Computing tools for implementing standards for single-case designs.
Chen, Li-Ting; Peng, Chao-Ying Joanne; Chen, Ming-E
2015-11-01
In the single-case design (SCD) literature, five sets of standards have been formulated and distinguished: design standards, assessment standards, analysis standards, reporting standards, and research synthesis standards. This article reviews computing tools that can assist researchers and practitioners in meeting the analysis standards recommended by the What Works Clearinghouse: Procedures and Standards Handbook-the WWC standards. These tools consist of specialized web-based calculators or downloadable software for SCD data, and algorithms or programs written in Excel, SAS procedures, SPSS commands/Macros, or the R programming language. We aligned these tools with the WWC standards and evaluated them for accuracy and treatment of missing data, using two published data sets. All tools were tested to be accurate. When missing data were present, most tools either gave an error message or conducted analysis based on the available data. Only one program used a single imputation method. This article concludes with suggestions for an inclusive computing tool or environment, additional research on the treatment of missing data, and reasonable and flexible interpretations of the WWC standards. © The Author(s) 2015.
Physical Education Curriculum Analysis Tool (PECAT)
ERIC Educational Resources Information Center
Lee, Sarah M.; Wechsler, Howell
2006-01-01
The Physical Education Curriculum Analysis Tool (PECAT) will help school districts conduct a clear, complete, and consistent analysis of written physical education curricula, based upon national physical education standards. The PECAT is customizable to include local standards. The results from the analysis can help school districts enhance…
Data Standards for Flow Cytometry
SPIDLEN, JOSEF; GENTLEMAN, ROBERT C.; HAALAND, PERRY D.; LANGILLE, MORGAN; MEUR, NOLWENN LE; OCHS, MICHAEL F.; SCHMITT, CHARLES; SMITH, CLAYTON A.; TREISTER, ADAM S.; BRINKMAN, RYAN R.
2009-01-01
Flow cytometry (FCM) is an analytical tool widely used for cancer and HIV/AIDS research, and treatment, stem cell manipulation and detecting microorganisms in environmental samples. Current data standards do not capture the full scope of FCM experiments and there is a demand for software tools that can assist in the exploration and analysis of large FCM datasets. We are implementing a standardized approach to capturing, analyzing, and disseminating FCM data that will facilitate both more complex analyses and analysis of datasets that could not previously be efficiently studied. Initial work has focused on developing a community-based guideline for recording and reporting the details of FCM experiments. Open source software tools that implement this standard are being created, with an emphasis on facilitating reproducible and extensible data analyses. As well, tools for electronic collaboration will assist the integrated access and comprehension of experiments to empower users to collaborate on FCM analyses. This coordinated, joint development of bioinformatics standards and software tools for FCM data analysis has the potential to greatly facilitate both basic and clinical research—impacting a notably diverse range of medical and environmental research areas. PMID:16901228
A survey of tools for the analysis of quantitative PCR (qPCR) data.
Pabinger, Stephan; Rödiger, Stefan; Kriegner, Albert; Vierlinger, Klemens; Weinhäusel, Andreas
2014-09-01
Real-time quantitative polymerase-chain-reaction (qPCR) is a standard technique in most laboratories used for various applications in basic research. Analysis of qPCR data is a crucial part of the entire experiment, which has led to the development of a plethora of methods. The released tools either cover specific parts of the workflow or provide complete analysis solutions. Here, we surveyed 27 open-access software packages and tools for the analysis of qPCR data. The survey includes 8 Microsoft Windows, 5 web-based, 9 R-based and 5 tools from other platforms. Reviewed packages and tools support the analysis of different qPCR applications, such as RNA quantification, DNA methylation, genotyping, identification of copy number variations, and digital PCR. We report an overview of the functionality, features and specific requirements of the individual software tools, such as data exchange formats, availability of a graphical user interface, included procedures for graphical data presentation, and offered statistical methods. In addition, we provide an overview about quantification strategies, and report various applications of qPCR. Our comprehensive survey showed that most tools use their own file format and only a fraction of the currently existing tools support the standardized data exchange format RDML. To allow a more streamlined and comparable analysis of qPCR data, more vendors and tools need to adapt the standardized format to encourage the exchange of data between instrument software, analysis tools, and researchers.
Applications of Automation Methods for Nonlinear Fracture Test Analysis
NASA Technical Reports Server (NTRS)
Allen, Phillip A.; Wells, Douglas N.
2013-01-01
Using automated and standardized computer tools to calculate the pertinent test result values has several advantages such as: 1. allowing high-fidelity solutions to complex nonlinear phenomena that would be impractical to express in written equation form, 2. eliminating errors associated with the interpretation and programing of analysis procedures from the text of test standards, 3. lessening the need for expertise in the areas of solid mechanics, fracture mechanics, numerical methods, and/or finite element modeling, to achieve sound results, 4. and providing one computer tool and/or one set of solutions for all users for a more "standardized" answer. In summary, this approach allows a non-expert with rudimentary training to get the best practical solution based on the latest understanding with minimum difficulty.Other existing ASTM standards that cover complicated phenomena use standard computer programs: 1. ASTM C1340/C1340M-10- Standard Practice for Estimation of Heat Gain or Loss Through Ceilings Under Attics Containing Radiant Barriers by Use of a Computer Program 2. ASTM F 2815 - Standard Practice for Chemical Permeation through Protective Clothing Materials: Testing Data Analysis by Use of a Computer Program 3. ASTM E2807 - Standard Specification for 3D Imaging Data Exchange, Version 1.0 The verification, validation, and round-robin processes required of a computer tool closely parallel the methods that are used to ensure the solution validity for equations included in test standard. The use of automated analysis tools allows the creation and practical implementation of advanced fracture mechanics test standards that capture the physics of a nonlinear fracture mechanics problem without adding undue burden or expense to the user. The presented approach forms a bridge between the equation-based fracture testing standards of today and the next generation of standards solving complex problems through analysis automation.
Allred, Sharon K; Smith, Kevin F; Flowers, Laura
2004-01-01
With the increased interest in evidence-based medicine, Internet access and the growing emphasis on national standards, there is an increased challenge for teaching institutions and nursing services to teach and implement standards. At the same time, electronic clinical documentation tools have started to become a common format for recording nursing notes. The major aim of this paper is to ascertain and assess the availability of clinical nursing tools based on the NANDA, NOC and NIC standards. Faculty at 20 large nursing schools and directors of nursing at 20 hospitals were interviewed regarding the use of nursing standards in clinical documentation packages, not only for teaching purposes but also for use in hospital-based systems to ensure patient safety. A survey tool was utilized that covered questions regarding what nursing standards are being taught in the nursing schools, what standards are encouraged by the hospitals, and teaching initiatives that include clinical documentation tools. Information was collected on how utilizing these standards in a clinical or hospital setting can improve the overall quality of care. Analysis included univariate and bivariate analysis. The consensus between both groups was that the NANDA, NOC and NIC national standards are the most widely taught and utilized. In addition, a training initiative was identified within a large university where a clinical documentation system based on these standards was developed utilizing handheld devices.
Greenwald, William W; Li, He; Smith, Erin N; Benaglio, Paola; Nariai, Naoki; Frazer, Kelly A
2017-04-07
Genomic interaction studies use next-generation sequencing (NGS) to examine the interactions between two loci on the genome, with subsequent bioinformatics analyses typically including annotation, intersection, and merging of data from multiple experiments. While many file types and analysis tools exist for storing and manipulating single locus NGS data, there is currently no file standard or analysis tool suite for manipulating and storing paired-genomic-loci: the data type resulting from "genomic interaction" studies. As genomic interaction sequencing data are becoming prevalent, a standard file format and tools for working with these data conveniently and efficiently are needed. This article details a file standard and novel software tool suite for working with paired-genomic-loci data. We present the paired-genomic-loci (PGL) file standard for genomic-interactions data, and the accompanying analysis tool suite "pgltools": a cross platform, pypy compatible python package available both as an easy-to-use UNIX package, and as a python module, for integration into pipelines of paired-genomic-loci analyses. Pgltools is a freely available, open source tool suite for manipulating paired-genomic-loci data. Source code, an in-depth manual, and a tutorial are available publicly at www.github.com/billgreenwald/pgltools , and a python module of the operations can be installed from PyPI via the PyGLtools module.
Code Analysis and Refactoring with Clang Tools, Version 0.1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kelley, Timothy M.
2016-12-23
Code Analysis and Refactoring with Clang Tools is a small set of example code that demonstrates techniques for applying tools distributed with the open source Clang compiler. Examples include analyzing where variables are used and replacing old data structures with standard structures.
Merlin, Fabrício Kurman; Pereira, Vera Lúciaduarte do Valle; Pacheco, Waldemar
2012-01-01
Organizations are part of an environment in which they are pressured to meet society's demands and acting in a sustainable way. In an attempt to meet such demands, organizations make use of various management tools, among which, ISO standards are used. Although there are evidences of contributions provided by these standards, it is questionable whether its parameters converge for a possible induction for sustainable development in organizations. This work presents a theoretical study, designed on structuralism world view, descriptive and deductive method, which aims to analyze the convergence of management tools' parameters in ISO standards. In order to support the analysis, a generic framework for possible convergence was developed, based on systems approach, linking five ISO standards (ISO 9001, ISO 14001, OHSAS 18001, ISO 31000 and ISO 26000) with sustainable development and positioning them according to organization levels (strategic, tactical and operational). The structure was designed based on Brundtland report concept. The analysis was performed exploring the generic framework for possible convergence based on Nadler and Tushman model. The results found the standards can contribute to a possible sustainable development induction in organizations, as long as they meet certain minimum conditions related to its strategic alignment.
Murray-Davis, Beth; McDonald, Helen; Cross-Sudworth, Fiona; Ahmed, Rashid; Simioni, Julia; Dore, Sharon; Marrin, Michael; DeSantis, Judy; Leyland, Nicholas; Gardosi, Jason; Hutton, Eileen; McDonald, Sarah
2015-08-01
Adverse events occur in up to 10% of obstetric cases, and up to one half of these could be prevented. Case reviews and root cause analysis using a structured tool may help health care providers to learn from adverse events and to identify trends and recurring systems issues. We sought to establish the reliability of a root cause analysis computer application called Standardized Clinical Outcome Review (SCOR). We designed a mixed methods study to evaluate the effectiveness of the tool. We conducted qualitative content analysis of five charts reviewed by both the traditional obstetric quality assurance methods and the SCOR tool. We also determined inter-rater reliability by having four health care providers review the same five cases using the SCOR tool. The comparative qualitative review revealed that the traditional quality assurance case review process used inconsistent language and made serious, personalized recommendations for those involved in the case. In contrast, the SCOR review provided a consistent format for recommendations, a list of action points, and highlighted systems issues. The mean percentage agreement between the four reviewers for the five cases was 75%. The different health care providers completed data entry and assessment of the case in a similar way. Missing data from the chart and poor wording of questions were identified as issues affecting percentage agreement. The SCOR tool provides a standardized, objective, obstetric-specific tool for root cause analysis that may improve identification of risk factors and dissemination of action plans to prevent future events.
Fedorov, Andriy; Clunie, David; Ulrich, Ethan; Bauer, Christian; Wahle, Andreas; Brown, Bartley; Onken, Michael; Riesmeier, Jörg; Pieper, Steve; Kikinis, Ron; Buatti, John; Beichel, Reinhard R
2016-01-01
Background. Imaging biomarkers hold tremendous promise for precision medicine clinical applications. Development of such biomarkers relies heavily on image post-processing tools for automated image quantitation. Their deployment in the context of clinical research necessitates interoperability with the clinical systems. Comparison with the established outcomes and evaluation tasks motivate integration of the clinical and imaging data, and the use of standardized approaches to support annotation and sharing of the analysis results and semantics. We developed the methodology and tools to support these tasks in Positron Emission Tomography and Computed Tomography (PET/CT) quantitative imaging (QI) biomarker development applied to head and neck cancer (HNC) treatment response assessment, using the Digital Imaging and Communications in Medicine (DICOM(®)) international standard and free open-source software. Methods. Quantitative analysis of PET/CT imaging data collected on patients undergoing treatment for HNC was conducted. Processing steps included Standardized Uptake Value (SUV) normalization of the images, segmentation of the tumor using manual and semi-automatic approaches, automatic segmentation of the reference regions, and extraction of the volumetric segmentation-based measurements. Suitable components of the DICOM standard were identified to model the various types of data produced by the analysis. A developer toolkit of conversion routines and an Application Programming Interface (API) were contributed and applied to create a standards-based representation of the data. Results. DICOM Real World Value Mapping, Segmentation and Structured Reporting objects were utilized for standards-compliant representation of the PET/CT QI analysis results and relevant clinical data. A number of correction proposals to the standard were developed. The open-source DICOM toolkit (DCMTK) was improved to simplify the task of DICOM encoding by introducing new API abstractions. Conversion and visualization tools utilizing this toolkit were developed. The encoded objects were validated for consistency and interoperability. The resulting dataset was deposited in the QIN-HEADNECK collection of The Cancer Imaging Archive (TCIA). Supporting tools for data analysis and DICOM conversion were made available as free open-source software. Discussion. We presented a detailed investigation of the development and application of the DICOM model, as well as the supporting open-source tools and toolkits, to accommodate representation of the research data in QI biomarker development. We demonstrated that the DICOM standard can be used to represent the types of data relevant in HNC QI biomarker development, and encode their complex relationships. The resulting annotated objects are amenable to data mining applications, and are interoperable with a variety of systems that support the DICOM standard.
A Student Assessment Tool for Standardized Patient Simulations (SAT-SPS): Psychometric analysis.
Castro-Yuste, Cristina; García-Cabanillas, María José; Rodríguez-Cornejo, María Jesús; Carnicer-Fuentes, Concepción; Paloma-Castro, Olga; Moreno-Corral, Luis Javier
2018-05-01
The evaluation of the level of clinical competence acquired by the student is a complex process that must meet various requirements to ensure its quality. The psychometric analysis of the data collected by the assessment tools used is a fundamental aspect to guarantee the student's competence level. To conduct a psychometric analysis of an instrument which assesses clinical competence in nursing students at simulation stations with standardized patients in OSCE-format tests. The construct of clinical competence was operationalized as a set of observable and measurable behaviors, measured by the newly-created Student Assessment Tool for Standardized Patient Simulations (SAT-SPS), which was comprised of 27 items. The categories assigned to the items were 'incorrect or not performed' (0), 'acceptable' (1), and 'correct' (2). 499 nursing students. Data were collected by two independent observers during the assessment of the students' performance at a four-station OSCE with standardized patients. Descriptive statistics were used to summarize the variables. The difficulty levels and floor and ceiling effects were determined for each item. Reliability was analyzed using internal consistency and inter-observer reliability. The validity analysis was performed considering face validity, content and construct validity (through exploratory factor analysis), and criterion validity. Internal reliability and inter-observer reliability were higher than 0.80. The construct validity analysis suggested a three-factor model accounting for 37.1% of the variance. These three factors were named 'Nursing process', 'Communication skills', and 'Safe practice'. A significant correlation was found between the scores obtained and the students' grades in general, as well as with the grades obtained in subjects with clinical content. The assessment tool has proven to be sufficiently reliable and valid for the assessment of the clinical competence of nursing students using standardized patients. This tool has three main components: the nursing process, communication skills, and safety management. Copyright © 2018 Elsevier Ltd. All rights reserved.
Guidelines for the analysis of free energy calculations
Klimovich, Pavel V.; Shirts, Michael R.; Mobley, David L.
2015-01-01
Free energy calculations based on molecular dynamics (MD) simulations show considerable promise for applications ranging from drug discovery to prediction of physical properties and structure-function studies. But these calculations are still difficult and tedious to analyze, and best practices for analysis are not well defined or propagated. Essentially, each group analyzing these calculations needs to decide how to conduct the analysis and, usually, develop its own analysis tools. Here, we review and recommend best practices for analysis yielding reliable free energies from molecular simulations. Additionally, we provide a Python tool, alchemical–analysis.py, freely available on GitHub at https://github.com/choderalab/pymbar–examples, that implements the analysis practices reviewed here for several reference simulation packages, which can be adapted to handle data from other packages. Both this review and the tool covers analysis of alchemical calculations generally, including free energy estimates via both thermodynamic integration and free energy perturbation-based estimators. Our Python tool also handles output from multiple types of free energy calculations, including expanded ensemble and Hamiltonian replica exchange, as well as standard fixed ensemble calculations. We also survey a range of statistical and graphical ways of assessing the quality of the data and free energy estimates, and provide prototypes of these in our tool. We hope these tools and discussion will serve as a foundation for more standardization of and agreement on best practices for analysis of free energy calculations. PMID:25808134
EEG and MEG data analysis in SPM8.
Litvak, Vladimir; Mattout, Jérémie; Kiebel, Stefan; Phillips, Christophe; Henson, Richard; Kilner, James; Barnes, Gareth; Oostenveld, Robert; Daunizeau, Jean; Flandin, Guillaume; Penny, Will; Friston, Karl
2011-01-01
SPM is a free and open source software written in MATLAB (The MathWorks, Inc.). In addition to standard M/EEG preprocessing, we presently offer three main analysis tools: (i) statistical analysis of scalp-maps, time-frequency images, and volumetric 3D source reconstruction images based on the general linear model, with correction for multiple comparisons using random field theory; (ii) Bayesian M/EEG source reconstruction, including support for group studies, simultaneous EEG and MEG, and fMRI priors; (iii) dynamic causal modelling (DCM), an approach combining neural modelling with data analysis for which there are several variants dealing with evoked responses, steady state responses (power spectra and cross-spectra), induced responses, and phase coupling. SPM8 is integrated with the FieldTrip toolbox , making it possible for users to combine a variety of standard analysis methods with new schemes implemented in SPM and build custom analysis tools using powerful graphical user interface (GUI) and batching tools.
EEG and MEG Data Analysis in SPM8
Litvak, Vladimir; Mattout, Jérémie; Kiebel, Stefan; Phillips, Christophe; Henson, Richard; Kilner, James; Barnes, Gareth; Oostenveld, Robert; Daunizeau, Jean; Flandin, Guillaume; Penny, Will; Friston, Karl
2011-01-01
SPM is a free and open source software written in MATLAB (The MathWorks, Inc.). In addition to standard M/EEG preprocessing, we presently offer three main analysis tools: (i) statistical analysis of scalp-maps, time-frequency images, and volumetric 3D source reconstruction images based on the general linear model, with correction for multiple comparisons using random field theory; (ii) Bayesian M/EEG source reconstruction, including support for group studies, simultaneous EEG and MEG, and fMRI priors; (iii) dynamic causal modelling (DCM), an approach combining neural modelling with data analysis for which there are several variants dealing with evoked responses, steady state responses (power spectra and cross-spectra), induced responses, and phase coupling. SPM8 is integrated with the FieldTrip toolbox , making it possible for users to combine a variety of standard analysis methods with new schemes implemented in SPM and build custom analysis tools using powerful graphical user interface (GUI) and batching tools. PMID:21437221
A Tool for Estimating Variability in Wood Preservative Treatment Retention
Patricia K. Lebow; Adam M. Taylor; Timothy M. Young
2015-01-01
Composite sampling is standard practice for evaluation of preservative retention levels in preservative-treated wood. Current protocols provide an average retention value but no estimate of uncertainty. Here we describe a statistical method for calculating uncertainty estimates using the standard sampling regime with minimal additional chemical analysis. This tool can...
Rahman, M Azizur; Rusteberg, Bernd; Gogu, R C; Lobo Ferreira, J P; Sauter, Martin
2012-05-30
This study reports the development of a new spatial multi-criteria decision analysis (SMCDA) software tool for selecting suitable sites for Managed Aquifer Recharge (MAR) systems. The new SMCDA software tool functions based on the combination of existing multi-criteria evaluation methods with modern decision analysis techniques. More specifically, non-compensatory screening, criteria standardization and weighting, and Analytical Hierarchy Process (AHP) have been combined with Weighted Linear Combination (WLC) and Ordered Weighted Averaging (OWA). This SMCDA tool may be implemented with a wide range of decision maker's preferences. The tool's user-friendly interface helps guide the decision maker through the sequential steps for site selection, those steps namely being constraint mapping, criteria hierarchy, criteria standardization and weighting, and criteria overlay. The tool offers some predetermined default criteria and standard methods to increase the trade-off between ease-of-use and efficiency. Integrated into ArcGIS, the tool has the advantage of using GIS tools for spatial analysis, and herein data may be processed and displayed. The tool is non-site specific, adaptive, and comprehensive, and may be applied to any type of site-selection problem. For demonstrating the robustness of the new tool, a case study was planned and executed at Algarve Region, Portugal. The efficiency of the SMCDA tool in the decision making process for selecting suitable sites for MAR was also demonstrated. Specific aspects of the tool such as built-in default criteria, explicit decision steps, and flexibility in choosing different options were key features, which benefited the study. The new SMCDA tool can be augmented by groundwater flow and transport modeling so as to achieve a more comprehensive approach to the selection process for the best locations of the MAR infiltration basins, as well as the locations of recovery wells and areas of groundwater protection. The new spatial multicriteria analysis tool has already been implemented within the GIS based Gabardine decision support system as an innovative MAR planning tool. Copyright © 2012 Elsevier Ltd. All rights reserved.
Guidelines for the analysis of free energy calculations.
Klimovich, Pavel V; Shirts, Michael R; Mobley, David L
2015-05-01
Free energy calculations based on molecular dynamics simulations show considerable promise for applications ranging from drug discovery to prediction of physical properties and structure-function studies. But these calculations are still difficult and tedious to analyze, and best practices for analysis are not well defined or propagated. Essentially, each group analyzing these calculations needs to decide how to conduct the analysis and, usually, develop its own analysis tools. Here, we review and recommend best practices for analysis yielding reliable free energies from molecular simulations. Additionally, we provide a Python tool, alchemical-analysis.py, freely available on GitHub as part of the pymbar package (located at http://github.com/choderalab/pymbar), that implements the analysis practices reviewed here for several reference simulation packages, which can be adapted to handle data from other packages. Both this review and the tool covers analysis of alchemical calculations generally, including free energy estimates via both thermodynamic integration and free energy perturbation-based estimators. Our Python tool also handles output from multiple types of free energy calculations, including expanded ensemble and Hamiltonian replica exchange, as well as standard fixed ensemble calculations. We also survey a range of statistical and graphical ways of assessing the quality of the data and free energy estimates, and provide prototypes of these in our tool. We hope this tool and discussion will serve as a foundation for more standardization of and agreement on best practices for analysis of free energy calculations.
Clunie, David; Ulrich, Ethan; Bauer, Christian; Wahle, Andreas; Brown, Bartley; Onken, Michael; Riesmeier, Jörg; Pieper, Steve; Kikinis, Ron; Buatti, John; Beichel, Reinhard R.
2016-01-01
Background. Imaging biomarkers hold tremendous promise for precision medicine clinical applications. Development of such biomarkers relies heavily on image post-processing tools for automated image quantitation. Their deployment in the context of clinical research necessitates interoperability with the clinical systems. Comparison with the established outcomes and evaluation tasks motivate integration of the clinical and imaging data, and the use of standardized approaches to support annotation and sharing of the analysis results and semantics. We developed the methodology and tools to support these tasks in Positron Emission Tomography and Computed Tomography (PET/CT) quantitative imaging (QI) biomarker development applied to head and neck cancer (HNC) treatment response assessment, using the Digital Imaging and Communications in Medicine (DICOM®) international standard and free open-source software. Methods. Quantitative analysis of PET/CT imaging data collected on patients undergoing treatment for HNC was conducted. Processing steps included Standardized Uptake Value (SUV) normalization of the images, segmentation of the tumor using manual and semi-automatic approaches, automatic segmentation of the reference regions, and extraction of the volumetric segmentation-based measurements. Suitable components of the DICOM standard were identified to model the various types of data produced by the analysis. A developer toolkit of conversion routines and an Application Programming Interface (API) were contributed and applied to create a standards-based representation of the data. Results. DICOM Real World Value Mapping, Segmentation and Structured Reporting objects were utilized for standards-compliant representation of the PET/CT QI analysis results and relevant clinical data. A number of correction proposals to the standard were developed. The open-source DICOM toolkit (DCMTK) was improved to simplify the task of DICOM encoding by introducing new API abstractions. Conversion and visualization tools utilizing this toolkit were developed. The encoded objects were validated for consistency and interoperability. The resulting dataset was deposited in the QIN-HEADNECK collection of The Cancer Imaging Archive (TCIA). Supporting tools for data analysis and DICOM conversion were made available as free open-source software. Discussion. We presented a detailed investigation of the development and application of the DICOM model, as well as the supporting open-source tools and toolkits, to accommodate representation of the research data in QI biomarker development. We demonstrated that the DICOM standard can be used to represent the types of data relevant in HNC QI biomarker development, and encode their complex relationships. The resulting annotated objects are amenable to data mining applications, and are interoperable with a variety of systems that support the DICOM standard. PMID:27257542
Screening and Evaluation Tool (SET) Users Guide
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pincock, Layne
This document is the users guide to using the Screening and Evaluation Tool (SET). SET is a tool for comparing multiple fuel cycle options against a common set of criteria and metrics. It does this using standard multi-attribute utility decision analysis methods.
A new software tool for 3D motion analyses of the musculo-skeletal system.
Leardini, A; Belvedere, C; Astolfi, L; Fantozzi, S; Viceconti, M; Taddei, F; Ensini, A; Benedetti, M G; Catani, F
2006-10-01
Many clinical and biomechanical research studies, particularly in orthopaedics, nowadays involve forms of movement analysis. Gait analysis, video-fluoroscopy of joint replacement, pre-operative planning, surgical navigation, and standard radiostereometry would require tools for easy access to three-dimensional graphical representations of rigid segment motion. Relevant data from this variety of sources need to be organised in structured forms. Registration, integration, and synchronisation of segment position data are additional necessities. With this aim, the present work exploits the features of a software tool recently developed within a EU-funded project ('Multimod') in a series of different research studies. Standard and advanced gait analysis on a normal subject, in vivo fluoroscopy-based three-dimensional motion of a replaced knee joint, patellar and ligament tracking on a knee specimen by a surgical navigation system, stem-to-femur migration pattern on a patient operated on total hip replacement, were analysed with standard techniques and all represented by this innovative software tool. Segment pose data were eventually obtained from these different techniques, and were successfully imported and organised in a hierarchical tree within the tool. Skeletal bony segments, prosthesis component models and ligament links were registered successfully to corresponding marker position data for effective three-dimensional animations. These were shown in various combinations, in different views, from different perspectives, according to possible specific research interests. Bioengineering and medical professionals would be much facilitated in the interpretation of the motion analysis measurements necessary in their research fields, and would benefit therefore from this software tool.
NASA Technical Reports Server (NTRS)
Eckhardt, Dave E., Jr.; Jipping, Michael J.; Wild, Chris J.; Zeil, Steven J.; Roberts, Cathy C.
1993-01-01
A study of computer engineering tool integration using the Portable Common Tool Environment (PCTE) Public Interface Standard is presented. Over a 10-week time frame, three existing software products were encapsulated to work in the Emeraude environment, an implementation of the PCTE version 1.5 standard. The software products used were a computer-aided software engineering (CASE) design tool, a software reuse tool, and a computer architecture design and analysis tool. The tool set was then demonstrated to work in a coordinated design process in the Emeraude environment. The project and the features of PCTE used are described, experience with the use of Emeraude environment over the project time frame is summarized, and several related areas for future research are summarized.
Tool Efficiency Analysis model research in SEMI industry
NASA Astrophysics Data System (ADS)
Lei, Ma; Nana, Zhang; Zhongqiu, Zhang
2018-06-01
One of the key goals in SEMI industry is to improve equipment through put and ensure equipment production efficiency maximization. This paper is based on SEMI standards in semiconductor equipment control, defines the transaction rules between different tool states, and presents a TEA system model which is to analysis tool performance automatically based on finite state machine. The system was applied to fab tools and verified its effectiveness successfully, and obtained the parameter values used to measure the equipment performance, also including the advices of improvement.
Détroit, Florent; Coudenneau, Aude; Moncel, Marie-Hélène
2016-01-01
There appears to be little doubt as to the existence of an intentional technological resolve to produce convergent tools during the Middle Palaeolithic. However, the use of these pieces as pointed tools is still subject to debate: i.e., handheld tool vs. hafted tool. Present-day technological analysis has begun to apply new methodologies in order to quantify shape variability and to decipher the role of the morphology of these pieces in relation to function; for instance, geometric morphometric analyses have recently been applied with successful results. This paper presents a study of this type of analysis on 37 convergent tools from level Ga of Payre site (France), dated to MIS 8–7. These pieces are non-standardized knapping products produced by discoidal and orthogonal core technologies. Moreover, macro-wear studies attest to various activities on diverse materials with no evidence of hafting or projectile use. The aim of this paper is to test the geometric morphometric approach on non-standardized artefacts applying the Elliptical Fourier analysis (EFA) to 3D contours and to assess the potential relationship between size and shape, technology and function. This study is innovative in that it is the first time that this method, considered to be a valuable complement for describing technological and functional attributes, is applied to 3D contours of lithic products. Our results show that this methodology ensures a very good degree of accuracy in describing shape variations of the sharp edges of technologically non-standardized convergent tools. EFA on 3D contours indicates variations in deviations of the outline along the third dimension (i.e., dorso-ventrally) and yields quantitative and insightful information on the actual shape variations of tools. Several statistically significant relationships are found between shape variation and use-wear attributes, though the results emphasize the large variability of the shape of the convergent tools, which, in general, does not show a strong direct association with technological features and function. This is in good agreement with the technological context of this chronological period, characterized by a wide diversity of non-standardized tools adapted to multipurpose functions for varied subsistence activities. PMID:27191164
Chacón, M Gema; Détroit, Florent; Coudenneau, Aude; Moncel, Marie-Hélène
2016-01-01
There appears to be little doubt as to the existence of an intentional technological resolve to produce convergent tools during the Middle Palaeolithic. However, the use of these pieces as pointed tools is still subject to debate: i.e., handheld tool vs. hafted tool. Present-day technological analysis has begun to apply new methodologies in order to quantify shape variability and to decipher the role of the morphology of these pieces in relation to function; for instance, geometric morphometric analyses have recently been applied with successful results. This paper presents a study of this type of analysis on 37 convergent tools from level Ga of Payre site (France), dated to MIS 8-7. These pieces are non-standardized knapping products produced by discoidal and orthogonal core technologies. Moreover, macro-wear studies attest to various activities on diverse materials with no evidence of hafting or projectile use. The aim of this paper is to test the geometric morphometric approach on non-standardized artefacts applying the Elliptical Fourier analysis (EFA) to 3D contours and to assess the potential relationship between size and shape, technology and function. This study is innovative in that it is the first time that this method, considered to be a valuable complement for describing technological and functional attributes, is applied to 3D contours of lithic products. Our results show that this methodology ensures a very good degree of accuracy in describing shape variations of the sharp edges of technologically non-standardized convergent tools. EFA on 3D contours indicates variations in deviations of the outline along the third dimension (i.e., dorso-ventrally) and yields quantitative and insightful information on the actual shape variations of tools. Several statistically significant relationships are found between shape variation and use-wear attributes, though the results emphasize the large variability of the shape of the convergent tools, which, in general, does not show a strong direct association with technological features and function. This is in good agreement with the technological context of this chronological period, characterized by a wide diversity of non-standardized tools adapted to multipurpose functions for varied subsistence activities.
Adapting HIV patient and program monitoring tools for chronic non-communicable diseases in Ethiopia.
Letebo, Mekitew; Shiferaw, Fassil
2016-06-02
Chronic non-communicable diseases (NCDs) have become a huge public health concern in developing countries. Many resource-poor countries facing this growing epidemic, however, lack systems for an organized and comprehensive response to NCDs. Lack of NCD national policy, strategies, treatment guidelines and surveillance and monitoring systems are features of health systems in many developing countries. Successfully responding to the problem requires a number of actions by the countries, including developing context-appropriate chronic care models and programs and standardization of patient and program monitoring tools. In this cross-sectional qualitative study we assessed existing monitoring and evaluation (M&E) tools used for NCD services in Ethiopia. Since HIV care and treatment program is the only large-scale chronic care program in the country, we explored the M&E tools being used in the program and analyzed how these tools might be adapted to support NCD services in the country. Document review and in-depth interviews were the main data collection methods used. The interviews were held with health workers and staff involved in data management purposively selected from four health facilities with high HIV and NCD patient load. Thematic analysis was employed to make sense of the data. Our findings indicate the apparent lack of information systems for NCD services, including the absence of standardized patient and program monitoring tools to support the services. We identified several HIV care and treatment patient and program monitoring tools currently being used to facilitate intake process, enrolment, follow up, cohort monitoring, appointment keeping, analysis and reporting. Analysis of how each tool being used for HIV patient and program monitoring can be adapted for supporting NCD services is presented. Given the similarity between HIV care and treatment and NCD services and the huge investment already made to implement standardized tools for HIV care and treatment program, adaptation and use of HIV patient and program monitoring tools for NCD services can improve NCD response in Ethiopia through structuring services, standardizing patient care and treatment, supporting evidence-based planning and providing information on effectiveness of interventions.
simuwatt - A Tablet Based Electronic Auditing Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Macumber, Daniel; Parker, Andrew; Lisell, Lars
2014-05-08
'simuwatt Energy Auditor' (TM) is a new tablet-based electronic auditing tool that is designed to dramatically reduce the time and cost to perform investment-grade audits and improve quality and consistency. The tool uses the U.S. Department of Energy's OpenStudio modeling platform and integrated Building Component Library to automate modeling and analysis. simuwatt's software-guided workflow helps users gather required data, and provides the data in a standard electronic format that is automatically converted to a baseline OpenStudio model for energy analysis. The baseline energy model is calibrated against actual monthly energy use to ASHRAE Standard 14 guidelines. Energy conservation measures frommore » the Building Component Library are then evaluated using OpenStudio's parametric analysis capability. Automated reporting creates audit documents that describe recommended packages of energy conservation measures. The development of this tool was partially funded by the U.S. Department of Defense's Environmental Security Technology Certification Program. As part of this program, the tool is being tested at 13 buildings on 5 Department of Defense sites across the United States. Results of the first simuwatt audit tool demonstration are presented in this paper.« less
ERIC Educational Resources Information Center
Sheehan, Kathleen M.
2015-01-01
The "TextEvaluator"® text analysis tool is a fully automated text complexity evaluation tool designed to help teachers, curriculum specialists, textbook publishers, and test developers select texts that are consistent with the text complexity guidelines specified in the Common Core State Standards.This paper documents the procedure used…
deepTools: a flexible platform for exploring deep-sequencing data.
Ramírez, Fidel; Dündar, Friederike; Diehl, Sarah; Grüning, Björn A; Manke, Thomas
2014-07-01
We present a Galaxy based web server for processing and visualizing deeply sequenced data. The web server's core functionality consists of a suite of newly developed tools, called deepTools, that enable users with little bioinformatic background to explore the results of their sequencing experiments in a standardized setting. Users can upload pre-processed files with continuous data in standard formats and generate heatmaps and summary plots in a straight-forward, yet highly customizable manner. In addition, we offer several tools for the analysis of files containing aligned reads and enable efficient and reproducible generation of normalized coverage files. As a modular and open-source platform, deepTools can easily be expanded and customized to future demands and developments. The deepTools webserver is freely available at http://deeptools.ie-freiburg.mpg.de and is accompanied by extensive documentation and tutorials aimed at conveying the principles of deep-sequencing data analysis. The web server can be used without registration. deepTools can be installed locally either stand-alone or as part of Galaxy. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.
Ergonomic analysis of fastening vibration based on ISO Standard 5349 (2001).
Joshi, Akul; Leu, Ming; Murray, Susan
2012-11-01
Hand-held power tools used for fastening operations exert high dynamic forces on the operator's hand-arm, potentially causing injuries to the operator in the long run. This paper presents a study that analyzed the vibrations exerted by two hand-held power tools used for fastening operations with the operating exhibiting different postures. The two pneumatic tools, a right-angled nut-runner and an offset pistol-grip, are used to install shearing-type fasteners. A tri-axial accelerometer is used to measure the tool's vibration. The position and orientation of the transducer mounted on the tool follows the ISO-5349 Standard. The measured vibration data is used to compare the two power tools at different operating postures. The data analysis determines the number of years required to reach a 10% probability of developing finger blanching. The results indicate that the pistol-grip tool induces more vibration in the hand-arm than the right-angled nut-runner and that the vibrations exerted on the hand-arm vary for different postures. Copyright © 2012 Elsevier Ltd and The Ergonomics Society. All rights reserved.
ERIC Educational Resources Information Center
Sheehan, Kathleen M.
2016-01-01
The "TextEvaluator"® text analysis tool is a fully automated text complexity evaluation tool designed to help teachers and other educators select texts that are consistent with the text complexity guidelines specified in the Common Core State Standards (CCSS). This paper provides an overview of the TextEvaluator measurement approach and…
CFD Process Pre- and Post-processing Automation in Support of Space Propulsion
NASA Technical Reports Server (NTRS)
Dorney, Suzanne M.
2003-01-01
The use of Computational Fluid Dynamics or CFD has become standard practice in the design and analysis of the major components used for space propulsion. In an attempt to standardize and improve the CFD process a series of automated tools have been developed. Through the use of these automated tools the application of CFD to the design cycle has been improved and streamlined. This paper presents a series of applications in which deficiencies were identified in the CFD process and corrected through the development of automated tools.
A population MRI brain template and analysis tools for the macaque.
Seidlitz, Jakob; Sponheim, Caleb; Glen, Daniel; Ye, Frank Q; Saleem, Kadharbatcha S; Leopold, David A; Ungerleider, Leslie; Messinger, Adam
2018-04-15
The use of standard anatomical templates is common in human neuroimaging, as it facilitates data analysis and comparison across subjects and studies. For non-human primates, previous in vivo templates have lacked sufficient contrast to reliably validate known anatomical brain regions and have not provided tools for automated single-subject processing. Here we present the "National Institute of Mental Health Macaque Template", or NMT for short. The NMT is a high-resolution in vivo MRI template of the average macaque brain generated from 31 subjects, as well as a neuroimaging tool for improved data analysis and visualization. From the NMT volume, we generated maps of tissue segmentation and cortical thickness. Surface reconstructions and transformations to previously published digital brain atlases are also provided. We further provide an analysis pipeline using the NMT that automates and standardizes the time-consuming processes of brain extraction, tissue segmentation, and morphometric feature estimation for anatomical scans of individual subjects. The NMT and associated tools thus provide a common platform for precise single-subject data analysis and for characterizations of neuroimaging results across subjects and studies. Copyright © 2017 ElsevierCompany. All rights reserved.
C++ software quality in the ATLAS experiment: tools and experience
NASA Astrophysics Data System (ADS)
Martin-Haugh, S.; Kluth, S.; Seuster, R.; Snyder, S.; Obreshkov, E.; Roe, S.; Sherwood, P.; Stewart, G. A.
2017-10-01
In this paper we explain how the C++ code quality is managed in ATLAS using a range of tools from compile-time through to run time testing and reflect on the substantial progress made in the last two years largely through the use of static analysis tools such as Coverity®, an industry-standard tool which enables quality comparison with general open source C++ code. Other available code analysis tools are also discussed, as is the role of unit testing with an example of how the GoogleTest framework can be applied to our codebase.
Integrated Data Visualization and Virtual Reality Tool
NASA Technical Reports Server (NTRS)
Dryer, David A.
1998-01-01
The Integrated Data Visualization and Virtual Reality Tool (IDVVRT) Phase II effort was for the design and development of an innovative Data Visualization Environment Tool (DVET) for NASA engineers and scientists, enabling them to visualize complex multidimensional and multivariate data in a virtual environment. The objectives of the project were to: (1) demonstrate the transfer and manipulation of standard engineering data in a virtual world; (2) demonstrate the effects of design and changes using finite element analysis tools; and (3) determine the training and engineering design and analysis effectiveness of the visualization system.
ARM Data File Standards Version: 1.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kehoe, Kenneth; Beus, Sherman; Cialella, Alice
2014-04-01
The Atmospheric Radiation Measurement (ARM) Climate Research Facility performs routine in situ and remote-sensing observations to provide a detailed and accurate description of the Earth atmosphere in diverse climate regimes. The result is a diverse data sets containing observational and derived data, currently accumulating at a rate of 30 TB of data and 150,000 different files per month (http://www.archive.arm.gov/stats/storage2.html). Continuing the current processing while scaling this to even larger sizes is extremely important to the ARM Facility and requires consistent metadata and data standards. The standards described in this document will enable development of automated analysis and discovery tools formore » the ever-growing volumes of data. It also will enable consistent analysis of the multiyear data, allow for development of automated monitoring and data health status tools, and facilitate development of future capabilities for delivering data on demand that can be tailored explicitly to user needs. This analysis ability will only be possible if the data follows a minimum set of standards. This document proposes a hierarchy that includes required and recommended standards.« less
Nephele: a cloud platform for simplified, standardized and reproducible microbiome data analysis.
Weber, Nick; Liou, David; Dommer, Jennifer; MacMenamin, Philip; Quiñones, Mariam; Misner, Ian; Oler, Andrew J; Wan, Joe; Kim, Lewis; Coakley McCarthy, Meghan; Ezeji, Samuel; Noble, Karlynn; Hurt, Darrell E
2018-04-15
Widespread interest in the study of the microbiome has resulted in data proliferation and the development of powerful computational tools. However, many scientific researchers lack the time, training, or infrastructure to work with large datasets or to install and use command line tools. The National Institute of Allergy and Infectious Diseases (NIAID) has created Nephele, a cloud-based microbiome data analysis platform with standardized pipelines and a simple web interface for transforming raw data into biological insights. Nephele integrates common microbiome analysis tools as well as valuable reference datasets like the healthy human subjects cohort of the Human Microbiome Project (HMP). Nephele is built on the Amazon Web Services cloud, which provides centralized and automated storage and compute capacity, thereby reducing the burden on researchers and their institutions. https://nephele.niaid.nih.gov and https://github.com/niaid/Nephele. darrell.hurt@nih.gov.
Nephele: a cloud platform for simplified, standardized and reproducible microbiome data analysis
Weber, Nick; Liou, David; Dommer, Jennifer; MacMenamin, Philip; Quiñones, Mariam; Misner, Ian; Oler, Andrew J; Wan, Joe; Kim, Lewis; Coakley McCarthy, Meghan; Ezeji, Samuel; Noble, Karlynn; Hurt, Darrell E
2018-01-01
Abstract Motivation Widespread interest in the study of the microbiome has resulted in data proliferation and the development of powerful computational tools. However, many scientific researchers lack the time, training, or infrastructure to work with large datasets or to install and use command line tools. Results The National Institute of Allergy and Infectious Diseases (NIAID) has created Nephele, a cloud-based microbiome data analysis platform with standardized pipelines and a simple web interface for transforming raw data into biological insights. Nephele integrates common microbiome analysis tools as well as valuable reference datasets like the healthy human subjects cohort of the Human Microbiome Project (HMP). Nephele is built on the Amazon Web Services cloud, which provides centralized and automated storage and compute capacity, thereby reducing the burden on researchers and their institutions. Availability and implementation https://nephele.niaid.nih.gov and https://github.com/niaid/Nephele Contact darrell.hurt@nih.gov PMID:29028892
USDA-ARS?s Scientific Manuscript database
Background: Dietary intake assessment with diet records (DR) is a standard research and practice tool in nutrition. Manual entry and analysis of DR is time-consuming and expensive. New electronic tools for diet entry by clients and research participants may reduce the cost and effort of nutrient int...
Status of the AIAA Modeling and Simulation Format Standard
NASA Technical Reports Server (NTRS)
Jackson, E. Bruce; Hildreth, Bruce L.
2008-01-01
The current draft AIAA Standard for flight simulation models represents an on-going effort to improve the productivity of practitioners of the art of digital flight simulation (one of the original digital computer applications). This initial release provides the capability for the efficient representation and exchange of an aerodynamic model in full fidelity; the DAVE-ML format can be easily imported (with development of site-specific import tools) in an unambiguous way with automatic verification. An attractive feature of the standard is the ability to coexist with existing legacy software or tools. The draft Standard is currently limited in scope to static elements of dynamic flight simulations; however, these static elements represent the bulk of typical flight simulation mathematical models. It is already seeing application within U.S. and Australian government agencies in an effort to improve productivity and reduce model rehosting overhead. An existing tool allows import of DAVE-ML models into a popular simulation modeling and analysis tool, and other community-contributed tools and libraries can simplify the use of DAVE-ML compliant models at compile- or run-time of high-fidelity flight simulation.
Muir-Paulik, S A; Johnson, L E A; Kennedy, P; Aden, T; Villanueva, J; Reisdorf, E; Humes, R; Moen, A C
2016-01-01
The 2005 International Health Regulations (IHR 2005) emphasized the importance of laboratory capacity to detect emerging diseases including novel influenza viruses. To support IHR 2005 requirements and the need to enhance influenza laboratory surveillance capacity, the Association of Public Health Laboratories (APHL) and the Centers for Disease Control and Prevention (CDC) Influenza Division developed the International Influenza Laboratory Capacity Review (Tool). Data from 37 assessments were reviewed and analyzed to verify that the quantitative analysis results accurately depicted a laboratory's capacity and capabilities. Subject matter experts in influenza and laboratory practice used an iterative approach to develop the Tool incorporating feedback and lessons learnt through piloting and implementation. To systematically analyze assessment data, a quantitative framework for analysis was added to the Tool. The review indicated that changes in scores consistently reflected enhanced or decreased capacity. The review process also validated the utility of adding a quantitative analysis component to the assessments and the benefit of establishing a baseline from which to compare future assessments in a standardized way. Use of the Tool has provided APHL, CDC and each assessed laboratory with a standardized analysis of the laboratory's capacity. The information generated is used to improve laboratory systems for laboratory testing and enhance influenza surveillance globally. We describe the development of the Tool and lessons learnt. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
Assessment of the Draft AIAA S-119 Flight Dynamic Model Exchange Standard
NASA Technical Reports Server (NTRS)
Jackson, E. Bruce; Murri, Daniel G.; Hill, Melissa A.; Jessick, Matthew V.; Penn, John M.; Hasan, David A.; Crues, Edwin Z.; Falck, Robert D.; McCarthy, Thomas G.; Vuong, Nghia;
2011-01-01
An assessment of a draft AIAA standard for flight dynamics model exchange, ANSI/AIAA S-119-2011, was conducted on behalf of NASA by a team from the NASA Engineering and Safety Center. The assessment included adding the capability of importing standard models into real-time simulation facilities at several NASA Centers as well as into analysis simulation tools. All participants were successful at importing two example models into their respective simulation frameworks by using existing software libraries or by writing new import tools. Deficiencies in the libraries and format documentation were identified and fixed; suggestions for improvements to the standard were provided to the AIAA. An innovative tool to generate C code directly from such a model was developed. Performance of the software libraries compared favorably with compiled code. As a result of this assessment, several NASA Centers can now import standard models directly into their simulations. NASA is considering adopting the now-published S-119 standard as an internal recommended practice.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gaffney, P.W.; Wooten, J.W.
1980-05-01
Four software tools PFORT, DAVE, POLISH, and BRNANL, which may be used to ensure the standardization of FORTRAN software are introduced. First, FORTRAN computer programs are loosely classified into three groups. Then reasons are given why the program in two of these groups should adhere to a portable subset of the American National Standard (ANS) First FORTRAN 1966. Next, the software tools PFORT, DAVE, POLISH, and BRNANL, are briefly described, and an example of the output from PFORT, DAVE, and POLISH are given. Finally, the dissemination of information pertaining to the tools together with their availability is outlined. 11 figures.
NASA Astrophysics Data System (ADS)
Vines, Aleksander; Hansen, Morten W.; Korosov, Anton
2017-04-01
Existing infrastructure international and Norwegian projects, e.g., NorDataNet, NMDC and NORMAP, provide open data access through the OPeNDAP protocol following the conventions for CF (Climate and Forecast) metadata, designed to promote the processing and sharing of files created with the NetCDF application programming interface (API). This approach is now also being implemented in the Norwegian Sentinel Data Hub (satellittdata.no) to provide satellite EO data to the user community. Simultaneously with providing simplified and unified data access, these projects also seek to use and establish common standards for use and discovery metadata. This then allows development of standardized tools for data search and (subset) streaming over the internet to perform actual scientific analysis. A combinnation of software tools, which we call a Scientific Platform as a Service (SPaaS), will take advantage of these opportunities to harmonize and streamline the search, retrieval and analysis of integrated satellite and auxiliary observations of the oceans in a seamless system. The SPaaS is a cloud solution for integration of analysis tools with scientific datasets via an API. The core part of the SPaaS is a distributed metadata catalog to store granular metadata describing the structure, location and content of available satellite, model, and in situ datasets. The analysis tools include software for visualization (also online), interactive in-depth analysis, and server-based processing chains. The API conveys search requests between system nodes (i.e., interactive and server tools) and provides easy access to the metadata catalog, data repositories, and the tools. The SPaaS components are integrated in virtual machines, of which provisioning and deployment are automatized using existing state-of-the-art open-source tools (e.g., Vagrant, Ansible, Docker). The open-source code for scientific tools and virtual machine configurations is under version control at https://github.com/nansencenter/, and is coupled to an online continuous integration system (e.g., Travis CI).
Systematic Omics Analysis Review (SOAR) Tool to Support Risk Assessment
McConnell, Emma R.; Bell, Shannon M.; Cote, Ila; Wang, Rong-Lin; Perkins, Edward J.; Garcia-Reyero, Natàlia; Gong, Ping; Burgoon, Lyle D.
2014-01-01
Environmental health risk assessors are challenged to understand and incorporate new data streams as the field of toxicology continues to adopt new molecular and systems biology technologies. Systematic screening reviews can help risk assessors and assessment teams determine which studies to consider for inclusion in a human health assessment. A tool for systematic reviews should be standardized and transparent in order to consistently determine which studies meet minimum quality criteria prior to performing in-depth analyses of the data. The Systematic Omics Analysis Review (SOAR) tool is focused on assisting risk assessment support teams in performing systematic reviews of transcriptomic studies. SOAR is a spreadsheet tool of 35 objective questions developed by domain experts, focused on transcriptomic microarray studies, and including four main topics: test system, test substance, experimental design, and microarray data. The tool will be used as a guide to identify studies that meet basic published quality criteria, such as those defined by the Minimum Information About a Microarray Experiment standard and the Toxicological Data Reliability Assessment Tool. Seven scientists were recruited to test the tool by using it to independently rate 15 published manuscripts that study chemical exposures with microarrays. Using their feedback, questions were weighted based on importance of the information and a suitability cutoff was set for each of the four topic sections. The final validation resulted in 100% agreement between the users on four separate manuscripts, showing that the SOAR tool may be used to facilitate the standardized and transparent screening of microarray literature for environmental human health risk assessment. PMID:25531884
Paques, Joseph-Jean; Gauthier, François; Perez, Alejandro
2007-01-01
To assess and plan future risk-analysis research projects, 275 documents describing methods and tools for assessing the risks associated with industrial machines or with other sectors such as the military, and the nuclear and aeronautics industries, etc., were collected. These documents were in the format of published books or papers, standards, technical guides and company procedures collected throughout industry. From the collected documents, 112 documents were selected for analysis; 108 methods applied or potentially applicable for assessing the risks associated with industrial machines were analyzed and classified. This paper presents the main quantitative results of the analysis of the methods and tools.
O'Connor, Brian D.; Yuen, Denis; Chung, Vincent; Duncan, Andrew G.; Liu, Xiang Kun; Patricia, Janice; Paten, Benedict; Stein, Lincoln; Ferretti, Vincent
2017-01-01
As genomic datasets continue to grow, the feasibility of downloading data to a local organization and running analysis on a traditional compute environment is becoming increasingly problematic. Current large-scale projects, such as the ICGC PanCancer Analysis of Whole Genomes (PCAWG), the Data Platform for the U.S. Precision Medicine Initiative, and the NIH Big Data to Knowledge Center for Translational Genomics, are using cloud-based infrastructure to both host and perform analysis across large data sets. In PCAWG, over 5,800 whole human genomes were aligned and variant called across 14 cloud and HPC environments; the processed data was then made available on the cloud for further analysis and sharing. If run locally, an operation at this scale would have monopolized a typical academic data centre for many months, and would have presented major challenges for data storage and distribution. However, this scale is increasingly typical for genomics projects and necessitates a rethink of how analytical tools are packaged and moved to the data. For PCAWG, we embraced the use of highly portable Docker images for encapsulating and sharing complex alignment and variant calling workflows across highly variable environments. While successful, this endeavor revealed a limitation in Docker containers, namely the lack of a standardized way to describe and execute the tools encapsulated inside the container. As a result, we created the Dockstore ( https://dockstore.org), a project that brings together Docker images with standardized, machine-readable ways of describing and running the tools contained within. This service greatly improves the sharing and reuse of genomics tools and promotes interoperability with similar projects through emerging web service standards developed by the Global Alliance for Genomics and Health (GA4GH). PMID:28344774
O'Connor, Brian D; Yuen, Denis; Chung, Vincent; Duncan, Andrew G; Liu, Xiang Kun; Patricia, Janice; Paten, Benedict; Stein, Lincoln; Ferretti, Vincent
2017-01-01
As genomic datasets continue to grow, the feasibility of downloading data to a local organization and running analysis on a traditional compute environment is becoming increasingly problematic. Current large-scale projects, such as the ICGC PanCancer Analysis of Whole Genomes (PCAWG), the Data Platform for the U.S. Precision Medicine Initiative, and the NIH Big Data to Knowledge Center for Translational Genomics, are using cloud-based infrastructure to both host and perform analysis across large data sets. In PCAWG, over 5,800 whole human genomes were aligned and variant called across 14 cloud and HPC environments; the processed data was then made available on the cloud for further analysis and sharing. If run locally, an operation at this scale would have monopolized a typical academic data centre for many months, and would have presented major challenges for data storage and distribution. However, this scale is increasingly typical for genomics projects and necessitates a rethink of how analytical tools are packaged and moved to the data. For PCAWG, we embraced the use of highly portable Docker images for encapsulating and sharing complex alignment and variant calling workflows across highly variable environments. While successful, this endeavor revealed a limitation in Docker containers, namely the lack of a standardized way to describe and execute the tools encapsulated inside the container. As a result, we created the Dockstore ( https://dockstore.org), a project that brings together Docker images with standardized, machine-readable ways of describing and running the tools contained within. This service greatly improves the sharing and reuse of genomics tools and promotes interoperability with similar projects through emerging web service standards developed by the Global Alliance for Genomics and Health (GA4GH).
Men'shikov, V V
2012-12-01
The article deals with the factors impacting the reliability of clinical laboratory information. The differences of qualities of laboratory analysis tools produced by various manufacturers are discussed. These characteristics are the causes of discrepancy of the results of laboratory analyses of the same analite. The role of the reference system in supporting the comparability of laboratory analysis results is demonstrated. The project of national standard is presented to regulate the requirements to standards and calibrators for analysis of qualitative and non-metrical characteristics of components of biomaterials.
The Assessment of a Tutoring Program to Meet CAS Standards Using a SWOT Analysis and Action Plan
ERIC Educational Resources Information Center
Fullmer, Patricia
2009-01-01
This article summarizes the use of SWOT (Strengths, Weaknesses, Opportunities, and Threats) analysis and subsequent action planning as a tool of self-assessment to meet CAS (Council for the Advancement of Standards in Higher Education) requirements for systematic assessment. The use of the evaluation results to devise improvements to increase the…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Palanisamy, Giri
The U.S. Department of Energy (DOE)’s Atmospheric Radiation Measurement (ARM) Climate Research Facility performs routine in situ and remote-sensing observations to provide a detailed and accurate description of the Earth atmosphere in diverse climate regimes. The result is a huge archive of diverse data sets containing observational and derived data, currently accumulating at a rate of 30 terabytes (TB) of data and 150,000 different files per month (http://www.archive.arm.gov/stats/). Continuing the current processing while scaling this to even larger sizes is extremely important to the ARM Facility and requires consistent metadata and data standards. The standards described in this document willmore » enable development of automated analysis and discovery tools for the ever growing data volumes. It will enable consistent analysis of the multiyear data, allow for development of automated monitoring and data health status tools, and allow future capabilities of delivering data on demand that can be tailored explicitly for the user needs. This analysis ability will only be possible if the data follows a minimum set of standards. This document proposes a hierarchy of required and recommended standards.« less
Postel, Alexander; Schmeiser, Stefanie; Zimmermann, Bernd; Becher, Paul
2016-01-01
Molecular epidemiology has become an indispensable tool in the diagnosis of diseases and in tracing the infection routes of pathogens. Due to advances in conventional sequencing and the development of high throughput technologies, the field of sequence determination is in the process of being revolutionized. Platforms for sharing sequence information and providing standardized tools for phylogenetic analyses are becoming increasingly important. The database (DB) of the European Union (EU) and World Organisation for Animal Health (OIE) Reference Laboratory for classical swine fever offers one of the world’s largest semi-public virus-specific sequence collections combined with a module for phylogenetic analysis. The classical swine fever (CSF) DB (CSF-DB) became a valuable tool for supporting diagnosis and epidemiological investigations of this highly contagious disease in pigs with high socio-economic impacts worldwide. The DB has been re-designed and now allows for the storage and analysis of traditionally used, well established genomic regions and of larger genomic regions including complete viral genomes. We present an application example for the analysis of highly similar viral sequences obtained in an endemic disease situation and introduce the new geographic “CSF Maps” tool. The concept of this standardized and easy-to-use DB with an integrated genetic typing module is suited to serve as a blueprint for similar platforms for other human or animal viruses. PMID:27827988
Development of Advanced Life Prediction Tools for Elastic-Plastic Fatigue Crack Growth
NASA Technical Reports Server (NTRS)
Gregg, Wayne; McGill, Preston; Swanson, Greg; Wells, Doug; Throckmorton, D. A. (Technical Monitor)
2001-01-01
The objective of this viewgraph presentation is to develop a systematic approach to improving the fracture control process, including analytical tools, standards, guidelines, and awareness. Analytical tools specifically for elastic-plastic fracture analysis is a regime that is currently empirical for the Space Shuttle External Tank (ET) and is handled by simulated service testing of pre-cracked panels.
Paediatric Automatic Phonological Analysis Tools (APAT).
Saraiva, Daniela; Lousada, Marisa; Hall, Andreia; Jesus, Luis M T
2017-12-01
To develop the pediatric Automatic Phonological Analysis Tools (APAT) and to estimate inter and intrajudge reliability, content validity, and concurrent validity. The APAT were constructed using Excel spreadsheets with formulas. The tools were presented to an expert panel for content validation. The corpus used in the Portuguese standardized test Teste Fonético-Fonológico - ALPE produced by 24 children with phonological delay or phonological disorder was recorded, transcribed, and then inserted into the APAT. Reliability and validity of APAT were analyzed. The APAT present strong inter- and intrajudge reliability (>97%). The content validity was also analyzed (ICC = 0.71), and concurrent validity revealed strong correlations between computerized and manual (traditional) methods. The development of these tools contributes to fill existing gaps in clinical practice and research, since previously there were no valid and reliable tools/instruments for automatic phonological analysis, which allowed the analysis of different corpora.
Standardizing Exoplanet Analysis with the Exoplanet Characterization Tool Kit (ExoCTK)
NASA Astrophysics Data System (ADS)
Fowler, Julia; Stevenson, Kevin B.; Lewis, Nikole K.; Fraine, Jonathan D.; Pueyo, Laurent; Bruno, Giovanni; Filippazzo, Joe; Hill, Matthew; Batalha, Natasha; Wakeford, Hannah; Bushra, Rafia
2018-06-01
Exoplanet characterization depends critically on analysis tools, models, and spectral libraries that are constantly under development and have no single source nor sense of unified style or methods. The complexity of spectroscopic analysis and initial time commitment required to become competitive is prohibitive to new researchers entering the field, as well as a remaining obstacle for established groups hoping to contribute in a comparable manner to their peers. As a solution, we are developing an open-source, modular data analysis package in Python and a publicly facing web interface including tools that address atmospheric characterization, transit observation planning with JWST, JWST corongraphy simulations, limb darkening, forward modeling, and data reduction, as well as libraries of stellar, planet, and opacity models. The foundation of these software tools and libraries exist within pockets of the exoplanet community, but our project will gather these seedling tools and grow a robust, uniform, and well-maintained exoplanet characterization toolkit.
Selection of reference standard during method development using the analytical hierarchy process.
Sun, Wan-yang; Tong, Ling; Li, Dong-xiang; Huang, Jing-yi; Zhou, Shui-ping; Sun, Henry; Bi, Kai-shun
2015-03-25
Reference standard is critical for ensuring reliable and accurate method performance. One important issue is how to select the ideal one from the alternatives. Unlike the optimization of parameters, the criteria of the reference standard are always immeasurable. The aim of this paper is to recommend a quantitative approach for the selection of reference standard during method development based on the analytical hierarchy process (AHP) as a decision-making tool. Six alternative single reference standards were assessed in quantitative analysis of six phenolic acids from Salvia Miltiorrhiza and its preparations by using ultra-performance liquid chromatography. The AHP model simultaneously considered six criteria related to reference standard characteristics and method performance, containing feasibility to obtain, abundance in samples, chemical stability, accuracy, precision and robustness. The priority of each alternative was calculated using standard AHP analysis method. The results showed that protocatechuic aldehyde is the ideal reference standard, and rosmarinic acid is about 79.8% ability as the second choice. The determination results successfully verified the evaluation ability of this model. The AHP allowed us comprehensive considering the benefits and risks of the alternatives. It was an effective and practical tool for optimization of reference standards during method development. Copyright © 2015 Elsevier B.V. All rights reserved.
Boerner, Jana; Godenschwege, Tanja Angela
2010-09-01
The Drosophila standard brain has been a useful tool that provides information about position and size of different brain structures within a wild-type brain and allows the comparison of imaging data that were collected from individual preparations. Therefore the standard can be used to reveal and visualize differences of brain regions between wild-type and mutant brains and can provide spatial description of single neurons within the nervous system. Recently the standard brain was complemented by the generation of a ventral nerve cord (VNC) standard. Here the authors have registered the major components of a simple neuronal circuit, the Giant Fiber System (GFS), into this standard. The authors show that they can also virtually reconstruct the well-characterized synaptic contact of the Giant Fiber with its motorneuronal target when they register the individual neurons from different preparations into the VNC standard. In addition to the potential application for the standard thorax in neuronal circuit reconstruction, the authors show that it is a useful tool for in-depth analysis of mutant morphology of single neurons. The authors find quantitative and qualitative differences when they compared the Giant Fibers of two different neuroglian alleles, nrg(849) and nrg(G00305), using the averaged wild-type GFS in the standard VNC as a reference.
Healey, Lucy; Humphreys, Cathy; Howe, Keran
2013-01-01
Women with disabilities experience violence at greater rates than other women, yet their access to domestic violence services is more limited. This limitation is mirrored in domestic violence sector standards, which often fail to include the specific issues for women with disabilities. This article has a dual focus: to outline a set of internationally transferrable standards for inclusive practice with women with disabilities affected by domestic violence; and report on the results of a documentary analysis of domestic violence service standards, codes of practice, and practice guidelines. It draws on the Building the Evidence (BtE) research and advocacy project in Victoria, Australia in which a matrix tool was developed to identify minimum standards to support the inclusion of women with disabilities in existing domestic violence sector standards. This tool is designed to interrogate domestic violence sector standards for their attention to women with disabilities.
Perl One-Liners: Bridging the Gap Between Large Data Sets and Analysis Tools.
Hokamp, Karsten
2015-01-01
Computational analyses of biological data are becoming increasingly powerful, and researchers intending on carrying out their own analyses can often choose from a wide array of tools and resources. However, their application might be obstructed by the wide variety of different data formats that are in use, from standard, commonly used formats to output files from high-throughput analysis platforms. The latter are often too large to be opened, viewed, or edited by standard programs, potentially leading to a bottleneck in the analysis. Perl one-liners provide a simple solution to quickly reformat, filter, and merge data sets in preparation for downstream analyses. This chapter presents example code that can be easily adjusted to meet individual requirements. An online version is available at http://bioinf.gen.tcd.ie/pol.
A Limit Theorem on the Cores of Large Standard Exchange Economies
Brown, Donald J.; Robinson, Abraham
1972-01-01
This note introduces a new mathematical tool, nonstandard analysis, for the analysis of an important class of problems in mathematical economics—the relation between bargaining and the competitive price system. PMID:16591988
Open source tools and toolkits for bioinformatics: significance, and where are we?
Stajich, Jason E; Lapp, Hilmar
2006-09-01
This review summarizes important work in open-source bioinformatics software that has occurred over the past couple of years. The survey is intended to illustrate how programs and toolkits whose source code has been developed or released under an Open Source license have changed informatics-heavy areas of life science research. Rather than creating a comprehensive list of all tools developed over the last 2-3 years, we use a few selected projects encompassing toolkit libraries, analysis tools, data analysis environments and interoperability standards to show how freely available and modifiable open-source software can serve as the foundation for building important applications, analysis workflows and resources.
Systematic review: work-related stress and the HSE management standards.
Brookes, K; Limbert, C; Deacy, C; O'Reilly, A; Scott, S; Thirlaway, K
2013-10-01
The Health and Safety Executive (HSE) has defined six management standards representing aspects of work that, if poorly managed, are associated with lower levels of employee health and productivity, and increased sickness absence. The HSE indicator tool aims to measure organizations' performance in managing the primary stressors identified by the HSE management standards. The aims of the study are to explore how the HSE indicator tool has been implemented within organizations and to identify contexts in which the tool has been used, its psychometric properties and relationships with alternative measures of well-being and stress. Studies that matched specific criteria were included in the review. Abstracts were considered by two researchers to ensure a reliable process. Full texts were obtained when abstracts met the inclusion criteria. Thirteen papers were included in the review. Using factor analysis and measures of reliability, the studies suggest that the HSE indicator tool is a psychometrically sound measure. The tool has been used to measure work-related stress across different occupational groups, with a clear relationship between the HSE tool and alternative measures of well-being. Limitations of the tool and recommendations for future research are discussed. The HSE indicator tool is a psychometrically sound measure of organizational performance against the HSE management standards. As such it can provide a broad overview of sources of work-related stress within organizations. More research is required to explore the use of the tool in the design of interventions to reduce stress, and its use in different contexts and with different cultural and gender groups.
The Legacy Archive for Microwave Background Data Analysis (LAMBDA)
NASA Astrophysics Data System (ADS)
Miller, Nathan; LAMBDA
2018-01-01
The Legacy Archive for Microwave Background Data Analysis (LAMBDA) provides CMB researchers with archival data for cosmology missions, software tools, and links to other sites of interest. LAMBDA is one-stop shopping for CMB researchers. It hosts data from WMAP along with many suborbital experiments. Over the past year, LAMBDA has acquired new data from SPTpol, SPIDER and ACTPol. In addition to the primary CMB, LAMBDA also provides foreground data.LAMBDA has several ongoing efforts to provide tools for CMB researchers. These tools include a web interface for CAMB and a web interface for a CMB survey footprint database and plotting tool. Additionally, we have recently developed a Docker container with standard CMB analysis tools and demonstrations in the form of Jupyter notebooks. These containers will be publically available through Docker's container repository and the source will be available on github.
Tackling the 2nd V: Big Data, Variety and the Need for Representation Consistency
NASA Astrophysics Data System (ADS)
Clune, T.; Kuo, K. S.
2016-12-01
While Big Data technologies are transforming our ability to analyze ever larger volumes of Earth science data, practical constraints continue to limit our ability to compare data across datasets from different sources in an efficient and robust manner. Within a single data collection, invariants such as file format, grid type, and spatial resolution greatly simplify many types of analysis (often implicitly). However, when analysis combines data across multiple data collections, researchers are generally required to implement data transformations (i.e., "data preparation") to provide appropriate invariants. These transformation include changing of file formats, ingesting into a database, and/or regridding to a common spatial representation, and they can either be performed once, statically, or each time the data is accessed. At the very least, this process is inefficient from the perspective of the community as each team selects its own representation and privately implements the appropriate transformations. No doubt there are disadvantages to any "universal" representation, but we posit that major benefits would be obtained if a suitably flexible spatial representation could be standardized along with tools for transforming to/from that representation. We regard this as part of the historic trend in data publishing. Early datasets used ad hoc formats and lacked metadata. As better tools evolved, published data began to use standardized formats (e.g., HDF and netCDF) with attached metadata. We propose that the modern need to perform analysis across data sets should drive a new generation of tools that support a standardized spatial representation. More specifically, we propose the hierarchical triangular mesh (HTM) as a suitable "generic" resolution that permits standard transformations to/from native representations in use today, as well as tools to convert/regrid existing datasets onto that representation.
Integrated optomechanical analysis and testing software development at MIT Lincoln Laboratory
NASA Astrophysics Data System (ADS)
Stoeckel, Gerhard P.; Doyle, Keith B.
2013-09-01
Advanced analytical software capabilities are being developed to advance the design of prototypical hardware in the Engineering Division at MIT Lincoln Laboratory. The current effort is focused on the integration of analysis tools tailored to the work flow, organizational structure, and current technology demands. These tools are being designed to provide superior insight into the interdisciplinary behavior of optical systems and enable rapid assessment and execution of design trades to optimize the design of optomechanical systems. The custom software architecture is designed to exploit and enhance the functionality of existing industry standard commercial software, provide a framework for centralizing internally developed tools, and deliver greater efficiency, productivity, and accuracy through standardization, automation, and integration. Specific efforts have included the development of a feature-rich software package for Structural-Thermal-Optical Performance (STOP) modeling, advanced Line Of Sight (LOS) jitter simulations, and improved integration of dynamic testing and structural modeling.
Understanding and Using the Fermi Science Tools
NASA Astrophysics Data System (ADS)
Asercion, Joseph
2018-01-01
The Fermi Science Support Center (FSSC) provides information, documentation, and tools for the analysis of Fermi science data, including both the Large-Area Telescope (LAT) and the Gamma-ray Burst Monitor (GBM). Source and binary versions of the Fermi Science Tools can be downloaded from the FSSC website, and are supported on multiple platforms. An overview document, the Cicerone, provides details of the Fermi mission, the science instruments and their response functions, the science data preparation and analysis process, and interpretation of the results. Analysis Threads and a reference manual available on the FSSC website provide the user with step-by-step instructions for many different types of data analysis: point source analysis - generating maps, spectra, and light curves, pulsar timing analysis, source identification, and the use of python for scripting customized analysis chains. We present an overview of the structure of the Fermi science tools and documentation, and how to acquire them. We also provide examples of standard analyses, including tips and tricks for improving Fermi science analysis.
ParamAP: Standardized Parameterization of Sinoatrial Node Myocyte Action Potentials.
Rickert, Christian; Proenza, Catherine
2017-08-22
Sinoatrial node myocytes act as cardiac pacemaker cells by generating spontaneous action potentials (APs). Much information is encoded in sinoatrial AP waveforms, but both the analysis and the comparison of AP parameters between studies is hindered by the lack of standardized parameter definitions and the absence of automated analysis tools. Here we introduce ParamAP, a standalone cross-platform computational tool that uses a template-free detection algorithm to automatically identify and parameterize APs from text input files. ParamAP employs a graphic user interface with automatic and user-customizable input modes, and it outputs data files in text and PDF formats. ParamAP returns a total of 16 AP waveform parameters including time intervals such as the AP duration, membrane potentials such as the maximum diastolic potential, and rates of change of the membrane potential such as the diastolic depolarization rate. ParamAP provides a robust AP detection algorithm in combination with a standardized AP parameter analysis over a wide range of AP waveforms and firing rates, owing in part to the use of an iterative algorithm for the determination of the threshold potential and the diastolic depolarization rate that is independent of the maximum upstroke velocity, a parameter that can vary significantly among sinoatrial APs. Because ParamAP is implemented in Python 3, it is also highly customizable and extensible. In conclusion, ParamAP is a powerful computational tool that facilitates quantitative analysis and enables comparison of sinoatrial APs by standardizing parameter definitions and providing an automated work flow. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.
"Type Ia Supernovae: Tools for Studying Dark Energy" Final Technical Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Woosley, Stan; Kasen, Dan
2017-05-10
Final technical report for project "Type Ia Supernovae: Tools for the Study of Dark Energy" awarded jointly to scientists at the University of California, Santa Cruz and Berkeley, for computer modeling, theory and data analysis relevant to the use of Type Ia supernovae as standard candles for cosmology.
He, Yongqun; Xiang, Zuoshuang; Zheng, Jie; Lin, Yu; Overton, James A; Ong, Edison
2018-01-12
Ontologies are critical to data/metadata and knowledge standardization, sharing, and analysis. With hundreds of biological and biomedical ontologies developed, it has become critical to ensure ontology interoperability and the usage of interoperable ontologies for standardized data representation and integration. The suite of web-based Ontoanimal tools (e.g., Ontofox, Ontorat, and Ontobee) support different aspects of extensible ontology development. By summarizing the common features of Ontoanimal and other similar tools, we identified and proposed an "eXtensible Ontology Development" (XOD) strategy and its associated four principles. These XOD principles reuse existing terms and semantic relations from reliable ontologies, develop and apply well-established ontology design patterns (ODPs), and involve community efforts to support new ontology development, promoting standardized and interoperable data and knowledge representation and integration. The adoption of the XOD strategy, together with robust XOD tool development, will greatly support ontology interoperability and robust ontology applications to support data to be Findable, Accessible, Interoperable and Reusable (i.e., FAIR).
Koopmans, Bastijn; Smit, August B; Verhage, Matthijs; Loos, Maarten
2017-04-04
Systematic, standardized and in-depth phenotyping and data analyses of rodent behaviour empowers gene-function studies, drug testing and therapy design. However, no data repositories are currently available for standardized quality control, data analysis and mining at the resolution of individual mice. Here, we present AHCODA-DB, a public data repository with standardized quality control and exclusion criteria aimed to enhance robustness of data, enabled with web-based mining tools for the analysis of individually and group-wise collected mouse phenotypic data. AHCODA-DB allows monitoring in vivo effects of compounds collected from conventional behavioural tests and from automated home-cage experiments assessing spontaneous behaviour, anxiety and cognition without human interference. AHCODA-DB includes such data from mutant mice (transgenics, knock-out, knock-in), (recombinant) inbred strains, and compound effects in wildtype mice and disease models. AHCODA-DB provides real time statistical analyses with single mouse resolution and versatile suite of data presentation tools. On March 9th, 2017 AHCODA-DB contained 650 k data points on 2419 parameters from 1563 mice. AHCODA-DB provides users with tools to systematically explore mouse behavioural data, both with positive and negative outcome, published and unpublished, across time and experiments with single mouse resolution. The standardized (automated) experimental settings and the large current dataset (1563 mice) in AHCODA-DB provide a unique framework for the interpretation of behavioural data and drug effects. The use of common ontologies allows data export to other databases such as the Mouse Phenome Database. Unbiased presentation of positive and negative data obtained under the highly standardized screening conditions increase cost efficiency of publicly funded mouse screening projects and help to reach consensus conclusions on drug responses and mouse behavioural phenotypes. The website is publicly accessible through https://public.sylics.com and can be viewed in every recent version of all commonly used browsers.
Paliwoda, Michelle; New, Karen; Bogossian, Fiona
2016-09-01
All newborns are at risk of deterioration as a result of failing to make the transition to extra uterine life. Signs of deterioration can be subtle and easily missed. It has been postulated that the use of an Early Warning Tool may assist clinicians in recognising and responding to signs of deterioration earlier in neonates, thereby preventing a serious adverse event. To examine whether observations from a Standard Observation Tool, applied to three neonatal Early Warning Tools, would hypothetically trigger an escalation of care more frequently than actual escalation of care using the Standard Observation Tool. A retrospective case-control study. A maternity unit in a tertiary public hospital in Australia. Neonates born in 2013 of greater than or equal to 34(+0) weeks gestation, admitted directly to the maternity ward from their birthing location and whose subsequent deterioration required admission to the neonatal unit, were identified as cases from databases of the study hospital. Each case was matched with three controls, inborn during the same period and who did not experience deterioration and neonatal unit admission. Clinical and physiological data recorded on a Standard Observation Tool, from time of admission to the maternity ward, for cases and controls were charted onto each of three Early Warning Tools. The primary outcome was whether the tool 'triggered an escalation of care'. Descriptive statistics (n, %, Mean and SD) were employed. Cases (n=26) comprised late preterm, early term and post-term neonates and matched by gestational age group with 3 controls (n=78). Overall, the Standard Observation Tool triggered an escalation of care for 92.3% of cases compared to the Early Warning Tools; New South Wales Health 80.8%, United Kingdom Newborn Early Warning Chart 57.7% and The Australian Capital Territory Neonatal Early Warning Score 11.5%. Subgroup analysis by gestational age found differences between the tools in hypothetically triggering an escalation of care. The Standard Observation Tool triggered an escalation of care more frequently than the Early Warning Tools, which may be as a result of behavioural data captured on the Standard Observation Tool and escalated, which could not be on the Early Warning Tools. Findings demonstrate that a single tool applied to all gestational age ranges may not be effective in identifying early deterioration or may over trigger an escalation of care. Further research is required into the sensitivity and specificity of Early Warning Tools in neonatal sub-populations. Copyright © 2016 Elsevier Ltd. All rights reserved.
Navigating freely-available software tools for metabolomics analysis.
Spicer, Rachel; Salek, Reza M; Moreno, Pablo; Cañueto, Daniel; Steinbeck, Christoph
2017-01-01
The field of metabolomics has expanded greatly over the past two decades, both as an experimental science with applications in many areas, as well as in regards to data standards and bioinformatics software tools. The diversity of experimental designs and instrumental technologies used for metabolomics has led to the need for distinct data analysis methods and the development of many software tools. To compile a comprehensive list of the most widely used freely available software and tools that are used primarily in metabolomics. The most widely used tools were selected for inclusion in the review by either ≥ 50 citations on Web of Science (as of 08/09/16) or the use of the tool being reported in the recent Metabolomics Society survey. Tools were then categorised by the type of instrumental data (i.e. LC-MS, GC-MS or NMR) and the functionality (i.e. pre- and post-processing, statistical analysis, workflow and other functions) they are designed for. A comprehensive list of the most used tools was compiled. Each tool is discussed within the context of its application domain and in relation to comparable tools of the same domain. An extended list including additional tools is available at https://github.com/RASpicer/MetabolomicsTools which is classified and searchable via a simple controlled vocabulary. This review presents the most widely used tools for metabolomics analysis, categorised based on their main functionality. As future work, we suggest a direct comparison of tools' abilities to perform specific data analysis tasks e.g. peak picking.
Writing Across the Curriculum: Reliability Testing of a Standardized Rubric.
Minnich, Margo; Kirkpatrick, Amanda J; Goodman, Joely T; Whittaker, Ali; Stanton Chapple, Helen; Schoening, Anne M; Khanna, Maya M
2018-06-01
Rubrics positively affect student academic performance; however, accuracy and consistency of the rubric and its use is imperative. The researchers in this study developed a standardized rubric for use across an undergraduate nursing curriculum, then evaluated the interrater reliability and general usability of the tool. Faculty raters graded papers using the standardized rubric, submitted their independent scoring for interrater reliability analyses, then participated in a focus group discussion regarding rubric use experience. Quantitative analysis of the data showed a high interrater reliability (α = .998). Content analysis of transcription revealed several positive themes: Consistency, Emphasis on Writing Ability, and Ability to Use the Rubric as a Teaching Tool. Areas for improvement included use of value words and difficulty with point allocation. Investigators recommend effective faculty orientation for rubric use and future work in developing a rubric to assess reflective writing. [J Nurs Educ. 2018;57(6):366-370.]. Copyright 2018, SLACK Incorporated.
Weech-Maldonado, Robert; Dreachslin, Janice L; Brown, Julie; Pradhan, Rohit; Rubin, Kelly L; Schiller, Cameron; Hays, Ron D
2012-01-01
The U.S. national standards for culturally and linguistically appropriate services (CLAS) in health care provide guidelines on policies and practices aimed at developing culturally competent systems of care. The Cultural Competency Assessment Tool for Hospitals (CCATH) was developed as an organizational tool to assess adherence to the CLAS standards. First, we describe the development of the CCATH and estimate the reliability and validity of the CCATH measures. Second, we discuss the managerial implications of the CCATH as an organizational tool to assess cultural competency. We pilot tested an initial draft of the CCATH, revised it based on a focus group and cognitive interviews, and then administered it in a field test with a sample of California hospitals. The reliability and validity of the CCATH were evaluated using factor analysis, analysis of variance, and Cronbach's alphas. Exploratory and confirmatory factor analyses identified 12 CCATH composites: leadership and strategic planning, data collection on inpatient population, data collection on service area, performance management systems and quality improvement, human resources practices, diversity training, community representation, availability of interpreter services, interpreter services policies, quality of interpreter services, translation of written materials, and clinical cultural competency practices. All the CCATH scales had internal consistency reliability of .65 or above, and the reliability was .70 or above for 9 of the 12 scales. Analysis of variance results showed that not-for-profit hospitals have higher CCATH scores than for-profit hospitals in five CCATH scales and higher CCATH scores than government hospitals in two CCATH scales. The CCATH showed adequate psychometric properties. Managers and policy makers can use the CCATH as a tool to evaluate hospital performance in cultural competency and identify and target improvements in hospital policies and practices that undergird the provision of CLAS.
Proposal of a micromagnetic standard problem for ferromagnetic resonance simulations
NASA Astrophysics Data System (ADS)
Baker, Alexander; Beg, Marijan; Ashton, Gregory; Albert, Maximilian; Chernyshenko, Dmitri; Wang, Weiwei; Zhang, Shilei; Bisotti, Marc-Antonio; Franchin, Matteo; Hu, Chun Lian; Stamps, Robert; Hesjedal, Thorsten; Fangohr, Hans
2017-01-01
Nowadays, micromagnetic simulations are a common tool for studying a wide range of different magnetic phenomena, including the ferromagnetic resonance. A technique for evaluating reliability and validity of different micromagnetic simulation tools is the simulation of proposed standard problems. We propose a new standard problem by providing a detailed specification and analysis of a sufficiently simple problem. By analyzing the magnetization dynamics in a thin permalloy square sample, triggered by a well defined excitation, we obtain the ferromagnetic resonance spectrum and identify the resonance modes via Fourier transform. Simulations are performed using both finite difference and finite element numerical methods, with OOMMF and Nmag simulators, respectively. We report the effects of initial conditions and simulation parameters on the character of the observed resonance modes for this standard problem. We provide detailed instructions and code to assist in using the results for evaluation of new simulator tools, and to help with numerical calculation of ferromagnetic resonance spectra and modes in general.
NASA Astrophysics Data System (ADS)
Pedersen, N. L.
2015-06-01
The strength of a gear is typically defined relative to durability (pitting) and load capacity (tooth-breakage). Tooth-breakage is controlled by the root shape and this gear part can be designed because there is no contact between gear pairs here. The shape of gears is generally defined by different standards, with the ISO standard probably being the most common one. Gears are manufactured using two principally different tools: rack tools and gear tools. In this work, the bending stress of involute teeth is minimized by shape optimization made directly on the final gear. This optimized shape is then used to find the cutting tool (the gear envelope) that can create this optimized gear shape. A simple but sufficiently flexible root parameterization is applied and emphasis is put on the importance of separating the shape parameterization from the finite element analysis of stresses. Large improvements in the stress level are found.
Preparing Laboratory and Real-World EEG Data for Large-Scale Analysis: A Containerized Approach
Bigdely-Shamlo, Nima; Makeig, Scott; Robbins, Kay A.
2016-01-01
Large-scale analysis of EEG and other physiological measures promises new insights into brain processes and more accurate and robust brain–computer interface models. However, the absence of standardized vocabularies for annotating events in a machine understandable manner, the welter of collection-specific data organizations, the difficulty in moving data across processing platforms, and the unavailability of agreed-upon standards for preprocessing have prevented large-scale analyses of EEG. Here we describe a “containerized” approach and freely available tools we have developed to facilitate the process of annotating, packaging, and preprocessing EEG data collections to enable data sharing, archiving, large-scale machine learning/data mining and (meta-)analysis. The EEG Study Schema (ESS) comprises three data “Levels,” each with its own XML-document schema and file/folder convention, plus a standardized (PREP) pipeline to move raw (Data Level 1) data to a basic preprocessed state (Data Level 2) suitable for application of a large class of EEG analysis methods. Researchers can ship a study as a single unit and operate on its data using a standardized interface. ESS does not require a central database and provides all the metadata data necessary to execute a wide variety of EEG processing pipelines. The primary focus of ESS is automated in-depth analysis and meta-analysis EEG studies. However, ESS can also encapsulate meta-information for the other modalities such as eye tracking, that are increasingly used in both laboratory and real-world neuroimaging. ESS schema and tools are freely available at www.eegstudy.org and a central catalog of over 850 GB of existing data in ESS format is available at studycatalog.org. These tools and resources are part of a larger effort to enable data sharing at sufficient scale for researchers to engage in truly large-scale EEG analysis and data mining (BigEEG.org). PMID:27014048
Introducing a design exigency to promote student learning through assessment: A case study.
Grealish, Laurie A; Shaw, Julie M
2018-02-01
Assessment technologies are often used to classify student and newly qualified nurse performance as 'pass' or 'fail', with little attention to how these decisions are achieved. Examining the design exigencies of classification technologies, such as performance assessment technologies, provides opportunities to explore flexibility and change in the process of using those technologies. Evaluate an established assessment technology for nursing performance as a classification system. A case study analysis that is focused on the assessment approach and a priori design exigencies of performance assessment technology, in this case the Australian Nursing Standards Assessment Tool 2016. Nurse assessors are required to draw upon their expertise to judge performance, but that judgement is described as a source of bias, creating confusion. The definition of satisfactory performance is 'ready to enter practice'. To pass, the performance on each criterion must be at least satisfactory, indicating to the student that no further improvement is required. The Australian Nursing Standards Assessment Tool 2016 does not have a third 'other' category, which is usually found in classification systems. Introducing a 'not yet competent' category and creating a two-part, mixed methods assessment process can improve the Australian Nursing Standards Assessment Tool 2016 assessment technology. Using a standards approach in the first part, judgement is valued and can generate learning opportunities across a program. Using a measurement approach in the second part, student performance can be 'not yet competent' but still meet criteria for year level performance and a graded pass. Subjecting the Australian Nursing Standards Assessment Tool 2016 assessment technology to analysis as a classification system provides opportunities for innovation in design. This design innovation has the potential to support students who move between programs and clinicians who assess students from different universities. Copyright © 2017 Elsevier Ltd. All rights reserved.
Annotare--a tool for annotating high-throughput biomedical investigations and resulting data.
Shankar, Ravi; Parkinson, Helen; Burdett, Tony; Hastings, Emma; Liu, Junmin; Miller, Michael; Srinivasa, Rashmi; White, Joseph; Brazma, Alvis; Sherlock, Gavin; Stoeckert, Christian J; Ball, Catherine A
2010-10-01
Computational methods in molecular biology will increasingly depend on standards-based annotations that describe biological experiments in an unambiguous manner. Annotare is a software tool that enables biologists to easily annotate their high-throughput experiments, biomaterials and data in a standards-compliant way that facilitates meaningful search and analysis. Annotare is available from http://code.google.com/p/annotare/ under the terms of the open-source MIT License (http://www.opensource.org/licenses/mit-license.php). It has been tested on both Mac and Windows.
ERIC Educational Resources Information Center
Wilcox, Bethany R.; Pollock, Steven J.
2015-01-01
Standardized conceptual assessment represents a widely used tool for educational researchers interested in student learning within the standard undergraduate physics curriculum. For example, these assessments are often used to measure student learning across educational contexts and instructional strategies. However, to support the large-scale…
Anton TenWolde; Mark T. Bomberg
2009-01-01
Overall, despite the lack of exact input data, the use of design tools, including models, is much superior to the simple following of rules of thumbs, and a moisture analysis should be standard procedure for any building envelope design. Exceptions can only be made for buildings in the same climate, similar occupancy, and similar envelope construction. This chapter...
Boerner, Jana; Godenschwege, Tanja Angela
2010-01-01
The Drosophila standard brain has been a useful tool that provides information about position and size of different brain structures within a wild-type brain and allows the comparison of imaging data that were collected from individual preparations. Therefore the standard can be used to reveal and visualize differences of brain regions between wild-type and mutant brains and can provide spatial description of single neurons within the nervous system. Recently the standard brain was complemented by the generation of a ventral nerve cord (VNC) standard. Here the authors have registered the major components of a simple neuronal circuit, the Giant Fiber System (GFS), into this standard. The authors show that they can also virtually reconstruct the well-characterized synaptic contact of the Giant Fiber with its motorneuronal target when they register the individual neurons from different preparations into the VNC standard. In addition to the potential application for the standard thorax in neuronal circuit reconstruction, the authors show that it is a useful tool for in-depth analysis of mutant morphology of single neurons. The authors find quantitative and qualitative differences when they compared the Giant Fibers of two different neuroglian alleles, nrg849 and nrgG00305, using the averaged wild-type GFS in the standard VNC as a reference. PMID:20615087
A Software Tool for Integrated Optical Design Analysis
NASA Technical Reports Server (NTRS)
Moore, Jim; Troy, Ed; DePlachett, Charles; Montgomery, Edward (Technical Monitor)
2001-01-01
Design of large precision optical systems requires multi-disciplinary analysis, modeling, and design. Thermal, structural and optical characteristics of the hardware must be accurately understood in order to design a system capable of accomplishing the performance requirements. The interactions between each of the disciplines become stronger as systems are designed lighter weight for space applications. This coupling dictates a concurrent engineering design approach. In the past, integrated modeling tools have been developed that attempt to integrate all of the complex analysis within the framework of a single model. This often results in modeling simplifications and it requires engineering specialist to learn new applications. The software described in this presentation addresses the concurrent engineering task using a different approach. The software tool, Integrated Optical Design Analysis (IODA), uses data fusion technology to enable a cross discipline team of engineering experts to concurrently design an optical system using their standard validated engineering design tools.
System of Systems Analytic Workbench - 2017
2017-08-31
and transitional activities with key collaborators. The tools include: System Operational Dependency Analysis/System Developmental Dependency Analysis...in the methods of the SoS-AWB involve the following: 1. System Operability Dependency Analysis (SODA)/System Development Dependency Analysis...available f. Development of standard dependencies with combinations of low-medium-high parameters Report No. SERC-2017-TR-111
Lommen, Arjen; van der Kamp, Henk J; Kools, Harrie J; van der Lee, Martijn K; van der Weg, Guido; Mol, Hans G J
2012-11-09
A new alternative data processing tool set, metAlignID, is developed for automated pre-processing and library-based identification and concentration estimation of target compounds after analysis by comprehensive two-dimensional gas chromatography with mass spectrometric detection. The tool set has been developed for and tested on LECO data. The software is developed to run multi-threaded (one thread per processor core) on a standard PC (personal computer) under different operating systems and is as such capable of processing multiple data sets simultaneously. Raw data files are converted into netCDF (network Common Data Form) format using a fast conversion tool. They are then preprocessed using previously developed algorithms originating from metAlign software. Next, the resulting reduced data files are searched against a user-composed library (derived from user or commercial NIST-compatible libraries) (NIST=National Institute of Standards and Technology) and the identified compounds, including an indicative concentration, are reported in Excel format. Data can be processed batch wise. The overall time needed for conversion together with processing and searching of 30 raw data sets for 560 compounds is routinely within an hour. The screening performance is evaluated for detection of pesticides and contaminants in raw data obtained after analysis of soil and plant samples. Results are compared to the existing data-handling routine based on proprietary software (LECO, ChromaTOF). The developed software tool set, which is freely downloadable at www.metalign.nl, greatly accelerates data-analysis and offers more options for fine-tuning automated identification toward specific application needs. The quality of the results obtained is slightly better than the standard processing and also adds a quantitative estimate. The software tool set in combination with two-dimensional gas chromatography coupled to time-of-flight mass spectrometry shows great potential as a highly-automated and fast multi-residue instrumental screening method. Copyright © 2012 Elsevier B.V. All rights reserved.
Review: visual analytics of climate networks
NASA Astrophysics Data System (ADS)
Nocke, T.; Buschmann, S.; Donges, J. F.; Marwan, N.; Schulz, H.-J.; Tominski, C.
2015-09-01
Network analysis has become an important approach in studying complex spatiotemporal behaviour within geophysical observation and simulation data. This new field produces increasing numbers of large geo-referenced networks to be analysed. Particular focus lies currently on the network analysis of the complex statistical interrelationship structure within climatological fields. The standard procedure for such network analyses is the extraction of network measures in combination with static standard visualisation methods. Existing interactive visualisation methods and tools for geo-referenced network exploration are often either not known to the analyst or their potential is not fully exploited. To fill this gap, we illustrate how interactive visual analytics methods in combination with geovisualisation can be tailored for visual climate network investigation. Therefore, the paper provides a problem analysis relating the multiple visualisation challenges to a survey undertaken with network analysts from the research fields of climate and complex systems science. Then, as an overview for the interested practitioner, we review the state-of-the-art in climate network visualisation and provide an overview of existing tools. As a further contribution, we introduce the visual network analytics tools CGV and GTX, providing tailored solutions for climate network analysis, including alternative geographic projections, edge bundling, and 3-D network support. Using these tools, the paper illustrates the application potentials of visual analytics for climate networks based on several use cases including examples from global, regional, and multi-layered climate networks.
Review: visual analytics of climate networks
NASA Astrophysics Data System (ADS)
Nocke, T.; Buschmann, S.; Donges, J. F.; Marwan, N.; Schulz, H.-J.; Tominski, C.
2015-04-01
Network analysis has become an important approach in studying complex spatiotemporal behaviour within geophysical observation and simulation data. This new field produces increasing amounts of large geo-referenced networks to be analysed. Particular focus lies currently on the network analysis of the complex statistical interrelationship structure within climatological fields. The standard procedure for such network analyses is the extraction of network measures in combination with static standard visualisation methods. Existing interactive visualisation methods and tools for geo-referenced network exploration are often either not known to the analyst or their potential is not fully exploited. To fill this gap, we illustrate how interactive visual analytics methods in combination with geovisualisation can be tailored for visual climate network investigation. Therefore, the paper provides a problem analysis, relating the multiple visualisation challenges with a survey undertaken with network analysts from the research fields of climate and complex systems science. Then, as an overview for the interested practitioner, we review the state-of-the-art in climate network visualisation and provide an overview of existing tools. As a further contribution, we introduce the visual network analytics tools CGV and GTX, providing tailored solutions for climate network analysis, including alternative geographic projections, edge bundling, and 3-D network support. Using these tools, the paper illustrates the application potentials of visual analytics for climate networks based on several use cases including examples from global, regional, and multi-layered climate networks.
Standardisation of DNA quantitation by image analysis: quality control of instrumentation.
Puech, M; Giroud, F
1999-05-01
DNA image analysis is frequently performed in clinical practice as a prognostic tool and to improve diagnosis. The precision of prognosis and diagnosis depends on the accuracy of analysis and particularly on the quality of image analysis systems. It has been reported that image analysis systems used for DNA quantification differ widely in their characteristics (Thunissen et al.: Cytometry 27: 21-25, 1997). This induces inter-laboratory variations when the same sample is analysed in different laboratories. In microscopic image analysis, the principal instrumentation errors arise from the optical and electronic parts of systems. They bring about problems of instability, non-linearity, and shading and glare phenomena. The aim of this study is to establish tools and standardised quality control procedures for microscopic image analysis systems. Specific reference standard slides have been developed to control instability, non-linearity, shading and glare phenomena and segmentation efficiency. Some systems have been controlled with these tools and these quality control procedures. Interpretation criteria and accuracy limits of these quality control procedures are proposed according to the conclusions of a European project called PRESS project (Prototype Reference Standard Slide). Beyond these limits, tested image analysis systems are not qualified to realise precise DNA analysis. The different procedures presented in this work determine if an image analysis system is qualified to deliver sufficiently precise DNA measurements for cancer case analysis. If the controlled systems are beyond the defined limits, some recommendations are given to find a solution to the problem.
LiPD and CSciBox: A Case Study in Why Data Standards are Important for Paleoscience
NASA Astrophysics Data System (ADS)
Weiss, I.; Bradley, E.; McKay, N.; Emile-Geay, J.; de Vesine, L. R.; Anderson, K. A.; White, J. W. C.; Marchitto, T. M., Jr.
2016-12-01
CSciBox [1] is an integrated software system that helps geoscientists build and evaluate age models. Its user chooses from a number of built-in analysis tools, composing them into an analysis workflow and applying it to paleoclimate proxy datasets. CSciBox employs modern database technology to store both the data and the analysis results in an easily accessible and searchable form, and offers the user access to the computational toolbox, the data, and the results via a graphical user interface and a sophisticated plotter. Standards are a staple of modern life, and underlie any form of automation. Without data standards, it is difficult, if not impossible, to construct effective computer tools for paleoscience analysis. The LiPD (Linked Paleo Data) framework [2] enables the storage of both data and metadata in systematic, meaningful, machine-readable ways. LiPD has been a primary enabler of CSciBox's goals of usability, interoperability, and reproducibility. Building LiPD capabilities into CSciBox's importer, for instance, eliminated the need to ask the user about file formats, variable names, relationships between columns in the input file, etc. Building LiPD capabilities into the exporter facilitated the storage of complete details about the input data-provenance, preprocessing steps, etc.-as well as full descriptions of any analyses that were performed using the CSciBox tool, along with citations to appropriate references. This comprehensive collection of data and metadata, which is all linked together in a semantically meaningful, machine-readable way, not only completely documents the analyses and makes them reproducible. It also enables interoperability with any other software system that employs the LiPD standard. [1] www.cs.colorado.edu/ lizb/cscience.html[2] McKay & Emile-Geay, Climate of the Past 12:1093 (2016)
STAMPS: Software Tool for Automated MRI Post-processing on a supercomputer.
Bigler, Don C; Aksu, Yaman; Miller, David J; Yang, Qing X
2009-08-01
This paper describes a Software Tool for Automated MRI Post-processing (STAMP) of multiple types of brain MRIs on a workstation and for parallel processing on a supercomputer (STAMPS). This software tool enables the automation of nonlinear registration for a large image set and for multiple MR image types. The tool uses standard brain MRI post-processing tools (such as SPM, FSL, and HAMMER) for multiple MR image types in a pipeline fashion. It also contains novel MRI post-processing features. The STAMP image outputs can be used to perform brain analysis using Statistical Parametric Mapping (SPM) or single-/multi-image modality brain analysis using Support Vector Machines (SVMs). Since STAMPS is PBS-based, the supercomputer may be a multi-node computer cluster or one of the latest multi-core computers.
NASA Astrophysics Data System (ADS)
Solér, Cecilia; Sandström, Cecilia; Skoog, Hanna
2017-02-01
This article investigates the outcomes of mainstream coffee voluntary sustainability standards for high-biodiversity coffee diversification. By viewing voluntary sustainability standards certifications as performative marketing tools, we address the question of how such certification schemes affect coffee value creation based on unique biodiversity conservation properties in coffee farming. To date, the voluntary sustainability standards literature has primarily approached biodiversity conservation in coffee farming in the context of financial remuneration to coffee farmers. The performative analysis of voluntary sustainability standards certification undertaken in this paper, in which such certifications are analyzed in terms of their effect on mutually reinforcing representational, normalizing and exchange practices, provides an understanding of coffee diversification potential as dependent on standard criteria and voluntary sustainability standards certification as branding tools. We draw on a case of high-biodiversity, shade-grown coffee-farming practice in Kodagu, South-West India, which represents one of the world's biodiversity "hotspots".
Solér, Cecilia; Sandström, Cecilia; Skoog, Hanna
2017-02-01
This article investigates the outcomes of mainstream coffee voluntary sustainability standards for high-biodiversity coffee diversification. By viewing voluntary sustainability standards certifications as performative marketing tools, we address the question of how such certification schemes affect coffee value creation based on unique biodiversity conservation properties in coffee farming. To date, the voluntary sustainability standards literature has primarily approached biodiversity conservation in coffee farming in the context of financial remuneration to coffee farmers. The performative analysis of voluntary sustainability standards certification undertaken in this paper, in which such certifications are analyzed in terms of their effect on mutually reinforcing representational, normalizing and exchange practices, provides an understanding of coffee diversification potential as dependent on standard criteria and voluntary sustainability standards certification as branding tools. We draw on a case of high-biodiversity, shade-grown coffee-farming practice in Kodagu, South-West India, which represents one of the world's biodiversity "hotspots".
CMS Configuration Editor: GUI based application for user analysis job
NASA Astrophysics Data System (ADS)
de Cosa, A.
2011-12-01
We present the user interface and the software architecture of the Configuration Editor for the CMS experiment. The analysis workflow is organized in a modular way integrated within the CMS framework that organizes in a flexible way user analysis code. The Python scripting language is adopted to define the job configuration that drives the analysis workflow. It could be a challenging task for users, especially for newcomers, to develop analysis jobs managing the configuration of many required modules. For this reason a graphical tool has been conceived in order to edit and inspect configuration files. A set of common analysis tools defined in the CMS Physics Analysis Toolkit (PAT) can be steered and configured using the Config Editor. A user-defined analysis workflow can be produced starting from a standard configuration file, applying and configuring PAT tools according to the specific user requirements. CMS users can adopt this tool, the Config Editor, to create their analysis visualizing in real time which are the effects of their actions. They can visualize the structure of their configuration, look at the modules included in the workflow, inspect the dependences existing among the modules and check the data flow. They can visualize at which values parameters are set and change them according to what is required by their analysis task. The integration of common tools in the GUI needed to adopt an object-oriented structure in the Python definition of the PAT tools and the definition of a layer of abstraction from which all PAT tools inherit.
Benchmarking and Threshold Standards in Higher Education. Staff and Educational Development Series.
ERIC Educational Resources Information Center
Smith, Helen, Ed.; Armstrong, Michael, Ed.; Brown, Sally, Ed.
This book explores the issues involved in developing standards in higher education, examining the practical issues involved in benchmarking and offering a critical analysis of the problems associated with this developmental tool. The book focuses primarily on experience in the United Kingdom (UK), but looks also at international activity in this…
Randomization and Data-Analysis Items in Quality Standards for Single-Case Experimental Studies
ERIC Educational Resources Information Center
Heyvaert, Mieke; Wendt, Oliver; Van den Noortgate, Wim; Onghena, Patrick
2015-01-01
Reporting standards and critical appraisal tools serve as beacons for researchers, reviewers, and research consumers. Parallel to existing guidelines for researchers to report and evaluate group-comparison studies, single-case experimental (SCE) researchers are in need of guidelines for reporting and evaluating SCE studies. A systematic search was…
Comparative Study of Platforms for E-Learning in the Higher Education
ERIC Educational Resources Information Center
Mondejar-Jimenez, Jose; Mondejar-Jimenez, Juan-Antonio; Vargas-Vargas, Manuel; Meseguer-Santamaria, Maria-Leticia
2008-01-01
Castilla-La Mancha University has decided to implement two tools: WebCT and Moodle, "Virtual Campus" has emerged: www.campusvirtual.ulcm.es. This paper is dedicated to the analysis of said tool as a primary mode of e-learning expansion in the university environment. It can be used to carry out standard educational university activities…
Quality Tools for Professional Higher Education Review and Improvement. PHExcel Report
ERIC Educational Resources Information Center
Jørgensen, Malene Dahl; Sparre Kristensen, Regitze; Wimpf, Alexandre; Delplace, Stefan
2014-01-01
The report is the project's first outcome, and provides an overview of quality tools, quality models and quality labels, currently in use in (professional) higher education. It is followed by a gap analysis as regards the Standards and Guidelines for quality assurance in the European Higher Education Area (ESG), and the identified characteristics…
APMS: An Integrated Suite of Tools for Measuring Performance and Safety
NASA Technical Reports Server (NTRS)
Statler, Irving C.; Lynch, Robert E.; Connors, Mary M. (Technical Monitor)
1997-01-01
This is a report of work in progress. In it, I summarize the status of the research and development of the Aviation Performance Measuring System (APMS) for managing, processing, and analyzing digital flight-recorded data. The objectives of the NASA-FAA APMS research project are to establish a sound scientific and technological basis for flight-data analysis, to define an open and flexible architecture for flight-data-analysis systems, and to articulate guidelines for a standardized database structure on which to continue to build future flight-data-analysis extensions. APMS will offer to the air transport community an open, voluntary standard for flight-data-analysis software, a standard that will help to ensure suitable functionality, and data interchangeability, among competing software programs. APMS will develop and document the methodologies, algorithms, and procedures for data management and analyses to enable users to easily interpret the implications regarding safety and efficiency of operations. APMS does not entail the implementation of a nationwide flight-data-collection system. It is intended to provide technical tools to ease the large-scale implementation of flight-data analyses at both the air-carrier and the national-airspace levels in support of their Flight Operations and Quality Assurance (FOQA) Programs and Advanced Qualifications Programs (AQP). APMS cannot meet its objectives unless it develops tools that go substantially beyond the capabilities of the current commercially available software and supporting analytic methods that are mainly designed to count special events. These existing capabilities, while of proven value, were created primarily with the needs of air crews in mind. APMS tools must serve the needs of the government and air carriers, as well as air crews, to fully support the FOQA and AQP programs. They must be able to derive knowledge not only through the analysis of single flights (special-event detection), but through statistical evaluation of the performance of large groups of flights. This paper describes the integrated suite of tools that will assist analysts in evaluating the operational performance and safety of the national air transport system, the air carrier, and the air crew.
NASA Technical Reports Server (NTRS)
Statler, Irving C.; Connor, Mary M. (Technical Monitor)
1998-01-01
This is a report of work in progress. In it, I summarize the status of the research and development of the Aviation Performance Measuring System (APMS) for managing, processing, and analyzing digital flight-recorded data, The objectives of the NASA-FAA APMS research project are to establish a sound scientific and technological basis for flight-data analysis, to define an open and flexible architecture for flight-data analysis systems, and to articulate guidelines for a standardized database structure on which to continue to build future flight-data-analysis extensions. APMS offers to the air transport community an open, voluntary standard for flight-data-analysis software; a standard that will help to ensure suitable functionality and data interchangeability among competing software programs. APMS will develop and document the methodologies, algorithms, and procedures for data management and analyses to enable users to easily interpret the implications regarding safety and efficiency of operations. APMS does not entail the implementation of a nationwide flight-data-collection system. It is intended to provide technical tools to ease the large-scale implementation of flight-data analyses at both the air-carrier and the national-airspace levels in support of their Flight Operations and Quality Assurance (FOQA) Programs and Advanced Qualifications Programs (AQP). APMS cannot meet its objectives unless it develops tools that go substantially beyond the capabilities of the current commercially available software and supporting analytic methods that are mainly designed to count special events. These existing capabilities, while of proven value, were created primarily with the needs-of aircrews in mind. APMS tools must serve the needs of the government and air carriers, as well as aircrews, to fully support the FOQA and AQP programs. They must be able to derive knowledge not only through the analysis of single flights (special-event detection), but also through statistical evaluation of the performance of large groups of flights. This paper describes the integrated suite of tools that will assist analysts in evaluating the operational performance and safety of the national air transport system, the air carrier, and the aircrew.
APMS: An Integrated Suite of Tools for Measuring Performance and Safety
NASA Technical Reports Server (NTRS)
Statler, Irving C. (Technical Monitor)
1997-01-01
This is a report of work in progress. In it, I summarize the status of the research and development of the Aviation Performance Measuring System (APMS) for managing, processing, and analyzing digital flight-recorded data. The objectives of the NASA-FAA APMS research project are to establish a sound scientific and technological basis for flight-data analysis, to define an open and flexible architecture for flight-data-analysis systems, and to articulate guidelines for a standardized database structure on which to continue to build future flight-data-analysis extensions . APMS will offer to the air transport community an open, voluntary standard for flight-data-analysis software, a standard that will help to ensure suitable functionality, and data interchangeability, among competing software programs. APMS will develop and document the methodologies, algorithms, and procedures for data management and analyses to enable users to easily interpret the implications regarding safety and efficiency of operations. APMS does not entail the implementation of a nationwide flight-data-collection system. It is intended to provide technical tools to ease the large-scale implementation of flight-data analyses at both the air-carrier and the national-airspace levels in support of their Flight Operations and Quality Assurance (FOQA) Programs and Advanced Qualifications Programs (AQP). APMS cannot meet its objectives unless it develops tools that go substantially beyond the capabilities of the current commercially available software and supporting analytic methods that are mainly designed to count special events. These existing capabilities, while of proven value, were created primarily with the needs of air crews in mind. APMS tools must serve the needs of the government and air carriers, as well as air crews, to fully support the FOQA and AQP programs. They must be able to derive knowledge not only through the analysis of single flights (special-event detection), but through statistical evaluation of the performance of large groups of flights. This paper describes the integrated suite of tools that will assist analysts in evaluating the operational performance and safety of the national air transport system, the air carrier, and the air crew.
APMS: An Integrated Set of Tools for Measuring Safety
NASA Technical Reports Server (NTRS)
Statler, Irving C.; Reynard, William D. (Technical Monitor)
1996-01-01
This is a report of work in progress. In it, I summarize the status of the research and development of the Aviation Performance Measuring System (APMS) for managing, processing, and analyzing digital flight-recorded data. The objectives of the NASA-FAA APMS research project are to establish a sound scientific and technological basis for flight-data analysis, to define an open and flexible architecture for flight-data-analysis systems, and to articulate guidelines for a standardized database structure on which to continue to build future flight-data-analysis extensions. APMS will offer to the air transport community an open, voluntary standard for flight-data-analysis software, a standard that will help to ensure suitable functionality, and data interchangeability, among competing software programs. APMS will develop and document the methodologies, algorithms, and procedures for data management and analyses to enable users to easily interpret the implications regarding safety and efficiency of operations. APMS does not entail the implementation of a nationwide flight-data-collection system. It is intended to provide technical tools to ease the large-scale implementation of flight-data analyses at both the air-carrier and the national-airspace levels in support of their Flight Operations and Quality Assurance (FOQA) Programs and Advanced Qualifications Programs (AQP). APMS cannot meet its objectives unless it develops tools that go substantially beyond the capabilities of the current commercially available software and supporting analytic methods that are mainly designed to count special events. These existing capabilities, while of proven value, were created primarily with the needs of air crews in mind. APMS tools must serve the needs of the government and air carriers, as well as air crews, to fully support the FOQA and AQP programs. They must be able to derive knowledge not only through the analysis of single flights (special-event detection), but through statistical evaluation of the performance of large groups of flights. This paper describes the integrated suite of tools that will assist analysts in evaluating the operational performance and safety of the national air transport system, the air carrier, and the air crew.
Independent Verification and Validation of Complex User Interfaces: A Human Factors Approach
NASA Technical Reports Server (NTRS)
Whitmore, Mihriban; Berman, Andrea; Chmielewski, Cynthia
1996-01-01
The Usability Testing and Analysis Facility (UTAF) at the NASA Johnson Space Center has identified and evaluated a potential automated software interface inspection tool capable of assessing the degree to which space-related critical and high-risk software system user interfaces meet objective human factors standards across each NASA program and project. Testing consisted of two distinct phases. Phase 1 compared analysis times and similarity of results for the automated tool and for human-computer interface (HCI) experts. In Phase 2, HCI experts critiqued the prototype tool's user interface. Based on this evaluation, it appears that a more fully developed version of the tool will be a promising complement to a human factors-oriented independent verification and validation (IV&V) process.
SNPConvert: SNP Array Standardization and Integration in Livestock Species.
Nicolazzi, Ezequiel Luis; Marras, Gabriele; Stella, Alessandra
2016-06-09
One of the main advantages of single nucleotide polymorphism (SNP) array technology is providing genotype calls for a specific number of SNP markers at a relatively low cost. Since its first application in animal genetics, the number of available SNP arrays for each species has been constantly increasing. However, conversely to that observed in whole genome sequence data analysis, SNP array data does not have a common set of file formats or coding conventions for allele calling. Therefore, the standardization and integration of SNP array data from multiple sources have become an obstacle, especially for users with basic or no programming skills. Here, we describe the difficulties related to handling SNP array data, focusing on file formats, SNP allele coding, and mapping. We also present SNPConvert suite, a multi-platform, open-source, and user-friendly set of tools to overcome these issues. This tool, which can be integrated with open-source and open-access tools already available, is a first step towards an integrated system to standardize and integrate any type of raw SNP array data. The tool is available at: https://github. com/nicolazzie/SNPConvert.git.
A Study on the Development of Service Quality Index for Incheon International Airport
NASA Technical Reports Server (NTRS)
Lee, Kang Seok; Lee, Seung Chang; Hong, Soon Kil
2003-01-01
The main purpose of this study is located at developing Ominibus Monitors System(OMS) for internal management, which will enable to establish standards, finding out matters to be improved, and appreciation for its treatment in a systematic way. It is through developing subjective or objective estimation tool with use importance, perceived level, and complex index at international airport by each principal service items. The direction of this study came towards for the purpose of developing a metric analysis tool, utilizing the Quantitative Second Data, Analysing Perceived Data through airport user surveys, systemizing the data collection-input-analysis process, making data image according to graph of results, planning Service Encounter and endowing control attribution, and ensuring competitiveness at the minimal international standards. It is much important to set up a pre-investigation plan on the base of existent foreign literature and actual inspection to international airport. Two tasks have been executed together on the base of this pre-investigation; one is developing subjective estimation standards for departing party, entering party, and airport residence and the other is developing objective standards as complementary methods. The study has processed for the purpose of monitoring services at airports regularly and irregularly through developing software system for operating standards after ensuring credibility and feasibility of estimation standards with substantial and statistical way.
da Costa, Bruno R; Beckett, Brooke; Diaz, Alison; Resta, Nina M; Johnston, Bradley C; Egger, Matthias; Jüni, Peter; Armijo-Olivo, Susan
2017-03-03
The Cochrane risk of bias tool is commonly criticized for having a low reliability. We aimed to investigate whether training of raters, with objective and standardized instructions on how to assess risk of bias, can improve the reliability of the Cochrane risk of bias tool. In this pilot study, four raters inexperienced in risk of bias assessment were randomly allocated to minimal or intensive standardized training for risk of bias assessment of randomized trials of physical therapy treatments for patients with knee osteoarthritis pain. Two raters were experienced risk of bias assessors who served as reference. The primary outcome of our study was between-group reliability, defined as the agreement of the risk of bias assessments of inexperienced raters with the reference assessments of experienced raters. Consensus-based assessments were used for this purpose. The secondary outcome was within-group reliability, defined as the agreement of assessments within pairs of inexperienced raters. We calculated the chance-corrected weighted Kappa to quantify agreement within and between groups of raters for each of the domains of the risk of bias tool. A total of 56 trials were included in our analysis. The Kappa for the agreement of inexperienced raters with reference across items of the risk of bias tool ranged from 0.10 to 0.81 for the minimal training group and from 0.41 to 0.90 for the standardized training group. The Kappa values for the agreement within pairs of inexperienced raters across the items of the risk of bias tool ranged from 0 to 0.38 for the minimal training group and from 0.93 to 1 for the standardized training group. Between-group differences in Kappa for the agreement of inexperienced raters with reference always favored the standardized training group and was most pronounced for incomplete outcome data (difference in Kappa 0.52, p < 0.001) and allocation concealment (difference in Kappa 0.30, p = 0.004). Intensive, standardized training on risk of bias assessment may significantly improve the reliability of the Cochrane risk of bias tool.
ProCon - PROteomics CONversion tool.
Mayer, Gerhard; Stephan, Christian; Meyer, Helmut E; Kohl, Michael; Marcus, Katrin; Eisenacher, Martin
2015-11-03
With the growing amount of experimental data produced in proteomics experiments and the requirements/recommendations of journals in the proteomics field to publicly make available data described in papers, a need for long-term storage of proteomics data in public repositories arises. For such an upload one needs proteomics data in a standardized format. Therefore, it is desirable, that the proprietary vendor's software will integrate in the future such an export functionality using the standard formats for proteomics results defined by the HUPO-PSI group. Currently not all search engines and analysis tools support these standard formats. In the meantime there is a need to provide user-friendly free-to-use conversion tools that can convert the data into such standard formats in order to support wet-lab scientists in creating proteomics data files ready for upload into the public repositories. ProCon is such a conversion tool written in Java for conversion of proteomics identification data into standard formats mzIdentML and Pride XML. It allows the conversion of Sequest™/Comet .out files, of search results from the popular and often used ProteomeDiscoverer® 1.x (x=versions 1.1 to1.4) software and search results stored in the LIMS systems ProteinScape® 1.3 and 2.1 into mzIdentML and PRIDE XML. This article is part of a Special Issue entitled: Computational Proteomics. Copyright © 2015. Published by Elsevier B.V.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCord, Jason
WLS gathers all known relevant contextual data along with standard event log information, processes it into an easily consumable format for analysis by 3rd party tools, and forwards the logs to any compatible log server.
Analysis of a Teacher's Pedagogical Arguments Using Toulmin's Model and Argumentation Schemes
ERIC Educational Resources Information Center
Metaxas, N.; Potari, D.; Zachariades, T.
2016-01-01
In this article, we elaborate methodologies to study the argumentation speech of a teacher involved in argumentative activities. The standard tool of analysis of teachers' argumentation concerning pedagogical matters is Toulmin's model. The theory of argumentation schemes offers an alternative perspective on the analysis of arguments. We propose…
Enhanced semantic interoperability by profiling health informatics standards.
López, Diego M; Blobel, Bernd
2009-01-01
Several standards applied to the healthcare domain support semantic interoperability. These standards are far from being completely adopted in health information system development, however. The objective of this paper is to provide a method and suggest the necessary tooling for reusing standard health information models, by that way supporting the development of semantically interoperable systems and components. The approach is based on the definition of UML Profiles. UML profiling is a formal modeling mechanism to specialize reference meta-models in such a way that it is possible to adapt those meta-models to specific platforms or domains. A health information model can be considered as such a meta-model. The first step of the introduced method identifies the standard health information models and tasks in the software development process in which healthcare information models can be reused. Then, the selected information model is formalized as a UML Profile. That Profile is finally applied to system models, annotating them with the semantics of the information model. The approach is supported on Eclipse-based UML modeling tools. The method is integrated into a comprehensive framework for health information systems development, and the feasibility of the approach is demonstrated in the analysis, design, and implementation of a public health surveillance system, reusing HL7 RIM and DIMs specifications. The paper describes a method and the necessary tooling for reusing standard healthcare information models. UML offers several advantages such as tooling support, graphical notation, exchangeability, extensibility, semi-automatic code generation, etc. The approach presented is also applicable for harmonizing different standard specifications.
Annotare—a tool for annotating high-throughput biomedical investigations and resulting data
Shankar, Ravi; Parkinson, Helen; Burdett, Tony; Hastings, Emma; Liu, Junmin; Miller, Michael; Srinivasa, Rashmi; White, Joseph; Brazma, Alvis; Sherlock, Gavin; Stoeckert, Christian J.; Ball, Catherine A.
2010-01-01
Summary: Computational methods in molecular biology will increasingly depend on standards-based annotations that describe biological experiments in an unambiguous manner. Annotare is a software tool that enables biologists to easily annotate their high-throughput experiments, biomaterials and data in a standards-compliant way that facilitates meaningful search and analysis. Availability and Implementation: Annotare is available from http://code.google.com/p/annotare/ under the terms of the open-source MIT License (http://www.opensource.org/licenses/mit-license.php). It has been tested on both Mac and Windows. Contact: rshankar@stanford.edu PMID:20733062
Su, Cheng; Zhou, Lei; Hu, Zheng; Weng, Winnie; Subramani, Jayanthi; Tadkod, Vineet; Hamilton, Kortney; Bautista, Ami; Wu, Yu; Chirmule, Narendra; Zhong, Zhandong Don
2015-10-01
Biotherapeutics can elicit immune responses, which can alter the exposure, safety, and efficacy of the therapeutics. A well-designed and robust bioanalytical method is critical for the detection and characterization of relevant anti-drug antibody (ADA) and the success of an immunogenicity study. As a fundamental criterion in immunogenicity testing, assay cut points need to be statistically established with a risk-based approach to reduce subjectivity. This manuscript describes the development of a validated, web-based, multi-tier customized assay statistical tool (CAST) for assessing cut points of ADA assays. The tool provides an intuitive web interface that allows users to import experimental data generated from a standardized experimental design, select the assay factors, run the standardized analysis algorithms, and generate tables, figures, and listings (TFL). It allows bioanalytical scientists to perform complex statistical analysis at a click of the button to produce reliable assay parameters in support of immunogenicity studies. Copyright © 2015 Elsevier B.V. All rights reserved.
Kasahara, Kota; Kinoshita, Kengo
2016-01-01
Ion conduction mechanisms of ion channels are a long-standing conundrum. Although the molecular dynamics (MD) method has been extensively used to simulate ion conduction dynamics at the atomic level, analysis and interpretation of MD results are not straightforward due to complexity of the dynamics. In our previous reports, we proposed an analytical method called ion-binding state analysis to scrutinize and summarize ion conduction mechanisms by taking advantage of a variety of analytical protocols, e.g., the complex network analysis, sequence alignment, and hierarchical clustering. This approach effectively revealed the ion conduction mechanisms and their dependence on the conditions, i.e., ion concentration and membrane voltage. Here, we present an easy-to-use computational toolkit for ion-binding state analysis, called IBiSA_tools. This toolkit consists of a C++ program and a series of Python and R scripts. From the trajectory file of MD simulations and a structure file, users can generate several images and statistics of ion conduction processes. A complex network named ion-binding state graph is generated in a standard graph format (graph modeling language; GML), which can be visualized by standard network analyzers such as Cytoscape. As a tutorial, a trajectory of a 50 ns MD simulation of the Kv1.2 channel is also distributed with the toolkit. Users can trace the entire process of ion-binding state analysis step by step. The novel method for analysis of ion conduction mechanisms of ion channels can be easily used by means of IBiSA_tools. This software is distributed under an open source license at the following URL: http://www.ritsumei.ac.jp/~ktkshr/ibisa_tools/.
Maser: one-stop platform for NGS big data from analysis to visualization
Kinjo, Sonoko; Monma, Norikazu; Misu, Sadahiko; Kitamura, Norikazu; Imoto, Junichi; Yoshitake, Kazutoshi; Gojobori, Takashi; Ikeo, Kazuho
2018-01-01
Abstract A major challenge in analyzing the data from high-throughput next-generation sequencing (NGS) is how to handle the huge amounts of data and variety of NGS tools and visualize the resultant outputs. To address these issues, we developed a cloud-based data analysis platform, Maser (Management and Analysis System for Enormous Reads), and an original genome browser, Genome Explorer (GE). Maser enables users to manage up to 2 terabytes of data to conduct analyses with easy graphical user interface operations and offers analysis pipelines in which several individual tools are combined as a single pipeline for very common and standard analyses. GE automatically visualizes genome assembly and mapping results output from Maser pipelines, without requiring additional data upload. With this function, the Maser pipelines can graphically display the results output from all the embedded tools and mapping results in a web browser. Therefore Maser realized a more user-friendly analysis platform especially for beginners by improving graphical display and providing the selected standard pipelines that work with built-in genome browser. In addition, all the analyses executed on Maser are recorded in the analysis history, helping users to trace and repeat the analyses. The entire process of analysis and its histories can be shared with collaborators or opened to the public. In conclusion, our system is useful for managing, analyzing, and visualizing NGS data and achieves traceability, reproducibility, and transparency of NGS analysis. Database URL: http://cell-innovation.nig.ac.jp/maser/ PMID:29688385
Experience with case tools in the design of process-oriented software
NASA Astrophysics Data System (ADS)
Novakov, Ognian; Sicard, Claude-Henri
1994-12-01
In Accelerator systems such as the CERN PS complex, process equipment has a life time which may exceed the typical life cycle of its related software. Taking into account the variety of such equipment, it is important to keep the analysis and design of the software in a system-independent form. This paper discusses the experience gathered in using commercial CASE tools for analysis, design and reverse engineering of different process-oriented software modules, with a principal emphasis on maintaining the initial analysis in a standardized form. Such tools have been in existence for several years, but this paper shows that they are not fully adapted to our needs. In particular, the paper stresses the problems of integrating such a tool into an existing data-base-dependent development chain, the lack of real-time simulation tools and of Object-Oriented concepts in existing commercial packages. Finally, the paper gives a broader view of software engineering needs in our particular context.
Reliability of the ECHOWS Tool for Assessment of Patient Interviewing Skills.
Boissonnault, Jill S; Evans, Kerrie; Tuttle, Neil; Hetzel, Scott J; Boissonnault, William G
2016-04-01
History taking is an important component of patient/client management. Assessment of student history-taking competency can be achieved via a standardized tool. The ECHOWS tool has been shown to be valid with modest intrarater reliability in a previous study but did not demonstrate sufficient power to definitively prove its stability. The purposes of this study were: (1) to assess the reliability of the ECHOWS tool for student assessment of patient interviewing skills and (2) to determine whether the tool discerns between novice and experienced skill levels. A reliability and construct validity assessment was conducted. Three faculty members from the United States and Australia scored videotaped histories from standardized patients taken by students and experienced clinicians from each of these countries. The tapes were scored twice, 3 to 6 weeks apart. Reliability was assessed using interclass correlation coefficients (ICCs) and repeated measures. Analysis of variance models assessed the ability of the tool to discern between novice and experienced skill levels. The ECHOWS tool showed excellent intrarater reliability (ICC [3,1]=.74-.89) and good interrater reliability (ICC [2,1]=.55) as a whole. The summary of performance (S) section showed poor interrater reliability (ICC [2,1]=.27). There was no statistical difference in performance on the tool between novice and experienced clinicians. A possible ceiling effect may occur when standardized patients are not coached to provide complex and obtuse responses to interviewer questions. Variation in familiarity with the ECHOWS tool and in use of the online training may have influenced scoring of the S section. The ECHOWS tool demonstrates excellent intrarater reliability and moderate interrater reliability. Sufficient training with the tool prior to student assessment is recommended. The S section must evolve in order to provide a more discerning measure of interviewing skills. © 2016 American Physical Therapy Association.
Designing Real-time Decision Support for Trauma Resuscitations
Yadav, Kabir; Chamberlain, James M.; Lewis, Vicki R.; Abts, Natalie; Chawla, Shawn; Hernandez, Angie; Johnson, Justin; Tuveson, Genevieve; Burd, Randall S.
2016-01-01
Background Use of electronic clinical decision support (eCDS) has been recommended to improve implementation of clinical decision rules. Many eCDS tools, however, are designed and implemented without taking into account the context in which clinical work is performed. Implementation of the pediatric traumatic brain injury (TBI) clinical decision rule at one Level I pediatric emergency department includes an electronic questionnaire triggered when ordering a head computed tomography using computerized physician order entry (CPOE). Providers use this CPOE tool in less than 20% of trauma resuscitation cases. A human factors engineering approach could identify the implementation barriers that are limiting the use of this tool. Objectives The objective was to design a pediatric TBI eCDS tool for trauma resuscitation using a human factors approach. The hypothesis was that clinical experts will rate a usability-enhanced eCDS tool better than the existing CPOE tool for user interface design and suitability for clinical use. Methods This mixed-methods study followed usability evaluation principles. Pediatric emergency physicians were surveyed to identify barriers to using the existing eCDS tool. Using standard trauma resuscitation protocols, a hierarchical task analysis of pediatric TBI evaluation was developed. Five clinical experts, all board-certified pediatric emergency medicine faculty members, then iteratively modified the hierarchical task analysis until reaching consensus. The software team developed a prototype eCDS display using the hierarchical task analysis. Three human factors engineers provided feedback on the prototype through a heuristic evaluation, and the software team refined the eCDS tool using a rapid prototyping process. The eCDS tool then underwent iterative usability evaluations by the five clinical experts using video review of 50 trauma resuscitation cases. A final eCDS tool was created based on their feedback, with content analysis of the evaluations performed to ensure all concerns were identified and addressed. Results Among 26 EPs (76% response rate), the main barriers to using the existing tool were that the information displayed is redundant and does not fit clinical workflow. After the prototype eCDS tool was developed based on the trauma resuscitation hierarchical task analysis, the human factors engineers rated it to be better than the CPOE tool for nine of 10 standard user interface design heuristics on a three-point scale. The eCDS tool was also rated better for clinical use on the same scale, in 84% of 50 expert–video pairs, and was rated equivalent in the remainder. Clinical experts also rated barriers to use of the eCDS tool as being low. Conclusions An eCDS tool for diagnostic imaging designed using human factors engineering methods has improved perceived usability among pediatric emergency physicians. PMID:26300010
Slow Speed--Fast Motion: Time-Lapse Recordings in Physics Education
ERIC Educational Resources Information Center
Vollmer, Michael; Möllmann, Klaus-Peter
2018-01-01
Video analysis with a 30 Hz frame rate is the standard tool in physics education. The development of affordable high-speed-cameras has extended the capabilities of the tool for much smaller time scales to the 1 ms range, using frame rates of typically up to 1000 frames s[superscript -1], allowing us to study transient physics phenomena happening…
ERIC Educational Resources Information Center
Cramer, Nicholas; Asmar, Abdo; Gorman, Laurel; Gros, Bernard; Harris, David; Howard, Thomas; Hussain, Mujtaba; Salazar, Sergio; Kibble, Jonathan D.
2016-01-01
Multiple-choice questions are a gold-standard tool in medical school for assessment of knowledge and are the mainstay of licensing examinations. However, multiple-choice questions items can be criticized for lacking the ability to test higher-order learning or integrative thinking across multiple disciplines. Our objective was to develop a novel…
V-FOR-WaTer - a new virtual research environment for environmental research
NASA Astrophysics Data System (ADS)
Strobl, Marcus; Azmi, Elnaz; Hassler, Sibylle; Mälicke, Mirko; Meyer, Jörg; Zehe, Erwin
2017-04-01
The preparation of heterogeneous datasets for scientific analysis is still a demanding task. Data preprocessing for hydrological models typically involves gathering datasets from different sources, extensive work within geoinformation systems, data transformation, the generation of computational grids and the definition of initial and boundary conditions. V-FOR-WaTer, a standardized and scalable data hub with compatible analysis tools, will ease comprehensive studies and significantly reduce data preparation time. The idea behind V-FOR-WaTer is to bring together various datasets (e.g. point measurements, 2D/3D data, time series data) from different sources (e.g. gathered in research projects, or as part of regular monitoring of state offices) and to provide common as well as innovative scaling tools in space and time to generate a coherent data grid. Each dataset holds detailed standardized metadata to ensure usability of the data, offer a comprehensive search function and provide reference information for appropriate citation of the dataset creators. V-FOR-WaTer includes a basis of data and tools, but its purpose is to grow by users who extend the virtual research environment with their own tools and research data. Researchers who upload new data or tools can receive a digital object identifier, or protect their data and tools from others until publication. Access to data and tools provided from V-FOR-WaTer happens via an easy-to-use web portal. Due to its modular architecture the portal is ready to be extended with new tools and features and also offers interfaces to Matlab, Python and R.
Rosella, L; Bowman, C; Pach, B; Morgan, S; Fitzpatrick, T; Goel, V
2016-07-01
Most quality appraisal tools were developed for clinical medicine and tend to be study-specific with a strong emphasis on risk of bias. In order to be more relevant to public health, an appropriate quality appraisal tool needs to be less reliant on the evidence hierarchy and consider practice applicability. Given the broad range of study designs used in public health, the objective of this study was to develop and validate a meta-tool that combines public health-focused principles of appraisal coupled with a set of design-specific companion tools. Several design methods were used to develop and validate the tool including literature review, synthesis, and validation with a reference standard. A search of critical appraisal tools relevant to public health was conducted; core concepts were collated. The resulting framework was piloted during three feedback sessions with public health practitioners. Following subsequent revisions, the final meta-tool, the Meta Quality Appraisal Tool (MetaQAT), was then validated through a content analysis of appraisals conducted by two groups of experienced public health researchers (MetaQAT vs generic appraisal form). The MetaQAT framework consists of four domains: relevancy, reliability, validity, and applicability. In addition, a companion tool was assembled from existing critical appraisal tools to provide study design-specific guidance on validity appraisal. Content analysis showed similar methodological and generalizability concerns were raised by both groups; however, the MetaQAT appraisers commented more extensively on applicability to public health practice. Critical appraisal tools designed for clinical medicine have limitations for use in the context of public health. The meta-tool structure of the MetaQAT allows for rigorous appraisal, while allowing users to simultaneously appraise the multitude of study designs relevant to public health research and assess non-standard domains, such as applicability. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
MAAMD: a workflow to standardize meta-analyses and comparison of affymetrix microarray data
2014-01-01
Background Mandatory deposit of raw microarray data files for public access, prior to study publication, provides significant opportunities to conduct new bioinformatics analyses within and across multiple datasets. Analysis of raw microarray data files (e.g. Affymetrix CEL files) can be time consuming, complex, and requires fundamental computational and bioinformatics skills. The development of analytical workflows to automate these tasks simplifies the processing of, improves the efficiency of, and serves to standardize multiple and sequential analyses. Once installed, workflows facilitate the tedious steps required to run rapid intra- and inter-dataset comparisons. Results We developed a workflow to facilitate and standardize Meta-Analysis of Affymetrix Microarray Data analysis (MAAMD) in Kepler. Two freely available stand-alone software tools, R and AltAnalyze were embedded in MAAMD. The inputs of MAAMD are user-editable csv files, which contain sample information and parameters describing the locations of input files and required tools. MAAMD was tested by analyzing 4 different GEO datasets from mice and drosophila. MAAMD automates data downloading, data organization, data quality control assesment, differential gene expression analysis, clustering analysis, pathway visualization, gene-set enrichment analysis, and cross-species orthologous-gene comparisons. MAAMD was utilized to identify gene orthologues responding to hypoxia or hyperoxia in both mice and drosophila. The entire set of analyses for 4 datasets (34 total microarrays) finished in ~ one hour. Conclusions MAAMD saves time, minimizes the required computer skills, and offers a standardized procedure for users to analyze microarray datasets and make new intra- and inter-dataset comparisons. PMID:24621103
Using Galaxy to Perform Large-Scale Interactive Data Analyses
Hillman-Jackson, Jennifer; Clements, Dave; Blankenberg, Daniel; Taylor, James; Nekrutenko, Anton
2014-01-01
Innovations in biomedical research technologies continue to provide experimental biologists with novel and increasingly large genomic and high-throughput data resources to be analyzed. As creating and obtaining data has become easier, the key decision faced by many researchers is a practical one: where and how should an analysis be performed? Datasets are large and analysis tool set-up and use is riddled with complexities outside of the scope of core research activities. The authors believe that Galaxy provides a powerful solution that simplifies data acquisition and analysis in an intuitive Web application, granting all researchers access to key informatics tools previously only available to computational specialists working in Unix-based environments. We will demonstrate through a series of biomedically relevant protocols how Galaxy specifically brings together (1) data retrieval from public and private sources, for example, UCSC's Eukaryote and Microbial Genome Browsers, (2) custom tools (wrapped Unix functions, format standardization/conversions, interval operations), and 3rd-party analysis tools. PMID:22700312
Moyer, Jason T; Gnatkovsky, Vadym; Ono, Tomonori; Otáhal, Jakub; Wagenaar, Joost; Stacey, William C; Noebels, Jeffrey; Ikeda, Akio; Staley, Kevin; de Curtis, Marco; Litt, Brian; Galanopoulou, Aristea S
2017-11-01
Electroencephalography (EEG)-the direct recording of the electrical activity of populations of neurons-is a tremendously important tool for diagnosing, treating, and researching epilepsy. Although standard procedures for recording and analyzing human EEG exist and are broadly accepted, there are no such standards for research in animal models of seizures and epilepsy-recording montages, acquisition systems, and processing algorithms may differ substantially among investigators and laboratories. The lack of standard procedures for acquiring and analyzing EEG from animal models of epilepsy hinders the interpretation of experimental results and reduces the ability of the scientific community to efficiently translate new experimental findings into clinical practice. Accordingly, the intention of this report is twofold: (1) to review current techniques for the collection and software-based analysis of neural field recordings in animal models of epilepsy, and (2) to offer pertinent standards and reporting guidelines for this research. Specifically, we review current techniques for signal acquisition, signal conditioning, signal processing, data storage, and data sharing, and include applicable recommendations to standardize collection and reporting. We close with a discussion of challenges and future opportunities, and include a supplemental report of currently available acquisition systems and analysis tools. This work represents a collaboration on behalf of the American Epilepsy Society/International League Against Epilepsy (AES/ILAE) Translational Task Force (TASK1-Workgroup 5), and is part of a larger effort to harmonize video-EEG interpretation and analysis methods across studies using in vivo and in vitro seizure and epilepsy models. Wiley Periodicals, Inc. © 2017 International League Against Epilepsy.
Schneider, T; Arumi, D; Crook, T J; Sun, F; Michel, M C
2014-09-01
To compare the effects of additional educational material on treatment satisfaction of overactive bladder (OAB) patients treated with a muscarinic receptor antagonist. In an observational study of OAB patients being treated by their physician with fesoterodine for 4 months (FAKTEN study), sites were randomised to providing standard treatment or additional educational material including the SAGA tool. Patient satisfaction was assessed by three validated patient-reported outcomes including the Treatment Satisfaction Question. Because of premature discontinuation of the study, descriptive statistical analysis was performed. A total of 431 and 342 patients received standard treatment or additional educational material, respectively. At study end, 76.1% [95% CI = 71.3, 80.4] of patients with standard care and 79.6% [95% CI = 74.4, 84.1] with additional SAGA tool were satisfied with treatment (primary end-point). Comparable outcomes with and without the additional educational material were also found in various patient subgroups, at the 1-month time point, and for the other patient-reported outcomes. A notable exception was the subgroup of treatment-naïve patients in which the percentage of satisfied patients was 77.2% vs. 89.5% with standard treatment and additional SAGA tool, respectively (post hoc analysis). In an observational study, most overactive bladder patients were satisfied with fesoterodine treatment. Because of the small sample size, the study does not support or refute the hypothesis that adding the SAGA tool will improve patient satisfaction with treatment. The potential effect of additional educational material in treatment-naïve patients warrants further dedicated studies. © 2014 John Wiley & Sons Ltd.
A Format for Phylogenetic Placements
Matsen, Frederick A.; Hoffman, Noah G.; Gallagher, Aaron; Stamatakis, Alexandros
2012-01-01
We have developed a unified format for phylogenetic placements, that is, mappings of environmental sequence data (e.g., short reads) into a phylogenetic tree. We are motivated to do so by the growing number of tools for computing and post-processing phylogenetic placements, and the lack of an established standard for storing them. The format is lightweight, versatile, extensible, and is based on the JSON format, which can be parsed by most modern programming languages. Our format is already implemented in several tools for computing and post-processing parsimony- and likelihood-based phylogenetic placements and has worked well in practice. We believe that establishing a standard format for analyzing read placements at this early stage will lead to a more efficient development of powerful and portable post-analysis tools for the growing applications of phylogenetic placement. PMID:22383988
A format for phylogenetic placements.
Matsen, Frederick A; Hoffman, Noah G; Gallagher, Aaron; Stamatakis, Alexandros
2012-01-01
We have developed a unified format for phylogenetic placements, that is, mappings of environmental sequence data (e.g., short reads) into a phylogenetic tree. We are motivated to do so by the growing number of tools for computing and post-processing phylogenetic placements, and the lack of an established standard for storing them. The format is lightweight, versatile, extensible, and is based on the JSON format, which can be parsed by most modern programming languages. Our format is already implemented in several tools for computing and post-processing parsimony- and likelihood-based phylogenetic placements and has worked well in practice. We believe that establishing a standard format for analyzing read placements at this early stage will lead to a more efficient development of powerful and portable post-analysis tools for the growing applications of phylogenetic placement.
HydroClimATe: hydrologic and climatic analysis toolkit
Dickinson, Jesse; Hanson, Randall T.; Predmore, Steven K.
2014-01-01
The potential consequences of climate variability and climate change have been identified as major issues for the sustainability and availability of the worldwide water resources. Unlike global climate change, climate variability represents deviations from the long-term state of the climate over periods of a few years to several decades. Currently, rich hydrologic time-series data are available, but the combination of data preparation and statistical methods developed by the U.S. Geological Survey as part of the Groundwater Resources Program is relatively unavailable to hydrologists and engineers who could benefit from estimates of climate variability and its effects on periodic recharge and water-resource availability. This report documents HydroClimATe, a computer program for assessing the relations between variable climatic and hydrologic time-series data. HydroClimATe was developed for a Windows operating system. The software includes statistical tools for (1) time-series preprocessing, (2) spectral analysis, (3) spatial and temporal analysis, (4) correlation analysis, and (5) projections. The time-series preprocessing tools include spline fitting, standardization using a normal or gamma distribution, and transformation by a cumulative departure. The spectral analysis tools include discrete Fourier transform, maximum entropy method, and singular spectrum analysis. The spatial and temporal analysis tool is empirical orthogonal function analysis. The correlation analysis tools are linear regression and lag correlation. The projection tools include autoregressive time-series modeling and generation of many realizations. These tools are demonstrated in four examples that use stream-flow discharge data, groundwater-level records, gridded time series of precipitation data, and the Multivariate ENSO Index.
Weech-Maldonado, Robert; Dreachslin, Janice L.; Brown, Julie; Pradhan, Rohit; Rubin, Kelly L.; Schiller, Cameron; Hays, Ron D.
2016-01-01
Background The U.S. national standards for culturally and linguistically appropriate services (CLAS) in health care provide guidelines on policies and practices aimed at developing culturally competent systems of care. The Cultural Competency Assessment Tool for Hospitals (CCATH) was developed as an organizational tool to assess adherence to the CLAS standards. Purposes First, we describe the development of the CCATH and estimate the reliability and validity of the CCATH measures. Second, we discuss the managerial implications of the CCATH as an organizational tool to assess cultural competency. Methodology/Approach We pilot tested an initial draft of the CCATH, revised it based on a focus group and cognitive interviews, and then administered it in a field test with a sample of California hospitals. The reliability and validity of the CCATH were evaluated using factor analysis, analysis of variance, and Cronbach’s alphas. Findings Exploratory and confirmatory factor analyses identified 12 CCATH composites: leadership and strategic planning, data collection on inpatient population, data collection on service area, performance management systems and quality improvement, human resources practices, diversity training, community representation, availability of interpreter services, interpreter services policies, quality of interpreter services, translation of written materials, and clinical cultural competency practices. All the CCATH scales had internal consistency reliability of .65 or above, and the reliability was .70 or above for 9 of the 12 scales. Analysis of variance results showed that not-for-profit hospitals have higher CCATH scores than for-profit hospitals in five CCATH scales and higher CCATH scores than government hospitals in two CCATH scales. Practice Implications The CCATH showed adequate psychometric properties. Managers and policy makers can use the CCATH as a tool to evaluate hospital performance in cultural competency and identify and target improvements in hospital policies and practices that undergird the provision of CLAS. PMID:21934511
REddyProc: Enabling researchers to process Eddy-Covariance data
NASA Astrophysics Data System (ADS)
Wutzler, Thomas; Moffat, Antje; Migliavacca, Mirco; Knauer, Jürgen; Menzer, Olaf; Sickel, Kerstin; Reichstein, Markus
2017-04-01
Analysing Eddy-Covariance measurements involves extensive processing, which puts technical labour to researchers. There is a need to overcome difficulties in data processing associated with deploying, adapting and using existing software and online tools. We tackled that need by developing the REddyProc package in the open source cross-platform language R that provides standard processing routines for reading half-hourly files from different formats, including from the recently released FLUXNET 2015 dataset, uStar threshold estimation and associated uncertainty, gap-filling, flux partitioning (both night-time or daytime based), and visualization of results. Although different in some features, the package mimics the online tool that has been extensively used by many users and site Principal Investigators (PIs) in the last years, and available on the website of the Max Planck Institute for Biogeochemistry. Generally, REddyProc results are statistically equal to results based on the state-of the art tools. The provided routines can be easily installed, configured, used, and integrated with further analysis. Hence the eddy covariance community will benefit from using the provided package allowing easier integration of standard processing with extended analysis. This complements activities by AmeriFlux, ICOS, NEON, and other regional networks for developing codes for standardized data processing of multiple sites in FLUXNET.
ERIC Educational Resources Information Center
Williams, Lynne J.; Abdi, Herve; French, Rebecca; Orange, Joseph B.
2010-01-01
Purpose: In communication disorders research, clinical groups are frequently described based on patterns of performance, but researchers often study only a few participants described by many quantitative and qualitative variables. These data are difficult to handle with standard inferential tools (e.g., analysis of variance or factor analysis)…
Wesner, Amber R.; Jones, Ryan; Schultz, Karen; Johnson, Mark
2016-01-01
The purpose of this study was to evaluate the impact of a standardized reflection tool on the development of a teaching philosophy statement in a pharmacy residency teaching and learning curriculum program (RTLCP). Pharmacy residents participating in the RTLCP over a two-year period were surveyed using a pre/post method to assess perceptions of teaching philosophy development before and after using the tool. Responses were assessed using a 5-point Likert scale to indicate level of agreement with each statement. For analysis, responses were divided into high (strongly agree/agree) and low (neutral/disagree/strongly disagree) agreement. The level of agreement increased significantly for all items surveyed (p < 0.05), with the exception of one area pertaining to the ability to describe characteristics of outstanding teachers, which was noted to be strong before and after using the tool (p = 0.5027). Overall results were positive, with 81% of participants responding that the reflection tool was helpful in developing a teaching philosophy, and 96% responding that the resulting teaching philosophy statement fully reflected their views on teaching and learning. The standardized reflection tool developed at Shenandoah University assisted pharmacy residents enrolled in a teaching and learning curriculum program to draft a comprehensive teaching philosophy statement, and was well received by participants. PMID:28970382
Open source cardiology electronic health record development for DIGICARDIAC implementation
NASA Astrophysics Data System (ADS)
Dugarte, Nelson; Medina, Rubén.; Huiracocha, Lourdes; Rojas, Rubén.
2015-12-01
This article presents the development of a Cardiology Electronic Health Record (CEHR) system. Software consists of a structured algorithm designed under Health Level-7 (HL7) international standards. Novelty of the system is the integration of high resolution ECG (HRECG) signal acquisition and processing tools, patient information management tools and telecardiology tools. Acquisition tools are for management and control of the DIGICARDIAC electrocardiograph functions. Processing tools allow management of HRECG signal analysis searching for indicative patterns of cardiovascular pathologies. Telecardiology tools incorporation allows system communication with other health care centers decreasing access time to the patient information. CEHR system was completely developed using open source software. Preliminary results of process validation showed the system efficiency.
Clark, Neil R.; Szymkiewicz, Maciej; Wang, Zichen; Monteiro, Caroline D.; Jones, Matthew R.; Ma’ayan, Avi
2016-01-01
Gene set analysis of differential expression, which identifies collectively differentially expressed gene sets, has become an important tool for biology. The power of this approach lies in its reduction of the dimensionality of the statistical problem and its incorporation of biological interpretation by construction. Many approaches to gene set analysis have been proposed, but benchmarking their performance in the setting of real biological data is difficult due to the lack of a gold standard. In a previously published work we proposed a geometrical approach to differential expression which performed highly in benchmarking tests and compared well to the most popular methods of differential gene expression. As reported, this approach has a natural extension to gene set analysis which we call Principal Angle Enrichment Analysis (PAEA). PAEA employs dimensionality reduction and a multivariate approach for gene set enrichment analysis. However, the performance of this method has not been assessed nor its implementation as a web-based tool. Here we describe new benchmarking protocols for gene set analysis methods and find that PAEA performs highly. The PAEA method is implemented as a user-friendly web-based tool, which contains 70 gene set libraries and is freely available to the community. PMID:26848405
Clark, Neil R; Szymkiewicz, Maciej; Wang, Zichen; Monteiro, Caroline D; Jones, Matthew R; Ma'ayan, Avi
2015-11-01
Gene set analysis of differential expression, which identifies collectively differentially expressed gene sets, has become an important tool for biology. The power of this approach lies in its reduction of the dimensionality of the statistical problem and its incorporation of biological interpretation by construction. Many approaches to gene set analysis have been proposed, but benchmarking their performance in the setting of real biological data is difficult due to the lack of a gold standard. In a previously published work we proposed a geometrical approach to differential expression which performed highly in benchmarking tests and compared well to the most popular methods of differential gene expression. As reported, this approach has a natural extension to gene set analysis which we call Principal Angle Enrichment Analysis (PAEA). PAEA employs dimensionality reduction and a multivariate approach for gene set enrichment analysis. However, the performance of this method has not been assessed nor its implementation as a web-based tool. Here we describe new benchmarking protocols for gene set analysis methods and find that PAEA performs highly. The PAEA method is implemented as a user-friendly web-based tool, which contains 70 gene set libraries and is freely available to the community.
Personal Constructions of Biological Concepts--The Repertory Grid Approach
ERIC Educational Resources Information Center
McCloughlin, Thomas J. J.; Matthews, Philip S. C.
2017-01-01
This work discusses repertory grid analysis as a tool for investigating the structures of students' representations of biological concepts. Repertory grid analysis provides the researcher with a variety of techniques that are not associated with standard methods of concept mapping for investigating conceptual structures. It can provide valuable…
Causal Mediation Analysis: Warning! Assumptions Ahead
ERIC Educational Resources Information Center
Keele, Luke
2015-01-01
In policy evaluations, interest may focus on why a particular treatment works. One tool for understanding why treatments work is causal mediation analysis. In this essay, I focus on the assumptions needed to estimate mediation effects. I show that there is no "gold standard" method for the identification of causal mediation effects. In…
DRIS Analysis Identifies a Common Potassium Imbalance in Sweetgum Plantations
Mark D. Coleman; S.X. Chang; D.J. Robison
2003-01-01
DRIS (Diagnosis and Recommendation Integrated System) analysis was applied to fast-growing sweetgum (Liquidambar styraciflua L.) plantations in the southeast United States as a tool for nutrient diagnosis and fertilizer recommendations. First, standard foliar nutrient ratios for nitrogen (N), phosphorus (P), potassium (K), calcium (Ca), and...
Golden, Sherita Hill; Hager, Daniel; Gould, Lois J; Mathioudakis, Nestoras; Pronovost, Peter J
2017-01-01
In a complex health system, it is important to establish a systematic and data-driven approach to identifying needs. The Diabetes Clinical Community (DCC) of Johns Hopkins Medicine's Armstrong Institute for Patient Safety and Quality developed a gap analysis tool and process to establish the system's current state of inpatient diabetes care. The collectively developed tool assessed the following areas: program infrastructure; protocols, policies, and order sets; patient and health care professional education; and automated data access. For the purposes of this analysis, gaps were defined as those instances in which local resources, infrastructure, or processes demonstrated a variance against the current national evidence base or institutionally defined best practices. Following the gap analysis, members of the DCC, in collaboration with health system leadership, met to identify priority areas in order to integrate and synergize diabetes care resources and efforts to enhance quality and reduce disparities in care across the system. Key gaps in care identified included lack of standardized glucose management policies, lack of standardized training of health care professionals in inpatient diabetes management, and lack of access to automated data collection and analysis. These results were used to gain resources to support collaborative diabetes health system initiatives and to successfully obtain federal research funding to develop and pilot a pragmatic diabetes educational intervention. At a health system level, the summary format of this gap analysis tool is an effective method to clearly identify disparities in care to focus efforts and resources to improve care delivery. Copyright © 2016 The Joint Commission. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Criollo, Rotman; Velasco, Violeta; Vázquez-Suñé, Enric; Nardi, Albert; Marazuela, Miguel A.; Rossetto, Rudy; Borsi, Iacopo; Foglia, Laura; Cannata, Massimiliano; De Filippis, Giovanna
2017-04-01
Due to the general increase of water scarcity (Steduto et al., 2012), water quantity and quality must be well known to ensure a proper access to water resources in compliance with local and regional directives. This circumstance can be supported by tools which facilitate process of data management and its analysis. Such analyses have to provide research/professionals, policy makers and users with the ability to improve the management of the water resources with standard regulatory guidelines. Compliance with the established standard regulatory guidelines (with a special focus on requirement deriving from the GWD) should have an effective monitoring, evaluation, and interpretation of a large number of physical and chemical parameters. These amounts of datasets have to be assessed and interpreted: (i) integrating data from different sources and gathered with different data access techniques and formats; (ii) managing data with varying temporal and spatial extent; (iii) integrating groundwater quality information with other relevant information such as further hydrogeological data (Velasco et al., 2014) and pre-processing these data generally for the realization of groundwater models. In this context, the Hydrochemical Analysis Tools, akvaGIS Tools, has been implemented within the H2020 FREEWAT project; which aims to manage water resources by modelling water resource management in an open source GIS platform (QGIS desktop). The main goal of AkvaGIS Tools is to improve water quality analysis through different capabilities to improve the case study conceptual model managing all data related into its geospatial database (implemented in Spatialite) and a set of tools for improving the harmonization, integration, standardization, visualization and interpretation of the hydrochemical data. To achieve that, different commands cover a wide range of methodologies for querying, interpreting, and comparing groundwater quality data and facilitate the pre-processing analysis for being used in the realization of groundwater modelling. They include, ionic balance calculations, chemical time-series analysis, correlation of chemical parameters, and calculation of various common hydrochemical diagrams (Salinity, Schöeller-Berkaloff, Piper, and Stiff), among others. Furthermore, it allows the generation of maps of the spatial distributions of parameters and diagrams and thematic maps for the parameters measured and classified in the queried area. References: Rossetto R., Borsi I., Schifani C., Bonari E., Mogorovich P., Primicerio M. (2013). SID&GRID: Integrating hydrological modeling in GIS environment. Rendiconti Online Societa Geologica Italiana, Vol. 24, 282-283 Steduto, P., Faurès, J.M., Hoogeveen, J., Winpenny, J.T., Burke, J.J. (2012). Coping with water scarcity: an action framework for agriculture and food security. ISSN 1020-1203 ; 38 Velasco, V., Tubau, I., Vázquez-Suñé, E., Gogu, R., Gaitanaru, D., Alcaraz, M., Sanchez-Vila, X. (2014). GIS-based hydrogeochemical analysis tools (QUIMET). Computers & Geosciences, 70, 164-180.
BiGG Models: A platform for integrating, standardizing and sharing genome-scale models
King, Zachary A.; Lu, Justin; Drager, Andreas; ...
2015-10-17
In this study, genome-scale metabolic models are mathematically structured knowledge bases that can be used to predict metabolic pathway usage and growth phenotypes. Furthermore, they can generate and test hypotheses when integrated with experimental data. To maximize the value of these models, centralized repositories of high-quality models must be established, models must adhere to established standards and model components must be linked to relevant databases. Tools for model visualization further enhance their utility. To meet these needs, we present BiGG Models (http://bigg.ucsd.edu), a completely redesigned Biochemical, Genetic and Genomic knowledge base. BiGG Models contains more than 75 high-quality, manually-curated genome-scalemore » metabolic models. On the website, users can browse, search and visualize models. BiGG Models connects genome-scale models to genome annotations and external databases. Reaction and metabolite identifiers have been standardized across models to conform to community standards and enable rapid comparison across models. Furthermore, BiGG Models provides a comprehensive application programming interface for accessing BiGG Models with modeling and analysis tools. As a resource for highly curated, standardized and accessible models of metabolism, BiGG Models will facilitate diverse systems biology studies and support knowledge-based analysis of diverse experimental data.« less
BiGG Models: A platform for integrating, standardizing and sharing genome-scale models
King, Zachary A.; Lu, Justin; Dräger, Andreas; Miller, Philip; Federowicz, Stephen; Lerman, Joshua A.; Ebrahim, Ali; Palsson, Bernhard O.; Lewis, Nathan E.
2016-01-01
Genome-scale metabolic models are mathematically-structured knowledge bases that can be used to predict metabolic pathway usage and growth phenotypes. Furthermore, they can generate and test hypotheses when integrated with experimental data. To maximize the value of these models, centralized repositories of high-quality models must be established, models must adhere to established standards and model components must be linked to relevant databases. Tools for model visualization further enhance their utility. To meet these needs, we present BiGG Models (http://bigg.ucsd.edu), a completely redesigned Biochemical, Genetic and Genomic knowledge base. BiGG Models contains more than 75 high-quality, manually-curated genome-scale metabolic models. On the website, users can browse, search and visualize models. BiGG Models connects genome-scale models to genome annotations and external databases. Reaction and metabolite identifiers have been standardized across models to conform to community standards and enable rapid comparison across models. Furthermore, BiGG Models provides a comprehensive application programming interface for accessing BiGG Models with modeling and analysis tools. As a resource for highly curated, standardized and accessible models of metabolism, BiGG Models will facilitate diverse systems biology studies and support knowledge-based analysis of diverse experimental data. PMID:26476456
Cornwell, MacIntosh; Vangala, Mahesh; Taing, Len; Herbert, Zachary; Köster, Johannes; Li, Bo; Sun, Hanfei; Li, Taiwen; Zhang, Jian; Qiu, Xintao; Pun, Matthew; Jeselsohn, Rinath; Brown, Myles; Liu, X Shirley; Long, Henry W
2018-04-12
RNA sequencing has become a ubiquitous technology used throughout life sciences as an effective method of measuring RNA abundance quantitatively in tissues and cells. The increase in use of RNA-seq technology has led to the continuous development of new tools for every step of analysis from alignment to downstream pathway analysis. However, effectively using these analysis tools in a scalable and reproducible way can be challenging, especially for non-experts. Using the workflow management system Snakemake we have developed a user friendly, fast, efficient, and comprehensive pipeline for RNA-seq analysis. VIPER (Visualization Pipeline for RNA-seq analysis) is an analysis workflow that combines some of the most popular tools to take RNA-seq analysis from raw sequencing data, through alignment and quality control, into downstream differential expression and pathway analysis. VIPER has been created in a modular fashion to allow for the rapid incorporation of new tools to expand the capabilities. This capacity has already been exploited to include very recently developed tools that explore immune infiltrate and T-cell CDR (Complementarity-Determining Regions) reconstruction abilities. The pipeline has been conveniently packaged such that minimal computational skills are required to download and install the dozens of software packages that VIPER uses. VIPER is a comprehensive solution that performs most standard RNA-seq analyses quickly and effectively with a built-in capacity for customization and expansion.
Problem Reporting Taxonomy and Data Preparation Tool Evaluation
NASA Technical Reports Server (NTRS)
Beil, Robert J.
2010-01-01
A member of the NASA Engineering and Safety Center (NESC) Systems Engineering Office (SEO) Technical Discipline Team (TDT) requested a SEO-managed activity to perform a gap analysis on the proposed NASA Standard 0006, "Common NASA Taxonomy for Problem Reporting, Analysis, and Resolution", and to create an input filter and set of instructions for using the data-mining/data-cleansing tool TechOasis1 with Space Shuttle Program (SSP) problem reporting data. The work that achieved these objectives and deployment of TechOasis are discussed in this report.
Multispectral analysis tools can increase utility of RGB color images in histology
NASA Astrophysics Data System (ADS)
Fereidouni, Farzad; Griffin, Croix; Todd, Austin; Levenson, Richard
2018-04-01
Multispectral imaging (MSI) is increasingly finding application in the study and characterization of biological specimens. However, the methods typically used come with challenges on both the acquisition and the analysis front. MSI can be slow and photon-inefficient, leading to long imaging times and possible phototoxicity and photobleaching. The resulting datasets can be large and complex, prompting the development of a number of mathematical approaches for segmentation and signal unmixing. We show that under certain circumstances, just three spectral channels provided by standard color cameras, coupled with multispectral analysis tools, including a more recent spectral phasor approach, can efficiently provide useful insights. These findings are supported with a mathematical model relating spectral bandwidth and spectral channel number to achievable spectral accuracy. The utility of 3-band RGB and MSI analysis tools are demonstrated on images acquired using brightfield and fluorescence techniques, as well as a novel microscopy approach employing UV-surface excitation. Supervised linear unmixing, automated non-negative matrix factorization and phasor analysis tools all provide useful results, with phasors generating particularly helpful spectral display plots for sample exploration.
James T. Peterson; Sherry P. Wollrab
1999-01-01
Natural resource managers in the Inland Northwest need tools for assessing the success or failure of conservation policies and the impacts of management actions on fish and fish habitats. Effectiveness monitoring is one such potential tool, but there are currently no established monitoring protocols. Since 1991, U.S. Forest Service biologists have used the standardized...
Application of Risk Assessment Tools in the Continuous Risk Management (CRM) Process
NASA Technical Reports Server (NTRS)
Ray, Paul S.
2002-01-01
Marshall Space Flight Center (MSFC) of the National Aeronautics and Space Administration (NASA) is currently implementing the Continuous Risk Management (CRM) Program developed by the Carnegie Mellon University and recommended by NASA as the Risk Management (RM) implementation approach. The four most frequently used risk assessment tools in the center are: (a) Failure Modes and Effects Analysis (FMEA), Hazard Analysis (HA), Fault Tree Analysis (FTA), and Probabilistic Risk Analysis (PRA). There are some guidelines for selecting the type of risk assessment tools during the project formulation phase of a project, but there is not enough guidance as to how to apply these tools in the Continuous Risk Management process (CRM). But the ways the safety and risk assessment tools are used make a significant difference in the effectiveness in the risk management function. Decisions regarding, what events are to be included in the analysis, to what level of details should the analysis be continued, make significant difference in the effectiveness of risk management program. Tools of risk analysis also depends on the phase of a project e.g. at the initial phase of a project, when not much data are available on hardware, standard FMEA cannot be applied; instead a functional FMEA may be appropriate. This study attempted to provide some directives to alleviate the difficulty in applying FTA, PRA, and FMEA in the CRM process. Hazard Analysis was not included in the scope of the study due to the short duration of the summer research project.
ERIC Educational Resources Information Center
Marie, S. Maria Josephine Arokia; Edannur, Sreekala
2015-01-01
This paper focused on the analysis of test items constructed in the paper of teaching Physical Science for B.Ed. class. It involved the analysis of difficulty level and discrimination power of each test item. Item analysis allows selecting or omitting items from the test, but more importantly item analysis is a tool to help the item writer improve…
Web-based analysis and publication of flow cytometry experiments.
Kotecha, Nikesh; Krutzik, Peter O; Irish, Jonathan M
2010-07-01
Cytobank is a Web-based application for storage, analysis, and sharing of flow cytometry experiments. Researchers use a Web browser to log in and use a wide range of tools developed for basic and advanced flow cytometry. In addition to providing access to standard cytometry tools from any computer, Cytobank creates a platform and community for developing new analysis and publication tools. Figure layouts created on Cytobank are designed to allow transparent access to the underlying experiment annotation and data processing steps. Since all flow cytometry files and analysis data are stored on a central server, experiments and figures can be viewed or edited by anyone with the proper permission, from any computer with Internet access. Once a primary researcher has performed the initial analysis of the data, collaborators can engage in experiment analysis and make their own figure layouts using the gated, compensated experiment files. Cytobank is available to the scientific community at http://www.cytobank.org. (c) 2010 by John Wiley & Sons, Inc.
Web-Based Analysis and Publication of Flow Cytometry Experiments
Kotecha, Nikesh; Krutzik, Peter O.; Irish, Jonathan M.
2014-01-01
Cytobank is a web-based application for storage, analysis, and sharing of flow cytometry experiments. Researchers use a web browser to log in and use a wide range of tools developed for basic and advanced flow cytometry. In addition to providing access to standard cytometry tools from any computer, Cytobank creates a platform and community for developing new analysis and publication tools. Figure layouts created on Cytobank are designed to allow transparent access to the underlying experiment annotation and data processing steps. Since all flow cytometry files and analysis data are stored on a central server, experiments and figures can be viewed or edited by anyone with the proper permissions from any computer with Internet access. Once a primary researcher has performed the initial analysis of the data, collaborators can engage in experiment analysis and make their own figure layouts using the gated, compensated experiment files. Cytobank is available to the scientific community at www.cytobank.org PMID:20578106
User Instructions for the Policy Analysis Modeling System (PAMS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
McNeil, Michael A.; Letschert, Virginie E.; Van Buskirk, Robert D.
PAMS uses country-specific and product-specific data to calculate estimates of impacts of a Minimum Efficiency Performance Standard (MEPS) program. The analysis tool is self-contained in a Microsoft Excel spreadsheet, and requires no links to external data, or special code additions to run. The analysis can be customized to a particular program without additional user input, through the use of the pull-down menus located on the Summary page. In addition, the spreadsheet contains many areas into which user-generated input data can be entered for increased accuracy of projection. The following is a step-by-step guide for using and customizing the tool.
Tools and techniques for developing policies for complex and uncertain systems.
Bankes, Steven C
2002-05-14
Agent-based models (ABM) are examples of complex adaptive systems, which can be characterized as those systems for which no model less complex than the system itself can accurately predict in detail how the system will behave at future times. Consequently, the standard tools of policy analysis, based as they are on devising policies that perform well on some best estimate model of the system, cannot be reliably used for ABM. This paper argues that policy analysis by using ABM requires an alternative approach to decision theory. The general characteristics of such an approach are described, and examples are provided of its application to policy analysis.
Measuring the Air Quality and Transportation Impacts of Infill Development
This report summarizes three case studies. The analysis shows how standard forecasting tools can be modified to capture at least some of the transportation and air quality benefits of brownfield and infill development.
Marcatto, Francesco; D'Errico, Giuseppe; Di Blas, Lisa; Ferrante, Donatella
2011-01-01
The aim of this paper is to present a preliminary validation of an Italian adaptation of the HSE Management Standards Work-Related Stress Indicator Tool (IT), an instrument for assessing work-related stress at the organizational level, originally developed in Britain by the Health and Safety Executive. A scale that assesses the physical work environment has been added to the original version of the IT. 190 employees of the University of Trieste have been enrolled in the study. A confirmatory analysis showed a satisfactory fit of the eight-factors structure of the instrument. Further psychometric analysis showed adequate internal consistency of the IT scales and good criterion validity, as evidenced by the correlations with self-perception of stress, work satisfaction and motivation. In conclusion, the Indicator Tool proved to be a valid and reliable instrument for the assessment of work-related stress at the organizational level, and it is also compatible with the instructions provided by the Ministry of Labour and Social Policy (Circular letter 18/11/2010).
Application handbook for a Standardized Control Module (SCM) for DC-DC converters, volume 1
NASA Astrophysics Data System (ADS)
Lee, F. C.; Mahmoud, M. F.; Yu, Y.
1980-04-01
The standardized control module (SCM) was developed for application in the buck, boost and buck/boost DC-DC converters. The SCM used multiple feedback loops to provide improved input line and output load regulation, stable feedback control system, good dynamic transient response and adaptive compensation of the control loop for changes in open loop gain and output filter time constraints. The necessary modeling and analysis tools to aid the design engineer in the application of the SCM to DC-DC Converters were developed. The SCM functional block diagram and the different analysis techniques were examined. The average time domain analysis technique was chosen as the basic analytical tool. The power stage transfer functions were developed for the buck, boost and buck/boost converters. The analog signal and digital signal processor transfer functions were developed for the three DC-DC Converter types using the constant on time, constant off time and constant frequency control laws.
Application handbook for a Standardized Control Module (SCM) for DC-DC converters, volume 1
NASA Technical Reports Server (NTRS)
Lee, F. C.; Mahmoud, M. F.; Yu, Y.
1980-01-01
The standardized control module (SCM) was developed for application in the buck, boost and buck/boost DC-DC converters. The SCM used multiple feedback loops to provide improved input line and output load regulation, stable feedback control system, good dynamic transient response and adaptive compensation of the control loop for changes in open loop gain and output filter time constraints. The necessary modeling and analysis tools to aid the design engineer in the application of the SCM to DC-DC Converters were developed. The SCM functional block diagram and the different analysis techniques were examined. The average time domain analysis technique was chosen as the basic analytical tool. The power stage transfer functions were developed for the buck, boost and buck/boost converters. The analog signal and digital signal processor transfer functions were developed for the three DC-DC Converter types using the constant on time, constant off time and constant frequency control laws.
Toward a Responsibility-Catering Prioritarian Ethical Theory of Risk.
Wikman-Svahn, Per; Lindblom, Lars
2018-03-05
Standard tools used in societal risk management such as probabilistic risk analysis or cost-benefit analysis typically define risks in terms of only probabilities and consequences and assume a utilitarian approach to ethics that aims to maximize expected utility. The philosopher Carl F. Cranor has argued against this view by devising a list of plausible aspects of the acceptability of risks that points towards a non-consequentialist ethical theory of societal risk management. This paper revisits Cranor's list to argue that the alternative ethical theory responsibility-catering prioritarianism can accommodate the aspects identified by Cranor and that the elements in the list can be used to inform the details of how to view risks within this theory. An approach towards operationalizing the theory is proposed based on a prioritarian social welfare function that operates on responsibility-adjusted utilities. A responsibility-catering prioritarian ethical approach towards managing risks is a promising alternative to standard tools such as cost-benefit analysis.
Fulton, James L.
1992-01-01
Spatial data analysis has become an integral component in many surface and sub-surface hydrologic investigations within the U.S. Geological Survey (USGS). Currently, one of the largest costs in applying spatial data analysis is the cost of developing the needed spatial data. Therefore, guidelines and standards are required for the development of spatial data in order to allow for data sharing and reuse; this eliminates costly redevelopment. In order to attain this goal, the USGS is expanding efforts to identify guidelines and standards for the development of spatial data for hydrologic analysis. Because of the variety of project and database needs, the USGS has concentrated on developing standards for documenting spatial sets to aid in the assessment of data set quality and compatibility of different data sets. An interim data set documentation standard (1990) has been developed that provides a mechanism for associating a wide variety of information with a data set, including data about source material, data automation and editing procedures used, projection parameters, data statistics, descriptions of features and feature attributes, information on organizational contacts lists of operations performed on the data, and free-form comments and notes about the data, made at various times in the evolution of the data set. The interim data set documentation standard has been automated using a commercial geographic information system (GIS) and data set documentation software developed by the USGS. Where possible, USGS developed software is used to enter data into the data set documentation file automatically. The GIS software closely associates a data set with its data set documentation file; the documentation file is retained with the data set whenever it is modified, copied, or transferred to another computer system. The Water Resources Division of the USGS is continuing to develop spatial data and data processing standards, with emphasis on standards needed to support hydrologic analysis, hydrologic data processing, and publication of hydrologic thermatic maps. There is a need for the GIS vendor community to develop data set documentation tools similar to those developed by the USGS, or to incorporate USGS developed tools in their software.
Geophysical data analysis and visualization using the Grid Analysis and Display System
NASA Technical Reports Server (NTRS)
Doty, Brian E.; Kinter, James L., III
1995-01-01
Several problems posed by the rapidly growing volume of geophysical data are described, and a selected set of existing solutions to these problems is outlined. A recently developed desktop software tool called the Grid Analysis and Display System (GrADS) is presented. The GrADS' user interface is a natural extension of the standard procedures scientists apply to their geophysical data analysis problems. The basic GrADS operations have defaults that naturally map to data analysis actions, and there is a programmable interface for customizing data access and manipulation. The fundamental concept of the GrADS' dimension environment, which defines both the space in which the geophysical data reside and the 'slice' of data which is being analyzed at a given time, is expressed The GrADS' data storage and access model is described. An argument is made in favor of describable data formats rather than standard data formats. The manner in which GrADS users may perform operations on their data and display the results is also described. It is argued that two-dimensional graphics provides a powerful quantitative data analysis tool whose value is underestimated in the current development environment which emphasizes three dimensional structure modeling.
SECIMTools: a suite of metabolomics data analysis tools.
Kirpich, Alexander S; Ibarra, Miguel; Moskalenko, Oleksandr; Fear, Justin M; Gerken, Joseph; Mi, Xinlei; Ashrafi, Ali; Morse, Alison M; McIntyre, Lauren M
2018-04-20
Metabolomics has the promise to transform the area of personalized medicine with the rapid development of high throughput technology for untargeted analysis of metabolites. Open access, easy to use, analytic tools that are broadly accessible to the biological community need to be developed. While technology used in metabolomics varies, most metabolomics studies have a set of features identified. Galaxy is an open access platform that enables scientists at all levels to interact with big data. Galaxy promotes reproducibility by saving histories and enabling the sharing workflows among scientists. SECIMTools (SouthEast Center for Integrated Metabolomics) is a set of Python applications that are available both as standalone tools and wrapped for use in Galaxy. The suite includes a comprehensive set of quality control metrics (retention time window evaluation and various peak evaluation tools), visualization techniques (hierarchical cluster heatmap, principal component analysis, modular modularity clustering), basic statistical analysis methods (partial least squares - discriminant analysis, analysis of variance, t-test, Kruskal-Wallis non-parametric test), advanced classification methods (random forest, support vector machines), and advanced variable selection tools (least absolute shrinkage and selection operator LASSO and Elastic Net). SECIMTools leverages the Galaxy platform and enables integrated workflows for metabolomics data analysis made from building blocks designed for easy use and interpretability. Standard data formats and a set of utilities allow arbitrary linkages between tools to encourage novel workflow designs. The Galaxy framework enables future data integration for metabolomics studies with other omics data.
Interactive Digital Signal Processor
NASA Technical Reports Server (NTRS)
Mish, W. H.
1985-01-01
Interactive Digital Signal Processor, IDSP, consists of set of time series analysis "operators" based on various algorithms commonly used for digital signal analysis. Processing of digital signal time series to extract information usually achieved by applications of number of fairly standard operations. IDSP excellent teaching tool for demonstrating application for time series operators to artificially generated signals.
A Laboratory Exercise Illustrating the Sensitivity and Specificity of Western Blot Analysis
ERIC Educational Resources Information Center
Chang, Ming-Mei; Lovett, Janice
2011-01-01
Western blot analysis, commonly known as "Western blotting," is a standard tool in every laboratory where proteins are analyzed. It involves the separation of polypeptides in polyacrylamide gels followed by the electrophoretic transfer of the separated polypeptides onto a nitrocellulose or polyvinylidene fluoride membrane. A replica of the…
Bogren, Malin; Sathyanarayanan Doraiswamy; Erlandsson, Kerstin; Akhter, Halima; Akter, Dalia; Begum, Momtaz; Chowdhury, Merry; Das, Lucky; Akter, Rehana; Begum, Sufia; Akter, Renoara; Yesmin, Syeada; Khatun, Yamin Ara
2018-06-01
using the International Confederation of Midwives (ICM) Global Standards for Midwifery Education as a conceptual framework, the aim of this study was to explore and describe important 'must haves' for inclusion in a context-specific accreditation assessment tool in Bangladesh. A questionnaire study was conducted using a Likert rating scale and 111 closed-response single items on adherence to accreditation-related statements, ending with an open-ended question. The ICM Global Standards guided data collection, deductive content analysis and description of the quantitative results. twenty-five public institutes/colleges (out of 38 in Bangladesh), covering seven out of eight geographical divisions in the country. one hundred and twenty-three nursing educators teaching the 3-year diploma midwifery education programme. this study provides insight into the development of a context-specific accreditation assessment tool for Bangladesh. Important components to be included in this accreditation tool are presented under the following categories and domains: 'organization and administration', 'midwifery faculty', 'student body', 'curriculum content', 'resources, facilities and services' and 'assessment strategies'. The identified components were a prerequisite to ensure that midwifery students achieve the intended learning outcomes of the midwifery curriculum, and hence contribute to a strong midwifery workforce. The components further ensure well-prepared teachers and a standardized curriculum supported at policy level to enable effective deployment of professional midwives in the existing health system. as part of developing an accreditation assessment tool, it is imperative to build ownership and capacity when translating the ICM Global Standards for Midwifery Education into the national context. this initiative can be used as lessons learned from Bangladesh to develop a context-specific accreditation assessment tool in line with national priorities, supporting the development of national policies. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.
NASA Astrophysics Data System (ADS)
Ruby, Michael
In the last decades scanning probe microscopy and spectroscopy have become well-established tools in nanotechnology and surface science. This opened the market for many commercial manufacturers, each with different hardware and software standards. Besides the advantage of a wide variety of available hardware, the diversity may software-wise complicate the data exchange between scientists, and the data analysis for groups working with hardware developed by different manufacturers. Not only the file format differs between manufacturers, but also the data often requires further numerical treatment before publication. SpectraFox is an open-source and independent tool which manages, processes, and evaluates scanning probe spectroscopy and microscopy data. It aims at simplifying the documentation in parallel to measurement, and it provides solid evaluation tools for a large number of data.
Lawless, Craig; Hubbard, Simon J.; Fan, Jun; Bessant, Conrad; Hermjakob, Henning; Jones, Andrew R.
2012-01-01
Abstract New methods for performing quantitative proteome analyses based on differential labeling protocols or label-free techniques are reported in the literature on an almost monthly basis. In parallel, a correspondingly vast number of software tools for the analysis of quantitative proteomics data has also been described in the literature and produced by private companies. In this article we focus on the review of some of the most popular techniques in the field and present a critical appraisal of several software packages available to process and analyze the data produced. We also describe the importance of community standards to support the wide range of software, which may assist researchers in the analysis of data using different platforms and protocols. It is intended that this review will serve bench scientists both as a useful reference and a guide to the selection and use of different pipelines to perform quantitative proteomics data analysis. We have produced a web-based tool (http://www.proteosuite.org/?q=other_resources) to help researchers find appropriate software for their local instrumentation, available file formats, and quantitative methodology. PMID:22804616
PICASSO: an end-to-end image simulation tool for space and airborne imaging systems
NASA Astrophysics Data System (ADS)
Cota, Steve A.; Bell, Jabin T.; Boucher, Richard H.; Dutton, Tracy E.; Florio, Chris J.; Franz, Geoffrey A.; Grycewicz, Thomas J.; Kalman, Linda S.; Keller, Robert A.; Lomheim, Terrence S.; Paulson, Diane B.; Willkinson, Timothy S.
2008-08-01
The design of any modern imaging system is the end result of many trade studies, each seeking to optimize image quality within real world constraints such as cost, schedule and overall risk. Image chain analysis - the prediction of image quality from fundamental design parameters - is an important part of this design process. At The Aerospace Corporation we have been using a variety of image chain analysis tools for many years, the Parameterized Image Chain Analysis & Simulation SOftware (PICASSO) among them. In this paper we describe our PICASSO tool, showing how, starting with a high quality input image and hypothetical design descriptions representative of the current state of the art in commercial imaging satellites, PICASSO can generate standard metrics of image quality in support of the decision processes of designers and program managers alike.
PICASSO: an end-to-end image simulation tool for space and airborne imaging systems
NASA Astrophysics Data System (ADS)
Cota, Stephen A.; Bell, Jabin T.; Boucher, Richard H.; Dutton, Tracy E.; Florio, Christopher J.; Franz, Geoffrey A.; Grycewicz, Thomas J.; Kalman, Linda S.; Keller, Robert A.; Lomheim, Terrence S.; Paulson, Diane B.; Wilkinson, Timothy S.
2010-06-01
The design of any modern imaging system is the end result of many trade studies, each seeking to optimize image quality within real world constraints such as cost, schedule and overall risk. Image chain analysis - the prediction of image quality from fundamental design parameters - is an important part of this design process. At The Aerospace Corporation we have been using a variety of image chain analysis tools for many years, the Parameterized Image Chain Analysis & Simulation SOftware (PICASSO) among them. In this paper we describe our PICASSO tool, showing how, starting with a high quality input image and hypothetical design descriptions representative of the current state of the art in commercial imaging satellites, PICASSO can generate standard metrics of image quality in support of the decision processes of designers and program managers alike.
In Search of a Time Efficient Approach to Crack and Delamination Growth Predictions in Composites
NASA Technical Reports Server (NTRS)
Krueger, Ronald; Carvalho, Nelson
2016-01-01
Analysis benchmarking was used to assess the accuracy and time efficiency of algorithms suitable for automated delamination growth analysis. First, the Floating Node Method (FNM) was introduced and its combination with a simple exponential growth law (Paris Law) and Virtual Crack Closure technique (VCCT) was discussed. Implementation of the method into a user element (UEL) in Abaqus/Standard(Registered TradeMark) was also presented. For the assessment of growth prediction capabilities, an existing benchmark case based on the Double Cantilever Beam (DCB) specimen was briefly summarized. Additionally, the development of new benchmark cases based on the Mixed-Mode Bending (MMB) specimen to assess the growth prediction capabilities under mixed-mode I/II conditions was discussed in detail. A comparison was presented, in which the benchmark cases were used to assess the existing low-cycle fatigue analysis tool in Abaqus/Standard(Registered TradeMark) in comparison to the FNM-VCCT fatigue growth analysis implementation. The low-cycle fatigue analysis tool in Abaqus/Standard(Registered TradeMark) was able to yield results that were in good agreement with the DCB benchmark example. Results for the MMB benchmark cases, however, only captured the trend correctly. The user element (FNM-VCCT) always yielded results that were in excellent agreement with all benchmark cases, at a fraction of the analysis time. The ability to assess the implementation of two methods in one finite element code illustrated the value of establishing benchmark solutions.
Tissue proteomics: a new investigative tool for renal biopsy analysis.
Sedor, John R
2009-05-01
Renal biopsy is viewed as the gold standard for diagnosis and management of many kidney diseases, especially glomerulopathies. However, the histopathological descriptions currently used in clinical practice often are neither diagnostic nor prognostic. The paper by Sethi et al. highlights the availability of a newer investigative tool that can be used to better define pathogenesis and, perhaps more important, to discover robust biomarkers of kidney disease cause and outcome.
ERIC Educational Resources Information Center
Kriston, Levente; Melchior, Hanne; Hergert, Anika; Bergelt, Corinna; Watzke, Birgit; Schulz, Holger; von Wolff, Alessa
2011-01-01
The aim of our study was to develop a graphical tool that can be used in addition to standard statistical criteria to support decisions on the number of classes in explorative categorical latent variable modeling for rehabilitation research. Data from two rehabilitation research projects were used. In the first study, a latent profile analysis was…
Setting the standards for signal transduction research.
Saez-Rodriguez, Julio; Alexopoulos, Leonidas G; Stolovitzky, Gustavo
2011-02-15
Major advances in high-throughput technology platforms, coupled with increasingly sophisticated computational methods for systematic data analysis, have provided scientists with tools to better understand the complexity of signaling networks. In this era of massive and diverse data collection, standardization efforts that streamline data gathering, analysis, storage, and sharing are becoming a necessity. Here, we give an overview of current technologies to study signal transduction. We argue that along with the opportunities the new technologies open, their heterogeneous nature poses critical challenges for data handling that are further increased when data are to be integrated in mathematical models. Efficient standardization through markup languages and data annotation is a sine qua non condition for a systems-level analysis of signaling processes. It remains to be seen the extent to which and the speed at which the emerging standardization efforts will be embraced by the signaling community.
NASA Astrophysics Data System (ADS)
Alfarra, M. R.; Coe, H.; Allan, J. D.; Bower, K. N.; Garforth, A. A.; Canagaratna, M.; Worsnop, D.
The aerosol mass spectrometer (AMS) is a quantitative instrument designed to deliver real-time size resolved chemical composition of the volatile and semi volatile aerosol fractions. The AMS response to a wide range of organic compounds has been exper- imentally characterized, and has been shown to compare well with standard libraries of 70 eV electron impact ionization mass spectra. These results will be presented. Due to the scanning nature of the quadrupole mass spectrometer, the AMS provides averaged composition of ensemble of particles rather than single particle composi- tion. However, the mass spectra measured by AMS are reproducible and similar to those of standard libraries so analysis tools can be developed on large mass spectral libraries that can provide chemical composition information about the type of organic compounds in the aerosol. One such tool is presented and compared with laboratory measurements of single species and mixed component organic particles by the AMS. We will then discuss the applicability of these tools to interpreting field AMS data ob- tained in a range of experiments at different sites in the UK and Canada. The data will be combined with other measurements to show the behaviour of the organic aerosol fraction in urban and sub-urban environments.
Patient Safety—Incorporating Drawing Software into Root Cause Analysis Software
Williams, Linda; Grayson, Diana; Gosbee, John
2001-01-01
Drawing software from Lassalle Technologies1 (France) designed for Visual Basic is the tool we used to standardize the creation, storage, and retrieval of flow diagrams containing information about adverse events and close calls.
Patient Safety—Incorporating Drawing Software into Root Cause Analysis Software
Williams, Linda; Grayson, Diana; Gosbee, John
2002-01-01
Drawing software from Lassalle Technologies1 (France) designed for Visual Basic is the tool we used to standardize the creation, storage, and retrieval of flow diagrams containing information about adverse events and close calls.
Using a formal requirements management tool for system engineering: first results at ESO
NASA Astrophysics Data System (ADS)
Zamparelli, Michele
2006-06-01
The attention to proper requirement analysis and maintenance is growing in modern astronomical undertakings. The increasing degree of complexity that current and future generations of projects have reached requires substantial system engineering efforts and the usage of all available technology to keep project development under control. One such technology is a tool which helps managing relationships between deliverables at various development stages, and across functional subsystems and disciplines as different as software, mechanics, optics and electronics. The immediate benefits are traceability and the possibility to do impact analysis. An industrially proven tool for requirements management is presented together with the first results across some projects at ESO and a cost/benefit analysis of its usage. Experience gathered so far shows that the extensibility and configurability of the tool from one hand, and integration with common documentation formats and standards on the other, make it appear as a promising solution for even small scale system development.
Fagerlind Ståhl, Anna-Carin; Gustavsson, Maria; Karlsson, Nadine; Johansson, Gun; Ekberg, Kerstin
2015-03-01
The effect of lean production on conditions for learning is debated. This study aimed to investigate how tools inspired by lean production (standardization, resource reduction, visual monitoring, housekeeping, value flow analysis) were associated with an innovative learning climate and with collective dispersion of ideas in organizations, and whether decision latitude contributed to these associations. A questionnaire was sent out to employees in public, private, production and service organizations (n = 4442). Multilevel linear regression analyses were used. Use of lean tools and decision latitude were positively associated with an innovative learning climate and collective dispersion of ideas. A low degree of decision latitude was a modifier in the association to collective dispersion of ideas. Lean tools can enable shared understanding and collective spreading of ideas, needed for the development of work processes, especially when decision latitude is low. Value flow analysis played a pivotal role in the associations. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.
MDTraj: A Modern Open Library for the Analysis of Molecular Dynamics Trajectories.
McGibbon, Robert T; Beauchamp, Kyle A; Harrigan, Matthew P; Klein, Christoph; Swails, Jason M; Hernández, Carlos X; Schwantes, Christian R; Wang, Lee-Ping; Lane, Thomas J; Pande, Vijay S
2015-10-20
As molecular dynamics (MD) simulations continue to evolve into powerful computational tools for studying complex biomolecular systems, the necessity of flexible and easy-to-use software tools for the analysis of these simulations is growing. We have developed MDTraj, a modern, lightweight, and fast software package for analyzing MD simulations. MDTraj reads and writes trajectory data in a wide variety of commonly used formats. It provides a large number of trajectory analysis capabilities including minimal root-mean-square-deviation calculations, secondary structure assignment, and the extraction of common order parameters. The package has a strong focus on interoperability with the wider scientific Python ecosystem, bridging the gap between MD data and the rapidly growing collection of industry-standard statistical analysis and visualization tools in Python. MDTraj is a powerful and user-friendly software package that simplifies the analysis of MD data and connects these datasets with the modern interactive data science software ecosystem in Python. Copyright © 2015 Biophysical Society. Published by Elsevier Inc. All rights reserved.
MDTraj: A Modern Open Library for the Analysis of Molecular Dynamics Trajectories
McGibbon, Robert T.; Beauchamp, Kyle A.; Harrigan, Matthew P.; Klein, Christoph; Swails, Jason M.; Hernández, Carlos X.; Schwantes, Christian R.; Wang, Lee-Ping; Lane, Thomas J.; Pande, Vijay S.
2015-01-01
As molecular dynamics (MD) simulations continue to evolve into powerful computational tools for studying complex biomolecular systems, the necessity of flexible and easy-to-use software tools for the analysis of these simulations is growing. We have developed MDTraj, a modern, lightweight, and fast software package for analyzing MD simulations. MDTraj reads and writes trajectory data in a wide variety of commonly used formats. It provides a large number of trajectory analysis capabilities including minimal root-mean-square-deviation calculations, secondary structure assignment, and the extraction of common order parameters. The package has a strong focus on interoperability with the wider scientific Python ecosystem, bridging the gap between MD data and the rapidly growing collection of industry-standard statistical analysis and visualization tools in Python. MDTraj is a powerful and user-friendly software package that simplifies the analysis of MD data and connects these datasets with the modern interactive data science software ecosystem in Python. PMID:26488642
Development of an Acoustic Signal Analysis Tool “Auto-F” Based on the Temperament Scale
NASA Astrophysics Data System (ADS)
Modegi, Toshio
The MIDI interface is originally designed for electronic musical instruments but we consider this music-note based coding concept can be extended for general acoustic signal description. We proposed applying the MIDI technology to coding of bio-medical auscultation sound signals such as heart sounds for retrieving medical records and performing telemedicine. Then we have tried to extend our encoding targets including vocal sounds, natural sounds and electronic bio-signals such as ECG, using Generalized Harmonic Analysis method. Currently, we are trying to separate vocal sounds included in popular songs and encode both vocal sounds and background instrumental sounds into separate MIDI channels. And also, we are trying to extract articulation parameters such as MIDI pitch-bend parameters in order to reproduce natural acoustic sounds using a GM-standard MIDI tone generator. In this paper, we present an overall algorithm of our developed acoustic signal analysis tool, based on those research works, which can analyze given time-based signals on the musical temperament scale. The prominent feature of this tool is producing high-precision MIDI codes, which reproduce the similar signals as the given source signal using a GM-standard MIDI tone generator, and also providing analyzed texts in the XML format.
Automating linear accelerator quality assurance.
Eckhause, Tobias; Al-Hallaq, Hania; Ritter, Timothy; DeMarco, John; Farrey, Karl; Pawlicki, Todd; Kim, Gwe-Ya; Popple, Richard; Sharma, Vijeshwar; Perez, Mario; Park, SungYong; Booth, Jeremy T; Thorwarth, Ryan; Moran, Jean M
2015-10-01
The purpose of this study was 2-fold. One purpose was to develop an automated, streamlined quality assurance (QA) program for use by multiple centers. The second purpose was to evaluate machine performance over time for multiple centers using linear accelerator (Linac) log files and electronic portal images. The authors sought to evaluate variations in Linac performance to establish as a reference for other centers. The authors developed analytical software tools for a QA program using both log files and electronic portal imaging device (EPID) measurements. The first tool is a general analysis tool which can read and visually represent data in the log file. This tool, which can be used to automatically analyze patient treatment or QA log files, examines the files for Linac deviations which exceed thresholds. The second set of tools consists of a test suite of QA fields, a standard phantom, and software to collect information from the log files on deviations from the expected values. The test suite was designed to focus on the mechanical tests of the Linac to include jaw, MLC, and collimator positions during static, IMRT, and volumetric modulated arc therapy delivery. A consortium of eight institutions delivered the test suite at monthly or weekly intervals on each Linac using a standard phantom. The behavior of various components was analyzed for eight TrueBeam Linacs. For the EPID and trajectory log file analysis, all observed deviations which exceeded established thresholds for Linac behavior resulted in a beam hold off. In the absence of an interlock-triggering event, the maximum observed log file deviations between the expected and actual component positions (such as MLC leaves) varied from less than 1% to 26% of published tolerance thresholds. The maximum and standard deviations of the variations due to gantry sag, collimator angle, jaw position, and MLC positions are presented. Gantry sag among Linacs was 0.336 ± 0.072 mm. The standard deviation in MLC position, as determined by EPID measurements, across the consortium was 0.33 mm for IMRT fields. With respect to the log files, the deviations between expected and actual positions for parameters were small (<0.12 mm) for all Linacs. Considering both log files and EPID measurements, all parameters were well within published tolerance values. Variations in collimator angle, MLC position, and gantry sag were also evaluated for all Linacs. The performance of the TrueBeam Linac model was shown to be consistent based on automated analysis of trajectory log files and EPID images acquired during delivery of a standardized test suite. The results can be compared directly to tolerance thresholds. In addition, sharing of results from standard tests across institutions can facilitate the identification of QA process and Linac changes. These reference values are presented along with the standard deviation for common tests so that the test suite can be used by other centers to evaluate their Linac performance against those in this consortium.
ExAtlas: An interactive online tool for meta-analysis of gene expression data.
Sharov, Alexei A; Schlessinger, David; Ko, Minoru S H
2015-12-01
We have developed ExAtlas, an on-line software tool for meta-analysis and visualization of gene expression data. In contrast to existing software tools, ExAtlas compares multi-component data sets and generates results for all combinations (e.g. all gene expression profiles versus all Gene Ontology annotations). ExAtlas handles both users' own data and data extracted semi-automatically from the public repository (GEO/NCBI database). ExAtlas provides a variety of tools for meta-analyses: (1) standard meta-analysis (fixed effects, random effects, z-score, and Fisher's methods); (2) analyses of global correlations between gene expression data sets; (3) gene set enrichment; (4) gene set overlap; (5) gene association by expression profile; (6) gene specificity; and (7) statistical analysis (ANOVA, pairwise comparison, and PCA). ExAtlas produces graphical outputs, including heatmaps, scatter-plots, bar-charts, and three-dimensional images. Some of the most widely used public data sets (e.g. GNF/BioGPS, Gene Ontology, KEGG, GAD phenotypes, BrainScan, ENCODE ChIP-seq, and protein-protein interaction) are pre-loaded and can be used for functional annotations.
Modeling and Simulation Tools for Heavy Lift Airships
NASA Technical Reports Server (NTRS)
Hochstetler, Ron; Chachad, Girish; Hardy, Gordon; Blanken, Matthew; Melton, John
2016-01-01
For conventional fixed wing and rotary wing aircraft a variety of modeling and simulation tools have been developed to provide designers the means to thoroughly investigate proposed designs and operational concepts. However, lighter-than-air (LTA) airships, hybrid air vehicles, and aerostats have some important aspects that are different from heavier-than-air (HTA) vehicles. In order to account for these differences, modifications are required to the standard design tools to fully characterize the LTA vehicle design and performance parameters.. To address these LTA design and operational factors, LTA development organizations have created unique proprietary modeling tools, often at their own expense. An expansion of this limited LTA tool set could be accomplished by leveraging existing modeling and simulation capabilities available in the National laboratories and public research centers. Development of an expanded set of publicly available LTA modeling and simulation tools for LTA developers would mitigate the reliance on proprietary LTA design tools in use today. A set of well researched, open source, high fidelity LTA design modeling and simulation tools would advance LTA vehicle development and also provide the analytical basis for accurate LTA operational cost assessments. This paper will present the modeling and analysis tool capabilities required for LTA vehicle design, analysis of operations, and full life-cycle support. A survey of the tools currently available will be assessed to identify the gaps between their capabilities and the LTA industry's needs. Options for development of new modeling and analysis capabilities to supplement contemporary tools will also be presented.
CoryneBase: Corynebacterium Genomic Resources and Analysis Tools at Your Fingertips
Tan, Mui Fern; Jakubovics, Nick S.; Wee, Wei Yee; Mutha, Naresh V. R.; Wong, Guat Jah; Ang, Mia Yang; Yazdi, Amir Hessam; Choo, Siew Woh
2014-01-01
Corynebacteria are used for a wide variety of industrial purposes but some species are associated with human diseases. With increasing number of corynebacterial genomes having been sequenced, comparative analysis of these strains may provide better understanding of their biology, phylogeny, virulence and taxonomy that may lead to the discoveries of beneficial industrial strains or contribute to better management of diseases. To facilitate the ongoing research of corynebacteria, a specialized central repository and analysis platform for the corynebacterial research community is needed to host the fast-growing amount of genomic data and facilitate the analysis of these data. Here we present CoryneBase, a genomic database for Corynebacterium with diverse functionality for the analysis of genomes aimed to provide: (1) annotated genome sequences of Corynebacterium where 165,918 coding sequences and 4,180 RNAs can be found in 27 species; (2) access to comprehensive Corynebacterium data through the use of advanced web technologies for interactive web interfaces; and (3) advanced bioinformatic analysis tools consisting of standard BLAST for homology search, VFDB BLAST for sequence homology search against the Virulence Factor Database (VFDB), Pairwise Genome Comparison (PGC) tool for comparative genomic analysis, and a newly designed Pathogenomics Profiling Tool (PathoProT) for comparative pathogenomic analysis. CoryneBase offers the access of a range of Corynebacterium genomic resources as well as analysis tools for comparative genomics and pathogenomics. It is publicly available at http://corynebacterium.um.edu.my/. PMID:24466021
Ma, Bin; Xu, Jia-Ke; Wu, Wen-Jing; Liu, Hong-Yan; Kou, Cheng-Kun; Liu, Na; Zhao, Lulu
2017-01-01
To investigate the awareness and use of the Systematic Review Center for Laboratory Animal Experimentation's (SYRCLE) risk-of-bias tool, the Animal Research: Reporting of In Vivo Experiments (ARRIVE) reporting guidelines, and Gold Standard Publication Checklist (GSPC) in China in basic medical researchers of animal experimental studies. A national questionnaire-based survey targeting basic medical researchers was carried in China to investigate the basic information and awareness of SYRCLE's risk of bias tool, ARRIVE guidelines, GSPC, and animal experimental bias risk control factors. The EpiData3.1 software was used for data entry, and Microsoft Excel 2013 was used for statistical analysis in this study. The number of cases (n) and percentage (%) of classified information were statistically described, and the comparison between groups (i.e., current students vs. research staff) was performed using chi-square test. A total of 298 questionnaires were distributed, and 272 responses were received, which included 266 valid questionnaires (from 118 current students and 148 research staff). Among the 266 survey participants, only 15.8% was aware of the SYRCLE's risk of bias tool, with significant difference between the two groups (P = 0.003), and the awareness rates of ARRIVE guidelines and GSPC were only 9.4% and 9.0%, respectively; 58.6% survey participants believed that the reports of animal experimental studies in Chinese literature were inadequate, with significant difference between the two groups (P = 0.004). In addition, only approximately 1/3 of the survey participants had read systematic reviews and meta-analysis reports of animal experimental studies; only 16/266 (6.0%) had carried out/participated in and 11/266 (4.1%) had published systematic reviews/meta-analysis of animal experimental studies. The awareness and use rates of SYRCLE's risk-of-bias tool, the ARRIVE guidelines, and the GSPC were low among Chinese basic medical researchers. Therefore, specific measures are necessary to promote and popularize these standards and specifications and to introduce these standards into guidelines of Chinese domestic journals as soon as possible to raise awareness and increase use rates of researchers and journal editors, thereby improving the quality of animal experimental methods and reports.
A New Spin on Miscue Analysis: Using Spider Charts to Web Reading Processes
ERIC Educational Resources Information Center
Wohlwend, Karen E.
2012-01-01
This article introduces a way of seeing miscue analysis data through a "spider chart", a readily available digital graphing tool that provides an effective way to visually represent readers' complex coordination of interrelated cueing systems. A spider chart is a standard feature in recent spreadsheet software that puts a new spin on miscue…
LabRS: A Rosetta stone for retrospective standardization of clinical laboratory test results.
Hauser, Ronald George; Quine, Douglas B; Ryder, Alex
2018-02-01
Clinical laboratories in the United States do not have an explicit result standard to report the 7 billion laboratory tests results they produce each year. The absence of standardized test results creates inefficiencies and ambiguities for secondary data users. We developed and tested a tool to standardize the results of laboratory tests in a large, multicenter clinical data warehouse. Laboratory records, each of which consisted of a laboratory result and a test identifier, from 27 diverse facilities were captured from 2000 through 2015. Each record underwent a standardization process to convert the original result into a format amenable to secondary data analysis. The standardization process included the correction of typos, normalization of categorical results, separation of inequalities from numbers, and conversion of numbers represented by words (eg, "million") to numerals. Quality control included expert review. We obtained 1.266 × 109 laboratory records and standardized 1.252 × 109 records (98.9%). Of the unique unstandardized records (78.887 × 103), most appeared <5 times (96%, eg, typos), did not have a test identifier (47%), or belonged to an esoteric test with <100 results (2%). Overall, these 3 reasons accounted for nearly all unstandardized results (98%). Current results suggest that the tool is both scalable and generalizable among diverse clinical laboratories. Based on observed trends, the tool will require ongoing maintenance to stay current with new tests and result formats. Future work to develop and implement an explicit standard for test results would reduce the need to retrospectively standardize test results. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com
Objective Data Assessment (ODA) Methods as Nutritional Assessment Tools.
Hamada, Yasuhiro
2015-01-01
Nutritional screening and assessment should be a standard of care for all patients because nutritional management plays an important role in clinical practice. However, there is no gold standard for the diagnosis of malnutrition or undernutrition, although a large number of nutritional screening and assessment tools have been developed. Nutritional screening and assessment tools are classified into two categories, namely, subjective global assessment (SGA) and objective data assessment (ODA). SGA assesses nutritional status based on the features of medical history and physical examination. On the other hand, ODA consists of objective data provided from various analyses, such as anthropometry, bioimpedance analysis (BIA), dual-energy X-ray absorptiometry (DEXA), computed tomography (CT), magnetic resonance imaging (MRI), laboratory tests, and functional tests. This review highlights knowledge on the performance of ODA methods for the assessment of nutritional status in clinical practice. J. Med. Invest. 62: 119-122, August, 2015.
Sun, Shulei; Chen, Jing; Li, Weizhong; Altintas, Ilkay; Lin, Abel; Peltier, Steve; Stocks, Karen; Allen, Eric E.; Ellisman, Mark; Grethe, Jeffrey; Wooley, John
2011-01-01
The Community Cyberinfrastructure for Advanced Microbial Ecology Research and Analysis (CAMERA, http://camera.calit2.net/) is a database and associated computational infrastructure that provides a single system for depositing, locating, analyzing, visualizing and sharing data about microbial biology through an advanced web-based analysis portal. CAMERA collects and links metadata relevant to environmental metagenome data sets with annotation in a semantically-aware environment allowing users to write expressive semantic queries against the database. To meet the needs of the research community, users are able to query metadata categories such as habitat, sample type, time, location and other environmental physicochemical parameters. CAMERA is compliant with the standards promulgated by the Genomic Standards Consortium (GSC), and sustains a role within the GSC in extending standards for content and format of the metagenomic data and metadata and its submission to the CAMERA repository. To ensure wide, ready access to data and annotation, CAMERA also provides data submission tools to allow researchers to share and forward data to other metagenomics sites and community data archives such as GenBank. It has multiple interfaces for easy submission of large or complex data sets, and supports pre-registration of samples for sequencing. CAMERA integrates a growing list of tools and viewers for querying, analyzing, annotating and comparing metagenome and genome data. PMID:21045053
Sun, Shulei; Chen, Jing; Li, Weizhong; Altintas, Ilkay; Lin, Abel; Peltier, Steve; Stocks, Karen; Allen, Eric E; Ellisman, Mark; Grethe, Jeffrey; Wooley, John
2011-01-01
The Community Cyberinfrastructure for Advanced Microbial Ecology Research and Analysis (CAMERA, http://camera.calit2.net/) is a database and associated computational infrastructure that provides a single system for depositing, locating, analyzing, visualizing and sharing data about microbial biology through an advanced web-based analysis portal. CAMERA collects and links metadata relevant to environmental metagenome data sets with annotation in a semantically-aware environment allowing users to write expressive semantic queries against the database. To meet the needs of the research community, users are able to query metadata categories such as habitat, sample type, time, location and other environmental physicochemical parameters. CAMERA is compliant with the standards promulgated by the Genomic Standards Consortium (GSC), and sustains a role within the GSC in extending standards for content and format of the metagenomic data and metadata and its submission to the CAMERA repository. To ensure wide, ready access to data and annotation, CAMERA also provides data submission tools to allow researchers to share and forward data to other metagenomics sites and community data archives such as GenBank. It has multiple interfaces for easy submission of large or complex data sets, and supports pre-registration of samples for sequencing. CAMERA integrates a growing list of tools and viewers for querying, analyzing, annotating and comparing metagenome and genome data.
Sud, Manish; Fahy, Eoin; Cotter, Dawn; Azam, Kenan; Vadivelu, Ilango; Burant, Charles; Edison, Arthur; Fiehn, Oliver; Higashi, Richard; Nair, K. Sreekumaran; Sumner, Susan; Subramaniam, Shankar
2016-01-01
The Metabolomics Workbench, available at www.metabolomicsworkbench.org, is a public repository for metabolomics metadata and experimental data spanning various species and experimental platforms, metabolite standards, metabolite structures, protocols, tutorials, and training material and other educational resources. It provides a computational platform to integrate, analyze, track, deposit and disseminate large volumes of heterogeneous data from a wide variety of metabolomics studies including mass spectrometry (MS) and nuclear magnetic resonance spectrometry (NMR) data spanning over 20 different species covering all the major taxonomic categories including humans and other mammals, plants, insects, invertebrates and microorganisms. Additionally, a number of protocols are provided for a range of metabolite classes, sample types, and both MS and NMR-based studies, along with a metabolite structure database. The metabolites characterized in the studies available on the Metabolomics Workbench are linked to chemical structures in the metabolite structure database to facilitate comparative analysis across studies. The Metabolomics Workbench, part of the data coordinating effort of the National Institute of Health (NIH) Common Fund's Metabolomics Program, provides data from the Common Fund's Metabolomics Resource Cores, metabolite standards, and analysis tools to the wider metabolomics community and seeks data depositions from metabolomics researchers across the world. PMID:26467476
Mousavi, Soraya; Mariotti, Roberto; Regni, Luca; Nasini, Luigi; Bufacchi, Marina; Pandolfi, Saverio; Baldoni, Luciana; Proietti, Primo
2017-01-01
Germplasm collections of tree crop species represent fundamental tools for conservation of diversity and key steps for its characterization and evaluation. For the olive tree, several collections were created all over the world, but only few of them have been fully characterized and molecularly identified. The olive collection of Perugia University (UNIPG), established in the years' 60, represents one of the first attempts to gather and safeguard olive diversity, keeping together cultivars from different countries. In the present study, a set of 370 olive trees previously uncharacterized was screened with 10 standard simple sequence repeats (SSRs) and nine new EST-SSR markers, to correctly and thoroughly identify all genotypes, verify their representativeness of the entire cultivated olive variation, and validate the effectiveness of new markers in comparison to standard genotyping tools. The SSR analysis revealed the presence of 59 genotypes, corresponding to 72 well known cultivars, 13 of them resulting exclusively present in this collection. The new EST-SSRs have shown values of diversity parameters quite similar to those of best standard SSRs. When compared to hundreds of Mediterranean cultivars, the UNIPG olive accessions were splitted into the three main populations (East, Center and West Mediterranean), confirming that the collection has a good representativeness of the entire olive variability. Furthermore, Bayesian analysis, performed on the 59 genotypes of the collection by the use of both sets of markers, have demonstrated their splitting into four clusters, with a well balanced membership obtained by EST respect to standard SSRs. The new OLEST ( Olea expressed sequence tags) SSR markers resulted as effective as the best standard markers. The information obtained from this study represents a high valuable tool for ex situ conservation and management of olive genetic resources, useful to build a common database from worldwide olive cultivar collections, also based on recently developed markers.
Chen, Josephine; Zhao, Po; Massaro, Donald; Clerch, Linda B; Almon, Richard R; DuBois, Debra C; Jusko, William J; Hoffman, Eric P
2004-01-01
Publicly accessible DNA databases (genome browsers) are rapidly accelerating post-genomic research (see http://www.genome.ucsc.edu/), with integrated genomic DNA, gene structure, EST/ splicing and cross-species ortholog data. DNA databases have relatively low dimensionality; the genome is a linear code that anchors all associated data. In contrast, RNA expression and protein databases need to be able to handle very high dimensional data, with time, tissue, cell type and genes, as interrelated variables. The high dimensionality of microarray expression profile data, and the lack of a standard experimental platform have complicated the development of web-accessible databases and analytical tools. We have designed and implemented a public resource of expression profile data containing 1024 human, mouse and rat Affymetrix GeneChip expression profiles, generated in the same laboratory, and subject to the same quality and procedural controls (Public Expression Profiling Resource; PEPR). Our Oracle-based PEPR data warehouse includes a novel time series query analysis tool (SGQT), enabling dynamic generation of graphs and spreadsheets showing the action of any transcript of interest over time. In this report, we demonstrate the utility of this tool using a 27 time point, in vivo muscle regeneration series. This data warehouse and associated analysis tools provides access to multidimensional microarray data through web-based interfaces, both for download of all types of raw data for independent analysis, and also for straightforward gene-based queries. Planned implementations of PEPR will include web-based remote entry of projects adhering to quality control and standard operating procedure (QC/SOP) criteria, and automated output of alternative probe set algorithms for each project (see http://microarray.cnmcresearch.org/pgadatatable.asp).
Chen, Josephine; Zhao, Po; Massaro, Donald; Clerch, Linda B.; Almon, Richard R.; DuBois, Debra C.; Jusko, William J.; Hoffman, Eric P.
2004-01-01
Publicly accessible DNA databases (genome browsers) are rapidly accelerating post-genomic research (see http://www.genome.ucsc.edu/), with integrated genomic DNA, gene structure, EST/ splicing and cross-species ortholog data. DNA databases have relatively low dimensionality; the genome is a linear code that anchors all associated data. In contrast, RNA expression and protein databases need to be able to handle very high dimensional data, with time, tissue, cell type and genes, as interrelated variables. The high dimensionality of microarray expression profile data, and the lack of a standard experimental platform have complicated the development of web-accessible databases and analytical tools. We have designed and implemented a public resource of expression profile data containing 1024 human, mouse and rat Affymetrix GeneChip expression profiles, generated in the same laboratory, and subject to the same quality and procedural controls (Public Expression Profiling Resource; PEPR). Our Oracle-based PEPR data warehouse includes a novel time series query analysis tool (SGQT), enabling dynamic generation of graphs and spreadsheets showing the action of any transcript of interest over time. In this report, we demonstrate the utility of this tool using a 27 time point, in vivo muscle regeneration series. This data warehouse and associated analysis tools provides access to multidimensional microarray data through web-based interfaces, both for download of all types of raw data for independent analysis, and also for straightforward gene-based queries. Planned implementations of PEPR will include web-based remote entry of projects adhering to quality control and standard operating procedure (QC/SOP) criteria, and automated output of alternative probe set algorithms for each project (see http://microarray.cnmcresearch.org/pgadatatable.asp). PMID:14681485
NREL Improves Building Energy Simulation Programs Through Diagnostic Testing (Fact Sheet)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2012-01-01
This technical highlight describes NREL research to develop Building Energy Simulation Test for Existing Homes (BESTEST-EX) to increase the quality and accuracy of energy analysis tools for the building retrofit market. Researchers at the National Renewable Energy Laboratory (NREL) have developed a new test procedure to increase the quality and accuracy of energy analysis tools for the building retrofit market. The Building Energy Simulation Test for Existing Homes (BESTEST-EX) is a test procedure that enables software developers to evaluate the performance of their audit tools in modeling energy use and savings in existing homes when utility bills are available formore » model calibration. Similar to NREL's previous energy analysis tests, such as HERS BESTEST and other BESTEST suites included in ANSI/ASHRAE Standard 140, BESTEST-EX compares software simulation findings to reference results generated with state-of-the-art simulation tools such as EnergyPlus, SUNREL, and DOE-2.1E. The BESTEST-EX methodology: (1) Tests software predictions of retrofit energy savings in existing homes; (2) Ensures building physics calculations and utility bill calibration procedures perform to a minimum standard; and (3) Quantifies impacts of uncertainties in input audit data and occupant behavior. BESTEST-EX includes building physics and utility bill calibration test cases. The diagram illustrates the utility bill calibration test cases. Participants are given input ranges and synthetic utility bills. Software tools use the utility bills to calibrate key model inputs and predict energy savings for the retrofit cases. Participant energy savings predictions using calibrated models are compared to NREL predictions using state-of-the-art building energy simulation programs.« less
Recent Developments in OVERGRID, OVERFLOW-2 and Chimera Grid Tools Scripts
NASA Technical Reports Server (NTRS)
Chan, William M.
2004-01-01
OVERGRID and OVERFLOW-2 feature easy to use multiple-body dynamics. The new features of OVERGRID include a preliminary chemistry interface, standard atmosphere and mass properties calculators, a simple unsteady solution viewer, and a debris tracking interface. Script library development in Chimera Grid Tools has applications in turbopump grid generation. This viewgraph presentation profiles multiple component dynamics, validation test cases for a sphere, cylinder, and oscillating airfoil, and debris analysis.
NASA Technical Reports Server (NTRS)
Bogert, Philip B.; Satyanarayana, Arunkumar; Chunchu, Prasad B.
2006-01-01
Splitting, ultimate failure load and the damage path in center notched composite specimens subjected to in-plane tension loading are predicted using progressive failure analysis methodology. A 2-D Hashin-Rotem failure criterion is used in determining intra-laminar fiber and matrix failures. This progressive failure methodology has been implemented in the Abaqus/Explicit and Abaqus/Standard finite element codes through user written subroutines "VUMAT" and "USDFLD" respectively. A 2-D finite element model is used for predicting the intra-laminar damages. Analysis results obtained from the Abaqus/Explicit and Abaqus/Standard code show good agreement with experimental results. The importance of modeling delamination in progressive failure analysis methodology is recognized for future studies. The use of an explicit integration dynamics code for simple specimen geometry and static loading establishes a foundation for future analyses where complex loading and nonlinear dynamic interactions of damage and structure will necessitate it.
Tool wear analysis during duplex stainless steel trochoidal milling
NASA Astrophysics Data System (ADS)
Amaro, Paulo; Ferreira, Pedro; Simões, Fernando
2018-05-01
In this study a tool with interchangeable inserts of sintered carbides coated with AlTiN were used to mill a duplex stainless steel with trochoidal strategies. Cutting speed range from 120 to 300 m/min were used and t he evaluation of tool deterioration and tool life was made according international standard ISO 8688-1. It was observed a progressive development of a flank wear and a cumulative cyclic process of localized adhesion of the chip to the cutting edge, followed by chipping, loss of the coating and substrate exposure. The tool life reached a maximum of 35 min. for cutting speed of 120 m/min. However, it was possible to maintain a tool life of 20-25 minutes when the cutting speed was increased up to 240 m/min.
Erberich, Stephan G; Bhandekar, Manasee; Chervenak, Ann; Kesselman, Carl; Nelson, Marvin D
2007-01-01
Functional MRI is successfully being used in clinical and research applications including preoperative planning, language mapping, and outcome monitoring. However, clinical use of fMRI is less widespread due to its complexity of imaging, image workflow, post-processing, and lack of algorithmic standards hindering result comparability. As a consequence, wide-spread adoption of fMRI as clinical tool is low contributing to the uncertainty of community physicians how to integrate fMRI into practice. In addition, training of physicians with fMRI is in its infancy and requires clinical and technical understanding. Therefore, many institutions which perform fMRI have a team of basic researchers and physicians to perform fMRI as a routine imaging tool. In order to provide fMRI as an advanced diagnostic tool to the benefit of a larger patient population, image acquisition and image post-processing must be streamlined, standardized, and available at any institution which does not have these resources available. Here we describe a software architecture, the functional imaging laboratory (funcLAB/G), which addresses (i) standardized image processing using Statistical Parametric Mapping and (ii) its extension to secure sharing and availability for the community using standards-based Grid technology (Globus Toolkit). funcLAB/G carries the potential to overcome the limitations of fMRI in clinical use and thus makes standardized fMRI available to the broader healthcare enterprise utilizing the Internet and HealthGrid Web Services technology.
Dynamic Hurricane Data Analysis Tool
NASA Technical Reports Server (NTRS)
Knosp, Brian W.; Li, Peggy; Vu, Quoc A.
2009-01-01
A dynamic hurricane data analysis tool allows users of the JPL Tropical Cyclone Information System (TCIS) to analyze data over a Web medium. The TCIS software is described in the previous article, Tropical Cyclone Information System (TCIS) (NPO-45748). This tool interfaces with the TCIS database to pull in data from several different atmospheric and oceanic data sets, both observed by instruments. Users can use this information to generate histograms, maps, and profile plots for specific storms. The tool also displays statistical values for the user-selected parameter for the mean, standard deviation, median, minimum, and maximum values. There is little wait time, allowing for fast data plots over date and spatial ranges. Users may also zoom-in for a closer look at a particular spatial range. This is version 1 of the software. Researchers will use the data and tools on the TCIS to understand hurricane processes, improve hurricane forecast models and identify what types of measurements the next generation of instruments will need to collect.
Kennerly, Susan; Heggestad, Eric D; Myers, Haley; Yap, Tracey L
2015-07-29
An effective workforce performing within the context of a positive cultural environment is central to a healthcare organization's ability to achieve quality outcomes. The Nursing Culture Assessment Tool (NCAT) provides nurses with a valid and reliable tool that captures the general aspects of nursing culture. This study extends earlier work confirming the tool's construct validity and dimensionality by standardizing the scoring approach and establishing norm-referenced scoring. Scoring standardization provides a reliable point of comparison for NCAT users. NCAT assessments support nursing's ability to evaluate nursing culture, use results to shape the culture into one that supports change, and advance nursing's best practices and care outcomes. Registered nurses, licensed practical nurses, and certified nursing assistants from 54 long-term care facilities in Kentucky, Nevada, North Carolina, and Oregon were surveyed. Confirmatory factor analysis yielded six first order factors forming the NCAT's subscales (Expectations, Behaviors, Teamwork, Communication, Satisfaction, Commitment) (Comparative Fit Index 0.93) and a second order factor-The Total Culture Score. Aggregated facility level comparisons of observed group variance with expected random variance using rwg(J) statistics is presented. Normative scores and cumulative rank percentages and how the NCAT can be used in implementing planned change are provided.
The jmzQuantML programming interface and validator for the mzQuantML data standard.
Qi, Da; Krishna, Ritesh; Jones, Andrew R
2014-03-01
The mzQuantML standard from the HUPO Proteomics Standards Initiative has recently been released, capturing quantitative data about peptides and proteins, following analysis of MS data. We present a Java application programming interface (API) for mzQuantML called jmzQuantML. The API provides robust bridges between Java classes and elements in mzQuantML files and allows random access to any part of the file. The API provides read and write capabilities, and is designed to be embedded in other software packages, enabling mzQuantML support to be added to proteomics software tools (http://code.google.com/p/jmzquantml/). The mzQuantML standard is designed around a multilevel validation system to ensure that files are structurally and semantically correct for different proteomics quantitative techniques. In this article, we also describe a Java software tool (http://code.google.com/p/mzquantml-validator/) for validating mzQuantML files, which is a formal part of the data standard. © 2014 The Authors. Proteomics published by Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Tissue enrichment analysis for C. elegans genomics.
Angeles-Albores, David; N Lee, Raymond Y; Chan, Juancarlos; Sternberg, Paul W
2016-09-13
Over the last ten years, there has been explosive development in methods for measuring gene expression. These methods can identify thousands of genes altered between conditions, but understanding these datasets and forming hypotheses based on them remains challenging. One way to analyze these datasets is to associate ontologies (hierarchical, descriptive vocabularies with controlled relations between terms) with genes and to look for enrichment of specific terms. Although Gene Ontology (GO) is available for Caenorhabditis elegans, it does not include anatomical information. We have developed a tool for identifying enrichment of C. elegans tissues among gene sets and generated a website GUI where users can access this tool. Since a common drawback to ontology enrichment analyses is its verbosity, we developed a very simple filtering algorithm to reduce the ontology size by an order of magnitude. We adjusted these filters and validated our tool using a set of 30 gold standards from Expression Cluster data in WormBase. We show our tool can even discriminate between embryonic and larval tissues and can even identify tissues down to the single-cell level. We used our tool to identify multiple neuronal tissues that are down-regulated due to pathogen infection in C. elegans. Our Tissue Enrichment Analysis (TEA) can be found within WormBase, and can be downloaded using Python's standard pip installer. It tests a slimmed-down C. elegans tissue ontology for enrichment of specific terms and provides users with a text and graphic representation of the results.
Pollock, James; Bolton, Glen; Coffman, Jon; Ho, Sa V; Bracewell, Daniel G; Farid, Suzanne S
2013-04-05
This paper presents an integrated experimental and modelling approach to evaluate the potential of semi-continuous chromatography for the capture of monoclonal antibodies (mAb) in clinical and commercial manufacture. Small-scale single-column experimental breakthrough studies were used to derive design equations for the semi-continuous affinity chromatography system. Verification runs with the semi-continuous 3-column and 4-column periodic counter current (PCC) chromatography system indicated the robustness of the design approach. The product quality profiles and step yields (after wash step optimisation) achieved were comparable to the standard batch process. The experimentally-derived design equations were incorporated into a decisional tool comprising dynamic simulation, process economics and sizing optimisation. The decisional tool was used to evaluate the economic and operational feasibility of whole mAb bioprocesses employing PCC affinity capture chromatography versus standard batch chromatography across a product's lifecycle from clinical to commercial manufacture. The tool predicted that PCC capture chromatography would offer more significant savings in direct costs for early-stage clinical manufacture (proof-of-concept) (∼30%) than for late-stage clinical (∼10-15%) or commercial (∼5%) manufacture. The evaluation also highlighted the potential facility fit issues that could arise with a capture resin (MabSelect) that experiences losses in binding capacity when operated in continuous mode over lengthy commercial campaigns. Consequently, the analysis explored the scenario of adopting the PCC system for clinical manufacture and switching to the standard batch process following product launch. The tool determined the PCC system design required to operate at commercial scale without facility fit issues and with similar costs to the standard batch process whilst pursuing a process change application. A retrofitting analysis established that the direct cost savings obtained by 8 proof-of-concept batches would be sufficient to pay back the investment cost of the pilot-scale semi-continuous chromatography system. Copyright © 2013 Elsevier B.V. All rights reserved.
a Standardized Approach to Topographic Data Processing and Workflow Management
NASA Astrophysics Data System (ADS)
Wheaton, J. M.; Bailey, P.; Glenn, N. F.; Hensleigh, J.; Hudak, A. T.; Shrestha, R.; Spaete, L.
2013-12-01
An ever-increasing list of options exist for collecting high resolution topographic data, including airborne LIDAR, terrestrial laser scanners, bathymetric SONAR and structure-from-motion. An equally rich, arguably overwhelming, variety of tools exists with which to organize, quality control, filter, analyze and summarize these data. However, scientists are often left to cobble together their analysis as a series of ad hoc steps, often using custom scripts and one-time processes that are poorly documented and rarely shared with the community. Even when literature-cited software tools are used, the input and output parameters differ from tool to tool. These parameters are rarely archived and the steps performed lost, making the analysis virtually impossible to replicate precisely. What is missing is a coherent, robust, framework for combining reliable, well-documented topographic data-processing steps into a workflow that can be repeated and even shared with others. We have taken several popular topographic data processing tools - including point cloud filtering and decimation as well as DEM differencing - and defined a common protocol for passing inputs and outputs between them. This presentation describes a free, public online portal that enables scientists to create custom workflows for processing topographic data using a number of popular topographic processing tools. Users provide the inputs required for each tool and in what sequence they want to combine them. This information is then stored for future reuse (and optionally sharing with others) before the user then downloads a single package that contains all the input and output specifications together with the software tools themselves. The user then launches the included batch file that executes the workflow on their local computer against their topographic data. This ZCloudTools architecture helps standardize, automate and archive topographic data processing. It also represents a forum for discovering and sharing effective topographic processing workflows.
Standardized data sharing in a paediatric oncology research network--a proof-of-concept study.
Hochedlinger, Nina; Nitzlnader, Michael; Falgenhauer, Markus; Welte, Stefan; Hayn, Dieter; Koumakis, Lefteris; Potamias, George; Tsiknakis, Manolis; Saraceno, Davide; Rinaldi, Eugenia; Ladenstein, Ruth; Schreier, Günter
2015-01-01
Data that has been collected in the course of clinical trials are potentially valuable for additional scientific research questions in so called secondary use scenarios. This is of particular importance in rare disease areas like paediatric oncology. If data from several research projects need to be connected, so called Core Datasets can be used to define which information needs to be extracted from every involved source system. In this work, the utility of the Clinical Data Interchange Standards Consortium (CDISC) Operational Data Model (ODM) as a format for Core Datasets was evaluated and a web tool was developed which received Source ODM XML files and--via Extensible Stylesheet Language Transformation (XSLT)--generated standardized Core Dataset ODM XML files. Using this tool, data from different source systems were extracted and pooled for joined analysis in a proof-of-concept study, facilitating both, basic syntactic and semantic interoperability.
Teaching core competencies of reconstructive microsurgery with the use of standardized patients.
Son, Ji; Zeidler, Kamakshi R; Echo, Anthony; Otake, Leo; Ahdoot, Michael; Lee, Gordon K
2013-04-01
The Accreditation Council of Graduate Medical Education has defined 6 core competencies that residents must master before completing their training. Objective structured clinical examinations (OSCEs) using standardized patients are effective educational tools to assess and teach core competencies. We developed an OSCE specific for microsurgical head and neck reconstruction. Fifteen plastic surgery residents participated in the OSCE simulating a typical new patient consultation, which involved a patient with oral cancer. Residents were scored in all 6 core competencies by the standardized patients and faculty experts. Analysis of participant performance showed that although residents performed well overall, many lacked proficiency in systems-based practice. Junior residents were also more likely to omit critical elements of the physical examination compared to senior residents. We have modified our educational curriculum to specifically address these deficiencies. Our study demonstrates that the OSCE is an effective assessment tool for teaching and assessing all core competencies in microsurgery.
A Standards-Based Grading and Reporting Tool for Faculty: Design and Implications
ERIC Educational Resources Information Center
Sadik, Alaa M.
2011-01-01
The use of standard-based assessment, grading and reporting tools is essential to ensure that assessment meets acceptable levels of quality and standardization. This study reports the design, development and evaluation of a standards-based assessment tool for the instructors at Sultan Qaboos University, Sultanate of Oman. The Rapid Applications…
Elemental Analysis in Biological Matrices Using ICP-MS.
Hansen, Matthew N; Clogston, Jeffrey D
2018-01-01
The increasing exploration of metallic nanoparticles for use as cancer therapeutic agents necessitates a sensitive technique to track the clearance and distribution of the material once introduced into a living system. Inductively coupled plasma mass spectrometry (ICP-MS) provides a sensitive and selective tool for tracking the distribution of metal components from these nanotherapeutics. This chapter presents a standardized method for processing biological matrices, ensuring complete homogenization of tissues, and outlines the preparation of appropriate standards and controls. The method described herein utilized gold nanoparticle-treated samples; however, the method can easily be applied to the analysis of other metals.
ArrayInitiative - a tool that simplifies creating custom Affymetrix CDFs
2011-01-01
Background Probes on a microarray represent a frozen view of a genome and are quickly outdated when new sequencing studies extend our knowledge, resulting in significant measurement error when analyzing any microarray experiment. There are several bioinformatics approaches to improve probe assignments, but without in-house programming expertise, standardizing these custom array specifications as a usable file (e.g. as Affymetrix CDFs) is difficult, owing mostly to the complexity of the specification file format. However, without correctly standardized files there is a significant barrier for testing competing analysis approaches since this file is one of the required inputs for many commonly used algorithms. The need to test combinations of probe assignments and analysis algorithms led us to develop ArrayInitiative, a tool for creating and managing custom array specifications. Results ArrayInitiative is a standalone, cross-platform, rich client desktop application for creating correctly formatted, custom versions of manufacturer-provided (default) array specifications, requiring only minimal knowledge of the array specification rules and file formats. Users can import default array specifications, import probe sequences for a default array specification, design and import a custom array specification, export any array specification to multiple output formats, export the probe sequences for any array specification and browse high-level information about the microarray, such as version and number of probes. The initial release of ArrayInitiative supports the Affymetrix 3' IVT expression arrays we currently analyze, but as an open source application, we hope that others will contribute modules for other platforms. Conclusions ArrayInitiative allows researchers to create new array specifications, in a standard format, based upon their own requirements. This makes it easier to test competing design and analysis strategies that depend on probe definitions. Since the custom array specifications are easily exported to the manufacturer's standard format, researchers can analyze these customized microarray experiments using established software tools, such as those available in Bioconductor. PMID:21548938
TRAPR: R Package for Statistical Analysis and Visualization of RNA-Seq Data.
Lim, Jae Hyun; Lee, Soo Youn; Kim, Ju Han
2017-03-01
High-throughput transcriptome sequencing, also known as RNA sequencing (RNA-Seq), is a standard technology for measuring gene expression with unprecedented accuracy. Numerous bioconductor packages have been developed for the statistical analysis of RNA-Seq data. However, these tools focus on specific aspects of the data analysis pipeline, and are difficult to appropriately integrate with one another due to their disparate data structures and processing methods. They also lack visualization methods to confirm the integrity of the data and the process. In this paper, we propose an R-based RNA-Seq analysis pipeline called TRAPR, an integrated tool that facilitates the statistical analysis and visualization of RNA-Seq expression data. TRAPR provides various functions for data management, the filtering of low-quality data, normalization, transformation, statistical analysis, data visualization, and result visualization that allow researchers to build customized analysis pipelines.
A graph algebra for scalable visual analytics.
Shaverdian, Anna A; Zhou, Hao; Michailidis, George; Jagadish, Hosagrahar V
2012-01-01
Visual analytics (VA), which combines analytical techniques with advanced visualization features, is fast becoming a standard tool for extracting information from graph data. Researchers have developed many tools for this purpose, suggesting a need for formal methods to guide these tools' creation. Increased data demands on computing requires redesigning VA tools to consider performance and reliability in the context of analysis of exascale datasets. Furthermore, visual analysts need a way to document their analyses for reuse and results justification. A VA graph framework encapsulated in a graph algebra helps address these needs. Its atomic operators include selection and aggregation. The framework employs a visual operator and supports dynamic attributes of data to enable scalable visual exploration of data.
Percy, Andrew J; Mohammed, Yassene; Yang, Juncong; Borchers, Christoph H
2015-12-01
An increasingly popular mass spectrometry-based quantitative approach for health-related research in the biomedical field involves the use of stable isotope-labeled standards (SIS) and multiple/selected reaction monitoring (MRM/SRM). To improve inter-laboratory precision and enable more widespread use of this 'absolute' quantitative technique in disease-biomarker assessment studies, methods must be standardized. Results/methodology: Using this MRM-with-SIS-peptide approach, we developed an automated method (encompassing sample preparation, processing and analysis) for quantifying 76 candidate protein markers (spanning >4 orders of magnitude in concentration) in neat human plasma. The assembled biomarker assessment kit - the 'BAK-76' - contains the essential materials (SIS mixes), methods (for acquisition and analysis), and tools (Qualis-SIS software) for performing biomarker discovery or verification studies in a rapid and standardized manner.
Engineering Analysis Using a Web-based Protocol
NASA Technical Reports Server (NTRS)
Schoeffler, James D.; Claus, Russell W.
2002-01-01
This paper reviews the development of a web-based framework for engineering analysis. A one-dimensional, high-speed analysis code called LAPIN was used in this study, but the approach can be generalized to any engineering analysis tool. The web-based framework enables users to store, retrieve, and execute an engineering analysis from a standard web-browser. We review the encapsulation of the engineering data into the eXtensible Markup Language (XML) and various design considerations in the storage and retrieval of application data.
VARED: Verification and Analysis of Requirements and Early Designs
NASA Technical Reports Server (NTRS)
Badger, Julia; Throop, David; Claunch, Charles
2014-01-01
Requirements are a part of every project life cycle; everything going forward in a project depends on them. Good requirements are hard to write, there are few useful tools to test, verify, or check them, and it is difficult to properly marry them to the subsequent design, especially if the requirements are written in natural language. In fact, the inconsistencies and errors in the requirements along with the difficulty in finding these errors contribute greatly to the cost of the testing and verification stage of flight software projects [1]. Large projects tend to have several thousand requirements written at various levels by different groups of people. The design process is distributed and a lack of widely accepted standards for requirements often results in a product that varies widely in style and quality. A simple way to improve this would be to standardize the design process using a set of tools and widely accepted requirements design constraints. The difficulty with this approach is finding the appropriate constraints and tools. Common complaints against the tools available include ease of use, functionality, and available features. Also, although preferable, it is rare that these tools are capable of testing the quality of the requirements.
ERIC Educational Resources Information Center
Yammine, Kaissar; Violato, Claudio
2015-01-01
Many medical graduates are deficient in anatomy knowledge and perhaps below the standards for safe medical practice. Three-dimensional visualization technology (3DVT) has been advanced as a promising tool to enhance anatomy knowledge. The purpose of this review is to conduct a meta-analysis of the effectiveness of 3DVT in teaching and learning…
COMPASS: a suite of pre- and post-search proteomics software tools for OMSSA
Wenger, Craig D.; Phanstiel, Douglas H.; Lee, M. Violet; Bailey, Derek J.; Coon, Joshua J.
2011-01-01
Here we present the Coon OMSSA Proteomic Analysis Software Suite (COMPASS): a free and open-source software pipeline for high-throughput analysis of proteomics data, designed around the Open Mass Spectrometry Search Algorithm. We detail a synergistic set of tools for protein database generation, spectral reduction, peptide false discovery rate analysis, peptide quantitation via isobaric labeling, protein parsimony and protein false discovery rate analysis, and protein quantitation. We strive for maximum ease of use, utilizing graphical user interfaces and working with data files in the original instrument vendor format. Results are stored in plain text comma-separated values files, which are easy to view and manipulate with a text editor or spreadsheet program. We illustrate the operation and efficacy of COMPASS through the use of two LC–MS/MS datasets. The first is a dataset of a highly annotated mixture of standard proteins and manually validated contaminants that exhibits the identification workflow. The second is a dataset of yeast peptides, labeled with isobaric stable isotope tags and mixed in known ratios, to demonstrate the quantitative workflow. For these two datasets, COMPASS performs equivalently or better than the current de facto standard, the Trans-Proteomic Pipeline. PMID:21298793
Integrated Analysis Tools for the NERRS System-Wide Monitoring Program Data
Standardized monitoring programs have vastly improved the quantity and quality of data that form the basis of environmental decision-making. One example is the NOAA-funded National Estuarine Research Reserve System (NERRS) System-wide Monitoring Program (SWMP) that was implement...
ZBIT Bioinformatics Toolbox: A Web-Platform for Systems Biology and Expression Data Analysis
Römer, Michael; Eichner, Johannes; Dräger, Andreas; Wrzodek, Clemens; Wrzodek, Finja; Zell, Andreas
2016-01-01
Bioinformatics analysis has become an integral part of research in biology. However, installation and use of scientific software can be difficult and often requires technical expert knowledge. Reasons are dependencies on certain operating systems or required third-party libraries, missing graphical user interfaces and documentation, or nonstandard input and output formats. In order to make bioinformatics software easily accessible to researchers, we here present a web-based platform. The Center for Bioinformatics Tuebingen (ZBIT) Bioinformatics Toolbox provides web-based access to a collection of bioinformatics tools developed for systems biology, protein sequence annotation, and expression data analysis. Currently, the collection encompasses software for conversion and processing of community standards SBML and BioPAX, transcription factor analysis, and analysis of microarray data from transcriptomics and proteomics studies. All tools are hosted on a customized Galaxy instance and run on a dedicated computation cluster. Users only need a web browser and an active internet connection in order to benefit from this service. The web platform is designed to facilitate the usage of the bioinformatics tools for researchers without advanced technical background. Users can combine tools for complex analyses or use predefined, customizable workflows. All results are stored persistently and reproducible. For each tool, we provide documentation, tutorials, and example data to maximize usability. The ZBIT Bioinformatics Toolbox is freely available at https://webservices.cs.uni-tuebingen.de/. PMID:26882475
GIS Toolsets for Planetary Geomorphology and Landing-Site Analysis
NASA Astrophysics Data System (ADS)
Nass, Andrea; van Gasselt, Stephan
2015-04-01
Modern Geographic Information Systems (GIS) allow expert and lay users alike to load and position geographic data and perform simple to highly complex surface analyses. For many applications dedicated and ready-to-use GIS tools are available in standard software systems while other applications require the modular combination of available basic tools to answer more specific questions. This also applies to analyses in modern planetary geomorphology where many of such (basic) tools can be used to build complex analysis tools, e.g. in image- and terrain model analysis. Apart from the simple application of sets of different tools, many complex tasks require a more sophisticated design for storing and accessing data using databases (e.g. ArcHydro for hydrological data analysis). In planetary sciences, complex database-driven models are often required to efficiently analyse potential landings sites or store rover data, but also geologic mapping data can be efficiently stored and accessed using database models rather than stand-alone shapefiles. For landings-site analyses, relief and surface roughness estimates are two common concepts that are of particular interest and for both, a number of different definitions co-exist. We here present an advanced toolset for the analysis of image and terrain-model data with an emphasis on extraction of landing site characteristics using established criteria. We provide working examples and particularly focus on the concepts of terrain roughness as it is interpreted in geomorphology and engineering studies.
Building Energy Monitoring and Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hong, Tianzhen; Feng, Wei; Lu, Alison
This project aimed to develop a standard methodology for building energy data definition, collection, presentation, and analysis; apply the developed methods to a standardized energy monitoring platform, including hardware and software, to collect and analyze building energy use data; and compile offline statistical data and online real-time data in both countries for fully understanding the current status of building energy use. This helps decode the driving forces behind the discrepancy of building energy use between the two countries; identify gaps and deficiencies of current building energy monitoring, data collection, and analysis; and create knowledge and tools to collect and analyzemore » good building energy data to provide valuable and actionable information for key stakeholders.« less
NASA Astrophysics Data System (ADS)
Krehbiel, C.; Maiersperger, T.; Friesz, A.; Harriman, L.; Quenzer, R.; Impecoven, K.
2016-12-01
Three major obstacles facing big Earth data users include data storage, management, and analysis. As the amount of satellite remote sensing data increases, so does the need for better data storage and management strategies to exploit the plethora of data now available. Standard GIS tools can help big Earth data users whom interact with and analyze increasingly large and diverse datasets. In this presentation we highlight how NASA's Land Processes Distributed Active Archive Center (LP DAAC) is tackling these big Earth data challenges. We provide a real life use case example to describe three tools and services provided by the LP DAAC to more efficiently exploit big Earth data in a GIS environment. First, we describe the Open-source Project for a Network Data Access Protocol (OPeNDAP), which calls to specific data, minimizing the amount of data that a user downloads and improves the efficiency of data downloading and processing. Next, we cover the LP DAAC's Application for Extracting and Exploring Analysis Ready Samples (AppEEARS), a web application interface for extracting and analyzing land remote sensing data. From there, we review an ArcPython toolbox that was developed to provide quality control services to land remote sensing data products. Locating and extracting specific subsets of larger big Earth datasets improves data storage and management efficiency for the end user, and quality control services provides a straightforward interpretation of big Earth data. These tools and services are beneficial to the GIS user community in terms of standardizing workflows and improving data storage, management, and analysis tactics.
Szentiks, C A; Tsangaras, K; Abendroth, B; Scheuch, M; Stenglein, M D; Wohlsein, P; Heeger, F; Höveler, R; Chen, W; Sun, W; Damiani, A; Nikolin, V; Gruber, A D; Grobbel, M; Kalthoff, D; Höper, D; Czirják, G Á; Derisi, J; Mazzoni, C J; Schüle, A; Aue, A; East, M L; Hofer, H; Beer, M; Osterrieder, N; Greenwood, A D
2014-05-01
This report describes three possibly related incidences of encephalitis, two of them lethal, in captive polar bears (Ursus maritimus). Standard diagnostic methods failed to identify pathogens in any of these cases. A comprehensive, three-stage diagnostic 'pipeline' employing both standard serological methods and new DNA microarray and next generation sequencing-based diagnostics was developed, in part as a consequence of this initial failure. This pipeline approach illustrates the strengths, weaknesses and limitations of these tools in determining pathogen caused deaths in non-model organisms such as wildlife species and why the use of a limited number of diagnostic tools may fail to uncover important wildlife pathogens. Copyright © 2013 Elsevier Ltd. All rights reserved.
Clarke, Callisia N; Patel, Sameer H; Day, Ryan W; George, Sobha; Sweeney, Colin; Monetes De Oca, Georgina Avaloa; Aiss, Mohamed Ait; Grubbs, Elizabeth G; Bednarski, Brian K; Lee, Jeffery E; Bodurka, Diane C; Skibber, John M; Aloia, Thomas A
2017-03-01
Duty-hour regulations have increased the frequency of trainee-trainee patient handoffs. Each handoff creates a potential source for communication errors that can lead to near-miss and patient-harm events. We investigated the utility, efficacy, and trainee experience associated with implementation of a novel, standardized, electronic handoff system. We conducted a prospective intervention study of trainee-trainee handoffs of inpatients undergoing complex general surgical oncology procedures at a large tertiary institution. Preimplementation data were measured using trainee surveys and direct observation and by tracking delinquencies in charting. A standardized electronic handoff tool was created in a research electronic data capture (REDCap) database using the previously validated I-PASS methodology (illness severity, patient summary, action list, situational awareness and contingency planning, and synthesis). Electronic handoff was augmented by direct communication via phone or face-to-face interaction for inpatients deemed "watcher" or "unstable." Postimplementation handoff compliance, communication errors, and trainee work flow were measured and compared to preimplementation values using standard statistical analysis. A total of 474 handoffs (203 preintervention and 271 postintervention) were observed over the study period; 86 handoffs involved patients admitted to the surgical intensive care unit, 344 patients admitted to the surgical stepdown unit, and 44 patients on the surgery ward. Implementation of the structured electronic tool resulted in an increase in trainee handoff compliance from 73% to 96% (P < .001) and decreased errors in communication by 50% (P = .044) while improving trainee efficiency and workflow. A standardized electronic tool augmented by direct communication for higher acuity patients can improve compliance, accuracy, and efficiency of handoff communication between surgery trainees. Copyright © 2016 Elsevier Inc. All rights reserved.
Lähdesmäki, Harri; Hautaniemi, Sampsa; Shmulevich, Ilya; Yli-Harja, Olli
2006-01-01
A significant amount of attention has recently been focused on modeling of gene regulatory networks. Two frequently used large-scale modeling frameworks are Bayesian networks (BNs) and Boolean networks, the latter one being a special case of its recent stochastic extension, probabilistic Boolean networks (PBNs). PBN is a promising model class that generalizes the standard rule-based interactions of Boolean networks into the stochastic setting. Dynamic Bayesian networks (DBNs) is a general and versatile model class that is able to represent complex temporal stochastic processes and has also been proposed as a model for gene regulatory systems. In this paper, we concentrate on these two model classes and demonstrate that PBNs and a certain subclass of DBNs can represent the same joint probability distribution over their common variables. The major benefit of introducing the relationships between the models is that it opens up the possibility of applying the standard tools of DBNs to PBNs and vice versa. Hence, the standard learning tools of DBNs can be applied in the context of PBNs, and the inference methods give a natural way of handling the missing values in PBNs which are often present in gene expression measurements. Conversely, the tools for controlling the stationary behavior of the networks, tools for projecting networks onto sub-networks, and efficient learning schemes can be used for DBNs. In other words, the introduced relationships between the models extend the collection of analysis tools for both model classes. PMID:17415411
2013-01-01
Background The harmonization of European health systems brings with it a need for tools to allow the standardized collection of information about medical care. A common coding system and standards for the description of services are needed to allow local data to be incorporated into evidence-informed policy, and to permit equity and mobility to be assessed. The aim of this project has been to design such a classification and a related tool for the coding of services for Long Term Care (DESDE-LTC), based on the European Service Mapping Schedule (ESMS). Methods The development of DESDE-LTC followed an iterative process using nominal groups in 6 European countries. 54 researchers and stakeholders in health and social services contributed to this process. In order to classify services, we use the minimal organization unit or “Basic Stable Input of Care” (BSIC), coded by its principal function or “Main Type of Care” (MTC). The evaluation of the tool included an analysis of feasibility, consistency, ontology, inter-rater reliability, Boolean Factor Analysis, and a preliminary impact analysis (screening, scoping and appraisal). Results DESDE-LTC includes an alpha-numerical coding system, a glossary and an assessment instrument for mapping and counting LTC. It shows high feasibility, consistency, inter-rater reliability and face, content and construct validity. DESDE-LTC is ontologically consistent. It is regarded by experts as useful and relevant for evidence-informed decision making. Conclusion DESDE-LTC contributes to establishing a common terminology, taxonomy and coding of LTC services in a European context, and a standard procedure for data collection and international comparison. PMID:23768163
Predicting SPE Fluxes: Coupled Simulations and Analysis Tools
NASA Astrophysics Data System (ADS)
Gorby, M.; Schwadron, N.; Linker, J.; Caplan, R. M.; Wijaya, J.; Downs, C.; Lionello, R.
2017-12-01
Presented here is a nuts-and-bolts look at the coupled framework of Predictive Science Inc's Magnetohydrodynamics Around a Sphere (MAS) code and the Energetic Particle Radiation Environment Module (EPREM). MAS simulated coronal mass ejection output from a variety of events can be selected as the MHD input to EPREM and a variety of parameters can be set to run against: bakground seed particle spectra, mean free path, perpendicular diffusion efficiency, etc.. A standard set of visualizations are produced as well as a library of analysis tools for deeper inquiries. All steps will be covered end-to-end as well as the framework's user interface and availability.
Han, Seong Kyu; Lee, Dongyeop; Lee, Heetak; Kim, Donghyo; Son, Heehwa G; Yang, Jae-Seong; Lee, Seung-Jae V; Kim, Sanguk
2016-08-30
Online application for survival analysis (OASIS) has served as a popular and convenient platform for the statistical analysis of various survival data, particularly in the field of aging research. With the recent advances in the fields of aging research that deal with complex survival data, we noticed a need for updates to the current version of OASIS. Here, we report OASIS 2 (http://sbi.postech.ac.kr/oasis2), which provides extended statistical tools for survival data and an enhanced user interface. In particular, OASIS 2 enables the statistical comparison of maximal lifespans, which is potentially useful for determining key factors that limit the lifespan of a population. Furthermore, OASIS 2 provides statistical and graphical tools that compare values in different conditions and times. That feature is useful for comparing age-associated changes in physiological activities, which can be used as indicators of "healthspan." We believe that OASIS 2 will serve as a standard platform for survival analysis with advanced and user-friendly statistical tools for experimental biologists in the field of aging research.
A Tool for Model-Based Generation of Scenario-driven Electric Power Load Profiles
NASA Technical Reports Server (NTRS)
Rozek, Matthew L.; Donahue, Kenneth M.; Ingham, Michel D.; Kaderka, Justin D.
2015-01-01
Power consumption during all phases of spacecraft flight is of great interest to the aerospace community. As a result, significant analysis effort is exerted to understand the rates of electrical energy generation and consumption under many operational scenarios of the system. Previously, no standard tool existed for creating and maintaining a power equipment list (PEL) of spacecraft components that consume power, and no standard tool existed for generating power load profiles based on this PEL information during mission design phases. This paper presents the Scenario Power Load Analysis Tool (SPLAT) as a model-based systems engineering tool aiming to solve those problems. SPLAT is a plugin for MagicDraw (No Magic, Inc.) that aids in creating and maintaining a PEL, and also generates a power and temporal variable constraint set, in Maple language syntax, based on specified operational scenarios. The constraint set can be solved in Maple to show electric load profiles (i.e. power consumption from loads over time). SPLAT creates these load profiles from three modeled inputs: 1) a list of system components and their respective power modes, 2) a decomposition hierarchy of the system into these components, and 3) the specification of at least one scenario, which consists of temporal constraints on component power modes. In order to demonstrate how this information is represented in a system model, a notional example of a spacecraft planetary flyby is introduced. This example is also used to explain the overall functionality of SPLAT, and how this is used to generate electric power load profiles. Lastly, a cursory review of the usage of SPLAT on the Cold Atom Laboratory project is presented to show how the tool was used in an actual space hardware design application.
Language-Agnostic Reproducible Data Analysis Using Literate Programming.
Vassilev, Boris; Louhimo, Riku; Ikonen, Elina; Hautaniemi, Sampsa
2016-01-01
A modern biomedical research project can easily contain hundreds of analysis steps and lack of reproducibility of the analyses has been recognized as a severe issue. While thorough documentation enables reproducibility, the number of analysis programs used can be so large that in reality reproducibility cannot be easily achieved. Literate programming is an approach to present computer programs to human readers. The code is rearranged to follow the logic of the program, and to explain that logic in a natural language. The code executed by the computer is extracted from the literate source code. As such, literate programming is an ideal formalism for systematizing analysis steps in biomedical research. We have developed the reproducible computing tool Lir (literate, reproducible computing) that allows a tool-agnostic approach to biomedical data analysis. We demonstrate the utility of Lir by applying it to a case study. Our aim was to investigate the role of endosomal trafficking regulators to the progression of breast cancer. In this analysis, a variety of tools were combined to interpret the available data: a relational database, standard command-line tools, and a statistical computing environment. The analysis revealed that the lipid transport related genes LAPTM4B and NDRG1 are coamplified in breast cancer patients, and identified genes potentially cooperating with LAPTM4B in breast cancer progression. Our case study demonstrates that with Lir, an array of tools can be combined in the same data analysis to improve efficiency, reproducibility, and ease of understanding. Lir is an open-source software available at github.com/borisvassilev/lir.
Language-Agnostic Reproducible Data Analysis Using Literate Programming
Vassilev, Boris; Louhimo, Riku; Ikonen, Elina; Hautaniemi, Sampsa
2016-01-01
A modern biomedical research project can easily contain hundreds of analysis steps and lack of reproducibility of the analyses has been recognized as a severe issue. While thorough documentation enables reproducibility, the number of analysis programs used can be so large that in reality reproducibility cannot be easily achieved. Literate programming is an approach to present computer programs to human readers. The code is rearranged to follow the logic of the program, and to explain that logic in a natural language. The code executed by the computer is extracted from the literate source code. As such, literate programming is an ideal formalism for systematizing analysis steps in biomedical research. We have developed the reproducible computing tool Lir (literate, reproducible computing) that allows a tool-agnostic approach to biomedical data analysis. We demonstrate the utility of Lir by applying it to a case study. Our aim was to investigate the role of endosomal trafficking regulators to the progression of breast cancer. In this analysis, a variety of tools were combined to interpret the available data: a relational database, standard command-line tools, and a statistical computing environment. The analysis revealed that the lipid transport related genes LAPTM4B and NDRG1 are coamplified in breast cancer patients, and identified genes potentially cooperating with LAPTM4B in breast cancer progression. Our case study demonstrates that with Lir, an array of tools can be combined in the same data analysis to improve efficiency, reproducibility, and ease of understanding. Lir is an open-source software available at github.com/borisvassilev/lir. PMID:27711123
An overview of the web-based Google Earth coincident imaging tool
Chander, Gyanesh; Kilough, B.; Gowda, S.
2010-01-01
The Committee on Earth Observing Satellites (CEOS) Visualization Environment (COVE) tool is a browser-based application that leverages Google Earth web to display satellite sensor coverage areas. The analysis tool can also be used to identify near simultaneous surface observation locations for two or more satellites. The National Aeronautics and Space Administration (NASA) CEOS System Engineering Office (SEO) worked with the CEOS Working Group on Calibration and Validation (WGCV) to develop the COVE tool. The CEOS member organizations are currently operating and planning hundreds of Earth Observation (EO) satellites. Standard cross-comparison exercises between multiple sensors to compare near-simultaneous surface observations and to identify corresponding image pairs are time-consuming and labor-intensive. COVE is a suite of tools that have been developed to make such tasks easier.
HACCP: Integrating Science and Management through ASTM Standards
From a technical perspective, hazard analysis-critical control point (HACCP) evaluation may be considered a risk management tool suited to a wide range of applications. As one outcome of a symposium convened by American Society for Testing and Materials (ASTM) in August, 2005, th...
Hard Choices for Individual Situations.
ERIC Educational Resources Information Center
Landon, Bruce
This paper focuses on faculty use of a decision-making process for complex situations. The analysis part of the process describes and compares course management software focusing on: technical specifications, instructional design values,tools and features, ease of use, and standards compliance. The extensive comparisons provide faculty with…
Issues in Biomedical Research Data Management and Analysis: Needs and Barriers
Anderson, Nicholas R.; Lee, E. Sally; Brockenbrough, J. Scott; Minie, Mark E.; Fuller, Sherrilynne; Brinkley, James; Tarczy-Hornoch, Peter
2007-01-01
Objectives A. Identify the current state of data management needs of academic biomedical researchers. B. Explore their anticipated data management and analysis needs. C. Identify barriers to addressing those needs. Design A multimodal needs analysis was conducted using a combination of an online survey and in-depth one-on-one semi-structured interviews. Subjects were recruited via an e-mail list representing a wide range of academic biomedical researchers in the Pacific Northwest. Measurements The results from 286 survey respondents were used to provide triangulation of the qualitative analysis of data gathered from 15 semi-structured in-depth interviews. Results Three major themes were identified: 1) there continues to be widespread use of basic general-purpose applications for core data management; 2) there is broad perceived need for additional support in managing and analyzing large datasets; and 3) the barriers to acquiring currently available tools are most commonly related to financial burdens on small labs and unmet expectations of institutional support. Conclusion Themes identified in this study suggest that at least some common data management needs will best be served by improving access to basic level tools such that researchers can solve their own problems. Additionally, institutions and informaticians should focus on three components: 1) facilitate and encourage the use of modern data exchange models and standards, enabling researchers to leverage a common layer of interoperability and analysis; 2) improve the ability of researchers to maintain provenance of data and models as they evolve over time though tools and the leveraging of standards; and 3) develop and support information management service cores that could assist in these previous components while providing researchers with unique data analysis and information design support within a spectrum of informatics capabilities. PMID:17460139
HTSeq--a Python framework to work with high-throughput sequencing data.
Anders, Simon; Pyl, Paul Theodor; Huber, Wolfgang
2015-01-15
A large choice of tools exists for many standard tasks in the analysis of high-throughput sequencing (HTS) data. However, once a project deviates from standard workflows, custom scripts are needed. We present HTSeq, a Python library to facilitate the rapid development of such scripts. HTSeq offers parsers for many common data formats in HTS projects, as well as classes to represent data, such as genomic coordinates, sequences, sequencing reads, alignments, gene model information and variant calls, and provides data structures that allow for querying via genomic coordinates. We also present htseq-count, a tool developed with HTSeq that preprocesses RNA-Seq data for differential expression analysis by counting the overlap of reads with genes. HTSeq is released as an open-source software under the GNU General Public Licence and available from http://www-huber.embl.de/HTSeq or from the Python Package Index at https://pypi.python.org/pypi/HTSeq. © The Author 2014. Published by Oxford University Press.
Bioelectrical impedance analysis: A new tool for assessing fish condition
Hartman, Kyle J.; Margraf, F. Joseph; Hafs, Andrew W.; Cox, M. Keith
2015-01-01
Bioelectrical impedance analysis (BIA) is commonly used in human health and nutrition fields but has only recently been considered as a potential tool for assessing fish condition. Once BIA is calibrated, it estimates fat/moisture levels and energy content without the need to kill fish. Despite the promise held by BIA, published studies have been divided on whether BIA can provide accurate estimates of body composition in fish. In cases where BIA was not successful, the models lacked the range of fat levels or sample sizes we determined were needed for model success (range of dry fat levels of 29%, n = 60, yielding an R2 of 0.8). Reduced range of fat levels requires an increased sample size to achieve that benchmark; therefore, standardization of methods is needed. Here we discuss standardized methods based on a decade of research, identify sources of error, discuss where BIA is headed, and suggest areas for future research.
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Leary, Conlan
Over the project, Sighten built a comprehensive software-as-a-service (Saas) platform to automate and streamline the residential solar financing workflow. Before the project period, significant time and money was spent by companies on front-end tools related to system design and proposal creation, but comparatively few resources were available to support the many back-end calculations and data management processes that underpin third party financing. Without a tool like Sighten, the solar financing processes involved passing information from the homeowner prospect into separate tools for system design, financing, and then later to reporting tools including Microsoft Excel, CRM software, in-house software, outside software,more » and offline, manual processes. Passing data between tools and attempting to connect disparate systems results in inefficiency and inaccuracy for the industry. Sighten was built to consolidate all financial and solar-related calculations in a single software platform. It significantly improves upon the accuracy of these calculations and exposes sophisticated new analysis tools resulting in a rigorous, efficient and cost-effective toolset for scaling residential solar. Widely deploying a platform like Sighten’s significantly and immediately impacts the residential solar space in several important ways: 1) standardizing and improving the quality of all quantitative calculations involved in the residential financing process, most notably project finance, system production and reporting calculations; 2) representing a true step change in terms of reporting and analysis capabilities by maintaining more accurate data and exposing sophisticated tools around simulation, tranching, and financial reporting, among others, to all stakeholders in the space; 3) allowing a broader group of developers/installers/finance companies to access the capital markets by providing an out-of-the-box toolset that handles the execution of running investor capital through a rooftop solar financing program. Standardizing and improving all calculations, improving data quality, and exposing new analysis tools previously unavailable affects investment in the residential space in several important ways: 1) lowering the cost of capital for existing capital providers by mitigating uncertainty and de-risking the solar asset class; 2) attracting new, lower cost investors to the solar asset class as reporting and data quality resemble standards of more mature asset classes; 3) increasing the prevalence of liquidity options for investors through back leverage, securitization, or secondary sale by providing the tools necessary for lenders, ratings agencies, etc. to properly understand a portfolio of residential solar assets. During the project period, Sighten successfully built and scaled a commercially ready tool for the residential solar market. The software solution built by Sighten has been deployed with key target customer segments identified in the award deliverables: solar installers, solar developers/channel managers, and solar financiers, including lenders. Each of these segments greatly benefits from the availability of the Sighten toolset.« less
Extending the XNAT archive tool for image and analysis management in ophthalmology research
NASA Astrophysics Data System (ADS)
Wahle, Andreas; Lee, Kyungmoo; Harding, Adam T.; Garvin, Mona K.; Niemeijer, Meindert; Sonka, Milan; Abràmoff, Michael D.
2013-03-01
In ophthalmology, various modalities and tests are utilized to obtain vital information on the eye's structure and function. For example, optical coherence tomography (OCT) is utilized to diagnose, screen, and aid treatment of eye diseases like macular degeneration or glaucoma. Such data are complemented by photographic retinal fundus images and functional tests on the visual field. DICOM isn't widely used yet, though, and frequently images are encoded in proprietary formats. The eXtensible Neuroimaging Archive Tool (XNAT) is an open-source NIH-funded framework for research PACS and is in use at the University of Iowa for neurological research applications. Its use for ophthalmology was hence desirable but posed new challenges due to data types thus far not considered and the lack of standardized formats. We developed custom tools for data types not natively recognized by XNAT itself using XNAT's low-level REST API. Vendor-provided tools can be included as necessary to convert proprietary data sets into valid DICOM. Clients can access the data in a standardized format while still retaining the original format if needed by specific analysis tools. With respective project-specific permissions, results like segmentations or quantitative evaluations can be stored as additional resources to previously uploaded datasets. Applications can use our abstract-level Python or C/C++ API to communicate with the XNAT instance. This paper describes concepts and details of the designed upload script templates, which can be customized to the needs of specific projects, and the novel client-side communication API which allows integration into new or existing research applications.
Sud, Manish; Fahy, Eoin; Cotter, Dawn; Azam, Kenan; Vadivelu, Ilango; Burant, Charles; Edison, Arthur; Fiehn, Oliver; Higashi, Richard; Nair, K Sreekumaran; Sumner, Susan; Subramaniam, Shankar
2016-01-04
The Metabolomics Workbench, available at www.metabolomicsworkbench.org, is a public repository for metabolomics metadata and experimental data spanning various species and experimental platforms, metabolite standards, metabolite structures, protocols, tutorials, and training material and other educational resources. It provides a computational platform to integrate, analyze, track, deposit and disseminate large volumes of heterogeneous data from a wide variety of metabolomics studies including mass spectrometry (MS) and nuclear magnetic resonance spectrometry (NMR) data spanning over 20 different species covering all the major taxonomic categories including humans and other mammals, plants, insects, invertebrates and microorganisms. Additionally, a number of protocols are provided for a range of metabolite classes, sample types, and both MS and NMR-based studies, along with a metabolite structure database. The metabolites characterized in the studies available on the Metabolomics Workbench are linked to chemical structures in the metabolite structure database to facilitate comparative analysis across studies. The Metabolomics Workbench, part of the data coordinating effort of the National Institute of Health (NIH) Common Fund's Metabolomics Program, provides data from the Common Fund's Metabolomics Resource Cores, metabolite standards, and analysis tools to the wider metabolomics community and seeks data depositions from metabolomics researchers across the world. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.
Habitat Design Optimization and Analysis
NASA Technical Reports Server (NTRS)
SanSoucie, Michael P.; Hull, Patrick V.; Tinker, Michael L.
2006-01-01
Long-duration surface missions to the Moon and Mars will require habitats for the astronauts. The materials chosen for the habitat walls play a direct role in the protection against the harsh environments found on the surface. Choosing the best materials, their configuration, and the amount required is extremely difficult due to the immense size of the design region. Advanced optimization techniques are necessary for habitat wall design. Standard optimization techniques are not suitable for problems with such large search spaces; therefore, a habitat design optimization tool utilizing genetic algorithms has been developed. Genetic algorithms use a "survival of the fittest" philosophy, where the most fit individuals are more likely to survive and reproduce. This habitat design optimization tool is a multi-objective formulation of structural analysis, heat loss, radiation protection, and meteoroid protection. This paper presents the research and development of this tool.
CWA 15793 2011 Planning and Implementation Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gross, Alan; Nail, George
This software, built on an open source platform called Electron (runs on Chromium and Node.js), is designed to assist organizations in the implementation of a biorisk management system consistent with the requirements of the international, publicly available guidance document CEN Workshop Agreement 15793:2011 (CWA 15793). The software includes tools for conducting organizational gap analysis against CWA 15793 requirements, planning tools to support the implementation of CWA 15793 requirements, and performance monitoring support. The gap analysis questions are based on the text of CWA 15793, and its associated guidance document, CEN Workshop Agreement 16393:2012. The authors have secured permission from themore » publisher of CWA 15793, the European Committee for Standardization (CEN), to use language from the document in the software, with the understanding that the software will be made available freely, without charge.« less
NASA Astrophysics Data System (ADS)
Xin, YANG; Si-qi, WU; Qi, ZHANG
2018-05-01
Beijing, London, Paris, New York are typical cities in the world, so comparative study of four cities green pattern is very important to find out gap and advantage and to learn from each other. The paper will provide basis and new ideas for development of metropolises in China. On the background of big data, API (Application Programming Interface) system can provide extensive and accurate basic data to study urban green pattern in different geographical environment in domestic and foreign. On the basis of this, Average nearest neighbor tool, Kernel density tool and Standard Ellipse tool in ArcGIS platform can process and summarize data and realize quantitative analysis of green pattern. The paper summarized uniqueness of four cities green pattern and reasons of formation on basis of numerical comparison.
LAMPAT and LAMPATNL User’s Manual
2012-09-01
nonlinearity. These tools are implemented as subroutines in the finite element software ABAQUS . This user’s manual provides information on the proper...model either through the General tab of the Edit Job dialog box in Abaqus /CAE or the command line with user=( subroutine filename). Table 1...Selection of software product and subroutine . Static Analysis With Abaqus /Standard Dynamic Analysis With Abaqus /Explicit Linear, uncoupled
Building the European Seismological Research Infrastructure: results from 4 years NERIES EC project
NASA Astrophysics Data System (ADS)
van Eck, T.; Giardini, D.
2010-12-01
The EC Research Infrastructure (RI) project, Network of Research Infrastructures for European Seismology (NERIES), implemented a comprehensive European integrated RI for earthquake seismological data that is scalable and sustainable. NERIES opened a significant amount of additional seismological data, integrated different distributed data archives, implemented and produced advanced analysis tools and advanced software packages and tools. A single seismic data portal provides a single access point and overview for European seismological data available for the earth science research community. Additional data access tools and sites have been implemented to meet user and robustness requirements, notably those at the EMSC and ORFEUS. The datasets compiled in NERIES and available through the portal include among others: - The expanded Virtual European Broadband Seismic Network (VEBSN) with real-time access to more then 500 stations from > 53 observatories. This data is continuously monitored, quality controlled and archived in the European Integrated Distributed waveform Archive (EIDA). - A unique integration of acceleration datasets from seven networks in seven European or associated countries centrally accessible in a homogeneous format, thus forming the core comprehensive European acceleration database. Standardized parameter analysis and actual software are included in the database. - A Distributed Archive of Historical Earthquake Data (AHEAD) for research purposes, containing among others a comprehensive European Macroseismic Database and Earthquake Catalogue (1000 - 1963, M ≥5.8), including analysis tools. - Data from 3 one year OBS deployments at three sites, Atlantic, Ionian and Ligurian Sea within the general SEED format, thus creating the core integrated data base for ocean, sea and land based seismological observatories. Tools to facilitate analysis and data mining of the RI datasets are: - A comprehensive set of European seismological velocity reference model including a standardized model description with several visualisation tools currently adapted on a global scale. - An integrated approach to seismic hazard modelling and forecasting, a community accepted forecasting testing and model validation approach and the core hazard portal developed along the same technologies as the NERIES data portal. - Implemented homogeneous shakemap estimation tools at several large European observatories and a complementary new loss estimation software tool. - A comprehensive set of new techniques for geotechnical site characterization with relevant software packages documented and maintained (www.geopsy.org). - A set of software packages for data mining, data reduction, data exchange and information management in seismology as research and observatory analysis tools NERIES has a long-term impact and is coordinated with related US initiatives IRIS and EarthScope. The follow-up EC project of NERIES, NERA (2010 - 2014), is funded and will integrate the seismological and the earthquake engineering infrastructures. NERIES further provided the proof of concept for the ESFRI2008 initiative: the European Plate Observing System (EPOS). Its preparatory phase (2010 - 2014) is also funded by the EC.
Tardivo, S; Moretti, F; Nobile, M; Agodi, A; Appignanesi, R; Arrigoni, C; Baldovin, T; Brusaferro, S; Canino, R; Carli, A; Chiesa, R; D'Alessandro, D; D'Errico, M M; Giuliani, G; Montagna, M T; Moro, M; Mura, I I; Novati, R; Orsi, G B; Pasquarella, C; Privitera, G; Ripabelli, G; Rossini, A; Saia, M; Sodano, L; Torregrossa, M V; Torri, E; Zarrilli, R; Auxilia, F; SItI, Gisio
2017-01-01
Healthcare-associated infections (HAIs) are an important issue in terms of quality of care. HAIs impact patient safety by contributing to higher rates of preventable mortality and prolonged hospitalizations. In Italy, analysis of the currently available accreditation systems shows a substantial heterogeneity of approaches for the prevention and surveillance of HAIs in hospitals. The aim of the present study is to develop and propose the use of a synthetic assessment tool that could be implemented homogenously throughout the nation. An analysis of nine international and of the 21 Italian regional accreditation systems was conducted in order to identify requirements and indicators implemented for HAI prevention and control. Two relevant reviews on this topic were further analyzed to identify additional evidence-based criteria. The project team evaluated all the requirements and indicators with consensus meeting methodology, then those applicable to the Italian context were grouped into a set of "focus areas". The analysis of international systems and Italian regional accreditation manuals led to the identification respectively of 19 and 14 main requirements, with relevant heterogeneity in their application. Additional evidence-based criteria were included from the reviews analysis. From the consensus among the project team members all the standards were compared and 20 different thematic areas were identified, with a total of 96 requirements and indicators for preventing and monitoring HAIs. The study reveals a great heterogeneity in the definition of accreditation criteria between the Italian regions. The introduction of a uniform, synthetic assessment instrument, based on the review of national and international standards, may serve as a self-assessment tool to evaluate the achievement of a minimum standards set for HAIs prevention and control in healthcare facilities. This may be used as an assessment tool by the Italian institutional accreditation system, also useful to reduce regional disparities.
DOT National Transportation Integrated Search
2010-01-01
The initial objective of this research was to develop procedures and standards for applying GPC as an analytical tool to define the percentage amounts of polymer modifiers in polymer modified asphalt cements soluble in eluting GPC solvents. Quantific...
Global Situational Awareness with Free Tools
2015-01-15
Client Technical Solutions • Software Engineering Measurement and Analysis • Architecture Practices • Product Line Practice • Team Software Process...multiple data sources • Snort (Snorby on Security Onion ) • Nagios • SharePoint RSS • Flow • Others • Leverage standard data formats • Keyhole Markup Language
Thrust reverser analysis for implementation in the Aviation Environmental Design Tool (AEDT)
DOT National Transportation Integrated Search
2007-06-01
This letter report presents an updated implementation for thrust reversers in AEDT. Currently, thrust reverser is applied to all STANDARD approach profiles in the Integrated Noise Mode (lNM) as 60% of the max rated thrust for jets and 40% for props o...
EuroFlow standardization of flow cytometer instrument settings and immunophenotyping protocols
Kalina, T; Flores-Montero, J; van der Velden, V H J; Martin-Ayuso, M; Böttcher, S; Ritgen, M; Almeida, J; Lhermitte, L; Asnafi, V; Mendonça, A; de Tute, R; Cullen, M; Sedek, L; Vidriales, M B; Pérez, J J; te Marvelde, J G; Mejstrikova, E; Hrusak, O; Szczepański, T; van Dongen, J J M; Orfao, A
2012-01-01
The EU-supported EuroFlow Consortium aimed at innovation and standardization of immunophenotyping for diagnosis and classification of hematological malignancies by introducing 8-color flow cytometry with fully standardized laboratory procedures and antibody panels in order to achieve maximally comparable results among different laboratories. This required the selection of optimal combinations of compatible fluorochromes and the design and evaluation of adequate standard operating procedures (SOPs) for instrument setup, fluorescence compensation and sample preparation. Additionally, we developed software tools for the evaluation of individual antibody reagents and antibody panels. Each section describes what has been evaluated experimentally versus adopted based on existing data and experience. Multicentric evaluation demonstrated high levels of reproducibility based on strict implementation of the EuroFlow SOPs and antibody panels. Overall, the 6 years of extensive collaborative experiments and the analysis of hundreds of cell samples of patients and healthy controls in the EuroFlow centers have provided for the first time laboratory protocols and software tools for fully standardized 8-color flow cytometric immunophenotyping of normal and malignant leukocytes in bone marrow and blood; this has yielded highly comparable data sets, which can be integrated in a single database. PMID:22948490
qcML: An Exchange Format for Quality Control Metrics from Mass Spectrometry Experiments*
Walzer, Mathias; Pernas, Lucia Espona; Nasso, Sara; Bittremieux, Wout; Nahnsen, Sven; Kelchtermans, Pieter; Pichler, Peter; van den Toorn, Henk W. P.; Staes, An; Vandenbussche, Jonathan; Mazanek, Michael; Taus, Thomas; Scheltema, Richard A.; Kelstrup, Christian D.; Gatto, Laurent; van Breukelen, Bas; Aiche, Stephan; Valkenborg, Dirk; Laukens, Kris; Lilley, Kathryn S.; Olsen, Jesper V.; Heck, Albert J. R.; Mechtler, Karl; Aebersold, Ruedi; Gevaert, Kris; Vizcaíno, Juan Antonio; Hermjakob, Henning; Kohlbacher, Oliver; Martens, Lennart
2014-01-01
Quality control is increasingly recognized as a crucial aspect of mass spectrometry based proteomics. Several recent papers discuss relevant parameters for quality control and present applications to extract these from the instrumental raw data. What has been missing, however, is a standard data exchange format for reporting these performance metrics. We therefore developed the qcML format, an XML-based standard that follows the design principles of the related mzML, mzIdentML, mzQuantML, and TraML standards from the HUPO-PSI (Proteomics Standards Initiative). In addition to the XML format, we also provide tools for the calculation of a wide range of quality metrics as well as a database format and interconversion tools, so that existing LIMS systems can easily add relational storage of the quality control data to their existing schema. We here describe the qcML specification, along with possible use cases and an illustrative example of the subsequent analysis possibilities. All information about qcML is available at http://code.google.com/p/qcml. PMID:24760958
qcML: an exchange format for quality control metrics from mass spectrometry experiments.
Walzer, Mathias; Pernas, Lucia Espona; Nasso, Sara; Bittremieux, Wout; Nahnsen, Sven; Kelchtermans, Pieter; Pichler, Peter; van den Toorn, Henk W P; Staes, An; Vandenbussche, Jonathan; Mazanek, Michael; Taus, Thomas; Scheltema, Richard A; Kelstrup, Christian D; Gatto, Laurent; van Breukelen, Bas; Aiche, Stephan; Valkenborg, Dirk; Laukens, Kris; Lilley, Kathryn S; Olsen, Jesper V; Heck, Albert J R; Mechtler, Karl; Aebersold, Ruedi; Gevaert, Kris; Vizcaíno, Juan Antonio; Hermjakob, Henning; Kohlbacher, Oliver; Martens, Lennart
2014-08-01
Quality control is increasingly recognized as a crucial aspect of mass spectrometry based proteomics. Several recent papers discuss relevant parameters for quality control and present applications to extract these from the instrumental raw data. What has been missing, however, is a standard data exchange format for reporting these performance metrics. We therefore developed the qcML format, an XML-based standard that follows the design principles of the related mzML, mzIdentML, mzQuantML, and TraML standards from the HUPO-PSI (Proteomics Standards Initiative). In addition to the XML format, we also provide tools for the calculation of a wide range of quality metrics as well as a database format and interconversion tools, so that existing LIMS systems can easily add relational storage of the quality control data to their existing schema. We here describe the qcML specification, along with possible use cases and an illustrative example of the subsequent analysis possibilities. All information about qcML is available at http://code.google.com/p/qcml. © 2014 by The American Society for Biochemistry and Molecular Biology, Inc.
Topography measurements and applications in ballistics and tool mark identifications*
Vorburger, T V; Song, J; Petraco, N
2016-01-01
The application of surface topography measurement methods to the field of firearm and toolmark analysis is fairly new. The field has been boosted by the development of a number of competing optical methods, which has improved the speed and accuracy of surface topography acquisitions. We describe here some of these measurement methods as well as several analytical methods for assessing similarities and differences among pairs of surfaces. We also provide a few examples of research results to identify cartridge cases originating from the same firearm or tool marks produced by the same tool. Physical standards and issues of traceability are also discussed. PMID:27182440
Using SWE Standards for Ubiquitous Environmental Sensing: A Performance Analysis
Tamayo, Alain; Granell, Carlos; Huerta, Joaquín
2012-01-01
Although smartphone applications represent the most typical data consumer tool from the citizen perspective in environmental applications, they can also be used for in-situ data collection and production in varied scenarios, such as geological sciences and biodiversity. The use of standard protocols, such as SWE, to exchange information between smartphones and sensor infrastructures brings benefits such as interoperability and scalability, but their reliance on XML is a potential problem when large volumes of data are transferred, due to limited bandwidth and processing capabilities on mobile phones. In this article we present a performance analysis about the use of SWE standards in smartphone applications to consume and produce environmental sensor data, analysing to what extent the performance problems related to XML can be alleviated by using alternative uncompressed and compressed formats.
Predictive Inference Using Latent Variables with Covariates*
Schofield, Lynne Steuerle; Junker, Brian; Taylor, Lowell J.; Black, Dan A.
2014-01-01
Plausible Values (PVs) are a standard multiple imputation tool for analysis of large education survey data that measures latent proficiency variables. When latent proficiency is the dependent variable, we reconsider the standard institutionally-generated PV methodology and find it applies with greater generality than shown previously. When latent proficiency is an independent variable, we show that the standard institutional PV methodology produces biased inference because the institutional conditioning model places restrictions on the form of the secondary analysts’ model. We offer an alternative approach that avoids these biases based on the mixed effects structural equations (MESE) model of Schofield (2008). PMID:25231627
Carroll, Adam J; Badger, Murray R; Harvey Millar, A
2010-07-14
Standardization of analytical approaches and reporting methods via community-wide collaboration can work synergistically with web-tool development to result in rapid community-driven expansion of online data repositories suitable for data mining and meta-analysis. In metabolomics, the inter-laboratory reproducibility of gas-chromatography/mass-spectrometry (GC/MS) makes it an obvious target for such development. While a number of web-tools offer access to datasets and/or tools for raw data processing and statistical analysis, none of these systems are currently set up to act as a public repository by easily accepting, processing and presenting publicly submitted GC/MS metabolomics datasets for public re-analysis. Here, we present MetabolomeExpress, a new File Transfer Protocol (FTP) server and web-tool for the online storage, processing, visualisation and statistical re-analysis of publicly submitted GC/MS metabolomics datasets. Users may search a quality-controlled database of metabolite response statistics from publicly submitted datasets by a number of parameters (eg. metabolite, species, organ/biofluid etc.). Users may also perform meta-analysis comparisons of multiple independent experiments or re-analyse public primary datasets via user-friendly tools for t-test, principal components analysis, hierarchical cluster analysis and correlation analysis. They may interact with chromatograms, mass spectra and peak detection results via an integrated raw data viewer. Researchers who register for a free account may upload (via FTP) their own data to the server for online processing via a novel raw data processing pipeline. MetabolomeExpress https://www.metabolome-express.org provides a new opportunity for the general metabolomics community to transparently present online the raw and processed GC/MS data underlying their metabolomics publications. Transparent sharing of these data will allow researchers to assess data quality and draw their own insights from published metabolomics datasets.
Java web tools for PCR, in silico PCR, and oligonucleotide assembly and analysis.
Kalendar, Ruslan; Lee, David; Schulman, Alan H
2011-08-01
The polymerase chain reaction is fundamental to molecular biology and is the most important practical molecular technique for the research laboratory. We have developed and tested efficient tools for PCR primer and probe design, which also predict oligonucleotide properties based on experimental studies of PCR efficiency. The tools provide comprehensive facilities for designing primers for most PCR applications and their combinations, including standard, multiplex, long-distance, inverse, real-time, unique, group-specific, bisulphite modification assays, Overlap-Extension PCR Multi-Fragment Assembly, as well as a programme to design oligonucleotide sets for long sequence assembly by ligase chain reaction. The in silico PCR primer or probe search includes comprehensive analyses of individual primers and primer pairs. It calculates the melting temperature for standard and degenerate oligonucleotides including LNA and other modifications, provides analyses for a set of primers with prediction of oligonucleotide properties, dimer and G-quadruplex detection, linguistic complexity, and provides a dilution and resuspension calculator. Copyright © 2011 Elsevier Inc. All rights reserved.
Byrne, Marion; White, Ben; McDonald, Fiona
Since the introduction of the Convention on the Rights of Persons with Disabilities (2006) (CRPD), there have been calls to establish standards to measure compliance of domestic mental health laws with the human rights outlined in the CRPD. This article aims to address this gap by proposing a tool: the Analysis Instrument for Mental health (AIM). In particular, the tool's purpose is to enable states and civil society to assess the compliance of non-forensic domestic mental health laws with Article 12 of the CRPD. It responds to Dawson's (2015) call for a mechanism designed to provide clear and measurable standards for which to undertake this exercise. The content of AIM draws directly from the authoritative interpretation of Article 12 provided by the United Nations Committee on the Rights of Persons with Disabilities (the Committee) in its General Comment, as well as the substantial body of academic and other literature about Article 12. Copyright © 2018 Elsevier Ltd. All rights reserved.
Biomechanical ToolKit: Open-source framework to visualize and process biomechanical data.
Barre, Arnaud; Armand, Stéphane
2014-04-01
C3D file format is widely used in the biomechanical field by companies and laboratories to store motion capture systems data. However, few software packages can visualize and modify the integrality of the data in the C3D file. Our objective was to develop an open-source and multi-platform framework to read, write, modify and visualize data from any motion analysis systems using standard (C3D) and proprietary file formats (used by many companies producing motion capture systems). The Biomechanical ToolKit (BTK) was developed to provide cost-effective and efficient tools for the biomechanical community to easily deal with motion analysis data. A large panel of operations is available to read, modify and process data through C++ API, bindings for high-level languages (Matlab, Octave, and Python), and standalone application (Mokka). All these tools are open-source and cross-platform and run on all major operating systems (Windows, Linux, MacOS X). Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Overview of 'Omics Technologies for Military Occupational Health Surveillance and Medicine.
Bradburne, Christopher; Graham, David; Kingston, H M; Brenner, Ruth; Pamuku, Matt; Carruth, Lucy
2015-10-01
Systems biology ('omics) technologies are emerging as tools for the comprehensive analysis and monitoring of human health. In order for these tools to be used in military medicine, clinical sampling and biobanking will need to be optimized to be compatible with downstream processing and analysis for each class of molecule measured. This article provides an overview of 'omics technologies, including instrumentation, tools, and methods, and their potential application for warfighter exposure monitoring. We discuss the current state and the potential utility of personalized data from a variety of 'omics sources including genomics, epigenomics, transcriptomics, metabolomics, proteomics, lipidomics, and efforts to combine their use. Issues in the "sample-to-answer" workflow, including collection and biobanking are discussed, as well as national efforts for standardization and clinical interpretation. Establishment of these emerging capabilities, along with accurate xenobiotic monitoring, for the Department of Defense could provide new and effective tools for environmental health monitoring at all duty stations, including deployed locations. Reprint & Copyright © 2015 Association of Military Surgeons of the U.S.
CASAS: Cancer Survival Analysis Suite, a web based application
Rupji, Manali; Zhang, Xinyan; Kowalski, Jeanne
2017-01-01
We present CASAS, a shiny R based tool for interactive survival analysis and visualization of results. The tool provides a web-based one stop shop to perform the following types of survival analysis: quantile, landmark and competing risks, in addition to standard survival analysis. The interface makes it easy to perform such survival analyses and obtain results using the interactive Kaplan-Meier and cumulative incidence plots. Univariate analysis can be performed on one or several user specified variable(s) simultaneously, the results of which are displayed in a single table that includes log rank p-values and hazard ratios along with their significance. For several quantile survival analyses from multiple cancer types, a single summary grid is constructed. The CASAS package has been implemented in R and is available via http://shinygispa.winship.emory.edu/CASAS/. The developmental repository is available at https://github.com/manalirupji/CASAS/. PMID:28928946
CASAS: Cancer Survival Analysis Suite, a web based application.
Rupji, Manali; Zhang, Xinyan; Kowalski, Jeanne
2017-01-01
We present CASAS, a shiny R based tool for interactive survival analysis and visualization of results. The tool provides a web-based one stop shop to perform the following types of survival analysis: quantile, landmark and competing risks, in addition to standard survival analysis. The interface makes it easy to perform such survival analyses and obtain results using the interactive Kaplan-Meier and cumulative incidence plots. Univariate analysis can be performed on one or several user specified variable(s) simultaneously, the results of which are displayed in a single table that includes log rank p-values and hazard ratios along with their significance. For several quantile survival analyses from multiple cancer types, a single summary grid is constructed. The CASAS package has been implemented in R and is available via http://shinygispa.winship.emory.edu/CASAS/. The developmental repository is available at https://github.com/manalirupji/CASAS/.
Ensuring Patient Safety in Care Transitions: An Empirical Evaluation of a Handoff Intervention Tool
Abraham, Joanna; Kannampallil, Thomas; Patel, Bela; Almoosa, Khalid; Patel, Vimla L.
2012-01-01
Successful handoffs ensure smooth, efficient and safe patient care transitions. Tools and systems designed for standardization of clinician handoffs often focuses on ensuring the communication activity during transitions, with limited support for preparatory activities such as information seeking and organization. We designed and evaluated a Handoff Intervention Tool (HAND-IT) based on a checklist-inspired, body system format allowing structured information organization, and a problem-case narrative format allowing temporal description of patient care events. Based on a pre-post prospective study using a multi-method analysis we evaluated the effectiveness of HAND-IT as a documentation tool. We found that the use of HAND-IT led to fewer transition breakdowns, greater tool resilience, and likely led to better learning outcomes for less-experienced clinicians when compared to the current tool. We discuss the implications of our results for improving patient safety with a continuity of care-based approach. PMID:23304268
Energy evaluation of protection effectiveness of anti-vibration gloves.
Hermann, Tomasz; Dobry, Marian Witalis
2017-09-01
This article describes an energy method of assessing protection effectiveness of anti-vibration gloves on the human dynamic structure. The study uses dynamic models of the human and the glove specified in Standard No. ISO 10068:2012. The physical models of human-tool systems were developed by combining human physical models with a power tool model. The combined human-tool models were then transformed into mathematical models from which energy models were finally derived. Comparative energy analysis was conducted in the domain of rms powers. The energy models of the human-tool systems were solved using numerical simulation implemented in the MATLAB/Simulink environment. The simulation procedure demonstrated the effectiveness of the anti-vibration glove as a method of protecting human operators of hand-held power tools against vibration. The desirable effect is achieved by lowering the flow of energy in the human-tool system when the anti-vibration glove is employed.
A reliability analysis tool for SpaceWire network
NASA Astrophysics Data System (ADS)
Zhou, Qiang; Zhu, Longjiang; Fei, Haidong; Wang, Xingyou
2017-04-01
A SpaceWire is a standard for on-board satellite networks as the basis for future data-handling architectures. It is becoming more and more popular in space applications due to its technical advantages, including reliability, low power and fault protection, etc. High reliability is the vital issue for spacecraft. Therefore, it is very important to analyze and improve the reliability performance of the SpaceWire network. This paper deals with the problem of reliability modeling and analysis with SpaceWire network. According to the function division of distributed network, a reliability analysis method based on a task is proposed, the reliability analysis of every task can lead to the system reliability matrix, the reliability result of the network system can be deduced by integrating these entire reliability indexes in the matrix. With the method, we develop a reliability analysis tool for SpaceWire Network based on VC, where the computation schemes for reliability matrix and the multi-path-task reliability are also implemented. By using this tool, we analyze several cases on typical architectures. And the analytic results indicate that redundancy architecture has better reliability performance than basic one. In practical, the dual redundancy scheme has been adopted for some key unit, to improve the reliability index of the system or task. Finally, this reliability analysis tool will has a directive influence on both task division and topology selection in the phase of SpaceWire network system design.
Ganalyzer: A tool for automatic galaxy image analysis
NASA Astrophysics Data System (ADS)
Shamir, Lior
2011-05-01
Ganalyzer is a model-based tool that automatically analyzes and classifies galaxy images. Ganalyzer works by separating the galaxy pixels from the background pixels, finding the center and radius of the galaxy, generating the radial intensity plot, and then computing the slopes of the peaks detected in the radial intensity plot to measure the spirality of the galaxy and determine its morphological class. Unlike algorithms that are based on machine learning, Ganalyzer is based on measuring the spirality of the galaxy, a task that is difficult to perform manually, and in many cases can provide a more accurate analysis compared to manual observation. Ganalyzer is simple to use, and can be easily embedded into other image analysis applications. Another advantage is its speed, which allows it to analyze ~10,000,000 galaxy images in five days using a standard modern desktop computer. These capabilities can make Ganalyzer a useful tool in analyzing large datasets of galaxy images collected by autonomous sky surveys such as SDSS, LSST or DES.
A document centric metadata registration tool constructing earth environmental data infrastructure
NASA Astrophysics Data System (ADS)
Ichino, M.; Kinutani, H.; Ono, M.; Shimizu, T.; Yoshikawa, M.; Masuda, K.; Fukuda, K.; Kawamoto, H.
2009-12-01
DIAS (Data Integration and Analysis System) is one of GEOSS activities in Japan. It is also a leading part of the GEOSS task with the same name defined in GEOSS Ten Year Implementation Plan. The main mission of DIAS is to construct data infrastructure that can effectively integrate earth environmental data such as observation data, numerical model outputs, and socio-economic data provided from the fields of climate, water cycle, ecosystem, ocean, biodiversity and agriculture. Some of DIAS's data products are available at the following web site of http://www.jamstec.go.jp/e/medid/dias. Most of earth environmental data commonly have spatial and temporal attributes such as the covering geographic scope or the created date. The metadata standards including these common attributes are published by the geographic information technical committee (TC211) in ISO (the International Organization for Standardization) as specifications of ISO 19115:2003 and 19139:2007. Accordingly, DIAS metadata is developed with basing on ISO/TC211 metadata standards. From the viewpoint of data users, metadata is useful not only for data retrieval and analysis but also for interoperability and information sharing among experts, beginners and nonprofessionals. On the other hand, from the viewpoint of data providers, two problems were pointed out after discussions. One is that data providers prefer to minimize another tasks and spending time for creating metadata. Another is that data providers want to manage and publish documents to explain their data sets more comprehensively. Because of solving these problems, we have been developing a document centric metadata registration tool. The features of our tool are that the generated documents are available instantly and there is no extra cost for data providers to generate metadata. Also, this tool is developed as a Web application. So, this tool does not demand any software for data providers if they have a web-browser. The interface of the tool provides the section titles of the documents and by filling out the content of each section, the documents for the data sets are automatically published in PDF and HTML format. Furthermore, the metadata XML file which is compliant with ISO19115 and ISO19139 is created at the same moment. The generated metadata are managed in the metadata database of the DIAS project, and will be used in various ISO19139 compliant metadata management tools, such as GeoNetwork.
NASA Astrophysics Data System (ADS)
Barber, Jeffrey; Greca, Joseph; Yam, Kevin; Weatherall, James C.; Smith, Peter R.; Smith, Barry T.
2017-05-01
In 2016, the millimeter wave (MMW) imaging community initiated the formation of a standard for millimeter wave image quality metrics. This new standard, American National Standards Institute (ANSI) N42.59, will apply to active MMW systems for security screening of humans. The Electromagnetic Signatures of Explosives Laboratory at the Transportation Security Laboratory is supporting the ANSI standards process via the creation of initial prototypes for round-robin testing with MMW imaging system manufacturers and experts. Results obtained for these prototypes will be used to inform the community and lead to consensus objective standards amongst stakeholders. Images collected with laboratory systems are presented along with results of preliminary image analysis. Future directions for object design, data collection and image processing are discussed.
SAMPA: A free software tool for skin and membrane permeation data analysis.
Bezrouk, Aleš; Fiala, Zdeněk; Kotingová, Lenka; Krulichová, Iva Selke; Kopečná, Monika; Vávrová, Kateřina
2017-10-01
Skin and membrane permeation experiments comprise an important step in the development of a transdermal or topical formulation or toxicological risk assessment. The standard method for analyzing these data relies on the linear part of a permeation profile. However, it is difficult to objectively determine when the profile becomes linear, or the experiment duration may be insufficient to reach a maximum or steady state. Here, we present a software tool for Skin And Membrane Permeation data Analysis, SAMPA, that is easy to use and overcomes several of these difficulties. The SAMPA method and software have been validated on in vitro and in vivo permeation data on human, pig and rat skin and model stratum corneum lipid membranes using compounds that range from highly lipophilic polycyclic aromatic hydrocarbons to highly hydrophilic antiviral drug, with and without two permeation enhancers. The SAMPA performance was compared with the standard method using a linear part of the permeation profile and a complex mathematical model. SAMPA is a user-friendly, open-source software tool for analyzing the data obtained from skin and membrane permeation experiments. It runs on a Microsoft Windows platform and is freely available as a Supporting file to this article. Copyright © 2017 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Texas State Technical Coll., Waco.
This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…
Coyne, Katherine; Mandalia, Sundhiya; McCullough, Sonya; Catalan, Jose; Noestlinger, Christiana; Colebunders, Robert; Asboe, David
2010-02-01
Erectile dysfunction is common in HIV-positive men who have sex with men (MSM). A standardized scale is needed to assess erectile function in clinical practice and research studies. The International Index of Erectile Function (IIEF) is a widely accepted tool for assessing erectile function designed for heterosexual men. We modified the tool for MSM. We present an analysis of internal consistency of the questionnaire in an HIV-positive cohort. The adapted questionnaire included modified questions within each of the five domains of the IIEF: (i) erectile function, (ii) intercourse satisfaction, (iii) orgasmic function, (iv) sexual desire, and (v) overall satisfaction with sex. MSM at seven European HIV treatment centers completed the questionnaire. Responses were analyzed for internal consistency using standardized Cronbach's alpha values within each of the five domains. A factor analysis was performed to confirm the domain structure of the questionnaire. Data from 486 MSM were analyzed. The factor analysis supported the domain structure described. Questions about erectile function, orgasmic function, and sexual desire performed well, with Cronbach's alpha values of 0.82, 0.83, and 0.89, respectively. Questions concerning intercourse satisfaction were less consistent (Cronbach's alpha 0.55) because frequency of attempts at sexual intercourse did not correlate with other responses. Responses about satisfaction with sex with a regular partner diverged from satisfaction with overall sex life. Frequency of morning erections diverged from other aspects of erectile function, whereas erections with masturbation correlated better. Internal consistency was high overall. This tool is suitable for HIV-positive MSM and can be used in screening, research, and monitoring treatment response.
NASA Astrophysics Data System (ADS)
Han, Xu; Xie, Guangping; Laflen, Brandon; Jia, Ming; Song, Guiju; Harding, Kevin G.
2015-05-01
In the real application environment of field engineering, a large variety of metrology tools are required by the technician to inspect part profile features. However, some of these tools are burdensome and only address a sole application or measurement. In other cases, standard tools lack the capability of accessing irregular profile features. Customers of field engineering want the next generation metrology devices to have the ability to replace the many current tools with one single device. This paper will describe a method based on the ring optical gage concept to the measurement of numerous kinds of profile features useful for the field technician. The ring optical system is composed of a collimated laser, a conical mirror and a CCD camera. To be useful for a wide range of applications, the ring optical system requires profile feature extraction algorithms and data manipulation directed toward real world applications in field operation. The paper will discuss such practical applications as measuring the non-ideal round hole with both off-centered and oblique axes. The algorithms needed to analyze other features such as measuring the width of gaps, radius of transition fillets, fall of step surfaces, and surface parallelism will also be discussed in this paper. With the assistance of image processing and geometric algorithms, these features can be extracted with a reasonable performance. Tailoring the feature extraction analysis to this specific gage offers the potential for a wider application base beyond simple inner diameter measurements. The paper will present experimental results that are compared with standard gages to prove the performance and feasibility of the analysis in real world field engineering. Potential accuracy improvement methods, a new dual ring design and future work will be discussed at the end of this paper.
Kalendar, Ruslan; Tselykh, Timofey V; Khassenov, Bekbolat; Ramanculov, Erlan M
2017-01-01
This chapter introduces the FastPCR software as an integrated tool environment for PCR primer and probe design, which predicts properties of oligonucleotides based on experimental studies of the PCR efficiency. The software provides comprehensive facilities for designing primers for most PCR applications and their combinations. These include the standard PCR as well as the multiplex, long-distance, inverse, real-time, group-specific, unique, overlap extension PCR for multi-fragments assembling cloning and loop-mediated isothermal amplification (LAMP). It also contains a built-in program to design oligonucleotide sets both for long sequence assembly by ligase chain reaction and for design of amplicons that tile across a region(s) of interest. The software calculates the melting temperature for the standard and degenerate oligonucleotides including locked nucleic acid (LNA) and other modifications. It also provides analyses for a set of primers with the prediction of oligonucleotide properties, dimer and G/C-quadruplex detection, linguistic complexity as well as a primer dilution and resuspension calculator. The program consists of various bioinformatical tools for analysis of sequences with the GC or AT skew, CG% and GA% content, and the purine-pyrimidine skew. It also analyzes the linguistic sequence complexity and performs generation of random DNA sequence as well as restriction endonucleases analysis. The program allows to find or create restriction enzyme recognition sites for coding sequences and supports the clustering of sequences. It performs efficient and complete detection of various repeat types with visual display. The FastPCR software allows the sequence file batch processing that is essential for automation. The program is available for download at http://primerdigital.com/fastpcr.html , and its online version is located at http://primerdigital.com/tools/pcr.html .
NASA System-Level Design, Analysis and Simulation Tools Research on NextGen
NASA Technical Reports Server (NTRS)
Bardina, Jorge
2011-01-01
A review of the research accomplished in 2009 in the System-Level Design, Analysis and Simulation Tools (SLDAST) of the NASA's Airspace Systems Program is presented. This research thrust focuses on the integrated system-level assessment of component level innovations, concepts and technologies of the Next Generation Air Traffic System (NextGen) under research in the ASP program to enable the development of revolutionary improvements and modernization of the National Airspace System. The review includes the accomplishments on baseline research and the advancements on design studies and system-level assessment, including the cluster analysis as an annualization standard of the air traffic in the U.S. National Airspace, and the ACES-Air MIDAS integration for human-in-the-loop analyzes within the NAS air traffic simulation.
Karasik, Avshalom; Rahimi, Oshrit; David, Michal; Weiss, Ehud; Drori, Elyashiv
2018-04-25
Grapevine (Vitis vinifera L.) is one of the classical fruits of the Old World. Among the thousands of domesticated grapevine varieties and variable wild sylvestris populations, the range of variation in pip morphology is very wide. In this study we scanned representative samples of grape pip populations, in an attempt to probe the possibility of using the 3D tool for grape variety identification. The scanning was followed by mathematical and statistical analysis using innovative algorithms from the field of computer sciences. Using selected Fourier coefficients, a very clear separation was obtained between most of the varieties, with only very few overlaps. These results show that this method enables the separation between different Vitis vinifera varieties. Interestingly, when using the 3D approach to analyze couples of varieties, considered synonyms by the standard 22 SSR analysis approach, we found that the varieties in two of the considered synonym couples were clearly separated by the morphological analysis. This work, therefore, suggests a new systematic tool for high resolution variety discrimination.
Development of a site analysis tool for distributed wind projects
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shaw, Shawn
The Cadmus Group, Inc., in collaboration with the National Renewable Energy Laboratory (NREL) and Encraft, was awarded a grant from the Department of Energy (DOE) to develop a site analysis tool for distributed wind technologies. As the principal investigator for this project, Mr. Shawn Shaw was responsible for overall project management, direction, and technical approach. The product resulting from this project is the Distributed Wind Site Analysis Tool (DSAT), a software tool for analyzing proposed sites for distributed wind technology (DWT) systems. This user-friendly tool supports the long-term growth and stability of the DWT market by providing reliable, realistic estimatesmore » of site and system energy output and feasibility. DSAT-which is accessible online and requires no purchase or download of software-is available in two account types; Standard: This free account allows the user to analyze a limited number of sites and to produce a system performance report for each; and Professional: For a small annual fee users can analyze an unlimited number of sites, produce system performance reports, and generate other customizable reports containing key information such as visual influence and wind resources. The tool’s interactive maps allow users to create site models that incorporate the obstructions and terrain types present. Users can generate site reports immediately after entering the requisite site information. Ideally, this tool also educates users regarding good site selection and effective evaluation practices.« less
Proceedings of the 11th Thermal and Fluids Analysis Workshop
NASA Astrophysics Data System (ADS)
Sakowski, Barbara
2002-07-01
The Eleventh Thermal & Fluids Analysis WorkShop (TFAWS 2000) was held the week of August 21-25 at The Forum in downtown Cleveland. This year's annual event focused on building stronger links between research community and the engineering design/application world and celebrated the theme "Bridging the Gap Between Research and Design". Dr. Simon Ostrach delivered the keynote address "Research for Design (R4D)" and encouraged a more deliberate approach to performing research with near-term engineering design applications in mind. Over 100 persons attended TFAWS 2000, including participants from five different countries. This year's conference devoted a full-day seminar to the discussion of analysis and design tools associated with aeropropulsion research at the Glenn Research Center. As in previous years, the workshop also included hands-on instruction in state-of-the-art analysis tools, paper sessions on selected topics, short courses and application software demonstrations. TFAWS 2000 was co-hosted by the Thermal/Fluids Systems Design and Analysis Branch of NASA GRC and by the Ohio Aerospace Institute and was co-chaired by Barbara A. Sakowski and James R. Yuko. The annual NASA Delegates meeting is a standard component of TFAWS where the civil servants of the various centers represented discuss current and future events which affect the Community of Applied Thermal and Fluid ANalystS (CATFANS). At this year's delegates meeting the following goals (among others) were set by the collective body of delegates participation of all Centers in the NASA material properties database (TPSX) update: (1) developing and collaboratively supporting multi-center proposals; (2) expanding the scope of TFAWS to include other federal laboratories; (3) initiation of a white papers on thermal tools and standards; and (4) formation of an Agency-wide TFAWS steering committee.
Erdal, Barbaros Selnur; Yildiz, Vedat; King, Mark A.; Patterson, Andrew T.; Knopp, Michael V.; Clymer, Bradley D.
2012-01-01
Background: Chest CT scans are commonly used to clinically assess disease severity in patients presenting with pulmonary sarcoidosis. Despite their ability to reliably detect subtle changes in lung disease, the utility of chest CT scans for guiding therapy is limited by the fact that image interpretation by radiologists is qualitative and highly variable. We sought to create a computerized CT image analysis tool that would provide quantitative and clinically relevant information. Methods: We established that a two-point correlation analysis approach reduced the background signal attendant to normal lung structures, such as blood vessels, airways, and lymphatics while highlighting diseased tissue. This approach was applied to multiple lung fields to generate an overall lung texture score (LTS) representing the quantity of diseased lung parenchyma. Using deidentified lung CT scan and pulmonary function test (PFT) data from The Ohio State University Medical Center’s Information Warehouse, we analyzed 71 consecutive CT scans from patients with sarcoidosis for whom simultaneous matching PFTs were available to determine whether the LTS correlated with standard PFT results. Results: We found a high correlation between LTS and FVC, total lung capacity, and diffusing capacity of the lung for carbon monoxide (P < .0001 for all comparisons). Moreover, LTS was equivalent to PFTs for the detection of active lung disease. The image analysis protocol was conducted quickly (< 1 min per study) on a standard laptop computer connected to a publicly available National Institutes of Health ImageJ toolkit. Conclusions: The two-point image analysis tool is highly practical and appears to reliably assess lung disease severity. We predict that this tool will be useful for clinical and research applications. PMID:22628487
Directional Dependence in Developmental Research
ERIC Educational Resources Information Center
von Eye, Alexander; DeShon, Richard P.
2012-01-01
In this article, we discuss and propose methods that may be of use to determine direction of dependence in non-normally distributed variables. First, it is shown that standard regression analysis is unable to distinguish between explanatory and response variables. Then, skewness and kurtosis are discussed as tools to assess deviation from…
Jacques, Eveline; Wells, Darren M; Bennett, Malcolm J; Vissenberg, Kris
2015-01-01
High-resolution imaging of cytoskeletal structures paves the way for standardized methods to quantify cytoskeletal organization. Here we provide a detailed description of the analysis performed to determine the microtubule patterns in gravistimulated roots, using the recently developed software tool MicroFilament Analyzer.
Merging Quality Processes & Tools with DACUM.
ERIC Educational Resources Information Center
McLennan, Krystyna S.
This paper explains how merging DACUM (Developing a Curriculum) analysis with quality initiatives can reduce waste, increase job efficiency, assist in development of standard operating procedures, and involve employees in positive job improvement methods. In the first half of the paper, the following principles of total quality management (TQM)…
Strategies for Teaching Fractions: Using Error Analysis for Intervention and Assessment
ERIC Educational Resources Information Center
Spangler, David B.
2011-01-01
Many students struggle with fractions and must understand them before learning higher-level math. Veteran educator David B. Spangler provides research-based tools that are aligned with NCTM and Common Core State Standards. He outlines powerful diagnostic methods for analyzing student work and providing timely, specific, and meaningful…
ERIC Educational Resources Information Center
Ivancevich, Daniel M.; And Others
1996-01-01
Points out that political and economic pressures have sometimes caused the Financial Accounting Standards Board to alter standards. Presents a spreadsheet tool that demonstrates the economic consequences of adopting accounting standards. (SK)
Performance management of multiple access communication networks
NASA Astrophysics Data System (ADS)
Lee, Suk; Ray, Asok
1993-12-01
This paper focuses on conceptual design, development, and implementation of a performance management tool for computer communication networks to serve large-scale integrated systems. The objective is to improve the network performance in handling various types of messages by on-line adjustment of protocol parameters. The techniques of perturbation analysis of Discrete Event Dynamic Systems (DEDS), stochastic approximation (SA), and learning automata have been used in formulating the algorithm of performance management. The efficacy of the performance management tool has been demonstrated on a network testbed. The conceptual design presented in this paper offers a step forward to bridging the gap between management standards and users' demands for efficient network operations since most standards such as ISO (International Standards Organization) and IEEE address only the architecture, services, and interfaces for network management. The proposed concept of performance management can also be used as a general framework to assist design, operation, and management of various DEDS such as computer integrated manufacturing and battlefield C(sup 3) (Command, Control, and Communications).
IDD Info: a software to manage surveillance data of Iodine Deficiency Disorders.
Liu, Peng; Teng, Bai-Jun; Zhang, Shu-Bin; Su, Xiao-Hui; Yu, Jun; Liu, Shou-Jun
2011-08-01
IDD info, a new software for managing survey data of Iodine Deficiency Disorders (IDD), is presented in this paper. IDD Info aims to create IDD project databases, process, analyze various national or regional surveillance data and form final report. It has series measures of choosing database from existing ones, revising it, choosing indicators from pool to establish database and adding indicators to pool. It also provides simple tools to scan one database and compare two databases, to set IDD standard parameters, to analyze data by single indicator and multi-indicators, and finally to form typeset report with content customized. IDD Info was developed using Chinese national IDD surveillance data of 2005. Its validity was evaluated by comparing with survey report given by China CDC. The IDD Info is a professional analysis tool, which succeeds in speeding IDD data analysis up to about 14.28% with respect to standard reference routines. It consequently enhances analysis performance and user compliance. IDD Info is a practical and accurate means of managing the multifarious IDD surveillance data that can be widely used by non-statisticians in national and regional IDD surveillance. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
Brain tumor classification using AFM in combination with data mining techniques.
Huml, Marlene; Silye, René; Zauner, Gerald; Hutterer, Stephan; Schilcher, Kurt
2013-01-01
Although classification of astrocytic tumors is standardized by the WHO grading system, which is mainly based on microscopy-derived, histomorphological features, there is great interobserver variability. The main causes are thought to be the complexity of morphological details varying from tumor to tumor and from patient to patient, variations in the technical histopathological procedures like staining protocols, and finally the individual experience of the diagnosing pathologist. Thus, to raise astrocytoma grading to a more objective standard, this paper proposes a methodology based on atomic force microscopy (AFM) derived images made from histopathological samples in combination with data mining techniques. By comparing AFM images with corresponding light microscopy images of the same area, the progressive formation of cavities due to cell necrosis was identified as a typical morphological marker for a computer-assisted analysis. Using genetic programming as a tool for feature analysis, a best model was created that achieved 94.74% classification accuracy in distinguishing grade II tumors from grade IV ones. While utilizing modern image analysis techniques, AFM may become an important tool in astrocytic tumor diagnosis. By this way patients suffering from grade II tumors are identified unambiguously, having a less risk for malignant transformation. They would benefit from early adjuvant therapies.
Interactive cutting path analysis programs
NASA Technical Reports Server (NTRS)
Weiner, J. M.; Williams, D. S.; Colley, S. R.
1975-01-01
The operation of numerically controlled machine tools is interactively simulated. Four programs were developed to graphically display the cutting paths for a Monarch lathe, Cintimatic mill, Strippit sheet metal punch, and the wiring path for a Standard wire wrap machine. These programs are run on a IMLAC PDS-ID graphic display system under the DOS-3 disk operating system. The cutting path analysis programs accept input via both paper tape and disk file.
Phyllis C. Adams; Glenn A. Christensen
2012-01-01
A rigorous quality assurance (QA) process assures that the data and information provided by the Forest Inventory and Analysis (FIA) program meet the highest possible standards of precision, completeness, representativeness, comparability, and accuracy. FIA relies on its analysts to check the final data quality prior to release of a Stateâs data to the national FIA...
MPHASYS: a mouse phenotype analysis system
Calder, R Brent; Beems, Rudolf B; van Steeg, Harry; Mian, I Saira; Lohman, Paul HM; Vijg, Jan
2007-01-01
Background Systematic, high-throughput studies of mouse phenotypes have been hampered by the inability to analyze individual animal data from a multitude of sources in an integrated manner. Studies generally make comparisons at the level of genotype or treatment thereby excluding associations that may be subtle or involve compound phenotypes. Additionally, the lack of integrated, standardized ontologies and methodologies for data exchange has inhibited scientific collaboration and discovery. Results Here we introduce a Mouse Phenotype Analysis System (MPHASYS), a platform for integrating data generated by studies of mouse models of human biology and disease such as aging and cancer. This computational platform is designed to provide a standardized methodology for working with animal data; a framework for data entry, analysis and sharing; and ontologies and methodologies for ensuring accurate data capture. We describe the tools that currently comprise MPHASYS, primarily ones related to mouse pathology, and outline its use in a study of individual animal-specific patterns of multiple pathology in mice harboring a specific germline mutation in the DNA repair and transcription-specific gene Xpd. Conclusion MPHASYS is a system for analyzing multiple data types from individual animals. It provides a framework for developing data analysis applications, and tools for collecting and distributing high-quality data. The software is platform independent and freely available under an open-source license [1]. PMID:17553167
Validity of the Kinect for Gait Assessment: A Focused Review
Springer, Shmuel; Yogev Seligmann, Galit
2016-01-01
Gait analysis may enhance clinical practice. However, its use is limited due to the need for expensive equipment which is not always available in clinical settings. Recent evidence suggests that Microsoft Kinect may provide a low cost gait analysis method. The purpose of this report is to critically evaluate the literature describing the concurrent validity of using the Kinect as a gait analysis instrument. An online search of PubMed, CINAHL, and ProQuest databases was performed. Included were studies in which walking was assessed with the Kinect and another gold standard device, and consisted of at least one numerical finding of spatiotemporal or kinematic measures. Our search identified 366 papers, from which 12 relevant studies were retrieved. The results demonstrate that the Kinect is valid only for some spatiotemporal gait parameters. Although the kinematic parameters measured by the Kinect followed the trend of the joint trajectories, they showed poor validity and large errors. In conclusion, the Kinect may have the potential to be used as a tool for measuring spatiotemporal aspects of gait, yet standardized methods should be established, and future examinations with both healthy subjects and clinical participants are required in order to integrate the Kinect as a clinical gait analysis tool. PMID:26861323
NASA Astrophysics Data System (ADS)
Kadow, C.; Illing, S.; Kunst, O.; Cubasch, U.
2014-12-01
The project 'Integrated Data and Evaluation System for Decadal Scale Prediction' (INTEGRATION) as part of the German decadal prediction project MiKlip develops a central evaluation system. The fully operational hybrid features a HPC shell access and an user friendly web-interface. It employs one common system with a variety of verification tools and validation data from different projects in- and outside of MiKlip. The evaluation system is located at the German Climate Computing Centre (DKRZ) and has direct access to the bulk of its ESGF node including millions of climate model data sets, e.g. from CMIP5 and CORDEX. The database is organized by the international CMOR standard using the meta information of the self-describing model, reanalysis and observational data sets. Apache Solr is used for indexing the different data projects into one common search environment. This implemented meta data system with its advanced but easy to handle search tool supports users, developers and their tools to retrieve the required information. A generic application programming interface (API) allows scientific developers to connect their analysis tools with the evaluation system independently of the programming language used. Users of the evaluation techniques benefit from the common interface of the evaluation system without any need to understand the different scripting languages. Facilitating the provision and usage of tools and climate data increases automatically the number of scientists working with the data sets and identify discrepancies. Additionally, the history and configuration sub-system stores every analysis performed with the evaluation system in a MySQL database. Configurations and results of the tools can be shared among scientists via shell or web-system. Therefore, plugged-in tools gain automatically from transparency and reproducibility. Furthermore, when configurations match while starting a evaluation tool, the system suggests to use results already produced by other users-saving CPU time, I/O and disk space. This study presents the different techniques and advantages of such a hybrid evaluation system making use of a Big Data HPC in climate science. website: www-miklip.dkrz.de visitor-login: guest password: miklip
NASA Astrophysics Data System (ADS)
Kadow, Christopher; Illing, Sebastian; Kunst, Oliver; Ulbrich, Uwe; Cubasch, Ulrich
2015-04-01
The project 'Integrated Data and Evaluation System for Decadal Scale Prediction' (INTEGRATION) as part of the German decadal prediction project MiKlip develops a central evaluation system. The fully operational hybrid features a HPC shell access and an user friendly web-interface. It employs one common system with a variety of verification tools and validation data from different projects in- and outside of MiKlip. The evaluation system is located at the German Climate Computing Centre (DKRZ) and has direct access to the bulk of its ESGF node including millions of climate model data sets, e.g. from CMIP5 and CORDEX. The database is organized by the international CMOR standard using the meta information of the self-describing model, reanalysis and observational data sets. Apache Solr is used for indexing the different data projects into one common search environment. This implemented meta data system with its advanced but easy to handle search tool supports users, developers and their tools to retrieve the required information. A generic application programming interface (API) allows scientific developers to connect their analysis tools with the evaluation system independently of the programming language used. Users of the evaluation techniques benefit from the common interface of the evaluation system without any need to understand the different scripting languages. Facilitating the provision and usage of tools and climate data increases automatically the number of scientists working with the data sets and identify discrepancies. Additionally, the history and configuration sub-system stores every analysis performed with the evaluation system in a MySQL database. Configurations and results of the tools can be shared among scientists via shell or web-system. Therefore, plugged-in tools gain automatically from transparency and reproducibility. Furthermore, when configurations match while starting a evaluation tool, the system suggests to use results already produced by other users-saving CPU time, I/O and disk space. This study presents the different techniques and advantages of such a hybrid evaluation system making use of a Big Data HPC in climate science. website: www-miklip.dkrz.de visitor-login: click on "Guest"
NASA Astrophysics Data System (ADS)
Witherly, Jeffre
Research on student achievement indicates the U.S. K-12 education system is not adequately preparing American students to compete in the 21st century global economy in the areas of science and mathematics. Congress has asked the scientific entities of the federal government to help increase K-12 science learning by creating standards-based learning tools for science classrooms as part of a "voluntary curriculum." One problem facing federal entities, such as the National Institutes of Health (NIH), is the need to create science-learning tools that conform to the National Science Education Standards (NSES) for curriculum materials and, therefore, are standards-based and applicable to the K-12 curriculum. This case study sought to better understand the change process at one federal agency as it went from producing K-12 learning tools that were educational in nature to a program that produced K-12 standards-based learning tools: the NIH Science Curriculum Supplement Program (NIH SCSP). The NIH SCSP was studied to gain insight into how this change in educational approach occurred, what factors enabled or inhibited the change process, and what the long-term benefits of the NIH SCSP are to the NIH. Kurt Lewin's three-step theory of change guided data gathering and data analysis. Semi-structured interviews and programmatic document review served as the major data gathering sources. Details describing the process of organizational change at the NIH were revealed during analysis of these data following the coding of interview transcripts and written record documents. The study found the process of change at the NIH proceeded in a manner generally predicted by the Lewinian change model. Enablers to the change were cost-sharing with individual institutes, support of senior leadership, and crediting the role of individual institutes prominently in each supplement. The cost of creating a supplement was reported as the single inhibitor to the program. This case study yielded a detailed description of the process of change at this federal institution that may offer valuable insights to similar federal organizations confronting educational change. The study may also contribute to the existing body of knowledge regarding the process of organizational change in a federal setting.
fluff: exploratory analysis and visualization of high-throughput sequencing data
Georgiou, Georgios
2016-01-01
Summary. In this article we describe fluff, a software package that allows for simple exploration, clustering and visualization of high-throughput sequencing data mapped to a reference genome. The package contains three command-line tools to generate publication-quality figures in an uncomplicated manner using sensible defaults. Genome-wide data can be aggregated, clustered and visualized in a heatmap, according to different clustering methods. This includes a predefined setting to identify dynamic clusters between different conditions or developmental stages. Alternatively, clustered data can be visualized in a bandplot. Finally, fluff includes a tool to generate genomic profiles. As command-line tools, the fluff programs can easily be integrated into standard analysis pipelines. The installation is straightforward and documentation is available at http://fluff.readthedocs.org. Availability. fluff is implemented in Python and runs on Linux. The source code is freely available for download at https://github.com/simonvh/fluff. PMID:27547532
Burau, Viola; Fenton, Laura
2009-01-01
This paper aims to identify variation in the introduction of New Public Management reforms in healthcare and how this variation is related to country-specific healthcare states. The analysis uses the introduction of clinical standards in Britain and Germany as cases. The two countries are characterised by interesting differences in relation to the institutional set-up of healthcare states and as such present ideal cases to explore the specific ways of how healthcare states filter clinical standards as tools of a generic managerialism. Both countries have introduced clinical standards but, importantly, the substantive nature of clinical standards differs, reflecting differences in initial institutional conditions. More specifically, in Britain clinical standards have taken the form of two parallel policies, which strengthen hierarchy-based governing and redefine professional self-regulation. In Germany, by contrast, clinical standards come in one single policy, which strengthens the hybrid of network- and hierarchy-based governing and to some extent also pure hierarchy-based forms of governing. First, with its cross-country comparative focus, the analysis is able to identify systematic variations across healthcare states and the specific ways in which they impact on the introduction of New Public Management. Second, with its focus on clinical standards, the analysis deals with the governance of medical practice as one of the central areas of healthcare states.
Evaluation of digital real-time PCR assay as a molecular diagnostic tool for single-cell analysis.
Chang, Chia-Hao; Mau-Hsu, Daxen; Chen, Ke-Cheng; Wei, Cheng-Wey; Chiu, Chiung-Ying; Young, Tai-Horng
2018-02-21
In a single-cell study, isolating and identifying single cells are essential, but these processes often require a large investment of time or money. The aim of this study was to isolate and analyse single cells using a novel platform, the PanelChip™ Analysis System, which includes 2500 microwells chip and a digital real-time polymerase chain reaction (dqPCR) assay, in comparison with a standard PCR (qPCR) assay. Through the serial dilution of a known concentration standard, namely pUC19, the accuracy and sensitivity levels of two methodologies were compared. The two systems were tested on the basis of expression levels of the genetic markers vimentin, E-cadherin, N-cadherin and GAPDH in A549 lung carcinoma cells at two known concentrations. Furthermore, the influence of a known PCR inhibitor commonly found in blood samples, heparin, was evaluated in both methodologies. Finally, mathematical models were proposed and separation method of single cells was verified; moreover, gene expression levels during epithelial-mesenchymal transition in single cells under TGFβ1 treatment were measured. The drawn conclusion is that dqPCR performed using PanelChip™ is superior to the standard qPCR in terms of sensitivity, precision, and heparin tolerance. The dqPCR assay is a potential tool for clinical diagnosis and single-cell applications.
Carpal tunnel syndrome: Analysis of online patient information with the EQIP tool.
Frueh, F S; Palma, A F; Raptis, D A; Graf, C P; Giovanoli, P; Calcagni, M
2015-06-01
Patients suffering from carpal tunnel syndrome (CTS) actively search for medical information on the Internet. The World Wide Web represents the main source of patient information. The aim of this study was to systematically assess the quality of patient information about CTS in the Internet. A qualitative and quantitative assessment of websites was performed with the modified Ensuring Quality Information for Patients (EQIP) tool that contains 36 standardized items. Five hundred websites with information on CTS treatment options were identified through Google, Bing, Yahoo, Ask.com and AOL. Duplicates and irrelevant websites were excluded. One hundred and ten websites were included. Only five websites addressed more than 20 items; quality scores were not significantly different between the various providing groups. A median of 15 EQIP items was found, with the top website addressing 26 out of 36 items. Major complications such as median nerve injury were reported in 27% of the websites and their treatment in only 3%. This analysis revealed several critical shortcomings in the quality of the information provided to patients suffering from CTS. There is a collective need to provide interactive, informative and educational websites for standard procedures in hand surgery. These websites should be compatible with international quality standards for hand surgery procedures. Copyright © 2015 Elsevier Masson SAS. All rights reserved.
MBS Measurement Tool for Swallow Impairment—MBSImp: Establishing a Standard
Martin-Harris, Bonnie; Brodsky, Martin B.; Michel, Yvonne; Castell, Donald O.; Schleicher, Melanie; Sandidge, John; Maxwell, Rebekah; Blair, Julie
2014-01-01
The aim of this study was to test reliability, content, construct, and external validity of a new modified barium swallowing study (MBSS) tool (MBSImp) that is used to quantify swallowing impairment. Multiple regression, confirmatory factor, and correlation analyses were used to analyze 300 in- and outpatients with heterogeneous medical and surgical diagnoses who were sequentially referred for MBS exams at a university medical center and private tertiary care community hospital. Main outcome measures were the MBSImp and index scores of aspiration, health status, and quality of life. Inter- and intrarater concordance were 80% or greater for blinded scoring of MBSSs. Regression analysis revealed contributions of eight of nine swallow types to impressions of overall swallowing impairment (p ≤ 0.05). Factor analysis revealed 13 significant components (loadings ≥ 0.5) that formed two impairment groupings (oral and pharyngeal). Significant correlations were found between Oral and Pharyngeal Impairment scores and Penetration-Aspiration Scale scores, and indexes of intake status, nutrition, health status, and quality of life. The MBSImp demonstrated clinical practicality, favorable inter- and intrarater reliability following standardized training, content, and external validity. This study reflects potential for establishment of a new standard for quantification and comparison of oropharyngeal swallowing impairment across patient diagnoses as measured on MBSS. PMID:18855050
Extravehicular Activity System Sizing Analysis Tool (EVAS_SAT)
NASA Technical Reports Server (NTRS)
Brown, Cheryl B.; Conger, Bruce C.; Miranda, Bruno M.; Bue, Grant C.; Rouen, Michael N.
2007-01-01
An effort was initiated by NASA/JSC in 2001 to develop an Extravehicular Activity System Sizing Analysis Tool (EVAS_SAT) for the sizing of Extravehicular Activity System (EVAS) architecture and studies. Its intent was to support space suit development efforts and to aid in conceptual designs for future human exploration missions. Its basis was the Life Support Options Performance Program (LSOPP), a spacesuit and portable life support system (PLSS) sizing program developed for NASA/JSC circa 1990. EVAS_SAT estimates the mass, power, and volume characteristics for user-defined EVAS architectures, including Suit Systems, Airlock Systems, Tools and Translation Aids, and Vehicle Support equipment. The tool has undergone annual changes and has been updated as new data have become available. Certain sizing algorithms have been developed based on industry standards, while others are based on the LSOPP sizing routines. The sizing algorithms used by EVAS_SAT are preliminary. Because EVAS_SAT was designed for use by members of the EVA community, subsystem familiarity on the part of the intended user group and in the analysis of results is assumed. The current EVAS_SAT is operated within Microsoft Excel 2003 using a Visual Basic interface system.
Yu, Chen; Zhang, Qian; Xu, Peng-Yao; Bai, Yin; Shen, Wen-Bin; Di, Bin; Su, Meng-Xiang
2018-01-01
Quantitative nuclear magnetic resonance (qNMR) is a well-established technique in quantitative analysis. We presented a validated 1 H-qNMR method for assay of octreotide acetate, a kind of cyclic octopeptide. Deuterium oxide was used to remove the undesired exchangeable peaks, which was referred to as proton exchange, in order to make the quantitative signals isolated in the crowded spectrum of the peptide and ensure precise quantitative analysis. Gemcitabine hydrochloride was chosen as the suitable internal standard. Experimental conditions, including relaxation delay time, the numbers of scans, and pulse angle, were optimized first. Then method validation was carried out in terms of selectivity, stability, linearity, precision, and robustness. The assay result was compared with that by means of high performance liquid chromatography, which is provided by Chinese Pharmacopoeia. The statistical F test, Student's t test, and nonparametric test at 95% confidence level indicate that there was no significant difference between these two methods. qNMR is a simple and accurate quantitative tool with no need for specific corresponding reference standards. It has the potential of the quantitative analysis of other peptide drugs and standardization of the corresponding reference standards. Copyright © 2017 John Wiley & Sons, Ltd.
Ashley, Dennis W; Mullins, Robert F; Dente, Christopher J; Garlow, Laura; Medeiros, Regina S; Atkins, Elizabeth V; Solomon, Gina; Abston, Dena; Ferdinand, Colville H
2017-09-01
Trauma center readiness costs are incurred to maintain essential infrastructure and capacity to provide emergent services on a 24/7 basis. These costs are not captured by traditional hospital cost accounting, and no national consensus exists on appropriate definitions for each cost. Therefore, in 2010, stakeholders from all Level I and II trauma centers developed a survey tool standardizing and defining trauma center readiness costs. The survey tool underwent minor revisions to provide further clarity, and the survey was repeated in 2013. The purpose of this study was to provide a follow-up analysis of readiness costs for Georgia's Level I and Level II trauma centers. Using the American College of Surgeons Resources for Optimal Care of the Injured Patient guidelines, four readiness cost categories were identified: Administrative, Clinical Medical Staff, Operating Room, and Education/Outreach. Through conference calls, webinars and face-to-face meetings with financial officers, trauma medical directors, and program managers from all trauma centers, standardized definitions for reporting readiness costs within each category were developed. This resulted in a survey tool for centers to report their individual readiness costs for one year. The total readiness cost for all Level I trauma centers was $34,105,318 (avg $6,821,064) and all Level II trauma centers was $20,998,019 (avg $2,333,113). Methodology to standardize and define readiness costs for all trauma centers within the state was developed. Average costs for Level I and Level II trauma centers were identified. This model may be used to help other states define and standardize their trauma readiness costs.
Application of Standards-Based Quantitative SEM-EDS Analysis to Oxide Minerals
NASA Astrophysics Data System (ADS)
Mengason, M. J.; Ritchie, N. W.; Newbury, D. E.
2016-12-01
SEM and EPMA analysis are powerful tools for documenting and evaluating the relationships between minerals in thin sections and for determining chemical compositions in-situ. The time and costs associated with determining major, minor, and some trace element concentrations in geologic materials can be reduced due to advances in EDS spectrometer performance and the availability of software tools such as NIST DTSA II to perform multiple linear least squares (MLLS) fitting of energy spectra from standards to the spectra from samples recorded under the same analytical conditions. MLLS fitting is able to overcome spectral peak overlaps among the transition-metal elements that commonly occur in oxide minerals, which had previously been seen as too difficult for EDS analysis, allowing for rapid and accurate determination of concentrations. The quantitative use of EDS is demonstrated in the chemical analysis of magnetite (NMNH 114887) and ilmenite (NMNH 96189) from the Smithsonian Natural History Museum Microbeam Standards Collection. Average concentrations from nine total spots over three grains are given in mass % listed as (recommended; measured concentration ± one standard deviation). Spectra were collected for sixty seconds live time at 15 kV and 10 nA over a 12 micrometer wide scan area. Analysis of magnetite yielded Magnesium (0.03; 0.04 ± 0.01), Aluminum (none given; 0.040 ± 0.006), Titanium (0.10; 0.11 ± 0.02), Vanadium (none given; 0.16 ± 0.01), Chromium (0.17; 0.14 ± 0.02), and Iron (70.71, 71.4 ± 0.2). Analysis of ilmenite yielded Magnesium (0.19; 0.183 ± 0.008), Aluminum (none given; 0.04 ± 0.02), Titanium (27.4, 28.1 ± 0.1), Chromium (none given; 0.04 ± 0.01), Manganese (3.69; 3.73 ± 0.03), Iron (36.18; 35.8 ± 0.1), and Niobium (0.64; 0.68 ± 0.03). The analysis of geologic materials by standards-based quantitative EDS can be further illustrated with chemical analyses of oxides from ocean island basalts representing several locations globally to illustrate the suitability of the method to the goal of evaluating trends in major and minor element concentrations and variability among locations. The shorter collection times of EDS, compared to WDS, allow greater sampling of the populations of oxides present as fine-grained quench products in addition to sampling larger inclusions hosted by silicate minerals.
Quantifying Traces of Tool Use: A Novel Morphometric Analysis of Damage Patterns on Percussive Tools
Caruana, Matthew V.; Carvalho, Susana; Braun, David R.; Presnyakova, Darya; Haslam, Michael; Archer, Will; Bobe, Rene; Harris, John W. K.
2014-01-01
Percussive technology continues to play an increasingly important role in understanding the evolution of tool use. Comparing the archaeological record with extractive foraging behaviors in nonhuman primates has focused on percussive implements as a key to investigating the origins of lithic technology. Despite this, archaeological approaches towards percussive tools have been obscured by a lack of standardized methodologies. Central to this issue have been the use of qualitative, non-diagnostic techniques to identify percussive tools from archaeological contexts. Here we describe a new morphometric method for distinguishing anthropogenically-generated damage patterns on percussive tools from naturally damaged river cobbles. We employ a geomatic approach through the use of three-dimensional scanning and geographical information systems software to statistically quantify the identification process in percussive technology research. This will strengthen current technological analyses of percussive tools in archaeological frameworks and open new avenues for translating behavioral inferences of early hominins from percussive damage patterns. PMID:25415303
Data standards for clinical research data collection forms: current status and challenges.
Richesson, Rachel L; Nadkarni, Prakash
2011-05-01
Case report forms (CRFs) are used for structured-data collection in clinical research studies. Existing CRF-related standards encompass structural features of forms and data items, content standards, and specifications for using terminologies. This paper reviews existing standards and discusses their current limitations. Because clinical research is highly protocol-specific, forms-development processes are more easily standardized than is CRF content. Tools that support retrieval and reuse of existing items will enable standards adoption in clinical research applications. Such tools will depend upon formal relationships between items and terminological standards. Future standards adoption will depend upon standardized approaches for bridging generic structural standards and domain-specific content standards. Clinical research informatics can help define tools requirements in terms of workflow support for research activities, reconcile the perspectives of varied clinical research stakeholders, and coordinate standards efforts toward interoperability across healthcare and research data collection.
Scoring Tools for the Analysis of Infant Respiratory Inductive Plethysmography Signals.
Robles-Rubio, Carlos Alejandro; Bertolizio, Gianluca; Brown, Karen A; Kearney, Robert E
2015-01-01
Infants recovering from anesthesia are at risk of life threatening Postoperative Apnea (POA). POA events are rare, and so the study of POA requires the analysis of long cardiorespiratory records. Manual scoring is the preferred method of analysis for these data, but it is limited by low intra- and inter-scorer repeatability. Furthermore, recommended scoring rules do not provide a comprehensive description of the respiratory patterns. This work describes a set of manual scoring tools that address these limitations. These tools include: (i) a set of definitions and scoring rules for 6 mutually exclusive, unique patterns that fully characterize infant respiratory inductive plethysmography (RIP) signals; (ii) RIPScore, a graphical, manual scoring software to apply these rules to infant data; (iii) a library of data segments representing each of the 6 patterns; (iv) a fully automated, interactive formal training protocol to standardize the analysis and establish intra- and inter-scorer repeatability; and (v) a quality control method to monitor scorer ongoing performance over time. To evaluate these tools, three scorers from varied backgrounds were recruited and trained to reach a performance level similar to that of an expert. These scorers used RIPScore to analyze data from infants at risk of POA in two separate, independent instances. Scorers performed with high accuracy and consistency, analyzed data efficiently, had very good intra- and inter-scorer repeatability, and exhibited only minor confusion between patterns. These results indicate that our tools represent an excellent method for the analysis of respiratory patterns in long data records. Although the tools were developed for the study of POA, their use extends to any study of respiratory patterns using RIP (e.g., sleep apnea, extubation readiness). Moreover, by establishing and monitoring scorer repeatability, our tools enable the analysis of large data sets by multiple scorers, which is essential for longitudinal and multicenter studies.
ERIC Educational Resources Information Center
Texas State Technical Coll., Waco.
This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…
ERIC Educational Resources Information Center
Texas State Technical Coll., Waco.
This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…
ERIC Educational Resources Information Center
Texas State Technical Coll., Waco.
This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…
ERIC Educational Resources Information Center
Texas State Technical Coll., Waco.
The Machine Tool Advanced Skills Technology (MAST) consortium was formed to address the shortage of skilled workers for the machine tools and metals-related industries. Featuring six of the nation's leading advanced technology centers, the MAST consortium developed, tested, and disseminated industry-specific skill standards and model curricula for…
ERIC Educational Resources Information Center
Texas State Technical Coll., Waco.
This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational speciality areas within the U.S. machine tool and metals-related…
ERIC Educational Resources Information Center
Texas State Technical Coll., Waco.
This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…
Coding gestural behavior with the NEUROGES--ELAN system.
Lausberg, Hedda; Sloetjes, Han
2009-08-01
We present a coding system combined with an annotation tool for the analysis of gestural behavior. The NEUROGES coding system consists of three modules that progress from gesture kinetics to gesture function. Grounded on empirical neuropsychological and psychological studies, the theoretical assumption behind NEUROGES is that its main kinetic and functional movement categories are differentially associated with specific cognitive, emotional, and interactive functions. ELAN is a free, multimodal annotation tool for digital audio and video media. It supports multileveled transcription and complies with such standards as XML and Unicode. ELAN allows gesture categories to be stored with associated vocabularies that are reusable by means of template files. The combination of the NEUROGES coding system and the annotation tool ELAN creates an effective tool for empirical research on gestural behavior.
Interoperability science cases with the CDPP tools
NASA Astrophysics Data System (ADS)
Nathanaël, J.; Cecconi, B.; André, N.; Bouchemit, M.; Gangloff, M.; Budnik, E.; Jacquey, C.; Pitout, F.; Durand, J.; Rouillard, A.; Lavraud, B.; Genot, V. N.; Popescu, D.; Beigbeder, L.; Toniutti, J. P.; Caussarieu, S.
2017-12-01
Data exchange protocols are never as efficient as when they are invisible for the end user who is then able to discover data, to cross compare observations and modeled data and finally to perform in depth analysis. Over the years these protocols, including SAMP from IVOA, EPN-TAP from the Europlanet 2020 RI community, backed by standard web-services, have been deployed in tools designed by the French Centre de Données de la Physique des Plasmas (CDPP) including AMDA, the Propagation Tool, 3DView, ... . This presentation will focus on science cases which show the capability of interoperability in the planetary and heliophysics contexts, involving both CDPP and companion tools. Europlanet 2020 RI has received funding from the European Union's Horizon 2020 research and innovation programme under grant agreement No 654208.
NASA Technical Reports Server (NTRS)
Smith, Jeffrey H.; Drews, Michael
1990-01-01
The results are described of an effort to establish commonality and standardization of generic crew extravehicular (crew-EVA) and telerobotic task analysis primitives used for the study of spaceborne operations. Although direct crew-EVA plans are the most visible output of spaceborne operations, significant ongoing efforts by a wide variety of projects and organizations also require tools for estimation of crew-EVA and telerobotic times. Task analysis tools provide estimates for input to technical and cost tradeoff studies. A workshop was convened to identify the issues and needs to establish a common language and syntax for task analysis primitives. In addition, the importance of such a syntax was shown to have precedence over the level to which such a syntax is applied. The syntax, lists of crew-EVA and telerobotic primitives, and the data base in diskette form are presented.
Recipe for Success: Digital Viewables
NASA Technical Reports Server (NTRS)
LaPha, Steven; Gaydos, Frank
2014-01-01
The Engineering Services Contract (ESC) and Information Management Communication Support contract (IMCS) at Kennedy Space Center (KSC) provide services to NASA in respect to flight and ground systems design and development. These groups provides the necessary tools, aid, and best practice methodologies required for efficient, optimized design and process development. The team is responsible for configuring and implementing systems, software, along with training, documentation, and administering standards. The team supports over 200 engineers and design specialists with the use of Windchill, Creo Parametric, NX, AutoCAD, and a variety of other design and analysis tools.
NASA Astrophysics Data System (ADS)
Kadow, Christopher; Illing, Sebastian; Kunst, Oliver; Schartner, Thomas; Kirchner, Ingo; Rust, Henning W.; Cubasch, Ulrich; Ulbrich, Uwe
2016-04-01
The Freie Univ Evaluation System Framework (Freva - freva.met.fu-berlin.de) is a software infrastructure for standardized data and tool solutions in Earth system science. Freva runs on high performance computers to handle customizable evaluation systems of research projects, institutes or universities. It combines different software technologies into one common hybrid infrastructure, including all features present in the shell and web environment. The database interface satisfies the international standards provided by the Earth System Grid Federation (ESGF). Freva indexes different data projects into one common search environment by storing the meta data information of the self-describing model, reanalysis and observational data sets in a database. This implemented meta data system with its advanced but easy-to-handle search tool supports users, developers and their plugins to retrieve the required information. A generic application programming interface (API) allows scientific developers to connect their analysis tools with the evaluation system independently of the programming language used. Users of the evaluation techniques benefit from the common interface of the evaluation system without any need to understand the different scripting languages. Facilitation of the provision and usage of tools and climate data automatically increases the number of scientists working with the data sets and identifying discrepancies. The integrated web-shell (shellinabox) adds a degree of freedom in the choice of the working environment and can be used as a gate to the research projects HPC. Plugins are able to integrate their e.g. post-processed results into the database of the user. This allows e.g. post-processing plugins to feed statistical analysis plugins, which fosters an active exchange between plugin developers of a research project. Additionally, the history and configuration sub-system stores every analysis performed with the evaluation system in a database. Configurations and results of the tools can be shared among scientists via shell or web system. Therefore, plugged-in tools benefit from transparency and reproducibility. Furthermore, if configurations match while starting an evaluation plugin, the system suggests to use results already produced by other users - saving CPU/h, I/O, disk space and time. The efficient interaction between different technologies improves the Earth system modeling science framed by Freva.
VisIVO: A Tool for the Virtual Observatory and Grid Environment
NASA Astrophysics Data System (ADS)
Becciani, U.; Comparato, M.; Costa, A.; Larsson, B.; Gheller, C.; Pasian, F.; Smareglia, R.
2007-10-01
We present the new features of VisIVO, software for the visualization and analysis of astrophysical data which can be retrieved from the Virtual Observatory framework and used for cosmological simulations running both on Windows and GNU/Linux platforms. VisIVO is VO standards compliant and supports the most important astronomical data formats such as FITS, HDF5 and VOTables. It is free software and can be downloaded from the web site http://visivo.cineca.it. VisIVO can interoperate with other astronomical VO compliant tools through PLASTIC (PLatform for AStronomical Tool InterConnection). This feature allows VisIVO to share data with many other astronomical packages to further analyze the loaded data.
Enriching and improving the quality of linked data with GIS
NASA Astrophysics Data System (ADS)
Iwaniak, Adam; Kaczmarek, Iwona; Strzelecki, Marek; Lukowicz, Jaromar; Jankowski, Piotr
2016-06-01
Standardization of methods for data exchange in GIS has along history predating the creation of World Wide Web. The advent of World Wide Web brought the emergence of new solutions for data exchange and sharing including; more recently, standards proposed by the W3C for data exchange involving Semantic Web technologies and linked data. Despite the growing interest in integration, GIS and linked data are still two separate paradigms for describing and publishing spatial data on the Web. At the same time, both paradigms offer complementary ways of representing real world phenomena and means of analysis using different processing functions. The complementarity of linked data and GIS can be leveraged to synergize both paradigms resulting in richer data content and more powerful inferencing. The article presents an approach aimed at integrating linked data with GIS. The approach relies on the use of GIS tools for integration, verification and enrichment of linked data. The GIS tools are employed to enrich linked data by furnishing access to collection of data resources, defining relationship between data resources, and subsequently facilitating GIS data integration with linked data. The proposed approach is demonstrated with examples using data from DBpedia, OSM, and tools developed by the authors for standard GIS software.
High preservation of DNA standards diluted in 50% glycerol.
Schaudien, Dirk; Baumgärtner, Wolfgang; Herden, Christiane
2007-09-01
Standard curves are important tools in real-time quantitative polymerase chain reaction (PCR) to precisely analyze gene expression patterns under physiologic and pathologic conditions. Handling of DNA standards often implies multiple cycles of freezing and thawing that might affect DNA stability and integrity. This in turn might influence the reliability and reproducibility of quantitative measurements in real-time PCR assays. In this study, 3 DNA standards such as murine tumor necrosis factor (TNF) alpha, interferon (IFN) gamma, and kainat-1 receptor were diluted in 50% glycerol or water after 1, 4, and 16 cycles of freezing and thawing and amplified copy numbers after real-time PCR were compared. The standards diluted in water showed a reduction to 83%, 55%, and 50% after 4 cycles, to 24%, 5%, and 4% after 16 cycles for kainat-1 receptor, TNFalpha, and IFNgamma standards, respectively, when compared with a single cycle of freezing and thawing. Interestingly, all cDNA samples diluted in 50% glycerol were amplified in comparable copy numbers even after 16 cycles of freezing and thawing. The effect of the standards undergoing different cycles of freezing and thawing on sample values was demonstrated by amplifying cDNA obtained from Borna disease virus infected and noninfected TNF-transgenic mice brain. This revealed significant differences of measured cDNA copy numbers using water-diluted DNA standards. In contrast, sample values did not vary using glycerol-diluted standards that were frozen and thawed for 16 times. In conclusion, glycerol storage of DNA standards represents a suitable tool for the accurate and reproducible quantification of cDNA samples in real-time PCR analysis.
Gold-standard for computer-assisted morphological sperm analysis.
Chang, Violeta; Garcia, Alejandra; Hitschfeld, Nancy; Härtel, Steffen
2017-04-01
Published algorithms for classification of human sperm heads are based on relatively small image databases that are not open to the public, and thus no direct comparison is available for competing methods. We describe a gold-standard for morphological sperm analysis (SCIAN-MorphoSpermGS), a dataset of sperm head images with expert-classification labels in one of the following classes: normal, tapered, pyriform, small or amorphous. This gold-standard is for evaluating and comparing known techniques and future improvements to present approaches for classification of human sperm heads for semen analysis. Although this paper does not provide a computational tool for morphological sperm analysis, we present a set of experiments for comparing sperm head description and classification common techniques. This classification base-line is aimed to be used as a reference for future improvements to present approaches for human sperm head classification. The gold-standard provides a label for each sperm head, which is achieved by majority voting among experts. The classification base-line compares four supervised learning methods (1- Nearest Neighbor, naive Bayes, decision trees and Support Vector Machine (SVM)) and three shape-based descriptors (Hu moments, Zernike moments and Fourier descriptors), reporting the accuracy and the true positive rate for each experiment. We used Fleiss' Kappa Coefficient to evaluate the inter-expert agreement and Fisher's exact test for inter-expert variability and statistical significant differences between descriptors and learning techniques. Our results confirm the high degree of inter-expert variability in the morphological sperm analysis. Regarding the classification base line, we show that none of the standard descriptors or classification approaches is best suitable for tackling the problem of sperm head classification. We discovered that the correct classification rate was highly variable when trying to discriminate among non-normal sperm heads. By using the Fourier descriptor and SVM, we achieved the best mean correct classification: only 49%. We conclude that the SCIAN-MorphoSpermGS will provide a standard tool for evaluation of characterization and classification approaches for human sperm heads. Indeed, there is a clear need for a specific shape-based descriptor for human sperm heads and a specific classification approach to tackle the problem of high variability within subcategories of abnormal sperm cells. Copyright © 2017 Elsevier Ltd. All rights reserved.
Determination of diagnostic standards on saturated soil extracts for cut roses grown in greenhouses.
Franco-Hermida, John Jairo; Quintero, María Fernanda; Cabrera, Raúl Iskander; Guzman, José Miguel
2017-01-01
This work comprises the theoretical determination and validation of diagnostic standards for the analysis of saturated soil extracts for cut rose flower crops (Rosa spp.) growing in the Bogota Plateau, Colombia. The data included 684 plant tissue analyses and 684 corresponding analyses of saturated soil extracts, all collected between January 2009 and June 2013. The tissue and soil samples were selected from 13 rose farms, and from cultivars grafted on the 'Natal Briar' rootstock. These concurrent samples of soil and plant tissues represented 251 production units (locations) of approximately 10,000 m2 distributed across the study area. The standards were conceived as a tool to improve the nutritional balance in the leaf tissue of rose plants and thereby define the norms for expressing optimum productive potential relative to nutritional conditions in the soil. To this end, previously determined diagnostic standard for rose leaf tissues were employed to obtain rates of foliar nutritional balance at each analyzed location and as criteria for determining the diagnostic norms for saturated soil extracts. Implementing this methodology to foliar analysis, showed a higher significant correlation for diagnostic indices. A similar behavior was observed in saturated soil extracts analysis, becoming a powerful tool for integrated nutritional diagnosis. Leaf analyses determine the most limiting nutrients for high yield and analyses of saturated soil extracts facilitate the possibility of correcting the fertigation formulations applied to soils or substrates. Recommendations are proposed to improve the balance in soil-plant system with which the possibility of yield increase becomes more probable. The main recommendations to increase and improve rose crop flower yields would be: continuously check pH values of SSE, reduce the amounts of P, Fe, Zn and Cu in fertigation solutions and carefully analyze the situation of Mn in the soil-plant system.
Porcupine: A visual pipeline tool for neuroimaging analysis
Snoek, Lukas; Knapen, Tomas
2018-01-01
The field of neuroimaging is rapidly adopting a more reproducible approach to data acquisition and analysis. Data structures and formats are being standardised and data analyses are getting more automated. However, as data analysis becomes more complicated, researchers often have to write longer analysis scripts, spanning different tools across multiple programming languages. This makes it more difficult to share or recreate code, reducing the reproducibility of the analysis. We present a tool, Porcupine, that constructs one’s analysis visually and automatically produces analysis code. The graphical representation improves understanding of the performed analysis, while retaining the flexibility of modifying the produced code manually to custom needs. Not only does Porcupine produce the analysis code, it also creates a shareable environment for running the code in the form of a Docker image. Together, this forms a reproducible way of constructing, visualising and sharing one’s analysis. Currently, Porcupine links to Nipype functionalities, which in turn accesses most standard neuroimaging analysis tools. Our goal is to release researchers from the constraints of specific implementation details, thereby freeing them to think about novel and creative ways to solve a given problem. Porcupine improves the overview researchers have of their processing pipelines, and facilitates both the development and communication of their work. This will reduce the threshold at which less expert users can generate reusable pipelines. With Porcupine, we bridge the gap between a conceptual and an implementational level of analysis and make it easier for researchers to create reproducible and shareable science. We provide a wide range of examples and documentation, as well as installer files for all platforms on our website: https://timvanmourik.github.io/Porcupine. Porcupine is free, open source, and released under the GNU General Public License v3.0. PMID:29746461
ESML for Earth Science Data Sets and Analysis
NASA Technical Reports Server (NTRS)
Graves, Sara; Ramachandran, Rahul
2003-01-01
The primary objective of this research project was to transition ESML from design to application. The resulting schema and prototype software will foster community acceptance for the Define once, use anywhere concept central to ESML. Supporting goals include: 1) Refinement of the ESML schema and software libraries in cooperation with the user community; 2) Application of the ESML schema and software to a variety of Earth science data sets and analysis tools; 3) Development of supporting prototype software for enhanced ease of use; 4) Cooperation with standards bodies in order to assure ESML is aligned with related metadata standards as appropriate; and 5) Widespread publication of the ESML approach, schema, and software.
Quality assurance and management in microelectronics companies: ISO 9000 versus Six Sigma
NASA Astrophysics Data System (ADS)
Lupan, Razvan; Kobi, Abdessamad; Robledo, Christian; Bacivarov, Ioan; Bacivarov, Angelica
2009-01-01
A strategy for the implementation of the Six Sigma method as an improvement solution for the ISO 9000:2000 Quality Standard is proposed. Our approach is focused on integrating the DMAIC cycle of the Six Sigma method with the PDCA process approach, highly recommended by the standard ISO 9000:2000. The Six Sigma steps applied to each part of the PDCA cycle are presented in detail, giving some tools and training examples. Based on this analysis the authors conclude that applying Six Sigma philosophy to the Quality Standard implementation process is the best way to achieve the optimal results in quality progress and therefore in customers satisfaction.
Validating data analysis of broadband laser ranging
NASA Astrophysics Data System (ADS)
Rhodes, M.; Catenacci, J.; Howard, M.; La Lone, B.; Kostinski, N.; Perry, D.; Bennett, C.; Patterson, J.
2018-03-01
Broadband laser ranging combines spectral interferometry and a dispersive Fourier transform to achieve high-repetition-rate measurements of the position of a moving surface. Telecommunications fiber is a convenient tool for generating the large linear dispersions required for a dispersive Fourier transform, but standard fiber also has higher-order dispersion that distorts the Fourier transform. Imperfections in the dispersive Fourier transform significantly complicate the ranging signal and must be dealt with to make high-precision measurements. We describe in detail an analysis process for interpreting ranging data when standard telecommunications fiber is used to perform an imperfect dispersive Fourier transform. This analysis process is experimentally validated over a 27-cm scan of static positions, showing an accuracy of 50 μm and a root-mean-square precision of 4.7 μm.
Standardized Curriculum for Machine Tool Operation/Machine Shop.
ERIC Educational Resources Information Center
Mississippi State Dept. of Education, Jackson. Office of Vocational, Technical and Adult Education.
Standardized vocational education course titles and core contents for two courses in Mississippi are provided: machine tool operation/machine shop I and II. The first course contains the following units: (1) orientation; (2) shop safety; (3) shop math; (4) measuring tools and instruments; (5) hand and bench tools; (6) blueprint reading; (7)…
Mitchell, Alex J; Meader, Nick; Davies, Evan; Clover, Kerrie; Carter, Gregory L; Loscalzo, Matthew J; Linden, Wolfgang; Grassi, Luigi; Johansen, Christoffer; Carlson, Linda E; Zabora, James
2012-10-01
To examine the validity of screening and case-finding tools used in the identification of depression as defined by an ICD10/DSM-IV criterion standard. We identified 63 studies involving 19 tools (in 33 publications) designed to help clinicians identify depression in cancer settings. We used a standardized rating system. We excluded 11 tools without at least two independent studies, leaving 8 tools for comparison. Across all cancer stages there were 56 diagnostic validity studies (n=10,009). For case-finding, one stem question, two stem questions and the BDI-II all had level 2 evidence (2a, 2b and 2c respectively) and given their better acceptability we gave the stem questions a grade B recommendation. For screening, two stem questions had level 1b evidence (with high acceptability) and the BDI-II had level 2c evidence. For every 100 people screened in advanced cancer, the two questions would accurately detect 18 cases, while missing only 1 and correctly reassure 74 with 7 falsely identified. For every 100 people screened in non-palliative settings the BDI-II would accurately detect 17 cases, missing 2 and correctly re-assure 70, with 11 falsely identified as cases. The main cautions are the reliance on DSM-IV definitions of major depression, the large number of small studies and the paucity of data for many tools in specific settings. Although no single tool could be offered unqualified support, several tools are likely to improve upon unassisted clinical recognition. In clinical practice, all tools should form part of an integrated approach involving further follow-up, clinical assessment and evidence based therapy. Copyright © 2012 Elsevier B.V. All rights reserved.
French, Anna; Bravery, Christopher; Smith, James; Chandra, Amit; Archibald, Peter; Gold, Joseph D; Artzi, Natalie; Kim, Hae-Won; Barker, Richard W; Meissner, Alexander; Wu, Joseph C; Knowles, Jonathan C; Williams, David; García-Cardeña, Guillermo; Sipp, Doug; Oh, Steve; Loring, Jeanne F; Rao, Mahendra S; Reeve, Brock; Wall, Ivan; Carr, Andrew J; Bure, Kim; Stacey, Glyn; Karp, Jeffrey M; Snyder, Evan Y; Brindley, David A
2015-03-01
There is a need for physical standards (reference materials) to ensure both reproducibility and consistency in the production of somatic cell types from human pluripotent stem cell (hPSC) sources. We have outlined the need for reference materials (RMs) in relation to the unique properties and concerns surrounding hPSC-derived products and suggest in-house approaches to RM generation relevant to basic research, drug screening, and therapeutic applications. hPSCs have an unparalleled potential as a source of somatic cells for drug screening, disease modeling, and therapeutic application. Undefined variation and product variability after differentiation to the lineage or cell type of interest impede efficient translation and can obscure the evaluation of clinical safety and efficacy. Moreover, in the absence of a consistent population, data generated from in vitro studies could be unreliable and irreproducible. Efforts to devise approaches and tools that facilitate improved consistency of hPSC-derived products, both as development tools and therapeutic products, will aid translation. Standards exist in both written and physical form; however, because many unknown factors persist in the field, premature written standards could inhibit rather than promote innovation and translation. We focused on the derivation of physical standard RMs. We outline the need for RMs and assess the approaches to in-house RM generation for hPSC-derived products, a critical tool for the analysis and control of product variation that can be applied by researchers and developers. We then explore potential routes for the generation of RMs, including both cellular and noncellular materials and novel methods that might provide valuable tools to measure and account for variation. Multiparametric techniques to identify "signatures" for therapeutically relevant cell types, such as neurons and cardiomyocytes that can be derived from hPSCs, would be of significant utility, although physical RMs will be required for clinical purposes. ©AlphaMed Press.
Bravery, Christopher; Smith, James; Chandra, Amit; Archibald, Peter; Gold, Joseph D.; Artzi, Natalie; Kim, Hae-Won; Barker, Richard W.; Meissner, Alexander; Wu, Joseph C.; Knowles, Jonathan C.; Williams, David; García-Cardeña, Guillermo; Sipp, Doug; Oh, Steve; Loring, Jeanne F.; Rao, Mahendra S.; Reeve, Brock; Wall, Ivan; Carr, Andrew J.; Bure, Kim; Stacey, Glyn; Karp, Jeffrey M.
2015-01-01
Summary There is a need for physical standards (reference materials) to ensure both reproducibility and consistency in the production of somatic cell types from human pluripotent stem cell (hPSC) sources. We have outlined the need for reference materials (RMs) in relation to the unique properties and concerns surrounding hPSC-derived products and suggest in-house approaches to RM generation relevant to basic research, drug screening, and therapeutic applications. hPSCs have an unparalleled potential as a source of somatic cells for drug screening, disease modeling, and therapeutic application. Undefined variation and product variability after differentiation to the lineage or cell type of interest impede efficient translation and can obscure the evaluation of clinical safety and efficacy. Moreover, in the absence of a consistent population, data generated from in vitro studies could be unreliable and irreproducible. Efforts to devise approaches and tools that facilitate improved consistency of hPSC-derived products, both as development tools and therapeutic products, will aid translation. Standards exist in both written and physical form; however, because many unknown factors persist in the field, premature written standards could inhibit rather than promote innovation and translation. We focused on the derivation of physical standard RMs. We outline the need for RMs and assess the approaches to in-house RM generation for hPSC-derived products, a critical tool for the analysis and control of product variation that can be applied by researchers and developers. We then explore potential routes for the generation of RMs, including both cellular and noncellular materials and novel methods that might provide valuable tools to measure and account for variation. Multiparametric techniques to identify “signatures” for therapeutically relevant cell types, such as neurons and cardiomyocytes that can be derived from hPSCs, would be of significant utility, although physical RMs will be required for clinical purposes. PMID:25650438
IgSimulator: a versatile immunosequencing simulator.
Safonova, Yana; Lapidus, Alla; Lill, Jennie
2015-10-01
The recent introduction of next-generation sequencing technologies to antibody studies have resulted in a growing number of immunoinformatics tools for antibody repertoire analysis. However, benchmarking these newly emerging tools remains problematic since the gold standard datasets that are needed to validate these tools are typically not available. Since simulating antibody repertoires is often the only feasible way to benchmark new immunoinformatics tools, we developed the IgSimulator tool that addresses various complications in generating realistic antibody repertoires. IgSimulator's code has modular structure and can be easily adapted to new requirements to simulation. IgSimulator is open source and freely available as a C++ and Python program running on all Unix-compatible platforms. The source code is available from yana-safonova.github.io/ig_simulator. safonova.yana@gmail.com Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
ERIC Educational Resources Information Center
Sheehan, Kathleen M.; Kostin, Irene; Napolitano, Diane; Flor, Michael
2014-01-01
This article describes TextEvaluator, a comprehensive text-analysis system designed to help teachers, textbook publishers, test developers, and literacy researchers select reading materials that are consistent with the text complexity goals outlined in the Common Core State Standards. Three particular aspects of the TextEvaluator measurement…
Assessing Customer Satisfaction at the NIST Research Library: Essential Tool for Future Planning
ERIC Educational Resources Information Center
Liu, Rosa; Allmang, Nancy
2008-01-01
This article describes a campus-wide customer satisfaction survey undertaken by the National Institute of Standards and Technology (NIST) Research Library in 2007. The methodology, survey instrument, data analysis, results, and actions taken in response to the survey are described. The outcome and recommendations will guide the library both…
Assessment and Educational Policy.
ERIC Educational Resources Information Center
Smith, Virginia B.
1975-01-01
Because of increased access of postsecondary education in the 1950's and 1960's, higher education cost analysis gained importance. Attempts have been made to develop a standard unit cost, but it is hard to see unit cost accounting by itself as a valuable tool for public accountability or policy making. For these purposes a cost-effectiveness ratio…
DACUM: Bridging the Gap between Work and High Performance.
ERIC Educational Resources Information Center
Norton, Robert E.; McLennan, Krystyna S.
The DACUM (Developing A Curriculum) occupational analysis process provides a systematic way to look at worker duties and tasks so that important knowledge, skills, standards, tools, and attitudes can be handed on to the next generation of workers. Revamped by The Ohio State University's Center on Education and Training for Employment, DACUM…
Experiencing the Progress Report: An Analysis of Gender and Administration in Doctoral Candidature
ERIC Educational Resources Information Center
Mewburn, Inger; Cuthbert, Denise; Tokareva, Ekaterina
2014-01-01
Most universities around the world put in place administrative processes and systems to manage student progress. These processes usually involve filling out standardised forms and instruments: managerial tools intended to increase transparency, promote efficiency and ensure fairness by applying the same standards to all. The progress report is a…
Methodology to assess clinical liver safety data.
Merz, Michael; Lee, Kwan R; Kullak-Ublick, Gerd A; Brueckner, Andreas; Watkins, Paul B
2014-11-01
Analysis of liver safety data has to be multivariate by nature and needs to take into account time dependency of observations. Current standard tools for liver safety assessment such as summary tables, individual data listings, and narratives address these requirements to a limited extent only. Using graphics in the context of a systematic workflow including predefined graph templates is a valuable addition to standard instruments, helping to ensure completeness of evaluation, and supporting both hypothesis generation and testing. Employing graphical workflows interactively allows analysis in a team-based setting and facilitates identification of the most suitable graphics for publishing and regulatory reporting. Another important tool is statistical outlier detection, accounting for the fact that for assessment of Drug-Induced Liver Injury, identification and thorough evaluation of extreme values has much more relevance than measures of central tendency in the data. Taken together, systematical graphical data exploration and statistical outlier detection may have the potential to significantly improve assessment and interpretation of clinical liver safety data. A workshop was convened to discuss best practices for the assessment of drug-induced liver injury (DILI) in clinical trials.
LLIMAS: Revolutionizing integrating modeling and analysis at MIT Lincoln Laboratory
NASA Astrophysics Data System (ADS)
Doyle, Keith B.; Stoeckel, Gerhard P.; Rey, Justin J.; Bury, Mark E.
2017-08-01
MIT Lincoln Laboratory's Integrated Modeling and Analysis Software (LLIMAS) enables the development of novel engineering solutions for advanced prototype systems through unique insights into engineering performance and interdisciplinary behavior to meet challenging size, weight, power, environmental, and performance requirements. LLIMAS is a multidisciplinary design optimization tool that wraps numerical optimization algorithms around an integrated framework of structural, thermal, optical, stray light, and computational fluid dynamics analysis capabilities. LLIMAS software is highly extensible and has developed organically across a variety of technologies including laser communications, directed energy, photometric detectors, chemical sensing, laser radar, and imaging systems. The custom software architecture leverages the capabilities of existing industry standard commercial software and supports the incorporation of internally developed tools. Recent advances in LLIMAS's Structural-Thermal-Optical Performance (STOP), aeromechanical, and aero-optical capabilities as applied to Lincoln prototypes are presented.
Developing tools for digital radar image data evaluation
NASA Technical Reports Server (NTRS)
Domik, G.; Leberl, F.; Raggam, J.
1986-01-01
The refinement of radar image analysis methods has led to a need for a systems approach to radar image processing software. Developments stimulated through satellite radar are combined with standard image processing techniques to create a user environment to manipulate and analyze airborne and satellite radar images. One aim is to create radar products for the user from the original data to enhance the ease of understanding the contents. The results are called secondary image products and derive from the original digital images. Another aim is to support interactive SAR image analysis. Software methods permit use of a digital height model to create ortho images, synthetic images, stereo-ortho images, radar maps or color combinations of different component products. Efforts are ongoing to integrate individual tools into a combined hardware/software environment for interactive radar image analysis.
Sipes, Carolyn; Hunter, Kathleen; McGonigle, Dee; West, Karen; Hill, Taryn; Hebda, Toni
2017-12-01
Information technology use in healthcare delivery mandates a prepared workforce. The initial Health Information Technology Competencies tool resulted from a 2-year transatlantic effort by experts from the US and European Union to identify approaches to develop skills and knowledge needed by healthcare workers. It was determined that competencies must be identified before strategies are established, resulting in a searchable database of more than 1000 competencies representing five domains, five skill levels, and more than 250 roles. Health Information Technology Competencies is available at no cost and supports role- or competency-based queries. Health Information Technology Competencies developers suggest its use for curriculum planning, job descriptions, and professional development.The Chamberlain College of Nursing informatics research team examined Health Information Technology Competencies for its possible application to our research and our curricular development, comparing it originally with the TIGER-based Assessment of Nursing Informatics Competencies and Nursing Informatics Competency Assessment of Level 3 and Level 4 tools, which examine informatics competencies at four levels of nursing practice. Additional analysis involved the 2015 Nursing Informatics: Scope and Standards of Practice. Informatics is a Health Information Technology Competencies domain, so clear delineation of nursing-informatics competencies was expected. Researchers found TIGER-based Assessment of Nursing Informatics Competencies and Nursing Informatics Competency Assessment of Level 3 and Level 4 differed from Health Information Technology Competencies 2016 in focus, definitions, ascribed competencies, and defined levels of expertise. When Health Information Technology Competencies 2017 was compared against the nursing informatics scope and standards, researchers found an increase in the number of informatics competencies but not to a significant degree. This is not surprising, given that Health Information Technology Competencies includes all healthcare workers, while the TIGER-based Assessment of Nursing Informatics Competencies and Nursing Informatics Competency Assessment of Level 3 and Level 4 tools and the American Nurses Association Nursing Informatics: Scope and Standards of Practice are nurse specific. No clear cross mapping across these tools and the standards of nursing informatics practice exists. Further examination and review are needed to translate Health Information Technology Competencies as a viable tool for nursing informatics use in the US.
Augmenting Traditional Static Analysis With Commonly Available Metadata
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cook, Devin
Developers and security analysts have been using static analysis for a long time to analyze programs for defects and vulnerabilities with some success. Generally a static analysis tool is run on the source code for a given program, flagging areas of code that need to be further inspected by a human analyst. These areas may be obvious bugs like potential bu er over flows, information leakage flaws, or the use of uninitialized variables. These tools tend to work fairly well - every year they find many important bugs. These tools are more impressive considering the fact that they only examinemore » the source code, which may be very complex. Now consider the amount of data available that these tools do not analyze. There are many pieces of information that would prove invaluable for finding bugs in code, things such as a history of bug reports, a history of all changes to the code, information about committers, etc. By leveraging all this additional data, it is possible to nd more bugs with less user interaction, as well as track useful metrics such as number and type of defects injected by committer. This dissertation provides a method for leveraging development metadata to find bugs that would otherwise be difficult to find using standard static analysis tools. We showcase two case studies that demonstrate the ability to find 0day vulnerabilities in large and small software projects by finding new vulnerabilities in the cpython and Roundup open source projects.« less
A reference guide for tree analysis and visualization
2010-01-01
The quantities of data obtained by the new high-throughput technologies, such as microarrays or ChIP-Chip arrays, and the large-scale OMICS-approaches, such as genomics, proteomics and transcriptomics, are becoming vast. Sequencing technologies become cheaper and easier to use and, thus, large-scale evolutionary studies towards the origins of life for all species and their evolution becomes more and more challenging. Databases holding information about how data are related and how they are hierarchically organized expand rapidly. Clustering analysis is becoming more and more difficult to be applied on very large amounts of data since the results of these algorithms cannot be efficiently visualized. Most of the available visualization tools that are able to represent such hierarchies, project data in 2D and are lacking often the necessary user friendliness and interactivity. For example, the current phylogenetic tree visualization tools are not able to display easy to understand large scale trees with more than a few thousand nodes. In this study, we review tools that are currently available for the visualization of biological trees and analysis, mainly developed during the last decade. We describe the uniform and standard computer readable formats to represent tree hierarchies and we comment on the functionality and the limitations of these tools. We also discuss on how these tools can be developed further and should become integrated with various data sources. Here we focus on freely available software that offers to the users various tree-representation methodologies for biological data analysis. PMID:20175922
Yu, Mu Xue; Jiang, Xiao Yun; Li, Yi Juan; Shen, Zhen Yu; Zhuang, Si Qi; Gu, Yu Fen
2018-02-01
The effect of using standardized parent training history-taking on the quality of medical records and communication skills among pediatric interns was determined. Fifth-year interns who were undertaking a pediatric clinical practice rotation were randomized to intervention and control groups. All of the pediatric interns received history-taking training by lecture and bedside teaching. The pediatric interns in the intervention group also received standardized parent history-taking training. The following two outcome measures were used: the scores of medical records, which were written by the pediatric interns after history-taking from real parents of pediatric patients; and the communication assessment tool (CAT) assessed by real parents. The general information, history of present illness (HPI), past medical history, personal history, family history, diagnosis, diagnostic analysis, and differential diagnosis scores in the intervention group were significantly higher than the control group (p < 0.05). Assessment of the CAT indicated that the real parents were more satisfied with the pediatric interns in the intervention group. Standardized parent training history-taking is effective in improving the quality of medical records by pediatric interns. Standardized parent training history-taking is a superior teaching tool for clinical reasoning ability, as well as communication skills in clinical pediatric practice.
Slow speed—fast motion: time-lapse recordings in physics education
NASA Astrophysics Data System (ADS)
Vollmer, Michael; Möllmann, Klaus-Peter
2018-05-01
Video analysis with a 30 Hz frame rate is the standard tool in physics education. The development of affordable high-speed-cameras has extended the capabilities of the tool for much smaller time scales to the 1 ms range, using frame rates of typically up to 1000 frames s-1, allowing us to study transient physics phenomena happening too fast for the naked eye. Here we want to extend the range of phenomena which may be studied by video analysis in the opposite direction by focusing on much longer time scales ranging from minutes, hours to many days or even months. We discuss this time-lapse method, needed equipment and give a few hints of how to produce respective recordings for two specific experiments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Augustoni, Arnold L.
A laser hazard analysis and safety assessment was performed for the LH-40 IR Laser Rangefinder based on the 2000 version of the American National Standard Institute's Standard Z136.1, for the Safe Use of Lasers and Z136.6, for the Safe Use of Lasers Outdoors. The LH-40 IR Laser is central to the Long Range Reconnaissance and Observation System (LORROS). The LORROS is being evaluated by the Department 4149 Group to determine its capability as a long-range assessment tool. The manufacture lists the laser rangefinder as 'eye safe' (Class 1 laser classified under the CDRH Compliance Guide for Laser Products and 21more » CFR 1040 Laser Product Performance Standard). It was necessary that SNL validate this prior to its use involving the general public. A formal laser hazard analysis is presented for the typical mode of operation.« less
Knowledge Representation Standards and Interchange Formats for Causal Graphs
NASA Technical Reports Server (NTRS)
Throop, David R.; Malin, Jane T.; Fleming, Land
2005-01-01
In many domains, automated reasoning tools must represent graphs of causally linked events. These include fault-tree analysis, probabilistic risk assessment (PRA), planning, procedures, medical reasoning about disease progression, and functional architectures. Each of these fields has its own requirements for the representation of causation, events, actors and conditions. The representations include ontologies of function and cause, data dictionaries for causal dependency, failure and hazard, and interchange formats between some existing tools. In none of the domains has a generally accepted interchange format emerged. The paper makes progress towards interoperability across the wide range of causal analysis methodologies. We survey existing practice and emerging interchange formats in each of these fields. Setting forth a set of terms and concepts that are broadly shared across the domains, we examine the several ways in which current practice represents them. Some phenomena are difficult to represent or to analyze in several domains. These include mode transitions, reachability analysis, positive and negative feedback loops, conditions correlated but not causally linked and bimodal probability distributions. We work through examples and contrast the differing methods for addressing them. We detail recent work in knowledge interchange formats for causal trees in aerospace analysis applications in early design, safety and reliability. Several examples are discussed, with a particular focus on reachability analysis and mode transitions. We generalize the aerospace analysis work across the several other domains. We also recommend features and capabilities for the next generation of causal knowledge representation standards.
quantGenius: implementation of a decision support system for qPCR-based gene quantification.
Baebler, Špela; Svalina, Miha; Petek, Marko; Stare, Katja; Rotter, Ana; Pompe-Novak, Maruša; Gruden, Kristina
2017-05-25
Quantitative molecular biology remains a challenge for researchers due to inconsistent approaches for control of errors in the final results. Due to several factors that can influence the final result, quantitative analysis and interpretation of qPCR data are still not trivial. Together with the development of high-throughput qPCR platforms, there is a need for a tool allowing for robust, reliable and fast nucleic acid quantification. We have developed "quantGenius" ( http://quantgenius.nib.si ), an open-access web application for a reliable qPCR-based quantification of nucleic acids. The quantGenius workflow interactively guides the user through data import, quality control (QC) and calculation steps. The input is machine- and chemistry-independent. Quantification is performed using the standard curve approach, with normalization to one or several reference genes. The special feature of the application is the implementation of user-guided QC-based decision support system, based on qPCR standards, that takes into account pipetting errors, assay amplification efficiencies, limits of detection and quantification of the assays as well as the control of PCR inhibition in individual samples. The intermediate calculations and final results are exportable in a data matrix suitable for further statistical analysis or visualization. We additionally compare the most important features of quantGenius with similar advanced software tools and illustrate the importance of proper QC system in the analysis of qPCR data in two use cases. To our knowledge, quantGenius is the only qPCR data analysis tool that integrates QC-based decision support and will help scientists to obtain reliable results which are the basis for biologically meaningful data interpretation.
NASA Astrophysics Data System (ADS)
Weltzin, J. F.; Walls, R.; Guralnick, R. P.; Rosemartin, A.; Deck, J.; Powers, L. A.
2014-12-01
There is a wealth of biodiversity and environmental data that can provide the basis for addressing global scale questions of societal concern. However, our ability to discover, access and integrate these data for use in broader analyses is hampered by the lack of standardized languages and systems. New tools (e.g. ontologies, data standards, integration tools, unique identifiers) are being developed that enable establishment of a framework for linked and open data. Relative to other domains, these tools are nascent in biodiversity and environmental sciences and will require effort to develop, though work can capitalize on lessons learned from previous efforts. Here we discuss needed next steps to provide consistently described and formatted ecological data for immediate application in ecological analysis, focusing on integrating phenology, trait and environmental data to understand local to continental-scale biophysical processes and inform natural resource management practices. As more sources of data become available at finer spatial and temporal resolution, e.g., from national standardized earth observing systems (e.g., NEON, LTER and LTAR Networks, USA NPN), these challenges will become more acute. Here we provide an overview of the standards and ontology development landscape specifically related to phenological and trait data, and identify requirements to overcome current challenges. Second, we outline a workflow for formatting and integrating existing datasets to address key scientific and resource management questions such as: "What traits determine differential phenological responses to changing environmental conditions?" or "What is the role of granularity of observation, and of spatiotemporal scale, in controlling phenological responses to different driving variables?" Third, we discuss methods to semantically annotate datasets to greatly decrease time needed to assemble heterogeneous data for use in ecological analyses on varying spatial scales. We close by making a call to interested community members for a working group to model phenology, trait and environmental data products from continental-scale efforts (e.g. NEON, USA-NPN and others) focusing on ways to assure discoverability and interoperability.
Nekolaichuk, Cheryl; Huot, Ann; Gratton, Valérie; Bush, Shirley H; Tarumi, Yoko; Watanabe, Sharon M
2017-09-01
The Edmonton Symptom Assessment System-revised (ESAS-r) is a nine-item self-report symptom intensity tool developed for palliative care patients, with the option of adding a 10th patient-specific symptom. Due to growing international uptake, the ESAS-r has been translated into different languages. There has not been agreement, however, regarding a standard process for translation into multiple languages, which also includes patients' perspectives. The purpose of this study was to develop a French version of the ESAS-r, using a standardized translation protocol, and to obtain palliative care patients' perspectives regarding this translated tool. We developed a French version of the ESAS-r, using a standard translation method, involving both professional translators (n = 2) and bilingual palliative care experts (n = 3). Fifteen Francophone participants recruited from palliative care sites in two urban centers in Canada completed the ESAS-r and provided feedback on the translation, in the presence of a trained interviewer. Descriptive statistics and thematic analysis were used to analyze the quantitative and qualitative data, respectively. Fifteen Francophone participants were recruited from palliative care sites in two urban centers in Canada. Participants completed the ESAS-r and provided feedback on the translation in the presence of a trained interviewer. Descriptive statistics and thematic analysis were used to analyze the quantitative and qualitative data, respectively. Based on participants' concerns, translations for four of the nine symptoms were revised: drowsiness, nausea, lack of appetite, and shortness of breath. Concerns expressed for three additional symptoms (depression, anxiety, and well-being) were related to overall difficulty rating these symptoms, not specific to the translation. The French version of the ESAS-r is a credible tool for symptom assessment in Francophone patients. The study findings provide a vital step in the development of a standardized translation protocol, including patients' perspectives, which can be applied to other languages.
NASA Astrophysics Data System (ADS)
Mendoza, A. M. M.; Rastaetter, L.; Kuznetsova, M. M.; Mays, M. L.; Chulaki, A.; Shim, J. S.; MacNeice, P. J.; Taktakishvili, A.; Collado-Vega, Y. M.; Weigand, C.; Zheng, Y.; Mullinix, R.; Patel, K.; Pembroke, A. D.; Pulkkinen, A. A.; Boblitt, J. M.; Bakshi, S. S.; Tsui, T.
2017-12-01
The Community Coordinated Modeling Center (CCMC), with the fundamental goal of aiding the transition of modern space science models into space weather forecasting while supporting space science research, has been serving as an integral hub for over 15 years, providing invaluable resources to both space weather scientific and operational communities. CCMC has developed and provided innovative web-based point of access tools varying from: Runs-On-Request System - providing unprecedented global access to the largest collection of state-of-the-art solar and space physics models, Integrated Space Weather Analysis (iSWA) - a powerful dissemination system for space weather information, Advanced Online Visualization and Analysis tools for more accurate interpretation of model results, Standard Data formats for Simulation Data downloads, and Mobile apps to view space weather data anywhere to the scientific community. In addition to supporting research and performing model evaluations, CCMC also supports space science education by hosting summer students through local universities. In this poster, we will showcase CCMC's latest innovative tools and services, and CCMC's tools that revolutionized the way we do research and improve our operational space weather capabilities. CCMC's free tools and resources are all publicly available online (http://ccmc.gsfc.nasa.gov).
A new pathway to product standardization.
Whitcomb, J
2000-06-01
As the benefits of product standardization become more evident in improved financial, managerial, and clinical outcomes, tools to make the process easier will be in demand. Once a standardization program is established, e-commerce offers tools to keep it on track.
Biblio-MetReS: A bibliometric network reconstruction application and server
2011-01-01
Background Reconstruction of genes and/or protein networks from automated analysis of the literature is one of the current targets of text mining in biomedical research. Some user-friendly tools already perform this analysis on precompiled databases of abstracts of scientific papers. Other tools allow expert users to elaborate and analyze the full content of a corpus of scientific documents. However, to our knowledge, no user friendly tool that simultaneously analyzes the latest set of scientific documents available on line and reconstructs the set of genes referenced in those documents is available. Results This article presents such a tool, Biblio-MetReS, and compares its functioning and results to those of other user-friendly applications (iHOP, STRING) that are widely used. Under similar conditions, Biblio-MetReS creates networks that are comparable to those of other user friendly tools. Furthermore, analysis of full text documents provides more complete reconstructions than those that result from using only the abstract of the document. Conclusions Literature-based automated network reconstruction is still far from providing complete reconstructions of molecular networks. However, its value as an auxiliary tool is high and it will increase as standards for reporting biological entities and relationships become more widely accepted and enforced. Biblio-MetReS is an application that can be downloaded from http://metres.udl.cat/. It provides an easy to use environment for researchers to reconstruct their networks of interest from an always up to date set of scientific documents. PMID:21975133
ERIC Educational Resources Information Center
Texas State Technical Coll., Waco.
This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…
ERIC Educational Resources Information Center
Texas State Technical Coll., Waco.
This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…
ERIC Educational Resources Information Center
Texas State Technical Coll., Waco.
This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…
ERIC Educational Resources Information Center
Texas State Technical Coll., Waco.
This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…
ERIC Educational Resources Information Center
Texas State Technical Coll., Waco.
This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…
ERIC Educational Resources Information Center
Texas State Technical Coll., Waco.
This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…
ERIC Educational Resources Information Center
Texas State Technical Coll., Waco.
This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…
Use of Electronic Health Record Tools to Facilitate and Audit Infliximab Prescribing.
Sharpless, Bethany R; Del Rosario, Fernando; Molle-Rios, Zarela; Hilmas, Elora
2018-01-01
The objective of this project was to assess a pediatric institution's use of infliximab and develop and evaluate electronic health record tools to improve safety and efficiency of infliximab ordering through auditing and improved communication. Best use of infliximab was defined through a literature review, analysis of baseline use of infliximab at our institution, and distribution and analysis of a national survey. Auditing and order communication were optimized through implementation of mandatory indications in the infliximab orderable and creation of an interactive flowsheet that collects discrete and free-text data. The value of the implemented electronic health record tools was assessed at the conclusion of the project. Baseline analysis determined that 93.8% of orders were dosed appropriately according to the findings of a literature review. After implementation of the flowsheet and indications, the time to perform an audit of use was reduced from 60 minutes to 5 minutes per month. Four months post implementation, data were entered by 60% of the pediatric gastroenterologists at our institution on 15.3% of all encounters for infliximab. Users were surveyed on the value of the tools, with 100% planning to continue using the workflow, and 82% stating the tools frequently improve the efficiency and safety of infliximab prescribing. Creation of a standard workflow by using an interactive flowsheet has improved auditing ability and facilitated the communication of important order information surrounding infliximab. Providers and pharmacists feel these tools improve the safety and efficiency of infliximab ordering, and auditing data reveal that the tools are being used.
Weather forecasting with open source software
NASA Astrophysics Data System (ADS)
Rautenhaus, Marc; Dörnbrack, Andreas
2013-04-01
To forecast the weather situation during aircraft-based atmospheric field campaigns, we employ a tool chain of existing and self-developed open source software tools and open standards. Of particular value are the Python programming language with its extension libraries NumPy, SciPy, PyQt4, Matplotlib and the basemap toolkit, the NetCDF standard with the Climate and Forecast (CF) Metadata conventions, and the Open Geospatial Consortium Web Map Service standard. These open source libraries and open standards helped to implement the "Mission Support System", a Web Map Service based tool to support weather forecasting and flight planning during field campaigns. The tool has been implemented in Python and has also been released as open source (Rautenhaus et al., Geosci. Model Dev., 5, 55-71, 2012). In this presentation we discuss the usage of free and open source software for weather forecasting in the context of research flight planning, and highlight how the field campaign work benefits from using open source tools and open standards.
Developing a uniformed assessment tool to evaluate care service needs for disabled persons in Japan.
Takei, Teiji; Takahashi, Hiroshi; Nakatani, Hiroki
2008-05-01
Until recently, the care services for disabled persons have been under rigid control by public sectors in terms of provision and funding in Japan. A reform was introduced in 2003 that brought a rapid increase of utilization of services and serious shortage of financial resources. Under these circumstances, the "Services and Supports for Persons with Disabilities Act" was enacted in 2005, requiring that the care service provision process should be transparent, fair and standardized. The purpose of this study is to develop an objective tool for assessing the need for disability care. In the present study we evaluate 1423 cases of patients receiving care services in 60 municipalities, including all three categories of disabilities (physical, intellectual and mental). Using the data of the total 106 items, we conducted factor analysis and regression analysis to develop an assessment tool for people with disabilities. The data revealed that instrumental activities of daily living (IADL) played an essential role in assessing disability levels. We have developed the uniformed assessment tool that has been utilized to guide the types and quantity of care services throughout Japan.
Balikuddembe, Michael S; Wakholi, Peter K; Tumwesigye, Nazarius M; Tylleskär, Thorkild
2018-01-01
A third of women in childbirth are inadequately monitored, partly due to the tools used. Some stakeholders assert that the current labour monitoring tools are not efficient and need improvement to become more relevant to childbirth attendants. The study objective was to explore the expectations of maternity service providers for a mobile childbirth monitoring tool in maternity facilities in a low-income country like Uganda. Semi-structured interviews of purposively selected midwives and doctors in rural-urban childbirth facilities in Uganda were conducted before thematic data analysis. The childbirth providers expected a tool that enabled fast and secure childbirth record storage and sharing. They desired a tool that would automatically and conveniently register patient clinical findings, and actively provide interactive clinical decision support on a busy ward. The tool ought to support agreed upon standards for good pregnancy outcomes but also adaptable to the patient and their difficult working conditions. The tool functionality should include clinical data management and real-time decision support to the midwives, while the non-functional attributes include versatility and security.
Using Performance Tools to Support Experiments in HPC Resilience
DOE Office of Scientific and Technical Information (OSTI.GOV)
Naughton, III, Thomas J; Boehm, Swen; Engelmann, Christian
2014-01-01
The high performance computing (HPC) community is working to address fault tolerance and resilience concerns for current and future large scale computing platforms. This is driving enhancements in the programming environ- ments, specifically research on enhancing message passing libraries to support fault tolerant computing capabilities. The community has also recognized that tools for resilience experimentation are greatly lacking. However, we argue that there are several parallels between performance tools and resilience tools . As such, we believe the rich set of HPC performance-focused tools can be extended (repurposed) to benefit the resilience community. In this paper, we describe the initialmore » motivation to leverage standard HPC per- formance analysis techniques to aid in developing diagnostic tools to assist fault tolerance experiments for HPC applications. These diagnosis procedures help to provide context for the system when the errors (failures) occurred. We describe our initial work in leveraging an MPI performance trace tool to assist in provid- ing global context during fault injection experiments. Such tools will assist the HPC resilience community as they extend existing and new application codes to support fault tolerances.« less
Electrophoresis gel image processing and analysis using the KODAK 1D software.
Pizzonia, J
2001-06-01
The present article reports on the performance of the KODAK 1D Image Analysis Software for the acquisition of information from electrophoresis experiments and highlights the utility of several mathematical functions for subsequent image processing, analysis, and presentation. Digital images of Coomassie-stained polyacrylamide protein gels containing molecular weight standards and ethidium bromide stained agarose gels containing DNA mass standards are acquired using the KODAK Electrophoresis Documentation and Analysis System 290 (EDAS 290). The KODAK 1D software is used to optimize lane and band identification using features such as isomolecular weight lines. Mathematical functions for mass standard representation are presented, and two methods for estimation of unknown band mass are compared. Given the progressive transition of electrophoresis data acquisition and daily reporting in peer-reviewed journals to digital formats ranging from 8-bit systems such as EDAS 290 to more expensive 16-bit systems, the utility of algorithms such as Gaussian modeling, which can correct geometric aberrations such as clipping due to signal saturation common at lower bit depth levels, is discussed. Finally, image-processing tools that can facilitate image preparation for presentation are demonstrated.
Benham, Brian; Hawley, Diane
2015-05-15
Students leave healthcare academic programs for a variety of reasons. When they attrite, it is disappointing for the student as well as their faculty. Advanced practice nursing and other healthcare professions require not only extensive academic preparation, but also the ability to critically evaluate patient care situations. The ability to critically evaluate a situation is not innate. Critical decision making skills are high level skills that are difficult to assess. For the purpose of this review, critical decision making and critical thinking skills refer to the same constructs and will be referred to globally as critical decision making skills. The objective of this review was to identify the effectiveness of tools used to evaluate critical decision making skills for applicants to healthcare graduate educational programs. Adult (18 years of age or older) applicants, students enrolled and/or recent graduates (within one year from completion) of healthcare graduate educational programs. Types of interventions: This review considered studies that evaluated the utilization of unique tools as well as standard tools, such as the Graduate Record Exam or grade point average, to evaluate critical decision making skills in graduate healthcare program applicants. Types of studies: Experimental and non-experimental studies were considered for inclusion. Types of outcomes: Successful quantitative evaluations based on specific field of study standards. The search strategy aimed to find both published and unpublished studies. Studies published in English after 1969 were considered for inclusion in this review. Databases that included both published and unpublished (grey) literature were searched. Additionally, reference lists from all articles retrieved were examined for articles for inclusion. Selected papers were assessed by two independent reviewers using standardized critical appraisal instruments from Joanna Briggs Institute. Any disagreement between reviewers was resolved through discussion or with a third reviewer. Data was extracted independently by each reviewer from papers included in the review using a Microsoft Excel spreadsheet. Included data included study type, 'r' values, number of subjects and reported 'p' values. These were indexed by author, year and study title. The meta-analysis was performed using the method for effect size analysis from Hunter and Schmidt. The syntax for equations was transposed into a Microsoft Excel spreadsheet for data entry, analysis and graph creation. No articles or paper addressing unique tools for ascertaining critical decision making skills met the inclusion criteria. Standard tools, which were represented in the literature, assess critical decision making skills via prediction of academic and clinical success, which indicates the presence of critical decision making skills in graduate healthcare students. A total of 16 studies addressing standard tools were included in this review. All were retrospective case series studies. The date range for the included studies was 1970 to 2009. The strongest relationship was undergraduate grade point average's correlation to graduate grade point average (small effect size with an 'r' value of 0.27, credibility interval of 0.18-0.37). The second strongest relationship was between Graduate Record Examination’s verbal section and graduate grade point average (small effect size with an r value of 0.24, CrI of 0.11-0.37). An applicant’s undergraduate GPA has the strongest correlation with graduate healthcare program success of the indicators analyzed (r = 0.27, small effect size). The next best predictor of graduate healthcare program success was the GRE Verbal score (r = 0.24, small effect size). However, all of the variables carried positive correlations with graduate success, just of lesser effect size strength. This review supports the continued use of traditional indicators of graduate school potential in the undergraduate grade point average and the various sections of the Graduate Record Examination for the selection of graduate healthcare applicants. Primary studies should be funded and performed to assess the use of unique tools in assessing critical thinking in graduate healthcare students. The Joanna Briggs Institute.
ERIC Educational Resources Information Center
Levy, Roy; Xu, Yuning; Yel, Nedim; Svetina, Dubravka
2015-01-01
The standardized generalized dimensionality discrepancy measure and the standardized model-based covariance are introduced as tools to critique dimensionality assumptions in multidimensional item response models. These tools are grounded in a covariance theory perspective and associated connections between dimensionality and local independence.…
Squires, Janet E.; Hayduk, Leslie; Hutchinson, Alison M.; Mallick, Ranjeeta; Norton, Peter G.; Cummings, Greta G.; Estabrooks, Carole A.
2015-01-01
Although organizational context is central to evidence-based practice, underdeveloped measurement hindersitsassessment. The Alberta Context Tool, comprised of 59 items that tap10 modifiable contextual concepts, was developed to address this gap. The purpose of this study to examine the reliability and validity of scores obtained when the Alberta Context Tool is completed by professional nurses across different healthcare settings. Five separate studies (N = 2361 nurses across different care settings) comprised the study sample. Reliability and validity were assessed. Cronbach’s alpha exceeded 0.70 for9/10 Alberta Context Tool concepts. Item-total correlations exceeded acceptable standards for 56/59items. Confirmatory Factor Analysescoordinated acceptably with the Alberta Context Tool’s proposed latent structure. The mean values for each Alberta Context Tool concept increased from low to high levels of research utilization(as hypothesized) further supporting its validity. This study provides robust evidence forreliability and validity of scores obtained with the Alberta Context Tool when administered to professional nurses. PMID:26098857
D'Onza, Giuseppe; Greco, Giulio; Allegrini, Marco
2016-02-01
Recycling implies additional costs for separated municipal solid waste (MSW) collection. The aim of the present study is to propose and implement a management tool - the full cost accounting (FCA) method - to calculate the full collection costs of different types of waste. Our analysis aims for a better understanding of the difficulties of putting FCA into practice in the MSW sector. We propose a FCA methodology that uses standard cost and actual quantities to calculate the collection costs of separate and undifferentiated waste. Our methodology allows cost efficiency analysis and benchmarking, overcoming problems related to firm-specific accounting choices, earnings management policies and purchase policies. Our methodology allows benchmarking and variance analysis that can be used to identify the causes of off-standards performance and guide managers to deploy resources more efficiently. Our methodology can be implemented by companies lacking a sophisticated management accounting system. Copyright © 2015 Elsevier Ltd. All rights reserved.
Flow Cytometry Data Preparation Guidelines for Improved Automated Phenotypic Analysis.
Jimenez-Carretero, Daniel; Ligos, José M; Martínez-López, María; Sancho, David; Montoya, María C
2018-05-15
Advances in flow cytometry (FCM) increasingly demand adoption of computational analysis tools to tackle the ever-growing data dimensionality. In this study, we tested different data input modes to evaluate how cytometry acquisition configuration and data compensation procedures affect the performance of unsupervised phenotyping tools. An analysis workflow was set up and tested for the detection of changes in reference bead subsets and in a rare subpopulation of murine lymph node CD103 + dendritic cells acquired by conventional or spectral cytometry. Raw spectral data or pseudospectral data acquired with the full set of available detectors by conventional cytometry consistently outperformed datasets acquired and compensated according to FCM standards. Our results thus challenge the paradigm of one-fluorochrome/one-parameter acquisition in FCM for unsupervised cluster-based analysis. Instead, we propose to configure instrument acquisition to use all available fluorescence detectors and to avoid integration and compensation procedures, thereby using raw spectral or pseudospectral data for improved automated phenotypic analysis. Copyright © 2018 by The American Association of Immunologists, Inc.
Chi, Bryan; DeLeeuw, Ronald J; Coe, Bradley P; MacAulay, Calum; Lam, Wan L
2004-02-09
Array comparative genomic hybridization (CGH) is a technique which detects copy number differences in DNA segments. Complete sequencing of the human genome and the development of an array representing a tiling set of tens of thousands of DNA segments spanning the entire human genome has made high resolution copy number analysis throughout the genome possible. Since array CGH provides signal ratio for each DNA segment, visualization would require the reassembly of individual data points into chromosome profiles. We have developed a visualization tool for displaying whole genome array CGH data in the context of chromosomal location. SeeGH is an application that translates spot signal ratio data from array CGH experiments to displays of high resolution chromosome profiles. Data is imported from a simple tab delimited text file obtained from standard microarray image analysis software. SeeGH processes the signal ratio data and graphically displays it in a conventional CGH karyotype diagram with the added features of magnification and DNA segment annotation. In this process, SeeGH imports the data into a database, calculates the average ratio and standard deviation for each replicate spot, and links them to chromosome regions for graphical display. Once the data is displayed, users have the option of hiding or flagging DNA segments based on user defined criteria, and retrieve annotation information such as clone name, NCBI sequence accession number, ratio, base pair position on the chromosome, and standard deviation. SeeGH represents a novel software tool used to view and analyze array CGH data. The software gives users the ability to view the data in an overall genomic view as well as magnify specific chromosomal regions facilitating the precise localization of genetic alterations. SeeGH is easily installed and runs on Microsoft Windows 2000 or later environments.
Pritchard, Caroline; O'Connor, Gavin; Ashcroft, Alison E
2013-08-06
To achieve comparability of measurement results of protein amount of substance content between clinical laboratories, suitable reference materials are required. The impact on measurement comparability of potential differences in the tertiary and quaternary structure of protein reference standards is as yet not well understood. With the use of human growth hormone as a model protein, the potential of ion mobility spectrometry-mass spectrometry as a tool to assess differences in the structure of protein reference materials and their interactions with antibodies has been investigated here.
Systematic review and meta-analysis: tools for the information age.
Weatherall, Mark
2017-11-01
The amount of available biomedical information is vast and growing. Natural limitations of the way clinicians and researchers approach this treasure trove of information comprise difficulties locating the information, and once located, cognitive biases may lead to inappropriate use of the information. Systematic reviews and meta-analyses represent important tools in the information age to improve knowledge and action. Systematic reviews represent a census approach to identifying literature to avoid non-response bias. They are a necessary prelude to producing combined quantitative summaries of associations or treatment effects. Meta-analysis comprises the arithmetical techniques for producing combined summaries from individual study reports. Careful, thoughtful and rigorous use of these tools is likely to enhance knowledge and action. Use of standard guidelines, such as the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines, or embedding these activities within collaborative groups such as the Cochrane Collaboration, are likely to lead to more useful systematic review and meta-analysis reporting. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Prioritizing Health: A Systematic Approach to Scoping Determinants in Health Impact Assessment.
McCallum, Lindsay C; Ollson, Christopher A; Stefanovic, Ingrid L
2016-01-01
The determinants of health are those factors that have the potential to affect health, either positively or negatively, and include a range of personal, social, economic, and environmental factors. In the practice of health impact assessment (HIA), the stage at which the determinants of health are considered for inclusion is during the scoping step. The scoping step is intended to identify how the HIA will be carried out and to set the boundaries (e.g., temporal and geographical) for the assessment. There are several factors that can help to inform the scoping process, many of which are considered in existing HIA tools and guidance; however, a systematic method of prioritizing determinants was found to be lacking. In order to analyze existing HIA scoping tools that are available, a systematic literature review was conducted, including both primary and gray literature. A total of 10 HIA scoping tools met the inclusion/exclusion criteria and were carried forward for comparative analysis. The analysis focused on minimum elements and practice standards of HIA scoping that have been established in the field. The analysis determined that existing approaches lack a clear, systematic method of prioritization of health determinants for inclusion in HIA. This finding led to the development of a Systematic HIA Scoping tool that addressed this gap. The decision matrix tool uses factors, such as impact, public concern, and data availability, to prioritize health determinants. Additionally, the tool allows for identification of data gaps and provides a transparent method for budget allocation and assessment planning. In order to increase efficiency and improve utility, the tool was programed into Microsoft Excel. Future work in the area of HIA methodology development is vital to the ongoing success of the practice and utilization of HIA as a reliable decision-making tool.
The Management Standards Indicator Tool and evaluation of burnout.
Ravalier, J M; McVicar, A; Munn-Giddings, C
2013-03-01
Psychosocial hazards in the workplace can impact upon employee health. The UK Health and Safety Executive's (HSE) Management Standards Indicator Tool (MSIT) appears to have utility in relation to health impacts but we were unable to find studies relating it to burnout. To explore the utility of the MSIT in evaluating risk of burnout assessed by the Maslach Burnout Inventory-General Survey (MBI-GS). This was a cross-sectional survey of 128 borough council employees. MSIT data were analysed according to MSIT and MBI-GS threshold scores and by using multivariate linear regression with MBI-GS factors as dependent variables. MSIT factor scores were gradated according to categories of risk of burnout according to published MBI-GS thresholds, and identified priority workplace concerns as demands, relationships, role and change. These factors also featured as significant independent variables, with control, in outcomes of the regression analysis. Exhaustion was associated with demands and control (adjusted R (2) = 0.331); cynicism was associated with change, role and demands (adjusted R (2) =0.429); and professional efficacy was associated with managerial support, role, control and demands (adjusted R (2) = 0.413). MSIT analysis generally has congruence with MBI-GS assessment of burnout. The identification of control within regression models but not as a priority concern in the MSIT analysis could suggest an issue of the setting of the MSIT thresholds for this factor, but verification requires a much larger study. Incorporation of relationship, role and change into the MSIT, missing from other conventional tools, appeared to add to its validity.
A standard-enabled workflow for synthetic biology.
Myers, Chris J; Beal, Jacob; Gorochowski, Thomas E; Kuwahara, Hiroyuki; Madsen, Curtis; McLaughlin, James Alastair; Mısırlı, Göksel; Nguyen, Tramy; Oberortner, Ernst; Samineni, Meher; Wipat, Anil; Zhang, Michael; Zundel, Zach
2017-06-15
A synthetic biology workflow is composed of data repositories that provide information about genetic parts, sequence-level design tools to compose these parts into circuits, visualization tools to depict these designs, genetic design tools to select parts to create systems, and modeling and simulation tools to evaluate alternative design choices. Data standards enable the ready exchange of information within such a workflow, allowing repositories and tools to be connected from a diversity of sources. The present paper describes one such workflow that utilizes, among others, the Synthetic Biology Open Language (SBOL) to describe genetic designs, the Systems Biology Markup Language to model these designs, and SBOL Visual to visualize these designs. We describe how a standard-enabled workflow can be used to produce types of design information, including multiple repositories and software tools exchanging information using a variety of data standards. Recently, the ACS Synthetic Biology journal has recommended the use of SBOL in their publications. © 2017 The Author(s); published by Portland Press Limited on behalf of the Biochemical Society.
Gene ARMADA: an integrated multi-analysis platform for microarray data implemented in MATLAB.
Chatziioannou, Aristotelis; Moulos, Panagiotis; Kolisis, Fragiskos N
2009-10-27
The microarray data analysis realm is ever growing through the development of various tools, open source and commercial. However there is absence of predefined rational algorithmic analysis workflows or batch standardized processing to incorporate all steps, from raw data import up to the derivation of significantly differentially expressed gene lists. This absence obfuscates the analytical procedure and obstructs the massive comparative processing of genomic microarray datasets. Moreover, the solutions provided, heavily depend on the programming skills of the user, whereas in the case of GUI embedded solutions, they do not provide direct support of various raw image analysis formats or a versatile and simultaneously flexible combination of signal processing methods. We describe here Gene ARMADA (Automated Robust MicroArray Data Analysis), a MATLAB implemented platform with a Graphical User Interface. This suite integrates all steps of microarray data analysis including automated data import, noise correction and filtering, normalization, statistical selection of differentially expressed genes, clustering, classification and annotation. In its current version, Gene ARMADA fully supports 2 coloured cDNA and Affymetrix oligonucleotide arrays, plus custom arrays for which experimental details are given in tabular form (Excel spreadsheet, comma separated values, tab-delimited text formats). It also supports the analysis of already processed results through its versatile import editor. Besides being fully automated, Gene ARMADA incorporates numerous functionalities of the Statistics and Bioinformatics Toolboxes of MATLAB. In addition, it provides numerous visualization and exploration tools plus customizable export data formats for seamless integration by other analysis tools or MATLAB, for further processing. Gene ARMADA requires MATLAB 7.4 (R2007a) or higher and is also distributed as a stand-alone application with MATLAB Component Runtime. Gene ARMADA provides a highly adaptable, integrative, yet flexible tool which can be used for automated quality control, analysis, annotation and visualization of microarray data, constituting a starting point for further data interpretation and integration with numerous other tools.
NASA Astrophysics Data System (ADS)
Telang, Aparna S.; Bedekar, P. P.
2017-09-01
Load flow analysis is the initial and essential step for any power system computation. It is required for choosing better options for power system expansion to meet with ever increasing load demand. Implementation of Flexible AC Transmission System (FACTS) device like STATCOM, in the load flow, which is having fast and very flexible control, is one of the important tasks for power system researchers. This paper presents a simple and systematic approach for steady state power flow calculations with FACTS controller, static synchronous compensator (STATCOM) using command line usage of MATLAB tool-power system analysis toolbox (PSAT). The complexity of MATLAB language programming increases due to incorporation of STATCOM in an existing Newton-Raphson load flow algorithm. Thus, the main contribution of this paper is to show how command line usage of user friendly MATLAB tool, PSAT, can extensively be used for quicker and wider interpretation of the results of load flow with STATCOM. The novelty of this paper lies in the method of applying the load increase pattern, where the active and reactive loads have been changed simultaneously at all the load buses under consideration for creating stressed conditions for load flow analysis with STATCOM. The performance have been evaluated on many standard IEEE test systems and the results for standard IEEE-30 bus system, IEEE-57 bus system, and IEEE-118 bus system are presented.
Neu, Thomas R; Kuhlicke, Ute
2017-02-10
Microbial biofilm systems are defined as interface-associated microorganisms embedded into a self-produced matrix. The extracellular matrix represents a continuous challenge in terms of characterization and analysis. The tools applied in more detailed studies comprise extraction/chemical analysis, molecular characterization, and visualisation using various techniques. Imaging by laser microscopy became a standard tool for biofilm analysis, and, in combination with fluorescently labelled lectins, the glycoconjugates of the matrix can be assessed. By employing this approach a wide range of pure culture biofilms from different habitats were examined using the commercially available lectins. From the results, a binary barcode pattern of lectin binding can be generated. Furthermore, the results can be fine-tuned and transferred into a heat map according to signal intensity. The lectin barcode approach is suggested as a useful tool for investigating the biofilm matrix characteristics and dynamics at various levels, e.g. bacterial cell surfaces, adhesive footprints, individual microcolonies, and the gross biofilm or bio-aggregate. Hence fluorescence lectin bar-coding (FLBC) serves as a basis for a subsequent tailor-made fluorescence lectin-binding analysis (FLBA) of a particular biofilm. So far, the lectin approach represents the only tool for in situ characterization of the glycoconjugate makeup in biofilm systems. Furthermore, lectin staining lends itself to other fluorescence techniques in order to correlate it with cellular biofilm constituents in general and glycoconjugate producers in particular.
IKOS: A Framework for Static Analysis based on Abstract Interpretation (Tool Paper)
NASA Technical Reports Server (NTRS)
Brat, Guillaume P.; Laserna, Jorge A.; Shi, Nija; Venet, Arnaud Jean
2014-01-01
The RTCA standard (DO-178C) for developing avionic software and getting certification credits includes an extension (DO-333) that describes how developers can use static analysis in certification. In this paper, we give an overview of the IKOS static analysis framework that helps developing static analyses that are both precise and scalable. IKOS harnesses the power of Abstract Interpretation and makes it accessible to a larger class of static analysis developers by separating concerns such as code parsing, model development, abstract domain management, results management, and analysis strategy. The benefits of the approach is demonstrated by a buffer overflow analysis applied to flight control systems.
Validation of Computerized Automatic Calculation of the Sequential Organ Failure Assessment Score
Harrison, Andrew M.; Pickering, Brian W.; Herasevich, Vitaly
2013-01-01
Purpose. To validate the use of a computer program for the automatic calculation of the sequential organ failure assessment (SOFA) score, as compared to the gold standard of manual chart review. Materials and Methods. Adult admissions (age > 18 years) to the medical ICU with a length of stay greater than 24 hours were studied in the setting of an academic tertiary referral center. A retrospective cross-sectional analysis was performed using a derivation cohort to compare automatic calculation of the SOFA score to the gold standard of manual chart review. After critical appraisal of sources of disagreement, another analysis was performed using an independent validation cohort. Then, a prospective observational analysis was performed using an implementation of this computer program in AWARE Dashboard, which is an existing real-time patient EMR system for use in the ICU. Results. Good agreement between the manual and automatic SOFA calculations was observed for both the derivation (N=94) and validation (N=268) cohorts: 0.02 ± 2.33 and 0.29 ± 1.75 points, respectively. These results were validated in AWARE (N=60). Conclusion. This EMR-based automatic tool accurately calculates SOFA scores and can facilitate ICU decisions without the need for manual data collection. This tool can also be employed in a real-time electronic environment. PMID:23936639
ERIC Educational Resources Information Center
O'Droma, Mairtin S.; Ganchev, Ivan; McDonnell, Fergal
2003-01-01
Presents a comparative analysis from the Institute of Electrical and Electronics Engineers (IEEE) Learning Technology Standards Committee's (LTSC) of the architectural and functional design of e-learning delivery platforms and applications, e-learning course authoring tools, and learning management systems (LMSs), with a view of assessing how…
2005-06-01
test, the entire turbulence model was changed from standard k- epsilon to Spalart- Allmaras. Using these different tools of turbulence models, a few...this research, leaving only pre-existing finite element models to be used. At some point a NASTRAN model was developed for vibrations analysis but
2013-12-01
each satellites field of view, 24 hrs a day Confirmed by analysis using industry- standard Satellite Tool Kit ( STK ). Operationally verified...Table of Contents Common Acronyms and Abbreviations 3 Program Information 4 Responsible Office 4 References 4 Mission and...Acquisition Program Baseline (APB) dated March 12, 2014 WGS December 2013 SAR April 16, 2014 17:25:37 UNCLASSIFIED 4 Mission and Description Wideband
Evaluating Classified MODIS Satellite Imagery as a Stratification Tool
Greg C. Liknes; Mark D. Nelson; Ronald E. McRoberts
2004-01-01
The Forest Inventory and Analysis (FIA) program of the USDA Forest Service collects forest attribute data on permanent plots arranged on a hexagonal network across all 50 states and Puerto Rico. Due to budget constraints, sample sizes sufficient to satisfy national FIA precision standards are seldom achieved for most inventory variables unless the estimation process is...
Videos Determine the Moon's "g"
ERIC Educational Resources Information Center
Persson, J. R.; Hagen, J. E.
2011-01-01
Determining the acceleration of a free-falling object due to gravity is a standard experiment in physics. Different methods to do this have been developed over the years. This article discusses the use of video-analysis tools as another method. If there is a video available and a known scale it is possible to analyse the motion. The use of video…
Nursing informatics, outcomes, and quality improvement.
Charters, Kathleen G
2003-08-01
Nursing informatics actively supports nursing by providing standard language systems, databases, decision support, readily accessible research results, and technology assessments. Through normalized datasets spanning an entire enterprise or other large demographic, nursing informatics tools support improvement of healthcare by answering questions about patient outcomes and quality improvement on an enterprise scale, and by providing documentation for business process definition, business process engineering, and strategic planning. Nursing informatics tools provide a way for advanced practice nurses to examine their practice and the effect of their actions on patient outcomes. Analysis of patient outcomes may lead to initiatives for quality improvement. Supported by nursing informatics tools, successful advance practice nurses leverage their quality improvement initiatives against the enterprise strategic plan to gain leadership support and resources.
Determining absolute protein numbers by quantitative fluorescence microscopy.
Verdaasdonk, Jolien Suzanne; Lawrimore, Josh; Bloom, Kerry
2014-01-01
Biological questions are increasingly being addressed using a wide range of quantitative analytical tools to examine protein complex composition. Knowledge of the absolute number of proteins present provides insights into organization, function, and maintenance and is used in mathematical modeling of complex cellular dynamics. In this chapter, we outline and describe three microscopy-based methods for determining absolute protein numbers--fluorescence correlation spectroscopy, stepwise photobleaching, and ratiometric comparison of fluorescence intensity to known standards. In addition, we discuss the various fluorescently labeled proteins that have been used as standards for both stepwise photobleaching and ratiometric comparison analysis. A detailed procedure for determining absolute protein number by ratiometric comparison is outlined in the second half of this chapter. Counting proteins by quantitative microscopy is a relatively simple yet very powerful analytical tool that will increase our understanding of protein complex composition. © 2014 Elsevier Inc. All rights reserved.
Open Source Tools for Seismicity Analysis
NASA Astrophysics Data System (ADS)
Powers, P.
2010-12-01
The spatio-temporal analysis of seismicity plays an important role in earthquake forecasting and is integral to research on earthquake interactions and triggering. For instance, the third version of the Uniform California Earthquake Rupture Forecast (UCERF), currently under development, will use Epidemic Type Aftershock Sequences (ETAS) as a model for earthquake triggering. UCERF will be a "living" model and therefore requires robust, tested, and well-documented ETAS algorithms to ensure transparency and reproducibility. Likewise, as earthquake aftershock sequences unfold, real-time access to high quality hypocenter data makes it possible to monitor the temporal variability of statistical properties such as the parameters of the Omori Law and the Gutenberg Richter b-value. Such statistical properties are valuable as they provide a measure of how much a particular sequence deviates from expected behavior and can be used when assigning probabilities of aftershock occurrence. To address these demands and provide public access to standard methods employed in statistical seismology, we present well-documented, open-source JavaScript and Java software libraries for the on- and off-line analysis of seismicity. The Javascript classes facilitate web-based asynchronous access to earthquake catalog data and provide a framework for in-browser display, analysis, and manipulation of catalog statistics; implementations of this framework will be made available on the USGS Earthquake Hazards website. The Java classes, in addition to providing tools for seismicity analysis, provide tools for modeling seismicity and generating synthetic catalogs. These tools are extensible and will be released as part of the open-source OpenSHA Commons library.
Thermal neutral format based on the step technology
NASA Technical Reports Server (NTRS)
Almazan, P. Planas; Legal, J. L.
1995-01-01
The exchange of models is one of the most serious problems currently encountered in the practice of spacecraft thermal analysis. Essentially, the problem originates in the diversity of computing environments that are used across different sites, and the consequent proliferation of native tool formats. Furthermore, increasing pressure to reduce the development's life cycle time has originated a growing interest in the so-called spacecraft concurrent engineering. In this context, the realization of the interdependencies between different disciplines and the proper communication between them become critical issues. The use of a neutral format represents a step forward in addressing these problems. Such a means of communication is adopted by consensus. A neutral format is not directly tied to any specific tool and it is kept under stringent change control. Currently, most of the groups promoting exchange formats are contributing with their experience to STEP, the Standard for Exchange of Product Model Data, which is being developed under the auspices of the International Standards Organization (ISO 10303). This paper presents the different efforts made in Europe to provide the spacecraft thermal analysis community with a Thermal Neutral Format (TNF) based on STEP. Following an introduction with some background information, the paper presents the characteristics of the STEP standard. Later, the first efforts to produce a STEP Spacecraft Thermal Application Protocol are described. Finally, the paper presents the currently harmonized European activities that follow up and extend earlier work on the area.
Use of Virtual Reality Tools for Vestibular Disorders Rehabilitation: A Comprehensive Analysis.
Bergeron, Mathieu; Lortie, Catherine L; Guitton, Matthieu J
2015-01-01
Classical peripheral vestibular disorders rehabilitation is a long and costly process. While virtual reality settings have been repeatedly suggested to represent possible tools to help the rehabilitation process, no systematic study had been conducted so far. We systematically reviewed the current literature to analyze the published protocols documenting the use of virtual reality settings for peripheral vestibular disorders rehabilitation. There is an important diversity of settings and protocols involving virtual reality settings for the treatment of this pathology. Evaluation of the symptoms is often not standardized. However, our results unveil a clear effect of virtual reality settings-based rehabilitation of the patients' symptoms, assessed by objectives tools such as the DHI (mean decrease of 27 points), changing symptoms handicap perception from moderate to mild impact on life. Furthermore, we detected a relationship between the duration of the exposure to virtual reality environments and the magnitude of the therapeutic effects, suggesting that virtual reality treatments should last at least 150 minutes of cumulated exposure to ensure positive outcomes. Virtual reality offers a pleasant and safe environment for the patient. Future studies should standardize evaluation tools, document putative side effects further, compare virtual reality to conventional physical therapy, and evaluate economical costs/benefits of such strategies.
The HDF Product Designer - Interoperability in the First Mile
NASA Astrophysics Data System (ADS)
Lee, H.; Jelenak, A.; Habermann, T.
2014-12-01
Interoperable data have been a long-time goal in many scientific communities. The recent growth in analysis, visualization and mash-up applications that expect data stored in a standardized manner has brought the interoperability issue to the fore. On the other hand, producing interoperable data is often regarded as a sideline task in a typical research team for which resources are not readily available. The HDF Group is developing a software tool aimed at lessening the burden of creating data in standards-compliant, interoperable HDF5 files. The tool, named HDF Product Designer, lowers the threshold needed to design such files by providing a user interface that combines the rich HDF5 feature set with applicable metadata conventions. Users can quickly devise new HDF5 files while at the same time seamlessly incorporating the latest best practices and conventions from their community. That is what the term interoperability in the first mile means: enabling generation of interoperable data in HDF5 files from the onset of their production. The tool also incorporates collaborative features, allowing team approach in the file design, as well as easy transfer of best practices as they are being developed. The current state of the tool and the plans for future development will be presented. Constructive input from interested parties is always welcome.
Use of Virtual Reality Tools for Vestibular Disorders Rehabilitation: A Comprehensive Analysis
Bergeron, Mathieu; Lortie, Catherine L.; Guitton, Matthieu J.
2015-01-01
Classical peripheral vestibular disorders rehabilitation is a long and costly process. While virtual reality settings have been repeatedly suggested to represent possible tools to help the rehabilitation process, no systematic study had been conducted so far. We systematically reviewed the current literature to analyze the published protocols documenting the use of virtual reality settings for peripheral vestibular disorders rehabilitation. There is an important diversity of settings and protocols involving virtual reality settings for the treatment of this pathology. Evaluation of the symptoms is often not standardized. However, our results unveil a clear effect of virtual reality settings-based rehabilitation of the patients' symptoms, assessed by objectives tools such as the DHI (mean decrease of 27 points), changing symptoms handicap perception from moderate to mild impact on life. Furthermore, we detected a relationship between the duration of the exposure to virtual reality environments and the magnitude of the therapeutic effects, suggesting that virtual reality treatments should last at least 150 minutes of cumulated exposure to ensure positive outcomes. Virtual reality offers a pleasant and safe environment for the patient. Future studies should standardize evaluation tools, document putative side effects further, compare virtual reality to conventional physical therapy, and evaluate economical costs/benefits of such strategies. PMID:26556560
The Development and Validation of a Rapid Assessment Tool of Primary Care in China
Mei, Jie; Liang, Yuan; Shi, LeiYu; Zhao, JingGe; Wang, YuTan; Kuang, Li
2016-01-01
Introduction. With Chinese health care reform increasingly emphasizing the importance of primary care, the need for a tool to evaluate primary care performance and service delivery is clear. This study presents a methodology for a rapid assessment of primary care organizations and service delivery in China. Methods. The study translated and adapted the Primary Care Assessment Tool-Adult Edition (PCAT-AE) into a Chinese version to measure core dimensions of primary care, namely, first contact, continuity, comprehensiveness, and coordination. A cross-sectional survey was conducted to assess the validity and reliability of the Chinese Rapid Primary Care Assessment Tool (CR-PCAT). Eight community health centers in Guangdong province have been selected to participate in the survey. Results. A total of 1465 effective samples were included for data analysis. Eight items were eliminated following principal component analysis and reliability testing. The principal component analysis extracted five multiple-item scales (first contact utilization, first contact accessibility, ongoing care, comprehensiveness, and coordination). The tests of scaling assumptions were basically met. Conclusion. The standard psychometric evaluation indicates that the scales have achieved relatively good reliability and validity. The CR-PCAT provides a rapid and reliable measure of four core dimensions of primary care, which could be applied in various scenarios. PMID:26885509
Integrated flexible manufacturing program for manufacturing automation and rapid prototyping
NASA Technical Reports Server (NTRS)
Brooks, S. L.; Brown, C. W.; King, M. S.; Simons, W. R.; Zimmerman, J. J.
1993-01-01
The Kansas City Division of Allied Signal Inc., as part of the Integrated Flexible Manufacturing Program (IFMP), is developing an integrated manufacturing environment. Several systems are being developed to produce standards and automation tools for specific activities within the manufacturing environment. The Advanced Manufacturing Development System (AMDS) is concentrating on information standards (STEP) and product data transfer; the Expert Cut Planner system (XCUT) is concentrating on machining operation process planning standards and automation capabilities; the Advanced Numerical Control system (ANC) is concentrating on NC data preparation standards and NC data generation tools; the Inspection Planning and Programming Expert system (IPPEX) is concentrating on inspection process planning, coordinate measuring machine (CMM) inspection standards and CMM part program generation tools; and the Intelligent Scheduling and Planning System (ISAPS) is concentrating on planning and scheduling tools for a flexible manufacturing system environment. All of these projects are working together to address information exchange, standardization, and information sharing to support rapid prototyping in a Flexible Manufacturing System (FMS) environment.
Error modelling of quantum Hall array resistance standards
NASA Astrophysics Data System (ADS)
Marzano, Martina; Oe, Takehiko; Ortolano, Massimo; Callegaro, Luca; Kaneko, Nobu-Hisa
2018-04-01
Quantum Hall array resistance standards (QHARSs) are integrated circuits composed of interconnected quantum Hall effect elements that allow the realization of virtually arbitrary resistance values. In recent years, techniques were presented to efficiently design QHARS networks. An open problem is that of the evaluation of the accuracy of a QHARS, which is affected by contact and wire resistances. In this work, we present a general and systematic procedure for the error modelling of QHARSs, which is based on modern circuit analysis techniques and Monte Carlo evaluation of the uncertainty. As a practical example, this method of analysis is applied to the characterization of a 1 MΩ QHARS developed by the National Metrology Institute of Japan. Software tools are provided to apply the procedure to other arrays.
A total variation diminishing finite difference algorithm for sonic boom propagation models
NASA Technical Reports Server (NTRS)
Sparrow, Victor W.
1993-01-01
It is difficult to accurately model the rise phases of sonic boom waveforms with traditional finite difference algorithms because of finite difference phase dispersion. This paper introduces the concept of a total variation diminishing (TVD) finite difference method as a tool for accurately modeling the rise phases of sonic booms. A standard second order finite difference algorithm and its TVD modified counterpart are both applied to the one-way propagation of a square pulse. The TVD method clearly outperforms the non-TVD method, showing great potential as a new computational tool in the analysis of sonic boom propagation.
FAST Modularization Framework for Wind Turbine Simulation: Full-System Linearization
Jonkman, Jason M.; Jonkman, Bonnie J.
2016-10-03
The wind engineering community relies on multiphysics engineering software to run nonlinear time-domain simulations e.g. for design-standards-based loads analysis. Although most physics involved in wind energy are nonlinear, linearization of the underlying nonlinear system equations is often advantageous to understand the system response and exploit well-established methods and tools for analyzing linear systems. Here, this paper presents the development and verification of the new linearization functionality of the open-source engineering tool FAST v8 for land-based wind turbines, as well as the concepts and mathematical background needed to understand and apply it correctly.
FAST modularization framework for wind turbine simulation: full-system linearization
NASA Astrophysics Data System (ADS)
Jonkman, J. M.; Jonkman, B. J.
2016-09-01
The wind engineering community relies on multiphysics engineering software to run nonlinear time-domain simulations e.g. for design-standards-based loads analysis. Although most physics involved in wind energy are nonlinear, linearization of the underlying nonlinear system equations is often advantageous to understand the system response and exploit well- established methods and tools for analyzing linear systems. This paper presents the development and verification of the new linearization functionality of the open-source engineering tool FAST v8 for land-based wind turbines, as well as the concepts and mathematical background needed to understand and apply it correctly.
Next-generation genotype imputation service and methods.
Das, Sayantan; Forer, Lukas; Schönherr, Sebastian; Sidore, Carlo; Locke, Adam E; Kwong, Alan; Vrieze, Scott I; Chew, Emily Y; Levy, Shawn; McGue, Matt; Schlessinger, David; Stambolian, Dwight; Loh, Po-Ru; Iacono, William G; Swaroop, Anand; Scott, Laura J; Cucca, Francesco; Kronenberg, Florian; Boehnke, Michael; Abecasis, Gonçalo R; Fuchsberger, Christian
2016-10-01
Genotype imputation is a key component of genetic association studies, where it increases power, facilitates meta-analysis, and aids interpretation of signals. Genotype imputation is computationally demanding and, with current tools, typically requires access to a high-performance computing cluster and to a reference panel of sequenced genomes. Here we describe improvements to imputation machinery that reduce computational requirements by more than an order of magnitude with no loss of accuracy in comparison to standard imputation tools. We also describe a new web-based service for imputation that facilitates access to new reference panels and greatly improves user experience and productivity.
Computing Linear Mathematical Models Of Aircraft
NASA Technical Reports Server (NTRS)
Duke, Eugene L.; Antoniewicz, Robert F.; Krambeer, Keith D.
1991-01-01
Derivation and Definition of Linear Aircraft Model (LINEAR) computer program provides user with powerful, and flexible, standard, documented, and verified software tool for linearization of mathematical models of aerodynamics of aircraft. Intended for use in software tool to drive linear analysis of stability and design of control laws for aircraft. Capable of both extracting such linearized engine effects as net thrust, torque, and gyroscopic effects, and including these effects in linear model of system. Designed to provide easy selection of state, control, and observation variables used in particular model. Also provides flexibility of allowing alternate formulations of both state and observation equations. Written in FORTRAN.
Integrated Procedures for Flight and Ground Operations Using International Standards
NASA Technical Reports Server (NTRS)
Ingalls, John
2011-01-01
Imagine astronauts using the same Interactive Electronic Technical Manuals (IETM's) as the ground personnel who assemble or maintain their flight hardware, and having all of that data interoperable with design, logistics, reliability analysis, and training. Modern international standards and their corresponding COTS tools already used in other industries provide a good foundation for streamlined technical publications in the space industry. These standards cover everything from data exchange to product breakdown structure to business rules flexibility. Full Product Lifecycle Support (PLCS) is supported. The concept is to organize, build once, reuse many ways, and integrate. This should apply to all future and some current launch vehicles, payloads, space stations/habitats, spacecraft, facilities, support equipment, and retrieval ships.
Hanny and the Mystery of the Voorwerp: Citizen Science in the Classroom
NASA Astrophysics Data System (ADS)
Costello, K.; Reilly, E.; Bracey, G.; Gay, P.
2012-08-01
The highly engaging graphic comic Hanny and the Mystery of the Voorwerp is the focus of an eight-day educational unit geared to middle level students. Activities in the unit link national astronomy standards to the citizen science Zooniverse website through tutorials that lead to analysis of real data online. NASA resources are also included in the unit. The content of the session focused on the terminology and concepts - galaxy formation, types and characteristics of galaxies, use of spectral analysis - needed to classify galaxies. Use of citizen science projects as tools to teach inquiry in the classroom was the primary focus of the workshop. The session included a hands-on experiment taken from the unit, including a NASA spectral analysis activity called "What's the Frequency, Roy G Biv?" In addition, presenters demonstrated the galaxy classification tools found in the "Galaxy Zoo" project at the Zooniverse citizen science website.
Open source tools for the information theoretic analysis of neural data.
Ince, Robin A A; Mazzoni, Alberto; Petersen, Rasmus S; Panzeri, Stefano
2010-01-01
The recent and rapid development of open source software tools for the analysis of neurophysiological datasets consisting of simultaneous multiple recordings of spikes, field potentials and other neural signals holds the promise for a significant advance in the standardization, transparency, quality, reproducibility and variety of techniques used to analyze neurophysiological data and for the integration of information obtained at different spatial and temporal scales. In this review we focus on recent advances in open source toolboxes for the information theoretic analysis of neural responses. We also present examples of their use to investigate the role of spike timing precision, correlations across neurons, and field potential fluctuations in the encoding of sensory information. These information toolboxes, available both in MATLAB and Python programming environments, hold the potential to enlarge the domain of application of information theory to neuroscience and to lead to new discoveries about how neurons encode and transmit information.
Accidents at work and costs analysis: a field study in a large Italian company.
Battaglia, Massimo; Frey, Marco; Passetti, Emilio
2014-01-01
Accidents at work are still a heavy burden in social and economic terms, and action to improve health and safety standards at work offers great potential gains not only to employers, but also to individuals and society as a whole. However, companies often are not interested to measure the costs of accidents even if cost information may facilitate preventive occupational health and safety management initiatives. The field study, carried out in a large Italian company, illustrates technical and organisational aspects associated with the implementation of an accident costs analysis tool. The results indicate that the implementation (and the use) of the tool requires a considerable commitment by the company, that accident costs analysis should serve to reinforce the importance of health and safety prevention and that the economic dimension of accidents is substantial. The study also suggests practical ways to facilitate the implementation and the moral acceptance of the accounting technology.
XIMPOL: a new x-ray polarimetry observation-simulation and analysis framework
NASA Astrophysics Data System (ADS)
Omodei, Nicola; Baldini, Luca; Pesce-Rollins, Melissa; di Lalla, Niccolò
2017-08-01
We present a new simulation framework, XIMPOL, based on the python programming language and the Scipy stack, specifically developed for X-ray polarimetric applications. XIMPOL is not tied to any specific mission or instrument design and is meant to produce fast and yet realistic observation-simulations, given as basic inputs: (i) an arbitrary source model including morphological, temporal, spectral and polarimetric information, and (ii) the response functions of the detector under study, i.e., the effective area, the energy dispersion, the point-spread function and the modulation factor. The format of the response files is OGIP compliant, and the framework has the capability of producing output files that can be directly fed into the standard visualization and analysis tools used by the X-ray community, including XSPEC which make it a useful tool not only for simulating physical systems, but also to develop and test end-to-end analysis chains.
Generating a Magellanic star cluster catalog with ASteCA
NASA Astrophysics Data System (ADS)
Perren, G. I.; Piatti, A. E.; Vázquez, R. A.
2016-08-01
An increasing number of software tools have been employed in the recent years for the automated or semi-automated processing of astronomical data. The main advantages of using these tools over a standard by-eye analysis include: speed (particularly for large databases), homogeneity, reproducibility, and precision. At the same time, they enable a statistically correct study of the uncertainties associated with the analysis, in contrast with manually set errors, or the still widespread practice of simply not assigning errors. We present a catalog comprising 210 star clusters located in the Large and Small Magellanic Clouds, observed with Washington photometry. Their fundamental parameters were estimated through an homogeneous, automatized and completely unassisted process, via the Automated Stellar Cluster Analysis package ( ASteCA). Our results are compared with two types of studies on these clusters: one where the photometry is the same, and another where the photometric system is different than that employed by ASteCA.
Accidents at Work and Costs Analysis: A Field Study in a Large Italian Company
BATTAGLIA, Massimo; FREY, Marco; PASSETTI, Emilio
2014-01-01
Accidents at work are still a heavy burden in social and economic terms, and action to improve health and safety standards at work offers great potential gains not only to employers, but also to individuals and society as a whole. However, companies often are not interested to measure the costs of accidents even if cost information may facilitate preventive occupational health and safety management initiatives. The field study, carried out in a large Italian company, illustrates technical and organisational aspects associated with the implementation of an accident costs analysis tool. The results indicate that the implementation (and the use) of the tool requires a considerable commitment by the company, that accident costs analysis should serve to reinforce the importance of health and safety prevention and that the economic dimension of accidents is substantial. The study also suggests practical ways to facilitate the implementation and the moral acceptance of the accounting technology. PMID:24869894
OpenMS: a flexible open-source software platform for mass spectrometry data analysis.
Röst, Hannes L; Sachsenberg, Timo; Aiche, Stephan; Bielow, Chris; Weisser, Hendrik; Aicheler, Fabian; Andreotti, Sandro; Ehrlich, Hans-Christian; Gutenbrunner, Petra; Kenar, Erhan; Liang, Xiao; Nahnsen, Sven; Nilse, Lars; Pfeuffer, Julianus; Rosenberger, George; Rurik, Marc; Schmitt, Uwe; Veit, Johannes; Walzer, Mathias; Wojnar, David; Wolski, Witold E; Schilling, Oliver; Choudhary, Jyoti S; Malmström, Lars; Aebersold, Ruedi; Reinert, Knut; Kohlbacher, Oliver
2016-08-30
High-resolution mass spectrometry (MS) has become an important tool in the life sciences, contributing to the diagnosis and understanding of human diseases, elucidating biomolecular structural information and characterizing cellular signaling networks. However, the rapid growth in the volume and complexity of MS data makes transparent, accurate and reproducible analysis difficult. We present OpenMS 2.0 (http://www.openms.de), a robust, open-source, cross-platform software specifically designed for the flexible and reproducible analysis of high-throughput MS data. The extensible OpenMS software implements common mass spectrometric data processing tasks through a well-defined application programming interface in C++ and Python and through standardized open data formats. OpenMS additionally provides a set of 185 tools and ready-made workflows for common mass spectrometric data processing tasks, which enable users to perform complex quantitative mass spectrometric analyses with ease.
MO-PIS-Exhibit Hall-01: Tools for TG-142 Linac Imaging QA I
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clements, M; Wiesmeyer, M
2014-06-15
Partners in Solutions is an exciting new program in which AAPM partners with our vendors to present practical “hands-on” information about the equipment and software systems that we use in our clinics. The therapy topic this year is solutions for TG-142 recommendations for linear accelerator imaging QA. Note that the sessions are being held in a special purpose room built on the Exhibit Hall Floor, to encourage further interaction with the vendors. Automated Imaging QA for TG-142 with RIT Presentation Time: 2:45 – 3:15 PM This presentation will discuss software tools for automated imaging QA and phantom analysis for TG-142.more » All modalities used in radiation oncology will be discussed, including CBCT, planar kV imaging, planar MV imaging, and imaging and treatment coordinate coincidence. Vendor supplied phantoms as well as a variety of third-party phantoms will be shown, along with appropriate analyses, proper phantom setup procedures and scanning settings, and a discussion of image quality metrics. Tools for process automation will be discussed which include: RIT Cognition (machine learning for phantom image identification), RIT Cerberus (automated file system monitoring and searching), and RunQueueC (batch processing of multiple images). In addition to phantom analysis, tools for statistical tracking, trending, and reporting will be discussed. This discussion will include an introduction to statistical process control, a valuable tool in analyzing data and determining appropriate tolerances. An Introduction to TG-142 Imaging QA Using Standard Imaging Products Presentation Time: 3:15 – 3:45 PM Medical Physicists want to understand the logic behind TG-142 Imaging QA. What is often missing is a firm understanding of the connections between the EPID and OBI phantom imaging, the software “algorithms” that calculate the QA metrics, the establishment of baselines, and the analysis and interpretation of the results. The goal of our brief presentation will be to establish and solidify these connections. Our talk will be motivated by the Standard Imaging, Inc. phantom and software solutions. We will present and explain each of the image quality metrics in TG-142 in terms of the theory, mathematics, and algorithms used to implement them in the Standard Imaging PIPSpro software. In the process, we will identify the regions of phantom images that are analyzed by each algorithm. We then will discuss the process of the creation of baselines and typical ranges of acceptable values for each imaging quality metric.« less
Roets-Merken, Lieve M; Zuidema, Sytse U; Vernooij-Dassen, Myrra J F J; Kempen, Gertrudis I J M
2014-11-01
This study investigated the psychometric properties of the Severe Dual Sensory Loss screening tool, a tool designed to help nurses and care assistants to identify hearing, visual and dual sensory impairment in older adults. Construct validity of the Severe Dual Sensory Loss screening tool was evaluated using Crohnbach's alpha and factor analysis. Interrater reliability was calculated using Kappa statistics. To evaluate the predictive validity, sensitivity and specificity were calculated by comparison with the criterion standard assessment for hearing and vision. The criterion used for hearing impairment was a hearing loss of ≥40 decibel measured by pure-tone audiometry, and the criterion for visual impairment was a visual acuity of ≤0.3 diopter or a visual field of ≤0.3°. Feasibility was evaluated by the time needed to fill in the screening tool and the clarity of the instruction and items. Prevalence of dual sensory impairment was calculated. A total of 56 older adults receiving aged care and 12 of their nurses and care assistants participated in the study. Crohnbach's alpha was 0.81 for the hearing subscale and 0.84 for the visual subscale. Factor analysis showed two constructs for hearing and two for vision. Kappa was 0.71 for the hearing subscale and 0.74 for the visual subscale. The predictive validity showed a sensitivity of 0.71 and a specificity of 0.72 for the hearing subscale; and a sensitivity of 0.69 and a specificity of 0.78 for the visual subscale. The optimum cut-off point for each subscale was score 1. The nurses and care assistants reported that the Severe Dual Sensory Loss screening tool was easy to use. The prevalence of hearing and vision impairment was 55% and 29%, respectively, and that of dual sensory impairment was 20%. The Severe Dual Sensory Loss screening tool was compared with the criterion standards for hearing and visual impairment and was found a valid and reliable tool, enabling nurses and care assistants to identify hearing, visual and dual sensory impairment among older adults. Copyright © 2014 Elsevier Ltd. All rights reserved.
wft4galaxy: a workflow testing tool for galaxy.
Piras, Marco Enrico; Pireddu, Luca; Zanetti, Gianluigi
2017-12-01
Workflow managers for scientific analysis provide a high-level programming platform facilitating standardization, automation, collaboration and access to sophisticated computing resources. The Galaxy workflow manager provides a prime example of this type of platform. As compositions of simpler tools, workflows effectively comprise specialized computer programs implementing often very complex analysis procedures. To date, no simple way to automatically test Galaxy workflows and ensure their correctness has appeared in the literature. With wft4galaxy we offer a tool to bring automated testing to Galaxy workflows, making it feasible to bring continuous integration to their development and ensuring that defects are detected promptly. wft4galaxy can be easily installed as a regular Python program or launched directly as a Docker container-the latter reducing installation effort to a minimum. Available at https://github.com/phnmnl/wft4galaxy under the Academic Free License v3.0. marcoenrico.piras@crs4.it. © The Author 2017. Published by Oxford University Press.
AITSO: A Tool for Spatial Optimization Based on Artificial Immune Systems
Zhao, Xiang; Liu, Yaolin; Liu, Dianfeng; Ma, Xiaoya
2015-01-01
A great challenge facing geocomputation and spatial analysis is spatial optimization, given that it involves various high-dimensional, nonlinear, and complicated relationships. Many efforts have been made with regard to this specific issue, and the strong ability of artificial immune system algorithms has been proven in previous studies. However, user-friendly professional software is still unavailable, which is a great impediment to the popularity of artificial immune systems. This paper describes a free, universal tool, named AITSO, which is capable of solving various optimization problems. It provides a series of standard application programming interfaces (APIs) which can (1) assist researchers in the development of their own problem-specific application plugins to solve practical problems and (2) allow the implementation of some advanced immune operators into the platform to improve the performance of an algorithm. As an integrated, flexible, and convenient tool, AITSO contributes to knowledge sharing and practical problem solving. It is therefore believed that it will advance the development and popularity of spatial optimization in geocomputation and spatial analysis. PMID:25678911
Medication Reconciliation: Work Domain Ontology, prototype development, and a predictive model.
Markowitz, Eliz; Bernstam, Elmer V; Herskovic, Jorge; Zhang, Jiajie; Shneiderman, Ben; Plaisant, Catherine; Johnson, Todd R
2011-01-01
Medication errors can result from administration inaccuracies at any point of care and are a major cause for concern. To develop a successful Medication Reconciliation (MR) tool, we believe it necessary to build a Work Domain Ontology (WDO) for the MR process. A WDO defines the explicit, abstract, implementation-independent description of the task by separating the task from work context, application technology, and cognitive architecture. We developed a prototype based upon the WDO and designed to adhere to standard principles of interface design. The prototype was compared to Legacy Health System's and Pre-Admission Medication List Builder MR tools via a Keystroke-Level Model analysis for three MR tasks. The analysis found the prototype requires the fewest mental operations, completes tasks in the fewest steps, and completes tasks in the least amount of time. Accordingly, we believe that developing a MR tool, based upon the WDO and user interface guidelines, improves user efficiency and reduces cognitive load.
Medication Reconciliation: Work Domain Ontology, Prototype Development, and a Predictive Model
Markowitz, Eliz; Bernstam, Elmer V.; Herskovic, Jorge; Zhang, Jiajie; Shneiderman, Ben; Plaisant, Catherine; Johnson, Todd R.
2011-01-01
Medication errors can result from administration inaccuracies at any point of care and are a major cause for concern. To develop a successful Medication Reconciliation (MR) tool, we believe it necessary to build a Work Domain Ontology (WDO) for the MR process. A WDO defines the explicit, abstract, implementation-independent description of the task by separating the task from work context, application technology, and cognitive architecture. We developed a prototype based upon the WDO and designed to adhere to standard principles of interface design. The prototype was compared to Legacy Health System’s and Pre-Admission Medication List Builder MR tools via a Keystroke-Level Model analysis for three MR tasks. The analysis found the prototype requires the fewest mental operations, completes tasks in the fewest steps, and completes tasks in the least amount of time. Accordingly, we believe that developing a MR tool, based upon the WDO and user interface guidelines, improves user efficiency and reduces cognitive load. PMID:22195146
Using Galaxy to Perform Large-Scale Interactive Data Analyses
Hillman-Jackson, Jennifer; Clements, Dave; Blankenberg, Daniel; Taylor, James; Nekrutenko, Anton
2012-01-01
Innovations in biomedical research technologies continue to provide experimental biologists with novel and increasingly large genomic and high-throughput data resources to be analyzed. As creating and obtaining data has become easier, the key decision faced by many researchers is a practical one: where and how should an analysis be performed? Datasets are large and analysis tool set-up and use is riddled with complexities outside of the scope of core research activities. The authors believe that Galaxy (galaxyproject.org) provides a powerful solution that simplifies data acquisition and analysis in an intuitive web-application, granting all researchers access to key informatics tools previously only available to computational specialists working in Unix-based environments. We will demonstrate through a series of biomedically relevant protocols how Galaxy specifically brings together 1) data retrieval from public and private sources, for example, UCSC’s Eukaryote and Microbial Genome Browsers (genome.ucsc.edu), 2) custom tools (wrapped Unix functions, format standardization/conversions, interval operations) and 3rd party analysis tools, for example, Bowtie/Tuxedo Suite (bowtie-bio.sourceforge.net), Lastz (www.bx.psu.edu/~rsharris/lastz/), SAMTools (samtools.sourceforge.net), FASTX-toolkit (hannonlab.cshl.edu/fastx_toolkit), and MACS (liulab.dfci.harvard.edu/MACS), and creates results formatted for visualization in tools such as the Galaxy Track Browser (GTB, galaxyproject.org/wiki/Learn/Visualization), UCSC Genome Browser (genome.ucsc.edu), Ensembl (www.ensembl.org), and GeneTrack (genetrack.bx.psu.edu). Galaxy rapidly has become the most popular choice for integrated next generation sequencing (NGS) analytics and collaboration, where users can perform, document, and share complex analysis within a single interface in an unprecedented number of ways. PMID:18428782
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prooijen, Monique van; Breen, Stephen
Purpose: Our treatment for choroidal melanoma utilizes the GTC frame. The patient looks at a small LED to stabilize target position. The LED is attached to a metal arm attached to the GTC frame. A camera on the arm allows therapists to monitor patient compliance. To move to mask-based immobilization we need a new LED/camera attachment mechanism. We used a Hazard-Risk Analysis (HRA) to guide the design of the new tool. Method: A pre-clinical model was built with input from therapy and machine shop personnel. It consisted of an aluminum frame placed in aluminum guide posts attached to the couchmore » top. Further development was guided by the Department of Defense Standard Practice - System Safety hazard risk analysis technique. Results: An Orfit mask was selected because it allowed access to indexes on the couch top which assist with setup reproducibility. The first HRA table was created considering mechanical failure modes of the device. Discussions with operators and manufacturers identified other failure modes and solutions. HRA directed the design towards a safe clinical device. Conclusion: A new immobilization tool has been designed using hazard-risk analysis which resulted in an easier-to-use and safer tool compared to the initial design. The remaining risks are all low probability events and not dissimilar from those currently faced with the GTC setup. Given the gains in ease of use for therapists and patients as well as the lower costs for the hospital, we will implement this new tool.« less
GOATS Image Projection Component
NASA Technical Reports Server (NTRS)
Haber, Benjamin M.; Green, Joseph J.
2011-01-01
When doing mission analysis and design of an imaging system in orbit around the Earth, answering the fundamental question of imaging performance requires an understanding of the image products that will be produced by the imaging system. GOATS software represents a series of MATLAB functions to provide for geometric image projections. Unique features of the software include function modularity, a standard MATLAB interface, easy-to-understand first-principles-based analysis, and the ability to perform geometric image projections of framing type imaging systems. The software modules are created for maximum analysis utility, and can all be used independently for many varied analysis tasks, or used in conjunction with other orbit analysis tools.
A Rapid Assessment Tool for affirming good practice in midwifery education programming.
Fullerton, Judith T; Johnson, Peter; Lobe, Erika; Myint, Khine Haymar; Aung, Nan Nan; Moe, Thida; Linn, Nay Aung
2016-03-01
to design a criterion-referenced assessment tool that could be used globally in a rapid assessment of good practices and bottlenecks in midwifery education programs. a standard tool development process was followed, to generate standards and reference criteria; followed by external review and field testing to document psychometric properties. review of standards and scoring criteria were conducted by stakeholders around the globe. Field testing of the tool was conducted in Myanmar. eleven of Myanmar׳s 22 midwifery education programs participated in the assessment. the clinimetric tool was demonstrated to have content validity and high inter-rater reliability in use. a globally validated tool, and accompanying user guide and handbook are now available for conducting rapid assessments of compliance with good practice criteria in midwifery education programming. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
Engine dynamic analysis with general nonlinear finite element codes
NASA Technical Reports Server (NTRS)
Adams, M. L.; Padovan, J.; Fertis, D. G.
1991-01-01
A general engine dynamic analysis as a standard design study computational tool is described for the prediction and understanding of complex engine dynamic behavior. Improved definition of engine dynamic response provides valuable information and insights leading to reduced maintenance and overhaul costs on existing engine configurations. Application of advanced engine dynamic simulation methods provides a considerable cost reduction in the development of new engine designs by eliminating some of the trial and error process done with engine hardware development.
1988-11-01
system, using graphic techniques which enable users, analysts, and designers to get a clear and common picture of the system and how its parts fit...boxes into hierarchies suitable for computer implementation. ŗ. Structured Design uses tools, especially graphic ones, to render systems readily...LSA, PROCESSES, DATA FLOWS, DATA STORES, EX"RNAL ENTITIES, OVERALL SYSTEMS DESIGN PROCESS, over 19, ABSTRACT (Continue on reverse if necessary and
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-25
... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Notice of Vitamin D Standardization Program (VDSP) Symposium: Tools To Improve Laboratory Measurement SUMMARY: The National Institutes of Health, Office of Dietary Supplements (ODS), and the National Institute of Standards and...
Geena 2, improved automated analysis of MALDI/TOF mass spectra.
Romano, Paolo; Profumo, Aldo; Rocco, Mattia; Mangerini, Rosa; Ferri, Fabio; Facchiano, Angelo
2016-03-02
Mass spectrometry (MS) is producing high volumes of data supporting oncological sciences, especially for translational research. Most of related elaborations can be carried out by combining existing tools at different levels, but little is currently available for the automation of the fundamental steps. For the analysis of MALDI/TOF spectra, a number of pre-processing steps are required, including joining of isotopic abundances for a given molecular species, normalization of signals against an internal standard, background noise removal, averaging multiple spectra from the same sample, and aligning spectra from different samples. In this paper, we present Geena 2, a public software tool for the automated execution of these pre-processing steps for MALDI/TOF spectra. Geena 2 has been developed in a Linux-Apache-MySQL-PHP web development environment, with scripts in PHP and Perl. Input and output are managed as simple formats that can be consumed by any database system and spreadsheet software. Input data may also be stored in a MySQL database. Processing methods are based on original heuristic algorithms which are introduced in the paper. Three simple and intuitive web interfaces are available: the Standard Search Interface, which allows a complete control over all parameters, the Bright Search Interface, which leaves to the user the possibility to tune parameters for alignment of spectra, and the Quick Search Interface, which limits the number of parameters to a minimum by using default values for the majority of parameters. Geena 2 has been utilized, in conjunction with a statistical analysis tool, in three published experimental works: a proteomic study on the effects of long-term cryopreservation on the low molecular weight fraction of serum proteome, and two retrospective serum proteomic studies, one on the risk of developing breat cancer in patients affected by gross cystic disease of the breast (GCDB) and the other for the identification of a predictor of breast cancer mortality following breast cancer surgery, whose results were validated by ELISA, a completely alternative method. Geena 2 is a public tool for the automated pre-processing of MS data originated by MALDI/TOF instruments, with a simple and intuitive web interface. It is now under active development for the inclusion of further filtering options and for the adoption of standard formats for MS spectra.
NASA Technical Reports Server (NTRS)
Falke, Stefan; Husar, Rudolf
2011-01-01
The goal of this REASoN applications and technology project is to deliver and use Earth Science Enterprise (ESE) data and tools in support of air quality management. Its scope falls within the domain of air quality management and aims to develop a federated air quality information sharing network that includes data from NASA, EPA, US States and others. Project goals were achieved through a access of satellite and ground observation data, web services information technology, interoperability standards, and air quality community collaboration. In contributing to a network of NASA ESE data in support of particulate air quality management, the project will develop access to distributed data, build Web infrastructure, and create tools for data processing and analysis. The key technologies used in the project include emerging web services for developing self describing and modular data access and processing tools, and service oriented architecture for chaining web services together to assemble customized air quality management applications. The technology and tools required for this project were developed within DataFed.net, a shared infrastructure that supports collaborative atmospheric data sharing and processing web services. Much of the collaboration was facilitated through community interactions through the Federation of Earth Science Information Partners (ESIP) Air Quality Workgroup. The main activities during the project that successfully advanced DataFed, enabled air quality applications and established community-oriented infrastructures were: develop access to distributed data (surface and satellite), build Web infrastructure to support data access, processing and analysis create tools for data processing and analysis foster air quality community collaboration and interoperability.
The current role of high-resolution mass spectrometry in food analysis.
Kaufmann, Anton
2012-05-01
High-resolution mass spectrometry (HRMS), which is used for residue analysis in food, has gained wider acceptance in the last few years. This development is due to the availability of more rugged, sensitive, and selective instrumentation. The benefits provided by HRMS over classical unit-mass-resolution tandem mass spectrometry are considerable. These benefits include the collection of full-scan spectra, which provides greater insight into the composition of a sample. Consequently, the analyst has the freedom to measure compounds without previous compound-specific tuning, the possibility of retrospective data analysis, and the capability of performing structural elucidations of unknown or suspected compounds. HRMS strongly competes with classical tandem mass spectrometry in the field of quantitative multiresidue methods (e.g., pesticides and veterinary drugs). It is one of the most promising tools when moving towards nontargeted approaches. Certain hardware and software issues still have to be addressed by the instrument manufacturers for it to dislodge tandem mass spectrometry from its position as the standard trace analysis tool.
Fast and straightforward analysis approach of charge transport data in single molecule junctions.
Zhang, Qian; Liu, Chenguang; Tao, Shuhui; Yi, Ruowei; Su, Weitao; Zhao, Cezhou; Zhao, Chun; Dappe, Yannick J; Nichols, Richard J; Yang, Li
2018-08-10
In this study, we introduce an efficient data sorting algorithm, including filters for noisy signals, conductance mapping for analyzing the most dominant conductance group and sub-population groups. The capacity of our data analysis process has also been corroborated on real experimental data sets of Au-1,6-hexanedithiol-Au and Au-1,8-octanedithiol-Au molecular junctions. The fully automated and unsupervised program requires less than one minute on a standard PC to sort the data and generate histograms. The resulting one-dimensional and two-dimensional log histograms give conductance values in good agreement with previous studies. Our algorithm is a straightforward, fast and user-friendly tool for single molecule charge transport data analysis. We also analyze the data in a form of a conductance map which can offer evidence for diversity in molecular conductance. The code for automatic data analysis is openly available, well-documented and ready to use, thereby offering a useful new tool for single molecule electronics.
Investigation of priorities in water quality management based on correlations and variations.
Boyacıoğlu, Hülya; Gündogdu, Vildan; Boyacıoğlu, Hayal
2013-04-15
The development of water quality assessment strategies investigating spatial and temporal changes caused by natural and anthropogenic phenomena is an important tool in management practices. This paper used cluster analysis, water quality index method, sensitivity analysis and canonical correlation analysis to investigate priorities in pollution control activities. Data sets representing 22 surface water quality parameters were subject to analysis. Results revealed that organic pollution was serious threat for overall water quality in the region. Besides, oil and grease, lead and mercury were the critical variables violating the standard. In contrast to inorganic variables, organic and physical-inorganic chemical parameters were influenced by variations in physical conditions (discharge, temperature). This study showed that information produced based on the variations and correlations in water quality data sets can be helpful to investigate priorities in water management activities. Moreover statistical techniques and index methods are useful tools in data - information transformation process. Copyright © 2013 Elsevier Ltd. All rights reserved.
Modeling Zone-3 Protection with Generic Relay Models for Dynamic Contingency Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Qiuhua; Vyakaranam, Bharat GNVSR; Diao, Ruisheng
This paper presents a cohesive approach for calculating and coordinating the settings of multiple zone-3 protections for dynamic contingency analysis. The zone-3 protections are represented by generic distance relay models. A two-step approach for determining zone-3 relay settings is proposed. The first step is to calculate settings, particularly, the reach, of each zone-3 relay individually by iteratively running line open-end fault short circuit analysis; the blinder is also employed and properly set to meet the industry standard under extreme loading conditions. The second step is to systematically coordinate the protection settings of the zone-3 relays. The main objective of thismore » coordination step is to address the over-reaching issues. We have developed a tool to automate the proposed approach and generate the settings of all distance relays in a PSS/E dyr format file. The calculated zone-3 settings have been tested on a modified IEEE 300 system using a dynamic contingency analysis tool (DCAT).« less
Interim Draft: Biological Sampling and Analysis Plan Outline ...
Standard Operation Procedures This interim sampling and analysis plan (SAP) outline was developed specifically as an outline of the output that will be generated by a developing on-line tool called the MicroSAP. The goal of the MicroSAP tool is to assist users with development of SAPs needed for site characterization, verification sampling, and post decontamination sampling stages of biological sampling and analysis activities in which the EPA would be responsible for conducting sampling. These activities could include sampling and analysis for a biological contamination incident, a research study, or an exercise. The development of this SAP outline did not consider the initial response of an incident, as it is assumed that the initial response would have been previously completed by another agency during the response, or the clearance phase, as it is assumed that separate committee would be established to make decisions regarding clearing a site. This outline also includes considerations for capturing the associated data quality objectives in the SAP.
2015-01-01
The Oldowan Industrial Complex has long been thought to have been static, with limited internal variability, embracing techno-complexes essentially focused on small-to-medium flake production. The flakes were rarely modified by retouch to produce small tools, which do not show any standardized pattern. Usually, the manufacture of small standardized tools has been interpreted as a more complex behavior emerging with the Acheulean technology. Here we report on the ~1.7 Ma Oldowan assemblages from Garba IVE-F at Melka Kunture in the Ethiopian highland. This industry is structured by technical criteria shared by the other East African Oldowan assemblages. However, there is also evidence of a specific technical process never recorded before, i.e. the systematic production of standardized small pointed tools strictly linked to the obsidian exploitation. Standardization and raw material selection in the manufacture of small tools disappear at Melka Kunture during the Lower Pleistocene Acheulean. This proves that 1) the emergence of a certain degree of standardization in tool-kits does not reflect in itself a major step in cultural evolution; and that 2) the Oldowan knappers, when driven by functional needs and supported by a highly suitable raw material, were occasionally able to develop specific technical solutions. The small tool production at ~1.7 Ma, at a time when the Acheulean was already emerging elsewhere in East Africa, adds to the growing amount of evidence of Oldowan techno-economic variability and flexibility, further challenging the view that early stone knapping was static over hundreds of thousands of years. PMID:26690569
Gallotti, Rosalia; Mussi, Margherita
2015-01-01
The Oldowan Industrial Complex has long been thought to have been static, with limited internal variability, embracing techno-complexes essentially focused on small-to-medium flake production. The flakes were rarely modified by retouch to produce small tools, which do not show any standardized pattern. Usually, the manufacture of small standardized tools has been interpreted as a more complex behavior emerging with the Acheulean technology. Here we report on the ~1.7 Ma Oldowan assemblages from Garba IVE-F at Melka Kunture in the Ethiopian highland. This industry is structured by technical criteria shared by the other East African Oldowan assemblages. However, there is also evidence of a specific technical process never recorded before, i.e. the systematic production of standardized small pointed tools strictly linked to the obsidian exploitation. Standardization and raw material selection in the manufacture of small tools disappear at Melka Kunture during the Lower Pleistocene Acheulean. This proves that 1) the emergence of a certain degree of standardization in tool-kits does not reflect in itself a major step in cultural evolution; and that 2) the Oldowan knappers, when driven by functional needs and supported by a highly suitable raw material, were occasionally able to develop specific technical solutions. The small tool production at ~1.7 Ma, at a time when the Acheulean was already emerging elsewhere in East Africa, adds to the growing amount of evidence of Oldowan techno-economic variability and flexibility, further challenging the view that early stone knapping was static over hundreds of thousands of years.
Svečko, Rajko; Kusić, Dragan; Kek, Tomaž; Sarjaš, Andrej; Hančič, Aleš; Grum, Janez
2013-05-14
This paper presents an improved monitoring system for the failure detection of engraving tool steel inserts during the injection molding cycle. This system uses acoustic emission PZT sensors mounted through acoustic waveguides on the engraving insert. We were thus able to clearly distinguish the defect through measured AE signals. Two engraving tool steel inserts were tested during the production of standard test specimens, each under the same processing conditions. By closely comparing the captured AE signals on both engraving inserts during the filling and packing stages, we were able to detect the presence of macro-cracks on one engraving insert. Gabor wavelet analysis was used for closer examination of the captured AE signals' peak amplitudes during the filling and packing stages. The obtained results revealed that such a system could be used successfully as an improved tool for monitoring the integrity of an injection molding process.
Svečko, Rajko; Kusić, Dragan; Kek, Tomaž; Sarjaš, Andrej; Hančič, Aleš; Grum, Janez
2013-01-01
This paper presents an improved monitoring system for the failure detection of engraving tool steel inserts during the injection molding cycle. This system uses acoustic emission PZT sensors mounted through acoustic waveguides on the engraving insert. We were thus able to clearly distinguish the defect through measured AE signals. Two engraving tool steel inserts were tested during the production of standard test specimens, each under the same processing conditions. By closely comparing the captured AE signals on both engraving inserts during the filling and packing stages, we were able to detect the presence of macro-cracks on one engraving insert. Gabor wavelet analysis was used for closer examination of the captured AE signals' peak amplitudes during the filling and packing stages. The obtained results revealed that such a system could be used successfully as an improved tool for monitoring the integrity of an injection molding process. PMID:23673677
Applying open source data visualization tools to standard based medical data.
Kopanitsa, Georgy; Taranik, Maxim
2014-01-01
Presentation of medical data in personal health records (PHRs) requires flexible platform independent tools to ensure easy access to the information. Different backgrounds of the patients, especially elder people require simple graphical presentation of the data. Data in PHRs can be collected from heterogeneous sources. Application of standard based medical data allows development of generic visualization methods. Focusing on the deployment of Open Source Tools, in this paper we applied Java Script libraries to create data presentations for standard based medical data.
Modeling languages for biochemical network simulation: reaction vs equation based approaches.
Wiechert, Wolfgang; Noack, Stephan; Elsheikh, Atya
2010-01-01
Biochemical network modeling and simulation is an essential task in any systems biology project. The systems biology markup language (SBML) was established as a standardized model exchange language for mechanistic models. A specific strength of SBML is that numerous tools for formulating, processing, simulation and analysis of models are freely available. Interestingly, in the field of multidisciplinary simulation, the problem of model exchange between different simulation tools occurred much earlier. Several general modeling languages like Modelica have been developed in the 1990s. Modelica enables an equation based modular specification of arbitrary hierarchical differential algebraic equation models. Moreover, libraries for special application domains can be rapidly developed. This contribution compares the reaction based approach of SBML with the equation based approach of Modelica and explains the specific strengths of both tools. Several biological examples illustrating essential SBML and Modelica concepts are given. The chosen criteria for tool comparison are flexibility for constraint specification, different modeling flavors, hierarchical, modular and multidisciplinary modeling. Additionally, support for spatially distributed systems, event handling and network analysis features is discussed. As a major result it is shown that the choice of the modeling tool has a strong impact on the expressivity of the specified models but also strongly depends on the requirements of the application context.
Climate tools in mainstream Linux distributions
NASA Astrophysics Data System (ADS)
McKinstry, Alastair
2015-04-01
Debian/meterology is a project to integrate climate tools and analysis software into the mainstream Debian/Ubuntu Linux distributions. This work describes lessons learnt, and recommends practices for scientific software to be adopted and maintained in OS distributions. In addition to standard analysis tools (cdo,, grads, ferret, metview, ncl, etc.), software used by the Earth System Grid Federation was chosen for integraion, to enable ESGF portals to be built on this base; however exposing scientific codes via web APIs enables security weaknesses, normally ignorable, to be exposed. How tools are hardened, and what changes are required to handle security upgrades, are described. Secondly, to enable libraries and components (e.g. Python modules) to be integrated requires planning by writers: it is not sufficient to assume users can upgrade their code when you make incompatible changes. Here, practices are recommended to enable upgrades and co-installability of C, C++, Fortran and Python codes. Finally, software packages such as NetCDF and HDF5 can be built in multiple configurations. Tools may then expect incompatible versions of these libraries (e.g. serial and parallel) to be simultaneously available; how this was solved in Debian using "pkg-config" and shared library interfaces is described, and best practices for software writers to enable this are summarised.
A health app developer's guide to law and policy: a multi-sector policy analysis.
Parker, Lisa; Karliychuk, Tanya; Gillies, Donna; Mintzes, Barbara; Raven, Melissa; Grundy, Quinn
2017-10-02
Apps targeted at health and wellbeing sit in a rapidly growing industry associated with widespread optimism about their potential to deliver accessible and cost-effective healthcare. App developers might not be aware of all the regulatory requirements and best practice principles are emergent. Health apps are regulated in order to minimise their potential for harm due to, for example, loss of personal health privacy, financial costs, and health harms from delayed or unnecessary diagnosis, monitoring and treatment. We aimed to produce a comprehensive guide to assist app developers in producing health apps that are legally compliant and in keeping with high professional standards of user protection. We conducted a case study analysis of the Australian and related international policy environment for mental health apps to identify relevant sectors, policy actors, and policy solutions. We identified 29 policies produced by governments and non-government organisations that provide oversight of health apps. In consultation with stakeholders, we developed an interactive tool targeted at app developers, summarising key features of the policy environment and highlighting legislative, industry and professional standards around seven relevant domains: privacy, security, content, promotion and advertising, consumer finances, medical device efficacy and safety, and professional ethics. We annotated this developer guidance tool with information about: the relevance of each domain; existing legislative and non-legislative guidance; critiques of existing policy; recommendations for developers; and suggestions for other key stakeholders. We anticipate that mental health apps developed in accordance with this tool will be more likely to conform to regulatory requirements, protect consumer privacy, protect consumer finances, and deliver health benefit; and less likely to attract regulatory penalties, offend consumers and communities, mislead consumers, or deliver health harms. We encourage government, industry and consumer organisations to use and publicise the tool.
2011-01-01
Background Current measures of antenatal care use are limited to initiation of care and number of visits. This study aimed to describe the development and application of a tool to assess the adequacy of the content and timing of antenatal care. Methods The Content and Timing of care in Pregnancy (CTP) tool was developed based on clinical relevance for ongoing antenatal care and recommendations in national and international guidelines. The tool reflects minimal care recommended in every pregnancy, regardless of parity or risk status. CTP measures timing of initiation of care, content of care (number of blood pressure readings, blood tests and ultrasound scans) and whether the interventions were received at an appropriate time. Antenatal care trajectories for 333 pregnant women were then described using a standard tool (the APNCU index), that measures the quantity of care only, and the new CTP tool. Both tools categorise care into 4 categories, from 'Inadequate' (both tools) to 'Adequate plus' (APNCU) or 'Appropriate' (CTP). Participants recorded the timing and content of their antenatal care prospectively using diaries. Analysis included an examination of similarities and differences in categorisation of care episodes between the tools. Results According to the CTP tool, the care trajectory of 10,2% of the women was classified as inadequate, 8,4% as intermediate, 36% as sufficient and 45,3% as appropriate. The assessment of quality of care differed significantly between the two tools. Seventeen care trajectories classified as 'Adequate' or 'Adequate plus' by the APNCU were deemed 'Inadequate' by the CTP. This suggests that, despite a high number of visits, these women did not receive the minimal recommended content and timing of care. Conclusions The CTP tool provides a more detailed assessment of the adequacy of antenatal care than the current standard index. However, guidelines for the content of antenatal care vary, and the tool does not at the moment grade over-use of interventions as 'Inappropriate'. Further work needs to be done to refine the content items prior to larger scale testing of the impact of the new measure. PMID:21896201
ERIC Educational Resources Information Center
Fish, Laurel J.; Halcoussis, Dennis; Phillips, G. Michael
2017-01-01
The Monte Carlo method and related multiple imputation methods are traditionally used in math, physics and science to estimate and analyze data and are now becoming standard tools in analyzing business and financial problems. However, few sources explain the application of the Monte Carlo method for individuals and business professionals who are…
Data needs for X-ray astronomy satellites
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kallman, T.
I review the current status of atomic data for X-ray astronomy satellites. This includes some of the astrophysical issues which can be addressed, current modeling and analysis techniques, computational tools, the limitations imposed by currently available atomic data, and the validity of standard assumptions. I also discuss the future: challenges associated with future missions and goals for atomic data collection.
Identifying g: A Review of Current Factor Analytic Practices in the Science of Mental Abilities
ERIC Educational Resources Information Center
Reeve, Charlie L.; Blacksmith, Nikki
2009-01-01
Factor analysis is arguably one of the most important tools in the science of mental abilities. While many studies have been conducted to make recommendations regarding "best practices" concerning its use, it is unknown the degree to which contemporary ability researchers abide by those standards. The current study sought to evaluate the typical…
Functional genomics (FG) screens, using RNAi or CRISPR technology, have become a standard tool for systematic, genome-wide loss-of-function studies for therapeutic target discovery. As in many large-scale assays, however, off-target effects, variable reagents' potency and experimental noise must be accounted for appropriately control for false positives.
How can my research paper be useful for future meta-analyses on forest restoration practices?
Enrique Andivia; Pedro Villar‑Salvador; Juan A. Oliet; Jaime Puertolas; R. Kasten Dumroese
2018-01-01
Statistical meta-analysis is a powerful and useful tool to quantitatively synthesize the information conveyed in published studies on a particular topic. It allows identifying and quantifying overall patterns and exploring causes of variation. The inclusion of published works in meta-analyses requires, however, a minimum quality standard of the reported data and...
The CEOS WGISS Atmospheric Composition Portal
NASA Technical Reports Server (NTRS)
Lynnes, Chris
2010-01-01
Goal: Demonstrate the feasibility of connecting distributed atmospheric composition data and analysis tools into a common and shared web framework. Initial effort focused on: a) Collaboratively creating a web application within WDC-RSAT for comparison of satellite derived atmospheric composition datasets accessed from distributed data sources. b) Implementation of data access and interoperability standards. c) Sollicit feedback from paternal users; Especially from ACC participants.
Effects of Cognitive Load on Trust
2013-10-01
that may be affected by load Build a parsing tool to extract relevant features Statistical analysis of results (by load components) Achieved...for a business application. Participants assessed potential job candidates and reviewed the applicants’ virtual resume which included standard...substantially different from each other that would make any confounding problems or other issues. Some statistics of the Australian data collection are
Algorithms and programming tools for image processing on the MPP:3
NASA Technical Reports Server (NTRS)
Reeves, Anthony P.
1987-01-01
This is the third and final report on the work done for NASA Grant 5-403 on Algorithms and Programming Tools for Image Processing on the MPP:3. All the work done for this grant is summarized in the introduction. Work done since August 1986 is reported in detail. Research for this grant falls under the following headings: (1) fundamental algorithms for the MPP; (2) programming utilities for the MPP; (3) the Parallel Pascal Development System; and (4) performance analysis. In this report, the results of two efforts are reported: region growing, and performance analysis of important characteristic algorithms. In each case, timing results from MPP implementations are included. A paper is included in which parallel algorithms for region growing on the MPP is discussed. These algorithms permit different sized regions to be merged in parallel. Details on the implementation and peformance of several important MPP algorithms are given. These include a number of standard permutations, the FFT, convolution, arbitrary data mappings, image warping, and pyramid operations, all of which have been implemented on the MPP. The permutation and image warping functions have been included in the standard development system library.
Gearbox Reliability Collaborative Phase 3 Gearbox 2 Test Plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Link, H.; Keller, J.; Guo, Y.
2013-04-01
Gearboxes in wind turbines have not been achieving their expected design life even though they commonly meet or exceed the design criteria specified in current design standards. One of the basic premises of the National Renewable Energy Laboratory (NREL) Gearbox Reliability Collaborative (GRC) is that the low gearbox reliability results from the absence of critical elements in the design process or insufficient design tools. Key goals of the GRC are to improve design approaches and analysis tools and to recommend practices and test methods resulting in improved design standards for wind turbine gearboxes that lower the cost of energy (COE)more » through improved reliability. The GRC uses a combined gearbox testing, modeling and analysis approach, along with a database of information from gearbox failures collected from overhauls and investigation of gearbox condition monitoring techniques to improve wind turbine operations and maintenance practices. Testing of Gearbox 2 (GB2) using the two-speed turbine controller that has been used in prior testing. This test series will investigate non-torque loads, high-speed shaft misalignment, and reproduction of field conditions in the dynamometer. This test series will also include vibration testing using an eddy-current brake on the gearbox's high speed shaft.« less
NASA Astrophysics Data System (ADS)
Krohn, Olivia; Armbruster, Aaron; Gao, Yongsheng; Atlas Collaboration
2017-01-01
Software tools developed for the purpose of modeling CERN LHC pp collision data to aid in its interpretation are presented. Some measurements are not adequately described by a Gaussian distribution; thus an interpretation assuming Gaussian uncertainties will inevitably introduce bias, necessitating analytical tools to recreate and evaluate non-Gaussian features. One example is the measurements of Higgs boson production rates in different decay channels, and the interpretation of these measurements. The ratios of data to Standard Model expectations (μ) for five arbitrary signals were modeled by building five Poisson distributions with mixed signal contributions such that the measured values of μ are correlated. Algorithms were designed to recreate probability distribution functions of μ as multi-variate Gaussians, where the standard deviation (σ) and correlation coefficients (ρ) are parametrized. There was good success with modeling 1-D likelihood contours of μ, and the multi-dimensional distributions were well modeled within 1- σ but the model began to diverge after 2- σ due to unmerited assumptions in developing ρ. Future plans to improve the algorithms and develop a user-friendly analysis package will also be discussed. NSF International Research Experiences for Students
2017-01-01
The current cytomegalovirus (CMV) prevention strategies in solid organ transplantation (SOT) recipients have contributed towards overcoming the detrimental effects caused by CMV lytic infection, and improving the long-term success rate of graft survival. Although the quantification of CMV in peripheral blood is the standard method, and an excellent end-point for diagnosing CMV replication and modulating the anti-CMV prevention strategies in SOT recipients, a novel biomarker mimicking the CMV control mechanism is required. CMV-specific immune monitoring can be employed as a basic tool predicting CMV infection or disease after SOT, since uncontrolled CMV replication mostly originates from the impairment of immune responses against CMV under immunosuppressive conditions in SOT recipients. Several studies conducted during the past few decades have indicated the possibility of measuring the CMV-specific cell-mediated immune response in clinical situations. Among several analytical assays, the most advancing standardized tool is the QuantiFERON®-CMV assay. The T-Track® CMV kit that uses the standardized enzyme-linked immunospot assay is also widely employed. In addition to these assays, immunophenotyping and intracellular cytokine analysis using flow cytometry (with fluorescence-labeled monoclonal antibodies or peptide-major histocompatibility complex multimers) needs to be adequately standardized and validated for potential clinical applications. PMID:29027383
Neo: an object model for handling electrophysiology data in multiple formats
Garcia, Samuel; Guarino, Domenico; Jaillet, Florent; Jennings, Todd; Pröpper, Robert; Rautenberg, Philipp L.; Rodgers, Chris C.; Sobolev, Andrey; Wachtler, Thomas; Yger, Pierre; Davison, Andrew P.
2014-01-01
Neuroscientists use many different software tools to acquire, analyze and visualize electrophysiological signals. However, incompatible data models and file formats make it difficult to exchange data between these tools. This reduces scientific productivity, renders potentially useful analysis methods inaccessible and impedes collaboration between labs. A common representation of the core data would improve interoperability and facilitate data-sharing. To that end, we propose here a language-independent object model, named “Neo,” suitable for representing data acquired from electroencephalographic, intracellular, or extracellular recordings, or generated from simulations. As a concrete instantiation of this object model we have developed an open source implementation in the Python programming language. In addition to representing electrophysiology data in memory for the purposes of analysis and visualization, the Python implementation provides a set of input/output (IO) modules for reading/writing the data from/to a variety of commonly used file formats. Support is included for formats produced by most of the major manufacturers of electrophysiology recording equipment and also for more generic formats such as MATLAB. Data representation and data analysis are conceptually separate: it is easier to write robust analysis code if it is focused on analysis and relies on an underlying package to handle data representation. For that reason, and also to be as lightweight as possible, the Neo object model and the associated Python package are deliberately limited to representation of data, with no functions for data analysis or visualization. Software for neurophysiology data analysis and visualization built on top of Neo automatically gains the benefits of interoperability, easier data sharing and automatic format conversion; there is already a burgeoning ecosystem of such tools. We intend that Neo should become the standard basis for Python tools in neurophysiology. PMID:24600386
Neo: an object model for handling electrophysiology data in multiple formats.
Garcia, Samuel; Guarino, Domenico; Jaillet, Florent; Jennings, Todd; Pröpper, Robert; Rautenberg, Philipp L; Rodgers, Chris C; Sobolev, Andrey; Wachtler, Thomas; Yger, Pierre; Davison, Andrew P
2014-01-01
Neuroscientists use many different software tools to acquire, analyze and visualize electrophysiological signals. However, incompatible data models and file formats make it difficult to exchange data between these tools. This reduces scientific productivity, renders potentially useful analysis methods inaccessible and impedes collaboration between labs. A common representation of the core data would improve interoperability and facilitate data-sharing. To that end, we propose here a language-independent object model, named "Neo," suitable for representing data acquired from electroencephalographic, intracellular, or extracellular recordings, or generated from simulations. As a concrete instantiation of this object model we have developed an open source implementation in the Python programming language. In addition to representing electrophysiology data in memory for the purposes of analysis and visualization, the Python implementation provides a set of input/output (IO) modules for reading/writing the data from/to a variety of commonly used file formats. Support is included for formats produced by most of the major manufacturers of electrophysiology recording equipment and also for more generic formats such as MATLAB. Data representation and data analysis are conceptually separate: it is easier to write robust analysis code if it is focused on analysis and relies on an underlying package to handle data representation. For that reason, and also to be as lightweight as possible, the Neo object model and the associated Python package are deliberately limited to representation of data, with no functions for data analysis or visualization. Software for neurophysiology data analysis and visualization built on top of Neo automatically gains the benefits of interoperability, easier data sharing and automatic format conversion; there is already a burgeoning ecosystem of such tools. We intend that Neo should become the standard basis for Python tools in neurophysiology.
Makadia, Rupa; Matcho, Amy; Ma, Qianli; Knoll, Chris; Schuemie, Martijn; DeFalco, Frank J; Londhe, Ajit; Zhu, Vivienne; Ryan, Patrick B
2015-01-01
Objectives To evaluate the utility of applying the Observational Medical Outcomes Partnership (OMOP) Common Data Model (CDM) across multiple observational databases within an organization and to apply standardized analytics tools for conducting observational research. Materials and methods Six deidentified patient-level datasets were transformed to the OMOP CDM. We evaluated the extent of information loss that occurred through the standardization process. We developed a standardized analytic tool to replicate the cohort construction process from a published epidemiology protocol and applied the analysis to all 6 databases to assess time-to-execution and comparability of results. Results Transformation to the CDM resulted in minimal information loss across all 6 databases. Patients and observations excluded were due to identified data quality issues in the source system, 96% to 99% of condition records and 90% to 99% of drug records were successfully mapped into the CDM using the standard vocabulary. The full cohort replication and descriptive baseline summary was executed for 2 cohorts in 6 databases in less than 1 hour. Discussion The standardization process improved data quality, increased efficiency, and facilitated cross-database comparisons to support a more systematic approach to observational research. Comparisons across data sources showed consistency in the impact of inclusion criteria, using the protocol and identified differences in patient characteristics and coding practices across databases. Conclusion Standardizing data structure (through a CDM), content (through a standard vocabulary with source code mappings), and analytics can enable an institution to apply a network-based approach to observational research across multiple, disparate observational health databases. PMID:25670757
Cleft audit protocol for speech (CAPS-A): a comprehensive training package for speech analysis.
Sell, D; John, A; Harding-Bell, A; Sweeney, T; Hegarty, F; Freeman, J
2009-01-01
The previous literature has largely focused on speech analysis systems and ignored process issues, such as the nature of adequate speech samples, data acquisition, recording and playback. Although there has been recognition of the need for training on tools used in speech analysis associated with cleft palate, little attention has been paid to this issue. To design, execute, and evaluate a training programme for speech and language therapists on the systematic and reliable use of the Cleft Audit Protocol for Speech-Augmented (CAPS-A), addressing issues of standardized speech samples, data acquisition, recording, playback, and listening guidelines. Thirty-six specialist speech and language therapists undertook the training programme over four days. This consisted of two days' training on the CAPS-A tool followed by a third day, making independent ratings and transcriptions on ten new cases which had been previously recorded during routine audit data collection. This task was repeated on day 4, a minimum of one month later. Ratings were made using the CAPS-A record form with the CAPS-A definition table. An analysis was made of the speech and language therapists' CAPS-A ratings at occasion 1 and occasion 2 and the intra- and inter-rater reliability calculated. Trained therapists showed consistency in individual judgements on specific sections of the tool. Intraclass correlation coefficients were calculated for each section with good agreement on eight of 13 sections. There were only fair levels of agreement on anterior oral cleft speech characteristics, non-cleft errors/immaturities and voice. This was explained, at least in part, by their low prevalence which affects the calculation of the intraclass correlation coefficient statistic. Speech and language therapists benefited from training on the CAPS-A, focusing on specific aspects of speech using definitions of parameters and scalar points, in order to apply the tool systematically and reliably. Ratings are enhanced by ensuring a high degree of attention to the nature of the data, standardizing the speech sample, data acquisition, the listening process together with the use of high-quality recording and playback equipment. In addition, a method is proposed for maintaining listening skills following training as part of an individual's continuing education.
What's it worth? A general manager's guide to valuation.
Luehrman, T A
1997-01-01
Behind every major resource-allocation decision a company makes lies some calculation of what that move is worth. So it is not surprising that valuation is the financial analytical skill general managers want to learn more than any other. Managers whose formal training is more than a few years old, however, are likely to have learned approaches that are becoming obsolete. What do generalists need in an updated valuation tool kit? In the 1970s, discounted-cash-flow analysis (DCF) emerged as best practice for valuing corporate assets. And one version of DCF-using the weighted-average cost of capital (WACC)-became the standard. Over the years, WACC has been used by most companies as a one-size-fits-all valuation tool. Today the WACC standard is insufficient. Improvements in computers and new theoretical insights have given rise to tools that outperform WACC in the three basic types of valuation problems managers face. Timothy Luehrman presents an overview of the three tools, explaining how they work and when to use them. For valuing operations, the DCF methodology of adjusted present value allows managers to break a problem into pieces that make managerial sense. For valuing opportunities, option pricing captures the contingent nature of investments in areas such as R&D and marketing. And for valuing ownership claims, the tool of equity cash flows helps managers value their company's stake in a joint venture, a strategic alliance, or an investment that uses project financing.
Analysis of laparoscopy in trauma.
Villavicencio, R T; Aucar, J A
1999-07-01
The optimum roles for laparoscopy in trauma have yet to be established. To date, reviews of laparoscopy in trauma have been primarily descriptive rather than analytic. This article analyzes the results of laparoscopy in trauma. Outcome analysis was done by reviewing 37 studies with more than 1,900 trauma patients, and laparoscopy was analyzed as a screening, diagnostic, or therapeutic tool. Laparoscopy was regarded as a screening tool if it was used to detect or exclude a positive finding (eg, hemoperitoneum, organ injury, gastrointestinal spillage, peritoneal penetration) that required operative exploration or repair. Laparoscopy was regarded as a diagnostic tool when it was used to identify all injuries, rather than as a screening tool to identify the first indication for a laparotomy. It was regarded as a diagnostic tool only in studies that mandated a laparotomy (gold standard) after laparoscopy to confirm the diagnostic accuracy of laparoscopic findings. Costs and charges for using laparoscopy in trauma were analyzed when feasible. As a screening tool, laparoscopy missed 1% of injuries and helped prevent 63% of patients from having a trauma laparotomy. When used as a diagnostic tool, laparoscopy had a 41% to 77% missed injury rate per patient. Overall, laparoscopy carried a 1% procedure-related complication rate. Cost-effectiveness has not been uniformly proved in studies comparing laparoscopy and laparotomy. Laparoscopy has been applied safely and effectively as a screening tool in stable patients with acute trauma. Because of the large number of missed injuries when used as a diagnostic tool, its value in this context is limited. Laparoscopy has been reported infrequently as a therapeutic tool in selected patients, and its use in this context requires further study.
Huysentruyt, Koen; Devreker, Thierry; Dejonckheere, Joachim; De Schepper, Jean; Vandenplas, Yvan; Cools, Filip
2015-08-01
The aim of the present study was to evaluate the predictive accuracy of screening tools for assessing nutritional risk in hospitalized children in developed countries. The study involved a systematic review of literature (MEDLINE, EMBASE, and Cochrane Central databases up to January 17, 2014) of studies on the diagnostic performance of pediatric nutritional screening tools. Methodological quality was assessed using a modified QUADAS tool. Sensitivity and specificity were calculated for each screening tool per validation method. A meta-analysis was performed to estimate the risk ratio of different screening result categories of being truly at nutritional risk. A total of 11 studies were included on ≥1 of the following screening tools: Pediatric Nutritional Risk Score, Screening Tool for the Assessment of Malnutrition in Paediatrics, Paediatric Yorkhill Malnutrition Score, and Screening Tool for Risk on Nutritional Status and Growth. Because of variation in reference standards, a direct comparison of the predictive accuracy of the screening tools was not possible. A meta-analysis was performed on 1629 children from 7 different studies. The risk ratio of being truly at nutritional risk was 0.349 (95% confidence interval [CI] 0.16-0.78) for children in the low versus moderate screening category and 0.292 (95% CI 0.19-0.44) in the moderate versus high screening category. There is insufficient evidence to choose 1 nutritional screening tool over another based on their predictive accuracy. The estimated risk of being at "true nutritional risk" increases with each category of screening test result. Each screening category should be linked to a specific course of action, although further research is needed.
Birch, Ivan; Vernon, Wesley; Walker, Jeremy; Saxelby, Jai
2013-10-01
Gait analysis from closed circuit camera footage is now commonly used as evidence in criminal trials. The biomechanical analysis of human gait is a well established science in both clinical and laboratory settings. However, closed circuit camera footage is rarely of the quality of that taken in the more controlled clinical and laboratory environments. The less than ideal quality of much of this footage for use in gait analysis is associated with a range of issues, the combination of which can often render the footage unsuitable for use in gait analysis. The aim of this piece of work was to develop a tool for assessing the suitability of closed circuit camera footage for the purpose of forensic gait analysis. A Delphi technique was employed with a small sample of expert forensic gait analysis practitioners, to identify key quality elements of CCTV footage used in legal proceedings. Five elements of the footage were identified and then subdivided into 15 contributing sub-elements, each of which was scored using a 5-point Likert scale. A Microsoft Excel worksheet was developed to calculate automatically an overall score from the fifteen sub-element scores. Five expert witnesses experienced in using CCTV footage for gait analysis then trialled the prototype tool on current case footage. A repeatability study was also undertaken using standardized CCTV footage. The results showed the tool to be a simple and repeatable means of assessing the suitability of closed circuit camera footage for use in forensic gait analysis. The inappropriate use of poor quality footage could lead to challenges to the practice of forensic gait analysis. All parties involved in criminal proceedings must therefore understand the fitness for purpose of any footage used. The development of this tool could offer a method of achieving this goal, and help to assure the continued role of forensic gait analysis as an aid to the identification process. Copyright © 2013 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.
Castañeda-Orjuela, Carlos; Romero, Martin; Arce, Patricia; Resch, Stephen; Janusz, Cara B; Toscano, Cristiana M; De la Hoz-Restrepo, Fernando
2013-07-02
The cost of Expanded Programs on Immunization (EPI) is an important aspect of the economic and financial analysis needed for planning purposes. Costs also are needed for cost-effectiveness analysis of introducing new vaccines. We describe a costing tool that improves the speed, accuracy, and availability of EPI costs and that was piloted in Colombia. The ProVac CostVac Tool is a spreadsheet-based tool that estimates overall EPI costs considering program inputs (personnel, cold chain, vaccines, supplies, etc.) at three administrative levels (central, departmental, and municipal) and one service delivery level (health facilities). It uses various costing methods. The tool was evaluated through a pilot exercise in Colombia. In addition to the costs obtained from the central and intermediate administrative levels, a survey of 112 local health facilities was conducted to collect vaccination costs. Total cost of the EPI, cost per dose of vaccine delivered, and cost per fully vaccinated child with the recommended immunization schedule in Colombia in 2009 were estimated. The ProVac CostVac Tool is a novel, user-friendly tool, which allows users to conduct an EPI costing study following guidelines for cost studies. The total costs of the Colombian EPI were estimated at US$ 107.8 million in 2009. The cost for a fully immunized child with the recommended schedule was estimated at US$ 153.62. Vaccines and vaccination supplies accounted for 58% of total costs, personnel for 21%, cold chain for 18%, and transportation for 2%. Most EPI costs are incurred at the central level (62%). The major cost driver at the department and municipal levels is personnel costs. The ProVac CostVac Tool proved to be a comprehensive and useful tool that will allow researchers and health officials to estimate the actual cost for national immunization programs. The present analysis shows that personnel, cold chain, and transportation are important components of EPI and should be carefully estimated in the cost analysis, particularly when evaluating new vaccine introduction. Copyright © 2013 Elsevier Ltd. All rights reserved.
BladeCAD: An Interactive Geometric Design Tool for Turbomachinery Blades
NASA Technical Reports Server (NTRS)
Miller, Perry L., IV; Oliver, James H.; Miller, David P.; Tweedt, Daniel L.
1996-01-01
A new metthodology for interactive design of turbomachinery blades is presented. Software implementation of the meth- ods provides a user interface that is intuitive to aero-designers while operating with standardized geometric forms. The primary contribution is that blade sections may be defined with respect to general surfaces of revolution which may be defined to represent the path of fluid flow through the turbomachine. The completed blade design is represented as a non-uniform rational B-spline (NURBS) surface and is written to a standard IGES file which is portable to most design, analysis, and manufacturing applications.
NASA Astrophysics Data System (ADS)
Santhana Vannan, S.; Cook, R. B.; Wilson, B. E.; Wei, Y.
2010-12-01
Terrestrial ecology data sets are produced from diverse data sources such as model output, field data collection, laboratory analysis and remote sensing observation. These data sets can be created, distributed, and consumed in diverse ways as well. However, this diversity can hinder the usability of the data, and limit data users’ abilities to validate and reuse data for science and application purposes. Geospatial web services, such as those described in this paper, are an important means of reducing this burden. Terrestrial ecology researchers generally create the data sets in diverse file formats, with file and data structures tailored to the specific needs of their project, possibly as tabular data, geospatial images, or documentation in a report. Data centers may reformat the data to an archive-stable format and distribute the data sets through one or more protocols, such as FTP, email, and WWW. Because of the diverse data preparation, delivery, and usage patterns, users have to invest time and resources to bring the data into the format and structure most useful for their analysis. This time-consuming data preparation process shifts valuable resources from data analysis to data assembly. To address these issues, the ORNL DAAC, a NASA-sponsored terrestrial ecology data center, has utilized geospatial Web service technology, such as Open Geospatial Consortium (OGC) Web Map Service (WMS) and OGC Web Coverage Service (WCS) standards, to increase the usability and availability of terrestrial ecology data sets. Data sets are standardized into non-proprietary file formats and distributed through OGC Web Service standards. OGC Web services allow the ORNL DAAC to store data sets in a single format and distribute them in multiple ways and formats. Registering the OGC Web services through search catalogues and other spatial data tools allows for publicizing the data sets and makes them more available across the Internet. The ORNL DAAC has also created a Web-based graphical user interface called Spatial Data Access Tool (SDAT) that utilizes OGC Web services standards and allows data distribution and consumption for users not familiar with OGC standards. SDAT also allows for users to visualize the data set prior to download. Google Earth visualizations of the data set are also provided through SDAT. The use of OGC Web service standards at the ORNL DAAC has enabled an increase in data consumption. In one case, a data set had ~10 fold increase in download through OGC Web service in comparison to the conventional FTP and WWW method of access. The increase in download suggests that users are not only finding the data sets they need but also able to consume them readily in the format they need.
Analyzing huge pathology images with open source software.
Deroulers, Christophe; Ameisen, David; Badoual, Mathilde; Gerin, Chloé; Granier, Alexandre; Lartaud, Marc
2013-06-06
Digital pathology images are increasingly used both for diagnosis and research, because slide scanners are nowadays broadly available and because the quantitative study of these images yields new insights in systems biology. However, such virtual slides build up a technical challenge since the images occupy often several gigabytes and cannot be fully opened in a computer's memory. Moreover, there is no standard format. Therefore, most common open source tools such as ImageJ fail at treating them, and the others require expensive hardware while still being prohibitively slow. We have developed several cross-platform open source software tools to overcome these limitations. The NDPITools provide a way to transform microscopy images initially in the loosely supported NDPI format into one or several standard TIFF files, and to create mosaics (division of huge images into small ones, with or without overlap) in various TIFF and JPEG formats. They can be driven through ImageJ plugins. The LargeTIFFTools achieve similar functionality for huge TIFF images which do not fit into RAM. We test the performance of these tools on several digital slides and compare them, when applicable, to standard software. A statistical study of the cells in a tissue sample from an oligodendroglioma was performed on an average laptop computer to demonstrate the efficiency of the tools. Our open source software enables dealing with huge images with standard software on average computers. They are cross-platform, independent of proprietary libraries and very modular, allowing them to be used in other open source projects. They have excellent performance in terms of execution speed and RAM requirements. They open promising perspectives both to the clinician who wants to study a single slide and to the research team or data centre who do image analysis of many slides on a computer cluster. The virtual slide(s) for this article can be found here:http://www.diagnosticpathology.diagnomx.eu/vs/5955513929846272.
Analyzing huge pathology images with open source software
2013-01-01
Background Digital pathology images are increasingly used both for diagnosis and research, because slide scanners are nowadays broadly available and because the quantitative study of these images yields new insights in systems biology. However, such virtual slides build up a technical challenge since the images occupy often several gigabytes and cannot be fully opened in a computer’s memory. Moreover, there is no standard format. Therefore, most common open source tools such as ImageJ fail at treating them, and the others require expensive hardware while still being prohibitively slow. Results We have developed several cross-platform open source software tools to overcome these limitations. The NDPITools provide a way to transform microscopy images initially in the loosely supported NDPI format into one or several standard TIFF files, and to create mosaics (division of huge images into small ones, with or without overlap) in various TIFF and JPEG formats. They can be driven through ImageJ plugins. The LargeTIFFTools achieve similar functionality for huge TIFF images which do not fit into RAM. We test the performance of these tools on several digital slides and compare them, when applicable, to standard software. A statistical study of the cells in a tissue sample from an oligodendroglioma was performed on an average laptop computer to demonstrate the efficiency of the tools. Conclusions Our open source software enables dealing with huge images with standard software on average computers. They are cross-platform, independent of proprietary libraries and very modular, allowing them to be used in other open source projects. They have excellent performance in terms of execution speed and RAM requirements. They open promising perspectives both to the clinician who wants to study a single slide and to the research team or data centre who do image analysis of many slides on a computer cluster. Virtual slides The virtual slide(s) for this article can be found here: http://www.diagnosticpathology.diagnomx.eu/vs/5955513929846272 PMID:23829479
Lung ultrasound in diagnosing pneumonia in childhood: a systematic review and meta-analysis.
Orso, Daniele; Ban, Alessio; Guglielmo, Nicola
2018-06-21
Pneumonia is the third leading cause of death in children under 5 years of age worldwide. In pediatrics, both the accuracy and safety of diagnostic tools are important. Lung ultrasound (LUS) could be a safe diagnostic tool for this reason. We searched in the literature for diagnostic studies about LUS to predict pneumonia in pediatric patients using systematic review and meta-analysis. The Medline, CINAHL, Cochrane Library, Embase, SPORTDiscus, ScienceDirect, and Web of Science databases from inception to September 2017 were searched. All studies that evaluated the diagnostic accuracy of LUS in determining the presence of pneumonia in patients under 18 years of age were included. 1042 articles were found by systematic search. 76 articles were assessed for eligibility. Seventeen studies were included in the systematic review. We included 2612 pooled cases. The age of the pooled sample population ranged from 0 to about 21 years old. Summary sensitivity, specificity, and AUC were 0.94 (IQR: 0.89-0.97), 0.93 (IQR: 0.86-0.98), and 0.98 (IQR: 0.94-0.99), respectively. No agreement on reference standard was detected: nine studies used chest X-rays, while four studies considered the clinical diagnosis. Only one study used computed tomography. LUS seems to be a promise tool for diagnosing pneumonia in children. However, the high heterogeneity found across the individual studies, and the absence of a reliable reference standard, make the finding questionable. More methodologically rigorous studies are needed.
NASA's Planetary Data System: Support for the Delivery of Derived Data Sets at the Atmospheres Node
NASA Astrophysics Data System (ADS)
Chanover, Nancy J.; Beebe, Reta; Neakrase, Lynn; Huber, Lyle; Rees, Shannon; Hornung, Danae
2015-11-01
NASA’s Planetary Data System is charged with archiving electronic data products from NASA planetary missions that are sponsored by NASA’s Science Mission Directorate. This archive, currently organized by science disciplines, uses standards for describing and storing data that are designed to enable future scientists who are unfamiliar with the original experiments to analyze the data, and to do this using a variety of computer platforms, with no additional support. These standards address the data structure, description contents, and media design. The new requirement in the NASA ROSES-2015 Research Announcement to include a Data Management Plan will result in an increase in the number of derived data sets that are being delivered to the PDS. These data sets may come from the Planetary Data Archiving, Restoration and Tools (PDART) program, other Data Analysis Programs (DAPs) or be volunteered by individuals who are publishing the results of their analysis. In response to this increase, the PDS Atmospheres Node is developing a set of guidelines and user tools to make the process of archiving these derived data products more efficient. Here we provide a description of Atmospheres Node resources, including a letter of support for the proposal stage, a communication schedule for the planned archive effort, product label samples and templates in extensible markup language (XML), documentation templates, and validation tools necessary for producing a PDS4-compliant derived data bundle(s) efficiently and accurately.
Modular modelling with Physiome standards
Nickerson, David P.; Nielsen, Poul M. F.; Hunter, Peter J.
2016-01-01
Key points The complexity of computational models is increasing, supported by research in modelling tools and frameworks. But relatively little thought has gone into design principles for complex models.We propose a set of design principles for complex model construction with the Physiome standard modelling protocol CellML.By following the principles, models are generated that are extensible and are themselves suitable for reuse in larger models of increasing complexity.We illustrate these principles with examples including an architectural prototype linking, for the first time, electrophysiology, thermodynamically compliant metabolism, signal transduction, gene regulation and synthetic biology.The design principles complement other Physiome research projects, facilitating the application of virtual experiment protocols and model analysis techniques to assist the modelling community in creating libraries of composable, characterised and simulatable quantitative descriptions of physiology. Abstract The ability to produce and customise complex computational models has great potential to have a positive impact on human health. As the field develops towards whole‐cell models and linking such models in multi‐scale frameworks to encompass tissue, organ, or organism levels, reuse of previous modelling efforts will become increasingly necessary. Any modelling group wishing to reuse existing computational models as modules for their own work faces many challenges in the context of construction, storage, retrieval, documentation and analysis of such modules. Physiome standards, frameworks and tools seek to address several of these challenges, especially for models expressed in the modular protocol CellML. Aside from providing a general ability to produce modules, there has been relatively little research work on architectural principles of CellML models that will enable reuse at larger scales. To complement and support the existing tools and frameworks, we develop a set of principles to address this consideration. The principles are illustrated with examples that couple electrophysiology, signalling, metabolism, gene regulation and synthetic biology, together forming an architectural prototype for whole‐cell modelling (including human intervention) in CellML. Such models illustrate how testable units of quantitative biophysical simulation can be constructed. Finally, future relationships between modular models so constructed and Physiome frameworks and tools are discussed, with particular reference to how such frameworks and tools can in turn be extended to complement and gain more benefit from the results of applying the principles. PMID:27353233
BIAS: Bioinformatics Integrated Application Software.
Finak, G; Godin, N; Hallett, M; Pepin, F; Rajabi, Z; Srivastava, V; Tang, Z
2005-04-15
We introduce a development platform especially tailored to Bioinformatics research and software development. BIAS (Bioinformatics Integrated Application Software) provides the tools necessary for carrying out integrative Bioinformatics research requiring multiple datasets and analysis tools. It follows an object-relational strategy for providing persistent objects, allows third-party tools to be easily incorporated within the system and supports standards and data-exchange protocols common to Bioinformatics. BIAS is an OpenSource project and is freely available to all interested users at http://www.mcb.mcgill.ca/~bias/. This website also contains a paper containing a more detailed description of BIAS and a sample implementation of a Bayesian network approach for the simultaneous prediction of gene regulation events and of mRNA expression from combinations of gene regulation events. hallett@mcb.mcgill.ca.
On the Development of a Hospital-Patient Web-Based Communication Tool: A Case Study From Norway.
Granja, Conceição; Dyb, Kari; Bolle, Stein Roald; Hartvigsen, Gunnar
2015-01-01
Surgery cancellations are undesirable in hospital settings as they increase costs, reduce productivity and efficiency, and directly affect the patient. The problem of elective surgery cancellations in a North Norwegian University Hospital is addressed. Based on a three-step methodology conducted at the hospital, the preoperative planning process was modeled taking into consideration the narratives from different health professions. From the analysis of the generated process models, it is concluded that in order to develop a useful patient centered web-based communication tool, it is necessary to fully understand how hospitals plan and organize surgeries today. Moreover, process reengineering is required to generate a standard process that can serve as a tool for health ICT designers to define the requirements for a robust and useful system.
Grape RNA-Seq analysis pipeline environment
Knowles, David G.; Röder, Maik; Merkel, Angelika; Guigó, Roderic
2013-01-01
Motivation: The avalanche of data arriving since the development of NGS technologies have prompted the need for developing fast, accurate and easily automated bioinformatic tools capable of dealing with massive datasets. Among the most productive applications of NGS technologies is the sequencing of cellular RNA, known as RNA-Seq. Although RNA-Seq provides similar or superior dynamic range than microarrays at similar or lower cost, the lack of standard and user-friendly pipelines is a bottleneck preventing RNA-Seq from becoming the standard for transcriptome analysis. Results: In this work we present a pipeline for processing and analyzing RNA-Seq data, that we have named Grape (Grape RNA-Seq Analysis Pipeline Environment). Grape supports raw sequencing reads produced by a variety of technologies, either in FASTA or FASTQ format, or as prealigned reads in SAM/BAM format. A minimal Grape configuration consists of the file location of the raw sequencing reads, the genome of the species and the corresponding gene and transcript annotation. Grape first runs a set of quality control steps, and then aligns the reads to the genome, a step that is omitted for prealigned read formats. Grape next estimates gene and transcript expression levels, calculates exon inclusion levels and identifies novel transcripts. Grape can be run on a single computer or in parallel on a computer cluster. It is distributed with specific mapping and quantification tools, but given its modular design, any tool supporting popular data interchange formats can be integrated. Availability: Grape can be obtained from the Bioinformatics and Genomics website at: http://big.crg.cat/services/grape. Contact: david.gonzalez@crg.eu or roderic.guigo@crg.eu PMID:23329413
Real-time MRI guidance of cardiac interventions.
Campbell-Washburn, Adrienne E; Tavallaei, Mohammad A; Pop, Mihaela; Grant, Elena K; Chubb, Henry; Rhode, Kawal; Wright, Graham A
2017-10-01
Cardiac magnetic resonance imaging (MRI) is appealing to guide complex cardiac procedures because it is ionizing radiation-free and offers flexible soft-tissue contrast. Interventional cardiac MR promises to improve existing procedures and enable new ones for complex arrhythmias, as well as congenital and structural heart disease. Guiding invasive procedures demands faster image acquisition, reconstruction and analysis, as well as intuitive intraprocedural display of imaging data. Standard cardiac MR techniques such as 3D anatomical imaging, cardiac function and flow, parameter mapping, and late-gadolinium enhancement can be used to gather valuable clinical data at various procedural stages. Rapid intraprocedural image analysis can extract and highlight critical information about interventional targets and outcomes. In some cases, real-time interactive imaging is used to provide a continuous stream of images displayed to interventionalists for dynamic device navigation. Alternatively, devices are navigated relative to a roadmap of major cardiac structures generated through fast segmentation and registration. Interventional devices can be visualized and tracked throughout a procedure with specialized imaging methods. In a clinical setting, advanced imaging must be integrated with other clinical tools and patient data. In order to perform these complex procedures, interventional cardiac MR relies on customized equipment, such as interactive imaging environments, in-room image display, audio communication, hemodynamic monitoring and recording systems, and electroanatomical mapping and ablation systems. Operating in this sophisticated environment requires coordination and planning. This review provides an overview of the imaging technology used in MRI-guided cardiac interventions. Specifically, this review outlines clinical targets, standard image acquisition and analysis tools, and the integration of these tools into clinical workflow. 1 Technical Efficacy: Stage 5 J. Magn. Reson. Imaging 2017;46:935-950. © 2017 International Society for Magnetic Resonance in Medicine.
Iannetti, S; Savini, L; Palma, D; Calistri, P; Natale, F; Di Lorenzo, A; Cerella, A; Giovannini, A
2014-03-01
The management of public health emergencies is improved by quick, exhaustive and standardized flow of data on disease outbreaks, by using specific tools for data collection, registration and analysis. In this context, the National Information System for the Notification of Outbreaks of Animal Diseases (SIMAN) has been developed in Italy to collect and share data on the notifications of outbreaks of animal diseases. SIMAN is connected through web services to the national database of animals and holdings (BDN) and has been integrated with tools for the management of epidemic emergencies. The website has been updated with a section dedicated to the contingency planning in case of epidemic emergency. EpiTrace is one such useful tool also integrated in the BDN and based on the Social Network Analysis (SNA) and on network epidemiological models. This tool gives the possibility of assessing the risk associated to holdings and animals on the basis of their trade, in order to support the veterinary services in tracing back and forward the animals in case of outbreaks of infectious diseases. Copyright © 2014 Elsevier B.V. All rights reserved.
Improving Building Energy Simulation Programs Through Diagnostic Testing (Fact Sheet)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2012-02-01
New test procedure evaluates quality and accuracy of energy analysis tools for the residential building retrofit market. Reducing the energy use of existing homes in the United States offers significant energy-saving opportunities, which can be identified through building simulation software tools that calculate optimal packages of efficiency measures. To improve the accuracy of energy analysis for residential buildings, the National Renewable Energy Laboratory's (NREL) Buildings Research team developed the Building Energy Simulation Test for Existing Homes (BESTEST-EX), a method for diagnosing and correcting errors in building energy audit software and calibration procedures. BESTEST-EX consists of building physics and utility billmore » calibration test cases, which software developers can use to compare their tools simulation findings to reference results generated with state-of-the-art simulation tools. Overall, the BESTEST-EX methodology: (1) Tests software predictions of retrofit energy savings in existing homes; (2) Ensures building physics calculations and utility bill calibration procedures perform to a minimum standard; and (3) Quantifies impacts of uncertainties in input audit data and occupant behavior. BESTEST-EX is helping software developers identify and correct bugs in their software, as well as develop and test utility bill calibration procedures.« less
An efficient framework for Java data processing systems in HPC environments
NASA Astrophysics Data System (ADS)
Fries, Aidan; Castañeda, Javier; Isasi, Yago; Taboada, Guillermo L.; Portell de Mora, Jordi; Sirvent, Raül
2011-11-01
Java is a commonly used programming language, although its use in High Performance Computing (HPC) remains relatively low. One of the reasons is a lack of libraries offering specific HPC functions to Java applications. In this paper we present a Java-based framework, called DpcbTools, designed to provide a set of functions that fill this gap. It includes a set of efficient data communication functions based on message-passing, thus providing, when a low latency network such as Myrinet is available, higher throughputs and lower latencies than standard solutions used by Java. DpcbTools also includes routines for the launching, monitoring and management of Java applications on several computing nodes by making use of JMX to communicate with remote Java VMs. The Gaia Data Processing and Analysis Consortium (DPAC) is a real case where scientific data from the ESA Gaia astrometric satellite will be entirely processed using Java. In this paper we describe the main elements of DPAC and its usage of the DpcbTools framework. We also assess the usefulness and performance of DpcbTools through its performance evaluation and the analysis of its impact on some DPAC systems deployed in the MareNostrum supercomputer (Barcelona Supercomputing Center).
Brower, Stewart M
2004-10-01
The analysis included forty-one academic health sciences library (HSL) Websites as captured in the first two weeks of January 2001. Home pages and persistent navigational tools (PNTs) were analyzed for layout, technology, and links, and other general site metrics were taken. Websites were selected based on rank in the National Network of Libraries of Medicine, with regional and resource libraries given preference on the basis that these libraries are recognized as leaders in their regions and would be the most reasonable source of standards for best practice. A three-page evaluation tool was developed based on previous similar studies. All forty-one sites were evaluated in four specific areas: library general information, Website aids and tools, library services, and electronic resources. Metrics taken for electronic resources included orientation of bibliographic databases alphabetically by title or by subject area and with links to specifically named databases. Based on the results, a formula for determining obligatory links was developed, listing items that should appear on all academic HSL Web home pages and PNTs. These obligatory links demonstrate a series of best practices that may be followed in the design and construction of academic HSL Websites.
Piazza, Rocco; Magistroni, Vera; Pirola, Alessandra; Redaelli, Sara; Spinelli, Roberta; Redaelli, Serena; Galbiati, Marta; Valletta, Simona; Giudici, Giovanni; Cazzaniga, Giovanni; Gambacorti-Passerini, Carlo
2013-01-01
Copy number alterations (CNA) are common events occurring in leukaemias and solid tumors. Comparative Genome Hybridization (CGH) is actually the gold standard technique to analyze CNAs; however, CGH analysis requires dedicated instruments and is able to perform only low resolution Loss of Heterozygosity (LOH) analyses. Here we present CEQer (Comparative Exome Quantification analyzer), a new graphical, event-driven tool for CNA/allelic-imbalance (AI) coupled analysis of exome sequencing data. By using case-control matched exome data, CEQer performs a comparative digital exonic quantification to generate CNA data and couples this information with exome-wide LOH and allelic imbalance detection. This data is used to build mixed statistical/heuristic models allowing the identification of CNA/AI events. To test our tool, we initially used in silico generated data, then we performed whole-exome sequencing from 20 leukemic specimens and corresponding matched controls and we analyzed the results using CEQer. Taken globally, these analyses showed that the combined use of comparative digital exon quantification and LOH/AI allows generating very accurate CNA data. Therefore, we propose CEQer as an efficient, robust and user-friendly graphical tool for the identification of CNA/AI in the context of whole-exome sequencing data.
Nardelli, Mimma; Valenza, Gaetano; Cristea, Ioana A.; Gentili, Claudio; Cotet, Carmen; David, Daniel; Lanata, Antonio; Scilingo, Enzo P.
2015-01-01
The objective assessment of psychological traits of healthy subjects and psychiatric patients has been growing interest in clinical and bioengineering research fields during the last decade. Several experimental evidences strongly suggest that a link between Autonomic Nervous System (ANS) dynamics and specific dimensions such as anxiety, social phobia, stress, and emotional regulation might exist. Nevertheless, an extensive investigation on a wide range of psycho-cognitive scales and ANS non-invasive markers gathered from standard and non-linear analysis still needs to be addressed. In this study, we analyzed the discerning and correlation capabilities of a comprehensive set of ANS features and psycho-cognitive scales in 29 non-pathological subjects monitored during resting conditions. In particular, the state of the art of standard and non-linear analysis was performed on Heart Rate Variability, InterBreath Interval series, and InterBeat Respiration series, which were considered as monovariate and multivariate measurements. Experimental results show that each ANS feature is linked to specific psychological traits. Moreover, non-linear analysis outperforms the psychological assessment with respect to standard analysis. Considering that the current clinical practice relies only on subjective scores from interviews and questionnaires, this study provides objective tools for the assessment of psychological dimensions. PMID:25859212
Application Program Interface for the Orion Aerodynamics Database
NASA Technical Reports Server (NTRS)
Robinson, Philip E.; Thompson, James
2013-01-01
The Application Programming Interface (API) for the Crew Exploration Vehicle (CEV) Aerodynamic Database has been developed to provide the developers of software an easily implemented, fully self-contained method of accessing the CEV Aerodynamic Database for use in their analysis and simulation tools. The API is programmed in C and provides a series of functions to interact with the database, such as initialization, selecting various options, and calculating the aerodynamic data. No special functions (file read/write, table lookup) are required on the host system other than those included with a standard ANSI C installation. It reads one or more files of aero data tables. Previous releases of aerodynamic databases for space vehicles have only included data tables and a document of the algorithm and equations to combine them for the total aerodynamic forces and moments. This process required each software tool to have a unique implementation of the database code. Errors or omissions in the documentation, or errors in the implementation, led to a lengthy and burdensome process of having to debug each instance of the code. Additionally, input file formats differ for each space vehicle simulation tool, requiring the aero database tables to be reformatted to meet the tool s input file structure requirements. Finally, the capabilities for built-in table lookup routines vary for each simulation tool. Implementation of a new database may require an update to and verification of the table lookup routines. This may be required if the number of dimensions of a data table exceeds the capability of the simulation tools built-in lookup routines. A single software solution was created to provide an aerodynamics software model that could be integrated into other simulation and analysis tools. The highly complex Orion aerodynamics model can then be quickly included in a wide variety of tools. The API code is written in ANSI C for ease of portability to a wide variety of systems. The input data files are in standard formatted ASCII, also for improved portability. The API contains its own implementation of multidimensional table reading and lookup routines. The same aerodynamics input file can be used without modification on all implementations. The turnaround time from aerodynamics model release to a working implementation is significantly reduced
Michaud, Ginette Y
2005-01-01
In the field of clinical laboratory medicine, standardization is aimed at increasing the trueness and reliability of measured values. Standardization relies on the use of written standards, reference measurement procedures and reference materials. These are important tools for the design and validation of new tests, and for establishing the metrological traceability of diagnostic assays. Their use supports the translation of research technologies into new diagnostic assays and leads to more rapid advances in science and medicine, as well as improvements in the quality of patient care. The various standardization tools are described, as are the procedures by which written standards, reference procedures and reference materials are developed. Recent efforts to develop standards for use in the field of molecular diagnostics are discussed. The recognition of standardization tools by the FDA and other regulatory authorities is noted as evidence of their important role in ensuring the safety and performance of in vitro diagnostic devices.
NASA Astrophysics Data System (ADS)
Wray, Richard B.
1991-12-01
A hybrid requirements analysis methodology was developed, based on the practices actually used in developing a Space Generic Open Avionics Architecture. During the development of this avionics architecture, a method of analysis able to effectively define the requirements for this space avionics architecture was developed. In this methodology, external interfaces and relationships are defined, a static analysis resulting in a static avionics model was developed, operating concepts for simulating the requirements were put together, and a dynamic analysis of the execution needs for the dynamic model operation was planned. The systems engineering approach was used to perform a top down modified structured analysis of a generic space avionics system and to convert actual program results into generic requirements. CASE tools were used to model the analyzed system and automatically generate specifications describing the model's requirements. Lessons learned in the use of CASE tools, the architecture, and the design of the Space Generic Avionics model were established, and a methodology notebook was prepared for NASA. The weaknesses of standard real-time methodologies for practicing systems engineering, such as Structured Analysis and Object Oriented Analysis, were identified.
NASA Technical Reports Server (NTRS)
Wray, Richard B.
1991-01-01
A hybrid requirements analysis methodology was developed, based on the practices actually used in developing a Space Generic Open Avionics Architecture. During the development of this avionics architecture, a method of analysis able to effectively define the requirements for this space avionics architecture was developed. In this methodology, external interfaces and relationships are defined, a static analysis resulting in a static avionics model was developed, operating concepts for simulating the requirements were put together, and a dynamic analysis of the execution needs for the dynamic model operation was planned. The systems engineering approach was used to perform a top down modified structured analysis of a generic space avionics system and to convert actual program results into generic requirements. CASE tools were used to model the analyzed system and automatically generate specifications describing the model's requirements. Lessons learned in the use of CASE tools, the architecture, and the design of the Space Generic Avionics model were established, and a methodology notebook was prepared for NASA. The weaknesses of standard real-time methodologies for practicing systems engineering, such as Structured Analysis and Object Oriented Analysis, were identified.
COMAN: a web server for comprehensive metatranscriptomics analysis.
Ni, Yueqiong; Li, Jun; Panagiotou, Gianni
2016-08-11
Microbiota-oriented studies based on metagenomic or metatranscriptomic sequencing have revolutionised our understanding on microbial ecology and the roles of both clinical and environmental microbes. The analysis of massive metatranscriptomic data requires extensive computational resources, a collection of bioinformatics tools and expertise in programming. We developed COMAN (Comprehensive Metatranscriptomics Analysis), a web-based tool dedicated to automatically and comprehensively analysing metatranscriptomic data. COMAN pipeline includes quality control of raw reads, removal of reads derived from non-coding RNA, followed by functional annotation, comparative statistical analysis, pathway enrichment analysis, co-expression network analysis and high-quality visualisation. The essential data generated by COMAN are also provided in tabular format for additional analysis and integration with other software. The web server has an easy-to-use interface and detailed instructions, and is freely available at http://sbb.hku.hk/COMAN/ CONCLUSIONS: COMAN is an integrated web server dedicated to comprehensive functional analysis of metatranscriptomic data, translating massive amount of reads to data tables and high-standard figures. It is expected to facilitate the researchers with less expertise in bioinformatics in answering microbiota-related biological questions and to increase the accessibility and interpretation of microbiota RNA-Seq data.
NASA Astrophysics Data System (ADS)
Spahr, K.; Hogue, T. S.
2016-12-01
Selecting the most appropriate green, gray, and / or hybrid system for stormwater treatment and conveyance can prove challenging to decision markers across all scales, from site managers to large municipalities. To help streamline the selection process, a multi-disciplinary team of academics and professionals is developing an industry standard for selecting and evaluating the most appropriate stormwater management technology for different regions. To make the tool more robust and comprehensive, life-cycle cost assessment and optimization modules will be included to evaluate non-monetized and ecosystem benefits of selected technologies. Initial work includes surveying advisory board members based in cities that use existing decision support tools in their infrastructure planning process. These surveys will qualify the decisions currently being made and identify challenges within the current planning process across a range of hydroclimatic regions and city size. Analysis of social and other non-technical barriers to adoption of the existing tools is also being performed, with identification of regional differences and institutional challenges. Surveys will also gage the regional appropriateness of certain stormwater technologies based off experiences in implementing stormwater treatment and conveyance plans. In additional to compiling qualitative data on existing decision support tools, a technical review of components of the decision support tool used will be performed. Gaps in each tool's analysis, like the lack of certain critical functionalities, will be identified and ease of use will be evaluated. Conclusions drawn from both the qualitative and quantitative analyses will be used to inform the development of the new decision support tool and its eventual dissemination.
[Handbook for the preparation of evidence-based documents. Tools derived from scientific knowledge].
Carrión-Camacho, M R; Martínez-Brocca, M A; Paneque-Sánchez-Toscano, I; Valencia-Martín, R; Palomino-García, A; Muñoz-Durán, C; Tamayo-López, M J; González-Eiris-Delgado, C; Otero-Candelera, R; Ortega-Ruiz, F; Sobrino-Márquez, J M; Jiménez-García-Bóveda, R; Fernández-Quero, M; Campos-Pareja, A M
2013-01-01
This handbook is intended to be an accessible, easy-to-consult guide to help professionals produce or adapt Evidence-Based Documents. Such documents will help standardize both clinical practice and decision-making, the quality always being monitored in such a way that established references are complied with. Evidence-Based Health Care Committee, a member of "Virgen del Rocío" University Hospital quality structure, proposed the preparation of a handbook to produce Evidence-Based Documents including: a description of products, characteristics, qualities, uses, methodology of production, and application scope of every one of them. The handbook consists of seven Evidence-Based tools, one chapter on critical analysis methodology of scientific literature, one chapter with internet resources, and some appendices with different assessment tools. This Handbook provides general practitioners with a great opportunity to improve quality and as a guideline to standardize clinical healthcare, and managers with a strategy to promote and encourage the development of documents in an effort to reduce clinical practice variability, as well as giving patients the opportunity of taking part in planning their own care. Copyright © 2011 SECA. Published by Elsevier Espana. All rights reserved.
KungFQ: a simple and powerful approach to compress fastq files.
Grassi, Elena; Di Gregorio, Federico; Molineris, Ivan
2012-01-01
Nowadays storing data derived from deep sequencing experiments has become pivotal and standard compression algorithms do not exploit in a satisfying manner their structure. A number of reference-based compression algorithms have been developed but they are less adequate when approaching new species without fully sequenced genomes or nongenomic data. We developed a tool that takes advantages of fastq characteristics and encodes them in a binary format optimized in order to be further compressed with standard tools (such as gzip or lzma). The algorithm is straightforward and does not need any external reference file, it scans the fastq only once and has a constant memory requirement. Moreover, we added the possibility to perform lossy compression, losing some of the original information (IDs and/or qualities) but resulting in smaller files; it is also possible to define a quality cutoff under which corresponding base calls are converted to N. We achieve 2.82 to 7.77 compression ratios on various fastq files without losing information and 5.37 to 8.77 losing IDs, which are often not used in common analysis pipelines. In this paper, we compare the algorithm performance with known tools, usually obtaining higher compression levels.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Dean N.
The climate and weather data science community gathered December 3–5, 2013, at Lawrence Livermore National Laboratory, in Livermore, California, for the third annual Earth System Grid Federation (ESGF) and Ultra-scale Visualization Climate Data Analysis Tools (UV-CDAT) Face-to-Face (F2F) Meeting, which was hosted by the Department of Energy, National Aeronautics and Space Administration, National Oceanic and Atmospheric Administration, the European Infrastructure for the European Network of Earth System Modelling, and the Australian Department of Education. Both ESGF and UV-CDAT are global collaborations designed to develop a new generation of open-source software infrastructure that provides distributed access and analysis to observed andmore » simulated data from the climate and weather communities. The tools and infrastructure developed under these international multi-agency collaborations are critical to understanding extreme weather conditions and long-term climate change, while the F2F meetings help to build a stronger climate and weather data science community and stronger federated software infrastructure. The 2013 F2F meeting determined requirements for existing and impending national and international community projects; enhancements needed for data distribution, analysis, and visualization infrastructure; and standards and resources needed for better collaborations.« less
Nakamura, Shinichiro; Kondo, Yasushi; Matsubae, Kazuyo; Nakajima, Kenichi; Nagasaka, Tetsuya
2011-02-01
Identification of the flow of materials and substances associated with a product system provides useful information for Life Cycle Analysis (LCA), and contributes to extending the scope of complementarity between LCA and Materials Flow Analysis/Substances Flow Analysis (MFA/SFA), the two major tools of industrial ecology. This paper proposes a new methodology based on input-output analysis for identifying the physical input-output flow of individual materials that is associated with the production of a unit of given product, the unit physical input-output by materials (UPIOM). While the Sankey diagram has been a standard tool for the visualization of MFA/SFA, with an increase in the complexity of the flows under consideration, which will be the case when economy-wide intersectoral flows of materials are involved, the Sankey diagram may become too complex for effective visualization. An alternative way to visually represent material flows is proposed which makes use of triangulation of the flow matrix based on degrees of fabrication. The proposed methodology is applied to the flow of pig iron and iron and steel scrap that are associated with the production of a passenger car in Japan. Its usefulness to identify a specific MFA pattern from the original IO table is demonstrated.
Impact of novel shift handle laparoscopic tool on wrist ergonomics and task performance
Yu, Denny; Lowndes, Bethany; Morrow, Missy; Kaufman, Kenton; Bingener, Juliane; Hallbeck, Susan
2015-01-01
Background Laparoscopic tool handles causing wrist flexion and extension more than 15° from neutral are considered “at-risk” for musculoskeletal strain. Therefore this study measured the impact of laparoscopic tool handle angles on wrist postures and task performance. Methods Eight surgeons performed standard and modified Fundamentals of Laparoscopic Surgery (FLS) tasks with laparoscopic tools. Tool A had three adjustable handle angle configurations, i.e., in-line 0° (A0), 30° (A30), and pistol-grip 70° (A70). Tool B was a fixed pistol-grip grasper. Participants performed FLS peg transfer, inverted peg transfer, and inverted circle-cut with each tool and handle angle. Inverted tasks were adapted from standard FLS tasks to simulate advanced tasks observed during abdominal wall surgeries, e.g., ventral hernia. Motion tracking, video-analysis, and modified NASA-TLX workload questionnaires were used to measure postures, performance (e.g., completion time and errors), and workload. Results Task performance did not differ among tools. For FLS peg transfer, self-reported physical workload was lower for B than A70, and mean wrist postures showed significantly higher flexion for in-line than pistol-grip tools (B and A70). For inverted peg transfer, workload was higher for all configurations. However, less time was spent in at-risk wrist postures for in-line (47%) than pistol-grip (93-94%), and most participants preferred Tool A. For inverted circle cut, workload did not vary across configurations, mean wrist posture was 10° closer to neutral for A0 than B, and median time in at-risk wrist postures was significantly less for A0 (43%) than B (87%). Conclusion The best ergonomic wrist positions for FLS (floor) tasks are provided by pistol-grip tools and for tasks on the abdominal wall (ventral surface) by in-line handles. Adjustable handle angle laparoscopic tools can reduce ergonomic risks for musculoskeletal strain and allow versatility for tasks alternating between the floor and ceiling positions in a surgical trainer without impacting performance. PMID:26541720
Impact of novel shift handle laparoscopic tool on wrist ergonomics and task performance.
Yu, Denny; Lowndes, Bethany; Morrow, Missy; Kaufman, Kenton; Bingener, Juliane; Hallbeck, Susan
2016-08-01
Laparoscopic tool handles causing wrist flexion and extension more than 15° from neutral are considered "at risk" for musculoskeletal strain. Therefore, this study measured the impact of laparoscopic tool handle angles on wrist postures and task performance. Eight surgeons performed standard and modified Fundamentals of Laparoscopic Surgery (FLS) tasks with laparoscopic tools. Tool A had three adjustable handle angle configurations, i.e., in-line 0° (A0), 30° (A30), and pistol-grip 70° (A70). Tool B was a fixed pistol-grip grasper. Participants performed FLS peg transfer, inverted peg transfer, and inverted circle cut with each tool and handle angle. Inverted tasks were adapted from standard FLS tasks to simulate advanced tasks observed during abdominal wall surgeries, e.g., ventral hernia. Motion tracking, video analysis, and modified NASA-TLX workload questionnaires were used to measure postures, performance (e.g., completion time and errors), and workload. Task performance did not differ between tools. For FLS peg transfer, self-reported physical workload was lower for B than for A70, and mean wrist postures showed significantly higher flexion for in-line than for pistol-grip tools (B and A70). For inverted peg transfer, workload was higher for all configurations. However, less time was spent in at-risk wrist postures for in-line (47 %) than for pistol-grip (93-94 %), and most participants preferred Tool A. For inverted circle cut, workload did not vary across configurations, mean wrist posture was 10° closer to neutral for A0 than B, and median time in at-risk wrist postures was significantly less for A0 (43 %) than for B (87 %). The best ergonomic wrist positions for FLS (floor) tasks are provided by pistol-grip tools and for tasks on the abdominal wall (ventral surface) by in-line handles. Adjustable handle angle laparoscopic tools can reduce ergonomic risks of musculoskeletal strain and allow versatility for tasks alternating between the floor and ceiling positions in a surgical trainer without impacting performance.
Gosselin, Emilie; Bourgault, Patricia; Lavoie, Stephan; Coleman, Robin-Marie; Méziat-Burdin, Anne
2014-12-01
Pain management in the intensive care unit is often inadequate. There is no tool available to assess nursing pain management practices. The aim of this study was to develop and validate a measuring tool to assess nursing pain management in the intensive care unit during standardized clinical simulation. A literature review was performed to identify relevant components demonstrating optimal pain management in adult intensive care units and to integrate them in an observation tool. This tool was submitted to an expert panel and pretested. It was then used to assess pain management practice during 26 discrete standardized clinical simulation sessions with intensive care nurses. The Nursing Observation Tool for Pain Management (NOTPaM) contains 28 statements grouped into 8 categories, which are grouped into 4 dimensions: subjective assessment, objective assessment, interventions, and reassessment. The tool's internal consistency was calculated at a Cronbach's alpha of 0.436 for the whole tool; the alpha varies from 0.328 to 0.518 for each dimension. To evaluate the inter-rater reliability, intra-class correlation coefficient was used, which was calculated at 0.751 (p < .001) for the whole tool, with variations from 0.619 to 0.920 (p < .01) between dimensions. The expert panel was satisfied with the content and face validity of the tool. The psychometric qualities of the NOTPaM developed in this study are satisfactory. However, the tool could be improved with slight modifications. Nevertheless, it was useful in assessing intensive care nurses' pain management in a standardized clinical simulation. The NOTPaM is the first tool created for this purpose. Copyright © 2014 American Society for Pain Management Nursing. Published by Elsevier Inc. All rights reserved.
EUnetHTA information management system: development and lessons learned.
Chalon, Patrice X; Kraemer, Peter
2014-11-01
The aim of this study was to describe the techniques used in achieving consensus on common standards to be implemented in the EUnetHTA Information Management System (IMS); and to describe how interoperability between tools was explored. Three face to face meetings were organized to identify and agree on common standards to the development of online tools. Two tools were created to demonstrate the added value of implementing interoperability standards at local levels. Developers of tools outside EUnetHTA were identified and contacted. Four common standards have been agreed on by consensus; and consequently all EUnetHTA tools have been modified or designed accordingly. RDF Site Summary (RSS) has demonstrated a good potential to support rapid dissemination of HTA information. Contacts outside EUnetHTA resulted in direct collaboration (HTA glossary, HTAi Vortal), evaluation of options for interoperability between tools (CRD HTA database) or a formal framework to prepare cooperation on concrete projects (INAHTA projects database). While being entitled a project on IT infrastructure, the work program was also about people. When having to agree on complex topics, fostering a cohesive group dynamic and hosting face to face meetings brings added value and enhances understanding between partners. The adoption of widespread standards enhanced the homogeneity of the EUnetHTA tools and should thus contribute to their wider use, therefore, to the general objective of EUnetHTA. The initiatives on interoperability of systems need to be developed further to support a general interoperable information system that could benefit the whole HTA community.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Astaf'ev, S. B., E-mail: bard@ns.crys.ras.ru; Shchedrin, B. M.; Yanusova, L. G.
2012-01-15
The main principles of developing the Basic Analysis of Reflectometry Data (BARD) software package, which is aimed at obtaining a unified (standardized) tool for analyzing the structure of thin multilayer films and nanostructures of different nature based on reflectometry data, are considered. This software package contains both traditionally used procedures for processing reflectometry data and the authors' original developments on the basis of new methods for carrying out and analyzing reflectometry experiments. The structure of the package, its functional possibilities, examples of application, and prospects of development are reviewed.
Santos, Eduardo Jose Melos Dos; McCabe, Antony; Gonzalez-Galarza, Faviel F; Jones, Andrew R; Middleton, Derek
2016-03-01
The Allele Frequencies Net Database (AFND) is a freely accessible database which stores population frequencies for alleles or genes of the immune system in worldwide populations. Herein we introduce two new tools. We have defined new classifications of data (gold, silver and bronze) to assist users in identifying the most suitable populations for their tasks. The gold standard datasets are defined by allele frequencies summing to 1, sample sizes >50 and high resolution genotyping, while silver standard datasets do not meet gold standard genotyping resolution and/or sample size criteria. The bronze standard datasets are those that could not be classified under the silver or gold standards. The gold standard includes >500 datasets covering over 3 million individuals from >100 countries at one or more of the following loci: HLA-A, -B, -C, -DPA1, -DPB1, -DQA1, -DQB1 and -DRB1 - with all loci except DPA1 present in more than 220 datasets. Three out of 12 geographic regions have low representation (the majority of their countries having less than five datasets) and the Central Asia region has no representation. There are 18 countries that are not represented by any gold standard datasets but are represented by at least one dataset that is either silver or bronze standard. We also briefly summarize the data held by AFND for KIR genes, alleles and their ligands. Our second new component is a data submission tool to assist users in the collection of the genotypes of the individuals (raw data), facilitating submission of short population reports to Human Immunology, as well as simplifying the submission of population demographics and frequency data. Copyright © 2015 American Society for Histocompatibility and Immunogenetics. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Canu I, Guseva; C, Ducros; S, Ducamp; L, Delabre; S, Audignon-Durand; C, Durand; Y, Iwatsubo; D, Jezewski-Serra; Bihan O, Le; S, Malard; A, Radauceanu; M, Reynier; M, Ricaud; O, Witschger
2015-05-01
The French national epidemiological surveillance program EpiNano aims at surveying mid- and long-term health effects possibly related with occupational exposure to either carbon nanotubes or titanium dioxide nanoparticles (TiO2). EpiNano is limited to workers potentially exposed to these nanomaterials including their aggregates and agglomerates. In order to identify those workers during the in-field industrial hygiene visits, a standardized non-instrumental method is necessary especially for epidemiologists and occupational physicians unfamiliar with nanoparticle and nanomaterial exposure metrology. A working group, Quintet ExpoNano, including national experts in nanomaterial metrology and occupational hygiene reviewed available methods, resources and their practice in order to develop a standardized tool for conducting company industrial hygiene visits and collecting necessary information. This tool, entitled “Onsite technical logbook”, includes 3 parts: company, workplace, and workstation allowing a detailed description of each task, process and exposure surrounding conditions. This logbook is intended to be completed during the company industrial hygiene visit. Each visit is conducted jointly by an industrial hygienist and an epidemiologist of the program and lasts one or two days depending on the company size. When all collected information is computerized using friendly-using software, it is possible to classify workstations with respect to their potential direct and/or indirect exposure. Workers appointed to workstations classified as concerned with exposure are considered as eligible for EpiNano program and invited to participate. Since January 2014, the Onsite technical logbook has been used in ten company visits. The companies visited were mostly involved in research and development. A total of 53 workstations with potential exposure to nanomaterials were pre-selected and observed: 5 with TiO2, 16 with single-walled carbon nanotubes, 27 multiwalled carbon nanotubes. Among the tasks observed there were: nanomaterial characterisation analysis (8), weighing (7), synthesis (6), functionalization (5), and transfer (5). The manipulated quantities were usually very small. After analysis of the data gathered in logbooks, 30 workstations have been classified as concerned with exposure to carbon nanotubes or TiO2. Additional tool validity as well as inter-and intra-evaluator reproducibility studies are ongoing. The first results are promising.
Gerhard, Stephan; Daducci, Alessandro; Lemkaddem, Alia; Meuli, Reto; Thiran, Jean-Philippe; Hagmann, Patric
2011-01-01
Advanced neuroinformatics tools are required for methods of connectome mapping, analysis, and visualization. The inherent multi-modality of connectome datasets poses new challenges for data organization, integration, and sharing. We have designed and implemented the Connectome Viewer Toolkit - a set of free and extensible open source neuroimaging tools written in Python. The key components of the toolkit are as follows: (1) The Connectome File Format is an XML-based container format to standardize multi-modal data integration and structured metadata annotation. (2) The Connectome File Format Library enables management and sharing of connectome files. (3) The Connectome Viewer is an integrated research and development environment for visualization and analysis of multi-modal connectome data. The Connectome Viewer's plugin architecture supports extensions with network analysis packages and an interactive scripting shell, to enable easy development and community contributions. Integration with tools from the scientific Python community allows the leveraging of numerous existing libraries for powerful connectome data mining, exploration, and comparison. We demonstrate the applicability of the Connectome Viewer Toolkit using Diffusion MRI datasets processed by the Connectome Mapper. The Connectome Viewer Toolkit is available from http://www.cmtk.org/
Ganalyzer: A Tool for Automatic Galaxy Image Analysis
NASA Astrophysics Data System (ADS)
Shamir, Lior
2011-08-01
We describe Ganalyzer, a model-based tool that can automatically analyze and classify galaxy images. Ganalyzer works by separating the galaxy pixels from the background pixels, finding the center and radius of the galaxy, generating the radial intensity plot, and then computing the slopes of the peaks detected in the radial intensity plot to measure the spirality of the galaxy and determine its morphological class. Unlike algorithms that are based on machine learning, Ganalyzer is based on measuring the spirality of the galaxy, a task that is difficult to perform manually, and in many cases can provide a more accurate analysis compared to manual observation. Ganalyzer is simple to use, and can be easily embedded into other image analysis applications. Another advantage is its speed, which allows it to analyze ~10,000,000 galaxy images in five days using a standard modern desktop computer. These capabilities can make Ganalyzer a useful tool in analyzing large data sets of galaxy images collected by autonomous sky surveys such as SDSS, LSST, or DES. The software is available for free download at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer, and the data used in the experiment are available at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer/GalaxyImages.zip.
Gerhard, Stephan; Daducci, Alessandro; Lemkaddem, Alia; Meuli, Reto; Thiran, Jean-Philippe; Hagmann, Patric
2011-01-01
Advanced neuroinformatics tools are required for methods of connectome mapping, analysis, and visualization. The inherent multi-modality of connectome datasets poses new challenges for data organization, integration, and sharing. We have designed and implemented the Connectome Viewer Toolkit – a set of free and extensible open source neuroimaging tools written in Python. The key components of the toolkit are as follows: (1) The Connectome File Format is an XML-based container format to standardize multi-modal data integration and structured metadata annotation. (2) The Connectome File Format Library enables management and sharing of connectome files. (3) The Connectome Viewer is an integrated research and development environment for visualization and analysis of multi-modal connectome data. The Connectome Viewer's plugin architecture supports extensions with network analysis packages and an interactive scripting shell, to enable easy development and community contributions. Integration with tools from the scientific Python community allows the leveraging of numerous existing libraries for powerful connectome data mining, exploration, and comparison. We demonstrate the applicability of the Connectome Viewer Toolkit using Diffusion MRI datasets processed by the Connectome Mapper. The Connectome Viewer Toolkit is available from http://www.cmtk.org/ PMID:21713110
TopoMS: Comprehensive topological exploration for molecular and condensed-matter systems.
Bhatia, Harsh; Gyulassy, Attila G; Lordi, Vincenzo; Pask, John E; Pascucci, Valerio; Bremer, Peer-Timo
2018-06-15
We introduce TopoMS, a computational tool enabling detailed topological analysis of molecular and condensed-matter systems, including the computation of atomic volumes and charges through the quantum theory of atoms in molecules, as well as the complete molecular graph. With roots in techniques from computational topology, and using a shared-memory parallel approach, TopoMS provides scalable, numerically robust, and topologically consistent analysis. TopoMS can be used as a command-line tool or with a GUI (graphical user interface), where the latter also enables an interactive exploration of the molecular graph. This paper presents algorithmic details of TopoMS and compares it with state-of-the-art tools: Bader charge analysis v1.0 (Arnaldsson et al., 01/11/17) and molecular graph extraction using Critic2 (Otero-de-la-Roza et al., Comput. Phys. Commun. 2014, 185, 1007). TopoMS not only combines the functionality of these individual codes but also demonstrates up to 4× performance gain on a standard laptop, faster convergence to fine-grid solution, robustness against lattice bias, and topological consistency. TopoMS is released publicly under BSD License. © 2018 Wiley Periodicals, Inc. © 2018 Wiley Periodicals, Inc.
Determination of diagnostic standards on saturated soil extracts for cut roses grown in greenhouses
Cabrera, Raúl Iskander
2017-01-01
This work comprises the theoretical determination and validation of diagnostic standards for the analysis of saturated soil extracts for cut rose flower crops (Rosa spp.) growing in the Bogota Plateau, Colombia. The data included 684 plant tissue analyses and 684 corresponding analyses of saturated soil extracts, all collected between January 2009 and June 2013. The tissue and soil samples were selected from 13 rose farms, and from cultivars grafted on the 'Natal Briar' rootstock. These concurrent samples of soil and plant tissues represented 251 production units (locations) of approximately 10,000 m2 distributed across the study area. The standards were conceived as a tool to improve the nutritional balance in the leaf tissue of rose plants and thereby define the norms for expressing optimum productive potential relative to nutritional conditions in the soil. To this end, previously determined diagnostic standard for rose leaf tissues were employed to obtain rates of foliar nutritional balance at each analyzed location and as criteria for determining the diagnostic norms for saturated soil extracts. Implementing this methodology to foliar analysis, showed a higher significant correlation for diagnostic indices. A similar behavior was observed in saturated soil extracts analysis, becoming a powerful tool for integrated nutritional diagnosis. Leaf analyses determine the most limiting nutrients for high yield and analyses of saturated soil extracts facilitate the possibility of correcting the fertigation formulations applied to soils or substrates. Recommendations are proposed to improve the balance in soil-plant system with which the possibility of yield increase becomes more probable. The main recommendations to increase and improve rose crop flower yields would be: continuously check pH values of SSE, reduce the amounts of P, Fe, Zn and Cu in fertigation solutions and carefully analyze the situation of Mn in the soil-plant system. PMID:28542547
Imaging mass spectrometry data reduction: automated feature identification and extraction.
McDonnell, Liam A; van Remoortere, Alexandra; de Velde, Nico; van Zeijl, René J M; Deelder, André M
2010-12-01
Imaging MS now enables the parallel analysis of hundreds of biomolecules, spanning multiple molecular classes, which allows tissues to be described by their molecular content and distribution. When combined with advanced data analysis routines, tissues can be analyzed and classified based solely on their molecular content. Such molecular histology techniques have been used to distinguish regions with differential molecular signatures that could not be distinguished using established histologic tools. However, its potential to provide an independent, complementary analysis of clinical tissues has been limited by the very large file sizes and large number of discrete variables associated with imaging MS experiments. Here we demonstrate data reduction tools, based on automated feature identification and extraction, for peptide, protein, and lipid imaging MS, using multiple imaging MS technologies, that reduce data loads and the number of variables by >100×, and that highlight highly-localized features that can be missed using standard data analysis strategies. It is then demonstrated how these capabilities enable multivariate analysis on large imaging MS datasets spanning multiple tissues. Copyright © 2010 American Society for Mass Spectrometry. Published by Elsevier Inc. All rights reserved.
Rodriguez, A Noel; DeWitt, Peter; Fisher, Jennifer; Broadfoot, Kirsten; Hurt, K Joseph
2016-06-11
To characterize the psychometric properties of a novel Obstetric Communication Assessment Tool (OCAT) in a pilot study of standardized difficult OB communication scenarios appropriate for undergraduate medical evaluation. We developed and piloted four challenging OB Standardized Patient (SP) scenarios in a sample of twenty-one third year OB/GYN clerkship students: Religious Beliefs (RB), Angry Father (AF), Maternal Smoking (MS), and Intimate Partner Violence (IPV). Five trained Standardized Patient Reviewers (SPRs) independently scored twenty-four randomized video-recorded encounters using the OCAT. Cronbach's alpha and Intraclass Correlation Coefficient-2 (ICC-2) were used to estimate internal consistency (IC) and inter-rater reliability (IRR), respectively. Systematic variation in reviewer scoring was assessed using the Stuart-Maxwell test. IC was acceptable to excellent with Cronbach's alpha values (and 95% Confidence Intervals [CI]): RB 0.91 (0.86, 0.95), AF 0.76 (0.62, 0.87), MS 0.91 (0.86, 0.95), and IPV 0.94 (0.91, 0.97). IRR was unacceptable to poor with ICC-2 values: RB 0.46 (0.40, 0.53), AF 0.48 (0.41, 0.54), MS 0.52 (0.45, 0.58), and IPV 0.67 (0.61, 0.72). Stuart-Maxwell analysis indicated systematic differences in reviewer stringency. Our initial characterization of the OCAT demonstrates important issues in communications assessment. We identify scoring inconsistencies due to differences in SPR rigor that require enhanced training to improve assessment reliability. We outline a rational process for initial communication tool validation that may be useful in undergraduate curriculum development, and acknowledge that rigorous validation of OCAT training and implementation is needed to create a valuable OB communication assessment tool.
Public health component in building information modeling
NASA Astrophysics Data System (ADS)
Trufanov, A. I.; Rossodivita, A.; Tikhomirov, A. A.; Berestneva, O. G.; Marukhina, O. V.
2018-05-01
A building information modelling (BIM) conception has established itself as an effective and practical approach to plan, design, construct, and manage buildings and infrastructure. Analysis of the governance literature has shown that the BIM-developed tools do not take fully into account the growing demands from ecology and health fields. In this connection, it is possible to offer an optimal way of adapting such tools to the necessary consideration of the sanitary and hygienic specifications of materials used in construction industry. It is proposed to do it through the introduction of assessments that meet the requirements of national sanitary standards. This approach was demonstrated in the case study of Revit® program.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Damiani, Rick
This manual summarizes the theory and preliminary verifications of the JacketSE module, which is an offshore jacket sizing tool that is part of the Wind-Plant Integrated System Design & Engineering Model toolbox. JacketSE is based on a finite-element formulation and on user-prescribed inputs and design standards' criteria (constraints). The physics are highly simplified, with a primary focus on satisfying ultimate limit states and modal performance requirements. Preliminary validation work included comparing industry data and verification against ANSYS, a commercial finite-element analysis package. The results are encouraging, and future improvements to the code are recommended in this manual.
Collaborative Software Development in Support of Fast Adaptive AeroSpace Tools (FAAST)
NASA Technical Reports Server (NTRS)
Kleb, William L.; Nielsen, Eric J.; Gnoffo, Peter A.; Park, Michael A.; Wood, William A.
2003-01-01
A collaborative software development approach is described. The software product is an adaptation of proven computational capabilities combined with new capabilities to form the Agency's next generation aerothermodynamic and aerodynamic analysis and design tools. To efficiently produce a cohesive, robust, and extensible software suite, the approach uses agile software development techniques; specifically, project retrospectives, the Scrum status meeting format, and a subset of Extreme Programming's coding practices are employed. Examples are provided which demonstrate the substantial benefits derived from employing these practices. Also included is a discussion of issues encountered when porting legacy Fortran 77 code to Fortran 95 and a Fortran 95 coding standard.
Chimera: a Bioconductor package for secondary analysis of fusion products.
Beccuti, Marco; Carrara, Matteo; Cordero, Francesca; Lazzarato, Fulvio; Donatelli, Susanna; Nadalin, Francesca; Policriti, Alberto; Calogero, Raffaele A
2014-12-15
Chimera is a Bioconductor package that organizes, annotates, analyses and validates fusions reported by different fusion detection tools; current implementation can deal with output from bellerophontes, chimeraScan, deFuse, fusionCatcher, FusionFinder, FusionHunter, FusionMap, mapSplice, Rsubread, tophat-fusion and STAR. The core of Chimera is a fusion data structure that can store fusion events detected with any of the aforementioned tools. Fusions are then easily manipulated with standard R functions or through the set of functionalities specifically developed in Chimera with the aim of supporting the user in managing fusions and discriminating false-positive results. © The Author 2014. Published by Oxford University Press.