Sample records for level analysis tool

  1. Investigation of effects of process parameters on properties of friction stir welded joints

    NASA Astrophysics Data System (ADS)

    Chauhan, Atul; Soota, Tarun; Rajput, S. K.

    2018-03-01

    This work deals with application of friction stir welding (FSW) using application of Taguchi orthogonal array. FSW procedure is used for joining the aluminium alloy AA6063-T0 plates in butt configuration with orthogonal combination of factors and their levels. The combination of factors involving tool rotation speed, tool travel speed and tool pin profile are used in three levels. Grey relational analysis (GRA) has been applied to select optimum level of factors for optimising UTS, ductility and hardness of joint. Experiments have been conducted with two different tool materials (HSS and HCHCr steel) with various factors level combinations for joining AA6063-T0. On the basis of grey relational grades at different levels of factors and analysis of variance (ANOVA) ideal combination of factors are determined. The influence of tool material is also studied.

  2. Designed tools for analysis of lithography patterns and nanostructures

    NASA Astrophysics Data System (ADS)

    Dervillé, Alexandre; Baderot, Julien; Bernard, Guilhem; Foucher, Johann; Grönqvist, Hanna; Labrosse, Aurélien; Martinez, Sergio; Zimmermann, Yann

    2017-03-01

    We introduce a set of designed tools for the analysis of lithography patterns and nano structures. The classical metrological analysis of these objects has the drawbacks of being time consuming, requiring manual tuning and lacking robustness and user friendliness. With the goal of improving the current situation, we propose new image processing tools at different levels: semi automatic, automatic and machine-learning enhanced tools. The complete set of tools has been integrated into a software platform designed to transform the lab into a virtual fab. The underlying idea is to master nano processes at the research and development level by accelerating the access to knowledge and hence speed up the implementation in product lines.

  3. A comparative analysis of Patient-Reported Expanded Disability Status Scale tools.

    PubMed

    Collins, Christian DE; Ivry, Ben; Bowen, James D; Cheng, Eric M; Dobson, Ruth; Goodin, Douglas S; Lechner-Scott, Jeannette; Kappos, Ludwig; Galea, Ian

    2016-09-01

    Patient-Reported Expanded Disability Status Scale (PREDSS) tools are an attractive alternative to the Expanded Disability Status Scale (EDSS) during long term or geographically challenging studies, or in pressured clinical service environments. Because the studies reporting these tools have used different metrics to compare the PREDSS and EDSS, we undertook an individual patient data level analysis of all available tools. Spearman's rho and the Bland-Altman method were used to assess correlation and agreement respectively. A systematic search for validated PREDSS tools covering the full EDSS range identified eight such tools. Individual patient data were available for five PREDSS tools. Excellent correlation was observed between EDSS and PREDSS with all tools. A higher level of agreement was observed with increasing levels of disability. In all tools, the 95% limits of agreement were greater than the minimum EDSS difference considered to be clinically significant. However, the intra-class coefficient was greater than that reported for EDSS raters of mixed seniority. The visual functional system was identified as the most significant predictor of the PREDSS-EDSS difference. This analysis will (1) enable researchers and service providers to make an informed choice of PREDSS tool, depending on their individual requirements, and (2) facilitate improvement of current PREDSS tools. © The Author(s), 2015.

  4. Cloud-Based Orchestration of a Model-Based Power and Data Analysis Toolchain

    NASA Technical Reports Server (NTRS)

    Post, Ethan; Cole, Bjorn; Dinkel, Kevin; Kim, Hongman; Lee, Erich; Nairouz, Bassem

    2016-01-01

    The proposed Europa Mission concept contains many engineering and scientific instruments that consume varying amounts of power and produce varying amounts of data throughout the mission. System-level power and data usage must be well understood and analyzed to verify design requirements. Numerous cross-disciplinary tools and analysis models are used to simulate the system-level spacecraft power and data behavior. This paper addresses the problem of orchestrating a consistent set of models, tools, and data in a unified analysis toolchain when ownership is distributed among numerous domain experts. An analysis and simulation environment was developed as a way to manage the complexity of the power and data analysis toolchain and to reduce the simulation turnaround time. A system model data repository is used as the trusted store of high-level inputs and results while other remote servers are used for archival of larger data sets and for analysis tool execution. Simulation data passes through numerous domain-specific analysis tools and end-to-end simulation execution is enabled through a web-based tool. The use of a cloud-based service facilitates coordination among distributed developers and enables scalable computation and storage needs, and ensures a consistent execution environment. Configuration management is emphasized to maintain traceability between current and historical simulation runs and their corresponding versions of models, tools and data.

  5. Effects-based strategy development through center of gravity and target system analysis

    NASA Astrophysics Data System (ADS)

    White, Christopher M.; Prendergast, Michael; Pioch, Nicholas; Jones, Eric K.; Graham, Stephen

    2003-09-01

    This paper describes an approach to effects-based planning in which a strategic-theater-level mission is refined into operational-level and ultimately tactical-level tasks and desired effects, informed by models of the expected enemy response at each level of abstraction. We describe a strategy development system that implements this approach and supports human-in-the-loop development of an effects-based plan. This system consists of plan authoring tools tightly integrated with a suite of center of gravity (COG) and target system analysis tools. A human planner employs the plan authoring tools to develop a hierarchy of tasks and desired effects. Upon invocation, the target system analysis tools use reduced-order models of enemy centers of gravity to select appropriate target set options for the achievement of desired effects, together with associated indicators for each option. The COG analysis tools also provide explicit models of the causal mechanisms linking tasks and desired effects to one another, and suggest appropriate observable indicators to guide ISR planning, execution monitoring, and campaign assessment. We are currently implementing the system described here as part of the AFRL-sponsored Effects Based Operations program.

  6. Effects of machining parameters on tool life and its optimization in turning mild steel with brazed carbide cutting tool

    NASA Astrophysics Data System (ADS)

    Dasgupta, S.; Mukherjee, S.

    2016-09-01

    One of the most significant factors in metal cutting is tool life. In this research work, the effects of machining parameters on tool under wet machining environment were studied. Tool life characteristics of brazed carbide cutting tool machined against mild steel and optimization of machining parameters based on Taguchi design of experiments were examined. The experiments were conducted using three factors, spindle speed, feed rate and depth of cut each having three levels. Nine experiments were performed on a high speed semi-automatic precision central lathe. ANOVA was used to determine the level of importance of the machining parameters on tool life. The optimum machining parameter combination was obtained by the analysis of S/N ratio. A mathematical model based on multiple regression analysis was developed to predict the tool life. Taguchi's orthogonal array analysis revealed the optimal combination of parameters at lower levels of spindle speed, feed rate and depth of cut which are 550 rpm, 0.2 mm/rev and 0.5mm respectively. The Main Effects plot reiterated the same. The variation of tool life with different process parameters has been plotted. Feed rate has the most significant effect on tool life followed by spindle speed and depth of cut.

  7. NASA Instrument Cost/Schedule Model

    NASA Technical Reports Server (NTRS)

    Habib-Agahi, Hamid; Mrozinski, Joe; Fox, George

    2011-01-01

    NASA's Office of Independent Program and Cost Evaluation (IPCE) has established a number of initiatives to improve its cost and schedule estimating capabilities. 12One of these initiatives has resulted in the JPL developed NASA Instrument Cost Model. NICM is a cost and schedule estimator that contains: A system level cost estimation tool; a subsystem level cost estimation tool; a database of cost and technical parameters of over 140 previously flown remote sensing and in-situ instruments; a schedule estimator; a set of rules to estimate cost and schedule by life cycle phases (B/C/D); and a novel tool for developing joint probability distributions for cost and schedule risk (Joint Confidence Level (JCL)). This paper describes the development and use of NICM, including the data normalization processes, data mining methods (cluster analysis, principal components analysis, regression analysis and bootstrap cross validation), the estimating equations themselves and a demonstration of the NICM tool suite.

  8. PolNet: A Tool to Quantify Network-Level Cell Polarity and Blood Flow in Vascular Remodeling.

    PubMed

    Bernabeu, Miguel O; Jones, Martin L; Nash, Rupert W; Pezzarossa, Anna; Coveney, Peter V; Gerhardt, Holger; Franco, Claudio A

    2018-05-08

    In this article, we present PolNet, an open-source software tool for the study of blood flow and cell-level biological activity during vessel morphogenesis. We provide an image acquisition, segmentation, and analysis protocol to quantify endothelial cell polarity in entire in vivo vascular networks. In combination, we use computational fluid dynamics to characterize the hemodynamics of the vascular networks under study. The tool enables, to our knowledge for the first time, a network-level analysis of polarity and flow for individual endothelial cells. To date, PolNet has proven invaluable for the study of endothelial cell polarization and migration during vascular patterning, as demonstrated by two recent publications. Additionally, the tool can be easily extended to correlate blood flow with other experimental observations at the cellular/molecular level. We release the source code of our tool under the Lesser General Public License. Copyright © 2018 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  9. Analysis of Sea Level Rise in Action

    NASA Astrophysics Data System (ADS)

    Gill, K. M.; Huang, T.; Quach, N. T.; Boening, C.

    2016-12-01

    NASA's Sea Level Change Portal provides scientists and the general public with "one-stop" source for current sea level change information and data. Sea Level Rise research is a multidisciplinary research and in order to understand its causes, scientists must be able to access different measurements and to be able to compare them. The portal includes an interactive tool, called the Data Analysis Tool (DAT), for accessing, visualizing, and analyzing observations and models relevant to the study of Sea Level Rise. Using NEXUS, an open source, big data analytic technology developed at the Jet Propulsion Laboratory, the DAT is able provide user on-the-fly data analysis on all relevant parameters. DAT is composed of three major components: A dedicated instance of OnEarth (a WMTS service), NEXUS deep data analytic platform, and the JPL Common Mapping Client (CMC) for web browser based user interface (UI). Utilizing the global imagery, a user is capable of browsing the data in a visual manner and isolate areas of interest for further study. The interfaces "Analysis" tool provides tools for area or point selection, single and/or comparative dataset selection, and a range of options, algorithms, and plotting. This analysis component utilizes the Nexus cloud computing platform to provide on-demand processing of the data within the user-selected parameters and immediate display of the results. A RESTful web API is exposed for users comfortable with other interfaces and who may want to take advantage of the cloud computing capabilities. This talk discuss how DAT enables on-the-fly sea level research. The talk will introduce the DAT with an end-to-end tour of the tool with exploration and animating of available imagery, a demonstration of comparative analysis and plotting, and how to share and export data along with images for use in publications/presentations. The session will cover what kind of data is available, what kind of analysis is possible, and what are the outputs.

  10. Separation analysis, a tool for analyzing multigrid algorithms

    NASA Technical Reports Server (NTRS)

    Costiner, Sorin; Taasan, Shlomo

    1995-01-01

    The separation of vectors by multigrid (MG) algorithms is applied to the study of convergence and to the prediction of the performance of MG algorithms. The separation operator for a two level cycle algorithm is derived. It is used to analyze the efficiency of the cycle when mixing of eigenvectors occurs. In particular cases the separation analysis reduces to Fourier type analysis. The separation operator of a two level cycle for a Schridubger eigenvalue problem, is derived and analyzed in a Fourier basis. Separation analysis gives information on how to choose performance relaxations and inter-level transfers. Separation analysis is a tool for analyzing and designing algorithms, and for optimizing their performance.

  11. Smart roadside initiative macro benefit analysis : user’s guide for the benefit-cost analysis tool.

    DOT National Transportation Integrated Search

    2015-03-01

    Through the Smart Roadside Initiative (SRI), a Benefit-Cost Analysis (BCA) tool was developed for the evaluation of various new transportation technologies at a State level and to provide results that could support technology adoption by a State Depa...

  12. Using standardized tools to improve immunization costing data for program planning: the cost of the Colombian Expanded Program on Immunization.

    PubMed

    Castañeda-Orjuela, Carlos; Romero, Martin; Arce, Patricia; Resch, Stephen; Janusz, Cara B; Toscano, Cristiana M; De la Hoz-Restrepo, Fernando

    2013-07-02

    The cost of Expanded Programs on Immunization (EPI) is an important aspect of the economic and financial analysis needed for planning purposes. Costs also are needed for cost-effectiveness analysis of introducing new vaccines. We describe a costing tool that improves the speed, accuracy, and availability of EPI costs and that was piloted in Colombia. The ProVac CostVac Tool is a spreadsheet-based tool that estimates overall EPI costs considering program inputs (personnel, cold chain, vaccines, supplies, etc.) at three administrative levels (central, departmental, and municipal) and one service delivery level (health facilities). It uses various costing methods. The tool was evaluated through a pilot exercise in Colombia. In addition to the costs obtained from the central and intermediate administrative levels, a survey of 112 local health facilities was conducted to collect vaccination costs. Total cost of the EPI, cost per dose of vaccine delivered, and cost per fully vaccinated child with the recommended immunization schedule in Colombia in 2009 were estimated. The ProVac CostVac Tool is a novel, user-friendly tool, which allows users to conduct an EPI costing study following guidelines for cost studies. The total costs of the Colombian EPI were estimated at US$ 107.8 million in 2009. The cost for a fully immunized child with the recommended schedule was estimated at US$ 153.62. Vaccines and vaccination supplies accounted for 58% of total costs, personnel for 21%, cold chain for 18%, and transportation for 2%. Most EPI costs are incurred at the central level (62%). The major cost driver at the department and municipal levels is personnel costs. The ProVac CostVac Tool proved to be a comprehensive and useful tool that will allow researchers and health officials to estimate the actual cost for national immunization programs. The present analysis shows that personnel, cold chain, and transportation are important components of EPI and should be carefully estimated in the cost analysis, particularly when evaluating new vaccine introduction. Copyright © 2013 Elsevier Ltd. All rights reserved.

  13. Correspondence analysis

    USDA-ARS?s Scientific Manuscript database

    Correspondence analysis is a powerful exploratory multivariate technique for categorical variables with many levels. It is a data analysis tool that characterizes associations between levels of 2 or more categorical variables using graphical representations of the information in a contingency table...

  14. Hydrogen Financial Analysis Scenario Tool (H2FAST). Web Tool User's Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bush, B.; Penev, M.; Melaina, M.

    The Hydrogen Financial Analysis Scenario Tool (H2FAST) provides a quick and convenient indepth financial analysis for hydrogen fueling stations. This manual describes how to use the H2FAST web tool, which is one of three H2FAST formats developed by the National Renewable Energy Laboratory (NREL). Although all of the formats are based on the same financial computations and conform to generally accepted accounting principles (FASAB 2014, Investopedia 2014), each format provides a different level of complexity and user interactivity.

  15. NASA System-Level Design, Analysis and Simulation Tools Research on NextGen

    NASA Technical Reports Server (NTRS)

    Bardina, Jorge

    2011-01-01

    A review of the research accomplished in 2009 in the System-Level Design, Analysis and Simulation Tools (SLDAST) of the NASA's Airspace Systems Program is presented. This research thrust focuses on the integrated system-level assessment of component level innovations, concepts and technologies of the Next Generation Air Traffic System (NextGen) under research in the ASP program to enable the development of revolutionary improvements and modernization of the National Airspace System. The review includes the accomplishments on baseline research and the advancements on design studies and system-level assessment, including the cluster analysis as an annualization standard of the air traffic in the U.S. National Airspace, and the ACES-Air MIDAS integration for human-in-the-loop analyzes within the NAS air traffic simulation.

  16. The prevention of mother-to-child transmission of HIV cascade analysis tool: supporting health managers to improve facility-level service delivery.

    PubMed

    Gimbel, Sarah; Voss, Joachim; Mercer, Mary Anne; Zierler, Brenda; Gloyd, Stephen; Coutinho, Maria de Joana; Floriano, Florencia; Cuembelo, Maria de Fatima; Einberg, Jennifer; Sherr, Kenneth

    2014-10-21

    The objective of the prevention of Mother-to-Child Transmission (pMTCT) cascade analysis tool is to provide frontline health managers at the facility level with the means to rapidly, independently and quantitatively track patient flows through the pMTCT cascade, and readily identify priority areas for clinic-level improvement interventions. Over a period of six months, five experienced maternal-child health managers and researchers iteratively adapted and tested this systems analysis tool for pMTCT services. They prioritized components of the pMTCT cascade for inclusion, disseminated multiple versions to 27 health managers and piloted it in five facilities. Process mapping techniques were used to chart PMTCT cascade steps in these five facilities, to document antenatal care attendance, HIV testing and counseling, provision of prophylactic anti-retrovirals, safe delivery, safe infant feeding, infant follow-up including HIV testing, and family planning, in order to obtain site-specific knowledge of service delivery. Seven pMTCT cascade steps were included in the Excel-based final tool. Prevalence calculations were incorporated as sub-headings under relevant steps. Cells not requiring data inputs were locked, wording was simplified and stepwise drop-offs and maximization functions were included at key steps along the cascade. While the drop off function allows health workers to rapidly assess how many patients were lost at each step, the maximization function details the additional people served if only one step improves to 100% capacity while others stay constant. Our experience suggests that adaptation of a cascade analysis tool for facility-level pMTCT services is feasible and appropriate as a starting point for discussions of where to implement improvement strategies. The resulting tool facilitates the engagement of frontline health workers and managers who fill out, interpret, apply the tool, and then follow up with quality improvement activities. Research on adoption, interpretation, and sustainability of this pMTCT cascade analysis tool by frontline health managers is needed. ClinicalTrials.gov NCT02023658, December 9, 2013.

  17. XMI2USE: A Tool for Transforming XMI to USE Specifications

    NASA Astrophysics Data System (ADS)

    Sun, Wuliang; Song, Eunjee; Grabow, Paul C.; Simmonds, Devon M.

    The UML-based Specification Environment (USE) tool supports syntactic analysis, type checking, consistency checking, and dynamic validation of invariants and pre-/post conditions specified in the Object Constraint Language (OCL). Due to its animation and analysis power, it is useful when checking critical non-functional properties such as security policies. However, the USE tool requires one to specify (i.e., "write") a model using its own textual language and does not allow one to import any model specification files created by other UML modeling tools. Hence, to make the best use of existing UML tools, we often create a model with OCL constraints using a modeling tool such as the IBM Rational Software Architect (RSA) and then use the USE tool for model validation. This approach, however, requires a manual transformation between the specifications of two different tool formats, which is error-prone and diminishes the benefit of automated model-level validations. In this paper, we describe our own implementation of a specification transformation engine that is based on the Model Driven Architecture (MDA) framework and currently supports automatic tool-level transformations from RSA to USE.

  18. ASTEC and MODEL: Controls software development at Goddard Space Flight Center

    NASA Technical Reports Server (NTRS)

    Downing, John P.; Bauer, Frank H.; Surber, Jeffrey L.

    1993-01-01

    The ASTEC (Analysis and Simulation Tools for Engineering Controls) software is under development at the Goddard Space Flight Center (GSFC). The design goal is to provide a wide selection of controls analysis tools at the personal computer level, as well as the capability to upload compute-intensive jobs to a mainframe or supercomputer. In the last three years the ASTEC (Analysis and Simulation Tools for Engineering Controls) software has been under development. ASTEC is meant to be an integrated collection of controls analysis tools for use at the desktop level. MODEL (Multi-Optimal Differential Equation Language) is a translator that converts programs written in the MODEL language to FORTRAN. An upgraded version of the MODEL program will be merged into ASTEC. MODEL has not been modified since 1981 and has not kept with changes in computers or user interface techniques. This paper describes the changes made to MODEL in order to make it useful in the 90's and how it relates to ASTEC.

  19. Diamond tool wear detection method using cutting force and its power spectrum analysis in ultra-precision fly cutting

    NASA Astrophysics Data System (ADS)

    Zhang, G. Q.; To, S.

    2014-08-01

    Cutting force and its power spectrum analysis was thought to be an effective method monitoring tool wear in many cutting processes and a significant body of research has been conducted on this research area. However, relative little similar research was found in ultra-precision fly cutting. In this paper, a group of experiments were carried out to investigate the cutting forces and its power spectrum characteristics under different tool wear stages. Result reveals that the cutting force increases with the progress of tool wear. The cutting force signals under different tool wear stages were analyzed using power spectrum analysis. The analysis indicates that a characteristic frequency does exist in the power spectrum of the cutting force, whose power spectral density increases with the increasing of tool wear level, this characteristic frequency could be adopted to monitor diamond tool wear in ultra-precision fly cutting.

  20. Factors influencing exemplary science teachers' levels of computer use

    NASA Astrophysics Data System (ADS)

    Hakverdi, Meral

    This study examines exemplary science teachers' use of technology in science instruction, factors influencing their level of computer use, their level of knowledge/skills in using specific computer applications for science instruction, their use of computer-related applications/tools during their instruction, and their students' use of computer applications/tools in or for their science class. After a relevant review of the literature certain variables were selected for analysis. These variables included personal self-efficacy in teaching with computers, outcome expectancy, pupil-control ideology, level of computer use, age, gender, teaching experience, personal computer use, professional computer use and science teachers' level of knowledge/skills in using specific computer applications for science instruction. The sample for this study includes middle and high school science teachers who received the Presidential Award for Excellence in Science Teaching Award (sponsored by the White House and the National Science Foundation) between the years 1997 and 2003 from all 50 states and U.S. territories. Award-winning science teachers were contacted about the survey via e-mail or letter with an enclosed return envelope. Of the 334 award-winning science teachers, usable responses were received from 92 science teachers, which made a response rate of 27.5%. Analysis of the survey responses indicated that exemplary science teachers have a variety of knowledge/skills in using computer related applications/tools. The most commonly used computer applications/tools are information retrieval via the Internet, presentation tools, online communication, digital cameras, and data collection probes. Results of the study revealed that students' use of technology in their science classroom is highly correlated with the frequency of their science teachers' use of computer applications/tools. The results of the multiple regression analysis revealed that personal self-efficacy related to the exemplary science teachers' level of computer use suggesting that computer use is dependent on perceived abilities at using computers. The teachers' use of computer-related applications/tools during class, and their personal self-efficacy, age, and gender are highly related with their level of knowledge/skills in using specific computer applications for science instruction. The teachers' level of knowledge/skills in using specific computer applications for science instruction and gender related to their use of computer-related applications/tools during class and the students' use of computer-related applications/tools in or for their science class. In conclusion, exemplary science teachers need assistance in learning and using computer-related applications/tool in their science class.

  1. RADC SCAT automated sneak circuit analysis tool

    NASA Astrophysics Data System (ADS)

    Depalma, Edward L.

    The sneak circuit analysis tool (SCAT) provides a PC-based system for real-time identification (during the design phase) of sneak paths and design concerns. The tool utilizes an expert system shell to assist the analyst so that prior experience with sneak analysis is not necessary for performance. Both sneak circuits and design concerns are targeted by this tool, with both digital and analog circuits being examined. SCAT focuses the analysis at the assembly level, rather than the entire system, so that most sneak problems can be identified and corrected by the responsible design engineer in a timely manner. The SCAT program identifies the sneak circuits to the designer, who then decides what course of action is necessary.

  2. Multi-level factors influence the implementation and use of complex innovations in cancer care: a multiple case study of synoptic reporting.

    PubMed

    Urquhart, Robin; Porter, Geoffrey A; Sargeant, Joan; Jackson, Lois; Grunfeld, Eva

    2014-09-16

    The implementation of innovations (i.e., new tools and practices) in healthcare organizations remains a significant challenge. The objective of this study was to examine the key interpersonal, organizational, and system level factors that influenced the implementation and use of synoptic reporting tools in three specific areas of cancer care. Using case study methodology, we studied three cases in Nova Scotia, Canada, wherein synoptic reporting tools were implemented within clinical departments/programs. Synoptic reporting tools capture and present information about a medical or surgical procedure in a structured, checklist-like format and typically report only items critical for understanding the disease and subsequent impacts on patient care. Data were collected through semi-structured interviews with key informants, document analysis, nonparticipant observation, and tool use/examination. Analysis involved production of case histories, in-depth analysis of each case, and a cross-case analysis. Numerous techniques were used during the research design, data collection, and data analysis stages to increase the rigour of this study. The analysis revealed five common factors that were particularly influential to implementation and use of synoptic reporting tools across the three cases: stakeholder involvement, managing the change process (e.g., building demand, communication, training and support), champions and respected colleagues, administrative and managerial support, and innovation attributes (e.g., complexity, compatibility with interests and values). The direction of influence (facilitating or impeding) of each of these factors differed across and within cases. The findings demonstrate the importance of a multi-level contextual analysis to gaining both breadth and depth to our understanding of innovation implementation and use in health care. They also provide new insights into several important issues under-reported in the literature on moving innovations into healthcare practice, including the role of middle managers in implementation efforts and the importance of attending to the interpersonal aspects of implementation.

  3. SLIVISU, an Interactive Visualisation Framework for Analysis of Geological Sea-Level Indicators

    NASA Astrophysics Data System (ADS)

    Klemann, V.; Schulte, S.; Unger, A.; Dransch, D.

    2011-12-01

    Flanking data analysis in earth system sciences by advanced visualisation tools is a striking feature due to rising complexity, amount and variety of available data. With respect to sea-level indicators (SLIs), their analysis in earth-system applications, such as modelling and simulation on regional or global scales, demands the consideration of large amounts of data - we talk about thousands of SLIs - and, so, to go ahead of analysing single sea-level curves. On the other hand, a gross analysis by means of statistical methods is hindered by the often heterogeneous and individual character of the single SLIs, i.e., the spatio-temporal context and often heterogenous information is difficult to handle or to represent in an objective way. Therefore a concept of integrating automated analysis and visualisation is mandatory. This is provided by visual analytics. As an implementation of this concept, we present the visualisation framework SLIVISU, developed at GFZ, which bases on multiple linked views and provides a synoptic analysis of observational data, model configurations, model outputs and results of automated analysis in glacial isostatic adjustment. Starting as a visualisation tool for an existing database of SLIs, it now serves as an analysis tool for the evaluation of model simulations in studies of glacial-isostatic adjustment.

  4. Sequence Alignment to Predict Across Species Susceptibility ...

    EPA Pesticide Factsheets

    Conservation of a molecular target across species can be used as a line-of-evidence to predict the likelihood of chemical susceptibility. The web-based Sequence Alignment to Predict Across Species Susceptibility (SeqAPASS) tool was developed to simplify, streamline, and quantitatively assess protein sequence/structural similarity across taxonomic groups as a means to predict relative intrinsic susceptibility. The intent of the tool is to allow for evaluation of any potential protein target, so it is amenable to variable degrees of protein characterization, depending on available information about the chemical/protein interaction and the molecular target itself. To allow for flexibility in the analysis, a layered strategy was adopted for the tool. The first level of the SeqAPASS analysis compares primary amino acid sequences to a query sequence, calculating a metric for sequence similarity (including detection of candidate orthologs), the second level evaluates sequence similarity within selected domains (e.g., ligand-binding domain, DNA binding domain), and the third level of analysis compares individual amino acid residue positions identified as being of importance for protein conformation and/or ligand binding upon chemical perturbation. Each level of the SeqAPASS analysis provides increasing evidence to apply toward rapid, screening-level assessments of probable cross species susceptibility. Such analyses can support prioritization of chemicals for further ev

  5. Rural Mental Health

    MedlinePlus

    ... Toolkits Economic Impact Analysis Tool Community Health Gateway Sustainability Planning Tools Testing New Approaches Rural Health IT ... Mental Health Professional Shortage in the United States reports that higher levels of unmet need for mental ...

  6. Decoupled 1D/3D analysis of a hydraulic valve

    NASA Astrophysics Data System (ADS)

    Mehring, Carsten; Zopeya, Ashok; Latham, Matt; Ihde, Thomas; Massie, Dan

    2014-10-01

    Analysis approaches during product development of fluid valves and other aircraft fluid delivery components vary greatly depending on the development stage. Traditionally, empirical or simplistic one-dimensional tools are being deployed during preliminary design, whereas detailed analysis such as CFD (Computational Fluid Dynamics) tools are used to refine a selected design during the detailed design stage. In recent years, combined 1D/3D co-simulation has been deployed specifically for system level simulations requiring an increased level of analysis detail for one or more components. The present paper presents a decoupled 1D/3D analysis approach where 3D CFD analysis results are utilized to enhance the fidelity of a dynamic 1D modelin context of an aircraft fuel valve.

  7. A hardware acceleration based on high-level synthesis approach for glucose-insulin analysis

    NASA Astrophysics Data System (ADS)

    Daud, Nur Atikah Mohd; Mahmud, Farhanahani; Jabbar, Muhamad Hairol

    2017-01-01

    In this paper, the research is focusing on Type 1 Diabetes Mellitus (T1DM). Since this disease requires a full attention on the blood glucose concentration with the help of insulin injection, it is important to have a tool that able to predict that level when consume a certain amount of carbohydrate during meal time. Therefore, to make it realizable, a Hovorka model which is aiming towards T1DM is chosen in this research. A high-level language is chosen that is C++ to construct the mathematical model of the Hovorka model. Later, this constructed code is converted into intellectual property (IP) which is also known as a hardware accelerator by using of high-level synthesis (HLS) approach which able to improve in terms of design and performance for glucose-insulin analysis tool later as will be explained further in this paper. This is the first step in this research before implementing the design into system-on-chip (SoC) to achieve a high-performance system for the glucose-insulin analysis tool.

  8. Strategic Analysis Overview

    NASA Technical Reports Server (NTRS)

    Cirillo, William M.; Earle, Kevin D.; Goodliff, Kandyce E.; Reeves, J. D.; Stromgren, Chel; Andraschko, Mark R.; Merrill, R. Gabe

    2008-01-01

    NASA s Constellation Program employs a strategic analysis methodology in providing an integrated analysis capability of Lunar exploration scenarios and to support strategic decision-making regarding those scenarios. The strategic analysis methodology integrates the assessment of the major contributors to strategic objective satisfaction performance, affordability, and risk and captures the linkages and feedbacks between all three components. Strategic analysis supports strategic decision making by senior management through comparable analysis of alternative strategies, provision of a consistent set of high level value metrics, and the enabling of cost-benefit analysis. The tools developed to implement the strategic analysis methodology are not element design and sizing tools. Rather, these models evaluate strategic performance using predefined elements, imported into a library from expert-driven design/sizing tools or expert analysis. Specific components of the strategic analysis tool set include scenario definition, requirements generation, mission manifesting, scenario lifecycle costing, crew time analysis, objective satisfaction benefit, risk analysis, and probabilistic evaluation. Results from all components of strategic analysis are evaluated a set of pre-defined figures of merit (FOMs). These FOMs capture the high-level strategic characteristics of all scenarios and facilitate direct comparison of options. The strategic analysis methodology that is described in this paper has previously been applied to the Space Shuttle and International Space Station Programs and is now being used to support the development of the baseline Constellation Program lunar architecture. This paper will present an overview of the strategic analysis methodology and will present sample results from the application of the strategic analysis methodology to the Constellation Program lunar architecture.

  9. Impact of a decision-support tool on decision making at the district level in Kenya

    PubMed Central

    2013-01-01

    Background In many countries, the responsibility for planning and delivery of health services is devolved to the subnational level. Health programs, however, often fall short of efficient use of data to inform decisions. As a result, programs are not as effective as they can be at meeting the health needs of the populations they serve. In Kenya, a decision-support tool, the District Health Profile (DHP) tool was developed to integrate data from health programs, primarily HIV, at the district level and to enable district health management teams to review and monitor program progress for specific health issues to make informed service delivery decisions. Methods Thirteen in-depth interviews were conducted with ten tool users and three non-users in six districts to qualitatively assess the process of implementing the tool and its effect on data-informed decision making at the district level. The factors that affected use or non-use of the tool were also investigated. Respondents were selected via convenience sample from among those that had been trained to use the DHP tool except for one user who was self-taught to use the tool. Selection criteria also included respondents from urban districts with significant resources as well as respondents from more remote, under-resourced districts. Results Findings from the in-depth interviews suggest that among those who used it, the DHP tool had a positive effect on data analysis, review, interpretation, and sharing at the district level. The automated function of the tool allowed for faster data sharing and immediate observation of trends that facilitated data-informed decision making. All respondents stated that the DHP tool assisted them to better target existing services in need of improvement and to plan future services, thus positively influencing program improvement. Conclusions This paper stresses the central role that a targeted decision-support tool can play in making data aggregation, analysis, and presentation easier and faster. The visual synthesis of data facilitates the use of information in health decision making at the district level of a health system and promotes program improvement. The experience in Kenya can be applied to other countries that face challenges making district-level, data-informed decisions with data from fragmented information systems. PMID:24011028

  10. Tools for observational gait analysis in patients with stroke: a systematic review.

    PubMed

    Ferrarello, Francesco; Bianchi, Valeria Anna Maria; Baccini, Marco; Rubbieri, Gaia; Mossello, Enrico; Cavallini, Maria Chiara; Marchionni, Niccolò; Di Bari, Mauro

    2013-12-01

    Stroke severely affects walking ability, and assessment of gait kinematics is important in defining diagnosis, planning treatment, and evaluating interventions in stroke rehabilitation. Although observational gait analysis is the most common approach to evaluate gait kinematics, tools useful for this purpose have received little attention in the scientific literature and have not been thoroughly reviewed. The aims of this systematic review were to identify tools proposed to conduct observational gait analysis in adults with a stroke, to summarize evidence concerning their quality, and to assess their implementation in rehabilitation research and clinical practice. An extensive search was performed of original articles reporting on visual/observational tools developed to investigate gait kinematics in adults with a stroke. Two reviewers independently selected studies, extracted data, assessed quality of the included studies, and scored the metric properties and clinical utility of each tool. Rigor in reporting metric properties and dissemination of the tools also was evaluated. Five tools were identified, not all of which had been tested adequately for their metric properties. Evaluation of content validity was partially satisfactory. Reliability was poorly investigated in all but one tool. Concurrent validity and sensitivity to change were shown for 3 and 2 tools, respectively. Overall, adequate levels of quality were rarely reached. The dissemination of the tools was poor. Based on critical appraisal, the Gait Assessment and Intervention Tool shows a good level of quality, and its use in stroke rehabilitation is recommended. Rigorous studies are needed for the other tools in order to establish their usefulness.

  11. Computational Tools and Facilities for the Next-Generation Analysis and Design Environment

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K. (Compiler); Malone, John B. (Compiler)

    1997-01-01

    This document contains presentations from the joint UVA/NASA Workshop on Computational Tools and Facilities for the Next-Generation Analysis and Design Environment held at the Virginia Consortium of Engineering and Science Universities in Hampton, Virginia on September 17-18, 1996. The presentations focused on the computational tools and facilities for analysis and design of engineering systems, including, real-time simulations, immersive systems, collaborative engineering environment, Web-based tools and interactive media for technical training. Workshop attendees represented NASA, commercial software developers, the aerospace industry, government labs, and academia. The workshop objectives were to assess the level of maturity of a number of computational tools and facilities and their potential for application to the next-generation integrated design environment.

  12. A web-based tool for groundwater mapping and drought analysis

    NASA Astrophysics Data System (ADS)

    Christensen, S.; Burns, M.; Jones, N.; Strassberg, G.

    2012-12-01

    In 2011-2012, the state of Texas saw the worst one-year drought on record. Fluctuations in gravity measured by GRACE satellites indicate that as much as 100 cubic kilometers of water was lost during this period. Much of this came from reservoirs and shallow soil moisture, but a significant amount came from aquifers. In response to this crisis, a Texas Drought Technology Steering Committee (TDTSC) consisting of academics and water managers was formed to develop new tools and strategies to assist the state in monitoring, predicting, and responding to drought events. In this presentation, we describe one of the tools that was developed as part of this effort. When analyzing the impact of drought on groundwater levels, it is fairly common to examine time series data at selected monitoring wells. However, accurately assessing impacts and trends requires both spatial and temporal analysis involving the development of detailed water level maps at various scales. Creating such maps in a flexible and rapid fashion is critical for effective drought analysis, but can be challenging due to the massive amounts of data involved and the processing required to generate such maps. Furthermore, wells are typically not sampled at the same points in time, and so developing a water table map for a particular date requires both spatial and temporal interpolation of water elevations. To address this challenge, a Cloud-based water level mapping system was developed for the state of Texas. The system is based on the Texas Water Development Board (TWDB) groundwater database, but can be adapted to use other databases as well. The system involves a set of ArcGIS workflows running on a server with a web-based front end and a Google Earth plug-in. A temporal interpolation geoprocessing tool was developed to estimate the piezometric heads for all wells in a given region at a specific date using a regression analysis. This interpolation tool is coupled with other geoprocessing tools to filter data and interpolate point elevations spatially to produce water level, drawdown, and depth to groundwater maps. The web interface allows for users to generate these maps at locations and times of interest. A sequence of maps can be generated over a period of time and animated to visualize how water levels are changing. The time series regression analysis can also be used to do short-term predictions of future water levels.

  13. Novel features and enhancements in BioBin, a tool for the biologically inspired binning and association analysis of rare variants

    PubMed Central

    Byrska-Bishop, Marta; Wallace, John; Frase, Alexander T; Ritchie, Marylyn D

    2018-01-01

    Abstract Motivation BioBin is an automated bioinformatics tool for the multi-level biological binning of sequence variants. Herein, we present a significant update to BioBin which expands the software to facilitate a comprehensive rare variant analysis and incorporates novel features and analysis enhancements. Results In BioBin 2.3, we extend our software tool by implementing statistical association testing, updating the binning algorithm, as well as incorporating novel analysis features providing for a robust, highly customizable, and unified rare variant analysis tool. Availability and implementation The BioBin software package is open source and freely available to users at http://www.ritchielab.com/software/biobin-download Contact mdritchie@geisinger.edu Supplementary information Supplementary data are available at Bioinformatics online. PMID:28968757

  14. Low Thrust Orbital Maneuvers Using Ion Propulsion

    NASA Astrophysics Data System (ADS)

    Ramesh, Eric

    2011-10-01

    Low-thrust maneuver options, such as electric propulsion, offer specific challenges within mission-level Modeling, Simulation, and Analysis (MS&A) tools. This project seeks to transition techniques for simulating low-thrust maneuvers from detailed engineering level simulations such as AGI's Satellite ToolKit (STK) Astrogator to mission level simulations such as the System Effectiveness Analysis Simulation (SEAS). Our project goals are as follows: A) Assess different low-thrust options to achieve various orbital changes; B) Compare such approaches to more conventional, high-thrust profiles; C) Compare computational cost and accuracy of various approaches to calculate and simulate low-thrust maneuvers; D) Recommend methods for implementing low-thrust maneuvers in high-level mission simulations; E) prototype recommended solutions.

  15. Colossal Tooling Design: 3D Simulation for Ergonomic Analysis

    NASA Technical Reports Server (NTRS)

    Hunter, Steve L.; Dischinger, Charles; Thomas, Robert E.; Babai, Majid

    2003-01-01

    The application of high-level 3D simulation software to the design phase of colossal mandrel tooling for composite aerospace fuel tanks was accomplished to discover and resolve safety and human engineering problems. The analyses were conducted to determine safety, ergonomic and human engineering aspects of the disassembly process of the fuel tank composite shell mandrel. Three-dimensional graphics high-level software, incorporating various ergonomic analysis algorithms, was utilized to determine if the process was within safety and health boundaries for the workers carrying out these tasks. In addition, the graphical software was extremely helpful in the identification of material handling equipment and devices for the mandrel tooling assembly/disassembly process.

  16. Electromagnetic pulse (EMP) coupling codes for use with the vulnerability/lethality (VIL) taxonomy. Final report, June-October 1984

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mar, M.H.

    1995-07-01

    Based on the vulnerability Lethality (V/L) taxonomy developed by the Ballistic Vulnerability Lethality Division (BVLD) of the Survivability Lethality Analysis Directorate (SLAD), a nuclear electromagnetic pulse (EMP) coupling V/L analysis taxonomy has been developed. A nuclear EMP threat to a military system can be divided into two levels: (1) coupling to a system level through a cable, antenna, or aperture; and (2) the component level. This report will focus on the initial condition, which includes threat definition and target description, as well as the mapping process from the initial condition to damaged components state. EMP coupling analysis at a systemmore » level is used to accomplish this. This report introduces the nature of EMP threat, interaction between the threat and target, and how the output of EMP coupling analysis at a system level becomes the input to the component level analysis. Many different tools (EMP coupling codes) will be discussed for the mapping process, which correponds to the physics of phenomenology. This EMP coupling V/L taxonomy and the models identified in this report will provide the tools necessary to conduct basic V/L analysis of EMP coupling.« less

  17. Seeking unique and common biological themes in multiple gene lists or datasets: pathway pattern extraction pipeline for pathway-level comparative analysis.

    PubMed

    Yi, Ming; Mudunuri, Uma; Che, Anney; Stephens, Robert M

    2009-06-29

    One of the challenges in the analysis of microarray data is to integrate and compare the selected (e.g., differential) gene lists from multiple experiments for common or unique underlying biological themes. A common way to approach this problem is to extract common genes from these gene lists and then subject these genes to enrichment analysis to reveal the underlying biology. However, the capacity of this approach is largely restricted by the limited number of common genes shared by datasets from multiple experiments, which could be caused by the complexity of the biological system itself. We now introduce a new Pathway Pattern Extraction Pipeline (PPEP), which extends the existing WPS application by providing a new pathway-level comparative analysis scheme. To facilitate comparing and correlating results from different studies and sources, PPEP contains new interfaces that allow evaluation of the pathway-level enrichment patterns across multiple gene lists. As an exploratory tool, this analysis pipeline may help reveal the underlying biological themes at both the pathway and gene levels. The analysis scheme provided by PPEP begins with multiple gene lists, which may be derived from different studies in terms of the biological contexts, applied technologies, or methodologies. These lists are then subjected to pathway-level comparative analysis for extraction of pathway-level patterns. This analysis pipeline helps to explore the commonality or uniqueness of these lists at the level of pathways or biological processes from different but relevant biological systems using a combination of statistical enrichment measurements, pathway-level pattern extraction, and graphical display of the relationships of genes and their associated pathways as Gene-Term Association Networks (GTANs) within the WPS platform. As a proof of concept, we have used the new method to analyze many datasets from our collaborators as well as some public microarray datasets. This tool provides a new pathway-level analysis scheme for integrative and comparative analysis of data derived from different but relevant systems. The tool is freely available as a Pathway Pattern Extraction Pipeline implemented in our existing software package WPS, which can be obtained at http://www.abcc.ncifcrf.gov/wps/wps_index.php.

  18. Perspective Tools of the Strategic Management of VFR Tourism Development at the Regional Level

    ERIC Educational Resources Information Center

    Gorbunov, Aleksandr P.; Efimova, Ekaterina V.; Kobets, Margarita V.; Kilinkarova, Sofiya G.

    2016-01-01

    This study is aimed at identifying the perspective tools of strategic management in general and strategic planning of VFR tourism (for the purpose of visiting friends and relatives) at the regional level in particular. It is based on dialectical and logical methods, analysis and synthesis, induction and deduction, the concrete historical and…

  19. 49 CFR 228.407 - Analysis of work schedules; submissions; FRA review and approval of submissions; fatigue...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... TRANSPORTATION HOURS OF SERVICE OF RAILROAD EMPLOYEES; RECORDKEEPING AND REPORTING; SLEEPING QUARTERS Substantive... fatigue mitigation tools to reduce the risk for fatigue to a level that does not violate the fatigue... mitigation tools so as to present a risk for a level of fatigue that does not violate the applicable fatigue...

  20. 49 CFR 228.407 - Analysis of work schedules; submissions; FRA review and approval of submissions; fatigue...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... TRANSPORTATION HOURS OF SERVICE OF RAILROAD EMPLOYEES; RECORDKEEPING AND REPORTING; SLEEPING QUARTERS Substantive... fatigue mitigation tools to reduce the risk for fatigue to a level that does not violate the fatigue... mitigation tools so as to present a risk for a level of fatigue that does not violate the applicable fatigue...

  1. 49 CFR 228.407 - Analysis of work schedules; submissions; FRA review and approval of submissions; fatigue...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... TRANSPORTATION HOURS OF SERVICE OF RAILROAD EMPLOYEES; RECORDKEEPING AND REPORTING; SLEEPING QUARTERS Substantive... fatigue mitigation tools to reduce the risk for fatigue to a level that does not violate the fatigue... mitigation tools so as to present a risk for a level of fatigue that does not violate the applicable fatigue...

  2. Problems in Choosing Tools and Methods for Teaching Programming

    ERIC Educational Resources Information Center

    Vitkute-Adžgauskiene, Davia; Vidžiunas, Antanas

    2012-01-01

    The paper analyses the problems in selecting and integrating tools for delivering basic programming knowledge at the university level. Discussion and analysis of teaching the programming disciplines, the main principles of study programme design, requirements for teaching tools, methods and corresponding languages is presented, based on literature…

  3. "Development Radar": The Co-Configuration of a Tool in a Learning Network

    ERIC Educational Resources Information Center

    Toiviainen, Hanna; Kerosuo, Hannele; Syrjala, Tuula

    2009-01-01

    Purpose: The paper aims to argue that new tools are needed for operating, developing and learning in work-life networks where academic and practice knowledge are intertwined in multiple levels of and in boundary-crossing across activities. At best, tools for learning are designed in a process of co-configuration, as the analysis of one tool,…

  4. Sustainability Tools Inventory - Initial Gaps Analysis | Science ...

    EPA Pesticide Factsheets

    This report identifies a suite of tools that address a comprehensive set of community sustainability concerns. The objective is to discover whether "gaps" exist in the tool suite’s analytic capabilities. These tools address activities that significantly influence resource consumption, waste generation, and hazard generation including air pollution and greenhouse gases. In addition, the tools have been evaluated using four screening criteria: relevance to community decision making, tools in an appropriate developmental stage, tools that may be transferrable to situations useful for communities, and tools with requiring skill levels appropriate to communities. This document provides an initial gap analysis in the area of community sustainability decision support tools. It provides a reference to communities for existing decision support tools, and a set of gaps for those wishing to develop additional needed tools to help communities to achieve sustainability. It contributes to SHC 1.61.4

  5. CloudMan as a platform for tool, data, and analysis distribution.

    PubMed

    Afgan, Enis; Chapman, Brad; Taylor, James

    2012-11-27

    Cloud computing provides an infrastructure that facilitates large scale computational analysis in a scalable, democratized fashion, However, in this context it is difficult to ensure sharing of an analysis environment and associated data in a scalable and precisely reproducible way. CloudMan (usecloudman.org) enables individual researchers to easily deploy, customize, and share their entire cloud analysis environment, including data, tools, and configurations. With the enabled customization and sharing of instances, CloudMan can be used as a platform for collaboration. The presented solution improves accessibility of cloud resources, tools, and data to the level of an individual researcher and contributes toward reproducibility and transparency of research solutions.

  6. Application of Frameworks in the Analysis and (Re)design of Interactive Visual Learning Tools

    ERIC Educational Resources Information Center

    Liang, Hai-Ning; Sedig, Kamran

    2009-01-01

    Interactive visual learning tools (IVLTs) are software environments that encode and display information visually and allow learners to interact with the visual information. This article examines the application and utility of frameworks in the analysis and design of IVLTs at the micro level. Frameworks play an important role in any design. They…

  7. FMAP: Functional Mapping and Analysis Pipeline for metagenomics and metatranscriptomics studies.

    PubMed

    Kim, Jiwoong; Kim, Min Soo; Koh, Andrew Y; Xie, Yang; Zhan, Xiaowei

    2016-10-10

    Given the lack of a complete and comprehensive library of microbial reference genomes, determining the functional profile of diverse microbial communities is challenging. The available functional analysis pipelines lack several key features: (i) an integrated alignment tool, (ii) operon-level analysis, and (iii) the ability to process large datasets. Here we introduce our open-sourced, stand-alone functional analysis pipeline for analyzing whole metagenomic and metatranscriptomic sequencing data, FMAP (Functional Mapping and Analysis Pipeline). FMAP performs alignment, gene family abundance calculations, and statistical analysis (three levels of analyses are provided: differentially-abundant genes, operons and pathways). The resulting output can be easily visualized with heatmaps and functional pathway diagrams. FMAP functional predictions are consistent with currently available functional analysis pipelines. FMAP is a comprehensive tool for providing functional analysis of metagenomic/metatranscriptomic sequencing data. With the added features of integrated alignment, operon-level analysis, and the ability to process large datasets, FMAP will be a valuable addition to the currently available functional analysis toolbox. We believe that this software will be of great value to the wider biology and bioinformatics communities.

  8. Everglades Depth Estimation Network (EDEN) Applications: Tools to View, Extract, Plot, and Manipulate EDEN Data

    USGS Publications Warehouse

    Telis, Pamela A.; Henkel, Heather

    2009-01-01

    The Everglades Depth Estimation Network (EDEN) is an integrated system of real-time water-level monitoring, ground-elevation data, and water-surface elevation modeling to provide scientists and water managers with current on-line water-depth information for the entire freshwater part of the greater Everglades. To assist users in applying the EDEN data to their particular needs, a series of five EDEN tools, or applications (EDENapps), were developed. Using EDEN's tools, scientists can view the EDEN datasets of daily water-level and ground elevations, compute and view daily water depth and hydroperiod surfaces, extract data for user-specified locations, plot transects of water level, and animate water-level transects over time. Also, users can retrieve data from the EDEN datasets for analysis and display in other analysis software programs. As scientists and managers attempt to restore the natural volume, timing, and distribution of sheetflow in the wetlands, such information is invaluable. Information analyzed and presented with these tools is used to advise policy makers, planners, and decision makers of the potential effects of water management and restoration scenarios on the natural resources of the Everglades.

  9. SMART: A Propositional Logic-Based Trade Analysis and Risk Assessment Tool for a Complex Mission

    NASA Technical Reports Server (NTRS)

    Ono, Masahiro; Nicholas, Austin; Alibay, Farah; Parrish, Joseph

    2015-01-01

    This paper introduces a new trade analysis software called the Space Mission Architecture and Risk Analysis Tool (SMART). This tool supports a high-level system trade study on a complex mission, such as a potential Mars Sample Return (MSR) mission, in an intuitive and quantitative manner. In a complex mission, a common approach to increase the probability of success is to have redundancy and prepare backups. Quantitatively evaluating the utility of adding redundancy to a system is important but not straightforward, particularly when the failure of parallel subsystems are correlated.

  10. Update on HCDstruct - A Tool for Hybrid Wing Body Conceptual Design and Structural Optimization

    NASA Technical Reports Server (NTRS)

    Gern, Frank H.

    2015-01-01

    HCDstruct is a Matlab® based software tool to rapidly build a finite element model for structural optimization of hybrid wing body (HWB) aircraft at the conceptual design level. The tool uses outputs from a Flight Optimization System (FLOPS) performance analysis together with a conceptual outer mold line of the vehicle, e.g. created by Vehicle Sketch Pad (VSP), to generate a set of MSC Nastran® bulk data files. These files can readily be used to perform a structural optimization and weight estimation using Nastran’s® Solution 200 multidisciplinary optimization solver. Initially developed at NASA Langley Research Center to perform increased fidelity conceptual level HWB centerbody structural analyses, HCDstruct has grown into a complete HWB structural sizing and weight estimation tool, including a fully flexible aeroelastic loads analysis. Recent upgrades to the tool include the expansion to a full wing tip-to-wing tip model for asymmetric analyses like engine out conditions and dynamic overswings, as well as a fully actuated trailing edge, featuring up to 15 independently actuated control surfaces and twin tails. Several example applications of the HCDstruct tool are presented.

  11. Discrimination of surface wear on obsidian tools using LSCM and RelA: pilot study results (area-scale analysis of obsidian tool surfaces).

    PubMed

    Stemp, W James; Chung, Steven

    2011-01-01

    This pilot study tests the reliability of laser scanning confocal microscopy (LSCM) to quantitatively measure wear on experimental obsidian tools. To our knowledge, this is the first use of confocal microscopy to study wear on stone flakes made from an amorphous silicate like obsidian. Three-dimensional surface roughness or texture area scans on three obsidian flakes used on different contact materials (hide, shell, wood) were documented using the LSCM to determine whether the worn surfaces could be discriminated using area-scale analysis, specifically relative area (RelA). When coupled with the F-test, this scale-sensitive fractal analysis could not only discriminate the used from unused surfaces on individual tools, but was also capable of discriminating the wear histories of tools used on different contact materials. Results indicate that such discriminations occur at different scales. Confidence levels for the discriminations at different scales were established using the F-test (mean square ratios or MSRs). In instances where discrimination of surface roughness or texture was not possible above the established confidence level based on MSRs, photomicrographs and RelA assisted in hypothesizing why this was so. Copyright © 2011 Wiley Periodicals, Inc.

  12. Radiation Mitigation and Power Optimization Design Tools for Reconfigurable Hardware in Orbit

    NASA Technical Reports Server (NTRS)

    French, Matthew; Graham, Paul; Wirthlin, Michael; Wang, Li; Larchev, Gregory

    2005-01-01

    The Reconfigurable Hardware in Orbit (RHinO)project is focused on creating a set of design tools that facilitate and automate design techniques for reconfigurable computing in space, using SRAM-based field-programmable-gate-array (FPGA) technology. In the second year of the project, design tools that leverage an established FPGA design environment have been created to visualize and analyze an FPGA circuit for radiation weaknesses and power inefficiencies. For radiation, a single event Upset (SEU) emulator, persistence analysis tool, and a half-latch removal tool for Xilinx/Virtex-II devices have been created. Research is underway on a persistence mitigation tool and multiple bit upsets (MBU) studies. For power, synthesis level dynamic power visualization and analysis tools have been completed. Power optimization tools are under development and preliminary test results are positive.

  13. CRIE: An automated analyzer for Chinese texts.

    PubMed

    Sung, Yao-Ting; Chang, Tao-Hsing; Lin, Wei-Chun; Hsieh, Kuan-Sheng; Chang, Kuo-En

    2016-12-01

    Textual analysis has been applied to various fields, such as discourse analysis, corpus studies, text leveling, and automated essay evaluation. Several tools have been developed for analyzing texts written in alphabetic languages such as English and Spanish. However, currently there is no tool available for analyzing Chinese-language texts. This article introduces a tool for the automated analysis of simplified and traditional Chinese texts, called the Chinese Readability Index Explorer (CRIE). Composed of four subsystems and incorporating 82 multilevel linguistic features, CRIE is able to conduct the major tasks of segmentation, syntactic parsing, and feature extraction. Furthermore, the integration of linguistic features with machine learning models enables CRIE to provide leveling and diagnostic information for texts in language arts, texts for learning Chinese as a foreign language, and texts with domain knowledge. The usage and validation of the functions provided by CRIE are also introduced.

  14. Assessment and application of national environmental databases and mapping tools at the local level to two community case studies.

    PubMed

    Hammond, Davyda; Conlon, Kathryn; Barzyk, Timothy; Chahine, Teresa; Zartarian, Valerie; Schultz, Brad

    2011-03-01

    Communities are concerned over pollution levels and seek methods to systematically identify and prioritize the environmental stressors in their communities. Geographic information system (GIS) maps of environmental information can be useful tools for communities in their assessment of environmental-pollution-related risks. Databases and mapping tools that supply community-level estimates of ambient concentrations of hazardous pollutants, risk, and potential health impacts can provide relevant information for communities to understand, identify, and prioritize potential exposures and risk from multiple sources. An assessment of existing databases and mapping tools was conducted as part of this study to explore the utility of publicly available databases, and three of these databases were selected for use in a community-level GIS mapping application. Queried data from the U.S. EPA's National-Scale Air Toxics Assessment, Air Quality System, and National Emissions Inventory were mapped at the appropriate spatial and temporal resolutions for identifying risks of exposure to air pollutants in two communities. The maps combine monitored and model-simulated pollutant and health risk estimates, along with local survey results, to assist communities with the identification of potential exposure sources and pollution hot spots. Findings from this case study analysis will provide information to advance the development of new tools to assist communities with environmental risk assessments and hazard prioritization. © 2010 Society for Risk Analysis.

  15. Bioinformatics tools for quantitative and functional metagenome and metatranscriptome data analysis in microbes.

    PubMed

    Niu, Sheng-Yong; Yang, Jinyu; McDermaid, Adam; Zhao, Jing; Kang, Yu; Ma, Qin

    2017-05-08

    Metagenomic and metatranscriptomic sequencing approaches are more frequently being used to link microbiota to important diseases and ecological changes. Many analyses have been used to compare the taxonomic and functional profiles of microbiota across habitats or individuals. While a large portion of metagenomic analyses focus on species-level profiling, some studies use strain-level metagenomic analyses to investigate the relationship between specific strains and certain circumstances. Metatranscriptomic analysis provides another important insight into activities of genes by examining gene expression levels of microbiota. Hence, combining metagenomic and metatranscriptomic analyses will help understand the activity or enrichment of a given gene set, such as drug-resistant genes among microbiome samples. Here, we summarize existing bioinformatics tools of metagenomic and metatranscriptomic data analysis, the purpose of which is to assist researchers in deciding the appropriate tools for their microbiome studies. Additionally, we propose an Integrated Meta-Function mapping pipeline to incorporate various reference databases and accelerate functional gene mapping procedures for both metagenomic and metatranscriptomic analyses. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  16. CloudMan as a platform for tool, data, and analysis distribution

    PubMed Central

    2012-01-01

    Background Cloud computing provides an infrastructure that facilitates large scale computational analysis in a scalable, democratized fashion, However, in this context it is difficult to ensure sharing of an analysis environment and associated data in a scalable and precisely reproducible way. Results CloudMan (usecloudman.org) enables individual researchers to easily deploy, customize, and share their entire cloud analysis environment, including data, tools, and configurations. Conclusions With the enabled customization and sharing of instances, CloudMan can be used as a platform for collaboration. The presented solution improves accessibility of cloud resources, tools, and data to the level of an individual researcher and contributes toward reproducibility and transparency of research solutions. PMID:23181507

  17. Open access to high-level data and analysis tools in the CMS experiment at the LHC

    DOE PAGES

    Calderon, A.; Colling, D.; Huffman, A.; ...

    2015-12-23

    The CMS experiment, in recognition of its commitment to data preservation and open access as well as to education and outreach, has made its first public release of high-level data under the CC0 waiver: up to half of the proton-proton collision data (by volume) at 7 TeV from 2010 in CMS Analysis Object Data format. CMS has prepared, in collaboration with CERN and the other LHC experiments, an open-data web portal based on Invenio. The portal provides access to CMS public data as well as to analysis tools and documentation for the public. The tools include an event display andmore » histogram application that run in the browser. In addition a virtual machine containing a CMS software environment along with XRootD access to the data is available. Within the virtual machine the public can analyse CMS data, example code is provided. As a result, we describe the accompanying tools and documentation and discuss the first experiences of data use.« less

  18. HYPATIA--An Online Tool for ATLAS Event Visualization

    ERIC Educational Resources Information Center

    Kourkoumelis, C.; Vourakis, S.

    2014-01-01

    This paper describes an interactive tool for analysis of data from the ATLAS experiment taking place at the world's highest energy particle collider at CERN. The tool, called HYPATIA/applet, enables students of various levels to become acquainted with particle physics and look for discoveries in a similar way to that of real research.

  19. A Thermal Management Systems Model for the NASA GTX RBCC Concept

    NASA Technical Reports Server (NTRS)

    Traci, Richard M.; Farr, John L., Jr.; Laganelli, Tony; Walker, James (Technical Monitor)

    2002-01-01

    The Vehicle Integrated Thermal Management Analysis Code (VITMAC) was further developed to aid the analysis, design, and optimization of propellant and thermal management concepts for advanced propulsion systems. The computational tool is based on engineering level principles and models. A graphical user interface (GUI) provides a simple and straightforward method to assess and evaluate multiple concepts before undertaking more rigorous analysis of candidate systems. The tool incorporates the Chemical Equilibrium and Applications (CEA) program and the RJPA code to permit heat transfer analysis of both rocket and air breathing propulsion systems. Key parts of the code have been validated with experimental data. The tool was specifically tailored to analyze rocket-based combined-cycle (RBCC) propulsion systems being considered for space transportation applications. This report describes the computational tool and its development and verification for NASA GTX RBCC propulsion system applications.

  20. Enhancement of the FDOT's project level and network level bridge management analysis tools

    DOT National Transportation Integrated Search

    2011-02-01

    Over several years, the Florida Department of Transportation (FDOT) has been implementing the AASHTO Pontis Bridge Management System to support network-level and project-level decision making in the headquarters and district offices. Pontis is an int...

  1. SYNCSA--R tool for analysis of metacommunities based on functional traits and phylogeny of the community components.

    PubMed

    Debastiani, Vanderlei J; Pillar, Valério D

    2012-08-01

    SYNCSA is an R package for the analysis of metacommunities based on functional traits and phylogeny of the community components. It offers tools to calculate several matrix correlations that express trait-convergence assembly patterns, trait-divergence assembly patterns and phylogenetic signal in functional traits at the species pool level and at the metacommunity level. SYNCSA is a package for the R environment, under a GPL-2 open-source license and freely available on CRAN official web server for R (http://cran.r-project.org). vanderleidebastiani@yahoo.com.br.

  2. Forensic surface metrology: tool mark evidence.

    PubMed

    Gambino, Carol; McLaughlin, Patrick; Kuo, Loretta; Kammerman, Frani; Shenkin, Peter; Diaczuk, Peter; Petraco, Nicholas; Hamby, James; Petraco, Nicholas D K

    2011-01-01

    Over the last several decades, forensic examiners of impression evidence have come under scrutiny in the courtroom due to analysis methods that rely heavily on subjective morphological comparisons. Currently, there is no universally accepted system that generates numerical data to independently corroborate visual comparisons. Our research attempts to develop such a system for tool mark evidence, proposing a methodology that objectively evaluates the association of striated tool marks with the tools that generated them. In our study, 58 primer shear marks on 9 mm cartridge cases, fired from four Glock model 19 pistols, were collected using high-resolution white light confocal microscopy. The resulting three-dimensional surface topographies were filtered to extract all "waviness surfaces"-the essential "line" information that firearm and tool mark examiners view under a microscope. Extracted waviness profiles were processed with principal component analysis (PCA) for dimension reduction. Support vector machines (SVM) were used to make the profile-gun associations, and conformal prediction theory (CPT) for establishing confidence levels. At the 95% confidence level, CPT coupled with PCA-SVM yielded an empirical error rate of 3.5%. Complementary, bootstrap-based computations for estimated error rates were 0%, indicating that the error rate for the algorithmic procedure is likely to remain low on larger data sets. Finally, suggestions are made for practical courtroom application of CPT for assigning levels of confidence to SVM identifications of tool marks recorded with confocal microscopy. Copyright © 2011 Wiley Periodicals, Inc.

  3. Tools for Genomic and Transcriptomic Analysis of Microbes at Single-Cell Level

    PubMed Central

    Chen, Zixi; Chen, Lei; Zhang, Weiwen

    2017-01-01

    Microbiologists traditionally study population rather than individual cells, as it is generally assumed that the status of individual cells will be similar to that observed in the population. However, the recent studies have shown that the individual behavior of each single cell could be quite different from that of the whole population, suggesting the importance of extending traditional microbiology studies to single-cell level. With recent technological advances, such as flow cytometry, next-generation sequencing (NGS), and microspectroscopy, single-cell microbiology has greatly enhanced the understanding of individuality and heterogeneity of microbes in many biological systems. Notably, the application of multiple ‘omics’ in single-cell analysis has shed light on how individual cells perceive, respond, and adapt to the environment, how heterogeneity arises under external stress and finally determines the fate of the whole population, and how microbes survive under natural conditions. As single-cell analysis involves no axenic cultivation of target microorganism, it has also been demonstrated as a valuable tool for dissecting the microbial ‘dark matter.’ In this review, current state-of-the-art tools and methods for genomic and transcriptomic analysis of microbes at single-cell level were critically summarized, including single-cell isolation methods and experimental strategies of single-cell analysis with NGS. In addition, perspectives on the future trends of technology development in the field of single-cell analysis was also presented. PMID:28979258

  4. miRanalyzer: a microRNA detection and analysis tool for next-generation sequencing experiments.

    PubMed

    Hackenberg, Michael; Sturm, Martin; Langenberger, David; Falcón-Pérez, Juan Manuel; Aransay, Ana M

    2009-07-01

    Next-generation sequencing allows now the sequencing of small RNA molecules and the estimation of their expression levels. Consequently, there will be a high demand of bioinformatics tools to cope with the several gigabytes of sequence data generated in each single deep-sequencing experiment. Given this scene, we developed miRanalyzer, a web server tool for the analysis of deep-sequencing experiments for small RNAs. The web server tool requires a simple input file containing a list of unique reads and its copy numbers (expression levels). Using these data, miRanalyzer (i) detects all known microRNA sequences annotated in miRBase, (ii) finds all perfect matches against other libraries of transcribed sequences and (iii) predicts new microRNAs. The prediction of new microRNAs is an especially important point as there are many species with very few known microRNAs. Therefore, we implemented a highly accurate machine learning algorithm for the prediction of new microRNAs that reaches AUC values of 97.9% and recall values of up to 75% on unseen data. The web tool summarizes all the described steps in a single output page, which provides a comprehensive overview of the analysis, adding links to more detailed output pages for each analysis module. miRanalyzer is available at http://web.bioinformatics.cicbiogune.es/microRNA/.

  5. A Student Assessment Tool for Standardized Patient Simulations (SAT-SPS): Psychometric analysis.

    PubMed

    Castro-Yuste, Cristina; García-Cabanillas, María José; Rodríguez-Cornejo, María Jesús; Carnicer-Fuentes, Concepción; Paloma-Castro, Olga; Moreno-Corral, Luis Javier

    2018-05-01

    The evaluation of the level of clinical competence acquired by the student is a complex process that must meet various requirements to ensure its quality. The psychometric analysis of the data collected by the assessment tools used is a fundamental aspect to guarantee the student's competence level. To conduct a psychometric analysis of an instrument which assesses clinical competence in nursing students at simulation stations with standardized patients in OSCE-format tests. The construct of clinical competence was operationalized as a set of observable and measurable behaviors, measured by the newly-created Student Assessment Tool for Standardized Patient Simulations (SAT-SPS), which was comprised of 27 items. The categories assigned to the items were 'incorrect or not performed' (0), 'acceptable' (1), and 'correct' (2). 499 nursing students. Data were collected by two independent observers during the assessment of the students' performance at a four-station OSCE with standardized patients. Descriptive statistics were used to summarize the variables. The difficulty levels and floor and ceiling effects were determined for each item. Reliability was analyzed using internal consistency and inter-observer reliability. The validity analysis was performed considering face validity, content and construct validity (through exploratory factor analysis), and criterion validity. Internal reliability and inter-observer reliability were higher than 0.80. The construct validity analysis suggested a three-factor model accounting for 37.1% of the variance. These three factors were named 'Nursing process', 'Communication skills', and 'Safe practice'. A significant correlation was found between the scores obtained and the students' grades in general, as well as with the grades obtained in subjects with clinical content. The assessment tool has proven to be sufficiently reliable and valid for the assessment of the clinical competence of nursing students using standardized patients. This tool has three main components: the nursing process, communication skills, and safety management. Copyright © 2018 Elsevier Ltd. All rights reserved.

  6. EEGNET: An Open Source Tool for Analyzing and Visualizing M/EEG Connectome.

    PubMed

    Hassan, Mahmoud; Shamas, Mohamad; Khalil, Mohamad; El Falou, Wassim; Wendling, Fabrice

    2015-01-01

    The brain is a large-scale complex network often referred to as the "connectome". Exploring the dynamic behavior of the connectome is a challenging issue as both excellent time and space resolution is required. In this context Magneto/Electroencephalography (M/EEG) are effective neuroimaging techniques allowing for analysis of the dynamics of functional brain networks at scalp level and/or at reconstructed sources. However, a tool that can cover all the processing steps of identifying brain networks from M/EEG data is still missing. In this paper, we report a novel software package, called EEGNET, running under MATLAB (Math works, inc), and allowing for analysis and visualization of functional brain networks from M/EEG recordings. EEGNET is developed to analyze networks either at the level of scalp electrodes or at the level of reconstructed cortical sources. It includes i) Basic steps in preprocessing M/EEG signals, ii) the solution of the inverse problem to localize / reconstruct the cortical sources, iii) the computation of functional connectivity among signals collected at surface electrodes or/and time courses of reconstructed sources and iv) the computation of the network measures based on graph theory analysis. EEGNET is the unique tool that combines the M/EEG functional connectivity analysis and the computation of network measures derived from the graph theory. The first version of EEGNET is easy to use, flexible and user friendly. EEGNET is an open source tool and can be freely downloaded from this webpage: https://sites.google.com/site/eegnetworks/.

  7. EEGNET: An Open Source Tool for Analyzing and Visualizing M/EEG Connectome

    PubMed Central

    Hassan, Mahmoud; Shamas, Mohamad; Khalil, Mohamad; El Falou, Wassim; Wendling, Fabrice

    2015-01-01

    The brain is a large-scale complex network often referred to as the “connectome”. Exploring the dynamic behavior of the connectome is a challenging issue as both excellent time and space resolution is required. In this context Magneto/Electroencephalography (M/EEG) are effective neuroimaging techniques allowing for analysis of the dynamics of functional brain networks at scalp level and/or at reconstructed sources. However, a tool that can cover all the processing steps of identifying brain networks from M/EEG data is still missing. In this paper, we report a novel software package, called EEGNET, running under MATLAB (Math works, inc), and allowing for analysis and visualization of functional brain networks from M/EEG recordings. EEGNET is developed to analyze networks either at the level of scalp electrodes or at the level of reconstructed cortical sources. It includes i) Basic steps in preprocessing M/EEG signals, ii) the solution of the inverse problem to localize / reconstruct the cortical sources, iii) the computation of functional connectivity among signals collected at surface electrodes or/and time courses of reconstructed sources and iv) the computation of the network measures based on graph theory analysis. EEGNET is the unique tool that combines the M/EEG functional connectivity analysis and the computation of network measures derived from the graph theory. The first version of EEGNET is easy to use, flexible and user friendly. EEGNET is an open source tool and can be freely downloaded from this webpage: https://sites.google.com/site/eegnetworks/. PMID:26379232

  8. Knowledge Mapping: A Multipurpose Task Analysis Tool.

    ERIC Educational Resources Information Center

    Esque, Timm J.

    1988-01-01

    Describes knowledge mapping, a tool developed to increase the objectivity and accuracy of task difficulty ratings for job design. Application in a semiconductor manufacturing environment is discussed, including identifying prerequisite knowledge for a given task; establishing training development priorities; defining knowledge levels; identifying…

  9. Using Enabling Technologies to Advance Data Intensive Analysis Tools in the JPL Tropical Cyclone Information System

    NASA Astrophysics Data System (ADS)

    Knosp, B.; Gangl, M. E.; Hristova-Veleva, S. M.; Kim, R. M.; Lambrigtsen, B.; Li, P.; Niamsuwan, N.; Shen, T. P. J.; Turk, F. J.; Vu, Q. A.

    2014-12-01

    The JPL Tropical Cyclone Information System (TCIS) brings together satellite, aircraft, and model forecast data from several NASA, NOAA, and other data centers to assist researchers in comparing and analyzing data related to tropical cyclones. The TCIS has been supporting specific science field campaigns, such as the Genesis and Rapid Intensification Processes (GRIP) campaign and the Hurricane and Severe Storm Sentinel (HS3) campaign, by creating near real-time (NRT) data visualization portals. These portals are intended to assist in mission planning, enhance the understanding of current physical processes, and improve model data by comparing it to satellite and aircraft observations. The TCIS NRT portals allow the user to view plots on a Google Earth interface. To compliment these visualizations, the team has been working on developing data analysis tools to let the user actively interrogate areas of Level 2 swath and two-dimensional plots they see on their screen. As expected, these observation and model data are quite voluminous and bottlenecks in the system architecture can occur when the databases try to run geospatial searches for data files that need to be read by the tools. To improve the responsiveness of the data analysis tools, the TCIS team has been conducting studies on how to best store Level 2 swath footprints and run sub-second geospatial searches to discover data. The first objective was to improve the sampling accuracy of the footprints being stored in the TCIS database by comparing the Java-based NASA PO.DAAC Level 2 Swath Generator with a TCIS Python swath generator. The second objective was to compare the performance of four database implementations - MySQL, MySQL+Solr, MongoDB, and PostgreSQL - to see which database management system would yield the best geospatial query and storage performance. The final objective was to integrate our chosen technologies with our Joint Probability Density Function (Joint PDF), Wave Number Analysis, and Automated Rotational Center Hurricane Eye Retrieval (ARCHER) tools. In this presentation, we will compare the enabling technologies we tested and discuss which ones we selected for integration into the TCIS' data analysis tool architecture. We will also show how these techniques have been automated to provide access to NRT data through our analysis tools.

  10. Mapping healthcare systems: a policy relevant analytic tool

    PubMed Central

    Sekhri Feachem, Neelam; Afshar, Ariana; Pruett, Cristina; Avanceña, Anton L.V.

    2017-01-01

    Abstract Background In the past decade, an international consensus on the value of well-functioning systems has driven considerable health systems research. This research falls into two broad categories. The first provides conceptual frameworks that take complex healthcare systems and create simplified constructs of interactions and functions. The second focuses on granular inputs and outputs. This paper presents a novel translational mapping tool – the University of California, San Francisco mapping tool (the Tool) - which bridges the gap between these two areas of research, creating a platform for multi-country comparative analysis. Methods Using the Murray-Frenk framework, we create a macro-level representation of a country's structure, focusing on how it finances and delivers healthcare. The map visually depicts the fundamental policy questions in healthcare system design: funding sources and amount spent through each source, purchasers, populations covered, provider categories; and the relationship between these entities. Results We use the Tool to provide a macro-level comparative analysis of the structure of India's and Thailand's healthcare systems. Conclusions As part of the systems strengthening arsenal, the Tool can stimulate debate about the merits and consequences of different healthcare systems structural designs, using a common framework that fosters multi-country comparative analyses. PMID:28541518

  11. Whole-Genome Thermodynamic Analysis Reduces siRNA Off-Target Effects

    PubMed Central

    Chen, Xi; Liu, Peng; Chou, Hui-Hsien

    2013-01-01

    Small interfering RNAs (siRNAs) are important tools for knocking down targeted genes, and have been widely applied to biological and biomedical research. To design siRNAs, two important aspects must be considered: the potency in knocking down target genes and the off-target effect on any nontarget genes. Although many studies have produced useful tools to design potent siRNAs, off-target prevention has mostly been delegated to sequence-level alignment tools such as BLAST. We hypothesize that whole-genome thermodynamic analysis can identify potential off-targets with higher precision and help us avoid siRNAs that may have strong off-target effects. To validate this hypothesis, two siRNA sets were designed to target three human genes IDH1, ITPR2 and TRIM28. They were selected from the output of two popular siRNA design tools, siDirect and siDesign. Both siRNA design tools have incorporated sequence-level screening to avoid off-targets, thus their output is believed to be optimal. However, one of the sets we tested has off-target genes predicted by Picky, a whole-genome thermodynamic analysis tool. Picky can identify off-target genes that may hybridize to a siRNA within a user-specified melting temperature range. Our experiments validated that some off-target genes predicted by Picky can indeed be inhibited by siRNAs. Similar experiments were performed using commercially available siRNAs and a few off-target genes were also found to be inhibited as predicted by Picky. In summary, we demonstrate that whole-genome thermodynamic analysis can identify off-target genes that are missed in sequence-level screening. Because Picky prediction is deterministic according to thermodynamics, if a siRNA candidate has no Picky predicted off-targets, it is unlikely to cause off-target effects. Therefore, we recommend including Picky as an additional screening step in siRNA design. PMID:23484018

  12. Guiding Users to Sea Level Change Data Through Content

    NASA Astrophysics Data System (ADS)

    Quach, N.; Abercrombie, S. P.; Boening, C.; Brennan, H. P.; Gill, K. M.; Greguska, F. R., III; Huang, T.; Jackson, R.; Larour, E. Y.; Shaftel, H.; Tenenbaum, L. F.; Zlotnicki, V.; Boeck, A.; Moore, B.; Moore, J.

    2017-12-01

    The NASA Sea Level Change Portal (https://sealevel.nasa.gov) is an immersive and innovative web portal for sea level change research that addresses the needs of diverse audiences, from scientists across disparate disciplines to the general public to policy makers and businesses. Since sea level change research involves vast amounts of data from multiple fields, it becomes increasingly important to come up with novel and effective ways to guide users to the data they need. News articles published on the portal contains links to relevant data. The Missions section highlights missions and projects as well as provide a logical grouping of the data. Tools available on the portal, such as the Data Analysis Tool, a data visualization and high-performance environment for sea level analysis, and the Virtual Earth System Laboratory, a 3D simulation application, describes and links to the source data. With over 30K Facebook followers and over 23K Twitter follower, the portal outreach team also leverages social media to guide users to relevant data. This presentation focuses on how the portal uses news articles, mission and project pages, tools, and social media to connect users to the data.

  13. Real-Time Visualization of Network Behaviors for Situational Awareness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Best, Daniel M.; Bohn, Shawn J.; Love, Douglas V.

    Plentiful, complex, and dynamic data make understanding the state of an enterprise network difficult. Although visualization can help analysts understand baseline behaviors in network traffic and identify off-normal events, visual analysis systems often do not scale well to operational data volumes (in the hundreds of millions to billions of transactions per day) nor to analysis of emergent trends in real-time data. We present a system that combines multiple, complementary visualization techniques coupled with in-stream analytics, behavioral modeling of network actors, and a high-throughput processing platform called MeDICi. This system provides situational understanding of real-time network activity to help analysts takemore » proactive response steps. We have developed these techniques using requirements gathered from the government users for which the tools are being developed. By linking multiple visualization tools to a streaming analytic pipeline, and designing each tool to support a particular kind of analysis (from high-level awareness to detailed investigation), analysts can understand the behavior of a network across multiple levels of abstraction.« less

  14. Operations and Modeling Analysis

    NASA Technical Reports Server (NTRS)

    Ebeling, Charles

    2005-01-01

    The Reliability and Maintainability Analysis Tool (RMAT) provides NASA the capability to estimate reliability and maintainability (R&M) parameters and operational support requirements for proposed space vehicles based upon relationships established from both aircraft and Shuttle R&M data. RMAT has matured both in its underlying database and in its level of sophistication in extrapolating this historical data to satisfy proposed mission requirements, maintenance concepts and policies, and type of vehicle (i.e. ranging from aircraft like to shuttle like). However, a companion analyses tool, the Logistics Cost Model (LCM) has not reached the same level of maturity as RMAT due, in large part, to nonexistent or outdated cost estimating relationships and underlying cost databases, and it's almost exclusive dependence on Shuttle operations and logistics cost input parameters. As a result, the full capability of the RMAT/LCM suite of analysis tools to take a conceptual vehicle and derive its operations and support requirements along with the resulting operating and support costs has not been realized.

  15. Comparison of CATs, CURB-65 and PMEWS as triage tools in pandemic influenza admissions to UK hospitals: case control analysis using retrospective data.

    PubMed

    Myles, Puja R; Nguyen-Van-Tam, Jonathan S; Lim, Wei Shen; Nicholson, Karl G; Brett, Stephen J; Enstone, Joanne E; McMenamin, James; Openshaw, Peter J M; Read, Robert C; Taylor, Bruce L; Bannister, Barbara; Semple, Malcolm G

    2012-01-01

    Triage tools have an important role in pandemics to identify those most likely to benefit from higher levels of care. We compared Community Assessment Tools (CATs), the CURB-65 score, and the Pandemic Medical Early Warning Score (PMEWS); to predict higher levels of care (high dependency--Level 2 or intensive care--Level 3) and/or death in patients at or shortly after admission to hospital with A/H1N1 2009 pandemic influenza. This was a case-control analysis using retrospectively collected data from the FLU-CIN cohort (1040 adults, 480 children) with PCR-confirmed A/H1N1 2009 influenza. Area under receiver operator curves (AUROC), sensitivity, specificity, positive predictive values and negative predictive values were calculated. CATs best predicted Level 2/3 admissions in both adults [AUROC (95% CI): CATs 0.77 (0.73, 0.80); CURB-65 0.68 (0.64, 0.72); PMEWS 0.68 (0.64, 0.73), p<0.001] and children [AUROC: CATs 0.74 (0.68, 0.80); CURB-65 0.52 (0.46, 0.59); PMEWS 0.69 (0.62, 0.75), p<0.001]. CURB-65 and CATs were similar in predicting death in adults with both performing better than PMEWS; and CATs best predicted death in children. CATs were the best predictor of Level 2/3 care and/or death for both adults and children. CATs are potentially useful triage tools for predicting need for higher levels of care and/or mortality in patients of all ages.

  16. ATLAS (Automatic Tool for Local Assembly Structures) - A Comprehensive Infrastructure for Assembly, Annotation, and Genomic Binning of Metagenomic and Metaranscripomic Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, Richard A.; Brown, Joseph M.; Colby, Sean M.

    ATLAS (Automatic Tool for Local Assembly Structures) is a comprehensive multiomics data analysis pipeline that is massively parallel and scalable. ATLAS contains a modular analysis pipeline for assembly, annotation, quantification and genome binning of metagenomics and metatranscriptomics data and a framework for reference metaproteomic database construction. ATLAS transforms raw sequence data into functional and taxonomic data at the microbial population level and provides genome-centric resolution through genome binning. ATLAS provides robust taxonomy based on majority voting of protein coding open reading frames rolled-up at the contig level using modified lowest common ancestor (LCA) analysis. ATLAS provides robust taxonomy based onmore » majority voting of protein coding open reading frames rolled-up at the contig level using modified lowest common ancestor (LCA) analysis. ATLAS is user-friendly, easy install through bioconda maintained as open-source on GitHub, and is implemented in Snakemake for modular customizable workflows.« less

  17. New risk metrics and mathematical tools for risk analysis: Current and future challenges

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Skandamis, Panagiotis N., E-mail: pskan@aua.gr; Andritsos, Nikolaos, E-mail: pskan@aua.gr; Psomas, Antonios, E-mail: pskan@aua.gr

    The current status of the food safety supply world wide, has led Food and Agriculture Organization (FAO) and World Health Organization (WHO) to establishing Risk Analysis as the single framework for building food safety control programs. A series of guidelines and reports that detail out the various steps in Risk Analysis, namely Risk Management, Risk Assessment and Risk Communication is available. The Risk Analysis approach enables integration between operational food management systems, such as Hazard Analysis Critical Control Points, public health and governmental decisions. To do that, a series of new Risk Metrics has been established as follows: i) themore » Appropriate Level of Protection (ALOP), which indicates the maximum numbers of illnesses in a population per annum, defined by quantitative risk assessments, and used to establish; ii) Food Safety Objective (FSO), which sets the maximum frequency and/or concentration of a hazard in a food at the time of consumption that provides or contributes to the ALOP. Given that ALOP is rather a metric of the public health tolerable burden (it addresses the total ‘failure’ that may be handled at a national level), it is difficult to be interpreted into control measures applied at the manufacturing level. Thus, a series of specific objectives and criteria for performance of individual processes and products have been established, all of them assisting in the achievement of FSO and hence, ALOP. In order to achieve FSO, tools quantifying the effect of processes and intrinsic properties of foods on survival and growth of pathogens are essential. In this context, predictive microbiology and risk assessment have offered an important assistance to Food Safety Management. Predictive modelling is the basis of exposure assessment and the development of stochastic and kinetic models, which are also available in the form of Web-based applications, e.g., COMBASE and Microbial Responses Viewer), or introduced into user-friendly softwares, (e.g., Seafood Spoilage Predictor) have evolved the use of information systems in the food safety management. Such tools are updateable with new food-pathogen specific models containing cardinal parameters and multiple dependent variables, including plate counts, concentration of metabolic products, or even expression levels of certain genes. Then, these tools may further serve as decision-support tools which may assist in product logistics, based on their scientifically-based and “momentary” expressed spoilage and safety level.« less

  18. New risk metrics and mathematical tools for risk analysis: Current and future challenges

    NASA Astrophysics Data System (ADS)

    Skandamis, Panagiotis N.; Andritsos, Nikolaos; Psomas, Antonios; Paramythiotis, Spyridon

    2015-01-01

    The current status of the food safety supply world wide, has led Food and Agriculture Organization (FAO) and World Health Organization (WHO) to establishing Risk Analysis as the single framework for building food safety control programs. A series of guidelines and reports that detail out the various steps in Risk Analysis, namely Risk Management, Risk Assessment and Risk Communication is available. The Risk Analysis approach enables integration between operational food management systems, such as Hazard Analysis Critical Control Points, public health and governmental decisions. To do that, a series of new Risk Metrics has been established as follows: i) the Appropriate Level of Protection (ALOP), which indicates the maximum numbers of illnesses in a population per annum, defined by quantitative risk assessments, and used to establish; ii) Food Safety Objective (FSO), which sets the maximum frequency and/or concentration of a hazard in a food at the time of consumption that provides or contributes to the ALOP. Given that ALOP is rather a metric of the public health tolerable burden (it addresses the total `failure' that may be handled at a national level), it is difficult to be interpreted into control measures applied at the manufacturing level. Thus, a series of specific objectives and criteria for performance of individual processes and products have been established, all of them assisting in the achievement of FSO and hence, ALOP. In order to achieve FSO, tools quantifying the effect of processes and intrinsic properties of foods on survival and growth of pathogens are essential. In this context, predictive microbiology and risk assessment have offered an important assistance to Food Safety Management. Predictive modelling is the basis of exposure assessment and the development of stochastic and kinetic models, which are also available in the form of Web-based applications, e.g., COMBASE and Microbial Responses Viewer), or introduced into user-friendly softwares, (e.g., Seafood Spoilage Predictor) have evolved the use of information systems in the food safety management. Such tools are updateable with new food-pathogen specific models containing cardinal parameters and multiple dependent variables, including plate counts, concentration of metabolic products, or even expression levels of certain genes. Then, these tools may further serve as decision-support tools which may assist in product logistics, based on their scientifically-based and "momentary" expressed spoilage and safety level.

  19. Realist Ontology and Natural Processes: A Semantic Tool to Analyze the Presentation of the Osmosis Concept in Science Texts

    ERIC Educational Resources Information Center

    Spinelli Barria, Michele; Morales, Cecilia; Merino, Cristian; Quiroz, Waldo

    2016-01-01

    In this work, we developed an ontological tool, based on the scientific realism of Mario Bunge, for the analysis of the presentation of natural processes in science textbooks. This tool was applied to analyze the presentation of the concept of osmosis in 16 chemistry and biology books at different educational levels. The results showed that more…

  20. Development of Advanced Modeling Tools for Hotpot Analysis of Transportation Emissions

    DOT National Transportation Integrated Search

    2009-07-29

    Hot-spot analysis, also known as project-level analysis, assesses impacts of transportation emissions on local air pollution of carbon monoxide (CO), air toxics and particulate matter (PM). It is required for regional transportation plans (RTP), tran...

  1. HydroClimATe: hydrologic and climatic analysis toolkit

    USGS Publications Warehouse

    Dickinson, Jesse; Hanson, Randall T.; Predmore, Steven K.

    2014-01-01

    The potential consequences of climate variability and climate change have been identified as major issues for the sustainability and availability of the worldwide water resources. Unlike global climate change, climate variability represents deviations from the long-term state of the climate over periods of a few years to several decades. Currently, rich hydrologic time-series data are available, but the combination of data preparation and statistical methods developed by the U.S. Geological Survey as part of the Groundwater Resources Program is relatively unavailable to hydrologists and engineers who could benefit from estimates of climate variability and its effects on periodic recharge and water-resource availability. This report documents HydroClimATe, a computer program for assessing the relations between variable climatic and hydrologic time-series data. HydroClimATe was developed for a Windows operating system. The software includes statistical tools for (1) time-series preprocessing, (2) spectral analysis, (3) spatial and temporal analysis, (4) correlation analysis, and (5) projections. The time-series preprocessing tools include spline fitting, standardization using a normal or gamma distribution, and transformation by a cumulative departure. The spectral analysis tools include discrete Fourier transform, maximum entropy method, and singular spectrum analysis. The spatial and temporal analysis tool is empirical orthogonal function analysis. The correlation analysis tools are linear regression and lag correlation. The projection tools include autoregressive time-series modeling and generation of many realizations. These tools are demonstrated in four examples that use stream-flow discharge data, groundwater-level records, gridded time series of precipitation data, and the Multivariate ENSO Index.

  2. Work with Us | Energy Analysis | NREL

    Science.gov Websites

    , and local levels; and industry use NREL analysis, data, models, and tools to make critical energy and academia make us well-positioned to help solve the most important energy challenges: Yours

  3. A methodology and decision support tool for informing state-level bioenergy policymaking: New Jersey biofuels as a case study

    NASA Astrophysics Data System (ADS)

    Brennan-Tonetta, Margaret

    This dissertation seeks to provide key information and a decision support tool that states can use to support long-term goals of fossil fuel displacement and greenhouse gas reductions. The research yields three outcomes: (1) A methodology that allows for a comprehensive and consistent inventory and assessment of bioenergy feedstocks in terms of type, quantity, and energy potential. Development of a standardized methodology for consistent inventorying of biomass resources fosters research and business development of promising technologies that are compatible with the state's biomass resource base. (2) A unique interactive decision support tool that allows for systematic bioenergy analysis and evaluation of policy alternatives through the generation of biomass inventory and energy potential data for a wide variety of feedstocks and applicable technologies, using New Jersey as a case study. Development of a database that can assess the major components of a bioenergy system in one tool allows for easy evaluation of technology, feedstock and policy options. The methodology and decision support tool is applicable to other states and regions (with location specific modifications), thus contributing to the achievement of state and federal goals of renewable energy utilization. (3) Development of policy recommendations based on the results of the decision support tool that will help to guide New Jersey into a sustainable renewable energy future. The database developed in this research represents the first ever assessment of bioenergy potential for New Jersey. It can serve as a foundation for future research and modifications that could increase its power as a more robust policy analysis tool. As such, the current database is not able to perform analysis of tradeoffs across broad policy objectives such as economic development vs. CO2 emissions, or energy independence vs. source reduction of solid waste. Instead, it operates one level below that with comparisons of kWh or GGE generated by different feedstock/technology combinations at the state and county level. Modification of the model to incorporate factors that will enable the analysis of broader energy policy issues as those mentioned above, are recommended for future research efforts.

  4. Coupled rotor/airframe vibration analysis

    NASA Technical Reports Server (NTRS)

    Sopher, R.; Studwell, R. E.; Cassarino, S.; Kottapalli, S. B. R.

    1982-01-01

    A coupled rotor/airframe vibration analysis developed as a design tool for predicting helicopter vibrations and a research tool to quantify the effects of structural properties, aerodynamic interactions, and vibration reduction devices on vehicle vibration levels is described. The analysis consists of a base program utilizing an impedance matching technique to represent the coupled rotor/airframe dynamics of the system supported by inputs from several external programs supplying sophisticated rotor and airframe aerodynamic and structural dynamic representation. The theoretical background, computer program capabilities and limited correlation results are presented in this report. Correlation results using scale model wind tunnel results show that the analysis can adequately predict trends of vibration variations with airspeed and higher harmonic control effects. Predictions of absolute values of vibration levels were found to be very sensitive to modal characteristics and results were not representative of measured values.

  5. The Blue DRAGON--a system for monitoring the kinematics and the dynamics of endoscopic tools in minimally invasive surgery for objective laparoscopic skill assessment.

    PubMed

    Rosen, Jacob; Brown, Jeffrey D; Barreca, Marco; Chang, Lily; Hannaford, Blake; Sinanan, Mika

    2002-01-01

    Minimally invasive surgeiy (MIS) involves a multi-dimensional series of tasks requiring a synthesis between visual information and the kinematics and dynamics of the surgical tools. Analysis of these sources of information is a key step in mastering MIS surgery but may also be used to define objective criteria for characterizing surgical performance. The BIueDRAGON is a new system for acquiring the kinematics and the dynamics of two endoscopic tools along with the visual view of the surgical scene. It includes two four-bar mechanisms equipped with position and force torque sensors for measuring the positions and the orientations (P/O) of two endoscopic tools along with the forces and torques applied by the surgeons hands. The methodology of decomposing the surgical task is based on a fully connected, finite-states (28 states) Markov model where each states corresponded to a fundamental tool/tissue interaction based on the tool kinematics and associated with unique F/T signatures. The experimental protocol included seven MIS tasks performed on an animal model (pig) by 30 surgeons at different levels of their residency training. Preliminary analysis of these data showed that major differences between residents at different skill levels were: (i) the types of tool/tissue interactions being used, (ii) the transitions between tool/tissue interactions being applied by each hand, (iii) time spent while perfonning each tool/tissue interaction, (iv) the overall completion time, and (v) the variable F/T magnitudes being applied by the subjects through the endoscopic tools. Systems like surgical robots or virtual reality simulators that inherently measure the kinematics and the dynamics of the surgical tool may benefit from inclusion of the proposed methodology for analysis of efficacy and objective evaluation of surgical skills during training.

  6. NCLOS program 2010 update.

    DOT National Transportation Integrated Search

    2013-06-01

    The North Carolina Level of Service (NCLOS) program is a planning-level highway capacity analysis tool : developed for NCDOT under a previous project. The program uses the operational methodologies in the 2010 : Highway Capacity Manual (HCM), along w...

  7. SpirPro: A Spirulina proteome database and web-based tools for the analysis of protein-protein interactions at the metabolic level in Spirulina (Arthrospira) platensis C1.

    PubMed

    Senachak, Jittisak; Cheevadhanarak, Supapon; Hongsthong, Apiradee

    2015-07-29

    Spirulina (Arthrospira) platensis is the only cyanobacterium that in addition to being studied at the molecular level and subjected to gene manipulation, can also be mass cultivated in outdoor ponds for commercial use as a food supplement. Thus, encountering environmental changes, including temperature stresses, is common during the mass production of Spirulina. The use of cyanobacteria as an experimental platform, especially for photosynthetic gene manipulation in plants and bacteria, is becoming increasingly important. Understanding the mechanisms and protein-protein interaction networks that underlie low- and high-temperature responses is relevant to Spirulina mass production. To accomplish this goal, high-throughput techniques such as OMICs analyses are used. Thus, large datasets must be collected, managed and subjected to information extraction. Therefore, databases including (i) proteomic analysis and protein-protein interaction (PPI) data and (ii) domain/motif visualization tools are required for potential use in temperature response models for plant chloroplasts and photosynthetic bacteria. A web-based repository was developed including an embedded database, SpirPro, and tools for network visualization. Proteome data were analyzed integrated with protein-protein interactions and/or metabolic pathways from KEGG. The repository provides various information, ranging from raw data (2D-gel images) to associated results, such as data from interaction and/or pathway analyses. This integration allows in silico analyses of protein-protein interactions affected at the metabolic level and, particularly, analyses of interactions between and within the affected metabolic pathways under temperature stresses for comparative proteomic analysis. The developed tool, which is coded in HTML with CSS/JavaScript and depicted in Scalable Vector Graphics (SVG), is designed for interactive analysis and exploration of the constructed network. SpirPro is publicly available on the web at http://spirpro.sbi.kmutt.ac.th . SpirPro is an analysis platform containing an integrated proteome and PPI database that provides the most comprehensive data on this cyanobacterium at the systematic level. As an integrated database, SpirPro can be applied in various analyses, such as temperature stress response networking analysis in cyanobacterial models and interacting domain-domain analysis between proteins of interest.

  8. Public Domain Generic Tools: An Overview.

    ERIC Educational Resources Information Center

    Erjavec, Tomaz

    This paper presents an introduction to language engineering software, especially for computerized language and text corpora. The focus of the paper is on small and relatively independent pieces of software designed for specific, often low-level language analysis tasks, and on tools in the public domain. Discussion begins with the application of…

  9. Twenty-four hour peaking relationship to level of service and other measures of effectiveness.

    DOT National Transportation Integrated Search

    2015-06-01

    Transportation planners and traffic engineers are increasingly interested in traffic analysis tools that analyze : demand profiles and performance that go beyond analysis of the traditional peak hours and extend the analysis to : other hours of the d...

  10. Exploratory Climate Data Visualization and Analysis Using DV3D and UVCDAT

    NASA Technical Reports Server (NTRS)

    Maxwell, Thomas

    2012-01-01

    Earth system scientists are being inundated by an explosion of data generated by ever-increasing resolution in both global models and remote sensors. Advanced tools for accessing, analyzing, and visualizing very large and complex climate data are required to maintain rapid progress in Earth system research. To meet this need, NASA, in collaboration with the Ultra-scale Visualization Climate Data Analysis Tools (UVCOAT) consortium, is developing exploratory climate data analysis and visualization tools which provide data analysis capabilities for the Earth System Grid (ESG). This paper describes DV3D, a UV-COAT package that enables exploratory analysis of climate simulation and observation datasets. OV3D provides user-friendly interfaces for visualization and analysis of climate data at a level appropriate for scientists. It features workflow inte rfaces, interactive 40 data exploration, hyperwall and stereo visualization, automated provenance generation, and parallel task execution. DV30's integration with CDAT's climate data management system (COMS) and other climate data analysis tools provides a wide range of high performance climate data analysis operations. DV3D expands the scientists' toolbox by incorporating a suite of rich new exploratory visualization and analysis methods for addressing the complexity of climate datasets.

  11. Advantages of Integrative Data Analysis for Developmental Research

    ERIC Educational Resources Information Center

    Bainter, Sierra A.; Curran, Patrick J.

    2015-01-01

    Amid recent progress in cognitive development research, high-quality data resources are accumulating, and data sharing and secondary data analysis are becoming increasingly valuable tools. Integrative data analysis (IDA) is an exciting analytical framework that can enhance secondary data analysis in powerful ways. IDA pools item-level data across…

  12. Stochastic Simulation Tool for Aerospace Structural Analysis

    NASA Technical Reports Server (NTRS)

    Knight, Norman F.; Moore, David F.

    2006-01-01

    Stochastic simulation refers to incorporating the effects of design tolerances and uncertainties into the design analysis model and then determining their influence on the design. A high-level evaluation of one such stochastic simulation tool, the MSC.Robust Design tool by MSC.Software Corporation, has been conducted. This stochastic simulation tool provides structural analysts with a tool to interrogate their structural design based on their mathematical description of the design problem using finite element analysis methods. This tool leverages the analyst's prior investment in finite element model development of a particular design. The original finite element model is treated as the baseline structural analysis model for the stochastic simulations that are to be performed. A Monte Carlo approach is used by MSC.Robust Design to determine the effects of scatter in design input variables on response output parameters. The tool was not designed to provide a probabilistic assessment, but to assist engineers in understanding cause and effect. It is driven by a graphical-user interface and retains the engineer-in-the-loop strategy for design evaluation and improvement. The application problem for the evaluation is chosen to be a two-dimensional shell finite element model of a Space Shuttle wing leading-edge panel under re-entry aerodynamic loading. MSC.Robust Design adds value to the analysis effort by rapidly being able to identify design input variables whose variability causes the most influence in response output parameters.

  13. Surging Seas Risk Finder: A Simple Search-Based Web Tool for Local Sea Level Rise Projections, Coastal Flood Risk Forecasts, and Inundation Exposure Analysis

    NASA Astrophysics Data System (ADS)

    Strauss, B.; Dodson, D.; Kulp, S. A.; Rizza, D. H.

    2016-12-01

    Surging Seas Risk Finder (riskfinder.org) is an online tool for accessing extensive local projections and analysis of sea level rise; coastal floods; and land, populations, contamination sources, and infrastructure and other assets that may be exposed to inundation. Risk Finder was first published in 2013 for Florida, New York and New Jersey, expanding to all states in the contiguous U.S. by 2016, when a major new version of the tool was released with a completely new interface. The revised tool was informed by hundreds of survey responses from and conversations with planners, local officials and other coastal stakeholders, plus consideration of modern best practices for responsive web design and user interfaces, and social science-based principles for science communication. Overarching design principles include simplicity and ease of navigation, leading to a landing page with Google-like sparsity and focus on search, and to an architecture based on search, so that each coastal zip code, city, county, state or other place type has its own webpage gathering all relevant analysis in modular, scrollable units. Millions of users have visited the Surging Seas suite of tools to date, and downloaded thousands of files, for stated purposes ranging from planning to business to education to personal decisions; and from institutions ranging from local to federal government agencies, to businesses, to NGOs, and to academia.

  14. Mapping healthcare systems: a policy relevant analytic tool.

    PubMed

    Sekhri Feachem, Neelam; Afshar, Ariana; Pruett, Cristina; Avanceña, Anton L V

    2017-07-01

    In the past decade, an international consensus on the value of well-functioning systems has driven considerable health systems research. This research falls into two broad categories. The first provides conceptual frameworks that take complex healthcare systems and create simplified constructs of interactions and functions. The second focuses on granular inputs and outputs. This paper presents a novel translational mapping tool - the University of California, San Francisco mapping tool (the Tool) - which bridges the gap between these two areas of research, creating a platform for multi-country comparative analysis. Using the Murray-Frenk framework, we create a macro-level representation of a country's structure, focusing on how it finances and delivers healthcare. The map visually depicts the fundamental policy questions in healthcare system design: funding sources and amount spent through each source, purchasers, populations covered, provider categories; and the relationship between these entities. We use the Tool to provide a macro-level comparative analysis of the structure of India's and Thailand's healthcare systems. As part of the systems strengthening arsenal, the Tool can stimulate debate about the merits and consequences of different healthcare systems structural designs, using a common framework that fosters multi-country comparative analyses. © The Author 2017. Published by Oxford University Press on behalf of Royal Society of Tropical Medicine and Hygiene.

  15. On analyzing free-response data on location level

    NASA Astrophysics Data System (ADS)

    Bandos, Andriy I.; Obuchowski, Nancy A.

    2017-03-01

    Free-response ROC (FROC) data are typically collected when primary question of interest is focused on the proportions of the correct detection-localization of known targets and frequencies of false positive responses, which can be multiple per subject (image). These studies are particularly relevant for CAD and related applications. The fundamental tool of the location-level FROC analysis is the FROC curve. Although there are many methods of FROC analysis, as we describe in this work, some of the standard and popular approaches, while important, are not suitable for analyzing specifically the location-level FROC performance as summarized by the FROC curve. Analysis of the FROC curve, on the other hand, might not be straightforward. Recently we developed an approach for the location-level analysis of the FROC data using the well-known tools for clustered ROC analysis. In the current work, based on previously developed concepts, and using specific examples, we demonstrate the key reasons why specifically location-level FROC performance cannot be fully addressed by the common approaches as well as illustrate the proposed solution. Specifically, we consider the two most salient FROC approaches, namely JAFROC and the area under the exponentially transformed FROC curve (AFE) and show that clearly superior FROC curves can have lower values for these indices. We describe the specific features that make these approaches inconsistent with FROC curves. This work illustrates some caveats for using the common approaches for location-level FROC analysis and provides guidelines for the appropriate assessment or comparison of FROC systems.

  16. Application of Risk Assessment Tools in the Continuous Risk Management (CRM) Process

    NASA Technical Reports Server (NTRS)

    Ray, Paul S.

    2002-01-01

    Marshall Space Flight Center (MSFC) of the National Aeronautics and Space Administration (NASA) is currently implementing the Continuous Risk Management (CRM) Program developed by the Carnegie Mellon University and recommended by NASA as the Risk Management (RM) implementation approach. The four most frequently used risk assessment tools in the center are: (a) Failure Modes and Effects Analysis (FMEA), Hazard Analysis (HA), Fault Tree Analysis (FTA), and Probabilistic Risk Analysis (PRA). There are some guidelines for selecting the type of risk assessment tools during the project formulation phase of a project, but there is not enough guidance as to how to apply these tools in the Continuous Risk Management process (CRM). But the ways the safety and risk assessment tools are used make a significant difference in the effectiveness in the risk management function. Decisions regarding, what events are to be included in the analysis, to what level of details should the analysis be continued, make significant difference in the effectiveness of risk management program. Tools of risk analysis also depends on the phase of a project e.g. at the initial phase of a project, when not much data are available on hardware, standard FMEA cannot be applied; instead a functional FMEA may be appropriate. This study attempted to provide some directives to alleviate the difficulty in applying FTA, PRA, and FMEA in the CRM process. Hazard Analysis was not included in the scope of the study due to the short duration of the summer research project.

  17. Multivariate Classification of Original and Fake Perfumes by Ion Analysis and Ethanol Content.

    PubMed

    Gomes, Clêrton L; de Lima, Ari Clecius A; Loiola, Adonay R; da Silva, Abel B R; Cândido, Manuela C L; Nascimento, Ronaldo F

    2016-07-01

    The increased marketing of fake perfumes has encouraged us to investigate how to identify such products by their chemical characteristics and multivariate analysis. The aim of this study was to present an alternative approach to distinguish original from fake perfumes by means of the investigation of sodium, potassium, chloride ions, and ethanol contents by chemometric tools. For this, 50 perfumes were used (25 original and 25 counterfeit) for the analysis of ions (ion chromatography) and ethanol (gas chromatography). The results demonstrated that the fake perfume had low levels of ethanol and high levels of chloride compared to the original product. The data were treated by chemometric tools such as principal component analysis and linear discriminant analysis. This study proved that the analysis of ethanol is an effective method of distinguishing original from the fake products, and it may potentially be used to assist legal authorities in such cases. © 2016 American Academy of Forensic Sciences.

  18. Towards Context-Aware and User-Centered Analysis in Assistive Environments: A Methodology and a Software Tool.

    PubMed

    Fontecha, Jesús; Hervás, Ramón; Mondéjar, Tania; González, Iván; Bravo, José

    2015-10-01

    One of the main challenges on Ambient Assisted Living (AAL) is to reach an appropriate acceptance level of the assistive systems, as well as to analyze and monitor end user tasks in a feasible and efficient way. The development and evaluation of AAL solutions based on user-centered perspective help to achive these goals. In this work, we have designed a methodology to integrate and develop analytics user-centered tools into assistive systems. An analysis software tool gathers information of end users from adapted psychological questionnaires and naturalistic observation of their own context. The aim is to enable an in-deep analysis focused on improving the life quality of elderly people and their caregivers.

  19. TH-C-18A-08: A Management Tool for CT Dose Monitoring, Analysis, and Protocol Review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, J; Chan, F; Newman, B

    2014-06-15

    Purpose: To develop a customizable tool for enterprise-wide managing of CT protocols and analyzing radiation dose information of CT exams for a variety of quality control applications Methods: All clinical CT protocols implemented on the 11 CT scanners at our institution were extracted in digital format. The original protocols had been preset by our CT management team. A commercial CT dose tracking software (DoseWatch,GE healthcare,WI) was used to collect exam information (exam date, patient age etc.), scanning parameters, and radiation doses for all CT exams. We developed a Matlab-based program (MathWorks,MA) with graphic user interface which allows to analyze themore » scanning protocols with the actual dose estimates, and compare the data to national (ACR,AAPM) and internal reference values for CT quality control. Results: The CT protocol review portion of our tool allows the user to look up the scanning and image reconstruction parameters of any protocol on any of the installed CT systems among about 120 protocols per scanner. In the dose analysis tool, dose information of all CT exams (from 05/2013 to 02/2014) was stratified on a protocol level, and within a protocol down to series level, i.e. each individual exposure event. This allows numerical and graphical review of dose information of any combination of scanner models, protocols and series. The key functions of the tool include: statistics of CTDI, DLP and SSDE, dose monitoring using user-set CTDI/DLP/SSDE thresholds, look-up of any CT exam dose data, and CT protocol review. Conclusion: our inhouse CT management tool provides radiologists, technologists and administration a first-hand near real-time enterprise-wide knowledge on CT dose levels of different exam types. Medical physicists use this tool to manage CT protocols, compare and optimize dose levels across different scanner models. It provides technologists feedback on CT scanning operation, and knowledge on important dose baselines and thresholds.« less

  20. ampliMethProfiler: a pipeline for the analysis of CpG methylation profiles of targeted deep bisulfite sequenced amplicons.

    PubMed

    Scala, Giovanni; Affinito, Ornella; Palumbo, Domenico; Florio, Ermanno; Monticelli, Antonella; Miele, Gennaro; Chiariotti, Lorenzo; Cocozza, Sergio

    2016-11-25

    CpG sites in an individual molecule may exist in a binary state (methylated or unmethylated) and each individual DNA molecule, containing a certain number of CpGs, is a combination of these states defining an epihaplotype. Classic quantification based approaches to study DNA methylation are intrinsically unable to fully represent the complexity of the underlying methylation substrate. Epihaplotype based approaches, on the other hand, allow methylation profiles of cell populations to be studied at the single molecule level. For such investigations, next-generation sequencing techniques can be used, both for quantitative and for epihaplotype analysis. Currently available tools for methylation analysis lack output formats that explicitly report CpG methylation profiles at the single molecule level and that have suited statistical tools for their interpretation. Here we present ampliMethProfiler, a python-based pipeline for the extraction and statistical epihaplotype analysis of amplicons from targeted deep bisulfite sequencing of multiple DNA regions. ampliMethProfiler tool provides an easy and user friendly way to extract and analyze the epihaplotype composition of reads from targeted bisulfite sequencing experiments. ampliMethProfiler is written in python language and requires a local installation of BLAST and (optionally) QIIME tools. It can be run on Linux and OS X platforms. The software is open source and freely available at http://amplimethprofiler.sourceforge.net .

  1. Rich Language Analysis for Counterterrorism

    NASA Astrophysics Data System (ADS)

    Guidère, Mathieu; Howard, Newton; Argamon, Shlomo

    Accurate and relevant intelligence is critical for effective counterterrorism. Too much irrelevant information is as bad or worse than not enough information. Modern computational tools promise to provide better search and summarization capabilities to help analysts filter and select relevant and key information. However, to do this task effectively, such tools must have access to levels of meaning beyond the literal. Terrorists operating in context-rich cultures like fundamentalist Islam use messages with multiple levels of interpretation, which are easily misunderstood by non-insiders. This chapter discusses several kinds of such “encryption” used by terrorists and insurgents in the Arabic language, and how knowledge of such methods can be used to enhance computational text analysis techniques for use in counterterrorism.

  2. PyCoTools: A Python Toolbox for COPASI.

    PubMed

    Welsh, Ciaran M; Fullard, Nicola; Proctor, Carole J; Martinez-Guimera, Alvaro; Isfort, Robert J; Bascom, Charles C; Tasseff, Ryan; Przyborski, Stefan A; Shanley, Daryl P

    2018-05-22

    COPASI is an open source software package for constructing, simulating and analysing dynamic models of biochemical networks. COPASI is primarily intended to be used with a graphical user interface but often it is desirable to be able to access COPASI features programmatically, with a high level interface. PyCoTools is a Python package aimed at providing a high level interface to COPASI tasks with an emphasis on model calibration. PyCoTools enables the construction of COPASI models and the execution of a subset of COPASI tasks including time courses, parameter scans and parameter estimations. Additional 'composite' tasks which use COPASI tasks as building blocks are available for increasing parameter estimation throughput, performing identifiability analysis and performing model selection. PyCoTools supports exploratory data analysis on parameter estimation data to assist with troubleshooting model calibrations. We demonstrate PyCoTools by posing a model selection problem designed to show case PyCoTools within a realistic scenario. The aim of the model selection problem is to test the feasibility of three alternative hypotheses in explaining experimental data derived from neonatal dermal fibroblasts in response to TGF-β over time. PyCoTools is used to critically analyse the parameter estimations and propose strategies for model improvement. PyCoTools can be downloaded from the Python Package Index (PyPI) using the command 'pip install pycotools' or directly from GitHub (https://github.com/CiaranWelsh/pycotools). Documentation at http://pycotools.readthedocs.io. Supplementary data are available at Bioinformatics.

  3. PySCeSToolbox: a collection of metabolic pathway analysis tools.

    PubMed

    Christensen, Carl D; Hofmeyr, Jan-Hendrik S; Rohwer, Johann M

    2018-01-01

    PySCeSToolbox is an extension to the Python Simulator for Cellular Systems (PySCeS) that includes tools for performing generalized supply-demand analysis, symbolic metabolic control analysis, and a framework for investigating the kinetic and thermodynamic aspects of enzyme-catalyzed reactions. Each tool addresses a different aspect of metabolic behaviour, control, and regulation; the tools complement each other and can be used in conjunction to better understand higher level system behaviour. PySCeSToolbox is available on Linux, Mac OS X and Windows. It is licensed under the BSD 3-clause licence. Code, setup instructions and a link to documentation can be found at https://github.com/PySCeS/PyscesToolbox. jr@sun.ac.za. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  4. Analysis of laser therapy and assessment methods in the rehabilitation of temporomandibular disorder: a systematic review of the literature

    PubMed Central

    Herpich, Carolina Marciela; Amaral, Ana Paula; Leal-Junior, Ernesto Cesar Pinto; Tosato, Juliana de Paiva; Gomes, Cid Andre Fidelis de Paula; Arruda, Éric Edmur Camargo; Glória, Igor Phillip dos Santos; Garcia, Marilia Barbosa Santos; Barbosa, Bruno Roberto Borges; Rodrigues, Monique Sampaio; Silva, Katiane Lima; El Hage, Yasmin; Politti, Fabiano; Gonzalez, Tabajara de Oliveira; Bussadori, Sandra Kalil; Biasotto-Gonzalez, Daniela Aparecida

    2015-01-01

    The aim of the present study was to perform a systematic review of the literature on the effects of low-level laser therapy in the treatment of TMD, and to analyze the use of different assessment tools. [Subjects and Methods] Searches were carried out of the BIREME, MEDLINE, PubMed and SciELO electronic databases by two independent researchers for papers published in English and Portuguese using the terms: “temporomandibular joint laser therapy” and “TMJ laser treatment”. [Results] Following the application of the eligibility criteria, 11 papers were selected for in-depth analysis. The papers analyzed exhibited considerable methodological differences, especially with regard to the number of sessions, anatomic site and duration of low-level laser therapy irradiation, as well as irradiation parameters, diagnostic criteria and assessment tools. [Conclusion] Further studies are needed, especially randomized clinical trials, to establish the exact dose and ideal parameters for low-level laser therapy and define the best assessment tools in this promising field of research that may benefit individuals with signs and symptoms of TMD. PMID:25642095

  5. A Tool for Estimating Variability in Wood Preservative Treatment Retention

    Treesearch

    Patricia K. Lebow; Adam M. Taylor; Timothy M. Young

    2015-01-01

    Composite sampling is standard practice for evaluation of preservative retention levels in preservative-treated wood. Current protocols provide an average retention value but no estimate of uncertainty. Here we describe a statistical method for calculating uncertainty estimates using the standard sampling regime with minimal additional chemical analysis. This tool can...

  6. Tools for Knowledge Analysis, Synthesis, and Sharing

    ERIC Educational Resources Information Center

    Medland, Michael B.

    2007-01-01

    Change and complexity are creating a need for increasing levels of literacy in science and technology. Presently, we are beginning to provide students with clear contexts in which to learn, including clearly written text, visual displays and maps, and more effective instruction. We are also beginning to give students tools that promote their own…

  7. Numerical Propulsion System Simulation: A Common Tool for Aerospace Propulsion Being Developed

    NASA Technical Reports Server (NTRS)

    Follen, Gregory J.; Naiman, Cynthia G.

    2001-01-01

    The NASA Glenn Research Center is developing an advanced multidisciplinary analysis environment for aerospace propulsion systems called the Numerical Propulsion System Simulation (NPSS). This simulation is initially being used to support aeropropulsion in the analysis and design of aircraft engines. NPSS provides increased flexibility for the user, which reduces the total development time and cost. It is currently being extended to support the Aviation Safety Program and Advanced Space Transportation. NPSS focuses on the integration of multiple disciplines such as aerodynamics, structure, and heat transfer with numerical zooming on component codes. Zooming is the coupling of analyses at various levels of detail. NPSS development includes using the Common Object Request Broker Architecture (CORBA) in the NPSS Developer's Kit to facilitate collaborative engineering. The NPSS Developer's Kit will provide the tools to develop custom components and to use the CORBA capability for zooming to higher fidelity codes, coupling to multidiscipline codes, transmitting secure data, and distributing simulations across different platforms. These powerful capabilities will extend NPSS from a zero-dimensional simulation tool to a multifidelity, multidiscipline system-level simulation tool for the full life cycle of an engine.

  8. Teaching Content Analysis through "Harry Potter"

    ERIC Educational Resources Information Center

    Messinger, Adam M.

    2012-01-01

    Content analysis is a valuable research tool for social scientists that unfortunately can prove challenging to teach to undergraduate students. Published classroom exercises designed to teach content analysis have thus far been predominantly envisioned as lengthy projects for upper-level courses. A brief and engaging exercise may be more…

  9. Effects of a Network-Centric Multi-Modal Communication Tool on a Communication Monitoring Task

    DTIC Science & Technology

    2012-03-01

    replaced (Nelson, Bolia, Vidulich, & Langhorne , 2004). Communication will continue to be the central tool for Command and Control (C2) operators. However...Nelson, Bolia, Vidulich, & Langhorne , 2004). The two highest ratings for most potential technologies were data capture/replay tools and chat...analysis of variance (ANOVA). A significant main effect was found for Difficulty, F (1, 13) = 21.11, p < .05; the overall level of detections was

  10. SECIMTools: a suite of metabolomics data analysis tools.

    PubMed

    Kirpich, Alexander S; Ibarra, Miguel; Moskalenko, Oleksandr; Fear, Justin M; Gerken, Joseph; Mi, Xinlei; Ashrafi, Ali; Morse, Alison M; McIntyre, Lauren M

    2018-04-20

    Metabolomics has the promise to transform the area of personalized medicine with the rapid development of high throughput technology for untargeted analysis of metabolites. Open access, easy to use, analytic tools that are broadly accessible to the biological community need to be developed. While technology used in metabolomics varies, most metabolomics studies have a set of features identified. Galaxy is an open access platform that enables scientists at all levels to interact with big data. Galaxy promotes reproducibility by saving histories and enabling the sharing workflows among scientists. SECIMTools (SouthEast Center for Integrated Metabolomics) is a set of Python applications that are available both as standalone tools and wrapped for use in Galaxy. The suite includes a comprehensive set of quality control metrics (retention time window evaluation and various peak evaluation tools), visualization techniques (hierarchical cluster heatmap, principal component analysis, modular modularity clustering), basic statistical analysis methods (partial least squares - discriminant analysis, analysis of variance, t-test, Kruskal-Wallis non-parametric test), advanced classification methods (random forest, support vector machines), and advanced variable selection tools (least absolute shrinkage and selection operator LASSO and Elastic Net). SECIMTools leverages the Galaxy platform and enables integrated workflows for metabolomics data analysis made from building blocks designed for easy use and interpretability. Standard data formats and a set of utilities allow arbitrary linkages between tools to encourage novel workflow designs. The Galaxy framework enables future data integration for metabolomics studies with other omics data.

  11. Smart roadside initiative macro benefit analysis project report.

    DOT National Transportation Integrated Search

    2015-03-31

    Through the Smart Roadside Initiative (SRI), a Benefit-Cost Analysis (BCA) tool was developed for the evaluation of various new transportation technologies at a State level and to provide results that could support technology adoption by a State Depa...

  12. Levels of intra-specific AFLP diversity in tuber-bearing potato species with different breeding systems and ploidy levels

    USDA-ARS?s Scientific Manuscript database

    DNA-based marker analysis of plant genebank material has become a useful tool in the evaluation of levels of genetic diversity and for the informed use and maintenance of germplasm. In this study we quantify levels of Amplified Fragment Length Polymorphism (AFLP) in representative accessions of wild...

  13. Progressive Damage and Failure Analysis of Composite Laminates

    NASA Astrophysics Data System (ADS)

    Joseph, Ashith P. K.

    Composite materials are widely used in various industries for making structural parts due to higher strength to weight ratio, better fatigue life, corrosion resistance and material property tailorability. To fully exploit the capability of composites, it is required to know the load carrying capacity of the parts made of them. Unlike metals, composites are orthotropic in nature and fails in a complex manner under various loading conditions which makes it a hard problem to analyze. Lack of reliable and efficient failure analysis tools for composites have led industries to rely more on coupon and component level testing to estimate the design space. Due to the complex failure mechanisms, composite materials require a very large number of coupon level tests to fully characterize the behavior. This makes the entire testing process very time consuming and costly. The alternative is to use virtual testing tools which can predict the complex failure mechanisms accurately. This reduces the cost only to it's associated computational expenses making significant savings. Some of the most desired features in a virtual testing tool are - (1) Accurate representation of failure mechanism: Failure progression predicted by the virtual tool must be same as those observed in experiments. A tool has to be assessed based on the mechanisms it can capture. (2) Computational efficiency: The greatest advantages of a virtual tools are the savings in time and money and hence computational efficiency is one of the most needed features. (3) Applicability to a wide range of problems: Structural parts are subjected to a variety of loading conditions including static, dynamic and fatigue conditions. A good virtual testing tool should be able to make good predictions for all these different loading conditions. The aim of this PhD thesis is to develop a computational tool which can model the progressive failure of composite laminates under different quasi-static loading conditions. The analysis tool is validated by comparing the simulations against experiments for a selected number of quasi-static loading cases.

  14. Evaluation of a New Digital Automated Glycemic Pattern Detection Tool.

    PubMed

    Comellas, María José; Albiñana, Emma; Artes, Maite; Corcoy, Rosa; Fernández-García, Diego; García-Alemán, Jorge; García-Cuartero, Beatriz; González, Cintia; Rivero, María Teresa; Casamira, Núria; Weissmann, Jörg

    2017-11-01

    Blood glucose meters are reliable devices for data collection, providing electronic logs of historical data easier to interpret than handwritten logbooks. Automated tools to analyze these data are necessary to facilitate glucose pattern detection and support treatment adjustment. These tools emerge in a broad variety in a more or less nonevaluated manner. The aim of this study was to compare eDetecta, a new automated pattern detection tool, to nonautomated pattern analysis in terms of time investment, data interpretation, and clinical utility, with the overarching goal to identify early in development and implementation of tool areas of improvement and potential safety risks. Multicenter web-based evaluation in which 37 endocrinologists were asked to assess glycemic patterns of 4 real reports (2 continuous subcutaneous insulin infusion [CSII] and 2 multiple daily injection [MDI]). Endocrinologist and eDetecta analyses were compared on time spent to analyze each report and agreement on the presence or absence of defined patterns. eDetecta module markedly reduced the time taken to analyze each case on the basis of the emminens eConecta reports (CSII: 18 min; MDI: 12.5), compared to the automatic eDetecta analysis. Agreement between endocrinologists and eDetecta varied depending on the patterns, with high level of agreement in patterns of glycemic variability. Further analysis of low level of agreement led to identifying areas where algorithms used could be improved to optimize trend pattern identification. eDetecta was a useful tool for glycemic pattern detection, helping clinicians to reduce time required to review emminens eConecta glycemic reports. No safety risks were identified during the study.

  15. Full Life Cycle of Data Analysis with Climate Model Diagnostic Analyzer (CMDA)

    NASA Astrophysics Data System (ADS)

    Lee, S.; Zhai, C.; Pan, L.; Tang, B.; Zhang, J.; Bao, Q.; Malarout, N.

    2017-12-01

    We have developed a system that supports the full life cycle of a data analysis process, from data discovery, to data customization, to analysis, to reanalysis, to publication, and to reproduction. The system called Climate Model Diagnostic Analyzer (CMDA) is designed to demonstrate that the full life cycle of data analysis can be supported within one integrated system for climate model diagnostic evaluation with global observational and reanalysis datasets. CMDA has four subsystems that are highly integrated to support the analysis life cycle. Data System manages datasets used by CMDA analysis tools, Analysis System manages CMDA analysis tools which are all web services, Provenance System manages the meta data of CMDA datasets and the provenance of CMDA analysis history, and Recommendation System extracts knowledge from CMDA usage history and recommends datasets/analysis tools to users. These four subsystems are not only highly integrated but also easily expandable. New datasets can be easily added to Data System and scanned to be visible to the other subsystems. New analysis tools can be easily registered to be available in the Analysis System and Provenance System. With CMDA, a user can start a data analysis process by discovering datasets of relevance to their research topic using the Recommendation System. Next, the user can customize the discovered datasets for their scientific use (e.g. anomaly calculation, regridding, etc) with tools in the Analysis System. Next, the user can do their analysis with the tools (e.g. conditional sampling, time averaging, spatial averaging) in the Analysis System. Next, the user can reanalyze the datasets based on the previously stored analysis provenance in the Provenance System. Further, they can publish their analysis process and result to the Provenance System to share with other users. Finally, any user can reproduce the published analysis process and results. By supporting the full life cycle of climate data analysis, CMDA improves the research productivity and collaboration level of its user.

  16. Segmentation and Quantitative Analysis of Epithelial Tissues.

    PubMed

    Aigouy, Benoit; Umetsu, Daiki; Eaton, Suzanne

    2016-01-01

    Epithelia are tissues that regulate exchanges with the environment. They are very dynamic and can acquire virtually any shape; at the cellular level, they are composed of cells tightly connected by junctions. Most often epithelia are amenable to live imaging; however, the large number of cells composing an epithelium and the absence of informatics tools dedicated to epithelial analysis largely prevented tissue scale studies. Here we present Tissue Analyzer, a free tool that can be used to segment and analyze epithelial cells and monitor tissue dynamics.

  17. Taverna: a tool for building and running workflows of services

    PubMed Central

    Hull, Duncan; Wolstencroft, Katy; Stevens, Robert; Goble, Carole; Pocock, Mathew R.; Li, Peter; Oinn, Tom

    2006-01-01

    Taverna is an application that eases the use and integration of the growing number of molecular biology tools and databases available on the web, especially web services. It allows bioinformaticians to construct workflows or pipelines of services to perform a range of different analyses, such as sequence analysis and genome annotation. These high-level workflows can integrate many different resources into a single analysis. Taverna is available freely under the terms of the GNU Lesser General Public License (LGPL) from . PMID:16845108

  18. Nonstationary Extreme Value Analysis in a Changing Climate: A Software Package

    NASA Astrophysics Data System (ADS)

    Cheng, L.; AghaKouchak, A.; Gilleland, E.

    2013-12-01

    Numerous studies show that climatic extremes have increased substantially in the second half of the 20th century. For this reason, analysis of extremes under a nonstationary assumption has received a great deal of attention. This paper presents a software package developed for estimation of return levels, return periods, and risks of climatic extremes in a changing climate. This MATLAB software package offers tools for analysis of climate extremes under both stationary and non-stationary assumptions. The Nonstationary Extreme Value Analysis (hereafter, NEVA) provides an efficient and generalized framework for analyzing extremes using Bayesian inference. NEVA estimates the extreme value parameters using a Differential Evolution Markov Chain (DE-MC) which utilizes the genetic algorithm Differential Evolution (DE) for global optimization over the real parameter space with the Markov Chain Monte Carlo (MCMC) approach and has the advantage of simplicity, speed of calculation and convergence over conventional MCMC. NEVA also offers the confidence interval and uncertainty bounds of estimated return levels based on the sampled parameters. NEVA integrates extreme value design concepts, data analysis tools, optimization and visualization, explicitly designed to facilitate analysis extremes in geosciences. The generalized input and output files of this software package make it attractive for users from across different fields. Both stationary and nonstationary components of the package are validated for a number of case studies using empirical return levels. The results show that NEVA reliably describes extremes and their return levels.

  19. Longitudinal Aerodynamic Modeling of the Adaptive Compliant Trailing Edge Flaps on a GIII Airplane and Comparisons to Flight Data

    NASA Technical Reports Server (NTRS)

    Smith, Mark S.; Bui, Trong T.; Garcia, Christian A.; Cumming, Stephen B.

    2016-01-01

    A pair of compliant trailing edge flaps was flown on a modified GIII airplane. Prior to flight test, multiple analysis tools of various levels of complexity were used to predict the aerodynamic effects of the flaps. Vortex lattice, full potential flow, and full Navier-Stokes aerodynamic analysis software programs were used for prediction, in addition to another program that used empirical data. After the flight-test series, lift and pitching moment coefficient increments due to the flaps were estimated from flight data and compared to the results of the predictive tools. The predicted lift increments matched flight data well for all predictive tools for small flap deflections. All tools over-predicted lift increments for large flap deflections. The potential flow and Navier-Stokes programs predicted pitching moment coefficient increments better than the other tools.

  20. Traffic Data for Integrated Project-Level PM2.5 Conformity Analysis.

    DOT National Transportation Integrated Search

    2014-08-01

    As required by the U.S. Environmental Protection Agency (EPA), the MOVES model is the mandatory emission : tool for new PM hot-spot analyses for project-level conformity determinations that began after December 20, 2012. : Localized traffic data inpu...

  1. Reliability and validity of the workplace harassment questionnaire for Korean finance and service workers.

    PubMed

    Lee, Myeongjun; Kim, Hyunjung; Shin, Donghee; Lee, Sangyun

    2016-01-01

    Harassment means systemic and repeated unethical acts. Research on workplace harassment have been conducted widely and the NAQ-R has been widely used for the researches. But this tool, however the limitations in revealing differended in sub-factors depending on the culture and in reflecting that unique characteristics of the Koren society. So, The workplace harassment questionnaire for Korean finace and service workers has been developed to assess the level of personal harassment at work. This study aims to develop a tool to assess the level of personal harassment at work and to test its validity and reliability while examining specific characteristics of workplace harassment against finance and service workers in Korea. The framework of survey was established based on literature review, focused-group interview for the Korean finance and service workers. To verify its reliability, Cronbach's alpha coefficient was calculated; and to verify its validity, items and factors of the tool were analyzed. The correlation matrix analysis was examined to verify the tool's convergent validity and discriminant validity. Structural validity was verified by checking statistical significance in relation to the BDI-K. Cronbach's alpha coefficient of this survey was 0.93, which indicates a quite high level of reliability. To verify the appropriateness of this survey tool, its construct validity was examined through factor analysis. As a result of the factor analysis, 3 factors were extracted, explaining 56.5 % of the total variance. The loading values and communalities of the 20 items were 0.85 to 0.48 and 0.71 to 0.46. The convergent validity and discriminant validity were analyzed and rate of item discriminant validity was 100 %. Finally, for the concurrent validity, We examined the relationship between the WHI-KFSW and pschosocial stress by examining the correlation with the BDI-K. The results of chi-square test and multiple logistic analysis indicated that the correlation with the BDI-K was satatisctically significant. Workplace harassment in actual workplaces were investigated based on interviews, and the statistical analysis contributed to systematizing the types of actual workplace harassment. By statistical method, we developed the questionare, 20 items of 3 categories.

  2. Topographica: Building and Analyzing Map-Level Simulations from Python, C/C++, MATLAB, NEST, or NEURON Components

    PubMed Central

    Bednar, James A.

    2008-01-01

    Many neural regions are arranged into two-dimensional topographic maps, such as the retinotopic maps in mammalian visual cortex. Computational simulations have led to valuable insights about how cortical topography develops and functions, but further progress has been hindered by the lack of appropriate tools. It has been particularly difficult to bridge across levels of detail, because simulators are typically geared to a specific level, while interfacing between simulators has been a major technical challenge. In this paper, we show that the Python-based Topographica simulator makes it straightforward to build systems that cross levels of analysis, as well as providing a common framework for evaluating and comparing models implemented in other simulators. These results rely on the general-purpose abstractions around which Topographica is designed, along with the Python interfaces becoming available for many simulators. In particular, we present a detailed, general-purpose example of how to wrap an external spiking PyNN/NEST simulation as a Topographica component using only a dozen lines of Python code, making it possible to use any of the extensive input presentation, analysis, and plotting tools of Topographica. Additional examples show how to interface easily with models in other types of simulators. Researchers simulating topographic maps externally should consider using Topographica's analysis tools (such as preference map, receptive field, or tuning curve measurement) to compare results consistently, and for connecting models at different levels. This seamless interoperability will help neuroscientists and computational scientists to work together to understand how neurons in topographic maps organize and operate. PMID:19352443

  3. Quantitative Imaging In Pathology (QUIP) | Informatics Technology for Cancer Research (ITCR)

    Cancer.gov

    This site hosts web accessible applications, tools and data designed to support analysis, management, and exploration of whole slide tissue images for cancer research. The following tools are included: caMicroscope: A digital pathology data management and visualization plaform that enables interactive viewing of whole slide tissue images and segmentation results. caMicroscope can be also used independently of QUIP. FeatureExplorer: An interactive tool to allow patient-level feature exploration across multiple dimensions.

  4. A self-report critical incident assessment tool for army night vision goggle helicopter operations.

    PubMed

    Renshaw, Peter F; Wiggins, Mark W

    2007-04-01

    The present study sought to examine the utility of a self-report tool that was designed as a partial substitute for a face-to-face cognitive interview for critical incidents involving night vision goggles (NVGs). The use of NVGs remains problematic within the military environment, as these devices have been identified as a factor in a significant proportion of aircraft accidents and incidents. The self-report tool was structured to identify some of the cognitive features of human performance that were associated with critical incidents involving NVGs. The tool incorporated a number of different levels of analysis, ranging from specific behavioral responses to broader cognitive constructs. Reports were received from 30 active pilots within the Australian Army using the NVG Critical Incident Assessment Tool (NVGCIAT). The results revealed a correspondence between specific types of NVG-related errors and elements of the Human Factors Analysis and Classification System (HFACS). In addition, uncertainty emerged as a significant factor associated with the critical incidents that were recalled by operators. These results were broadly consistent with previous research and provide some support for the utility of subjective assessment tools as a means of extracting critical incident-related data when face-to-face cognitive interviews are not possible. In some circumstances, the NVGCIAT might be regarded as a substitute cognitive interview protocol with some level of diagnosticity.

  5. How to support forest management in a world of change: results of some regional studies.

    PubMed

    Fürst, C; Lorz, C; Vacik, H; Potocic, N; Makeschin, F

    2010-12-01

    This article presents results of several studies in Middle, Eastern and Southeastern Europe on needs and application areas, desirable attributes and marketing potentials of forest management support tools. By comparing present and future application areas, a trend from sectoral planning towards landscape planning and integration of multiple stakeholder needs is emerging. In terms of conflicts, where management support tools might provide benefit, no clear tendencies were found, neither on local nor on regional level. In contrast, on national and European levels, support of the implementation of laws, directives, and regulations was found to be of highest importance. Following the user-requirements analysis, electronic tools supporting communication are preferred against paper-based instruments. The users identified most important attributes of optimized management support tools: (i) a broad accessibility for all users at any time should be guaranteed, (ii) the possibility to integrate iteratively experiences from case studies and from regional experts into the knowledge base (learning system) should be given, and (iii) a self-explanatory user interface is demanded, which is also suitable for users rather inexperienced with electronic tools. However, a market potential analysis revealed that the willingness to pay for management tools is very limited, although the participants specified realistic ranges of maximal amounts of money, which would be invested if the products were suitable and payment inevitable. To bridge the discrepancy between unwillingness to pay and the need to use management support tools, optimized financing or cooperation models between practice and science must be found.

  6. How to Support Forest Management in a World of Change: Results of Some Regional Studies

    NASA Astrophysics Data System (ADS)

    Fürst, C.; Lorz, C.; Vacik, H.; Potocic, N.; Makeschin, F.

    2010-12-01

    This article presents results of several studies in Middle, Eastern and Southeastern Europe on needs and application areas, desirable attributes and marketing potentials of forest management support tools. By comparing present and future application areas, a trend from sectoral planning towards landscape planning and integration of multiple stakeholder needs is emerging. In terms of conflicts, where management support tools might provide benefit, no clear tendencies were found, neither on local nor on regional level. In contrast, on national and European levels, support of the implementation of laws, directives, and regulations was found to be of highest importance. Following the user-requirements analysis, electronic tools supporting communication are preferred against paper-based instruments. The users identified most important attributes of optimized management support tools: (i) a broad accessibility for all users at any time should be guaranteed, (ii) the possibility to integrate iteratively experiences from case studies and from regional experts into the knowledge base (learning system) should be given, and (iii) a self-explanatory user interface is demanded, which is also suitable for users rather inexperienced with electronic tools. However, a market potential analysis revealed that the willingness to pay for management tools is very limited, although the participants specified realistic ranges of maximal amounts of money, which would be invested if the products were suitable and payment inevitable. To bridge the discrepancy between unwillingness to pay and the need to use management support tools, optimized financing or cooperation models between practice and science must be found.

  7. Comparative Analysis of Nursing Students' Perspectives toward Avatar Learning Modality: Gain Pre-Clinical Experience via Self-Paced Cognitive Tool

    ERIC Educational Resources Information Center

    Commendador, Kathleen; Chi, Robert

    2013-01-01

    This study was undertaken to better understand the nature of nursing students' perspectives toward simulative learning modality for gaining pre-clinical experience via self-paced cognitive tool--Avatar. Findings indicates that participants engaged in synchronous Avatar learning environment had higher levels of appreciation toward Avatar learning…

  8. Toward Intelligent Software Defect Detection

    NASA Technical Reports Server (NTRS)

    Benson, Markland J.

    2011-01-01

    Source code level software defect detection has gone from state of the art to a software engineering best practice. Automated code analysis tools streamline many of the aspects of formal code inspections but have the drawback of being difficult to construct and either prone to false positives or severely limited in the set of defects that can be detected. Machine learning technology provides the promise of learning software defects by example, easing construction of detectors and broadening the range of defects that can be found. Pinpointing software defects with the same level of granularity as prominent source code analysis tools distinguishes this research from past efforts, which focused on analyzing software engineering metrics data with granularity limited to that of a particular function rather than a line of code.

  9. Open source cardiology electronic health record development for DIGICARDIAC implementation

    NASA Astrophysics Data System (ADS)

    Dugarte, Nelson; Medina, Rubén.; Huiracocha, Lourdes; Rojas, Rubén.

    2015-12-01

    This article presents the development of a Cardiology Electronic Health Record (CEHR) system. Software consists of a structured algorithm designed under Health Level-7 (HL7) international standards. Novelty of the system is the integration of high resolution ECG (HRECG) signal acquisition and processing tools, patient information management tools and telecardiology tools. Acquisition tools are for management and control of the DIGICARDIAC electrocardiograph functions. Processing tools allow management of HRECG signal analysis searching for indicative patterns of cardiovascular pathologies. Telecardiology tools incorporation allows system communication with other health care centers decreasing access time to the patient information. CEHR system was completely developed using open source software. Preliminary results of process validation showed the system efficiency.

  10. Reliability Analysis for AFTI-F16 SRFCS Using ASSIST and SURE

    NASA Technical Reports Server (NTRS)

    Wu, N. Eva

    2001-01-01

    This paper reports the results of a study on reliability analysis of an AFTI-16 Self-Repairing Flight Control System (SRFCS) using software tools SURE (Semi-Markov Unreliability Range Evaluator and ASSIST (Abstract Semi-Markov Specification Interface to the SURE Tool). The purpose of the study is to investigate the potential utility of the software tools in the ongoing effort of the NASA Aviation Safety Program, where the class of systems must be extended beyond the originally intended serving class of electronic digital processors. The study concludes that SURE and ASSIST are applicable to reliability, analysis of flight control systems. They are especially efficient for sensitivity analysis that quantifies the dependence of system reliability on model parameters. The study also confirms an earlier finding on the dominant role of a parameter called a failure coverage. The paper will remark on issues related to the improvement of coverage and the optimization of redundancy level.

  11. Tools for quantifying isotopic niche space and dietary variation at the individual and population level.

    USGS Publications Warehouse

    Newsome, Seth D.; Yeakel, Justin D.; Wheatley, Patrick V.; Tinker, M. Tim

    2012-01-01

    Ecologists are increasingly using stable isotope analysis to inform questions about variation in resource and habitat use from the individual to community level. In this study we investigate data sets from 2 California sea otter (Enhydra lutris nereis) populations to illustrate the advantages and potential pitfalls of applying various statistical and quantitative approaches to isotopic data. We have subdivided these tools, or metrics, into 3 categories: IsoSpace metrics, stable isotope mixing models, and DietSpace metrics. IsoSpace metrics are used to quantify the spatial attributes of isotopic data that are typically presented in bivariate (e.g., δ13C versus δ15N) 2-dimensional space. We review IsoSpace metrics currently in use and present a technique by which uncertainty can be included to calculate the convex hull area of consumers or prey, or both. We then apply a Bayesian-based mixing model to quantify the proportion of potential dietary sources to the diet of each sea otter population and compare this to observational foraging data. Finally, we assess individual dietary specialization by comparing a previously published technique, variance components analysis, to 2 novel DietSpace metrics that are based on mixing model output. As the use of stable isotope analysis in ecology continues to grow, the field will need a set of quantitative tools for assessing isotopic variance at the individual to community level. Along with recent advances in Bayesian-based mixing models, we hope that the IsoSpace and DietSpace metrics described here will provide another set of interpretive tools for ecologists.

  12. NASA Subsonic Rotary Wing Project-Multidisciplinary Analysis and Technology Development: Overview

    NASA Technical Reports Server (NTRS)

    Yamauchi, Gloria K.

    2009-01-01

    This slide presentation reviews the objectives of the Multidisciplinary Analysis and Technology Development (MDATD) in the Subsonic Rotary Wing project. The objectives are to integrate technologies and analyses to enable advanced rotorcraft and provide a roadmap to guide Level 1 and 2 research. The MDATD objectives will be met by conducting assessments of advanced technology benefits, developing new or enhanced design tools, and integrating Level 2 discipline technologies to develop and enable system-level analyses and demonstrations.

  13. Geometric modeling for computer aided design

    NASA Technical Reports Server (NTRS)

    Schwing, James L.

    1992-01-01

    The goal was the design and implementation of software to be used in the conceptual design of aerospace vehicles. Several packages and design studies were completed, including two software tools currently used in the conceptual level design of aerospace vehicles. These tools are the Solid Modeling Aerospace Research Tool (SMART) and the Environment for Software Integration and Execution (EASIE). SMART provides conceptual designers with a rapid prototyping capability and additionally provides initial mass property analysis. EASIE provides a set of interactive utilities that simplify the task of building and executing computer aided design systems consisting of diverse, stand alone analysis codes that result in the streamlining of the exchange of data between programs, reducing errors and improving efficiency.

  14. Bioelectrical impedance analysis: A new tool for assessing fish condition

    USGS Publications Warehouse

    Hartman, Kyle J.; Margraf, F. Joseph; Hafs, Andrew W.; Cox, M. Keith

    2015-01-01

    Bioelectrical impedance analysis (BIA) is commonly used in human health and nutrition fields but has only recently been considered as a potential tool for assessing fish condition. Once BIA is calibrated, it estimates fat/moisture levels and energy content without the need to kill fish. Despite the promise held by BIA, published studies have been divided on whether BIA can provide accurate estimates of body composition in fish. In cases where BIA was not successful, the models lacked the range of fat levels or sample sizes we determined were needed for model success (range of dry fat levels of 29%, n = 60, yielding an R2 of 0.8). Reduced range of fat levels requires an increased sample size to achieve that benchmark; therefore, standardization of methods is needed. Here we discuss standardized methods based on a decade of research, identify sources of error, discuss where BIA is headed, and suggest areas for future research.

  15. Global Nanotribology Research Output (1996–2010): A Scientometric Analysis

    PubMed Central

    Elango, Bakthavachalam; Rajendran, Periyaswamy; Bornmann, Lutz

    2013-01-01

    This study aims to assess the nanotribology research output at global level using scientometric tools. The SCOPUS database was used to retrieve records related to the nanotribology research for the period 1996–2010. Publications were counted on a fractional basis. The level of collaboration and its citation impact were examined. The performance of the most productive countries, institutes and most preferred journals is assessed. Various visualization tools such as the Sci2 tool and Ucinet were employed. The USA ranked top in terms of number of publications, citations per paper and h-index, while Switzerland published a higher percentage of international collaborative papers. The most productive institution was Tsinghua University followed by Ohio State University and Lanzhou Institute of Chemical Physics, CAS. The most preferred journals were Tribology Letters, Wear and Journal of Japanese Society of Tribologists. The result of author keywords analysis reveals that Molecular Dynamics, MEMS, Hard Disk and Diamond like Carbon are major research topics. PMID:24339900

  16. Applications of Parallel Process HiMAP for Large Scale Multidisciplinary Problems

    NASA Technical Reports Server (NTRS)

    Guruswamy, Guru P.; Potsdam, Mark; Rodriguez, David; Kwak, Dochay (Technical Monitor)

    2000-01-01

    HiMAP is a three level parallel middleware that can be interfaced to a large scale global design environment for code independent, multidisciplinary analysis using high fidelity equations. Aerospace technology needs are rapidly changing. Computational tools compatible with the requirements of national programs such as space transportation are needed. Conventional computation tools are inadequate for modern aerospace design needs. Advanced, modular computational tools are needed, such as those that incorporate the technology of massively parallel processors (MPP).

  17. Evaluation of a New Digital Automated Glycemic Pattern Detection Tool

    PubMed Central

    Albiñana, Emma; Artes, Maite; Corcoy, Rosa; Fernández-García, Diego; García-Alemán, Jorge; García-Cuartero, Beatriz; González, Cintia; Rivero, María Teresa; Casamira, Núria; Weissmann, Jörg

    2017-01-01

    Abstract Background: Blood glucose meters are reliable devices for data collection, providing electronic logs of historical data easier to interpret than handwritten logbooks. Automated tools to analyze these data are necessary to facilitate glucose pattern detection and support treatment adjustment. These tools emerge in a broad variety in a more or less nonevaluated manner. The aim of this study was to compare eDetecta, a new automated pattern detection tool, to nonautomated pattern analysis in terms of time investment, data interpretation, and clinical utility, with the overarching goal to identify early in development and implementation of tool areas of improvement and potential safety risks. Methods: Multicenter web-based evaluation in which 37 endocrinologists were asked to assess glycemic patterns of 4 real reports (2 continuous subcutaneous insulin infusion [CSII] and 2 multiple daily injection [MDI]). Endocrinologist and eDetecta analyses were compared on time spent to analyze each report and agreement on the presence or absence of defined patterns. Results: eDetecta module markedly reduced the time taken to analyze each case on the basis of the emminens eConecta reports (CSII: 18 min; MDI: 12.5), compared to the automatic eDetecta analysis. Agreement between endocrinologists and eDetecta varied depending on the patterns, with high level of agreement in patterns of glycemic variability. Further analysis of low level of agreement led to identifying areas where algorithms used could be improved to optimize trend pattern identification. Conclusion: eDetecta was a useful tool for glycemic pattern detection, helping clinicians to reduce time required to review emminens eConecta glycemic reports. No safety risks were identified during the study. PMID:29091477

  18. ASTEC: Controls analysis for personal computers

    NASA Technical Reports Server (NTRS)

    Downing, John P.; Bauer, Frank H.; Thorpe, Christopher J.

    1989-01-01

    The ASTEC (Analysis and Simulation Tools for Engineering Controls) software is under development at Goddard Space Flight Center (GSFC). The design goal is to provide a wide selection of controls analysis tools at the personal computer level, as well as the capability to upload compute-intensive jobs to a mainframe or supercomputer. The project is a follow-on to the INCA (INteractive Controls Analysis) program that has been developed at GSFC over the past five years. While ASTEC makes use of the algorithms and expertise developed for the INCA program, the user interface was redesigned to take advantage of the capabilities of the personal computer. The design philosophy and the current capabilities of the ASTEC software are described.

  19. Assessing hospital disaster preparedness: a comparison of an on-site survey, directly observed drill performance, and video analysis of teamwork.

    PubMed

    Kaji, Amy H; Langford, Vinette; Lewis, Roger J

    2008-09-01

    There is currently no validated method for assessing hospital disaster preparedness. We determine the degree of correlation between the results of 3 methods for assessing hospital disaster preparedness: administration of an on-site survey, drill observation using a structured evaluation tool, and video analysis of team performance in the hospital incident command center. This was a prospective, observational study conducted during a regional disaster drill, comparing the results from an on-site survey, a structured disaster drill evaluation tool, and a video analysis of teamwork, performed at 6 911-receiving hospitals in Los Angeles County, CA. The on-site survey was conducted separately from the drill and assessed hospital disaster plan structure, vendor agreements, modes of communication, medical and surgical supplies, involvement of law enforcement, mutual aid agreements with other facilities, drills and training, surge capacity, decontamination capability, and pharmaceutical stockpiles. The drill evaluation tool, developed by Johns Hopkins University under contract from the Agency for Healthcare Research and Quality, was used to assess various aspects of drill performance, such as the availability of the hospital disaster plan, the geographic configuration of the incident command center, whether drill participants were identifiable, whether the noise level interfered with effective communication, and how often key information (eg, number of available staffed floor, intensive care, and isolation beds; number of arriving victims; expected triage level of victims; number of potential discharges) was received by the incident command center. Teamwork behaviors in the incident command center were quantitatively assessed, using the MedTeams analysis of the video recordings obtained during the disaster drill. Spearman rank correlations of the results between pair-wise groupings of the 3 assessment methods were calculated. The 3 evaluation methods demonstrated qualitatively different results with respect to each hospital's level of disaster preparedness. The Spearman rank correlation coefficient between the results of the on-site survey and the video analysis of teamwork was -0.34; between the results of the on-site survey and the structured drill evaluation tool, 0.15; and between the results of the video analysis and the drill evaluation tool, 0.82. The disparate results obtained from the 3 methods suggest that each measures distinct aspects of disaster preparedness, and perhaps no single method adequately characterizes overall hospital preparedness.

  20. SHAPA: An interactive software tool for protocol analysis applied to aircrew communications and workload

    NASA Technical Reports Server (NTRS)

    James, Jeffrey M.; Sanderson, Penelope M.; Seidler, Karen S.

    1990-01-01

    As modern transport environments become increasingly complex, issues such as crew communication, interaction with automation, and workload management have become crucial. Much research is being focused on holistic aspects of social and cognitive behavior, such as the strategies used to handle workload, the flow of information, the scheduling of tasks, the verbal and non-verbal interactions between crew members. Traditional laboratory performance measures no longer sufficiently meet the needs of researchers addressing these issues. However observational techniques are better equipped to capture the type of data needed and to build models of the requisite level of sophistication. Presented here is SHAPA, an interactive software tool for performing both verbal and non-verbal protocol analysis. It has been developed with the idea of affording the researchers the closest possible degree of engagement with protocol data. The researcher can configure SHAPA to encode protocols using any theoretical framework or encoding vocabulary that is desired. SHAPA allows protocol analysis to be performed at any level of analysis, and it supplies a wide variety of tools for data aggregation, manipulation. The output generated by SHAPA can be used alone or in combination with other performance variables to get a rich picture of the influences on sequences of verbal or nonverbal behavior.

  1. Use of stable isotope analysis in determining aquatic food webs

    EPA Science Inventory

    Stable isotope analysis is a useful tool for describing resource-consumer dynamics in ecosystems. In general, organisms of a given trophic level or functional feeding group will have a stable isotope ratio identifiable different than their prey because of preferential use of one ...

  2. Multi-focus and multi-level techniques for visualization and analysis of networks with thematic data

    NASA Astrophysics Data System (ADS)

    Cossalter, Michele; Mengshoel, Ole J.; Selker, Ted

    2013-01-01

    Information-rich data sets bring several challenges in the areas of visualization and analysis, even when associated with node-link network visualizations. This paper presents an integration of multi-focus and multi-level techniques that enable interactive, multi-step comparisons in node-link networks. We describe NetEx, a visualization tool that enables users to simultaneously explore different parts of a network and its thematic data, such as time series or conditional probability tables. NetEx, implemented as a Cytoscape plug-in, has been applied to the analysis of electrical power networks, Bayesian networks, and the Enron e-mail repository. In this paper we briefly discuss visualization and analysis of the Enron social network, but focus on data from an electrical power network. Specifically, we demonstrate how NetEx supports the analytical task of electrical power system fault diagnosis. Results from a user study with 25 subjects suggest that NetEx enables more accurate isolation of complex faults compared to an especially designed software tool.

  3. An Experience of Social Rising of Logical Tools in a Primary School Classroom: The Role of Language

    ERIC Educational Resources Information Center

    Coppola, Cristina; Mollo, Monica; Pacelli, Tiziana

    2011-01-01

    In this paper we explore the relationship between language and developmental processes of logical tools through the analysis at different levels of some "linguistic-manipulative" activities in a primary school classroom. We believe that this kind of activities can spur in the children a reflection and a change in their language…

  4. US forest carbon calculation tool: forest-land carbon stocks and net annual stock change

    Treesearch

    James E. Smith; Linda S. Heath; Michael C. Nichols

    2007-01-01

    The Carbon Calculation Tool 4.0, CCTv40.exe, is a computer application that reads publicly available forest inventory data collected by the U.S. Forest Service's Forest Inventory and Analysis Program (FIA) and generates state-level annualized estimates of carbon stocks on forest land based on FORCARB2 estimators. Estimates can be recalculated as...

  5. Work-related musculoskeletal disorders (WMDs) risk assessment at core assembly production of electronic components manufacturing company

    NASA Astrophysics Data System (ADS)

    Yahya, N. M.; Zahid, M. N. O.

    2018-03-01

    This study conducted to assess the work-related musculoskeletal disorders (WMDs) among the workers at core assembly production in an electronic components manufacturing company located in Pekan, Pahang, Malaysia. The study is to identify the WMDs risk factor and risk level. A set of questionnaires survey based on modified Nordic Musculoskeletal Disorder Questionnaires have been distributed to respective workers to acquire the WMDs risk factor identification. Then, postural analysis was conducted in order to measure the respective WMDs risk level. The analysis were based on two ergonomics assessment tools; Rapid Upper Limb Assessment (RULA) and Rapid Entire Body Assessment (REBA). The study found that 30 respondents out of 36 respondents suffered from WMDs especially at shoulder, wrists and lower back. The WMDs risk have been identified from unloading process, pressing process and winding process. In term of the WMDs risk level, REBA and RULA assessment tools have indicated high risk level to unloading and pressing process. Thus, this study had established the WMDs risk factor and risk level of core assembly production in an electronic components manufacturing company at Malaysia environment.

  6. Evaluation and Analysis of F-16XL Wind Tunnel Data From Static and Dynamic Tests

    NASA Technical Reports Server (NTRS)

    Kim, Sungwan; Murphy, Patrick C.; Klein, Vladislav

    2004-01-01

    A series of wind tunnel tests were conducted in the NASA Langley Research Center as part of an ongoing effort to develop and test mathematical models for aircraft rigid-body aerodynamics in nonlinear unsteady flight regimes. Analysis of measurement accuracy, especially for nonlinear dynamic systems that may exhibit complicated behaviors, is an essential component of this ongoing effort. In this report, tools for harmonic analysis of dynamic data and assessing measurement accuracy are presented. A linear aerodynamic model is assumed that is appropriate for conventional forced-oscillation experiments, although more general models can be used with these tools. Application of the tools to experimental data is demonstrated and results indicate the levels of uncertainty in output measurements that can arise from experimental setup, calibration procedures, mechanical limitations, and input errors.

  7. Northwest Trajectory Analysis Capability: A Platform for Enhancing Computational Biophysics Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peterson, Elena S.; Stephan, Eric G.; Corrigan, Abigail L.

    2008-07-30

    As computational resources continue to increase, the ability of computational simulations to effectively complement, and in some cases replace, experimentation in scientific exploration also increases. Today, large-scale simulations are recognized as an effective tool for scientific exploration in many disciplines including chemistry and biology. A natural side effect of this trend has been the need for an increasingly complex analytical environment. In this paper, we describe Northwest Trajectory Analysis Capability (NTRAC), an analytical software suite developed to enhance the efficiency of computational biophysics analyses. Our strategy is to layer higher-level services and introduce improved tools within the user’s familiar environmentmore » without preventing researchers from using traditional tools and methods. Our desire is to share these experiences to serve as an example for effectively analyzing data intensive large scale simulation data.« less

  8. Small Launch Vehicle Trade Space Definition: Development of a Zero Level Mass Estimation Tool with Trajectory Validation

    NASA Technical Reports Server (NTRS)

    Waters, Eric D.

    2013-01-01

    Recent high level interest in the capability of small launch vehicles has placed significant demand on determining the trade space these vehicles occupy. This has led to the development of a zero level analysis tool that can quickly determine the minimum expected vehicle gross liftoff weight (GLOW) in terms of vehicle stage specific impulse (Isp) and propellant mass fraction (pmf) for any given payload value. Utilizing an extensive background in Earth to orbit trajectory experience a total necessary delta v the vehicle must achieve can be estimated including relevant loss terms. This foresight into expected losses allows for more specific assumptions relating to the initial estimates of thrust to weight values for each stage. This tool was further validated against a trajectory model, in this case the Program to Optimize Simulated Trajectories (POST), to determine if the initial sizing delta v was adequate to meet payload expectations. Presented here is a description of how the tool is setup and the approach the analyst must take when using the tool. Also, expected outputs which are dependent on the type of small launch vehicle being sized will be displayed. The method of validation will be discussed as well as where the sizing tool fits into the vehicle design process.

  9. Comparing 2 National Organization-Level Workplace Health Promotion and Improvement Tools, 2013–2015

    PubMed Central

    Lang, Jason E.; Davis, Whitney D.; Jones-Jack, Nkenge H.; Mukhtar, Qaiser; Lu, Hua; Acharya, Sushama D.; Molloy, Meg E.

    2016-01-01

    Creating healthy workplaces is becoming more common. Half of employers that have more than 50 employees offer some type of workplace health promotion program. Few employers implement comprehensive evidence-based interventions that reach all employees and achieve desired health and cost outcomes. A few organization-level assessment and benchmarking tools have emerged to help employers evaluate the comprehensiveness and rigor of their health promotion offerings. Even fewer tools exist that combine assessment with technical assistance and guidance to implement evidence-based practices. Our descriptive analysis compares 2 such tools, the Centers for Disease Control and Prevention’s Worksite Health ScoreCard and Prevention Partners’ WorkHealthy America, and presents data from both to describe workplace health promotion practices across the United States. These tools are reaching employers of all types (N = 1,797), and many employers are using a comprehensive approach (85% of those using WorkHealthy America and 45% of those using the ScoreCard), increasing program effectiveness and impact. PMID:27685429

  10. Design and evaluation of a web-based decision support tool for district-level disease surveillance in a low-resource setting

    PubMed Central

    Pore, Meenal; Sengeh, David M.; Mugambi, Purity; Purswani, Nuri V.; Sesay, Tom; Arnold, Anna Lena; Tran, Anh-Minh A.; Myers, Ralph

    2017-01-01

    During the 2014 West African Ebola Virus outbreak it became apparent that the initial response to the outbreak was hampered by limitations in the collection, aggregation, analysis and use of data for intervention planning. As part of the post-Ebola recovery phase, IBM Research Africa partnered with the Port Loko District Health Management Team (DHMT) in Sierra Leone and GOAL Global, to design, implement and deploy a web-based decision support tool for district-level disease surveillance. This paper discusses the design process and the functionality of the first version of the system. The paper presents evaluation results prior to a pilot deployment and identifies features for future iterations. A qualitative assessment of the tool prior to pilot deployment indicates that it improves the timeliness and ease of using data for making decisions at the DHMT level. PMID:29854209

  11. GES DAAC HDF Data Processing and Visualization Tools

    NASA Astrophysics Data System (ADS)

    Ouzounov, D.; Cho, S.; Johnson, J.; Li, J.; Liu, Z.; Lu, L.; Pollack, N.; Qin, J.; Savtchenko, A.; Teng, B.

    2002-12-01

    The Goddard Earth Sciences (GES) Distributed Active Archive Center (DAAC) plays a major role in enabling basic scientific research and providing access to scientific data to the general user community. Several GES DAAC Data Support Teams provide expert assistance to users in accessing data, including information on visualization tools and documentation for data products. To provide easy access to the science data, the data support teams have additionally developed many online and desktop tools for data processing and visualization. This presentation is an overview of major HDF tools implemented at the GES DAAC and aimed at optimizing access to EOS data for the Earth Sciences community. GES DAAC ONLINE TOOLS: MODIS and AIRS on-demand Channel/Variable Subsetter are web-based, on-the-fly/on-demand subsetters that perform channel/variable subsetting and restructuring for Level1B and Level 2 data products. Users can specify criteria to subset data files with desired channels and variables and then download the subsetted file. AIRS QuickLook is a CGI/IDL combo package that allows users to view AIRS/HSB/AMSU Level-1B data online by specifying a channel prior to obtaining data. A global map is also provided along with the image to show geographic coverage of the granule and flight direction of the spacecraft. OASIS (Online data AnalySIS) is an IDL-based HTML/CGI interface for search, selection, and simple analysis of earth science data. It supports binary and GRIB formatted data, such as TOVS, Data Assimilation products, and some NCEP operational products. TRMM Online Analysis System is designed for quick exploration, analyses, and visualization of TRMM Level-3 and other precipitation products. The products consist of the daily (3B42), monthly(3B43), near-real-time (3B42RT), and Willmott's climate data. The system is also designed to be simple and easy to use - users can plot the average or accumulated rainfall over their region of interest for a given time period, or plot the time series of regional rainfall average. WebGIS is an online web software that implements the Open GIS Consortium (OGC) standards for mapping requests and rendering. It allows users access to TRMM, MODIS, SeaWiFS, and AVHRR data from several DAAC map servers, as well as externally served data such as political boundaries, population centers, lakes, rivers, and elevation. GES DAAC DESKTOP TOOLS: HDFLook-MODIS is a new, multifunctional, data processing and visualization tool for Radiometric and Geolocation, Atmosphere, Ocean, and Land MODIS HDF-EOS data. Features include (1) accessing and visualization of all swath (Levels l and 2) MODIS and AIRS products, and gridded (Levels 3 and 4) MODIS products; (2) re-mapping of swath data to world map; (3) geo-projection conversion; (4) interactive and batch mode capabilities; (5) subsetting and multi-granule processing; and (6) data conversion. SIMAP is an IDL-based script that is designed to read and map MODIS Level 1B (L1B) and Level 2 (L2) Ocean and Atmosphere products. It is a non-interactive, command line executed tool. The resulting maps are scaled to physical units (e.g., radiances, concentrations, brightness temperatures) and saved in binary files. TRMM HDF (in C and Fortran), reads in TRMM HDF data files and writes out user-selected SDS arrays and Vdata tables as separate flat binary files.

  12. PICASSO: an end-to-end image simulation tool for space and airborne imaging systems II. Extension to the thermal infrared: equations and methods

    NASA Astrophysics Data System (ADS)

    Cota, Stephen A.; Lomheim, Terrence S.; Florio, Christopher J.; Harbold, Jeffrey M.; Muto, B. Michael; Schoolar, Richard B.; Wintz, Daniel T.; Keller, Robert A.

    2011-10-01

    In a previous paper in this series, we described how The Aerospace Corporation's Parameterized Image Chain Analysis & Simulation SOftware (PICASSO) tool may be used to model space and airborne imaging systems operating in the visible to near-infrared (VISNIR). PICASSO is a systems-level tool, representative of a class of such tools used throughout the remote sensing community. It is capable of modeling systems over a wide range of fidelity, anywhere from conceptual design level (where it can serve as an integral part of the systems engineering process) to as-built hardware (where it can serve as part of the verification process). In the present paper, we extend the discussion of PICASSO to the modeling of Thermal Infrared (TIR) remote sensing systems, presenting the equations and methods necessary to modeling in that regime.

  13. A Productivity Analysis of Nonprocedural Languages.

    DTIC Science & Technology

    1982-12-01

    abstracts. The tools they work with are up-to- date, well documented, and f:om acceptable/reliable sources. With their Maket - 4- 1 a nd teeoo in enced...Eie invarsion are possible at any level. Additionally, any fisld carn be indexed at any level. b. Online operation with iateractive error- zorrec- c

  14. Analytical framework and tool kit for SEA follow-up

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nilsson, Mans; Wiklund, Hans; Finnveden, Goeran

    2009-04-15

    Most Strategic Environmental Assessment (SEA) research and applications have so far neglected the ex post stages of the process, also called SEA follow-up. Tool kits and methodological frameworks for engaging effectively with SEA follow-up have been conspicuously missing. In particular, little has so far been learned from the much more mature evaluation literature although many aspects are similar. This paper provides an analytical framework and tool kit for SEA follow-up. It is based on insights and tools developed within programme evaluation and environmental systems analysis. It is also grounded in empirical studies into real planning and programming practices at themore » regional level, but should have relevance for SEA processes at all levels. The purpose of the framework is to promote a learning-oriented and integrated use of SEA follow-up in strategic decision making. It helps to identify appropriate tools and their use in the process, and to systematise the use of available data and knowledge across the planning organization and process. It distinguishes three stages in follow-up: scoping, analysis and learning, identifies the key functions and demonstrates the informational linkages to the strategic decision-making process. The associated tool kit includes specific analytical and deliberative tools. Many of these are applicable also ex ante, but are then used in a predictive mode rather than on the basis of real data. The analytical element of the framework is organized on the basis of programme theory and 'DPSIR' tools. The paper discusses three issues in the application of the framework: understanding the integration of organizations and knowledge; understanding planners' questions and analytical requirements; and understanding interests, incentives and reluctance to evaluate.« less

  15. GOATS - Orbitology Component

    NASA Technical Reports Server (NTRS)

    Haber, Benjamin M.; Green, Joseph J.

    2010-01-01

    The GOATS Orbitology Component software was developed to specifically address the concerns presented by orbit analysis tools that are often written as stand-alone applications. These applications do not easily interface with standard JPL first-principles analysis tools, and have a steep learning curve due to their complicated nature. This toolset is written as a series of MATLAB functions, allowing seamless integration into existing JPL optical systems engineering modeling and analysis modules. The functions are completely open, and allow for advanced users to delve into and modify the underlying physics being modeled. Additionally, this software module fills an analysis gap, allowing for quick, high-level mission analysis trades without the need for detailed and complicated orbit analysis using commercial stand-alone tools. This software consists of a series of MATLAB functions to provide for geometric orbit-related analysis. This includes propagation of orbits to varying levels of generalization. In the simplest case, geosynchronous orbits can be modeled by specifying a subset of three orbit elements. The next case is a circular orbit, which can be specified by a subset of four orbit elements. The most general case is an arbitrary elliptical orbit specified by all six orbit elements. These orbits are all solved geometrically, under the basic problem of an object in circular (or elliptical) orbit around a rotating spheroid. The orbit functions output time series ground tracks, which serve as the basis for more detailed orbit analysis. This software module also includes functions to track the positions of the Sun, Moon, and arbitrary celestial bodies specified by right ascension and declination. Also included are functions to calculate line-of-sight geometries to ground-based targets, angular rotations and decompositions, and other line-of-site calculations. The toolset allows for the rapid execution of orbit trade studies at the level of detail required for the early stage of mission concept development.

  16. Eulerian frequency analysis of structural vibrations from high-speed video

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Venanzoni, Andrea; Siemens Industry Software NV, Interleuvenlaan 68, B-3001 Leuven; De Ryck, Laurent

    An approach for the analysis of the frequency content of structural vibrations from high-speed video recordings is proposed. The techniques and tools proposed rely on an Eulerian approach, that is, using the time history of pixels independently to analyse structural motion, as opposed to Lagrangian approaches, where the motion of the structure is tracked in time. The starting point is an existing Eulerian motion magnification method, which consists in decomposing the video frames into a set of spatial scales through a so-called Laplacian pyramid [1]. Each scale — or level — can be amplified independently to reconstruct a magnified motionmore » of the observed structure. The approach proposed here provides two analysis tools or pre-amplification steps. The first tool provides a representation of the global frequency content of a video per pyramid level. This may be further enhanced by applying an angular filter in the spatial frequency domain to each frame of the video before the Laplacian pyramid decomposition, which allows for the identification of the frequency content of the structural vibrations in a particular direction of space. This proposed tool complements the existing Eulerian magnification method by amplifying selectively the levels containing relevant motion information with respect to their frequency content. This magnifies the displacement while limiting the noise contribution. The second tool is a holographic representation of the frequency content of a vibrating structure, yielding a map of the predominant frequency components across the structure. In contrast to the global frequency content representation of the video, this tool provides a local analysis of the periodic gray scale intensity changes of the frame in order to identify the vibrating parts of the structure and their main frequencies. Validation cases are provided and the advantages and limits of the approaches are discussed. The first validation case consists of the frequency content retrieval of the tip of a shaker, excited at selected fixed frequencies. The goal of this setup is to retrieve the frequencies at which the tip is excited. The second validation case consists of two thin metal beams connected to a randomly excited bar. It is shown that the holographic representation visually highlights the predominant frequency content of each pixel and locates the global frequencies of the motion, thus retrieving the natural frequencies for each beam.« less

  17. An Analysis of the Text Complexity of Leveled Passages in Four Popular Classroom Reading Assessments

    ERIC Educational Resources Information Center

    Toyama, Yukie; Hiebert, Elfrieda H.; Pearson, P. David

    2017-01-01

    This study investigated the complexity of leveled passages used in four classroom reading assessments. A total of 167 passages leveled for Grades 1-6 from these assessments were analyzed using four analytical tools of text complexity. More traditional, two-factor measures of text complexity found a general trend of fairly consistent across-grade…

  18. Analysis of Interactive Conflict Resolution Tool Usage in a Mixed Equipage Environment

    NASA Technical Reports Server (NTRS)

    Homola, Jeffrey; Morey, Susan; Cabrall, Christopher; Martin, Lynne; Mercer, Joey; Prevot, Thomas

    2013-01-01

    A human-in-the-loop simulation was conducted that examined separation assurance concepts in varying levels of traffic density with mixtures of aircraft equipage and automation. This paper's analysis focuses on one of the experimental conditions in which traffic levels were approximately fifty percent higher than today, and approximately fifty percent of the traffic within the test area were equipped with data communications (data comm) capabilities. The other fifty percent of the aircraft required control by voice much like today. Within this environment, the air traffic controller participants were provided access to tools and automation designed to support the primary task of separation assurance that are currently unavailable. Two tools were selected for analysis in this paper: 1) a pre-probed altitude fly-out menu that provided instant feedback of conflict probe results for a range of altitudes, and 2) an interactive auto resolver that provided on-demand access to an automation-generated conflict resolution trajectory. Although encouraged, use of the support tools was not required; the participants were free to use the tools as they saw fit, and they were also free to accept, reject, or modify the resolutions offered by the automation. This mode of interaction provided a unique opportunity to examine exactly when and how these tools were used, as well as how acceptable the resolutions were. Results showed that the participants used the pre-probed altitude fly-out menu in 14% of conflict cases and preferred to use it in a strategic timeframe on data comm equipped and level flight aircraft. The interactive auto resolver was also used in a primarily strategic timeframe on 22% of conflicts and that their preference was to use it on conflicts involving data comm equipped aircraft as well. Of the 258 resolutions displayed, 46% were implemented and 54% were not. The auto resolver was rated highly by participants in terms of confidence and preference. Factors such as aircraft equipage, ownership, and location of predicted separation loss appeared to play a role in the decision of controllers to accept or reject the auto resolver's resolutions.

  19. Relevance of Item Analysis in Standardizing an Achievement Test in Teaching of Physical Science in B.Ed Syllabus

    ERIC Educational Resources Information Center

    Marie, S. Maria Josephine Arokia; Edannur, Sreekala

    2015-01-01

    This paper focused on the analysis of test items constructed in the paper of teaching Physical Science for B.Ed. class. It involved the analysis of difficulty level and discrimination power of each test item. Item analysis allows selecting or omitting items from the test, but more importantly item analysis is a tool to help the item writer improve…

  20. Towards Accurate Application Characterization for Exascale (APEX)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hammond, Simon David

    Sandia National Laboratories has been engaged in hardware and software codesign activities for a number of years, indeed, it might be argued that prototyping of clusters as far back as the CPLANT machines and many large capability resources including ASCI Red and RedStorm were examples of codesigned solutions. As the research supporting our codesign activities has moved closer to investigating on-node runtime behavior a nature hunger has grown for detailed analysis of both hardware and algorithm performance from the perspective of low-level operations. The Application Characterization for Exascale (APEX) LDRD was a project concieved of addressing some of these concerns.more » Primarily the research was to intended to focus on generating accurate and reproducible low-level performance metrics using tools that could scale to production-class code bases. Along side this research was an advocacy and analysis role associated with evaluating tools for production use, working with leading industry vendors to develop and refine solutions required by our code teams and to directly engage with production code developers to form a context for the application analysis and a bridge to the research community within Sandia. On each of these accounts significant progress has been made, particularly, as this report will cover, in the low-level analysis of operations for important classes of algorithms. This report summarizes the development of a collection of tools under the APEX research program and leaves to other SAND and L2 milestone reports the description of codesign progress with Sandia’s production users/developers.« less

  1. Bark analysis as a guide to cassava nutrition in Sierra Leone

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Godfrey-Sam-Aggrey, W.; Garber, M.J.

    1979-01-01

    Cassava main stem barks from two experiments in which similar fertilizers were applied directly in a 2/sup 5/ confounded factorial design were analyzed and the bark nutrients used as a guide to cassava nutrition. The application of multiple regression analysis to the respective root yields and bark nutrient concentrations enable nutrient levels and optimum adjusted root yields to be derived. Differences in bark nutrient concentrations reflected soil fertility levels. Bark analysis and the application of multiple regression analysis to root yields and bark nutrients appear to be useful tools for predicting fertilizer recommendations for cassava production.

  2. Leadership in Doctoral Dissertations of Educational Sciences in Turkey

    ERIC Educational Resources Information Center

    Yardibi, Nursel

    2014-01-01

    The purpose of the study is to determine tendencies in educational sciences doctoral dissertations according to divisions, research methods and desings, data collection tools, data analysis techniques, and leadership levels in Turkey. This content analysis study has been desinged with qualitative research methods. This research has been limited by…

  3. Treating technology as a luxury? 10 necessary tools.

    PubMed

    Berger, Steven H

    2007-02-01

    Technology and techniques that every hospital should acquire and use for effective financial management include: Daily dashboards. Balanced scorecards. Benchmarking. Flexible budgeting and monitoring. Labor management systems. Nonlabor management analysis. Service, line, physician, and patient-level reporting and analysis. Cost accounting technology. Contract management technology. Denials management software.

  4. A comprehensive comparison of tools for differential ChIP-seq analysis

    PubMed Central

    Steinhauser, Sebastian; Kurzawa, Nils; Eils, Roland

    2016-01-01

    ChIP-seq has become a widely adopted genomic assay in recent years to determine binding sites for transcription factors or enrichments for specific histone modifications. Beside detection of enriched or bound regions, an important question is to determine differences between conditions. While this is a common analysis for gene expression, for which a large number of computational approaches have been validated, the same question for ChIP-seq is particularly challenging owing to the complexity of ChIP-seq data in terms of noisiness and variability. Many different tools have been developed and published in recent years. However, a comprehensive comparison and review of these tools is still missing. Here, we have reviewed 14 tools, which have been developed to determine differential enrichment between two conditions. They differ in their algorithmic setups, and also in the range of applicability. Hence, we have benchmarked these tools on real data sets for transcription factors and histone modifications, as well as on simulated data sets to quantitatively evaluate their performance. Overall, there is a great variety in the type of signal detected by these tools with a surprisingly low level of agreement. Depending on the type of analysis performed, the choice of method will crucially impact the outcome. PMID:26764273

  5. Public data and open source tools for multi-assay genomic investigation of disease.

    PubMed

    Kannan, Lavanya; Ramos, Marcel; Re, Angela; El-Hachem, Nehme; Safikhani, Zhaleh; Gendoo, Deena M A; Davis, Sean; Gomez-Cabrero, David; Castelo, Robert; Hansen, Kasper D; Carey, Vincent J; Morgan, Martin; Culhane, Aedín C; Haibe-Kains, Benjamin; Waldron, Levi

    2016-07-01

    Molecular interrogation of a biological sample through DNA sequencing, RNA and microRNA profiling, proteomics and other assays, has the potential to provide a systems level approach to predicting treatment response and disease progression, and to developing precision therapies. Large publicly funded projects have generated extensive and freely available multi-assay data resources; however, bioinformatic and statistical methods for the analysis of such experiments are still nascent. We review multi-assay genomic data resources in the areas of clinical oncology, pharmacogenomics and other perturbation experiments, population genomics and regulatory genomics and other areas, and tools for data acquisition. Finally, we review bioinformatic tools that are explicitly geared toward integrative genomic data visualization and analysis. This review provides starting points for accessing publicly available data and tools to support development of needed integrative methods. © The Author 2015. Published by Oxford University Press.

  6. Investigation of tool wear and surface roughness on machining of titanium alloy with MT-CVD cutting tool

    NASA Astrophysics Data System (ADS)

    Maity, Kalipada; Pradhan, Swastik

    2018-04-01

    In this study, machining of titanium alloy (grade 5) is carried out using MT-CVD coated cutting tool. Titanium alloys possess superior strength-to-weight ratio with good corrosion resistance. Most of the industries used titanium alloy for the manufacturing of various types of lightweight components. The parts made from Ti-6Al-4V largely used in aerospace, biomedical, automotive and marine sectors. The conventional machining of this material is very difficult, due to low thermal conductivity and high chemical reactivity properties. To achieve a good surface finish with minimum tool wear of cutting tool, the machining is carried out using MT-CVD coated cutting tool. The experiment is carried out using of Taguchi L27 array layout with three cutting variables and levels. To find out the optimum parametric setting desirability function analysis (DFA) approach is used. The analysis of variance is studied to know the percentage contribution of each cutting variables. The optimum parametric setting results calculated from DFA were validated through the confirmation test.

  7. XML-based data model and architecture for a knowledge-based grid-enabled problem-solving environment for high-throughput biological imaging.

    PubMed

    Ahmed, Wamiq M; Lenz, Dominik; Liu, Jia; Paul Robinson, J; Ghafoor, Arif

    2008-03-01

    High-throughput biological imaging uses automated imaging devices to collect a large number of microscopic images for analysis of biological systems and validation of scientific hypotheses. Efficient manipulation of these datasets for knowledge discovery requires high-performance computational resources, efficient storage, and automated tools for extracting and sharing such knowledge among different research sites. Newly emerging grid technologies provide powerful means for exploiting the full potential of these imaging techniques. Efficient utilization of grid resources requires the development of knowledge-based tools and services that combine domain knowledge with analysis algorithms. In this paper, we first investigate how grid infrastructure can facilitate high-throughput biological imaging research, and present an architecture for providing knowledge-based grid services for this field. We identify two levels of knowledge-based services. The first level provides tools for extracting spatiotemporal knowledge from image sets and the second level provides high-level knowledge management and reasoning services. We then present cellular imaging markup language, an extensible markup language-based language for modeling of biological images and representation of spatiotemporal knowledge. This scheme can be used for spatiotemporal event composition, matching, and automated knowledge extraction and representation for large biological imaging datasets. We demonstrate the expressive power of this formalism by means of different examples and extensive experimental results.

  8. A Web-Based Decision Tool to Improve Contraceptive Counseling for Women With Chronic Medical Conditions: Protocol For a Mixed Methods Implementation Study

    PubMed Central

    Damschroder, Laura J; Fetters, Michael D; Zikmund-Fisher, Brian J; Crabtree, Benjamin F; Hudson, Shawna V; Ruffin IV, Mack T; Fucinari, Juliana; Kang, Minji; Taichman, L Susan; Creswell, John W

    2018-01-01

    Background Women with chronic medical conditions, such as diabetes and hypertension, have a higher risk of pregnancy-related complications compared with women without medical conditions and should be offered contraception if desired. Although evidence based guidelines for contraceptive selection in the presence of medical conditions are available via the United States Medical Eligibility Criteria (US MEC), these guidelines are underutilized. Research also supports the use of decision tools to promote shared decision making between patients and providers during contraceptive counseling. Objective The overall goal of the MiHealth, MiChoice project is to design and implement a theory-driven, Web-based tool that incorporates the US MEC (provider-level intervention) within the vehicle of a contraceptive decision tool for women with chronic medical conditions (patient-level intervention) in community-based primary care settings (practice-level intervention). This will be a 3-phase study that includes a predesign phase, a design phase, and a testing phase in a randomized controlled trial. This study protocol describes phase 1 and aim 1, which is to determine patient-, provider-, and practice-level factors that are relevant to the design and implementation of the contraceptive decision tool. Methods This is a mixed methods implementation study. To customize the delivery of the US MEC in the decision tool, we selected high-priority constructs from the Consolidated Framework for Implementation Research and the Theoretical Domains Framework to drive data collection and analysis at the practice and provider level, respectively. A conceptual model that incorporates constructs from the transtheoretical model and the health beliefs model undergirds patient-level data collection and analysis and will inform customization of the decision tool for this population. We will recruit 6 community-based primary care practices and conduct quantitative surveys and semistructured qualitative interviews with women who have chronic medical conditions, their primary care providers (PCPs), and clinic staff, as well as field observations of practice activities. Quantitative survey data will be summarized with simple descriptive statistics and relationships between participant characteristics and contraceptive recommendations (for PCPs), and current contraceptive use (for patients) will be examined using Fisher exact test. We will conduct thematic analysis of qualitative data from interviews and field observations. The integration of data will occur by comparing, contrasting, and synthesizing qualitative and quantitative findings to inform the future development and implementation of the intervention. Results We are currently enrolling practices and anticipate study completion in 15 months. Conclusions This protocol describes the first phase of a multiphase mixed methods study to develop and implement a Web-based decision tool that is customized to meet the needs of women with chronic medical conditions in primary care settings. Study findings will promote contraceptive counseling via shared decision making and reflect evidence-based guidelines for contraceptive selection. Trial Registration ClinicalTrials.gov NCT03153644; https://clinicaltrials.gov/ct2/show/NCT03153644 (Archived by WebCite at http://www.webcitation.org/6yUkA5lK8) PMID:29669707

  9. Extension of a System Level Tool for Component Level Analysis

    NASA Technical Reports Server (NTRS)

    Majumdar, Alok; Schallhorn, Paul

    2002-01-01

    This paper presents an extension of a numerical algorithm for network flow analysis code to perform multi-dimensional flow calculation. The one dimensional momentum equation in network flow analysis code has been extended to include momentum transport due to shear stress and transverse component of velocity. Both laminar and turbulent flows are considered. Turbulence is represented by Prandtl's mixing length hypothesis. Three classical examples (Poiseuille flow, Couette flow and shear driven flow in a rectangular cavity) are presented as benchmark for the verification of the numerical scheme.

  10. Extension of a System Level Tool for Component Level Analysis

    NASA Technical Reports Server (NTRS)

    Majumdar, Alok; Schallhorn, Paul; McConnaughey, Paul K. (Technical Monitor)

    2001-01-01

    This paper presents an extension of a numerical algorithm for network flow analysis code to perform multi-dimensional flow calculation. The one dimensional momentum equation in network flow analysis code has been extended to include momentum transport due to shear stress and transverse component of velocity. Both laminar and turbulent flows are considered. Turbulence is represented by Prandtl's mixing length hypothesis. Three classical examples (Poiseuille flow, Couette flow, and shear driven flow in a rectangular cavity) are presented as benchmark for the verification of the numerical scheme.

  11. Towards early software reliability prediction for computer forensic tools (case study).

    PubMed

    Abu Talib, Manar

    2016-01-01

    Versatility, flexibility and robustness are essential requirements for software forensic tools. Researchers and practitioners need to put more effort into assessing this type of tool. A Markov model is a robust means for analyzing and anticipating the functioning of an advanced component based system. It is used, for instance, to analyze the reliability of the state machines of real time reactive systems. This research extends the architecture-based software reliability prediction model for computer forensic tools, which is based on Markov chains and COSMIC-FFP. Basically, every part of the computer forensic tool is linked to a discrete time Markov chain. If this can be done, then a probabilistic analysis by Markov chains can be performed to analyze the reliability of the components and of the whole tool. The purposes of the proposed reliability assessment method are to evaluate the tool's reliability in the early phases of its development, to improve the reliability assessment process for large computer forensic tools over time, and to compare alternative tool designs. The reliability analysis can assist designers in choosing the most reliable topology for the components, which can maximize the reliability of the tool and meet the expected reliability level specified by the end-user. The approach of assessing component-based tool reliability in the COSMIC-FFP context is illustrated with the Forensic Toolkit Imager case study.

  12. Empowering districts to target priorities for improving child health service in Uganda using change management and rapid assessment methods.

    PubMed

    Odaga, John; Henriksson, Dorcus K; Nkolo, Charles; Tibeihaho, Hector; Musabe, Richard; Katusiime, Margaret; Sinabulya, Zaccheus; Mucunguzi, Stephen; Mbonye, Anthony K; Valadez, Joseph J

    2016-01-01

    Local health system managers in low- and middle-income countries have the responsibility to set health priorities and allocate resources accordingly. Although tools exist to aid this process, they are not widely applied for various reasons including non-availability, poor knowledge of the tools, and poor adaptability into the local context. In Uganda, delivery of basic services is devolved to the District Local Governments through the District Health Teams (DHTs). The Community and District Empowerment for Scale-up (CODES) project aims to provide a set of management tools that aid contextualised priority setting, fund allocation, and problem-solving in a systematic way to improve effective coverage and quality of child survival interventions. Although the various tools have previously been used at the national level, the project aims to combine them in an integral way for implementation at the district level. These tools include Lot Quality Assurance Sampling (LQAS) surveys to generate local evidence, Bottleneck analysis and Causal analysis as analytical tools, Continuous Quality Improvement, and Community Dialogues based on Citizen Report Cards and U reports. The tools enable identification of gaps, prioritisation of possible solutions, and allocation of resources accordingly. This paper presents some of the tools used by the project in five districts in Uganda during the proof-of-concept phase of the project. All five districts were trained and participated in LQAS surveys and readily adopted the tools for priority setting and resource allocation. All districts developed health operational work plans, which were based on the evidence and each of the districts implemented more than three of the priority activities which were included in their work plans. In the five districts, the CODES project demonstrated that DHTs can adopt and integrate these tools in the planning process by systematically identifying gaps and setting priority interventions for child survival.

  13. Empowering districts to target priorities for improving child health service in Uganda using change management and rapid assessment methods

    PubMed Central

    Odaga, John; Henriksson, Dorcus K.; Nkolo, Charles; Tibeihaho, Hector; Musabe, Richard; Katusiime, Margaret; Sinabulya, Zaccheus; Mucunguzi, Stephen; Mbonye, Anthony K.; Valadez, Joseph J.

    2016-01-01

    Background Local health system managers in low- and middle-income countries have the responsibility to set health priorities and allocate resources accordingly. Although tools exist to aid this process, they are not widely applied for various reasons including non-availability, poor knowledge of the tools, and poor adaptability into the local context. In Uganda, delivery of basic services is devolved to the District Local Governments through the District Health Teams (DHTs). The Community and District Empowerment for Scale-up (CODES) project aims to provide a set of management tools that aid contextualised priority setting, fund allocation, and problem-solving in a systematic way to improve effective coverage and quality of child survival interventions. Design Although the various tools have previously been used at the national level, the project aims to combine them in an integral way for implementation at the district level. These tools include Lot Quality Assurance Sampling (LQAS) surveys to generate local evidence, Bottleneck analysis and Causal analysis as analytical tools, Continuous Quality Improvement, and Community Dialogues based on Citizen Report Cards and U reports. The tools enable identification of gaps, prioritisation of possible solutions, and allocation of resources accordingly. This paper presents some of the tools used by the project in five districts in Uganda during the proof-of-concept phase of the project. Results All five districts were trained and participated in LQAS surveys and readily adopted the tools for priority setting and resource allocation. All districts developed health operational work plans, which were based on the evidence and each of the districts implemented more than three of the priority activities which were included in their work plans. Conclusions In the five districts, the CODES project demonstrated that DHTs can adopt and integrate these tools in the planning process by systematically identifying gaps and setting priority interventions for child survival. PMID:27225791

  14. Verification of S&D Solutions for Network Communications and Devices

    NASA Astrophysics Data System (ADS)

    Rudolph, Carsten; Compagna, Luca; Carbone, Roberto; Muñoz, Antonio; Repp, Jürgen

    This chapter describes the tool-supported verification of S&D Solutions on the level of network communications and devices. First, the general goals and challenges of verification in the context of AmI systems are highlighted and the role of verification and validation within the SERENITY processes is explained.Then, SERENITY extensions to the SH VErification tool are explained using small examples. Finally, the applicability of existing verification tools is discussed in the context of the AVISPA toolset. The two different tools show that for the security analysis of network and devices S&D Patterns relevant complementary approachesexist and can be used.

  15. Leveraging Python Interoperability Tools to Improve Sapphire's Usability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gezahegne, A; Love, N S

    2007-12-10

    The Sapphire project at the Center for Applied Scientific Computing (CASC) develops and applies an extensive set of data mining algorithms for the analysis of large data sets. Sapphire's algorithms are currently available as a set of C++ libraries. However many users prefer higher level scripting languages such as Python for their ease of use and flexibility. In this report, we evaluate four interoperability tools for the purpose of wrapping Sapphire's core functionality with Python. Exposing Sapphire's functionality through a Python interface would increase its usability and connect its algorithms to existing Python tools.

  16. Evaluation of Correlation of Blood Glucose and Salivary Glucose Level in Known Diabetic Patients.

    PubMed

    Gupta, Anjali; Singh, Siddharth Kumar; Padmavathi, B N; Rajan, S Y; Mamatha, G P; Kumar, Sandeep; Roy, Sayak; Sareen, Mohit

    2015-05-01

    Diabetes mellitus is a chronic heterogenous disease in which there is dysregulation of carbohydrates, protein and lipid metabolism; leading to elevated blood glucose levels. The present study was conducted to evaluate the correlation between blood glucose and salivary glucose levels in known diabetic patients and control group and also to evaluate salivary glucose level as a diagnostic tool in diabetic patients. A total number of 250 patients were studied, out of which 212 formed the study group and 38 formed the control group. Among 250 patients, correlation was evaluated between blood glucose and salivary glucose values which on analysis revealed Pearson correlation of 0.073. The p-value was 0.247, which was statistically non significant. Salivary glucose values cannot be considered as a diagnostic tool for diabetic individuals.

  17. Sequential sentinel SNP Regional Association Plots (SSS-RAP): an approach for testing independence of SNP association signals using meta-analysis data.

    PubMed

    Zheng, Jie; Gaunt, Tom R; Day, Ian N M

    2013-01-01

    Genome-Wide Association Studies (GWAS) frequently incorporate meta-analysis within their framework. However, conditional analysis of individual-level data, which is an established approach for fine mapping of causal sites, is often precluded where only group-level summary data are available for analysis. Here, we present a numerical and graphical approach, "sequential sentinel SNP regional association plot" (SSS-RAP), which estimates regression coefficients (beta) with their standard errors using the meta-analysis summary results directly. Under an additive model, typical for genes with small effect, the effect for a sentinel SNP can be transformed to the predicted effect for a possibly dependent SNP through a 2×2 2-SNP haplotypes table. The approach assumes Hardy-Weinberg equilibrium for test SNPs. SSS-RAP is available as a Web-tool (http://apps.biocompute.org.uk/sssrap/sssrap.cgi). To develop and illustrate SSS-RAP we analyzed lipid and ECG traits data from the British Women's Heart and Health Study (BWHHS), evaluated a meta-analysis for ECG trait and presented several simulations. We compared results with existing approaches such as model selection methods and conditional analysis. Generally findings were consistent. SSS-RAP represents a tool for testing independence of SNP association signals using meta-analysis data, and is also a convenient approach based on biological principles for fine mapping in group level summary data. © 2012 Blackwell Publishing Ltd/University College London.

  18. Determination and representation of electric charge distributions associated with adverse weather conditions

    NASA Technical Reports Server (NTRS)

    Rompala, John T.

    1992-01-01

    Algorithms are presented for determining the size and location of electric charges which model storm systems and lightning strikes. The analysis utilizes readings from a grid of ground level field mills and geometric constraints on parameters to arrive at a representative set of charges. This set is used to generate three dimensional graphical depictions of the set as well as contour maps of the ground level electrical environment over the grid. The composite, analytic and graphic package is demonstrated and evaluated using controlled input data and archived data from a storm system. The results demonstrate the packages utility as: an operational tool in appraising adverse weather conditions; a research tool in studies of topics such as storm structure, storm dynamics, and lightning; and a tool in designing and evaluating grid systems.

  19. Preparing Laboratory and Real-World EEG Data for Large-Scale Analysis: A Containerized Approach

    PubMed Central

    Bigdely-Shamlo, Nima; Makeig, Scott; Robbins, Kay A.

    2016-01-01

    Large-scale analysis of EEG and other physiological measures promises new insights into brain processes and more accurate and robust brain–computer interface models. However, the absence of standardized vocabularies for annotating events in a machine understandable manner, the welter of collection-specific data organizations, the difficulty in moving data across processing platforms, and the unavailability of agreed-upon standards for preprocessing have prevented large-scale analyses of EEG. Here we describe a “containerized” approach and freely available tools we have developed to facilitate the process of annotating, packaging, and preprocessing EEG data collections to enable data sharing, archiving, large-scale machine learning/data mining and (meta-)analysis. The EEG Study Schema (ESS) comprises three data “Levels,” each with its own XML-document schema and file/folder convention, plus a standardized (PREP) pipeline to move raw (Data Level 1) data to a basic preprocessed state (Data Level 2) suitable for application of a large class of EEG analysis methods. Researchers can ship a study as a single unit and operate on its data using a standardized interface. ESS does not require a central database and provides all the metadata data necessary to execute a wide variety of EEG processing pipelines. The primary focus of ESS is automated in-depth analysis and meta-analysis EEG studies. However, ESS can also encapsulate meta-information for the other modalities such as eye tracking, that are increasingly used in both laboratory and real-world neuroimaging. ESS schema and tools are freely available at www.eegstudy.org and a central catalog of over 850 GB of existing data in ESS format is available at studycatalog.org. These tools and resources are part of a larger effort to enable data sharing at sufficient scale for researchers to engage in truly large-scale EEG analysis and data mining (BigEEG.org). PMID:27014048

  20. An analysis of the concept of competence in individuals and social systems.

    PubMed

    Adler, P T

    1982-01-01

    This paper has attempted to present a unified conceptual model of positive mental health or competence from the perspective of individuals and from the perspective of social systems of varying degrees of complexity, such as families, organizations, and entire communities. It has provided a taxonomy of the elements of competence which allows the application of a common framework to the analysis of competence and to the planning and evaluation of competence building interventions at any level of social organization. Community Mental Health Centers can apply the model which has been presented in a number of different ways. At whatever level(s) the CMHCs' efforts are directed, the competence model presents a framework for analysis, intervention, and evaluation which enriches and expands upon more typical disorder-based formulations. By providing a framework which encompasses all levels of social organization, the model provides the conceptual tools for going beyond the individual and microsystem levels which have often constituted the boundaries of CMHC concern, and allows the CMHC to approach the organizational and community levels which must be encompassed by a competently comprehensive center. Application of the concept of competence to social organizations and to communities allows the CMHC to analyze and intervene at these levels. Finally, the concept of organizational competence separated into its various elements provides the CMHC with a tool for analyzing and evaluating its own environment and the competence of various aspects of its own functioning within that environment.

  1. Understanding Data Needs for Vulnerability Assessment and Decision Making to Manage Vulnerability of Department of Defense Installations to Climate Change

    DTIC Science & Technology

    2016-02-01

    frequency...................................................................... 81 Figure 46. Return period analysis at Sewell’s Point (across the mouth ...Return period analysis at Sewell’s Point (across the mouth of the James River from both Langley AFB and Fort Eustis with sea level rise projections...a digital elevation model as an input and calculates the water level necessary to fill each grid cell. In other words , the fill tool takes into

  2. A Cross-Platform Infrastructure for Scalable Runtime Application Performance Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jack Dongarra; Shirley Moore; Bart Miller, Jeffrey Hollingsworth

    2005-03-15

    The purpose of this project was to build an extensible cross-platform infrastructure to facilitate the development of accurate and portable performance analysis tools for current and future high performance computing (HPC) architectures. Major accomplishments include tools and techniques for multidimensional performance analysis, as well as improved support for dynamic performance monitoring of multithreaded and multiprocess applications. Previous performance tool development has been limited by the burden of having to re-write a platform-dependent low-level substrate for each architecture/operating system pair in order to obtain the necessary performance data from the system. Manual interpretation of performance data is not scalable for large-scalemore » long-running applications. The infrastructure developed by this project provides a foundation for building portable and scalable performance analysis tools, with the end goal being to provide application developers with the information they need to analyze, understand, and tune the performance of terascale applications on HPC architectures. The backend portion of the infrastructure provides runtime instrumentation capability and access to hardware performance counters, with thread-safety for shared memory environments and a communication substrate to support instrumentation of multiprocess and distributed programs. Front end interfaces provides tool developers with a well-defined, platform-independent set of calls for requesting performance data. End-user tools have been developed that demonstrate runtime data collection, on-line and off-line analysis of performance data, and multidimensional performance analysis. The infrastructure is based on two underlying performance instrumentation technologies. These technologies are the PAPI cross-platform library interface to hardware performance counters and the cross-platform Dyninst library interface for runtime modification of executable images. The Paradyn and KOJAK projects have made use of this infrastructure to build performance measurement and analysis tools that scale to long-running programs on large parallel and distributed systems and that automate much of the search for performance bottlenecks.« less

  3. Raising Sociocultural Awareness through Contextual Analysis: Some Tools for Teachers

    ERIC Educational Resources Information Center

    McConachy, Troy

    2009-01-01

    Despite long-standing recognition of the importance of sociocultural context in meaning making, criticisms have been levelled at communicative language teaching (CLT) for failing to effectively address this at the level of classroom practice. In fact, it has been argued that the way CLT presents content reveals a fundamentally reductionist view of…

  4. RE Data Explorer: Informing Variable Renewable Energy Grid Integration for Low Emission Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cox, Sarah L

    The RE Data Explorer, developed by the National Renewable Energy Laboratory, is an innovative web-based analysis tool that utilizes geospatial and spatiotemporal renewable energy data to visualize, execute, and support analysis of renewable energy potential under various user-defined scenarios. This analysis can inform high-level prospecting, integrated planning, and policy making to enable low emission development.

  5. An adverse event screening tool based on routinely collected hospital-acquired diagnoses.

    PubMed

    Brand, Caroline; Tropea, Joanne; Gorelik, Alexandra; Jolley, Damien; Scott, Ian; Sundararajan, Vijaya

    2012-06-01

    The aim was to develop an electronic adverse event (AE) screening tool applicable to acute care hospital episodes for patients admitted with chronic heart failure (CHF) and pneumonia. Consensus building using a modified Delphi method and descriptive analysis of hospital discharge data. Consultant physicians in general medicine (n = 38). In-hospital acquired (C-prefix) diagnoses associated with CHF and pneumonia admissions to 230 hospitals in Victoria, Australia, were extracted from the Victorian Admitted Episodes Data Set between July 2004 and June 2007. A 9-point rating scale was used to prioritize diagnoses acquired during hospitalization (routinely coded as a 'C-prefix' diagnosis to distinguish from diagnoses present on admission) for inclusion within an AE screening tool. Diagnoses rated a group median score between 7 and 9 by the physician panel were included. Selection of C-prefix diagnoses with a group median rating of 7-9 in a screening tool, and the level of physician agreement, as assessed using the Interpercentile Range Adjusted for Symmetry. Of 697 initial C-prefix diagnoses, there were high levels of agreement to include 113 (16.2%) in the AE screening tool. Using these selected diagnoses, a potential AE was flagged in 14% of all admissions for the two index conditions. Intra-rater reliability for each clinician ranged from kappa 0.482 to 1.0. A high level of physician agreement was obtained in selecting in-hospital diagnoses for inclusion in an AE screening tool based on routinely collected data. These results support further tool validation.

  6. Scientist-Centered Workflow Abstractions via Generic Actors, Workflow Templates, and Context-Awareness for Groundwater Modeling and Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chin, George; Sivaramakrishnan, Chandrika; Critchlow, Terence J.

    2011-07-04

    A drawback of existing scientific workflow systems is the lack of support to domain scientists in designing and executing their own scientific workflows. Many domain scientists avoid developing and using workflows because the basic objects of workflows are too low-level and high-level tools and mechanisms to aid in workflow construction and use are largely unavailable. In our research, we are prototyping higher-level abstractions and tools to better support scientists in their workflow activities. Specifically, we are developing generic actors that provide abstract interfaces to specific functionality, workflow templates that encapsulate workflow and data patterns that can be reused and adaptedmore » by scientists, and context-awareness mechanisms to gather contextual information from the workflow environment on behalf of the scientist. To evaluate these scientist-centered abstractions on real problems, we apply them to construct and execute scientific workflows in the specific domain area of groundwater modeling and analysis.« less

  7. Functional Analysis of OMICs Data and Small Molecule Compounds in an Integrated "Knowledge-Based" Platform.

    PubMed

    Dubovenko, Alexey; Nikolsky, Yuri; Rakhmatulin, Eugene; Nikolskaya, Tatiana

    2017-01-01

    Analysis of NGS and other sequencing data, gene variants, gene expression, proteomics, and other high-throughput (OMICs) data is challenging because of its biological complexity and high level of technical and biological noise. One way to deal with both problems is to perform analysis with a high fidelity annotated knowledgebase of protein interactions, pathways, and functional ontologies. This knowledgebase has to be structured in a computer-readable format and must include software tools for managing experimental data, analysis, and reporting. Here, we present MetaCore™ and Key Pathway Advisor (KPA), an integrated platform for functional data analysis. On the content side, MetaCore and KPA encompass a comprehensive database of molecular interactions of different types, pathways, network models, and ten functional ontologies covering human, mouse, and rat genes. The analytical toolkit includes tools for gene/protein list enrichment analysis, statistical "interactome" tool for the identification of over- and under-connected proteins in the dataset, and a biological network analysis module made up of network generation algorithms and filters. The suite also features Advanced Search, an application for combinatorial search of the database content, as well as a Java-based tool called Pathway Map Creator for drawing and editing custom pathway maps. Applications of MetaCore and KPA include molecular mode of action of disease research, identification of potential biomarkers and drug targets, pathway hypothesis generation, analysis of biological effects for novel small molecule compounds and clinical applications (analysis of large cohorts of patients, and translational and personalized medicine).

  8. [Assessing work-related stress: an Italian adaptation of the HSE Management Standards Work-Related Stress Indicator Tool].

    PubMed

    Marcatto, Francesco; D'Errico, Giuseppe; Di Blas, Lisa; Ferrante, Donatella

    2011-01-01

    The aim of this paper is to present a preliminary validation of an Italian adaptation of the HSE Management Standards Work-Related Stress Indicator Tool (IT), an instrument for assessing work-related stress at the organizational level, originally developed in Britain by the Health and Safety Executive. A scale that assesses the physical work environment has been added to the original version of the IT. 190 employees of the University of Trieste have been enrolled in the study. A confirmatory analysis showed a satisfactory fit of the eight-factors structure of the instrument. Further psychometric analysis showed adequate internal consistency of the IT scales and good criterion validity, as evidenced by the correlations with self-perception of stress, work satisfaction and motivation. In conclusion, the Indicator Tool proved to be a valid and reliable instrument for the assessment of work-related stress at the organizational level, and it is also compatible with the instructions provided by the Ministry of Labour and Social Policy (Circular letter 18/11/2010).

  9. A Practical Tutorial on Modified Condition/Decision Coverage

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J.; Veerhusen, Dan S.; Chilenski, John J.; Rierson, Leanna K.

    2001-01-01

    This tutorial provides a practical approach to assessing modified condition/decision coverage (MC/DC) for aviation software products that must comply with regulatory guidance for DO-178B level A software. The tutorial's approach to MC/DC is a 5-step process that allows a certification authority or verification analyst to evaluate MC/DC claims without the aid of a coverage tool. In addition to the MC/DC approach, the tutorial addresses factors to consider in selecting and qualifying a structural coverage analysis tool, tips for reviewing life cycle data related to MC/DC, and pitfalls common to structural coverage analysis.

  10. Modeling the Multi-Body System Dynamics of a Flexible Solar Sail Spacecraft

    NASA Technical Reports Server (NTRS)

    Kim, Young; Stough, Robert; Whorton, Mark

    2005-01-01

    Solar sail propulsion systems enable a wide range of space missions that are not feasible with current propulsion technology. Hardware concepts and analytical methods have matured through ground development to the point that a flight validation mission is now realizable. Much attention has been given to modeling the structural dynamics of the constituent elements, but to date an integrated system level dynamics analysis has been lacking. Using a multi-body dynamics and control analysis tool called TREETOPS, the coupled dynamics of the sailcraft bus, sail membranes, flexible booms, and control system sensors and actuators of a representative solar sail spacecraft are investigated to assess system level dynamics and control issues. With this tool, scaling issues and parametric trade studies can be performed to study achievable performance, control authority requirements, and control/structure interaction assessments.

  11. Consumption value theory and the marketing of public health: an effective formative research tool.

    PubMed

    Nelson, Douglas G; Byus, Kent

    2002-01-01

    Contemporary public health requires the support and participation of its constituency. This study assesses the capacity of consumption value theory to identify the basis of this support. A telephone survey design used simple random sampling of adult residents of Cherokee County, Oklahoma. Factor analysis and stepwise discriminant analysis was used to identify and classify personal and societal level support variables. Most residents base societal level support on epistemic values. Direct services clientele base their support on positive emotional values derived from personal contact and attractive programs. Residents are curious about public health and want to know more about the health department. Where marketing the effectiveness of public health programs would yield relatively little support, marketing health promotion activities may attract public opposition. This formative research tool suggests a marketing strategy for public health practitioners.

  12. Advanced manufacturing development of a composite empennage component for l-1011 aircraft

    NASA Technical Reports Server (NTRS)

    1978-01-01

    Tooling concepts were developed which would permit co-couring of the hat stiffeners to the skin to form the cover assembly in a single autoclave cycle. These tooling concepts include the use of solid rubber mandrels, foam mandrels, and formed elastometric bladders. A simplification of the root end design of the cover hat stiffeners was accomplished in order to facilitate fabrication. The conversion of the 3D NASTRAN model from level 15 to level 16 was completed and a successful check run accomplished. A detailed analysis of the thermal load requirement for the environmental chambers was carried out. Based on the thermal analysis, best function requirements, load inputs and ease of access, a system involving four chambers, two for the covers containing 6 and 4 specimens, respectively, and two for the spares containing 6 and 4 specimens, respectively, evolved.

  13. Systems biology: A tool for charting the antiviral landscape.

    PubMed

    Bowen, James R; Ferris, Martin T; Suthar, Mehul S

    2016-06-15

    The host antiviral programs that are initiated following viral infection form a dynamic and complex web of responses that we have collectively termed as "the antiviral landscape". Conventional approaches to studying antiviral responses have primarily used reductionist systems to assess the function of a single or a limited subset of molecules. Systems biology is a holistic approach that considers the entire system as a whole, rather than individual components or molecules. Systems biology based approaches facilitate an unbiased and comprehensive analysis of the antiviral landscape, while allowing for the discovery of emergent properties that are missed by conventional approaches. The antiviral landscape can be viewed as a hierarchy of complexity, beginning at the whole organism level and progressing downward to isolated tissues, populations of cells, and single cells. In this review, we will discuss how systems biology has been applied to better understand the antiviral landscape at each of these layers. At the organismal level, the Collaborative Cross is an invaluable genetic resource for assessing how genetic diversity influences the antiviral response. Whole tissue and isolated bulk cell transcriptomics serves as a critical tool for the comprehensive analysis of antiviral responses at both the tissue and cellular levels of complexity. Finally, new techniques in single cell analysis are emerging tools that will revolutionize our understanding of how individual cells within a bulk infected cell population contribute to the overall antiviral landscape. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Ares I-X Flight Test Validation of Control Design Tools in the Frequency-Domain

    NASA Technical Reports Server (NTRS)

    Johnson, Matthew; Hannan, Mike; Brandon, Jay; Derry, Stephen

    2011-01-01

    A major motivation of the Ares I-X flight test program was to Design for Data, in order to maximize the usefulness of the data recorded in support of Ares I modeling and validation of design and analysis tools. The Design for Data effort was intended to enable good post-flight characterizations of the flight control system, the vehicle structural dynamics, and also the aerodynamic characteristics of the vehicle. To extract the necessary data from the system during flight, a set of small predetermined Programmed Test Inputs (PTIs) was injected directly into the TVC signal. These PTIs were designed to excite the necessary vehicle dynamics while exhibiting a minimal impact on loads. The method is similar to common approaches in aircraft flight test programs, but with unique launch vehicle challenges due to rapidly changing states, short duration of flight, a tight flight envelope, and an inability to repeat any test. This paper documents the validation effort of the stability analysis tools to the flight data which was performed by comparing the post-flight calculated frequency response of the vehicle to the frequency response calculated by the stability analysis tools used to design and analyze the preflight models during the control design effort. The comparison between flight day frequency response and stability tool analysis for flight of the simulated vehicle shows good agreement and provides a high level of confidence in the stability analysis tools for use in any future program. This is true for both a nominal model as well as for dispersed analysis, which shows that the flight day frequency response is enveloped by the vehicle s preflight uncertainty models.

  15. Extended Testability Analysis Tool

    NASA Technical Reports Server (NTRS)

    Melcher, Kevin; Maul, William A.; Fulton, Christopher

    2012-01-01

    The Extended Testability Analysis (ETA) Tool is a software application that supports fault management (FM) by performing testability analyses on the fault propagation model of a given system. Fault management includes the prevention of faults through robust design margins and quality assurance methods, or the mitigation of system failures. Fault management requires an understanding of the system design and operation, potential failure mechanisms within the system, and the propagation of those potential failures through the system. The purpose of the ETA Tool software is to process the testability analysis results from a commercial software program called TEAMS Designer in order to provide a detailed set of diagnostic assessment reports. The ETA Tool is a command-line process with several user-selectable report output options. The ETA Tool also extends the COTS testability analysis and enables variation studies with sensor sensitivity impacts on system diagnostics and component isolation using a single testability output. The ETA Tool can also provide extended analyses from a single set of testability output files. The following analysis reports are available to the user: (1) the Detectability Report provides a breakdown of how each tested failure mode was detected, (2) the Test Utilization Report identifies all the failure modes that each test detects, (3) the Failure Mode Isolation Report demonstrates the system s ability to discriminate between failure modes, (4) the Component Isolation Report demonstrates the system s ability to discriminate between failure modes relative to the components containing the failure modes, (5) the Sensor Sensor Sensitivity Analysis Report shows the diagnostic impact due to loss of sensor information, and (6) the Effect Mapping Report identifies failure modes that result in specified system-level effects.

  16. Screening tool to evaluate the vulnerability of down-gradient receptors to groundwater contaminants from uncapped landfills

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baker, Ronald J.; Reilly, Timothy J.; Lopez, Anthony

    2015-09-15

    Highlights: • A spreadsheet-based risk screening tool for groundwater affected by landfills is presented. • Domenico solute transport equations are used to estimate downgradient contaminant concentrations. • Landfills are categorized as presenting high, moderate or low risks. • Analysis of parameter sensitivity and examples of the method’s application are given. • The method has value to regulators and those considering redeveloping closed landfills. - Abstract: A screening tool for quantifying levels of concern for contaminants detected in monitoring wells on or near landfills to down-gradient receptors (streams, wetlands and residential lots) was developed and evaluated. The tool uses Quick Domenicomore » Multi-scenario (QDM), a spreadsheet implementation of Domenico-based solute transport, to estimate concentrations of contaminants reaching receptors under steady-state conditions from a constant-strength source. Unlike most other available Domenico-based model applications, QDM calculates the time for down-gradient contaminant concentrations to approach steady state and appropriate dispersivity values, and allows for up to fifty simulations on a single spreadsheet. Sensitivity of QDM solutions to critical model parameters was quantified. The screening tool uses QDM results to categorize landfills as having high, moderate and low levels of concern, based on contaminant concentrations reaching receptors relative to regulatory concentrations. The application of this tool was demonstrated by assessing levels of concern (as defined by the New Jersey Pinelands Commission) for thirty closed, uncapped landfills in the New Jersey Pinelands National Reserve, using historic water-quality data from monitoring wells on and near landfills and hydraulic parameters from regional flow models. Twelve of these landfills are categorized as having high levels of concern, indicating a need for further assessment. This tool is not a replacement for conventional numerically-based transport model or other available Domenico-based applications, but is suitable for quickly assessing the level of concern posed by a landfill or other contaminant point source before expensive and lengthy monitoring or remediation measures are taken. In addition to quantifying the level of concern using historic groundwater-monitoring data, the tool allows for archiving model scenarios and adding refinements as new data become available.« less

  17. Top-level modeling of an als system utilizing object-oriented techniques

    NASA Astrophysics Data System (ADS)

    Rodriguez, L. F.; Kang, S.; Ting, K. C.

    The possible configuration of an Advanced Life Support (ALS) System capable of supporting human life for long-term space missions continues to evolve as researchers investigate potential technologies and configurations. To facilitate the decision process the development of acceptable, flexible, and dynamic mathematical computer modeling tools capable of system level analysis is desirable. Object-oriented techniques have been adopted to develop a dynamic top-level model of an ALS system.This approach has several advantages; among these, object-oriented abstractions of systems are inherently modular in architecture. Thus, models can initially be somewhat simplistic, while allowing for adjustments and improvements. In addition, by coding the model in Java, the model can be implemented via the World Wide Web, greatly encouraging the utilization of the model. Systems analysis is further enabled with the utilization of a readily available backend database containing information supporting the model. The subsystem models of the ALS system model include Crew, Biomass Production, Waste Processing and Resource Recovery, Food Processing and Nutrition, and the Interconnecting Space. Each subsystem model and an overall model have been developed. Presented here is the procedure utilized to develop the modeling tool, the vision of the modeling tool, and the current focus for each of the subsystem models.

  18. Climate Impact and GIS Education Using Realistic Applications of Data.gov Thematic Datasets in a Structured Lesson-Based Workbook

    NASA Astrophysics Data System (ADS)

    Amirazodi, S.; Griffin, R.; Bugbee, K.; Ramachandran, R.; Weigel, A. M.

    2016-12-01

    This project created a workbook which teaches Earth Science education undergraduate and graduate students through guided in-class activities and take-home assignments organized around climate topics which use GIS to teach key geospatial analysis techniques and cartography skills. The workbook is structured to the White House's Data.gov climate change themes, which include Coastal Flooding, Ecosystem Vulnerability, Energy Infrastructure, Arctic, Food Resilience, Human Health, Transportation, Tribal Nations, Water. Each theme provides access to framing questions, associated data, interactive tools, and further reading (e.g. the US Climate Resilience Toolkit and National Climate Assessment). Lessons make use of the respective theme's available resources. The structured thematic approach is designed to encourage independent exploration. The goal is to teach climate concepts and concerns, GIS techniques and approaches, and effective cartographic representation and communication of results; and foster a greater awareness of publically available resources and datasets. To reach more audiences more effectively, a two level approach was used. Level 1 serves as an introductory study and relies on only freely available interactive tools to reach audiences with fewer resources and less familiarity. Level 2 presents a more advanced case study, and focuses on supporting common commercially available tool use and real-world analysis techniques.

  19. Climate Impact and GIS Education Using Realistic Applications of Data.gov Thematic Datasets in a Structured Lesson-Based Workbook

    NASA Technical Reports Server (NTRS)

    Amirazodi, Sara; Griffin, Robert; Bugbee, Kaylin; Ramachandran, Rahul; Weigel, Amanda

    2016-01-01

    This project created a workbook which teaches Earth Science to undergraduate and graduate students through guided in-class activities and take-home assignments organized around climate topics which use GIS to teach key geospatial analysis techniques and cartography skills. The workbook is structured to the White House's Data.gov climate change themes, which include Coastal Flooding, Ecosystem Vulnerability, Energy Infrastructure, Arctic, Food Resilience, Human Health, Transportation, Tribal Nations, and Water. Each theme provides access to framing questions, associated data, interactive tools, and further reading (e.g. The US Climate Resilience Toolkit and National Climate Assessment). Lessons make use of the respective theme's available resources. The structured thematic approach is designed to encourage independent exploration. The goal is to teach climate concepts and concerns, GIS techniques and approaches, and effective cartographic representation and communication results; and foster a greater awareness of publicly available resources and datasets. To reach more audiences more effectively, a two level approach was used. Level 1 serves as an introductory study and relies on only freely available interactive tools to reach audiences with fewer resources and less familiarity. Level 2 presents a more advanced case study, and focuses on supporting common commercially available tool use and real-world analysis techniques.

  20. A New Approach in Applying Systems Engineering Tools and Analysis to Determine Hepatocyte Toxicogenomics Risk Levels to Human Health.

    PubMed

    Gigrich, James; Sarkani, Shahryar; Holzer, Thomas

    2017-03-01

    There is an increasing backlog of potentially toxic compounds that cannot be evaluated with current animal-based approaches in a cost-effective and expeditious manner, thus putting human health at risk. Extrapolation of animal-based test results for human risk assessment often leads to different physiological outcomes. This article introduces the use of quantitative tools and methods from systems engineering to evaluate the risk of toxic compounds by the analysis of the amount of stress that human hepatocytes undergo in vitro when metabolizing GW7647 1 over extended times and concentrations. Hepatocytes are exceedingly connected systems that make it challenging to understand the highly varied dimensional genomics data to determine risk of exposure. Gene expression data of peroxisome proliferator-activated receptor-α (PPARα) 2 binding was measured over multiple concentrations and varied times of GW7647 exposure and leveraging mahalanombis distance to establish toxicity threshold risk levels. The application of these novel systems engineering tools provides new insight into the intricate workings of human hepatocytes to determine risk threshold levels from exposure. This approach is beneficial to decision makers and scientists, and it can help reduce the backlog of untested chemical compounds due to the high cost and inefficiency of animal-based models.

  1. A Critical and Comparative Review of Fluorescent Tools for Live-Cell Imaging.

    PubMed

    Specht, Elizabeth A; Braselmann, Esther; Palmer, Amy E

    2017-02-10

    Fluorescent tools have revolutionized our ability to probe biological dynamics, particularly at the cellular level. Fluorescent sensors have been developed on several platforms, utilizing either small-molecule dyes or fluorescent proteins, to monitor proteins, RNA, DNA, small molecules, and even cellular properties, such as pH and membrane potential. We briefly summarize the impressive history of tool development for these various applications and then discuss the most recent noteworthy developments in more detail. Particular emphasis is placed on tools suitable for single-cell analysis and especially live-cell imaging applications. Finally, we discuss prominent areas of need in future fluorescent tool development-specifically, advancing our capability to analyze and integrate the plethora of high-content data generated by fluorescence imaging.

  2. Neurocognitive inefficacy of the strategy process.

    PubMed

    Klein, Harold E; D'Esposito, Mark

    2007-11-01

    The most widely used (and taught) protocols for strategic analysis-Strengths, Weaknesses, Opportunities, and Threats (SWOT) and Porter's (1980) Five Force Framework for industry analysis-have been found to be insufficient as stimuli for strategy creation or even as a basis for further strategy development. We approach this problem from a neurocognitive perspective. We see profound incompatibilities between the cognitive process-deductive reasoning-channeled into the collective mind of strategists within the formal planning process through its tools of strategic analysis (i.e., rational technologies) and the essentially inductive reasoning process actually needed to address ill-defined, complex strategic situations. Thus, strategic analysis protocols that may appear to be and, indeed, are entirely rational and logical are not interpretable as such at the neuronal substrate level where thinking takes place. The analytical structure (or propositional representation) of these tools results in a mental dead end, the phenomenon known in cognitive psychology as functional fixedness. The difficulty lies with the inability of the brain to make out meaningful (i.e., strategy-provoking) stimuli from the mental images (or depictive representations) generated by strategic analysis tools. We propose decreasing dependence on these tools and conducting further research employing brain imaging technology to explore complex data handling protocols with richer mental representation and greater potential for strategy creation.

  3. CFD Multiphysics Tool

    NASA Technical Reports Server (NTRS)

    Perrell, Eric R.

    2005-01-01

    The recent bold initiatives to expand the human presence in space require innovative approaches to the design of propulsion systems whose underlying technology is not yet mature. The space propulsion community has identified a number of candidate concepts. A short list includes solar sails, high-energy-density chemical propellants, electric and electromagnetic accelerators, solar-thermal and nuclear-thermal expanders. For each of these, the underlying physics are relatively well understood. One could easily cite authoritative texts, addressing both the governing equations, and practical solution methods for, e.g. electromagnetic fields, heat transfer, radiation, thermophysics, structural dynamics, particulate kinematics, nuclear energy, power conversion, and fluid dynamics. One could also easily cite scholarly works in which complete equation sets for any one of these physical processes have been accurately solved relative to complex engineered systems. The Advanced Concepts and Analysis Office (ACAO), Space Transportation Directorate, NASA Marshall Space Flight Center, has recently released the first alpha version of a set of computer utilities for performing the applicable physical analyses relative to candidate deep-space propulsion systems such as those listed above. PARSEC, Preliminary Analysis of Revolutionary in-Space Engineering Concepts, enables rapid iterative calculations using several physics tools developed in-house. A complete cycle of the entire tool set takes about twenty minutes. PARSEC is a level-zero/level-one design tool. For PARSEC s proof-of-concept, and preliminary design decision-making, assumptions that significantly simplify the governing equation sets are necessary. To proceed to level-two, one wishes to retain modeling of the underlying physics as close as practical to known applicable first principles. This report describes results of collaboration between ACAO, and Embry-Riddle Aeronautical University (ERAU), to begin building a set of level-two design tools for PARSEC. The "CFD Multiphysics Tool" will be the propulsive element of the tool set. The name acknowledges that space propulsion performance assessment is primarily a fluid mechanics problem. At the core of the CFD Multiphysics Tool is an open-source CFD code, HYP, under development at ERAU. ERAU is renowned for its undergraduate degree program in Aerospace Engineering the largest in the nation. The strength of the program is its applications-oriented curriculum, which culminates in one of three two-course Engineering Design sequences: Aerospace Propulsion, Spacecraft, or Aircraft. This same philosophy applies to the HYP Project, albeit with fluid physics modeling commensurate with graduate research. HYP s purpose, like the Multiphysics Tool s, is to enable calculations of real (three-dimensional; geometrically complex; intended for hardware development) applications of high speed and propulsive fluid flows.

  4. Tools, information sources, and methods used in deciding on drug availability in HMOs.

    PubMed

    Barner, J C; Thomas, J

    1998-01-01

    The use and importance of specific decision-making tools, information sources, and drug-use management methods in determining drug availability and use in HMOs were studied. A questionnaire was sent to 303 randomly selected HMOs. Respondents were asked to rate their use of each of four formal decision-making tools and its relative importance, as well as the use and importance of eight information sources and 11 methods for managing drug availability and use, on a 5-point scale. The survey response rate was 28%. Approximately half of the respondents reported that their HMOs used decision analysis or multiattribute analysis in deciding on drug availability. If used, these tools were rated as very important. There were significant differences in levels of use by HMO type, membership size, and age. Journal articles and reference books were reported most often as information sources. Retrospective drug-use review was used very often and perceived to be very important in managing drug use. Other management methods were used only occasionally, but the importance placed on these tools when used ranged from moderately to very important. Older organizations used most of the management methods more often than did other HMOs. Decision analysis and multiattribute analysis were the most commonly used tools for deciding on which drugs to make available to HMO members, and reference books and journal articles were the most commonly used information sources. Retrospective and prospective drug-use reviews were the most commonly applied methods for managing HMO members' access to drugs.

  5. PATIKA: an integrated visual environment for collaborative construction and analysis of cellular pathways.

    PubMed

    Demir, E; Babur, O; Dogrusoz, U; Gursoy, A; Nisanci, G; Cetin-Atalay, R; Ozturk, M

    2002-07-01

    Availability of the sequences of entire genomes shifts the scientific curiosity towards the identification of function of the genomes in large scale as in genome studies. In the near future, data produced about cellular processes at molecular level will accumulate with an accelerating rate as a result of proteomics studies. In this regard, it is essential to develop tools for storing, integrating, accessing, and analyzing this data effectively. We define an ontology for a comprehensive representation of cellular events. The ontology presented here enables integration of fragmented or incomplete pathway information and supports manipulation and incorporation of the stored data, as well as multiple levels of abstraction. Based on this ontology, we present the architecture of an integrated environment named Patika (Pathway Analysis Tool for Integration and Knowledge Acquisition). Patika is composed of a server-side, scalable, object-oriented database and client-side editors to provide an integrated, multi-user environment for visualizing and manipulating network of cellular events. This tool features automated pathway layout, functional computation support, advanced querying and a user-friendly graphical interface. We expect that Patika will be a valuable tool for rapid knowledge acquisition, microarray generated large-scale data interpretation, disease gene identification, and drug development. A prototype of Patika is available upon request from the authors.

  6. Development of Multidisciplinary, Multifidelity Analysis, Integration, and Optimization of Aerospace Vehicles

    DTIC Science & Technology

    2010-02-27

    investigated in more detail. The intermediate level of fidelity, though more expensive, is then used to refine the analysis , add geometric detail, and...design stage is used to further refine the analysis , narrowing the design to a handful of options. Figure 1. Integrated Hierarchical Framework. In...computational structural and computational fluid modeling. For the structural analysis tool we used McIntosh Structural Dynamics’ finite element code CNEVAL

  7. A Comparative Analysis of Life-Cycle Assessment Tools for ...

    EPA Pesticide Factsheets

    We identified and evaluated five life-cycle assessment tools that community decision makers can use to assess the environmental and economic impacts of end-of-life (EOL) materials management options. The tools evaluated in this report are waste reduction mode (WARM), municipal solid waste-decision support tool (MSW-DST), solid waste optimization life-cycle framework (SWOLF), environmental assessment system for environmental technologies (EASETECH), and waste and resources assessment for the environment (WRATE). WARM, MSW-DST, and SWOLF were developed for US-specific materials management strategies, while WRATE and EASETECH were developed for European-specific conditions. All of the tools (with the exception of WARM) allow specification of a wide variety of parameters (e.g., materials composition and energy mix) to a varying degree, thus allowing users to model specific EOL materials management methods even outside the geographical domain they are originally intended for. The flexibility to accept user-specified input for a large number of parameters increases the level of complexity and the skill set needed for using these tools. The tools were evaluated and compared based on a series of criteria, including general tool features, the scope of the analysis (e.g., materials and processes included), and the impact categories analyzed (e.g., climate change, acidification). A series of scenarios representing materials management problems currently relevant to c

  8. Selection and validation of endogenous reference genes for qRT-PCR analysis in leafy spurge (Euphorbia esula)

    USDA-ARS?s Scientific Manuscript database

    Quantitative real-time polymerase chain reaction (qRT-PCR) is the most important tool in measuring levels of gene expression due to its accuracy, specificity, and sensitivity. However, the accuracy of qRT-PCR analysis strongly depends on transcript normalization using stably expressed reference gene...

  9. Microcomputers, Software and Foreign Languages for Special Purposes: An Analysis of TXTPRO.

    ERIC Educational Resources Information Center

    Tang, Michael S.

    TXTPRO, a computer program developed as a graduate-level research tool for descriptive linguistic analysis, produces simple alphabetic and word frequency lists, analyzes word combinations, and develops concordances. With modifications, a teacher could enter the program into a mainframe or a microcomputer and use it for text analyses to develop…

  10. Meta-analysis of screening and case finding tools for depression in cancer: evidence based recommendations for clinical practice on behalf of the Depression in Cancer Care consensus group.

    PubMed

    Mitchell, Alex J; Meader, Nick; Davies, Evan; Clover, Kerrie; Carter, Gregory L; Loscalzo, Matthew J; Linden, Wolfgang; Grassi, Luigi; Johansen, Christoffer; Carlson, Linda E; Zabora, James

    2012-10-01

    To examine the validity of screening and case-finding tools used in the identification of depression as defined by an ICD10/DSM-IV criterion standard. We identified 63 studies involving 19 tools (in 33 publications) designed to help clinicians identify depression in cancer settings. We used a standardized rating system. We excluded 11 tools without at least two independent studies, leaving 8 tools for comparison. Across all cancer stages there were 56 diagnostic validity studies (n=10,009). For case-finding, one stem question, two stem questions and the BDI-II all had level 2 evidence (2a, 2b and 2c respectively) and given their better acceptability we gave the stem questions a grade B recommendation. For screening, two stem questions had level 1b evidence (with high acceptability) and the BDI-II had level 2c evidence. For every 100 people screened in advanced cancer, the two questions would accurately detect 18 cases, while missing only 1 and correctly reassure 74 with 7 falsely identified. For every 100 people screened in non-palliative settings the BDI-II would accurately detect 17 cases, missing 2 and correctly re-assure 70, with 11 falsely identified as cases. The main cautions are the reliance on DSM-IV definitions of major depression, the large number of small studies and the paucity of data for many tools in specific settings. Although no single tool could be offered unqualified support, several tools are likely to improve upon unassisted clinical recognition. In clinical practice, all tools should form part of an integrated approach involving further follow-up, clinical assessment and evidence based therapy. Copyright © 2012 Elsevier B.V. All rights reserved.

  11. Teaching bioinformatics and neuroinformatics by using free web-based tools.

    PubMed

    Grisham, William; Schottler, Natalie A; Valli-Marill, Joanne; Beck, Lisa; Beatty, Jackson

    2010-01-01

    This completely computer-based module's purpose is to introduce students to bioinformatics resources. We present an easy-to-adopt module that weaves together several important bioinformatic tools so students can grasp how these tools are used in answering research questions. Students integrate information gathered from websites dealing with anatomy (Mouse Brain Library), quantitative trait locus analysis (WebQTL from GeneNetwork), bioinformatics and gene expression analyses (University of California, Santa Cruz Genome Browser, National Center for Biotechnology Information's Entrez Gene, and the Allen Brain Atlas), and information resources (PubMed). Instructors can use these various websites in concert to teach genetics from the phenotypic level to the molecular level, aspects of neuroanatomy and histology, statistics, quantitative trait locus analysis, and molecular biology (including in situ hybridization and microarray analysis), and to introduce bioinformatic resources. Students use these resources to discover 1) the region(s) of chromosome(s) influencing the phenotypic trait, 2) a list of candidate genes-narrowed by expression data, 3) the in situ pattern of a given gene in the region of interest, 4) the nucleotide sequence of the candidate gene, and 5) articles describing the gene. Teaching materials such as a detailed student/instructor's manual, PowerPoints, sample exams, and links to free Web resources can be found at http://mdcune.psych.ucla.edu/modules/bioinformatics.

  12. The Health Information Technology Competencies Tool: Does It Translate for Nursing Informatics in the United States?

    PubMed

    Sipes, Carolyn; Hunter, Kathleen; McGonigle, Dee; West, Karen; Hill, Taryn; Hebda, Toni

    2017-12-01

    Information technology use in healthcare delivery mandates a prepared workforce. The initial Health Information Technology Competencies tool resulted from a 2-year transatlantic effort by experts from the US and European Union to identify approaches to develop skills and knowledge needed by healthcare workers. It was determined that competencies must be identified before strategies are established, resulting in a searchable database of more than 1000 competencies representing five domains, five skill levels, and more than 250 roles. Health Information Technology Competencies is available at no cost and supports role- or competency-based queries. Health Information Technology Competencies developers suggest its use for curriculum planning, job descriptions, and professional development.The Chamberlain College of Nursing informatics research team examined Health Information Technology Competencies for its possible application to our research and our curricular development, comparing it originally with the TIGER-based Assessment of Nursing Informatics Competencies and Nursing Informatics Competency Assessment of Level 3 and Level 4 tools, which examine informatics competencies at four levels of nursing practice. Additional analysis involved the 2015 Nursing Informatics: Scope and Standards of Practice. Informatics is a Health Information Technology Competencies domain, so clear delineation of nursing-informatics competencies was expected. Researchers found TIGER-based Assessment of Nursing Informatics Competencies and Nursing Informatics Competency Assessment of Level 3 and Level 4 differed from Health Information Technology Competencies 2016 in focus, definitions, ascribed competencies, and defined levels of expertise. When Health Information Technology Competencies 2017 was compared against the nursing informatics scope and standards, researchers found an increase in the number of informatics competencies but not to a significant degree. This is not surprising, given that Health Information Technology Competencies includes all healthcare workers, while the TIGER-based Assessment of Nursing Informatics Competencies and Nursing Informatics Competency Assessment of Level 3 and Level 4 tools and the American Nurses Association Nursing Informatics: Scope and Standards of Practice are nurse specific. No clear cross mapping across these tools and the standards of nursing informatics practice exists. Further examination and review are needed to translate Health Information Technology Competencies as a viable tool for nursing informatics use in the US.

  13. Surging Seas Risk Finder: A Tool for Local-Scale Flood Risk Assessments in Coastal Cities

    NASA Astrophysics Data System (ADS)

    Kulp, S. A.; Strauss, B.

    2015-12-01

    Local decision makers in coastal cities require accurate, accessible, and thorough assessments of flood exposure risk within their individual municipality, in their efforts to mitigate against damage due to future sea level rise. To fill this need, we have developed Climate Central's Surging Seas Risk Finder, an interactive data toolkit which presents our sea level rise and storm surge analysis for every coastal town, city, county, and state within the USA. Using this tool, policy makers can easily zoom in on their local place of interest to receive a detailed flood risk assessment, which synthesizes a wide range of features including total population, socially vulnerable population, housing, property value, road miles, power plants, schools, hospitals, and many other critical facilities. Risk Finder can also be used to identify specific points of interest in danger of exposure at different flood levels. Additionally, this tool provides localized storm surge probabilities and sea level rise projections at tidal gauges along the coast, so that users can quickly understand the risk of flooding in their area over the coming decades.

  14. System Risk Assessment and Allocation in Conceptual Design

    NASA Technical Reports Server (NTRS)

    Mahadevan, Sankaran; Smith, Natasha L.; Zang, Thomas A. (Technical Monitor)

    2003-01-01

    As aerospace systems continue to evolve in addressing newer challenges in air and space transportation, there exists a heightened priority for significant improvement in system performance, cost effectiveness, reliability, and safety. Tools, which synthesize multidisciplinary integration, probabilistic analysis, and optimization, are needed to facilitate design decisions allowing trade-offs between cost and reliability. This study investigates tools for probabilistic analysis and probabilistic optimization in the multidisciplinary design of aerospace systems. A probabilistic optimization methodology is demonstrated for the low-fidelity design of a reusable launch vehicle at two levels, a global geometry design and a local tank design. Probabilistic analysis is performed on a high fidelity analysis of a Navy missile system. Furthermore, decoupling strategies are introduced to reduce the computational effort required for multidisciplinary systems with feedback coupling.

  15. Towards understanding the breast cancer epigenome: a comparison of genome-wide DNA methylation and gene expression data

    PubMed Central

    Michiels, Stefan; Metzger-Filho, Otto; Saini, Kamal S.

    2016-01-01

    Until recently, an elevated disease risk has been ascribed to a genetic predisposition, however, exciting progress over the past years has discovered alternate elements of inheritance that involve epigenetic regulation. Epigenetic changes are heritably stable alterations that include DNA methylation, histone modifications and RNA-mediated silencing. Aberrant DNA methylation is a common molecular basis for a number of important human diseases, including breast cancer. Changes in DNA methylation profoundly affect global gene expression patterns. What is emerging is a more dynamic and complex association between DNA methylation and gene expression than previously believed. Although many tools have already been developed for analyzing genome-wide gene expression data, tools for analyzing genome-wide DNA methylation have not yet reached the same level of refinement. Here we provide an in-depth analysis of DNA methylation in parallel with gene expression data characteristics and describe the particularities of low-level and high-level analyses of DNA methylation data. Low-level analysis refers to pre-processing of methylation data (i.e. normalization, transformation and filtering), whereas high-level analysis is focused on illustrating the application of the widely used class comparison, class prediction and class discovery methods to DNA methylation data. Furthermore, we investigate the influence of DNA methylation on gene expression by measuring the correlation between the degree of CpG methylation and the level of expression and to explore the pattern of methylation as a function of the promoter region. PMID:26657508

  16. Towards understanding the breast cancer epigenome: a comparison of genome-wide DNA methylation and gene expression data.

    PubMed

    Singhal, Sandeep K; Usmani, Nawaid; Michiels, Stefan; Metzger-Filho, Otto; Saini, Kamal S; Kovalchuk, Olga; Parliament, Matthew

    2016-01-19

    Until recently, an elevated disease risk has been ascribed to a genetic predisposition, however, exciting progress over the past years has discovered alternate elements of inheritance that involve epigenetic regulation. Epigenetic changes are heritably stable alterations that include DNA methylation, histone modifications and RNA-mediated silencing. Aberrant DNA methylation is a common molecular basis for a number of important human diseases, including breast cancer. Changes in DNA methylation profoundly affect global gene expression patterns. What is emerging is a more dynamic and complex association between DNA methylation and gene expression than previously believed. Although many tools have already been developed for analyzing genome-wide gene expression data, tools for analyzing genome-wide DNA methylation have not yet reached the same level of refinement. Here we provide an in-depth analysis of DNA methylation in parallel with gene expression data characteristics and describe the particularities of low-level and high-level analyses of DNA methylation data. Low-level analysis refers to pre-processing of methylation data (i.e. normalization, transformation and filtering), whereas high-level analysis is focused on illustrating the application of the widely used class comparison, class prediction and class discovery methods to DNA methylation data. Furthermore, we investigate the influence of DNA methylation on gene expression by measuring the correlation between the degree of CpG methylation and the level of expression and to explore the pattern of methylation as a function of the promoter region.

  17. A high-level 3D visualization API for Java and ImageJ.

    PubMed

    Schmid, Benjamin; Schindelin, Johannes; Cardona, Albert; Longair, Mark; Heisenberg, Martin

    2010-05-21

    Current imaging methods such as Magnetic Resonance Imaging (MRI), Confocal microscopy, Electron Microscopy (EM) or Selective Plane Illumination Microscopy (SPIM) yield three-dimensional (3D) data sets in need of appropriate computational methods for their analysis. The reconstruction, segmentation and registration are best approached from the 3D representation of the data set. Here we present a platform-independent framework based on Java and Java 3D for accelerated rendering of biological images. Our framework is seamlessly integrated into ImageJ, a free image processing package with a vast collection of community-developed biological image analysis tools. Our framework enriches the ImageJ software libraries with methods that greatly reduce the complexity of developing image analysis tools in an interactive 3D visualization environment. In particular, we provide high-level access to volume rendering, volume editing, surface extraction, and image annotation. The ability to rely on a library that removes the low-level details enables concentrating software development efforts on the algorithm implementation parts. Our framework enables biomedical image software development to be built with 3D visualization capabilities with very little effort. We offer the source code and convenient binary packages along with extensive documentation at http://3dviewer.neurofly.de.

  18. Learning Photogrammetry with Interactive Software Tool PhoX

    NASA Astrophysics Data System (ADS)

    Luhmann, T.

    2016-06-01

    Photogrammetry is a complex topic in high-level university teaching, especially in the fields of geodesy, geoinformatics and metrology where high quality results are demanded. In addition, more and more black-box solutions for 3D image processing and point cloud generation are available that generate nice results easily, e.g. by structure-from-motion approaches. Within this context, the classical approach of teaching photogrammetry (e.g. focusing on aerial stereophotogrammetry) has to be reformed in order to educate students and professionals with new topics and provide them with more information behind the scene. Since around 20 years photogrammetry courses at the Jade University of Applied Sciences in Oldenburg, Germany, include the use of digital photogrammetry software that provide individual exercises, deep analysis of calculation results and a wide range of visualization tools for almost all standard tasks in photogrammetry. During the last years the software package PhoX has been developed that is part of a new didactic concept in photogrammetry and related subjects. It also serves as analysis tool in recent research projects. PhoX consists of a project-oriented data structure for images, image data, measured points and features and 3D objects. It allows for almost all basic photogrammetric measurement tools, image processing, calculation methods, graphical analysis functions, simulations and much more. Students use the program in order to conduct predefined exercises where they have the opportunity to analyse results in a high level of detail. This includes the analysis of statistical quality parameters but also the meaning of transformation parameters, rotation matrices, calibration and orientation data. As one specific advantage, PhoX allows for the interactive modification of single parameters and the direct view of the resulting effect in image or object space.

  19. Scoring Tools for the Analysis of Infant Respiratory Inductive Plethysmography Signals.

    PubMed

    Robles-Rubio, Carlos Alejandro; Bertolizio, Gianluca; Brown, Karen A; Kearney, Robert E

    2015-01-01

    Infants recovering from anesthesia are at risk of life threatening Postoperative Apnea (POA). POA events are rare, and so the study of POA requires the analysis of long cardiorespiratory records. Manual scoring is the preferred method of analysis for these data, but it is limited by low intra- and inter-scorer repeatability. Furthermore, recommended scoring rules do not provide a comprehensive description of the respiratory patterns. This work describes a set of manual scoring tools that address these limitations. These tools include: (i) a set of definitions and scoring rules for 6 mutually exclusive, unique patterns that fully characterize infant respiratory inductive plethysmography (RIP) signals; (ii) RIPScore, a graphical, manual scoring software to apply these rules to infant data; (iii) a library of data segments representing each of the 6 patterns; (iv) a fully automated, interactive formal training protocol to standardize the analysis and establish intra- and inter-scorer repeatability; and (v) a quality control method to monitor scorer ongoing performance over time. To evaluate these tools, three scorers from varied backgrounds were recruited and trained to reach a performance level similar to that of an expert. These scorers used RIPScore to analyze data from infants at risk of POA in two separate, independent instances. Scorers performed with high accuracy and consistency, analyzed data efficiently, had very good intra- and inter-scorer repeatability, and exhibited only minor confusion between patterns. These results indicate that our tools represent an excellent method for the analysis of respiratory patterns in long data records. Although the tools were developed for the study of POA, their use extends to any study of respiratory patterns using RIP (e.g., sleep apnea, extubation readiness). Moreover, by establishing and monitoring scorer repeatability, our tools enable the analysis of large data sets by multiple scorers, which is essential for longitudinal and multicenter studies.

  20. Combining multiple tools outperforms individual methods in gene set enrichment analyses.

    PubMed

    Alhamdoosh, Monther; Ng, Milica; Wilson, Nicholas J; Sheridan, Julie M; Huynh, Huy; Wilson, Michael J; Ritchie, Matthew E

    2017-02-01

    Gene set enrichment (GSE) analysis allows researchers to efficiently extract biological insight from long lists of differentially expressed genes by interrogating them at a systems level. In recent years, there has been a proliferation of GSE analysis methods and hence it has become increasingly difficult for researchers to select an optimal GSE tool based on their particular dataset. Moreover, the majority of GSE analysis methods do not allow researchers to simultaneously compare gene set level results between multiple experimental conditions. The ensemble of genes set enrichment analyses (EGSEA) is a method developed for RNA-sequencing data that combines results from twelve algorithms and calculates collective gene set scores to improve the biological relevance of the highest ranked gene sets. EGSEA's gene set database contains around 25 000 gene sets from sixteen collections. It has multiple visualization capabilities that allow researchers to view gene sets at various levels of granularity. EGSEA has been tested on simulated data and on a number of human and mouse datasets and, based on biologists' feedback, consistently outperforms the individual tools that have been combined. Our evaluation demonstrates the superiority of the ensemble approach for GSE analysis, and its utility to effectively and efficiently extrapolate biological functions and potential involvement in disease processes from lists of differentially regulated genes. EGSEA is available as an R package at http://www.bioconductor.org/packages/EGSEA/ . The gene sets collections are available in the R package EGSEAdata from http://www.bioconductor.org/packages/EGSEAdata/ . monther.alhamdoosh@csl.com.au mritchie@wehi.edu.au. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  1. Scalability Analysis of Gleipnir: A Memory Tracing and Profiling Tool, on Titan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Janjusic, Tommy; Kartsaklis, Christos; Wang, Dali

    2013-01-01

    Application performance is hindered by a variety of factors but most notably driven by the well know CPU-memory speed gap (also known as the memory wall). Understanding application s memory behavior is key if we are trying to optimize performance. Understanding application performance properties is facilitated with various performance profiling tools. The scope of profiling tools varies in complexity, ease of deployment, profiling performance, and the detail of profiled information. Specifically, using profiling tools for performance analysis is a common task when optimizing and understanding scientific applications on complex and large scale systems such as Cray s XK7. This papermore » describes the performance characteristics of using Gleipnir, a memory tracing tool, on the Titan Cray XK7 system when instrumenting large applications such as the Community Earth System Model. Gleipnir is a memory tracing tool built as a plug-in tool for the Valgrind instrumentation framework. The goal of Gleipnir is to provide fine-grained trace information. The generated traces are a stream of executed memory transactions mapped to internal structures per process, thread, function, and finally the data structure or variable. Our focus was to expose tool performance characteristics when using Gleipnir with a combination of an external tools such as a cache simulator, Gl CSim, to characterize the tool s overall performance. In this paper we describe our experience with deploying Gleipnir on the Titan Cray XK7 system, report on the tool s ease-of-use, and analyze run-time performance characteristics under various workloads. While all performance aspects are important we mainly focus on I/O characteristics analysis due to the emphasis on the tools output which are trace-files. Moreover, the tool is dependent on the run-time system to provide the necessary infrastructure to expose low level system detail; therefore, we also discuss any theoretical benefits that can be achieved if such modules were present.« less

  2. Analysing the Theme of Pollution in Portuguese Geography and Biology Textbooks

    ERIC Educational Resources Information Center

    Tracana, Rosa Branca; Ferreira, Claudia; Ferreira, Maria Eduarda; Carvalho, Graca S.

    2008-01-01

    Environmental education has been seen as a basic tool to contribute to the change of conceptions, values and attitudes. Textbook analysis is a major element in the evaluation of how the educational goals (at the legislative level of national programmes) are implemented at the school level. The aim of the present study was to analyse the…

  3. Integrated tools for control-system analysis

    NASA Technical Reports Server (NTRS)

    Ostroff, Aaron J.; Proffitt, Melissa S.; Clark, David R.

    1989-01-01

    The basic functions embedded within a user friendly software package (MATRIXx) are used to provide a high level systems approach to the analysis of linear control systems. Various control system analysis configurations are assembled automatically to minimize the amount of work by the user. Interactive decision making is incorporated via menu options and at selected points, such as in the plotting section, by inputting data. There are five evaluations such as the singular value robustness test, singular value loop transfer frequency response, Bode frequency response, steady-state covariance analysis, and closed-loop eigenvalues. Another section describes time response simulations. A time response for random white noise disturbance is available. The configurations and key equations used for each type of analysis, the restrictions that apply, the type of data required, and an example problem are described. One approach for integrating the design and analysis tools is also presented.

  4. Automatic analysis of online image data for law enforcement agencies by concept detection and instance search

    NASA Astrophysics Data System (ADS)

    de Boer, Maaike H. T.; Bouma, Henri; Kruithof, Maarten C.; ter Haar, Frank B.; Fischer, Noëlle M.; Hagendoorn, Laurens K.; Joosten, Bart; Raaijmakers, Stephan

    2017-10-01

    The information available on-line and off-line, from open as well as from private sources, is growing at an exponential rate and places an increasing demand on the limited resources of Law Enforcement Agencies (LEAs). The absence of appropriate tools and techniques to collect, process, and analyze the volumes of complex and heterogeneous data has created a severe information overload. If a solution is not found, the impact on law enforcement will be dramatic, e.g. because important evidence is missed or the investigation time is too long. Furthermore, there is an uneven level of capabilities to deal with the large volumes of complex and heterogeneous data that come from multiple open and private sources at national level across the EU, which hinders cooperation and information sharing. Consequently, there is a pertinent need to develop tools, systems and processes which expedite online investigations. In this paper, we describe a suite of analysis tools to identify and localize generic concepts, instances of objects and logos in images, which constitutes a significant portion of everyday law enforcement data. We describe how incremental learning based on only a few examples and large-scale indexing are addressed in both concept detection and instance search. Our search technology allows querying of the database by visual examples and by keywords. Our tools are packaged in a Docker container to guarantee easy deployment on a system and our tools exploit possibilities provided by open source toolboxes, contributing to the technical autonomy of LEAs.

  5. Connecting Architecture and Implementation

    NASA Astrophysics Data System (ADS)

    Buchgeher, Georg; Weinreich, Rainer

    Software architectures are still typically defined and described independently from implementation. To avoid architectural erosion and drift, architectural representation needs to be continuously updated and synchronized with system implementation. Existing approaches for architecture representation like informal architecture documentation, UML diagrams, and Architecture Description Languages (ADLs) provide only limited support for connecting architecture descriptions and implementations. Architecture management tools like Lattix, SonarJ, and Sotoarc and UML-tools tackle this problem by extracting architecture information directly from code. This approach works for low-level architectural abstractions like classes and interfaces in object-oriented systems but fails to support architectural abstractions not found in programming languages. In this paper we present an approach for linking and continuously synchronizing a formalized architecture representation to an implementation. The approach is a synthesis of functionality provided by code-centric architecture management and UML tools and higher-level architecture analysis approaches like ADLs.

  6. Using integrated models to minimize environmentally induced wavefront error in optomechanical design and analysis

    NASA Astrophysics Data System (ADS)

    Genberg, Victor L.; Michels, Gregory J.

    2017-08-01

    The ultimate design goal of an optical system subjected to dynamic loads is to minimize system level wavefront error (WFE). In random response analysis, system WFE is difficult to predict from finite element results due to the loss of phase information. In the past, the use of ystem WFE was limited by the difficulty of obtaining a linear optics model. In this paper, an automated method for determining system level WFE using a linear optics model is presented. An error estimate is included in the analysis output based on fitting errors of mode shapes. The technique is demonstrated by example with SigFit, a commercially available tool integrating mechanical analysis with optical analysis.

  7. Designing workload analysis questionnaire to evaluate needs of employees

    NASA Astrophysics Data System (ADS)

    Astuti, Rahmaniyah Dwi; Navi, Muhammad Abdu Haq

    2018-02-01

    Incompatibility between workload with work capacity is one of main problem to make optimal result. In case at the office, there are constraints to determine workload because of non-repetitive works. Employees do work based on the targets set in a working period. At the end of the period is usually performed an evaluation of employees performance to evaluate needs of employees. The aims of this study to design a workload questionnaire tools to evaluate the efficiency level of position as indicator to determine needs of employees based on the Indonesian State Employment Agency Regulation on workload analysis. This research is applied to State-Owned Enterprise PT. X by determining 3 positions as a pilot project. Position A is held by 2 employees, position B is held by 7 employees, and position C is held by 6 employees. From the calculation result, position A has an efficiency level of 1,33 or "very good", position B has an efficiency level of 1.71 or "enough", and position C has an efficiency level of 1.03 or "very good". The application of this tools giving suggestion the needs of employees of position A is 3 people, position B is 5 people, and position C is 6 people. The difference between the number of employees and the calculation result is then analyzed by interviewing the employees to get more data about personal perception. It can be concluded that this workload evaluation tools can be used as an alternative solution to evaluate needs of employees in office.

  8. Operational Analysis in the Launch Environment

    NASA Technical Reports Server (NTRS)

    James, George; Kaouk, Mo; Cao, Tim; Fogt, Vince; Rocha, Rodney; Schultz, Ken; Tucker, Jon-Michael; Rayos, Eli; Bell,Jeff; Alldredge, David; hide

    2012-01-01

    The launch environment is a challenging regime to work due to changing system dynamics, changing environmental loading, joint compression loads that cannot be easily applied on the ground, and control effects. Operational testing is one of the few feasible approaches to capture system level dynamics since ground testing cannot reproduce all of these conditions easily. However, the most successful applications of Operational Modal Testing involve systems with good stationarity and long data acquisition times. This paper covers an ongoing effort to understand the launch environment and the utility of current operational modal tools. This work is expected to produce a collection of operational tools that can be applied to non-stationary launch environment, experience dealing with launch data, and an expanding database of flight parameters such as damping. This paper reports on recent efforts to build a software framework for the data processing utilizing existing and specialty tools; understand the limits of current tools; assess a wider variety of current tools; and expand the experience with additional datasets as well as to begin to address issues raised in earlier launch analysis studies.

  9. A Web-Based Decision Tool to Improve Contraceptive Counseling for Women With Chronic Medical Conditions: Protocol For a Mixed Methods Implementation Study.

    PubMed

    Wu, Justine P; Damschroder, Laura J; Fetters, Michael D; Zikmund-Fisher, Brian J; Crabtree, Benjamin F; Hudson, Shawna V; Ruffin, Mack T; Fucinari, Juliana; Kang, Minji; Taichman, L Susan; Creswell, John W

    2018-04-18

    Women with chronic medical conditions, such as diabetes and hypertension, have a higher risk of pregnancy-related complications compared with women without medical conditions and should be offered contraception if desired. Although evidence based guidelines for contraceptive selection in the presence of medical conditions are available via the United States Medical Eligibility Criteria (US MEC), these guidelines are underutilized. Research also supports the use of decision tools to promote shared decision making between patients and providers during contraceptive counseling. The overall goal of the MiHealth, MiChoice project is to design and implement a theory-driven, Web-based tool that incorporates the US MEC (provider-level intervention) within the vehicle of a contraceptive decision tool for women with chronic medical conditions (patient-level intervention) in community-based primary care settings (practice-level intervention). This will be a 3-phase study that includes a predesign phase, a design phase, and a testing phase in a randomized controlled trial. This study protocol describes phase 1 and aim 1, which is to determine patient-, provider-, and practice-level factors that are relevant to the design and implementation of the contraceptive decision tool. This is a mixed methods implementation study. To customize the delivery of the US MEC in the decision tool, we selected high-priority constructs from the Consolidated Framework for Implementation Research and the Theoretical Domains Framework to drive data collection and analysis at the practice and provider level, respectively. A conceptual model that incorporates constructs from the transtheoretical model and the health beliefs model undergirds patient-level data collection and analysis and will inform customization of the decision tool for this population. We will recruit 6 community-based primary care practices and conduct quantitative surveys and semistructured qualitative interviews with women who have chronic medical conditions, their primary care providers (PCPs), and clinic staff, as well as field observations of practice activities. Quantitative survey data will be summarized with simple descriptive statistics and relationships between participant characteristics and contraceptive recommendations (for PCPs), and current contraceptive use (for patients) will be examined using Fisher exact test. We will conduct thematic analysis of qualitative data from interviews and field observations. The integration of data will occur by comparing, contrasting, and synthesizing qualitative and quantitative findings to inform the future development and implementation of the intervention. We are currently enrolling practices and anticipate study completion in 15 months. This protocol describes the first phase of a multiphase mixed methods study to develop and implement a Web-based decision tool that is customized to meet the needs of women with chronic medical conditions in primary care settings. Study findings will promote contraceptive counseling via shared decision making and reflect evidence-based guidelines for contraceptive selection. ClinicalTrials.gov NCT03153644; https://clinicaltrials.gov/ct2/show/NCT03153644 (Archived by WebCite at http://www.webcitation.org/6yUkA5lK8). ©Justine P Wu, Laura J Damschroder, Michael D Fetters, Brian J Zikmund-Fisher, Benjamin F Crabtree, Shawna V Hudson, Mack T Ruffin IV, Juliana Fucinari, Minji Kang, L Susan Taichman, John W Creswell. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 18.04.2018.

  10. Investigation of machinability characteristics on EN47 steel for cutting force and tool wear using optimization technique

    NASA Astrophysics Data System (ADS)

    M, Vasu; Shivananda Nayaka, H.

    2018-06-01

    In this experimental work dry turning process carried out on EN47 spring steel with coated tungsten carbide tool insert with 0.8 mm nose radius are optimized by using statistical technique. Experiments were conducted at three different cutting speeds (625, 796 and 1250 rpm) with three different feed rates (0.046, 0.062 and 0.093 mm/rev) and depth of cuts (0.2, 0.3 and 0.4 mm). Experiments are conducted based on full factorial design (FFD) 33 three factors and three levels. Analysis of variance is used to identify significant factor for each output response. The result reveals that feed rate is the most significant factor influencing on cutting force followed by depth of cut and cutting speed having less significance. Optimum machining condition for cutting force obtained from the statistical technique. Tool wear measurements are performed with optimum condition of Vc = 796 rpm, ap = 0.2 mm, f = 0.046 mm/rev. The minimum tool wear observed as 0.086 mm with 5 min machining. Analysis of tool wear was done by confocal microscope it was observed that tool wear increases with increasing cutting time.

  11. "ATLAS" Advanced Technology Life-cycle Analysis System

    NASA Technical Reports Server (NTRS)

    Lollar, Louis F.; Mankins, John C.; ONeil, Daniel A.

    2004-01-01

    Making good decisions concerning research and development portfolios-and concerning the best systems concepts to pursue - as early as possible in the life cycle of advanced technologies is a key goal of R&D management This goal depends upon the effective integration of information from a wide variety of sources as well as focused, high-level analyses intended to inform such decisions Life-cycle Analysis System (ATLAS) methodology and tool kit. ATLAS encompasses a wide range of methods and tools. A key foundation for ATLAS is the NASA-created Technology Readiness. The toolkit is largely spreadsheet based (as of August 2003). This product is being funded by the Human and Robotics The presentation provides a summary of the Advanced Technology Level (TRL) systems Technology Program Office, Office of Exploration Systems, NASA Headquarters, Washington D.C. and is being integrated by Dan O Neil of the Advanced Projects Office, NASA/MSFC, Huntsville, AL

  12. Fungal genome resources at NCBI.

    PubMed

    Robbertse, B; Tatusova, T

    2011-09-01

    The National Center for Biotechnology Information (NCBI) is well known for the nucleotide sequence archive, GenBank and sequence analysis tool BLAST. However, NCBI integrates many types of biomolecular data from variety of sources and makes it available to the scientific community as interactive web resources as well as organized releases of bulk data. These tools are available to explore and compare fungal genomes. Searching all databases with Fungi [organism] at http://www.ncbi.nlm.nih.gov/ is the quickest way to find resources of interest with fungal entries. Some tools though are resources specific and can be indirectly accessed from a particular database in the Entrez system. These include graphical viewers and comparative analysis tools such as TaxPlot, TaxMap and UniGene DDD (found via UniGene Homepage). Gene and BioProject pages also serve as portals to external data such as community annotation websites, BioGrid and UniProt. There are many different ways of accessing genomic data at NCBI. Depending on the focus and goal of research projects or the level of interest, a user would select a particular route for accessing genomic databases and resources. This review article describes methods of accessing fungal genome data and provides examples that illustrate the use of analysis tools.

  13. The development of plant food processing in the Levant: insights from use-wear analysis of Early Epipalaeolithic ground stone tools

    PubMed Central

    Dubreuil, Laure; Nadel, Dani

    2015-01-01

    In recent years, the study of percussive, pounding and grinding tools has provided new insights into human evolution, more particularly regarding the development of technology enabling the processing and exploitation of plant resources. Some of these studies focus on early evidence for flour production, an activity frequently perceived as an important step in the evolution of plant exploitation. The present paper investigates plant food preparation in mobile hunter-gatherer societies from the Southern Levant. The analysis consists of a use-wear study of 18 tools recovered from Ohalo II, a 23 000-year-old site in Israel showing an exceptional level of preservation. Our sample includes a slab previously interpreted as a lower implement used for producing flour, based on the presence of cereal starch residues. The use-wear data we have obtained provide crucial information about the function of this and other percussive tools at Ohalo II, as well as on investment in tool manufacture, discard strategies and evidence for plant processing in the Late Pleistocene. The use-wear analysis indicates that the production of flour was a sporadic activity at Ohalo II, predating by thousands of years the onset of routine processing of plant foods. PMID:26483535

  14. The development of plant food processing in the Levant: insights from use-wear analysis of Early Epipalaeolithic ground stone tools.

    PubMed

    Dubreuil, Laure; Nadel, Dani

    2015-11-19

    In recent years, the study of percussive, pounding and grinding tools has provided new insights into human evolution, more particularly regarding the development of technology enabling the processing and exploitation of plant resources. Some of these studies focus on early evidence for flour production, an activity frequently perceived as an important step in the evolution of plant exploitation. The present paper investigates plant food preparation in mobile hunter-gatherer societies from the Southern Levant. The analysis consists of a use-wear study of 18 tools recovered from Ohalo II, a 23 000-year-old site in Israel showing an exceptional level of preservation. Our sample includes a slab previously interpreted as a lower implement used for producing flour, based on the presence of cereal starch residues. The use-wear data we have obtained provide crucial information about the function of this and other percussive tools at Ohalo II, as well as on investment in tool manufacture, discard strategies and evidence for plant processing in the Late Pleistocene. The use-wear analysis indicates that the production of flour was a sporadic activity at Ohalo II, predating by thousands of years the onset of routine processing of plant foods. © 2015 The Author(s).

  15. Captive chimpanzees' manual laterality in tool use context: Influence of communication and of sociodemographic factors.

    PubMed

    Prieur, Jacques; Pika, Simone; Blois-Heulin, Catherine; Barbu, Stéphanie

    2018-04-14

    Understanding variations of apes' laterality between activities is a central issue when investigating the evolutionary origins of human hemispheric specialization of manual functions and language. We assessed laterality of 39 chimpanzees in a non-communication action similar to termite fishing that we compared with data on five frequent conspecific-directed gestures involving a tool previously exploited in the same subjects. We evaluated, first, population-level manual laterality for tool-use in non-communication actions; second, the influence of sociodemographic factors (age, sex, group, and hierarchy) on manual laterality in both non-communication actions and gestures. No significant right-hand bias at the population level was found for non-communication tool use, contrary to our previous findings for gestures involving a tool. A multifactorial analysis revealed that hierarchy and age particularly modulated manual laterality. Dominants and immatures were more right-handed when using a tool in gestures than in non-communication actions. On the contrary, subordinates, adolescents, young and mature adults as well as males were more right-handed when using a tool in non-communication actions than in gestures. Our findings support the hypothesis that some primate species may have a specific left-hemisphere processing gestures distinct from the cerebral system processing non-communication manual actions and to partly support the tool use hypothesis. Copyright © 2018 Elsevier B.V. All rights reserved.

  16. Finite-element analysis of NiTi wire deflection during orthodontic levelling treatment

    NASA Astrophysics Data System (ADS)

    Razali, M. F.; Mahmud, A. S.; Mokhtar, N.; Abdullah, J.

    2016-02-01

    Finite-element analysis is an important product development tool in medical devices industry for design and failure analysis of devices. This tool helps device designers to quickly explore various design options, optimizing specific designs and providing a deeper insight how a device is actually performing. In this study, three-dimensional finite-element models of superelastic nickel-titanium arch wire engaged in a three brackets system were developed. The aim was to measure the effect of binding friction developed on wire-bracket interaction towards the remaining recovery force available for tooth movement. Uniaxial and three brackets bending test were modelled and validated against experimental works. The prediction made by the three brackets bending models shows good agreement with the experimental results.

  17. Situational Awareness Geospatial Application (iSAGA)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sher, Benjamin

    Situational Awareness Geospatial Application (iSAGA) is a geospatial situational awareness software tool that uses an algorithm to extract location data from nearly any internet-based, or custom data source and display it geospatially; allows user-friendly conduct of spatial analysis using custom-developed tools; searches complex Geographic Information System (GIS) databases and accesses high resolution imagery. iSAGA has application at the federal, state and local levels of emergency response, consequence management, law enforcement, emergency operations and other decision makers as a tool to provide complete, visual, situational awareness using data feeds and tools selected by the individual agency or organization. Feeds may bemore » layered and custom tools developed to uniquely suit each subscribing agency or organization. iSAGA may similarly be applied to international agencies and organizations.« less

  18. Software Tools to Support Research on Airport Departure Planning

    NASA Technical Reports Server (NTRS)

    Carr, Francis; Evans, Antony; Feron, Eric; Clarke, John-Paul

    2003-01-01

    A simple, portable and useful collection of software tools has been developed for the analysis of airport surface traffic. The tools are based on a flexible and robust traffic-flow model, and include calibration, validation and simulation functionality for this model. Several different interfaces have been developed to help promote usage of these tools, including a portable Matlab(TM) implementation of the basic algorithms; a web-based interface which provides online access to automated analyses of airport traffic based on a database of real-world operations data which covers over 250 U.S. airports over a 5-year period; and an interactive simulation-based tool currently in use as part of a college-level educational module. More advanced applications for airport departure traffic include taxi-time prediction and evaluation of "windowing" congestion control.

  19. Multidimensional Recurrence Quantification Analysis (MdRQA) for the Analysis of Multidimensional Time-Series: A Software Implementation in MATLAB and Its Application to Group-Level Data in Joint Action.

    PubMed

    Wallot, Sebastian; Roepstorff, Andreas; Mønster, Dan

    2016-01-01

    We introduce Multidimensional Recurrence Quantification Analysis (MdRQA) as a tool to analyze multidimensional time-series data. We show how MdRQA can be used to capture the dynamics of high-dimensional signals, and how MdRQA can be used to assess coupling between two or more variables. In particular, we describe applications of the method in research on joint and collective action, as it provides a coherent analysis framework to systematically investigate dynamics at different group levels-from individual dynamics, to dyadic dynamics, up to global group-level of arbitrary size. The Appendix in Supplementary Material contains a software implementation in MATLAB to calculate MdRQA measures.

  20. Socio-economic inequity in demand for insecticide-treated nets, in-door residual house spraying, larviciding and fogging in Sudan.

    PubMed

    Onwujekwe, Obinna; Malik, El-Fatih Mohamed; Mustafa, Sara Hassan; Mnzava, Abraham

    2005-12-15

    In order to optimally prioritize and use public and private budgets for equitable malaria vector control, there is a need to determine the level and determinants of consumer demand for different vector control tools. To determine the demand from people of different socio-economic groups for indoor residual house-spraying (IRHS), insecticide-treated nets (ITNs), larviciding with chemicals (LWC), and space spraying/fogging (SS) and the disease control implications of the result. Ratings and levels of willingness-to-pay (WTP) for the vector control tools were determined using a random cross-sectional sample of 720 householdes drawn from two states. WTP was elicited using the bidding game. An asset-based socio-economic status (SES) index was used to explore whether WTP was related to SES of the respondents. IRHS received the highest proportion of highest preferred rating (41.0%) followed by ITNs (23.1%). However, ITNs had the highest mean WTP followed by IRHS, while LWC had the least. The regression analysis showed that SES was positively and statistically significantly related to WTP across the four vector control tools and that the respondents' rating of IRHS and ITNs significantly explained their levels of WTP for the two tools. People were willing to pay for all the vector-control tools, but the demand for the vector control tools was related to the SES of the respondents. Hence, it is vital that there are public policies and financing mechanisms to ensure equitable provision and utilisation of vector control tools, as well as protecting the poor from cost-sharing arrangements.

  1. Contribution of suppression difficulty and lessons learned in forecasting fire suppression operations productivity: A methodological approach

    Treesearch

    Francisco Rodríguez y Silva; Armando González-Cabán

    2016-01-01

    We propose an economic analysis using utility and productivity, and efficiency theories to provide fire managers a decision support tool to determine the most efficient fire management programs levels. By incorporating managers’ accumulated fire suppression experiences (capitalized experience) in the analysis we help fire managers...

  2. Using Miscue Analysis to Assess Comprehension in Deaf College Readers

    ERIC Educational Resources Information Center

    Albertini, John; Mayer, Connie

    2011-01-01

    For over 30 years, teachers have used miscue analysis as a tool to assess and evaluate the reading abilities of hearing students in elementary and middle schools and to design effective literacy programs. More recently, teachers of deaf and hard-of-hearing students have also reported its usefulness for diagnosing word- and phrase-level reading…

  3. Accounting for Student Success: An Empirical Analysis of the Origins and Spread of State Student Unit-Record Systems

    ERIC Educational Resources Information Center

    Hearn, James C.; McLendon, Michael K.; Mokher, Christine G.

    2008-01-01

    This event history analysis explores factors driving the emergence over recent decades of comprehensive state-level student unit-record [SUR] systems, a potentially powerful tool for increasing student success. Findings suggest that the adoption of these systems is rooted in demand and ideological factors. Larger states, states with high…

  4. Web-Based Analysis for Student-Generated Complex Genetic Profiles

    ERIC Educational Resources Information Center

    Kass, David H.; LaRoe, Robert

    2007-01-01

    A simple, rapid method for generating complex genetic profiles using Alu-based markers was recently developed for students primarily at the undergraduate level to learn more about forensics and paternity analysis. On the basis of the Cold Spring Harbor Allele Server, which provides an excellent tool for analyzing a single Alu variant, we present a…

  5. Using Molecular Visualization to Explore Protein Structure and Function and Enhance Student Facility with Computational Tools

    ERIC Educational Resources Information Center

    Terrell, Cassidy R.; Listenberger, Laura L.

    2017-01-01

    Recognizing that undergraduate students can benefit from analysis of 3D protein structure and function, we have developed a multiweek, inquiry-based molecular visualization project for Biochemistry I students. This project uses a virtual model of cyclooxygenase-1 (COX-1) to guide students through multiple levels of protein structure analysis. The…

  6. ICT Capacity Building: A Critical Discourse Analysis of Rwandan Policies from Higher Education Perspective

    ERIC Educational Resources Information Center

    Byungura, Jean Claude; Hansson, Henrik; Masengesho, Kamuzinzi; Karunaratne, Thashmee

    2016-01-01

    With the development of technology in the 21st Century, education systems attempt to integrate technology-based tools to improve experiences in pedagogy and administration. It is becoming increasingly prominent to build human and ICT infrastructure capacities at universities from policy to implementation level. Using a critical discourse analysis,…

  7. Decision-Making, Information Communication Technology, and Data Analysis by School Leaders about Student Achievement

    ERIC Educational Resources Information Center

    Akoma, Ahunna Margaux

    2012-01-01

    This case study of one school district examined how school leaders use student performance data and technology-based data analysis tools to engage in data-informed decision-making for continuous improvement. School leaders in this context included leaders at the district, school, and classroom levels. An extensive literature review provided the…

  8. Anvil Forecast Tool in the Advanced Weather Interactive Processing System (AWIPS)

    NASA Technical Reports Server (NTRS)

    Barrett, Joe H., III; Hood, Doris

    2009-01-01

    Launch Weather Officers (LWOs) from the 45th Weather Squadron (45 WS) and forecasters from the National Weather Service (NWS) Spaceflight Meteorology Group (SMG) have identified anvil forecasting as one of their most challenging tasks when predicting the probability of violating the Lightning Launch Commit Criteria (LLCC) (Krider et al. 2006; Space Shuttle Flight Rules (FR), NASA/JSC 2004)). As a result, the Applied Meteorology Unit (AMU) developed a tool that creates an anvil threat corridor graphic that can be overlaid on satellite imagery using the Meteorological Interactive Data Display System (MIDDS, Short and Wheeler, 2002). The tool helps forecasters estimate the locations of thunderstorm anvils at one, two, and three hours into the future. It has been used extensively in launch and landing operations by both the 45 WS and SMG. The Advanced Weather Interactive Processing System (AWIPS) is now used along with MIDDS for weather analysis and display at SMG. In Phase I of this task, SMG tasked the AMU to transition the tool from MIDDS to AWIPS (Barrett et aI., 2007). For Phase II, SMG requested the AMU make the Anvil Forecast Tool in AWIPS more configurable by creating the capability to read model gridded data from user-defined model files instead of hard-coded files. An NWS local AWIPS application called AGRID was used to accomplish this. In addition, SMG needed to be able to define the pressure levels for the model data, instead of hard-coding the bottom level as 300 mb and the top level as 150 mb. This paper describes the initial development of the Anvil Forecast Tool for MIDDS, followed by the migration of the tool to AWIPS in Phase I. It then gives a detailed presentation of the Phase II improvements to the AWIPS tool.

  9. Porcupine: A visual pipeline tool for neuroimaging analysis

    PubMed Central

    Snoek, Lukas; Knapen, Tomas

    2018-01-01

    The field of neuroimaging is rapidly adopting a more reproducible approach to data acquisition and analysis. Data structures and formats are being standardised and data analyses are getting more automated. However, as data analysis becomes more complicated, researchers often have to write longer analysis scripts, spanning different tools across multiple programming languages. This makes it more difficult to share or recreate code, reducing the reproducibility of the analysis. We present a tool, Porcupine, that constructs one’s analysis visually and automatically produces analysis code. The graphical representation improves understanding of the performed analysis, while retaining the flexibility of modifying the produced code manually to custom needs. Not only does Porcupine produce the analysis code, it also creates a shareable environment for running the code in the form of a Docker image. Together, this forms a reproducible way of constructing, visualising and sharing one’s analysis. Currently, Porcupine links to Nipype functionalities, which in turn accesses most standard neuroimaging analysis tools. Our goal is to release researchers from the constraints of specific implementation details, thereby freeing them to think about novel and creative ways to solve a given problem. Porcupine improves the overview researchers have of their processing pipelines, and facilitates both the development and communication of their work. This will reduce the threshold at which less expert users can generate reusable pipelines. With Porcupine, we bridge the gap between a conceptual and an implementational level of analysis and make it easier for researchers to create reproducible and shareable science. We provide a wide range of examples and documentation, as well as installer files for all platforms on our website: https://timvanmourik.github.io/Porcupine. Porcupine is free, open source, and released under the GNU General Public License v3.0. PMID:29746461

  10. Geographic information systems: introduction.

    PubMed

    Calistri, Paolo; Conte, Annamaria; Freier, Jerome E; Ward, Michael P

    2007-01-01

    The recent exponential growth of the science and technology of geographic information systems (GIS) has made a tremendous contribution to epidemiological analysis and has led to the development of new powerful tools for the surveillance of animal diseases. GIS, spatial analysis and remote sensing provide valuable methods to collect and manage information for epidemiological surveys. Spatial patterns and trends of disease can be correlated with climatic and environmental information, thus contributing to a better understanding of the links between disease processes and explanatory spatial variables. Until recently, these tools were underexploited in the field of veterinary public health, due to the prohibitive cost of hardware and the complexity of GIS software that required a high level of expertise. The revolutionary developments in computer performance of the last decade have not only reduced the costs of equipment but have made available easy-to-use Web-based software which in turn have meant that GIS are more widely accessible by veterinary services at all levels. At the same time, the increased awareness of the possibilities offered by these tools has created new opportunities for decision-makers to enhance their planning, analysis and monitoring capabilities. These technologies offer a new way of sharing and accessing spatial and non-spatial data across groups and institutions. The series of papers included in this compilation aim to: - define the state of the art in the use of GIS in veterinary activities - identify priority needs in the development of new GIS tools at the international level for the surveillance of animal diseases and zoonoses - define practical proposals for their implementation. The topics addressed are presented in the following order in this book: - importance of GIS for the monitoring of animal diseases and zoonoses - GIS application in surveillance activities - spatial analysis in veterinary epidemiology - data collection and remote sensing applications - Web - GIS as a tool for data and knowledge sharing. All 43 manuscripts selected for this book have been peer-reviewed. These contributions were originally commissioned for the First international conference on the use of GIS in veterinary activities organised by the Istituto Zooprofilattico Sperimentale dell'Abruzzo e del Molise 'G. Caporale', Teramo, Italy, and the World Organisation for Animal Health (OIE: Office International des Epizooties) that was held in Silvi Marina, Italy, from 8 to 11 October 2006. The editors would like to thank all authors for their valuable contributions.

  11. Biomechanical ToolKit: Open-source framework to visualize and process biomechanical data.

    PubMed

    Barre, Arnaud; Armand, Stéphane

    2014-04-01

    C3D file format is widely used in the biomechanical field by companies and laboratories to store motion capture systems data. However, few software packages can visualize and modify the integrality of the data in the C3D file. Our objective was to develop an open-source and multi-platform framework to read, write, modify and visualize data from any motion analysis systems using standard (C3D) and proprietary file formats (used by many companies producing motion capture systems). The Biomechanical ToolKit (BTK) was developed to provide cost-effective and efficient tools for the biomechanical community to easily deal with motion analysis data. A large panel of operations is available to read, modify and process data through C++ API, bindings for high-level languages (Matlab, Octave, and Python), and standalone application (Mokka). All these tools are open-source and cross-platform and run on all major operating systems (Windows, Linux, MacOS X). Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  12. gHRV: Heart rate variability analysis made easy.

    PubMed

    Rodríguez-Liñares, L; Lado, M J; Vila, X A; Méndez, A J; Cuesta, P

    2014-08-01

    In this paper, the gHRV software tool is presented. It is a simple, free and portable tool developed in python for analysing heart rate variability. It includes a graphical user interface and it can import files in multiple formats, analyse time intervals in the signal, test statistical significance and export the results. This paper also contains, as an example of use, a clinical analysis performed with the gHRV tool, namely to determine whether the heart rate variability indexes change across different stages of sleep. Results from tests completed by researchers who have tried gHRV are also explained: in general the application was positively valued and results reflect a high level of satisfaction. gHRV is in continuous development and new versions will include suggestions made by testers. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  13. Tools for the functional interpretation of metabolomic experiments.

    PubMed

    Chagoyen, Monica; Pazos, Florencio

    2013-11-01

    The so-called 'omics' approaches used in modern biology aim at massively characterizing the molecular repertories of living systems at different levels. Metabolomics is one of the last additions to the 'omics' family and it deals with the characterization of the set of metabolites in a given biological system. As metabolomic techniques become more massive and allow characterizing larger sets of metabolites, automatic methods for analyzing these sets in order to obtain meaningful biological information are required. Only recently the first tools specifically designed for this task in metabolomics appeared. They are based on approaches previously used in transcriptomics and other 'omics', such as annotation enrichment analysis. These, together with generic tools for metabolic analysis and visualization not specifically designed for metabolomics will for sure be in the toolbox of the researches doing metabolomic experiments in the near future.

  14. Metallothionein--a promising tool for cancer diagnostics.

    PubMed

    Krizkova, S; Fabrik, I; Adam, V; Hrabeta, J; Eckschlager, T; Kizek, R

    2009-01-01

    The latest research outcomes indicate that metallothionein (MT) levels in peripheral blood and serum from cancer patients can provide many interesting information about type or clinical stage of the disease, or response to therapy. MT plays a key role in transport of essential heavy metals, detoxification of toxic metals and protection of cells against oxidation stress. Serum MT levels of cancer patients are three times higher than control patients (0.5 microM). The elevated MT levels in cancer cells are probably related to their increased proliferation and protection against apoptosis. Automated electrochemical detection of MT allows its serial analysis in a very small volume with excellent sensitivity, reliability and reproducibility and therefore it can be considered as a new tool for cancer diagnosis (Fig. 4, Ref. 55). Full Text (Free, PDF) www.bmj.sk.

  15. Stochastic analysis of motor-control stability, polymer based force sensing, and optical stimulation as a preventive measure for falls

    NASA Astrophysics Data System (ADS)

    Landrock, Clinton K.

    Falls are the leading cause of all external injuries. Outcomes of falls include the leading cause of traumatic brain injury and bone fractures, and high direct medical costs in the billions of dollars. This work focused on developing three areas of enabling component technology to be used in postural control monitoring tools targeting the mitigation of falls. The first was an analysis tool based on stochastic fractal analysis to reliably measure levels of motor control. The second focus was on thin film wearable pressure sensors capable of relaying data for the first tool. The third was new thin film advanced optics for improving phototherapy devices targeting postural control disorders. Two populations, athletes and elderly, were studied against control groups. The results of these studies clearly show that monitoring postural stability in at-risk groups can be achieved reliably, and an integrated wearable system can be envisioned for both monitoring and treatment purposes. Keywords: electro-active polymer, ionic polymer-metal composite, postural control, motor control, fall prevention, sports medicine, fractal analysis, physiological signals, wearable sensors, phototherapy, photobiomodulation, nano-optics.

  16. Setting up a proper power spectral density (PSD) and autocorrelation analysis for material and process characterization

    NASA Astrophysics Data System (ADS)

    Rutigliani, Vito; Lorusso, Gian Francesco; De Simone, Danilo; Lazzarino, Frederic; Rispens, Gijsbert; Papavieros, George; Gogolides, Evangelos; Constantoudis, Vassilios; Mack, Chris A.

    2018-03-01

    Power spectral density (PSD) analysis is playing more and more a critical role in the understanding of line-edge roughness (LER) and linewidth roughness (LWR) in a variety of applications across the industry. It is an essential step to get an unbiased LWR estimate, as well as an extremely useful tool for process and material characterization. However, PSD estimate can be affected by both random to systematic artifacts caused by image acquisition and measurement settings, which could irremediably alter its information content. In this paper, we report on the impact of various setting parameters (smoothing image processing filters, pixel size, and SEM noise levels) on the PSD estimate. We discuss also the use of PSD analysis tool in a variety of cases. Looking beyond the basic roughness estimate, we use PSD and autocorrelation analysis to characterize resist blur[1], as well as low and high frequency roughness contents and we apply this technique to guide the EUV material stack selection. Our results clearly indicate that, if properly used, PSD methodology is a very sensitive tool to investigate material and process variations

  17. Fungal Genomics Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grigoriev, Igor

    The JGI Fungal Genomics Program aims to scale up sequencing and analysis of fungal genomes to explore the diversity of fungi important for energy and the environment, and to promote functional studies on a system level. Combining new sequencing technologies and comparative genomics tools, JGI is now leading the world in fungal genome sequencing and analysis. Over 120 sequenced fungal genomes with analytical tools are available via MycoCosm (www.jgi.doe.gov/fungi), a web-portal for fungal biologists. Our model of interacting with user communities, unique among other sequencing centers, helps organize these communities, improves genome annotation and analysis work, and facilitates new larger-scalemore » genomic projects. This resulted in 20 high-profile papers published in 2011 alone and contributing to the Genomics Encyclopedia of Fungi, which targets fungi related to plant health (symbionts, pathogens, and biocontrol agents) and biorefinery processes (cellulose degradation, sugar fermentation, industrial hosts). Our next grand challenges include larger scale exploration of fungal diversity (1000 fungal genomes), developing molecular tools for DOE-relevant model organisms, and analysis of complex systems and metagenomes.« less

  18. Data and Tools | Energy Analysis | NREL

    Science.gov Websites

    and Tools Energy Analysis Data and Tools NREL develops energy analysis data and tools to assess collections. Data Products Technology and Performance Analysis Tools Energy Systems Analysis Tools Economic and Financial Analysis Tools

  19. Geometric modeling for computer aided design

    NASA Technical Reports Server (NTRS)

    Schwing, James L.; Olariu, Stephen

    1995-01-01

    The primary goal of this grant has been the design and implementation of software to be used in the conceptual design of aerospace vehicles particularly focused on the elements of geometric design, graphical user interfaces, and the interaction of the multitude of software typically used in this engineering environment. This has resulted in the development of several analysis packages and design studies. These include two major software systems currently used in the conceptual level design of aerospace vehicles. These tools are SMART, the Solid Modeling Aerospace Research Tool, and EASIE, the Environment for Software Integration and Execution. Additional software tools were designed and implemented to address the needs of the engineer working in the conceptual design environment. SMART provides conceptual designers with a rapid prototyping capability and several engineering analysis capabilities. In addition, SMART has a carefully engineered user interface that makes it easy to learn and use. Finally, a number of specialty characteristics have been built into SMART which allow it to be used efficiently as a front end geometry processor for other analysis packages. EASIE provides a set of interactive utilities that simplify the task of building and executing computer aided design systems consisting of diverse, stand-alone, analysis codes. Resulting in a streamlining of the exchange of data between programs reducing errors and improving the efficiency. EASIE provides both a methodology and a collection of software tools to ease the task of coordinating engineering design and analysis codes.

  20. Advanced Usage of Vehicle Sketch Pad for CFD-Based Conceptual Design

    NASA Technical Reports Server (NTRS)

    Ordaz, Irian; Li, Wu

    2013-01-01

    Conceptual design is the most fluid phase of aircraft design. It is important to be able to perform large scale design space exploration of candidate concepts that can achieve the design intent to avoid more costly configuration changes in later stages of design. This also means that conceptual design is highly dependent on the disciplinary analysis tools to capture the underlying physics accurately. The required level of analysis fidelity can vary greatly depending on the application. Vehicle Sketch Pad (VSP) allows the designer to easily construct aircraft concepts and make changes as the design matures. More recent development efforts have enabled VSP to bridge the gap to high-fidelity analysis disciplines such as computational fluid dynamics and structural modeling for finite element analysis. This paper focuses on the current state-of-the-art geometry modeling for the automated process of analysis and design of low-boom supersonic concepts using VSP and several capability-enhancing design tools.

  1. Integrated corridor management modeling results report : Dallas, Minneapolis, and San Diego.

    DOT National Transportation Integrated Search

    2012-02-01

    This executive summary documents the analysis methodologies, tools, and performance measures used to analyze Integrated Corridor Management (ICM) strategies; and presents high-level results for the successful implementation of ICM at three Stage 2 Pi...

  2. Taking the business continuity programme to a corporate leadership role.

    PubMed

    Messer, Ira

    2009-11-01

    The paper discusses a process to raise the awareness and value of your continuity programme to higher levels by leveraging existing data. It gives examples of how to utilise the structure of the Business Impact Analysis tool to develop an enterprise level of information capture, and then utilise that data to generate an enterprise level 'group think' which results in incorporation of business continuity as a part of the business-as-usual model.

  3. Rationalization and the Individuals with Disabilities Education Act: Exploring the Characteristics of Multi-Level Performance Monitoring and Improvement

    ERIC Educational Resources Information Center

    Mahu, Robert J.

    2017-01-01

    Performance measurement has emerged as a management tool that, accompanied by advances in technology and data analysis, has allowed public officials to control public policy at multiple levels of government. In the United States, the federal government has used performance measurement as part of an accountability strategy that enables Congress and…

  4. A geostatistics-informed hierarchical sensitivity analysis method for complex groundwater flow and transport modeling: GEOSTATISTICAL SENSITIVITY ANALYSIS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Heng; Chen, Xingyuan; Ye, Ming

    Sensitivity analysis is an important tool for quantifying uncertainty in the outputs of mathematical models, especially for complex systems with a high dimension of spatially correlated parameters. Variance-based global sensitivity analysis has gained popularity because it can quantify the relative contribution of uncertainty from different sources. However, its computational cost increases dramatically with the complexity of the considered model and the dimension of model parameters. In this study we developed a hierarchical sensitivity analysis method that (1) constructs an uncertainty hierarchy by analyzing the input uncertainty sources, and (2) accounts for the spatial correlation among parameters at each level ofmore » the hierarchy using geostatistical tools. The contribution of uncertainty source at each hierarchy level is measured by sensitivity indices calculated using the variance decomposition method. Using this methodology, we identified the most important uncertainty source for a dynamic groundwater flow and solute transport in model at the Department of Energy (DOE) Hanford site. The results indicate that boundary conditions and permeability field contribute the most uncertainty to the simulated head field and tracer plume, respectively. The relative contribution from each source varied spatially and temporally as driven by the dynamic interaction between groundwater and river water at the site. By using a geostatistical approach to reduce the number of realizations needed for the sensitivity analysis, the computational cost of implementing the developed method was reduced to a practically manageable level. The developed sensitivity analysis method is generally applicable to a wide range of hydrologic and environmental problems that deal with high-dimensional spatially-distributed parameters.« less

  5. B-MIC: An Ultrafast Three-Level Parallel Sequence Aligner Using MIC.

    PubMed

    Cui, Yingbo; Liao, Xiangke; Zhu, Xiaoqian; Wang, Bingqiang; Peng, Shaoliang

    2016-03-01

    Sequence alignment is the central process for sequence analysis, where mapping raw sequencing data to reference genome. The large amount of data generated by NGS is far beyond the process capabilities of existing alignment tools. Consequently, sequence alignment becomes the bottleneck of sequence analysis. Intensive computing power is required to address this challenge. Intel recently announced the MIC coprocessor, which can provide massive computing power. The Tianhe-2 is the world's fastest supercomputer now equipped with three MIC coprocessors each compute node. A key feature of sequence alignment is that different reads are independent. Considering this property, we proposed a MIC-oriented three-level parallelization strategy to speed up BWA, a widely used sequence alignment tool, and developed our ultrafast parallel sequence aligner: B-MIC. B-MIC contains three levels of parallelization: firstly, parallelization of data IO and reads alignment by a three-stage parallel pipeline; secondly, parallelization enabled by MIC coprocessor technology; thirdly, inter-node parallelization implemented by MPI. In this paper, we demonstrate that B-MIC outperforms BWA by a combination of those techniques using Inspur NF5280M server and the Tianhe-2 supercomputer. To the best of our knowledge, B-MIC is the first sequence alignment tool to run on Intel MIC and it can achieve more than fivefold speedup over the original BWA while maintaining the alignment precision.

  6. Study of the Effect of Lubricant Emulsion Percentage and Tool Material on Surface Roughness in Machining of EN-AC 48000 Alloy

    NASA Astrophysics Data System (ADS)

    Soltani, E.; Shahali, H.; Zarepour, H.

    2011-01-01

    In this paper, the effect of machining parameters, namely, lubricant emulsion percentage and tool material on surface roughness has been studied in machining process of EN-AC 48000 aluminum alloy. EN-AC 48000 aluminum alloy is an important alloy in industries. Machining of this alloy is of vital importance due to built-up edge and tool wear. A L9 Taguchi standard orthogonal array has been applied as experimental design to investigate the effect of the factors and their interaction. Nine machining tests have been carried out with three random replications resulting in 27 experiments. Three type of cutting tools including coated carbide (CD1810), uncoated carbide (H10), and polycrystalline diamond (CD10) have been used in this research. Emulsion percentage of lubricant is selected at three levels including 3%, 5% and 10%. Statistical analysis has been employed to study the effect of factors and their interactions using ANOVA method. Moreover, the optimal factors level has been achieved through signal to noise ratio (S/N) analysis. Also, a regression model has been provided to predict the surface roughness. Finally, the results of the confirmation tests have been presented to verify the adequacy of the predictive model. In this research, surface quality was improved by 9% using lubricant and statistical optimization method.

  7. Development of analytical cell support for vitrification at the West Valley Demonstration Project. Topical report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barber, F.H.; Borek, T.T.; Christopher, J.Z.

    1997-12-01

    Analytical and Process Chemistry (A&PC) support is essential to the high-level waste vitrification campaign at the West Valley Demonstration Project (WVDP). A&PC characterizes the waste, providing information necessary to formulate the recipe for the target radioactive glass product. High-level waste (HLW) samples are prepared and analyzed in the analytical cells (ACs) and Sample Storage Cell (SSC) on the third floor of the main plant. The high levels of radioactivity in the samples require handling them in the shielded cells with remote manipulators. The analytical hot cells and third floor laboratories were refurbished to ensure optimal uninterrupted operation during the vitrificationmore » campaign. New and modified instrumentation, tools, sample preparation and analysis techniques, and equipment and training were required for A&PC to support vitrification. Analytical Cell Mockup Units (ACMUs) were designed to facilitate method development, scientist and technician training, and planning for analytical process flow. The ACMUs were fabricated and installed to simulate the analytical cell environment and dimensions. New techniques, equipment, and tools could be evaluated m in the ACMUs without the consequences of generating or handling radioactive waste. Tools were fabricated, handling and disposal of wastes was addressed, and spatial arrangements for equipment were refined. As a result of the work at the ACMUs the remote preparation and analysis methods and the equipment and tools were ready for installation into the ACs and SSC m in July 1995. Before use m in the hot cells, all remote methods had been validated and four to eight technicians were trained on each. Fine tuning of the procedures has been ongoing at the ACs based on input from A&PC technicians. Working at the ACs presents greater challenges than had development at the ACMUs. The ACMU work and further refinements m in the ACs have resulted m in a reduction m in analysis turnaround time (TAT).« less

  8. Battery Lifetime Analysis and Simulation Tool (BLAST) Documentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neubauer, J.

    2014-12-01

    The deployment and use of lithium-ion (Li-ion) batteries in automotive and stationary energy storage applications must be optimized to justify their high up-front costs. Given that batteries degrade with use and storage, such optimizations must evaluate many years of operation. As the degradation mechanisms are sensitive to temperature, state-of-charge (SOC) histories, current levels, and cycle depth and frequency, it is important to model both the battery and the application to a high level of detail to ensure battery response is accurately predicted. To address these issues, the National Renewable Energy Laboratory (NREL) has developed the Battery Lifetime Analysis and Simulationmore » Tool (BLAST) suite. This suite of tools pairs NREL’s high-fidelity battery degradation model with a battery electrical and thermal performance model, application-specific electrical and thermal performance models of the larger system (e.g., an electric vehicle), application-specific system use data (e.g., vehicle travel patterns and driving data), and historic climate data from cities across the United States. This provides highly realistic long-term predictions of battery response and thereby enables quantitative comparisons of varied battery use strategies.« less

  9. Pulse Shape Discrimination in the MAJORANA DEMONSTRATOR

    NASA Astrophysics Data System (ADS)

    Haufe, Christopher; Majorana Collaboration

    2017-09-01

    The MAJORANA DEMONSTRATOR is an experiment constructed to search for neutrinoless double-beta decays in germanium-76 and to demonstrate the feasibility to deploy a large-scale experiment in a phased and modular fashion. It consists of two modular arrays of natural and 76Ge-enriched germanium p-type point contact detectors totaling 44.1 kg, located at the 4850' level of the Sanford Underground Research Facility in Lead, South Dakota, USA. A large effort is underway to analyze the data currently being taken by the DEMONSTRATOR. Key components of this effort are analysis tools that allow for pulse shape discrimination-techniques that significantly reduce background levels in the neutrinoless double-beta decay region of interest. These tools are able to identify and reject multi-site events from Compton scattering as well as events from alpha particle interactions. This work serves as an overview for these analysis tools and highlights the unique advantages that the HPGe p-type point contact detector provides to pulse shape discrimination. This material is supported by the U.S. Department of Energy, Office of Science, Office of Nuclear Physics, the Particle Astrophysics and Nuclear Physics Programs of the National Science Foundation, and the Sanford Underground Research Facility.

  10. Design and implementation of highly parallel pipelined VLSI systems

    NASA Astrophysics Data System (ADS)

    Delange, Alphonsus Anthonius Jozef

    A methodology and its realization as a prototype CAD (Computer Aided Design) system for the design and analysis of complex multiprocessor systems is presented. The design is an iterative process in which the behavioral specifications of the system components are refined into structural descriptions consisting of interconnections and lower level components etc. A model for the representation and analysis of multiprocessor systems at several levels of abstraction and an implementation of a CAD system based on this model are described. A high level design language, an object oriented development kit for tool design, a design data management system, and design and analysis tools such as a high level simulator and graphics design interface which are integrated into the prototype system and graphics interface are described. Procedures for the synthesis of semiregular processor arrays, and to compute the switching of input/output signals, memory management and control of processor array, and sequencing and segmentation of input/output data streams due to partitioning and clustering of the processor array during the subsequent synthesis steps, are described. The architecture and control of a parallel system is designed and each component mapped to a module or module generator in a symbolic layout library, compacted for design rules of VLSI (Very Large Scale Integration) technology. An example of the design of a processor that is a useful building block for highly parallel pipelined systems in the signal/image processing domains is given.

  11. Analysis technique for controlling system wavefront error with active/adaptive optics

    NASA Astrophysics Data System (ADS)

    Genberg, Victor L.; Michels, Gregory J.

    2017-08-01

    The ultimate goal of an active mirror system is to control system level wavefront error (WFE). In the past, the use of this technique was limited by the difficulty of obtaining a linear optics model. In this paper, an automated method for controlling system level WFE using a linear optics model is presented. An error estimate is included in the analysis output for both surface error disturbance fitting and actuator influence function fitting. To control adaptive optics, the technique has been extended to write system WFE in state space matrix form. The technique is demonstrated by example with SigFit, a commercially available tool integrating mechanical analysis with optical analysis.

  12. CHESS (CgHExpreSS): a comprehensive analysis tool for the analysis of genomic alterations and their effects on the expression profile of the genome.

    PubMed

    Lee, Mikyung; Kim, Yangseok

    2009-12-16

    Genomic alterations frequently occur in many cancer patients and play important mechanistic roles in the pathogenesis of cancer. Furthermore, they can modify the expression level of genes due to altered copy number in the corresponding region of the chromosome. An accumulating body of evidence supports the possibility that strong genome-wide correlation exists between DNA content and gene expression. Therefore, more comprehensive analysis is needed to quantify the relationship between genomic alteration and gene expression. A well-designed bioinformatics tool is essential to perform this kind of integrative analysis. A few programs have already been introduced for integrative analysis. However, there are many limitations in their performance of comprehensive integrated analysis using published software because of limitations in implemented algorithms and visualization modules. To address this issue, we have implemented the Java-based program CHESS to allow integrative analysis of two experimental data sets: genomic alteration and genome-wide expression profile. CHESS is composed of a genomic alteration analysis module and an integrative analysis module. The genomic alteration analysis module detects genomic alteration by applying a threshold based method or SW-ARRAY algorithm and investigates whether the detected alteration is phenotype specific or not. On the other hand, the integrative analysis module measures the genomic alteration's influence on gene expression. It is divided into two separate parts. The first part calculates overall correlation between comparative genomic hybridization ratio and gene expression level by applying following three statistical methods: simple linear regression, Spearman rank correlation and Pearson's correlation. In the second part, CHESS detects the genes that are differentially expressed according to the genomic alteration pattern with three alternative statistical approaches: Student's t-test, Fisher's exact test and Chi square test. By successive operations of two modules, users can clarify how gene expression levels are affected by the phenotype specific genomic alterations. As CHESS was developed in both Java application and web environments, it can be run on a web browser or a local machine. It also supports all experimental platforms if a properly formatted text file is provided to include the chromosomal position of probes and their gene identifiers. CHESS is a user-friendly tool for investigating disease specific genomic alterations and quantitative relationships between those genomic alterations and genome-wide gene expression profiling.

  13. Applications of temporal kernel canonical correlation analysis in adherence studies.

    PubMed

    John, Majnu; Lencz, Todd; Ferbinteanu, Janina; Gallego, Juan A; Robinson, Delbert G

    2017-10-01

    Adherence to medication is often measured as a continuous outcome but analyzed as a dichotomous outcome due to lack of appropriate tools. In this paper, we illustrate the use of the temporal kernel canonical correlation analysis (tkCCA) as a method to analyze adherence measurements and symptom levels on a continuous scale. The tkCCA is a novel method developed for studying the relationship between neural signals and hemodynamic response detected by functional MRI during spontaneous activity. Although the tkCCA is a powerful tool, it has not been utilized outside the application that it was originally developed for. In this paper, we simulate time series of symptoms and adherence levels for patients with a hypothetical brain disorder and show how the tkCCA can be used to understand the relationship between them. We also examine, via simulations, the behavior of the tkCCA under various missing value mechanisms and imputation methods. Finally, we apply the tkCCA to a real data example of psychotic symptoms and adherence levels obtained from a study based on subjects with a first episode of schizophrenia, schizophreniform or schizoaffective disorder.

  14. ANAlyte: A modular image analysis tool for ANA testing with indirect immunofluorescence.

    PubMed

    Di Cataldo, Santa; Tonti, Simone; Bottino, Andrea; Ficarra, Elisa

    2016-05-01

    The automated analysis of indirect immunofluorescence images for Anti-Nuclear Autoantibody (ANA) testing is a fairly recent field that is receiving ever-growing interest from the research community. ANA testing leverages on the categorization of intensity level and fluorescent pattern of IIF images of HEp-2 cells to perform a differential diagnosis of important autoimmune diseases. Nevertheless, it suffers from tremendous lack of repeatability due to subjectivity in the visual interpretation of the images. The automatization of the analysis is seen as the only valid solution to this problem. Several works in literature address individual steps of the work-flow, nonetheless integrating such steps and assessing their effectiveness as a whole is still an open challenge. We present a modular tool, ANAlyte, able to characterize a IIF image in terms of fluorescent intensity level and fluorescent pattern without any user-interactions. For this purpose, ANAlyte integrates the following: (i) Intensity Classifier module, that categorizes the intensity level of the input slide based on multi-scale contrast assessment; (ii) Cell Segmenter module, that splits the input slide into individual HEp-2 cells; (iii) Pattern Classifier module, that determines the fluorescent pattern of the slide based on the pattern of the individual cells. To demonstrate the accuracy and robustness of our tool, we experimentally validated ANAlyte on two different public benchmarks of IIF HEp-2 images with rigorous leave-one-out cross-validation strategy. We obtained overall accuracy of fluorescent intensity and pattern classification respectively around 85% and above 90%. We assessed all results by comparisons with some of the most representative state of the art works. Unlike most of the other works in the recent literature, ANAlyte aims at the automatization of all the major steps of ANA image analysis. Results on public benchmarks demonstrate that the tool can characterize HEp-2 slides in terms of intensity and fluorescent pattern with accuracy better or comparable with the state of the art techniques, even when such techniques are run on manually segmented cells. Hence, ANAlyte can be proposed as a valid solution to the problem of ANA testing automatization. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  15. A GIS-based methodology for the estimation of potential volcanic damage and its application to Tenerife Island, Spain

    NASA Astrophysics Data System (ADS)

    Scaini, C.; Felpeto, A.; Martí, J.; Carniel, R.

    2014-05-01

    This paper presents a GIS-based methodology to estimate damages produced by volcanic eruptions. The methodology is constituted by four parts: definition and simulation of eruptive scenarios, exposure analysis, vulnerability assessment and estimation of expected damages. Multi-hazard eruptive scenarios are defined for the Teide-Pico Viejo active volcanic complex, and simulated through the VORIS tool. The exposure analysis identifies the elements exposed to the hazard at stake and focuses on the relevant assets for the study area. The vulnerability analysis is based on previous studies on the built environment and complemented with the analysis of transportation and urban infrastructures. Damage assessment is performed associating a qualitative damage rating to each combination of hazard and vulnerability. This operation consists in a GIS-based overlap, performed for each hazardous phenomenon considered and for each element. The methodology is then automated into a GIS-based tool using an ArcGIS® program. Given the eruptive scenarios and the characteristics of the exposed elements, the tool produces expected damage maps. The tool is applied to the Icod Valley (North of Tenerife Island) which is likely to be affected by volcanic phenomena in case of eruption from both the Teide-Pico Viejo volcanic complex and North-West basaltic rift. Results are thematic maps of vulnerability and damage that can be displayed at different levels of detail, depending on the user preferences. The aim of the tool is to facilitate territorial planning and risk management in active volcanic areas.

  16. ProphTools: general prioritization tools for heterogeneous biological networks.

    PubMed

    Navarro, Carmen; Martínez, Victor; Blanco, Armando; Cano, Carlos

    2017-12-01

    Networks have been proven effective representations for the analysis of biological data. As such, there exist multiple methods to extract knowledge from biological networks. However, these approaches usually limit their scope to a single biological entity type of interest or they lack the flexibility to analyze user-defined data. We developed ProphTools, a flexible open-source command-line tool that performs prioritization on a heterogeneous network. ProphTools prioritization combines a Flow Propagation algorithm similar to a Random Walk with Restarts and a weighted propagation method. A flexible model for the representation of a heterogeneous network allows the user to define a prioritization problem involving an arbitrary number of entity types and their interconnections. Furthermore, ProphTools provides functionality to perform cross-validation tests, allowing users to select the best network configuration for a given problem. ProphTools core prioritization methodology has already been proven effective in gene-disease prioritization and drug repositioning. Here we make ProphTools available to the scientific community as flexible, open-source software and perform a new proof-of-concept case study on long noncoding RNAs (lncRNAs) to disease prioritization. ProphTools is robust prioritization software that provides the flexibility not present in other state-of-the-art network analysis approaches, enabling researchers to perform prioritization tasks on any user-defined heterogeneous network. Furthermore, the application to lncRNA-disease prioritization shows that ProphTools can reach the performance levels of ad hoc prioritization tools without losing its generality. © The Authors 2017. Published by Oxford University Press.

  17. Student Task Analysis for the Development of E-Learning Lectural System in Basic Chemistry Courses in FKIP UMMY Solok

    NASA Astrophysics Data System (ADS)

    Afrahamiryano, A.; Ariani, D.

    2018-04-01

    The student task analysis is one part of the define stage in development research using the 4-D development model. Analysis of this task is useful to determine the level of understanding of students on lecture materials that have been given. The results of this task analysis serve as a measuring tool to determine the level of success of learning and as a basis in the development of lecture system. Analysis of this task is done by the method of observation and documentation study of the tasks undertaken by students. The results of this analysis are then described and after that triangulation are done to draw conclusions. The results of the analysis indicate that the students' level of understanding is high for theoretical and low material for counting material. Based on the results of this task analysis, it can be concluded that e-learning lecture system developed should be able to increase students' understanding on basic chemicals that are calculated.

  18. Numerical Propulsion System Simulation

    NASA Technical Reports Server (NTRS)

    Naiman, Cynthia

    2006-01-01

    The NASA Glenn Research Center, in partnership with the aerospace industry, other government agencies, and academia, is leading the effort to develop an advanced multidisciplinary analysis environment for aerospace propulsion systems called the Numerical Propulsion System Simulation (NPSS). NPSS is a framework for performing analysis of complex systems. The initial development of NPSS focused on the analysis and design of airbreathing aircraft engines, but the resulting NPSS framework may be applied to any system, for example: aerospace, rockets, hypersonics, power and propulsion, fuel cells, ground based power, and even human system modeling. NPSS provides increased flexibility for the user, which reduces the total development time and cost. It is currently being extended to support the NASA Aeronautics Research Mission Directorate Fundamental Aeronautics Program and the Advanced Virtual Engine Test Cell (AVETeC). NPSS focuses on the integration of multiple disciplines such as aerodynamics, structure, and heat transfer with numerical zooming on component codes. Zooming is the coupling of analyses at various levels of detail. NPSS development includes capabilities to facilitate collaborative engineering. The NPSS will provide improved tools to develop custom components and to use capability for zooming to higher fidelity codes, coupling to multidiscipline codes, transmitting secure data, and distributing simulations across different platforms. These powerful capabilities extend NPSS from a zero-dimensional simulation tool to a multi-fidelity, multidiscipline system-level simulation tool for the full development life cycle.

  19. Vibration reduction of pneumatic percussive rivet tools: mechanical and ergonomic re-design approaches.

    PubMed

    Cherng, John G; Eksioglu, Mahmut; Kizilaslan, Kemal

    2009-03-01

    This paper presents a systematic design approach, which is the result of years of research effort, to ergonomic re-design of rivet tools, i.e. rivet hammers and bucking bars. The investigation was carried out using both ergonomic approach and mechanical analysis of the rivet tools dynamic behavior. The optimal mechanical design parameters of the re-designed rivet tools were determined by Taguchi method. Two ergonomically re-designed rivet tools with vibration damping/isolation mechanisms were tested against two conventional rivet tools in both laboratory and field tests. Vibration characteristics of both types of tools were measured by laboratory tests using a custom-made test fixture. The subjective field evaluations of the tools were performed by six experienced riveters at an aircraft repair shop. Results indicate that the isolation spring and polymer damper are very effective in reducing the overall level of vibration under both unweighted and weighted acceleration conditions. The mass of the dolly head and the housing played a significant role in the vibration absorption of the bucking bars. Another important result was that the duct iron has better vibration reducing capability compared to steel and aluminum for bucking bars. Mathematical simulation results were also consistent with the experimental results. Overall conclusion obtained from the study was that by applying the design principles of ergonomics and by adding vibration damping/isolation mechanisms to the rivet tools, the vibration level can significantly be reduced and the tools become safer and user friendly. The details of the experience learned, design modifications, test methods, mathematical models and the results are included in the paper.

  20. Designing Real-time Decision Support for Trauma Resuscitations

    PubMed Central

    Yadav, Kabir; Chamberlain, James M.; Lewis, Vicki R.; Abts, Natalie; Chawla, Shawn; Hernandez, Angie; Johnson, Justin; Tuveson, Genevieve; Burd, Randall S.

    2016-01-01

    Background Use of electronic clinical decision support (eCDS) has been recommended to improve implementation of clinical decision rules. Many eCDS tools, however, are designed and implemented without taking into account the context in which clinical work is performed. Implementation of the pediatric traumatic brain injury (TBI) clinical decision rule at one Level I pediatric emergency department includes an electronic questionnaire triggered when ordering a head computed tomography using computerized physician order entry (CPOE). Providers use this CPOE tool in less than 20% of trauma resuscitation cases. A human factors engineering approach could identify the implementation barriers that are limiting the use of this tool. Objectives The objective was to design a pediatric TBI eCDS tool for trauma resuscitation using a human factors approach. The hypothesis was that clinical experts will rate a usability-enhanced eCDS tool better than the existing CPOE tool for user interface design and suitability for clinical use. Methods This mixed-methods study followed usability evaluation principles. Pediatric emergency physicians were surveyed to identify barriers to using the existing eCDS tool. Using standard trauma resuscitation protocols, a hierarchical task analysis of pediatric TBI evaluation was developed. Five clinical experts, all board-certified pediatric emergency medicine faculty members, then iteratively modified the hierarchical task analysis until reaching consensus. The software team developed a prototype eCDS display using the hierarchical task analysis. Three human factors engineers provided feedback on the prototype through a heuristic evaluation, and the software team refined the eCDS tool using a rapid prototyping process. The eCDS tool then underwent iterative usability evaluations by the five clinical experts using video review of 50 trauma resuscitation cases. A final eCDS tool was created based on their feedback, with content analysis of the evaluations performed to ensure all concerns were identified and addressed. Results Among 26 EPs (76% response rate), the main barriers to using the existing tool were that the information displayed is redundant and does not fit clinical workflow. After the prototype eCDS tool was developed based on the trauma resuscitation hierarchical task analysis, the human factors engineers rated it to be better than the CPOE tool for nine of 10 standard user interface design heuristics on a three-point scale. The eCDS tool was also rated better for clinical use on the same scale, in 84% of 50 expert–video pairs, and was rated equivalent in the remainder. Clinical experts also rated barriers to use of the eCDS tool as being low. Conclusions An eCDS tool for diagnostic imaging designed using human factors engineering methods has improved perceived usability among pediatric emergency physicians. PMID:26300010

  1. Specification and Error Pattern Based Program Monitoring

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Johnson, Scott; Rosu, Grigore; Clancy, Daniel (Technical Monitor)

    2001-01-01

    We briefly present Java PathExplorer (JPAX), a tool developed at NASA Ames for monitoring the execution of Java programs. JPAX can be used not only during program testing to reveal subtle errors, but also can be applied during operation to survey safety critical systems. The tool facilitates automated instrumentation of a program in order to properly observe its execution. The instrumentation can be either at the bytecode level or at the source level when the source code is available. JPaX is an instance of a more general project, called PathExplorer (PAX), which is a basis for experiments rather than a fixed system, capable of monitoring various programming languages and experimenting with other logics and analysis techniques

  2. Bridging the gap between individual-level risk for HIV and structural determinants: using root cause analysis in strategic planning.

    PubMed

    Willard, Nancy; Chutuape, Kate; Stines, Stephanie; Ellen, Jonathan M

    2012-01-01

    HIV prevention efforts have expanded beyond individual-level interventions to address structural determinants of risk. Coalitions have been an important vehicle for addressing similar intractable and deeply rooted health-related issues. A root cause analysis process may aid coalitions in identifying fundamental, structural-level contributors to risk and in identifying appropriate solutions. For this article, strategic plans for 13 coalitions were analyzed both before and after a root cause analysis approach was applied to determine the coalitions' strategic plans potential impact and comprehensiveness. After root cause analysis, strategic plans trended toward targeting policies and practices rather than on single agency programmatic changes. Plans expanded to target multiple sectors and several changes within sectors to penetrate deeply into a sector or system. Findings suggest that root cause analysis may be a viable tool to assist coalitions in identifying structural determinants and possible solutions for HIV risk.

  3. Gender Mainstreaming in Education at the Level of Field Operations: The Case of CARE USA's Indicator Framework

    ERIC Educational Resources Information Center

    Miske, Shirley; Meagher, Margaret; DeJaeghere, Joan

    2010-01-01

    Following the adoption of gender mainstreaming at the Beijing Conference for Women in 1995 as a major strategy to promote gender equality and the recognition of gender analysis as central to this process, Gender and Development (GAD) frameworks have provided tools for gender analysis in various sectors. Gender mainstreaming in basic education has…

  4. "I Have a Love-Hate Relationship with ATLAS.ti"™: Integrating Qualitative Data Analysis Software into a Graduate Research Methods Course

    ERIC Educational Resources Information Center

    Paulus, Trena M.; Bennett, Ann M.

    2017-01-01

    While research on teaching qualitative methods in education has increased, few studies explore teaching qualitative data analysis software within graduate-level methods courses. During 2013, we required students in several such courses to use ATLAS.ti™ as a project management tool for their assignments. By supporting students' early experiences…

  5. Examining Students' Reflective Thinking from Keywords Tagged to Blogs: Using Map Analysis as a Content Analysis Method

    ERIC Educational Resources Information Center

    Xie, Ying; Sharma, Priya

    2013-01-01

    Reflective learning refers to a learner's purposeful and conscious manipulation of ideas toward meaningful learning. Blogs have been used to support reflective thinking, but the commonly seen blog software usually does not provide overt mechanisms for students' high-level reflections. A new tool was designed to support the reflective…

  6. Analyzing Human-Landscape Interactions: Tools That Integrate

    NASA Astrophysics Data System (ADS)

    Zvoleff, Alex; An, Li

    2014-01-01

    Humans have transformed much of Earth's land surface, giving rise to loss of biodiversity, climate change, and a host of other environmental issues that are affecting human and biophysical systems in unexpected ways. To confront these problems, environmental managers must consider human and landscape systems in integrated ways. This means making use of data obtained from a broad range of methods (e.g., sensors, surveys), while taking into account new findings from the social and biophysical science literatures. New integrative methods (including data fusion, simulation modeling, and participatory approaches) have emerged in recent years to address these challenges, and to allow analysts to provide information that links qualitative and quantitative elements for policymakers. This paper brings attention to these emergent tools while providing an overview of the tools currently in use for analysis of human-landscape interactions. Analysts are now faced with a staggering array of approaches in the human-landscape literature—in an attempt to bring increased clarity to the field, we identify the relative strengths of each tool, and provide guidance to analysts on the areas to which each tool is best applied. We discuss four broad categories of tools: statistical methods (including survival analysis, multi-level modeling, and Bayesian approaches), GIS and spatial analysis methods, simulation approaches (including cellular automata, agent-based modeling, and participatory modeling), and mixed-method techniques (such as alternative futures modeling and integrated assessment). For each tool, we offer an example from the literature of its application in human-landscape research. Among these tools, participatory approaches are gaining prominence for analysts to make the broadest possible array of information available to researchers, environmental managers, and policymakers. Further development of new approaches of data fusion and integration across sites or disciplines pose an important challenge for future work in integrating human and landscape components.

  7. A survey of social media data analysis for physical activity surveillance.

    PubMed

    Liu, Sam; Young, Sean D

    2018-07-01

    Social media data can provide valuable information regarding people's behaviors and health outcomes. Previous studies have shown that social media data can be extracted to monitor and predict infectious disease outbreaks. These same approaches can be applied to other fields including physical activity research and forensic science. Social media data have the potential to provide real-time monitoring and prediction of physical activity level in a given region. This tool can be valuable to public health organizations as it can overcome the time lag in the reporting of physical activity epidemiology data faced by traditional research methods (e.g. surveys, observational studies). As a result, this tool could help public health organizations better mobilize and target physical activity interventions. The first part of this paper aims to describe current approaches (e.g. topic modeling, sentiment analysis and social network analysis) that could be used to analyze social media data to provide real-time monitoring of physical activity level. The second aim of this paper was to discuss ways to apply social media analysis to other fields such as forensic sciences and provide recommendations to further social media research. Copyright © 2016 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  8. Exploring Valid Reference Genes for Quantitative Real-time PCR Analysis in Plutella xylostella (Lepidoptera: Plutellidae)

    PubMed Central

    Fu, Wei; Xie, Wen; Zhang, Zhuo; Wang, Shaoli; Wu, Qingjun; Liu, Yong; Zhou, Xiaomao; Zhou, Xuguo; Zhang, Youjun

    2013-01-01

    Abstract: Quantitative real-time PCR (qRT-PCR), a primary tool in gene expression analysis, requires an appropriate normalization strategy to control for variation among samples. The best option is to compare the mRNA level of a target gene with that of reference gene(s) whose expression level is stable across various experimental conditions. In this study, expression profiles of eight candidate reference genes from the diamondback moth, Plutella xylostella, were evaluated under diverse experimental conditions. RefFinder, a web-based analysis tool, integrates four major computational programs including geNorm, Normfinder, BestKeeper, and the comparative ΔCt method to comprehensively rank the tested candidate genes. Elongation factor 1 (EF1) was the most suited reference gene for the biotic factors (development stage, tissue, and strain). In contrast, although appropriate reference gene(s) do exist for several abiotic factors (temperature, photoperiod, insecticide, and mechanical injury), we were not able to identify a single universal reference gene. Nevertheless, a suite of candidate reference genes were specifically recommended for selected experimental conditions. Our finding is the first step toward establishing a standardized qRT-PCR analysis of this agriculturally important insect pest. PMID:23983612

  9. Vibration syndrome in chipping and grinding workers.

    PubMed

    1984-10-01

    A clear conclusion from these studies is that vibration syndrome occurs in chipping and grinding workers in this country and that earlier reports that it may not exist were probably inaccurate. The careful selection of exposed and control groups for analysis strengthens the observed association between vibration syndrome and the occupational use of pneumatic chipping hammers and grinding tools. In the foundry populations studied the vibration syndrome was severe, with short latencies and high prevalences of the advanced stages. The shipyard population did not display this pattern. This difference can be attributed to variations in work practices but the more important factor seems to be the effect of incentive work schedules. Comparisons of groups of hourly and incentive workers from the shipyard and within foundry populations consistently demonstrated that incentive work was associated with increased severity of vibration syndrome. Excessive vibration levels were measured on chipping and grinding tools. Of the factors studied, reduction of throttle level decreased the vibration levels measured on chipping hammers. For grinders, the working condition of the tool affected the measured vibration acceleration levels. Grinders receiving average to poor maintenance showed higher vibration levels. The results of objective clinical testing did not yield tests with diagnostic properties. To date, the clinical judgment of the physician remains the primary focus of the diagnosis of vibration syndrome. A number of actions can be taken to prevent vibration syndrome. Preplacement medical examinations can identify workers predisposed to or experiencing Raynaud's phenomenon or disease. Informing employees and employers about the signs, symptoms, and consequences of vibration syndrome can encourage workers to report the condition to their physicians promptly. Engineering approaches to preventing vibration syndrome include increased quality control on castings to reduce finishing time and automation of the finishing process. Tool manufacturers can contribute by modifying or redesigning tools to reduce vibration. The technology to reduce vibration from hand tools exists but the engineering application is difficult. Vibration from chain saws has been reduced through changes in design and some companies have begun to redesign jackhammers, scalers, grinders, and chipping hammers. As these become available, purchasers can encourage manufacturers by selecting tools with antivibration characteristics. Vibration from tools currently in use can be controlled by periodically scheduled inspection and maintenance programs for vibrating tools.(ABSTRACT TRUNCATED AT 400 WORDS)

  10. The JASMIN Analysis Platform - bridging the gap between traditional climate data practicies and data-centric analysis paradigms

    NASA Astrophysics Data System (ADS)

    Pascoe, Stephen; Iwi, Alan; kershaw, philip; Stephens, Ag; Lawrence, Bryan

    2014-05-01

    The advent of large-scale data and the consequential analysis problems have led to two new challenges for the research community: how to share such data to get the maximum value and how to carry out efficient analysis. Solving both challenges require a form of parallelisation: the first is social parallelisation (involving trust and information sharing), the second data parallelisation (involving new algorithms and tools). The JASMIN infrastructure supports both kinds of parallelism by providing a multi-tennent environment with petabyte-scale storage, VM provisioning and batch cluster facilities. The JASMIN Analysis Platform (JAP) is an analysis software layer for JASMIN which emphasises ease of transition from a researcher's local environment to JASMIN. JAP brings together tools traditionally used by multiple communities and configures them to work together, enabling users to move analysis from their local environment to JASMIN without rewriting code. JAP also provides facilities to exploit JASMIN's parallel capabilities whilst maintaining their familiar analysis environment where ever possible. Modern opensource analysis tools typically have multiple dependent packages, increasing the installation burden on system administrators. When you consider a suite of tools, often with both common and conflicting dependencies, analysis pipelines can become locked to a particular installation simply because of the effort required to reconstruct the dependency tree. JAP addresses this problem by providing a consistent suite of RPMs compatible with RedHat Enterprise Linux and CentOS 6.4. Researchers can install JAP locally, either as RPMs or through a pre-built VM image, giving them the confidence to know moving analysis to JASMIN will not disrupt their environment. Analysis parallelisation is in it's infancy in climate sciences, with few tools capable of exploiting any parallel environment beyond manual scripting of the use of multiple processors. JAP begins to bridge this gap through a veriety of higher-level tools for parallelisation and job scheduling such as IPython-parallel and MPI support for interactive analysis languages. We find that enabling even simple parallelisation of workflows, together with the state of the art I/O performance of JASMIN storage, provides many users with the large increases in efficiency they need to scale their analyses to conteporary data volumes and tackly new, previously inaccessible, problems.

  11. Query2Question: Translating Visualization Interaction into Natural Language.

    PubMed

    Nafari, Maryam; Weaver, Chris

    2015-06-01

    Richly interactive visualization tools are increasingly popular for data exploration and analysis in a wide variety of domains. Existing systems and techniques for recording provenance of interaction focus either on comprehensive automated recording of low-level interaction events or on idiosyncratic manual transcription of high-level analysis activities. In this paper, we present the architecture and translation design of a query-to-question (Q2Q) system that automatically records user interactions and presents them semantically using natural language (written English). Q2Q takes advantage of domain knowledge and uses natural language generation (NLG) techniques to translate and transcribe a progression of interactive visualization states into a visual log of styled text that complements and effectively extends the functionality of visualization tools. We present Q2Q as a means to support a cross-examination process in which questions rather than interactions are the focus of analytic reasoning and action. We describe the architecture and implementation of the Q2Q system, discuss key design factors and variations that effect question generation, and present several visualizations that incorporate Q2Q for analysis in a variety of knowledge domains.

  12. Next generation data systems and knowledge products to support agricultural producers and science-based policy decision making.

    PubMed

    Capalbo, Susan M; Antle, John M; Seavert, Clark

    2017-07-01

    Research on next generation agricultural systems models shows that the most important current limitation is data, both for on-farm decision support and for research investment and policy decision making. One of the greatest data challenges is to obtain reliable data on farm management decision making, both for current conditions and under scenarios of changed bio-physical and socio-economic conditions. This paper presents a framework for the use of farm-level and landscape-scale models and data to provide analysis that could be used in NextGen knowledge products, such as mobile applications or personal computer data analysis and visualization software. We describe two analytical tools - AgBiz Logic and TOA-MD - that demonstrate the current capability of farmlevel and landscape-scale models. The use of these tools is explored with a case study of an oilseed crop, Camelina sativa , which could be used to produce jet aviation fuel. We conclude with a discussion of innovations needed to facilitate the use of farm and policy-level models to generate data and analysis for improved knowledge products.

  13. Portfolio Analysis Tool

    NASA Technical Reports Server (NTRS)

    Barth, Tim; Zapata, Edgar; Benjamin, Perakath; Graul, Mike; Jones, Doug

    2005-01-01

    Portfolio Analysis Tool (PAT) is a Web-based, client/server computer program that helps managers of multiple projects funded by different customers to make decisions regarding investments in those projects. PAT facilitates analysis on a macroscopic level, without distraction by parochial concerns or tactical details of individual projects, so that managers decisions can reflect the broad strategy of their organization. PAT is accessible via almost any Web-browser software. Experts in specific projects can contribute to a broad database that managers can use in analyzing the costs and benefits of all projects, but do not have access for modifying criteria for analyzing projects: access for modifying criteria is limited to managers according to levels of administrative privilege. PAT affords flexibility for modifying criteria for particular "focus areas" so as to enable standardization of criteria among similar projects, thereby making it possible to improve assessments without need to rewrite computer code or to rehire experts, and thereby further reducing the cost of maintaining and upgrading computer code. Information in the PAT database and results of PAT analyses can be incorporated into a variety of ready-made or customizable tabular or graphical displays.

  14. Extending the XNAT archive tool for image and analysis management in ophthalmology research

    NASA Astrophysics Data System (ADS)

    Wahle, Andreas; Lee, Kyungmoo; Harding, Adam T.; Garvin, Mona K.; Niemeijer, Meindert; Sonka, Milan; Abràmoff, Michael D.

    2013-03-01

    In ophthalmology, various modalities and tests are utilized to obtain vital information on the eye's structure and function. For example, optical coherence tomography (OCT) is utilized to diagnose, screen, and aid treatment of eye diseases like macular degeneration or glaucoma. Such data are complemented by photographic retinal fundus images and functional tests on the visual field. DICOM isn't widely used yet, though, and frequently images are encoded in proprietary formats. The eXtensible Neuroimaging Archive Tool (XNAT) is an open-source NIH-funded framework for research PACS and is in use at the University of Iowa for neurological research applications. Its use for ophthalmology was hence desirable but posed new challenges due to data types thus far not considered and the lack of standardized formats. We developed custom tools for data types not natively recognized by XNAT itself using XNAT's low-level REST API. Vendor-provided tools can be included as necessary to convert proprietary data sets into valid DICOM. Clients can access the data in a standardized format while still retaining the original format if needed by specific analysis tools. With respective project-specific permissions, results like segmentations or quantitative evaluations can be stored as additional resources to previously uploaded datasets. Applications can use our abstract-level Python or C/C++ API to communicate with the XNAT instance. This paper describes concepts and details of the designed upload script templates, which can be customized to the needs of specific projects, and the novel client-side communication API which allows integration into new or existing research applications.

  15. SigTree: A Microbial Community Analysis Tool to Identify and Visualize Significantly Responsive Branches in a Phylogenetic Tree.

    PubMed

    Stevens, John R; Jones, Todd R; Lefevre, Michael; Ganesan, Balasubramanian; Weimer, Bart C

    2017-01-01

    Microbial community analysis experiments to assess the effect of a treatment intervention (or environmental change) on the relative abundance levels of multiple related microbial species (or operational taxonomic units) simultaneously using high throughput genomics are becoming increasingly common. Within the framework of the evolutionary phylogeny of all species considered in the experiment, this translates to a statistical need to identify the phylogenetic branches that exhibit a significant consensus response (in terms of operational taxonomic unit abundance) to the intervention. We present the R software package SigTree , a collection of flexible tools that make use of meta-analysis methods and regular expressions to identify and visualize significantly responsive branches in a phylogenetic tree, while appropriately adjusting for multiple comparisons.

  16. Optimization of Maghemite (γ-Fe2O3) Nano-Powder Mixed micro-EDM of CoCrMo with Multiple Responses Using Gray Relational Analysis (GRA)

    NASA Astrophysics Data System (ADS)

    Mejid Elsiti, Nagwa; Noordin, M. Y.; Idris, Ani; Saed Majeed, Faraj

    2017-10-01

    This paper presents an optimization of process parameters of Micro-Electrical Discharge Machining (EDM) process with (γ-Fe2O3) nano-powder mixed dielectric using multi-response optimization Grey Relational Analysis (GRA) method instead of single response optimization. These parameters were optimized based on 2-Level factorial design combined with Grey Relational Analysis. The machining parameters such as peak current, gap voltage, and pulse on time were chosen for experimentation. The performance characteristics chosen for this study are material removal rate (MRR), tool wear rate (TWR), Taper and Overcut. Experiments were conducted using electrolyte copper as the tool and CoCrMo as the workpiece. Experimental results have been improved through this approach.

  17. Fault tree analysis for integrated and probabilistic risk analysis of drinking water systems.

    PubMed

    Lindhe, Andreas; Rosén, Lars; Norberg, Tommy; Bergstedt, Olof

    2009-04-01

    Drinking water systems are vulnerable and subject to a wide range of risks. To avoid sub-optimisation of risk-reduction options, risk analyses need to include the entire drinking water system, from source to tap. Such an integrated approach demands tools that are able to model interactions between different events. Fault tree analysis is a risk estimation tool with the ability to model interactions between events. Using fault tree analysis on an integrated level, a probabilistic risk analysis of a large drinking water system in Sweden was carried out. The primary aims of the study were: (1) to develop a method for integrated and probabilistic risk analysis of entire drinking water systems; and (2) to evaluate the applicability of Customer Minutes Lost (CML) as a measure of risk. The analysis included situations where no water is delivered to the consumer (quantity failure) and situations where water is delivered but does not comply with water quality standards (quality failure). Hard data as well as expert judgements were used to estimate probabilities of events and uncertainties in the estimates. The calculations were performed using Monte Carlo simulations. CML is shown to be a useful measure of risks associated with drinking water systems. The method presented provides information on risk levels, probabilities of failure, failure rates and downtimes of the system. This information is available for the entire system as well as its different sub-systems. Furthermore, the method enables comparison of the results with performance targets and acceptable levels of risk. The method thus facilitates integrated risk analysis and consequently helps decision-makers to minimise sub-optimisation of risk-reduction options.

  18. Test-Analysis Correlation for Space Shuttle External Tank Foam Impacting RCC Wing Leading Edge Component Panels

    NASA Technical Reports Server (NTRS)

    Lyle, Karen H.

    2008-01-01

    The Space Shuttle Columbia Accident Investigation Board recommended that NASA develop, validate, and maintain a modeling tool capable of predicting the damage threshold for debris impacts on the Space Shuttle Reinforced Carbon-Carbon (RCC) wing leading edge and nosecap assembly. The results presented in this paper are one part of a multi-level approach that supported the development of the predictive tool used to recertify the shuttle for flight following the Columbia Accident. The assessment of predictive capability was largely based on test analysis comparisons for simpler component structures. This paper provides comparisons of finite element simulations with test data for external tank foam debris impacts onto 6-in. square RCC flat panels. Both quantitative displacement and qualitative damage assessment correlations are provided. The comparisons show good agreement and provided the Space Shuttle Program with confidence in the predictive tool.

  19. Metabolomics combined with chemometric tools (PCA, HCA, PLS-DA and SVM) for screening cassava (Manihot esculenta Crantz) roots during postharvest physiological deterioration.

    PubMed

    Uarrota, Virgílio Gavicho; Moresco, Rodolfo; Coelho, Bianca; Nunes, Eduardo da Costa; Peruch, Luiz Augusto Martins; Neubert, Enilto de Oliveira; Rocha, Miguel; Maraschin, Marcelo

    2014-10-15

    Cassava roots are an important source of dietary and industrial carbohydrates and suffer markedly from postharvest physiological deterioration (PPD). This paper deals with metabolomics combined with chemometric tools for screening the chemical and enzymatic composition in several genotypes of cassava roots during PPD. Metabolome analyses showed increases in carotenoids, flavonoids, anthocyanins, phenolics, reactive scavenging species, and enzymes (superoxide dismutase family, hydrogen peroxide, and catalase) until 3-5days postharvest. PPD correlated negatively with phenolics and carotenoids and positively with anthocyanins and flavonoids. Chemometric tools such as principal component analysis, partial least squares discriminant analysis, and support vector machines discriminated well cassava samples and enabled a good prediction of samples. Hierarchical clustering analyses grouped samples according to their levels of PPD and chemical compositions. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. Socio-economic inequity in demand for insecticide-treated nets, in-door residual house spraying, larviciding and fogging in Sudan

    PubMed Central

    Onwujekwe, Obinna; Malik, El-Fatih Mohamed; Mustafa, Sara Hassan; Mnzava, Abraham

    2005-01-01

    Background In order to optimally prioritize and use public and private budgets for equitable malaria vector control, there is a need to determine the level and determinants of consumer demand for different vector control tools. Objectives To determine the demand from people of different socio-economic groups for indoor residual house-spraying (IRHS), insecticide-treated nets (ITNs), larviciding with chemicals (LWC), and space spraying/fogging (SS) and the disease control implications of the result. Methods Ratings and levels of willingness-to-pay (WTP) for the vector control tools were determined using a random cross-sectional sample of 720 householdes drawn from two states. WTP was elicited using the bidding game. An asset-based socio-economic status (SES) index was used to explore whether WTP was related to SES of the respondents. Results IRHS received the highest proportion of highest preferred rating (41.0%) followed by ITNs (23.1%). However, ITNs had the highest mean WTP followed by IRHS, while LWC had the least. The regression analysis showed that SES was positively and statistically significantly related to WTP across the four vector control tools and that the respondents' rating of IRHS and ITNs significantly explained their levels of WTP for the two tools. Conclusion People were willing to pay for all the vector-control tools, but the demand for the vector control tools was related to the SES of the respondents. Hence, it is vital that there are public policies and financing mechanisms to ensure equitable provision and utilisation of vector control tools, as well as protecting the poor from cost-sharing arrangements. PMID:16356177

  1. Reducing maintenance costs in agreement with CNC machine tools reliability

    NASA Astrophysics Data System (ADS)

    Ungureanu, A. L.; Stan, G.; Butunoi, P. A.

    2016-08-01

    Aligning maintenance strategy with reliability is a challenge due to the need to find an optimal balance between them. Because the various methods described in the relevant literature involve laborious calculations or use of software that can be costly, this paper proposes a method that is easier to implement on CNC machine tools. The new method, called the Consequence of Failure Analysis (CFA) is based on technical and economic optimization, aimed at obtaining a level of required performance with minimum investment and maintenance costs.

  2. Integrated modeling environment for systems-level performance analysis of the Next-Generation Space Telescope

    NASA Astrophysics Data System (ADS)

    Mosier, Gary E.; Femiano, Michael; Ha, Kong; Bely, Pierre Y.; Burg, Richard; Redding, David C.; Kissil, Andrew; Rakoczy, John; Craig, Larry

    1998-08-01

    All current concepts for the NGST are innovative designs which present unique systems-level challenges. The goals are to outperform existing observatories at a fraction of the current price/performance ratio. Standard practices for developing systems error budgets, such as the 'root-sum-of- squares' error tree, are insufficient for designs of this complexity. Simulation and optimization are the tools needed for this project; in particular tools that integrate controls, optics, thermal and structural analysis, and design optimization. This paper describes such an environment which allows sub-system performance specifications to be analyzed parametrically, and includes optimizing metrics that capture the science requirements. The resulting systems-level design trades are greatly facilitated, and significant cost savings can be realized. This modeling environment, built around a tightly integrated combination of commercial off-the-shelf and in-house- developed codes, provides the foundation for linear and non- linear analysis on both the time and frequency-domains, statistical analysis, and design optimization. It features an interactive user interface and integrated graphics that allow highly-effective, real-time work to be done by multidisciplinary design teams. For the NGST, it has been applied to issues such as pointing control, dynamic isolation of spacecraft disturbances, wavefront sensing and control, on-orbit thermal stability of the optics, and development of systems-level error budgets. In this paper, results are presented from parametric trade studies that assess requirements for pointing control, structural dynamics, reaction wheel dynamic disturbances, and vibration isolation. These studies attempt to define requirements bounds such that the resulting design is optimized at the systems level, without attempting to optimize each subsystem individually. The performance metrics are defined in terms of image quality, specifically centroiding error and RMS wavefront error, which directly links to science requirements.

  3. Use of the SONET score to evaluate Urgent Care Center overcrowding: a prospective pilot study

    PubMed Central

    Wang, Hao; Robinson, Richard D; Cowden, Chad D; Gorman, Violet A; Cook, Christopher D; Gicheru, Eugene K; Schrader, Chet D; Jayswal, Rani D; Zenarosa, Nestor R

    2015-01-01

    Objectives To derive a tool to determine Urgent Care Center (UCC) crowding and investigate the association between different levels of UCC overcrowding and negative patient care outcomes. Design Prospective pilot study. Setting Single centre study in the USA. Participants 3565 patients who registered at UCC during the 21-day study period were included. Patients who had no overcrowding statuses estimated due to incomplete collection of operational variables at the time of registration were excluded in this study. 3139 patients were enrolled in the final data analysis. Primary and secondary outcome measures A crowding estimation tool (SONET: Severely overcrowded, Overcrowded and Not overcrowded Estimation Tool) was derived using the linear regression analysis. The average length of stay (LOS) in UCC patients and the number of left without being seen (LWBS) patients were calculated and compared under the three different levels of UCC crowding. Results Four independent operational variables could affect the UCC overcrowding score including the total number of patients, the number of results pending for patients, the number of patients in the waiting room and the longest time a patient was stationed in the waiting room. In addition, UCC overcrowding was associated with longer average LOS (not overcrowded: 133±76 min, overcrowded: 169±79 min, and severely overcrowded: 196±87 min, p<0.001) and an increased number of LWBS patients (not overcrowded: 0.28±0.69 patients, overcrowded: 0.64±0.98, and severely overcrowded: 1.00±0.97). Conclusions The overcrowding estimation tool (SONET) derived in this study might be used to determine different levels of crowding in a high volume UCC setting. It also showed that UCC overcrowding might be associated with negative patient care outcomes. PMID:25872940

  4. Using EMIS to Identify Top Opportunities for Commercial Building Efficiency

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, Guanjing; Singla, Rupam; Granderson, Jessica

    Energy Management and Information Systems (EMIS) comprise a broad family of tools and services to manage commercial building energy use. These technologies offer a mix of capabilities to store, display, and analyze energy use and system data, and in some cases, provide control. EMIS technologies enable 10–20 percent site energy savings in best practice implementations. Energy Information System (EIS) and Fault Detection and Diagnosis (FDD) systems are two key technologies in the EMIS family. Energy Information Systems are broadly defined as the web-based software, data acquisition hardware, and communication systems used to analyze and display building energy performance. At amore » minimum, an EIS provides daily, hourly or sub-hourly interval meter data at the whole-building level, with graphical and analytical capability. Fault Detection and Diagnosis systems automatically identify heating, ventilation, and air-conditioning (HVAC) system or equipment-level performances issues, and in some cases are able to isolate the root causes of the problem. They use computer algorithms to continuously analyze system-level operational data to detect faults and diagnose their causes. Many FDD tools integrate the trend log data from a Building Automation System (BAS) but otherwise are stand-alone software packages; other types of FDD tools are implemented as “on-board” equipment-embedded diagnostics. (This document focuses on the former.) Analysis approaches adopted in FDD technologies span a variety of techniques from rule-based methods to process history-based approaches. FDD tools automate investigations that can be conducted via manual data inspection by someone with expert knowledge, thereby expanding accessibility and breath of analysis opportunity, and also reducing complexity.« less

  5. Automated condition-invariable neurite segmentation and synapse classification using textural analysis-based machine-learning algorithms

    PubMed Central

    Kandaswamy, Umasankar; Rotman, Ziv; Watt, Dana; Schillebeeckx, Ian; Cavalli, Valeria; Klyachko, Vitaly

    2013-01-01

    High-resolution live-cell imaging studies of neuronal structure and function are characterized by large variability in image acquisition conditions due to background and sample variations as well as low signal-to-noise ratio. The lack of automated image analysis tools that can be generalized for varying image acquisition conditions represents one of the main challenges in the field of biomedical image analysis. Specifically, segmentation of the axonal/dendritic arborizations in brightfield or fluorescence imaging studies is extremely labor-intensive and still performed mostly manually. Here we describe a fully automated machine-learning approach based on textural analysis algorithms for segmenting neuronal arborizations in high-resolution brightfield images of live cultured neurons. We compare performance of our algorithm to manual segmentation and show that it combines 90% accuracy, with similarly high levels of specificity and sensitivity. Moreover, the algorithm maintains high performance levels under a wide range of image acquisition conditions indicating that it is largely condition-invariable. We further describe an application of this algorithm to fully automated synapse localization and classification in fluorescence imaging studies based on synaptic activity. Textural analysis-based machine-learning approach thus offers a high performance condition-invariable tool for automated neurite segmentation. PMID:23261652

  6. Reliability of the ECHOWS Tool for Assessment of Patient Interviewing Skills.

    PubMed

    Boissonnault, Jill S; Evans, Kerrie; Tuttle, Neil; Hetzel, Scott J; Boissonnault, William G

    2016-04-01

    History taking is an important component of patient/client management. Assessment of student history-taking competency can be achieved via a standardized tool. The ECHOWS tool has been shown to be valid with modest intrarater reliability in a previous study but did not demonstrate sufficient power to definitively prove its stability. The purposes of this study were: (1) to assess the reliability of the ECHOWS tool for student assessment of patient interviewing skills and (2) to determine whether the tool discerns between novice and experienced skill levels. A reliability and construct validity assessment was conducted. Three faculty members from the United States and Australia scored videotaped histories from standardized patients taken by students and experienced clinicians from each of these countries. The tapes were scored twice, 3 to 6 weeks apart. Reliability was assessed using interclass correlation coefficients (ICCs) and repeated measures. Analysis of variance models assessed the ability of the tool to discern between novice and experienced skill levels. The ECHOWS tool showed excellent intrarater reliability (ICC [3,1]=.74-.89) and good interrater reliability (ICC [2,1]=.55) as a whole. The summary of performance (S) section showed poor interrater reliability (ICC [2,1]=.27). There was no statistical difference in performance on the tool between novice and experienced clinicians. A possible ceiling effect may occur when standardized patients are not coached to provide complex and obtuse responses to interviewer questions. Variation in familiarity with the ECHOWS tool and in use of the online training may have influenced scoring of the S section. The ECHOWS tool demonstrates excellent intrarater reliability and moderate interrater reliability. Sufficient training with the tool prior to student assessment is recommended. The S section must evolve in order to provide a more discerning measure of interviewing skills. © 2016 American Physical Therapy Association.

  7. Is the Job Satisfaction Survey a good tool to measure job satisfaction amongst health workers in Nepal? Results of a validation analysis.

    PubMed

    Batura, Neha; Skordis-Worrall, Jolene; Thapa, Rita; Basnyat, Regina; Morrison, Joanna

    2016-07-27

    Job satisfaction is an important predictor of an individual's intention to leave the workplace. It is increasingly being used to consider the retention of health workers in low-income countries. However, the determinants of job satisfaction vary in different contexts, and it is important to use measurement methods that are contextually appropriate. We identified a measurement tool developed by Paul Spector, and used mixed methods to assess its validity and reliability in measuring job satisfaction among maternal and newborn health workers (MNHWs) in government facilities in rural Nepal. We administered the tool to 137 MNHWs and collected qualitative data from 78 MNHWs, and district and central level stakeholders to explore definitions of job satisfaction and factors that affected it. We calculated a job satisfaction index for all MNHWs using quantitative data and tested for validity, reliability and sensitivity. We conducted qualitative content analysis and compared the job satisfaction indices with qualitative data. Results from the internal consistency tests offer encouraging evidence of the validity, reliability and sensitivity of the tool. Overall, the job satisfaction indices reflected the qualitative data. The tool was able to distinguish levels of job satisfaction among MNHWs. However, the work environment and promotion dimensions of the tool did not adequately reflect local conditions. Further, community fit was found to impact job satisfaction but was not captured by the tool. The relatively high incidence of missing responses may suggest that responding to some statements was perceived as risky. Our findings indicate that the adapted job satisfaction survey was able to measure job satisfaction in Nepal. However, it did not include key contextual factors affecting job satisfaction of MNHWs, and as such may have been less sensitive than a more inclusive measure. The findings suggest that this tool can be used in similar settings and populations, with the addition of statements reflecting the nature of the work environment and structure of the local health system. Qualitative data on job satisfaction should be collected before using the tool in a new context, to highlight any locally relevant dimensions of job satisfaction not already captured in the standard survey.

  8. Web-based automation of green building rating index and life cycle cost analysis

    NASA Astrophysics Data System (ADS)

    Shahzaib Khan, Jam; Zakaria, Rozana; Aminuddin, Eeydzah; IzieAdiana Abidin, Nur; Sahamir, Shaza Rina; Ahmad, Rosli; Nafis Abas, Darul

    2018-04-01

    Sudden decline in financial markets and economic meltdown has slow down adaptation and lowered interest of investors towards green certified buildings due to their higher initial costs. Similarly, it is essential to fetch investor’s attention towards more development of green buildings through automated tools for the construction projects. Though, historical dearth is found on the automation of green building rating tools that brings up an essential gap to develop an automated analog computerized programming tool. This paper present a proposed research aim to develop an integrated web-based automated analog computerized programming that applies green building rating assessment tool, green technology and life cycle cost analysis. It also emphasizes to identify variables of MyCrest and LCC to be integrated and developed in a framework then transformed into automated analog computerized programming. A mix methodology of qualitative and quantitative survey and its development portray the planned to carry MyCrest-LCC integration to an automated level. In this study, the preliminary literature review enriches better understanding of Green Building Rating Tools (GBRT) integration to LCC. The outcome of this research is a pave way for future researchers to integrate other efficient tool and parameters that contributes towards green buildings and future agendas.

  9. The PathoYeastract database: an information system for the analysis of gene and genomic transcription regulation in pathogenic yeasts.

    PubMed

    Monteiro, Pedro Tiago; Pais, Pedro; Costa, Catarina; Manna, Sauvagya; Sá-Correia, Isabel; Teixeira, Miguel Cacho

    2017-01-04

    We present the PATHOgenic YEAst Search for Transcriptional Regulators And Consensus Tracking (PathoYeastract - http://pathoyeastract.org) database, a tool for the analysis and prediction of transcription regulatory associations at the gene and genomic levels in the pathogenic yeasts Candida albicans and C. glabrata Upon data retrieval from hundreds of publications, followed by curation, the database currently includes 28 000 unique documented regulatory associations between transcription factors (TF) and target genes and 107 DNA binding sites, considering 134 TFs in both species. Following the structure used for the YEASTRACT database, PathoYeastract makes available bioinformatics tools that enable the user to exploit the existing information to predict the TFs involved in the regulation of a gene or genome-wide transcriptional response, while ranking those TFs in order of their relative importance. Each search can be filtered based on the selection of specific environmental conditions, experimental evidence or positive/negative regulatory effect. Promoter analysis tools and interactive visualization tools for the representation of TF regulatory networks are also provided. The PathoYeastract database further provides simple tools for the prediction of gene and genomic regulation based on orthologous regulatory associations described for other yeast species, a comparative genomics setup for the study of cross-species evolution of regulatory networks. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  10. The MG-RAST Metagenomics Database and Portal in 2015

    DOE PAGES

    Wilke, Andreas; Bischof, Jared; Gerlach, Wolfgang; ...

    2015-12-09

    MG-RAST (http://metagenomics.anl.gov) is an opensubmission data portal for processing, analyzing, sharing and disseminating metagenomic datasets. Currently, the system hosts over 200 000 datasets and is continuously updated. The volume of submissions has increased 4-fold over the past 24 months, now averaging 4 terabasepairs per month. In addition to several new features, we report changes to the analysis workflow and the technologies used to scale the pipeline up to the required throughput levels. Lastly, to show possible uses for the data from MG-RAST, we present several examples integrating data and analyses from MG-RAST into popular third-party analysis tools or sequence alignmentmore » tools.« less

  11. Evaluation of Cross-Protocol Stability of a Fully Automated Brain Multi-Atlas Parcellation Tool.

    PubMed

    Liang, Zifei; He, Xiaohai; Ceritoglu, Can; Tang, Xiaoying; Li, Yue; Kutten, Kwame S; Oishi, Kenichi; Miller, Michael I; Mori, Susumu; Faria, Andreia V

    2015-01-01

    Brain parcellation tools based on multiple-atlas algorithms have recently emerged as a promising method with which to accurately define brain structures. When dealing with data from various sources, it is crucial that these tools are robust for many different imaging protocols. In this study, we tested the robustness of a multiple-atlas, likelihood fusion algorithm using Alzheimer's Disease Neuroimaging Initiative (ADNI) data with six different protocols, comprising three manufacturers and two magnetic field strengths. The entire brain was parceled into five different levels of granularity. In each level, which defines a set of brain structures, ranging from eight to 286 regions, we evaluated the variability of brain volumes related to the protocol, age, and diagnosis (healthy or Alzheimer's disease). Our results indicated that, with proper pre-processing steps, the impact of different protocols is minor compared to biological effects, such as age and pathology. A precise knowledge of the sources of data variation enables sufficient statistical power and ensures the reliability of an anatomical analysis when using this automated brain parcellation tool on datasets from various imaging protocols, such as clinical databases.

  12. Use of statistical tools to evaluate the reductive dechlorination of high levels of TCE in microcosm studies.

    PubMed

    Harkness, Mark; Fisher, Angela; Lee, Michael D; Mack, E Erin; Payne, Jo Ann; Dworatzek, Sandra; Roberts, Jeff; Acheson, Carolyn; Herrmann, Ronald; Possolo, Antonio

    2012-04-01

    A large, multi-laboratory microcosm study was performed to select amendments for supporting reductive dechlorination of high levels of trichloroethylene (TCE) found at an industrial site in the United Kingdom (UK) containing dense non-aqueous phase liquid (DNAPL) TCE. The study was designed as a fractional factorial experiment involving 177 bottles distributed between four industrial laboratories and was used to assess the impact of six electron donors, bioaugmentation, addition of supplemental nutrients, and two TCE levels (0.57 and 1.90 mM or 75 and 250 mg/L in the aqueous phase) on TCE dechlorination. Performance was assessed based on the concentration changes of TCE and reductive dechlorination degradation products. The chemical data was evaluated using analysis of variance (ANOVA) and survival analysis techniques to determine both main effects and important interactions for all the experimental variables during the 203-day study. The statistically based design and analysis provided powerful tools that aided decision-making for field application of this technology. The analysis showed that emulsified vegetable oil (EVO), lactate, and methanol were the most effective electron donors, promoting rapid and complete dechlorination of TCE to ethene. Bioaugmentation and nutrient addition also had a statistically significant positive impact on TCE dechlorination. In addition, the microbial community was measured using phospholipid fatty acid analysis (PLFA) for quantification of total biomass and characterization of the community structure and quantitative polymerase chain reaction (qPCR) for enumeration of Dehalococcoides organisms (Dhc) and the vinyl chloride reductase (vcrA) gene. The highest increase in levels of total biomass and Dhc was observed in the EVO microcosms, which correlated well with the dechlorination results. Copyright © 2012 Elsevier B.V. All rights reserved.

  13. Virtual Beach v2.2 User Guide

    EPA Science Inventory

    Virtual Beach version 2.2 (VB 2.2) is a decision support tool. It is designed to construct site-specific Multi-Linear Regression (MLR) models to predict pathogen indicator levels (or fecal indicator bacteria, FIB) at recreational beaches. MLR analysis has outperformed persisten...

  14. Hand-independent representation of tool-use pantomimes in the left anterior intraparietal cortex.

    PubMed

    Ogawa, Kenji; Imai, Fumihito

    2016-12-01

    Previous neuropsychological studies of ideomotor apraxia (IMA) indicated impairments in pantomime actions for tool use for both right and left hands following lesions of parieto-premotor cortices in the left hemisphere. Using functional magnetic resonance imaging (fMRI) with multi-voxel pattern analysis (MVPA), we tested the hypothesis that the left parieto-premotor cortices are involved in the storage or retrieval of hand-independent representation of tool-use actions. In the fMRI scanner, one of three kinds of tools was displayed in pictures or letters, and the participants made pantomimes of the use of these tools using the right hand for the picture stimuli or with the left hand for the letters. We then used MVPA to classify which kind of tool the subjects were pantomiming. Whole-brain searchlight analysis revealed successful decoding using the activities largely in the contralateral primary sensorimotor region, ipsilateral cerebellum, and bilateral early visual area, which may reflect differences in low-level sensorimotor components for three types of pantomimes. Furthermore, a successful cross-classification between the right and left hands was possible using the activities of the left inferior parietal lobule (IPL) near the junction of the anterior intraparietal sulcus. Our finding indicates that the left anterior intraparietal cortex plays an important role in the production of tool-use pantomimes in a hand-independent manner, and independent of stimuli modality.

  15. Generic extravehicular (EVA) and telerobot task primitives for analysis, design, and integration. Version 1.0: Reference compilation for the EVA and telerobotics communities

    NASA Technical Reports Server (NTRS)

    Smith, Jeffrey H.; Drews, Michael

    1990-01-01

    The results are described of an effort to establish commonality and standardization of generic crew extravehicular (crew-EVA) and telerobotic task analysis primitives used for the study of spaceborne operations. Although direct crew-EVA plans are the most visible output of spaceborne operations, significant ongoing efforts by a wide variety of projects and organizations also require tools for estimation of crew-EVA and telerobotic times. Task analysis tools provide estimates for input to technical and cost tradeoff studies. A workshop was convened to identify the issues and needs to establish a common language and syntax for task analysis primitives. In addition, the importance of such a syntax was shown to have precedence over the level to which such a syntax is applied. The syntax, lists of crew-EVA and telerobotic primitives, and the data base in diskette form are presented.

  16. Automotive manufacturing assessment system. Volume IV: engine manufacturing analysis. Final report Jun 77-Aug 78

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taylor, T. Jr

    Volume IV represents the results of one of four major study areas under the Automotive Manufacturing Assessment System (AMAS) sponsored by the DOT/Transportation Systems Center. AMAS was designed to assist in the evaluation of industry's capability to produce fuel efficient vehicles. An analysis of automotive engine manufacturing was conducted in order to determine the impact of regulatory changes on tooling costs and the production process. The 351W CID V-8 engine at Ford's Windsor No. 1 Plant was the subject of the analysis. A review of plant history and its product is presented along with an analysis of manufacturing operations, includingmore » material and production flow, plant layout, machining and assembly processes, tooling, supporting facilities, inspection, service and repair. Four levels of product change intensity showing the impact on manufacturing methods and cost is also presented.« less

  17. SlideJ: An ImageJ plugin for automated processing of whole slide images.

    PubMed

    Della Mea, Vincenzo; Baroni, Giulia L; Pilutti, David; Di Loreto, Carla

    2017-01-01

    The digital slide, or Whole Slide Image, is a digital image, acquired with specific scanners, that represents a complete tissue sample or cytological specimen at microscopic level. While Whole Slide image analysis is recognized among the most interesting opportunities, the typical size of such images-up to Gpixels- can be very demanding in terms of memory requirements. Thus, while algorithms and tools for processing and analysis of single microscopic field images are available, Whole Slide images size makes the direct use of such tools prohibitive or impossible. In this work a plugin for ImageJ, named SlideJ, is proposed with the objective to seamlessly extend the application of image analysis algorithms implemented in ImageJ for single microscopic field images to a whole digital slide analysis. The plugin has been complemented by examples of macro in the ImageJ scripting language to demonstrate its use in concrete situations.

  18. SlideJ: An ImageJ plugin for automated processing of whole slide images

    PubMed Central

    Baroni, Giulia L.; Pilutti, David; Di Loreto, Carla

    2017-01-01

    The digital slide, or Whole Slide Image, is a digital image, acquired with specific scanners, that represents a complete tissue sample or cytological specimen at microscopic level. While Whole Slide image analysis is recognized among the most interesting opportunities, the typical size of such images—up to Gpixels- can be very demanding in terms of memory requirements. Thus, while algorithms and tools for processing and analysis of single microscopic field images are available, Whole Slide images size makes the direct use of such tools prohibitive or impossible. In this work a plugin for ImageJ, named SlideJ, is proposed with the objective to seamlessly extend the application of image analysis algorithms implemented in ImageJ for single microscopic field images to a whole digital slide analysis. The plugin has been complemented by examples of macro in the ImageJ scripting language to demonstrate its use in concrete situations. PMID:28683129

  19. Validating a work group climate assessment tool for improving the performance of public health organizations

    PubMed Central

    Perry, Cary; LeMay, Nancy; Rodway, Greg; Tracy, Allison; Galer, Joan

    2005-01-01

    Background This article describes the validation of an instrument to measure work group climate in public health organizations in developing countries. The instrument, the Work Group Climate Assessment Tool (WCA), was applied in Brazil, Mozambique, and Guinea to assess the intermediate outcomes of a program to develop leadership for performance improvement. Data were collected from 305 individuals in 42 work groups, who completed a self-administered questionnaire. Methods The WCA was initially validated using Cronbach's alpha reliability coefficient and exploratory factor analysis. This article presents the results of a second validation study to refine the initial analyses to account for nested data, to provide item-level psychometrics, and to establish construct validity. Analyses included eigenvalue decomposition analysis, confirmatory factor analysis, and validity and reliability analyses. Results This study confirmed the validity and reliability of the WCA across work groups with different demographic characteristics (gender, education, management level, and geographical location). The study showed that there is agreement between the theoretical construct of work climate and the items in the WCA tool across different populations. The WCA captures a single perception of climate rather than individual sub-scales of clarity, support, and challenge. Conclusion The WCA is useful for comparing the climates of different work groups, tracking the changes in climate in a single work group over time, or examining differences among individuals' perceptions of their work group climate. Application of the WCA before and after a leadership development process can help work groups hold a discussion about current climate and select a target for improvement. The WCA provides work groups with a tool to take ownership of their own group climate through a process that is simple and objective and that protects individual confidentiality. PMID:16223447

  20. Forecasting weed distributions using climate data: a GIS early warning tool

    USGS Publications Warehouse

    Jarnevich, Catherine S.; Holcombe, Tracy R.; Barnett, David T.; Stohlgren, Thomas J.; Kartesz, John T.

    2010-01-01

    The number of invasive exotic plant species establishing in the United States is continuing to rise. When prevention of exotic species from entering into a country fails at the national level and the species establishes, reproduces, spreads, and becomes invasive, the most successful action at a local level is early detection followed eradication. We have developed a simple geographic information system (GIS) analysis for developing watch lists for early detection of invasive exotic plants that relies upon currently available species distribution data coupled with environmental data to aid in describing coarse-scale potential distributions. This GIS analysis tool develops environmental envelopes for species based upon the known distribution of a species thought to be invasive and represents the first approximation of its potential habitat while the necessary data are collected to perform more in­-depth analyses. To validate this method we looked at a time series of species distributions for 66 species in Pacific Northwest, and northern Rocky Mountain counties. The time series analysis presented here did select counties that the invasive exotic weeds invaded in subsequent years, showing that this technique could be useful in developing watch lists for the spread of particular exotic species. We applied this same habitat-matching model based upon bioclimaric envelopes to 100 invasive exotics with various levels of known distributions within continental U.S. counties. For species with climatically limited distributions, county watch lists describe county-specific vulnerability to invasion. Species with matching habitats in a county would be added to that county's list. These watch lists can influence management decisions for early warning, control prioritization, and targeted research to determine specific locations within vulnerable counties. This tool provides useful information for rapid assessment of the potential distribution based upon climate envelopes of current distributions for new invasive exotic species.

  1. Can we trust the calculation of texture indices of CT images? A phantom study.

    PubMed

    Caramella, Caroline; Allorant, Adrien; Orlhac, Fanny; Bidault, Francois; Asselain, Bernard; Ammari, Samy; Jaranowski, Patricia; Moussier, Aurelie; Balleyguier, Corinne; Lassau, Nathalie; Pitre-Champagnat, Stephanie

    2018-04-01

    Texture analysis is an emerging tool in the field of medical imaging analysis. However, many issues have been raised in terms of its use in assessing patient images and it is crucial to harmonize and standardize this new imaging measurement tool. This study was designed to evaluate the reliability of texture indices of CT images on a phantom including a reproducibility study, to assess the discriminatory capacity of indices potentially relevant in CT medical images and to determine their redundancy. For the reproducibility and discriminatory analysis, eight identical CT acquisitions were performed on a phantom including one homogeneous insert and two close heterogeneous inserts. Texture indices were selected for their high reproducibility and capability of discriminating different textures. For the redundancy analysis, 39 acquisitions of the same phantom were performed using varying acquisition parameters and a correlation matrix was used to explore the 2 × 2 relationships. LIFEx software was used to explore 34 different parameters including first order and texture indices. Only eight indices of 34 exhibited high reproducibility and discriminated textures from each other. Skewness and kurtosis from histogram were independent from the six other indices but were intercorrelated, the other six indices correlated in diverse degrees (entropy, dissimilarity, and contrast of the co-occurrence matrix, contrast of the Neighborhood Gray Level difference matrix, SZE, ZLNU of the Gray-Level Size Zone Matrix). Care should be taken when using texture analysis as a tool to characterize CT images because changes in quantitation may be primarily due to internal variability rather than from real physio-pathological effects. Some textural indices appear to be sufficiently reliable and capable to discriminate close textures on CT images. © 2018 American Association of Physicists in Medicine.

  2. Data fusion for CD metrology: heterogeneous hybridization of scatterometry, CDSEM, and AFM data

    NASA Astrophysics Data System (ADS)

    Hazart, J.; Chesneau, N.; Evin, G.; Largent, A.; Derville, A.; Thérèse, R.; Bos, S.; Bouyssou, R.; Dezauzier, C.; Foucher, J.

    2014-04-01

    The manufacturing of next generation semiconductor devices forces metrology tool providers for an exceptional effort in order to meet the requirements for precision, accuracy and throughput stated in the ITRS. In the past years hybrid metrology (based on data fusion theories) has been investigated as a new methodology for advanced metrology [1][2][3]. This paper provides a new point of view of data fusion for metrology through some experiments and simulations. The techniques are presented concretely in terms of equations to be solved. The first point of view is High Level Fusion which is the use of simple numbers with their associated uncertainty postprocessed by tools. In this paper, it is divided into two stages: one for calibration to reach accuracy, the second to reach precision thanks to Bayesian Fusion. From our perspective, the first stage is mandatory before applying the second stage which is commonly presented [1]. However a reference metrology system is necessary for this fusion. So, precision can be improved if and only if the tools to be fused are perfectly matched at least for some parameters. We provide a methodology similar to a multidimensional TMU able to perform this matching exercise. It is demonstrated on a 28 nm node backend lithography case. The second point of view is Deep Level Fusion which works on the contrary with raw data and their combination. In the approach presented here, the analysis of each raw data is based on a parametric model and connections between the parameters of each tool. In order to allow OCD/SEM Deep Level Fusion, a SEM Compact Model derived from [4] has been developed and compared to AFM. As far as we know, this is the first time such techniques have been coupled at Deep Level. A numerical study on the case of a simple stack for lithography is performed. We show strict equivalence of Deep Level Fusion and High Level Fusion when tools are sensitive and models are perfect. When one of the tools can be considered as a reference and the second is biased, High Level Fusion is far superior to standard Deep Level Fusion. Otherwise, only the second stage of High Level Fusion is possible (Bayesian Fusion) and do not provide substantial advantage. Finally, when OCD is equipped with methods for bias detection [5], Deep Level Fusion outclasses the two-stage High Level Fusion and will benefit to the industry for most advanced nodes production.

  3. Fluorescence Lectin Bar-Coding of Glycoconjugates in the Extracellular Matrix of Biofilm and Bioaggregate Forming Microorganisms.

    PubMed

    Neu, Thomas R; Kuhlicke, Ute

    2017-02-10

    Microbial biofilm systems are defined as interface-associated microorganisms embedded into a self-produced matrix. The extracellular matrix represents a continuous challenge in terms of characterization and analysis. The tools applied in more detailed studies comprise extraction/chemical analysis, molecular characterization, and visualisation using various techniques. Imaging by laser microscopy became a standard tool for biofilm analysis, and, in combination with fluorescently labelled lectins, the glycoconjugates of the matrix can be assessed. By employing this approach a wide range of pure culture biofilms from different habitats were examined using the commercially available lectins. From the results, a binary barcode pattern of lectin binding can be generated. Furthermore, the results can be fine-tuned and transferred into a heat map according to signal intensity. The lectin barcode approach is suggested as a useful tool for investigating the biofilm matrix characteristics and dynamics at various levels, e.g. bacterial cell surfaces, adhesive footprints, individual microcolonies, and the gross biofilm or bio-aggregate. Hence fluorescence lectin bar-coding (FLBC) serves as a basis for a subsequent tailor-made fluorescence lectin-binding analysis (FLBA) of a particular biofilm. So far, the lectin approach represents the only tool for in situ characterization of the glycoconjugate makeup in biofilm systems.  Furthermore, lectin staining lends itself to other fluorescence techniques in order to correlate it with cellular biofilm constituents in general and glycoconjugate producers in particular.

  4. WholePathwayScope: a comprehensive pathway-based analysis tool for high-throughput data

    PubMed Central

    Yi, Ming; Horton, Jay D; Cohen, Jonathan C; Hobbs, Helen H; Stephens, Robert M

    2006-01-01

    Background Analysis of High Throughput (HTP) Data such as microarray and proteomics data has provided a powerful methodology to study patterns of gene regulation at genome scale. A major unresolved problem in the post-genomic era is to assemble the large amounts of data generated into a meaningful biological context. We have developed a comprehensive software tool, WholePathwayScope (WPS), for deriving biological insights from analysis of HTP data. Result WPS extracts gene lists with shared biological themes through color cue templates. WPS statistically evaluates global functional category enrichment of gene lists and pathway-level pattern enrichment of data. WPS incorporates well-known biological pathways from KEGG (Kyoto Encyclopedia of Genes and Genomes) and Biocarta, GO (Gene Ontology) terms as well as user-defined pathways or relevant gene clusters or groups, and explores gene-term relationships within the derived gene-term association networks (GTANs). WPS simultaneously compares multiple datasets within biological contexts either as pathways or as association networks. WPS also integrates Genetic Association Database and Partial MedGene Database for disease-association information. We have used this program to analyze and compare microarray and proteomics datasets derived from a variety of biological systems. Application examples demonstrated the capacity of WPS to significantly facilitate the analysis of HTP data for integrative discovery. Conclusion This tool represents a pathway-based platform for discovery integration to maximize analysis power. The tool is freely available at . PMID:16423281

  5. PASTA: splice junction identification from RNA-Sequencing data

    PubMed Central

    2013-01-01

    Background Next generation transcriptome sequencing (RNA-Seq) is emerging as a powerful experimental tool for the study of alternative splicing and its regulation, but requires ad-hoc analysis methods and tools. PASTA (Patterned Alignments for Splicing and Transcriptome Analysis) is a splice junction detection algorithm specifically designed for RNA-Seq data, relying on a highly accurate alignment strategy and on a combination of heuristic and statistical methods to identify exon-intron junctions with high accuracy. Results Comparisons against TopHat and other splice junction prediction software on real and simulated datasets show that PASTA exhibits high specificity and sensitivity, especially at lower coverage levels. Moreover, PASTA is highly configurable and flexible, and can therefore be applied in a wide range of analysis scenarios: it is able to handle both single-end and paired-end reads, it does not rely on the presence of canonical splicing signals, and it uses organism-specific regression models to accurately identify junctions. Conclusions PASTA is a highly efficient and sensitive tool to identify splicing junctions from RNA-Seq data. Compared to similar programs, it has the ability to identify a higher number of real splicing junctions, and provides highly annotated output files containing detailed information about their location and characteristics. Accurate junction data in turn facilitates the reconstruction of the splicing isoforms and the analysis of their expression levels, which will be performed by the remaining modules of the PASTA pipeline, still under development. Use of PASTA can therefore enable the large-scale investigation of transcription and alternative splicing. PMID:23557086

  6. Combining multi-criteria decision analysis and mini-health technology assessment: A funding decision-support tool for medical devices in a university hospital setting.

    PubMed

    Martelli, Nicolas; Hansen, Paul; van den Brink, Hélène; Boudard, Aurélie; Cordonnier, Anne-Laure; Devaux, Capucine; Pineau, Judith; Prognon, Patrice; Borget, Isabelle

    2016-02-01

    At the hospital level, decisions about purchasing new and oftentimes expensive medical devices must take into account multiple criteria simultaneously. Multi-criteria decision analysis (MCDA) is increasingly used for health technology assessment (HTA). One of the most successful hospital-based HTA approaches is mini-HTA, of which a notable example is the Matrix4value model. To develop a funding decision-support tool combining MCDA and mini-HTA, based on Matrix4value, suitable for medical devices for individual patient use in French university hospitals - known as the IDA tool, short for 'innovative device assessment'. Criteria for assessing medical devices were identified from a literature review and a survey of 18 French university hospitals. Weights for the criteria, representing their relative importance, were derived from a survey of 25 members of a medical devices committee using an elicitation technique involving pairwise comparisons. As a test of its usefulness, the IDA tool was applied to two new drug-eluting beads (DEBs) for transcatheter arterial chemoembolization. The IDA tool comprises five criteria and weights for each of two over-arching categories: risk and value. The tool revealed that the two new DEBs conferred no additional value relative to DEBs currently available. Feedback from participating decision-makers about the IDA tool was very positive. The tool could help to promote a more structured and transparent approach to HTA decision-making in French university hospitals. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. Identification of human operator performance models utilizing time series analysis

    NASA Technical Reports Server (NTRS)

    Holden, F. M.; Shinners, S. M.

    1973-01-01

    The results of an effort performed by Sperry Systems Management Division for AMRL in applying time series analysis as a tool for modeling the human operator are presented. This technique is utilized for determining the variation of the human transfer function under various levels of stress. The human operator's model is determined based on actual input and output data from a tracking experiment.

  8. Large Deployable Reflector (LDR) thermal characteristics

    NASA Technical Reports Server (NTRS)

    Miyake, R. N.; Wu, Y. C.

    1988-01-01

    The thermal support group, which is part of the lightweight composite reflector panel program, developed thermal test and analysis evaluation tools necessary to support the integrated interdisciplinary analysis (IIDA) capability. A detailed thermal mathematical model and a simplified spacecraft thermal math model were written. These models determine the orbital temperature level and variation, and the thermally induced gradients through and across a panel, for inclusion in the IIDA.

  9. Application of the CO2-PENS risk analysis tool to the Rock Springs Uplift, Wyoming

    USGS Publications Warehouse

    Stauffer, P.H.; Pawar, R.J.; Surdam, R.C.; Jiao, Z.; Deng, H.; Lettelier, B.C.; Viswanathan, H.S.; Sanzo, D.L.; Keating, G.N.

    2011-01-01

    We describe preliminary application of the CO2-PENS performance and risk analysis tool to a planned geologic CO2 sequestration demonstration project in the Rock Springs Uplift (RSU), located in south western Wyoming. We use data from the RSU to populate CO2-PENS, an evolving system-level modeling tool developed at Los Alamos National Laboratory. This tool has been designed to generate performance and risk assessment calculations for the geologic sequestration of carbon dioxide. Our approach follows Systems Analysis logic and includes estimates of uncertainty in model parameters and Monte-Carlo simulations that lead to probabilistic results. Probabilistic results provide decision makers with a range in the likelihood of different outcomes. Herein we present results from a newly implemented approach in CO 2-PENS that captures site-specific spatially coherent details such as topography on the reservoir/cap-rock interface, changes in saturation and pressure during injection, and dip on overlying aquifers that may be impacted by leakage upward through wellbores and faults. We present simulations of CO 2 injection under different uncertainty distributions for hypothetical leaking wells and faults. Although results are preliminary and to be used only for demonstration of the approach, future results of the risk analysis will form the basis for a discussion on methods to reduce uncertainty in the risk calculations. Additionally, we present ideas on using the model to help locate monitoring equipment to detect potential leaks. By maintaining site-specific details in the CO2-PENS analysis we provide a tool that allows more logical presentations to stakeholders in the region. ?? 2011 Published by Elsevier Ltd.

  10. Objective laparoscopic skills assessments of surgical residents using Hidden Markov Models based on haptic information and tool/tissue interactions.

    PubMed

    Rosen, J; Solazzo, M; Hannaford, B; Sinanan, M

    2001-01-01

    Laparoscopic surgical skills evaluation of surgery residents is usually a subjective process, carried out in the operating room by senior surgeons. By its nature, this process is performed using fuzzy criteria. The objective of the current study was to develop and assess an objective laparoscopic surgical skill scale using Hidden Markov Models (HMM) based on haptic information, tool/tissue interactions and visual task decomposition. Eight subjects (six surgical trainees: first year surgical residents 2 x R1, third year surgical residents 2 x R3 fifth year surgical residents 2 x R5; and two expert laparoscopic surgeons: 2 x ES) performed laparoscopic cholecystectomy following a specific 7 steps protocol on a pig. An instrumented laparoscopic grasper equipped with a three-axis force/torque sensor located at the proximal end with an additional force sensor located on the handle, was used to measure the forces and torques. The hand/tool interface force/torque data was synchronized with a video of the tool operative maneuvers. A synthesis of frame-by-frame video analysis was used to define 14 different types of tool/tissue interactions, each one associated with unique force/torque (F/T) signatures. HMMs were developed for each subject representing the surgical skills by defining the various tool/tissue interactions as states and the associated F/T signatures as observations. The statistical distance between the HMMs representing residents at different levels of their training and the HMMs of expert surgeons were calculated in order to generate a learning curve of selected steps during laparoscopic cholecystectomy. Comparison of HMM's between groups showed significant differences between all skill levels, supporting the objective definition of a learning curve. The major differences between skill levels were: (i) magnitudes of F/T applied (ii) types of tool/tissue interactions used and the transition between them and (iii) time intervals spent in each tool/tissue interaction and the overall completion time. The objective HMM analysis showed that the greatest difference in performance was between R1 and R3 groups and then decreased as the level of expertise increased, suggesting that significant laparoscopic surgical capability develops between the first and the third years of their residency training. The power of the methodology using HMM for objective surgical skill assessment arises from the fact that it compiles enormous amount of data regarding different aspects of surgical skill into a very compact model that can be translated into a single number representing the distance from expert performance. Moreover, the methodology is not limited to in-vivo condition as demonstrated in the current study. It can be extended to other modalities such as measuring performance in surgical simulators and robotic systems.

  11. Agreement analysis between three different short geriatric screening scales in patients undergoing chemotherapy for solid tumors.

    PubMed

    Joshi, Amit; Tandon, Nidhi; Patil, Vijay M; Noronha, Vanita; Gupta, Sudeep; Bhattacharjee, Atanu; Prabhash, Kumar

    2017-01-01

    Comprehensive geriatric assessment (CGA) in routine practice is not logistically feasible. Short geriatric screening tools are available for selecting patients for CGA. However none of them is validated in India. In this analysis we aim to compare the level of agreement between three commonly used short screening tools (Flemish version of TRST (fTRST), G8 and VES-13. Patients ≥65 years with a solid tumor malignancy undergoing cancer directed treatment were interviewed between March 2013 to July 2014. Geriatric screening with G8, fTRST and VES-13 tools was performed in these patients. G8 score ≤14, fTRST score ≥1 and VES-13 score ≥3 were taken as indicators for the presence of a high risk geriatric profile respectively. R version 3.1.2 was used for analysis. Cohen kappa agreement statistics was used to compare the agreement between the 3 tools. p value of 0.05 was taken as significant. The kappa statistics value for agreement between G8 score and fTRST, between VES-13 and fTRST and between VES-13 and G8 were 0.12 (P = 0.04), 0.16 (P = 0.07) and 0.05 (P = 0.45) respectively. It was found that maximum agreement was observed for VES-13 and fTRST. The agreement value of VES-13 and fTRST observed was 59.44 %(39.63% for high risk profile and 19.81% for low risk profile). The agreement value of G-8 and fTRST was 39.62% (2.83% only for high risk profile and 36.79% for low risk profile). The lowest agreement was between G8 and VES-13, 35.84% (7.54% for high risk detection and 28.30% for low risk detection). There was poor agreement (in view of kappa value been below 0.2) between the 3 short geriatric screening tools. Research needs to be directed to compare the agreement level between these 3 scales and CGA, so that the appropriate short screening tool can be selected for routine use.

  12. The Art of Athlete Leadership: Identifying High-Quality Athlete Leadership at the Individual and Team Level Through Social Network Analysis.

    PubMed

    Fransen, Katrien; Van Puyenbroeck, Stef; Loughead, Todd M; Vanbeselaere, Norbert; De Cuyper, Bert; Vande Broek, Gert; Boen, Filip

    2015-06-01

    This research aimed to introduce social network analysis as a novel technique in sports teams to identify the attributes of high-quality athlete leadership, both at the individual and at the team level. Study 1 included 25 sports teams (N = 308 athletes) and focused on athletes' general leadership quality. Study 2 comprised 21 sports teams (N = 267 athletes) and focused on athletes' specific leadership quality as a task, motivational, social, and external leader. The extent to which athletes felt connected with their leader proved to be most predictive for athletes' perceptions of that leader's quality on each leadership role. Also at the team level, teams with higher athlete leadership quality were more strongly connected. We conclude that social network analysis constitutes a valuable tool to provide more insight in the attributes of high-quality leadership both at the individual and at the team level.

  13. Development of a multilevel health and safety climate survey tool within a mining setting.

    PubMed

    Parker, Anthony W; Tones, Megan J; Ritchie, Gabrielle E

    2017-09-01

    This study aimed to design, implement and evaluate the reliability and validity of a multifactorial and multilevel health and safety climate survey (HSCS) tool with utility in the Australian mining setting. An 84-item questionnaire was developed and pilot tested on a sample of 302 Australian miners across two open cut sites. A 67-item, 10 factor solution was obtained via exploratory factor analysis (EFA) representing prioritization and attitudes to health and safety across multiple domains and organizational levels. Each factor demonstrated a high level of internal reliability, and a series of ANOVAs determined a high level of consistency in responses across the workforce, and generally irrespective of age, experience or job category. Participants tended to hold favorable views of occupational health and safety (OH&S) climate at the management, supervisor, workgroup and individual level. The survey tool demonstrated reliability and validity for use within an open cut Australian mining setting and supports a multilevel, industry specific approach to OH&S climate. Findings suggested a need for mining companies to maintain high OH&S standards to minimize risks to employee health and safety. Future research is required to determine the ability of this measure to predict OH&S outcomes and its utility within other mine settings. As this tool integrates health and safety, it may have benefits for assessment, monitoring and evaluation in the industry, and improving the understanding of how health and safety climate interact at multiple levels to influence OH&S outcomes. Copyright © 2017 National Safety Council and Elsevier Ltd. All rights reserved.

  14. Technology Combination Analysis Tool (TCAT) for Active Debris Removal

    NASA Astrophysics Data System (ADS)

    Chamot, B.; Richard, M.; Salmon, T.; Pisseloup, A.; Cougnet, C.; Axthelm, R.; Saunder, C.; Dupont, C.; Lequette, L.

    2013-08-01

    This paper present the work of the Swiss Space Center EPFL within the CNES-funded OTV-2 study. In order to find the most performant Active Debris Removal (ADR) mission architectures and technologies, a tool was developed in order to design and compare ADR spacecraft, and to plan ADR campaigns to remove large debris. Two types of architectures are considered to be efficient: the Chaser (single-debris spacecraft), the Mothership/ Kits (multiple-debris spacecraft). Both are able to perform controlled re-entry. The tool includes modules to optimise the launch dates and the order of capture, to design missions and spacecraft, and to select launch vehicles. The propulsion, power and structure subsystems are sized by the tool thanks to high-level parametric models whilst the other ones are defined by their mass and power consumption. Final results are still under investigation by the consortium but two concrete examples of the tool's outputs are presented in the paper.

  15. Minimizing tooth bending stress in spur gears with simplified shapes of fillet and tool shape determination

    NASA Astrophysics Data System (ADS)

    Pedersen, N. L.

    2015-06-01

    The strength of a gear is typically defined relative to durability (pitting) and load capacity (tooth-breakage). Tooth-breakage is controlled by the root shape and this gear part can be designed because there is no contact between gear pairs here. The shape of gears is generally defined by different standards, with the ISO standard probably being the most common one. Gears are manufactured using two principally different tools: rack tools and gear tools. In this work, the bending stress of involute teeth is minimized by shape optimization made directly on the final gear. This optimized shape is then used to find the cutting tool (the gear envelope) that can create this optimized gear shape. A simple but sufficiently flexible root parameterization is applied and emphasis is put on the importance of separating the shape parameterization from the finite element analysis of stresses. Large improvements in the stress level are found.

  16. Analysis instruments for the performance of Advanced Practice Nursing.

    PubMed

    Sevilla-Guerra, Sonia; Zabalegui, Adelaida

    2017-11-29

    Advanced Practice Nursing has been a reality in the international context for several decades and recently new nursing profiles have been developed in Spain as well that follow this model. The consolidation of these advanced practice roles has also led to of the creation of tools that attempt to define and evaluate their functions. This study aims to identify and explore the existing instruments that enable the domains of Advanced Practice Nursing to be defined. A review of existing international questionnaires and instruments was undertaken, including an analysis of the design process, the domains/dimensions defined, the main results and an exploration of clinimetric properties. Seven studies were analysed but not all proved to be valid, stable or reliable tools. One included tool was able to differentiate between the functions of the general nurse and the advanced practice nurse by the level of activities undertaken within the five domains described. These tools are necessary to evaluate the scope of advanced practice in new nursing roles that correspond to other international models of competencies and practice domains. Copyright © 2017 Elsevier España, S.L.U. All rights reserved.

  17. Developing a uniformed assessment tool to evaluate care service needs for disabled persons in Japan.

    PubMed

    Takei, Teiji; Takahashi, Hiroshi; Nakatani, Hiroki

    2008-05-01

    Until recently, the care services for disabled persons have been under rigid control by public sectors in terms of provision and funding in Japan. A reform was introduced in 2003 that brought a rapid increase of utilization of services and serious shortage of financial resources. Under these circumstances, the "Services and Supports for Persons with Disabilities Act" was enacted in 2005, requiring that the care service provision process should be transparent, fair and standardized. The purpose of this study is to develop an objective tool for assessing the need for disability care. In the present study we evaluate 1423 cases of patients receiving care services in 60 municipalities, including all three categories of disabilities (physical, intellectual and mental). Using the data of the total 106 items, we conducted factor analysis and regression analysis to develop an assessment tool for people with disabilities. The data revealed that instrumental activities of daily living (IADL) played an essential role in assessing disability levels. We have developed the uniformed assessment tool that has been utilized to guide the types and quantity of care services throughout Japan.

  18. The adenosine triphosphate method as a quality control tool to assess 'cleanliness' of frequently touched hospital surfaces.

    PubMed

    Knape, L; Hambraeus, A; Lytsy, B

    2015-10-01

    The adenosine triphosphate (ATP) method is widely accepted as a quality control method to complement visual assessment, in the specifications of requirements, when purchasing cleaning contractors in Swedish hospitals. To examine whether the amount of biological load, as measured by ATP on frequently touched near-patient surfaces, had been reduced after an intervention; to evaluate the correlation between visual assessment and ATP levels on the same surfaces; to identify aspects of the performance of the ATP method as a tool in evaluating hospital cleanliness. A prospective intervention study in three phases was carried out in a medical ward and an intensive care unit (ICU) at a regional hospital in mid-Sweden between 2012 and 2013. Existing cleaning procedures were defined and baseline tests were sampled by visual inspection and ATP measurements of ten frequently touched surfaces in patients' rooms before and after intervention. The intervention consisted of educating nursing staff about the importance of hospital cleaning and direct feedback of ATP levels before and after cleaning. The mixed model showed a significant decrease in ATP levels after the intervention (P < 0.001). Relative light unit values were lower in the ICU. Cleanliness as judged by visual assessments improved. In the logistic regression analysis, there was a significant association between visual assessments and ATP levels. Direct feedback of ATP levels, together with education and introduction of written cleaning protocols, were effective tools to improve cleanliness. Visual assessment correlated with the level of ATP but the correlation was not absolute. The ATP method could serve as an educational tool for staff, but is not enough to assess hospital cleanliness in general as only a limited part of a large area is covered. Copyright © 2015 The Healthcare Infection Society. Published by Elsevier Ltd. All rights reserved.

  19. Tools used to assess medical students competence in procedural skills at the end of a primary medical degree: a systematic review.

    PubMed

    Morris, Marie C; Gallagher, Tom K; Ridgway, Paul F

    2012-01-01

    The objective was to systematically review the literature to identify and grade tools used for the end point assessment of procedural skills (e.g., phlebotomy, IV cannulation, suturing) competence in medical students prior to certification. The authors searched eight bibliographic databases electronically - ERIC, Medline, CINAHL, EMBASE, Psychinfo, PsychLIT, EBM Reviews and the Cochrane databases. Two reviewers independently reviewed the literature to identify procedural assessment tools used specifically for assessing medical students within the PRISMA framework, the inclusion/exclusion criteria and search period. Papers on OSATS and DOPS were excluded as they focused on post-registration assessment and clinical rather than simulated competence. Of 659 abstracted articles 56 identified procedural assessment tools. Only 11 specifically assessed medical students. The final 11 studies consisted of 1 randomised controlled trial, 4 comparative and 6 descriptive studies yielding 12 heterogeneous procedural assessment tools for analysis. Seven tools addressed four discrete pre-certification skills, basic suture (3), airway management (2), nasogastric tube insertion (1) and intravenous cannulation (1). One tool used a generic assessment of procedural skills. Two tools focused on postgraduate laparoscopic skills and one on osteopathic students and thus were not included in this review. The levels of evidence are low with regard to reliability - κ = 0.65-0.71 and minimum validity is achieved - face and content. In conclusion, there are no tools designed specifically to assess competence of procedural skills in a final certification examination. There is a need to develop standardised tools with proven reliability and validity for assessment of procedural skills competence at the end of medical training. Medicine graduates must have comparable levels of procedural skills acquisition entering the clinical workforce irrespective of the country of training.

  20. The Development of Sport Expertise: Mapping the Tactical Domain.

    ERIC Educational Resources Information Center

    McPherson, Sue L.

    1994-01-01

    Explores issues and research relevant to sport tactical knowledge development and expertise. The paper discusses controversies concerning methodological tools, possible levels of analysis in sport research, sport tactical knowledge and expertise, a protocol structure model for sport, and expert-novice sport research. (SM)

  1. Automatic Single Event Effects Sensitivity Analysis of a 13-Bit Successive Approximation ADC

    NASA Astrophysics Data System (ADS)

    Márquez, F.; Muñoz, F.; Palomo, F. R.; Sanz, L.; López-Morillo, E.; Aguirre, M. A.; Jiménez, A.

    2015-08-01

    This paper presents Analog Fault Tolerant University of Seville Debugging System (AFTU), a tool to evaluate the Single-Event Effect (SEE) sensitivity of analog/mixed signal microelectronic circuits at transistor level. As analog cells can behave in an unpredictable way when critical areas interact with the particle hitting, there is a need for designers to have a software tool that allows an automatic and exhaustive analysis of Single-Event Effects influence. AFTU takes the test-bench SPECTRE design, emulates radiation conditions and automatically evaluates vulnerabilities using user-defined heuristics. To illustrate the utility of the tool, the SEE sensitivity of a 13-bits Successive Approximation Analog-to-Digital Converter (ADC) has been analysed. This circuit was selected not only because it was designed for space applications, but also due to the fact that a manual SEE sensitivity analysis would be too time-consuming. After a user-defined test campaign, it was detected that some voltage transients were propagated to a node where a parasitic diode was activated, affecting the offset cancelation, and therefore the whole resolution of the ADC. A simple modification of the scheme solved the problem, as it was verified with another automatic SEE sensitivity analysis.

  2. [Model of Analysis and Prevention of Accidents - MAPA: tool for operational health surveillance].

    PubMed

    de Almeida, Ildeberto Muniz; Vilela, Rodolfo Andrade de Gouveia; da Silva, Alessandro José Nunes; Beltran, Sandra Lorena

    2014-12-01

    The analysis of work-related accidents is important for accident surveillance and prevention. Current methods of analysis seek to overcome reductionist views that see these occurrences as simple events explained by operator error. The objective of this paper is to analyze the Model of Analysis and Prevention of Accidents (MAPA) and its use in monitoring interventions, duly highlighting aspects experienced in the use of the tool. The descriptive analytical method was used, introducing the steps of the model. To illustrate contributions and or difficulties, cases where the tool was used in the context of service were selected. MAPA integrates theoretical approaches that have already been tried in studies of accidents by providing useful conceptual support from the data collection stage until conclusion and intervention stages. Besides revealing weaknesses of the traditional approach, it helps identify organizational determinants, such as management failings, system design and safety management involved in the accident. The main challenges lie in the grasp of concepts by users, in exploring organizational aspects upstream in the chain of decisions or at higher levels of the hierarchy, as well as the intervention to change the determinants of these events.

  3. Naturalistic Decision Making for Power System Operators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Greitzer, Frank L.; Podmore, Robin; Robinson, Marck

    2010-02-01

    Motivation – Investigations of large-scale outages in the North American interconnected electric system often attribute the causes to three T’s: Trees, Training and Tools. To document and understand the mental processes used by expert operators when making critical decisions, a naturalistic decision making (NDM) model was developed. Transcripts of conversations were analyzed to reveal and assess NDM-based performance criteria. Findings/Design – An item analysis indicated that the operators’ Situation Awareness Levels, mental models, and mental simulations can be mapped at different points in the training scenario. This may identify improved training methods or analytical/ visualization tools. Originality/Value – This studymore » applies for the first time, the concepts of Recognition Primed Decision Making, Situation Awareness Levels and Cognitive Task Analysis to training of electric power system operators. Take away message – The NDM approach provides a viable framework for systematic training management to accelerate learning in simulator-based training scenarios for power system operators and teams.« less

  4. The application of data mining techniques to oral cancer prognosis.

    PubMed

    Tseng, Wan-Ting; Chiang, Wei-Fan; Liu, Shyun-Yeu; Roan, Jinsheng; Lin, Chun-Nan

    2015-05-01

    This study adopted an integrated procedure that combines the clustering and classification features of data mining technology to determine the differences between the symptoms shown in past cases where patients died from or survived oral cancer. Two data mining tools, namely decision tree and artificial neural network, were used to analyze the historical cases of oral cancer, and their performance was compared with that of logistic regression, the popular statistical analysis tool. Both decision tree and artificial neural network models showed superiority to the traditional statistical model. However, as to clinician, the trees created by the decision tree models are relatively easier to interpret compared to that of the artificial neural network models. Cluster analysis also discovers that those stage 4 patients whose also possess the following four characteristics are having an extremely low survival rate: pN is N2b, level of RLNM is level I-III, AJCC-T is T4, and cells mutate situation (G) is moderate.

  5. CytoSpectre: a tool for spectral analysis of oriented structures on cellular and subcellular levels.

    PubMed

    Kartasalo, Kimmo; Pölönen, Risto-Pekka; Ojala, Marisa; Rasku, Jyrki; Lekkala, Jukka; Aalto-Setälä, Katriina; Kallio, Pasi

    2015-10-26

    Orientation and the degree of isotropy are important in many biological systems such as the sarcomeres of cardiomyocytes and other fibrillar structures of the cytoskeleton. Image based analysis of such structures is often limited to qualitative evaluation by human experts, hampering the throughput, repeatability and reliability of the analyses. Software tools are not readily available for this purpose and the existing methods typically rely at least partly on manual operation. We developed CytoSpectre, an automated tool based on spectral analysis, allowing the quantification of orientation and also size distributions of structures in microscopy images. CytoSpectre utilizes the Fourier transform to estimate the power spectrum of an image and based on the spectrum, computes parameter values describing, among others, the mean orientation, isotropy and size of target structures. The analysis can be further tuned to focus on targets of particular size at cellular or subcellular scales. The software can be operated via a graphical user interface without any programming expertise. We analyzed the performance of CytoSpectre by extensive simulations using artificial images, by benchmarking against FibrilTool and by comparisons with manual measurements performed for real images by a panel of human experts. The software was found to be tolerant against noise and blurring and superior to FibrilTool when analyzing realistic targets with degraded image quality. The analysis of real images indicated general good agreement between computational and manual results while also revealing notable expert-to-expert variation. Moreover, the experiment showed that CytoSpectre can handle images obtained of different cell types using different microscopy techniques. Finally, we studied the effect of mechanical stretching on cardiomyocytes to demonstrate the software in an actual experiment and observed changes in cellular orientation in response to stretching. CytoSpectre, a versatile, easy-to-use software tool for spectral analysis of microscopy images was developed. The tool is compatible with most 2D images and can be used to analyze targets at different scales. We expect the tool to be useful in diverse applications dealing with structures whose orientation and size distributions are of interest. While designed for the biological field, the software could also be useful in non-biological applications.

  6. Data analysis of the benefits of an electronic registry of information in a neonatal intensive care unit in Greece.

    PubMed

    Skouroliakou, Maria; Soloupis, George; Gounaris, Antonis; Charitou, Antonia; Papasarantopoulos, Petros; Markantonis, Sophia L; Golna, Christina; Souliotis, Kyriakos

    2008-07-28

    This study assesses the results of implementation of a software program that allows for input of admission/discharge summary data (including cost) in a neonatal intensive care unit (NICU) in Greece, based on the establishment of a baseline statistical database for infants treated in a NICU and the statistical analysis of epidemiological and resource utilization data thus collected. A software tool was designed, developed, and implemented between April 2004 and March 2005 in the NICU of the LITO private maternity hospital in Athens, Greece, to allow for the first time for step-by-step collection and management of summary treatment data. Data collected over this period were subsequently analyzed using defined indicators as a basis to extract results related to treatment options, treatment duration, and relative resource utilization. Data for 499 babies were entered in the tool and processed. Information on medical costs (e.g., mean total cost +/- SD of treatment was euro310.44 +/- 249.17 and euro6704.27 +/- 4079.53 for babies weighing more than 2500 g and 1000-1500 g respectively), incidence of complications or disease (e.g., 4.3 percent and 14.3 percent of study babies weighing 1,000 to 1,500 g suffered from cerebral bleeding [grade I] and bronchopulmonary dysplasia, respectively, while overall 6.0 percent had microbial infections), and medical statistics (e.g., perinatal mortality was 6.8 percent) was obtained in a quick and robust manner. The software tool allowed for collection and analysis of data traditionally maintained in paper medical records in the NICU with greater ease and accuracy. Data codification and analysis led to significant findings at the epidemiological, medical resource utilization, and respective hospital cost levels that allowed comparisons with literature findings for the first time in Greece. The tool thus contributed to a clearer understanding of treatment practices in the NICU and set the baseline for the assessment of the impact of future interventions at the policy or hospital level.

  7. Bootstrap position analysis for forecasting low flow frequency

    USGS Publications Warehouse

    Tasker, Gary D.; Dunne, P.

    1997-01-01

    A method of random resampling of residuals from stochastic models is used to generate a large number of 12-month-long traces of natural monthly runoff to be used in a position analysis model for a water-supply storage and delivery system. Position analysis uses the traces to forecast the likelihood of specified outcomes such as reservoir levels falling below a specified level or streamflows falling below statutory passing flows conditioned on the current reservoir levels and streamflows. The advantages of this resampling scheme, called bootstrap position analysis, are that it does not rely on the unverifiable assumption of normality, fewer parameters need to be estimated directly from the data, and accounting for parameter uncertainty is easily done. For a given set of operating rules and water-use requirements for a system, water managers can use such a model as a decision-making tool to evaluate different operating rules. ?? ASCE,.

  8. Teamwork in the operating room: frontline perspectives among hospitals and operating room personnel.

    PubMed

    Sexton, J Bryan; Makary, Martin A; Tersigni, Anthony R; Pryor, David; Hendrich, Ann; Thomas, Eric J; Holzmueller, Christine G; Knight, Andrew P; Wu, Yun; Pronovost, Peter J

    2006-11-01

    The Joint Commission on Accreditation of Healthcare Organizations is proposing that hospitals measure culture beginning in 2007. However, a reliable and widely used measurement tool for the operating room (OR) setting does not currently exist. OR personnel in 60 US hospitals were surveyed using the Safety Attitudes Questionnaire. The teamwork climate domain of the survey uses six items about difficulty speaking up, conflict resolution, physician-nurse collaboration, feeling supported by others, asking questions, and heeding nurse input. To justify grouping individual-level responses to a single score at each hospital OR level, the authors used a multilevel confirmatory factor analysis, intraclass correlations, within-group interrater reliability, and Cronbach's alpha. To detect differences at the hospital OR level and by caregiver type, the authors used multivariate analysis of variance (items) and analysis of variance (scale). The response rate was 77.1%. There was robust evidence for grouping individual-level respondents to the hospital OR level using the diverse set of statistical tests, e.g., Comparative Fit Index = 0.99, root mean squared error of approximation = 0.05, and acceptable intraclasss correlations, within-group interrater reliability values, and Cronbach's alpha = 0.79. Teamwork climate differed significantly by hospital (F59, 1,911 = 4.06, P < 0.001) and OR caregiver type (F4, 1,911 = 9.96, P < 0.001). Rigorous assessment of teamwork climate is possible using this psychometrically sound teamwork climate scale. This tool and initial benchmarks allow others to compare their teamwork climate to national means, in an effort to focus more on what excellent surgical teams do well.

  9. Insights into teaching quantum mechanics in secondary and lower undergraduate education

    NASA Astrophysics Data System (ADS)

    Krijtenburg-Lewerissa, K.; Pol, H. J.; Brinkman, A.; van Joolingen, W. R.

    2017-06-01

    This study presents a review of the current state of research on teaching quantum mechanics in secondary and lower undergraduate education. A conceptual approach to quantum mechanics is being implemented in more and more introductory physics courses around the world. Because of the differences between the conceptual nature of quantum mechanics and classical physics, research on misconceptions, testing, and teaching strategies for introductory quantum mechanics is needed. For this review, 74 articles were selected and analyzed for the misconceptions, research tools, teaching strategies, and multimedia applications investigated. Outcomes were categorized according to their contribution to the various subtopics of quantum mechanics. Analysis shows that students have difficulty relating quantum physics to physical reality. It also shows that the teaching of complex quantum behavior, such as time dependence, superposition, and the measurement problem, has barely been investigated for the secondary and lower undergraduate level. At the secondary school level, this article shows a need to investigate student difficulties concerning wave functions and potential wells. Investigation of research tools shows the necessity for the development of assessment tools for secondary and lower undergraduate education, which cover all major topics and are suitable for statistical analysis. Furthermore, this article shows the existence of very diverse ideas concerning teaching strategies for quantum mechanics and a lack of research into which strategies promote understanding. This article underlines the need for more empirical research into student difficulties, teaching strategies, activities, and research tools intended for a conceptual approach for quantum mechanics.

  10. Metabolic pathways for the whole community.

    PubMed

    Hanson, Niels W; Konwar, Kishori M; Hawley, Alyse K; Altman, Tomer; Karp, Peter D; Hallam, Steven J

    2014-07-22

    A convergence of high-throughput sequencing and computational power is transforming biology into information science. Despite these technological advances, converting bits and bytes of sequence information into meaningful insights remains a challenging enterprise. Biological systems operate on multiple hierarchical levels from genomes to biomes. Holistic understanding of biological systems requires agile software tools that permit comparative analyses across multiple information levels (DNA, RNA, protein, and metabolites) to identify emergent properties, diagnose system states, or predict responses to environmental change. Here we adopt the MetaPathways annotation and analysis pipeline and Pathway Tools to construct environmental pathway/genome databases (ePGDBs) that describe microbial community metabolism using MetaCyc, a highly curated database of metabolic pathways and components covering all domains of life. We evaluate Pathway Tools' performance on three datasets with different complexity and coding potential, including simulated metagenomes, a symbiotic system, and the Hawaii Ocean Time-series. We define accuracy and sensitivity relationships between read length, coverage and pathway recovery and evaluate the impact of taxonomic pruning on ePGDB construction and interpretation. Resulting ePGDBs provide interactive metabolic maps, predict emergent metabolic pathways associated with biosynthesis and energy production and differentiate between genomic potential and phenotypic expression across defined environmental gradients. This multi-tiered analysis provides the user community with specific operating guidelines, performance metrics and prediction hazards for more reliable ePGDB construction and interpretation. Moreover, it demonstrates the power of Pathway Tools in predicting metabolic interactions in natural and engineered ecosystems.

  11. Orion Entry, Descent, and Landing Simulation

    NASA Technical Reports Server (NTRS)

    Hoelscher, Brian R.

    2007-01-01

    The Orion Entry, Descent, and Landing simulation was created over the past two years to serve as the primary Crew Exploration Vehicle guidance, navigation, and control (GN&C) design and analysis tool at the National Aeronautics and Space Administration (NASA). The Advanced NASA Technology Architecture for Exploration Studies (ANTARES) simulation is a six degree-of-freedom tool with a unique design architecture which has a high level of flexibility. This paper describes the decision history and motivations that guided the creation of this simulation tool. The capabilities of the models within ANTARES are presented in detail. Special attention is given to features of the highly flexible GN&C architecture and the details of the implemented GN&C algorithms. ANTARES provides a foundation simulation for the Orion Project that has already been successfully used for requirements analysis, system definition analysis, and preliminary GN&C design analysis. ANTARES will find useful application in engineering analysis, mission operations, crew training, avionics-in-the-loop testing, etc. This paper focuses on the entry simulation aspect of ANTARES, which is part of a bigger simulation package supporting the entire mission profile of the Orion vehicle. The unique aspects of entry GN&C design are covered, including how the simulation is being used for Monte Carlo dispersion analysis and for support of linear stability analysis. Sample simulation output from ANTARES is presented in an appendix.

  12. Different methods of image segmentation in the process of meat marbling evaluation

    NASA Astrophysics Data System (ADS)

    Ludwiczak, A.; Ślósarz, P.; Lisiak, D.; Przybylak, A.; Boniecki, P.; Stanisz, M.; Koszela, K.; Zaborowicz, M.; Przybył, K.; Wojcieszak, D.; Janczak, D.; Bykowska, M.

    2015-07-01

    The level of marbling in meat assessment based on digital images is very popular, as computer vision tools are becoming more and more advanced. However considering muscle cross sections as the data source for marbling level evaluation, there are still a few problems to cope with. There is a need for an accurate method which would facilitate this evaluation procedure and increase its accuracy. The presented research was conducted in order to compare the effect of different image segmentation tools considering their usefulness in meat marbling evaluation on the muscle anatomical cross - sections. However this study is considered to be an initial trial in the presented field of research and an introduction to ultrasonic images processing and analysis.

  13. Estimating learning outcomes from pre- and posttest student self-assessments: a longitudinal study.

    PubMed

    Schiekirka, Sarah; Reinhardt, Deborah; Beißbarth, Tim; Anders, Sven; Pukrop, Tobias; Raupach, Tobias

    2013-03-01

    Learning outcome is an important measure for overall teaching quality and should be addressed by comprehensive evaluation tools. The authors evaluated the validity of a novel evaluation tool based on student self-assessments, which may help identify specific strengths and weaknesses of a particular course. In 2011, the authors asked 145 fourth-year students at Göttingen Medical School to self-assess their knowledge on 33 specific learning objectives in a pretest and posttest as part of a cardiorespiratory module. The authors compared performance gain calculated from self-assessments with performance gain derived from formative examinations that were closely matched to these 33 learning objectives. Eighty-three students (57.2%) completed the assessment. There was good agreement between performance gain derived from subjective data and performance gain derived from objective examinations (Pearson r=0.78; P<.0001) on the group level. The association between the two measures was much weaker when data were analyzed on the individual level. Further analysis determined a quality cutoff for performance gain derived from aggregated student self-assessments. When using this cutoff, the evaluation tool was highly sensitive in identifying specific learning objectives with favorable or suboptimal objective performance gains. The tool is easy to implement, takes initial performance levels into account, and does not require extensive pre-post testing. By providing valid estimates of actual performance gain obtained during a teaching module, it may assist medical teachers in identifying strengths and weaknesses of a particular course on the level of specific learning objectives.

  14. Simplified tools for evaluating domestic ventilation systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maansson, L.G.; Orme, M.

    1999-07-01

    Within an International Energy Agency (IEA) project, Annex 27, experts from 8 countries (Canada, France, Italy, Japan, The Netherlands, Sweden, UK and USA) have developed simplified tools for evaluating domestic ventilation systems during the heating season. Tools for building and user aspects, thermal comfort, noise, energy, life cycle cost, reliability and indoor air quality (IAQ) have been devised. The results can be used both for dwellings at the design stage and after construction. The tools lead to immediate answers and indications about the consequences of different choices that may arise during discussion with clients. This paper presents an introduction tomore » these tools. Examples applications of the indoor air quality and energy simplified tools are also provided. The IAQ tool accounts for constant emission sources, CO{sub 2}, cooking products, tobacco smoke, condensation risks, humidity levels (i.e., for judging the risk for mould and house dust mites), and pressure difference (for identifying the risk for radon or land fill spillage entering the dwelling or problems with indoor combustion appliances). An elaborated set of design parameters were worked out that resulted in about 17,000 combinations. By using multi-variate analysis it was possible to reduce this to 174 combinations for IAQ. In addition, a sensitivity analysis was made using 990 combinations. The results from all the runs were used to develop a simplified tool, as well as quantifying equations relying on the design parameters. A computerized energy tool has also been developed within this project, which takes into account air tightness, climate, window airing pattern, outdoor air flow rate and heat exchange efficiency.« less

  15. rpoB-Based Identification of Nonpigmented and Late-Pigmenting Rapidly Growing Mycobacteria

    PubMed Central

    Adékambi, Toïdi; Colson, Philippe; Drancourt, Michel

    2003-01-01

    Nonpigmented and late-pigmenting rapidly growing mycobacteria (RGM) are increasingly isolated in clinical microbiology laboratories. Their accurate identification remains problematic because classification is labor intensive work and because new taxa are not often incorporated into classification databases. Also, 16S rRNA gene sequence analysis underestimates RGM diversity and does not distinguish between all taxa. We determined the complete nucleotide sequence of the rpoB gene, which encodes the bacterial β subunit of the RNA polymerase, for 20 RGM type strains. After using in-house software which analyzes and graphically represents variability stretches of 60 bp along the nucleotide sequence, our analysis focused on a 723-bp variable region exhibiting 83.9 to 97% interspecies similarity and 0 to 1.7% intraspecific divergence. Primer pair Myco-F-Myco-R was designed as a tool for both PCR amplification and sequencing of this region for molecular identification of RGM. This tool was used for identification of 63 RGM clinical isolates previously identified at the species level on the basis of phenotypic characteristics and by 16S rRNA gene sequence analysis. Of 63 clinical isolates, 59 (94%) exhibited <2% partial rpoB gene sequence divergence from 1 of 20 species under study and were regarded as correctly identified at the species level. Mycobacterium abscessus and Mycobacterium mucogenicum isolates were clearly distinguished from Mycobacterium chelonae; Mycobacterium mageritense isolates were clearly distinguished from “Mycobacterium houstonense.” Four isolates were not identified at the species level because they exhibited >3% partial rpoB gene sequence divergence from the corresponding type strain; they belonged to three taxa related to M. mucogenicum, Mycobacterium smegmatis, and Mycobacterium porcinum. For M. abscessus and M. mucogenicum, this partial sequence yielded a high genetic heterogeneity within the clinical isolates. We conclude that molecular identification by analysis of the 723-bp rpoB sequence is a rapid and accurate tool for identification of RGM. PMID:14662964

  16. Using Enabling Technologies to Facilitate the Comparison of Satellite Observations with the Model Forecasts for Hurricane Study

    NASA Astrophysics Data System (ADS)

    Li, P.; Knosp, B.; Hristova-Veleva, S. M.; Niamsuwan, N.; Johnson, M. P.; Shen, T. P. J.; Tanelli, S.; Turk, J.; Vu, Q. A.

    2014-12-01

    Due to their complexity and volume, the satellite data are underutilized in today's hurricane research and operations. To better utilize these data, we developed the JPL Tropical Cyclone Information System (TCIS) - an Interactive Data Portal providing fusion between Near-Real-Time satellite observations and model forecasts to facilitate model evaluation and improvement. We have collected satellite observations and model forecasts in the Atlantic Basin and the East Pacific for the hurricane seasons since 2010 and supported the NASA Airborne Campaigns for Hurricane Study such as the Genesis and Rapid Intensification Processes (GRIP) in 2010 and the Hurricane and Severe Storm Sentinel (HS3) from 2012 to 2014. To enable the direct inter-comparisons of the satellite observations and the model forecasts, the TCIS was integrated with the NASA Earth Observing System Simulator Suite (NEOS3) to produce synthetic observations (e.g. simulated passive microwave brightness temperatures) from a number of operational hurricane forecast models (HWRF and GFS). An automated process was developed to trigger NEOS3 simulations via web services given the location and time of satellite observations, monitor the progress of the NEOS3 simulations, display the synthetic observation and ingest them into the TCIS database when they are done. In addition, three analysis tools, the joint PDF analysis of the brightness temperatures, ARCHER for finding the storm-center and the storm organization and the Wave Number Analysis tool for storm asymmetry and morphology analysis were integrated into TCIS to provide statistical and structural analysis on both observed and synthetic data. Interactive tools were built in the TCIS visualization system to allow the spatial and temporal selections of the datasets, the invocation of the tools with user specified parameters, and the display and the delivery of the results. In this presentation, we will describe the key enabling technologies behind the design of the TCIS interactive data portal and analysis tools, including the spatial database technology for the representation and query of the level 2 satellite data, the automatic process flow using web services, the interactive user interface using the Google Earth API, and a common and expandable Python wrapper to invoke the analysis tools.

  17. Evaluation of an inpatient fall risk screening tool to identify the most critical fall risk factors in inpatients.

    PubMed

    Hou, Wen-Hsuan; Kang, Chun-Mei; Ho, Mu-Hsing; Kuo, Jessie Ming-Chuan; Chen, Hsiao-Lien; Chang, Wen-Yin

    2017-03-01

    To evaluate the accuracy of the inpatient fall risk screening tool and to identify the most critical fall risk factors in inpatients. Variations exist in several screening tools applied in acute care hospitals for examining risk factors for falls and identifying high-risk inpatients. Secondary data analysis. A subset of inpatient data for the period from June 2011-June 2014 was extracted from the nursing information system and adverse event reporting system of an 818-bed teaching medical centre in Taipei. Data were analysed using descriptive statistics, receiver operating characteristic curve analysis and logistic regression analysis. During the study period, 205 fallers and 37,232 nonfallers were identified. The results revealed that the inpatient fall risk screening tool (cut-off point of ≥3) had a low sensitivity level (60%), satisfactory specificity (87%), a positive predictive value of 2·0% and a negative predictive value of 99%. The receiver operating characteristic curve analysis revealed an area under the curve of 0·805 (sensitivity, 71·8%; specificity, 78%). To increase the sensitivity values, the Youden index suggests at least 1·5 points to be the most suitable cut-off point for the inpatient fall risk screening tool. Multivariate logistic regression analysis revealed a considerably increased fall risk in patients with impaired balance and impaired elimination. The fall risk factor was also significantly associated with days of hospital stay and with admission to surgical wards. The findings can raise awareness about the two most critical risk factors for falls among future clinical nurses and other healthcare professionals and thus facilitate the development of fall prevention interventions. This study highlights the needs for redefining the cut-off points of the inpatient fall risk screening tool to effectively identify inpatients at a high risk of falls. Furthermore, inpatients with impaired balance and impaired elimination should be closely monitored by nurses to prevent falling during hospitalisations. © 2016 John Wiley & Sons Ltd.

  18. Single Cell and Population Level Analysis of HCA Data.

    PubMed

    Novo, David; Ghosh, Kaya; Burke, Sean

    2018-01-01

    High Content Analysis instrumentation has undergone tremendous hardware advances in recent years. It is now possible to obtain images of hundreds of thousands to millions of individual objects, across multiple wells, channels, and plates, in a reasonable amount of time. In addition, it is possible to extract dozens, or hundreds, of features per object using commonly available software tools. Analyzing this data provides new challenges to the scientists. The magnitude of these numbers is reminiscent of flow cytometer, where practitioners have long been taking what effectively amounted to very low resolution, multi-parametric measurements from individual cells for many decades. Flow cytometrists have developed a wide range of tools to effectively analyze and interpret these types of data. This chapter will review the techniques used in flow cytometry and show how they can easily and effectively be applied to High Content Analysis.

  19. Design and Analysis Techniques for Concurrent Blackboard Systems. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Mcmanus, John William

    1992-01-01

    Blackboard systems are a natural progression of knowledge-based systems into a more powerful problem solving technique. They provide a way for several highly specialized knowledge sources to cooperate to solve large, complex problems. Blackboard systems incorporate the concepts developed by rule-based and expert systems programmers and include the ability to add conventionally coded knowledge sources. The small and specialized knowledge sources are easier to develop and test, and can be hosted on hardware specifically suited to the task that they are solving. The Formal Model for Blackboard Systems was developed to provide a consistent method for describing a blackboard system. A set of blackboard system design tools has been developed and validated for implementing systems that are expressed using the Formal Model. The tools are used to test and refine a proposed blackboard system design before the design is implemented. My research has shown that the level of independence and specialization of the knowledge sources directly affects the performance of blackboard systems. Using the design, simulation, and analysis tools, I developed a concurrent object-oriented blackboard system that is faster, more efficient, and more powerful than existing systems. The use of the design and analysis tools provided the highly specialized and independent knowledge sources required for my concurrent blackboard system to achieve its design goals.

  20. Multisite Evaluation of a Data Quality Tool for Patient-Level Clinical Data Sets

    PubMed Central

    Huser, Vojtech; DeFalco, Frank J.; Schuemie, Martijn; Ryan, Patrick B.; Shang, Ning; Velez, Mark; Park, Rae Woong; Boyce, Richard D.; Duke, Jon; Khare, Ritu; Utidjian, Levon; Bailey, Charles

    2016-01-01

    Introduction: Data quality and fitness for analysis are crucial if outputs of analyses of electronic health record data or administrative claims data should be trusted by the public and the research community. Methods: We describe a data quality analysis tool (called Achilles Heel) developed by the Observational Health Data Sciences and Informatics Collaborative (OHDSI) and compare outputs from this tool as it was applied to 24 large healthcare datasets across seven different organizations. Results: We highlight 12 data quality rules that identified issues in at least 10 of the 24 datasets and provide a full set of 71 rules identified in at least one dataset. Achilles Heel is a freely available software that provides a useful starter set of data quality rules with the ability to add additional rules. We also present results of a structured email-based interview of all participating sites that collected qualitative comments about the value of Achilles Heel for data quality evaluation. Discussion: Our analysis represents the first comparison of outputs from a data quality tool that implements a fixed (but extensible) set of data quality rules. Thanks to a common data model, we were able to compare quickly multiple datasets originating from several countries in America, Europe and Asia. PMID:28154833

  1. Comprehensive benchmarking and ensemble approaches for metagenomic classifiers.

    PubMed

    McIntyre, Alexa B R; Ounit, Rachid; Afshinnekoo, Ebrahim; Prill, Robert J; Hénaff, Elizabeth; Alexander, Noah; Minot, Samuel S; Danko, David; Foox, Jonathan; Ahsanuddin, Sofia; Tighe, Scott; Hasan, Nur A; Subramanian, Poorani; Moffat, Kelly; Levy, Shawn; Lonardi, Stefano; Greenfield, Nick; Colwell, Rita R; Rosen, Gail L; Mason, Christopher E

    2017-09-21

    One of the main challenges in metagenomics is the identification of microorganisms in clinical and environmental samples. While an extensive and heterogeneous set of computational tools is available to classify microorganisms using whole-genome shotgun sequencing data, comprehensive comparisons of these methods are limited. In this study, we use the largest-to-date set of laboratory-generated and simulated controls across 846 species to evaluate the performance of 11 metagenomic classifiers. Tools were characterized on the basis of their ability to identify taxa at the genus, species, and strain levels, quantify relative abundances of taxa, and classify individual reads to the species level. Strikingly, the number of species identified by the 11 tools can differ by over three orders of magnitude on the same datasets. Various strategies can ameliorate taxonomic misclassification, including abundance filtering, ensemble approaches, and tool intersection. Nevertheless, these strategies were often insufficient to completely eliminate false positives from environmental samples, which are especially important where they concern medically relevant species. Overall, pairing tools with different classification strategies (k-mer, alignment, marker) can combine their respective advantages. This study provides positive and negative controls, titrated standards, and a guide for selecting tools for metagenomic analyses by comparing ranges of precision, accuracy, and recall. We show that proper experimental design and analysis parameters can reduce false positives, provide greater resolution of species in complex metagenomic samples, and improve the interpretation of results.

  2. Impact of the Use of a Standardized Guidance Tool on the Development of a Teaching Philosophy in a Pharmacy Residency Teaching and Learning Curriculum Program

    PubMed Central

    Wesner, Amber R.; Jones, Ryan; Schultz, Karen; Johnson, Mark

    2016-01-01

    The purpose of this study was to evaluate the impact of a standardized reflection tool on the development of a teaching philosophy statement in a pharmacy residency teaching and learning curriculum program (RTLCP). Pharmacy residents participating in the RTLCP over a two-year period were surveyed using a pre/post method to assess perceptions of teaching philosophy development before and after using the tool. Responses were assessed using a 5-point Likert scale to indicate level of agreement with each statement. For analysis, responses were divided into high (strongly agree/agree) and low (neutral/disagree/strongly disagree) agreement. The level of agreement increased significantly for all items surveyed (p < 0.05), with the exception of one area pertaining to the ability to describe characteristics of outstanding teachers, which was noted to be strong before and after using the tool (p = 0.5027). Overall results were positive, with 81% of participants responding that the reflection tool was helpful in developing a teaching philosophy, and 96% responding that the resulting teaching philosophy statement fully reflected their views on teaching and learning. The standardized reflection tool developed at Shenandoah University assisted pharmacy residents enrolled in a teaching and learning curriculum program to draft a comprehensive teaching philosophy statement, and was well received by participants. PMID:28970382

  3. Assessment of the Asian Neurogastroenterology and Motility Association Chronic Constipation Criteria: An Asian Multicenter Cross-sectional Study

    PubMed Central

    Gwee, Kok-Ann; Bergmans, Paul; Kim, JinYong; Coudsy, Bogdana; Sim, Angelia; Chen, Minhu; Lin, Lin; Hou, Xiaohua; Wang, Huahong; Goh, Khean-Lee; Pangilinan, John A; Kim, Nayoung; des Varannes, Stanislas Bruley

    2017-01-01

    Background/Aims There is a need for a simple and practical tool adapted for the diagnosis of chronic constipation (CC) in the Asian population. This study compared the Asian Neurogastroenterology and Motility Association (ANMA) CC tool and Rome III criteria for the diagnosis of CC in Asian subjects. Methods This multicenter, cross-sectional study included subjects presenting at outpatient gastrointestinal clinics across Asia. Subjects with CC alert symptoms completed a combination Diagnosis Questionnaire to obtain a diagnosis based on 4 different diagnostic methods: self-defined, investigator’s judgment, ANMA CC tool, and Rome III criteria. The primary endpoint was the level of agreement/disagreement between the ANMA CC diagnostic tool and Rome III criteria for the diagnosis of CC. Results The primary analysis comprised of 449 subjects, 414 of whom had a positive diagnosis according to the ANMA CC tool. Rome III positive/ANMA positive and Rome III negative/ANMA negative diagnoses were reported in 76.8% and 7.8% of subjects, respectively, resulting in an overall percentage agreement of 84.6% between the 2 diagnostic methods. The overall percentage disagreement between these 2 diagnostic methods was 15.4%. A higher level of agreement was seen between the ANMA CC tool and self-defined (374 subjects [90.3%]) or investigator’s judgment criteria (388 subjects [93.7%]) compared with the Rome III criteria. Conclusion This study demonstrates that the ANMA CC tool can be a useful for Asian patients with CC. PMID:27764907

  4. Measuring the style of innovative thinking among engineering students

    NASA Astrophysics Data System (ADS)

    Passig, David; Cohen, Lizi

    2014-01-01

    Background: Many tools have been developed to measure the ability of workers to innovate. However, all of them are based on self-reporting questionnaires, which raises questions about their validity Purpose: The aim was to develop and validate a tool, called Ideas Generation Implementation (IGI), to objectively measure the style and potential of engineering students in generating innovative technological ideas. The cognitive framework of IGI is based on the Architectural Innovation Model (AIM). Tool description: The IGI tool was designed to measure the level of innovation in generating technological ideas and their potential to be implemented. These variables rely on the definition of innovation as 'creativity, implemented in a high degree of success'. The levels of innovative thinking are based on the AIM and consist of four levels: incremental innovation, modular innovation, architectural innovation and radical innovation. Sample: Sixty experts in technological innovation developed the tool. We checked its face validity and calculated its reliability in a pilot study (kappa = 0.73). Then, 145 undergraduate students were sampled at random from the seven Israeli universities offering engineering programs and asked to complete the questionnaire. Design and methods: We examined the construct validity of the tool by conducting a variance analysis and measuring the correlations between the innovator's style of each student, as suggested by the AIM, and the three subscale factors of creative styles (efficient, conformist and original), as suggested by the Kirton Adaptors and Innovators (KAI) questionnaire. Results: Students with a radical innovator's style inclined more than those with an incremental innovator's style towards the three creative cognitive styles. Students with an architectural innovator's style inclined moderately, but not significantly, towards the three creative styles. Conclusions: The IGI tool objectively measures innovative thinking among students, thus allowing screening of potential employees at an early stage, during their undergraduate studies. The tool was found to be reliable and valid in measuring the style and potential of technological innovation among engineering students.

  5. Optimization of Compressor Mounting Bracket of a Passenger Car

    NASA Astrophysics Data System (ADS)

    Kalsi, Sachin; Singh, Daljeet; Saini, J. S.

    2018-05-01

    In the present work, the CAE tools are used for the optimization of the compressor mounting bracket used in an automobile. Both static and dynamic analysis is done for the bracket. With the objective to minimize the mass and increase the stiffness of the bracket, the new design is optimized using shape and topology optimization techniques. The optimized design given by CAE tool is then validated experimentally. The new design results in lower level of vibrations, contribute to lower mass along with lesser cost which is effective in air conditioning system as well as the efficiency of a vehicle. The results given by CAE tool had a very good correlation with the experimental results.

  6. Development of risk assessment tool for foundry workers.

    PubMed

    Mohan, G Madhan; Prasad, P S S; Mokkapati, Anil Kumar; Venkataraman, G

    2008-01-01

    Occupational ill-health and work-related disorders are predominant in manufacturing industries due to the inevitable presence of manual work even after several waves of industrial automation and technological advancements. Ergonomic risk factors and musculoskeletal disorders like low-back symptoms have been noted amongst foundry workers. The purpose of this study was to formulate and develop a Physical Effort Index to assess risk factor. The questionnaire tool applicable to foundry environment has been designed and validated. The data recorded through survey across the foundries has been subjected to regression analysis to correlate between proposed physical effort index and the standard Borg's Ratings of Perceived Exertion (RPE) scale. The physical efforts of sixty seven workers in various foundry shop floors were assessed subjectively. The 'Job factors' and 'Work environment' were the two major parameters considered in assessing the worker discomfort level at workplace. A relation between Borg's RPE scale and the above two parameters were arrived at, through regression analysis. The study demonstrates the prevalence of risk factors amongst foundry workers and the effectiveness of the proposed index in estimating the risk factor levels. RELEVANCE TO THE INDUSTRY: The proposed tool will assist foundry supervisors and managers to assess the risk factors and helps in better understanding of the workplace to avoid work-related disorders, ensuring better output.

  7. A method to identify the main mode of machine tool under operating conditions

    NASA Astrophysics Data System (ADS)

    Wang, Daming; Pan, Yabing

    2017-04-01

    The identification of the modal parameters under experimental conditions is the most common procedure when solving the problem of machine tool structure vibration. However, the influence of each mode on the machine tool vibration in real working conditions remains unknown. In fact, the contributions each mode makes to the machine tool vibration during machining process are different. In this article, an active excitation modal analysis is applied to identify the modal parameters in operational condition, and the Operating Deflection Shapes (ODS) in frequencies of high level vibration that affect the quality of machining in real working conditions are obtained. Then, the ODS is decomposed by the mode shapes which are identified in operational conditions. So, the contributions each mode makes to machine tool vibration during machining process are got by decomposition coefficients. From the previous steps, we can find out the main modes which effect the machine tool more significantly in working conditions. This method was also verified to be effective by experiments.

  8. Reliability and Validity of the Alberta Context Tool (ACT) with Professional Nurses: Findings from a Multi-Study Analysis

    PubMed Central

    Squires, Janet E.; Hayduk, Leslie; Hutchinson, Alison M.; Mallick, Ranjeeta; Norton, Peter G.; Cummings, Greta G.; Estabrooks, Carole A.

    2015-01-01

    Although organizational context is central to evidence-based practice, underdeveloped measurement hindersitsassessment. The Alberta Context Tool, comprised of 59 items that tap10 modifiable contextual concepts, was developed to address this gap. The purpose of this study to examine the reliability and validity of scores obtained when the Alberta Context Tool is completed by professional nurses across different healthcare settings. Five separate studies (N = 2361 nurses across different care settings) comprised the study sample. Reliability and validity were assessed. Cronbach’s alpha exceeded 0.70 for9/10 Alberta Context Tool concepts. Item-total correlations exceeded acceptable standards for 56/59items. Confirmatory Factor Analysescoordinated acceptably with the Alberta Context Tool’s proposed latent structure. The mean values for each Alberta Context Tool concept increased from low to high levels of research utilization(as hypothesized) further supporting its validity. This study provides robust evidence forreliability and validity of scores obtained with the Alberta Context Tool when administered to professional nurses. PMID:26098857

  9. IBiSA_Tools: A Computational Toolkit for Ion-Binding State Analysis in Molecular Dynamics Trajectories of Ion Channels.

    PubMed

    Kasahara, Kota; Kinoshita, Kengo

    2016-01-01

    Ion conduction mechanisms of ion channels are a long-standing conundrum. Although the molecular dynamics (MD) method has been extensively used to simulate ion conduction dynamics at the atomic level, analysis and interpretation of MD results are not straightforward due to complexity of the dynamics. In our previous reports, we proposed an analytical method called ion-binding state analysis to scrutinize and summarize ion conduction mechanisms by taking advantage of a variety of analytical protocols, e.g., the complex network analysis, sequence alignment, and hierarchical clustering. This approach effectively revealed the ion conduction mechanisms and their dependence on the conditions, i.e., ion concentration and membrane voltage. Here, we present an easy-to-use computational toolkit for ion-binding state analysis, called IBiSA_tools. This toolkit consists of a C++ program and a series of Python and R scripts. From the trajectory file of MD simulations and a structure file, users can generate several images and statistics of ion conduction processes. A complex network named ion-binding state graph is generated in a standard graph format (graph modeling language; GML), which can be visualized by standard network analyzers such as Cytoscape. As a tutorial, a trajectory of a 50 ns MD simulation of the Kv1.2 channel is also distributed with the toolkit. Users can trace the entire process of ion-binding state analysis step by step. The novel method for analysis of ion conduction mechanisms of ion channels can be easily used by means of IBiSA_tools. This software is distributed under an open source license at the following URL: http://www.ritsumei.ac.jp/~ktkshr/ibisa_tools/.

  10. Translating Vision into Design: A Method for Conceptual Design Development

    NASA Technical Reports Server (NTRS)

    Carpenter, Joyce E.

    2003-01-01

    One of the most challenging tasks for engineers is the definition of design solutions that will satisfy high-level strategic visions and objectives. Even more challenging is the need to demonstrate how a particular design solution supports the high-level vision. This paper describes a process and set of system engineering tools that have been used at the Johnson Space Center to analyze and decompose high-level objectives for future human missions into design requirements that can be used to develop alternative concepts for vehicles, habitats, and other systems. Analysis and design studies of alternative concepts and approaches are used to develop recommendations for strategic investments in research and technology that support the NASA Integrated Space Plan. In addition to a description of system engineering tools, this paper includes a discussion of collaborative design practices for human exploration mission architecture studies used at the Johnson Space Center.

  11. Measurement of obesity prevention in childcare settings: A systematic review of current instruments.

    PubMed

    Stanhope, Kaitlyn K; Kay, Christi; Stevenson, Beth; Gazmararian, Julie A

    The incidence of childhood obesity is highest among children entering kindergarten. Overweight and obesity in early childhood track through adulthood. Programs increasingly target children in early life for obesity prevention. However, the published literature lacks a review on tools available for measuring behaviour and environmental level change in child care. The objective is to describe measurement tools currently in use in evaluating obesity-prevention in preschool-aged children. Literature searches were conducted in PubMed using the keywords "early childhood obesity," "early childhood measurement," "early childhood nutrition" and "early childhood physical activity." Inclusion criteria included a discussion of: (1) obesity prevention, risk assessment or treatment in children ages 1-5 years; and (2) measurement of nutrition or physical activity. One hundred thirty-four publications were selected for analysis. Data on measurement tools, population and outcomes were abstracted into tables. Tables are divided by individual and environmental level measures and further divided into physical activity, diet and physical health outcomes. Recommendations are made for weighing advantages and disadvantages of tools. Despite rising numbers of interventions targeting obesity-prevention and treatment in preschool-aged children, there is no consensus for which tools represent a gold standard or threshold of accuracy. Copyright © 2016 Asia Oceania Association for the Study of Obesity. Published by Elsevier Ltd. All rights reserved.

  12. Students' Strategies for Exception Handling

    ERIC Educational Resources Information Center

    Rashkovits, Rami; Lavy, Ilana

    2011-01-01

    This study discusses and presents various strategies employed by novice programmers concerning exception handling. The main contributions of this paper are as follows: we provide an analysis tool to measure the level of assimilation of exception handling mechanism; we present and analyse strategies to handle exceptions; we present and analyse…

  13. The State of Sustainability Reporting in Universities

    ERIC Educational Resources Information Center

    Lozano, Rodrigo

    2011-01-01

    Purpose: The purpose of this paper is to review and assess the state of sustainability reporting in universities. Design/methodology/approach: Analysis of the performance level of 12 universities sustainability reports using the Graphical Assessment of Sustainability in Universities tool. Findings: The results show that sustainability reporting in…

  14. Community Near-Port Modeling System (C-PORT): Briefing for Environmental Defense Fund

    EPA Science Inventory

    What C-PORT is: Screening level tool for assessing port activities and exploring the range of potential impacts that changes to port operations might have on local air quality; Analysis of decision alternatives through mapping of the likely pattern of potential pollutant dispersi...

  15. INTEGRATING LANDSCAPE AND HYDROLOGIC ANALYSIS FOR WATERSHED ASSESSMENT IN AN AMERICAN SEMI-ARID BIOREGION

    EPA Science Inventory

    The objective of this study is to demonstrate the application of operational hydrologic modeling and landscape assessment tools to investigate the temporal and spatial effects of varying levels of anthropogenic disturbance in a semi-arid catchment and examine the consequences of ...

  16. The use of MALDI-TOF ICMS as an alternative tool for Trichophyton rubrum identification and typing.

    PubMed

    Pereira, Leonel; Dias, Nicolina; Santos, Cledir; Lima, Nelson

    2014-01-01

    In this study, the potential of matrix-assisted laser desorption/ionization time-of-flight intact cell mass spectrometry (MALDI-TOF ICMS) was investigated for the identification of clinical isolates. The isolates were analyzed at the species and strain level. Spectral identification by MALDI-TOF ICMS was performed for all strains, and compared with the results of sequencing of the internal transcribed spacers (ITS1 and ITS2), and the 5.8S rDNA region. PCR fingerprinting analysis using primers M13, (GACA)4, and (AC)10 was performed in order to assess the intra-specific variability of Trichophyton rubrum strains. The identification of strains at species level by MALDI-TOF ICMS was in agreement with the previously performed morphological and biochemical analysis. Sequence data confirmed spectral mass identification at species level. Intra-specific variability was assessed. Within the T. rubrum cluster, strains were distributed into smaller highly related sub-groups with a similarity values above 85%. MALDI-TOF ICMS was shown to be a rapid, low-cost and accurate alternative tool for the identification and strain typing of T. rubrum. Copyright © 2012 Elsevier España, S.L. All rights reserved.

  17. Rasch Analysis of the Power as Knowing Participation in Change Tool--the Brazilian version.

    PubMed

    Guedes, Erika de Souza; Orozco-Vargas, Luiz Carlos; Turrini, Ruth Natália Teresa; de Sousa, Regina Márcia Cardoso; dos Santos, Mariana Alvina; da Cruz, Diná de Almeida Lopes Monteiro

    2013-01-01

    the objective of this study was to evaluate the items contained in the Brazilian version of the Power as Knowing Participation in Change Tool (PKPCT). investigation of the psychometric properties of the mentioned questionnaire through Rasch analysis. the data from 952 nursing assistants and 627 baccalaureate nurses were analyzed (average age 44.1 (SD=9.5); 13.0% men). The subscales Choices, Awareness, Freedom and Involvement were tested separately and presented unidimensionality; the categories of the responses given to the items were compiled from 7 to 3 levels and the items fit the model well, except for the following/leading item, in which the infit and outfit values were above 1.4; this item has also presented Differential Item Functioning (DIF) according to the participant's role. The reliability of the items was of 0.99 and the reliability of the participants ranged from 0.80 to 0.84 in the subscales. Items with extremely high levels of difficulty were not identified. the PKPCT should not be viewed as unidimensional, items with extremely high levels of difficulty in the scale need to be created and the differential functioning of some items has to be further investigated.

  18. Verification and Validation in a Rapid Software Development Process

    NASA Technical Reports Server (NTRS)

    Callahan, John R.; Easterbrook, Steve M.

    1997-01-01

    The high cost of software production is driving development organizations to adopt more automated design and analysis methods such as rapid prototyping, computer-aided software engineering (CASE) tools, and high-level code generators. Even developers of safety-critical software system have adopted many of these new methods while striving to achieve high levels Of quality and reliability. While these new methods may enhance productivity and quality in many cases, we examine some of the risks involved in the use of new methods in safety-critical contexts. We examine a case study involving the use of a CASE tool that automatically generates code from high-level system designs. We show that while high-level testing on the system structure is highly desirable, significant risks exist in the automatically generated code and in re-validating releases of the generated code after subsequent design changes. We identify these risks and suggest process improvements that retain the advantages of rapid, automated development methods within the quality and reliability contexts of safety-critical projects.

  19. The association between health literacy and preventable hospitalizations in Missouri: implications in an era of reform.

    PubMed

    Cimasi, Robert J; Sharamitaro, Anne R; Seiler, Rachel L

    2013-01-01

    To evaluate the association between health literacy and preventable hospitalizations on a population level in Missouri, and the extent to which differing levels of health literacy are associated with county preventable hospitalization rates and associated charges. Secondary data from the 2008 Missouri Information for Community Assessment and Missouri Health Literacy Mapping Tool was used to determine health literacy and preventable hospitalization rates for the 114 counties and city of St. Louis comprising Missouri. Using correlation analysis, simple hierarchical regression models and nonparametric analysis, we investigated whether lower health literacy rates were associated with increased levels of preventable hospitalizations and charges, by county. Health literacy was found to be inversely associated with preventable hospitalization rates on a population level, accounting for 21 percent of the variation in preventable hospitalization rates. Preventable hospitalization rates significantly differed for counties with the highest and lowest health literacy levels. Lower levels of health literacy are significantly associated with increased rates of preventable hospitalizations and charges in a population-level analysis of Missouri counties. Additional research is needed to quantify the effects of successful community health literacy interventions.

  20. Toward designing for trust in database automation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duez, P. P.; Jamieson, G. A.

    Appropriate reliance on system automation is imperative for safe and productive work, especially in safety-critical systems. It is unsafe to rely on automation beyond its designed use; conversely, it can be both unproductive and unsafe to manually perform tasks that are better relegated to automated tools. Operator trust in automated tools mediates reliance, and trust appears to affect how operators use technology. As automated agents become more complex, the question of trust in automation is increasingly important. In order to achieve proper use of automation, we must engender an appropriate degree of trust that is sensitive to changes in operatingmore » functions and context. In this paper, we present research concerning trust in automation in the domain of automated tools for relational databases. Lee and See have provided models of trust in automation. One model developed by Lee and See identifies three key categories of information about the automation that lie along a continuum of attributional abstraction. Purpose-, process-and performance-related information serve, both individually and through inferences between them, to describe automation in such a way as to engender r properly-calibrated trust. Thus, one can look at information from different levels of attributional abstraction as a general requirements analysis for information key to appropriate trust in automation. The model of information necessary to engender appropriate trust in automation [1] is a general one. Although it describes categories of information, it does not provide insight on how to determine the specific information elements required for a given automated tool. We have applied the Abstraction Hierarchy (AH) to this problem in the domain of relational databases. The AH serves as a formal description of the automation at several levels of abstraction, ranging from a very abstract purpose-oriented description to a more concrete description of the resources involved in the automated process. The connection between an AH for an automated tool and a list of information elements at the three levels of attributional abstraction is then direct, providing a method for satisfying information requirements for appropriate trust in automation. In this paper, we will present our method for developing specific information requirements for an automated tool, based on a formal analysis of that tool and the models presented by Lee and See. We will show an example of the application of the AH to automation, in the domain of relational database automation, and the resulting set of specific information elements for appropriate trust in the automated tool. Finally, we will comment on the applicability of this approach to the domain of nuclear plant instrumentation. (authors)« less

  1. Analysis of Volatile Organic Compounds in a Controlled Environment: Ethylene Gas Measurement Studies on Radish

    NASA Technical Reports Server (NTRS)

    Kong, Suk Bin

    2001-01-01

    Volatile organic compound(VOC), ethylene gas, was characterized and quantified by GC/FID. 20-50 ppb levels were detected during the growth stages of radish. SPME could be a good analytical tool for the purpose. Low temperature trapping method using dry ice/diethyl ether and liquid nitrogen bath was recommended for the sampling process for GC/PID and GC/MS analysis.

  2. An Interactive Tool For Semi-automated Statistical Prediction Using Earth Observations and Models

    NASA Astrophysics Data System (ADS)

    Zaitchik, B. F.; Berhane, F.; Tadesse, T.

    2015-12-01

    We developed a semi-automated statistical prediction tool applicable to concurrent analysis or seasonal prediction of any time series variable in any geographic location. The tool was developed using Shiny, JavaScript, HTML and CSS. A user can extract a predictand by drawing a polygon over a region of interest on the provided user interface (global map). The user can select the Climatic Research Unit (CRU) precipitation or Climate Hazards Group InfraRed Precipitation with Station data (CHIRPS) as predictand. They can also upload their own predictand time series. Predictors can be extracted from sea surface temperature, sea level pressure, winds at different pressure levels, air temperature at various pressure levels, and geopotential height at different pressure levels. By default, reanalysis fields are applied as predictors, but the user can also upload their own predictors, including a wide range of compatible satellite-derived datasets. The package generates correlations of the variables selected with the predictand. The user also has the option to generate composites of the variables based on the predictand. Next, the user can extract predictors by drawing polygons over the regions that show strong correlations (composites). Then, the user can select some or all of the statistical prediction models provided. Provided models include Linear Regression models (GLM, SGLM), Tree-based models (bagging, random forest, boosting), Artificial Neural Network, and other non-linear models such as Generalized Additive Model (GAM) and Multivariate Adaptive Regression Splines (MARS). Finally, the user can download the analysis steps they used, such as the region they selected, the time period they specified, the predictand and predictors they chose and preprocessing options they used, and the model results in PDF or HTML format. Key words: Semi-automated prediction, Shiny, R, GLM, ANN, RF, GAM, MARS

  3. Are joint health plans effective for coordination of health services? An analysis based on theory and Danish pre-reform results

    PubMed Central

    Strandberg-Larsen, Martin; Bernt Nielsen, Mikkel; Krasnik, Allan

    2007-01-01

    Background Since 1994 formal health plans have been used for coordination of health care services between the regional and local level in Denmark. From 2007 a substantial reform has changed the administrative boundaries of the system and a new tool for coordination has been introduced. Purpose To assess the use of the pre-reform health plans as a tool for strengthening coordination, quality and preventive efforts between the regional and local level of health care. Methods A survey addressed to: all counties (n=15), all municipalities (n=271) and a randomised selected sample of general practitioners (n=700). Results The stakeholders at the administrative level agree that health plans have not been effective as a tool for coordination. The development of health plans are dominated by the regional level. At the functional level 27 percent of the general practitioners are not familiar with health plans. Among those familiar with health plans 61 percent report that health plans influence their work to only a lesser degree or not at all. Conclusion Joint health planning is needed to achieve coordination of care. Efforts must be made to overcome barriers hampering efficient whole system planning. Active policies emphasising the necessity of health planning, despite involved cost, are warranted to insure delivery of care that benefits the health of the population. PMID:17925882

  4. Reliability culture at La Silla Paranal Observatory

    NASA Astrophysics Data System (ADS)

    Gonzalez, Sergio

    2010-07-01

    The Maintenance Department at the La Silla - Paranal Observatory has been an important base to keep the operations of the observatory at a good level of reliability and availability. Several strategies have been implemented and improved in order to cover these requirements and keep the system and equipment working properly when it is required. For that reason, one of the latest improvements has been the introduction of the concept of reliability, which implies that we don't simply speak about reliability concepts. It involves much more than that. It involves the use of technologies, data collecting, data analysis, decision making, committees concentrated in analysis of failure modes and how they can be eliminated, aligning the results with the requirements of our internal partners and establishing steps to achieve success. Some of these steps have already been implemented: data collection, use of technologies, analysis of data, development of priority tools, committees dedicated to analyze data and people dedicated to reliability analysis. This has permitted us to optimize our process, analyze where we can improve, avoid functional failures, reduce the failures range in several systems and subsystems; all this has had a positive impact in terms of results for our Observatory. All these tools are part of the reliability culture that allows our system to operate with a high level of reliability and availability.

  5. RED Alert – Early warning or detection of global re-emerging infectious disease (RED)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deshpande, Alina

    This is the PDF of a presentation for a webinar given by Los Alamos National Laboratory (LANL) on the early warning or detection of global re-emerging infectious disease (RED). First, there is an overview of LANL biosurveillance tools. Then, information is given about RED Alert. Next, a demonstration is given of a component prototype. RED Alert is an analysis tool that can provide early warning or detection of the re-emergence of an infectious disease at the global level, but through a local lens.

  6. Fault-Tree Compiler

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Boerschlein, David P.

    1993-01-01

    Fault-Tree Compiler (FTC) program, is software tool used to calculate probability of top event in fault tree. Gates of five different types allowed in fault tree: AND, OR, EXCLUSIVE OR, INVERT, and M OF N. High-level input language easy to understand and use. In addition, program supports hierarchical fault-tree definition feature, which simplifies tree-description process and reduces execution time. Set of programs created forming basis for reliability-analysis workstation: SURE, ASSIST, PAWS/STEM, and FTC fault-tree tool (LAR-14586). Written in PASCAL, ANSI-compliant C language, and FORTRAN 77. Other versions available upon request.

  7. Secure FAST: Security Enhancement in the NATO Time Sensitive Targeting Tool

    DTIC Science & Technology

    2010-11-01

    designed to aid in the tracking and prosecuting of Time Sensitive Targets. The FAST tool provides user level authentication and authorisation in terms...level authentication and authorisation in terms of security. It uses operating system level security but does not provide application level security for...and collaboration tool, designed to aid in the tracking and prosecuting of Time Sensitive Targets. The FAST tool provides user level authentication and

  8. Methodological approach and tools for systems thinking in health systems research: technical assistants' support of health administration reform in the Democratic Republic of Congo as an application.

    PubMed

    Ribesse, Nathalie; Bossyns, Paul; Marchal, Bruno; Karemere, Hermes; Burman, Christopher J; Macq, Jean

    2017-03-01

    In the field of development cooperation, interest in systems thinking and complex systems theories as a methodological approach is increasingly recognised. And so it is in health systems research, which informs health development aid interventions. However, practical applications remain scarce to date. The objective of this article is to contribute to the body of knowledge by presenting the tools inspired by systems thinking and complexity theories and methodological lessons learned from their application. These tools were used in a case study. Detailed results of this study are in process for publication in additional articles. Applying a complexity 'lens', the subject of the case study is the role of long-term international technical assistance in supporting health administration reform at the provincial level in the Democratic Republic of Congo. The Methods section presents the guiding principles of systems thinking and complex systems, their relevance and implication for the subject under study, and the existing tools associated with those theories which inspired us in the design of the data collection and analysis process. The tools and their application processes are presented in the results section, and followed in the discussion section by the critical analysis of their innovative potential and emergent challenges. The overall methodology provides a coherent whole, each tool bringing a different and complementary perspective on the system.

  9. Fully automated analysis of multi-resolution four-channel micro-array genotyping data

    NASA Astrophysics Data System (ADS)

    Abbaspour, Mohsen; Abugharbieh, Rafeef; Podder, Mohua; Tebbutt, Scott J.

    2006-03-01

    We present a fully-automated and robust microarray image analysis system for handling multi-resolution images (down to 3-micron with sizes up to 80 MBs per channel). The system is developed to provide rapid and accurate data extraction for our recently developed microarray analysis and quality control tool (SNP Chart). Currently available commercial microarray image analysis applications are inefficient, due to the considerable user interaction typically required. Four-channel DNA microarray technology is a robust and accurate tool for determining genotypes of multiple genetic markers in individuals. It plays an important role in the state of the art trend where traditional medical treatments are to be replaced by personalized genetic medicine, i.e. individualized therapy based on the patient's genetic heritage. However, fast, robust, and precise image processing tools are required for the prospective practical use of microarray-based genetic testing for predicting disease susceptibilities and drug effects in clinical practice, which require a turn-around timeline compatible with clinical decision-making. In this paper we have developed a fully-automated image analysis platform for the rapid investigation of hundreds of genetic variations across multiple genes. Validation tests indicate very high accuracy levels for genotyping results. Our method achieves a significant reduction in analysis time, from several hours to just a few minutes, and is completely automated requiring no manual interaction or guidance.

  10. Development of Response Surface Models for Rapid Analysis & Multidisciplinary Optimization of Launch Vehicle Design Concepts

    NASA Technical Reports Server (NTRS)

    Unal, Resit

    1999-01-01

    Multdisciplinary design optimization (MDO) is an important step in the design and evaluation of launch vehicles, since it has a significant impact on performance and lifecycle cost. The objective in MDO is to search the design space to determine the values of design parameters that optimize the performance characteristics subject to system constraints. Vehicle Analysis Branch (VAB) at NASA Langley Research Center has computerized analysis tools in many of the disciplines required for the design and analysis of launch vehicles. Vehicle performance characteristics can be determined by the use of these computerized analysis tools. The next step is to optimize the system performance characteristics subject to multidisciplinary constraints. However, most of the complex sizing and performance evaluation codes used for launch vehicle design are stand-alone tools, operated by disciplinary experts. They are, in general, difficult to integrate and use directly for MDO. An alternative has been to utilize response surface methodology (RSM) to obtain polynomial models that approximate the functional relationships between performance characteristics and design variables. These approximation models, called response surface models, are then used to integrate the disciplines using mathematical programming methods for efficient system level design analysis, MDO and fast sensitivity simulations. A second-order response surface model of the form given has been commonly used in RSM since in many cases it can provide an adequate approximation especially if the region of interest is sufficiently limited.

  11. Application of the Tool for Turbine Engine Closed-Loop Transient Analysis (TTECTrA) for Dynamic Systems Analysis

    NASA Technical Reports Server (NTRS)

    Csank, Jeffrey T.; Zinnecker, Alicia M.

    2014-01-01

    The aircraft engine design process seeks to achieve the best overall system-level performance, weight, and cost for a given engine design. This is achieved by a complex process known as systems analysis, where steady-state simulations are used to identify trade-offs that should be balanced to optimize the system. The steady-state simulations and data on which systems analysis relies may not adequately capture the true performance trade-offs that exist during transient operation. Dynamic Systems Analysis provides the capability for assessing these trade-offs at an earlier stage of the engine design process. The concept of dynamic systems analysis and the type of information available from this analysis are presented in this paper. To provide this capability, the Tool for Turbine Engine Closed-loop Transient Analysis (TTECTrA) was developed. This tool aids a user in the design of a power management controller to regulate thrust, and a transient limiter to protect the engine model from surge at a single flight condition (defined by an altitude and Mach number). Results from simulation of the closed-loop system may be used to estimate the dynamic performance of the model. This enables evaluation of the trade-off between performance and operability, or safety, in the engine, which could not be done with steady-state data alone. A design study is presented to compare the dynamic performance of two different engine models integrated with the TTECTrA software.

  12. Estimating the Diets of Animals Using Stable Isotopes and a Comprehensive Bayesian Mixing Model

    PubMed Central

    Hopkins, John B.; Ferguson, Jake M.

    2012-01-01

    Using stable isotope mixing models (SIMMs) as a tool to investigate the foraging ecology of animals is gaining popularity among researchers. As a result, statistical methods are rapidly evolving and numerous models have been produced to estimate the diets of animals—each with their benefits and their limitations. Deciding which SIMM to use is contingent on factors such as the consumer of interest, its food sources, sample size, the familiarity a user has with a particular framework for statistical analysis, or the level of inference the researcher desires to make (e.g., population- or individual-level). In this paper, we provide a review of commonly used SIMM models and describe a comprehensive SIMM that includes all features commonly used in SIMM analysis and two new features. We used data collected in Yosemite National Park to demonstrate IsotopeR's ability to estimate dietary parameters. We then examined the importance of each feature in the model and compared our results to inferences from commonly used SIMMs. IsotopeR's user interface (in R) will provide researchers a user-friendly tool for SIMM analysis. The model is also applicable for use in paleontology, archaeology, and forensic studies as well as estimating pollution inputs. PMID:22235246

  13. Reusable Social Networking Capabilities for an Earth Science Collaboratory

    NASA Astrophysics Data System (ADS)

    Lynnes, C.; Da Silva, D.; Leptoukh, G. G.; Ramachandran, R.

    2011-12-01

    A vast untapped resource of data, tools, information and knowledge lies within the Earth science community. This is due to the fact that it is difficult to share the full spectrum of these entities, particularly their full context. As a result, most knowledge exchange is through person-to-person contact at meetings, email and journal articles, each of which can support only a limited level of detail. We propose the creation of an Earth Science Collaboratory (ESC): a framework that would enable sharing of data, tools, workflows, results and the contextual knowledge about these information entities. The Drupal platform is well positioned to provide the key social networking capabilities to the ESC. As a proof of concept of a rich collaboration mechanism, we have developed a Drupal-based mechanism for graphically annotating and commenting on results images from analysis workflows in the online Giovanni analysis system for remote sensing data. The annotations can be tagged and shared with others in the community. These capabilities are further supplemented by a Research Notebook capability reused from another online analysis system named Talkoot. The goal is a reusable set of modules that can integrate with variety of other applications either within Drupal web frameworks or at a machine level.

  14. Computer-aided modelling and analysis of PV systems: a comparative study.

    PubMed

    Koukouvaos, Charalambos; Kandris, Dionisis; Samarakou, Maria

    2014-01-01

    Modern scientific advances have enabled remarkable efficacy for photovoltaic systems with regard to the exploitation of solar energy, boosting them into having a rapidly growing position among the systems developed for the production of renewable energy. However, in many cases the design, analysis, and control of photovoltaic systems are tasks which are quite complex and thus difficult to be carried out. In order to cope with this kind of problems, appropriate software tools have been developed either as standalone products or parts of general purpose software platforms used to model and simulate the generation, transmission, and distribution of solar energy. The utilization of this kind of software tools may be extremely helpful to the successful performance evaluation of energy systems with maximum accuracy and minimum cost in time and effort. The work presented in this paper aims on a first level at the performance analysis of various configurations of photovoltaic systems through computer-aided modelling. On a second level, it provides a comparative evaluation of the credibility of two of the most advanced graphical programming environments, namely, Simulink and LabVIEW, with regard to their application in photovoltaic systems.

  15. Spectral Properties and Dynamics of Gold Nanorods Revealed by EMCCD Based Spectral-Phasor Method

    PubMed Central

    Chen, Hongtao; Digman, Michelle A.

    2015-01-01

    Gold nanorods (NRs) with tunable plasmon-resonant absorption in the near-infrared region have considerable advantages over organic fluorophores as imaging agents. However, the luminescence spectral properties of NRs have not been fully explored at the single particle level in bulk due to lack of proper analytic tools. Here we present a global spectral phasor analysis method which allows investigations of NRs' spectra at single particle level with their statistic behavior and spatial information during imaging. The wide phasor distribution obtained by the spectral phasor analysis indicates spectra of NRs are different from particle to particle. NRs with different spectra can be identified graphically in corresponding spatial images with high spectral resolution. Furthermore, spectral behaviors of NRs under different imaging conditions, e.g. different excitation powers and wavelengths, were carefully examined by our laser-scanning multiphoton microscope with spectral imaging capability. Our results prove that the spectral phasor method is an easy and efficient tool in hyper-spectral imaging analysis to unravel subtle changes of the emission spectrum. Moreover, we applied this method to study the spectral dynamics of NRs during direct optical trapping and by optothermal trapping. Interestingly, spectral shifts were observed in both trapping phenomena. PMID:25684346

  16. Computer-Aided Modelling and Analysis of PV Systems: A Comparative Study

    PubMed Central

    Koukouvaos, Charalambos

    2014-01-01

    Modern scientific advances have enabled remarkable efficacy for photovoltaic systems with regard to the exploitation of solar energy, boosting them into having a rapidly growing position among the systems developed for the production of renewable energy. However, in many cases the design, analysis, and control of photovoltaic systems are tasks which are quite complex and thus difficult to be carried out. In order to cope with this kind of problems, appropriate software tools have been developed either as standalone products or parts of general purpose software platforms used to model and simulate the generation, transmission, and distribution of solar energy. The utilization of this kind of software tools may be extremely helpful to the successful performance evaluation of energy systems with maximum accuracy and minimum cost in time and effort. The work presented in this paper aims on a first level at the performance analysis of various configurations of photovoltaic systems through computer-aided modelling. On a second level, it provides a comparative evaluation of the credibility of two of the most advanced graphical programming environments, namely, Simulink and LabVIEW, with regard to their application in photovoltaic systems. PMID:24772007

  17. MatSeis and the GNEM R&E regional seismic anaylsis tools.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chael, Eric Paul; Hart, Darren M.; Young, Christopher John

    2003-08-01

    To improve the nuclear event monitoring capability of the U.S., the NNSA Ground-based Nuclear Explosion Monitoring Research & Engineering (GNEM R&E) program has been developing a collection of products known as the Knowledge Base (KB). Though much of the focus for the KB has been on the development of calibration data, we have also developed numerous software tools for various purposes. The Matlab-based MatSeis package and the associated suite of regional seismic analysis tools were developed to aid in the testing and evaluation of some Knowledge Base products for which existing applications were either not available or ill-suited. This presentationmore » will provide brief overviews of MatSeis and each of the tools, emphasizing features added in the last year. MatSeis was begun in 1996 and is now a fairly mature product. It is a highly flexible seismic analysis package that provides interfaces to read data from either flatfiles or an Oracle database. All of the standard seismic analysis tasks are supported (e.g. filtering, 3 component rotation, phase picking, event location, magnitude calculation), as well as a variety of array processing algorithms (beaming, FK, coherency analysis, vespagrams). The simplicity of Matlab coding and the tremendous number of available functions make MatSeis/Matlab an ideal environment for developing new monitoring research tools (see the regional seismic analysis tools below). New MatSeis features include: addition of evid information to events in MatSeis, options to screen picks by author, input and output of origerr information, improved performance in reading flatfiles, improved speed in FK calculations, and significant improvements to Measure Tool (filtering, multiple phase display), Free Plot (filtering, phase display and alignment), Mag Tool (maximum likelihood options), and Infra Tool (improved calculation speed, display of an F statistic stream). Work on the regional seismic analysis tools (CodaMag, EventID, PhaseMatch, and Dendro) began in 1999 and the tools vary in their level of maturity. All rely on MatSeis to provide necessary data (waveforms, arrivals, origins, and travel time curves). CodaMag Tool implements magnitude calculation by scaling to fit the envelope shape of the coda for a selected phase type (Mayeda, 1993; Mayeda and Walter, 1996). New tool features include: calculation of a yield estimate based on the source spectrum, display of a filtered version of the seismogram based on the selected band, and the output of codamag data records for processed events. EventID Tool implements event discrimination using phase ratios of regional arrivals (Hartse et al., 1997; Walter et al., 1999). New features include: bandpass filtering of displayed waveforms, screening of reference events based on SNR, multivariate discriminants, use of libcgi to access correction surfaces, and the output of discrim{_}data records for processed events. PhaseMatch Tool implements match filtering to isolate surface waves (Herrin and Goforth, 1977). New features include: display of the signal's observed dispersion and an option to use a station-based dispersion surface. Dendro Tool implements agglomerative hierarchical clustering using dendrograms to identify similar events based on waveform correlation (Everitt, 1993). New features include: modifications to include arrival information within the tool, and the capability to automatically add/re-pick arrivals based on the picked arrivals for similar events.« less

  18. International Space Station Execution Replanning Process: Trends and Implications

    NASA Technical Reports Server (NTRS)

    McCormick, Robet J.

    2007-01-01

    International Space Station is a joint venture. Because of this, ISS execution planning- planning within the week for the ISS requires coordination across multiple partner, and the associated processes and tools to allow this coordination to occur. These processes and tools are currently defined and are extensively used. This paper summarizes these processes, and documents the current data trends associated with these processes and tools, with a focus on the metrics provided from the ISS Planning Product Change Request (PPCR) tool. As NASA's Vision for Space Exploration and general Human spaceflight trends are implemented, the probability of joint venture long duration programs such as ISS, with varying levels of intergovernmental and/or corporate partnership, will increase. Therefore, the results of this PPCR analysis serve as current Lessons learned for the ISS and for further similar ventures.

  19. QPROT: Statistical method for testing differential expression using protein-level intensity data in label-free quantitative proteomics.

    PubMed

    Choi, Hyungwon; Kim, Sinae; Fermin, Damian; Tsou, Chih-Chiang; Nesvizhskii, Alexey I

    2015-11-03

    We introduce QPROT, a statistical framework and computational tool for differential protein expression analysis using protein intensity data. QPROT is an extension of the QSPEC suite, originally developed for spectral count data, adapted for the analysis using continuously measured protein-level intensity data. QPROT offers a new intensity normalization procedure and model-based differential expression analysis, both of which account for missing data. Determination of differential expression of each protein is based on the standardized Z-statistic based on the posterior distribution of the log fold change parameter, guided by the false discovery rate estimated by a well-known Empirical Bayes method. We evaluated the classification performance of QPROT using the quantification calibration data from the clinical proteomic technology assessment for cancer (CPTAC) study and a recently published Escherichia coli benchmark dataset, with evaluation of FDR accuracy in the latter. QPROT is a statistical framework with computational software tool for comparative quantitative proteomics analysis. It features various extensions of QSPEC method originally built for spectral count data analysis, including probabilistic treatment of missing values in protein intensity data. With the increasing popularity of label-free quantitative proteomics data, the proposed method and accompanying software suite will be immediately useful for many proteomics laboratories. This article is part of a Special Issue entitled: Computational Proteomics. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. A formative evaluation of the implementation of a medication safety data collection tool in English healthcare settings: A qualitative interview study using normalisation process theory.

    PubMed

    Rostami, Paryaneh; Ashcroft, Darren M; Tully, Mary P

    2018-01-01

    Reducing medication-related harm is a global priority; however, impetus for improvement is impeded as routine medication safety data are seldom available. Therefore, the Medication Safety Thermometer was developed within England's National Health Service. This study aimed to explore the implementation of the tool into routine practice from users' perspectives. Fifteen semi-structured interviews were conducted with purposely sampled National Health Service staff from primary and secondary care settings. Interview data were analysed using an initial thematic analysis, and subsequent analysis using Normalisation Process Theory. Secondary care staff understood that the Medication Safety Thermometer's purpose was to measure medication safety and improvement. However, other uses were reported, such as pinpointing poor practice. Confusion about its purpose existed in primary care, despite further training, suggesting unsuitability of the tool. Decreased engagement was displayed by staff less involved with medication use, who displayed less ownership. Nonetheless, these advocates often lacked support from management and frontline levels, leading to an overall lack of engagement. Many participants reported efforts to drive scale-up of the use of the tool, for example, by securing funding, despite uncertainty around how to use data. Successful improvement was often at ward-level and went unrecognised within the wider organisation. There was mixed feedback regarding the value of the tool, often due to a perceived lack of "capacity". However, participants demonstrated interest in learning how to use their data and unexpected applications of data were reported. Routine medication safety data collection is complex, but achievable and facilitates improvements. However, collected data must be analysed, understood and used for further work to achieve improvement, which often does not happen. The national roll-out of the tool has accelerated shared learning; however, a number of difficulties still exist, particularly in primary care settings, where a different approach is likely to be required.

  1. A formative evaluation of the implementation of a medication safety data collection tool in English healthcare settings: A qualitative interview study using normalisation process theory

    PubMed Central

    Ashcroft, Darren M.; Tully, Mary P.

    2018-01-01

    Background Reducing medication-related harm is a global priority; however, impetus for improvement is impeded as routine medication safety data are seldom available. Therefore, the Medication Safety Thermometer was developed within England’s National Health Service. This study aimed to explore the implementation of the tool into routine practice from users’ perspectives. Method Fifteen semi-structured interviews were conducted with purposely sampled National Health Service staff from primary and secondary care settings. Interview data were analysed using an initial thematic analysis, and subsequent analysis using Normalisation Process Theory. Results Secondary care staff understood that the Medication Safety Thermometer’s purpose was to measure medication safety and improvement. However, other uses were reported, such as pinpointing poor practice. Confusion about its purpose existed in primary care, despite further training, suggesting unsuitability of the tool. Decreased engagement was displayed by staff less involved with medication use, who displayed less ownership. Nonetheless, these advocates often lacked support from management and frontline levels, leading to an overall lack of engagement. Many participants reported efforts to drive scale-up of the use of the tool, for example, by securing funding, despite uncertainty around how to use data. Successful improvement was often at ward-level and went unrecognised within the wider organisation. There was mixed feedback regarding the value of the tool, often due to a perceived lack of “capacity”. However, participants demonstrated interest in learning how to use their data and unexpected applications of data were reported. Conclusion Routine medication safety data collection is complex, but achievable and facilitates improvements. However, collected data must be analysed, understood and used for further work to achieve improvement, which often does not happen. The national roll-out of the tool has accelerated shared learning; however, a number of difficulties still exist, particularly in primary care settings, where a different approach is likely to be required. PMID:29489842

  2. [Model-based biofuels system analysis: a review].

    PubMed

    Chang, Shiyan; Zhang, Xiliang; Zhao, Lili; Ou, Xunmin

    2011-03-01

    Model-based system analysis is an important tool for evaluating the potential and impacts of biofuels, and for drafting biofuels technology roadmaps and targets. The broad reach of the biofuels supply chain requires that biofuels system analyses span a range of disciplines, including agriculture/forestry, energy, economics, and the environment. Here we reviewed various models developed for or applied to modeling biofuels, and presented a critical analysis of Agriculture/Forestry System Models, Energy System Models, Integrated Assessment Models, Micro-level Cost, Energy and Emission Calculation Models, and Specific Macro-level Biofuel Models. We focused on the models' strengths, weaknesses, and applicability, facilitating the selection of a suitable type of model for specific issues. Such an analysis was a prerequisite for future biofuels system modeling, and represented a valuable resource for researchers and policy makers.

  3. Multidimensional Recurrence Quantification Analysis (MdRQA) for the Analysis of Multidimensional Time-Series: A Software Implementation in MATLAB and Its Application to Group-Level Data in Joint Action

    PubMed Central

    Wallot, Sebastian; Roepstorff, Andreas; Mønster, Dan

    2016-01-01

    We introduce Multidimensional Recurrence Quantification Analysis (MdRQA) as a tool to analyze multidimensional time-series data. We show how MdRQA can be used to capture the dynamics of high-dimensional signals, and how MdRQA can be used to assess coupling between two or more variables. In particular, we describe applications of the method in research on joint and collective action, as it provides a coherent analysis framework to systematically investigate dynamics at different group levels—from individual dynamics, to dyadic dynamics, up to global group-level of arbitrary size. The Appendix in Supplementary Material contains a software implementation in MATLAB to calculate MdRQA measures. PMID:27920748

  4. Joint Analysis: QDR 2001 and Beyond Mini-Symposium Held in Fairfax, Virginia on 1-3 February 2000

    DTIC Science & Technology

    2001-04-11

    have done better in: * Articulating a high level, understandable story that was credible to Congress. * Documenting, archiving assessments performed ...to (1) examine DoD assessment capabilities for performing QDR 2001, (2) provide a non-confrontational environment in which OSD, the Joint Staff...example. Foc trcues-Ec Key Issues Tools/databases Defined for Three Levels _________ (Low, Med., High ) Scenarios A . Emphasis on Modernization B. Emphasis

  5. Characterization of Sleep Using Bispectral Analysis

    DTIC Science & Technology

    2001-10-25

    approved the Bispectral Index (BIS) - developed by Aspect Medical Systems (Natick, MA) - as a tool for monitoring the depth of anesthesia based on...on the brain [2]. The A-1000 BIS monitor quantifies the level of hypnosis based on frequency, amplitude, and coherence of the EEG. The BIS index...on a scale of 0-100 (100 reflecting the fully conscious or awake state), is a single-number indicator of the level of induced hypnosis . Furthermore

  6. Planning level assessment of greenhouse gas emissions for alternative transportation construction projects : carbon footprint estimator, phase II, volume I - GASCAP model.

    DOT National Transportation Integrated Search

    2014-03-01

    The GASCAP model was developed to provide a software tool for analysis of the life-cycle GHG : emissions associated with the construction and maintenance of transportation projects. This phase : of development included techniques for estimating emiss...

  7. CHEMICAL MARKERS OF HUMAN WASTE CONTAMINATION: ANALYSIS OF UROBILIN AND PHARMACEUTICALS IN SOURCE WATERS

    EPA Science Inventory

    Giving public water authorities another tool to monitor and measure levels of human waste contamination of waters simply and rapidly would enhance public protection. Most of the methods used today detect such contamination by quantifying microbes occurring in feces in high enough...

  8. Strategies for Teaching Fractions: Using Error Analysis for Intervention and Assessment

    ERIC Educational Resources Information Center

    Spangler, David B.

    2011-01-01

    Many students struggle with fractions and must understand them before learning higher-level math. Veteran educator David B. Spangler provides research-based tools that are aligned with NCTM and Common Core State Standards. He outlines powerful diagnostic methods for analyzing student work and providing timely, specific, and meaningful…

  9. Conceptual Level of Understanding about Sound Concept: Sample of Fifth Grade Students

    ERIC Educational Resources Information Center

    Bostan Sarioglan, Ayberk

    2016-01-01

    In this study, students' conceptual change processes related to the sound concept were examined. Study group was comprises of 325 fifth grade middle school students. Three multiple-choice questions were used as the data collection tool. At the data analysis process "scientific response", "scientifically unacceptable response"…

  10. Analysis methodology and development of a statistical tool for biodistribution data from internal contamination with actinides.

    PubMed

    Lamart, Stephanie; Griffiths, Nina M; Tchitchek, Nicolas; Angulo, Jaime F; Van der Meeren, Anne

    2017-03-01

    The aim of this work was to develop a computational tool that integrates several statistical analysis features for biodistribution data from internal contamination experiments. These data represent actinide levels in biological compartments as a function of time and are derived from activity measurements in tissues and excreta. These experiments aim at assessing the influence of different contamination conditions (e.g. intake route or radioelement) on the biological behavior of the contaminant. The ever increasing number of datasets and diversity of experimental conditions make the handling and analysis of biodistribution data difficult. This work sought to facilitate the statistical analysis of a large number of datasets and the comparison of results from diverse experimental conditions. Functional modules were developed using the open-source programming language R to facilitate specific operations: descriptive statistics, visual comparison, curve fitting, and implementation of biokinetic models. In addition, the structure of the datasets was harmonized using the same table format. Analysis outputs can be written in text files and updated data can be written in the consistent table format. Hence, a data repository is built progressively, which is essential for the optimal use of animal data. Graphical representations can be automatically generated and saved as image files. The resulting computational tool was applied using data derived from wound contamination experiments conducted under different conditions. In facilitating biodistribution data handling and statistical analyses, this computational tool ensures faster analyses and a better reproducibility compared with the use of multiple office software applications. Furthermore, re-analysis of archival data and comparison of data from different sources is made much easier. Hence this tool will help to understand better the influence of contamination characteristics on actinide biokinetics. Our approach can aid the optimization of treatment protocols and therefore contribute to the improvement of the medical response after internal contamination with actinides.

  11. In silico prediction of splice-altering single nucleotide variants in the human genome.

    PubMed

    Jian, Xueqiu; Boerwinkle, Eric; Liu, Xiaoming

    2014-12-16

    In silico tools have been developed to predict variants that may have an impact on pre-mRNA splicing. The major limitation of the application of these tools to basic research and clinical practice is the difficulty in interpreting the output. Most tools only predict potential splice sites given a DNA sequence without measuring splicing signal changes caused by a variant. Another limitation is the lack of large-scale evaluation studies of these tools. We compared eight in silico tools on 2959 single nucleotide variants within splicing consensus regions (scSNVs) using receiver operating characteristic analysis. The Position Weight Matrix model and MaxEntScan outperformed other methods. Two ensemble learning methods, adaptive boosting and random forests, were used to construct models that take advantage of individual methods. Both models further improved prediction, with outputs of directly interpretable prediction scores. We applied our ensemble scores to scSNVs from the Catalogue of Somatic Mutations in Cancer database. Analysis showed that predicted splice-altering scSNVs are enriched in recurrent scSNVs and known cancer genes. We pre-computed our ensemble scores for all potential scSNVs across the human genome, providing a whole genome level resource for identifying splice-altering scSNVs discovered from large-scale sequencing studies.

  12. [Validation of a dietary habits questionnaire related to fats and sugars intake].

    PubMed

    Aráuz Hernández, Ana Gladys; Roselló Araya, Marlene; Guzmán Padilla, Sonia; Padilla Vargas, Gioconda

    2008-12-01

    The objective of this study was to design and validate a psychometric tool to measure dietary practices related to the intake of fats and sugars in a sample of overweight and obese adults. Classical test theory was applied. The validated construct was dietary habits, and the following theoretical dimensions were utilized: exclusion, modification, substitution and replacement. These had been previously defined in similar studies conducted in other countries. The tool was validated with 139 adults, males and females, with body mass indexes equal to or higher than 25. Construct validity for each section of the tool was obtained through factor analysis. The final tool was made up of 47 items. Cronbach's Alpha reliability coefficient was 0.948, which indicates a highly satisfactory internal consistency. Using sediment graph data and factor analysis of the four proposed theoretical dimensions of behavior, items were fused into two dimensions with a cumulative variance of 58%. These were renamed "elimination" and "modification". Cronbach's Alphas were 0.906 and 0.873, respectively, indicating a high level of reliability for construct measurement. Results show the need to adapt foreign tools to our socio-cultural context before utilizing them in interventions intended to modify dietary patterns, since these are interrelated to other aspects of the culture itself.

  13. PageMan: an interactive ontology tool to generate, display, and annotate overview graphs for profiling experiments.

    PubMed

    Usadel, Björn; Nagel, Axel; Steinhauser, Dirk; Gibon, Yves; Bläsing, Oliver E; Redestig, Henning; Sreenivasulu, Nese; Krall, Leonard; Hannah, Matthew A; Poree, Fabien; Fernie, Alisdair R; Stitt, Mark

    2006-12-18

    Microarray technology has become a widely accepted and standardized tool in biology. The first microarray data analysis programs were developed to support pair-wise comparison. However, as microarray experiments have become more routine, large scale experiments have become more common, which investigate multiple time points or sets of mutants or transgenics. To extract biological information from such high-throughput expression data, it is necessary to develop efficient analytical platforms, which combine manually curated gene ontologies with efficient visualization and navigation tools. Currently, most tools focus on a few limited biological aspects, rather than offering a holistic, integrated analysis. Here we introduce PageMan, a multiplatform, user-friendly, and stand-alone software tool that annotates, investigates, and condenses high-throughput microarray data in the context of functional ontologies. It includes a GUI tool to transform different ontologies into a suitable format, enabling the user to compare and choose between different ontologies. It is equipped with several statistical modules for data analysis, including over-representation analysis and Wilcoxon statistical testing. Results are exported in a graphical format for direct use, or for further editing in graphics programs.PageMan provides a fast overview of single treatments, allows genome-level responses to be compared across several microarray experiments covering, for example, stress responses at multiple time points. This aids in searching for trait-specific changes in pathways using mutants or transgenics, analyzing development time-courses, and comparison between species. In a case study, we analyze the results of publicly available microarrays of multiple cold stress experiments using PageMan, and compare the results to a previously published meta-analysis.PageMan offers a complete user's guide, a web-based over-representation analysis as well as a tutorial, and is freely available at http://mapman.mpimp-golm.mpg.de/pageman/. PageMan allows multiple microarray experiments to be efficiently condensed into a single page graphical display. The flexible interface allows data to be quickly and easily visualized, facilitating comparisons within experiments and to published experiments, thus enabling researchers to gain a rapid overview of the biological responses in the experiments.

  14. Implementing multiresolution models and families of models: from entity-level simulation to desktop stochastic models and "repro" models

    NASA Astrophysics Data System (ADS)

    McEver, Jimmie; Davis, Paul K.; Bigelow, James H.

    2000-06-01

    We have developed and used families of multiresolution and multiple-perspective models (MRM and MRMPM), both in our substantive analytic work for the Department of Defense and to learn more about how such models can be designed and implemented. This paper is a brief case history of our experience with a particular family of models addressing the use of precision fires in interdicting and halting an invading army. Our models were implemented as closed-form analytic solutions, in spreadsheets, and in the more sophisticated AnalyticaTM environment. We also drew on an entity-level simulation for data. The paper reviews the importance of certain key attributes of development environments (visual modeling, interactive languages, friendly use of array mathematics, facilities for experimental design and configuration control, statistical analysis tools, graphical visualization tools, interactive post-processing, and relational database tools). These can go a long way towards facilitating MRMPM work, but many of these attributes are not yet widely available (or available at all) in commercial model-development tools--especially for use with personal computers. We conclude with some lessons learned from our experience.

  15. Machine Learning Meta-analysis of Large Metagenomic Datasets: Tools and Biological Insights.

    PubMed

    Pasolli, Edoardo; Truong, Duy Tin; Malik, Faizan; Waldron, Levi; Segata, Nicola

    2016-07-01

    Shotgun metagenomic analysis of the human associated microbiome provides a rich set of microbial features for prediction and biomarker discovery in the context of human diseases and health conditions. However, the use of such high-resolution microbial features presents new challenges, and validated computational tools for learning tasks are lacking. Moreover, classification rules have scarcely been validated in independent studies, posing questions about the generality and generalization of disease-predictive models across cohorts. In this paper, we comprehensively assess approaches to metagenomics-based prediction tasks and for quantitative assessment of the strength of potential microbiome-phenotype associations. We develop a computational framework for prediction tasks using quantitative microbiome profiles, including species-level relative abundances and presence of strain-specific markers. A comprehensive meta-analysis, with particular emphasis on generalization across cohorts, was performed in a collection of 2424 publicly available metagenomic samples from eight large-scale studies. Cross-validation revealed good disease-prediction capabilities, which were in general improved by feature selection and use of strain-specific markers instead of species-level taxonomic abundance. In cross-study analysis, models transferred between studies were in some cases less accurate than models tested by within-study cross-validation. Interestingly, the addition of healthy (control) samples from other studies to training sets improved disease prediction capabilities. Some microbial species (most notably Streptococcus anginosus) seem to characterize general dysbiotic states of the microbiome rather than connections with a specific disease. Our results in modelling features of the "healthy" microbiome can be considered a first step toward defining general microbial dysbiosis. The software framework, microbiome profiles, and metadata for thousands of samples are publicly available at http://segatalab.cibio.unitn.it/tools/metaml.

  16. Metabolic Network Modeling of Microbial Communities

    PubMed Central

    Biggs, Matthew B.; Medlock, Gregory L.; Kolling, Glynis L.

    2015-01-01

    Genome-scale metabolic network reconstructions and constraint-based analysis are powerful methods that have the potential to make functional predictions about microbial communities. Current use of genome-scale metabolic networks to characterize the metabolic functions of microbial communities includes species compartmentalization, separating species-level and community-level objectives, dynamic analysis, the “enzyme-soup” approach, multi-scale modeling, and others. There are many challenges inherent to the field, including a need for tools that accurately assign high-level omics signals to individual community members, new automated reconstruction methods that rival manual curation, and novel algorithms for integrating omics data and engineering communities. As technologies and modeling frameworks improve, we expect that there will be proportional advances in the fields of ecology, health science, and microbial community engineering. PMID:26109480

  17. wft4galaxy: a workflow testing tool for galaxy.

    PubMed

    Piras, Marco Enrico; Pireddu, Luca; Zanetti, Gianluigi

    2017-12-01

    Workflow managers for scientific analysis provide a high-level programming platform facilitating standardization, automation, collaboration and access to sophisticated computing resources. The Galaxy workflow manager provides a prime example of this type of platform. As compositions of simpler tools, workflows effectively comprise specialized computer programs implementing often very complex analysis procedures. To date, no simple way to automatically test Galaxy workflows and ensure their correctness has appeared in the literature. With wft4galaxy we offer a tool to bring automated testing to Galaxy workflows, making it feasible to bring continuous integration to their development and ensuring that defects are detected promptly. wft4galaxy can be easily installed as a regular Python program or launched directly as a Docker container-the latter reducing installation effort to a minimum. Available at https://github.com/phnmnl/wft4galaxy under the Academic Free License v3.0. marcoenrico.piras@crs4.it. © The Author 2017. Published by Oxford University Press.

  18. New generation of exploration tools: interactive modeling software and microcomputers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krajewski, S.A.

    1986-08-01

    Software packages offering interactive modeling techniques are now available for use on microcomputer hardware systems. These packages are reasonably priced for both company and independent explorationists; they do not require users to have high levels of computer literacy; they are capable of rapidly completing complex ranges of sophisticated geologic and geophysical modeling tasks; and they can produce presentation-quality output for comparison with real-world data. For example, interactive packages are available for mapping, log analysis, seismic modeling, reservoir studies, and financial projects as well as for applying a variety of statistical and geostatistical techniques to analysis of exploration data. More importantly,more » these packages enable explorationists to directly apply their geologic expertise when developing and fine-tuning models for identifying new prospects and for extending producing fields. As a result of these features, microcomputers and interactive modeling software are becoming common tools in many exploration offices. Gravity and magnetics software programs illustrate some of the capabilities of such exploration tools.« less

  19. Verification and Validation of the General Mission Analysis Tool (GMAT)

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P.; Qureshi, Rizwan H.; Cooley, D. Steven; Parker, Joel J. K.; Grubb, Thomas G.

    2014-01-01

    This paper describes the processes and results of Verification and Validation (V&V) efforts for the General Mission Analysis Tool (GMAT). We describe the test program and environments, the tools used for independent test data, and comparison results. The V&V effort produced approximately 13,000 test scripts that are run as part of the nightly buildtest process. In addition, we created approximately 3000 automated GUI tests that are run every two weeks. Presenting all test results are beyond the scope of a single paper. Here we present high-level test results in most areas, and detailed test results for key areas. The final product of the V&V effort presented in this paper was GMAT version R2013a, the first Gold release of the software with completely updated documentation and greatly improved quality. Release R2013a was the staging release for flight qualification performed at Goddard Space Flight Center (GSFC) ultimately resulting in GMAT version R2013b.

  20. Applied Meteorology Unit (AMU) Quarterly Report Fourth Quarter FY-13

    NASA Technical Reports Server (NTRS)

    Bauman, William; Crawford, Winifred; Watson, Leela; Shafer, Jaclyn; Huddleston, Lisa

    2013-01-01

    Ms. Shafer completed the task to determine relationships between pressure gradients and peak winds at Vandenberg Air Force Base (VAFB), and began developing a climatology for the VAFB wind towers; Dr. Huddleston completed the task to develop a tool to help forecast the time of the first lightning strike of the day in the Kennedy Space Center (KSC)/Cape Canaveral Air Force Station (CCAFS) area; Dr. Bauman completed work on a severe weather forecast tool focused on the Eastern Range (ER), and also developed upper-winds analysis tools for VAFB and Wallops Fl ight Facility (WFF); Ms. Crawford processed and displayed radar data in the software she will use to create a dual-Doppler analysis over the east-central Florida and KSC/CCAFS areas; Mr. Decker completed developing a wind pairs database for the Launch Services Program to use when evaluating upper-level winds for launch vehicles; Dr. Watson continued work to assimilate observational data into the high-resolution model configurations she created for WFF and the ER.

  1. Medication Reconciliation: Work Domain Ontology, prototype development, and a predictive model.

    PubMed

    Markowitz, Eliz; Bernstam, Elmer V; Herskovic, Jorge; Zhang, Jiajie; Shneiderman, Ben; Plaisant, Catherine; Johnson, Todd R

    2011-01-01

    Medication errors can result from administration inaccuracies at any point of care and are a major cause for concern. To develop a successful Medication Reconciliation (MR) tool, we believe it necessary to build a Work Domain Ontology (WDO) for the MR process. A WDO defines the explicit, abstract, implementation-independent description of the task by separating the task from work context, application technology, and cognitive architecture. We developed a prototype based upon the WDO and designed to adhere to standard principles of interface design. The prototype was compared to Legacy Health System's and Pre-Admission Medication List Builder MR tools via a Keystroke-Level Model analysis for three MR tasks. The analysis found the prototype requires the fewest mental operations, completes tasks in the fewest steps, and completes tasks in the least amount of time. Accordingly, we believe that developing a MR tool, based upon the WDO and user interface guidelines, improves user efficiency and reduces cognitive load.

  2. Medication Reconciliation: Work Domain Ontology, Prototype Development, and a Predictive Model

    PubMed Central

    Markowitz, Eliz; Bernstam, Elmer V.; Herskovic, Jorge; Zhang, Jiajie; Shneiderman, Ben; Plaisant, Catherine; Johnson, Todd R.

    2011-01-01

    Medication errors can result from administration inaccuracies at any point of care and are a major cause for concern. To develop a successful Medication Reconciliation (MR) tool, we believe it necessary to build a Work Domain Ontology (WDO) for the MR process. A WDO defines the explicit, abstract, implementation-independent description of the task by separating the task from work context, application technology, and cognitive architecture. We developed a prototype based upon the WDO and designed to adhere to standard principles of interface design. The prototype was compared to Legacy Health System’s and Pre-Admission Medication List Builder MR tools via a Keystroke-Level Model analysis for three MR tasks. The analysis found the prototype requires the fewest mental operations, completes tasks in the fewest steps, and completes tasks in the least amount of time. Accordingly, we believe that developing a MR tool, based upon the WDO and user interface guidelines, improves user efficiency and reduces cognitive load. PMID:22195146

  3. Ecological hierarchies and self-organisation - Pattern analysis, modelling and process integration across scales

    USGS Publications Warehouse

    Reuter, H.; Jopp, F.; Blanco-Moreno, J. M.; Damgaard, C.; Matsinos, Y.; DeAngelis, D.L.

    2010-01-01

    A continuing discussion in applied and theoretical ecology focuses on the relationship of different organisational levels and on how ecological systems interact across scales. We address principal approaches to cope with complex across-level issues in ecology by applying elements of hierarchy theory and the theory of complex adaptive systems. A top-down approach, often characterised by the use of statistical techniques, can be applied to analyse large-scale dynamics and identify constraints exerted on lower levels. Current developments are illustrated with examples from the analysis of within-community spatial patterns and large-scale vegetation patterns. A bottom-up approach allows one to elucidate how interactions of individuals shape dynamics at higher levels in a self-organisation process; e.g., population development and community composition. This may be facilitated by various modelling tools, which provide the distinction between focal levels and resulting properties. For instance, resilience in grassland communities has been analysed with a cellular automaton approach, and the driving forces in rodent population oscillations have been identified with an agent-based model. Both modelling tools illustrate the principles of analysing higher level processes by representing the interactions of basic components.The focus of most ecological investigations on either top-down or bottom-up approaches may not be appropriate, if strong cross-scale relationships predominate. Here, we propose an 'across-scale-approach', closely interweaving the inherent potentials of both approaches. This combination of analytical and synthesising approaches will enable ecologists to establish a more coherent access to cross-level interactions in ecological systems. ?? 2010 Gesellschaft f??r ??kologie.

  4. Hanny and the Mystery of the Voorwerp: Citizen Science in the Classroom

    NASA Astrophysics Data System (ADS)

    Costello, K.; Reilly, E.; Bracey, G.; Gay, P.

    2012-08-01

    The highly engaging graphic comic Hanny and the Mystery of the Voorwerp is the focus of an eight-day educational unit geared to middle level students. Activities in the unit link national astronomy standards to the citizen science Zooniverse website through tutorials that lead to analysis of real data online. NASA resources are also included in the unit. The content of the session focused on the terminology and concepts - galaxy formation, types and characteristics of galaxies, use of spectral analysis - needed to classify galaxies. Use of citizen science projects as tools to teach inquiry in the classroom was the primary focus of the workshop. The session included a hands-on experiment taken from the unit, including a NASA spectral analysis activity called "What's the Frequency, Roy G Biv?" In addition, presenters demonstrated the galaxy classification tools found in the "Galaxy Zoo" project at the Zooniverse citizen science website.

  5. Optimization of an Advanced Hybrid Wing Body Concept Using HCDstruct Version 1.2

    NASA Technical Reports Server (NTRS)

    Quinlan, Jesse R.; Gern, Frank H.

    2016-01-01

    Hybrid Wing Body (HWB) aircraft concepts continue to be promising candidates for achieving the simultaneous fuel consumption and noise reduction goals set forth by NASA's Environmentally Responsible Aviation (ERA) project. In order to evaluate the projected benefits, improvements in structural analysis at the conceptual design level were necessary; thus, NASA researchers developed the Hybrid wing body Conceptual Design and structural optimization (HCDstruct) tool to perform aeroservoelastic structural optimizations of advanced HWB concepts. In this paper, the authors present substantial updates to the HCDstruct tool and related analysis, including: the addition of four inboard and eight outboard control surfaces and two all-movable tail/rudder assemblies, providing a full aeroservoelastic analysis capability; the implementation of asymmetric load cases for structural sizing applications; and a methodology for minimizing control surface actuation power using NASTRAN SOL 200 and HCDstruct's aeroservoelastic finite-element model (FEM).

  6. ArrayNinja: An Open Source Platform for Unified Planning and Analysis of Microarray Experiments.

    PubMed

    Dickson, B M; Cornett, E M; Ramjan, Z; Rothbart, S B

    2016-01-01

    Microarray-based proteomic platforms have emerged as valuable tools for studying various aspects of protein function, particularly in the field of chromatin biochemistry. Microarray technology itself is largely unrestricted in regard to printable material and platform design, and efficient multidimensional optimization of assay parameters requires fluidity in the design and analysis of custom print layouts. This motivates the need for streamlined software infrastructure that facilitates the combined planning and analysis of custom microarray experiments. To this end, we have developed ArrayNinja as a portable, open source, and interactive application that unifies the planning and visualization of microarray experiments and provides maximum flexibility to end users. Array experiments can be planned, stored to a private database, and merged with the imaged results for a level of data interaction and centralization that is not currently attainable with available microarray informatics tools. © 2016 Elsevier Inc. All rights reserved.

  7. MASS SPECTROMETRY-BASED METABOLOMICS

    PubMed Central

    Dettmer, Katja; Aronov, Pavel A.; Hammock, Bruce D.

    2007-01-01

    This review presents an overview of the dynamically developing field of mass spectrometry-based metabolomics. Metabolomics aims at the comprehensive and quantitative analysis of wide arrays of metabolites in biological samples. These numerous analytes have very diverse physico-chemical properties and occur at different abundance levels. Consequently, comprehensive metabolomics investigations are primarily a challenge for analytical chemistry and specifically mass spectrometry has vast potential as a tool for this type of investigation. Metabolomics require special approaches for sample preparation, separation, and mass spectrometric analysis. Current examples of those approaches are described in this review. It primarily focuses on metabolic fingerprinting, a technique that analyzes all detectable analytes in a given sample with subsequent classification of samples and identification of differentially expressed metabolites, which define the sample classes. To perform this complex task, data analysis tools, metabolite libraries, and databases are required. Therefore, recent advances in metabolomics bioinformatics are also discussed. PMID:16921475

  8. Risk analysis and bovine tuberculosis, a re-emerging zoonosis.

    PubMed

    Etter, Eric; Donado, Pilar; Jori, Ferran; Caron, Alexandre; Goutard, Flavie; Roger, François

    2006-10-01

    The widespread of immunodeficiency with AIDS, the consequence of poverty on sanitary protection and information at both individual and state levels lead control of tuberculosis (TB) to be one of the priorities of World Health Organization programs. The impact of bovine tuberculosis (BTB) on humans is poorly documented. However, BTB remains a major problem for livestock in developing countries particularly in Africa and wildlife is responsible for the failure of TB eradication programs. In Africa, the consumption of raw milk and raw meat, and the development of bushmeat consumption as a cheap source of proteins, represent one of the principal routes for human contaminations with BTB. The exploration of these different pathways using tools as participatory epidemiology allows the risk analysis of the impact of BTB on human health in Africa. This analysis represents a management support and decision tool in the study and the control of zoonotic BTB.

  9. A Gap Analysis Needs Assessment Tool to Drive a Care Delivery and Research Agenda for Integration of Care and Sharing of Best Practices Across a Health System.

    PubMed

    Golden, Sherita Hill; Hager, Daniel; Gould, Lois J; Mathioudakis, Nestoras; Pronovost, Peter J

    2017-01-01

    In a complex health system, it is important to establish a systematic and data-driven approach to identifying needs. The Diabetes Clinical Community (DCC) of Johns Hopkins Medicine's Armstrong Institute for Patient Safety and Quality developed a gap analysis tool and process to establish the system's current state of inpatient diabetes care. The collectively developed tool assessed the following areas: program infrastructure; protocols, policies, and order sets; patient and health care professional education; and automated data access. For the purposes of this analysis, gaps were defined as those instances in which local resources, infrastructure, or processes demonstrated a variance against the current national evidence base or institutionally defined best practices. Following the gap analysis, members of the DCC, in collaboration with health system leadership, met to identify priority areas in order to integrate and synergize diabetes care resources and efforts to enhance quality and reduce disparities in care across the system. Key gaps in care identified included lack of standardized glucose management policies, lack of standardized training of health care professionals in inpatient diabetes management, and lack of access to automated data collection and analysis. These results were used to gain resources to support collaborative diabetes health system initiatives and to successfully obtain federal research funding to develop and pilot a pragmatic diabetes educational intervention. At a health system level, the summary format of this gap analysis tool is an effective method to clearly identify disparities in care to focus efforts and resources to improve care delivery. Copyright © 2016 The Joint Commission. Published by Elsevier Inc. All rights reserved.

  10. Weighing up the weighted case mix tool (WCMT): a psychometric investigation using confirmatory factor analysis.

    PubMed

    Duane, B G; Humphris, G; Richards, D; Okeefe, E J; Gordon, K; Freeman, R

    2014-12-01

    To assess the use of the WCMT in two Scottish health boards and to consider the impact of simplifying the tool to improve efficient use. A retrospective analysis of routine WCMT data (47,276 cases). Public Dental Service (PDS) within NHS Lothian and Highland. The WCMT consists of six criteria. Each criterion is measured independently on a four-point scale to assess patient complexity and the dental care for the disabled/impaired patient. Psychometric analyses on the data-set were conducted. Conventional internal consistency coefficients were calculated. Latent variable modelling was performed to assess the 'fit' of the raw data to a pre-specified measurement model. A Confirmatory Factor Analysis (CFA) was used to test three potential changes to the existing WCMT that included, the removal of the oral risk factor question, the removal of original weightings for scoring the Tool, and collapsing the 4-point rating scale to three categories. The removal of the oral risk factor question had little impact on the reliability of the proposed simplified CMT to discriminate between levels of patient complexity. The removal of weighting and collapsing each item's rating scale to three categories had limited impact on reliability of the revised tool. The CFA analysis provided strong evidence that a new, proposed simplified Case Mix Tool (sCMT) would operate closely to the pre-specified measurement model (the WMCT). A modified sCMT can demonstrate, without reducing reliability, a useful measure of the complexity of patient care. The proposed sCMT may be implemented within primary care dentistry to record patient complexity as part of an oral health assessment.

  11. Using the missed opportunity tool as an application of the Lives Saved Tool (LiST) for intervention prioritization.

    PubMed

    Tam, Yvonne; Pearson, Luwei

    2017-11-07

    The Missed Opportunity tool was developed as an application in the Lives Saved Tool (LiST) to allow users to quickly compare the relative impact of interventions. Global Financing Facility (GFF) investment cases have been identified as a potential application of the Missed Opportunity analyses in Democratic Republic of the Congo (DRC), Ethiopia, Kenya, and Tanzania, to use 'lives saved' as a normative factor to set priorities. The Missed Opportunity analysis draws on data and methods in LiST to project maternal, stillbirth, and child deaths averted based on changes in interventions' coverage. Coverage of each individual intervention in LiST was automated to be scaled up from current coverage to 90% in the next year, to simulate a scenario where almost every mother and child receive proven interventions that they need. The main outcome of the Missed Opportunity analysis is deaths averted due to each intervention. When reducing unmet need for contraception is included in the analysis, it ranks as the top missed opportunity across the four countries. When it is not included in the analysis, top interventions with the most total deaths averted are hospital-based interventions such as labor and delivery management in the CEmOC and BEmOC level, and full treatment and supportive care for premature babies, and for sepsis/pneumonia. The Missed Opportunity tool can be used to provide a quick, first look at missed opportunities in a country or geographic region, and help identify interventions for prioritization. While it is a useful advocate for evidence-based priority setting, decision makers need to consider other factors that influence decision making, and also discuss how to implement, deliver, and sustain programs to achieve high coverage.

  12. Consequences of theory level choice evaluated with new tools from QTAIM and the stress tensor for a dipeptide conformer

    NASA Astrophysics Data System (ADS)

    Li, Jiahui; Xu, Tianlv; Ping, Yang; van Mourik, Tanja; Früchtl, Herbert; Kirk, Steven R.; Jenkins, Samantha

    2018-03-01

    QTAIM and the stress tensor were used to provide a detailed analysis of the topology of the molecular graph, BCP and bond-path properties, including the new introduced helicity length H, of a Tyr-Gly dipeptide conformer subjected to a torsion with four levels of theory; MP2, M06-2X, B3LYP-D3 and B3LYP and a modest-sized basis set, 6-31+G(d). Structural effects and bonding properties are quantified and reflect differences in the BSSE and lack of inclusion of dispersion effects in the B3LYP calculations. The helicity length H demonstrated that MP2 produced a unique response to the torsion suggesting future use as a diagnostic tool.

  13. Morphometric Assessment of Convergent Tool Technology and Function during the Early Middle Palaeolithic: The Case of Payre, France

    PubMed Central

    Détroit, Florent; Coudenneau, Aude; Moncel, Marie-Hélène

    2016-01-01

    There appears to be little doubt as to the existence of an intentional technological resolve to produce convergent tools during the Middle Palaeolithic. However, the use of these pieces as pointed tools is still subject to debate: i.e., handheld tool vs. hafted tool. Present-day technological analysis has begun to apply new methodologies in order to quantify shape variability and to decipher the role of the morphology of these pieces in relation to function; for instance, geometric morphometric analyses have recently been applied with successful results. This paper presents a study of this type of analysis on 37 convergent tools from level Ga of Payre site (France), dated to MIS 8–7. These pieces are non-standardized knapping products produced by discoidal and orthogonal core technologies. Moreover, macro-wear studies attest to various activities on diverse materials with no evidence of hafting or projectile use. The aim of this paper is to test the geometric morphometric approach on non-standardized artefacts applying the Elliptical Fourier analysis (EFA) to 3D contours and to assess the potential relationship between size and shape, technology and function. This study is innovative in that it is the first time that this method, considered to be a valuable complement for describing technological and functional attributes, is applied to 3D contours of lithic products. Our results show that this methodology ensures a very good degree of accuracy in describing shape variations of the sharp edges of technologically non-standardized convergent tools. EFA on 3D contours indicates variations in deviations of the outline along the third dimension (i.e., dorso-ventrally) and yields quantitative and insightful information on the actual shape variations of tools. Several statistically significant relationships are found between shape variation and use-wear attributes, though the results emphasize the large variability of the shape of the convergent tools, which, in general, does not show a strong direct association with technological features and function. This is in good agreement with the technological context of this chronological period, characterized by a wide diversity of non-standardized tools adapted to multipurpose functions for varied subsistence activities. PMID:27191164

  14. Morphometric Assessment of Convergent Tool Technology and Function during the Early Middle Palaeolithic: The Case of Payre, France.

    PubMed

    Chacón, M Gema; Détroit, Florent; Coudenneau, Aude; Moncel, Marie-Hélène

    2016-01-01

    There appears to be little doubt as to the existence of an intentional technological resolve to produce convergent tools during the Middle Palaeolithic. However, the use of these pieces as pointed tools is still subject to debate: i.e., handheld tool vs. hafted tool. Present-day technological analysis has begun to apply new methodologies in order to quantify shape variability and to decipher the role of the morphology of these pieces in relation to function; for instance, geometric morphometric analyses have recently been applied with successful results. This paper presents a study of this type of analysis on 37 convergent tools from level Ga of Payre site (France), dated to MIS 8-7. These pieces are non-standardized knapping products produced by discoidal and orthogonal core technologies. Moreover, macro-wear studies attest to various activities on diverse materials with no evidence of hafting or projectile use. The aim of this paper is to test the geometric morphometric approach on non-standardized artefacts applying the Elliptical Fourier analysis (EFA) to 3D contours and to assess the potential relationship between size and shape, technology and function. This study is innovative in that it is the first time that this method, considered to be a valuable complement for describing technological and functional attributes, is applied to 3D contours of lithic products. Our results show that this methodology ensures a very good degree of accuracy in describing shape variations of the sharp edges of technologically non-standardized convergent tools. EFA on 3D contours indicates variations in deviations of the outline along the third dimension (i.e., dorso-ventrally) and yields quantitative and insightful information on the actual shape variations of tools. Several statistically significant relationships are found between shape variation and use-wear attributes, though the results emphasize the large variability of the shape of the convergent tools, which, in general, does not show a strong direct association with technological features and function. This is in good agreement with the technological context of this chronological period, characterized by a wide diversity of non-standardized tools adapted to multipurpose functions for varied subsistence activities.

  15. Microbial community analysis using MEGAN.

    PubMed

    Huson, Daniel H; Weber, Nico

    2013-01-01

    Metagenomics, the study of microbes in the environment using DNA sequencing, depends upon dedicated software tools for processing and analyzing very large sequencing datasets. One such tool is MEGAN (MEtaGenome ANalyzer), which can be used to interactively analyze and compare metagenomic and metatranscriptomic data, both taxonomically and functionally. To perform a taxonomic analysis, the program places the reads onto the NCBI taxonomy, while functional analysis is performed by mapping reads to the SEED, COG, and KEGG classifications. Samples can be compared taxonomically and functionally, using a wide range of different charting and visualization techniques. PCoA analysis and clustering methods allow high-level comparison of large numbers of samples. Different attributes of the samples can be captured and used within analysis. The program supports various input formats for loading data and can export analysis results in different text-based and graphical formats. The program is designed to work with very large samples containing many millions of reads. It is written in Java and installers for the three major computer operating systems are available from http://www-ab.informatik.uni-tuebingen.de. © 2013 Elsevier Inc. All rights reserved.

  16. Dynamic Systems Analysis for Turbine Based Aero Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Csank, Jeffrey T.

    2016-01-01

    The aircraft engine design process seeks to optimize the overall system-level performance, weight, and cost for a given concept. Steady-state simulations and data are used to identify trade-offs that should be balanced to optimize the system in a process known as systems analysis. These systems analysis simulations and data may not adequately capture the true performance trade-offs that exist during transient operation. Dynamic systems analysis provides the capability for assessing the dynamic tradeoffs at an earlier stage of the engine design process. The dynamic systems analysis concept, developed tools, and potential benefit are presented in this paper. To provide this capability, the Tool for Turbine Engine Closed-loop Transient Analysis (TTECTrA) was developed to provide the user with an estimate of the closed-loop performance (response time) and operability (high pressure compressor surge margin) for a given engine design and set of control design requirements. TTECTrA along with engine deterioration information, can be used to develop a more generic relationship between performance and operability that can impact the engine design constraints and potentially lead to a more efficient engine.

  17. Sigma metrics as a tool for evaluating the performance of internal quality control in a clinical chemistry laboratory.

    PubMed

    Kumar, B Vinodh; Mohan, Thuthi

    2018-01-01

    Six Sigma is one of the most popular quality management system tools employed for process improvement. The Six Sigma methods are usually applied when the outcome of the process can be measured. This study was done to assess the performance of individual biochemical parameters on a Sigma Scale by calculating the sigma metrics for individual parameters and to follow the Westgard guidelines for appropriate Westgard rules and levels of internal quality control (IQC) that needs to be processed to improve target analyte performance based on the sigma metrics. This is a retrospective study, and data required for the study were extracted between July 2015 and June 2016 from a Secondary Care Government Hospital, Chennai. The data obtained for the study are IQC - coefficient of variation percentage and External Quality Assurance Scheme (EQAS) - Bias% for 16 biochemical parameters. For the level 1 IQC, four analytes (alkaline phosphatase, magnesium, triglyceride, and high-density lipoprotein-cholesterol) showed an ideal performance of ≥6 sigma level, five analytes (urea, total bilirubin, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level and for level 2 IQCs, same four analytes of level 1 showed a performance of ≥6 sigma level, and four analytes (urea, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level. For all analytes <6 sigma level, the quality goal index (QGI) was <0.8 indicating the area requiring improvement to be imprecision except cholesterol whose QGI >1.2 indicated inaccuracy. This study shows that sigma metrics is a good quality tool to assess the analytical performance of a clinical chemistry laboratory. Thus, sigma metric analysis provides a benchmark for the laboratory to design a protocol for IQC, address poor assay performance, and assess the efficiency of existing laboratory processes.

  18. What Are the Costs of Trauma Center Readiness? Defining and Standardizing Readiness Costs for Trauma Centers Statewide.

    PubMed

    Ashley, Dennis W; Mullins, Robert F; Dente, Christopher J; Garlow, Laura; Medeiros, Regina S; Atkins, Elizabeth V; Solomon, Gina; Abston, Dena; Ferdinand, Colville H

    2017-09-01

    Trauma center readiness costs are incurred to maintain essential infrastructure and capacity to provide emergent services on a 24/7 basis. These costs are not captured by traditional hospital cost accounting, and no national consensus exists on appropriate definitions for each cost. Therefore, in 2010, stakeholders from all Level I and II trauma centers developed a survey tool standardizing and defining trauma center readiness costs. The survey tool underwent minor revisions to provide further clarity, and the survey was repeated in 2013. The purpose of this study was to provide a follow-up analysis of readiness costs for Georgia's Level I and Level II trauma centers. Using the American College of Surgeons Resources for Optimal Care of the Injured Patient guidelines, four readiness cost categories were identified: Administrative, Clinical Medical Staff, Operating Room, and Education/Outreach. Through conference calls, webinars and face-to-face meetings with financial officers, trauma medical directors, and program managers from all trauma centers, standardized definitions for reporting readiness costs within each category were developed. This resulted in a survey tool for centers to report their individual readiness costs for one year. The total readiness cost for all Level I trauma centers was $34,105,318 (avg $6,821,064) and all Level II trauma centers was $20,998,019 (avg $2,333,113). Methodology to standardize and define readiness costs for all trauma centers within the state was developed. Average costs for Level I and Level II trauma centers were identified. This model may be used to help other states define and standardize their trauma readiness costs.

  19. Applying reliability analysis to design electric power systems for More-electric aircraft

    NASA Astrophysics Data System (ADS)

    Zhang, Baozhu

    The More-Electric Aircraft (MEA) is a type of aircraft that replaces conventional hydraulic and pneumatic systems with electrically powered components. These changes have significantly challenged the aircraft electric power system design. This thesis investigates how reliability analysis can be applied to automatically generate system topologies for the MEA electric power system. We first use a traditional method of reliability block diagrams to analyze the reliability level on different system topologies. We next propose a new methodology in which system topologies, constrained by a set reliability level, are automatically generated. The path-set method is used for analysis. Finally, we interface these sets of system topologies with control synthesis tools to automatically create correct-by-construction control logic for the electric power system.

  20. Multi-level characterization of balanced inhibitory-excitatory cortical neuron network derived from human pluripotent stem cells.

    PubMed

    Nadadhur, Aishwarya G; Emperador Melero, Javier; Meijer, Marieke; Schut, Desiree; Jacobs, Gerbren; Li, Ka Wan; Hjorth, J J Johannes; Meredith, Rhiannon M; Toonen, Ruud F; Van Kesteren, Ronald E; Smit, August B; Verhage, Matthijs; Heine, Vivi M

    2017-01-01

    Generation of neuronal cultures from induced pluripotent stem cells (hiPSCs) serve the studies of human brain disorders. However we lack neuronal networks with balanced excitatory-inhibitory activities, which are suitable for single cell analysis. We generated low-density networks of hPSC-derived GABAergic and glutamatergic cortical neurons. We used two different co-culture models with astrocytes. We show that these cultures have balanced excitatory-inhibitory synaptic identities using confocal microscopy, electrophysiological recordings, calcium imaging and mRNA analysis. These simple and robust protocols offer the opportunity for single-cell to multi-level analysis of patient hiPSC-derived cortical excitatory-inhibitory networks; thereby creating advanced tools to study disease mechanisms underlying neurodevelopmental disorders.

  1. GECKO: a complete large-scale gene expression analysis platform.

    PubMed

    Theilhaber, Joachim; Ulyanov, Anatoly; Malanthara, Anish; Cole, Jack; Xu, Dapeng; Nahf, Robert; Heuer, Michael; Brockel, Christoph; Bushnell, Steven

    2004-12-10

    Gecko (Gene Expression: Computation and Knowledge Organization) is a complete, high-capacity centralized gene expression analysis system, developed in response to the needs of a distributed user community. Based on a client-server architecture, with a centralized repository of typically many tens of thousands of Affymetrix scans, Gecko includes automatic processing pipelines for uploading data from remote sites, a data base, a computational engine implementing approximately 50 different analysis tools, and a client application. Among available analysis tools are clustering methods, principal component analysis, supervised classification including feature selection and cross-validation, multi-factorial ANOVA, statistical contrast calculations, and various post-processing tools for extracting data at given error rates or significance levels. On account of its open architecture, Gecko also allows for the integration of new algorithms. The Gecko framework is very general: non-Affymetrix and non-gene expression data can be analyzed as well. A unique feature of the Gecko architecture is the concept of the Analysis Tree (actually, a directed acyclic graph), in which all successive results in ongoing analyses are saved. This approach has proven invaluable in allowing a large (approximately 100 users) and distributed community to share results, and to repeatedly return over a span of years to older and potentially very complex analyses of gene expression data. The Gecko system is being made publicly available as free software http://sourceforge.net/projects/geckoe. In totality or in parts, the Gecko framework should prove useful to users and system developers with a broad range of analysis needs.

  2. Comprehensive helicopter analysis: A state of the art review

    NASA Technical Reports Server (NTRS)

    Johnson, W.

    1978-01-01

    An assessment of the status of helicopter theory and analysis is presented. The technology level embodied in available design tools (computer programs) is examined, considering the problem areas of performance, loads and vibration, handling qualities and simulation, and aeroelastic stability. The effectiveness of the present analyses is discussed. The characteristics of the technology in the analyses are reviewed, including the aerodynamics technology, induced velocity and wake geometry, dynamics technology, and machine limitations.

  3. Mitochondrial reactive oxygen species and complex II levels are associated with the outcome of hepatocellular carcinoma

    PubMed Central

    WU, JIANHUA; ZHAO, FEI; ZHAO, YUFEI; GUO, ZHANJUN

    2015-01-01

    In the present study, two oxidative stress parameters, reactive oxygen species (ROS) and mitochondrial respiratory complex II, were evaluated in the mitochondria of hepatocellular carcinoma (HCC) cells to determine the association between these parameters and the carcinogenesis and clinical outcome of HCC. High levels of ROS and low levels of complex II were found to be associated with reduced post-operative survival in HCC patients using the log-rank test. Furthermore, multivariate analysis confirmed that the levels of ROS [relative risk (RR)=2.867; 95% confidence interval (CI), 1.062–7.737; P=0.038] and complex II (RR=5.422; 95% CI, 1.273–23.088; P=0.022) were independent predictors for the survival of patients with HCC. Therefore, the analysis of ROS and complex II levels may provide a useful research and therapeutic tool for the prediction of HCC prognosis and treatment. PMID:26622849

  4. Applications of a broad-spectrum tool for conservation and fisheries analysis: aquatic gap analysis

    USGS Publications Warehouse

    McKenna, James E.; Steen, Paul J.; Lyons, John; Stewart, Jana S.

    2009-01-01

    Natural resources support all of our social and economic activities, as well as our biological existence. Humans have little control over most of the physical, biological, and sociological conditions dictating the status and capacity of natural resources in any particular area. However, the most rapid and threatening influences on natural resources typically are anthropogenic overuse and degradation. In addition, living natural resources (i.e., organisms) do not respect political boundaries, but are aware of their optimal habitat and environmental conditions. Most organisms have wider spatial ranges than the jurisdictional boundaries of environmental agencies that deal with them; even within those jurisdictions, information is patchy and disconnected. Planning and projecting effects of ecological management are difficult, because many organisms, habitat conditions, and interactions are involved. Conservation and responsible resource use involves wise management and manipulation of the aspects of the environment and biological communities that can be effectively changed. Tools and data sets that provide new insights and analysis capabilities can enhance the ability of resource managers to make wise decisions and plan effective, long-term management strategies. Aquatic gap analysis has been developed to provide those benefits. Gap analysis is more than just the assessment of the match or mis-match (i.e., gaps) between habitats of ecological value and areas with an appropriate level of environmental protection (e.g., refuges, parks, preserves), as the name suggests. Rather, a Gap Analysis project is a process which leads to an organized database of georeferenced information and previously available tools to examine conservation and other ecological issues; it provides a geographic analysis platform that serves as a foundation for aquatic ecological studies. This analytical tool box allows one to conduct assessments of all habitat elements within an area of interest. Aquatic gap analysis naturally focuses on aquatic habitats. The analytical tools are largely based on specification of the species-habitat relations for the system and organism group of interest (Morrison et al. 2003; McKenna et al. 2006; Steen et al. 2006; Sowa et al. 2007). The Great Lakes Regional Aquatic Gap Analysis (GLGap) project focuses primarily on lotic habitat of the U.S. Great Lakes drainage basin and associated states and has been developed to address fish and fisheries issues. These tools are unique because they allow us to address problems at a range of scales from the region to the stream segment and include the ability to predict species specific occurrence or abundance for most of the fish species in the study area. The results and types of questions that can be addressed provide better global understanding of the ecological context within which specific natural resources fit (e.g., neighboring environments and resources, and large and small scale processes). The geographic analysis platform consists of broad and flexible geospatial tools (and associated data) with many potential applications. The objectives of this article are to provide a brief overview of GLGap methods and analysis tools, and demonstrate conservation and planning applications of those data and tools. Although there are many potential applications, we will highlight just three: (1) support for the Eastern Brook Trout Joint Venture (EBTJV), (2) Aquatic Life classification in Wisconsin, and (3) an educational tool that makes use of Google Earth (use of trade or product names does not imply endorsement by the U.S. Government) and Internet accessibility.

  5. OISI dynamic end-to-end modeling tool

    NASA Astrophysics Data System (ADS)

    Kersten, Michael; Weidler, Alexander; Wilhelm, Rainer; Johann, Ulrich A.; Szerdahelyi, Laszlo

    2000-07-01

    The OISI Dynamic end-to-end modeling tool is tailored to end-to-end modeling and dynamic simulation of Earth- and space-based actively controlled optical instruments such as e.g. optical stellar interferometers. `End-to-end modeling' is meant to denote the feature that the overall model comprises besides optical sub-models also structural, sensor, actuator, controller and disturbance sub-models influencing the optical transmission, so that the system- level instrument performance due to disturbances and active optics can be simulated. This tool has been developed to support performance analysis and prediction as well as control loop design and fine-tuning for OISI, Germany's preparatory program for optical/infrared spaceborne interferometry initiated in 1994 by Dornier Satellitensysteme GmbH in Friedrichshafen.

  6. Using Habitat Equivalency Analysis to Assess the Cost Effectiveness of Restoration Outcomes in Four Institutional Contexts

    NASA Astrophysics Data System (ADS)

    Scemama, Pierre; Levrel, Harold

    2016-01-01

    At the national level, with a fixed amount of resources available for public investment in the restoration of biodiversity, it is difficult to prioritize alternative restoration projects. One way to do this is to assess the level of ecosystem services delivered by these projects and to compare them with their costs. The challenge is to derive a common unit of measurement for ecosystem services in order to compare projects which are carried out in different institutional contexts having different goals (application of environmental laws, management of natural reserves, etc.). This paper assesses the use of habitat equivalency analysis (HEA) as a tool to evaluate ecosystem services provided by restoration projects developed in different institutional contexts. This tool was initially developed to quantify the level of ecosystem services required to compensate for non-market impacts coming from accidental pollution in the US. In this paper, HEA is used to assess the cost effectiveness of several restoration projects in relation to different environmental policies, using case studies based in France. Four case studies were used: the creation of a market for wetlands, public acceptance of a port development project, the rehabilitation of marshes to mitigate nitrate loading to the sea, and the restoration of streams in a protected area. Our main conclusion is that HEA can provide a simple tool to clarify the objectives of restoration projects, to compare the cost and effectiveness of these projects, and to carry out trade-offs, without requiring significant amounts of human or technical resources.

  7. Using Habitat Equivalency Analysis to Assess the Cost Effectiveness of Restoration Outcomes in Four Institutional Contexts.

    PubMed

    Scemama, Pierre; Levrel, Harold

    2016-01-01

    At the national level, with a fixed amount of resources available for public investment in the restoration of biodiversity, it is difficult to prioritize alternative restoration projects. One way to do this is to assess the level of ecosystem services delivered by these projects and to compare them with their costs. The challenge is to derive a common unit of measurement for ecosystem services in order to compare projects which are carried out in different institutional contexts having different goals (application of environmental laws, management of natural reserves, etc.). This paper assesses the use of habitat equivalency analysis (HEA) as a tool to evaluate ecosystem services provided by restoration projects developed in different institutional contexts. This tool was initially developed to quantify the level of ecosystem services required to compensate for non-market impacts coming from accidental pollution in the US. In this paper, HEA is used to assess the cost effectiveness of several restoration projects in relation to different environmental policies, using case studies based in France. Four case studies were used: the creation of a market for wetlands, public acceptance of a port development project, the rehabilitation of marshes to mitigate nitrate loading to the sea, and the restoration of streams in a protected area. Our main conclusion is that HEA can provide a simple tool to clarify the objectives of restoration projects, to compare the cost and effectiveness of these projects, and to carry out trade-offs, without requiring significant amounts of human or technical resources.

  8. An automated genotyping tool for enteroviruses and noroviruses.

    PubMed

    Kroneman, A; Vennema, H; Deforche, K; v d Avoort, H; Peñaranda, S; Oberste, M S; Vinjé, J; Koopmans, M

    2011-06-01

    Molecular techniques are established as routine in virological laboratories and virus typing through (partial) sequence analysis is increasingly common. Quality assurance for the use of typing data requires harmonization of genotype nomenclature, and agreement on target genes, depending on the level of resolution required, and robustness of methods. To develop and validate web-based open-access typing-tools for enteroviruses and noroviruses. An automated web-based typing algorithm was developed, starting with BLAST analysis of the query sequence against a reference set of sequences from viruses in the family Picornaviridae or Caliciviridae. The second step is phylogenetic analysis of the query sequence and a sub-set of the reference sequences, to assign the enterovirus type or norovirus genotype and/or variant, with profile alignment, construction of phylogenetic trees and bootstrap validation. Typing is performed on VP1 sequences of Human enterovirus A to D, and ORF1 and ORF2 sequences of genogroup I and II noroviruses. For validation, we used the tools to automatically type sequences in the RIVM and CDC enterovirus databases and the FBVE norovirus database. Using the typing-tools, 785(99%) of 795 Enterovirus VP1 sequences, and 8154(98.5%) of 8342 norovirus sequences were typed in accordance with previously used methods. Subtyping into variants was achieved for 4439(78.4%) of 5838 NoV GII.4 sequences. The online typing-tools reliably assign genotypes for enteroviruses and noroviruses. The use of phylogenetic methods makes these tools robust to ongoing evolution. This should facilitate standardized genotyping and nomenclature in clinical and public health laboratories, thus supporting inter-laboratory comparisons. Copyright © 2011 Elsevier B.V. All rights reserved.

  9. A Bayesian deconvolution strategy for immunoprecipitation-based DNA methylome analysis

    PubMed Central

    Down, Thomas A.; Rakyan, Vardhman K.; Turner, Daniel J.; Flicek, Paul; Li, Heng; Kulesha, Eugene; Gräf, Stefan; Johnson, Nathan; Herrero, Javier; Tomazou, Eleni M.; Thorne, Natalie P.; Bäckdahl, Liselotte; Herberth, Marlis; Howe, Kevin L.; Jackson, David K.; Miretti, Marcos M.; Marioni, John C.; Birney, Ewan; Hubbard, Tim J. P.; Durbin, Richard; Tavaré, Simon; Beck, Stephan

    2009-01-01

    DNA methylation is an indispensible epigenetic modification of mammalian genomes. Consequently there is great interest in strategies for genome-wide/whole-genome DNA methylation analysis, and immunoprecipitation-based methods have proven to be a powerful option. Such methods are rapidly shifting the bottleneck from data generation to data analysis, necessitating the development of better analytical tools. Until now, a major analytical difficulty associated with immunoprecipitation-based DNA methylation profiling has been the inability to estimate absolute methylation levels. Here we report the development of a novel cross-platform algorithm – Bayesian Tool for Methylation Analysis (Batman) – for analyzing Methylated DNA Immunoprecipitation (MeDIP) profiles generated using arrays (MeDIP-chip) or next-generation sequencing (MeDIP-seq). The latter is an approach we have developed to elucidate the first high-resolution whole-genome DNA methylation profile (DNA methylome) of any mammalian genome. MeDIP-seq/MeDIP-chip combined with Batman represent robust, quantitative, and cost-effective functional genomic strategies for elucidating the function of DNA methylation. PMID:18612301

  10. Use of LEED, Auger emission spectroscopy and field ion microscopy in microstructural studies

    NASA Technical Reports Server (NTRS)

    Ferrante, J.; Buckley, D. H.; Pepper, S. V.; Brainard, W. A.

    1972-01-01

    Surface research tools such as LEED, Auger emission spectroscopy analysis, and field ion microscopy are discussed. Examples of their use in studying adhesion, friction, wear, and lubrication presented. These tools have provided considerable insight into the basic nature of solid surface interactions. The transfer of metals from one surface to another at the atomic level has been observed and studied with each of these devices. The field ion microscope has been used to study polymer-metal interactions and Auger analysis to study the mechanism of polymer adhesion to metals. LEED and Auger analysis have identified surface segregation of alloying elements and indicated the influence of these elements in metallic adhesion. LEED and Auger analysis have assisted in adsorption studies in determining the structural arrangement and quantity of adsorbed species present in making an understanding of the influence of these species on adhesion possible. These devices are assisting in the furtherance of understanding of the fundamental mechanism involved in the adhesion, friction, wear, and lubrication processes.

  11. Sentiments analysis at conceptual level making use of the Narrative Knowledge Representation Language.

    PubMed

    Zarri, Gian Piero

    2014-10-01

    This paper illustrates some of the knowledge representation structures and inference procedures proper to a high-level, fully implemented conceptual language, NKRL (Narrative Knowledge Representation Language). The aim is to show how these tools can be used to deal, in a sentiment analysis/opinion mining context, with some common types of human (and non-human) "behaviors". These behaviors correspond, in particular, to the concrete, mutual relationships among human and non-human characters that can be expressed under the form of non-fictional and real-time "narratives" (i.e., as logically and temporally structured sequences of "elementary events"). Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. Identifying poor performance among doctors in NHS organizations.

    PubMed

    Locke, Rachel; Scallan, Samantha; Leach, Camilla; Rickenbach, Mark

    2013-10-01

    To account for the means by which poor performance among career doctors is identified by National Health Service organizations, whether the tools are considered effective and how these processes may be strengthened in the light of revalidation and the requirement for doctors to demonstrate their fitness to practice. This study sought to look beyond the 'doctor as individual'; as well as considering the typical approaches to managing the practice of an individual, the systems within which the doctor is working were reviewed, as these are also relevant to standards of performance. A qualitative review was undertaken consisting of a literature review of current practice, a policy review of current documentation from 15 trusts in one deanery locality, and 14 semi-structured interviews with respondents with an overview of processes in use. The framework for the analysis of the data considered tools at three levels: individual, team and organizational. Tools are, in the main, reactive--with an individual focus. They rely on colleagues and others to speak out, so their effectiveness is hindered by a reluctance to do so. Tools can lack an evidence base for their use, and there is limited linking of data across contexts and tools. There is more work to be done in evaluating current tools and developing stronger processes. Linkage between data sources needs to be improved and proactive tools at the organizational level need further development to help with the early identification of performance issues. This would also assist in balancing a wider systems approach with a current over emphasis on individual doctors. © 2012 John Wiley & Sons Ltd.

  13. Human error analysis of commercial aviation accidents: application of the Human Factors Analysis and Classification system (HFACS).

    PubMed

    Wiegmann, D A; Shappell, S A

    2001-11-01

    The Human Factors Analysis and Classification System (HFACS) is a general human error framework originally developed and tested within the U.S. military as a tool for investigating and analyzing the human causes of aviation accidents. Based on Reason's (1990) model of latent and active failures, HFACS addresses human error at all levels of the system, including the condition of aircrew and organizational factors. The purpose of the present study was to assess the utility of the HFACS framework as an error analysis and classification tool outside the military. The HFACS framework was used to analyze human error data associated with aircrew-related commercial aviation accidents that occurred between January 1990 and December 1996 using database records maintained by the NTSB and the FAA. Investigators were able to reliably accommodate all the human causal factors associated with the commercial aviation accidents examined in this study using the HFACS system. In addition, the classification of data using HFACS highlighted several critical safety issues in need of intervention research. These results demonstrate that the HFACS framework can be a viable tool for use within the civil aviation arena. However, additional research is needed to examine its applicability to areas outside the flight deck, such as aircraft maintenance and air traffic control domains.

  14. A guide to understanding meta-analysis.

    PubMed

    Israel, Heidi; Richter, Randy R

    2011-07-01

    With the focus on evidence-based practice in healthcare, a well-conducted systematic review that includes a meta-analysis where indicated represents a high level of evidence for treatment effectiveness. The purpose of this commentary is to assist clinicians in understanding meta-analysis as a statistical tool using both published articles and explanations of components of the technique. We describe what meta-analysis is, what heterogeneity is, and how it affects meta-analysis, effect size, the modeling techniques of meta-analysis, and strengths and weaknesses of meta-analysis. Common components like forest plot interpretation, software that may be used, special cases for meta-analysis, such as subgroup analysis, individual patient data, and meta-regression, and a discussion of criticisms, are included.

  15. Family-Based Benchmarking of Copy Number Variation Detection Software.

    PubMed

    Nutsua, Marcel Elie; Fischer, Annegret; Nebel, Almut; Hofmann, Sylvia; Schreiber, Stefan; Krawczak, Michael; Nothnagel, Michael

    2015-01-01

    The analysis of structural variants, in particular of copy-number variations (CNVs), has proven valuable in unraveling the genetic basis of human diseases. Hence, a large number of algorithms have been developed for the detection of CNVs in SNP array signal intensity data. Using the European and African HapMap trio data, we undertook a comparative evaluation of six commonly used CNV detection software tools, namely Affymetrix Power Tools (APT), QuantiSNP, PennCNV, GLAD, R-gada and VEGA, and assessed their level of pair-wise prediction concordance. The tool-specific CNV prediction accuracy was assessed in silico by way of intra-familial validation. Software tools differed greatly in terms of the number and length of the CNVs predicted as well as the number of markers included in a CNV. All software tools predicted substantially more deletions than duplications. Intra-familial validation revealed consistently low levels of prediction accuracy as measured by the proportion of validated CNVs (34-60%). Moreover, up to 20% of apparent family-based validations were found to be due to chance alone. Software using Hidden Markov models (HMM) showed a trend to predict fewer CNVs than segmentation-based algorithms albeit with greater validity. PennCNV yielded the highest prediction accuracy (60.9%). Finally, the pairwise concordance of CNV prediction was found to vary widely with the software tools involved. We recommend HMM-based software, in particular PennCNV, rather than segmentation-based algorithms when validity is the primary concern of CNV detection. QuantiSNP may be used as an additional tool to detect sets of CNVs not detectable by the other tools. Our study also reemphasizes the need for laboratory-based validation, such as qPCR, of CNVs predicted in silico.

  16. Evaluating biomarkers for prognostic enrichment of clinical trials.

    PubMed

    Kerr, Kathleen F; Roth, Jeremy; Zhu, Kehao; Thiessen-Philbrook, Heather; Meisner, Allison; Wilson, Francis Perry; Coca, Steven; Parikh, Chirag R

    2017-12-01

    A potential use of biomarkers is to assist in prognostic enrichment of clinical trials, where only patients at relatively higher risk for an outcome of interest are eligible for the trial. We investigated methods for evaluating biomarkers for prognostic enrichment. We identified five key considerations when considering a biomarker and a screening threshold for prognostic enrichment: (1) clinical trial sample size, (2) calendar time to enroll the trial, (3) total patient screening costs and the total per-patient trial costs, (4) generalizability of trial results, and (5) ethical evaluation of trial eligibility criteria. Items (1)-(3) are amenable to quantitative analysis. We developed the Biomarker Prognostic Enrichment Tool for evaluating biomarkers for prognostic enrichment at varying levels of screening stringency. We demonstrate that both modestly prognostic and strongly prognostic biomarkers can improve trial metrics using Biomarker Prognostic Enrichment Tool. Biomarker Prognostic Enrichment Tool is available as a webtool at http://prognosticenrichment.com and as a package for the R statistical computing platform. In some clinical settings, even biomarkers with modest prognostic performance can be useful for prognostic enrichment. In addition to the quantitative analysis provided by Biomarker Prognostic Enrichment Tool, investigators must consider the generalizability of trial results and evaluate the ethics of trial eligibility criteria.

  17. Validation of the Italian version of the Stanford Presenteeism Scale in nurses.

    PubMed

    Cicolini, Giancarlo; Della Pelle, Carlo; Cerratti, Francesca; Franza, Marcello; Flacco, Maria E

    2016-07-01

    To ascertain the validity and reliability of the Italian version of the Stanford Presenteeism Scale (SPS-6). Presenteeism has been associated with a work productivity reduction, a lower quality of work and an increased risk of developing health disorders. It is particularly high among nurses and needs valid tools to be assessed. A validation study was carried out from July to September 2014. A three-section tool, made of a demographic form, the Stanford Presenteeism Scale (SPS-6) and the Perceived Stress Scale (PSS-10) was administered to a sample of nurses, enrolled in three Italian hospitals. Cronbach's α for the entire sample (229 nurses) was found to be 0.72. A significant negative correlation between SPS and perceived stress scores evidenced the external validity. The factor analysis showed a two-component solution, accounting for 71.2% of the variance. The confirmatory factor analysis showed an adequate fit. The Italian SPS-6 is a valid and reliable tool for workplace surveys. Since the validity and reliability of SPS-6 has been confirmed for the Italian version, we have now a valid tool that can measure the levels of presenteeism among Italian nurses. © 2016 John Wiley & Sons Ltd.

  18. Improvements in analysis techniques for segmented mirror arrays

    NASA Astrophysics Data System (ADS)

    Michels, Gregory J.; Genberg, Victor L.; Bisson, Gary R.

    2016-08-01

    The employment of actively controlled segmented mirror architectures has become increasingly common in the development of current astronomical telescopes. Optomechanical analysis of such hardware presents unique issues compared to that of monolithic mirror designs. The work presented here is a review of current capabilities and improvements in the methodology of the analysis of mechanically induced surface deformation of such systems. The recent improvements include capability to differentiate surface deformation at the array and segment level. This differentiation allowing surface deformation analysis at each individual segment level offers useful insight into the mechanical behavior of the segments that is unavailable by analysis solely at the parent array level. In addition, capability to characterize the full displacement vector deformation of collections of points allows analysis of mechanical disturbance predictions of assembly interfaces relative to other assembly interfaces. This capability, called racking analysis, allows engineers to develop designs for segment-to-segment phasing performance in assembly integration, 0g release, and thermal stability of operation. The performance predicted by racking has the advantage of being comparable to the measurements used in assembly of hardware. Approaches to all of the above issues are presented and demonstrated by example with SigFit, a commercially available tool integrating mechanical analysis with optical analysis.

  19. A modelling tool for policy analysis to support the design of efficient and effective policy responses for complex public health problems.

    PubMed

    Atkinson, Jo-An; Page, Andrew; Wells, Robert; Milat, Andrew; Wilson, Andrew

    2015-03-03

    In the design of public health policy, a broader understanding of risk factors for disease across the life course, and an increasing awareness of the social determinants of health, has led to the development of more comprehensive, cross-sectoral strategies to tackle complex problems. However, comprehensive strategies may not represent the most efficient or effective approach to reducing disease burden at the population level. Rather, they may act to spread finite resources less intensively over a greater number of programs and initiatives, diluting the potential impact of the investment. While analytic tools are available that use research evidence to help identify and prioritise disease risk factors for public health action, they are inadequate to support more targeted and effective policy responses for complex public health problems. This paper discusses the limitations of analytic tools that are commonly used to support evidence-informed policy decisions for complex problems. It proposes an alternative policy analysis tool which can integrate diverse evidence sources and provide a platform for virtual testing of policy alternatives in order to design solutions that are efficient, effective, and equitable. The case of suicide prevention in Australia is presented to demonstrate the limitations of current tools to adequately inform prevention policy and discusses the utility of the new policy analysis tool. In contrast to popular belief, a systems approach takes a step beyond comprehensive thinking and seeks to identify where best to target public health action and resources for optimal impact. It is concerned primarily with what can be reasonably left out of strategies for prevention and can be used to explore where disinvestment may occur without adversely affecting population health (or equity). Simulation modelling used for policy analysis offers promise in being able to better operationalise research evidence to support decision making for complex problems, improve targeting of public health policy, and offers a foundation for strengthening relationships between policy makers, stakeholders, and researchers.

  20. Using Microsoft PowerPoint as an Astronomical Image Analysis Tool

    NASA Astrophysics Data System (ADS)

    Beck-Winchatz, Bernhard

    2006-12-01

    Engaging students in the analysis of authentic scientific data is an effective way to teach them about the scientific process and to develop their problem solving, teamwork and communication skills. In astronomy several image processing and analysis software tools have been developed for use in school environments. However, the practical implementation in the classroom is often difficult because the teachers may not have the comfort level with computers necessary to install and use these tools, they may not have adequate computer privileges and/or support, and they may not have the time to learn how to use specialized astronomy software. To address this problem, we have developed a set of activities in which students analyze astronomical images using basic tools provided in PowerPoint. These include measuring sizes, distances, and angles, and blinking images. In contrast to specialized software, PowerPoint is broadly available on school computers. Many teachers are already familiar with PowerPoint, and the skills developed while learning how to analyze astronomical images are highly transferable. We will discuss several practical examples of measurements, including the following: -Variations in the distances to the sun and moon from their angular sizes -Magnetic declination from images of shadows -Diameter of the moon from lunar eclipse images -Sizes of lunar craters -Orbital radii of the Jovian moons and mass of Jupiter -Supernova and comet searches -Expansion rate of the universe from images of distant galaxies

  1. Exploratory Analysis of Carbon Dioxide Levels, Ultrasound and Optical Coherence Tomography Measures of the Eye During ISS Missions

    NASA Technical Reports Server (NTRS)

    Schaefer, C.; Coble, C.; Mason, S.; Young, M.; Wear, M. L.; Sargsyan, A.; Garcia, K.; Patel, N.; Gibson, C.; Alexander, D.; hide

    2017-01-01

    Carbon dioxide (CO2) levels on board the International Space Station (ISS) have typically averaged 2.3 to 5.3 mmHg, with large fluctuations occurring over periods of hours and days. CO2 has effects on cerebral vascular tone, resulting in vasodilation and alteration of cerebral blood flow (CBF). Increased CBF leads to elevated intracranial pressure (ICP), a factor leading to visual disturbances, headaches, and other central nervous system symptoms. Ultrasound of the optic nerve and optical coherence tomography (OCT) provide surrogate measurements of ICP; in-flight measurements of both were implemented as enhanced screening tools for the Visual Impairment/Intracranial Pressure (VIIP) syndrome. This analysis examines the relationships between ambient CO2 levels on ISS, ultrasound and OCT measures of the eye in an effort to understand how CO2 may possibly be associated with VIIP and to inform future analysis of in-flight VIIP data.

  2. Quantitative molecular analysis in mantle cell lymphoma.

    PubMed

    Brízová, H; Hilská, I; Mrhalová, M; Kodet, R

    2011-07-01

    A molecular analysis has three major roles in modern oncopathology--as an aid in the differential diagnosis, in molecular monitoring of diseases, and in estimation of the potential prognosis. In this report we review the application of the molecular analysis in a group of patients with mantle cell lymphoma (MCL). We demonstrate that detection of the cyclin D1 mRNA level is a molecular marker in 98% of patients with MCL. Cyclin D1 quantitative monitoring is specific and sensitive for the differential diagnosis and for the molecular monitoring of the disease in the bone marrow. Moreover, the dynamics of cyclin D1 in bone marrow reflects the disease development and it predicts the clinical course. We employed the molecular analysis for a precise quantitative detection of proliferation markers, Ki-67, topoisomerase IIalpha, and TPX2, that are described as effective prognostic factors. Using the molecular approach it is possible to measure the proliferation rate in a reproducible, standard way which is an essential prerequisite for using the proliferation activity as a routine clinical tool. Comparing with immunophenotyping we may conclude that the quantitative PCR-based analysis is a useful, reliable, rapid, reproducible, sensitive and specific method broadening our diagnostic tools in hematopathology. In comparison to interphase FISH in paraffin sections quantitative PCR is less technically demanding and less time-consuming and furthermore it is more sensitive in detecting small changes in the mRNA level. Moreover, quantitative PCR is the only technology which provides precise and reproducible quantitative information about the expression level. Therefore it may be used to demonstrate the decrease or increase of a tumor-specific marker in bone marrow in comparison with a previously aspirated specimen. Thus, it has a powerful potential to monitor the course of the disease in correlation with clinical data.

  3. Stream Splitting in Support of Intrusion Detection

    DTIC Science & Technology

    2003-06-01

    increased. Every computer on the Internet has no need to see the traffic of every other computer on the Internet. Indeed if this was so, nothing would get ...distinguishes the stream splitter from other network analysis tools. B. HIGH LEVEL DESIGN To get the desired level of performance, a multi-threaded...of greater concern than added accuracy of a Bayesian model. This is a case where close is good enough . b. PassiveSensors Though similar to active

  4. In vitro and in vivo analysis and characterization of engineered spinal neural implants (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Shor, Erez; Shoham, Shy; Levenberg, Shulamit

    2016-03-01

    Spinal cord injury is a devastating medical condition. Recent developments in pre-clinical and clinical research have started to yield neural implants inducing functional recovery after spinal cord transection injury. However, the functional performance of the transplants was assessed using histology and behavioral experiments which are unable to study cell dynamics and the therapeutic response. Here, we use neurophotonic tools and optogenetic probes to investigate cellular level morphology and activity characteristics of neural implants over time at the cellular level. These methods were used in-vitro and in-vivo, in a mouse spinal cord injury implant model. Following previous attempts to induce recovery after spinal cord injury, we engineered a pre-vascularized implant to obtain better functional performance. To image network activity of a construct implanted in a mouse spinal cord, we transfected the implant to express GCaMP6 calcium activity indicators and implanted these constructs under a spinal cord chamber enabling 2-photon chronic in vivo neural activity imaging. Activity and morphology analysis image processing software was developed to automatically quantify the behavior of the neural and vascular networks. Our experimental results and analyses demonstrate that vascularized and non-vascularized constructs exhibit very different morphologic and activity patterns at the cellular level. This work enables further optimization of neural implants and also provides valuable tools for continuous cellular level monitoring and evaluation of transplants designed for various neurodegenerative disease models.

  5. Internet-based Modeling, Mapping, and Analysis for the Greater Everglades (IMMAGE; Version 1.0): web-based tools to assess the impact of sea level rise in south Florida

    USGS Publications Warehouse

    Hearn, Paul; Strong, David; Swain, Eric; Decker, Jeremy

    2013-01-01

    South Florida's Greater Everglades area is particularly vulnerable to sea level rise, due to its rich endowment of animal and plant species and its heavily populated urban areas along the coast. Rising sea levels are expected to have substantial impacts on inland flooding, the depth and extent of surge from coastal storms, the degradation of water supplies by saltwater intrusion, and the integrity of plant and animal habitats. Planners and managers responsible for mitigating these impacts require advanced tools to help them more effectively identify areas at risk. The U.S. Geological Survey's (USGS) Internet-based Modeling, Mapping, and Analysis for the Greater Everglades (IMMAGE) Web site has been developed to address these needs by providing more convenient access to projections from models that forecast the effects of sea level rise on surface water and groundwater, the extent of surge and resulting economic losses from coastal storms, and the distribution of habitats. IMMAGE not only provides an advanced geographic information system (GIS) interface to support decision making, but also includes topic-based modules that explain and illustrate key concepts for nontechnical users. The purpose of this report is to familiarize both technical and nontechnical users with the IMMAGE Web site and its various applications.

  6. Geostatistical applications in environmental remediation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stewart, R.N.; Purucker, S.T.; Lyon, B.F.

    1995-02-01

    Geostatistical analysis refers to a collection of statistical methods for addressing data that vary in space. By incorporating spatial information into the analysis, geostatistics has advantages over traditional statistical analysis for problems with a spatial context. Geostatistics has a history of success in earth science applications, and its popularity is increasing in other areas, including environmental remediation. Due to recent advances in computer technology, geostatistical algorithms can be executed at a speed comparable to many standard statistical software packages. When used responsibly, geostatistics is a systematic and defensible tool can be used in various decision frameworks, such as the Datamore » Quality Objectives (DQO) process. At every point in the site, geostatistics can estimate both the concentration level and the probability or risk of exceeding a given value. Using these probability maps can assist in identifying clean-up zones. Given any decision threshold and an acceptable level of risk, the probability maps identify those areas that are estimated to be above or below the acceptable risk. Those areas that are above the threshold are of the most concern with regard to remediation. In addition to estimating clean-up zones, geostatistics can assist in designing cost-effective secondary sampling schemes. Those areas of the probability map with high levels of estimated uncertainty are areas where more secondary sampling should occur. In addition, geostatistics has the ability to incorporate soft data directly into the analysis. These data include historical records, a highly correlated secondary contaminant, or expert judgment. The role of geostatistics in environmental remediation is a tool that in conjunction with other methods can provide a common forum for building consensus.« less

  7. Quantifying Individual Brain Connectivity with Functional Principal Component Analysis for Networks.

    PubMed

    Petersen, Alexander; Zhao, Jianyang; Carmichael, Owen; Müller, Hans-Georg

    2016-09-01

    In typical functional connectivity studies, connections between voxels or regions in the brain are represented as edges in a network. Networks for different subjects are constructed at a given graph density and are summarized by some network measure such as path length. Examining these summary measures for many density values yields samples of connectivity curves, one for each individual. This has led to the adoption of basic tools of functional data analysis, most commonly to compare control and disease groups through the average curves in each group. Such group differences, however, neglect the variability in the sample of connectivity curves. In this article, the use of functional principal component analysis (FPCA) is demonstrated to enrich functional connectivity studies by providing increased power and flexibility for statistical inference. Specifically, individual connectivity curves are related to individual characteristics such as age and measures of cognitive function, thus providing a tool to relate brain connectivity with these variables at the individual level. This individual level analysis opens a new perspective that goes beyond previous group level comparisons. Using a large data set of resting-state functional magnetic resonance imaging scans, relationships between connectivity and two measures of cognitive function-episodic memory and executive function-were investigated. The group-based approach was implemented by dichotomizing the continuous cognitive variable and testing for group differences, resulting in no statistically significant findings. To demonstrate the new approach, FPCA was implemented, followed by linear regression models with cognitive scores as responses, identifying significant associations of connectivity in the right middle temporal region with both cognitive scores.

  8. Early visual analysis tool using magnetoencephalography for treatment and recovery of neuronal dysfunction.

    PubMed

    Rasheed, Waqas; Neoh, Yee Yik; Bin Hamid, Nor Hisham; Reza, Faruque; Idris, Zamzuri; Tang, Tong Boon

    2017-10-01

    Functional neuroimaging modalities play an important role in deciding the diagnosis and course of treatment of neuronal dysfunction and degeneration. This article presents an analytical tool with visualization by exploiting the strengths of the MEG (magnetoencephalographic) neuroimaging technique. The tool automates MEG data import (in tSSS format), channel information extraction, time/frequency decomposition, and circular graph visualization (connectogram) for simple result inspection. For advanced users, the tool also provides magnitude squared coherence (MSC) values allowing personalized threshold levels, and the computation of default model from MEG data of control population. Default model obtained from healthy population data serves as a useful benchmark to diagnose and monitor neuronal recovery during treatment. The proposed tool further provides optional labels with international 10-10 system nomenclature in order to facilitate comparison studies with EEG (electroencephalography) sensor space. Potential applications in epilepsy and traumatic brain injury studies are also discussed. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Single-cell analysis by ICP-MS/MS as a fast tool for cellular bioavailability studies of arsenite.

    PubMed

    Meyer, S; López-Serrano, A; Mitze, H; Jakubowski, N; Schwerdtle, T

    2018-01-24

    Single-cell inductively coupled plasma mass spectrometry (SC-ICP-MS) has become a powerful and fast tool to evaluate the elemental composition at a single-cell level. In this study, the cellular bioavailability of arsenite (incubation of 25 and 50 μM for 0-48 h) has been successfully assessed by SC-ICP-MS/MS for the first time directly after re-suspending the cells in water. This procedure avoids the normally arising cell membrane permeabilization caused by cell fixation methods (e.g. methanol fixation). The reliability and feasibility of this SC-ICP-MS/MS approach with a limit of detection of 0.35 fg per cell was validated by conventional bulk ICP-MS/MS analysis after cell digestion and parallel measurement of sulfur and phosphorus.

  10. TetrUSS Capabilities for S and C Applications

    NASA Technical Reports Server (NTRS)

    Frink, Neal T.; Parikh, Paresh

    2004-01-01

    TetrUSS is a suite of loosely coupled computational fluid dynamics software that is packaged into a complete flow analysis system. The system components consist of tools for geometry setup, grid generation, flow solution, visualization, and various utilities tools. Development began in 1990 and it has evolved into a proven and stable system for Euler and Navier-Stokes analysis and design of unconventional configurations. It is 1) well developed and validated, 2) has a broad base of support, and 3) is presently is a workhorse code because of the level of confidence that has been established through wide use. The entire system can now run on linux or mac architectures. In the following slides, I will highlight more of the features of the VGRID and USM3D codes.

  11. Ares Project Technology Assessment: Approach and Tools

    NASA Technical Reports Server (NTRS)

    Hueter, Uwe; Tyson, Richard

    2010-01-01

    Technology assessments provide a status of the development maturity of specific technologies. Along with benefit analysis, the risks the project assumes can be quantified. Normally due to budget constraints, the competing technologies are prioritized and decisions are made which ones to fund. A detailed technology development plan is produced for the selected technologies to provide a roadmap to reach the desired maturity by the project s critical design review. Technology assessments can be conducted for both technology only tasks or for product development programs. This paper is primarily biased toward the product development programs. The paper discusses the Ares Project s approach to technology assessment. System benefit analysis, risk assessment, technology prioritization, and technology readiness assessment are addressed. A description of the technology readiness level tool being used is provided.

  12. JIMM: the next step for mission-level models

    NASA Astrophysics Data System (ADS)

    Gump, Jamieson; Kurker, Robert G.; Nalepka, Joseph P.

    2001-09-01

    The (Simulation Based Acquisition) SBA process is one in which the planning, design, and test of a weapon system or other product is done through the more effective use of modeling and simulation, information technology, and process improvement. This process results in a product that is produced faster, cheaper, and more reliably than its predecessors. Because the SBA process requires realistic and detailed simulation conditions, it was necessary to develop a simulation tool that would provide a simulation environment acceptable for doing SBA analysis. The Joint Integrated Mission Model (JIMM) was created to help define and meet the analysis, test and evaluation, and training requirements of a Department of Defense program utilizing SBA. Through its generic nature of representing simulation entities, its data analysis capability, and its robust configuration management process, JIMM can be used to support a wide range of simulation applications as both a constructive and a virtual simulation tool. JIMM is a Mission Level Model (MLM). A MLM is capable of evaluating the effectiveness and survivability of a composite force of air and space systems executing operational objectives in a specific scenario against an integrated air and space defense system. Because MLMs are useful for assessing a system's performance in a realistic, integrated, threat environment, they are key to implementing the SBA process. JIMM is a merger of the capabilities of one legacy model, the Suppressor MLM, into another, the Simulated Warfare Environment Generator (SWEG) MLM. By creating a more capable MLM, JIMM will not only be a tool to support the SBA initiative, but could also provide the framework for the next generation of MLMs.

  13. Midwifery education and technology enhanced learning: Evaluating online story telling in preregistration midwifery education.

    PubMed

    Scamell, Mandie; Hanley, Thomas

    2018-03-01

    A major issue regarding the implementation of blended learning for preregistration health programmes is the analysis of students' perceptions and attitudes towards their learning. It is the extent of the embedding of Technology Enhanced Learning (TEL) into the higher education curriculum that makes this analysis so vital. This paper reports on the quantitative results of a UK based study that was set up to respond to the apparent disconnect between technology enhanced education provision and reliable student evaluation of this mode of learning. Employing a mixed methods research design, the research described here was carried to develop a reliable and valid evaluation tool to measure acceptability of and satisfaction with a blended learning approach, specifically designed for a preregistration midwifery module offered at level 4. Feasibility testing of 46 completed blended learning evaluation questionnaires - Student Midwife Evaluation of Online Learning Effectiveness (SMEOLE) - using descriptive statistics, reliability and internal consistency tests. Standard deviations and mean scores all followed predicted pattern. Results from the reliability and internal consistency testing confirm the feasibility of SMEOLE as an effective tool for measuring student satisfaction with a blended learning approach to preregistration learning. The analysis presented in this paper suggests that we have been successful in our aim to produce an evaluation tool capable of assessing the quality of technology enhanced, University level learning in Midwifery. This work can provide future benchmarking against which midwifery, and other health, blended learning curriculum planning could be structured and evaluated. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Mutant KRAS Circulating Tumor DNA Is an Accurate Tool for Pancreatic Cancer Monitoring.

    PubMed

    Perets, Ruth; Greenberg, Orli; Shentzer, Talia; Semenisty, Valeria; Epelbaum, Ron; Bick, Tova; Sarji, Shada; Ben-Izhak, Ofer; Sabo, Edmond; Hershkovitz, Dov

    2018-05-01

    Many new pancreatic cancer treatment combinations have been discovered in recent years, yet the prognosis of pancreatic ductal adenocarcinoma (PDAC) remains grim. The advent of new treatments highlights the need for better monitoring tools for treatment response, to allow a timely switch between different therapeutic regimens. Circulating tumor DNA (ctDNA) is a tool for cancer detection and characterization with growing clinical use. However, currently, ctDNA is not used for monitoring treatment response. The high prevalence of KRAS hotspot mutations in PDAC suggests that mutant KRAS can be an efficient ctDNA marker for PDAC monitoring. Seventeen metastatic PDAC patients were recruited and serial plasma samples were collected. CtDNA was extracted from the plasma, and KRAS mutation analysis was performed using next-generation sequencing and correlated with serum CA19-9 levels, imaging, and survival. Plasma KRAS mutations were detected in 5/17 (29.4%) patients. KRAS ctDNA detection was associated with shorter survival (8 vs. 37.5 months). Our results show that, in ctDNA positive patients, ctDNA is at least comparable to CA19-9 as a marker for monitoring treatment response. Furthermore, the rate of ctDNA change was inversely correlated with survival. Our results confirm that mutant KRAS ctDNA detection in metastatic PDAC patients is a poor prognostic marker. Additionally, we were able to show that mutant KRAS ctDNA analysis can be used to monitor treatment response in PDAC patients and that ctDNA dynamics is associated with survival. We suggest that ctDNA analysis in metastatic PDAC patients is a readily available tool for disease monitoring. Avoiding futile chemotherapy in metastatic pancreatic ductal adenocarcinoma (PDAC) patients by monitoring response to treatment is of utmost importance. A novel biomarker for monitoring treatment response in PDAC, using mutant KRAS circulating tumor DNA (ctDNA), is proposed. Results, although limited by small sample numbers, suggest that ctDNA can be an effective marker for disease monitoring and that ctDNA level over time is a better predictor of survival than the dynamics of the commonly used biomarker CA19-9. Therefore, ctDNA analysis can be a useful tool for monitoring PDAC treatment response. These results should be further validated in larger sample numbers. © AlphaMed Press 2018.

  15. A Liver-centric Multiscale Modeling Framework for Xenobiotics ...

    EPA Pesticide Factsheets

    We describe a multi-scale framework for modeling acetaminophen-induced liver toxicity. Acetaminophen is a widely used analgesic. Overdose of acetaminophen can result in liver injury via its biotransformation into toxic product, which further induce massive necrosis. Our study focuses on developing a multi-scale computational model to characterize both phase I and phase II metabolism of acetaminophen, by bridging Physiologically Based Pharmacokinetic (PBPK) modeling at the whole body level, cell movement and blood flow at the tissue level and cell signaling and drug metabolism at the sub-cellular level. To validate the model, we estimated our model parameters by fi?tting serum concentrations of acetaminophen and its glucuronide and sulfate metabolites to experiments, and carried out sensitivity analysis on 35 parameters selected from three modules. Our study focuses on developing a multi-scale computational model to characterize both phase I and phase II metabolism of acetaminophen, by bridging Physiologically Based Pharmacokinetic (PBPK) modeling at the whole body level, cell movement and blood flow at the tissue level and cell signaling and drug metabolism at the sub-cellular level. This multiscale model bridges the CompuCell3D tool used by the Virtual Tissue project with the httk tool developed by the Rapid Exposure and Dosimetry project.

  16. The Strength of Ethical Matrixes as a Tool for Normative Analysis Related to Technological Choices: The Case of Geological Disposal for Radioactive Waste.

    PubMed

    Kermisch, Céline; Depaus, Christophe

    2018-02-01

    The ethical matrix is a participatory tool designed to structure ethical reflection about the design, the introduction, the development or the use of technologies. Its collective implementation, in the context of participatory decision-making, has shown its potential usefulness. On the contrary, its implementation by a single researcher has not been thoroughly analyzed. The aim of this paper is precisely to assess the strength of ethical matrixes implemented by a single researcher as a tool for conceptual normative analysis related to technological choices. Therefore, the ethical matrix framework is applied to the management of high-level radioactive waste, more specifically to retrievable and non-retrievable geological disposal. The results of this analysis show that the usefulness of ethical matrixes is twofold and that they provide a valuable input for further decision-making. Indeed, by using ethical matrixes, implicit ethically relevant issues were revealed-namely issues of equity associated with health impacts and differences between close and remote future generations regarding ethical impacts. Moreover, the ethical matrix framework was helpful in synthesizing and comparing systematically the ethical impacts of the technologies under scrutiny, and hence in highlighting the potential ethical conflicts.

  17. Sustainable development induction in organizations: a convergence analysis of ISO standards management tools' parameters.

    PubMed

    Merlin, Fabrício Kurman; Pereira, Vera Lúciaduarte do Valle; Pacheco, Waldemar

    2012-01-01

    Organizations are part of an environment in which they are pressured to meet society's demands and acting in a sustainable way. In an attempt to meet such demands, organizations make use of various management tools, among which, ISO standards are used. Although there are evidences of contributions provided by these standards, it is questionable whether its parameters converge for a possible induction for sustainable development in organizations. This work presents a theoretical study, designed on structuralism world view, descriptive and deductive method, which aims to analyze the convergence of management tools' parameters in ISO standards. In order to support the analysis, a generic framework for possible convergence was developed, based on systems approach, linking five ISO standards (ISO 9001, ISO 14001, OHSAS 18001, ISO 31000 and ISO 26000) with sustainable development and positioning them according to organization levels (strategic, tactical and operational). The structure was designed based on Brundtland report concept. The analysis was performed exploring the generic framework for possible convergence based on Nadler and Tushman model. The results found the standards can contribute to a possible sustainable development induction in organizations, as long as they meet certain minimum conditions related to its strategic alignment.

  18. Use of the SONET score to evaluate Urgent Care Center overcrowding: a prospective pilot study.

    PubMed

    Wang, Hao; Robinson, Richard D; Cowden, Chad D; Gorman, Violet A; Cook, Christopher D; Gicheru, Eugene K; Schrader, Chet D; Jayswal, Rani D; Zenarosa, Nestor R

    2015-04-14

    To derive a tool to determine Urgent Care Center (UCC) crowding and investigate the association between different levels of UCC overcrowding and negative patient care outcomes. Prospective pilot study. Single centre study in the USA. 3565 patients who registered at UCC during the 21-day study period were included. Patients who had no overcrowding statuses estimated due to incomplete collection of operational variables at the time of registration were excluded in this study. 3139 patients were enrolled in the final data analysis. A crowding estimation tool (SONET: Severely overcrowded, Overcrowded and Not overcrowded Estimation Tool) was derived using the linear regression analysis. The average length of stay (LOS) in UCC patients and the number of left without being seen (LWBS) patients were calculated and compared under the three different levels of UCC crowding. Four independent operational variables could affect the UCC overcrowding score including the total number of patients, the number of results pending for patients, the number of patients in the waiting room and the longest time a patient was stationed in the waiting room. In addition, UCC overcrowding was associated with longer average LOS (not overcrowded: 133±76 min, overcrowded: 169±79 min, and severely overcrowded: 196±87 min, p<0.001) and an increased number of LWBS patients (not overcrowded: 0.28±0.69 patients, overcrowded: 0.64±0.98, and severely overcrowded: 1.00±0.97). The overcrowding estimation tool (SONET) derived in this study might be used to determine different levels of crowding in a high volume UCC setting. It also showed that UCC overcrowding might be associated with negative patient care outcomes. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  19. Exploring the single-cell RNA-seq analysis landscape with the scRNA-tools database.

    PubMed

    Zappia, Luke; Phipson, Belinda; Oshlack, Alicia

    2018-06-25

    As single-cell RNA-sequencing (scRNA-seq) datasets have become more widespread the number of tools designed to analyse these data has dramatically increased. Navigating the vast sea of tools now available is becoming increasingly challenging for researchers. In order to better facilitate selection of appropriate analysis tools we have created the scRNA-tools database (www.scRNA-tools.org) to catalogue and curate analysis tools as they become available. Our database collects a range of information on each scRNA-seq analysis tool and categorises them according to the analysis tasks they perform. Exploration of this database gives insights into the areas of rapid development of analysis methods for scRNA-seq data. We see that many tools perform tasks specific to scRNA-seq analysis, particularly clustering and ordering of cells. We also find that the scRNA-seq community embraces an open-source and open-science approach, with most tools available under open-source licenses and preprints being extensively used as a means to describe methods. The scRNA-tools database provides a valuable resource for researchers embarking on scRNA-seq analysis and records the growth of the field over time.

  20. Model-Driven Useware Engineering

    NASA Astrophysics Data System (ADS)

    Meixner, Gerrit; Seissler, Marc; Breiner, Kai

    User-oriented hardware and software development relies on a systematic development process based on a comprehensive analysis focusing on the users' requirements and preferences. Such a development process calls for the integration of numerous disciplines, from psychology and ergonomics to computer sciences and mechanical engineering. Hence, a correspondingly interdisciplinary team must be equipped with suitable software tools to allow it to handle the complexity of a multimodal and multi-device user interface development approach. An abstract, model-based development approach seems to be adequate for handling this complexity. This approach comprises different levels of abstraction requiring adequate tool support. Thus, in this chapter, we present the current state of our model-based software tool chain. We introduce the use model as the core model of our model-based process, transformation processes, and a model-based architecture, and we present different software tools that provide support for creating and maintaining the models or performing the necessary model transformations.

  1. Multi-objective spatial tools to inform maritime spatial planning in the Adriatic Sea.

    PubMed

    Depellegrin, Daniel; Menegon, Stefano; Farella, Giulio; Ghezzo, Michol; Gissi, Elena; Sarretta, Alessandro; Venier, Chiara; Barbanti, Andrea

    2017-12-31

    This research presents a set of multi-objective spatial tools for sea planning and environmental management in the Adriatic Sea Basin. The tools address four objectives: 1) assessment of cumulative impacts from anthropogenic sea uses on environmental components of marine areas; 2) analysis of sea use conflicts; 3) 3-D hydrodynamic modelling of nutrient dispersion (nitrogen and phosphorus) from riverine sources in the Adriatic Sea Basin and 4) marine ecosystem services capacity assessment from seabed habitats based on an ES matrix approach. Geospatial modelling results were illustrated, analysed and compared on country level and for three biogeographic subdivisions, Northern-Central-Southern Adriatic Sea. The paper discusses model results for their spatial implications, relevance for sea planning, limitations and concludes with an outlook towards the need for more integrated, multi-functional tools development for sea planning. Copyright © 2017. Published by Elsevier B.V.

  2. The Evolutionary Ecology of Plant Disease: A Phylogenetic Perspective.

    PubMed

    Gilbert, Gregory S; Parker, Ingrid M

    2016-08-04

    An explicit phylogenetic perspective provides useful tools for phytopathology and plant disease ecology because the traits of both plants and microbes are shaped by their evolutionary histories. We present brief primers on phylogenetic signal and the analytical tools of phylogenetic ecology. We review the literature and find abundant evidence of phylogenetic signal in pathogens and plants for most traits involved in disease interactions. Plant nonhost resistance mechanisms and pathogen housekeeping functions are conserved at deeper phylogenetic levels, whereas molecular traits associated with rapid coevolutionary dynamics are more labile at branch tips. Horizontal gene transfer disrupts the phylogenetic signal for some microbial traits. Emergent traits, such as host range and disease severity, show clear phylogenetic signals. Therefore pathogen spread and disease impact are influenced by the phylogenetic structure of host assemblages. Phylogenetically rare species escape disease pressure. Phylogenetic tools could be used to develop predictive tools for phytosanitary risk analysis and reduce disease pressure in multispecies cropping systems.

  3. Ramping and Uncertainty Prediction Tool - Analysis and Visualization of Wind Generation Impact on Electrical Grid

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Etingov, Pavel; Makarov, PNNL Yuri; Subbarao, PNNL Kris

    RUT software is designed for use by the Balancing Authorities to predict and display additional requirements caused by the variability and uncertainty in load and generation. The prediction is made for the next operating hours as well as for the next day. The tool predicts possible deficiencies in generation capability and ramping capability. This deficiency of balancing resources can cause serious risks to power system stability and also impact real-time market energy prices. The tool dynamically and adaptively correlates changing system conditions with the additional balancing needs triggered by the interplay between forecasted and actual load and output of variablemore » resources. The assessment is performed using a specially developed probabilistic algorithm incorporating multiple sources of uncertainty including wind, solar and load forecast errors. The tool evaluates required generation for a worst case scenario, with a user-specified confidence level.« less

  4. Forward impact extrusion of surface textured steel blanks using coated tooling

    NASA Astrophysics Data System (ADS)

    Hild, Rafael; Feuerhack, Andreas; Trauth, Daniel; Arghavani, Mostafa; Kruppe, Nathan C.; Brögelmann, Tobias; Bobzin, Kirsten; Klocke, Fritz

    2017-10-01

    A method to enable dry metal forming by the means of a self-lubricating coating and surface textures was researched using an innovative Pin-On-Cylinder-Tribometer. The experimental analysis was complemented by a numerical model of the complex contact conditions between coated tools and the surface textured specimen at the micro-level. Based on the results, the explanation of the tribological interactions between surface textured specimens and the tool in dry full forward extrusion is the objective of this work. Therefore, experimental dry extrusion tests were performed using a tool system. The extruded specimens were evaluated regarding their geometry as well as by the required punch force. Thereby, the effectiveness and the feasibility of dry metal forming on the example of full forward extrusion was evaluated. Thus, one more step towards the technical realization of dry metal forming of low alloy steels under industrial conditions was realized.

  5. Techniques and Tools for Estimating Ionospheric Effects in Interferometric and Polarimetric SAR Data

    NASA Technical Reports Server (NTRS)

    Rosen, P.; Lavalle, M.; Pi, X.; Buckley, S.; Szeliga, W.; Zebker, H.; Gurrola, E.

    2011-01-01

    The InSAR Scientific Computing Environment (ISCE) is a flexible, extensible software tool designed for the end-to-end processing and analysis of synthetic aperture radar data. ISCE inherits the core of the ROI_PAC interferometric tool, but contains improvements at all levels of the radar processing chain, including a modular and extensible architecture, new focusing approach, better geocoding of the data, handling of multi-polarization data, radiometric calibration, and estimation and correction of ionospheric effects. In this paper we describe the characteristics of ISCE with emphasis on the ionospheric modules. To detect ionospheric anomalies, ISCE implements the Faraday rotation method using quadpolarimetric images, and the split-spectrum technique using interferometric single-, dual- and quad-polarimetric images. The ability to generate co-registered time series of quad-polarimetric images makes ISCE also an ideal tool to be used for polarimetric-interferometric radar applications.

  6. Robust detection, isolation and accommodation for sensor failures

    NASA Technical Reports Server (NTRS)

    Emami-Naeini, A.; Akhter, M. M.; Rock, S. M.

    1986-01-01

    The objective is to extend the recent advances in robust control system design of multivariable systems to sensor failure detection, isolation, and accommodation (DIA), and estimator design. This effort provides analysis tools to quantify the trade-off between performance robustness and DIA sensitivity, which are to be used to achieve higher levels of performance robustness for given levels of DIA sensitivity. An innovations-based DIA scheme is used. Estimators, which depend upon a model of the process and process inputs and outputs, are used to generate these innovations. Thresholds used to determine failure detection are computed based on bounds on modeling errors, noise properties, and the class of failures. The applicability of the newly developed tools are demonstrated on a multivariable aircraft turbojet engine example. A new concept call the threshold selector was developed. It represents a significant and innovative tool for the analysis and synthesis of DiA algorithms. The estimators were made robust by introduction of an internal model and by frequency shaping. The internal mode provides asymptotically unbiased filter estimates.The incorporation of frequency shaping of the Linear Quadratic Gaussian cost functional modifies the estimator design to make it suitable for sensor failure DIA. The results are compared with previous studies which used thresholds that were selcted empirically. Comparison of these two techniques on a nonlinear dynamic engine simulation shows improved performance of the new method compared to previous techniques

  7. Multi-Mission Power Analysis Tool (MMPAT) Version 3

    NASA Technical Reports Server (NTRS)

    Wood, Eric G.; Chang, George W.; Chen, Fannie C.

    2012-01-01

    The Multi-Mission Power Analysis Tool (MMPAT) simulates a spacecraft power subsystem including the power source (solar array and/or radioisotope thermoelectric generator), bus-voltage control, secondary battery (lithium-ion or nickel-hydrogen), thermostatic heaters, and power-consuming equipment. It handles multiple mission types including heliocentric orbiters, planetary orbiters, and surface operations. Being parametrically driven along with its user-programmable features can reduce or even eliminate any need for software modifications when configuring it for a particular spacecraft. It provides multiple levels of fidelity, thereby fulfilling the vast majority of a project s power simulation needs throughout the lifecycle. It can operate in a stand-alone mode with a graphical user interface, in batch mode, or as a library linked with other tools. This software can simulate all major aspects of a spacecraft power subsystem. It is parametrically driven to reduce or eliminate the need for a programmer. Added flexibility is provided through user-designed state models and table-driven parameters. MMPAT is designed to be used by a variety of users, such as power subsystem engineers for sizing power subsystem components; mission planners for adjusting mission scenarios using power profiles generated by the model; system engineers for performing system- level trade studies using the results of the model during the early design phases of a spacecraft; and operations personnel for high-fidelity modeling of the essential power aspect of the planning picture.

  8. Construction and Validation of a Questionnaire about Heart Failure Patients' Knowledge of Their Disease

    PubMed Central

    Bonin, Christiani Decker Batista; dos Santos, Rafaella Zulianello; Ghisi, Gabriela Lima de Melo; Vieira, Ariany Marques; Amboni, Ricardo; Benetti, Magnus

    2014-01-01

    Background The lack of tools to measure heart failure patients' knowledge about their syndrome when participating in rehabilitation programs demonstrates the need for specific recommendations regarding the amount or content of information required. Objectives To develop and validate a questionnaire to assess heart failure patients' knowledge about their syndrome when participating in cardiac rehabilitation programs. Methods The tool was developed based on the Coronary Artery Disease Education Questionnaire and applied to 96 patients with heart failure, with a mean age of 60.22 ± 11.6 years, 64% being men. Reproducibility was obtained via the intraclass correlation coefficient, using the test-retest method. Internal consistency was assessed by use of Cronbach's alpha, and construct validity, by use of exploratory factor analysis. Results The final version of the tool had 19 questions arranged in ten areas of importance for patient education. The proposed questionnaire had a clarity index of 8.94 ± 0.83. The intraclass correlation coefficient was 0.856, and Cronbach's alpha, 0.749. Factor analysis revealed five factors associated with the knowledge areas. Comparing the final scores with the characteristics of the population evidenced that low educational level and low income are significantly associated with low levels of knowledge. Conclusion The instrument has satisfactory clarity and validity indices, and can be used to assess the heart failure patients' knowledge about their syndrome when participating in cardiac rehabilitation programs. PMID:24652054

  9. Karl Marx and the Environment

    ERIC Educational Resources Information Center

    Shifferd, K. D.

    1972-01-01

    Implications from Karl Marx's concept of nature are explored. Serving as a frame of reference for the fight against pollution, the Marxian philosophy provides a kind of systems analysis of the origins and dynamics of pollution at the level of society and a set of conceptual tools and attitudes for unmasking the claims of industry. (BL)

  10. Circular Dichroism Spectroscopy: Enhancing a Traditional Undergraduate Biochemistry Laboratory Experience

    ERIC Educational Resources Information Center

    Lewis, Russell L.; Seal, Erin L.; Lorts, Aimee R.; Stewart, Amanda L.

    2017-01-01

    The undergraduate biochemistry laboratory curriculum is designed to provide students with experience in protein isolation and purification protocols as well as various data analysis techniques, which enhance the biochemistry lecture course and give students a broad range of tools upon which to build in graduate level laboratories or once they…

  11. Introducing Graduate Students to High-Resolution Mass Spectrometry (HRMS) Using a Hands-On Approach

    ERIC Educational Resources Information Center

    Stock, Naomi L.

    2017-01-01

    High-resolution mass spectrometry (HRMS) features both high resolution and high mass accuracy and is a powerful tool for the analysis and quantitation of compounds, determination of elemental compositions, and identification of unknowns. A hands-on laboratory experiment for upper-level undergraduate and graduate students to investigate HRMS is…

  12. Spatial modeling of potential woody biomass flow

    Treesearch

    Woodam Chung; Nathaniel Anderson

    2012-01-01

    The flow of woody biomass to end users is determined by economic factors, especially the amount available across a landscape and delivery costs of bioenergy facilities. The objective of this study develop methodology to quantify landscape-level stocks and potential biomass flows using the currently available spatial database road network analysis tool. We applied this...

  13. Benchmarking and beyond. Information trends in home care.

    PubMed

    Twiss, Amanda; Rooney, Heather; Lang, Christine

    2002-11-01

    With today's benchmarking concepts and tools, agencies have the unprecedented opportunity to use information as a strategic advantage. Because agencies are demanding more and better information, benchmark functionality has grown increasingly sophisticated. Agencies now require a new type of analysis, focused on high-level executive summaries while reducing the current "data overload."

  14. Research-Informed Curriculum Design for a Master's-Level Program in Project Management

    ERIC Educational Resources Information Center

    Bentley, Yongmei; Richardson, Diane; Duan, Yanqing; Philpott, Elly; Ong, Vincent; Owen, David

    2013-01-01

    This article reports on the application of Research-Informed Curriculum Design (RICD) for the development and implementation of an MSc Program in Project Management. The research focused on contemporary issues in project management and provided an analysis of project management approaches, tools, and techniques currently used in organizations.…

  15. Technology in the College Classroom.

    ERIC Educational Resources Information Center

    Earl, Archie W., Sr.

    An analysis was made of the use of computing tools at the graduate and undergraduate levels in colleges and universities in the United States. Topics ranged from hand-held calculators to the use of main-frame computers and the assessment of the SPSSX, SPSS, LINDO, and MINITAB computer software packages. Hand-held calculators are being increasingly…

  16. Economic Evaluation of New Technologies in Higher Education. N.I.E. Report Phase 1, Volume 6 of 7.

    ERIC Educational Resources Information Center

    Heriot-Watt Univ., Edinburgh (Scotland). Esmee Fairbairn Economics Research Centre.

    Part of a series of instructional packages for use in college level economics courses, the document contains nine microeconomics chapters. Chapter I, "Economic Concepts, Issues, and Tools," discusses scarcity and choice; preferences, resources, exchange, and economic efficiency; marginal analysis and opportunity cost; and different economic…

  17. Analysis of Traditional versus Three-Dimensional Augmented Curriculum on Anatomical Learning Outcome Measures

    ERIC Educational Resources Information Center

    Peterson, Diana Coomes; Mlynarczyk, Gregory S.A.

    2016-01-01

    This study examined whether student learning outcome measures are influenced by the addition of three-dimensional and digital teaching tools to a traditional dissection and lecture learning format curricula. The study was performed in a semester long graduate level course that incorporated both gross anatomy and neuroanatomy curricula. Methods…

  18. Commercial Supersonics Technology Project - Status of Airport Noise

    NASA Technical Reports Server (NTRS)

    Bridges, James

    2016-01-01

    The Commercial Supersonic Technology Project has been developing databases, computational tools, and system models to prepare for a level 1 milestone, the Low Noise Propulsion Tech Challenge, to be delivered Sept 2016. Steps taken to prepare for the final validation test are given, including system analysis, code validation, and risk reduction testing.

  19. A Methodology for Teaching Afro-American Literature.

    ERIC Educational Resources Information Center

    Kittrell, Jean

    This paper outlines a system of methods for teaching Afro-American Literature at the secondary and college level. Seven goals of the methodology are presented for the course, including making the students familiar with various definitions of black literature, helping the students use the tools of literary analysis in the discussion of black…

  20. The Power of 'Evidence': Reliable Science or a Set of Blunt Tools?

    ERIC Educational Resources Information Center

    Wrigley, Terry

    2018-01-01

    In response to the increasing emphasis on 'evidence-based teaching', this article examines the privileging of randomised controlled trials and their statistical synthesis (meta-analysis). It also pays particular attention to two third-level statistical syntheses: John Hattie's "Visible learning" project and the EEF's "Teaching and…

Top