Sample records for comprehensive modeling tools

  1. MetaboTools: A comprehensive toolbox for analysis of genome-scale metabolic models

    DOE PAGES

    Aurich, Maike K.; Fleming, Ronan M. T.; Thiele, Ines

    2016-08-03

    Metabolomic data sets provide a direct read-out of cellular phenotypes and are increasingly generated to study biological questions. Previous work, by us and others, revealed the potential of analyzing extracellular metabolomic data in the context of the metabolic model using constraint-based modeling. With the MetaboTools, we make our methods available to the broader scientific community. The MetaboTools consist of a protocol, a toolbox, and tutorials of two use cases. The protocol describes, in a step-wise manner, the workflow of data integration, and computational analysis. The MetaboTools comprise the Matlab code required to complete the workflow described in the protocol. Tutorialsmore » explain the computational steps for integration of two different data sets and demonstrate a comprehensive set of methods for the computational analysis of metabolic models and stratification thereof into different phenotypes. The presented workflow supports integrative analysis of multiple omics data sets. Importantly, all analysis tools can be applied to metabolic models without performing the entire workflow. Taken together, the MetaboTools constitute a comprehensive guide to the intra-model analysis of extracellular metabolomic data from microbial, plant, or human cells. In conclusion, this computational modeling resource offers a broad set of computational analysis tools for a wide biomedical and non-biomedical research community.« less

  2. Simulation Modeling of Lakes in Undergraduate and Graduate Classrooms Increases Comprehension of Climate Change Concepts and Experience with Computational Tools

    ERIC Educational Resources Information Center

    Carey, Cayelan C.; Gougis, Rebekka Darner

    2017-01-01

    Ecosystem modeling is a critically important tool for environmental scientists, yet is rarely taught in undergraduate and graduate classrooms. To address this gap, we developed a teaching module that exposes students to a suite of modeling skills and tools (including computer programming, numerical simulation modeling, and distributed computing)…

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aurich, Maike K.; Fleming, Ronan M. T.; Thiele, Ines

    Metabolomic data sets provide a direct read-out of cellular phenotypes and are increasingly generated to study biological questions. Previous work, by us and others, revealed the potential of analyzing extracellular metabolomic data in the context of the metabolic model using constraint-based modeling. With the MetaboTools, we make our methods available to the broader scientific community. The MetaboTools consist of a protocol, a toolbox, and tutorials of two use cases. The protocol describes, in a step-wise manner, the workflow of data integration, and computational analysis. The MetaboTools comprise the Matlab code required to complete the workflow described in the protocol. Tutorialsmore » explain the computational steps for integration of two different data sets and demonstrate a comprehensive set of methods for the computational analysis of metabolic models and stratification thereof into different phenotypes. The presented workflow supports integrative analysis of multiple omics data sets. Importantly, all analysis tools can be applied to metabolic models without performing the entire workflow. Taken together, the MetaboTools constitute a comprehensive guide to the intra-model analysis of extracellular metabolomic data from microbial, plant, or human cells. In conclusion, this computational modeling resource offers a broad set of computational analysis tools for a wide biomedical and non-biomedical research community.« less

  4. A new computer code for discrete fracture network modelling

    NASA Astrophysics Data System (ADS)

    Xu, Chaoshui; Dowd, Peter

    2010-03-01

    The authors describe a comprehensive software package for two- and three-dimensional stochastic rock fracture simulation using marked point processes. Fracture locations can be modelled by a Poisson, a non-homogeneous, a cluster or a Cox point process; fracture geometries and properties are modelled by their respective probability distributions. Virtual sampling tools such as plane, window and scanline sampling are included in the software together with a comprehensive set of statistical tools including histogram analysis, probability plots, rose diagrams and hemispherical projections. The paper describes in detail the theoretical basis of the implementation and provides a case study in rock fracture modelling to demonstrate the application of the software.

  5. Assessing Comprehension During Reading with the Reading Strategy Assessment Tool (RSAT)

    PubMed Central

    Magliano, Joseph P.; Millis, Keith K.; Levinstein, Irwin

    2011-01-01

    Comprehension emerges as the results of inference and strategic processes that support the construction of a coherent mental model for a text. However, the vast majority of comprehension skills tests adopt a format that does not afford an assessment of these processes as they operate during reading. This study assessed the viability of the Reading Strategy Assessment Tool (RSAT), which is an automated computer-based reading assessment designed to measure readers’ comprehension and spontaneous use of reading strategies while reading texts. In the tool, readers comprehend passages one sentence at a time, and are asked either an indirect (“What are your thoughts regarding your understanding of the sentence in the context of the passage?”) or direct (e.g., why X?) question after reading each pre-selected target sentence. The answers to the indirect questions are analyzed on the extent that they contain words associated with comprehension processes. The answers to direct questions are coded for the number of content words in common with an ideal answer, which is intended to be an assessment of emerging comprehension. In the study, the RSAT approach was shown to predict measures of comprehension comparable to standardized tests. The RSAT variables were also shown to correlate with human ratings. The results of this study constitute a “proof of concept” and demonstrate that it is possible to develop a comprehension skills assessment tool that assesses both comprehension and comprehension strategies. PMID:23901332

  6. Benchmarking a Visual-Basic based multi-component one-dimensional reactive transport modeling tool

    NASA Astrophysics Data System (ADS)

    Torlapati, Jagadish; Prabhakar Clement, T.

    2013-01-01

    We present the details of a comprehensive numerical modeling tool, RT1D, which can be used for simulating biochemical and geochemical reactive transport problems. The code can be run within the standard Microsoft EXCEL Visual Basic platform, and it does not require any additional software tools. The code can be easily adapted by others for simulating different types of laboratory-scale reactive transport experiments. We illustrate the capabilities of the tool by solving five benchmark problems with varying levels of reaction complexity. These literature-derived benchmarks are used to highlight the versatility of the code for solving a variety of practical reactive transport problems. The benchmarks are described in detail to provide a comprehensive database, which can be used by model developers to test other numerical codes. The VBA code presented in the study is a practical tool that can be used by laboratory researchers for analyzing both batch and column datasets within an EXCEL platform.

  7. The National Association of School Psychologists' Self-Assessment Tool for School Psychologists: Factor Structure and Relationship to the National Association of School Psychologists' Practice Model

    ERIC Educational Resources Information Center

    Eklund, Katie; Rossen, Eric; Charvat, Jeff; Meyer, Lauren; Tanner, Nick

    2016-01-01

    The National Association of School Psychologists' Model for Comprehensive and Integrated School Psychological Services (2010a), often referred to as the National Association of School Psychologists' Practice Model, describes the comprehensive range of professional skills and competencies available from school psychologists across 10 domains. The…

  8. Neurolinguistically constrained simulation of sentence comprehension: integrating artificial intelligence and brain theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gigley, H.M.

    1982-01-01

    An artificial intelligence approach to the simulation of neurolinguistically constrained processes in sentence comprehension is developed using control strategies for simulation of cooperative computation in associative networks. The desirability of this control strategy in contrast to ATN and production system strategies is explained. A first pass implementation of HOPE, an artificial intelligence simulation model of sentence comprehension, constrained by studies of aphasic performance, psycholinguistics, neurolinguistics, and linguistic theory is described. Claims that the model could serve as a basis for sentence production simulation and for a model of language acquisition as associative learning are discussed. HOPE is a model thatmore » performs in a normal state and includes a lesion simulation facility. HOPE is also a research tool. Its modifiability and use as a tool to investigate hypothesized causes of degradation in comprehension performance by aphasic patients are described. Issues of using behavioral constraints in modelling and obtaining appropriate data for simulated process modelling are discussed. Finally, problems of validation of the simulation results are raised; and issues of how to interpret clinical results to define the evolution of the model are discussed. Conclusions with respect to the feasibility of artificial intelligence simulation process modelling are discussed based on the current state of research.« less

  9. Logic Modeling as a Tool to Prepare to Evaluate Disaster and Emergency Preparedness, Response, and Recovery in Schools

    ERIC Educational Resources Information Center

    Zantal-Wiener, Kathy; Horwood, Thomas J.

    2010-01-01

    The authors propose a comprehensive evaluation framework to prepare for evaluating school emergency management programs. This framework involves a logic model that incorporates Government Performance and Results Act (GPRA) measures as a foundation for comprehensive evaluation that complements performance monitoring used by the U.S. Department of…

  10. Building the European Seismological Research Infrastructure: results from 4 years NERIES EC project

    NASA Astrophysics Data System (ADS)

    van Eck, T.; Giardini, D.

    2010-12-01

    The EC Research Infrastructure (RI) project, Network of Research Infrastructures for European Seismology (NERIES), implemented a comprehensive European integrated RI for earthquake seismological data that is scalable and sustainable. NERIES opened a significant amount of additional seismological data, integrated different distributed data archives, implemented and produced advanced analysis tools and advanced software packages and tools. A single seismic data portal provides a single access point and overview for European seismological data available for the earth science research community. Additional data access tools and sites have been implemented to meet user and robustness requirements, notably those at the EMSC and ORFEUS. The datasets compiled in NERIES and available through the portal include among others: - The expanded Virtual European Broadband Seismic Network (VEBSN) with real-time access to more then 500 stations from > 53 observatories. This data is continuously monitored, quality controlled and archived in the European Integrated Distributed waveform Archive (EIDA). - A unique integration of acceleration datasets from seven networks in seven European or associated countries centrally accessible in a homogeneous format, thus forming the core comprehensive European acceleration database. Standardized parameter analysis and actual software are included in the database. - A Distributed Archive of Historical Earthquake Data (AHEAD) for research purposes, containing among others a comprehensive European Macroseismic Database and Earthquake Catalogue (1000 - 1963, M ≥5.8), including analysis tools. - Data from 3 one year OBS deployments at three sites, Atlantic, Ionian and Ligurian Sea within the general SEED format, thus creating the core integrated data base for ocean, sea and land based seismological observatories. Tools to facilitate analysis and data mining of the RI datasets are: - A comprehensive set of European seismological velocity reference model including a standardized model description with several visualisation tools currently adapted on a global scale. - An integrated approach to seismic hazard modelling and forecasting, a community accepted forecasting testing and model validation approach and the core hazard portal developed along the same technologies as the NERIES data portal. - Implemented homogeneous shakemap estimation tools at several large European observatories and a complementary new loss estimation software tool. - A comprehensive set of new techniques for geotechnical site characterization with relevant software packages documented and maintained (www.geopsy.org). - A set of software packages for data mining, data reduction, data exchange and information management in seismology as research and observatory analysis tools NERIES has a long-term impact and is coordinated with related US initiatives IRIS and EarthScope. The follow-up EC project of NERIES, NERA (2010 - 2014), is funded and will integrate the seismological and the earthquake engineering infrastructures. NERIES further provided the proof of concept for the ESFRI2008 initiative: the European Plate Observing System (EPOS). Its preparatory phase (2010 - 2014) is also funded by the EC.

  11. Model Modules to Assist Assessing and Controlling SCC

    DOT National Transportation Integrated Search

    2008-04-04

    This project developed and validated tools to assist in integrity assessment and management both forms of SCC. Because the understanding that underlies integrity management tools was most comprehensive for high-pH SCC, development targeted NN-pH SCC,...

  12. Community Coordinated Modeling Center Support of Science Needs for Integrated Data Environment

    NASA Technical Reports Server (NTRS)

    Kuznetsova, M. M.; Hesse, M.; Rastatter, L.; Maddox, M.

    2007-01-01

    Space science models are essential component of integrated data environment. Space science models are indispensable tools to facilitate effective use of wide variety of distributed scientific sources and to place multi-point local measurements into global context. The Community Coordinated Modeling Center (CCMC) hosts a set of state-of-the- art space science models ranging from the solar atmosphere to the Earth's upper atmosphere. The majority of models residing at CCMC are comprehensive computationally intensive physics-based models. To allow the models to be driven by data relevant to particular events, the CCMC developed an online data file generation tool that automatically downloads data from data providers and transforms them to required format. CCMC provides a tailored web-based visualization interface for the model output, as well as the capability to download simulations output in portable standard format with comprehensive metadata and user-friendly model output analysis library of routines that can be called from any C supporting language. CCMC is developing data interpolation tools that enable to present model output in the same format as observations. CCMC invite community comments and suggestions to better address science needs for the integrated data environment.

  13. DATA FOR ENVIRONMENTAL MODELING: AN OVERVIEW

    EPA Science Inventory

    The objective of the project described here, entitled Data for Environmental Modeling (D4EM), is the development of a comprehensive set of software tools that allow an environmental model developer to automatically populate model input files with environmental data available from...

  14. Improving text comprehension: scaffolding adolescents into strategic reading.

    PubMed

    Ukrainetz, Teresa A

    2015-02-01

    Understanding and learning from academic texts involves purposeful, strategic reading. Adolescent readers, particularly poor readers, benefit from explicit instruction in text comprehension strategies, such as text preview, summarization, and comprehension monitoring, as part of a comprehensive reading program. However, strategies are difficult to teach within subject area lessons where content instruction must take primacy. Speech-language pathologists (SLPs) have the expertise and service delivery options to support middle and high school students in learning to use comprehension strategies in their academic reading and learning. This article presents the research evidence on what strategies to teach and how best to teach them, including the use of explicit instruction, spoken interactions around text, cognitive modeling, peer learning, classroom connections, and disciplinary literacy. The article focuses on how to move comprehension strategies from being teaching tools of the SLP to becoming learning tools of the student. SLPs can provide the instruction and support needed for students to learn and apply of this important component of academic reading. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  15. SWAT: Model use, calibration, and validation

    USDA-ARS?s Scientific Manuscript database

    SWAT (Soil and Water Assessment Tool) is a comprehensive, semi-distributed river basin model that requires a large number of input parameters which complicates model parameterization and calibration. Several calibration techniques have been developed for SWAT including manual calibration procedures...

  16. R-SWAT-FME user's guide

    USGS Publications Warehouse

    Wu, Yiping; Liu, Shu-Guang

    2012-01-01

    R program language-Soil and Water Assessment Tool-Flexible Modeling Environment (R-SWAT-FME) (Wu and Liu, 2012) is a comprehensive modeling framework that adopts an R package, Flexible Modeling Environment (FME) (Soetaert and Petzoldt, 2010), for the Soil and Water Assessment Tool (SWAT) model (Arnold and others, 1998; Neitsch and others, 2005). This framework provides the functionalities of parameter identifiability, model calibration, and sensitivity and uncertainty analysis with instant visualization. This user's guide shows how to apply this framework for a customized SWAT project.

  17. Comprehensive Modeling and Analysis of Rotorcraft Variable Speed Propulsion System With Coupled Engine/Transmission/Rotor Dynamics

    NASA Technical Reports Server (NTRS)

    DeSmidt, Hans A.; Smith, Edward C.; Bill, Robert C.; Wang, Kon-Well

    2013-01-01

    This project develops comprehensive modeling and simulation tools for analysis of variable rotor speed helicopter propulsion system dynamics. The Comprehensive Variable-Speed Rotorcraft Propulsion Modeling (CVSRPM) tool developed in this research is used to investigate coupled rotor/engine/fuel control/gearbox/shaft/clutch/flight control system dynamic interactions for several variable rotor speed mission scenarios. In this investigation, a prototypical two-speed Dual-Clutch Transmission (DCT) is proposed and designed to achieve 50 percent rotor speed variation. The comprehensive modeling tool developed in this study is utilized to analyze the two-speed shift response of both a conventional single rotor helicopter and a tiltrotor drive system. In the tiltrotor system, both a Parallel Shift Control (PSC) strategy and a Sequential Shift Control (SSC) strategy for constant and variable forward speed mission profiles are analyzed. Under the PSC strategy, selecting clutch shift-rate results in a design tradeoff between transient engine surge margins and clutch frictional power dissipation. In the case of SSC, clutch power dissipation is drastically reduced in exchange for the necessity to disengage one engine at a time which requires a multi-DCT drive system topology. In addition to comprehensive simulations, several sections are dedicated to detailed analysis of driveline subsystem components under variable speed operation. In particular an aeroelastic simulation of a stiff in-plane rotor using nonlinear quasi-steady blade element theory was conducted to investigate variable speed rotor dynamics. It was found that 2/rev and 4/rev flap and lag vibrations were significant during resonance crossings with 4/rev lagwise loads being directly transferred into drive-system torque disturbances. To capture the clutch engagement dynamics, a nonlinear stick-slip clutch torque model is developed. Also, a transient gas-turbine engine model based on first principles mean-line compressor and turbine approximations is developed. Finally an analysis of high frequency gear dynamics including the effect of tooth mesh stiffness variation under variable speed operation is conducted including experimental validation. Through exploring the interactions between the various subsystems, this investigation provides important insights into the continuing development of variable-speed rotorcraft propulsion systems.

  18. Subsonic Wing Optimization for Handling Qualities Using ACSYNT

    NASA Technical Reports Server (NTRS)

    Soban, Danielle Suzanne

    1996-01-01

    The capability to accurately and rapidly predict aircraft stability derivatives using one comprehensive analysis tool has been created. The PREDAVOR tool has the following capabilities: rapid estimation of stability derivatives using a vortex lattice method, calculation of a longitudinal handling qualities metric, and inherent methodology to optimize a given aircraft configuration for longitudinal handling qualities, including an intuitive graphical interface. The PREDAVOR tool may be applied to both subsonic and supersonic designs, as well as conventional and unconventional, symmetric and asymmetric configurations. The workstation-based tool uses as its model a three-dimensional model of the configuration generated using a computer aided design (CAD) package. The PREDAVOR tool was applied to a Lear Jet Model 23 and the North American XB-70 Valkyrie.

  19. Planform: an application and database of graph-encoded planarian regenerative experiments.

    PubMed

    Lobo, Daniel; Malone, Taylor J; Levin, Michael

    2013-04-15

    Understanding the mechanisms governing the regeneration capabilities of many organisms is a fundamental interest in biology and medicine. An ever-increasing number of manipulation and molecular experiments are attempting to discover a comprehensive model for regeneration, with the planarian flatworm being one of the most important model species. Despite much effort, no comprehensive, constructive, mechanistic models exist yet, and it is now clear that computational tools are needed to mine this huge dataset. However, until now, there is no database of regenerative experiments, and the current genotype-phenotype ontologies and databases are based on textual descriptions, which are not understandable by computers. To overcome these difficulties, we present here Planform (Planarian formalization), a manually curated database and software tool for planarian regenerative experiments, based on a mathematical graph formalism. The database contains more than a thousand experiments from the main publications in the planarian literature. The software tool provides the user with a graphical interface to easily interact with and mine the database. The presented system is a valuable resource for the regeneration community and, more importantly, will pave the way for the application of novel artificial intelligence tools to extract knowledge from this dataset. The database and software tool are freely available at http://planform.daniel-lobo.com.

  20. Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)

    NASA Technical Reports Server (NTRS)

    Doyle, Monica; ONeil, Daniel A.; Christensen, Carissa B.

    2005-01-01

    The Advanced Technology Lifecycle Analysis System (ATLAS) is a decision support tool designed to aid program managers and strategic planners in determining how to invest technology research and development dollars. It is an Excel-based modeling package that allows a user to build complex space architectures and evaluate the impact of various technology choices. ATLAS contains system models, cost and operations models, a campaign timeline and a centralized technology database. Technology data for all system models is drawn from a common database, the ATLAS Technology Tool Box (TTB). The TTB provides a comprehensive, architecture-independent technology database that is keyed to current and future timeframes.

  1. The Health Impact Assessment (HIA) Resource and Tool ...

    EPA Pesticide Factsheets

    Health Impact Assessment (HIA) is a relatively new and rapidly emerging field in the U.S. An inventory of available HIA resources and tools was conducted, with a primary focus on resources developed in the U.S. The resources and tools available to HIA practitioners in the conduct of their work were identified through multiple methods and compiled into a comprehensive list. The compilation includes tools and resources related to the HIA process itself and those that can be used to collect and analyze data, establish a baseline profile, assess potential health impacts, and establish benchmarks and indicators for monitoring and evaluation. These resources include literature and evidence bases, data and statistics, guidelines, benchmarks, decision and economic analysis tools, scientific models, methods, frameworks, indices, mapping, and various data collection tools. Understanding the data, tools, models, methods, and other resources available to perform HIAs will help to advance the HIA community of practice in the U.S., improve the quality and rigor of assessments upon which stakeholder and policy decisions are based, and potentially improve the overall effectiveness of HIA to promote healthy and sustainable communities. The Health Impact Assessment (HIA) Resource and Tool Compilation is a comprehensive list of resources and tools that can be utilized by HIA practitioners with all levels of HIA experience to guide them throughout the HIA process. The HIA Resource

  2. VARS-TOOL: A Comprehensive, Efficient, and Robust Sensitivity Analysis Toolbox

    NASA Astrophysics Data System (ADS)

    Razavi, S.; Sheikholeslami, R.; Haghnegahdar, A.; Esfahbod, B.

    2016-12-01

    VARS-TOOL is an advanced sensitivity and uncertainty analysis toolbox, applicable to the full range of computer simulation models, including Earth and Environmental Systems Models (EESMs). The toolbox was developed originally around VARS (Variogram Analysis of Response Surfaces), which is a general framework for Global Sensitivity Analysis (GSA) that utilizes the variogram/covariogram concept to characterize the full spectrum of sensitivity-related information, thereby providing a comprehensive set of "global" sensitivity metrics with minimal computational cost. VARS-TOOL is unique in that, with a single sample set (set of simulation model runs), it generates simultaneously three philosophically different families of global sensitivity metrics, including (1) variogram-based metrics called IVARS (Integrated Variogram Across a Range of Scales - VARS approach), (2) variance-based total-order effects (Sobol approach), and (3) derivative-based elementary effects (Morris approach). VARS-TOOL is also enabled with two novel features; the first one being a sequential sampling algorithm, called Progressive Latin Hypercube Sampling (PLHS), which allows progressively increasing the sample size for GSA while maintaining the required sample distributional properties. The second feature is a "grouping strategy" that adaptively groups the model parameters based on their sensitivity or functioning to maximize the reliability of GSA results. These features in conjunction with bootstrapping enable the user to monitor the stability, robustness, and convergence of GSA with the increase in sample size for any given case study. VARS-TOOL has been shown to achieve robust and stable results within 1-2 orders of magnitude smaller sample sizes (fewer model runs) than alternative tools. VARS-TOOL, available in MATLAB and Python, is under continuous development and new capabilities and features are forthcoming.

  3. Model-Driven Useware Engineering

    NASA Astrophysics Data System (ADS)

    Meixner, Gerrit; Seissler, Marc; Breiner, Kai

    User-oriented hardware and software development relies on a systematic development process based on a comprehensive analysis focusing on the users' requirements and preferences. Such a development process calls for the integration of numerous disciplines, from psychology and ergonomics to computer sciences and mechanical engineering. Hence, a correspondingly interdisciplinary team must be equipped with suitable software tools to allow it to handle the complexity of a multimodal and multi-device user interface development approach. An abstract, model-based development approach seems to be adequate for handling this complexity. This approach comprises different levels of abstraction requiring adequate tool support. Thus, in this chapter, we present the current state of our model-based software tool chain. We introduce the use model as the core model of our model-based process, transformation processes, and a model-based architecture, and we present different software tools that provide support for creating and maintaining the models or performing the necessary model transformations.

  4. A comprehensive surface-groundwater flow model

    NASA Astrophysics Data System (ADS)

    Arnold, Jeffrey G.; Allen, Peter M.; Bernhardt, Gilbert

    1993-02-01

    In this study, a simple groundwater flow and height model was added to an existing basin-scale surface water model. The linked model is: (1) watershed scale, allowing the basin to be subdivided; (2) designed to accept readily available inputs to allow general use over large regions; (3) continuous in time to allow simulation of land management, including such factors as climate and vegetation changes, pond and reservoir management, groundwater withdrawals, and stream and reservoir withdrawals. The model is described, and is validated on a 471 km 2 watershed near Waco, Texas. This linked model should provide a comprehensive tool for water resource managers in development and planning.

  5. MFV-class: a multi-faceted visualization tool of object classes.

    PubMed

    Zhang, Zhi-meng; Pan, Yun-he; Zhuang, Yue-ting

    2004-11-01

    Classes are key software components in an object-oriented software system. In many industrial OO software systems, there are some classes that have complicated structure and relationships. So in the processes of software maintenance, testing, software reengineering, software reuse and software restructure, it is a challenge for software engineers to understand these classes thoroughly. This paper proposes a class comprehension model based on constructivist learning theory, and implements a software visualization tool (MFV-Class) to help in the comprehension of a class. The tool provides multiple views of class to uncover manifold facets of class contents. It enables visualizing three object-oriented metrics of classes to help users focus on the understanding process. A case study was conducted to evaluate our approach and the toolkit.

  6. An open-source Java-based Toolbox for environmental model evaluation: The MOUSE Software Application

    USDA-ARS?s Scientific Manuscript database

    A consequence of environmental model complexity is that the task of understanding how environmental models work and identifying their sensitivities/uncertainties, etc. becomes progressively more difficult. Comprehensive numerical and visual evaluation tools have been developed such as the Monte Carl...

  7. Cognitive Interviewing: A Qualitative Tool for Improving Questionnaires in Sport Science

    ERIC Educational Resources Information Center

    Dietrich, Hanno; Ehrlenspiel, Felix

    2010-01-01

    Cognitive models postulate that respondents to a questionnaire follow a four-stage process when answering a question: comprehension, memory retrieval, decision, and response. Cognitive interviewing is a qualitative tool to gain insight into this process by means of letting respondents think aloud or asking them specific questions (Willis, 2005).…

  8. A systematic review on popularity, application and characteristics of protein secondary structure prediction tools.

    PubMed

    Kashani-Amin, Elaheh; Tabatabaei-Malazy, Ozra; Sakhteman, Amirhossein; Larijani, Bagher; Ebrahim-Habibi, Azadeh

    2018-02-27

    Prediction of proteins' secondary structure is one of the major steps in the generation of homology models. These models provide structural information which is used to design suitable ligands for potential medicinal targets. However, selecting a proper tool between multiple secondary structure prediction (SSP) options is challenging. The current study is an insight onto currently favored methods and tools, within various contexts. A systematic review was performed for a comprehensive access to recent (2013-2016) studies which used or recommended protein SSP tools. Three databases, Web of Science, PubMed and Scopus were systematically searched and 99 out of 209 studies were finally found eligible to extract data. Four categories of applications for 59 retrieved SSP tools were: (I) prediction of structural features of a given sequence, (II) evaluation of a method, (III) providing input for a new SSP method and (IV) integrating a SSP tool as a component for a program. PSIPRED was found to be the most popular tool in all four categories. JPred and tools utilizing PHD (Profile network from HeiDelberg) method occupied second and third places of popularity in categories I and II. JPred was only found in the two first categories, while PHD was present in three fields. This study provides a comprehensive insight about the recent usage of SSP tools which could be helpful for selecting a proper tool's choice. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  9. Global sensitivity analysis of DRAINMOD-FOREST, an integrated forest ecosystem model

    Treesearch

    Shiying Tian; Mohamed A. Youssef; Devendra M. Amatya; Eric D. Vance

    2014-01-01

    Global sensitivity analysis is a useful tool to understand process-based ecosystem models by identifying key parameters and processes controlling model predictions. This study reported a comprehensive global sensitivity analysis for DRAINMOD-FOREST, an integrated model for simulating water, carbon (C), and nitrogen (N) cycles and plant growth in lowland forests. The...

  10. Data for Environmental Modeling (D4EM): Background and Applications of Data Automation

    EPA Science Inventory

    The Data for Environmental Modeling (D4EM) project demonstrates the development of a comprehensive set of open source software tools that overcome obstacles to accessing data needed by automating the process of populating model input data sets with environmental data available fr...

  11. Model Educational Specifications for Technology in Schools.

    ERIC Educational Resources Information Center

    Maryland State Dept. of Education, College Park. Office of Administration and Finance.

    This description of the Model Edspec, which can be used by itself or in conjunction with the "Format Guide of Educational Specifications," serves as a comprehensive planning tool for the selection and application of technology. The model is designed to assist schools in implementing the facilities development process, thereby making…

  12. Tempest: Tools for Addressing the Needs of Next-Generation Climate Models

    NASA Astrophysics Data System (ADS)

    Ullrich, P. A.; Guerra, J. E.; Pinheiro, M. C.; Fong, J.

    2015-12-01

    Tempest is a comprehensive simulation-to-science infrastructure that tackles the needs of next-generation, high-resolution, data intensive climate modeling activities. This project incorporates three key components: TempestDynamics, a global modeling framework for experimental numerical methods and high-performance computing; TempestRemap, a toolset for arbitrary-order conservative and consistent remapping between unstructured grids; and TempestExtremes, a suite of detection and characterization tools for identifying weather extremes in large climate datasets. In this presentation, the latest advances with the implementation of this framework will be discussed, and a number of projects now utilizing these tools will be featured.

  13. A survey on annotation tools for the biomedical literature.

    PubMed

    Neves, Mariana; Leser, Ulf

    2014-03-01

    New approaches to biomedical text mining crucially depend on the existence of comprehensive annotated corpora. Such corpora, commonly called gold standards, are important for learning patterns or models during the training phase, for evaluating and comparing the performance of algorithms and also for better understanding the information sought for by means of examples. Gold standards depend on human understanding and manual annotation of natural language text. This process is very time-consuming and expensive because it requires high intellectual effort from domain experts. Accordingly, the lack of gold standards is considered as one of the main bottlenecks for developing novel text mining methods. This situation led the development of tools that support humans in annotating texts. Such tools should be intuitive to use, should support a range of different input formats, should include visualization of annotated texts and should generate an easy-to-parse output format. Today, a range of tools which implement some of these functionalities are available. In this survey, we present a comprehensive survey of tools for supporting annotation of biomedical texts. Altogether, we considered almost 30 tools, 13 of which were selected for an in-depth comparison. The comparison was performed using predefined criteria and was accompanied by hands-on experiences whenever possible. Our survey shows that current tools can support many of the tasks in biomedical text annotation in a satisfying manner, but also that no tool can be considered as a true comprehensive solution.

  14. Climate Model Diagnostic Analyzer

    NASA Technical Reports Server (NTRS)

    Lee, Seungwon; Pan, Lei; Zhai, Chengxing; Tang, Benyang; Kubar, Terry; Zhang, Zia; Wang, Wei

    2015-01-01

    The comprehensive and innovative evaluation of climate models with newly available global observations is critically needed for the improvement of climate model current-state representation and future-state predictability. A climate model diagnostic evaluation process requires physics-based multi-variable analyses that typically involve large-volume and heterogeneous datasets, making them both computation- and data-intensive. With an exploratory nature of climate data analyses and an explosive growth of datasets and service tools, scientists are struggling to keep track of their datasets, tools, and execution/study history, let alone sharing them with others. In response, we have developed a cloud-enabled, provenance-supported, web-service system called Climate Model Diagnostic Analyzer (CMDA). CMDA enables the physics-based, multivariable model performance evaluations and diagnoses through the comprehensive and synergistic use of multiple observational data, reanalysis data, and model outputs. At the same time, CMDA provides a crowd-sourcing space where scientists can organize their work efficiently and share their work with others. CMDA is empowered by many current state-of-the-art software packages in web service, provenance, and semantic search.

  15. Three Software Tools for Viewing Sectional Planes, Volume Models, and Surface Models of a Cadaver Hand.

    PubMed

    Chung, Beom Sun; Chung, Min Suk; Shin, Byeong Seok; Kwon, Koojoo

    2018-02-19

    The hand anatomy, including the complicated hand muscles, can be grasped by using computer-assisted learning tools with high quality two-dimensional images and three-dimensional models. The purpose of this study was to present up-to-date software tools that promote learning of stereoscopic morphology of the hand. On the basis of horizontal sectioned images and outlined images of a male cadaver, vertical planes, volume models, and surface models were elaborated. Software to browse pairs of the sectioned and outlined images in orthogonal planes and software to peel and rotate the volume models, as well as a portable document format (PDF) file to select and rotate the surface models, were produced. All of the software tools were downloadable free of charge and usable off-line. The three types of tools for viewing multiple aspects of the hand could be adequately employed according to individual needs. These new tools involving the realistic images of a cadaver and the diverse functions are expected to improve comprehensive knowledge of the hand shape. © 2018 The Korean Academy of Medical Sciences.

  16. Three Software Tools for Viewing Sectional Planes, Volume Models, and Surface Models of a Cadaver Hand

    PubMed Central

    2018-01-01

    Background The hand anatomy, including the complicated hand muscles, can be grasped by using computer-assisted learning tools with high quality two-dimensional images and three-dimensional models. The purpose of this study was to present up-to-date software tools that promote learning of stereoscopic morphology of the hand. Methods On the basis of horizontal sectioned images and outlined images of a male cadaver, vertical planes, volume models, and surface models were elaborated. Software to browse pairs of the sectioned and outlined images in orthogonal planes and software to peel and rotate the volume models, as well as a portable document format (PDF) file to select and rotate the surface models, were produced. Results All of the software tools were downloadable free of charge and usable off-line. The three types of tools for viewing multiple aspects of the hand could be adequately employed according to individual needs. Conclusion These new tools involving the realistic images of a cadaver and the diverse functions are expected to improve comprehensive knowledge of the hand shape. PMID:29441756

  17. A Constitutive Relationship between Crack Propagation and Specific Damping Capacity in Steel

    DTIC Science & Technology

    1990-10-01

    diagnostic tool for detecting crack growth in structures. The model must be simple to act as a tool, but it must be comprehensive to provide accuracy...strain for static fracture u ECritical strain above which plastic strain occursP EAverage value of the cyclic plastic-strain rangeP E t ln(Ao/AI), true

  18. Developing Cost Accounting and Decision Support Software for Comprehensive Community-Based Support Systems: An Analysis of Needs, Interest, and Readiness in the Field.

    ERIC Educational Resources Information Center

    Harrington, Robert; Jenkins, Peter; Marzke, Carolyn; Cohen, Carol

    Prominent among the new models of social service delivery are organizations providing comprehensive, community-based supports and services (CCBSS) to children and their families. A needs analysis explored CCBSS sites' interest in and readiness to use a software tool designed to help them make more effective internal resource allocation decisions…

  19. Development and testing of a physically based model of streambank erosion for coupling with a basin-scale hydrologic model SWAT

    USDA-ARS?s Scientific Manuscript database

    A comprehensive stream bank erosion model based on excess shear stress has been developed and incorporated in the hydrological model Soil and Water Assessment Tool (SWAT). It takes into account processes such as weathering, vegetative cover, and channel meanders to adjust critical and effective str...

  20. Force Project Technology Presentation to the NRCC

    DTIC Science & Technology

    2014-02-04

    Functional Bridge components Smart Odometer Adv Pretreatment Smart Bridge Multi-functional Gap Crossing Fuel Automated Tracking System Adv...comprehensive matrix of candidate composite material systems and textile reinforcement architectures via modeling/analyses and testing. Product(s...Validated Dynamic Modeling tool based on parametric study using material models to reliably predict the textile mechanics of the hose

  1. Comprehensive Aspectual UML approach to support AspectJ.

    PubMed

    Magableh, Aws; Shukur, Zarina; Ali, Noorazean Mohd

    2014-01-01

    Unified Modeling Language is the most popular and widely used Object-Oriented modelling language in the IT industry. This study focuses on investigating the ability to expand UML to some extent to model crosscutting concerns (Aspects) to support AspectJ. Through a comprehensive literature review, we identify and extensively examine all the available Aspect-Oriented UML modelling approaches and find that the existing Aspect-Oriented Design Modelling approaches using UML cannot be considered to provide a framework for a comprehensive Aspectual UML modelling approach and also that there is a lack of adequate Aspect-Oriented tool support. This study also proposes a set of Aspectual UML semantic rules and attempts to generate AspectJ pseudocode from UML diagrams. The proposed Aspectual UML modelling approach is formally evaluated using a focus group to test six hypotheses regarding performance; a "good design" criteria-based evaluation to assess the quality of the design; and an AspectJ-based evaluation as a reference measurement-based evaluation. The results of the focus group evaluation confirm all the hypotheses put forward regarding the proposed approach. The proposed approach provides a comprehensive set of Aspectual UML structural and behavioral diagrams, which are designed and implemented based on a comprehensive and detailed set of AspectJ programming constructs.

  2. Comprehensive Aspectual UML Approach to Support AspectJ

    PubMed Central

    Magableh, Aws; Shukur, Zarina; Mohd. Ali, Noorazean

    2014-01-01

    Unified Modeling Language is the most popular and widely used Object-Oriented modelling language in the IT industry. This study focuses on investigating the ability to expand UML to some extent to model crosscutting concerns (Aspects) to support AspectJ. Through a comprehensive literature review, we identify and extensively examine all the available Aspect-Oriented UML modelling approaches and find that the existing Aspect-Oriented Design Modelling approaches using UML cannot be considered to provide a framework for a comprehensive Aspectual UML modelling approach and also that there is a lack of adequate Aspect-Oriented tool support. This study also proposes a set of Aspectual UML semantic rules and attempts to generate AspectJ pseudocode from UML diagrams. The proposed Aspectual UML modelling approach is formally evaluated using a focus group to test six hypotheses regarding performance; a “good design” criteria-based evaluation to assess the quality of the design; and an AspectJ-based evaluation as a reference measurement-based evaluation. The results of the focus group evaluation confirm all the hypotheses put forward regarding the proposed approach. The proposed approach provides a comprehensive set of Aspectual UML structural and behavioral diagrams, which are designed and implemented based on a comprehensive and detailed set of AspectJ programming constructs. PMID:25136656

  3. Information security system quality assessment through the intelligent tools

    NASA Astrophysics Data System (ADS)

    Trapeznikov, E. V.

    2018-04-01

    The technology development has shown the automated system information security comprehensive analysis necessity. The subject area analysis indicates the study relevance. The research objective is to develop the information security system quality assessment methodology based on the intelligent tools. The basis of the methodology is the information security assessment model in the information system through the neural network. The paper presents the security assessment model, its algorithm. The methodology practical implementation results in the form of the software flow diagram are represented. The practical significance of the model being developed is noted in conclusions.

  4. Solid Modeling Aerospace Research Tool (SMART) user's guide, version 2.0

    NASA Technical Reports Server (NTRS)

    Mcmillin, Mark L.; Spangler, Jan L.; Dahmen, Stephen M.; Rehder, John J.

    1993-01-01

    The Solid Modeling Aerospace Research Tool (SMART) software package is used in the conceptual design of aerospace vehicles. It provides a highly interactive and dynamic capability for generating geometries with Bezier cubic patches. Features include automatic generation of commonly used aerospace constructs (e.g., wings and multilobed tanks); cross-section skinning; wireframe and shaded presentation; area, volume, inertia, and center-of-gravity calculations; and interfaces to various aerodynamic and structural analysis programs. A comprehensive description of SMART and how to use it is provided.

  5. Presenting an Evaluation Model for the Cancer Registry Software.

    PubMed

    Moghaddasi, Hamid; Asadi, Farkhondeh; Rabiei, Reza; Rahimi, Farough; Shahbodaghi, Reihaneh

    2017-12-01

    As cancer is increasingly growing, cancer registry is of great importance as the main core of cancer control programs, and many different software has been designed for this purpose. Therefore, establishing a comprehensive evaluation model is essential to evaluate and compare a wide range of such software. In this study, the criteria of the cancer registry software have been determined by studying the documents and two functional software of this field. The evaluation tool was a checklist and in order to validate the model, this checklist was presented to experts in the form of a questionnaire. To analyze the results of validation, an agreed coefficient of %75 was determined in order to apply changes. Finally, when the model was approved, the final version of the evaluation model for the cancer registry software was presented. The evaluation model of this study contains tool and method of evaluation. The evaluation tool is a checklist including the general and specific criteria of the cancer registry software along with their sub-criteria. The evaluation method of this study was chosen as a criteria-based evaluation method based on the findings. The model of this study encompasses various dimensions of cancer registry software and a proper method for evaluating it. The strong point of this evaluation model is the separation between general criteria and the specific ones, while trying to fulfill the comprehensiveness of the criteria. Since this model has been validated, it can be used as a standard to evaluate the cancer registry software.

  6. HSPF Toolkit: a New Tool for Stormwater Management at the Watershed Scale

    EPA Science Inventory

    The Hydrological Simulation Program - FORTRAN (HSPF) is a comprehensive watershed model endorsed by US EPA for simulating point and nonpoint source pollutants. The model is used for developing total maximum daily load (TMDL) plans for impaired water bodies; as such, HSPF is the c...

  7. A computational approach to climate science education with CLIMLAB

    NASA Astrophysics Data System (ADS)

    Rose, B. E. J.

    2017-12-01

    CLIMLAB is a Python-based software toolkit for interactive, process-oriented climate modeling for use in education and research. It is motivated by the need for simpler tools and more reproducible workflows with which to "fill in the gaps" between blackboard-level theory and the results of comprehensive climate models. With CLIMLAB you can interactively mix and match physical model components, or combine simpler process models together into a more comprehensive model. I use CLIMLAB in the classroom to put models in the hands of students (undergraduate and graduate), and emphasize a hierarchical, process-oriented approach to understanding the key emergent properties of the climate system. CLIMLAB is equally a tool for climate research, where the same needs exist for more robust, process-based understanding and reproducible computational results. I will give an overview of CLIMLAB and an update on recent developments, including: a full-featured, well-documented, interactive implementation of a widely-used radiation model (RRTM) packaging with conda-forge for compiler-free (and hassle-free!) installation on Mac, Windows and Linux interfacing with xarray for i/o and graphics with gridded model data a rich and growing collection of examples and self-computing lecture notes in Jupyter notebook format

  8. The database management system: A topic and a tool

    NASA Technical Reports Server (NTRS)

    Plummer, O. R.

    1984-01-01

    Data structures and data base management systems are common tools employed to deal with the administrative information of a university. An understanding of these topics is needed by a much wider audience, ranging from those interested in computer aided design and manufacturing to those using microcomputers. These tools are becoming increasingly valuable to academic programs as they develop comprehensive computer support systems. The wide use of these tools relies upon the relational data model as a foundation. Experience with the use of the IPAD RIM5.0 program is described.

  9. The Comprehensive Evaluation of Electronic Learning Tools and Educational Software (CEELTES)

    ERIC Educational Resources Information Center

    Karolcík, Štefan; Cipková, Elena; Hrušecký, Roman; Veselský, Milan

    2015-01-01

    Despite the fact that digital technologies are more and more used in the learning and education process, there is still lack of professional evaluation tools capable of assessing the quality of used digital teaching aids in a comprehensive and objective manner. Construction of the Comprehensive Evaluation of Electronic Learning Tools and…

  10. Construct Validation of the Louisiana School Analysis Model (SAM) Instructional Staff Questionnaire

    ERIC Educational Resources Information Center

    Bray-Clark, Nikki; Bates, Reid

    2005-01-01

    The purpose of this study was to validate the Louisiana SAM Instructional Staff Questionnaire, a key component of the Louisiana School Analysis Model. The model was designed as a comprehensive evaluation tool for schools. Principle axis factoring with oblique rotation was used to uncover the underlying structure of the SISQ. (Contains 1 table.)

  11. Creating and Using Interactive, 3D-Printed Models to Improve Student Comprehension of the Bohr Model of the Atom, Bond Polarity, and Hybridization

    ERIC Educational Resources Information Center

    Smiar, Karen; Mendez, J. D.

    2016-01-01

    Molecular model kits have been used in chemistry classrooms for decades but have seen very little recent innovation. Using 3D printing, three sets of physical models were created for a first semester, introductory chemistry course. Students manipulated these interactive models during class activities as a supplement to existing teaching tools for…

  12. Comprehensive Analysis Modeling of Small-Scale UAS Rotors

    NASA Technical Reports Server (NTRS)

    Russell, Carl R.; Sekula, Martin K.

    2017-01-01

    Multicopter unmanned aircraft systems (UAS), or drones, have continued their explosive growth in recent years. With this growth comes demand for increased performance as the limits of existing technologies are reached. In order to better design multicopter UAS aircraft, better performance prediction tools are needed. This paper presents the results of a study aimed at using the rotorcraft comprehensive analysis code CAMRAD II to model a multicopter UAS rotor in hover. Parametric studies were performed to determine the level of fidelity needed in the analysis code inputs to achieve results that match test data. Overall, the results show that CAMRAD II is well suited to model small-scale UAS rotors in hover. This paper presents the results of the parametric studies as well as recommendations for the application of comprehensive analysis codes to multicopter UAS rotors.

  13. Patient-centered medical home model: do school-based health centers fit the model?

    PubMed

    Larson, Satu A; Chapman, Susan A

    2013-01-01

    School-based health centers (SBHCs) are an important component of health care reform. The SBHC model of care offers accessible, continuous, comprehensive, family-centered, coordinated, and compassionate care to infants, children, and adolescents. These same elements comprise the patient-centered medical home (PCMH) model of care being promoted by the Affordable Care Act with the hope of lowering health care costs by rewarding clinicians for primary care services. PCMH survey tools have been developed to help payers determine whether a clinician/site serves as a PCMH. Our concern is that current survey tools will be unable to capture how a SBHC may provide a medical home and therefore be denied needed funding. This article describes how SBHCs might meet the requirements of one PCMH tool. SBHC stakeholders need to advocate for the creation or modification of existing survey tools that allow the unique characteristics of SBHCs to qualify as PCMHs.

  14. Sagebrush ecosystem conservation and management: Ecoregional assessment tools and models for the Wyoming Basins

    USGS Publications Warehouse

    Hanser, S.E.; Leu, M.; Knick, S.T.; Aldridge, Cameron L.

    2011-01-01

    The Wyoming Basins are one of the remaining strongholds of the sagebrush ecosystem. However, like most sagebrush habitats, threats to this region are numerous. This book adds to current knowledge about the regional status of the sagebrush ecosystem, the distribution of habitats, the threats to the ecosystem, and the influence of threats and habitat conditions on occurrence and abundance of sagebrush associated fauna and flora in the Wyoming Basins. Comprehensive methods are outlined for use in data collection and monitoring of wildlife and plant populations. Field and spatial data are integrated into a spatially explicit analytical framework to develop models of species occurrence and abundance for the egion. This book provides significant new information on distributions, abundances, and habitat relationships for a number of species of conservation concern that depend on sagebrush in the region. The tools and models presented in this book increase our understanding of impacts from land uses and can contribute to the development of comprehensive management and conservation strategies.

  15. AgBase: supporting functional modeling in agricultural organisms

    PubMed Central

    McCarthy, Fiona M.; Gresham, Cathy R.; Buza, Teresia J.; Chouvarine, Philippe; Pillai, Lakshmi R.; Kumar, Ranjit; Ozkan, Seval; Wang, Hui; Manda, Prashanti; Arick, Tony; Bridges, Susan M.; Burgess, Shane C.

    2011-01-01

    AgBase (http://www.agbase.msstate.edu/) provides resources to facilitate modeling of functional genomics data and structural and functional annotation of agriculturally important animal, plant, microbe and parasite genomes. The website is redesigned to improve accessibility and ease of use, including improved search capabilities. Expanded capabilities include new dedicated pages for horse, cat, dog, cotton, rice and soybean. We currently provide 590 240 Gene Ontology (GO) annotations to 105 454 gene products in 64 different species, including GO annotations linked to transcripts represented on agricultural microarrays. For many of these arrays, this provides the only functional annotation available. GO annotations are available for download and we provide comprehensive, species-specific GO annotation files for 18 different organisms. The tools available at AgBase have been expanded and several existing tools improved based upon user feedback. One of seven new tools available at AgBase, GOModeler, supports hypothesis testing from functional genomics data. We host several associated databases and provide genome browsers for three agricultural pathogens. Moreover, we provide comprehensive training resources (including worked examples and tutorials) via links to Educational Resources at the AgBase website. PMID:21075795

  16. Identifying and tracking attacks on networks: C3I displays and related technologies

    NASA Astrophysics Data System (ADS)

    Manes, Gavin W.; Dawkins, J.; Shenoi, Sujeet; Hale, John C.

    2003-09-01

    Converged network security is extremely challenging for several reasons; expanded system and technology perimeters, unexpected feature interaction, and complex interfaces all conspire to provide hackers with greater opportunities for compromising large networks. Preventive security services and architectures are essential, but in and of themselves do not eliminate all threat of compromise. Attack management systems mitigate this residual risk by facilitating incident detection, analysis and response. There are a wealth of attack detection and response tools for IP networks, but a dearth of such tools for wireless and public telephone networks. Moreover, methodologies and formalisms have yet to be identified that can yield a common model for vulnerabilities and attacks in converged networks. A comprehensive attack management system must coordinate detection tools for converged networks, derive fully-integrated attack and network models, perform vulnerability and multi-stage attack analysis, support large-scale attack visualization, and orchestrate strategic responses to cyber attacks that cross network boundaries. We present an architecture that embodies these principles for attack management. The attack management system described engages a suite of detection tools for various networking domains, feeding real-time attack data to a comprehensive modeling, analysis and visualization subsystem. The resulting early warning system not only provides network administrators with a heads-up cockpit display of their entire network, it also supports guided response and predictive capabilities for multi-stage attacks in converged networks.

  17. Ecosystem Risk Assessment Using the Comprehensive Assessment of Risk to Ecosystems (CARE) Tool

    NASA Astrophysics Data System (ADS)

    Battista, W.; Fujita, R.; Karr, K.

    2016-12-01

    Effective Ecosystem Based Management requires a localized understanding of the health and functioning of a given system as well as of the various factors that may threaten the ongoing ability of the system to support the provision of valued services. Several risk assessment models are available that can provide a scientific basis for understanding these factors and for guiding management action, but these models focus mainly on single species and evaluate only the impacts of fishing in detail. We have developed a new ecosystem risk assessment model - the Comprehensive Assessment of Risk to Ecosystems (CARE) - that allows analysts to consider the cumulative impact of multiple threats, interactions among multiple threats that may result in synergistic or antagonistic impacts, and the impacts of a suite of threats on whole-ecosystem productivity and functioning, as well as on specific ecosystem services. The CARE model was designed to be completed in as little as two hours, and uses local and expert knowledge where data are lacking. The CARE tool can be used to evaluate risks facing a single site; to compare multiple sites for the suitability or necessity of different management options; or to evaluate the effects of a proposed management action aimed at reducing one or more risks. This analysis can help users identify which threats are the most important at a given site, and therefore where limited management resources should be targeted. CARE can be applied to virtually any system, and can be modified as knowledge is gained or to better match different site characteristics. CARE builds on previous ecosystem risk assessment tools to provide a comprehensive assessment of fishing and non-fishing threats that can be used to inform environmental management decisions across a broad range of systems.

  18. Ecosystem Risk Assessment Using the Comprehensive Assessment of Risk to Ecosystems (CARE) Tool

    NASA Astrophysics Data System (ADS)

    Battista, W.; Fujita, R.; Karr, K.

    2016-02-01

    Effective Ecosystem Based Management requires a localized understanding of the health and functioning of a given system as well as of the various factors that may threaten the ongoing ability of the system to support the provision of valued services. Several risk assessment models are available that can provide a scientific basis for understanding these factors and for guiding management action, but these models focus mainly on single species and evaluate only the impacts of fishing in detail. We have developed a new ecosystem risk assessment model - the Comprehensive Assessment of Risk to Ecosystems (CARE) - that allows analysts to consider the cumulative impact of multiple threats, interactions among multiple threats that may result in synergistic or antagonistic impacts, and the impacts of a suite of threats on whole-ecosystem productivity and functioning, as well as on specific ecosystem services. The CARE model was designed to be completed in as little as two hours, and uses local and expert knowledge where data are lacking. The CARE tool can be used to evaluate risks facing a single site; to compare multiple sites for the suitability or necessity of different management options; or to evaluate the effects of a proposed management action aimed at reducing one or more risks. This analysis can help users identify which threats are the most important at a given site, and therefore where limited management resources should be targeted. CARE can be applied to virtually any system, and can be modified as knowledge is gained or to better match different site characteristics. CARE builds on previous ecosystem risk assessment tools to provide a comprehensive assessment of fishing and non-fishing threats that can be used to inform environmental management decisions across a broad range of systems.

  19. MoCha: Molecular Characterization of Unknown Pathways.

    PubMed

    Lobo, Daniel; Hammelman, Jennifer; Levin, Michael

    2016-04-01

    Automated methods for the reverse-engineering of complex regulatory networks are paving the way for the inference of mechanistic comprehensive models directly from experimental data. These novel methods can infer not only the relations and parameters of the known molecules defined in their input datasets, but also unknown components and pathways identified as necessary by the automated algorithms. Identifying the molecular nature of these unknown components is a crucial step for making testable predictions and experimentally validating the models, yet no specific and efficient tools exist to aid in this process. To this end, we present here MoCha (Molecular Characterization), a tool optimized for the search of unknown proteins and their pathways from a given set of known interacting proteins. MoCha uses the comprehensive dataset of protein-protein interactions provided by the STRING database, which currently includes more than a billion interactions from over 2,000 organisms. MoCha is highly optimized, performing typical searches within seconds. We demonstrate the use of MoCha with the characterization of unknown components from reverse-engineered models from the literature. MoCha is useful for working on network models by hand or as a downstream step of a model inference engine workflow and represents a valuable and efficient tool for the characterization of unknown pathways using known data from thousands of organisms. MoCha and its source code are freely available online under the GPLv3 license.

  20. The Environment-Power System Analysis Tool development program. [for spacecraft power supplies

    NASA Technical Reports Server (NTRS)

    Jongeward, Gary A.; Kuharski, Robert A.; Kennedy, Eric M.; Wilcox, Katherine G.; Stevens, N. John; Putnam, Rand M.; Roche, James C.

    1989-01-01

    The Environment Power System Analysis Tool (EPSAT) is being developed to provide engineers with the ability to assess the effects of a broad range of environmental interactions on space power systems. A unique user-interface-data-dictionary code architecture oversees a collection of existing and future environmental modeling codes (e.g., neutral density) and physical interaction models (e.g., sheath ionization). The user-interface presents the engineer with tables, graphs, and plots which, under supervision of the data dictionary, are automatically updated in response to parameter change. EPSAT thus provides the engineer with a comprehensive and responsive environmental assessment tool and the scientist with a framework into which new environmental or physical models can be easily incorporated.

  1. A new assessment model and tool for pediatric nurse practitioners.

    PubMed

    Burns, C

    1992-01-01

    This article presents a comprehensive assessment model for pediatric nurse practitioner (PNP) practice that integrates familiar elements of the classical medical history, Gordon's Functional Health Patterns, and developmental fields into one system. This model drives the diagnostic reasoning process toward consideration of a broad range of disease, daily living (nursing diagnosis), and developmental diagnoses, which represents PNP practice better than the medical model does.

  2. Mixed-Dimensionality VLSI-Type Configurable Tools for Virtual Prototyping of Biomicrofluidic Devices and Integrated Systems

    NASA Astrophysics Data System (ADS)

    Makhijani, Vinod B.; Przekwas, Andrzej J.

    2002-10-01

    This report presents results of a DARPA/MTO Composite CAD Project aimed to develop a comprehensive microsystem CAD environment, CFD-ACE+ Multiphysics, for bio and microfluidic devices and complete microsystems. The project began in July 1998, and was a three-year team effort between CFD Research Corporation, California Institute of Technology (CalTech), University of California, Berkeley (UCB), and Tanner Research, with Mr. Don Verlee from Abbott Labs participating as a consultant on the project. The overall objective of this project was to develop, validate and demonstrate several applications of a user-configurable VLSI-type mixed-dimensionality software tool for design of biomicrofluidics devices and integrated systems. The developed tool would provide high fidelity 3-D multiphysics modeling capability, l-D fluidic circuits modeling, and SPICE interface for system level simulations, and mixed-dimensionality design. It would combine tools for layouts and process fabrication, geometric modeling, and automated grid generation, and interfaces to EDA tools (e.g. Cadence) and MCAD tools (e.g. ProE).

  3. CARD 2017: expansion and model-centric curation of the comprehensive antibiotic resistance database

    PubMed Central

    Jia, Baofeng; Raphenya, Amogelang R.; Alcock, Brian; Waglechner, Nicholas; Guo, Peiyao; Tsang, Kara K.; Lago, Briony A.; Dave, Biren M.; Pereira, Sheldon; Sharma, Arjun N.; Doshi, Sachin; Courtot, Mélanie; Lo, Raymond; Williams, Laura E.; Frye, Jonathan G.; Elsayegh, Tariq; Sardar, Daim; Westman, Erin L.; Pawlowski, Andrew C.; Johnson, Timothy A.; Brinkman, Fiona S.L.; Wright, Gerard D.; McArthur, Andrew G.

    2017-01-01

    The Comprehensive Antibiotic Resistance Database (CARD; http://arpcard.mcmaster.ca) is a manually curated resource containing high quality reference data on the molecular basis of antimicrobial resistance (AMR), with an emphasis on the genes, proteins and mutations involved in AMR. CARD is ontologically structured, model centric, and spans the breadth of AMR drug classes and resistance mechanisms, including intrinsic, mutation-driven and acquired resistance. It is built upon the Antibiotic Resistance Ontology (ARO), a custom built, interconnected and hierarchical controlled vocabulary allowing advanced data sharing and organization. Its design allows the development of novel genome analysis tools, such as the Resistance Gene Identifier (RGI) for resistome prediction from raw genome sequence. Recent improvements include extensive curation of additional reference sequences and mutations, development of a unique Model Ontology and accompanying AMR detection models to power sequence analysis, new visualization tools, and expansion of the RGI for detection of emergent AMR threats. CARD curation is updated monthly based on an interplay of manual literature curation, computational text mining, and genome analysis. PMID:27789705

  4. The development of a model of culturally responsive science and mathematics teaching

    NASA Astrophysics Data System (ADS)

    Hernandez, Cecilia M.; Morales, Amanda R.; Shroyer, M. Gail

    2013-12-01

    This qualitative theoretical study was conducted in response to the current need for an inclusive and comprehensive model to guide the preparation and assessment of teacher candidates for culturally responsive teaching. The process of developing a model of culturally responsive teaching involved three steps: a comprehensive review of the literature; a synthesis of the literature into thematic categories to capture the dispositions and behaviors of culturally responsive teaching; and the piloting of these thematic categories with teacher candidates to validate the usefulness of the categories and to generate specific exemplars of behavior to represent each category. The model of culturally responsive teaching contains five thematic categories: (1) content integration, (2) facilitating knowledge construction, (3) prejudice reduction, (4) social justice, and (5) academic development. The current model is a promising tool for comprehensively defining culturally responsive teaching in the context of teacher education as well as to guide curriculum and assessment changes aimed to increase candidates' culturally responsive knowledge and skills in science and mathematics teaching.

  5. A modelling tool for policy analysis to support the design of efficient and effective policy responses for complex public health problems.

    PubMed

    Atkinson, Jo-An; Page, Andrew; Wells, Robert; Milat, Andrew; Wilson, Andrew

    2015-03-03

    In the design of public health policy, a broader understanding of risk factors for disease across the life course, and an increasing awareness of the social determinants of health, has led to the development of more comprehensive, cross-sectoral strategies to tackle complex problems. However, comprehensive strategies may not represent the most efficient or effective approach to reducing disease burden at the population level. Rather, they may act to spread finite resources less intensively over a greater number of programs and initiatives, diluting the potential impact of the investment. While analytic tools are available that use research evidence to help identify and prioritise disease risk factors for public health action, they are inadequate to support more targeted and effective policy responses for complex public health problems. This paper discusses the limitations of analytic tools that are commonly used to support evidence-informed policy decisions for complex problems. It proposes an alternative policy analysis tool which can integrate diverse evidence sources and provide a platform for virtual testing of policy alternatives in order to design solutions that are efficient, effective, and equitable. The case of suicide prevention in Australia is presented to demonstrate the limitations of current tools to adequately inform prevention policy and discusses the utility of the new policy analysis tool. In contrast to popular belief, a systems approach takes a step beyond comprehensive thinking and seeks to identify where best to target public health action and resources for optimal impact. It is concerned primarily with what can be reasonably left out of strategies for prevention and can be used to explore where disinvestment may occur without adversely affecting population health (or equity). Simulation modelling used for policy analysis offers promise in being able to better operationalise research evidence to support decision making for complex problems, improve targeting of public health policy, and offers a foundation for strengthening relationships between policy makers, stakeholders, and researchers.

  6. Estimating the Diets of Animals Using Stable Isotopes and a Comprehensive Bayesian Mixing Model

    PubMed Central

    Hopkins, John B.; Ferguson, Jake M.

    2012-01-01

    Using stable isotope mixing models (SIMMs) as a tool to investigate the foraging ecology of animals is gaining popularity among researchers. As a result, statistical methods are rapidly evolving and numerous models have been produced to estimate the diets of animals—each with their benefits and their limitations. Deciding which SIMM to use is contingent on factors such as the consumer of interest, its food sources, sample size, the familiarity a user has with a particular framework for statistical analysis, or the level of inference the researcher desires to make (e.g., population- or individual-level). In this paper, we provide a review of commonly used SIMM models and describe a comprehensive SIMM that includes all features commonly used in SIMM analysis and two new features. We used data collected in Yosemite National Park to demonstrate IsotopeR's ability to estimate dietary parameters. We then examined the importance of each feature in the model and compared our results to inferences from commonly used SIMMs. IsotopeR's user interface (in R) will provide researchers a user-friendly tool for SIMM analysis. The model is also applicable for use in paleontology, archaeology, and forensic studies as well as estimating pollution inputs. PMID:22235246

  7. Implementation of a novel communication tool and its effect on patient comprehension of care and satisfaction.

    PubMed

    Simmons, Stefanie Anne; Sharp, Brian; Fowler, Jennifer; Singal, Bonita

    2013-05-01

    Emergency department (ED) communication has been demonstrated as requiring improvement and ED patients have repeatedly demonstrated poor comprehension of the care they receive. Through patient focus groups, the authors developed a novel tool designed to improve communication and patient comprehension. This is a prospective, randomised controlled clinical trial to test the efficacy of a novel, patient-centred communication tool. Patients in a small community hospital ED were randomised to receive the instrument, which was utilised by the entire ED care team and served as a checklist or guide to the patients' ED stay. At the end of the ED stay, patients completed a survey of their comprehension of the care and a communication assessment tool-team survey (a validated instrument to assess satisfaction with communication). Three blinded chart reviewers scored patients' comprehension of their ED care as concordant, partially concordant or discordant with charted care. The authors tested whether there was a difference in satisfaction using a two-sample t test and a difference in comprehension using ordinal logistic regression analysis. 146 patients were enrolled in the study with 72 randomised to receive the communication instrument. There was no significant difference between groups in comprehension (OR=0.65, 95% CI 0.34 to 1.23, p=0.18) or communication assessment tool-team scores (difference=0.2, 95% CI: -3.4 to 3.8, p=0.91). Using their novel communication tool, the authors were not able to show a statistically significant improvement in either comprehension or satisfaction, though a tendency towards improved comprehension was seen.

  8. Toward A Simulation-Based Tool for the Treatment of Vocal Fold Paralysis

    PubMed Central

    Mittal, Rajat; Zheng, Xudong; Bhardwaj, Rajneesh; Seo, Jung Hee; Xue, Qian; Bielamowicz, Steven

    2011-01-01

    Advances in high-performance computing are enabling a new generation of software tools that employ computational modeling for surgical planning. Surgical management of laryngeal paralysis is one area where such computational tools could have a significant impact. The current paper describes a comprehensive effort to develop a software tool for planning medialization laryngoplasty where a prosthetic implant is inserted into the larynx in order to medialize the paralyzed vocal fold (VF). While this is one of the most common procedures used to restore voice in patients with VF paralysis, it has a relatively high revision rate, and the tool being developed is expected to improve surgical outcomes. This software tool models the biomechanics of airflow-induced vibration in the human larynx and incorporates sophisticated approaches for modeling the turbulent laryngeal flow, the complex dynamics of the VFs, as well as the production of voiced sound. The current paper describes the key elements of the modeling approach, presents computational results that demonstrate the utility of the approach and also describes some of the limitations and challenges. PMID:21556320

  9. Ball Bearing Analysis with the ORBIS Tool

    NASA Technical Reports Server (NTRS)

    Halpin, Jacob D.

    2016-01-01

    Ball bearing design is critical to the success of aerospace mechanisms. Key bearing performance parameters, such as load capability, stiffness, torque, and life all depend on accurate determination of the internal load distribution. Hence, a good analytical bearing tool that provides both comprehensive capabilities and reliable results becomes a significant asset to the engineer. This paper introduces the ORBIS bearing tool. A discussion of key modeling assumptions and a technical overview is provided. Numerous validation studies and case studies using the ORBIS tool are presented. All results suggest the ORBIS code closely correlates to predictions on bearing internal load distributions, stiffness, deflection and stresses.

  10. Hydro-economic modeling of the role of forests on water resources production in Andalusia, Spain

    NASA Astrophysics Data System (ADS)

    Beguería, Santiago; Serrano-Notivoli, Roberto; Álvarez-Palomino, Alejandro; Campos, Pablo

    2014-05-01

    The development of more refined information tools is a pre-requisite for supporting decision making in the context of integrated water resources management. Among these tools, hydro-economic models are favoured because they allow integrating the ecological, hydrological, infrastructure and economic aspects into a coherent, scientifically-informed framework. We present a case study that assesses physically the water resources of forest lands of the Andalusia region in Spain and conducts an economic environmental income and asset valuation of the forest surface water yield. We show how, based on available hydrologic and economic data, we can develop a comprehensive water account for all the forest lands at the regional scale. This forest water environmental valuation is part of the larger RECAMAN project, which aims at providing a robust and easily replicable accounting tool to evaluate yearly the total income an capital generated by the forest land, encompassing all measurable sources of private and public incomes (timber and cork production, auto-consumption, recreational activities, biodiversity conservation, carbon sequestration, water production, etc.). Only a comprehensive integrated tool such as the one built within the RECAMAN project may serve as a basis for the development of integrated policies such as those internationally agreed and recommended for the management of water resources.

  11. Polar bear encephalitis: establishment of a comprehensive next-generation pathogen analysis pipeline for captive and free-living wildlife.

    PubMed

    Szentiks, C A; Tsangaras, K; Abendroth, B; Scheuch, M; Stenglein, M D; Wohlsein, P; Heeger, F; Höveler, R; Chen, W; Sun, W; Damiani, A; Nikolin, V; Gruber, A D; Grobbel, M; Kalthoff, D; Höper, D; Czirják, G Á; Derisi, J; Mazzoni, C J; Schüle, A; Aue, A; East, M L; Hofer, H; Beer, M; Osterrieder, N; Greenwood, A D

    2014-05-01

    This report describes three possibly related incidences of encephalitis, two of them lethal, in captive polar bears (Ursus maritimus). Standard diagnostic methods failed to identify pathogens in any of these cases. A comprehensive, three-stage diagnostic 'pipeline' employing both standard serological methods and new DNA microarray and next generation sequencing-based diagnostics was developed, in part as a consequence of this initial failure. This pipeline approach illustrates the strengths, weaknesses and limitations of these tools in determining pathogen caused deaths in non-model organisms such as wildlife species and why the use of a limited number of diagnostic tools may fail to uncover important wildlife pathogens. Copyright © 2013 Elsevier Ltd. All rights reserved.

  12. NEW GIS WATERSHED ANALYSIS TOOLS FOR SOIL CHARACTERIZATION AND EROSION AND SEDIMENTATION MODELING

    EPA Science Inventory

    A comprehensive procedure for computing soil erosion and sediment delivery metrics has been developed which utilizes a suite of automated scripts and a pair of processing-intensive executable programs operating on a personal computer platform.

  13. Computational nanoelectronics towards: design, analysis, synthesis, and fundamental limits

    NASA Technical Reports Server (NTRS)

    Klimeck, G.

    2003-01-01

    This seminar will review the development of a comprehensive nanoelectronic modeling tool (NEMO 1-D and NEMO 3-D) and its application to high-speed electronics (resonant tunneling diodes) and IR detectors and lasers (quantum dots and 1-D heterostructures).

  14. Verification of Functional Fault Models and the Use of Resource Efficient Verification Tools

    NASA Technical Reports Server (NTRS)

    Bis, Rachael; Maul, William A.

    2015-01-01

    Functional fault models (FFMs) are a directed graph representation of the failure effect propagation paths within a system's physical architecture and are used to support development and real-time diagnostics of complex systems. Verification of these models is required to confirm that the FFMs are correctly built and accurately represent the underlying physical system. However, a manual, comprehensive verification process applied to the FFMs was found to be error prone due to the intensive and customized process necessary to verify each individual component model and to require a burdensome level of resources. To address this problem, automated verification tools have been developed and utilized to mitigate these key pitfalls. This paper discusses the verification of the FFMs and presents the tools that were developed to make the verification process more efficient and effective.

  15. Modeling and Visualizing Flow of Chemical Agents Across Complex Terrain

    NASA Technical Reports Server (NTRS)

    Kao, David; Kramer, Marc; Chaderjian, Neal

    2005-01-01

    Release of chemical agents across complex terrain presents a real threat to homeland security. Modeling and visualization tools are being developed that capture flow fluid terrain interaction as well as point dispersal downstream flow paths. These analytic tools when coupled with UAV atmospheric observations provide predictive capabilities to allow for rapid emergency response as well as developing a comprehensive preemptive counter-threat evacuation plan. The visualization tools involve high-end computing and massive parallel processing combined with texture mapping. We demonstrate our approach across a mountainous portion of North California under two contrasting meteorological conditions. Animations depicting flow over this geographical location provide immediate assistance in decision support and crisis management.

  16. Modelling the role of forests on water provision services: a hydro-economic valuation approach

    NASA Astrophysics Data System (ADS)

    Beguería, S.; Campos, P.

    2015-12-01

    Hydro-economic models that allow integrating the ecological, hydrological, infrastructure, economic and social aspects into a coherent, scientifically- informed framework constitute preferred tools for supporting decision making in the context of integrated water resources management. We present a case study of water regulation and provision services of forests in the Andalusia region of Spain. Our model computes the physical water flows and conducts an economic environmental income and asset valuation of forest surface and underground water yield. Based on available hydrologic and economic data, we develop a comprehensive water account for all the forest lands at the regional scale. This forest water environmental valuation is integrated within a much larger project aiming at providing a robust and easily replicable accounting tool to evaluate yearly the total income and capital of forests, encompassing all measurable sources of private and public incomes (timber and cork production, auto-consumption, recreational activities, biodiversity conservation, carbon sequestration, water production, etc.). We also force our simulation with future socio-economic scenarios to quantify the physical and economic efects of expected trends or simulated public and private policies on future water resources. Only a comprehensive integrated tool may serve as a basis for the development of integrated policies, such as those internationally agreed and recommended for the management of water resources.

  17. Creating a Double-Spring Model to Teach Chromosome Movement during Mitosis & Meiosis

    ERIC Educational Resources Information Center

    Luo, Peigao

    2012-01-01

    The comprehension of chromosome movement during mitosis and meiosis is essential for understanding genetic transmission, but students often find this process difficult to grasp in a classroom setting. I propose a "double-spring model" that incorporates a physical demonstration and can be used as a teaching tool to help students understand this…

  18. Modeling regional-scale wildland fire emissions with the wildland fire emissions information system

    Treesearch

    Nancy H.F. French; Donald McKenzie; Tyler Erickson; Benjamin Koziol; Michael Billmire; K. Endsley; Naomi K.Y. Scheinerman; Liza Jenkins; Mary E. Miller; Roger Ottmar; Susan Prichard

    2014-01-01

    As carbon modeling tools become more comprehensive, spatial data are needed to improve quantitative maps of carbon emissions from fire. The Wildland Fire Emissions Information System (WFEIS) provides mapped estimates of carbon emissions from historical forest fires in the United States through a web browser. WFEIS improves access to data and provides a consistent...

  19. Modeling small cell lung cancer (SCLC) biology through deterministic and stochastic mathematical models.

    PubMed

    Salgia, Ravi; Mambetsariev, Isa; Hewelt, Blake; Achuthan, Srisairam; Li, Haiqing; Poroyko, Valeriy; Wang, Yingyu; Sattler, Martin

    2018-05-25

    Mathematical cancer models are immensely powerful tools that are based in part on the fractal nature of biological structures, such as the geometry of the lung. Cancers of the lung provide an opportune model to develop and apply algorithms that capture changes and disease phenotypes. We reviewed mathematical models that have been developed for biological sciences and applied them in the context of small cell lung cancer (SCLC) growth, mutational heterogeneity, and mechanisms of metastasis. The ultimate goal is to develop the stochastic and deterministic nature of this disease, to link this comprehensive set of tools back to its fractalness and to provide a platform for accurate biomarker development. These techniques may be particularly useful in the context of drug development research, such as combination with existing omics approaches. The integration of these tools will be important to further understand the biology of SCLC and ultimately develop novel therapeutics.

  20. Models as Artefacts of a Dual Nature: A Philosophical Contribution to Teaching about Models Designed and Used in Engineering Practice

    ERIC Educational Resources Information Center

    Nia, Mahdi G.; de Vries, Marc J.

    2017-01-01

    Although '"models" play a significant role in engineering activities, not much has yet been developed to enhance the technological literacy of students in this regard. This contribution intends to help fill this gap and deliver a comprehensive account as to the nature and various properties of these engineering tools. It begins by…

  1. Irena : tool suite for modeling and analysis of small-angle scattering.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ilavsky, J.; Jemian, P.

    2009-04-01

    Irena, a tool suite for analysis of both X-ray and neutron small-angle scattering (SAS) data within the commercial Igor Pro application, brings together a comprehensive suite of tools useful for investigations in materials science, physics, chemistry, polymer science and other fields. In addition to Guinier and Porod fits, the suite combines a variety of advanced SAS data evaluation tools for the modeling of size distribution in the dilute limit using maximum entropy and other methods, dilute limit small-angle scattering from multiple non-interacting populations of scatterers, the pair-distance distribution function, a unified fit, the Debye-Bueche model, the reflectivity (X-ray and neutron)more » using Parratt's formalism, and small-angle diffraction. There are also a number of support tools, such as a data import/export tool supporting a broad sampling of common data formats, a data modification tool, a presentation-quality graphics tool optimized for small-angle scattering data, and a neutron and X-ray scattering contrast calculator. These tools are brought together into one suite with consistent interfaces and functionality. The suite allows robust automated note recording and saving of parameters during export.« less

  2. Research on the spatial analysis method of seismic hazard for island

    NASA Astrophysics Data System (ADS)

    Jia, Jing; Jiang, Jitong; Zheng, Qiuhong; Gao, Huiying

    2017-05-01

    Seismic hazard analysis(SHA) is a key component of earthquake disaster prevention field for island engineering, whose result could provide parameters for seismic design microscopically and also is the requisite work for the island conservation planning’s earthquake and comprehensive disaster prevention planning macroscopically, in the exploitation and construction process of both inhabited and uninhabited islands. The existing seismic hazard analysis methods are compared in their application, and their application and limitation for island is analysed. Then a specialized spatial analysis method of seismic hazard for island (SAMSHI) is given to support the further related work of earthquake disaster prevention planning, based on spatial analysis tools in GIS and fuzzy comprehensive evaluation model. The basic spatial database of SAMSHI includes faults data, historical earthquake record data, geological data and Bouguer gravity anomalies data, which are the data sources for the 11 indices of the fuzzy comprehensive evaluation model, and these indices are calculated by the spatial analysis model constructed in ArcGIS’s Model Builder platform.

  3. A comprehensive dairy valorization model.

    PubMed

    Banaszewska, A; Cruijssen, F; van der Vorst, J G A J; Claassen, G D H; Kampman, J L

    2013-02-01

    Dairy processors face numerous challenges resulting from both unsteady dairy markets and some specific characteristics of dairy supply chains. To maintain a competitive position on the market, companies must look beyond standard solutions currently used in practice. This paper presents a comprehensive dairy valorization model that serves as a decision support tool for mid-term allocation of raw milk to end products and production planning. The developed model was used to identify the optimal product portfolio composition. The model allocates raw milk to the most profitable dairy products while accounting for important constraints (i.e., recipes, composition variations, dairy production interdependencies, seasonality, demand, supply, capacities, and transportation flows). The inclusion of all relevant constraints and the ease of understanding dairy production dynamics make the model comprehensive. The developed model was tested at the international dairy processor FrieslandCampina (Amersfoort, the Netherlands). The structure of the model and its output were discussed in multiple sessions with and approved by relevant FrieslandCampina employees. The elements included in the model were considered necessary to optimally valorize raw milk. To illustrate the comprehensiveness and functionality of the model, we analyzed the effect of seasonality on milk valorization. A large difference in profit and a shift in the allocation of milk showed that seasonality has a considerable impact on the valorization of raw milk. Copyright © 2013 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  4. Comprehending emergent systems phenomena through direct-manipulation animation

    NASA Astrophysics Data System (ADS)

    Aguirre, Priscilla Abel

    This study seeks to understand the type of interaction mode that best supports learning and comprehension of emergent systems phenomena. Given that the literature has established that students hold robust misconceptions of such phenomena, this study investigates the influence of using three types of interaction; speed-manipulation animation (SMN), post-manipulation animation (PMA) and direct-manipulation animation (DMA) for increasing comprehension and testing transfer of the phenomena, by looking at the effect of simultaneous interaction of haptic and visual channels on long term and working memories when seeking to comprehend emergent phenomena. The questions asked were: (1) Does the teaching of emergent phenomena, with the aid of a dynamic interactive modeling tool (i.e., SMA, PMA or DMA), improve students' mental model construction of systems, thus increasing comprehension of this scientific concept? And (2) does the teaching of emergent phenomena, with the aid of a dynamic interactive modeling tool, give the students the necessary complex cognitive skill which can then be applied to similar (near transfer) and/or novel, but different, (far transfer) scenarios? In an empirical study undergraduate and graduate students were asked to participate in one of three experimental conditions: SMA, PMA, or DMA. The results of the study found that it was the participants of the SMA treatment condition that had the most improvement in post-test scores. Students' understanding of the phenomena increased most when they used a dynamic model with few interactive elements (i.e., start, stop, and speed) that allowed for real time visualization of one's interaction on the phenomena. Furthermore, no indication was found that the learning of emergent phenomena, with the aid of a dynamic interactive modeling tool, gave the students the necessary complex cognitive skill which could then be applied to similar (near transfer) and/or novel, but different, (far transfer) scenarios. Finally, besides treatment condition, gender and age were also shown to be predictors of score differences; overall, males did better than females, and younger students did better than older students.

  5. IPMP 2013 - A comprehensive data analysis tool for predictive microbiology

    USDA-ARS?s Scientific Manuscript database

    Predictive microbiology is an area of applied research in food science that uses mathematical models to predict the changes in the population of pathogenic or spoilage microorganisms in foods undergoing complex environmental changes during processing, transportation, distribution, and storage. It f...

  6. Generating community-built tools for data sharing and analysis in environmental networks

    USGS Publications Warehouse

    Read, Jordan S.; Gries, Corinna; Read, Emily K.; Klug, Jennifer; Hanson, Paul C.; Hipsey, Matthew R.; Jennings, Eleanor; O'Reilley, Catherine; Winslow, Luke A.; Pierson, Don; McBride, Christopher G.; Hamilton, David

    2016-01-01

    Rapid data growth in many environmental sectors has necessitated tools to manage and analyze these data. The development of tools often lags behind the proliferation of data, however, which may slow exploratory opportunities and scientific progress. The Global Lake Ecological Observatory Network (GLEON) collaborative model supports an efficient and comprehensive data–analysis–insight life cycle, including implementations of data quality control checks, statistical calculations/derivations, models, and data visualizations. These tools are community-built and openly shared. We discuss the network structure that enables tool development and a culture of sharing, leading to optimized output from limited resources. Specifically, data sharing and a flat collaborative structure encourage the development of tools that enable scientific insights from these data. Here we provide a cross-section of scientific advances derived from global-scale analyses in GLEON. We document enhancements to science capabilities made possible by the development of analytical tools and highlight opportunities to expand this framework to benefit other environmental networks.

  7. The business process management software for successful quality management and organization: A case study from the University of Split School of Medicine.

    PubMed

    Sapunar, Damir; Grković, Ivica; Lukšić, Davor; Marušić, Matko

    2016-05-01

    Our aim was to describe a comprehensive model of internal quality management (QM) at a medical school founded on the business process analysis (BPA) software tool. BPA software tool was used as the core element for description of all working processes in our medical school, and subsequently the system served as the comprehensive model of internal QM. The quality management system at the University of Split School of Medicine included the documentation and analysis of all business processes within the School. The analysis revealed 80 weak points related to one or several business processes. A precise analysis of medical school business processes allows identification of unfinished, unclear and inadequate points in these processes, and subsequently the respective improvements and increase of the QM level and ultimately a rationalization of the institution's work. Our approach offers a potential reference model for development of common QM framework allowing a continuous quality control, i.e. the adjustments and adaptation to contemporary educational needs of medical students. Copyright © 2016 by Academy of Sciences and Arts of Bosnia and Herzegovina.

  8. Tools for building a comprehensive modeling system for virtual screening under real biological conditions: The Computational Titration algorithm.

    PubMed

    Kellogg, Glen E; Fornabaio, Micaela; Chen, Deliang L; Abraham, Donald J; Spyrakis, Francesca; Cozzini, Pietro; Mozzarelli, Andrea

    2006-05-01

    Computational tools utilizing a unique empirical modeling system based on the hydrophobic effect and the measurement of logP(o/w) (the partition coefficient for solvent transfer between 1-octanol and water) are described. The associated force field, Hydropathic INTeractions (HINT), contains much rich information about non-covalent interactions in the biological environment because of its basis in an experiment that measures interactions in solution. HINT is shown to be the core of an evolving virtual screening system that is capable of taking into account a number of factors often ignored such as entropy, effects of solvent molecules at the active site, and the ionization states of acidic and basic residues and ligand functional groups. The outline of a comprehensive modeling system for virtual screening that incorporates these features is described. In addition, a detailed description of the Computational Titration algorithm is provided. As an example, three complexes of dihydrofolate reductase (DHFR) are analyzed with our system and these results are compared with the experimental free energies of binding.

  9. Evaluating the integration of cultural competence skills into health and physical assessment tools: a survey of Canadian schools of nursing.

    PubMed

    Chircop, Andrea; Edgecombe, Nancy; Hayward, Kathryn; Ducey-Gilbert, Cherie; Sheppard-Lemoine, Debbie

    2013-04-01

    Currently used audiovisual (AV) teaching tools to teach health and physical assessment reflect a Eurocentric bias using the biomedical model. The purpose of our study was to (a) identify commonly used AV teaching tools of Canadian schools of nursing and (b) evaluate the identified tools. A two-part descriptive quantitative method design was used. First, we surveyed schools of nursing across Canada. Second, the identified AV teaching tools were evaluated for content and modeling of cultural competence. The majority of the schools (67%) used publisher-produced videos associated with a physical assessment textbook. Major findings included minimal demonstration of negotiation with a client around cultural aspects of the interview including the need for an interpreter, modesty, and inclusion of support persons. Identification of culturally specific examples given during the videos was superficial and did not provide students with a comprehensive understanding of necessary culturally competent skills.

  10. Comprehension Tools for Teachers: Reading for Understanding from Prekindergarten through Fourth Grade

    ERIC Educational Resources Information Center

    Connor, Carol McDonald; Phillips, Beth M.; Kaschak, Michael; Apel, Kenn; Kim, Young-Suk; Al Otaiba, Stephanie; Crowe, Elizabeth C.; Thomas-Tate, Shurita; Johnson, Lakeisha Cooper; Lonigan, Christopher J.

    2014-01-01

    This paper describes the theoretical framework, as well as the development and testing of the intervention, "Comprehension Tools for Teachers" (CTT), which is composed of eight component interventions targeting malleable language and reading comprehension skills that emerging research indicates contribute to proficient reading for…

  11. Comprehensive data resources and analytical tools for pathological association of aminoacyl tRNA synthetases with cancer

    PubMed Central

    Lee, Ji-Hyun; You, Sungyong; Hyeon, Do Young; Kang, Byeongsoo; Kim, Hyerim; Park, Kyoung Mii; Han, Byungwoo; Hwang, Daehee; Kim, Sunghoon

    2015-01-01

    Mammalian cells have cytoplasmic and mitochondrial aminoacyl-tRNA synthetases (ARSs) that catalyze aminoacylation of tRNAs during protein synthesis. Despite their housekeeping functions in protein synthesis, recently, ARSs and ARS-interacting multifunctional proteins (AIMPs) have been shown to play important roles in disease pathogenesis through their interactions with disease-related molecules. However, there are lacks of data resources and analytical tools that can be used to examine disease associations of ARS/AIMPs. Here, we developed an Integrated Database for ARSs (IDA), a resource database including cancer genomic/proteomic and interaction data of ARS/AIMPs. IDA includes mRNA expression, somatic mutation, copy number variation and phosphorylation data of ARS/AIMPs and their interacting proteins in various cancers. IDA further includes an array of analytical tools for exploration of disease association of ARS/AIMPs, identification of disease-associated ARS/AIMP interactors and reconstruction of ARS-dependent disease-perturbed network models. Therefore, IDA provides both comprehensive data resources and analytical tools for understanding potential roles of ARS/AIMPs in cancers. Database URL: http://ida.biocon.re.kr/, http://ars.biocon.re.kr/ PMID:25824651

  12. A comprehensive review on biosorption of heavy metals by algal biomass: materials, performances, chemistry, and modeling simulation tools.

    PubMed

    He, Jinsong; Chen, J Paul

    2014-05-01

    Heavy metals contamination has become a global issue of concern due to their higher toxicities, nature of non-biodegradability, high capabilities in bioaccumulation in human body and food chain, and carcinogenicities to humans. A series of researches demonstrate that biosorption is a promising technology for removal of heavy metals from aqueous solutions. Algae serve as good biosorbents due to their abundance in seawater and fresh water, cost-effectiveness, reusability and high metal sorption capacities. This article provides a comprehensive review of recent findings on performances, applications and chemistry of algae (e.g., brown, green and red algae, modified algae and the derivatives) for sequestration of heavy metals. Biosorption kinetics and equilibrium models are reviewed. The mechanisms for biosorption are presented. Biosorption is a complicated process involving ion-exchange, complexation and coordination. Finally the theoretical simulation tools for biosorption equilibrium and kinetics are presented so that the readers can use them for further studies. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. Developing a planning model to estimate future cash flows.

    PubMed

    Barenbaum, L; Monahan, T F

    1988-03-01

    Financial managers are discovering that net income and other traditional measures of cash flow may not provide them with the flexibility needed for comprehensive internal planning and control. By using a discretionary cash flow model, financial managers have a forecasting tool that can help them measure anticipated cash flows, and make better decisions concerning financing alternatives, capital expansion, and performance appraisal.

  14. PomBase: a comprehensive online resource for fission yeast

    PubMed Central

    Wood, Valerie; Harris, Midori A.; McDowall, Mark D.; Rutherford, Kim; Vaughan, Brendan W.; Staines, Daniel M.; Aslett, Martin; Lock, Antonia; Bähler, Jürg; Kersey, Paul J.; Oliver, Stephen G.

    2012-01-01

    PomBase (www.pombase.org) is a new model organism database established to provide access to comprehensive, accurate, and up-to-date molecular data and biological information for the fission yeast Schizosaccharomyces pombe to effectively support both exploratory and hypothesis-driven research. PomBase encompasses annotation of genomic sequence and features, comprehensive manual literature curation and genome-wide data sets, and supports sophisticated user-defined queries. The implementation of PomBase integrates a Chado relational database that houses manually curated data with Ensembl software that supports sequence-based annotation and web access. PomBase will provide user-friendly tools to promote curation by experts within the fission yeast community. This will make a key contribution to shaping its content and ensuring its comprehensiveness and long-term relevance. PMID:22039153

  15. Validation of Multiple Tools for Flat Plate Photovoltaic Modeling Against Measured Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freeman, J.; Whitmore, J.; Blair, N.

    2014-08-01

    This report expands upon a previous work by the same authors, published in the 40th IEEE Photovoltaic Specialists conference. In this validation study, comprehensive analysis is performed on nine photovoltaic systems for which NREL could obtain detailed performance data and specifications, including three utility-scale systems and six commercial scale systems. Multiple photovoltaic performance modeling tools were used to model these nine systems, and the error of each tool was analyzed compared to quality-controlled measured performance data. This study shows that, excluding identified outliers, all tools achieve annual errors within +/-8% and hourly root mean squared errors less than 7% formore » all systems. It is further shown using SAM that module model and irradiance input choices can change the annual error with respect to measured data by as much as 6.6% for these nine systems, although all combinations examined still fall within an annual error range of +/-8.5%. Additionally, a seasonal variation in monthly error is shown for all tools. Finally, the effects of irradiance data uncertainty and the use of default loss assumptions on annual error are explored, and two approaches to reduce the error inherent in photovoltaic modeling are proposed.« less

  16. An Evaluation Tool for CONUS-Scale Estimates of Components of the Water Balance

    NASA Astrophysics Data System (ADS)

    Saxe, S.; Hay, L.; Farmer, W. H.; Markstrom, S. L.; Kiang, J. E.

    2016-12-01

    Numerous research groups are independently developing data products to represent various components of the water balance (e.g. runoff, evapotranspiration, recharge, snow water equivalent, soil moisture, and climate) at the scale of the conterminous United States. These data products are derived from a range of sources, including direct measurement, remotely-sensed measurement, and statistical and deterministic model simulations. An evaluation tool is needed to compare these data products and the components of the water balance they contain in order to identify the gaps in the understanding and representation of continental-scale hydrologic processes. An ideal tool will be an objective, universally agreed upon, framework to address questions related to closing the water balance. This type of generic, model agnostic evaluation tool would facilitate collaboration amongst different hydrologic research groups and improve modeling capabilities with respect to continental-scale water resources. By adopting a comprehensive framework to consider hydrologic modeling in the context of a complete water balance, it is possible to identify weaknesses in process modeling, data product representation and regional hydrologic variation. As part of its National Water Census initiative, the U.S. Geological survey is facilitating this dialogue to developing prototype evaluation tools.

  17. Exploring the usefulness of comprehensive care plans for children with medical complexity (CMC): a qualitative study.

    PubMed

    Adams, Sherri; Cohen, Eyal; Mahant, Sanjay; Friedman, Jeremy N; Macculloch, Radha; Nicholas, David B

    2013-01-19

    The Medical Home model recommends that Children with Special Health Care Needs (CSHCN) receive a medical care plan, outlining the child's major medical issues and care needs to assist with care coordination. While care plans are a primary component of effective care coordination, the creation and maintenance of care plans is time, labor, and cost intensive, and the desired content of the care plan has not been studied. The purpose of this qualitative study was to understand the usefulness and desired content of comprehensive care plans by exploring the perceptions of parents and health care providers (HCPs) of children with medical complexity (CMC). This qualitative study utilized in-depth semi-structured interviews and focus groups. HCPs (n = 15) and parents (n = 15) of CMC who had all used a comprehensive care plan were recruited from a tertiary pediatric academic health sciences center. Themes were identified through grounded theory analysis of interview and focus group data. A multi-dimensional model of perceived care plan usefulness emerged. The model highlights three integral aspects of the care plan: care plan characteristics, activating factors and perceived outcomes of using a care plan. Care plans were perceived as a useful tool that centralized and focused the care of the child. Care plans were reported to flatten the hierarchical relationship between HCPs and parents, resulting in enhanced reciprocal information exchange and strengthened relationships. Participants expressed that a standardized template that is family-centered and includes content relevant to both the medical and social needs of the child is beneficial when integrated into overall care planning and delivery for CMC. Care plans are perceived to be a useful tool to both health care providers and parents of CMC. These findings inform the utility and development of a comprehensive care plan template as well as a model of how and when to best utilize care plans within family-centered models of care.

  18. GAMES II Project: a general architecture for medical knowledge-based systems.

    PubMed

    Bruno, F; Kindler, H; Leaning, M; Moustakis, V; Scherrer, J R; Schreiber, G; Stefanelli, M

    1994-10-01

    GAMES II aims at developing a comprehensive and commercially viable methodology to avoid problems ordinarily occurring in KBS development. GAMES II methodology proposes to design a KBS starting from an epistemological model of medical reasoning (the Select and Test Model). The design is viewed as a process of adding symbol level information to the epistemological model. The architectural framework provided by GAMES II integrates the use of different formalisms and techniques providing a large set of tools. The user can select the most suitable one for representing a piece of knowledge after a careful analysis of its epistemological characteristics. Special attention is devoted to the tools dealing with knowledge acquisition (both manual and automatic). A panel of practicing physicians are assessing the medical value of such a framework and its related tools by using it in a practical application.

  19. The Handbook of Leadership Development Evaluation

    ERIC Educational Resources Information Center

    Hannum, Kelly M., Ed.; Martineau, Jennifer W., Ed.; Reinelt, Claire, Ed.

    2007-01-01

    With the increase in the number of organizational leadership development programs, there is a pressing need for evaluation to answer important questions, improve practice, and inform decisions. The Handbook is a comprehensive resource filled with examples, tools, and the most innovative models and approaches designed to evaluate leadership…

  20. Comprehensive Model of Single Particle Pulverized Coal Combustion Extended to Oxy-Coal Conditions

    DOE PAGES

    Holland, Troy; Fletcher, Thomas H.

    2017-02-22

    Oxy-fired coal combustion is a promising potential carbon capture technology. Predictive CFD simulations are valuable tools in evaluating and deploying oxy-fuel and other carbon capture technologies either as retrofit technologies or for new construction. But, accurate predictive simulations require physically realistic submodels with low computational requirements. In particular, comprehensive char oxidation and gasification models have been developed that describe multiple reaction and diffusion processes. Our work extends a comprehensive char conversion code (CCK), which treats surface oxidation and gasification reactions as well as processes such as film diffusion, pore diffusion, ash encapsulation, and annealing. In this work several submodels inmore » the CCK code were updated with more realistic physics or otherwise extended to function in oxy-coal conditions. Improved submodels include the annealing model, the swelling model, the mode of burning parameter, and the kinetic model, as well as the addition of the chemical percolation devolatilization (CPD) model. We compare our results of the char combustion model to oxy-coal data, and further compared to parallel data sets near conventional conditions. A potential method to apply the detailed code in CFD work is given.« less

  1. Comprehensive Model of Single Particle Pulverized Coal Combustion Extended to Oxy-Coal Conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holland, Troy; Fletcher, Thomas H.

    Oxy-fired coal combustion is a promising potential carbon capture technology. Predictive CFD simulations are valuable tools in evaluating and deploying oxy-fuel and other carbon capture technologies either as retrofit technologies or for new construction. But, accurate predictive simulations require physically realistic submodels with low computational requirements. In particular, comprehensive char oxidation and gasification models have been developed that describe multiple reaction and diffusion processes. Our work extends a comprehensive char conversion code (CCK), which treats surface oxidation and gasification reactions as well as processes such as film diffusion, pore diffusion, ash encapsulation, and annealing. In this work several submodels inmore » the CCK code were updated with more realistic physics or otherwise extended to function in oxy-coal conditions. Improved submodels include the annealing model, the swelling model, the mode of burning parameter, and the kinetic model, as well as the addition of the chemical percolation devolatilization (CPD) model. We compare our results of the char combustion model to oxy-coal data, and further compared to parallel data sets near conventional conditions. A potential method to apply the detailed code in CFD work is given.« less

  2. The use of music therapy within the SCERTS model for children with Autism Spectrum Disorder.

    PubMed

    Walworth, Darcy DeLoach

    2007-01-01

    The SCERTS model is a new, comprehensive curriculum designed to assess and identify treatment goals and objectives within a multidisciplinary team of clinicians and educators for children with Autism Spectrum Disorders (ASD). This model is an ongoing assessment tool with resulting goals and objectives derived there from. Because music therapy offers a unique interaction setting for children with ASD to elicit communication skills, music therapists will need to be an integral part of the multidisciplinary assessment team using the SCERTS model which is projected to become the primary nation wide curriculum for children with ASD. The purpose of this paper is to assist music therapists in transitioning to this model by providing an overview and explanation of the SCERTS model and by identifying how music therapists are currently providing clinical services incorporated in the SCERTS Model for children with ASD. In order to formulate comprehensive transitional suggestions, a national survey of music therapists working with clients at risk or diagnosed with ASD was conducted to: (a) identify the areas of SCERTS assessment model that music therapists are currently addressing within their written goals for clients with ASD, (b) identify current music therapy activities that address various SCERTS goals and objectives, and (c) provide demographic information about settings, length, and tools used in music therapy interventions for clients with ASD.

  3. 2D Hydrodynamic Based Logic Modeling Tool for River Restoration Decision Analysis: A Quantitative Approach to Project Prioritization

    NASA Astrophysics Data System (ADS)

    Bandrowski, D.; Lai, Y.; Bradley, N.; Gaeuman, D. A.; Murauskas, J.; Som, N. A.; Martin, A.; Goodman, D.; Alvarez, J.

    2014-12-01

    In the field of river restoration sciences there is a growing need for analytical modeling tools and quantitative processes to help identify and prioritize project sites. 2D hydraulic models have become more common in recent years and with the availability of robust data sets and computing technology, it is now possible to evaluate large river systems at the reach scale. The Trinity River Restoration Program is now analyzing a 40 mile segment of the Trinity River to determine priority and implementation sequencing for its Phase II rehabilitation projects. A comprehensive approach and quantitative tool has recently been developed to analyze this complex river system referred to as: 2D-Hydrodynamic Based Logic Modeling (2D-HBLM). This tool utilizes various hydraulic output parameters combined with biological, ecological, and physical metrics at user-defined spatial scales. These metrics and their associated algorithms are the underpinnings of the 2D-HBLM habitat module used to evaluate geomorphic characteristics, riverine processes, and habitat complexity. The habitat metrics are further integrated into a comprehensive Logic Model framework to perform statistical analyses to assess project prioritization. The Logic Model will analyze various potential project sites by evaluating connectivity using principal component methods. The 2D-HBLM tool will help inform management and decision makers by using a quantitative process to optimize desired response variables with balancing important limiting factors in determining the highest priority locations within the river corridor to implement restoration projects. Effective river restoration prioritization starts with well-crafted goals that identify the biological objectives, address underlying causes of habitat change, and recognizes that social, economic, and land use limiting factors may constrain restoration options (Bechie et. al. 2008). Applying natural resources management actions, like restoration prioritization, is essential for successful project implementation (Conroy and Peterson, 2013). Evaluating tradeoffs and examining alternatives to improve fish habitat through optimization modeling is not just a trend but rather the scientific strategy by which management needs embrace and apply in its decision framework.

  4. The Effects of Literacy Support Tools on the Comprehension of Informational e-Books and Print-Based Text

    ERIC Educational Resources Information Center

    Herman, Heather A.

    2017-01-01

    This mixed methods research explores the effects of literacy support tools to support comprehension strategies when reading informational e-books and print-based text with 14 first-grade students. This study focused on the following comprehension strategies: annotating connections, annotating "I wonders," and looking back in the text.…

  5. Features of the Drag-Free-Simulator demonstrated for the Microscope-mission

    NASA Astrophysics Data System (ADS)

    List, Meike; Bremer, Stefanie; Dittus, Hansjoerg; Selig, Hanns

    The ZARM Drag-Free-Simulator is being developed as a tool for comprehensive mission modeling. Environmental disturbances like solar radiation pressure, atmospheric drag, interactions between the satellite and the Earth's magnetic field can be taken into account via several models. Besides the gravitational field of the Earth, the influence of Sun, Moon and the planets including Pluto can be considered for aimed simulations, too. Methods of modeling and implementation will be presented. At the moment, effort is made to adapt this simulation tool for the french mission MICRO- SCOPE which is designed for testing the equivalence principle up to an accuracy of η=10-15 . Additionally, detailed modeling of on-board capacitive sensors is necessary for a better understanding of the real system. The actual status of mission modeling will be reported.

  6. Final Report (2010-2015) for the Topical Collaboration on Quantitative Jet and Electromagnetic Tomography (JET) of Extreme Phases of Matter in Heavy-ion Collisions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gyulassy, Miklos; Romatschke, Paul; Bass, Steffen

    2015-08-31

    During the 5-year funding period (2010-2015), the JET Collaboration carried out a comprehensive research program with coordinated efforts involving all PI members and external associated members according to the plan and milestones outlined in the approved JET proposal. We identified important issues in the study of parton energy loss and made significant progress toward NLO calculations; advanced event-by-event hydrodynamic simulations of bulk matter evolution; developed Monte Carlo tools that combine different parton energy loss approaches, hydrodynamic models and parton recombination model for jet hadronization; and carried out the first comprehensive phenomenological study to extract the jet transport parameter.

  7. ReMatch: a web-based tool to construct, store and share stoichiometric metabolic models with carbon maps for metabolic flux analysis.

    PubMed

    Pitkänen, Esa; Akerlund, Arto; Rantanen, Ari; Jouhten, Paula; Ukkonen, Esko

    2008-08-25

    ReMatch is a web-based, user-friendly tool that constructs stoichiometric network models for metabolic flux analysis, integrating user-developed models into a database collected from several comprehensive metabolic data resources, including KEGG, MetaCyc and CheBI. Particularly, ReMatch augments the metabolic reactions of the model with carbon mappings to facilitate (13)C metabolic flux analysis. The construction of a network model consisting of biochemical reactions is the first step in most metabolic modelling tasks. This model construction can be a tedious task as the required information is usually scattered to many separate databases whose interoperability is suboptimal, due to the heterogeneous naming conventions of metabolites in different databases. Another, particularly severe data integration problem is faced in (13)C metabolic flux analysis, where the mappings of carbon atoms from substrates into products in the model are required. ReMatch has been developed to solve the above data integration problems. First, ReMatch matches the imported user-developed model against the internal ReMatch database while considering a comprehensive metabolite name thesaurus. This, together with wild card support, allows the user to specify the model quickly without having to look the names up manually. Second, ReMatch is able to augment reactions of the model with carbon mappings, obtained either from the internal database or given by the user with an easy-touse tool. The constructed models can be exported into 13C-FLUX and SBML file formats. Further, a stoichiometric matrix and visualizations of the network model can be generated. The constructed models of metabolic networks can be optionally made available to the other users of ReMatch. Thus, ReMatch provides a common repository for metabolic network models with carbon mappings for the needs of metabolic flux analysis community. ReMatch is freely available for academic use at http://www.cs.helsinki.fi/group/sysfys/software/rematch/.

  8. Assessment of Institutional Strategic Goal Realization: A Case Study

    ERIC Educational Resources Information Center

    Holwick, Jana W.

    2009-01-01

    Strategic planning is a common tool utilized at colleges and universities to assist in achieving institutional goals. Leaders in higher education have taken best practices from corporate management and adapted them in an effort to develop comprehensive approaches to institutional planning, assessment and accountability. Various models for planning…

  9. AUTOMATED GIS WATERSHED ANALYSIS TOOLS FOR RUSLE/SEDMOD SOIL EROSION AND SEDIMENTATION MODELING

    EPA Science Inventory

    A comprehensive procedure for computing soil erosion and sediment delivery metrics has been developed using a suite of automated Arc Macro Language (AML ) scripts and a pair of processing- intensive ANSI C++ executable programs operating on an ESRI ArcGIS 8.x Workstation platform...

  10. Connected vehicle impacts on transportation planning : analysis of the need for new and enhanced analysis tools, techniques and data, briefing for traffic simulation models.

    DOT National Transportation Integrated Search

    2016-03-11

    The principal objective of this project, Connected Vehicle Impacts on Transportation Planning, is to comprehensively assess how connected vehicles should be considered across the range of transportation planning processes and products developed...

  11. Connected vehicle impacts on transportation planning : analysis of the need for new and enhanced analysis tools, techniques and data—briefing for traffic simulation models.

    DOT National Transportation Integrated Search

    2016-03-11

    The principal objective of this project, Connected Vehicle Impacts on Transportation Planning, is to comprehensively assess how connected vehicles should be considered across the range of transportation planning processes and products developed...

  12. NREL: Renewable Resource Data Center - Biomass Resource Related Links

    Science.gov Websites

    Biomass Resource Related Links Comprehensive biomass resource information is also available from . Printable Version RReDC Home Biomass Resource Information Biomass Data Models & Tools Publications Related Links Geothermal Resource Information Solar Resource Information Wind Resource Information Did you

  13. MATREX: A Unifying Modeling and Simulation Architecture for Live-Virtual-Constructive Applications

    DTIC Science & Technology

    2007-05-23

    Deployment Systems Acquisition Operations & Support B C Sustainment FRP Decision Review FOC LRIP/IOT& ECritical Design Review Pre-Systems...CMS2 – Comprehensive Munitions & Sensor Server • CSAT – C4ISR Static Analysis Tool • C4ISR – Command & Control, Communications, Computers

  14. Rethinking Student Loan Debt: Tools and Strategies for Debt Management.

    ERIC Educational Resources Information Center

    Mason, Susan G.

    2001-01-01

    Analyzes student loan debt at the University of Missouri-St. Louis School of Optometry, showing the need for a comprehensive debt management program. Presents a model for determining manageable amounts of student loan debt developed from conventional lending criteria and data on earnings for optometrists. (EV)

  15. Towards the Integration of APECS with VE-Suite to Create a Comprehensive Virtual Engineering Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCorkle, D.; Yang, C.; Jordan, T.

    2007-06-01

    Modeling and simulation tools are becoming pervasive in the process engineering practice of designing advanced power generation facilities. These tools enable engineers to explore many what-if scenarios before cutting metal or constructing a pilot scale facility. While such tools enable investigation of crucial plant design aspects, typical commercial process simulation tools such as Aspen Plus®, gPROMS®, and HYSYS® still do not explore some plant design information, including computational fluid dynamics (CFD) models for complex thermal and fluid flow phenomena, economics models for policy decisions, operational data after the plant is constructed, and as-built information for use in as-designed models. Softwaremore » tools must be created that allow disparate sources of information to be integrated if environments are to be constructed where process simulation information can be accessed. At the Department of Energy’s (DOE) National Energy Technology Laboratory (NETL), the Advanced Process Engineering Co-Simulator (APECS) has been developed as an integrated software suite that combines process simulation (e.g., Aspen Plus) and high-fidelity equipment simulation (e.g., Fluent® CFD), together with advanced analysis capabilities including case studies, sensitivity analysis, stochastic simulation for risk/uncertainty analysis, and multi-objective optimization. In this paper, we discuss the initial phases of integrating APECS with the immersive and interactive virtual engineering software, VE-Suite, developed at Iowa State University and Ames Laboratory. VE-Suite utilizes the ActiveX (OLE Automation) controls in Aspen Plus wrapped by the CASI library developed by Reaction Engineering International to run the process simulation and query for unit operation results. This integration permits any application that uses the VE-Open interface to integrate with APECS co-simulations, enabling construction of the comprehensive virtual engineering environment needed for the rapid engineering of advanced power generation facilities.« less

  16. Musite, a tool for global prediction of general and kinase-specific phosphorylation sites.

    PubMed

    Gao, Jianjiong; Thelen, Jay J; Dunker, A Keith; Xu, Dong

    2010-12-01

    Reversible protein phosphorylation is one of the most pervasive post-translational modifications, regulating diverse cellular processes in various organisms. High throughput experimental studies using mass spectrometry have identified many phosphorylation sites, primarily from eukaryotes. However, the vast majority of phosphorylation sites remain undiscovered, even in well studied systems. Because mass spectrometry-based experimental approaches for identifying phosphorylation events are costly, time-consuming, and biased toward abundant proteins and proteotypic peptides, in silico prediction of phosphorylation sites is potentially a useful alternative strategy for whole proteome annotation. Because of various limitations, current phosphorylation site prediction tools were not well designed for comprehensive assessment of proteomes. Here, we present a novel software tool, Musite, specifically designed for large scale predictions of both general and kinase-specific phosphorylation sites. We collected phosphoproteomics data in multiple organisms from several reliable sources and used them to train prediction models by a comprehensive machine-learning approach that integrates local sequence similarities to known phosphorylation sites, protein disorder scores, and amino acid frequencies. Application of Musite on several proteomes yielded tens of thousands of phosphorylation site predictions at a high stringency level. Cross-validation tests show that Musite achieves some improvement over existing tools in predicting general phosphorylation sites, and it is at least comparable with those for predicting kinase-specific phosphorylation sites. In Musite V1.0, we have trained general prediction models for six organisms and kinase-specific prediction models for 13 kinases or kinase families. Although the current pretrained models were not correlated with any particular cellular conditions, Musite provides a unique functionality for training customized prediction models (including condition-specific models) from users' own data. In addition, with its easily extensible open source application programming interface, Musite is aimed at being an open platform for community-based development of machine learning-based phosphorylation site prediction applications. Musite is available at http://musite.sourceforge.net/.

  17. Discover Space Weather and Sun's Superpowers: Using CCMC's innovative tools and applications

    NASA Astrophysics Data System (ADS)

    Mendoza, A. M. M.; Maddox, M. M.; Kuznetsova, M. M.; Chulaki, A.; Rastaetter, L.; Mullinix, R.; Weigand, C.; Boblitt, J.; Taktakishvili, A.; MacNeice, P. J.; Pulkkinen, A. A.; Pembroke, A. D.; Mays, M. L.; Zheng, Y.; Shim, J. S.

    2015-12-01

    Community Coordinated Modeling Center (CCMC) has developed a comprehensive set of tools and applications that are directly applicable to space weather and space science education. These tools, some of which were developed by our student interns, are capable of serving a wide range of student audiences, from middle school to postgraduate research. They include a web-based point of access to sophisticated space physics models and visualizations, and a powerful space weather information dissemination system, available on the web and as a mobile app. In this demonstration, we will use CCMC's innovative tools to engage the audience in real-time space weather analysis and forecasting and will share some of our interns' hands-on experiences while being trained as junior space weather forecasters. The main portals to CCMC's educational material are ccmc.gsfc.nasa.gov and iswa.gsfc.nasa.gov

  18. Simulation of atmospheric and terrestrial background signatures for detection and tracking scenarios

    NASA Astrophysics Data System (ADS)

    Schweitzer, Caroline; Stein, Karin

    2015-10-01

    In the fields of early warning, one is depending on reliable image exploitation: Only if the applied detection and tracking algorithms work efficiently, the threat approach alert can be given fast enough to ensure an automatic initiation of the countermeasure. In order to evaluate the performance of those algorithms for a certain electro-optical (EO) sensor system, test sequences need to be created as realistic and comprehensive as possible. Since both, background and target signature, depend on the environmental conditions, a detailed knowledge of the meteorology and climatology is necessary. Trials for measuring these environmental characteristics serve as a solid basis, but might only constitute the conditions during a rather short period of time. To represent the entire variation of meteorology and climatology that the future system will be exposed to, the application of comprehensive atmospheric modelling tools is essential. This paper gives an introduction of the atmospheric modelling tools that are currently used at Fraunhofer IOSB to simulate spectral background signatures in the infrared (IR) range. It is also demonstrated, how those signatures are affected by changing atmospheric and climatic conditions. In conclusion - and with a special focus on the modelling of different cloud types - sources of error and limits are discussed.

  19. Characteristics of 3-D transport simulations of the stratosphere and mesosphere

    NASA Technical Reports Server (NTRS)

    Fairlie, T. D. A.; Siskind, D. E.; Turner, R. E.; Fisher, M.

    1992-01-01

    A 3D mechanistic, primitive-equation model of the stratosphere and mesosphere is coupled to an offline spectral transport model. The dynamics model is initialized with and forced by observations so that the coupled models may be used to study specific episodes. Results are compared with those obtained by transport online in the dynamics model. Although some differences are apparent, the results suggest that coupling of the models to a comprehensive photochemical package will provide a useful tool for studying the evolution of constituents in the middle atmosphere during specific episodes.

  20. NONMEMory: a run management tool for NONMEM.

    PubMed

    Wilkins, Justin J

    2005-06-01

    NONMEM is an extremely powerful tool for nonlinear mixed-effect modelling and simulation of pharmacokinetic and pharmacodynamic data. However, it is a console-based application whose output does not lend itself to rapid interpretation or efficient management. NONMEMory has been created to be a comprehensive project manager for NONMEM, providing detailed summary, comparison and overview of the runs comprising a given project, including the display of output data, simple post-run processing, fast diagnostic plots and run output management, complementary to other available modelling aids. Analysis time ought not to be spent on trivial tasks, and NONMEMory's role is to eliminate these as far as possible by increasing the efficiency of the modelling process. NONMEMory is freely available from http://www.uct.ac.za/depts/pha/nonmemory.php.

  1. A Comprehensive Rehabilitation Approach in a Patient With Serious Neuropsychiatric Systemic Lupus Erythematosus.

    PubMed

    Ko, Yong Jae; Lee, Yang Gyun; Park, Ji Woong; Ahn, Sung Ho; Kwak, Jin Myoung; Choi, Yoon-Hee

    2016-08-01

    Neuropsychiatric systemic lupus erythematosus (NPSLE) involves the central and peripheral nervous system in patients with systemic lupus erythematosus (SLE). It is essential to specify the problems faced by patients with NPSLE because it causes diverse disabilities and impairs quality of life. After performing a comprehensive evaluation, tailored management should be provided for the patient's specific problems. We report here the case of a 30-year-old female with SLE who experienced serious neuropsychiatric symptoms cerebral infarction followed by posterior reversible encephalopathy syndrome and peripheral polyneuropathy. We systemically assessed the patient using the International Classification of Functioning, Disability and Health model as a clinical problem-solving tool and provided comprehensive rehabilitation by focusing on her problems.

  2. An Introduction to the Transition to Work Simulator.

    ERIC Educational Resources Information Center

    Conroy, William G., Jr.

    The transition to work simulator (TWS) was developed as a policy tool for education and manpower planning. It was designed in anticipation of the Education Amendments of 1976 and the Youth Employment and Demonstration Projects Act of 1977 to facilitate comprehensive planning. TWS is a computerized simulation model of the stream of important…

  3. Growing a Faculty Writing Group on a Traditionally Teaching-Focused Campus: A Model for Faculty Development

    ERIC Educational Resources Information Center

    Hampton-Farmer, Cheri; Laverick, Erin; Denecker, Christine; Tulley, Christine E.; Diederich, Nicole; Wilgus, Anthony

    2013-01-01

    When expectations for scholarly productivity increase at comprehensive universities, faculty writing groups can provide the tools, motivation, and support necessary to achieve both administrative and faculty goals. Narratives from members of a faculty writing group experiencing a shift in institutional expectations for scholarship reveal tangible…

  4. Electronic Reverse Auctions: Integrating an E-Sourcing Tool into a Sales and Purchasing Cross-Course Negotiation Project

    ERIC Educational Resources Information Center

    Williams, Jacqueline A.; Dobie, Kathryn

    2011-01-01

    Electronic reverse auctions are increasingly being used by firms to improve firm financial and operational performance. The described teaching innovation serves as a model for introducing electronic reverse auctions as a central element in a comprehensive negotiation exercise involving sales management and purchasing management students. Results…

  5. A Neuro-Oncology Workstation for Structuring, Modeling, and Visualizing Patient Records

    PubMed Central

    Hsu, William; Arnold, Corey W.; Taira, Ricky K.

    2016-01-01

    The patient medical record contains a wealth of information consisting of prior observations, interpretations, and interventions that need to be interpreted and applied towards decisions regarding current patient care. Given the time constraints and the large—often extraneous—amount of data available, clinicians are tasked with the challenge of performing a comprehensive review of how a disease progresses in individual patients. To facilitate this process, we demonstrate a neuro-oncology workstation that assists in structuring and visualizing medical data to promote an evidence-based approach for understanding a patient’s record. The workstation consists of three components: 1) a structuring tool that incorporates natural language processing to assist with the extraction of problems, findings, and attributes for structuring observations, events, and inferences stated within medical reports; 2) a data modeling tool that provides a comprehensive and consistent representation of concepts for the disease-specific domain; and 3) a visual workbench for visualizing, navigating, and querying the structured data to enable retrieval of relevant portions of the patient record. We discuss this workstation in the context of reviewing cases of glioblastoma multiforme patients. PMID:27583308

  6. A Neuro-Oncology Workstation for Structuring, Modeling, and Visualizing Patient Records.

    PubMed

    Hsu, William; Arnold, Corey W; Taira, Ricky K

    2010-11-01

    The patient medical record contains a wealth of information consisting of prior observations, interpretations, and interventions that need to be interpreted and applied towards decisions regarding current patient care. Given the time constraints and the large-often extraneous-amount of data available, clinicians are tasked with the challenge of performing a comprehensive review of how a disease progresses in individual patients. To facilitate this process, we demonstrate a neuro-oncology workstation that assists in structuring and visualizing medical data to promote an evidence-based approach for understanding a patient's record. The workstation consists of three components: 1) a structuring tool that incorporates natural language processing to assist with the extraction of problems, findings, and attributes for structuring observations, events, and inferences stated within medical reports; 2) a data modeling tool that provides a comprehensive and consistent representation of concepts for the disease-specific domain; and 3) a visual workbench for visualizing, navigating, and querying the structured data to enable retrieval of relevant portions of the patient record. We discuss this workstation in the context of reviewing cases of glioblastoma multiforme patients.

  7. The FaceBase Consortium: A comprehensive program to facilitate craniofacial research

    PubMed Central

    Hochheiser, Harry; Aronow, Bruce J.; Artinger, Kristin; Beaty, Terri H.; Brinkley, James F.; Chai, Yang; Clouthier, David; Cunningham, Michael L.; Dixon, Michael; Donahue, Leah Rae; Fraser, Scott E.; Hallgrimsson, Benedikt; Iwata, Junichi; Klein, Ophir; Marazita, Mary L.; Murray, Jeffrey C.; Murray, Stephen; de Villena, Fernando Pardo-Manuel; Postlethwait, John; Potter, Steven; Shapiro, Linda; Spritz, Richard; Visel, Axel; Weinberg, Seth M.; Trainor, Paul A.

    2012-01-01

    The FaceBase Consortium consists of ten interlinked research and technology projects whose goal is to generate craniofacial research data and technology for use by the research community through a central data management and integrated bioinformatics hub. Funded by the National Institute of Dental and Craniofacial Research (NIDCR) and currently focused on studying the development of the middle region of the face, the Consortium will produce comprehensive datasets of global gene expression patterns, regulatory elements and sequencing; will generate anatomical and molecular atlases; will provide human normative facial data and other phenotypes; conduct follow up studies of a completed genome-wide association study; generate independent data on the genetics of craniofacial development, build repositories of animal models and of human samples and data for community access and analysis; and will develop software tools and animal models for analyzing and functionally testing and integrating these data. The FaceBase website (http://www.facebase.org) will serve as a web home for these efforts, providing interactive tools for exploring these datasets, together with discussion forums and other services to support and foster collaboration within the craniofacial research community. PMID:21458441

  8. Climate Model Diagnostic Analyzer Web Service System

    NASA Astrophysics Data System (ADS)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Jiang, J. H.

    2013-12-01

    The latest Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report stressed the need for the comprehensive and innovative evaluation of climate models with newly available global observations. The traditional approach to climate model evaluation, which compares a single parameter at a time, identifies symptomatic model biases and errors but fails to diagnose the model problems. The model diagnosis process requires physics-based multi-variable comparisons that typically involve large-volume and heterogeneous datasets, making them both computationally- and data-intensive. To address these challenges, we are developing a parallel, distributed web-service system that enables the physics-based multi-variable model performance evaluations and diagnoses through the comprehensive and synergistic use of multiple observational data, reanalysis data, and model outputs. We have developed a methodology to transform an existing science application code into a web service using a Python wrapper interface and Python web service frameworks (i.e., Flask, Gunicorn, and Tornado). The web-service system, called Climate Model Diagnostic Analyzer (CMDA), currently supports (1) all the datasets from Obs4MIPs and a few ocean datasets from NOAA and Argo, which can serve as observation-based reference data for model evaluation and (2) many of CMIP5 model outputs covering a broad range of atmosphere, ocean, and land variables from the CMIP5 specific historical runs and AMIP runs. Analysis capabilities currently supported by CMDA are (1) the calculation of annual and seasonal means of physical variables, (2) the calculation of time evolution of the means in any specified geographical region, (3) the calculation of correlation between two variables, and (4) the calculation of difference between two variables. A web user interface is chosen for CMDA because it not only lowers the learning curve and removes the adoption barrier of the tool but also enables instantaneous use, avoiding the hassle of local software installation and environment incompatibility. CMDA is planned to be used as an educational tool for the summer school organized by JPL's Center for Climate Science in 2014. The requirements of the educational tool are defined with the interaction with the school organizers, and CMDA is customized to meet the requirements accordingly. The tool needs to be production quality for 30+ simultaneous users. The summer school will thus serve as a valuable testbed for the tool development, preparing CMDA to serve the Earth-science modeling and model-analysis community at the end of the project. This work was funded by the NASA Earth Science Program called Computational Modeling Algorithms and Cyberinfrastructure (CMAC).

  9. New tools for sculpting cranial implants in a shared haptic augmented reality environment.

    PubMed

    Ai, Zhuming; Evenhouse, Ray; Leigh, Jason; Charbel, Fady; Rasmussen, Mary

    2006-01-01

    New volumetric tools were developed for the design and fabrication of high quality cranial implants from patient CT data. These virtual tools replace time consuming physical sculpting, mold making and casting steps. The implant is designed by medical professionals in tele-immersive collaboration. Virtual clay is added in the virtual defect area on the CT data using the adding tool. With force feedback the modeler can feel the edge of the defect and fill only the space where no bone is present. A carving tool and a smoothing tool are then used to sculpt and refine the implant. To make a physical evaluation, the skull with simulated defect and the implant are fabricated via stereolithography to allow neurosurgeons to evaluate the quality of the implant. Initial tests demonstrate a very high quality fit. These new haptic volumetric sculpting tools are a critical component of a comprehensive tele-immersive system.

  10. The Comprehensive Inner Magnetosphere-Ionosphere Model

    NASA Technical Reports Server (NTRS)

    Fok, M.-C.; Buzulukova, N. Y.; Chen, S.-H.; Glocer, A.; Nagai, T.; Valek, P.; Perez, J. D.

    2014-01-01

    Simulation studies of the Earth's radiation belts and ring current are very useful in understanding the acceleration, transport, and loss of energetic particles. Recently, the Comprehensive Ring Current Model (CRCM) and the Radiation Belt Environment (RBE) model were merged to form a Comprehensive Inner Magnetosphere-Ionosphere (CIMI) model. CIMI solves for many essential quantities in the inner magnetosphere, including ion and electron distributions in the ring current and radiation belts, plasmaspheric density, Region 2 currents, convection potential, and precipitation in the ionosphere. It incorporates whistler mode chorus and hiss wave diffusion of energetic electrons in energy, pitch angle, and cross terms. CIMI thus represents a comprehensive model that considers the effects of the ring current and plasmasphere on the radiation belts. We have performed a CIMI simulation for the storm on 5-9 April 2010 and then compared our results with data from the Two Wide-angle Imaging Neutral-atom Spectrometers and Akebono satellites. We identify the dominant energization and loss processes for the ring current and radiation belts. We find that the interactions with the whistler mode chorus waves are the main cause of the flux increase of MeV electrons during the recovery phase of this particular storm. When a self-consistent electric field from the CRCM is used, the enhancement of MeV electrons is higher than when an empirical convection model is applied. We also demonstrate how CIMI can be a powerful tool for analyzing and interpreting data from the new Van Allen Probes mission.

  11. Sensitivities of Greenland ice sheet volume inferred from an ice sheet adjoint model

    NASA Astrophysics Data System (ADS)

    Heimbach, P.; Bugnion, V.

    2009-04-01

    We present a new and original approach to understanding the sensitivity of the Greenland ice sheet to key model parameters and environmental conditions. At the heart of this approach is the use of an adjoint ice sheet model. Since its introduction by MacAyeal (1992), the adjoint method has become widespread to fit ice stream models to the increasing number and diversity of satellite observations, and to estimate uncertain model parameters such as basal conditions. However, no attempt has been made to extend this method to comprehensive ice sheet models. As a first step toward the use of adjoints of comprehensive three-dimensional ice sheet models we have generated an adjoint of the ice sheet model SICOPOLIS of Greve (1997). The adjoint was generated by means of the automatic differentiation (AD) tool TAF. The AD tool generates exact source code representing the tangent linear and adjoint model of the nonlinear parent model provided. Model sensitivities are given by the partial derivatives of a scalar-valued model diagnostic with respect to the controls, and can be efficiently calculated via the adjoint. By way of example, we determine the sensitivity of the total Greenland ice volume to various control variables, such as spatial fields of basal flow parameters, surface and basal forcings, and initial conditions. Reliability of the adjoint was tested through finite-difference perturbation calculations for various control variables and perturbation regions. Besides confirming qualitative aspects of ice sheet sensitivities, such as expected regional variations, we detect regions where model sensitivities are seemingly unexpected or counter-intuitive, albeit ``real'' in the sense of actual model behavior. An example is inferred regions where sensitivities of ice sheet volume to basal sliding coefficient are positive, i.e. where a local increase in basal sliding parameter increases the ice sheet volume. Similarly, positive ice temperature sensitivities in certain parts of the ice sheet are found (in most regions it is negativ, i.e. an increase in temperature decreases ice sheet volume), the detection of which seems highly unlikely if only conventional perturbation experiments had been used. An effort to generate an efficient adjoint with the newly developed open-source AD tool OpenAD is also under way. Available adjoint code generation tools now open up a variety of novel model applications, notably with regard to sensitivity and uncertainty analyses and ice sheet state estimation or data assimilation.

  12. The role of emotion in decision-making: a cognitive neuroeconomic approach towards understanding sexual risk behavior.

    PubMed

    Gutnik, Lily A; Hakimzada, A Forogh; Yoskowitz, Nicole A; Patel, Vimla L

    2006-12-01

    Models of decision-making usually focus on cognitive, situational, and socio-cultural variables in accounting for human performance. However, the emotional component is rarely addressed within these models. This paper reviews evidence for the emotional aspect of decision-making and its role within a new framework of investigation, called neuroeconomics. The new approach aims to build a comprehensive theory of decision-making, through the unification of theories and methods from economics, psychology, and neuroscience. In this paper, we review these integrative research methods and their applications to issues of public health, with illustrative examples from our research on young adults' safe sex practices. This approach promises to be valuable as a comprehensively descriptive and possibly, better predictive model for construction and customization of decision support tools for health professionals and consumers.

  13. The development of a multimedia online language assessment tool for young children with autism.

    PubMed

    Lin, Chu-Sui; Chang, Shu-Hui; Liou, Wen-Ying; Tsai, Yu-Show

    2013-10-01

    This study aimed to provide early childhood special education professionals with a standardized and comprehensive language assessment tool for the early identification of language learning characteristics (e.g., hyperlexia) of young children with autism. In this study, we used computer technology to develop a multi-media online language assessment tool that presents auditory or visual stimuli. This online comprehensive language assessment consists of six subtests: decoding, homographs, auditory vocabulary comprehension, visual vocabulary comprehension, auditory sentence comprehension, and visual sentence comprehension. Three hundred typically developing children and 35 children with autism from Tao-Yuan County in Taiwan aged 4-6 participated in this study. The Cronbach α values of the six subtests ranged from .64 to .97. The variance explained by the six subtests ranged from 14% to 56%, the current validity of each subtest with the Peabody Picture Vocabulary Test-Revised ranged from .21 to .45, and the predictive validity of each subtest with WISC-III ranged from .47 to .75. This assessment tool was also found to be able to accurately differentiate children with autism up to 92%. These results indicate that this assessment tool has both adequate reliability and validity. Additionally, 35 children with autism have completed the entire assessment in this study without exhibiting any extremely troubling behaviors. However, future research is needed to increase the sample size of both typically developing children and young children with autism and to overcome the technical challenges associated with internet issues. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. PinAPL-Py: A comprehensive web-application for the analysis of CRISPR/Cas9 screens.

    PubMed

    Spahn, Philipp N; Bath, Tyler; Weiss, Ryan J; Kim, Jihoon; Esko, Jeffrey D; Lewis, Nathan E; Harismendy, Olivier

    2017-11-20

    Large-scale genetic screens using CRISPR/Cas9 technology have emerged as a major tool for functional genomics. With its increased popularity, experimental biologists frequently acquire large sequencing datasets for which they often do not have an easy analysis option. While a few bioinformatic tools have been developed for this purpose, their utility is still hindered either due to limited functionality or the requirement of bioinformatic expertise. To make sequencing data analysis of CRISPR/Cas9 screens more accessible to a wide range of scientists, we developed a Platform-independent Analysis of Pooled Screens using Python (PinAPL-Py), which is operated as an intuitive web-service. PinAPL-Py implements state-of-the-art tools and statistical models, assembled in a comprehensive workflow covering sequence quality control, automated sgRNA sequence extraction, alignment, sgRNA enrichment/depletion analysis and gene ranking. The workflow is set up to use a variety of popular sgRNA libraries as well as custom libraries that can be easily uploaded. Various analysis options are offered, suitable to analyze a large variety of CRISPR/Cas9 screening experiments. Analysis output includes ranked lists of sgRNAs and genes, and publication-ready plots. PinAPL-Py helps to advance genome-wide screening efforts by combining comprehensive functionality with user-friendly implementation. PinAPL-Py is freely accessible at http://pinapl-py.ucsd.edu with instructions and test datasets.

  15. Simulation Tool for Dielectric Barrier Discharge Plasma Actuators

    NASA Technical Reports Server (NTRS)

    Likhanskii, Alexander

    2014-01-01

    Traditional approaches for active flow separation control using dielectric barrier discharge (DBD) plasma actuators are limited to relatively low speed flows and atmospheric conditions. This results in low feasibility of the DBDs for aerospace applications. For active flow control at turbine blades, fixed wings, and rotary wings and on hypersonic vehicles, DBD plasma actuators must perform at a wide range of conditions, including rarified flows and combustion mixtures. An efficient, comprehensive, physically based DBD simulation tool can optimize DBD plasma actuators for different operation conditions. Researchers are developing a DBD plasma actuator simulation tool for a wide range of ambient gas pressures. The tool will treat DBD using either kinetic, fluid, or hybrid models, depending on the DBD operational condition.

  16. CCSI and the role of advanced computing in accelerating the commercial deployment of carbon capture systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, David; Agarwal, Deborah A.; Sun, Xin

    2011-09-01

    The Carbon Capture Simulation Initiative is developing state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technology. The CCSI Toolset consists of an integrated multi-scale modeling and simulation framework, which includes extensive use of reduced order models (ROMs) and a comprehensive uncertainty quantification (UQ) methodology. This paper focuses on the interrelation among high performance computing, detailed device simulations, ROMs for scale-bridging, UQ and the integration framework.

  17. CCSI and the role of advanced computing in accelerating the commercial deployment of carbon capture systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, D.; Agarwal, D.; Sun, X.

    2011-01-01

    The Carbon Capture Simulation Initiative is developing state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technology. The CCSI Toolset consists of an integrated multi-scale modeling and simulation framework, which includes extensive use of reduced order models (ROMs) and a comprehensive uncertainty quantification (UQ) methodology. This paper focuses on the interrelation among high performance computing, detailed device simulations, ROMs for scale-bridging, UQ and the integration framework.

  18. The enhanced Software Life Cyle Support Environment (ProSLCSE): Automation for enterprise and process modeling

    NASA Technical Reports Server (NTRS)

    Milligan, James R.; Dutton, James E.

    1993-01-01

    In this paper, we have introduced a comprehensive method for enterprise modeling that addresses the three important aspects of how an organization goes about its business. FirstEP includes infrastructure modeling, information modeling, and process modeling notations that are intended to be easy to learn and use. The notations stress the use of straightforward visual languages that are intuitive, syntactically simple, and semantically rich. ProSLCSE will be developed with automated tools and services to facilitate enterprise modeling and process enactment. In the spirit of FirstEP, ProSLCSE tools will also be seductively easy to use. Achieving fully managed, optimized software development and support processes will be long and arduous for most software organizations, and many serious problems will have to be solved along the way. ProSLCSE will provide the ability to document, communicate, and modify existing processes, which is the necessary first step.

  19. HiRel: Hybrid Automated Reliability Predictor (HARP) integrated reliability tool system, (version 7.0). Volume 1: HARP introduction and user's guide

    NASA Technical Reports Server (NTRS)

    Bavuso, Salvatore J.; Rothmann, Elizabeth; Dugan, Joanne Bechta; Trivedi, Kishor S.; Mittal, Nitin; Boyd, Mark A.; Geist, Robert M.; Smotherman, Mark D.

    1994-01-01

    The Hybrid Automated Reliability Predictor (HARP) integrated Reliability (HiRel) tool system for reliability/availability prediction offers a toolbox of integrated reliability/availability programs that can be used to customize the user's application in a workstation or nonworkstation environment. HiRel consists of interactive graphical input/output programs and four reliability/availability modeling engines that provide analytical and simulative solutions to a wide host of reliable fault-tolerant system architectures and is also applicable to electronic systems in general. The tool system was designed to be compatible with most computing platforms and operating systems, and some programs have been beta tested, within the aerospace community for over 8 years. Volume 1 provides an introduction to the HARP program. Comprehensive information on HARP mathematical models can be found in the references.

  20. Predictive models and prognostic factors for upper tract urothelial carcinoma: a comprehensive review of the literature.

    PubMed

    Mbeutcha, Aurélie; Mathieu, Romain; Rouprêt, Morgan; Gust, Kilian M; Briganti, Alberto; Karakiewicz, Pierre I; Shariat, Shahrokh F

    2016-10-01

    In the context of customized patient care for upper tract urothelial carcinoma (UTUC), decision-making could be facilitated by risk assessment and prediction tools. The aim of this study was to provide a critical overview of existing predictive models and to review emerging promising prognostic factors for UTUC. A literature search of articles published in English from January 2000 to June 2016 was performed using PubMed. Studies on risk group stratification models and predictive tools in UTUC were selected, together with studies on predictive factors and biomarkers associated with advanced-stage UTUC and oncological outcomes after surgery. Various predictive tools have been described for advanced-stage UTUC assessment, disease recurrence and cancer-specific survival (CSS). Most of these models are based on well-established prognostic factors such as tumor stage, grade and lymph node (LN) metastasis, but some also integrate newly described prognostic factors and biomarkers. These new prediction tools seem to reach a high level of accuracy, but they lack external validation and decision-making analysis. The combinations of patient-, pathology- and surgery-related factors together with novel biomarkers have led to promising predictive tools for oncological outcomes in UTUC. However, external validation of these predictive models is a prerequisite before their introduction into daily practice. New models predicting response to therapy are urgently needed to allow accurate and safe individualized management in this heterogeneous disease.

  1. A network control theory approach to modeling and optimal control of zoonoses: case study of brucellosis transmission in sub-Saharan Africa.

    PubMed

    Roy, Sandip; McElwain, Terry F; Wan, Yan

    2011-10-01

    Developing control policies for zoonotic diseases is challenging, both because of the complex spread dynamics exhibited by these diseases, and because of the need for implementing complex multi-species surveillance and control efforts using limited resources. Mathematical models, and in particular network models, of disease spread are promising as tools for control-policy design, because they can provide comprehensive quantitative representations of disease transmission. A layered dynamical network model for the transmission and control of zoonotic diseases is introduced as a tool for analyzing disease spread and designing cost-effective surveillance and control. The model development is achieved using brucellosis transmission among wildlife, cattle herds, and human sub-populations in an agricultural system as a case study. Precisely, a model that tracks infection counts in interacting animal herds of multiple species (e.g., cattle herds and groups of wildlife for brucellosis) and in human subpopulations is introduced. The model is then abstracted to a form that permits comprehensive targeted design of multiple control capabilities as well as model identification from data. Next, techniques are developed for such quantitative design of control policies (that are directed to both the animal and human populations), and for model identification from snapshot and time-course data, by drawing on recent results in the network control community. The modeling approach is shown to provide quantitative insight into comprehensive control policies for zoonotic diseases, and in turn to permit policy design for mitigation of these diseases. For the brucellosis-transmission example in particular, numerous insights are obtained regarding the optimal distribution of resources among available control capabilities (e.g., vaccination, surveillance and culling, pasteurization of milk) and points in the spread network (e.g., transhumance vs. sedentary herds). In addition, a preliminary identification of the network model for brucellosis is achieved using historical data, and the robustness of the obtained model is demonstrated. As a whole, our results indicate that network modeling can aid in designing control policies for zoonotic diseases.

  2. A Network Control Theory Approach to Modeling and Optimal Control of Zoonoses: Case Study of Brucellosis Transmission in Sub-Saharan Africa

    PubMed Central

    Roy, Sandip; McElwain, Terry F.; Wan, Yan

    2011-01-01

    Background Developing control policies for zoonotic diseases is challenging, both because of the complex spread dynamics exhibited by these diseases, and because of the need for implementing complex multi-species surveillance and control efforts using limited resources. Mathematical models, and in particular network models, of disease spread are promising as tools for control-policy design, because they can provide comprehensive quantitative representations of disease transmission. Methodology/Principal Findings A layered dynamical network model for the transmission and control of zoonotic diseases is introduced as a tool for analyzing disease spread and designing cost-effective surveillance and control. The model development is achieved using brucellosis transmission among wildlife, cattle herds, and human sub-populations in an agricultural system as a case study. Precisely, a model that tracks infection counts in interacting animal herds of multiple species (e.g., cattle herds and groups of wildlife for brucellosis) and in human subpopulations is introduced. The model is then abstracted to a form that permits comprehensive targeted design of multiple control capabilities as well as model identification from data. Next, techniques are developed for such quantitative design of control policies (that are directed to both the animal and human populations), and for model identification from snapshot and time-course data, by drawing on recent results in the network control community. Conclusions/Significance The modeling approach is shown to provide quantitative insight into comprehensive control policies for zoonotic diseases, and in turn to permit policy design for mitigation of these diseases. For the brucellosis-transmission example in particular, numerous insights are obtained regarding the optimal distribution of resources among available control capabilities (e.g., vaccination, surveillance and culling, pasteurization of milk) and points in the spread network (e.g., transhumance vs. sedentary herds). In addition, a preliminary identification of the network model for brucellosis is achieved using historical data, and the robustness of the obtained model is demonstrated. As a whole, our results indicate that network modeling can aid in designing control policies for zoonotic diseases. PMID:22022621

  3. V-FOR-WaTer - a new virtual research environment for environmental research

    NASA Astrophysics Data System (ADS)

    Strobl, Marcus; Azmi, Elnaz; Hassler, Sibylle; Mälicke, Mirko; Meyer, Jörg; Zehe, Erwin

    2017-04-01

    The preparation of heterogeneous datasets for scientific analysis is still a demanding task. Data preprocessing for hydrological models typically involves gathering datasets from different sources, extensive work within geoinformation systems, data transformation, the generation of computational grids and the definition of initial and boundary conditions. V-FOR-WaTer, a standardized and scalable data hub with compatible analysis tools, will ease comprehensive studies and significantly reduce data preparation time. The idea behind V-FOR-WaTer is to bring together various datasets (e.g. point measurements, 2D/3D data, time series data) from different sources (e.g. gathered in research projects, or as part of regular monitoring of state offices) and to provide common as well as innovative scaling tools in space and time to generate a coherent data grid. Each dataset holds detailed standardized metadata to ensure usability of the data, offer a comprehensive search function and provide reference information for appropriate citation of the dataset creators. V-FOR-WaTer includes a basis of data and tools, but its purpose is to grow by users who extend the virtual research environment with their own tools and research data. Researchers who upload new data or tools can receive a digital object identifier, or protect their data and tools from others until publication. Access to data and tools provided from V-FOR-WaTer happens via an easy-to-use web portal. Due to its modular architecture the portal is ready to be extended with new tools and features and also offers interfaces to Matlab, Python and R.

  4. Engine System Model Development for Nuclear Thermal Propulsion

    NASA Technical Reports Server (NTRS)

    Nelson, Karl W.; Simpson, Steven P.

    2006-01-01

    In order to design, analyze, and evaluate conceptual Nuclear Thermal Propulsion (NTP) engine systems, an improved NTP design and analysis tool has been developed. The NTP tool utilizes the Rocket Engine Transient Simulation (ROCETS) system tool and many of the routines from the Enabler reactor model found in Nuclear Engine System Simulation (NESS). Improved non-nuclear component models and an external shield model were added to the tool. With the addition of a nearly complete system reliability model, the tool will provide performance, sizing, and reliability data for NERVA-Derived NTP engine systems. A new detailed reactor model is also being developed and will replace Enabler. The new model will allow more flexibility in reactor geometry and include detailed thermal hydraulics and neutronics models. A description of the reactor, component, and reliability models is provided. Another key feature of the modeling process is the use of comprehensive spreadsheets for each engine case. The spreadsheets include individual worksheets for each subsystem with data, plots, and scaled figures, making the output very useful to each engineering discipline. Sample performance and sizing results with the Enabler reactor model are provided including sensitivities. Before selecting an engine design, all figures of merit must be considered including the overall impacts on the vehicle and mission. Evaluations based on key figures of merit of these results and results with the new reactor model will be performed. The impacts of clustering and external shielding will also be addressed. Over time, the reactor model will be upgraded to design and analyze other NTP concepts with CERMET and carbide fuel cores.

  5. Command Center Library Model Document. Comprehensive Approach to Reusable Defense Software (CARDS)

    DTIC Science & Technology

    1992-05-31

    system, and functionality for specifying the layout of the document. 3.7.16.1 FrameMaker FrameMaker is a Commercial Off The Shelf (COTS) component...facilitating WYSIWYG creation of formatted reports with embedded graphics. FrameMaker is an advanced publishing tool that integrates word processing...available for the component FrameMaker : * Product evaluation reports in ASCII and postscript formats • Product assessment on line in model 0 Product

  6. Assessment of uncertainty in the numerical simulation of solar irradiance over inclined PV panels: New algorithms using measurements and modeling tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xie, Yu; Sengupta, Manajit; Dooraghi, Mike

    Development of accurate transposition models to simulate plane-of-array (POA) irradiance from horizontal measurements or simulations is a complex process mainly because of the anisotropic distribution of diffuse solar radiation in the atmosphere. The limited availability of reliable POA measurements at large temporal and spatial scales leads to difficulties in the comprehensive evaluation of transposition models. This paper proposes new algorithms to assess the uncertainty of transposition models using both surface-based observations and modeling tools. We reviewed the analytical derivation of POA irradiance and the approximation of isotropic diffuse radiation that simplifies the computation. Two transposition models are evaluated against themore » computation by the rigorous analytical solution. We proposed a new algorithm to evaluate transposition models using the clear-sky measurements at the National Renewable Energy Laboratory's (NREL's) Solar Radiation Research Laboratory (SRRL) and a radiative transfer model that integrates diffuse radiances of various sky-viewing angles. We found that the radiative transfer model and a transposition model based on empirical regressions are superior to the isotropic models when compared to measurements. We further compared the radiative transfer model to the transposition models under an extensive range of idealized conditions. Our results suggest that the empirical transposition model has slightly higher cloudy-sky POA irradiance than the radiative transfer model, but performs better than the isotropic models under clear-sky conditions. Significantly smaller POA irradiances computed by the transposition models are observed when the photovoltaics (PV) panel deviates from the azimuthal direction of the sun. The new algorithms developed in the current study have opened the door to a more comprehensive evaluation of transposition models for various atmospheric conditions and solar and PV orientations.« less

  7. Assessment of uncertainty in the numerical simulation of solar irradiance over inclined PV panels: New algorithms using measurements and modeling tools

    DOE PAGES

    Xie, Yu; Sengupta, Manajit; Dooraghi, Mike

    2018-03-20

    Development of accurate transposition models to simulate plane-of-array (POA) irradiance from horizontal measurements or simulations is a complex process mainly because of the anisotropic distribution of diffuse solar radiation in the atmosphere. The limited availability of reliable POA measurements at large temporal and spatial scales leads to difficulties in the comprehensive evaluation of transposition models. This paper proposes new algorithms to assess the uncertainty of transposition models using both surface-based observations and modeling tools. We reviewed the analytical derivation of POA irradiance and the approximation of isotropic diffuse radiation that simplifies the computation. Two transposition models are evaluated against themore » computation by the rigorous analytical solution. We proposed a new algorithm to evaluate transposition models using the clear-sky measurements at the National Renewable Energy Laboratory's (NREL's) Solar Radiation Research Laboratory (SRRL) and a radiative transfer model that integrates diffuse radiances of various sky-viewing angles. We found that the radiative transfer model and a transposition model based on empirical regressions are superior to the isotropic models when compared to measurements. We further compared the radiative transfer model to the transposition models under an extensive range of idealized conditions. Our results suggest that the empirical transposition model has slightly higher cloudy-sky POA irradiance than the radiative transfer model, but performs better than the isotropic models under clear-sky conditions. Significantly smaller POA irradiances computed by the transposition models are observed when the photovoltaics (PV) panel deviates from the azimuthal direction of the sun. The new algorithms developed in the current study have opened the door to a more comprehensive evaluation of transposition models for various atmospheric conditions and solar and PV orientations.« less

  8. Modeling Criminal Activity in Urban Landscapes

    NASA Astrophysics Data System (ADS)

    Brantingham, Patricia; Glässer, Uwe; Jackson, Piper; Vajihollahi, Mona

    Computational and mathematical methods arguably have an enormous potential for serving practical needs in crime analysis and prevention by offering novel tools for crime investigations and experimental platforms for evidence-based policy making. We present a comprehensive formal framework and tool support for mathematical and computational modeling of criminal behavior to facilitate systematic experimental studies of a wide range of criminal activities in urban environments. The focus is on spatial and temporal aspects of different forms of crime, including opportunistic and serial violent crimes. However, the proposed framework provides a basis to push beyond conventional empirical research and engage the use of computational thinking and social simulations in the analysis of terrorism and counter-terrorism.

  9. Implementing health promotion tools in Australian Indigenous primary health care.

    PubMed

    Percival, Nikki A; McCalman, Janya; Armit, Christine; O'Donoghue, Lynette; Bainbridge, Roxanne; Rowley, Kevin; Doyle, Joyce; Tsey, Komla

    2018-02-01

    In Australia, significant resources have been invested in producing health promotion best practice guidelines, frameworks and tools (herein referred to as health promotion tools) as a strategy to improve Indigenous health promotion programmes. Yet, there has been very little rigorous implementation research about whether or how health promotion tools are implemented. This paper theorizes the complex processes of health promotion tool implementation in Indigenous comprehensive primary healthcare services. Data were derived from published and grey literature about the development and the implementation of four Indigenous health promotion tools. Tools were theoretically sampled to account for the key implementation types described in the literature. Data were analysed using the grounded-theory methods of coding and constant comparison with construct a theoretical implementation model. An Indigenous Health Promotion Tool Implementation Model was developed. Implementation is a social process, whereby researchers, practitioners and community members collectively interacted in creating culturally responsive health promotion to the common purpose of facilitating empowerment. The implementation of health promotion tools was influenced by the presence of change agents; a commitment to reciprocity and organizational governance and resourcing. The Indigenous Health Promotion Tool Implementation Model assists in explaining how health promotion tools are implemented and the conditions that influence these actions. Rather than simply developing more health promotion tools, our study suggests that continuous investment in developing conditions that support empowering implementation processes are required to maximize the beneficial impacts and effectiveness of health promotion tools. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  10. Methods and Practices of Investigators for Determining Participants’ Decisional Capacity and Comprehension of Protocols

    PubMed Central

    Kon, Alexander A.; Klug, Michael

    2010-01-01

    Ethicists recommend that investigators assess subjects’ comprehension prior to accepting their consent as valid. Because children represent an at-risk population, ensuring adequate comprehension in pediatric research is vital. We surveyed all corresponding authors of research articles published over a six-month period in five leading adult and pediatric journals. Our goal was to assess how often subject’s comprehension or decisional capacity was assessed in the consent process, whether there was any difference between adult and pediatric research projects, and the rate at which investigators use formal or validated tools to assess capacity. Responses from 102 authors were analyzed (response rate 56%). Approximately two-thirds of respondents stated that they assessed comprehension or decisional capacity prior to accepting consent, and we found no difference between adult and pediatric researchers. Nine investigators used a formal questionnaire, and three used a validated tool. These findings suggest that fewer than expected investigators assess comprehension and decisional capacity, and that the use of standardized and validated tools is the exception rather than the rule. PMID:19385838

  11. Software Tools for Developing and Simulating the NASA LaRC CMF Motion Base

    NASA Technical Reports Server (NTRS)

    Bryant, Richard B., Jr.; Carrelli, David J.

    2006-01-01

    The NASA Langley Research Center (LaRC) Cockpit Motion Facility (CMF) motion base has provided many design and analysis challenges. In the process of addressing these challenges, a comprehensive suite of software tools was developed. The software tools development began with a detailed MATLAB/Simulink model of the motion base which was used primarily for safety loads prediction, design of the closed loop compensator and development of the motion base safety systems1. A Simulink model of the digital control law, from which a portion of the embedded code is directly generated, was later added to this model to form a closed loop system model. Concurrently, software that runs on a PC was created to display and record motion base parameters. It includes a user interface for controlling time history displays, strip chart displays, data storage, and initializing of function generators used during motion base testing. Finally, a software tool was developed for kinematic analysis and prediction of mechanical clearances for the motion system. These tools work together in an integrated package to support normal operations of the motion base, simulate the end to end operation of the motion base system providing facilities for software-in-the-loop testing, mechanical geometry and sensor data visualizations, and function generator setup and evaluation.

  12. Beginning Korean. Yale Linguistic Series.

    ERIC Educational Resources Information Center

    Martin, Samuel E.; Lee, Young-Sook C.

    A "model of structural linguistic analysis as well as a teaching tool," this text is designed to give the student a comprehensive grasp of the essentials of modern Korean in 25 lessons, with 5 review lessons, leading to advanced levels of proficiency. It is intended to be used by adult students working either in classes or by themselves,…

  13. An Investigation of the Effectiveness of Computer Simulation Programs as Tutorial Tools for Teaching Population Ecology at University.

    ERIC Educational Resources Information Center

    Korfiatis, K.; Papatheodorou, E.; Paraskevopoulous, S.; Stamou, G. P.

    1999-01-01

    Describes a study of the effectiveness of computer-simulation programs in enhancing biology students' familiarity with ecological modeling and concepts. Finds that computer simulations improved student comprehension of ecological processes expressed in mathematical form, but did not allow a full understanding of ecological concepts. Contains 28…

  14. Constructing a Model of Lottery Tax Incidence Measurement: Revisiting the Illinois Lottery Tax for Education

    ERIC Educational Resources Information Center

    Daberkow, Kevin S.; Lin, Wei

    2012-01-01

    Nearly half a century of lottery scholarship has measured lottery tax incidence predominantly through either the Suits Index or regression analysis. The present study builds on historic lottery tax burden measurement to present a comprehensive set of tools to determine the tax incidence of individual games in addition to determining which lottery…

  15. Factors influencing subjects' comprehension of a set of medicine package inserts.

    PubMed

    Pires, Carla; Vigário, Marina; Cavaco, Afonso

    2016-08-01

    Background Package inserts (PIs) should promote the safe and effective use of medicines. The comprehension of PIs is related to socio-demographic features, such as education. Objectives To evaluate the participants' comprehension of a sample of PIs and to build an explanatory model of subjects' understanding of the content of these documents. Setting The data were collected from municipalities, city halls, firefighters, the military, schools and charities from two Portuguese regions. Methods Cross-sectional descriptive survey: 503 participants, homogeneously distributed by education and gender. The self-administered tool comprised questions on socio-demographic data, literacy tasks and comprehension evaluation of 12 purposively selected PIs. A logistic regression analysis was used. Main outcome measures Scores of numeracy tasks and comprehension. Results The average comprehension score for the PIs was 63 % (±32 %), with 48 % (n = 239) of the participants scoring <75 %. The most important predictors in explaining a comprehension score ≥75 % were having >12 years of education and correctly performing a numeracy task [respectively, OR 49.6 (CI 95 %: 22.8-108) and OR 2.48 (CI 95 %: 1.5-4.2)]. Conclusion An explanatory model of subjects' knowledge about the content of the tested PIs was built. Given that a high level of education and literacy were found to be the most relevant predictors for acceptable comprehension rates, PIs should be clearly written to assure that they are understood by all potential users, including the less educated. The evaluated PIs may thus need to be simplified.

  16. Clinical governance is "ACE"--using the EFQM excellence model to support baseline assessment.

    PubMed

    Holland, K; Fennell, S

    2000-01-01

    The introduction of clinical governance in the "new NHS" means that National Health Service (NHS) organisations are now accountable for the quality of the services they provide to their local communities. As part of the implementation of clinical governance in the NHS, Trusts and health authorities had to complete a baseline assessment of their capability and capacity by September 1999. Describes one Trust's approach to developing and implementing its baseline assessment tool, based upon its existing use of the European Foundation for Quality Management (EFQM) Excellence Model. An initial review of the process suggests that the model provides an adaptable framework for the development of a comprehensive and practical assessment tool and that self-assessment ensures ownership of action plans at service level.

  17. Tools for visually exploring biological networks.

    PubMed

    Suderman, Matthew; Hallett, Michael

    2007-10-15

    Many tools exist for visually exploring biological networks including well-known examples such as Cytoscape, VisANT, Pathway Studio and Patika. These systems play a key role in the development of integrative biology, systems biology and integrative bioinformatics. The trend in the development of these tools is to go beyond 'static' representations of cellular state, towards a more dynamic model of cellular processes through the incorporation of gene expression data, subcellular localization information and time-dependent behavior. We provide a comprehensive review of the relative advantages and disadvantages of existing systems with two goals in mind: to aid researchers in efficiently identifying the appropriate existing tools for data visualization; to describe the necessary and realistic goals for the next generation of visualization tools. In view of the first goal, we provide in the Supplementary Material a systematic comparison of more than 35 existing tools in terms of over 25 different features. Supplementary data are available at Bioinformatics online.

  18. Modeling and Prediction of Fan Noise

    NASA Technical Reports Server (NTRS)

    Envia, Ed

    2008-01-01

    Fan noise is a significant contributor to the total noise signature of a modern high bypass ratio aircraft engine and with the advent of ultra high bypass ratio engines like the geared turbofan, it is likely to remain so in the future. As such, accurate modeling and prediction of the basic characteristics of fan noise are necessary ingredients in designing quieter aircraft engines in order to ensure compliance with ever more stringent aviation noise regulations. In this paper, results from a comprehensive study aimed at establishing the utility of current tools for modeling and predicting fan noise will be summarized. It should be emphasized that these tools exemplify present state of the practice and embody what is currently used at NASA and Industry for predicting fan noise. The ability of these tools to model and predict fan noise is assessed against a set of benchmark fan noise databases obtained for a range of representative fan cycles and operating conditions. Detailed comparisons between the predicted and measured narrowband spectral and directivity characteristics of fan nose will be presented in the full paper. General conclusions regarding the utility of current tools and recommendations for future improvements will also be given.

  19. Classification of processes involved in sharing individual participant data from clinical trials.

    PubMed

    Ohmann, Christian; Canham, Steve; Banzi, Rita; Kuchinke, Wolfgang; Battaglia, Serena

    2018-01-01

    Background: In recent years, a cultural change in the handling of data from research has resulted in the strong promotion of a culture of openness and increased sharing of data. In the area of clinical trials, sharing of individual participant data involves a complex set of processes and the interaction of many actors and actions. Individual services/tools to support data sharing are available, but what is missing is a detailed, structured and comprehensive list of processes/subprocesses involved and tools/services needed. Methods : Principles and recommendations from a published data sharing consensus document are analysed in detail by a small expert group. Processes/subprocesses involved in data sharing are identified and linked to actors and possible services/tools. Definitions are adapted from the business process model and notation (BPMN) and applied in the analysis. Results: A detailed and comprehensive list of individual processes/subprocesses involved in data sharing, structured according to 9 main processes, is provided. Possible tools/services to support these processes/subprocesses are identified and grouped according to major type of support. Conclusions: The list of individual processes/subprocesses and tools/services identified is a first step towards development of a generic framework or architecture for sharing of data from clinical trials. Such a framework is strongly needed to give an overview of how various actors, research processes and services could form an interoperable system for data sharing.

  20. Classification of processes involved in sharing individual participant data from clinical trials

    PubMed Central

    Ohmann, Christian; Canham, Steve; Banzi, Rita; Kuchinke, Wolfgang; Battaglia, Serena

    2018-01-01

    Background: In recent years, a cultural change in the handling of data from research has resulted in the strong promotion of a culture of openness and increased sharing of data. In the area of clinical trials, sharing of individual participant data involves a complex set of processes and the interaction of many actors and actions. Individual services/tools to support data sharing are available, but what is missing is a detailed, structured and comprehensive list of processes/subprocesses involved and tools/services needed. Methods: Principles and recommendations from a published data sharing consensus document are analysed in detail by a small expert group. Processes/subprocesses involved in data sharing are identified and linked to actors and possible services/tools. Definitions are adapted from the business process model and notation (BPMN) and applied in the analysis. Results: A detailed and comprehensive list of individual processes/subprocesses involved in data sharing, structured according to 9 main processes, is provided. Possible tools/services to support these processes/subprocesses are identified and grouped according to major type of support. Conclusions: The list of individual processes/subprocesses and tools/services identified is a first step towards development of a generic framework or architecture for sharing of data from clinical trials. Such a framework is strongly needed to give an overview of how various actors, research processes and services could form an interoperable system for data sharing. PMID:29623192

  1. Prognostic and Prediction Tools in Bladder Cancer: A Comprehensive Review of the Literature.

    PubMed

    Kluth, Luis A; Black, Peter C; Bochner, Bernard H; Catto, James; Lerner, Seth P; Stenzl, Arnulf; Sylvester, Richard; Vickers, Andrew J; Xylinas, Evanguelos; Shariat, Shahrokh F

    2015-08-01

    This review focuses on risk assessment and prediction tools for bladder cancer (BCa). To review the current knowledge on risk assessment and prediction tools to enhance clinical decision making and counseling of patients with BCa. A literature search in English was performed using PubMed in July 2013. Relevant risk assessment and prediction tools for BCa were selected. More than 1600 publications were retrieved. Special attention was given to studies that investigated the clinical benefit of a prediction tool. Most prediction tools for BCa focus on the prediction of disease recurrence and progression in non-muscle-invasive bladder cancer or disease recurrence and survival after radical cystectomy. Although these tools are helpful, recent prediction tools aim to address a specific clinical problem, such as the prediction of organ-confined disease and lymph node metastasis to help identify patients who might benefit from neoadjuvant chemotherapy. Although a large number of prediction tools have been reported in recent years, many of them lack external validation. Few studies have investigated the clinical utility of any given model as measured by its ability to improve clinical decision making. There is a need for novel biomarkers to improve the accuracy and utility of prediction tools for BCa. Decision tools hold the promise of facilitating the shared decision process, potentially improving clinical outcomes for BCa patients. Prediction models need external validation and assessment of clinical utility before they can be incorporated into routine clinical care. We looked at models that aim to predict outcomes for patients with bladder cancer (BCa). We found a large number of prediction models that hold the promise of facilitating treatment decisions for patients with BCa. However, many models are missing confirmation in a different patient cohort, and only a few studies have tested the clinical utility of any given model as measured by its ability to improve clinical decision making. Copyright © 2015 European Association of Urology. Published by Elsevier B.V. All rights reserved.

  2. Validation of an instrument to measure inter-organisational linkages in general practice.

    PubMed

    Amoroso, Cheryl; Proudfoot, Judith; Bubner, Tanya; Jayasinghe, Upali W; Holton, Christine; Winstanley, Julie; Beilby, Justin; Harris, Mark F

    2007-12-03

    Linkages between general medical practices and external services are important for high quality chronic disease care. The purpose of this research is to describe the development, evaluation and use of a brief tool that measures the comprehensiveness and quality of a general practice's linkages with external providers for the management of patients with chronic disease. In this study, clinical linkages are defined as the communication, support, and referral arrangements between services for the care and assistance of patients with chronic disease. An interview to measure surgery-level (rather than individual clinician-level) clinical linkages was developed, piloted, reviewed, and evaluated with 97 Australian general practices. Two validated survey instruments were posted to patients, and a survey of locally available services was developed and posted to participating Divisions of General Practice (support organisations). Hypotheses regarding internal validity, association with local services, and patient satisfaction were tested using factor analysis, logistic regression and multilevel regression models. The resulting General Practice Clinical Linkages Interview (GP-CLI) is a nine-item tool with three underlying factors: referral and advice linkages, shared care and care planning linkages, and community access and awareness linkages. Local availability of chronic disease services has no affect on the comprehensiveness of services with which practices link, however, comprehensiveness of clinical linkages has an association with patient assessment of access, receptionist services, and of continuity of care in their general practice. The GP-CLI may be useful to researchers examining comparable health care systems for measuring the comprehensiveness and quality of linkages at a general practice-level with related services, possessing both internal and external validity. The tool can be used with large samples exploring the impact, outcomes, and facilitators of high quality clinical linkages in general practice.

  3. Improving diabetic foot care in a nurse-managed safety-net clinic.

    PubMed

    Peterson, Joann M; Virden, Mary D

    2013-05-01

    This article is a description of the development and implementation of a Comprehensive Diabetic Foot Care Program and assessment tool in an academically affiliated nurse-managed, multidisciplinary, safety-net clinic. The assessment tool parallels parameters identified in the Task Force Foot Care Interest Group of the American Diabetes Association's report published in 2008, "Comprehensive Foot Examination and Risk Assessment." Review of literature, Silver City Health Center's (SCHC) 2009 Annual Report, retrospective chart review. Since the full implementation of SCHC's Comprehensive Diabetic Foot Care Program, there have been no hospitalizations of clinic patients for foot-related complications. The development of the Comprehensive Diabetic Foot Assessment tool and the implementation of the Comprehensive Diabetic Foot Care Program have resulted in positive outcomes for the patients in a nurse-managed safety-net clinic. This article demonstrates that quality healthcare services can successfully be developed and implemented in a safety-net clinic setting. ©2012 The Author(s) Journal compilation ©2012 American Association of Nurse Practitioners.

  4. The use of typed lambda calculus for comprehension and construction of simulation models in the domain of ecology

    NASA Technical Reports Server (NTRS)

    Uschold, Michael

    1992-01-01

    We are concerned with two important issues in simulation modelling: model comprehension and model construction. Model comprehension is limited because many important choices taken during the modelling process are not documented. This makes it difficult for models to be modified or used by others. A key factor hindering model construction is the vast modelling search space which must be navigated. This is exacerbated by the fact that many modellers are unfamiliar with the terms and concepts catered to by current tools. The root of both problems is the lack of facilities for representing or reasoning about domain concepts in current simulation technology. The basis for our achievements in both of these areas is the development of a language with two distinct levels; one for representing domain information, and the other for representing the simulation model. Of equal importance, is the fact that we make formal connections between these two levels. The domain we are concerned with is ecological modelling. This language, called Elklogic, is based on the typed lambda calculus. Important features include a rich type structure, the use of various higher order functions, and semantics. This enables complex expressions to be constructed from relatively few primitives. The meaning of each expression can be determined in terms of the domain, the simulation model, or the relationship between the two. We describe a novel representation for sets and substructure, and a variety of other general concepts that are especially useful in the ecological domain. We use the type structure in a novel way: for controlling the modelling search space, rather than a proof search space. We facilitate model comprehension by representing modelling decisions that are embodied in the simulation model. We represent the simulation model separately from, but in terms of a domain mode. The explicit links between the two models constitute the modelling decisions. The semantics of Elklogic enables English text to be generated to explain the simulation model in domain terms.

  5. Determinants of the Pace of Global Innovation in Energy Technologies

    DTIC Science & Technology

    2013-10-14

    quality (see Figures S1 and S2 in File S1), a comprehensive patent database is a powerful tool for investigating the determinants of innovative...model in order to avoid overfitting the data and to maximize predictive power . We develop a model that explains the observed trends in energy...patents. (A.) World map of cumulative patents in photovoltaics (solar). Japan is the leading nation in terms of patent numbers, followed by the US and China

  6. Technical Description of Urban Microscale Modeling System: Component 1 of CRTI Project 02-0093RD

    DTIC Science & Technology

    2007-03-01

    0093RD which involved (1) development and implementation of a com- putational fluid dynamics model for the simulation of urban flow in an arbitrary...resource will serve as a nation-wide general problem- solving tool for first-responders involved with CBR incidents in the urban environment and...predictions with experimental data obtained from a comprehensive full-scale urban field experiment conducted in Oklahoma City, Oklahoma in July 2003 (Joint

  7. A multimedia consent tool for research participants in the Gambia: a randomized controlled trial.

    PubMed

    Afolabi, Muhammed Olanrewaju; McGrath, Nuala; D'Alessandro, Umberto; Kampmann, Beate; Imoukhuede, Egeruan B; Ravinetto, Raffaella M; Alexander, Neal; Larson, Heidi J; Chandramohan, Daniel; Bojang, Kalifa

    2015-05-01

    To assess the effectiveness of a multimedia informed consent tool for adults participating in a clinical trial in the Gambia. Adults eligible for inclusion in a malaria treatment trial (n = 311) were randomized to receive information needed for informed consent using either a multimedia tool (intervention arm) or a standard procedure (control arm). A computerized, audio questionnaire was used to assess participants' comprehension of informed consent. This was done immediately after consent had been obtained (at day 0) and at subsequent follow-up visits (days 7, 14, 21 and 28). The acceptability and ease of use of the multimedia tool were assessed in focus groups. On day 0, the median comprehension score in the intervention arm was 64% compared with 40% in the control arm (P = 0.042). The difference remained significant at all follow-up visits. Poorer comprehension was independently associated with female sex (odds ratio, OR: 0.29; 95% confidence interval, CI: 0.12-0.70) and residing in Jahaly rather than Basse province (OR: 0.33; 95% CI: 0.13-0.82). There was no significant independent association with educational level. The risk that a participant's comprehension score would drop to half of the initial value was lower in the intervention arm (hazard ratio 0.22, 95% CI: 0.16-0.31). Overall, 70% (42/60) of focus group participants from the intervention arm found the multimedia tool clear and easy to understand. A multimedia informed consent tool significantly improved comprehension and retention of consent information by research participants with low levels of literacy.

  8. Machine Learning Meta-analysis of Large Metagenomic Datasets: Tools and Biological Insights.

    PubMed

    Pasolli, Edoardo; Truong, Duy Tin; Malik, Faizan; Waldron, Levi; Segata, Nicola

    2016-07-01

    Shotgun metagenomic analysis of the human associated microbiome provides a rich set of microbial features for prediction and biomarker discovery in the context of human diseases and health conditions. However, the use of such high-resolution microbial features presents new challenges, and validated computational tools for learning tasks are lacking. Moreover, classification rules have scarcely been validated in independent studies, posing questions about the generality and generalization of disease-predictive models across cohorts. In this paper, we comprehensively assess approaches to metagenomics-based prediction tasks and for quantitative assessment of the strength of potential microbiome-phenotype associations. We develop a computational framework for prediction tasks using quantitative microbiome profiles, including species-level relative abundances and presence of strain-specific markers. A comprehensive meta-analysis, with particular emphasis on generalization across cohorts, was performed in a collection of 2424 publicly available metagenomic samples from eight large-scale studies. Cross-validation revealed good disease-prediction capabilities, which were in general improved by feature selection and use of strain-specific markers instead of species-level taxonomic abundance. In cross-study analysis, models transferred between studies were in some cases less accurate than models tested by within-study cross-validation. Interestingly, the addition of healthy (control) samples from other studies to training sets improved disease prediction capabilities. Some microbial species (most notably Streptococcus anginosus) seem to characterize general dysbiotic states of the microbiome rather than connections with a specific disease. Our results in modelling features of the "healthy" microbiome can be considered a first step toward defining general microbial dysbiosis. The software framework, microbiome profiles, and metadata for thousands of samples are publicly available at http://segatalab.cibio.unitn.it/tools/metaml.

  9. Shuttle Debris Impact Tool Assessment Using the Modern Design of Experiments

    NASA Technical Reports Server (NTRS)

    DeLoach, R.; Rayos, E. M.; Campbell, C. H.; Rickman, S. L.

    2006-01-01

    Computational tools have been developed to estimate thermal and mechanical reentry loads experienced by the Space Shuttle Orbiter as the result of cavities in the Thermal Protection System (TPS). Such cavities can be caused by impact from ice or insulating foam debris shed from the External Tank (ET) on liftoff. The reentry loads depend on cavity geometry and certain Shuttle state variables, among other factors. Certain simplifying assumptions have been made in the tool development about the cavity geometry variables. For example, the cavities are all modeled as shoeboxes , with rectangular cross-sections and planar walls. So an actual cavity is typically approximated with an idealized cavity described in terms of its length, width, and depth, as well as its entry angle, exit angle, and side angles (assumed to be the same for both sides). As part of a comprehensive assessment of the uncertainty in reentry loads estimated by the debris impact assessment tools, an effort has been initiated to quantify the component of the uncertainty that is due to imperfect geometry specifications for the debris impact cavities. The approach is to compute predicted loads for a set of geometry factor combinations sufficient to develop polynomial approximations to the complex, nonparametric underlying computational models. Such polynomial models are continuous and feature estimable, continuous derivatives, conditions that facilitate the propagation of independent variable errors. As an additional benefit, once the polynomial models have been developed, they require fewer computational resources to execute than the underlying finite element and computational fluid dynamics codes, and can generate reentry loads estimates in significantly less time. This provides a practical screening capability, in which a large number of debris impact cavities can be quickly classified either as harmless, or subject to additional analysis with the more comprehensive underlying computational tools. The polynomial models also provide useful insights into the sensitivity of reentry loads to various cavity geometry variables, and reveal complex interactions among those variables that indicate how the sensitivity of one variable depends on the level of one or more other variables. For example, the effect of cavity length on certain reentry loads depends on the depth of the cavity. Such interactions are clearly displayed in the polynomial response models.

  10. Informed consent comprehension in African research settings.

    PubMed

    Afolabi, Muhammed O; Okebe, Joseph U; McGrath, Nuala; Larson, Heidi J; Bojang, Kalifa; Chandramohan, Daniel

    2014-06-01

    Previous reviews on participants' comprehension of informed consent information have focused on developed countries. Experience has shown that ethical standards developed on Western values may not be appropriate for African settings where research concepts are unfamiliar. We undertook this review to describe how informed consent comprehension is defined and measured in African research settings. We conducted a comprehensive search involving five electronic databases: Medline, Embase, Global Health, EthxWeb and Bioethics Literature Database (BELIT). We also examined African Index Medicus and Google Scholar for relevant publications on informed consent comprehension in clinical studies conducted in sub-Saharan Africa. 29 studies satisfied the inclusion criteria; meta-analysis was possible in 21 studies. We further conducted a direct comparison of participants' comprehension on domains of informed consent in all eligible studies. Comprehension of key concepts of informed consent varies considerably from country to country and depends on the nature and complexity of the study. Meta-analysis showed that 47% of a total of 1633 participants across four studies demonstrated comprehension about randomisation (95% CI 13.9-80.9%). Similarly, 48% of 3946 participants in six studies had understanding about placebo (95% CI 19.0-77.5%), while only 30% of 753 participants in five studies understood the concept of therapeutic misconception (95% CI 4.6-66.7%). Measurement tools for informed consent comprehension were developed with little or no validation. Assessment of comprehension was carried out at variable times after disclosure of study information. No uniform definition of informed consent comprehension exists to form the basis for development of an appropriate tool to measure comprehension in African participants. Comprehension of key concepts of informed consent is poor among study participants across Africa. There is a vital need to develop a uniform definition for informed consent comprehension in low literacy research settings in Africa. This will be an essential step towards developing appropriate tools that can adequately measure informed consent comprehension. This may consequently suggest adequate measures to improve the informed consent procedure. © 2014 John Wiley & Sons Ltd.

  11. CalFitter: a web server for analysis of protein thermal denaturation data.

    PubMed

    Mazurenko, Stanislav; Stourac, Jan; Kunka, Antonin; Nedeljkovic, Sava; Bednar, David; Prokop, Zbynek; Damborsky, Jiri

    2018-05-14

    Despite significant advances in the understanding of protein structure-function relationships, revealing protein folding pathways still poses a challenge due to a limited number of relevant experimental tools. Widely-used experimental techniques, such as calorimetry or spectroscopy, critically depend on a proper data analysis. Currently, there are only separate data analysis tools available for each type of experiment with a limited model selection. To address this problem, we have developed the CalFitter web server to be a unified platform for comprehensive data fitting and analysis of protein thermal denaturation data. The server allows simultaneous global data fitting using any combination of input data types and offers 12 protein unfolding pathway models for selection, including irreversible transitions often missing from other tools. The data fitting produces optimal parameter values, their confidence intervals, and statistical information to define unfolding pathways. The server provides an interactive and easy-to-use interface that allows users to directly analyse input datasets and simulate modelled output based on the model parameters. CalFitter web server is available free at https://loschmidt.chemi.muni.cz/calfitter/.

  12. Theoretical analysis of microwave propagation

    NASA Astrophysics Data System (ADS)

    Parl, S.; Malaga, A.

    1984-04-01

    This report documents a comprehensive investigation of microwave propagation. The structure of line-of-sight multipath is determined and the impact on practical diversity is discussed. A new model of diffraction propagation for multiple rounded obstacles is developed. A troposcatter model valid at microwave frequencies is described. New results for the power impulse response, and the delay spread and Doppler spread are developed. A 2-component model separating large and small scale scatter effects is proposed. The prediction techniques for diffraction and troposcatter have been implemented in a computer program intended as a tool to analyze propagation experiments.

  13. Foreign Language Analysis and Recognition (FLARe) Initial Progress

    DTIC Science & Technology

    2012-11-29

    University Language Modeling ToolKit CoMMA Count Mediated Morphological Analysis CRUD Create, Read , Update & Delete CPAN Comprehensive Perl Archive...DATES COVERED (From - To) 1 October 2010 – 30 September 2012 4. TITLE AND SUBTITLE Foreign Language Analysis and Recognition (FLARe) Initial Progress...AFRL-RH-WP-TR-2012-0165 FOREIGN LANGUAGE ANALYSIS AND RECOGNITION (FLARE) INITIAL PROGRESS Brian M. Ore

  14. The CEBAF Element Database and Related Operational Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Larrieu, Theodore; Slominski, Christopher; Keesee, Marie

    The newly commissioned 12GeV CEBAF accelerator relies on a flexible, scalable and comprehensive database to define the accelerator. This database delivers the configuration for CEBAF operational tools, including hardware checkout, the downloadable optics model, control screens, and much more. The presentation will describe the flexible design of the CEBAF Element Database (CED), its features and assorted use case examples.

  15. Lessons in Media Literacy and Students' Comprehension of Television and Text Advertisements.

    ERIC Educational Resources Information Center

    Verkaik, Nan; Gathercoal, Paul

    A Media Studies program enhances the goals of formal schooling by providing every student with knowledge and skills to wisely select, access and use the communications and information tools they will need to be responsible citizens in a free society. All students deserve a good media education. This paper provides a model to address this need…

  16. Lotus Base: An integrated information portal for the model legume Lotus japonicus

    PubMed Central

    Mun, Terry; Bachmann, Asger; Gupta, Vikas; Stougaard, Jens; Andersen, Stig U.

    2016-01-01

    Lotus japonicus is a well-characterized model legume widely used in the study of plant-microbe interactions. However, datasets from various Lotus studies are poorly integrated and lack interoperability. We recognize the need for a comprehensive repository that allows comprehensive and dynamic exploration of Lotus genomic and transcriptomic data. Equally important are user-friendly in-browser tools designed for data visualization and interpretation. Here, we present Lotus Base, which opens to the research community a large, established LORE1 insertion mutant population containing an excess of 120,000 lines, and serves the end-user tightly integrated data from Lotus, such as the reference genome, annotated proteins, and expression profiling data. We report the integration of expression data from the L. japonicus gene expression atlas project, and the development of tools to cluster and export such data, allowing users to construct, visualize, and annotate co-expression gene networks. Lotus Base takes advantage of modern advances in browser technology to deliver powerful data interpretation for biologists. Its modular construction and publicly available application programming interface enable developers to tap into the wealth of integrated Lotus data. Lotus Base is freely accessible at: https://lotus.au.dk. PMID:28008948

  17. Bayesian Network Webserver: a comprehensive tool for biological network modeling.

    PubMed

    Ziebarth, Jesse D; Bhattacharya, Anindya; Cui, Yan

    2013-11-01

    The Bayesian Network Webserver (BNW) is a platform for comprehensive network modeling of systems genetics and other biological datasets. It allows users to quickly and seamlessly upload a dataset, learn the structure of the network model that best explains the data and use the model to understand relationships between network variables. Many datasets, including those used to create genetic network models, contain both discrete (e.g. genotype) and continuous (e.g. gene expression traits) variables, and BNW allows for modeling hybrid datasets. Users of BNW can incorporate prior knowledge during structure learning through an easy-to-use structural constraint interface. After structure learning, users are immediately presented with an interactive network model, which can be used to make testable hypotheses about network relationships. BNW, including a downloadable structure learning package, is available at http://compbio.uthsc.edu/BNW. (The BNW interface for adding structural constraints uses HTML5 features that are not supported by current version of Internet Explorer. We suggest using other browsers (e.g. Google Chrome or Mozilla Firefox) when accessing BNW). ycui2@uthsc.edu. Supplementary data are available at Bioinformatics online.

  18. Application of enhanced modern structured analysis techniques to Space Station Freedom electric power system requirements

    NASA Technical Reports Server (NTRS)

    Biernacki, John; Juhasz, John; Sadler, Gerald

    1991-01-01

    A team of Space Station Freedom (SSF) system engineers are in the process of extensive analysis of the SSF requirements, particularly those pertaining to the electrical power system (EPS). The objective of this analysis is the development of a comprehensive, computer-based requirements model, using an enhanced modern structured analysis methodology (EMSA). Such a model provides a detailed and consistent representation of the system's requirements. The process outlined in the EMSA methodology is unique in that it allows the graphical modeling of real-time system state transitions, as well as functional requirements and data relationships, to be implemented using modern computer-based tools. These tools permit flexible updating and continuous maintenance of the models. Initial findings resulting from the application of EMSA to the EPS have benefited the space station program by linking requirements to design, providing traceability of requirements, identifying discrepancies, and fostering an understanding of the EPS.

  19. Browsing Space Weather Data and Models with the Integrated Space Weather Analysis (iSWA) System

    NASA Technical Reports Server (NTRS)

    Maddox, Marlo M.; Mullinix, Richard E.; Berrios, David H.; Hesse, Michael; Rastaetter, Lutz; Pulkkinen, Antti; Hourcle, Joseph A.; Thompson, Barbara J.

    2011-01-01

    The Integrated Space Weather Analysis (iSWA) System is a comprehensive web-based platform for space weather information that combines data from solar, heliospheric and geospace observatories with forecasts based on the most advanced space weather models. The iSWA system collects, generates, and presents a wide array of space weather resources in an intuitive, user-configurable, and adaptable format - thus enabling users to respond to current and future space weather impacts as well as enabling post-impact analysis. iSWA currently provides over 200 data and modeling products, and features a variety of tools that allow the user to browse, combine, and examine data and models from various sources. This presentation will consist of a summary of the iSWA products and an overview of the customizable user interfaces, and will feature several tutorial demonstrations highlighting the interactive tools and advanced capabilities.

  20. Source-term development for a contaminant plume for use by multimedia risk assessment models

    NASA Astrophysics Data System (ADS)

    Whelan, Gene; McDonald, John P.; Taira, Randal Y.; Gnanapragasam, Emmanuel K.; Yu, Charley; Lew, Christine S.; Mills, William B.

    2000-02-01

    Multimedia modelers from the US Environmental Protection Agency (EPA) and US Department of Energy (DOE) are collaborating to conduct a comprehensive and quantitative benchmarking analysis of four intermedia models: MEPAS, MMSOILS, PRESTO, and RESRAD. These models represent typical analytically based tools that are used in human-risk and endangerment assessments at installations containing radioactive and hazardous contaminants. The objective is to demonstrate an approach for developing an adequate source term by simplifying an existing, real-world, 90Sr plume at DOE's Hanford installation in Richland, WA, for use in a multimedia benchmarking exercise between MEPAS, MMSOILS, PRESTO, and RESRAD. Source characteristics and a release mechanism are developed and described; also described is a typical process and procedure that an analyst would follow in developing a source term for using this class of analytical tool in a preliminary assessment.

  1. A description of the new 3D electron gun and collector modeling tool: MICHELLE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Petillo, J.; Mondelli, A.; Krueger, W.

    1999-07-01

    A new 3D finite element gun and collector modeling code is under development at SAIC in collaboration with industrial partners and national laboratories. This development program has been designed specifically to address the shortcomings of current simulation and modeling tools. In particular, although there are 3D gun codes that exist today, their ability to address fine scale features is somewhat limited in 3D due to disparate length scales of certain classes of devices. Additionally, features like advanced emission rules, including thermionic Child's law and comprehensive secondary emission models also need attention. The program specifically targets problems classes including gridded-guns, sheet-beammore » guns, multi-beam devices, and anisotropic collectors. The presentation will provide an overview of the program objectives, the approach to be taken by the development team, and a status of the project.« less

  2. A comprehensive linear programming tool to optimize formulations of ready-to-use therapeutic foods: An application to Ethiopia

    USDA-ARS?s Scientific Manuscript database

    Ready-to-use therapeutic food (RUTF) is the standard of care for children suffering from noncomplicated severe acute malnutrition (SAM). The objective was to develop a comprehensive linear programming (LP) tool to create novel RUTF formulations for Ethiopia. A systematic approach that surveyed inter...

  3. Analysis of the comprehensibility of chemical hazard communication tools at the industrial workplace.

    PubMed

    Ta, Goh Choo; Mokhtar, Mazlin Bin; Mohd Mokhtar, Hj Anuar Bin; Ismail, Azmir Bin; Abu Yazid, Mohd Fadhil Bin Hj

    2010-01-01

    Chemical classification and labelling systems may be roughly similar from one country to another but there are significant differences too. In order to harmonize various chemical classification systems and ultimately provide consistent chemical hazard communication tools worldwide, the Globally Harmonized System of Classification and Labelling of Chemicals (GHS) was endorsed by the United Nations Economic and Social Council (ECOSOC). Several countries, including Japan, Taiwan, Korea and Malaysia, are now in the process of implementing GHS. It is essential to ascertain the comprehensibility of chemical hazard communication tools that are described in the GHS documents, namely the chemical labels and Safety Data Sheets (SDS). Comprehensibility Testing (CT) was carried out with a mixed group of industrial workers in Malaysia (n=150) and factors that influence the comprehensibility were analysed using one-way ANOVA. The ability of the respondents to retrieve information from the SDS was also tested in this study. The findings show that almost all the GHS pictograms meet the ISO comprehension criteria and it is concluded that the underlying core elements that enhance comprehension of GHS pictograms and which are also essential in developing competent persons in the use of SDS are training and education.

  4. Comparing 2 National Organization-Level Workplace Health Promotion and Improvement Tools, 2013–2015

    PubMed Central

    Lang, Jason E.; Davis, Whitney D.; Jones-Jack, Nkenge H.; Mukhtar, Qaiser; Lu, Hua; Acharya, Sushama D.; Molloy, Meg E.

    2016-01-01

    Creating healthy workplaces is becoming more common. Half of employers that have more than 50 employees offer some type of workplace health promotion program. Few employers implement comprehensive evidence-based interventions that reach all employees and achieve desired health and cost outcomes. A few organization-level assessment and benchmarking tools have emerged to help employers evaluate the comprehensiveness and rigor of their health promotion offerings. Even fewer tools exist that combine assessment with technical assistance and guidance to implement evidence-based practices. Our descriptive analysis compares 2 such tools, the Centers for Disease Control and Prevention’s Worksite Health ScoreCard and Prevention Partners’ WorkHealthy America, and presents data from both to describe workplace health promotion practices across the United States. These tools are reaching employers of all types (N = 1,797), and many employers are using a comprehensive approach (85% of those using WorkHealthy America and 45% of those using the ScoreCard), increasing program effectiveness and impact. PMID:27685429

  5. Understanding Kidney Disease: Toward the Integration of Regulatory Networks Across Species

    PubMed Central

    Ju, Wenjun; Brosius, Frank C.

    2010-01-01

    Animal models have long been useful in investigating both normal and abnormal human physiology. Systems biology provides a relatively new set of approaches to identify similarities and differences between animal models and humans that may lead to a more comprehensive understanding of human kidney pathophysiology. In this review, we briefly describe how genome-wide analyses of mouse models have helped elucidate features of human kidney diseases, discuss strategies to achieve effective network integration, and summarize currently available web-based tools that may facilitate integration of data across species. The rapid progress in systems biology and orthology, as well as the advent of web-based tools to facilitate these processes, now make it possible to take advantage of knowledge from distant animal species in targeted identification of regulatory networks that may have clinical relevance for human kidney diseases. PMID:21044762

  6. Systems biology of embryonic development: Prospects for a complete understanding of the Caenorhabditis elegans embryo.

    PubMed

    Murray, John Isaac

    2018-05-01

    The convergence of developmental biology and modern genomics tools brings the potential for a comprehensive understanding of developmental systems. This is especially true for the Caenorhabditis elegans embryo because its small size, invariant developmental lineage, and powerful genetic and genomic tools provide the prospect of a cellular resolution understanding of messenger RNA (mRNA) expression and regulation across the organism. We describe here how a systems biology framework might allow large-scale determination of the embryonic regulatory relationships encoded in the C. elegans genome. This framework consists of two broad steps: (a) defining the "parts list"-all genes expressed in all cells at each time during development and (b) iterative steps of computational modeling and refinement of these models by experimental perturbation. Substantial progress has been made towards defining the parts list through imaging methods such as large-scale green fluorescent protein (GFP) reporter analysis. Imaging results are now being augmented by high-resolution transcriptome methods such as single-cell RNA sequencing, and it is likely the complete expression patterns of all genes across the embryo will be known within the next few years. In contrast, the modeling and perturbation experiments performed so far have focused largely on individual cell types or genes, and improved methods will be needed to expand them to the full genome and organism. This emerging comprehensive map of embryonic expression and regulatory function will provide a powerful resource for developmental biologists, and would also allow scientists to ask questions not accessible without a comprehensive picture. This article is categorized under: Invertebrate Organogenesis > Worms Technologies > Analysis of the Transcriptome Gene Expression and Transcriptional Hierarchies > Gene Networks and Genomics. © 2018 Wiley Periodicals, Inc.

  7. New EVSE Analytical Tools/Models: Electric Vehicle Infrastructure Projection Tool (EVI-Pro)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wood, Eric W; Rames, Clement L; Muratori, Matteo

    This presentation addresses the fundamental question of how much charging infrastructure is needed in the United States to support PEVs. It complements ongoing EVSE initiatives by providing a comprehensive analysis of national PEV charging infrastructure requirements. The result is a quantitative estimate for a U.S. network of non-residential (public and workplace) EVSE that would be needed to support broader PEV adoption. The analysis provides guidance to public and private stakeholders who are seeking to provide nationwide charging coverage, improve the EVSE business case by maximizing station utilization, and promote effective use of private/public infrastructure investments.

  8. Automated eukaryotic gene structure annotation using EVidenceModeler and the Program to Assemble Spliced Alignments

    PubMed Central

    Haas, Brian J; Salzberg, Steven L; Zhu, Wei; Pertea, Mihaela; Allen, Jonathan E; Orvis, Joshua; White, Owen; Buell, C Robin; Wortman, Jennifer R

    2008-01-01

    EVidenceModeler (EVM) is presented as an automated eukaryotic gene structure annotation tool that reports eukaryotic gene structures as a weighted consensus of all available evidence. EVM, when combined with the Program to Assemble Spliced Alignments (PASA), yields a comprehensive, configurable annotation system that predicts protein-coding genes and alternatively spliced isoforms. Our experiments on both rice and human genome sequences demonstrate that EVM produces automated gene structure annotation approaching the quality of manual curation. PMID:18190707

  9. Role of Knowledge Management in Development and Lifecycle Management of Biopharmaceuticals.

    PubMed

    Rathore, Anurag S; Garcia-Aponte, Oscar Fabián; Golabgir, Aydin; Vallejo-Diaz, Bibiana Margarita; Herwig, Christoph

    2017-02-01

    Knowledge Management (KM) is a key enabler for achieving quality in a lifecycle approach for production of biopharmaceuticals. Due to the important role that it plays towards successful implementation of Quality by Design (QbD), an analysis of KM solutions is needed. This work provides a comprehensive review of the interface between KM and QbD-driven biopharmaceutical production systems as perceived by academic as well as industrial viewpoints. A comprehensive set of 356 publications addressing the applications of KM tools to QbD-related tasks were screened and a query to gather industrial inputs from 17 major biopharmaceutical organizations was performed. Three KM tool classes were identified as having high relevance for biopharmaceutical production systems and have been further explored: knowledge indicators, ontologies, and process modeling. A proposed categorization of 16 distinct KM tool classes allowed for the identification of holistic technologies supporting QbD. In addition, the classification allowed for addressing the disparity between industrial and academic expectations regarding the application of KM methodologies. This is a first of a kind attempt and thus we think that this paper would be of considerable interest to those in academia and industry that are engaged in accelerating development and commercialization of biopharmaceuticals.

  10. Comprehension Tools for Teachers: Reading for Understanding from Prekindergarten through Fourth Grade

    PubMed Central

    Connor, Carol McDonald; Phillips, Beth M.; Kaschak, Michael; Apel, Kenn; Kim, Young-Suk; Al Otaiba, Stephanie; Crowe, Elizabeth C.; Thomas-Tate, Shurita; Johnson, Lakeisha Cooper; Lonigan, Christopher J.

    2015-01-01

    This paper describes the theoretical framework, as well as the development and testing of the intervention, Comprehension Tools for Teachers (CTT), which is composed of eight component interventions targeting malleable language and reading comprehension skills that emerging research indicates contribute to proficient reading for understanding for prekindergarteners through fourth graders. Component interventions target processes considered largely automatic as well as more reflective processes, with interacting and reciprocal effects. Specifically, we present component interventions targeting cognitive, linguistic, and text-specific processes, including morphological awareness, syntax, mental-state verbs, comprehension monitoring, narrative and expository text structure, enacted comprehension, academic knowledge, and reading to learn from informational text. Our aim was to develop a tool set composed of intensive meaningful individualized small group interventions. We improved feasibility in regular classrooms through the use of design-based iterative research methods including careful lesson planning, targeted scripting, pre- and postintervention proximal assessments, and technology. In addition to the overall framework, we discuss seven of the component interventions and general results of design and efficacy studies. PMID:26500420

  11. A multimedia consent tool for research participants in the Gambia: a randomized controlled trial

    PubMed Central

    McGrath, Nuala; D’Alessandro, Umberto; Kampmann, Beate; Imoukhuede, Egeruan B; Ravinetto, Raffaella M; Alexander, Neal; Larson, Heidi J; Chandramohan, Daniel; Bojang, Kalifa

    2015-01-01

    Abstract Objective To assess the effectiveness of a multimedia informed consent tool for adults participating in a clinical trial in the Gambia. Methods Adults eligible for inclusion in a malaria treatment trial (n = 311) were randomized to receive information needed for informed consent using either a multimedia tool (intervention arm) or a standard procedure (control arm). A computerized, audio questionnaire was used to assess participants’ comprehension of informed consent. This was done immediately after consent had been obtained (at day 0) and at subsequent follow-up visits (days 7, 14, 21 and 28). The acceptability and ease of use of the multimedia tool were assessed in focus groups. Findings On day 0, the median comprehension score in the intervention arm was 64% compared with 40% in the control arm (P = 0.042). The difference remained significant at all follow-up visits. Poorer comprehension was independently associated with female sex (odds ratio, OR: 0.29; 95% confidence interval, CI: 0.12–0.70) and residing in Jahaly rather than Basse province (OR: 0.33; 95% CI: 0.13–0.82). There was no significant independent association with educational level. The risk that a participant’s comprehension score would drop to half of the initial value was lower in the intervention arm (hazard ratio 0.22, 95% CI: 0.16–0.31). Overall, 70% (42/60) of focus group participants from the intervention arm found the multimedia tool clear and easy to understand. Conclusion A multimedia informed consent tool significantly improved comprehension and retention of consent information by research participants with low levels of literacy. PMID:26229203

  12. Models for residential-and commercial-sector energy conservation analysis: Applications, limitations, and future potential

    NASA Astrophysics Data System (ADS)

    Cole, H. E.; Fuller, R. E.

    1980-09-01

    Four of the major models used by DOE for energy conservation analyses in the residential and commercial building sectors are reviewed and critically analyzed to determine how these models can serve as tools for DOE and its Conservation Policy Office in evaluating and quantifying their policy and program requirements. The most effective role for each model in addressing future issues of buildings energy conservation policy and analysis is assessed. The four models covered are: Oak Ridge Residential Energy Model; Micro Analysis of Transfers to Households/Comprehensive Human Resources Data System (MATH/CHRDS) Model; Oak Ridge Commercial Energy Model; and Brookhaven Buildings Energy Conservation Optimization Model (BECOM).

  13. Development of Sustainability Assessment Tool for Malaysian hydropower industry: A case study

    NASA Astrophysics Data System (ADS)

    Turan, Faiz Mohd; Johan, Kartina; Abu Sofian, Muhammad Irfan

    2018-04-01

    This research deals with the development of sustainability assessment tools as a medium to assess the performance of a hydropower project compliances towards sustainability practice. Since the increasing needs of implementing sustainability practice, developed countries are utilizing sustainability tools to achieve sustainable development goals. Its inception within ASEAN countries including Malaysia is still low. The problem with most tools developed from other countries is that it is not very comprehensive as well as its implementation factors are not suitable for the local environment that is not quantified. Hence, there is a need to develop a suitable sustainable assessment tool for the Malaysian hydropower industry to comply with the sustainable development goals as a bridging gap between the governor and the practitioner. The steps of achieving this goal is separated into several parts. The first part is to identify sustainable parameters from established tools as a model for comparison to enhance new parameters. The second stage is to convert equivalent quantification value from the model to the new developed tools. The last stage is to develop software program as a mean of gaining energy company feedback with systematic sustainable reporting from the surveyor so as to be able to integrate sustainability assessment, monitoring and reporting for self-improved reporting.

  14. Rotorcraft Optimization Tools: Incorporating Rotorcraft Design Codes into Multi-Disciplinary Design, Analysis, and Optimization

    NASA Technical Reports Server (NTRS)

    Meyn, Larry A.

    2018-01-01

    One of the goals of NASA's Revolutionary Vertical Lift Technology Project (RVLT) is to provide validated tools for multidisciplinary design, analysis and optimization (MDAO) of vertical lift vehicles. As part of this effort, the software package, RotorCraft Optimization Tools (RCOTOOLS), is being developed to facilitate incorporating key rotorcraft conceptual design codes into optimizations using the OpenMDAO multi-disciplinary optimization framework written in Python. RCOTOOLS, also written in Python, currently supports the incorporation of the NASA Design and Analysis of RotorCraft (NDARC) vehicle sizing tool and the Comprehensive Analytical Model of Rotorcraft Aerodynamics and Dynamics II (CAMRAD II) analysis tool into OpenMDAO-driven optimizations. Both of these tools use detailed, file-based inputs and outputs, so RCOTOOLS provides software wrappers to update input files with new design variable values, execute these codes and then extract specific response variable values from the file outputs. These wrappers are designed to be flexible and easy to use. RCOTOOLS also provides several utilities to aid in optimization model development, including Graphical User Interface (GUI) tools for browsing input and output files in order to identify text strings that are used to identify specific variables as optimization input and response variables. This paper provides an overview of RCOTOOLS and its use

  15. OPUS: A Comprehensive Search Tool for Remote Sensing Observations of the Outer Planets. Now with Enhanced Geometric Metadata for Cassini and New Horizons Optical Remote Sensing Instruments.

    NASA Astrophysics Data System (ADS)

    Gordon, M. K.; Showalter, M. R.; Ballard, L.; Tiscareno, M.; French, R. S.; Olson, D.

    2017-06-01

    The PDS RMS Node hosts OPUS - an accurate, comprehensive search tool for spacecraft remote sensing observations. OPUS supports Cassini: CIRS, ISS, UVIS, VIMS; New Horizons: LORRI, MVIC; Galileo SSI; Voyager ISS; and Hubble: ACS, STIS, WFC3, WFPC2.

  16. Measuring New Media Literacies: Towards the Development of a Comprehensive Assessment Tool

    ERIC Educational Resources Information Center

    Literat, Ioana

    2014-01-01

    This study assesses the psychometric properties of a newly tested self-report assessment tool for media literacy, based on the twelve new media literacy skills (NMLs) developed by Jenkins et al. (2006). The sample (N = 327) consisted of normal volunteers who completed a comprehensive online survey that measured their NML skills, media exposure,…

  17. Freiburg RNA tools: a central online resource for RNA-focused research and teaching.

    PubMed

    Raden, Martin; Ali, Syed M; Alkhnbashi, Omer S; Busch, Anke; Costa, Fabrizio; Davis, Jason A; Eggenhofer, Florian; Gelhausen, Rick; Georg, Jens; Heyne, Steffen; Hiller, Michael; Kundu, Kousik; Kleinkauf, Robert; Lott, Steffen C; Mohamed, Mostafa M; Mattheis, Alexander; Miladi, Milad; Richter, Andreas S; Will, Sebastian; Wolff, Joachim; Wright, Patrick R; Backofen, Rolf

    2018-05-21

    The Freiburg RNA tools webserver is a well established online resource for RNA-focused research. It provides a unified user interface and comprehensive result visualization for efficient command line tools. The webserver includes RNA-RNA interaction prediction (IntaRNA, CopraRNA, metaMIR), sRNA homology search (GLASSgo), sequence-structure alignments (LocARNA, MARNA, CARNA, ExpaRNA), CRISPR repeat classification (CRISPRmap), sequence design (antaRNA, INFO-RNA, SECISDesign), structure aberration evaluation of point mutations (RaSE), and RNA/protein-family models visualization (CMV), and other methods. Open education resources offer interactive visualizations of RNA structure and RNA-RNA interaction prediction as well as basic and advanced sequence alignment algorithms. The services are freely available at http://rna.informatik.uni-freiburg.de.

  18. Enhanced semantic interoperability by profiling health informatics standards.

    PubMed

    López, Diego M; Blobel, Bernd

    2009-01-01

    Several standards applied to the healthcare domain support semantic interoperability. These standards are far from being completely adopted in health information system development, however. The objective of this paper is to provide a method and suggest the necessary tooling for reusing standard health information models, by that way supporting the development of semantically interoperable systems and components. The approach is based on the definition of UML Profiles. UML profiling is a formal modeling mechanism to specialize reference meta-models in such a way that it is possible to adapt those meta-models to specific platforms or domains. A health information model can be considered as such a meta-model. The first step of the introduced method identifies the standard health information models and tasks in the software development process in which healthcare information models can be reused. Then, the selected information model is formalized as a UML Profile. That Profile is finally applied to system models, annotating them with the semantics of the information model. The approach is supported on Eclipse-based UML modeling tools. The method is integrated into a comprehensive framework for health information systems development, and the feasibility of the approach is demonstrated in the analysis, design, and implementation of a public health surveillance system, reusing HL7 RIM and DIMs specifications. The paper describes a method and the necessary tooling for reusing standard healthcare information models. UML offers several advantages such as tooling support, graphical notation, exchangeability, extensibility, semi-automatic code generation, etc. The approach presented is also applicable for harmonizing different standard specifications.

  19. The Guideline Implementability Decision Excellence Model (GUIDE-M): a mixed methods approach to create an international resource to advance the practice guideline field.

    PubMed

    Brouwers, Melissa C; Makarski, Julie; Kastner, Monika; Hayden, Leigh; Bhattacharyya, Onil

    2015-03-15

    Practice guideline (PG) implementability refers to PG features that promote their use. While there are tools and resources to promote PG implementability, none are based on an evidence-informed and multidisciplinary perspective. Our objectives were to (i) create a comprehensive and evidence-informed model of PG implementability, (ii) seek support for the model from the international PG community, (iii) map existing implementability tools on to the model, (iv) prioritize areas for further investigation, and (v) describe how the model can be used by PG developers, users, and researchers. A mixed methods approach was used. Using our completed realist review of the literature of seven different disciplines as the foundation, an iterative consensus process was used to create the beta version of the model. This was followed by (i) a survey of international stakeholders (guideline developers and users) to gather feedback and to refine the model, (ii) a content analysis comparing the model to existing PG tools, and (iii) a strategy to prioritize areas of the model for further research by members of the research team. The Guideline Implementability for Decision Excellence Model (GUIDE-M) is comprised of 3 core tactics, 7 domains, 9 subdomains, 44 attributes, and 40 subattributes and elements. Feedback on the beta version was received from 248 stakeholders from 34 countries. The model was rated as logical, relevant, and appropriate. Seven PG tools were selected and compared to the GUIDE-M: very few tools targeted the Contextualization and Deliberations domain. Also, fewer of the tools addressed PG appraisal than PG development and reporting functions. These findings informed the research priorities identified by the team. The GUIDE-M provides an evidence-informed international and multidisciplinary conceptualization of PG implementability. The model can be used by PG developers to help them create more implementable recommendations, by clinicians and other users to help them be better consumers of PGs, and by the research community to identify priorities for further investigation.

  20. Model-based reasoning for system and software engineering: The Knowledge From Pictures (KFP) environment

    NASA Technical Reports Server (NTRS)

    Bailin, Sydney; Paterra, Frank; Henderson, Scott; Truszkowski, Walt

    1993-01-01

    This paper presents a discussion of current work in the area of graphical modeling and model-based reasoning being undertaken by the Automation Technology Section, Code 522.3, at Goddard. The work was initially motivated by the growing realization that the knowledge acquisition process was a major bottleneck in the generation of fault detection, isolation, and repair (FDIR) systems for application in automated Mission Operations. As with most research activities this work started out with a simple objective: to develop a proof-of-concept system demonstrating that a draft rule-base for a FDIR system could be automatically realized by reasoning from a graphical representation of the system to be monitored. This work was called Knowledge From Pictures (KFP) (Truszkowski et. al. 1992). As the work has successfully progressed the KFP tool has become an environment populated by a set of tools that support a more comprehensive approach to model-based reasoning. This paper continues by giving an overview of the graphical modeling objectives of the work, describing the three tools that now populate the KFP environment, briefly presenting a discussion of related work in the field, and by indicating future directions for the KFP environment.

  1. Developing Parametric Models for the Assembly of Machine Fixtures for Virtual Multiaxial CNC Machining Centers

    NASA Astrophysics Data System (ADS)

    Balaykin, A. V.; Bezsonov, K. A.; Nekhoroshev, M. V.; Shulepov, A. P.

    2018-01-01

    This paper dwells upon a variance parameterization method. Variance or dimensional parameterization is based on sketching, with various parametric links superimposed on the sketch objects and user-imposed constraints in the form of an equation system that determines the parametric dependencies. This method is fully integrated in a top-down design methodology to enable the creation of multi-variant and flexible fixture assembly models, as all the modeling operations are hierarchically linked in the built tree. In this research the authors consider a parameterization method of machine tooling used for manufacturing parts using multiaxial CNC machining centers in the real manufacturing process. The developed method allows to significantly reduce tooling design time when making changes of a part’s geometric parameters. The method can also reduce time for designing and engineering preproduction, in particular, for development of control programs for CNC equipment and control and measuring machines, automate the release of design and engineering documentation. Variance parameterization helps to optimize construction of parts as well as machine tooling using integrated CAE systems. In the framework of this study, the authors demonstrate a comprehensive approach to parametric modeling of machine tooling in the CAD package used in the real manufacturing process of aircraft engines.

  2. Method Development for Clinical Comprehensive Evaluation of Pediatric Drugs Based on Multi-Criteria Decision Analysis: Application to Inhaled Corticosteroids for Children with Asthma.

    PubMed

    Yu, Yuncui; Jia, Lulu; Meng, Yao; Hu, Lihua; Liu, Yiwei; Nie, Xiaolu; Zhang, Meng; Zhang, Xuan; Han, Sheng; Peng, Xiaoxia; Wang, Xiaoling

    2018-04-01

    Establishing a comprehensive clinical evaluation system is critical in enacting national drug policy and promoting rational drug use. In China, the 'Clinical Comprehensive Evaluation System for Pediatric Drugs' (CCES-P) project, which aims to compare drugs based on clinical efficacy and cost effectiveness to help decision makers, was recently proposed; therefore, a systematic and objective method is required to guide the process. An evidence-based multi-criteria decision analysis model that involved an analytic hierarchy process (AHP) was developed, consisting of nine steps: (1) select the drugs to be reviewed; (2) establish the evaluation criterion system; (3) determine the criterion weight based on the AHP; (4) construct the evidence body for each drug under evaluation; (5) select comparative measures and calculate the original utility score; (6) place a common utility scale and calculate the standardized utility score; (7) calculate the comprehensive utility score; (8) rank the drugs; and (9) perform a sensitivity analysis. The model was applied to the evaluation of three different inhaled corticosteroids (ICSs) used for asthma management in children (a total of 16 drugs with different dosage forms and strengths or different manufacturers). By applying the drug analysis model, the 16 ICSs under review were successfully scored and evaluated. Budesonide suspension for inhalation (drug ID number: 7) ranked the highest, with comprehensive utility score of 80.23, followed by fluticasone propionate inhaled aerosol (drug ID number: 16), with a score of 79.59, and budesonide inhalation powder (drug ID number: 6), with a score of 78.98. In the sensitivity analysis, the ranking of the top five and lowest five drugs remains unchanged, suggesting this model is generally robust. An evidence-based drug evaluation model based on AHP was successfully developed. The model incorporates sufficient utility and flexibility for aiding the decision-making process, and can be a useful tool for the CCES-P.

  3. Lexical development of noun and predicate comprehension and production in isiZulu

    PubMed Central

    Ahmed, Saaliha

    2016-01-01

    This study seeks to investigate the development of noun and predicate comprehension and production in isiZulu-speaking children between the ages of 25 and 36 months. It compares lexical comprehension and production in isiZulu, using an Italian developed and validated vocabulary assessment tool: The Picture Naming Game (PiNG) developed by Bello, Giannantoni, Pettenati, Stefanini and Caselli (2012). The PiNG tool includes four subtests, one each for subnoun comprehension (NC), noun production (NP), predicate comprehension (PC), and predicate production (PP). Children are shown these lexical items and then asked to show comprehension and produce certain lexical items. After adaptation into the South African context, the adapted version of PiNG was used to directly assess the lexical development of isiZulu with the three main objectives to (1) test the efficiency of the adaptation of a vocabulary tool to measure isiZulu comprehension and production development, (2) test previous findings done in many cross-linguistic comparisons that have found that both comprehension and production performance increase with age for a lesser-studied language, and (3) present our findings around the comprehension and production of the linguistic categories of nouns and predicates. An analysis of the results reported in this study show an age effect throughout the entire sample. Across all the age groups, the comprehension of the noun and predicate subtests was better performed than the production of noun and predicate subtests. With regard to lexical items, the responses of children showed an influence of various factors, including the late acquisition of items, possible problems with stimuli presented to them, and the possible input received by the children from their home environment. PMID:27542416

  4. Lexical development of noun and predicate comprehension and production in isiZulu.

    PubMed

    Nicolas, Ramona Kunene; Ahmed, Saaliha

    2016-07-28

    This study seeks to investigate the development of noun and predicate comprehension and production in isiZulu-speaking children between the ages of 25 and 36 months. It compares lexical comprehension and production in isiZulu, using an Italian developed and validated vocabulary assessment tool: The Picture Naming Game (PiNG) developed by Bello, Giannantoni, Pettenati, Stefanini and Caselli (2012). The PiNG tool includes four subtests, one each for subnoun comprehension (NC), noun production (NP), predicate comprehension (PC), and predicate production (PP). Children are shown these lexical items and then asked to show comprehension and produce certain lexical items. After adaptation into the South African context, the adapted version of PiNG was used to directly assess the lexical development of isiZulu with the three main objectives to (1) test the efficiency of the adaptation of a vocabulary tool to measure isiZulu comprehension and production development, (2) test previous findings done in many cross-linguistic comparisons that have found that both comprehension and production performance increase with age for a lesser-studied language, and (3) present our findings around the comprehension and production of the linguistic categories of nouns and predicates. An analysis of the results reported in this study show an age effect throughout the entire sample. Across all the age groups, the comprehension of the noun and predicate subtests was better performed than the production of noun and predicate subtests. With regard to lexical items, the responses of children showed an influence of various factors, including the late acquisition of items, possible problems with stimuli presented to them, and the possible input received by the children from their home environment.

  5. A new spatial multi-criteria decision support tool for site selection for implementation of managed aquifer recharge.

    PubMed

    Rahman, M Azizur; Rusteberg, Bernd; Gogu, R C; Lobo Ferreira, J P; Sauter, Martin

    2012-05-30

    This study reports the development of a new spatial multi-criteria decision analysis (SMCDA) software tool for selecting suitable sites for Managed Aquifer Recharge (MAR) systems. The new SMCDA software tool functions based on the combination of existing multi-criteria evaluation methods with modern decision analysis techniques. More specifically, non-compensatory screening, criteria standardization and weighting, and Analytical Hierarchy Process (AHP) have been combined with Weighted Linear Combination (WLC) and Ordered Weighted Averaging (OWA). This SMCDA tool may be implemented with a wide range of decision maker's preferences. The tool's user-friendly interface helps guide the decision maker through the sequential steps for site selection, those steps namely being constraint mapping, criteria hierarchy, criteria standardization and weighting, and criteria overlay. The tool offers some predetermined default criteria and standard methods to increase the trade-off between ease-of-use and efficiency. Integrated into ArcGIS, the tool has the advantage of using GIS tools for spatial analysis, and herein data may be processed and displayed. The tool is non-site specific, adaptive, and comprehensive, and may be applied to any type of site-selection problem. For demonstrating the robustness of the new tool, a case study was planned and executed at Algarve Region, Portugal. The efficiency of the SMCDA tool in the decision making process for selecting suitable sites for MAR was also demonstrated. Specific aspects of the tool such as built-in default criteria, explicit decision steps, and flexibility in choosing different options were key features, which benefited the study. The new SMCDA tool can be augmented by groundwater flow and transport modeling so as to achieve a more comprehensive approach to the selection process for the best locations of the MAR infiltration basins, as well as the locations of recovery wells and areas of groundwater protection. The new spatial multicriteria analysis tool has already been implemented within the GIS based Gabardine decision support system as an innovative MAR planning tool. Copyright © 2012 Elsevier Ltd. All rights reserved.

  6. Development of a Comprehensive Digital Avionics Curriculum for the Aeronautical Engineer

    DTIC Science & Technology

    2006-03-01

    able to analyze and design aircraft and missile guidance and control systems, including feedback stabilization schemes and stochastic processes, using ...Uncertainty modeling for robust control; Robust closed-loop stability and performance; Robust H- infinity control; Robustness check using mu-analysis...Controlled feedback (reduces noise) 3. Statistical group response (reduce pressure toward conformity) When used as a tool to study a complex problem

  7. Fostering the Development of Critical Thinking Skills, and Reading Comprehension of Undergraduates Using a Web 2.0 Tool Coupled with a Learning System

    ERIC Educational Resources Information Center

    Mendenhall, Anne; Johnson, Tristan E.

    2010-01-01

    A social annotation model learning system (SAM-LS) was created using multiple instructional strategies thereby supporting the student in improving in critical thinking, critical writing and related literacy. There are four mechanisms in which the SAM-LS methodology is believed to improve learning and performance. These mechanisms include providing…

  8. A Tabletop Tool for Modeling Life Support Systems

    NASA Technical Reports Server (NTRS)

    Ramachandran, N.; Majumdar, A.; McDaniels, D.; Stewart, E.

    2003-01-01

    This paper describes the development plan for a comprehensive research and diagnostic tool for aspects of advanced life support systems in space-based laboratories. Specifically it aims to build a high fidelity tabletop model that can be used for the purpose of risk mitigation, failure mode analysis, contamination tracking, and testing reliability. We envision a comprehensive approach involving experimental work coupled with numerical simulation to develop this diagnostic tool. It envisions a 10% scale transparent model of a space platform such as the International Space Station that operates with water or a specific matched index of refraction liquid as the working fluid. This allows the scaling of a 10 ft x 10 ft x 10 ft room with air flow to 1 ft x 1 ft x 1 ft tabletop model with water/liquid flow. Dynamic similitude for this length scale dictates model velocities to be 67% of full-scale and thereby the time scale of the model to represent 15% of the full- scale system; meaning identical processes in the model are completed in 15% of the full- scale time. The use of an index matching fluid (fluid that matches the refractive index of cast acrylic, the model material) allows making the entire model (with complex internal geometry) transparent and hence conducive to non-intrusive optical diagnostics. So using such a system one can test environment control parameters such as core flows (axial flows), cross flows (from registers and diffusers), potential problem areas such as flow short circuits, inadequate oxygen content, build up of other gases beyond desirable levels, test mixing processes within the system at local nodes or compartments and assess the overall system performance. The system allows quantitative measurements of contaminants introduced in the system and allows testing and optimizing the tracking process and removal of contaminants. The envisaged system will be modular and hence flexible for quick configuration change and subsequent testing. The data and inferences from the tests will allow for improvements in the development and design of next generation life support systems and configurations.

  9. Learning to Estimate Slide Comprehension in Classrooms with Support Vector Machines

    ERIC Educational Resources Information Center

    Pattanasri, N.; Mukunoki, M.; Minoh, M.

    2012-01-01

    Comprehension assessment is an essential tool in classroom learning. However, the judgment often relies on experience of an instructor who makes observation of students' behavior during the lessons. We argue that students should report their own comprehension explicitly in a classroom. With students' comprehension made available at the slide…

  10. Scientific, technological, and economic aspects of rapid tooling by electric arc spray forming

    NASA Astrophysics Data System (ADS)

    Grant, P. S.; Duncan, S. R.; Roche, A.; Johnson, C. F.

    2006-12-01

    For the last seven years, Oxford University and Ford Motor Company personnel have been researching jointly the development of the large-scale spray forming of steel tooling capable for use in mass production, particularly for the pressing of sheet metal in automotive applications. These investigations have involved: the comprehensive microstructure and property studies, modeling of shape evolution and heat flow, realtime feedback control of tool temperature to eliminate tool distortion, high-speed imaging and particle image velocimetry of droplet deposition on three-dimensional (3D) shapes, testing of full-scale tools for different applications in the production environment, and detailed studies of the cost and time savings realized for different tooling applications. This paper provides an overview of the scientific and technical progress to date, presents the latest results, and describes the current state-of-the-art. Many of the insights described have relevance and applicability across the family of thermal spray processes and applications.

  11. Integrated Exoplanet Modeling with the GSFC Exoplanet Modeling & Analysis Center (EMAC)

    NASA Astrophysics Data System (ADS)

    Mandell, Avi M.; Hostetter, Carl; Pulkkinen, Antti; Domagal-Goldman, Shawn David

    2018-01-01

    Our ability to characterize the atmospheres of extrasolar planets will be revolutionized by JWST, WFIRST and future ground- and space-based telescopes. In preparation, the exoplanet community must develop an integrated suite of tools with which we can comprehensively predict and analyze observations of exoplanets, in order to characterize the planetary environments and ultimately search them for signs of habitability and life.The GSFC Exoplanet Modeling and Analysis Center (EMAC) will be a web-accessible high-performance computing platform with science support for modelers and software developers to host and integrate their scientific software tools, with the goal of leveraging the scientific contributions from the entire exoplanet community to improve our interpretations of future exoplanet discoveries. Our suite of models will include stellar models, models for star-planet interactions, atmospheric models, planet system science models, telescope models, instrument models, and finally models for retrieving signals from observational data. By integrating this suite of models, the community will be able to self-consistently calculate the emergent spectra from the planet whether from emission, scattering, or in transmission, and use these simulations to model the performance of current and new telescopes and their instrumentation.The EMAC infrastructure will not only provide a repository for planetary and exoplanetary community models, modeling tools and intermodal comparisons, but it will include a "run-on-demand" portal with each software tool hosted on a separate virtual machine. The EMAC system will eventually include a means of running or “checking in” new model simulations that are in accordance with the community-derived standards. Additionally, the results of intermodal comparisons will be used to produce open source publications that quantify the model comparisons and provide an overview of community consensus on model uncertainties on the climates of various planetary targets.

  12. A roadmap for improving healthcare service quality.

    PubMed

    Kennedy, Denise M; Caselli, Richard J; Berry, Leonard L

    2011-01-01

    A data-driven, comprehensive model for improving service and creating long-term value was developed and implemented at Mayo Clinic Arizona (MCA). Healthcare organizations can use this model to prepare for value-based purchasing, a payment system in which quality and patient experience measures will influence reimbursement. Surviving and thriving in such a system will require a comprehensive approach to sustaining excellent service performance from physicians and allied health staff (e.g., nurses, technicians, nonclinical staff). The seven prongs in MCA's service quality improvement model are (1) multiple data sources to drive improvement, (2) accountability for service quality, (3) service consultation and improvement tools, (4) service values and behaviors, (5) education and training, (6) ongoing monitoring and control, and (7) recognition and reward. The model was fully implemented and tested in five departments in which patient perception of provider-specific service attributes and/or overall quality of care were below the 90th percentile for patient satisfaction in the vendor's database. Extent of the implementation was at the discretion of department leadership. Perception data rating various service attributes were collected from randomly selected patients and monitored over a 24-month period. The largest increases in patient perception of excellence over the pilot period were realized when all seven prongs of the model were implemented as a comprehensive improvement approach. The results of this pilot may help other healthcare organizations prepare for value-based purchasing.

  13. Hybrid LCA model for assessing the embodied environmental impacts of buildings in South Korea

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jang, Minho, E-mail: minmin40@hanmail.net; Hong, Taehoon, E-mail: hong7@yonsei.ac.kr; Ji, Changyoon, E-mail: chnagyoon@yonsei.ac.kr

    2015-01-15

    The assessment of the embodied environmental impacts of buildings can help decision-makers plan environment-friendly buildings and reduce environmental impacts. For a more comprehensive assessment of the embodied environmental impacts of buildings, a hybrid life cycle assessment model was developed in this study. The developed model can assess the embodied environmental impacts (global warming, ozone layer depletion, acidification, eutrophication, photochemical ozone creation, abiotic depletion, and human toxicity) generated directly and indirectly in the material manufacturing, transportation, and construction phases. To demonstrate the application and validity of the developed model, the environmental impacts of an elementary school building were assessed using themore » developed model and compared with the results of a previous model used in a case study. The embodied environmental impacts from the previous model were lower than those from the developed model by 4.6–25.2%. Particularly, human toxicity potential (13 kg C{sub 6}H{sub 6} eq.) calculated by the previous model was much lower (1965 kg C{sub 6}H{sub 6} eq.) than what was calculated by the developed model. The results indicated that the developed model can quantify the embodied environmental impacts of buildings more comprehensively, and can be used by decision-makers as a tool for selecting environment-friendly buildings. - Highlights: • The model was developed to assess the embodied environmental impacts of buildings. • The model evaluates GWP, ODP, AP, EP, POCP, ADP, and HTP as environmental impacts. • The model presents more comprehensive results than the previous model by 4.6–100%. • The model can present the HTP of buildings, which the previous models cannot do. • Decision-makers can use the model for selecting environment-friendly buildings.« less

  14. Improved design method of a rotating spool compressor using a comprehensive model and comparison to experimental results

    NASA Astrophysics Data System (ADS)

    Bradshaw, Craig R.; Kemp, Greg; Orosz, Joe; Groll, Eckhard A.

    2017-08-01

    An improvement to the design process of the rotating spool compressor is presented. This improvement utilizes a comprehensive model to explore two working uids (R410A and R134a), various displaced volumes, at a variety of geometric parameters. The geometric parameters explored consists of eccentricity ratio and length-to-diameter ratio. The eccentricity ratio is varied between 0.81 and 0.92 and the length-to-diameter ratio is varied between 0.4 and 3. The key tradeoffs are evaluated and the results show that there is an optimum eccentricity and length-to-diameter ratio, which will maximize the model predicted performance, that is unique to a particular uid and displaced volume. For R410A, the modeling tool predicts that the overall isentropic efficiency will optimize at a length-to-diameter ratio that is lower than for R134a. Additionally, the tool predicts that as the displaced volume increases the overall isentropic efficiency will increase and the ideal length-to-diameter ratio will shift. The result from this study are utilized to develop a basic design for a 141 kW (40 tonsR) capacity prototype spool compressor for light-commercial air-conditioning applications. Results from a prototype compressor constructed based on these efforts is presented. The volumetric efficiency predictions are found to be very accurate with the overall isentropic efficiency predictions shown to be slightly over-predicted.

  15. Development and validation of a Malawian version of the primary care assessment tool.

    PubMed

    Dullie, Luckson; Meland, Eivind; Hetlevik, Øystein; Mildestvedt, Thomas; Gjesdal, Sturla

    2018-05-16

    Malawi does not have validated tools for assessing primary care performance from patients' experience. The aim of this study was to develop a Malawian version of Primary Care Assessment Tool (PCAT-Mw) and to evaluate its reliability and validity in the assessment of the core primary care dimensions from adult patients' perspective in Malawi. A team of experts assessed the South African version of the primary care assessment tool (ZA-PCAT) for face and content validity. The adapted questionnaire underwent forward and backward translation and a pilot study. The tool was then used in an interviewer administered cross-sectional survey in Neno district, Malawi, to test validity and reliability. Exploratory factor analysis was performed on a random half of the sample to evaluate internal consistency, reliability and construct validity of items and scales. The identified constructs were then tested with confirmatory factor analysis. Likert scale assumption testing and descriptive statistics were done on the final factor structure. The PCAT-Mw was further tested for intra-rater and inter-rater reliability. From the responses of 631 patients, a 29-item PCAT-Mw was constructed comprising seven multi-item scales, representing five primary care dimensions (first contact, continuity, comprehensiveness, coordination and community orientation). All the seven scales achieved good internal consistency, item-total correlations and construct validity. Cronbach's alpha coefficient ranged from 0.66 to 0.91. A satisfactory goodness of fit model was achieved (GFI = 0.90, CFI = 0.91, RMSEA = 0.05, PCLOSE = 0.65). The full range of possible scores was observed for all scales. Scaling assumptions tests were achieved for all except the two comprehensiveness scales. Intra-class correlation coefficient (ICC) was 0.90 (n = 44, 95% CI 0.81-0.94, p < 0.001) for intra-rater reliability and 0.84 (n = 42, 95% CI 0.71-0.96, p < 0.001) for inter-rater reliability. Comprehensive metric analyses supported the reliability and validity of PCAT-Mw in assessing the core concepts of primary care from adult patients' experience. This tool could be used for health service research in primary care in Malawi.

  16. The Use of Behavior Models for Predicting Complex Operations

    NASA Technical Reports Server (NTRS)

    Gore, Brian F.

    2010-01-01

    Modeling and simulation (M&S) plays an important role when complex human-system notions are being proposed, developed and tested within the system design process. National Aeronautics and Space Administration (NASA) as an agency uses many different types of M&S approaches for predicting human-system interactions, especially when it is early in the development phase of a conceptual design. NASA Ames Research Center possesses a number of M&S capabilities ranging from airflow, flight path models, aircraft models, scheduling models, human performance models (HPMs), and bioinformatics models among a host of other kinds of M&S capabilities that are used for predicting whether the proposed designs will benefit the specific mission criteria. The Man-Machine Integration Design and Analysis System (MIDAS) is a NASA ARC HPM software tool that integrates many models of human behavior with environment models, equipment models, and procedural / task models. The challenge to model comprehensibility is heightened as the number of models that are integrated and the requisite fidelity of the procedural sets are increased. Model transparency is needed for some of the more complex HPMs to maintain comprehensibility of the integrated model performance. This will be exemplified in a recent MIDAS v5 application model and plans for future model refinements will be presented.

  17. Environmental Camp as a Comprehensive Communication Tool to Promote the RRR Concept to Elementary Education Students at Koh Si Chang School

    ERIC Educational Resources Information Center

    Supakata, Nuta; Puangthongthub, Sitthichok; Srithongouthai, Sarawut; Kanokkantapong, Vorapot; Chaikaew, Pasicha

    2016-01-01

    The objective of this study was to develop and implement a Reduce-Reuse-Recycle (RRR) communication strategy through environmental camp as a comprehensive communication tool to promote the RRR concept to elementary school students. Various activities from five learning bases including the folding milk carton game, waste separation relay, recycling…

  18. A Comprehensive Mixture of Tobacco Smoke Components Retards Orthodontic Tooth Movement via the Inhibition of Osteoclastogenesis in a Rat Model

    PubMed Central

    Nagaie, Maya; Nishiura, Aki; Honda, Yoshitomo; Fujiwara, Shin-Ichi; Matsumoto, Naoyuki

    2014-01-01

    Tobacco smoke is a complex mixture of numerous components. Nevertheless, most experiments have examined the effects of individual chemicals in tobacco smoke. The comprehensive effects of components on tooth movement and bone resorption remain unexplored. Here, we have shown that a comprehensive mixture of tobacco smoke components (TSCs) attenuated bone resorption through osteoclastogenesis inhibition, thereby retarding experimental tooth movement in a rat model. An elastic power chain (PC) inserted between the first and second maxillary molars robustly yielded experimental tooth movement within 10 days. TSC administration effectively retarded tooth movement since day 4. Histological evaluation disclosed that tooth movement induced bone resorption at two sites: in the bone marrow and the peripheral bone near the root. TSC administration significantly reduced the number of tartrate-resistant acid phosphatase (TRAP)-positive osteoclastic cells in the bone marrow cavity of the PC-treated dentition. An in vitro study indicated that the inhibitory effects of TSCs on osteoclastogenesis seemed directed more toward preosteoclasts than osteoblasts. These results indicate that the comprehensive mixture of TSCs might be a useful tool for detailed verification of the adverse effects of tobacco smoke, possibly contributing to the development of reliable treatments in various fields associated with bone resorption. PMID:25322153

  19. Development of a Peer Teaching-Assessment Program and a Peer Observation and Evaluation Tool

    PubMed Central

    Trujillo, Jennifer M.; Barr, Judith; Gonyeau, Michael; Van Amburgh, Jenny A.; Matthews, S. James; Qualters, Donna

    2008-01-01

    Objectives To develop a formalized, comprehensive, peer-driven teaching assessment program and a valid and reliable assessment tool. Methods A volunteer taskforce was formed and a peer-assessment program was developed using a multistep, sequential approach and the Peer Observation and Evaluation Tool (POET). A pilot study was conducted to evaluate the efficiency and practicality of the process and to establish interrater reliability of the tool. Intra-class correlation coefficients (ICC) were calculated. Results ICCs for 8 separate lectures evaluated by 2-3 observers ranged from 0.66 to 0.97, indicating good interrater reliability of the tool. Conclusion Our peer assessment program for large classroom teaching, which includes a valid and reliable evaluation tool, is comprehensive, feasible, and can be adopted by other schools of pharmacy. PMID:19325963

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Nicholas R.; Carlsen, Brett W.; Dixon, Brent W.

    Dynamic fuel cycle simulation tools are intended to model holistic transient nuclear fuel cycle scenarios. As with all simulation tools, fuel cycle simulators require verification through unit tests, benchmark cases, and integral tests. Model validation is a vital aspect as well. Although compara-tive studies have been performed, there is no comprehensive unit test and benchmark library for fuel cycle simulator tools. The objective of this paper is to identify the must test functionalities of a fuel cycle simulator tool within the context of specific problems of interest to the Fuel Cycle Options Campaign within the U.S. Department of Energy smore » Office of Nuclear Energy. The approach in this paper identifies the features needed to cover the range of promising fuel cycle options identified in the DOE-NE Fuel Cycle Evaluation and Screening (E&S) and categorizes these features to facilitate prioritization. Features were categorized as essential functions, integrating features, and exemplary capabilities. One objective of this paper is to propose a library of unit tests applicable to each of the essential functions. Another underlying motivation for this paper is to encourage an international dialog on the functionalities and standard test methods for fuel cycle simulator tools.« less

  1. Greenhouse gases from wastewater treatment - A review of modelling tools.

    PubMed

    Mannina, Giorgio; Ekama, George; Caniani, Donatella; Cosenza, Alida; Esposito, Giovanni; Gori, Riccardo; Garrido-Baserba, Manel; Rosso, Diego; Olsson, Gustaf

    2016-05-01

    Nitrous oxide, carbon dioxide and methane are greenhouse gases (GHG) emitted from wastewater treatment that contribute to its carbon footprint. As a result of the increasing awareness of GHG emissions from wastewater treatment plants (WWTPs), new modelling, design, and operational tools have been developed to address and reduce GHG emissions at the plant-wide scale and beyond. This paper reviews the state-of-the-art and the recently developed tools used to understand and manage GHG emissions from WWTPs, and discusses open problems and research gaps. The literature review reveals that knowledge on the processes related to N2O formation, especially due to autotrophic biomass, is still incomplete. The literature review shows also that a plant-wide modelling approach that includes GHG is the best option for the understanding how to reduce the carbon footprint of WWTPs. Indeed, several studies have confirmed that a wide vision of the WWPTs has to be considered in order to make them more sustainable as possible. Mechanistic dynamic models were demonstrated as the most comprehensive and reliable tools for GHG assessment. Very few plant-wide GHG modelling studies have been applied to real WWTPs due to the huge difficulties related to data availability and the model complexity. For further improvement in GHG plant-wide modelling and to favour its use at large real scale, knowledge of the mechanisms involved in GHG formation and release, and data acquisition must be enhanced. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. The BonaRes Centre - A virtual institute for soil research in the context of a sustainable bio-economy

    NASA Astrophysics Data System (ADS)

    Wollschläger, Ute; Helming, Katharina; Heinrich, Uwe; Bartke, Stephan; Kögel-Knabner, Ingrid; Russell, David; Eberhardt, Einar; Vogel, Hans-Jörg

    2016-04-01

    Fertile soils are central resources for the production of biomass and provision of food and energy. A growing world population and latest climate targets lead to an increasing demand for both, food and bio-energy, which require preserving and improving the long-term productivity of soils as a bio-economic resource. At the same time, other soil functions and ecosystem services need to be maintained. To render soil management sustainable, we need to establish a scientific knowledge base about complex soil system processes that allows for the development of model tools to quantitatively predict the impact of a multitude of management measures on soil functions. This, finally, will allow for the provision of site-specific options for sustainable soil management. To face this challenge, the German Federal Ministry of Education and Research recently launched the funding program "Soil as a Natural Resource for the Bio-Economy - BonaRes". In a joint effort, ten collaborative projects and the coordinating BonaRes Centre are engaged to close existing knowledge gaps for a profound and systemic understanding of soil functions and their sensitivity to soil management. This presentation provides an overview of the concept of the BonaRes Centre which is responsible for i) setting up a comprehensive data base for soil-related information, ii) the development of model tools aiming to estimate the impact of different management measures on soil functions, and iii) establishing a web-based portal providing decision support tools for a sustainable soil management. A specific focus of the presentation will be laid on the so-called "knowledge-portal" providing the infrastructure for a community effort towards a comprehensive meta-analysis on soil functions as a basis for future model developments.

  3. Data-driven multi-scale multi-physics models to derive process-structure-property relationships for additive manufacturing

    NASA Astrophysics Data System (ADS)

    Yan, Wentao; Lin, Stephen; Kafka, Orion L.; Lian, Yanping; Yu, Cheng; Liu, Zeliang; Yan, Jinhui; Wolff, Sarah; Wu, Hao; Ndip-Agbor, Ebot; Mozaffar, Mojtaba; Ehmann, Kornel; Cao, Jian; Wagner, Gregory J.; Liu, Wing Kam

    2018-05-01

    Additive manufacturing (AM) possesses appealing potential for manipulating material compositions, structures and properties in end-use products with arbitrary shapes without the need for specialized tooling. Since the physical process is difficult to experimentally measure, numerical modeling is a powerful tool to understand the underlying physical mechanisms. This paper presents our latest work in this regard based on comprehensive material modeling of process-structure-property relationships for AM materials. The numerous influencing factors that emerge from the AM process motivate the need for novel rapid design and optimization approaches. For this, we propose data-mining as an effective solution. Such methods—used in the process-structure, structure-properties and the design phase that connects them—would allow for a design loop for AM processing and materials. We hope this article will provide a road map to enable AM fundamental understanding for the monitoring and advanced diagnostics of AM processing.

  4. Development and validation of risk models and molecular diagnostics to permit personalized management of cancer.

    PubMed

    Pu, Xia; Ye, Yuanqing; Wu, Xifeng

    2014-01-01

    Despite the advances made in cancer management over the past few decades, improvements in cancer diagnosis and prognosis are still poor, highlighting the need for individualized strategies. Toward this goal, risk prediction models and molecular diagnostic tools have been developed, tailoring each step of risk assessment from diagnosis to treatment and clinical outcomes based on the individual's clinical, epidemiological, and molecular profiles. These approaches hold increasing promise for delivering a new paradigm to maximize the efficiency of cancer surveillance and efficacy of treatment. However, they require stringent study design, methodology development, comprehensive assessment of biomarkers and risk factors, and extensive validation to ensure their overall usefulness for clinical translation. In the current study, the authors conducted a systematic review using breast cancer as an example and provide general guidelines for risk prediction models and molecular diagnostic tools, including development, assessment, and validation. © 2013 American Cancer Society.

  5. Influence of Wake Models on Calculated Tiltrotor Aerodynamics

    NASA Technical Reports Server (NTRS)

    Johnson, Wayne

    2001-01-01

    The tiltrotor aircraft configuration has the potential to revolutionize air transportation by providing an economical combination of vertical take-off and landing capability with efficient, high-speed cruise flight. To achieve this potential it is necessary to have validated analytical tools that will support future tiltrotor aircraft development. These analytical tools must calculate tiltrotor aeromechanical behavior, including performance, structural loads, vibration, and aeroelastic stability, with an accuracy established by correlation with measured tiltrotor data. The recent test of the Tilt Rotor Aeroacoustic Model (TRAM) with a single,l/4-scale V-22 rotor in the German-Dutch Wind Tunnel (DNW) provides an extensive set of aeroacoustic, performance, and structural loads data. This paper will examine the influence of wake models on calculated tiltrotor aerodynamics, comparing calculations of performance and airloads with TRAM DNW measurements. The calculations will be performed using the comprehensive analysis CAMRAD II.

  6. Data-driven multi-scale multi-physics models to derive process-structure-property relationships for additive manufacturing

    NASA Astrophysics Data System (ADS)

    Yan, Wentao; Lin, Stephen; Kafka, Orion L.; Lian, Yanping; Yu, Cheng; Liu, Zeliang; Yan, Jinhui; Wolff, Sarah; Wu, Hao; Ndip-Agbor, Ebot; Mozaffar, Mojtaba; Ehmann, Kornel; Cao, Jian; Wagner, Gregory J.; Liu, Wing Kam

    2018-01-01

    Additive manufacturing (AM) possesses appealing potential for manipulating material compositions, structures and properties in end-use products with arbitrary shapes without the need for specialized tooling. Since the physical process is difficult to experimentally measure, numerical modeling is a powerful tool to understand the underlying physical mechanisms. This paper presents our latest work in this regard based on comprehensive material modeling of process-structure-property relationships for AM materials. The numerous influencing factors that emerge from the AM process motivate the need for novel rapid design and optimization approaches. For this, we propose data-mining as an effective solution. Such methods—used in the process-structure, structure-properties and the design phase that connects them—would allow for a design loop for AM processing and materials. We hope this article will provide a road map to enable AM fundamental understanding for the monitoring and advanced diagnostics of AM processing.

  7. MetaDP: a comprehensive web server for disease prediction of 16S rRNA metagenomic datasets.

    PubMed

    Xu, Xilin; Wu, Aiping; Zhang, Xinlei; Su, Mingming; Jiang, Taijiao; Yuan, Zhe-Ming

    2016-01-01

    High-throughput sequencing-based metagenomics has garnered considerable interest in recent years. Numerous methods and tools have been developed for the analysis of metagenomic data. However, it is still a daunting task to install a large number of tools and complete a complicated analysis, especially for researchers with minimal bioinformatics backgrounds. To address this problem, we constructed an automated software named MetaDP for 16S rRNA sequencing data analysis, including data quality control, operational taxonomic unit clustering, diversity analysis, and disease risk prediction modeling. Furthermore, a support vector machine-based prediction model for intestinal bowel syndrome (IBS) was built by applying MetaDP to microbial 16S sequencing data from 108 children. The success of the IBS prediction model suggests that the platform may also be applied to other diseases related to gut microbes, such as obesity, metabolic syndrome, or intestinal cancer, among others (http://metadp.cn:7001/).

  8. Mediator Effect of TPM between TQM and Business Performance in Malaysia Automotive Industry

    NASA Astrophysics Data System (ADS)

    Ahmad, M. F.; Zakuan, N.; Rasi, Raja Zuraidah R. M.; Hisyamudin, M. N. N.

    2015-05-01

    Total Quality Management (TQM) is vital management tool in ensuring a company can success in the continuously growing competition in the global market. In order to survive in the global market with intense competition amongst regions and enterprises, the adoption of tools and techniques are essential in improving business performance. However, only few previous studies have examined the mediators and moderators between TQM and business performance. This present research proposed a TQM performance model with mediator effect of TPM with structural equation modelling, which is a more comprehensive model for developing countries, specifically for Malaysia. A questionnaire was prepared and sent to 1500 companies from automotive industry and the related vendors in Malaysia, giving a 21.3 per cent rate. The result concludes that TPM is partial mediation between and TQM and Business Performance with indirect effect (IE) is 0.25 which can be categorised as high mediator effect.

  9. The Genomic and Genetic Toolbox of the Teleost Medaka (Oryzias latipes)

    PubMed Central

    Kirchmaier, Stephan; Naruse, Kiyoshi; Wittbrodt, Joachim; Loosli, Felix

    2015-01-01

    The Japanese medaka, Oryzias latipes, is a vertebrate teleost model with a long history of genetic research. A number of unique features and established resources distinguish medaka from other vertebrate model systems. A large number of laboratory strains from different locations are available. Due to a high tolerance to inbreeding, many highly inbred strains have been established, thus providing a rich resource for genetic studies. Furthermore, closely related species native to different habitats in Southeast Asia permit comparative evolutionary studies. The transparency of embryos, larvae, and juveniles allows a detailed in vivo analysis of development. New tools to study diverse aspects of medaka biology are constantly being generated. Thus, medaka has become an important vertebrate model organism to study development, behavior, and physiology. In this review, we provide a comprehensive overview of established genetic and molecular-genetic tools that render medaka fish a full-fledged vertebrate system. PMID:25855651

  10. A Review and Analysis of Remote Sensing Capability for Air Quality Measurements as a Potential Decision Support Tool Conducted by the NASA DEVELOP Program

    NASA Technical Reports Server (NTRS)

    Ross, A.; Richards, A.; Keith, K.; Frew, C.; Boseck, J.; Sutton, S.; Watts, C.; Rickman, D.

    2007-01-01

    This project focused on a comprehensive utilization of air quality model products as decision support tools (DST) needed for public health applications. A review of past and future air quality measurement methods and their uncertainty, along with the relationship of air quality to national and global public health, is vital. This project described current and future NASA satellite remote sensing and ground sensing capabilities and the potential for using these sensors to enhance the prediction, prevention, and control of public health effects that result from poor air quality. The qualitative uncertainty of current satellite remotely sensed air quality, the ground-based remotely sensed air quality, the air quality/public health model, and the decision making process is evaluated in this study. Current peer-reviewed literature suggests that remotely sensed air quality parameters correlate well with ground-based sensor data. A satellite remote-sensed and ground-sensed data complement is needed to enhance the models/tools used by policy makers for the protection of national and global public health communities

  11. PROACT: Iterative Design of a Patient-Centered Visualization for Effective Prostate Cancer Health Risk Communication.

    PubMed

    Hakone, Anzu; Harrison, Lane; Ottley, Alvitta; Winters, Nathan; Gutheil, Caitlin; Han, Paul K J; Chang, Remco

    2017-01-01

    Prostate cancer is the most common cancer among men in the US, and yet most cases represent localized cancer for which the optimal treatment is unclear. Accumulating evidence suggests that the available treatment options, including surgery and conservative treatment, result in a similar prognosis for most men with localized prostate cancer. However, approximately 90% of patients choose surgery over conservative treatment, despite the risk of severe side effects like erectile dysfunction and incontinence. Recent medical research suggests that a key reason is the lack of patient-centered tools that can effectively communicate personalized risk information and enable them to make better health decisions. In this paper, we report the iterative design process and results of developing the PROgnosis Assessment for Conservative Treatment (PROACT) tool, a personalized health risk communication tool for localized prostate cancer patients. PROACT utilizes two published clinical prediction models to communicate the patients' personalized risk estimates and compare treatment options. In collaboration with the Maine Medical Center, we conducted two rounds of evaluations with prostate cancer survivors and urologists to identify the design elements and narrative structure that effectively facilitate patient comprehension under emotional distress. Our results indicate that visualization can be an effective means to communicate complex risk information to patients with low numeracy and visual literacy. However, the visualizations need to be carefully chosen to balance readability with ease of comprehension. In addition, due to patients' charged emotional state, an intuitive narrative structure that considers the patients' information need is critical to aid the patients' comprehension of their risk information.

  12. PeTTSy: a computational tool for perturbation analysis of complex systems biology models.

    PubMed

    Domijan, Mirela; Brown, Paul E; Shulgin, Boris V; Rand, David A

    2016-03-10

    Over the last decade sensitivity analysis techniques have been shown to be very useful to analyse complex and high dimensional Systems Biology models. However, many of the currently available toolboxes have either used parameter sampling, been focused on a restricted set of model observables of interest, studied optimisation of a objective function, or have not dealt with multiple simultaneous model parameter changes where the changes can be permanent or temporary. Here we introduce our new, freely downloadable toolbox, PeTTSy (Perturbation Theory Toolbox for Systems). PeTTSy is a package for MATLAB which implements a wide array of techniques for the perturbation theory and sensitivity analysis of large and complex ordinary differential equation (ODE) based models. PeTTSy is a comprehensive modelling framework that introduces a number of new approaches and that fully addresses analysis of oscillatory systems. It examines sensitivity analysis of the models to perturbations of parameters, where the perturbation timing, strength, length and overall shape can be controlled by the user. This can be done in a system-global setting, namely, the user can determine how many parameters to perturb, by how much and for how long. PeTTSy also offers the user the ability to explore the effect of the parameter perturbations on many different types of outputs: period, phase (timing of peak) and model solutions. PeTTSy can be employed on a wide range of mathematical models including free-running and forced oscillators and signalling systems. To enable experimental optimisation using the Fisher Information Matrix it efficiently allows one to combine multiple variants of a model (i.e. a model with multiple experimental conditions) in order to determine the value of new experiments. It is especially useful in the analysis of large and complex models involving many variables and parameters. PeTTSy is a comprehensive tool for analysing large and complex models of regulatory and signalling systems. It allows for simulation and analysis of models under a variety of environmental conditions and for experimental optimisation of complex combined experiments. With its unique set of tools it makes a valuable addition to the current library of sensitivity analysis toolboxes. We believe that this software will be of great use to the wider biological, systems biology and modelling communities.

  13. A Science Plan for a Comprehensive Regional Assessment of the Atlantic Coastal Plain Aquifer System in Maryland

    USGS Publications Warehouse

    Shedlock, Robert J.; Bolton, David W.; Cleaves, Emery T.; Gerhart, James M.; Nardi, Mark R.

    2007-01-01

    The Maryland Coastal Plain region is, at present, largely dependent upon ground water for its water supply. Decades of increasing pumpage have caused ground-water levels in parts of the Maryland Coastal Plain to decline by as much as 2 feet per year in some areas of southern Maryland. Continued declines at this rate could affect the long-term sustainability of ground-water resources in Maryland's heavily populated Coastal Plain communities and the agricultural industry of the Eastern Shore. In response to a recommendation in 2004 by the Advisory Committee on the Management and Protection of the State's Water Resources, the Maryland Geological Survey and the U.S. Geological Survey have developed a science plan for a comprehensive assessment that will provide new scientific information and new data management and analysis tools for the State to use in allocating ground water in the Coastal Plain. The comprehensive assessment has five goals aimed at improving the current information and tools used to understand the resource potential of the aquifer system: (1) document the geologic and hydrologic characteristics of the aquifer system in the Maryland Coastal Plain and appropriate areas of adjacent states; (2) conduct detailed studies of the regional ground-water-flow system and water budget for the aquifer system; (3) improve documentation of patterns of water quality in all Coastal Plain aquifers, including the distribution of saltwater; (4) enhance ground-water-level, streamflow, and water-quality-monitoring networks in the Maryland Coastal Plain; and (5) develop science-based tools to facilitate sound management of the ground-water resources in the Maryland Coastal Plain. The assessment, as designed, will be conducted in three phases and if fully implemented, is expected to take 7 to 8 years to complete. Phase I, which was initiated in January 2006, is an effort to assemble all the information and investigation tools needed to do a more comprehensive assessment of the aquifer system. The work will include updating the hydrogeologic framework, developing a Geographic Information System-based aquifer information system, refinement of water-use information, assessment of existing water-quality data, and development of detailed plans for ground-water-flow and management models. Phase II is an intensive study phase during which a regional ground-water-flow model will be developed and calibrated for the entire region of Maryland in the Atlantic Coastal Plain as well as appropriate areas of Delaware and Virginia. The model will be used to simulate flow and water levels in the aquifer system and to study the water budget of the system. The model analysis will be based on published information but will be supplemented with field investigations of recharge and leakage in the aquifer system. Localized and finely discretized ground-water-flow models that are embedded in the regional model will be developed for selected areas of heavy withdrawals. Other modeling studies will be conducted to better understand flow in the unconfined parts of the aquifer system and to support the recharge studies. Phase II will also include selected water-quality studies and a study to determine how hydrologic and water-quality-monitoring networks need to be enhanced to appropriately assess the sustainability of the Coastal Plain aquifer system. Phase III will be largely devoted to the development and application of a ground-water optimization model. This model will be linked to the ground-water-flow model to create a model package that can be used to test different water-management scenarios. The management criteria that will be used to develop these scenarios will be determined in consultation with a variety of state and local stakeholders and policy makers in Phases I and II of the assessment. The development of the aquifer information system is a key component of the assessment. The system will store all relevant aquifer data

  14. Sustainability Tools Inventory Initial Gap Analysis

    EPA Science Inventory

    This report identifies a suite of tools that address a comprehensive set of community sustainability concerns. The objective is to discover whether "gaps" exist in the tool suite’s analytic capabilities. These tools address activities that significantly influence resource consu...

  15. [Comprehension of hazard pictograms of chemical products among cleaning workers].

    PubMed

    Martí Fernández, Francesc; van der Haar, Rudolf; López López, Juan Carlos; Portell, Mariona; Torner Solé, Anna

    2015-01-01

    To assess the comprehension among cleaning workers of the hazard pictograms as defined by the Globally Harmonized System (GHS) of the United Nations, concerning the classification, labeling and packaging of substances and mixtures. A sample of 118 workers was surveyed on their perception of the GHS hazard pictograms. Comprehensibility was measured by the percentage of correct answers and the degree to which they reflected International Organization for Standardization and American National Standards Institute standards for minimum level of comprehension. The influence of different variables to predict comprehension capacity was assessed using a logistic regression model. Three groups of pictograms could be distinguished which were statistically differentiated by their comprehensibility. Pictograms reflecting "acute toxicity" and "flammable", were described correctly by 94% and 95% of the surveyed population, respectively. For pictograms reflecting "systemic toxicity", "corrosive", "warning", "environment" and "explosive" the frequency of correct answers ranged from 48% to 64%, whereas those for pictograms "oxidizing" and "compressed gas" were interpreted correctly by only 7% of respondents. Prognostic factors for poor comprehension included: not being familiar with the pictograms, not having received training on safe use of chemical products, being an immigrant and being 54 years of age or older. Only two pictograms exceeded minimum standards for comprehension. Training, a tool proven to be effective to improve the correct interpretation of danger symbols, should be encouraged, especially in those groups with greater comprehension difficulties. Copyright belongs to the Societat Catalana de Salut Laboral.

  16. Comprehensive assessment and performance improvement of effector protein predictors for bacterial secretion systems III, IV and VI.

    PubMed

    An, Yi; Wang, Jiawei; Li, Chen; Leier, André; Marquez-Lago, Tatiana; Wilksch, Jonathan; Zhang, Yang; Webb, Geoffrey I; Song, Jiangning; Lithgow, Trevor

    2018-01-01

    Bacterial effector proteins secreted by various protein secretion systems play crucial roles in host-pathogen interactions. In this context, computational tools capable of accurately predicting effector proteins of the various types of bacterial secretion systems are highly desirable. Existing computational approaches use different machine learning (ML) techniques and heterogeneous features derived from protein sequences and/or structural information. These predictors differ not only in terms of the used ML methods but also with respect to the used curated data sets, the features selection and their prediction performance. Here, we provide a comprehensive survey and benchmarking of currently available tools for the prediction of effector proteins of bacterial types III, IV and VI secretion systems (T3SS, T4SS and T6SS, respectively). We review core algorithms, feature selection techniques, tool availability and applicability and evaluate the prediction performance based on carefully curated independent test data sets. In an effort to improve predictive performance, we constructed three ensemble models based on ML algorithms by integrating the output of all individual predictors reviewed. Our benchmarks demonstrate that these ensemble models outperform all the reviewed tools for the prediction of effector proteins of T3SS and T4SS. The webserver of the proposed ensemble methods for T3SS and T4SS effector protein prediction is freely available at http://tbooster.erc.monash.edu/index.jsp. We anticipate that this survey will serve as a useful guide for interested users and that the new ensemble predictors will stimulate research into host-pathogen relationships and inspiration for the development of new bioinformatics tools for predicting effector proteins of T3SS, T4SS and T6SS. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  17. A six-parameter Iwan model and its application

    NASA Astrophysics Data System (ADS)

    Li, Yikun; Hao, Zhiming

    2016-02-01

    Iwan model is a practical tool to describe the constitutive behaviors of joints. In this paper, a six-parameter Iwan model based on a truncated power-law distribution with two Dirac delta functions is proposed, which gives a more comprehensive description of joints than the previous Iwan models. Its analytical expressions including backbone curve, unloading curves and energy dissipation are deduced. Parameter identification procedures and the discretization method are also provided. A model application based on Segalman et al.'s experiment works with bolted joints is carried out. Simulation effects of different numbers of Jenkins elements are discussed. The results indicate that the six-parameter Iwan model can be used to accurately reproduce the experimental phenomena of joints.

  18. Relevance of graph literacy in the development of patient-centered communication tools.

    PubMed

    Nayak, Jasmir G; Hartzler, Andrea L; Macleod, Liam C; Izard, Jason P; Dalkin, Bruce M; Gore, John L

    2016-03-01

    To determine the literacy skill sets of patients in the context of graphical interpretation of interactive dashboards. We assessed literacy characteristics of prostate cancer patients and assessed comprehension of quality of life dashboards. Health literacy, numeracy and graph literacy were assessed with validated tools. We divided patients into low vs. high numeracy and graph literacy. We report descriptive statistics on literacy, dashboard comprehension, and relationships between groups. We used correlation and multiple linear regressions to examine factors associated with dashboard comprehension. Despite high health literacy in educated patients (78% college educated), there was variation in numeracy and graph literacy. Numeracy and graph literacy scores were correlated (r=0.37). In those with low literacy, graph literacy scores most strongly correlated with dashboard comprehension (r=0.59-0.90). On multivariate analysis, graph literacy was independently associated with dashboard comprehension, adjusting for age, education, and numeracy level. Even among higher educated patients; variation in the ability to comprehend graphs exists. Clinicians must be aware of these differential proficiencies when counseling patients. Tools for patient-centered communication that employ visual displays need to account for literacy capabilities to ensure that patients can effectively engage these resources. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  19. Correlates of lower comprehension of informed consent among participants enrolled in a cohort study in Pune, India.

    PubMed

    Joglekar, Neelam S; Deshpande, Swapna S; Sahay, Seema; Ghate, Manisha V; Bollinger, Robert C; Mehendale, Sanjay M

    2013-03-01

    Optimum comprehension of informed consent by research participants is essential yet challenging. This study explored correlates of lower comprehension of informed consent among 1334 participants of a cohort study aimed at estimating HIV incidence in Pune, India. As part of the informed consent process, a structured comprehension tool was administered to study participants. Participants scoring ≥90% were categorised into the 'optimal comprehension group', whilst those scoring 80-89% were categorised into the 'lower comprehension group'. Data were analysed to identify sociodemographic and behavioural correlates of lower consent comprehension. The mean ± SD comprehension score was 94.4 ± 5.00%. Information pertaining to study-related risks was not comprehended by 61.7% of participants. HIV-negative men (adjusted OR [AOR] = 4.36, 95% CI 1.71-11.05) or HIV-negative women (AOR = 13.54, 95% CI 6.42-28.55), illiteracy (AOR= 1.65, 95% CI 1.19-2.30), those with a history of multiple partners (AOR = 1.73, 95% CI 1.12-2.66) and those never using condoms (AOR = 1.35, 95% CI 1.01-1.82) were more likely to have lower consent comprehension. We recommend exploration of domains of lower consent comprehension using a validated consent comprehension tool. Improved education in these specific domains would optimise consent comprehension among research participants.

  20. Parameter optimization, sensitivity, and uncertainty analysis of an ecosystem model at a forest flux tower site in the United States

    USGS Publications Warehouse

    Wu, Yiping; Liu, Shuguang; Huang, Zhihong; Yan, Wende

    2014-01-01

    Ecosystem models are useful tools for understanding ecological processes and for sustainable management of resources. In biogeochemical field, numerical models have been widely used for investigating carbon dynamics under global changes from site to regional and global scales. However, it is still challenging to optimize parameters and estimate parameterization uncertainty for complex process-based models such as the Erosion Deposition Carbon Model (EDCM), a modified version of CENTURY, that consider carbon, water, and nutrient cycles of ecosystems. This study was designed to conduct the parameter identifiability, optimization, sensitivity, and uncertainty analysis of EDCM using our developed EDCM-Auto, which incorporated a comprehensive R package—Flexible Modeling Framework (FME) and the Shuffled Complex Evolution (SCE) algorithm. Using a forest flux tower site as a case study, we implemented a comprehensive modeling analysis involving nine parameters and four target variables (carbon and water fluxes) with their corresponding measurements based on the eddy covariance technique. The local sensitivity analysis shows that the plant production-related parameters (e.g., PPDF1 and PRDX) are most sensitive to the model cost function. Both SCE and FME are comparable and performed well in deriving the optimal parameter set with satisfactory simulations of target variables. Global sensitivity and uncertainty analysis indicate that the parameter uncertainty and the resulting output uncertainty can be quantified, and that the magnitude of parameter-uncertainty effects depends on variables and seasons. This study also demonstrates that using the cutting-edge R functions such as FME can be feasible and attractive for conducting comprehensive parameter analysis for ecosystem modeling.

  1. Cost-effectiveness modelling in diagnostic imaging: a stepwise approach.

    PubMed

    Sailer, Anna M; van Zwam, Wim H; Wildberger, Joachim E; Grutters, Janneke P C

    2015-12-01

    Diagnostic imaging (DI) is the fastest growing sector in medical expenditures and takes a central role in medical decision-making. The increasing number of various and new imaging technologies induces a growing demand for cost-effectiveness analysis (CEA) in imaging technology assessment. In this article we provide a comprehensive framework of direct and indirect effects that should be considered for CEA in DI, suitable for all imaging modalities. We describe and explain the methodology of decision analytic modelling in six steps aiming to transfer theory of CEA to clinical research by demonstrating key principles of CEA in a practical approach. We thereby provide radiologists with an introduction to the tools necessary to perform and interpret CEA as part of their research and clinical practice. • DI influences medical decision making, affecting both costs and health outcome. • This article provides a comprehensive framework for CEA in DI. • A six-step methodology for conducting and interpreting cost-effectiveness modelling is proposed.

  2. Machinability of titanium metal matrix composites (Ti-MMCs)

    NASA Astrophysics Data System (ADS)

    Aramesh, Maryam

    Titanium metal matrix composites (Ti-MMCs), as a new generation of materials, have various potential applications in aerospace and automotive industries. The presence of ceramic particles enhances the physical and mechanical properties of the alloy matrix. However, the hard and abrasive nature of these particles causes various issues in the field of their machinability. Severe tool wear and short tool life are the most important drawbacks of machining this class of materials. There is very limited work in the literature regarding the machinability of this class of materials especially in the area of tool life estimation and tool wear. By far, polycrystalline diamond (PCD) tools appear to be the best choice for machining MMCs from researchers' point of view. However, due to their high cost, economical alternatives are sought. Cubic boron nitride (CBN) inserts, as the second hardest available tools, show superior characteristics such as great wear resistance, high hardness at elevated temperatures, a low coefficient of friction and a high melting point. Yet, so far CBN tools have not been studied during machining of Ti-MMCs. In this study, a comprehensive study has been performed to explore the tool wear mechanisms of CBN inserts during turning of Ti-MMCs. The unique morphology of the worn faces of the tools was investigated for the first time, which led to new insights in the identification of chemical wear mechanisms during machining of Ti-MMCs. Utilizing the full tool life capacity of cutting tools is also very crucial, due to the considerable costs associated with suboptimal replacement of tools. This strongly motivates development of a reliable model for tool life estimation under any cutting conditions. In this study, a novel model based on the survival analysis methodology is developed to estimate the progressive states of tool wear under any cutting conditions during machining of Ti-MMCs. This statistical model takes into account the machining time in addition to the effect of cutting parameters. Thus, promising results were obtained which showed a very good agreement with the experimental results. Moreover, a more advanced model was constructed, by adding the tool wear as another variable to the previous model. Therefore, a new model was proposed for estimating the remaining life of worn inserts under different cutting conditions, using the current tool wear data as an input. The results of this model were validated with the experimental results. The estimated results were well consistent with the results obtained from the experiments.

  3. The GENIE Neutrino Monte Carlo Generator: Physics and User Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andreopoulos, Costas; Barry, Christopher; Dytman, Steve

    2015-10-20

    GENIE is a suite of products for the experimental neutrino physics community. This suite includes i) a modern software framework for implementing neutrino event generators, a state-of-the-art comprehensive physics model and tools to support neutrino interaction simulation for realistic experimental setups (the Generator product), ii) extensive archives of neutrino, charged-lepton and hadron scattering data and software to produce a comprehensive set of data/MC comparisons (the Comparisons product), and iii) a generator tuning framework and fitting applications (the Tuning product). This book provides the definite guide for the GENIE Generator: It presents the software architecture and a detailed description of itsmore » physics model and official tunes. In addition, it provides a rich set of data/MC comparisons that characterise the physics performance of GENIE. Detailed step-by-step instructions on how to install and configure the Generator, run its applications and analyze its outputs are also included.« less

  4. Sampling and sensitivity analyses tools (SaSAT) for computational modelling

    PubMed Central

    Hoare, Alexander; Regan, David G; Wilson, David P

    2008-01-01

    SaSAT (Sampling and Sensitivity Analysis Tools) is a user-friendly software package for applying uncertainty and sensitivity analyses to mathematical and computational models of arbitrary complexity and context. The toolbox is built in Matlab®, a numerical mathematical software package, and utilises algorithms contained in the Matlab® Statistics Toolbox. However, Matlab® is not required to use SaSAT as the software package is provided as an executable file with all the necessary supplementary files. The SaSAT package is also designed to work seamlessly with Microsoft Excel but no functionality is forfeited if that software is not available. A comprehensive suite of tools is provided to enable the following tasks to be easily performed: efficient and equitable sampling of parameter space by various methodologies; calculation of correlation coefficients; regression analysis; factor prioritisation; and graphical output of results, including response surfaces, tornado plots, and scatterplots. Use of SaSAT is exemplified by application to a simple epidemic model. To our knowledge, a number of the methods available in SaSAT for performing sensitivity analyses have not previously been used in epidemiological modelling and their usefulness in this context is demonstrated. PMID:18304361

  5. Risk Prediction Models for Acute Kidney Injury in Critically Ill Patients: Opus in Progressu.

    PubMed

    Neyra, Javier A; Leaf, David E

    2018-05-31

    Acute kidney injury (AKI) is a complex systemic syndrome associated with high morbidity and mortality. Among critically ill patients admitted to intensive care units (ICUs), the incidence of AKI is as high as 50% and is associated with dismal outcomes. Thus, the development and validation of clinical risk prediction tools that accurately identify patients at high risk for AKI in the ICU is of paramount importance. We provide a comprehensive review of 3 clinical risk prediction tools that have been developed for incident AKI occurring in the first few hours or days following admission to the ICU. We found substantial heterogeneity among the clinical variables that were examined and included as significant predictors of AKI in the final models. The area under the receiver operating characteristic curves was ∼0.8 for all 3 models, indicating satisfactory model performance, though positive predictive values ranged from only 23 to 38%. Hence, further research is needed to develop more accurate and reproducible clinical risk prediction tools. Strategies for improved assessment of AKI susceptibility in the ICU include the incorporation of dynamic (time-varying) clinical parameters, as well as biomarker, functional, imaging, and genomic data. © 2018 S. Karger AG, Basel.

  6. Efficient exploration of pan-cancer networks by generalized covariance selection and interactive web content

    PubMed Central

    Kling, Teresia; Johansson, Patrik; Sanchez, José; Marinescu, Voichita D.; Jörnsten, Rebecka; Nelander, Sven

    2015-01-01

    Statistical network modeling techniques are increasingly important tools to analyze cancer genomics data. However, current tools and resources are not designed to work across multiple diagnoses and technical platforms, thus limiting their applicability to comprehensive pan-cancer datasets such as The Cancer Genome Atlas (TCGA). To address this, we describe a new data driven modeling method, based on generalized Sparse Inverse Covariance Selection (SICS). The method integrates genetic, epigenetic and transcriptional data from multiple cancers, to define links that are present in multiple cancers, a subset of cancers, or a single cancer. It is shown to be statistically robust and effective at detecting direct pathway links in data from TCGA. To facilitate interpretation of the results, we introduce a publicly accessible tool (cancerlandscapes.org), in which the derived networks are explored as interactive web content, linked to several pathway and pharmacological databases. To evaluate the performance of the method, we constructed a model for eight TCGA cancers, using data from 3900 patients. The model rediscovered known mechanisms and contained interesting predictions. Possible applications include prediction of regulatory relationships, comparison of network modules across multiple forms of cancer and identification of drug targets. PMID:25953855

  7. Methods, Tools and Current Perspectives in Proteogenomics *

    PubMed Central

    Ruggles, Kelly V.; Krug, Karsten; Wang, Xiaojing; Clauser, Karl R.; Wang, Jing; Payne, Samuel H.; Fenyö, David; Zhang, Bing; Mani, D. R.

    2017-01-01

    With combined technological advancements in high-throughput next-generation sequencing and deep mass spectrometry-based proteomics, proteogenomics, i.e. the integrative analysis of proteomic and genomic data, has emerged as a new research field. Early efforts in the field were focused on improving protein identification using sample-specific genomic and transcriptomic sequencing data. More recently, integrative analysis of quantitative measurements from genomic and proteomic studies have identified novel insights into gene expression regulation, cell signaling, and disease. Many methods and tools have been developed or adapted to enable an array of integrative proteogenomic approaches and in this article, we systematically classify published methods and tools into four major categories, (1) Sequence-centric proteogenomics; (2) Analysis of proteogenomic relationships; (3) Integrative modeling of proteogenomic data; and (4) Data sharing and visualization. We provide a comprehensive review of methods and available tools in each category and highlight their typical applications. PMID:28456751

  8. Performance and Sizing Tool for Quadrotor Biplane Tailsitter UAS

    NASA Astrophysics Data System (ADS)

    Strom, Eric

    The Quadrotor-Biplane-Tailsitter (QBT) configuration is the basis for a mechanically simplistic rotorcraft capable of both long-range, high-speed cruise as well as hovering flight. This work presents the development and validation of a set of preliminary design tools built specifically for this aircraft to enable its further development, including: a QBT weight model, preliminary sizing framework, and vehicle analysis tools. The preliminary sizing tool presented here shows the advantage afforded by QBT designs in missions with aggressive cruise requirements, such as offshore wind turbine inspections, wherein transition from a quadcopter configuration to a QBT allows for a 5:1 trade of battery weight for wing weight. A 3D, unsteady panel method utilizing a nonlinear implementation of the Kutta-Joukowsky condition is also presented as a means of computing aerodynamic interference effects and, through the implementation of rotor, body, and wing geometry generators, is prepared for coupling with a comprehensive rotor analysis package.

  9. Navigating complex patients using an innovative tool: the MTM Spider Web.

    PubMed

    Morello, Candis M; Hirsch, Jan D; Lee, Kelly C

    2013-01-01

    To introduce a teaching tool that can be used to assess the complexity of medication therapy management (MTM) patients, prioritize appropriate interventions, and design patient-centered care plans for each encounter. MTM patients are complex as a result of multiple comorbidities, medications, and socioeconomic and behavioral issues. Pharmacists who provide MTM services are required to synthesize a plethora of information (medical and nonmedical), evaluate and prioritize the clinical problems, and design a comprehensive patient-centered care plan. The MTM Spider Web is a visual tool to facilitate this process. A description is provided regarding how to build the MTM Spider Web using case-based scenarios. This model can be used to teach pharmacists, health professional students, and patients. The MTM Spider Web is an innovative teaching tool that can be used to teach pharmacists and students how to assess complex patients and design a patient-centered care plan to deliver the most appropriate medication therapy.

  10. Intersymbol Interference Investigations Using a 3D Time-Dependent Traveling Wave Tube Model

    NASA Technical Reports Server (NTRS)

    Kory, Carol L.; Andro, Monty

    2002-01-01

    For the first time, a time-dependent, physics-based computational model has been used to provide a direct description of the effects of the traveling wave tube amplifier (TWTA) on modulated digital signals. The TWT model comprehensively takes into account the effects of frequency dependent AM/AM and AM/PM conversion; gain and phase ripple; drive-induced oscillations; harmonic generation; intermodulation products; and backward waves. Thus, signal integrity can be investigated in the presence of these sources of potential distortion as a function of the physical geometry and operating characteristics of the high power amplifier and the operational digital signal. This method promises superior predictive fidelity compared to methods using TWT models based on swept- amplitude and/or swept-frequency data. First, the TWT model using the three dimensional (3D) electromagnetic code MAFIA is presented. Then, this comprehensive model is used to investigate approximations made in conventional TWT black-box models used in communication system level simulations. To quantitatively demonstrate the effects these approximations have on digital signal performance predictions, including intersymbol interference (ISI), the MAFIA results are compared to the system level analysis tool, Signal Processing Workstation (SPW), using high order modulation schemes including 16 and 64-QAM.

  11. EMPIRE: A code for nuclear astrophysics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palumbo, A.

    The nuclear reaction code EMPIRE is presented as a useful tool for nuclear astrophysics. EMPIRE combines a variety of the reaction models with a comprehensive library of input parameters providing a diversity of options for the user. With exclusion of the directsemidirect capture all reaction mechanisms relevant to the nuclear astrophysics energy range of interest are implemented in the code. Comparison to experimental data show consistent agreement for all relevant channels.

  12. A Comprehensive Study of a Micro-Channel Heat Sink Using Integrated Thin-Film Temperature Sensors

    PubMed Central

    Wang, Tao; Wang, Jiejun; He, Jian; Wu, Chuangui; Luo, Wenbo; Shuai, Yao; Zhang, Wanli; Chen, Xiancai; Zhang, Jian; Lin, Jia

    2018-01-01

    A micro-channel heat sink is a promising cooling method for high power integrated circuits (IC). However, the understanding of such a micro-channel device is not sufficient, because the tools for studying it are very limited. The details inside the micro-channels are not readily available. In this letter, a micro-channel heat sink is comprehensively studied using the integrated temperature sensors. The highly sensitive thin film temperature sensors can accurately monitor the temperature change in the micro-channel in real time. The outstanding heat dissipation performance of the micro-channel heat sink is proven in terms of maximum temperature, cooling speed and heat resistance. The temperature profile along the micro-channel is extracted, and even small temperature perturbations can be detected. The heat source formed temperature peak shifts towards the flow direction with the increasing flow rate. However, the temperature non-uniformity is independent of flow rate, but solely dependent on the heating power. Specific designs for minimizing the temperature non-uniformity are necessary. In addition, the experimental results from the integrated temperature sensors match the simulation results well. This can be used to directly verify the modeling results, helping to build a convincing simulation model. The integrated sensor could be a powerful tool for studying the micro-channel based heat sink. PMID:29351248

  13. SCALE Code System 6.2.2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T.; Jessee, Matthew Anderson

    The SCALE Code System is a widely used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor physics, radiation shielding, radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including 3 deterministic and 3 Monte Carlomore » radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results. SCALE 6.2 represents one of the most comprehensive revisions in the history of SCALE, providing several new capabilities and significant improvements in many existing features.« less

  14. Interprofessional partnerships in chronic illness care: a conceptual model for measuring partnership effectiveness

    PubMed Central

    Butt, Gail; Markle-Reid, Maureen; Browne, Gina

    2008-01-01

    Introduction Interprofessional health and social service partnerships (IHSSP) are internationally acknowledged as integral for comprehensive chronic illness care. However, the evidence-base for partnership effectiveness is lacking. This paper aims to clarify partnership measurement issues, conceptualize IHSSP at the front-line staff level, and identify tools valid for group process measurement. Theory and methods A systematic literature review utilizing three interrelated searches was conducted. Thematic analysis techniques were supported by NVivo 7 software. Complexity theory was used to guide the analysis, ground the new conceptualization and validate the selected measures. Other properties of the measures were critiqued using established criteria. Results There is a need for a convergent view of what constitutes a partnership and its measurement. The salient attributes of IHSSP and their interorganizational context were described and grounded within complexity theory. Two measures were selected and validated for measurement of proximal group outcomes. Conclusion This paper depicts a novel complexity theory-based conceptual model for IHSSP of front-line staff who provide chronic illness care. The conceptualization provides the underpinnings for a comprehensive evaluative framework for partnerships. Two partnership process measurement tools, the PSAT and TCI are valid for IHSSP process measurement with consideration of their strengths and limitations. PMID:18493591

  15. A Comprehensive Study of a Micro-Channel Heat Sink Using Integrated Thin-Film Temperature Sensors.

    PubMed

    Wang, Tao; Wang, Jiejun; He, Jian; Wu, Chuangui; Luo, Wenbo; Shuai, Yao; Zhang, Wanli; Chen, Xiancai; Zhang, Jian; Lin, Jia

    2018-01-19

    A micro-channel heat sink is a promising cooling method for high power integrated circuits (IC). However, the understanding of such a micro-channel device is not sufficient, because the tools for studying it are very limited. The details inside the micro-channels are not readily available. In this letter, a micro-channel heat sink is comprehensively studied using the integrated temperature sensors. The highly sensitive thin film temperature sensors can accurately monitor the temperature change in the micro-channel in real time. The outstanding heat dissipation performance of the micro-channel heat sink is proven in terms of maximum temperature, cooling speed and heat resistance. The temperature profile along the micro-channel is extracted, and even small temperature perturbations can be detected. The heat source formed temperature peak shifts towards the flow direction with the increasing flow rate. However, the temperature non-uniformity is independent of flow rate, but solely dependent on the heating power. Specific designs for minimizing the temperature non-uniformity are necessary. In addition, the experimental results from the integrated temperature sensors match the simulation results well. This can be used to directly verify the modeling results, helping to build a convincing simulation model. The integrated sensor could be a powerful tool for studying the micro-channel based heat sink.

  16. Quantitative Evaluation of Heavy Duty Machine Tools Remanufacturing Based on Modified Catastrophe Progression Method

    NASA Astrophysics Data System (ADS)

    shunhe, Li; jianhua, Rao; lin, Gui; weimin, Zhang; degang, Liu

    2017-11-01

    The result of remanufacturing evaluation is the basis for judging whether the heavy duty machine tool can remanufacture in the EOL stage of the machine tool lifecycle management.The objectivity and accuracy of evaluation is the key to the evaluation method.In this paper, the catastrophe progression method is introduced into the quantitative evaluation of heavy duty machine tools’ remanufacturing,and the results are modified by the comprehensive adjustment method,which makes the evaluation results accord with the standard of human conventional thinking.Using the catastrophe progression method to establish the heavy duty machine tools’ quantitative evaluation model,to evaluate the retired TK6916 type CNC floor milling-boring machine’s remanufacturing.The evaluation process is simple,high quantification,the result is objective.

  17. Integrated modelling of nitrate loads to coastal waters and land rent applied to catchment-scale water management.

    PubMed

    Refsgaard, A; Jacobsen, T; Jacobsen, B; Ørum, J-E

    2007-01-01

    The EU Water Framework Directive (WFD) requires an integrated approach to river basin management in order to meet environmental and ecological objectives. This paper presents concepts and full-scale application of an integrated modelling framework. The Ringkoebing Fjord basin is characterized by intensive agricultural production and leakage of nitrate constitute a major pollution problem with respect groundwater aquifers (drinking water), fresh surface water systems (water quality of lakes) and coastal receiving waters (eutrophication). The case study presented illustrates an advanced modelling approach applied in river basin management. Point sources (e.g. sewage treatment plant discharges) and distributed diffuse sources (nitrate leakage) are included to provide a modelling tool capable of simulating pollution transport from source to recipient to analyse the effects of specific, localized basin water management plans. The paper also includes a land rent modelling approach which can be used to choose the most cost-effective measures and the location of these measures. As a forerunner to the use of basin-scale models in WFD basin water management plans this project demonstrates the potential and limitations of comprehensive, integrated modelling tools.

  18. Global Analysis, Interpretation and Modelling: An Earth Systems Modelling Program

    NASA Technical Reports Server (NTRS)

    Moore, Berrien, III; Sahagian, Dork

    1997-01-01

    The Goal of the GAIM is: To advance the study of the coupled dynamics of the Earth system using as tools both data and models; to develop a strategy for the rapid development, evaluation, and application of comprehensive prognostic models of the Global Biogeochemical Subsystem which could eventually be linked with models of the Physical-Climate Subsystem; to propose, promote, and facilitate experiments with existing models or by linking subcomponent models, especially those associated with IGBP Core Projects and with WCRP efforts. Such experiments would be focused upon resolving interface issues and questions associated with developing an understanding of the prognostic behavior of key processes; to clarify key scientific issues facing the development of Global Biogeochemical Models and the coupling of these models to General Circulation Models; to assist the Intergovernmental Panel on Climate Change (IPCC) process by conducting timely studies that focus upon elucidating important unresolved scientific issues associated with the changing biogeochemical cycles of the planet and upon the role of the biosphere in the physical-climate subsystem, particularly its role in the global hydrological cycle; and to advise the SC-IGBP on progress in developing comprehensive Global Biogeochemical Models and to maintain scientific liaison with the WCRP Steering Group on Global Climate Modelling.

  19. General Pressurization Model in Simscape

    NASA Technical Reports Server (NTRS)

    Servin, Mario; Garcia, Vicky

    2010-01-01

    System integration is an essential part of the engineering design process. The Ares I Upper Stage (US) is a complex system which is made up of thousands of components assembled into subsystems including a J2-X engine, liquid hydrogen (LH2) and liquid oxygen (LO2) tanks, avionics, thrust vector control, motors, etc. System integration is the task of connecting together all of the subsystems into one large system. To ensure that all the components will "fit together" as well as safety and, quality, integration analysis is required. Integration analysis verifies that, as an integrated system, the system will behave as designed. Models that represent the actual subsystems are built for more comprehensive analysis. Matlab has been an instrument widely use by engineers to construct mathematical models of systems. Simulink, one of the tools offered by Matlab, provides multi-domain graphical environment to simulate and design time-varying systems. Simulink is a powerful tool to analyze the dynamic behavior of systems over time. Furthermore, Simscape, a tool provided by Simulink, allows users to model physical (such as mechanical, thermal and hydraulic) systems using physical networks. Using Simscape, a model representing an inflow of gas to a pressurized tank was created where the temperature and pressure of the tank are measured over time to show the behavior of the gas. By further incorporation of Simscape into model building, the full potential of this software can be discovered and it hopefully can become a more utilized tool.

  20. Unconventional Tools for an Unconventional Resource: Community and Landscape Planning for Shale in the Marcellus Region

    NASA Astrophysics Data System (ADS)

    Murtha, T., Jr.; Orland, B.; Goldberg, L.; Hammond, R.

    2014-12-01

    Deep shale natural gas deposits made accessible by new technologies are quickly becoming a considerable share of North America's energy portfolio. Unlike traditional deposits and extraction footprints, shale gas offers dispersed and complex landscape and community challenges. These challenges are both cultural and environmental. This paper describes the development and application of creative geospatial tools as a means to engage communities along the northern tier counties of Pennsylvania, experiencing Marcellus shale drilling in design and planning. Uniquely combining physical landscape models with predictive models of exploration activities, including drilling, pipeline construction and road reconstruction, the tools quantify the potential impacts of drilling activities for communities and landscapes in the commonwealth of Pennsylvania. Dividing the state into 9836 watershed sub-basins, we first describe the current state of Marcellus related activities through 2014. We then describe and report the results of three scaled predictive models designed to investigate probable sub-basins where future activities will be focused. Finally, the core of the paper reports on the second level of tools we have now developed to engage communities in planning for unconventional gas extraction in Pennsylvania. Using a geodesign approach we are working with communities to transfer information for comprehensive landscape planning and informed decision making. These tools not only quantify physical landscape impacts, but also quantify potential visual, aesthetic and cultural resource implications.

  1. The Assessment of Reading Comprehension Difficulties for Reading Intervention

    ERIC Educational Resources Information Center

    Woolley, Gary

    2008-01-01

    There are many environmental and personal factors that contribute to reading success. Reading comprehension is a complex interaction of language, sensory perception, memory, and motivational aspects. However, most existing assessment tools have not adequately reflected the complex nature of reading comprehension. Good assessment requires a…

  2. Prediction of the wear and evolution of cutting tools in a carbide / titanium-aluminum-vanadium machining tribosystem by volumetric tool wear characterization and modeling

    NASA Astrophysics Data System (ADS)

    Kuttolamadom, Mathew Abraham

    The objective of this research work is to create a comprehensive microstructural wear mechanism-based predictive model of tool wear in the tungsten carbide / Ti-6Al-4V machining tribosystem, and to develop a new topology characterization method for worn cutting tools in order to validate the model predictions. This is accomplished by blending first principle wear mechanism models using a weighting scheme derived from scanning electron microscopy (SEM) imaging and energy dispersive x-ray spectroscopy (EDS) analysis of tools worn under different operational conditions. In addition, the topology of worn tools is characterized through scanning by white light interferometry (WLI), and then application of an algorithm to stitch and solidify data sets to calculate the volume of the tool worn away. The methodology was to first combine and weight dominant microstructural wear mechanism models, to be able to effectively predict the tool volume worn away. Then, by developing a new metrology method for accurately quantifying the bulk-3D wear, the model-predicted wear was validated against worn tool volumes obtained from corresponding machining experiments. On analyzing worn crater faces using SEM/EDS, adhesion was found dominant at lower surface speeds, while dissolution wear dominated with increasing speeds -- this is in conformance with the lower relative surface speed requirement for micro welds to form and rupture, essentially defining the mechanical load limit of the tool material. It also conforms to the known dominance of high temperature-controlled wear mechanisms with increasing surface speed, which is known to exponentially increase temperatures especially when machining Ti-6Al-4V due to its low thermal conductivity. Thus, straight tungsten carbide wear when machining Ti-6Al-4V is mechanically-driven at low surface speeds and thermally-driven at high surface speeds. Further, at high surface speeds, craters were formed due to carbon diffusing to the tool surface and being carried away by the rubbing action of the chips -- this left behind a smooth crater surface predominantly of tungsten and cobalt as observed from EDS analysis. Also, at high surface speeds, carbon from the tool was found diffused into the adhered titanium layer to form a titanium carbide (TiC) boundary layer -- this was observed as instances of TiC build-up on the tool edge from EDS analysis. A complex wear mechanism interaction was thus observed, i.e., titanium adhered on top of an earlier worn out crater trough, additional carbon diffused into this adhered titanium layer to create a more stable boundary layer (which could limit diffusion-rates on saturation), and then all were further worn away by dissolution wear as temperatures increased. At low and medium feeds, notch discoloration was observed -- this was detected to be carbon from EDS analysis, suggesting that it was deposited from the edges of the passing chips. Mapping the dominant wear mechanisms showed the increasing dominance of dissolution wear relative to adhesion, with increasing grain size -- this is because a 13% larger sub-micron grain results in a larger surface area of cobalt exposed to chemical action. On the macro-scale, wear quantification through topology characterization elevated wear from a 1D to 3D concept. From investigation, a second order dependence of volumetric tool wear (VTW) and VTW rate with the material removal rate (MRR) emerged, suggesting that MRR is a more consistent wear-controlling factor instead of the traditionally used cutting speed. A predictive model for VTW was developed which showed its exponential dependence with workpiece stock volume removed. Also, both VTW and VTW rate were found to be dependent on the accumulated cumulative wear on the tool. Further, a ratio metric of stock material removed to tool volume lost is now possible as a tool efficiency quantifier and energy-based productivity parameter, which was found to inversely depend on MRR - this led to a more comprehensive tool wear definition based on cutting tool efficiency. (Abstract shortened by UMI.)

  3. A tool for urban soundscape evaluation applying Support Vector Machines for developing a soundscape classification model.

    PubMed

    Torija, Antonio J; Ruiz, Diego P; Ramos-Ridao, Angel F

    2014-06-01

    To ensure appropriate soundscape management in urban environments, the urban-planning authorities need a range of tools that enable such a task to be performed. An essential step during the management of urban areas from a sound standpoint should be the evaluation of the soundscape in such an area. In this sense, it has been widely acknowledged that a subjective and acoustical categorization of a soundscape is the first step to evaluate it, providing a basis for designing or adapting it to match people's expectations as well. In this sense, this work proposes a model for automatic classification of urban soundscapes. This model is intended for the automatic classification of urban soundscapes based on underlying acoustical and perceptual criteria. Thus, this classification model is proposed to be used as a tool for a comprehensive urban soundscape evaluation. Because of the great complexity associated with the problem, two machine learning techniques, Support Vector Machines (SVM) and Support Vector Machines trained with Sequential Minimal Optimization (SMO), are implemented in developing model classification. The results indicate that the SMO model outperforms the SVM model in the specific task of soundscape classification. With the implementation of the SMO algorithm, the classification model achieves an outstanding performance (91.3% of instances correctly classified). © 2013 Elsevier B.V. All rights reserved.

  4. Analysis of Advanced Rotorcraft Configurations

    NASA Technical Reports Server (NTRS)

    Johnson, Wayne

    2000-01-01

    Advanced rotorcraft configurations are being investigated with the objectives of identifying vehicles that are larger, quieter, and faster than current-generation rotorcraft. A large rotorcraft, carrying perhaps 150 passengers, could do much to alleviate airport capacity limitations, and a quiet rotorcraft is essential for community acceptance of the benefits of VTOL operations. A fast, long-range, long-endurance rotorcraft, notably the tilt-rotor configuration, will improve rotorcraft economics through productivity increases. A major part of the investigation of advanced rotorcraft configurations consists of conducting comprehensive analyses of vehicle behavior for the purpose of assessing vehicle potential and feasibility, as well as to establish the analytical models required to support the vehicle development. The analytical work of FY99 included applications to tilt-rotor aircraft. Tilt Rotor Aeroacoustic Model (TRAM) wind tunnel measurements are being compared with calculations performed by using the comprehensive analysis tool (Comprehensive Analytical Model of Rotorcraft Aerodynamics and Dynamics (CAMRAD 11)). The objective is to establish the wing and wake aerodynamic models that are required for tilt-rotor analysis and design. The TRAM test in the German-Dutch Wind Tunnel (DNW) produced extensive measurements. This is the first test to encompass air loads, performance, and structural load measurements on tilt rotors, as well as acoustic and flow visualization data. The correlation of measurements and calculations includes helicopter-mode operation (performance, air loads, and blade structural loads), hover (performance and air loads), and airplane-mode operation (performance).

  5. Software development: Stratosphere modeling

    NASA Technical Reports Server (NTRS)

    Chen, H. C.

    1977-01-01

    A more comprehensive model for stratospheric chemistry and transport theory was developed for the purpose of aiding predictions of changes in the stratospheric ozone content as a consequence of natural and anthropogenic processes. This new and more advanced stratospheric model is time dependent and the dependent variables are zonal means of the relevant meteorological quantities which are functions of latitude and height. The model was constructed by the best mathematical approach on a large IBM S360 in American National Standard FORTRAN. It will be both a scientific tool and an assessment device used to evaluate other models. The interactions of dynamics, photochemistry and radiation in the stratosphere can be governed by a set of fundamental dynamical equations.

  6. The Value of Reliable Data: Interactive Data Tools from the National Comprehensive Center for Teacher Quality. Policy-to-Practice Brief. Number 1

    ERIC Educational Resources Information Center

    National Comprehensive Center for Teacher Quality, 2008

    2008-01-01

    The National Comprehensive Center for Teacher Quality (TQ Center) designed the Interactive Data Tools to provide users with access to state and national data that can be helpful in assessing the qualifications of teachers in the states and the extent to which a state's teacher policy climate generally supports teacher quality. The Interactive Data…

  7. Pathology economic model tool: a novel approach to workflow and budget cost analysis in an anatomic pathology laboratory.

    PubMed

    Muirhead, David; Aoun, Patricia; Powell, Michael; Juncker, Flemming; Mollerup, Jens

    2010-08-01

    The need for higher efficiency, maximum quality, and faster turnaround time is a continuous focus for anatomic pathology laboratories and drives changes in work scheduling, instrumentation, and management control systems. To determine the costs of generating routine, special, and immunohistochemical microscopic slides in a large, academic anatomic pathology laboratory using a top-down approach. The Pathology Economic Model Tool was used to analyze workflow processes at The Nebraska Medical Center's anatomic pathology laboratory. Data from the analysis were used to generate complete cost estimates, which included not only materials, consumables, and instrumentation but also specific labor and overhead components for each of the laboratory's subareas. The cost data generated by the Pathology Economic Model Tool were compared with the cost estimates generated using relative value units. Despite the use of automated systems for different processes, the workflow in the laboratory was found to be relatively labor intensive. The effect of labor and overhead on per-slide costs was significantly underestimated by traditional relative-value unit calculations when compared with the Pathology Economic Model Tool. Specific workflow defects with significant contributions to the cost per slide were identified. The cost of providing routine, special, and immunohistochemical slides may be significantly underestimated by traditional methods that rely on relative value units. Furthermore, a comprehensive analysis may identify specific workflow processes requiring improvement.

  8. Higher Order Modulation Intersymbol Interference Caused by Traveling-wave Tube Amplifiers

    NASA Technical Reports Server (NTRS)

    Kory, Carol L.; Andro, Monty; Williams, W. D. (Technical Monitor)

    2002-01-01

    For the first time, a time-dependent, physics-based computational model has been used to provide a direct description of the effects of the traveling wave tube amplifier (TWTA) on modulated digital signals. The TWT model comprehensively takes into account the effects of frequency dependent AM/AM and AM/PM conversion; gain and phase ripple; drive-induced oscillations; harmonic generation; intermodulation products; and backward waves, Thus, signal integrity can be investigated in the presence of these sources of potential distortion as a function of the physical geometry and operating characteristics of the high power amplifier and the operational digital signal. This method promises superior predictive fidelity compared to methods using TWT models based on swept-amplitude and/or swept-frequency data. First, the TWT model using the three dimensional (3D) electromagnetic code MAFIA is presented. Then, this comprehensive model is used to investigate approximations made in conventional TWT black-box models used in communication system level simulations, To quantitatively demonstrate the effects these approximations have on digital signal performance predictions, including intersymbol interference (ISI), the MAFIA results are compared to the system level analysis tool, Signal Processing, Workstation (SPW), using high order modulation schemes including 16 and 64-QAM.

  9. Benefits and applications of interdisciplinary digital tools for environmental meta-reviews and analyses

    NASA Astrophysics Data System (ADS)

    Grubert, Emily; Siders, Anne

    2016-09-01

    Digitally-aided reviews of large bodies of text-based information, such as academic literature, are growing in capability but are not yet common in environmental fields. Environmental sciences and studies can benefit from application of digital tools to create comprehensive, replicable, interdisciplinary reviews that provide rapid, up-to-date, and policy-relevant reports of existing work. This work reviews the potential for applications of computational text mining and analysis tools originating in the humanities to environmental science and policy questions. Two process-oriented case studies of digitally-aided environmental literature reviews and meta-analyses illustrate potential benefits and limitations. A medium-sized, medium-resolution review (∼8000 journal abstracts and titles) focuses on topic modeling as a rapid way to identify thematic changes over time. A small, high-resolution review (∼300 full text journal articles) combines collocation and network analysis with manual coding to synthesize and question empirical field work. We note that even small digitally-aided analyses are close to the upper limit of what can be done manually. Established computational methods developed in humanities disciplines and refined by humanities and social science scholars to interrogate large bodies of textual data are applicable and useful in environmental sciences but have not yet been widely applied. Two case studies provide evidence that digital tools can enhance insight. Two major conclusions emerge. First, digital tools enable scholars to engage large literatures rapidly and, in some cases, more comprehensively than is possible manually. Digital tools can confirm manually identified patterns or identify additional patterns visible only at a large scale. Second, digital tools allow for more replicable and transparent conclusions to be drawn from literature reviews and meta-analyses. The methodological subfields of digital humanities and computational social sciences will likely continue to create innovative tools for analyzing large bodies of text, providing opportunities for interdisciplinary collaboration with the environmental fields.

  10. A business case evaluation of workplace engineering noise control: a net-cost model.

    PubMed

    Lahiri, Supriya; Low, Colleen; Barry, Michael

    2011-03-01

    This article provides a convenient tool for companies to determine the costs and benefits of alternative interventions to prevent noise-induced hearing loss (NIHL). Contextualized for Singapore and in collaboration with Singapore's Ministry of Manpower, the Net-Cost model evaluates costs of intervention for equipment and labor, avoided costs of productivity losses and medical care, and productivity gains from the employer's economic perspective. To pilot this approach, four case studies are presented, with varying degrees of economic benefits to the employer, including one in which multifactor productivity is the main driver. Although compliance agencies may not require economic analysis of NIHL, given scarce resources in a market-driven economy, this tool enables stakeholders to understand and compare the costs and benefits of NIHL interventions comprehensively and helps in determining risk management strategies.

  11. Technical Advances and Fifth Grade Reading Comprehension: Do Students Benefit?

    ERIC Educational Resources Information Center

    Fountaine, Drew

    This paper takes a look at some recent studies on utilization of technical tools, primarily personal computers and software, for improving fifth-grade students' reading comprehension. Specifically, the paper asks what benefits an educator can expect students to derive from closed-captioning and computer-assisted reading comprehension products. It…

  12. A Low Vision Reading Comprehension Test.

    ERIC Educational Resources Information Center

    Watson, G. R.; And Others

    1996-01-01

    Fifty adults (ages 28-86) with macular degeneration were given the Low Vision Reading Comprehension Assessment (LVRCA) to test its reliability and validity in evaluating the reading comprehension of those with vision impairments. The LVRCA was found to take only nine minutes to administer and was a valid and reliable tool. (CR)

  13. Space Laboratory on a Table Top: A Next Generative ECLSS design and diagnostic tool

    NASA Technical Reports Server (NTRS)

    Ramachandran, N.

    2005-01-01

    This paper describes the development plan for a comprehensive research and diagnostic tool for aspects of advanced life support systems in space-based laboratories. Specifically it aims to build a high fidelity tabletop model that can be used for the purpose of risk mitigation, failure mode analysis, contamination tracking, and testing reliability. We envision a comprehensive approach involving experimental work coupled with numerical simulation to develop this diagnostic tool. It envisions a 10% scale transparent model of a space platform such as the International Space Station that operates with water or a specific matched index of refraction liquid as the working fluid. This allows the scaling of a 10 ft x 10 ft x 10 ft room with air flow to 1 ft x 1 ft x 1 ft tabletop model with water/liquid flow. Dynamic similitude for this length scale dictates model velocities to be 67% of full-scale and thereby the time scale of the model to represent 15% of the full- scale system; meaning identical processes in the model are completed in 15% of the full- scale-time. The use of an index matching fluid (fluid that matches the refractive index of cast acrylic, the model material) allows making the entire model (with complex internal geometry) transparent and hence conducive to non-intrusive optical diagnostics. So using such a system one can test environment control parameters such as core flows (axial flows), cross flows (from registers and diffusers), potential problem areas such as flow short circuits, inadequate oxygen content, build up of other gases beyond desirable levels, test mixing processes within the system at local nodes or compartments and assess the overall system performance. The system allows quantitative measurements of contaminants introduced in the system and allows testing and optimizing the tracking process and removal of contaminants. The envisaged system will be modular and hence flexible for quick configuration change and subsequent testing. The data and inferences from the tests will allow for improvements in the development and design of next generation life support systems and configurations. Preliminary experimental and modeling work in this area will be presented. This involves testing of a single inlet-exit model with detailed 3-D flow visualization and quantitative diagnostics and computational modeling of the system.

  14. Correlates of lower comprehension of informed consent among participants enrolled in a cohort study in Pune, India

    PubMed Central

    Joglekar, Neelam S.; Deshpande, Swapna S.; Sahay, Seema; Ghate, Manisha V.; Bollinger, Robert C.; Mehendale, Sanjay M.

    2013-01-01

    Background Optimum comprehension of informed consent by research participants is essential yet challenging. This study explored correlates of lower comprehension of informed consent among 1334 participants of a cohort study aimed at estimating HIV incidence in Pune, India. Methods As part of the informed consent process, a structured comprehension tool was administered to study participants. Participants scoring ≥90% were categorised into the ‘optimal comprehension group’, whilst those scoring 80–89% were categorised into the ‘lower comprehension group’. Data were analysed to identify sociodemographic and behavioural correlates of lower consent comprehension. Results The mean ± SD comprehension score was 94.4 ± 5.00%. Information pertaining to study-related risks was not comprehended by 61.7% of participants. HIV-negative men (adjusted OR [AOR] = 4.36, 95% CI 1.71–11.05) or HIV-negative women (AOR = 13.54, 95% CI 6.42–28.55), illiteracy (AOR= 1.65, 95% CI 1.19–2.30), those with a history of multiple partners (AOR = 1.73, 95% CI 1.12–2.66) and those never using condoms (AOR = 1.35, 95% CI 1.01–1.82) were more likely to have lower consent comprehension. Conclusions We recommend exploration of domains of lower consent comprehension using a validated consent comprehension tool. Improved education in these specific domains would optimise consent comprehension among research participants. PMID:24029848

  15. A comprehensive model to evaluate implementation of the world health organization framework convention of tobacco control

    PubMed Central

    Sarrafzadegan, Nizal; Kelishad, Roya; Rabiei, Katayoun; Abedi, Heidarali; Mohaseli, Khadijeh Fereydoun; Masooleh, Hasan Azaripour; Alavi, Mousa; Heidari, Gholamreza; Ghaffari, Mostafa; O’Loughlin, Jennifer

    2012-01-01

    Background: Iran is one of the countries that has ratified the World Health Organization Framework Convention of Tobacco Control (WHO-FCTC), and has implemented a series of tobacco control interventions including the Comprehensive Tobacco Control Law. Enforcement of this legislation and assessment of its outcome requires a dedicated evaluation system. This study aimed to develop a generic model to evaluate the implementation of the Comprehensive Tobacco Control Law in Iran that was provided based on WHO-FCTC articles. Materials and Methods: Using a grounded theory approach, qualitative data were collected from 265 subjects in individual interviews and focus group discussions with policymakers who designed the legislation, key stakeholders, and members of the target community. In addition, field observations data in supermarkets/shops, restaurants, teahouses and coffee shops were collected. Data were analyzed in two stages through conceptual theoretical coding. Findings: Overall, 617 open codes were extracted from the data into tables; 72 level-3 codes were retained from the level-2 code series. Using a Model Met paradigm, the relationships between the components of each paradigm were depicted graphically. The evaluation model entailed three levels, namely: short-term results, process evaluation and long-term results. Conclusions: Central concept of the process of evaluation is that enforcing the law influences a variety of internal and environmental factors including legislative changes. These factors will be examined during the process evaluation and context evaluation. The current model can be applicable for providing FCTC evaluation tools across other jurisdictions. PMID:23833621

  16. Screening for adolescents' internalizing symptoms in primary care: item response theory analysis of the behavior health screen depression, anxiety, and suicidal risk scales.

    PubMed

    Bevans, Katherine B; Diamond, Guy; Levy, Suzanne

    2012-05-01

    To apply a modern psychometric approach to validate the Behavioral Health Screen (BHS) Depression, Anxiety, and Suicidal Risk Scales among adolescents in primary care. Psychometric analyses were conducted using data collected from 426 adolescents aged 12 to 21 years (mean = 15.8, SD = 2.2). Rasch-Masters partial credit models were fit to the data to determine whether items supported the comprehensive measurement of internalizing symptoms with minimal gaps and redundancies. Scales were reduced to ensure that they measured singular dimensions of generalized anxiety, depressed affect, and suicidal risk both comprehensively and efficiently. Although gender bias was observed for some depression and anxiety items, differential item functioning did not impact overall subscale scores. Future revisions to the BHS should include additional items that assess low-level internalizing symptoms. The BHS is an accurate and efficient tool for identifying adolescents with internalizing symptoms in primary care settings. Access to psychometrically sound and cost-effective behavioral health screening tools is essential for meeting the increasing demands for adolescent behavioral health screening in primary/ambulatory care.

  17. Measuring health literacy in populations: illuminating the design and development process of the European Health Literacy Survey Questionnaire (HLS-EU-Q)

    PubMed Central

    2013-01-01

    Background Several measurement tools have been developed to measure health literacy. The tools vary in their approach and design, but few have focused on comprehensive health literacy in populations. This paper describes the design and development of the European Health Literacy Survey Questionnaire (HLS-EU-Q), an innovative, comprehensive tool to measure health literacy in populations. Methods Based on a conceptual model and definition, the process involved item development, pre-testing, field-testing, external consultation, plain language check, and translation from English to Bulgarian, Dutch, German, Greek, Polish, and Spanish. Results The development process resulted in the HLS-EU-Q, which entailed two sections, a core health literacy section and a section on determinants and outcomes associated to health literacy. The health literacy section included 47 items addressing self-reported difficulties in accessing, understanding, appraising and applying information in tasks concerning decisions making in healthcare, disease prevention, and health promotion. The second section included items related to, health behaviour, health status, health service use, community participation, socio-demographic and socio-economic factors. Conclusions By illuminating the detailed steps in the design and development process of the HLS-EU-Q, it is the aim to provide a deeper understanding of its purpose, its capability and its limitations for others using the tool. By stimulating a wide application it is the vision that HLS-EU-Q will be validated in more countries to enhance the understanding of health literacy in different populations. PMID:24112855

  18. MetaGenyo: a web tool for meta-analysis of genetic association studies.

    PubMed

    Martorell-Marugan, Jordi; Toro-Dominguez, Daniel; Alarcon-Riquelme, Marta E; Carmona-Saez, Pedro

    2017-12-16

    Genetic association studies (GAS) aims to evaluate the association between genetic variants and phenotypes. In the last few years, the number of this type of study has increased exponentially, but the results are not always reproducible due to experimental designs, low sample sizes and other methodological errors. In this field, meta-analysis techniques are becoming very popular tools to combine results across studies to increase statistical power and to resolve discrepancies in genetic association studies. A meta-analysis summarizes research findings, increases statistical power and enables the identification of genuine associations between genotypes and phenotypes. Meta-analysis techniques are increasingly used in GAS, but it is also increasing the amount of published meta-analysis containing different errors. Although there are several software packages that implement meta-analysis, none of them are specifically designed for genetic association studies and in most cases their use requires advanced programming or scripting expertise. We have developed MetaGenyo, a web tool for meta-analysis in GAS. MetaGenyo implements a complete and comprehensive workflow that can be executed in an easy-to-use environment without programming knowledge. MetaGenyo has been developed to guide users through the main steps of a GAS meta-analysis, covering Hardy-Weinberg test, statistical association for different genetic models, analysis of heterogeneity, testing for publication bias, subgroup analysis and robustness testing of the results. MetaGenyo is a useful tool to conduct comprehensive genetic association meta-analysis. The application is freely available at http://bioinfo.genyo.es/metagenyo/ .

  19. [The application of two occupation health risk assessment models in a wooden furniture manufacturing industry].

    PubMed

    Wang, A H; Leng, P B; Bian, G L; Li, X H; Mao, G C; Zhang, M B

    2016-10-20

    Objective: To explore the applicability of 2 different models of occupational health risk assessment in wooden furniture manufacturing industry. Methods: American EPA inhalation risk model and ICMM model of occupational health risk assessment were conducted to assess occupational health risk in a small wooden furniture enterprises, respectively. Results: There was poor protective measure and equipment of occupational disease in the plant. The concentration of wood dust in the air of two workshops was over occupational exposure limit (OEL) , and the C TWA was 8.9 mg/m 3 and 3.6 mg/m 3 , respectively. According to EPA model, the workers who exposed to benzene in this plant had high risk (9.7×10 -6 ~34.3×10 -6 ) of leukemia, and who exposed to formaldehyde had high risk (11.4 × 10 -6 ) of squamous cell carcinoma. There were inconsistent evaluation results using the ICMM tools of standard-based matrix and calculated risk rating. There were very high risks to be attacked by rhinocarcinoma of the workers who exposed to wood dust for the tool of calculated risk rating, while high risk for the tool of standard-based matrix. For the workers who exposed to noise, risk of noise-induced deafness was unacceptable and medium risk using two tools, respectively. Conclusion: Both EPA model and ICMM model can appropriately predict and assessthe occupational health risk in wooden furniture manufactory, ICMM due to the relatively simple operation, easy evaluation parameters, assessment of occupational - disease - inductive factors comprehensively, and more suitable for wooden furniture production enterprise.

  20. Integration of PKPD relationships into benefit–risk analysis

    PubMed Central

    Bellanti, Francesco; van Wijk, Rob C; Danhof, Meindert; Della Pasqua, Oscar

    2015-01-01

    Aim Despite the continuous endeavour to achieve high standards in medical care through effectiveness measures, a quantitative framework for the assessment of the benefit–risk balance of new medicines is lacking prior to regulatory approval. The aim of this short review is to summarise the approaches currently available for benefit–risk assessment. In addition, we propose the use of pharmacokinetic–pharmacodynamic (PKPD) modelling as the pharmacological basis for evidence synthesis and evaluation of novel therapeutic agents. Methods A comprehensive literature search has been performed using MESH terms in PubMed, in which articles describing benefit–risk assessment and modelling and simulation were identified. In parallel, a critical review of multi-criteria decision analysis (MCDA) is presented as a tool for characterising a drug's safety and efficacy profile. Results A definition of benefits and risks has been proposed by the European Medicines Agency (EMA), in which qualitative and quantitative elements are included. However, in spite of the value of MCDA as a quantitative method, decisions about benefit–risk balance continue to rely on subjective expert opinion. By contrast, a model-informed approach offers the opportunity for a more comprehensive evaluation of benefit–risk balance before extensive evidence is generated in clinical practice. Conclusions Benefit–risk balance should be an integral part of the risk management plan and as such considered before marketing authorisation. Modelling and simulation can be incorporated into MCDA to support the evidence synthesis as well evidence generation taking into account the underlying correlations between favourable and unfavourable effects. In addition, it represents a valuable tool for the optimization of protocol design in effectiveness trials. PMID:25940398

  1. Integration of PKPD relationships into benefit-risk analysis.

    PubMed

    Bellanti, Francesco; van Wijk, Rob C; Danhof, Meindert; Della Pasqua, Oscar

    2015-11-01

    Despite the continuous endeavour to achieve high standards in medical care through effectiveness measures, a quantitative framework for the assessment of the benefit-risk balance of new medicines is lacking prior to regulatory approval. The aim of this short review is to summarise the approaches currently available for benefit-risk assessment. In addition, we propose the use of pharmacokinetic-pharmacodynamic (PKPD) modelling as the pharmacological basis for evidence synthesis and evaluation of novel therapeutic agents. A comprehensive literature search has been performed using MESH terms in PubMed, in which articles describing benefit-risk assessment and modelling and simulation were identified. In parallel, a critical review of multi-criteria decision analysis (MCDA) is presented as a tool for characterising a drug's safety and efficacy profile. A definition of benefits and risks has been proposed by the European Medicines Agency (EMA), in which qualitative and quantitative elements are included. However, in spite of the value of MCDA as a quantitative method, decisions about benefit-risk balance continue to rely on subjective expert opinion. By contrast, a model-informed approach offers the opportunity for a more comprehensive evaluation of benefit-risk balance before extensive evidence is generated in clinical practice. Benefit-risk balance should be an integral part of the risk management plan and as such considered before marketing authorisation. Modelling and simulation can be incorporated into MCDA to support the evidence synthesis as well evidence generation taking into account the underlying correlations between favourable and unfavourable effects. In addition, it represents a valuable tool for the optimization of protocol design in effectiveness trials. © 2015 The British Pharmacological Society.

  2. BiGG Models: A platform for integrating, standardizing and sharing genome-scale models

    DOE PAGES

    King, Zachary A.; Lu, Justin; Drager, Andreas; ...

    2015-10-17

    In this study, genome-scale metabolic models are mathematically structured knowledge bases that can be used to predict metabolic pathway usage and growth phenotypes. Furthermore, they can generate and test hypotheses when integrated with experimental data. To maximize the value of these models, centralized repositories of high-quality models must be established, models must adhere to established standards and model components must be linked to relevant databases. Tools for model visualization further enhance their utility. To meet these needs, we present BiGG Models (http://bigg.ucsd.edu), a completely redesigned Biochemical, Genetic and Genomic knowledge base. BiGG Models contains more than 75 high-quality, manually-curated genome-scalemore » metabolic models. On the website, users can browse, search and visualize models. BiGG Models connects genome-scale models to genome annotations and external databases. Reaction and metabolite identifiers have been standardized across models to conform to community standards and enable rapid comparison across models. Furthermore, BiGG Models provides a comprehensive application programming interface for accessing BiGG Models with modeling and analysis tools. As a resource for highly curated, standardized and accessible models of metabolism, BiGG Models will facilitate diverse systems biology studies and support knowledge-based analysis of diverse experimental data.« less

  3. BiGG Models: A platform for integrating, standardizing and sharing genome-scale models

    PubMed Central

    King, Zachary A.; Lu, Justin; Dräger, Andreas; Miller, Philip; Federowicz, Stephen; Lerman, Joshua A.; Ebrahim, Ali; Palsson, Bernhard O.; Lewis, Nathan E.

    2016-01-01

    Genome-scale metabolic models are mathematically-structured knowledge bases that can be used to predict metabolic pathway usage and growth phenotypes. Furthermore, they can generate and test hypotheses when integrated with experimental data. To maximize the value of these models, centralized repositories of high-quality models must be established, models must adhere to established standards and model components must be linked to relevant databases. Tools for model visualization further enhance their utility. To meet these needs, we present BiGG Models (http://bigg.ucsd.edu), a completely redesigned Biochemical, Genetic and Genomic knowledge base. BiGG Models contains more than 75 high-quality, manually-curated genome-scale metabolic models. On the website, users can browse, search and visualize models. BiGG Models connects genome-scale models to genome annotations and external databases. Reaction and metabolite identifiers have been standardized across models to conform to community standards and enable rapid comparison across models. Furthermore, BiGG Models provides a comprehensive application programming interface for accessing BiGG Models with modeling and analysis tools. As a resource for highly curated, standardized and accessible models of metabolism, BiGG Models will facilitate diverse systems biology studies and support knowledge-based analysis of diverse experimental data. PMID:26476456

  4. Improved Analysis of Earth System Models and Observations using Simple Climate Models

    NASA Astrophysics Data System (ADS)

    Nadiga, B. T.; Urban, N. M.

    2016-12-01

    Earth system models (ESM) are the most comprehensive tools we have to study climate change and develop climate projections. However, the computational infrastructure required and the cost incurred in running such ESMs precludes direct use of such models in conjunction with a wide variety of tools that can further our understanding of climate. Here we are referring to tools that range from dynamical systems tools that give insight into underlying flow structure and topology to tools that come from various applied mathematical and statistical techniques and are central to quantifying stability, sensitivity, uncertainty and predictability to machine learning tools that are now being rapidly developed or improved. Our approach to facilitate the use of such models is to analyze output of ESM experiments (cf. CMIP) using a range of simpler models that consider integral balances of important quantities such as mass and/or energy in a Bayesian framework.We highlight the use of this approach in the context of the uptake of heat by the world oceans in the ongoing global warming. Indeed, since in excess of 90% of the anomalous radiative forcing due greenhouse gas emissions is sequestered in the world oceans, the nature of ocean heat uptake crucially determines the surface warming that is realized (cf. climate sensitivity). Nevertheless, ESMs themselves are never run long enough to directly assess climate sensitivity. So, we consider a range of models based on integral balances--balances that have to be realized in all first-principles based models of the climate system including the most detailed state-of-the art climate simulations. The models range from simple models of energy balance to those that consider dynamically important ocean processes such as the conveyor-belt circulation (Meridional Overturning Circulation, MOC), North Atlantic Deep Water (NADW) formation, Antarctic Circumpolar Current (ACC) and eddy mixing. Results from Bayesian analysis of such models using both ESM experiments and actual observations are presented. One such result points to the importance of direct sequestration of heat below 700 m, a process that is not allowed for in the simple models that have been traditionally used to deduce climate sensitivity.

  5. Graphical Modeling Meets Systems Pharmacology.

    PubMed

    Lombardo, Rosario; Priami, Corrado

    2017-01-01

    A main source of failures in systems projects (including systems pharmacology) is poor communication level and different expectations among the stakeholders. A common and not ambiguous language that is naturally comprehensible by all the involved players is a boost to success. We present bStyle, a modeling tool that adopts a graphical language close enough to cartoons to be a common media to exchange ideas and data and that it is at the same time formal enough to enable modeling, analysis, and dynamic simulations of a system. Data analysis and simulation integrated in the same application are fundamental to understand the mechanisms of actions of drugs: a core aspect of systems pharmacology.

  6. Graphical Modeling Meets Systems Pharmacology

    PubMed Central

    Lombardo, Rosario; Priami, Corrado

    2017-01-01

    A main source of failures in systems projects (including systems pharmacology) is poor communication level and different expectations among the stakeholders. A common and not ambiguous language that is naturally comprehensible by all the involved players is a boost to success. We present bStyle, a modeling tool that adopts a graphical language close enough to cartoons to be a common media to exchange ideas and data and that it is at the same time formal enough to enable modeling, analysis, and dynamic simulations of a system. Data analysis and simulation integrated in the same application are fundamental to understand the mechanisms of actions of drugs: a core aspect of systems pharmacology. PMID:28469411

  7. Cross-cultural adaptation and psychometric assessment of the Chinese version of the comprehensive needs assessment tool for cancer caregivers (CNAT-C).

    PubMed

    Zhang, Yin-Ping; Zhao, Xin-Shuang; Zhang, Bei; Zhang, Lu-Lu; Ni, Chun-Ping; Hao, Nan; Shi, Chang-Bei; Porr, Caroline

    2015-07-01

    The comprehensive needs assessment tool for cancer caregivers (CNAT-C) is a systematic and comprehensive needs assessment tool for the family caregivers. The purpose of this project was twofold: (1) to adapt the CNAT-C to Mainland China's cultural context and (2) to evaluate the psychometric properties of the newly adapted Chinese CNAT-C. Cross-cultural adaptation of the original CNAT-C was performed according to published guidelines. A pilot study was conducted in Mainland China with 30 Chinese family cancer caregivers. A subsequent validation study was conducted with 205 Chinese cancer caregivers from Mainland China. Construct validity was determined through exploratory and confirmatory factor analyses. Reliability was determined using internal consistency and test-retest reliability. The split-half coefficient for the overall Chinese CNAT-C scale was 0.77. Principal component analysis resulted in an eight-factor structure explaining 68.11 % of the total variance. The comparative fit index (CFI) was 0.91 from the modified model confirmatory factor analysis. The Chi-square divided by degrees of freedom was 1.98, and the root mean squared error of approximation (RMSEA) was 0.079. In relation to the known-group validation, significant differences were found in the Chinese CNAT-C scale according to various caregiver characteristics. Internal consistency was high for the Chinese CNAT-C reaching a Cronbach α value of 0.94. Test-retest reliability was 0.85. The newly adapted Chinese CNAT-C scale possesses adequate validity, test-retest reliability, and internal consistency and therefore may be used to ascertain holistic health and support needs of cancer patients' family caregivers in Mainland China.

  8. Does Use of Text-to-Speech and Related Read-Aloud Tools Improve Reading Comprehension for Students with Reading Disabilities? A Meta-Analysis

    ERIC Educational Resources Information Center

    Wood, Sarah G.; Moxley, Jerad H.; Tighe, Elizabeth L.; Wagner, Richard K.

    2018-01-01

    Text-to-speech and related read-aloud tools are being widely implemented in an attempt to assist students' reading comprehension skills. Read-aloud software, including text-to-speech, is used to translate written text into spoken text, enabling one to listen to written text while reading along. It is not clear how effective text-to-speech is at…

  9. A Comprehensive Tool and Analytical Pathway for Differential Molecular Profiling and Biomarker Discovery

    DTIC Science & Technology

    2014-10-20

    three possiblities: AKR , B6, and BALB_B) and MUP Protein (containing two possibilities: Intact and Denatured), then you can view a plot of the Strain...the tags for the last two labels. Again, if the attribute Strain has three tags: AKR , B6, 74 Distribution A . Approved for public release...AFRL-RH-WP-TR-2014-0131 A COMPREHENSIVE TOOL AND ANALYTICAL PATHWAY FOR DIFFERENTIAL MOLECULAR PROFILING AND BIOMARKER DISCOVERY

  10. The Comprehensive, Powerful, Academic Database (CPAD): An Evaluative Study of a Predictive Tool Designed for Elementary School Personnel in Identifying At-Risk Students through Progress, Curriculum, and Performance Monitoring

    ERIC Educational Resources Information Center

    Chavez-Gibson, Sarah

    2013-01-01

    The purpose of this study is to exam in-depth, the Comprehensive, Powerful, Academic Database (CPAD), a data decision-making tool that determines and identifies students at-risk of dropping out of school, and how the CPAD assists administrators and teachers at an elementary campus to monitor progress, curriculum, and performance to improve student…

  11. Towards a National Space Weather Predictive Capability

    NASA Astrophysics Data System (ADS)

    Fox, N. J.; Lindstrom, K. L.; Ryschkewitsch, M. G.; Anderson, B. J.; Gjerloev, J. W.; Merkin, V. G.; Kelly, M. A.; Miller, E. S.; Sitnov, M. I.; Ukhorskiy, A. Y.; Erlandson, R. E.; Barnes, R. J.; Paxton, L. J.; Sotirelis, T.; Stephens, G.; Comberiate, J.

    2014-12-01

    National needs in the area of space weather informational and predictive tools are growing rapidly. Adverse conditions in the space environment can cause disruption of satellite operations, communications, navigation, and electric power distribution grids, leading to a variety of socio-economic losses and impacts on our security. Future space exploration and most modern human endeavors will require major advances in physical understanding and improved transition of space research to operations. At present, only a small fraction of the latest research and development results from NASA, NOAA, NSF and DoD investments are being used to improve space weather forecasting and to develop operational tools. The power of modern research and space weather model development needs to be better utilized to enable comprehensive, timely, and accurate operational space weather tools. The mere production of space weather information is not sufficient to address the needs of those who are affected by space weather. A coordinated effort is required to support research-to-applications transition efforts and to develop the tools required those who rely on this information. In this presentation we will review datasets, tools and models that have resulted from research by scientists at JHU/APL, and examine how they could be applied to support space weather applications in coordination with other community assets and capabilities.

  12. Identification of fuel cycle simulator functionalities for analysis of transition to a new fuel cycle

    DOE PAGES

    Brown, Nicholas R.; Carlsen, Brett W.; Dixon, Brent W.; ...

    2016-06-09

    Dynamic fuel cycle simulation tools are intended to model holistic transient nuclear fuel cycle scenarios. As with all simulation tools, fuel cycle simulators require verification through unit tests, benchmark cases, and integral tests. Model validation is a vital aspect as well. Although compara-tive studies have been performed, there is no comprehensive unit test and benchmark library for fuel cycle simulator tools. The objective of this paper is to identify the must test functionalities of a fuel cycle simulator tool within the context of specific problems of interest to the Fuel Cycle Options Campaign within the U.S. Department of Energy smore » Office of Nuclear Energy. The approach in this paper identifies the features needed to cover the range of promising fuel cycle options identified in the DOE-NE Fuel Cycle Evaluation and Screening (E&S) and categorizes these features to facilitate prioritization. Features were categorized as essential functions, integrating features, and exemplary capabilities. One objective of this paper is to propose a library of unit tests applicable to each of the essential functions. Another underlying motivation for this paper is to encourage an international dialog on the functionalities and standard test methods for fuel cycle simulator tools.« less

  13. Mammalian synthetic biology for studying the cell

    PubMed Central

    Mathur, Melina; Xiang, Joy S.

    2017-01-01

    Synthetic biology is advancing the design of genetic devices that enable the study of cellular and molecular biology in mammalian cells. These genetic devices use diverse regulatory mechanisms to both examine cellular processes and achieve precise and dynamic control of cellular phenotype. Synthetic biology tools provide novel functionality to complement the examination of natural cell systems, including engineered molecules with specific activities and model systems that mimic complex regulatory processes. Continued development of quantitative standards and computational tools will expand capacities to probe cellular mechanisms with genetic devices to achieve a more comprehensive understanding of the cell. In this study, we review synthetic biology tools that are being applied to effectively investigate diverse cellular processes, regulatory networks, and multicellular interactions. We also discuss current challenges and future developments in the field that may transform the types of investigation possible in cell biology. PMID:27932576

  14. Rapid Prototyping of Hydrologic Model Interfaces with IPython

    NASA Astrophysics Data System (ADS)

    Farthing, M. W.; Winters, K. D.; Ahmadia, A. J.; Hesser, T.; Howington, S. E.; Johnson, B. D.; Tate, J.; Kees, C. E.

    2014-12-01

    A significant gulf still exists between the state of practice and state of the art in hydrologic modeling. Part of this gulf is due to the lack of adequate pre- and post-processing tools for newly developed computational models. The development of user interfaces has traditionally lagged several years behind the development of a particular computational model or suite of models. As a result, models with mature interfaces often lack key advancements in model formulation, solution methods, and/or software design and technology. Part of the problem has been a focus on developing monolithic tools to provide comprehensive interfaces for the entire suite of model capabilities. Such efforts require expertise in software libraries and frameworks for creating user interfaces (e.g., Tcl/Tk, Qt, and MFC). These tools are complex and require significant investment in project resources (time and/or money) to use. Moreover, providing the required features for the entire range of possible applications and analyses creates a cumbersome interface. For a particular site or application, the modeling requirements may be simplified or at least narrowed, which can greatly reduce the number and complexity of options that need to be accessible to the user. However, monolithic tools usually are not adept at dynamically exposing specific workflows. Our approach is to deliver highly tailored interfaces to users. These interfaces may be site and/or process specific. As a result, we end up with many, customized interfaces rather than a single, general-use tool. For this approach to be successful, it must be efficient to create these tailored interfaces. We need technology for creating quality user interfaces that is accessible and has a low barrier for integration into model development efforts. Here, we present efforts to leverage IPython notebooks as tools for rapid prototyping of site and application-specific user interfaces. We provide specific examples from applications in near-shore environments as well as levee analysis. We discuss our design decisions and methodology for developing customized interfaces, strategies for delivery of the interfaces to users in various computing environments, as well as implications for the design/implementation of simulation models.

  15. Modelling the exposure to chemicals for risk assessment: a comprehensive library of multimedia and PBPK models for integration, prediction, uncertainty and sensitivity analysis - the MERLIN-Expo tool.

    PubMed

    Ciffroy, P; Alfonso, B; Altenpohl, A; Banjac, Z; Bierkens, J; Brochot, C; Critto, A; De Wilde, T; Fait, G; Fierens, T; Garratt, J; Giubilato, E; Grange, E; Johansson, E; Radomyski, A; Reschwann, K; Suciu, N; Tanaka, T; Tediosi, A; Van Holderbeke, M; Verdonck, F

    2016-10-15

    MERLIN-Expo is a library of models that was developed in the frame of the FP7 EU project 4FUN in order to provide an integrated assessment tool for state-of-the-art exposure assessment for environment, biota and humans, allowing the detection of scientific uncertainties at each step of the exposure process. This paper describes the main features of the MERLIN-Expo tool. The main challenges in exposure modelling that MERLIN-Expo has tackled are: (i) the integration of multimedia (MM) models simulating the fate of chemicals in environmental media, and of physiologically based pharmacokinetic (PBPK) models simulating the fate of chemicals in human body. MERLIN-Expo thus allows the determination of internal effective chemical concentrations; (ii) the incorporation of a set of functionalities for uncertainty/sensitivity analysis, from screening to variance-based approaches. The availability of such tools for uncertainty and sensitivity analysis aimed to facilitate the incorporation of such issues in future decision making; (iii) the integration of human and wildlife biota targets with common fate modelling in the environment. MERLIN-Expo is composed of a library of fate models dedicated to non biological receptor media (surface waters, soils, outdoor air), biological media of concern for humans (several cultivated crops, mammals, milk, fish), as well as wildlife biota (primary producers in rivers, invertebrates, fish) and humans. These models can be linked together to create flexible scenarios relevant for both human and wildlife biota exposure. Standardized documentation for each model and training material were prepared to support an accurate use of the tool by end-users. One of the objectives of the 4FUN project was also to increase the confidence in the applicability of the MERLIN-Expo tool through targeted realistic case studies. In particular, we aimed at demonstrating the feasibility of building complex realistic exposure scenarios and the accuracy of the modelling predictions through a comparison with actual measurements. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Entry one: striving for best practice in professional assessment.

    PubMed

    English, Justin M; Mykyta, Lu

    2002-01-01

    The aim of this study was to develop a best practice model of professional assessment to ensure efficient and effective delivery of home-based services to frail and disabled elders. In 2000, an innovative model of professional assessment was introduced by one of Australia's largest providers of home-based care in order to reduce multiple assessments and to reduce the utilisation of assessment as a gatekeeping tool for limiting access to services. Data was analysed from a random sample of 1500 clients drawn from a population of 5000 as well as through the use of a survey tool administered to the Organisation's assessment staff and other key stakeholders. Results revealed that, contrary to popular belief, carer advocacy plays a significant role in the professional assessment process to the point that clients with carers received significantly more services and service time that clients without such support. However, if not monitored, assessment can also be used as a gate-keeping tool as opposed to one that can provide significant benefits to the consumers through comprehensive need articulation. We argue that the "professional" approach does not preclude empowerment and that assessment should not be used as a gate-keeping tool.

  17. Head-twitch response in rodents induced by the hallucinogen 2,5-dimethoxy-4-iodoamphetamine: a comprehensive history, a re-evaluation of mechanisms, and its utility as a model

    PubMed Central

    Canal, Clint E.; Morgan, Drake

    2013-01-01

    Two primary animal models persist for assessing hallucinogenic potential of novel compounds and for examining the pharmacological and neurobiological substrates underlying the actions of classical hallucinogens, the two-lever drug discrimination procedure and the drug-induced head-twitch response (HTR) in rodents. The substituted amphetamine hallucinogen, serotonin 2 (5-HT2) receptor agonist, 2,5-dimethoxy-4-iodoamphetamine (DOI) has emerged as the most popular pharmacological tool used in HTR studies of hallucinogens. Synthesizing classic, recent, and relatively overlooked findings, addressing ostensibly conflicting observations, and considering contemporary theories in receptor and behavioural pharmacology, this review provides an up-to-date and comprehensive synopsis of DOI and the HTR model, from neural mechanisms to utility for understanding psychiatric diseases. Also presented is support for the argument that, although both the two-lever drug discrimination and the HTR models in rodents are useful for uncovering receptors, interacting proteins, intracellular signalling pathways, and neurochemical processes affected by DOI and related classical hallucinogens, results from both models suggest they are not reporting hallucinogenic experiences in animals. PMID:22517680

  18. Methodology and application of combined watershed and ground-water models in Kansas

    USGS Publications Warehouse

    Sophocleous, M.; Perkins, S.P.

    2000-01-01

    Increased irrigation in Kansas and other regions during the last several decades has caused serious water depletion, making the development of comprehensive strategies and tools to resolve such problems increasingly important. This paper makes the case for an intermediate complexity, quasi-distributed, comprehensive, large-watershed model, which falls between the fully distributed, physically based hydrological modeling system of the type of the SHE model and the lumped, conceptual rainfall-runoff modeling system of the type of the Stanford watershed model. This is achieved by integrating the quasi-distributed watershed model SWAT with the fully-distributed ground-water model MODFLOW. The advantage of this approach is the appreciably smaller input data requirements and the use of readily available data (compared to the fully distributed, physically based models), the statistical handling of watershed heterogeneities by employing the hydrologic-response-unit concept, and the significantly increased flexibility in handling stream-aquifer interactions, distributed well withdrawals, and multiple land uses. The mechanics of integrating the component watershed and ground-water models are outlined, and three real-world management applications of the integrated model from Kansas are briefly presented. Three different aspects of the integrated model are emphasized: (1) management applications of a Decision Support System for the integrated model (Rattlesnake Creek subbasin); (2) alternative conceptual models of spatial heterogeneity related to the presence or absence of an underlying aquifer with shallow or deep water table (Lower Republican River basin); and (3) the general nature of the integrated model linkage by employing a watershed simulator other than SWAT (Wet Walnut Creek basin). These applications demonstrate the practicality and versatility of this relatively simple and conceptually clear approach, making public acceptance of the integrated watershed modeling system much easier. This approach also enhances model calibration and thus the reliability of model results. (C) 2000 Elsevier Science B.V.Increased irrigation in Kansas and other regions during the last several decades has caused serious water depletion, making the development of comprehensive strategies and tools to resolve such problems increasingly important. This paper makes the case for an intermediate complexity, quasi-distributed, comprehensive, large-watershed model, which falls between the fully distributed, physically based hydrological modeling system of the type of the SHE model and the lumped, conceptual rainfall-runoff modeling system of the type of the Stanford watershed model. This is achieved by integrating the quasi-distributed watershed model SWAT with the fully-distributed ground-water model MODFLOW. The advantage of this approach is the appreciably smaller input data requirements and the use of readily available data (compared to the fully distributed, physically based models), the statistical handling of watershed heterogeneities by employing the hydrologic-response-unit concept, and the significantly increased flexibility in handling stream-aquifer interactions, distributed well withdrawals, and multiple land uses. The mechanics of integrating the component watershed and ground-water models are outlined, and three real-world management applications of the integrated model from Kansas are briefly presented. Three different aspects of the integrated model are emphasized: (1) management applications of a Decision Support System for the integrated model (Rattlesnake Creek subbasin); (2) alternative conceptual models of spatial heterogeneity related to the presence or absence of an underlying aquifer with shallow or deep water table (Lower Republican River basin); and (3) the general nature of the integrated model linkage by employing a watershed simulator other than SWAT (Wet Walnut Creek basin). These applications demonstrate the practicality and ve

  19. Development and evaluation of a comprehensive clinical decision support taxonomy: comparison of front-end tools in commercial and internally developed electronic health record systems

    PubMed Central

    Sittig, Dean F; Ash, Joan S; Feblowitz, Joshua; Meltzer, Seth; McMullen, Carmit; Guappone, Ken; Carpenter, Jim; Richardson, Joshua; Simonaitis, Linas; Evans, R Scott; Nichol, W Paul; Middleton, Blackford

    2011-01-01

    Background Clinical decision support (CDS) is a valuable tool for improving healthcare quality and lowering costs. However, there is no comprehensive taxonomy of types of CDS and there has been limited research on the availability of various CDS tools across current electronic health record (EHR) systems. Objective To develop and validate a taxonomy of front-end CDS tools and to assess support for these tools in major commercial and internally developed EHRs. Study design and methods We used a modified Delphi approach with a panel of 11 decision support experts to develop a taxonomy of 53 front-end CDS tools. Based on this taxonomy, a survey on CDS tools was sent to a purposive sample of commercial EHR vendors (n=9) and leading healthcare institutions with internally developed state-of-the-art EHRs (n=4). Results Responses were received from all healthcare institutions and 7 of 9 EHR vendors (response rate: 85%). All 53 types of CDS tools identified in the taxonomy were found in at least one surveyed EHR system, but only 8 functions were present in all EHRs. Medication dosing support and order facilitators were the most commonly available classes of decision support, while expert systems (eg, diagnostic decision support, ventilator management suggestions) were the least common. Conclusion We developed and validated a comprehensive taxonomy of front-end CDS tools. A subsequent survey of commercial EHR vendors and leading healthcare institutions revealed a small core set of common CDS tools, but identified significant variability in the remainder of clinical decision support content. PMID:21415065

  20. Modular modeling system for building distributed hydrologic models with a user-friendly software package

    NASA Astrophysics Data System (ADS)

    Wi, S.; Ray, P. A.; Brown, C.

    2015-12-01

    A software package developed to facilitate building distributed hydrologic models in a modular modeling system is presented. The software package provides a user-friendly graphical user interface that eases its practical use in water resources-related research and practice. The modular modeling system organizes the options available to users when assembling models according to the stages of hydrological cycle, such as potential evapotranspiration, soil moisture accounting, and snow/glacier melting processes. The software is intended to be a comprehensive tool that simplifies the task of developing, calibrating, validating, and using hydrologic models through the inclusion of intelligent automation to minimize user effort, and reduce opportunities for error. Processes so far automated include the definition of system boundaries (i.e., watershed delineation), climate and geographical input generation, and parameter calibration. Built-in post-processing toolkits greatly improve the functionality of the software as a decision support tool for water resources system management and planning. Example post-processing toolkits enable streamflow simulation at ungauged sites with predefined model parameters, and perform climate change risk assessment by means of the decision scaling approach. The software is validated through application to watersheds representing a variety of hydrologic regimes.

  1. Innovative Stormwater Quality Tools by SARA for Holistic Watershed Master Planning

    NASA Astrophysics Data System (ADS)

    Thomas, S. M.; Su, Y. C.; Hummel, P. R.

    2016-12-01

    Stormwater management strategies such as Best Management Practices (BMP) and Low-Impact Development (LID) have increasingly gained attention in urban runoff control, becoming vital to holistic watershed master plans. These strategies can help address existing water quality impairments and support regulatory compliance, as well as guide planning and management of future development when substantial population growth and urbanization is projected to occur. However, past efforts have been limited to qualitative planning due to the lack of suitable tools to conduct quantitative assessment. The San Antonio River Authority (SARA), with the assistance of Lockwood, Andrews & Newnam, Inc. (LAN) and AQUA TERRA Consultants (a division of RESPEC), developed comprehensive hydrodynamic and water quality models using the Hydrological Simulation Program-FORTRAN (HSPF) for several urban watersheds in the San Antonio River Basin. These models enabled watershed management to look at water quality issues on a more refined temporal and spatial scale than the limited monitoring data. They also provided a means to locate and quantify potential water quality impairments and evaluate the effects of mitigation measures. To support the models, a suite of software tools were developed. including: 1) SARA Timeseries Utility Tool for managing and processing of large model timeseries files, 2) SARA Load Reduction Tool to determine load reductions needed to achieve screening levels for each modeled constituent on a sub-basin basis, and 3) SARA Enhanced BMP Tool to determine the optimal combination of BMP types and units needed to achieve the required load reductions. Using these SARA models and tools, water quality agencies and stormwater professionals can determine the optimal combinations of BMP/LID to accomplish their goals and save substantial stormwater infrastructure and management costs. The tools can also help regulators and permittees evaluate the feasibility of achieving compliance using BMP/LID. The project has gained national attention, being showcased in multiple newsletters, professional magazines, and conference presentations. The project also won the Texas American Council of Engineering Companies (ACEC) Gold Medal Award and the ACEC National Recognition Award in 2016.

  2. DengueTools: innovative tools and strategies for the surveillance and control of dengue.

    PubMed

    Wilder-Smith, Annelies; Renhorn, Karl-Erik; Tissera, Hasitha; Abu Bakar, Sazaly; Alphey, Luke; Kittayapong, Pattamaporn; Lindsay, Steve; Logan, James; Hatz, Christoph; Reiter, Paul; Rocklöv, Joacim; Byass, Peter; Louis, Valérie R; Tozan, Yesim; Massad, Eduardo; Tenorio, Antonio; Lagneau, Christophe; L'Ambert, Grégory; Brooks, David; Wegerdt, Johannah; Gubler, Duane

    2012-01-01

    Dengue fever is a mosquito-borne viral disease estimated to cause about 230 million infections worldwide every year, of which 25,000 are fatal. Global incidence has risen rapidly in recent decades with some 3.6 billion people, over half of the world's population, now at risk, mainly in urban centres of the tropics and subtropics. Demographic and societal changes, in particular urbanization, globalization, and increased international travel, are major contributors to the rise in incidence and geographic expansion of dengue infections. Major research gaps continue to hamper the control of dengue. The European Commission launched a call under the 7th Framework Programme with the title of 'Comprehensive control of Dengue fever under changing climatic conditions'. Fourteen partners from several countries in Europe, Asia, and South America formed a consortium named 'DengueTools' to respond to the call to achieve better diagnosis, surveillance, prevention, and predictive models and improve our understanding of the spread of dengue to previously uninfected regions (including Europe) in the context of globalization and climate change.The consortium comprises 12 work packages to address a set of research questions in three areas:Research area 1: Develop a comprehensive early warning and surveillance system that has predictive capability for epidemic dengue and benefits from novel tools for laboratory diagnosis and vector monitoring.Research area 2: Develop novel strategies to prevent dengue in children.Research area 3: Understand and predict the risk of global spread of dengue, in particular the risk of introduction and establishment in Europe, within the context of parameters of vectorial capacity, global mobility, and climate change.In this paper, we report on the rationale and specific study objectives of 'DengueTools'. DengueTools is funded under the Health theme of the Seventh Framework Programme of the European Community, Grant Agreement Number: 282589 Dengue Tools.

  3. Techno-economic analysis and decision making for PHEV benefits to society, consumers, policymakers and automakers

    NASA Astrophysics Data System (ADS)

    Al-Alawi, Baha Mohammed

    Plug-in hybrid electric vehicles (PHEVs) are an emerging automotive technology that has the capability to reduce transportation environmental impacts, but at an increased production cost. PHEVs can draw and store energy from an electric grid and consequently show reductions in petroleum consumption, air emissions, ownership costs, and regulation compliance costs, and various other externalities. Decision makers in the policy, consumer, and industry spheres would like to understand the impact of HEV and PHEV technologies on the U.S. vehicle fleets, but to date, only the disciplinary characteristics of PHEVs been considered. The multidisciplinary tradeoffs between vehicle energy sources, policy requirements, market conditions, consumer preferences and technology improvements are not well understood. For example, the results of recent studies have posited the importance of PHEVs to the future US vehicle fleet. No studies have considered the value of PHEVs to automakers and policy makers as a tool for achieving US corporate average fuel economy (CAFE) standards which are planned to double by 2030. Previous studies have demonstrated the cost and benefit of PHEVs but there is no study that comprehensively accounts for the cost and benefits of PHEV to consumers. The diffusion rate of hybrid electric vehicle (HEV) and PHEV technology into the marketplace has been estimated by existing studies using various tools and scenarios, but results show wide variations between studies. There is no comprehensive modeling study that combines policy, consumers, society and automakers in the U.S. new vehicle sales cost and benefits analysis. The aim of this research is to build a potential framework that can simulate and optimize the benefits of PHEVs for a multiplicity of stakeholders. This dissertation describes the results of modeling that integrates the effects of PHEV market penetration on policy, consumer and economic spheres. A model of fleet fuel economy and CAFE compliance for a large US automaker will be developed. A comprehensive total cost of ownership model will be constructed to calculate and compare the cost and benefits of PHEVs, conventional vehicles (CVs) and HEVs. Then a comprehensive literature review of PHEVs penetration rate studies will be developed to review and analyze the primary purposes, methods, and results of studies of PHEV market penetration. Finally a multi-criteria modeling system will incorporate results of the support model results. In this project, the models, analysis and results will provide a broader understanding of the benefits and costs of PHEV technology and the parties to whom those benefits accrue. The findings will provide important information for consumers, automakers and policy makers to understand and define HEVs and PHEVs costs, benefits, expected penetration rate and the preferred vehicle design and technology scenario to meet the requirements of policy, society, industry and consumers.

  4. Development of modelling method selection tool for health services management: from problem structuring methods to modelling and simulation methods.

    PubMed

    Jun, Gyuchan T; Morris, Zoe; Eldabi, Tillal; Harper, Paul; Naseer, Aisha; Patel, Brijesh; Clarkson, John P

    2011-05-19

    There is an increasing recognition that modelling and simulation can assist in the process of designing health care policies, strategies and operations. However, the current use is limited and answers to questions such as what methods to use and when remain somewhat underdeveloped. The aim of this study is to provide a mechanism for decision makers in health services planning and management to compare a broad range of modelling and simulation methods so that they can better select and use them or better commission relevant modelling and simulation work. This paper proposes a modelling and simulation method comparison and selection tool developed from a comprehensive literature review, the research team's extensive expertise and inputs from potential users. Twenty-eight different methods were identified, characterised by their relevance to different application areas, project life cycle stages, types of output and levels of insight, and four input resources required (time, money, knowledge and data). The characterisation is presented in matrix forms to allow quick comparison and selection. This paper also highlights significant knowledge gaps in the existing literature when assessing the applicability of particular approaches to health services management, where modelling and simulation skills are scarce let alone money and time. A modelling and simulation method comparison and selection tool is developed to assist with the selection of methods appropriate to supporting specific decision making processes. In particular it addresses the issue of which method is most appropriate to which specific health services management problem, what the user might expect to be obtained from the method, and what is required to use the method. In summary, we believe the tool adds value to the scarce existing literature on methods comparison and selection.

  5. Power processing methodology. [computerized design of spacecraft electric power systems

    NASA Technical Reports Server (NTRS)

    Fegley, K. A.; Hansen, I. G.; Hayden, J. H.

    1974-01-01

    Discussion of the interim results of a program to investigate the feasibility of formulating a methodology for the modeling and analysis of aerospace electrical power processing systems. The object of the total program is to develop a flexible engineering tool which will allow the power processor designer to effectively and rapidly assess and analyze the tradeoffs available by providing, in one comprehensive program, a mathematical model, an analysis of expected performance, simulation, and a comparative evaluation with alternative designs. This requires an understanding of electrical power source characteristics and the effects of load control, protection, and total system interaction.

  6. Petrophysical evaluation of subterranean formations

    DOEpatents

    Klein, James D; Schoderbek, David A; Mailloux, Jason M

    2013-05-28

    Methods and systems are provided for evaluating petrophysical properties of subterranean formations and comprehensively evaluating hydrate presence through a combination of computer-implemented log modeling and analysis. Certain embodiments include the steps of running a number of logging tools in a wellbore to obtain a variety of wellbore data and logs, and evaluating and modeling the log data to ascertain various petrophysical properties. Examples of suitable logging techniques that may be used in combination with the present invention include, but are not limited to, sonic logs, electrical resistivity logs, gamma ray logs, neutron porosity logs, density logs, NRM logs, or any combination or subset thereof.

  7. A Prototype for the Support of Integrated Software Process Development and Improvement

    NASA Astrophysics Data System (ADS)

    Porrawatpreyakorn, Nalinpat; Quirchmayr, Gerald; Chutimaskul, Wichian

    An efficient software development process is one of key success factors for quality software. Not only can the appropriate establishment but also the continuous improvement of integrated project management and of the software development process result in efficiency. This paper hence proposes a software process maintenance framework which consists of two core components: an integrated PMBOK-Scrum model describing how to establish a comprehensive set of project management and software engineering processes and a software development maturity model advocating software process improvement. Besides, a prototype tool to support the framework is introduced.

  8. Tailored and Integrated Web-Based Tools for Improving Psychosocial Outcomes of Cancer Patients: The DoTTI Development Framework

    PubMed Central

    Bryant, Jamie; Sanson-Fisher, Rob; Tzelepis, Flora; Henskens, Frans; Paul, Christine; Stevenson, William

    2014-01-01

    Background Effective communication with cancer patients and their families about their disease, treatment options, and possible outcomes may improve psychosocial outcomes. However, traditional approaches to providing information to patients, including verbal information and written booklets, have a number of shortcomings centered on their limited ability to meet patient preferences and literacy levels. New-generation Web-based technologies offer an innovative and pragmatic solution for overcoming these limitations by providing a platform for interactive information seeking, information sharing, and user-centered tailoring. Objective The primary goal of this paper is to discuss the advantages of comprehensive and iterative Web-based technologies for health information provision and propose a four-phase framework for the development of Web-based information tools. Methods The proposed framework draws on our experience of constructing a Web-based information tool for hematological cancer patients and their families. The framework is based on principles for the development and evaluation of complex interventions and draws on the Agile methodology of software programming that emphasizes collaboration and iteration throughout the development process. Results The DoTTI framework provides a model for a comprehensive and iterative approach to the development of Web-based informational tools for patients. The process involves 4 phases of development: (1) Design and development, (2) Testing early iterations, (3) Testing for effectiveness, and (4) Integration and implementation. At each step, stakeholders (including researchers, clinicians, consumers, and programmers) are engaged in consultations to review progress, provide feedback on versions of the Web-based tool, and based on feedback, determine the appropriate next steps in development. Conclusions This 4-phase framework is evidence-informed and consumer-centered and could be applied widely to develop Web-based programs for a diverse range of diseases. PMID:24641991

  9. Tailored and integrated Web-based tools for improving psychosocial outcomes of cancer patients: the DoTTI development framework.

    PubMed

    Smits, Rochelle; Bryant, Jamie; Sanson-Fisher, Rob; Tzelepis, Flora; Henskens, Frans; Paul, Christine; Stevenson, William

    2014-03-14

    Effective communication with cancer patients and their families about their disease, treatment options, and possible outcomes may improve psychosocial outcomes. However, traditional approaches to providing information to patients, including verbal information and written booklets, have a number of shortcomings centered on their limited ability to meet patient preferences and literacy levels. New-generation Web-based technologies offer an innovative and pragmatic solution for overcoming these limitations by providing a platform for interactive information seeking, information sharing, and user-centered tailoring. The primary goal of this paper is to discuss the advantages of comprehensive and iterative Web-based technologies for health information provision and propose a four-phase framework for the development of Web-based information tools. The proposed framework draws on our experience of constructing a Web-based information tool for hematological cancer patients and their families. The framework is based on principles for the development and evaluation of complex interventions and draws on the Agile methodology of software programming that emphasizes collaboration and iteration throughout the development process. The DoTTI framework provides a model for a comprehensive and iterative approach to the development of Web-based informational tools for patients. The process involves 4 phases of development: (1) Design and development, (2) Testing early iterations, (3) Testing for effectiveness, and (4) Integration and implementation. At each step, stakeholders (including researchers, clinicians, consumers, and programmers) are engaged in consultations to review progress, provide feedback on versions of the Web-based tool, and based on feedback, determine the appropriate next steps in development. This 4-phase framework is evidence-informed and consumer-centered and could be applied widely to develop Web-based programs for a diverse range of diseases.

  10. Toward Identifying Needed Investments in Modeling and Simulation Tools for NEO Deflection Planning

    NASA Technical Reports Server (NTRS)

    Adams, Robert B.

    2009-01-01

    Its time: a) To bring planetary scientists, deflection system investigators and vehicle designers together on the characterization/mitigation problem. b) To develop a comprehensive trade space of options. c) To trade options under a common set of assumptions and see what comparisons on effectiveness can be made. d) To explore the synergy that can be had with proposed scientific and exploration architectures while interest in NEO's are at an all time high.

  11. The Biobank Economic Modeling Tool (BEMT): Online Financial Planning to Facilitate Biobank Sustainability

    PubMed Central

    Odeh, Hana; Miranda, Lisa; Rao, Abhi; Vaught, Jim; Greenman, Howard; McLean, Jeffrey; Reed, Daniel; Memon, Sarfraz; Fombonne, Benjamin; Guan, Ping

    2015-01-01

    Background: Biospecimens are essential resources for advancing basic and translational research. However, there are little data available regarding the costs associated with operating a biobank, and few resources to enable their long-term sustainability. To support the research community in this effort, the National Institutes of Health, National Cancer Institute's Biorepositories and Biospecimen Research Branch has developed the Biobank Economic Modeling Tool (BEMT). The tool is accessible at http://biospecimens.cancer.gov/resources/bemt.asp. Methods: To obtain market-based cost information and to inform the development of the tool, a survey was designed and sent to 423 biobank managers and directors across the world. The survey contained questions regarding infrastructure investments, salary costs, funding options, types of biospecimen resources and services offered, as well as biospecimen pricing and service-related costs. Results: A total of 106 responses were received. The data were anonymized, aggregated, and used to create a comprehensive database of cost and pricing information that was integrated into the web-based tool, the BEMT. The BEMT was built to allow the user to input cost and pricing data through a seven-step process to build a cost profile for their biobank, define direct and indirect costs, determine cost recovery fees, perform financial forecasting, and query the anonymized survey data from comparable biobanks. Conclusion: A survey was conducted to obtain a greater understanding of the costs involved in operating a biobank. The anonymized survey data was then used to develop the BEMT, a cost modeling tool for biobanks. Users of the tool will be able to create a cost profile for their biobanks' specimens, products and services, establish pricing, and allocate costs for biospecimens based on percent cost recovered, and perform project-specific cost analyses and financial forecasting. PMID:26697911

  12. The Biobank Economic Modeling Tool (BEMT): Online Financial Planning to Facilitate Biobank Sustainability.

    PubMed

    Odeh, Hana; Miranda, Lisa; Rao, Abhi; Vaught, Jim; Greenman, Howard; McLean, Jeffrey; Reed, Daniel; Memon, Sarfraz; Fombonne, Benjamin; Guan, Ping; Moore, Helen M

    2015-12-01

    Biospecimens are essential resources for advancing basic and translational research. However, there are little data available regarding the costs associated with operating a biobank, and few resources to enable their long-term sustainability. To support the research community in this effort, the National Institutes of Health, National Cancer Institute's Biorepositories and Biospecimen Research Branch has developed the Biobank Economic Modeling Tool (BEMT). The tool is accessible at http://biospecimens.cancer.gov/resources/bemt.asp. To obtain market-based cost information and to inform the development of the tool, a survey was designed and sent to 423 biobank managers and directors across the world. The survey contained questions regarding infrastructure investments, salary costs, funding options, types of biospecimen resources and services offered, as well as biospecimen pricing and service-related costs. A total of 106 responses were received. The data were anonymized, aggregated, and used to create a comprehensive database of cost and pricing information that was integrated into the web-based tool, the BEMT. The BEMT was built to allow the user to input cost and pricing data through a seven-step process to build a cost profile for their biobank, define direct and indirect costs, determine cost recovery fees, perform financial forecasting, and query the anonymized survey data from comparable biobanks. A survey was conducted to obtain a greater understanding of the costs involved in operating a biobank. The anonymized survey data was then used to develop the BEMT, a cost modeling tool for biobanks. Users of the tool will be able to create a cost profile for their biobanks' specimens, products and services, establish pricing, and allocate costs for biospecimens based on percent cost recovered, and perform project-specific cost analyses and financial forecasting.

  13. Comprehensive Design Reliability Activities for Aerospace Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Christenson, R. L.; Whitley, M. R.; Knight, K. C.

    2000-01-01

    This technical publication describes the methodology, model, software tool, input data, and analysis result that support aerospace design reliability studies. The focus of these activities is on propulsion systems mechanical design reliability. The goal of these activities is to support design from a reliability perspective. Paralleling performance analyses in schedule and method, this requires the proper use of metrics in a validated reliability model useful for design, sensitivity, and trade studies. Design reliability analysis in this view is one of several critical design functions. A design reliability method is detailed and two example analyses are provided-one qualitative and the other quantitative. The use of aerospace and commercial data sources for quantification is discussed and sources listed. A tool that was developed to support both types of analyses is presented. Finally, special topics discussed include the development of design criteria, issues of reliability quantification, quality control, and reliability verification.

  14. Applying the Land Use Portfolio Model with Hazus to analyse risk from natural hazard events

    USGS Publications Warehouse

    Dinitz, Laura B.; Taketa, Richard A.

    2013-01-01

    This paper describes and demonstrates the integration of two geospatial decision-support systems for natural-hazard risk assessment and management. Hazus is a risk-assessment tool developed by the Federal Emergency Management Agency to identify risks and estimate the severity of risk from natural hazards. The Land Use Portfolio Model (LUPM) is a risk-management tool developed by the U.S. Geological Survey to evaluate plans or actions intended to reduce risk from natural hazards. We analysed three mitigation policies for one earthquake scenario in the San Francisco Bay area to demonstrate the added value of using Hazus and the LUPM together. The demonstration showed that Hazus loss estimates can be input to the LUPM to obtain estimates of losses avoided through mitigation, rates of return on mitigation investment, and measures of uncertainty. Together, they offer a more comprehensive approach to help with decisions for reducing risk from natural hazards.

  15. TASI: A software tool for spatial-temporal quantification of tumor spheroid dynamics.

    PubMed

    Hou, Yue; Konen, Jessica; Brat, Daniel J; Marcus, Adam I; Cooper, Lee A D

    2018-05-08

    Spheroid cultures derived from explanted cancer specimens are an increasingly utilized resource for studying complex biological processes like tumor cell invasion and metastasis, representing an important bridge between the simplicity and practicality of 2-dimensional monolayer cultures and the complexity and realism of in vivo animal models. Temporal imaging of spheroids can capture the dynamics of cell behaviors and microenvironments, and when combined with quantitative image analysis methods, enables deep interrogation of biological mechanisms. This paper presents a comprehensive open-source software framework for Temporal Analysis of Spheroid Imaging (TASI) that allows investigators to objectively characterize spheroid growth and invasion dynamics. TASI performs spatiotemporal segmentation of spheroid cultures, extraction of features describing spheroid morpho-phenotypes, mathematical modeling of spheroid dynamics, and statistical comparisons of experimental conditions. We demonstrate the utility of this tool in an analysis of non-small cell lung cancer spheroids that exhibit variability in metastatic and proliferative behaviors.

  16. Interpreter of maladies: redescription mining applied to biomedical data analysis.

    PubMed

    Waltman, Peter; Pearlman, Alex; Mishra, Bud

    2006-04-01

    Comprehensive, systematic and integrated data-centric statistical approaches to disease modeling can provide powerful frameworks for understanding disease etiology. Here, one such computational framework based on redescription mining in both its incarnations, static and dynamic, is discussed. The static framework provides bioinformatic tools applicable to multifaceted datasets, containing genetic, transcriptomic, proteomic, and clinical data for diseased patients and normal subjects. The dynamic redescription framework provides systems biology tools to model complex sets of regulatory, metabolic and signaling pathways in the initiation and progression of a disease. As an example, the case of chronic fatigue syndrome (CFS) is considered, which has so far remained intractable and unpredictable in its etiology and nosology. The redescription mining approaches can be applied to the Centers for Disease Control and Prevention's Wichita (KS, USA) dataset, integrating transcriptomic, epidemiological and clinical data, and can also be used to study how pathways in the hypothalamic-pituitary-adrenal axis affect CFS patients.

  17. Multidisciplinary model-based-engineering for laser weapon systems: recent progress

    NASA Astrophysics Data System (ADS)

    Coy, Steve; Panthaki, Malcolm

    2013-09-01

    We are working to develop a comprehensive, integrated software framework and toolset to support model-based engineering (MBE) of laser weapons systems. MBE has been identified by the Office of the Director, Defense Science and Engineering as one of four potentially "game-changing" technologies that could bring about revolutionary advances across the entire DoD research and development and procurement cycle. To be effective, however, MBE requires robust underlying modeling and simulation technologies capable of modeling all the pertinent systems, subsystems, components, effects, and interactions at any level of fidelity that may be required in order to support crucial design decisions at any point in the system development lifecycle. Very often the greatest technical challenges are posed by systems involving interactions that cut across two or more distinct scientific or engineering domains; even in cases where there are excellent tools available for modeling each individual domain, generally none of these domain-specific tools can be used to model the cross-domain interactions. In the case of laser weapons systems R&D these tools need to be able to support modeling of systems involving combined interactions among structures, thermal, and optical effects, including both ray optics and wave optics, controls, atmospheric effects, target interaction, computational fluid dynamics, and spatiotemporal interactions between lasing light and the laser gain medium. To address this problem we are working to extend Comet™, to add the addition modeling and simulation capabilities required for this particular application area. In this paper we will describe our progress to date.

  18. A Comprehensive Look at Polypharmacy and Medication Screening Tools for the Older Cancer Patient

    PubMed Central

    DeGregory, Kathlene A.; Morris, Amy L.; Ramsdale, Erika E.

    2016-01-01

    Inappropriate medication use and polypharmacy are extremely common among older adults. Numerous studies have discussed the importance of a comprehensive medication assessment in the general geriatric population. However, only a handful of studies have evaluated inappropriate medication use in the geriatric oncology patient. Almost a dozen medication screening tools exist for the older adult. Each available tool has the potential to improve aspects of the care of older cancer patients, but no single tool has been developed for this population. We extensively reviewed the literature (MEDLINE, PubMed) to evaluate and summarize the most relevant medication screening tools for older patients with cancer. Findings of this review support the use of several screening tools concurrently for the elderly patient with cancer. A deprescribing tool should be developed and included in a comprehensive geriatric oncology assessment. Finally, prospective studies are needed to evaluate such a tool to determine its feasibility and impact in older patients with cancer. Implications for Practice: The prevalence of polypharmacy increases with advancing age. Older adults are more susceptible to adverse effects of medications. “Prescribing cascades” are common, whereas “deprescribing” remains uncommon; thus, older patients tend to accumulate medications over time. Older patients with cancer are at high risk for adverse drug events, in part because of the complexity and intensity of cancer treatment. Additionally, a cancer diagnosis often alters assessments of life expectancy, clinical status, and competing risk. Screening for polypharmacy and potentially inappropriate medications could reduce the risk for adverse drug events, enhance quality of life, and reduce health care spending for older cancer patients. PMID:27151653

  19. Transitioning from Targeted to Comprehensive Mass Spectrometry Using Genetic Algorithms.

    PubMed

    Jaffe, Jacob D; Feeney, Caitlin M; Patel, Jinal; Lu, Xiaodong; Mani, D R

    2016-11-01

    Targeted proteomic assays are becoming increasingly popular because of their robust quantitative applications enabled by internal standardization, and they can be routinely executed on high performance mass spectrometry instrumentation. However, these assays are typically limited to 100s of analytes per experiment. Considerable time and effort are often expended in obtaining and preparing samples prior to targeted analyses. It would be highly desirable to detect and quantify 1000s of analytes in such samples using comprehensive mass spectrometry techniques (e.g., SWATH and DIA) while retaining a high degree of quantitative rigor for analytes with matched internal standards. Experimentally, it is facile to port a targeted assay to a comprehensive data acquisition technique. However, data analysis challenges arise from this strategy concerning agreement of results from the targeted and comprehensive approaches. Here, we present the use of genetic algorithms to overcome these challenges in order to configure hybrid targeted/comprehensive MS assays. The genetic algorithms are used to select precursor-to-fragment transitions that maximize the agreement in quantification between the targeted and the comprehensive methods. We find that the algorithm we used provided across-the-board improvement in the quantitative agreement between the targeted assay data and the hybrid comprehensive/targeted assay that we developed, as measured by parameters of linear models fitted to the results. We also found that the algorithm could perform at least as well as an independently-trained mass spectrometrist in accomplishing this task. We hope that this approach will be a useful tool in the development of quantitative approaches for comprehensive proteomics techniques. Graphical Abstract ᅟ.

  20. Transitioning from Targeted to Comprehensive Mass Spectrometry Using Genetic Algorithms

    NASA Astrophysics Data System (ADS)

    Jaffe, Jacob D.; Feeney, Caitlin M.; Patel, Jinal; Lu, Xiaodong; Mani, D. R.

    2016-11-01

    Targeted proteomic assays are becoming increasingly popular because of their robust quantitative applications enabled by internal standardization, and they can be routinely executed on high performance mass spectrometry instrumentation. However, these assays are typically limited to 100s of analytes per experiment. Considerable time and effort are often expended in obtaining and preparing samples prior to targeted analyses. It would be highly desirable to detect and quantify 1000s of analytes in such samples using comprehensive mass spectrometry techniques (e.g., SWATH and DIA) while retaining a high degree of quantitative rigor for analytes with matched internal standards. Experimentally, it is facile to port a targeted assay to a comprehensive data acquisition technique. However, data analysis challenges arise from this strategy concerning agreement of results from the targeted and comprehensive approaches. Here, we present the use of genetic algorithms to overcome these challenges in order to configure hybrid targeted/comprehensive MS assays. The genetic algorithms are used to select precursor-to-fragment transitions that maximize the agreement in quantification between the targeted and the comprehensive methods. We find that the algorithm we used provided across-the-board improvement in the quantitative agreement between the targeted assay data and the hybrid comprehensive/targeted assay that we developed, as measured by parameters of linear models fitted to the results. We also found that the algorithm could perform at least as well as an independently-trained mass spectrometrist in accomplishing this task. We hope that this approach will be a useful tool in the development of quantitative approaches for comprehensive proteomics techniques.

  1. Enhancing Literacy Skills of Students with Congenital and Profound Hearing Impairment in Nigeria Using Babudoh's Comprehension Therapy

    ERIC Educational Resources Information Center

    Babudoh, Gladys B.

    2014-01-01

    This study reports the effect of a treatment tool called "Babudoh's comprehension therapy" in enhancing the comprehension and writing skills of 10 junior secondary school students with congenital and profound hearing impairment in Plateau State, Nigeria. The study adopted the single group pretest-posttest quasi-experimental research…

  2. NDARC - NASA Design and Analysis of Rotorcraft Validation and Demonstration

    NASA Technical Reports Server (NTRS)

    Johnson, Wayne

    2010-01-01

    Validation and demonstration results from the development of the conceptual design tool NDARC (NASA Design and Analysis of Rotorcraft) are presented. The principal tasks of NDARC are to design a rotorcraft to satisfy specified design conditions and missions, and then analyze the performance of the aircraft for a set of off-design missions and point operating conditions. The aircraft chosen as NDARC development test cases are the UH-60A single main-rotor and tail-rotor helicopter, the CH-47D tandem helicopter, the XH-59A coaxial lift-offset helicopter, and the XV-15 tiltrotor. These aircraft were selected because flight performance data, a weight statement, detailed geometry information, and a correlated comprehensive analysis model are available for each. Validation consists of developing the NDARC models for these aircraft by using geometry and weight information, airframe wind tunnel test data, engine decks, rotor performance tests, and comprehensive analysis results; and then comparing the NDARC results for aircraft and component performance with flight test data. Based on the calibrated models, the capability of the code to size rotorcraft is explored.

  3. Minimally Disruptive Medicine: A Pragmatically Comprehensive Model for Delivering Care to Patients with Multiple Chronic Conditions

    PubMed Central

    Leppin, Aaron L.; Montori, Victor M.; Gionfriddo, Michael R.

    2015-01-01

    An increasing proportion of healthcare resources in the United States are directed toward an expanding group of complex and multimorbid patients. Federal stakeholders have called for new models of care to meet the needs of these patients. Minimally Disruptive Medicine (MDM) is a theory-based, patient-centered, and context-sensitive approach to care that focuses on achieving patient goals for life and health while imposing the smallest possible treatment burden on patients’ lives. The MDM Care Model is designed to be pragmatically comprehensive, meaning that it aims to address any and all factors that impact the implementation and effectiveness of care for patients with multiple chronic conditions. It comprises core activities that map to an underlying and testable theoretical framework. This encourages refinement and future study. Here, we present the conceptual rationale for and a practical approach to minimally disruptive care for patients with multiple chronic conditions. We introduce some of the specific tools and strategies that can be used to identify the right care for these patients and to put it into practice. PMID:27417747

  4. Social Network Analysis of Biomedical Research Collaboration Networks in a CTSA Institution

    PubMed Central

    Bian, Jiang; Xie, Mengjun; Topaloglu, Umit; Hudson, Teresa; Eswaran, Hari; Hogan, William

    2014-01-01

    BACKGROUND The popularity of social networks has triggered a number of research efforts on network analyses of research collaborations in the Clinical and Translational Science Award (CTSA) community. Those studies mainly focus on the general understanding of collaboration networks by measuring common network metrics. More fundamental questions about collaborations still remain unanswered such as recognizing “influential” nodes and identifying potential new collaborations that are most rewarding. METHODS We analyzed biomedical research collaboration networks (RCNs) constructed from a dataset of research grants collected at a CTSA institution (i.e. University of Arkansas for Medical Sciences (UAMS)) in a comprehensive and systematic manner. First, our analysis covers the full spectrum of a RCN study: from network modeling to network characteristics measurement, from key nodes recognition to potential links (collaborations) suggestion. Second, our analysis employs non-conventional model and techniques including a weighted network model for representing collaboration strength, rank aggregation for detecting important nodes, and Random Walk with Restart (RWR) for suggesting new research collaborations. RESULTS By applying our models and techniques to RCNs at UAMS prior to and after the CTSA, we have gained valuable insights that not only reveal the temporal evolution of the network dynamics but also assess the effectiveness of the CTSA and its impact on a research institution. We find that collaboration networks at UAMS are not scale-free but small-world. Quantitative measures have been obtained to evident that the RCNs at UAMS are moving towards favoring multidisciplinary research. Moreover, our link prediction model creates the basis of collaboration recommendations with an impressive accuracy (AUC: 0.990, MAP@3: 1.48 and MAP@5: 1.522). Last but not least, an open-source visual analytical tool for RCNs is being developed and released through Github. CONCLUSIONS Through this study, we have developed a set of techniques and tools for analyzing research collaboration networks and conducted a comprehensive case study focusing on a CTSA institution. Our findings demonstrate the promising future of these techniques and tools in understanding the generative mechanisms of research collaborations and helping identify beneficial collaborations to members in the research community. PMID:24560679

  5. Investigation of a Verification and Validation Tool with a Turbofan Aircraft Engine Application

    NASA Technical Reports Server (NTRS)

    Uth, Peter; Narang-Siddarth, Anshu; Wong, Edmond

    2018-01-01

    The development of more advanced control architectures for turbofan aircraft engines can yield gains in performance and efficiency over the lifetime of an engine. However, the implementation of these increasingly complex controllers is contingent on their ability to provide safe, reliable engine operation. Therefore, having the means to verify the safety of new control algorithms is crucial. As a step towards this goal, CoCoSim, a publicly available verification tool for Simulink, is used to analyze C-MAPSS40k, a 40,000 lbf class turbo-fan engine model developed at NASA for testing new control algorithms. Due to current limitations of the verification software, several modifications are made to C-MAPSS40k to achieve compatibility with CoCoSim. Some of these modifications sacrifice fidelity to the original model. Several safety and performance requirements typical for turbofan engines are identified and constructed into a verification framework. Preliminary results using an industry standard baseline controller for these requirements are presented. While verification capabilities are demonstrated, a truly comprehensive analysis will require further development of the verification tool.

  6. Adapting Web content for low-literacy readers by using lexical elaboration and named entities labeling

    NASA Astrophysics Data System (ADS)

    Watanabe, W. M.; Candido, A.; Amâncio, M. A.; De Oliveira, M.; Pardo, T. A. S.; Fortes, R. P. M.; Aluísio, S. M.

    2010-12-01

    This paper presents an approach for assisting low-literacy readers in accessing Web online information. The "Educational FACILITA" tool is a Web content adaptation tool that provides innovative features and follows more intuitive interaction models regarding accessibility concerns. Especially, we propose an interaction model and a Web application that explore the natural language processing tasks of lexical elaboration and named entity labeling for improving Web accessibility. We report on the results obtained from a pilot study on usability analysis carried out with low-literacy users. The preliminary results show that "Educational FACILITA" improves the comprehension of text elements, although the assistance mechanisms might also confuse users when word sense ambiguity is introduced, by gathering, for a complex word, a list of synonyms with multiple meanings. This fact evokes a future solution in which the correct sense for a complex word in a sentence is identified, solving this pervasive characteristic of natural languages. The pilot study also identified that experienced computer users find the tool to be more useful than novice computer users do.

  7. On the road to a stronger public health workforce: visual tools to address complex challenges.

    PubMed

    Drehobl, Patricia; Stover, Beth H; Koo, Denise

    2014-11-01

    The public health workforce is vital to protecting the health and safety of the public, yet for years, state and local governmental public health agencies have reported substantial workforce losses and other challenges to the workforce that threaten the public's health. These challenges are complex, often involve multiple influencing or related causal factors, and demand comprehensive solutions. However, proposed solutions often focus on selected factors and might be fragmented rather than comprehensive. This paper describes approaches to characterizing the situation more comprehensively and includes two visual tools: (1) a fishbone, or Ishikawa, diagram that depicts multiple factors affecting the public health workforce; and (2) a roadmap that displays key elements-goals and strategies-to strengthen the public health workforce, thus moving from the problems depicted in the fishbone toward solutions. The visual tools aid thinking about ways to strengthen the public health workforce through collective solutions and to help leverage resources and build on each other's work. The strategic roadmap is intended to serve as a dynamic tool for partnership, prioritization, and gap assessment. These tools reflect and support CDC's commitment to working with partners on the highest priorities for strengthening the workforce to improve the public's health. Published by Elsevier Inc.

  8. Patient Education and Support During CKD Transitions: When the Possible Becomes Probable.

    PubMed

    Green, Jamie A; Boulware, L Ebony

    2016-07-01

    Patients transitioning from kidney disease to kidney failure require comprehensive patient-centered education and support. Efforts to prepare patients for this transition often fail to meet patients' needs due to uncertainty about which patients will progress to kidney failure, nonindividualized patient education programs, inadequate psychosocial support, or lack of assistance to guide patients through complex treatment plans. Resources are available to help overcome barriers to providing optimal care during this time, including prognostic tools, educational lesson plans, decision aids, communication skills training, peer support, and patient navigation programs. New models are being studied to comprehensively address patients' needs and improve the lives of kidney patients during this high-risk time. Copyright © 2016 National Kidney Foundation, Inc. Published by Elsevier Inc. All rights reserved.

  9. Development of a new generation of high-resolution anatomical models for medical device evaluation: the Virtual Population 3.0

    NASA Astrophysics Data System (ADS)

    Gosselin, Marie-Christine; Neufeld, Esra; Moser, Heidi; Huber, Eveline; Farcito, Silvia; Gerber, Livia; Jedensjö, Maria; Hilber, Isabel; Di Gennaro, Fabienne; Lloyd, Bryn; Cherubini, Emilio; Szczerba, Dominik; Kainz, Wolfgang; Kuster, Niels

    2014-09-01

    The Virtual Family computational whole-body anatomical human models were originally developed for electromagnetic (EM) exposure evaluations, in particular to study how absorption of radiofrequency radiation from external sources depends on anatomy. However, the models immediately garnered much broader interest and are now applied by over 300 research groups, many from medical applications research fields. In a first step, the Virtual Family was expanded to the Virtual Population to provide considerably broader population coverage with the inclusion of models of both sexes ranging in age from 5 to 84 years old. Although these models have proven to be invaluable for EM dosimetry, it became evident that significantly enhanced models are needed for reliable effectiveness and safety evaluations of diagnostic and therapeutic applications, including medical implants safety. This paper describes the research and development performed to obtain anatomical models that meet the requirements necessary for medical implant safety assessment applications. These include implementation of quality control procedures, re-segmentation at higher resolution, more-consistent tissue assignments, enhanced surface processing and numerous anatomical refinements. Several tools were developed to enhance the functionality of the models, including discretization tools, posing tools to expand the posture space covered, and multiple morphing tools, e.g., to develop pathological models or variations of existing ones. A comprehensive tissue properties database was compiled to complement the library of models. The results are a set of anatomically independent, accurate, and detailed models with smooth, yet feature-rich and topologically conforming surfaces. The models are therefore suited for the creation of unstructured meshes, and the possible applications of the models are extended to a wider range of solvers and physics. The impact of these improvements is shown for the MRI exposure of an adult woman with an orthopedic spinal implant. Future developments include the functionalization of the models for specific physical and physiological modeling tasks.

  10. Introduction of blended learning in a master program: Developing an integrative mixed method evaluation framework.

    PubMed

    Chmiel, Aviva S; Shaha, Maya; Schneider, Daniel K

    2017-01-01

    The aim of this research is to develop a comprehensive evaluation framework involving all actors in a higher education blended learning (BL) program. BL evaluation usually either focuses on students, faculty, technological or institutional aspects. Currently, no validated comprehensive monitoring tool exists that can support introduction and further implementation of BL in a higher education context. Starting from established evaluation principles and standards, concepts that were to be evaluated were firstly identified and grouped. In a second step, related BL evaluation tools referring to students, faculty and institutional level were selected. This allowed setting up and implementing an evaluation framework to monitor the introduction of BL during two succeeding recurrences of the program. The results of the evaluation allowed documenting strengths and weaknesses of the BL format in a comprehensive way, involving all actors. It has led to improvements at program, faculty and course level. The evaluation process and the reporting of the results proved to be demanding in time and personal resources. The evaluation framework allows measuring the most significant dimensions influencing the success of a BL implementation at program level. However, this comprehensive evaluation is resource intensive. Further steps will be to refine the framework towards a sustainable and transferable BL monitoring tool that finds a balance between comprehensiveness and efficiency. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Understanding the Physical Optics Phenomena by Using a Digital Application for Light Propagation

    NASA Astrophysics Data System (ADS)

    Sierra-Sosa, Daniel-Esteban; Ángel-Toro, Luciano

    2011-01-01

    Understanding the light propagation on the basis of the Huygens-Fresnel principle stands for a fundamental factor for deeper comprehension of different physical optics related phenomena like diffraction, self-imaging, image formation, Fourier analysis and spatial filtering. This constitutes the physical approach of the Fourier optics whose principles and applications have been developed since the 1950's. Both for analytical and digital applications purposes, light propagation can be formulated in terms of the Fresnel Integral Transform. In this work, a digital optics application based on the implementation of the Discrete Fresnel Transform (DFT), and addressed to serve as a tool for applications in didactics of optics is presented. This tool allows, at a basic and intermediate learning level, exercising with the identification of basic phenomena, and observing changes associated with modifications of physical parameters. This is achieved by using a friendly graphic user interface (GUI). It also assists the user in the development of his capacity for abstracting and predicting the characteristics of more complicated phenomena. At an upper level of learning, the application could be used to favor a deeper comprehension of involved physics and models, and experimenting with new models and configurations. To achieve this, two characteristics of the didactic tool were taken into account when designing it. First, all physical operations, ranging from simple diffraction experiments to digital holography and interferometry, were developed on the basis of the more fundamental concept of light propagation. Second, the algorithm was conceived to be easily upgradable due its modular architecture based in MATLAB® software environment. Typical results are presented and briefly discussed in connection with didactics of optics.

  12. One multi-media environmental system with linkage between meteorology/ hydrology/ air quality models and water quality model

    NASA Astrophysics Data System (ADS)

    Tang, C.; Lynch, J. A.; Dennis, R. L.

    2016-12-01

    The biogeochemical processing of nitrogen and associated pollutants is driven by meteorological and hydrological processes in conjunction with pollutant loading. There are feedbacks between meteorology and hydrology that will be affected by land-use change and climate change. Changes in meteorology will affect pollutant deposition. It is important to account for those feedbacks and produce internally consistent simulations of meteorology, hydrology, and pollutant loading to drive the (watershed/water quality) biogeochemical models. In this study, the ecological response to emission reductions in streams in the Potomac watershed was evaluated. Firstly, we simulated the deposition by using the fully coupled Weather Research & Forecasting (WRF) model and the Community Multiscale Air Quality (CAMQ) model; secondly, we created the hydrological data by the offline linked Variable Infiltration Capacity (VIC) model and the WRF model. Lastly, we investigated the water quality by one comprehensive/environment model, namely the linkage of CMAQ, WRF, VIC and the Model of Acidification of Groundwater In Catchment (MAGIC) model from 2002 to 2010.The simulated results (such as NO3, SO4, and SBC) fit well to the observed values. The linkage provides a generally accurate, well-tested tool for evaluating sensitivities to varying meteorology and environmental changes on acidification and other biogeochemical processes, with capability to comprehensively explore strategic policy and management design.

  13. Financing Alternatives Comparison Tool

    EPA Pesticide Factsheets

    FACT is a financial analysis tool that helps identify the most cost-effective method to fund a wastewater or drinking water management project. It produces a comprehensive analysis that compares various financing options.

  14. Dynamic visualization techniques for high consequence software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pollock, G.M.

    1998-02-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification. The prototype tool is described along with the requirements constraint language after a brief literature review is presented. Examples of howmore » the tool can be used are also presented. In conclusion, the most significant advantage of this tool is to provide a first step in evaluating specification completeness, and to provide a more productive method for program comprehension and debugging. The expected payoff is increased software surety confidence, increased program comprehension, and reduced development and debugging time.« less

  15. A procedural model for planning and evaluating behavioral interventions.

    PubMed

    Hyner, G C

    2005-01-01

    A model for planning, implementing and evaluating health behavior change strategies is proposed. Variables are presented which can be used in the model or serve as examples for how the model is utilized once a theory of health behavior is adopted. Examples of three innovative strategies designed to influence behavior change are presented so that the proposed model can be modified for use following comprehensive screening and baseline measurements. Three measurement priorities: clients, methods and agency are subjected to three phases of assessment: goals, implementation and effects. Lifestyles account for the majority of variability in quality-of-life and premature morbidity and mortality. Interventions designed to influence healthy behavior changes must be driven by theory and carefully planned and evaluated. The proposed model is offered as a useful tool for the behavior change strategist.

  16. Recognizing and exploring the right questions with climate data: An example of better understanding ENSO in climate projections

    NASA Astrophysics Data System (ADS)

    Ammann, C. M.; Brown, B.; Kalb, C. P.; Bullock, R.; Buja, L.; Gutowski, W. J., Jr.; Halley-Gotway, J.; Kaatz, L.; Yates, D. N.

    2017-12-01

    Coordinated, multi-model climate change projection archives have already led to a flourishing of new climate impact applications. Collections and online tools for the computation of derived indicators have attracted many non-specialist users and decision-makers and facilitated for them the exploration of potential future weather and climate changes on their systems. Guided by a set of standardized steps and analyses, many can now use model output and determine basic model-based changes. But because each application and decision-context is different, the question remains if such a small collection of standardized tools can faithfully and comprehensively represent the critical physical context of change? We use the example of the El Niño - Southern Oscillation, the largest and most broadly recognized mode of variability in the climate system, to explore the difference in impact contexts between a quasi-blind, protocol-bound and a flexible, scientifically guided use of climate information. More use oriented diagnostics of the model-data as well as different strategies for getting data into decision environments are explored.

  17. MVIAeval: a web tool for comprehensively evaluating the performance of a new missing value imputation algorithm.

    PubMed

    Wu, Wei-Sheng; Jhou, Meng-Jhun

    2017-01-13

    Missing value imputation is important for microarray data analyses because microarray data with missing values would significantly degrade the performance of the downstream analyses. Although many microarray missing value imputation algorithms have been developed, an objective and comprehensive performance comparison framework is still lacking. To solve this problem, we previously proposed a framework which can perform a comprehensive performance comparison of different existing algorithms. Also the performance of a new algorithm can be evaluated by our performance comparison framework. However, constructing our framework is not an easy task for the interested researchers. To save researchers' time and efforts, here we present an easy-to-use web tool named MVIAeval (Missing Value Imputation Algorithm evaluator) which implements our performance comparison framework. MVIAeval provides a user-friendly interface allowing users to upload the R code of their new algorithm and select (i) the test datasets among 20 benchmark microarray (time series and non-time series) datasets, (ii) the compared algorithms among 12 existing algorithms, (iii) the performance indices from three existing ones, (iv) the comprehensive performance scores from two possible choices, and (v) the number of simulation runs. The comprehensive performance comparison results are then generated and shown as both figures and tables. MVIAeval is a useful tool for researchers to easily conduct a comprehensive and objective performance evaluation of their newly developed missing value imputation algorithm for microarray data or any data which can be represented as a matrix form (e.g. NGS data or proteomics data). Thus, MVIAeval will greatly expedite the progress in the research of missing value imputation algorithms.

  18. Use of a screening tool and primary health care gerontology nurse specialist for high-needs older people.

    PubMed

    King, Anna; Boyd, Michal; Dagley, Lynelle

    2017-02-01

    To describe implementation of an innovative gerontology nurse specialist role within one primary health organisation in Auckland, New Zealand. Quantitative outcomes of the screening tool as well as the nurse specialist assessment will be presented. The intervention involved use of the Brief Risk Identification for Geriatric Health Tool (BRIGHT) to identify high-needs older people with subsequent comprehensive geriatric assessment (CGA) performed by the gerontology nurse specialist. A total 384 of the 416 BRIGHTs were completed (92% response rate) and 15% of these were identified as high risk (n = 57). The BRIGHTs for high-risk older people revealed the highest scoring question was 'needing help with housework' (26%). The most frequent intervention by the gerontology nurse specialist was education (30%). The primary health care gerontology nurse specialist model delivers a proactive case finding and specialist gerontology intervention for older people at high risk of functional or health decline.

  19. Mammalian synthetic biology for studying the cell.

    PubMed

    Mathur, Melina; Xiang, Joy S; Smolke, Christina D

    2017-01-02

    Synthetic biology is advancing the design of genetic devices that enable the study of cellular and molecular biology in mammalian cells. These genetic devices use diverse regulatory mechanisms to both examine cellular processes and achieve precise and dynamic control of cellular phenotype. Synthetic biology tools provide novel functionality to complement the examination of natural cell systems, including engineered molecules with specific activities and model systems that mimic complex regulatory processes. Continued development of quantitative standards and computational tools will expand capacities to probe cellular mechanisms with genetic devices to achieve a more comprehensive understanding of the cell. In this study, we review synthetic biology tools that are being applied to effectively investigate diverse cellular processes, regulatory networks, and multicellular interactions. We also discuss current challenges and future developments in the field that may transform the types of investigation possible in cell biology. © 2017 Mathur et al.

  20. A web-based 3D visualisation and assessment system for urban precinct scenario modelling

    NASA Astrophysics Data System (ADS)

    Trubka, Roman; Glackin, Stephen; Lade, Oliver; Pettit, Chris

    2016-07-01

    Recent years have seen an increasing number of spatial tools and technologies for enabling better decision-making in the urban environment. They have largely arisen because of the need for cities to be more efficiently planned to accommodate growing populations while mitigating urban sprawl, and also because of innovations in rendering data in 3D being well suited for visualising the urban built environment. In this paper we review a number of systems that are better known and more commonly used in the field of urban planning. We then introduce Envision Scenario Planner (ESP), a web-based 3D precinct geodesign, visualisation and assessment tool, developed using Agile and Co-design methods. We provide a comprehensive account of the tool, beginning with a discussion of its design and development process and concluding with an example use case and a discussion of the lessons learned in its development.

  1. Cardiac resynchronization therapy (CRT) in heart failure--a model to assess the economic value of this new medical technology.

    PubMed

    Banz, Kurt

    2005-01-01

    This article describes the framework of a comprehensive European model developed to assess clinical and economic outcomes of cardiac resynchronization therapy (CRT) versus optimal pharmacological therapy (OPT) alone in patients with heart failure. The model structure is based on information obtained from the literature, expert opinion, and a European CRT Steering Committee. The decision-analysis tool allows a consideration of direct medical and indirect costs, and computes outcomes for distinctive periods of time up to 5 years. Qualitative data can also be entered for cost-utility analysis. Model input data for a preliminary economic appraisal of the economic value of CRT in Germany were obtained from clinical trials, experts, health statistics, and medical tariff lists. The model offers comprehensive analysis capabilities and high flexibility so that it can easily be adapted to any European country or special setting. The illustrative analysis for Germany indicates that CRT is a cost-effective intervention. Although CRT is associated with average direct medical net costs of Euro 5880 per patient, this finding means that 22% of its upfront implantation cost is recouped already within 1 year because of significantly decreased hospitalizations. With 36,600 Euros the incremental cost per quality-adjusted life-year (QALY) gained is below the euro equivalent (41,300 Euros, 1 Euro = US1.21 dollars) of the commonly used threshold level of US50,000 dollars considered to represent cost-effectiveness. The sensitivity analysis showed these preliminary results to be fairly robust towards changes in key assumptions. The European CRT model is an important tool to assess the economic value of CRT in patients with moderate to severe heart failure. In the light of the planned introduction of Diagnosis Related Group (DRG) based reimbursement in various European countries, the economic data generated by the model can play an important role in the decision-making process.

  2. Comparison of Spatial Correlation Parameters between Full and Model Scale Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Kenny, Jeremy; Giacomoni, Clothilde

    2016-01-01

    The current vibro-acoustic analysis tools require specific spatial correlation parameters as input to define the liftoff acoustic environment experienced by the launch vehicle. Until recently these parameters have not been very well defined. A comprehensive set of spatial correlation data were obtained during a scale model acoustic test conducted in 2014. From these spatial correlation data, several parameters were calculated: the decay coefficient, the diffuse to propagating ratio, and the angle of incidence. Spatial correlation data were also collected on the EFT-1 flight of the Delta IV vehicle which launched on December 5th, 2014. A comparison of the spatial correlation parameters from full scale and model scale data will be presented.

  3. Introduction to multiresolution modeling (MMR) with an example involving precision fires

    NASA Astrophysics Data System (ADS)

    Davis, Paul K.; Bigelow, James H.

    1998-08-01

    In this paper we review motivations for multilevel resolution modeling (MRM) within a single model, an integrated hierarchical family of models, or both. We then present a new depiction of consistency criteria for models at different levels. After describing our hypotheses for studying the process of MRM with examples, we define a simple but policy-relevant problem involving the use of precision fires to halt an invading army. We then illustrate MRM with a sequence of abstractions suggested by formal theory, visual representation, and approximation. We milk the example for insights about why MRM is different and often difficult, and how it might be accomplished more routinely. It should be feasible even in complex systems such as JWARS and JSIMS, but it is by no means easy. Comprehensive MRM designs are unlikely. It is useful to take the view that some MRM is a great deal better than none and that approximate MRM relationships are often quite adequate. Overall, we conclude that high-quality MRM requires new theory, design practices, modeling tools, and software tools, all of which will take some years to develop. Current object-oriented programming practices may actually be a hindrance.

  4. Can Early Years Professionals Determine Which Preschoolers Have Comprehension Delays? A Comparison of Two Screening Tools

    ERIC Educational Resources Information Center

    Seager, Emily; Abbot-Smith, Kirsten

    2017-01-01

    Language comprehension delays in pre-schoolers are predictive of difficulties in a range of developmental domains. In England, early years practitioners are required to assess the language comprehension of 2-year-olds in their care. Many use a format based on the Early Years Foundation Stage Unique Child Communication Sheet (EYFS:UCCS) in which…

  5. Stakeholders' Perspectives towards the Use of the Comprehensive Health Assessment Program (CHAP) for Adults with Intellectual Disabilities in Manitoba

    ERIC Educational Resources Information Center

    Shooshtari, Shahin; Temple, Beverley; Waldman, Celeste; Abraham, Sneha; Ouellette-Kuntz, Héléne; Lennox, Nicholas

    2017-01-01

    Background: No standardized tool is used in Canada for comprehensive health assessments of adults with intellectual disabilities. This study was conducted to determine the feasibility of implementing the Comprehensive Health Assessment Program (CHAP) in Manitoba, Canada. Method: This was a qualitative study using a purposive sample of physicians,…

  6. A Comprehensive Automated 3D Approach for Building Extraction, Reconstruction, and Regularization from Airborne Laser Scanning Point Clouds

    PubMed Central

    Dorninger, Peter; Pfeifer, Norbert

    2008-01-01

    Three dimensional city models are necessary for supporting numerous management applications. For the determination of city models for visualization purposes, several standardized workflows do exist. They are either based on photogrammetry or on LiDAR or on a combination of both data acquisition techniques. However, the automated determination of reliable and highly accurate city models is still a challenging task, requiring a workflow comprising several processing steps. The most relevant are building detection, building outline generation, building modeling, and finally, building quality analysis. Commercial software tools for building modeling require, generally, a high degree of human interaction and most automated approaches described in literature stress the steps of such a workflow individually. In this article, we propose a comprehensive approach for automated determination of 3D city models from airborne acquired point cloud data. It is based on the assumption that individual buildings can be modeled properly by a composition of a set of planar faces. Hence, it is based on a reliable 3D segmentation algorithm, detecting planar faces in a point cloud. This segmentation is of crucial importance for the outline detection and for the modeling approach. We describe the theoretical background, the segmentation algorithm, the outline detection, and the modeling approach, and we present and discuss several actual projects. PMID:27873931

  7. Engineering the fitness of older patients for chemotherapy: an exploration of Comprehensive Geriatric Assessment in practice.

    PubMed

    McCarthy, Alexandra L; Cook, Peta S; Yates, Patsy

    2014-03-01

    Clinicians often report that currently available methods to assess older patients, including standard clinical consultations, do not elicit the information necessary to make an appropriate cancer treatment recommendation for older cancer patients. An increasingly popular way of assessing the potential of older patients to cope with chemotherapy is a Comprehensive Geriatric Assessment. What constitutes Comprehensive Geriatric Assessment, however, is open to interpretation and varies from one setting to another. Furthermore, Comprehensive Geriatric Assessment's usefulness as a predictor of fitness for chemotherapy and as a determinant of actual treatment is not well understood. In this article, we analyse how Comprehensive Geriatric Assessment was developed for use in a large cancer service in an Australian capital city. Drawing upon Actor-Network Theory, our findings reveal how, during its development, Comprehensive Geriatric Assessment was made both a tool and a science. Furthermore, we briefly explore the tensions that we experienced as scholars who analyse medico-scientific practices and as practitioner-designers charged with improving the very tools we critique. Our study contributes towards geriatric oncology by scrutinising the medicalisation of ageing, unravelling the practices of standardisation and illuminating the multiplicity of 'fitness for chemotherapy'.

  8. Intervention Model for Contaminated Consumer Products: A Multifaceted Tool for Protecting Public Health

    PubMed Central

    Ahmed, Munerah; Nagin, Deborah; Clark, Nancy

    2014-01-01

    Lead-based paint and occupational lead hazards remain the primary exposure sources of lead in New York City (NYC) children and men, respectively. Lead poisoning has also been associated with the use of certain consumer products in NYC. The NYC Department of Health and Mental Hygiene developed the Intervention Model for Contaminated Consumer Products, a comprehensive approach to identify and reduce exposure to lead and other hazards in consumer products. The model identifies hazardous consumer products, determines their availability in NYC, enforces on these products, and provides risk communication and public education. Implementation of the model has resulted in removal of thousands of contaminated products from local businesses and continues to raise awareness of these hazardous products. PMID:24922141

  9. Behavioural phenotyping assays for mouse models of autism

    PubMed Central

    Silverman, Jill L.; Yang, Mu; Lord, Catherine; Crawley, Jacqueline N.

    2011-01-01

    Autism is a heterogeneous neurodevelopmental disorder of unknown aetiology that affects 1 in 100–150 individuals. Diagnosis is based on three categories of behavioural criteria: abnormal social interactions, communication deficits and repetitive behaviours. Strong evidence for a genetic basis has prompted the development of mouse models with targeted mutations in candidate genes for autism. As the diagnostic criteria for autism are behavioural, phenotyping these mouse models requires behavioural assays with high relevance to each category of the diagnostic symptoms. Behavioural neuroscientists are generating a comprehensive set of assays for social interaction, communication and repetitive behaviours to test hypotheses about the causes of austism. Robust phenotypes in mouse models hold great promise as translational tools for discovering effective treatments for components of autism spectrum disorders. PMID:20559336

  10. Exploring model based engineering for large telescopes: getting started with descriptive models

    NASA Astrophysics Data System (ADS)

    Karban, R.; Zamparelli, M.; Bauvir, B.; Koehler, B.; Noethe, L.; Balestra, A.

    2008-07-01

    Large telescopes pose a continuous challenge to systems engineering due to their complexity in terms of requirements, operational modes, long duty lifetime, interfaces and number of components. A multitude of decisions must be taken throughout the life cycle of a new system, and a prime means of coping with complexity and uncertainty is using models as one decision aid. The potential of descriptive models based on the OMG Systems Modeling Language (OMG SysMLTM) is examined in different areas: building a comprehensive model serves as the basis for subsequent activities of soliciting and review for requirements, analysis and design alike. Furthermore a model is an effective communication instrument against misinterpretation pitfalls which are typical of cross disciplinary activities when using natural language only or free-format diagrams. Modeling the essential characteristics of the system, like interfaces, system structure and its behavior, are important system level issues which are addressed. Also shown is how to use a model as an analysis tool to describe the relationships among disturbances, opto-mechanical effects and control decisions and to refine the control use cases. Considerations on the scalability of the model structure and organization, its impact on the development process, the relation to document-centric structures, style and usage guidelines and the required tool chain are presented.

  11. SpacePy - a Python-based library of tools for the space sciences

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morley, Steven K; Welling, Daniel T; Koller, Josef

    Space science deals with the bodies within the solar system and the interplanetary medium; the primary focus is on atmospheres and above - at Earth the short timescale variation in the the geomagnetic field, the Van Allen radiation belts and the deposition of energy into the upper atmosphere are key areas of investigation. SpacePy is a package for Python, targeted at the space sciences, that aims to make basic data analysis, modeling and visualization easier. It builds on the capabilities of the well-known NumPy and MatPlotLib packages. Publication quality output direct from analyses is emphasized. The SpacePy project seeks tomore » promote accurate and open research standards by providing an open environment for code development. In the space physics community there has long been a significant reliance on proprietary languages that restrict free transfer of data and reproducibility of results. By providing a comprehensive, open-source library of widely used analysis and visualization tools in a free, modern and intuitive language, we hope that this reliance will be diminished. SpacePy includes implementations of widely used empirical models, statistical techniques used frequently in space science (e.g. superposed epoch analysis), and interfaces to advanced tools such as electron drift shell calculations for radiation belt studies. SpacePy also provides analysis and visualization tools for components of the Space Weather Modeling Framework - currently this only includes the BATS-R-US 3-D magnetohydrodynamic model and the RAM ring current model - including streamline tracing in vector fields. Further development is currently underway. External libraries, which include well-known magnetic field models, high-precision time conversions and coordinate transformations are wrapped for access from Python using SWIG and f2py. The rest of the tools have been implemented directly in Python. The provision of open-source tools to perform common tasks will provide openness in the analysis methods employed in scientific studies and will give access to advanced tools to all space scientists regardless of affiliation or circumstance.« less

  12. A meta-model for computer executable dynamic clinical safety checklists.

    PubMed

    Nan, Shan; Van Gorp, Pieter; Lu, Xudong; Kaymak, Uzay; Korsten, Hendrikus; Vdovjak, Richard; Duan, Huilong

    2017-12-12

    Safety checklist is a type of cognitive tool enforcing short term memory of medical workers with the purpose of reducing medical errors caused by overlook and ignorance. To facilitate the daily use of safety checklists, computerized systems embedded in the clinical workflow and adapted to patient-context are increasingly developed. However, the current hard-coded approach of implementing checklists in these systems increase the cognitive efforts of clinical experts and coding efforts for informaticists. This is due to the lack of a formal representation format that is both understandable by clinical experts and executable by computer programs. We developed a dynamic checklist meta-model with a three-step approach. Dynamic checklist modeling requirements were extracted by performing a domain analysis. Then, existing modeling approaches and tools were investigated with the purpose of reusing these languages. Finally, the meta-model was developed by eliciting domain concepts and their hierarchies. The feasibility of using the meta-model was validated by two case studies. The meta-model was mapped to specific modeling languages according to the requirements of hospitals. Using the proposed meta-model, a comprehensive coronary artery bypass graft peri-operative checklist set and a percutaneous coronary intervention peri-operative checklist set have been developed in a Dutch hospital and a Chinese hospital, respectively. The result shows that it is feasible to use the meta-model to facilitate the modeling and execution of dynamic checklists. We proposed a novel meta-model for the dynamic checklist with the purpose of facilitating creating dynamic checklists. The meta-model is a framework of reusing existing modeling languages and tools to model dynamic checklists. The feasibility of using the meta-model is validated by implementing a use case in the system.

  13. A flexible object-oriented software framework for developing complex multimedia simulations.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sydelko, P. J.; Dolph, J. E.; Christiansen, J. H.

    Decision makers involved in brownfields redevelopment and long-term stewardship must consider environmental conditions, future-use potential, site ownership, area infrastructure, funding resources, cost recovery, regulations, risk and liability management, community relations, and expected return on investment in a comprehensive and integrated fashion to achieve desired results. Successful brownfields redevelopment requires the ability to assess the impacts of redevelopment options on multiple interrelated aspects of the ecosystem, both natural and societal. Computer-based tools, such as simulation models, databases, and geographical information systems (GISs) can be used to address brownfields planning and project execution. The transparent integration of these tools into a comprehensivemore » and dynamic decision support system would greatly enhance the brownfields assessment process. Such a system needs to be able to adapt to shifting and expanding analytical requirements and contexts. The Dynamic Information Architecture System (DIAS) is a flexible, extensible, object-oriented framework for developing and maintaining complex multidisciplinary simulations of a wide variety of application domains. The modeling domain of a specific DIAS-based simulation is determined by (1) software objects that represent the real-world entities that comprise the problem space (atmosphere, watershed, human), and (2) simulation models and other data processing applications that express the dynamic behaviors of the domain entities. Models and applications used to express dynamic behaviors can be either internal or external to DIAS, including existing legacy models written in various languages (FORTRAN, C, etc.). The flexible design framework of DIAS makes the objects adjustable to the context of the problem without a great deal of recoding. The DIAS Spatial Data Set facility allows parameters to vary spatially depending on the simulation context according to any of a number of 1-D, 2-D, or 3-D topologies. DIAS is also capable of interacting with other GIS packages and can import many standard spatial data formats. DIAS simulation capabilities can also be extended by including societal process models. Models that implement societal behaviors of individuals and organizations within larger DIAS-based natural systems simulations allow for interaction and feedback among natural and societal processes. The ability to simulate the complex interplay of multimedia processes makes DIAS a promising tool for constructing applications for comprehensive community planning, including the assessment of multiple development and redevelopment scenarios.« less

  14. Earth-Science Data Co-Locating Tool

    NASA Technical Reports Server (NTRS)

    Lee, Seungwon; Pan, Lei; Block, Gary L.

    2012-01-01

    This software is used to locate Earth-science satellite data and climate-model analysis outputs in space and time. This enables the direct comparison of any set of data with different spatial and temporal resolutions. It is written in three separate modules that are clearly separated for their functionality and interface with other modules. This enables a fast development of supporting any new data set. In this updated version of the tool, several new front ends are developed for new products. This software finds co-locatable data pairs for given sets of data products and creates new data products that share the same spatial and temporal coordinates. This facilitates the direct comparison between the two heterogeneous datasets and the comprehensive and synergistic use of the datasets.

  15. A survey of program slicing for software engineering

    NASA Technical Reports Server (NTRS)

    Beck, Jon

    1993-01-01

    This research concerns program slicing which is used as a tool for program maintainence of software systems. Program slicing decreases the level of effort required to understand and maintain complex software systems. It was first designed as a debugging aid, but it has since been generalized into various tools and extended to include program comprehension, module cohesion estimation, requirements verification, dead code elimination, and maintainence of several software systems, including reverse engineering, parallelization, portability, and reuse component generation. This paper seeks to address and define terminology, theoretical concepts, program representation, different program graphs, developments in static slicing, dynamic slicing, and semantics and mathematical models. Applications for conventional slicing are presented, along with a prognosis of future work in this field.

  16. Virtual acoustic environments for comprehensive evaluation of model-based hearing devices.

    PubMed

    Grimm, Giso; Luberadzka, Joanna; Hohmann, Volker

    2018-06-01

    Create virtual acoustic environments (VAEs) with interactive dynamic rendering for applications in audiology. A toolbox for creation and rendering of dynamic virtual acoustic environments (TASCAR) that allows direct user interaction was developed for application in hearing aid research and audiology. The software architecture and the simulation methods used to produce VAEs are outlined. Example environments are described and analysed. With the proposed software, a tool for simulation of VAEs is available. A set of VAEs rendered with the proposed software was described.

  17. Environmental Risk Assessment of dredging processes - application to Marin harbour (NW Spain)

    NASA Astrophysics Data System (ADS)

    Gómez, A. G.; García Alba, J.; Puente, A.; Juanes, J. A.

    2014-04-01

    A methodological procedure to estimate the environmental risk of dredging operations in aquatic systems has been developed. Environmental risk estimations are based on numerical models results, which provide an appropriated spatio-temporal framework analysis to guarantee an effective decision-making process. The methodological procedure has been applied on a real dredging operation in the port of Marin (NW Spain). Results from Marin harbour confirmed the suitability of the developed methodology and the conceptual approaches as a comprehensive and practical management tool.

  18. International Neural Network Society Annual Meeting (1994) Held in San Diego, California on 5-9 June 1994. Volume 1

    DTIC Science & Technology

    1994-06-09

    Ethics and the Soul 1-221 P. Werbos A Net Program for Natural Language Comprehension 1-863 J. Weiss Applications Oral ANN Design of Image Processing...Controlling Nonlinear Dynamic Systems Using Neuro-Fuzzy Networks 1-787 E. Teixera, G. Laforga, H. Azevedo Neural Fuzzy Logics as a Tool for Design Ecological ...Discrete Neural Network 11-466 Z. Cheng-fu Representation of Number A Theory of Mathematical Modeling 11-479 J. Cristofano An Ecological Approach to

  19. Comprehensive feedback on trainee surgeons’ non-technical skills

    PubMed Central

    Dieckmann, Peter; Beier-Holgersen, Randi; Rosenberg, Jacob; Oestergaard, Doris

    2015-01-01

    Objectives This study aimed to explore the content of conversations, feedback style, and perceived usefulness of feedback to trainee surgeons when conversations were stimulated by a tool for assessing surgeons’ non-technical skills. Methods Trainee surgeons and their supervisors used the Non-Technical Skills for Surgeons in Denmark tool to stimulate feedback conversations. Audio recordings of post-operation feedback conversations were collected. Trainees and supervisors provided questionnaire responses on the usefulness and comprehensiveness of the feedback. The feedback conversations were qualitatively analyzed for content and feedback style. Usefulness was investigated using a scale from 1 to 5 and written comments were qualitatively analyzed. Results Six trainees and six supervisors participated in eight feedback conversations. Eighty questionnaires (response rate 83 percent) were collected from 13 trainees and 12 supervisors. Conversations lasted median eight (2-15) minutes. Supervisors used the elements and categories in the tool to structure the content of the conversations. Supervisors tended to talk about the trainees’ actions and their own frames rather than attempting to understand the trainees’ perceptions. Supervisors and trainees welcomed the feedback opportunity and agreed that the conversations were useful and comprehensive. Conclusions The content of the feedback conversations reflected the contents of the tool and the feedback was considered useful and comprehensive. However, supervisors talked primarily about their own frames, so in order for the feedback to reach its full potential, supervisors may benefit from training techniques to stimulate a deeper reflection among trainees. PMID:25602262

  20. Multimedia Informed Consent Tool for a Low Literacy African Research Population: Development and Pilot-Testing.

    PubMed

    Afolabi, Muhammed Olanrewaju; Bojang, Kalifa; D'Alessandro, Umberto; Imoukhuede, Egeruan Babatunde; Ravinetto, Raffaella M; Larson, Heidi Jane; McGrath, Nuala; Chandramohan, Daniel

    2014-04-05

    International guidelines recommend the use of appropriate informed consent procedures in low literacy research settings because written information is not known to guarantee comprehension of study information. This study developed and evaluated a multimedia informed consent tool for people with low literacy in an area where a malaria treatment trial was being planned in The Gambia. We developed the informed consent document of the malaria treatment trial into a multimedia tool integrating video, animations and audio narrations in three major Gambian languages. Acceptability and ease of use of the multimedia tool were assessed using quantitative and qualitative methods. In two separate visits, the participants' comprehension of the study information was measured by using a validated digitised audio questionnaire. The majority of participants (70%) reported that the multimedia tool was clear and easy to understand. Participants had high scores on the domains of adverse events/risk, voluntary participation, study procedures while lowest scores were recorded on the question items on randomisation. The differences in mean scores for participants' 'recall' and 'understanding' between first and second visits were statistically significant (F (1,41)=25.38, p<0.00001 and (F (1, 41) = 31.61, p<0.00001 respectively. Our locally developed multimedia tool was acceptable and easy to administer among low literacy participants in The Gambia. It also proved to be effective in delivering and sustaining comprehension of study information across a diverse group of participants. Additional research is needed to compare the tool to the traditional consent interview, both in The Gambia and in other sub-Saharan settings.

  1. A Tool for the Automated Design and Evaluation of Habitat Interior Layouts

    NASA Technical Reports Server (NTRS)

    Simon, Matthew A.; Wilhite, Alan W.

    2013-01-01

    The objective of space habitat design is to minimize mass and system size while providing adequate space for all necessary equipment and a functional layout that supports crew health and productivity. Unfortunately, development and evaluation of interior layouts is often ignored during conceptual design because of the subjectivity and long times required using current evaluation methods (e.g., human-in-the-loop mockup tests and in-depth CAD evaluations). Early, more objective assessment could prevent expensive design changes that may increase vehicle mass and compromise functionality. This paper describes a new interior design evaluation method to enable early, structured consideration of habitat interior layouts. This interior layout evaluation method features a comprehensive list of quantifiable habitat layout evaluation criteria, automatic methods to measure these criteria from a geometry model, and application of systems engineering tools and numerical methods to construct a multi-objective value function measuring the overall habitat layout performance. In addition to a detailed description of this method, a C++/OpenGL software tool which has been developed to implement this method is also discussed. This tool leverages geometry modeling coupled with collision detection techniques to identify favorable layouts subject to multiple constraints and objectives (e.g., minimize mass, maximize contiguous habitable volume, maximize task performance, and minimize crew safety risks). Finally, a few habitat layout evaluation examples are described to demonstrate the effectiveness of this method and tool to influence habitat design.

  2. Simulating Activities: Relating Motives, Deliberation and Attentive Coordination

    NASA Technical Reports Server (NTRS)

    Clancey, William J.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Activities are located behaviors, taking time, conceived as socially meaningful, and usually involving interaction with tools and the environment. In modeling human cognition as a form of problem solving (goal-directed search and operator sequencing), cognitive science researchers have not adequately studied "off-task" activities (e.g., waiting), non-intellectual motives (e.g., hunger), sustaining a goal state (e.g., playful interaction), and coupled perceptual-motor dynamics (e.g., following someone). These aspects of human behavior have been considered in bits and pieces in past research, identified as scripts, human factors, behavior settings, ensemble, flow experience, and situated action. More broadly, activity theory provides a comprehensive framework relating motives, goals, and operations. This paper ties these ideas together, using examples from work life in a Canadian High Arctic research station. The emphasis is on simulating human behavior as it naturally occurs, such that "working" is understood as an aspect of living. The result is a synthesis of previously unrelated analytic perspectives and a broader appreciation of the nature of human cognition. Simulating activities in this comprehensive way is useful for understanding work practice, promoting learning, and designing better tools, including human-robot systems.

  3. The FaceBase Consortium: a comprehensive resource for craniofacial researchers

    PubMed Central

    Brinkley, James F.; Fisher, Shannon; Harris, Matthew P.; Holmes, Greg; Hooper, Joan E.; Wang Jabs, Ethylin; Jones, Kenneth L.; Kesselman, Carl; Klein, Ophir D.; Maas, Richard L.; Marazita, Mary L.; Selleri, Licia; Spritz, Richard A.; van Bakel, Harm; Visel, Axel; Williams, Trevor J.; Wysocka, Joanna

    2016-01-01

    The FaceBase Consortium, funded by the National Institute of Dental and Craniofacial Research, National Institutes of Health, is designed to accelerate understanding of craniofacial developmental biology by generating comprehensive data resources to empower the research community, exploring high-throughput technology, fostering new scientific collaborations among researchers and human/computer interactions, facilitating hypothesis-driven research and translating science into improved health care to benefit patients. The resources generated by the FaceBase projects include a number of dynamic imaging modalities, genome-wide association studies, software tools for analyzing human facial abnormalities, detailed phenotyping, anatomical and molecular atlases, global and specific gene expression patterns, and transcriptional profiling over the course of embryonic and postnatal development in animal models and humans. The integrated data visualization tools, faceted search infrastructure, and curation provided by the FaceBase Hub offer flexible and intuitive ways to interact with these multidisciplinary data. In parallel, the datasets also offer unique opportunities for new collaborations and training for researchers coming into the field of craniofacial studies. Here, we highlight the focus of each spoke project and the integration of datasets contributed by the spokes to facilitate craniofacial research. PMID:27287806

  4. Envelope analysis of rotating machine vibrations in variable speed conditions: A comprehensive treatment

    NASA Astrophysics Data System (ADS)

    Abboud, D.; Antoni, J.; Sieg-Zieba, S.; Eltabach, M.

    2017-02-01

    Nowadays, the vibration analysis of rotating machine signals is a well-established methodology, rooted on powerful tools offered, in particular, by the theory of cyclostationary (CS) processes. Among them, the squared envelope spectrum (SES) is probably the most popular to detect random CS components which are typical symptoms, for instance, of rolling element bearing faults. Recent researches are shifted towards the extension of existing CS tools - originally devised in constant speed conditions - to the case of variable speed conditions. Many of these works combine the SES with computed order tracking after some preprocessing steps. The principal object of this paper is to organize these dispersed researches into a structured comprehensive framework. Three original features are furnished. First, a model of rotating machine signals is introduced which sheds light on the various components to be expected in the SES. Second, a critical comparison is made of three sophisticated methods, namely, the improved synchronous average, the cepstrum prewhitening, and the generalized synchronous average, used for suppressing the deterministic part. Also, a general envelope enhancement methodology which combines the latter two techniques with a time-domain filtering operation is revisited. All theoretical findings are experimentally validated on simulated and real-world vibration signals.

  5. Enhanced terahertz imaging system performance analysis and design tool for concealed weapon identification

    NASA Astrophysics Data System (ADS)

    Murrill, Steven R.; Franck, Charmaine C.; Espinola, Richard L.; Petkie, Douglas T.; De Lucia, Frank C.; Jacobs, Eddie L.

    2011-11-01

    The U.S. Army Research Laboratory (ARL) and the U.S. Army Night Vision and Electronic Sensors Directorate (NVESD) have developed a terahertz-band imaging system performance model/tool for detection and identification of concealed weaponry. The details of the MATLAB-based model which accounts for the effects of all critical sensor and display components, and for the effects of atmospheric attenuation, concealment material attenuation, and active illumination, were reported on at the 2005 SPIE Europe Security & Defence Symposium (Brugge). An advanced version of the base model that accounts for both the dramatic impact that target and background orientation can have on target observability as related to specular and Lambertian reflections captured by an active-illumination-based imaging system, and for the impact of target and background thermal emission, was reported on at the 2007 SPIE Defense and Security Symposium (Orlando). This paper will provide a comprehensive review of an enhanced, user-friendly, Windows-executable, terahertz-band imaging system performance analysis and design tool that now includes additional features such as a MODTRAN-based atmospheric attenuation calculator and advanced system architecture configuration inputs that allow for straightforward performance analysis of active or passive systems based on scanning (single- or line-array detector element(s)) or staring (focal-plane-array detector elements) imaging architectures. This newly enhanced THz imaging system design tool is an extension of the advanced THz imaging system performance model that was developed under the Defense Advanced Research Project Agency's (DARPA) Terahertz Imaging Focal-Plane Technology (TIFT) program. This paper will also provide example system component (active-illumination source and detector) trade-study analyses using the new features of this user-friendly THz imaging system performance analysis and design tool.

  6. User Guide for the Financing Alternatives Comparison Tool

    EPA Pesticide Factsheets

    FACT is a financial analysis tool that helps identify the most cost-effective method to fund a wastewater or drinking water management project. It creates a comprehensive analysis that compares various financing options.

  7. Exposure Assessment Tools by Approaches - Exposure Reconstruction (Biomonitoring and Reverse Dosimetry)

    EPA Pesticide Factsheets

    This page provides access to a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases,

  8. Comprehensive metabolomic profiling and incident cardiovascular disease: a systematic review

    USDA-ARS?s Scientific Manuscript database

    Background: Metabolomics is a promising tool of cardiovascular biomarker discovery. We systematically reviewed the literature on comprehensive metabolomic profiling in association with incident cardiovascular disease (CVD). Methods and Results: We searched MEDLINE and EMBASE from inception to Janua...

  9. The UEA sRNA Workbench (version 4.4): a comprehensive suite of tools for analyzing miRNAs and sRNAs.

    PubMed

    Stocks, Matthew B; Mohorianu, Irina; Beckers, Matthew; Paicu, Claudia; Moxon, Simon; Thody, Joshua; Dalmay, Tamas; Moulton, Vincent

    2018-05-02

    RNA interference, a highly conserved regulatory mechanism, is mediated via small RNAs. Recent technical advances enabled the analysis of larger, complex datasets and the investigation of microRNAs and the less known small interfering RNAs. However, the size and intricacy of current data requires a comprehensive set of tools, able to discriminate the patterns from the low-level, noise-like, variation; numerous and varied suggestions from the community represent an invaluable source of ideas for future tools, the ability of the community to contribute to this software is essential. We present a new version of the UEA sRNA Workbench, reconfigured to allow an easy insertion of new tools/workflows. In its released form, it comprises of a suite of tools in a user-friendly environment, with enhanced capabilities for a comprehensive processing of sRNA-seq data e.g. tools for an accurate prediction of sRNA loci (CoLIde) and miRNA loci (miRCat2), as well as workflows to guide the users through common steps such as quality checking of the input data, normalization of abundances or detection of differential expression represent the first step in sRNA-seq analyses. The UEA sRNA Workbench is available at: http://srna-workbench.cmp.uea.ac.uk The source code is available at: https://github.com/sRNAworkbenchuea/UEA_sRNA_Workbench. v.moulton@uea.ac.uk.

  10. Automated visualization of rule-based models

    PubMed Central

    Tapia, Jose-Juan; Faeder, James R.

    2017-01-01

    Frameworks such as BioNetGen, Kappa and Simmune use “reaction rules” to specify biochemical interactions compactly, where each rule specifies a mechanism such as binding or phosphorylation and its structural requirements. Current rule-based models of signaling pathways have tens to hundreds of rules, and these numbers are expected to increase as more molecule types and pathways are added. Visual representations are critical for conveying rule-based models, but current approaches to show rules and interactions between rules scale poorly with model size. Also, inferring design motifs that emerge from biochemical interactions is an open problem, so current approaches to visualize model architecture rely on manual interpretation of the model. Here, we present three new visualization tools that constitute an automated visualization framework for rule-based models: (i) a compact rule visualization that efficiently displays each rule, (ii) the atom-rule graph that conveys regulatory interactions in the model as a bipartite network, and (iii) a tunable compression pipeline that incorporates expert knowledge and produces compact diagrams of model architecture when applied to the atom-rule graph. The compressed graphs convey network motifs and architectural features useful for understanding both small and large rule-based models, as we show by application to specific examples. Our tools also produce more readable diagrams than current approaches, as we show by comparing visualizations of 27 published models using standard graph metrics. We provide an implementation in the open source and freely available BioNetGen framework, but the underlying methods are general and can be applied to rule-based models from the Kappa and Simmune frameworks also. We expect that these tools will promote communication and analysis of rule-based models and their eventual integration into comprehensive whole-cell models. PMID:29131816

  11. Integration and Validation of the Genome-Scale Metabolic Models of Pichia pastoris: A Comprehensive Update of Protein Glycosylation Pathways, Lipid and Energy Metabolism

    PubMed Central

    Tomàs-Gamisans, Màrius; Ferrer, Pau; Albiol, Joan

    2016-01-01

    Motivation Genome-scale metabolic models (GEMs) are tools that allow predicting a phenotype from a genotype under certain environmental conditions. GEMs have been developed in the last ten years for a broad range of organisms, and are used for multiple purposes such as discovering new properties of metabolic networks, predicting new targets for metabolic engineering, as well as optimizing the cultivation conditions for biochemicals or recombinant protein production. Pichia pastoris is one of the most widely used organisms for heterologous protein expression. There are different GEMs for this methylotrophic yeast of which the most relevant and complete in the published literature are iPP668, PpaMBEL1254 and iLC915. However, these three models differ regarding certain pathways, terminology for metabolites and reactions and annotations. Moreover, GEMs for some species are typically built based on the reconstructed models of related model organisms. In these cases, some organism-specific pathways could be missing or misrepresented. Results In order to provide an updated and more comprehensive GEM for P. pastoris, we have reconstructed and validated a consensus model integrating and merging all three existing models. In this step a comprehensive review and integration of the metabolic pathways included in each one of these three versions was performed. In addition, the resulting iMT1026 model includes a new description of some metabolic processes. Particularly new information described in recently published literature is included, mainly related to fatty acid and sphingolipid metabolism, glycosylation and cell energetics. Finally the reconstructed model was tested and validated, by comparing the results of the simulations with available empirical physiological datasets results obtained from a wide range of experimental conditions, such as different carbon sources, distinct oxygen availability conditions, as well as producing of two different recombinant proteins. In these simulations, the iMT1026 model has shown a better performance than the previous existing models. PMID:26812499

  12. Integration and Validation of the Genome-Scale Metabolic Models of Pichia pastoris: A Comprehensive Update of Protein Glycosylation Pathways, Lipid and Energy Metabolism.

    PubMed

    Tomàs-Gamisans, Màrius; Ferrer, Pau; Albiol, Joan

    2016-01-01

    Genome-scale metabolic models (GEMs) are tools that allow predicting a phenotype from a genotype under certain environmental conditions. GEMs have been developed in the last ten years for a broad range of organisms, and are used for multiple purposes such as discovering new properties of metabolic networks, predicting new targets for metabolic engineering, as well as optimizing the cultivation conditions for biochemicals or recombinant protein production. Pichia pastoris is one of the most widely used organisms for heterologous protein expression. There are different GEMs for this methylotrophic yeast of which the most relevant and complete in the published literature are iPP668, PpaMBEL1254 and iLC915. However, these three models differ regarding certain pathways, terminology for metabolites and reactions and annotations. Moreover, GEMs for some species are typically built based on the reconstructed models of related model organisms. In these cases, some organism-specific pathways could be missing or misrepresented. In order to provide an updated and more comprehensive GEM for P. pastoris, we have reconstructed and validated a consensus model integrating and merging all three existing models. In this step a comprehensive review and integration of the metabolic pathways included in each one of these three versions was performed. In addition, the resulting iMT1026 model includes a new description of some metabolic processes. Particularly new information described in recently published literature is included, mainly related to fatty acid and sphingolipid metabolism, glycosylation and cell energetics. Finally the reconstructed model was tested and validated, by comparing the results of the simulations with available empirical physiological datasets results obtained from a wide range of experimental conditions, such as different carbon sources, distinct oxygen availability conditions, as well as producing of two different recombinant proteins. In these simulations, the iMT1026 model has shown a better performance than the previous existing models.

  13. Tracing catchment fine sediment sources using the new SIFT (SedIment Fingerprinting Tool) open source software.

    PubMed

    Pulley, S; Collins, A L

    2018-09-01

    The mitigation of diffuse sediment pollution requires reliable provenance information so that measures can be targeted. Sediment source fingerprinting represents one approach for supporting these needs, but recent methodological developments have resulted in an increasing complexity of data processing methods rendering the approach less accessible to non-specialists. A comprehensive new software programme (SIFT; SedIment Fingerprinting Tool) has therefore been developed which guides the user through critical data analysis decisions and automates all calculations. Multiple source group configurations and composite fingerprints are identified and tested using multiple methods of uncertainty analysis. This aims to explore the sediment provenance information provided by the tracers more comprehensively than a single model, and allows for model configurations with high uncertainties to be rejected. This paper provides an overview of its application to an agricultural catchment in the UK to determine if the approach used can provide a reduction in uncertainty and increase in precision. Five source group classifications were used; three formed using a k-means cluster analysis containing 2, 3 and 4 clusters, and two a-priori groups based upon catchment geology. Three different composite fingerprints were used for each classification and bi-plots, range tests, tracer variability ratios and virtual mixtures tested the reliability of each model configuration. Some model configurations performed poorly when apportioning the composition of virtual mixtures, and different model configurations could produce different sediment provenance results despite using composite fingerprints able to discriminate robustly between the source groups. Despite this uncertainty, dominant sediment sources were identified, and those in close proximity to each sediment sampling location were found to be of greatest importance. This new software, by integrating recent methodological developments in tracer data processing, guides users through key steps. Critically, by applying multiple model configurations and uncertainty assessment, it delivers more robust solutions for informing catchment management of the sediment problem than many previously used approaches. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  14. Zebrafish as tools for drug discovery.

    PubMed

    MacRae, Calum A; Peterson, Randall T

    2015-10-01

    The zebrafish has become a prominent vertebrate model for disease and has already contributed to several examples of successful phenotype-based drug discovery. For the zebrafish to become useful in drug development more broadly, key hurdles must be overcome, including a more comprehensive elucidation of the similarities and differences between human and zebrafish biology. Recent studies have begun to establish the capabilities and limitations of zebrafish for disease modelling, drug screening, target identification, pharmacology, and toxicology. As our understanding increases and as the technologies for manipulating zebrafish improve, it is hoped that the zebrafish will have a key role in accelerating the emergence of precision medicine.

  15. NASA's Aviation Safety and Modeling Project

    NASA Technical Reports Server (NTRS)

    Chidester, Thomas R.; Statler, Irving C.

    2006-01-01

    The Aviation Safety Monitoring and Modeling (ASMM) Project of NASA's Aviation Safety program is cultivating sources of data and developing automated computer hardware and software to facilitate efficient, comprehensive, and accurate analyses of the data collected from large, heterogeneous databases throughout the national aviation system. The ASMM addresses the need to provide means for increasing safety by enabling the identification and correcting of predisposing conditions that could lead to accidents or to incidents that pose aviation risks. A major component of the ASMM Project is the Aviation Performance Measuring System (APMS), which is developing the next generation of software tools for analyzing and interpreting flight data.

  16. [Adaptations of psychotropic drugs in patients aged 75 years and older in a departement of geriatric internal medecine: report of 100 cases].

    PubMed

    Couderc, Anne-Laure; Bailly-Agaledes, Cindy; Camalet, Joëlle; Capriz-Ribière, Françoise; Gary, André; Robert, Philippe; Brocker, Patrice; Guérin, Olivier

    2011-06-01

    The elderly often with multiple diseases are particularly at risk from adverse drug reactions. Nearly half of iatrogenic drug in the elderly are preventable. Some medications such as psychotropic drugs are particularly involved in iatrogenic accidents. We wanted to know if the tools of the comprehensive geriatric assessment or other factors could influence the changes of psychotropic drugs in a geriatric departement. Our prospective study of four months in 100 patients aged 75 years and older hospitalized in the Geriatric Internal Medecine Departement of University Hospital of Nice investigated what were the clinical or biological reasons and tools used during changes of psychotropic drugs. We compared these changes according to the comprehensive geriatric assessment tools and we analyzed the changes based on lists of potentially inappropriate medications by Laroche et al. and from the instrument STOPP/START. The Mini Mental State Examination (MMSE) was the tool that has most influenced the changes in psychotropic including a tendency to increase and the introduction of anxiolytics when MMSE < 20 (p = 0.007) while neuroleptics instead arrested and decreased (p = 0.012). The comprehensive geriatric assessment has its place in decision support during the potentially iatrogenic prescriptions of drugs such as psychotropic and new tools such as STOPP/START can also be a help to the prescriber informed.

  17. Addressing Curse of Dimensionality in Sensitivity Analysis: How Can We Handle High-Dimensional Problems?

    NASA Astrophysics Data System (ADS)

    Safaei, S.; Haghnegahdar, A.; Razavi, S.

    2016-12-01

    Complex environmental models are now the primary tool to inform decision makers for the current or future management of environmental resources under the climate and environmental changes. These complex models often contain a large number of parameters that need to be determined by a computationally intensive calibration procedure. Sensitivity analysis (SA) is a very useful tool that not only allows for understanding the model behavior, but also helps in reducing the number of calibration parameters by identifying unimportant ones. The issue is that most global sensitivity techniques are highly computationally demanding themselves for generating robust and stable sensitivity metrics over the entire model response surface. Recently, a novel global sensitivity analysis method, Variogram Analysis of Response Surfaces (VARS), is introduced that can efficiently provide a comprehensive assessment of global sensitivity using the Variogram concept. In this work, we aim to evaluate the effectiveness of this highly efficient GSA method in saving computational burden, when applied to systems with extra-large number of input factors ( 100). We use a test function and a hydrological modelling case study to demonstrate the capability of VARS method in reducing problem dimensionality by identifying important vs unimportant input factors.

  18. PIMMS tools for capturing metadata about simulations

    NASA Astrophysics Data System (ADS)

    Pascoe, Charlotte; Devine, Gerard; Tourte, Gregory; Pascoe, Stephen; Lawrence, Bryan; Barjat, Hannah

    2013-04-01

    PIMMS (Portable Infrastructure for the Metafor Metadata System) provides a method for consistent and comprehensive documentation of modelling activities that enables the sharing of simulation data and model configuration information. The aim of PIMMS is to package the metadata infrastructure developed by Metafor for CMIP5 so that it can be used by climate modelling groups in UK Universities. PIMMS tools capture information about simulations from the design of experiments to the implementation of experiments via simulations that run models. PIMMS uses the Metafor methodology which consists of a Common Information Model (CIM), Controlled Vocabularies (CV) and software tools. PIMMS software tools provide for the creation and consumption of CIM content via a web services infrastructure and portal developed by the ES-DOC community. PIMMS metadata integrates with the ESGF data infrastructure via the mapping of vocabularies onto ESGF facets. There are three paradigms of PIMMS metadata collection: Model Intercomparision Projects (MIPs) where a standard set of questions is asked of all models which perform standard sets of experiments. Disciplinary level metadata collection where a standard set of questions is asked of all models but experiments are specified by users. Bespoke metadata creation where the users define questions about both models and experiments. Examples will be shown of how PIMMS has been configured to suit each of these three paradigms. In each case PIMMS allows users to provide additional metadata beyond that which is asked for in an initial deployment. The primary target for PIMMS is the UK climate modelling community where it is common practice to reuse model configurations from other researchers. This culture of collaboration exists in part because climate models are very complex with many variables that can be modified. Therefore it has become common practice to begin a series of experiments by using another climate model configuration as a starting point. Usually this other configuration is provided by a researcher in the same research group or by a previous collaborator with whom there is an existing scientific relationship. Some efforts have been made at the university department level to create documentation but there is a wide diversity in the scope and purpose of this information. The consistent and comprehensive documentation enabled by PIMMS will enable the wider sharing of climate model data and configuration information. The PIMMS methodology assumes an initial effort to document standard model configurations. Once these descriptions have been created users need only describe the specific way in which their model configuration is different from the standard. Thus the documentation burden on the user is specific to the experiment they are performing and fits easily into the workflow of doing their science. PIMMS metadata is independent of data and as such is ideally suited for documenting model development. PIMMS provides a framework for sharing information about failed model configurations for which data are not kept, the negative results that don't appear in scientific literature. PIMMS is a UK project funded by JISC, The University of Reading, The University of Bristol and STFC.

  19. Phases of ERA - Tools

    EPA Pesticide Factsheets

    Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  20. Mediator effect of statistical process control between Total Quality Management (TQM) and business performance in Malaysian Automotive Industry

    NASA Astrophysics Data System (ADS)

    Ahmad, M. F.; Rasi, R. Z.; Zakuan, N.; Hisyamudin, M. N. N.

    2015-12-01

    In today's highly competitive market, Total Quality Management (TQM) is vital management tool in ensuring a company can success in their business. In order to survive in the global market with intense competition amongst regions and enterprises, the adoption of tools and techniques are essential in improving business performance. There are consistent results between TQM and business performance. However, only few previous studies have examined the mediator effect namely statistical process control (SPC) between TQM and business performance. A mediator is a third variable that changes the association between an independent variable and an outcome variable. This study present research proposed a TQM performance model with mediator effect of SPC with structural equation modelling, which is a more comprehensive model for developing countries, specifically for Malaysia. A questionnaire was prepared and sent to 1500 companies from automotive industry and the related vendors in Malaysia, giving a 21.8 per cent rate. Attempts were made at findings significant impact of mediator between TQM practices and business performance showed that SPC is important tools and techniques in TQM implementation. The result concludes that SPC is partial correlation between and TQM and BP with indirect effect (IE) is 0.25 which can be categorised as high moderator effect.

  1. Status of Cognitive Testing of Adults in India

    PubMed Central

    Porrselvi, A. P.; Shankar, V.

    2017-01-01

    The assessment of cognitive function is a challenging yet an integral component of psychological, psychiatric, and neurological evaluation. Cognitive assessment tools either can be administered quickly for screening for neurocognitive disorders or can be comprehensive and detailed to identify cognitive deficits for the purpose of localization, diagnosis, and rehabilitation. This article is a comprehensive review of published research that discusses the current challenges for cognitive testing in India, available tools used for the assessment of cognitive function in India, and future directions for cognitive testing in India. PMID:29184333

  2. Integrated Medical Model Verification, Validation, and Credibility

    NASA Technical Reports Server (NTRS)

    Walton, Marlei; Kerstman, Eric; Foy, Millennia; Shah, Ronak; Saile, Lynn; Boley, Lynn; Butler, Doug; Myers, Jerry

    2014-01-01

    The Integrated Medical Model (IMM) was designed to forecast relative changes for a specified set of crew health and mission success risk metrics by using a probabilistic (stochastic process) model based on historical data, cohort data, and subject matter expert opinion. A probabilistic approach is taken since exact (deterministic) results would not appropriately reflect the uncertainty in the IMM inputs. Once the IMM was conceptualized, a plan was needed to rigorously assess input information, framework and code, and output results of the IMM, and ensure that end user requests and requirements were considered during all stages of model development and implementation. METHODS: In 2008, the IMM team developed a comprehensive verification and validation (VV) plan, which specified internal and external review criteria encompassing 1) verification of data and IMM structure to ensure proper implementation of the IMM, 2) several validation techniques to confirm that the simulation capability of the IMM appropriately represents occurrences and consequences of medical conditions during space missions, and 3) credibility processes to develop user confidence in the information derived from the IMM. When the NASA-STD-7009 (7009) was published, the IMM team updated their verification, validation, and credibility (VVC) project plan to meet 7009 requirements and include 7009 tools in reporting VVC status of the IMM. RESULTS: IMM VVC updates are compiled recurrently and include 7009 Compliance and Credibility matrices, IMM VV Plan status, and a synopsis of any changes or updates to the IMM during the reporting period. Reporting tools have evolved over the lifetime of the IMM project to better communicate VVC status. This has included refining original 7009 methodology with augmentation from the NASA-STD-7009 Guidance Document. End user requests and requirements are being satisfied as evidenced by ISS Program acceptance of IMM risk forecasts, transition to an operational model and simulation tool, and completion of service requests from a broad end user consortium including Operations, Science and Technology Planning, and Exploration Planning. CONCLUSIONS: The VVC approach established by the IMM project of combining the IMM VV Plan with 7009 requirements is comprehensive and includes the involvement of end users at every stage in IMM evolution. Methods and techniques used to quantify the VVC status of the IMM have not only received approval from the local NASA community but have also garnered recognition by other federal agencies seeking to develop similar guidelines in the medical modeling community.

  3. A database and tool for boundary conditions for regional air quality modeling: description and evaluation

    NASA Astrophysics Data System (ADS)

    Henderson, B. H.; Akhtar, F.; Pye, H. O. T.; Napelenok, S. L.; Hutzell, W. T.

    2013-09-01

    Transported air pollutants receive increasing attention as regulations tighten and global concentrations increase. The need to represent international transport in regional air quality assessments requires improved representation of boundary concentrations. Currently available observations are too sparse vertically to provide boundary information, particularly for ozone precursors, but global simulations can be used to generate spatially and temporally varying Lateral Boundary Conditions (LBC). This study presents a public database of global simulations designed and evaluated for use as LBC for air quality models (AQMs). The database covers the contiguous United States (CONUS) for the years 2000-2010 and contains hourly varying concentrations of ozone, aerosols, and their precursors. The database is complimented by a tool for configuring the global results as inputs to regional scale models (e.g., Community Multiscale Air Quality or Comprehensive Air quality Model with extensions). This study also presents an example application based on the CONUS domain, which is evaluated against satellite retrieved ozone vertical profiles. The results show performance is largely within uncertainty estimates for the Tropospheric Emission Spectrometer (TES) with some exceptions. The major difference shows a high bias in the upper troposphere along the southern boundary in January. This publication documents the global simulation database, the tool for conversion to LBC, and the fidelity of concentrations on the boundaries. This documentation is intended to support applications that require representation of long-range transport of air pollutants.

  4. Causal Influence of Articulatory Motor Cortex on Comprehending Single Spoken Words: TMS Evidence.

    PubMed

    Schomers, Malte R; Kirilina, Evgeniya; Weigand, Anne; Bajbouj, Malek; Pulvermüller, Friedemann

    2015-10-01

    Classic wisdom had been that motor and premotor cortex contribute to motor execution but not to higher cognition and language comprehension. In contrast, mounting evidence from neuroimaging, patient research, and transcranial magnetic stimulation (TMS) suggest sensorimotor interaction and, specifically, that the articulatory motor cortex is important for classifying meaningless speech sounds into phonemic categories. However, whether these findings speak to the comprehension issue is unclear, because language comprehension does not require explicit phonemic classification and previous results may therefore relate to factors alien to semantic understanding. We here used the standard psycholinguistic test of spoken word comprehension, the word-to-picture-matching task, and concordant TMS to articulatory motor cortex. TMS pulses were applied to primary motor cortex controlling either the lips or the tongue as subjects heard critical word stimuli starting with bilabial lip-related or alveolar tongue-related stop consonants (e.g., "pool" or "tool"). A significant cross-over interaction showed that articulatory motor cortex stimulation delayed comprehension responses for phonologically incongruent words relative to congruous ones (i.e., lip area TMS delayed "tool" relative to "pool" responses). As local TMS to articulatory motor areas differentially delays the comprehension of phonologically incongruous spoken words, we conclude that motor systems can take a causal role in semantic comprehension and, hence, higher cognition. © The Author 2014. Published by Oxford University Press.

  5. Development of the Sydney Falls Risk Screening Tool in brain injury rehabilitation: A multisite prospective cohort study.

    PubMed

    McKechnie, Duncan; Fisher, Murray J; Pryor, Julie; Bonser, Melissa; Jesus, Jhoven De

    2018-03-01

    To develop a falls risk screening tool (FRST) sensitive to the traumatic brain injury rehabilitation population. Falls are the most frequently recorded patient safety incident within the hospital context. The inpatient traumatic brain injury rehabilitation population is one particular population that has been identified as at high risk of falls. However, no FRST has been developed for this patient population. Consequently in the traumatic brain injury rehabilitation population, there is the real possibility that nurses are using falls risk screening tools that have a poor clinical utility. Multisite prospective cohort study. Univariate and multiple logistic regression modelling techniques (backward elimination, elastic net and hierarchical) were used to examine each variable's association with patients who fell. The resulting FRST's clinical validity was examined. Of the 140 patients in the study, 41 (29%) fell. Through multiple logistic regression modelling, 11 variables were identified as predictors for falls. Using hierarchical logistic regression, five of these were identified for inclusion in the resulting falls risk screening tool: prescribed mobility aid (such as, wheelchair or frame), a fall since admission to hospital, impulsive behaviour, impaired orientation and bladder and/or bowel incontinence. The resulting FRST has good clinical validity (sensitivity = 0.9; specificity = 0.62; area under the curve = 0.87; Youden index = 0.54). The tool was significantly more accurate (p = .037 on DeLong test) in discriminating fallers from nonfallers than the Ontario Modified STRATIFY FRST. A FRST has been developed using a comprehensive statistical framework, and evidence has been provided of this tool's clinical validity. The developed tool, the Sydney Falls Risk Screening Tool, should be considered for use in brain injury rehabilitation populations. © 2017 John Wiley & Sons Ltd.

  6. Metabolomic fingerprinting employing DART-TOFMS for authentication of tomatoes and peppers from organic and conventional farming.

    PubMed

    Novotná, H; Kmiecik, O; Gałązka, M; Krtková, V; Hurajová, A; Schulzová, V; Hallmann, E; Rembiałkowska, E; Hajšlová, J

    2012-01-01

    The rapidly growing demand for organic food requires the availability of analytical tools enabling their authentication. Recently, metabolomic fingerprinting/profiling has been demonstrated as a challenging option for a comprehensive characterisation of small molecules occurring in plants, since their pattern may reflect the impact of various external factors. In a two-year pilot study, concerned with the classification of organic versus conventional crops, ambient mass spectrometry consisting of a direct analysis in real time (DART) ion source and a time-of-flight mass spectrometer (TOFMS) was employed. This novel methodology was tested on 40 tomato and 24 pepper samples grown under specified conditions. To calculate statistical models, the obtained data (mass spectra) were processed by the principal component analysis (PCA) followed by linear discriminant analysis (LDA). The results from the positive ionisation mode enabled better differentiation between organic and conventional samples than the results from the negative mode. In this case, the recognition ability obtained by LDA was 97.5% for tomato and 100% for pepper samples and the prediction abilities were above 80% for both sample sets. The results suggest that the year of production had stronger influence on the metabolomic fingerprints compared with the type of farming (organic versus conventional). In any case, DART-TOFMS is a promising tool for rapid screening of samples. Establishing comprehensive (multi-sample) long-term databases may further help to improve the quality of statistical classification models.

  7. Biota Modeling in EPA's Preliminary Remediation Goal and Dose Compliance Concentration Calculators for Use in EPA Superfund Risk Assessment: Explanation of Intake Rate Derivation, Transfer Factor Compilation, and Mass Loading Factor Sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Manning, Karessa L.; Dolislager, Fredrick G.; Bellamy, Michael B.

    The Preliminary Remediation Goal (PRG) and Dose Compliance Concentration (DCC) calculators are screening level tools that set forth Environmental Protection Agency's (EPA) recommended approaches, based upon currently available information with respect to risk assessment, for response actions at Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) sites, commonly known as Superfund. The screening levels derived by the PRG and DCC calculators are used to identify isotopes contributing the highest risk and dose as well as establish preliminary remediation goals. Each calculator has a residential gardening scenario and subsistence farmer exposure scenarios that require modeling of the transfer of contaminants frommore » soil and water into various types of biota (crops and animal products). New publications of human intake rates of biota; farm animal intakes of water, soil, and fodder; and soil to plant interactions require updates be implemented into the PRG and DCC exposure scenarios. Recent improvements have been made in the biota modeling for these calculators, including newly derived biota intake rates, more comprehensive soil mass loading factors (MLFs), and more comprehensive soil to tissue transfer factors (TFs) for animals and soil to plant transfer factors (BV's). New biota have been added in both the produce and animal products categories that greatly improve the accuracy and utility of the PRG and DCC calculators and encompass greater geographic diversity on a national and international scale.« less

  8. Comprehensive Safety Analysis 2010 Safety Measurement System (SMS) Methodology, Version 2.1 Revised December 2010

    DOT National Transportation Integrated Search

    2010-12-01

    This report documents the Safety Measurement System (SMS) methodology developed to support the Comprehensive Safety Analysis 2010 (CSA 2010) Initiative for the Federal Motor Carrier Safety Administration (FMCSA). The SMS is one of the major tools for...

  9. The Development of a Visual-Perceptual Chemistry Specific (VPCS) Assessment Tool

    ERIC Educational Resources Information Center

    Oliver-Hoyo, Maria; Sloan, Caroline

    2014-01-01

    The development of the Visual-Perceptual Chemistry Specific (VPCS) assessment tool is based on items that align to eight visual-perceptual skills considered as needed by chemistry students. This tool includes a comprehensive range of visual operations and presents items within a chemistry context without requiring content knowledge to solve…

  10. Interactive and Authentic e-Learning Tools for Criminal Justice Education

    ERIC Educational Resources Information Center

    Miner-Romanoff, Karen; McCombs, Jonathan; Chongwony, Lewis

    2017-01-01

    This mixed-method study tested the effectiveness of two experiential e-learning tools for criminal justice courses. The first tool was a comprehensive video series, including a criminal trial and interviews with the judge, defense counsel, prosecution, investigators and court director (virtual trial), in order to enhance course and learning…

  11. Navigating freely-available software tools for metabolomics analysis.

    PubMed

    Spicer, Rachel; Salek, Reza M; Moreno, Pablo; Cañueto, Daniel; Steinbeck, Christoph

    2017-01-01

    The field of metabolomics has expanded greatly over the past two decades, both as an experimental science with applications in many areas, as well as in regards to data standards and bioinformatics software tools. The diversity of experimental designs and instrumental technologies used for metabolomics has led to the need for distinct data analysis methods and the development of many software tools. To compile a comprehensive list of the most widely used freely available software and tools that are used primarily in metabolomics. The most widely used tools were selected for inclusion in the review by either ≥ 50 citations on Web of Science (as of 08/09/16) or the use of the tool being reported in the recent Metabolomics Society survey. Tools were then categorised by the type of instrumental data (i.e. LC-MS, GC-MS or NMR) and the functionality (i.e. pre- and post-processing, statistical analysis, workflow and other functions) they are designed for. A comprehensive list of the most used tools was compiled. Each tool is discussed within the context of its application domain and in relation to comparable tools of the same domain. An extended list including additional tools is available at https://github.com/RASpicer/MetabolomicsTools which is classified and searchable via a simple controlled vocabulary. This review presents the most widely used tools for metabolomics analysis, categorised based on their main functionality. As future work, we suggest a direct comparison of tools' abilities to perform specific data analysis tasks e.g. peak picking.

  12. Comprehensiveness of care from the patient perspective: comparison of primary healthcare evaluation instruments.

    PubMed

    Haggerty, Jeannie L; Beaulieu, Marie-Dominique; Pineault, Raynald; Burge, Frederick; Lévesque, Jean-Frédéric; Santor, Darcy A; Bouharaoui, Fatima; Beaulieu, Christine

    2011-12-01

    Comprehensiveness relates both to scope of services offered and to a whole-person clinical approach. Comprehensive services are defined as "the provision, either directly or indirectly, of a full range of services to meet most patients' healthcare needs"; whole-person care is "the extent to which a provider elicits and considers the physical, emotional and social aspects of a patient's health and considers the community context in their care." Among instruments that evaluate primary healthcare, two had subscales that mapped to comprehensive services and to the community component of whole-person care: the Primary Care Assessment Tool - Short Form (PCAT-S) and the Components of Primary Care Index (CPCI, a limited measure of whole-person care). To examine how well comprehensiveness is captured in validated instruments that evaluate primary healthcare from the patient's perspective. 645 adults with at least one healthcare contact in the previous 12 months responded to six instruments that evaluate primary healthcare. Scores were normalized for descriptive comparison. Exploratory and confirmatory (structural equation modelling) factor analysis examined fit to operational definition, and item response theory analysis examined item performance on common constructs. Over one-quarter of respondents had missing responses on services offered or doctor's knowledge of the community. The subscales did not load on a single factor; comprehensive services and community orientation were examined separately. The community orientation subscales did not perform satisfactorily. The three comprehensive services subscales fit very modestly onto two factors: (1) most healthcare needs (from one provider) (CPCI Comprehensive Care, PCAT-S First-Contact Utilization) and (2) range of services (PCAT-S Comprehensive Services Available). Individual item performance revealed several problems. Measurement of comprehensiveness is problematic, making this attribute a priority for measure development. Range of services offered is best obtained from providers. Whole-person care is not addressed as a separate construct, but some dimensions are covered by attributes such as interpersonal communication and relational continuity.

  13. Exposure Assessment Tools by Media - Air

    EPA Pesticide Factsheets

    EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  14. MVP-CA Methodology for the Expert System Advocate's Advisor (ESAA)

    DOT National Transportation Integrated Search

    1997-11-01

    The Multi-Viewpoint Clustering Analysis (MVP-CA) tool is a semi-automated tool to provide a valuable aid for comprehension, verification, validation, maintenance, integration, and evolution of complex knowledge-based software systems. In this report,...

  15. Aviation Environmental Design Tool (AEDT) System Architecture

    DOT National Transportation Integrated Search

    2007-01-29

    The Federal Aviation Administration's Office of Environment and Energy (FAA-AEE) is : developing a comprehensive suite of software tools that will allow for thorough assessment of the environmental effects of aviation. The main goal of the effort is ...

  16. Exposure Assessment Tools by Routes - Inhalation

    EPA Pesticide Factsheets

    EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  17. Exposure Assessment Tools by Chemical Classes

    EPA Pesticide Factsheets

    EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  18. Exposure Assessment Tools by Routes

    EPA Pesticide Factsheets

    EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  19. Exposure Assessment Tools by Media - Food

    EPA Pesticide Factsheets

    EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  20. Exposure Assessment Tools by Media

    EPA Pesticide Factsheets

    EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  1. Exposure Assessment Tools by Routes - Ingestion

    EPA Pesticide Factsheets

    EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  2. Exposure Assessment Tools by Approaches

    EPA Pesticide Factsheets

    EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  3. A Multi-Pronged Plan

    ERIC Educational Resources Information Center

    Starkman, Neal

    2007-01-01

    As schools adopt new and varied technologies to protect the campus community, the need to look at security tools in terms of a comprehensive, layered, and integrated strategy, becomes clear. This article discusses how schools are using these security tools.

  4. Assessment of the Aviation Environmental Design Tool

    DOT National Transportation Integrated Search

    2009-06-29

    A comprehensive Tools Suite to allow for : thorough evaluation of the environmental effects and impacts : of aviation is currently being developed by the U.S. This suite : consists of the Environmental Design Space (EDS), the : Aviation Environmental...

  5. EPA EcoBox Tools by Stressors - Biological

    EPA Pesticide Factsheets

    Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  6. Exposure Assessment Tools by Routes - Dermal

    EPA Pesticide Factsheets

    EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  7. EPA EcoBox Tools by Stressors - Physical

    EPA Pesticide Factsheets

    Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  8. EPA EcoBox Tools by Stressors

    EPA Pesticide Factsheets

    Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  9. EPA EcoBox Tools by Effects - Aquatic

    EPA Pesticide Factsheets

    Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  10. EPA EcoBox Tools by Exposure Pathways

    EPA Pesticide Factsheets

    Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  11. EPA EcoBox Tools by Effects - References

    EPA Pesticide Factsheets

    Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  12. EPA EcoBox Tools by Stressors - References

    EPA Pesticide Factsheets

    Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  13. EPA EcoBox Tools by Stressors - Chemical

    EPA Pesticide Factsheets

    Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  14. EPA EcoBox Tools by Effects

    EPA Pesticide Factsheets

    Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  15. EPA EcoBox Tools by Receptors - Biota

    EPA Pesticide Factsheets

    Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  16. EPA EcoBox Tools by Receptors

    EPA Pesticide Factsheets

    Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  17. EPA EcoBox Tools by Receptors - References

    EPA Pesticide Factsheets

    Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  18. EPA EcoBox Tools by Effects - Terrestrial

    EPA Pesticide Factsheets

    Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  19. The Comprehensive Antibiotic Resistance Database

    PubMed Central

    McArthur, Andrew G.; Waglechner, Nicholas; Nizam, Fazmin; Yan, Austin; Azad, Marisa A.; Baylay, Alison J.; Bhullar, Kirandeep; Canova, Marc J.; De Pascale, Gianfranco; Ejim, Linda; Kalan, Lindsay; King, Andrew M.; Koteva, Kalinka; Morar, Mariya; Mulvey, Michael R.; O'Brien, Jonathan S.; Pawlowski, Andrew C.; Piddock, Laura J. V.; Spanogiannopoulos, Peter; Sutherland, Arlene D.; Tang, Irene; Taylor, Patricia L.; Thaker, Maulik; Wang, Wenliang; Yan, Marie; Yu, Tennison

    2013-01-01

    The field of antibiotic drug discovery and the monitoring of new antibiotic resistance elements have yet to fully exploit the power of the genome revolution. Despite the fact that the first genomes sequenced of free living organisms were those of bacteria, there have been few specialized bioinformatic tools developed to mine the growing amount of genomic data associated with pathogens. In particular, there are few tools to study the genetics and genomics of antibiotic resistance and how it impacts bacterial populations, ecology, and the clinic. We have initiated development of such tools in the form of the Comprehensive Antibiotic Research Database (CARD; http://arpcard.mcmaster.ca). The CARD integrates disparate molecular and sequence data, provides a unique organizing principle in the form of the Antibiotic Resistance Ontology (ARO), and can quickly identify putative antibiotic resistance genes in new unannotated genome sequences. This unique platform provides an informatic tool that bridges antibiotic resistance concerns in health care, agriculture, and the environment. PMID:23650175

  20. Developing a Tool for Increasing the Awareness about Gendered and Intersectional Processes in the Clinical Assessment of Patients--A Study of Pain Rehabilitation.

    PubMed

    Hammarström, Anne; Wiklund, Maria; Stålnacke, Britt-Marie; Lehti, Arja; Haukenes, Inger; Fjellman-Wiklund, Anncristine

    2016-01-01

    There is a need for tools addressing gender inequality in the everyday clinical work in health care. The aim of our paper was to develop a tool for increasing the awareness of gendered and intersectional processes in clinical assessment of patients, based on a study of pain rehabilitation. In the overarching project named "Equal care in rehabilitation" we used multiple methods (both quantitative and qualitative) in five sub studies. With a novel approach we used Grounded Theory in order to synthesize the results from our sub studies, in order to develop the gender equality tool. The gender equality tool described and developed in this article is thus based on results from sub studies about the processes of assessment and selection of patients in pain rehabilitation. Inspired by some questions in earlier tools, we posed open ended questions and inductively searched for findings and concepts relating to gendered and social selection processes in pain rehabilitation, in each of our sub studies. Through this process, the actual gender equality tool was developed as 15 questions about the process of assessing and selecting patients to pain rehabilitation. As a more comprehensive way of understanding the tool, we performed a final step of the GT analyses. Here we synthesized the results of the tool into a comprehensive model with two dimensions in relation to several possible discrimination axes. The process of assessing and selecting patients was visualized as a funnel, a top down process governed by gendered attitudes, rules and structures. We found that the clinicians judged inner and outer characteristics and status of patients in a gendered and intersectional way in the process of clinical decision-making which thus can be regarded as (potentially) biased with regard to gender, socio-economic status, ethnicity and age. The clinical implications of our tool are that the tool can be included in the systematic routine of clinical assessment of patients for both awareness raising and as a base for avoiding gender bias in clinical decision-making. The tool could also be used in team education for health professionals as an instrument for critical reflection on gender bias. Thus, tools for clinical assessment can be developed from empirical studies in various clinical settings. However, such a micro-level approach must be understood from a broader societal perspective including gender relations on both the macro- and the meso-level.

  1. Developing a Tool for Increasing the Awareness about Gendered and Intersectional Processes in the Clinical Assessment of Patients – A Study of Pain Rehabilitation

    PubMed Central

    Hammarström, Anne; Wiklund, Maria; Stålnacke, Britt-Marie; Lehti, Arja; Haukenes, Inger; Fjellman-Wiklund, Anncristine

    2016-01-01

    Objective There is a need for tools addressing gender inequality in the everyday clinical work in health care. The aim of our paper was to develop a tool for increasing the awareness of gendered and intersectional processes in clinical assessment of patients, based on a study of pain rehabilitation. Methods In the overarching project named “Equal care in rehabilitation” we used multiple methods (both quantitative and qualitative) in five sub studies. With a novel approach we used Grounded Theory in order to synthesize the results from our sub studies, in order to develop the gender equality tool. The gender equality tool described and developed in this article is thus based on results from sub studies about the processes of assessment and selection of patients in pain rehabilitation. Inspired by some questions in earlier tools, we posed open ended questions and inductively searched for findings and concepts relating to gendered and social selection processes in pain rehabilitation, in each of our sub studies. Through this process, the actual gender equality tool was developed as 15 questions about the process of assessing and selecting patients to pain rehabilitation. As a more comprehensive way of understanding the tool, we performed a final step of the GT analyses. Here we synthesized the results of the tool into a comprehensive model with two dimensions in relation to several possible discrimination axes. Results The process of assessing and selecting patients was visualized as a funnel, a top down process governed by gendered attitudes, rules and structures. We found that the clinicians judged inner and outer characteristics and status of patients in a gendered and intersectional way in the process of clinical decision-making which thus can be regarded as (potentially) biased with regard to gender, socio-economic status, ethnicity and age. Implications The clinical implications of our tool are that the tool can be included in the systematic routine of clinical assessment of patients for both awareness raising and as a base for avoiding gender bias in clinical decision-making. The tool could also be used in team education for health professionals as an instrument for critical reflection on gender bias. Conclusions Thus, tools for clinical assessment can be developed from empirical studies in various clinical settings. However, such a micro-level approach must be understood from a broader societal perspective including gender relations on both the macro- and the meso-level. PMID:27055029

  2. In silico environmental chemical science: properties and processes from statistical and computational modelling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tratnyek, Paul G.; Bylaska, Eric J.; Weber, Eric J.

    2017-01-01

    Quantitative structure–activity relationships (QSARs) have long been used in the environmental sciences. More recently, molecular modeling and chemoinformatic methods have become widespread. These methods have the potential to expand and accelerate advances in environmental chemistry because they complement observational and experimental data with “in silico” results and analysis. The opportunities and challenges that arise at the intersection between statistical and theoretical in silico methods are most apparent in the context of properties that determine the environmental fate and effects of chemical contaminants (degradation rate constants, partition coefficients, toxicities, etc.). The main example of this is the calibration of QSARs usingmore » descriptor variable data calculated from molecular modeling, which can make QSARs more useful for predicting property data that are unavailable, but also can make them more powerful tools for diagnosis of fate determining pathways and mechanisms. Emerging opportunities for “in silico environmental chemical science” are to move beyond the calculation of specific chemical properties using statistical models and toward more fully in silico models, prediction of transformation pathways and products, incorporation of environmental factors into model predictions, integration of databases and predictive models into more comprehensive and efficient tools for exposure assessment, and extending the applicability of all the above from chemicals to biologicals and materials.« less

  3. Modeling Tool Advances Rotorcraft Design

    NASA Technical Reports Server (NTRS)

    2007-01-01

    Continuum Dynamics Inc. (CDI), founded in 1979, specializes in advanced engineering services, including fluid dynamic modeling and analysis for aeronautics research. The company has completed a number of SBIR research projects with NASA, including early rotorcraft work done through Langley Research Center, but more recently, out of Ames Research Center. NASA Small Business Innovation Research (SBIR) grants on helicopter wake modeling resulted in the Comprehensive Hierarchical Aeromechanics Rotorcraft Model (CHARM), a tool for studying helicopter and tiltrotor unsteady free wake modeling, including distributed and integrated loads, and performance prediction. Application of the software code in a blade redesign program for Carson Helicopters, of Perkasie, Pennsylvania, increased the payload and cruise speeds of its S-61 helicopter. Follow-on development resulted in a $24 million revenue increase for Sikorsky Aircraft Corporation, of Stratford, Connecticut, as part of the company's rotor design efforts. Now under continuous development for more than 25 years, CHARM models the complete aerodynamics and dynamics of rotorcraft in general flight conditions. CHARM has been used to model a broad spectrum of rotorcraft attributes, including performance, blade loading, blade-vortex interaction noise, air flow fields, and hub loads. The highly accurate software is currently in use by all major rotorcraft manufacturers, NASA, the U.S. Army, and the U.S. Navy.

  4. How To Steal From Nature

    DTIC Science & Technology

    2004-07-23

    Field Principal TRIZ Tools TRIZ offers a comprehensive series of creativity and innovation tools, methods and strategies. The main tools include...Algorithm for Inventive Problem Solving) The tools shown in red can use information from nature. Hence TRIZ can drive biomimetics by organising and...targeting information. Biomimetics can drive TRIZ with new “patents”. Lessons • It’s possible to learn from nature • Huge changes in context are

  5. Integrated Medical Model (IMM) Project Verification, Validation, and Credibility (VVandC)

    NASA Technical Reports Server (NTRS)

    Walton, M.; Boley, L.; Keenan, L.; Kerstman, E.; Shah, R.; Young, M.; Saile, L.; Garcia, Y.; Meyers, J.; Reyes, D.

    2015-01-01

    The Integrated Medical Model (IMM) Project supports end user requests by employing the Integrated Medical Evidence Database (iMED) and IMM tools as well as subject matter expertise within the Project. The iMED houses data used by the IMM. The IMM is designed to forecast relative changes for a specified set of crew health and mission success risk metrics by using a probabilistic model based on historical data, cohort data, and subject matter expert opinion. A stochastic approach is taken because deterministic results would not appropriately reflect the uncertainty in the IMM inputs. Once the IMM was conceptualized, a plan was needed to rigorously assess input information, framework and code, and output results of the IMM, and ensure that end user requests and requirements were considered during all stages of model development and implementation, as well as lay the foundation for external review and application. METHODS: In 2008, the Project team developed a comprehensive verification and validation (VV) plan, which specified internal and external review criteria encompassing 1) verification of data and IMM structure to ensure proper implementation of the IMM, 2) several validation techniques to confirm that the simulation capability of the IMM appropriately represents occurrences and consequences of medical conditions during space missions, and 3) credibility processes to develop user confidence in the information derived from the IMM. When the NASA-STD-7009 (7009) [1] was published, the Project team updated their verification, validation, and credibility (VVC) project plan to meet 7009 requirements and include 7009 tools in reporting VVC status of the IMM. Construction of these tools included meeting documentation and evidence requirements sufficient to meet external review success criteria. RESULTS: IMM Project VVC updates are compiled recurrently and include updates to the 7009 Compliance and Credibility matrices. Reporting tools have evolved over the lifetime of the IMM Project to better communicate VVC status. This has included refining original 7009 methodology with augmentation from the HRP NASA-STD-7009 Guidance Document working group and the NASA-HDBK-7009 [2]. End user requests and requirements are being satisfied as evidenced by ISS Program acceptance of IMM risk forecasts, transition to an operational model and simulation tool, and completion of service requests from a broad end user consortium including operations, science and technology planning, and exploration planning. IMM v4.0 is slated for operational release in the FY015 and current VVC assessments illustrate the expected VVC status prior to the completion of customer lead external review efforts. CONCLUSIONS: The VVC approach established by the IMM Project of incorporating Project-specific recommended practices and guidelines for implementing the 7009 requirements is comprehensive and includes the involvement of end users at every stage in IMM evolution. Methods and techniques used to quantify the VVC status of the IMM Project represented a critical communication tool in providing clear and concise suitability assessments to IMM customers. These processes have not only received approval from the local NASA community but have also garnered recognition by other federal agencies seeking to develop similar guidelines in the medical modeling community.

  6. Economic Consequence Analysis of Disasters: The ECAT Software Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rose, Adam; Prager, Fynn; Chen, Zhenhua

    This study develops a methodology for rapidly obtaining approximate estimates of the economic consequences from numerous natural, man-made and technological threats. This software tool is intended for use by various decision makers and analysts to obtain estimates rapidly. It is programmed in Excel and Visual Basic for Applications (VBA) to facilitate its use. This tool is called E-CAT (Economic Consequence Analysis Tool) and accounts for the cumulative direct and indirect impacts (including resilience and behavioral factors that significantly affect base estimates) on the U.S. economy. E-CAT is intended to be a major step toward advancing the current state of economicmore » consequence analysis (ECA) and also contributing to and developing interest in further research into complex but rapid turnaround approaches. The essence of the methodology involves running numerous simulations in a computable general equilibrium (CGE) model for each threat, yielding synthetic data for the estimation of a single regression equation based on the identification of key explanatory variables (threat characteristics and background conditions). This transforms the results of a complex model, which is beyond the reach of most users, into a "reduced form" model that is readily comprehensible. Functionality has been built into E-CAT so that its users can switch various consequence categories on and off in order to create customized profiles of economic consequences of numerous risk events. E-CAT incorporates uncertainty on both the input and output side in the course of the analysis.« less

  7. Integrated multidisciplinary CAD/CAE environment for micro-electro-mechanical systems (MEMS)

    NASA Astrophysics Data System (ADS)

    Przekwas, Andrzej J.

    1999-03-01

    Computational design of MEMS involves several strongly coupled physical disciplines, including fluid mechanics, heat transfer, stress/deformation dynamics, electronics, electro/magneto statics, calorics, biochemistry and others. CFDRC is developing a new generation multi-disciplinary CAD systems for MEMS using high-fidelity field solvers on unstructured, solution-adaptive grids for a full range of disciplines. The software system, ACE + MEMS, includes all essential CAD tools; geometry/grid generation for multi- discipline, multi-equation solvers, GUI, tightly coupled configurable 3D field solvers for FVM, FEM and BEM and a 3D visualization/animation tool. The flow/heat transfer/calorics/chemistry equations are solved with unstructured adaptive FVM solver, stress/deformation are computed with a FEM STRESS solver and a FAST BEM solver is used to solve linear heat transfer, electro/magnetostatics and elastostatics equations on adaptive polygonal surface grids. Tight multidisciplinary coupling and automatic interoperability between the tools was achieved by designing a comprehensive database structure and APIs for complete model definition. The virtual model definition is implemented in data transfer facility, a publicly available tool described in this paper. The paper presents overall description of the software architecture and MEMS design flow in ACE + MEMS. It describes current status, ongoing effort and future plans for the software. The paper also discusses new concepts of mixed-level and mixed- dimensionality capability in which 1D microfluidic networks are simulated concurrently with 3D high-fidelity models of discrete components.

  8. PCTFPeval: a web tool for benchmarking newly developed algorithms for predicting cooperative transcription factor pairs in yeast.

    PubMed

    Lai, Fu-Jou; Chang, Hong-Tsun; Wu, Wei-Sheng

    2015-01-01

    Computational identification of cooperative transcription factor (TF) pairs helps understand the combinatorial regulation of gene expression in eukaryotic cells. Many advanced algorithms have been proposed to predict cooperative TF pairs in yeast. However, it is still difficult to conduct a comprehensive and objective performance comparison of different algorithms because of lacking sufficient performance indices and adequate overall performance scores. To solve this problem, in our previous study (published in BMC Systems Biology 2014), we adopted/proposed eight performance indices and designed two overall performance scores to compare the performance of 14 existing algorithms for predicting cooperative TF pairs in yeast. Most importantly, our performance comparison framework can be applied to comprehensively and objectively evaluate the performance of a newly developed algorithm. However, to use our framework, researchers have to put a lot of effort to construct it first. To save researchers time and effort, here we develop a web tool to implement our performance comparison framework, featuring fast data processing, a comprehensive performance comparison and an easy-to-use web interface. The developed tool is called PCTFPeval (Predicted Cooperative TF Pair evaluator), written in PHP and Python programming languages. The friendly web interface allows users to input a list of predicted cooperative TF pairs from their algorithm and select (i) the compared algorithms among the 15 existing algorithms, (ii) the performance indices among the eight existing indices, and (iii) the overall performance scores from two possible choices. The comprehensive performance comparison results are then generated in tens of seconds and shown as both bar charts and tables. The original comparison results of each compared algorithm and each selected performance index can be downloaded as text files for further analyses. Allowing users to select eight existing performance indices and 15 existing algorithms for comparison, our web tool benefits researchers who are eager to comprehensively and objectively evaluate the performance of their newly developed algorithm. Thus, our tool greatly expedites the progress in the research of computational identification of cooperative TF pairs.

  9. PCTFPeval: a web tool for benchmarking newly developed algorithms for predicting cooperative transcription factor pairs in yeast

    PubMed Central

    2015-01-01

    Background Computational identification of cooperative transcription factor (TF) pairs helps understand the combinatorial regulation of gene expression in eukaryotic cells. Many advanced algorithms have been proposed to predict cooperative TF pairs in yeast. However, it is still difficult to conduct a comprehensive and objective performance comparison of different algorithms because of lacking sufficient performance indices and adequate overall performance scores. To solve this problem, in our previous study (published in BMC Systems Biology 2014), we adopted/proposed eight performance indices and designed two overall performance scores to compare the performance of 14 existing algorithms for predicting cooperative TF pairs in yeast. Most importantly, our performance comparison framework can be applied to comprehensively and objectively evaluate the performance of a newly developed algorithm. However, to use our framework, researchers have to put a lot of effort to construct it first. To save researchers time and effort, here we develop a web tool to implement our performance comparison framework, featuring fast data processing, a comprehensive performance comparison and an easy-to-use web interface. Results The developed tool is called PCTFPeval (Predicted Cooperative TF Pair evaluator), written in PHP and Python programming languages. The friendly web interface allows users to input a list of predicted cooperative TF pairs from their algorithm and select (i) the compared algorithms among the 15 existing algorithms, (ii) the performance indices among the eight existing indices, and (iii) the overall performance scores from two possible choices. The comprehensive performance comparison results are then generated in tens of seconds and shown as both bar charts and tables. The original comparison results of each compared algorithm and each selected performance index can be downloaded as text files for further analyses. Conclusions Allowing users to select eight existing performance indices and 15 existing algorithms for comparison, our web tool benefits researchers who are eager to comprehensively and objectively evaluate the performance of their newly developed algorithm. Thus, our tool greatly expedites the progress in the research of computational identification of cooperative TF pairs. PMID:26677932

  10. GenoBase: comprehensive resource database of Escherichia coli K-12

    PubMed Central

    Otsuka, Yuta; Muto, Ai; Takeuchi, Rikiya; Okada, Chihiro; Ishikawa, Motokazu; Nakamura, Koichiro; Yamamoto, Natsuko; Dose, Hitomi; Nakahigashi, Kenji; Tanishima, Shigeki; Suharnan, Sivasundaram; Nomura, Wataru; Nakayashiki, Toru; Aref, Walid G.; Bochner, Barry R.; Conway, Tyrrell; Gribskov, Michael; Kihara, Daisuke; Rudd, Kenneth E.; Tohsato, Yukako; Wanner, Barry L.; Mori, Hirotada

    2015-01-01

    Comprehensive experimental resources, such as ORFeome clone libraries and deletion mutant collections, are fundamental tools for elucidation of gene function. Data sets by omics analysis using these resources provide key information for functional analysis, modeling and simulation both in individual and systematic approaches. With the long-term goal of complete understanding of a cell, we have over the past decade created a variety of clone and mutant sets for functional genomics studies of Escherichia coli K-12. We have made these experimental resources freely available to the academic community worldwide. Accordingly, these resources have now been used in numerous investigations of a multitude of cell processes. Quality control is extremely important for evaluating results generated by these resources. Because the annotation has been changed since 2005, which we originally used for the construction, we have updated these genomic resources accordingly. Here, we describe GenoBase (http://ecoli.naist.jp/GB/), which contains key information about comprehensive experimental resources of E. coli K-12, their quality control and several omics data sets generated using these resources. PMID:25399415

  11. A comprehensive map of the mTOR signaling network

    PubMed Central

    Caron, Etienne; Ghosh, Samik; Matsuoka, Yukiko; Ashton-Beaucage, Dariel; Therrien, Marc; Lemieux, Sébastien; Perreault, Claude; Roux, Philippe P; Kitano, Hiroaki

    2010-01-01

    The mammalian target of rapamycin (mTOR) is a central regulator of cell growth and proliferation. mTOR signaling is frequently dysregulated in oncogenic cells, and thus an attractive target for anticancer therapy. Using CellDesigner, a modeling support software for graphical notation, we present herein a comprehensive map of the mTOR signaling network, which includes 964 species connected by 777 reactions. The map complies with both the systems biology markup language (SBML) and graphical notation (SBGN) for computational analysis and graphical representation, respectively. As captured in the mTOR map, we review and discuss our current understanding of the mTOR signaling network and highlight the impact of mTOR feedback and crosstalk regulations on drug-based cancer therapy. This map is available on the Payao platform, a Web 2.0 based community-wide interactive process for creating more accurate and information-rich databases. Thus, this comprehensive map of the mTOR network will serve as a tool to facilitate systems-level study of up-to-date mTOR network components and signaling events toward the discovery of novel regulatory processes and therapeutic strategies for cancer. PMID:21179025

  12. The relationship between quality management practices and organisational performance: A structural equation modelling approach

    NASA Astrophysics Data System (ADS)

    Jamaluddin, Z.; Razali, A. M.; Mustafa, Z.

    2015-02-01

    The purpose of this paper is to examine the relationship between the quality management practices (QMPs) and organisational performance for the manufacturing industry in Malaysia. In this study, a QMPs and organisational performance framework is developed according to a comprehensive literature review which cover aspects of hard and soft quality factors in manufacturing process environment. A total of 11 hypotheses have been put forward to test the relationship amongst the six constructs, which are management commitment, training, process management, quality tools, continuous improvement and organisational performance. The model is analysed using Structural Equation Modeling (SEM) with AMOS software version 18.0 using Maximum Likelihood (ML) estimation. A total of 480 questionnaires were distributed, and 210 questionnaires were valid for analysis. The results of the modeling analysis using ML estimation indicate that the fits statistics of QMPs and organisational performance model for manufacturing industry is admissible. From the results, it found that the management commitment have significant impact on the training and process management. Similarly, the training had significant effect to the quality tools, process management and continuous improvement. Furthermore, the quality tools have significant influence on the process management and continuous improvement. Likewise, the process management also has a significant impact to the continuous improvement. In addition the continuous improvement has significant influence the organisational performance. However, the results of the study also found that there is no significant relationship between management commitment and quality tools, and between the management commitment and continuous improvement. The results of the study can be used by managers to prioritize the implementation of QMPs. For instances, those practices that are found to have positive impact on organisational performance can be recommended to managers so that they can allocate resources to improve these practices to get better performance.

  13. Modelling entomological-climatic interactions of Plasmodium falciparum malaria transmission in two Colombian endemic-regions: contributions to a National Malaria Early Warning System

    PubMed Central

    Ruiz, Daniel; Poveda, Germán; Vélez, Iván D; Quiñones, Martha L; Rúa, Guillermo L; Velásquez, Luz E; Zuluaga, Juan S

    2006-01-01

    Background Malaria has recently re-emerged as a public health burden in Colombia. Although the problem seems to be climate-driven, there remain significant gaps of knowledge in the understanding of the complexity of malaria transmission, which have motivated attempts to develop a comprehensive model. Methods The mathematical tool was applied to represent Plasmodium falciparum malaria transmission in two endemic-areas. Entomological exogenous variables were estimated through field campaigns and laboratory experiments. Availability of breeding places was included towards representing fluctuations in vector densities. Diverse scenarios, sensitivity analyses and instabilities cases were considered during experimentation-validation process. Results Correlation coefficients and mean square errors between observed and modelled incidences reached 0.897–0.668 (P > 0.95) and 0.0002–0.0005, respectively. Temperature became the most relevant climatic parameter driving the final incidence. Accordingly, malaria outbreaks are possible during the favourable epochs following the onset of El Niño warm events. Sporogonic and gonotrophic cycles showed to be the entomological key-variables controlling the transmission potential of mosquitoes' population. Simulation results also showed that seasonality of vector density becomes an important factor towards understanding disease transmission. Conclusion The model constitutes a promising tool to deepen the understanding of the multiple interactions related to malaria transmission conducive to outbreaks. In the foreseeable future it could be implemented as a tool to diagnose possible dynamical patterns of malaria incidence under several scenarios, as well as a decision-making tool for the early detection and control of outbreaks. The model will be also able to be merged with forecasts of El Niño events to provide a National Malaria Early Warning System. PMID:16882349

  14. Modelling entomological-climatic interactions of Plasmodium falciparum malaria transmission in two Colombian endemic-regions: contributions to a National Malaria Early Warning System.

    PubMed

    Ruiz, Daniel; Poveda, Germán; Vélez, Iván D; Quiñones, Martha L; Rúa, Guillermo L; Velásquez, Luz E; Zuluaga, Juan S

    2006-08-01

    Malaria has recently re-emerged as a public health burden in Colombia. Although the problem seems to be climate-driven, there remain significant gaps of knowledge in the understanding of the complexity of malaria transmission, which have motivated attempts to develop a comprehensive model. The mathematical tool was applied to represent Plasmodium falciparum malaria transmission in two endemic-areas. Entomological exogenous variables were estimated through field campaigns and laboratory experiments. Availability of breeding places was included towards representing fluctuations in vector densities. Diverse scenarios, sensitivity analyses and instabilities cases were considered during experimentation-validation process. Correlation coefficients and mean square errors between observed and modelled incidences reached 0.897-0.668 (P > 0.95) and 0.0002-0.0005, respectively. Temperature became the most relevant climatic parameter driving the final incidence. Accordingly, malaria outbreaks are possible during the favourable epochs following the onset of El Niño warm events. Sporogonic and gonotrophic cycles showed to be the entomological key-variables controlling the transmission potential of mosquitoes' population. Simulation results also showed that seasonality of vector density becomes an important factor towards understanding disease transmission. The model constitutes a promising tool to deepen the understanding of the multiple interactions related to malaria transmission conducive to outbreaks. In the foreseeable future it could be implemented as a tool to diagnose possible dynamical patterns of malaria incidence under several scenarios, as well as a decision-making tool for the early detection and control of outbreaks. The model will be also able to be merged with forecasts of El Niño events to provide a National Malaria Early Warning System.

  15. Multimedia Informed Consent Tool for a Low Literacy African Research Population: Development and Pilot-Testing

    PubMed Central

    Afolabi, Muhammed Olanrewaju; Bojang, Kalifa; D’Alessandro, Umberto; Imoukhuede, Egeruan Babatunde; Ravinetto, Raffaella M; Larson, Heidi Jane; McGrath, Nuala; Chandramohan, Daniel

    2014-01-01

    Background International guidelines recommend the use of appropriate informed consent procedures in low literacy research settings because written information is not known to guarantee comprehension of study information. Objectives This study developed and evaluated a multimedia informed consent tool for people with low literacy in an area where a malaria treatment trial was being planned in The Gambia. Methods We developed the informed consent document of the malaria treatment trial into a multimedia tool integrating video, animations and audio narrations in three major Gambian languages. Acceptability and ease of use of the multimedia tool were assessed using quantitative and qualitative methods. In two separate visits, the participants’ comprehension of the study information was measured by using a validated digitised audio questionnaire. Results The majority of participants (70%) reported that the multimedia tool was clear and easy to understand. Participants had high scores on the domains of adverse events/risk, voluntary participation, study procedures while lowest scores were recorded on the question items on randomisation. The differences in mean scores for participants’ ‘recall’ and ‘understanding’ between first and second visits were statistically significant (F (1,41)=25.38, p<0.00001 and (F (1, 41) = 31.61, p<0.00001 respectively. Conclusions Our locally developed multimedia tool was acceptable and easy to administer among low literacy participants in The Gambia. It also proved to be effective in delivering and sustaining comprehension of study information across a diverse group of participants. Additional research is needed to compare the tool to the traditional consent interview, both in The Gambia and in other sub-Saharan settings. PMID:25133065

  16. NCPP's Use of Standard Metadata to Promote Open and Transparent Climate Modeling

    NASA Astrophysics Data System (ADS)

    Treshansky, A.; Barsugli, J. J.; Guentchev, G.; Rood, R. B.; DeLuca, C.

    2012-12-01

    The National Climate Predictions and Projections (NCPP) Platform is developing comprehensive regional and local information about the evolving climate to inform decision making and adaptation planning. This includes both creating and providing tools to create metadata about the models and processes used to create its derived data products. NCPP is using the Common Information Model (CIM), an ontology developed by a broad set of international partners in climate research, as its metadata language. This use of a standard ensures interoperability within the climate community as well as permitting access to the ecosystem of tools and services emerging alongside the CIM. The CIM itself is divided into a general-purpose (UML & XML) schema which structures metadata documents, and a project or community-specific (XML) Controlled Vocabulary (CV) which constraints the content of metadata documents. NCPP has already modified the CIM Schema to accommodate downscaling models, simulations, and experiments. NCPP is currently developing a CV for use by the downscaling community. Incorporating downscaling into the CIM will lead to several benefits: easy access to the existing CIM Documents describing CMIP5 models and simulations that are being downscaled, access to software tools that have been developed in order to search, manipulate, and visualize CIM metadata, and coordination with national and international efforts such as ES-DOC that are working to make climate model descriptions and datasets interoperable. Providing detailed metadata descriptions which include the full provenance of derived data products will contribute to making that data (and, the models and processes which generated that data) more open and transparent to the user community.

  17. TethysCluster: A comprehensive approach for harnessing cloud resources for hydrologic modeling

    NASA Astrophysics Data System (ADS)

    Nelson, J.; Jones, N.; Ames, D. P.

    2015-12-01

    Advances in water resources modeling are improving the information that can be supplied to support decisions affecting the safety and sustainability of society. However, as water resources models become more sophisticated and data-intensive they require more computational power to run. Purchasing and maintaining the computing facilities needed to support certain modeling tasks has been cost-prohibitive for many organizations. With the advent of the cloud, the computing resources needed to address this challenge are now available and cost-effective, yet there still remains a significant technical barrier to leverage these resources. This barrier inhibits many decision makers and even trained engineers from taking advantage of the best science and tools available. Here we present the Python tools TethysCluster and CondorPy, that have been developed to lower the barrier to model computation in the cloud by providing (1) programmatic access to dynamically scalable computing resources, (2) a batch scheduling system to queue and dispatch the jobs to the computing resources, (3) data management for job inputs and outputs, and (4) the ability to dynamically create, submit, and monitor computing jobs. These Python tools leverage the open source, computing-resource management, and job management software, HTCondor, to offer a flexible and scalable distributed-computing environment. While TethysCluster and CondorPy can be used independently to provision computing resources and perform large modeling tasks, they have also been integrated into Tethys Platform, a development platform for water resources web apps, to enable computing support for modeling workflows and decision-support systems deployed as web apps.

  18. The advancement of the built environment research through employment of structural equation modeling (SEM)

    NASA Astrophysics Data System (ADS)

    Wasilah, S.; Fahmyddin, T.

    2018-03-01

    The employment of structural equation modeling (SEM) in research has taken an increasing attention in among researchers in built environment. There is a gap to understand the attributes, application, and importance of this approach in data analysis in built environment study. This paper intends to provide fundamental comprehension of SEM method in data analysis, unveiling attributes, employment and significance and bestow cases to assess associations amongst variables and constructs. The study uses some main literature to grasp the essence of SEM regarding with built environment research. The better acknowledgment of this analytical tool may assist the researcher in the built environment to analyze data under complex research questions and to test multivariate models in a single study.

  19. PyRhO: A Multiscale Optogenetics Simulation Platform

    PubMed Central

    Evans, Benjamin D.; Jarvis, Sarah; Schultz, Simon R.; Nikolic, Konstantin

    2016-01-01

    Optogenetics has become a key tool for understanding the function of neural circuits and controlling their behavior. An array of directly light driven opsins have been genetically isolated from several families of organisms, with a wide range of temporal and spectral properties. In order to characterize, understand and apply these opsins, we present an integrated suite of open-source, multi-scale computational tools called PyRhO. The purpose of developing PyRhO is three-fold: (i) to characterize new (and existing) opsins by automatically fitting a minimal set of experimental data to three-, four-, or six-state kinetic models, (ii) to simulate these models at the channel, neuron and network levels, and (iii) provide functional insights through model selection and virtual experiments in silico. The module is written in Python with an additional IPython/Jupyter notebook based GUI, allowing models to be fit, simulations to be run and results to be shared through simply interacting with a webpage. The seamless integration of model fitting algorithms with simulation environments (including NEURON and Brian2) for these virtual opsins will enable neuroscientists to gain a comprehensive understanding of their behavior and rapidly identify the most suitable variant for application in a particular biological system. This process may thereby guide not only experimental design and opsin choice but also alterations of the opsin genetic code in a neuro-engineering feed-back loop. In this way, we expect PyRhO will help to significantly advance optogenetics as a tool for transforming biological sciences. PMID:27148037

  20. Theoretical Tools and Software for Modeling, Simulation and Control Design of Rocket Test Facilities

    NASA Technical Reports Server (NTRS)

    Richter, Hanz

    2004-01-01

    A rocket test stand and associated subsystems are complex devices whose operation requires that certain preparatory calculations be carried out before a test. In addition, real-time control calculations must be performed during the test, and further calculations are carried out after a test is completed. The latter may be required in order to evaluate if a particular test conformed to specifications. These calculations are used to set valve positions, pressure setpoints, control gains and other operating parameters so that a desired system behavior is obtained and the test can be successfully carried out. Currently, calculations are made in an ad-hoc fashion and involve trial-and-error procedures that may involve activating the system with the sole purpose of finding the correct parameter settings. The goals of this project are to develop mathematical models, control methodologies and associated simulation environments to provide a systematic and comprehensive prediction and real-time control capability. The models and controller designs are expected to be useful in two respects: 1) As a design tool, a model is the only way to determine the effects of design choices without building a prototype, which is, in the context of rocket test stands, impracticable; 2) As a prediction and tuning tool, a good model allows to set system parameters off-line, so that the expected system response conforms to specifications. This includes the setting of physical parameters, such as valve positions, and the configuration and tuning of any feedback controllers in the loop.

  1. PyRhO: A Multiscale Optogenetics Simulation Platform.

    PubMed

    Evans, Benjamin D; Jarvis, Sarah; Schultz, Simon R; Nikolic, Konstantin

    2016-01-01

    Optogenetics has become a key tool for understanding the function of neural circuits and controlling their behavior. An array of directly light driven opsins have been genetically isolated from several families of organisms, with a wide range of temporal and spectral properties. In order to characterize, understand and apply these opsins, we present an integrated suite of open-source, multi-scale computational tools called PyRhO. The purpose of developing PyRhO is three-fold: (i) to characterize new (and existing) opsins by automatically fitting a minimal set of experimental data to three-, four-, or six-state kinetic models, (ii) to simulate these models at the channel, neuron and network levels, and (iii) provide functional insights through model selection and virtual experiments in silico. The module is written in Python with an additional IPython/Jupyter notebook based GUI, allowing models to be fit, simulations to be run and results to be shared through simply interacting with a webpage. The seamless integration of model fitting algorithms with simulation environments (including NEURON and Brian2) for these virtual opsins will enable neuroscientists to gain a comprehensive understanding of their behavior and rapidly identify the most suitable variant for application in a particular biological system. This process may thereby guide not only experimental design and opsin choice but also alterations of the opsin genetic code in a neuro-engineering feed-back loop. In this way, we expect PyRhO will help to significantly advance optogenetics as a tool for transforming biological sciences.

  2. Recent experience with the CQE{trademark}

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harrison, C.D.; Kehoe, D.B.; O`Connor, D.C.

    1997-12-31

    CQE (the Coal Quality Expert) is a software tool that brings a new level of sophistication to fuel decisions by seamlessly integrating the system-wide effects of fuel purchase decisions on power plant performance, emissions, and power generation costs. The CQE technology, which addresses fuel quality from the coal mine to the busbar and the stack, is an integration and improvement of predecessor software tools including: EPRI`s Coal Quality Information System, EPRI`s Coal Cleaning Cost Model, EPRI`s Coal Quality Impact Model, and EPRI and DOE models to predict slagging and fouling. CQE can be used as a stand-alone workstation or asmore » a network application for utilities, coal producers, and equipment manufacturers to perform detailed analyses of the impacts of coal quality, capital improvements, operational changes, and/or environmental compliance alternatives on power plant emissions, performance and production costs. It can be used as a comprehensive, precise and organized methodology for systematically evaluating all such impacts or it may be used in pieces with some default data to perform more strategic or comparative studies.« less

  3. Toward a renewed Galactic Cepheid distance scale from Gaia and optical interferometry

    NASA Astrophysics Data System (ADS)

    Kervella, Pierre; Mérand, Antoine; Gallenne, Alexandre; Trahin, Boris; Nardetto, Nicolas; Anderson, Richard I.; Breitfelder, Joanne; Szabados, Laszlo; Bond, Howard E.; Borgniet, Simon; Gieren, Wolfgang; Pietrzyński, Grzegorz

    2017-09-01

    Through an innovative combination of multiple observing techniques and modeling, we are assembling a comprehensive understanding of the pulsation and close environment of Cepheids. We developed the SPIPS modeling tool that combines all observables (radial velocimetry, photometry, angular diameters from interferometry) to derive the relevant physical parameters of the star (effective temperature, infrared excess, reddening, …) and the ratio of the distance and the projection factor d/p. We present the application of SPIPS to the long-period Cepheid RS Pup, for which we derive p = 1.25±0.06. The addition of this massive Cepheid consolidates the existing sample of p-factor measurements towards long-period pulsators. This allows us to conclude that p is constant or mildly variable around p = 1.29±0.04 (±3%) as a function of the pulsation period. The forthcoming Gaia DR2 will provide a considerable improvement in quantity and accuracy of the trigonometric parallaxes of Cepheids. From this sample, the SPIPS modeling tool will enable a robust calibration of the Cepheid distance scale.

  4. Participant comprehension of research for which they volunteer: a systematic review.

    PubMed

    Montalvo, Wanda; Larson, Elaine

    2014-11-01

    Evidence indicates that research participants often do not fully understand the studies for which they have volunteered. The aim of this systematic review was to examine the relationship between the process of obtaining informed consent for research and participant comprehension and satisfaction with the research. Systematic review of published research on informed consent and participant comprehension of research for which they volunteer using the Preferred Reporting Items for Systematic Review and Meta-Analysis (PRISMA) Statement as a guide. PubMed, Cumulative Index for Nursing and Allied Health Literature, Cochrane Central Register of Controlled Trails, and Cochrane Database of Systematic Reviews were used to search the literature for studies meeting the following inclusion criteria: (a) published between January 1, 2006, and December 31, 2013, (b) interventional or descriptive quantitative design, (c) published in a peer-reviewed journal, (d) written in English, and (e) assessed participant comprehension or satisfaction with the research process. Studies were assessed for quality using seven indicators: sampling method, use of controls or comparison groups, response rate, description of intervention, description of outcome, statistical method, and health literacy assessment. Of 176 studies identified, 27 met inclusion criteria: 13 (48%) were randomized interventional designs and 14 (52%) were descriptive. Three categories of studies included projects assessing (a) enhanced consent process or form, (b) multimedia methods, and (c) education to improve participant understanding. Most (78%) used investigator-developed tools to assess participant comprehension, did not assess participant health literacy (74%), or did not assess the readability level of the consent form (89%). Researchers found participants lacked basic understanding of research elements: randomization, placebo, risks, and therapeutic misconception. Findings indicate (a) inconsistent assessment of participant reading or health literacy level, (b) measurement variation associated with use of nonstandardized tools, and (c) continued therapeutic misconception and lack of understanding among research participants of randomization, placebo, benefit, and risk. While the Agency for Healthcare and Quality and National Quality Forum have published informed consent and authorization toolkits, previously published validated tools are underutilized. Informed consent requires the assessment of health literacy, reading level, and comprehension of research participants using validated assessment tools and methods. © 2014 Sigma Theta Tau International.

  5. Tool use, aye-ayes, and sensorimotor intelligence.

    PubMed

    Sterling, E J; Povinelli, D J

    1999-01-01

    Humans, chimpanzees, capuchins and aye-ayes all display an unusually high degree of encephalization and diverse omnivorous extractive foraging. It has been suggested that the high degree of encephalization in aye-ayes may be the result of their diverse, omnivorous extractive foraging behaviors. In combination with certain forms of tool use, omnivorous extractive foraging has been hypothesized to be linked to higher levels of sensorimotor intelligence (stages 5 or 6). Although free-ranging aye-ayes have not been observed to use tools directly in the context of their extractive foraging activities, they have recently been reported to use lianas as tools in a manner that independently suggests that they may possess stage 5 or 6 sensorimotor intelligence. Although other primate species which display diverse, omnivorous extractive foraging have been tested for sensorimotor intelligence, aye-ayes have not. We report a test of captive aye-ayes' comprehension of tool use in a situation designed to simulate natural conditions. The results support the view that aye-ayes do not achieve stage 6 comprehension of tool use, but rather may use trial-and-error learning to develop tool-use behaviors. Other theories for aye-aye encephalization are considered.

  6. Novel combined patient instruction and discharge summary tool improves timeliness of documentation and outpatient provider satisfaction

    PubMed Central

    Gilliam, Meredith; Krein, Sarah L; Belanger, Karen; Fowler, Karen E; Dimcheff, Derek E; Solomon, Gabriel

    2017-01-01

    Background: Incomplete or delayed access to discharge information by outpatient providers and patients contributes to discontinuity of care and poor outcomes. Objective: To evaluate the effect of a new electronic discharge summary tool on the timeliness of documentation and communication with outpatient providers. Methods: In June 2012, we implemented an electronic discharge summary tool at our 145-bed university-affiliated Veterans Affairs hospital. The tool facilitates completion of a comprehensive discharge summary note that is available for patients and outpatient medical providers at the time of hospital discharge. Discharge summary note availability, outpatient provider satisfaction, and time between the decision to discharge a patient and discharge note completion were all evaluated before and after implementation of the tool. Results: The percentage of discharge summary notes completed by the time of first post-discharge clinical contact improved from 43% in February 2012 to 100% in September 2012 and was maintained at 100% in 2014. A survey of 22 outpatient providers showed that 90% preferred the new summary and 86% found it comprehensive. Despite increasing required documentation, the time required to discharge a patient, from physician decision to discharge note completion, improved from 5.6 h in 2010 to 4.1 h in 2012 (p = 0.04), and to 2.8 h in 2015 (p < 0.001). Conclusion: The implementation of a novel discharge summary tool improved the timeliness and comprehensiveness of discharge information as needed for the delivery of appropriate, high-quality follow-up care, without adversely affecting the efficiency of the discharge process. PMID:28491308

  7. Novel combined patient instruction and discharge summary tool improves timeliness of documentation and outpatient provider satisfaction.

    PubMed

    Gilliam, Meredith; Krein, Sarah L; Belanger, Karen; Fowler, Karen E; Dimcheff, Derek E; Solomon, Gabriel

    2017-01-01

    Incomplete or delayed access to discharge information by outpatient providers and patients contributes to discontinuity of care and poor outcomes. To evaluate the effect of a new electronic discharge summary tool on the timeliness of documentation and communication with outpatient providers. In June 2012, we implemented an electronic discharge summary tool at our 145-bed university-affiliated Veterans Affairs hospital. The tool facilitates completion of a comprehensive discharge summary note that is available for patients and outpatient medical providers at the time of hospital discharge. Discharge summary note availability, outpatient provider satisfaction, and time between the decision to discharge a patient and discharge note completion were all evaluated before and after implementation of the tool. The percentage of discharge summary notes completed by the time of first post-discharge clinical contact improved from 43% in February 2012 to 100% in September 2012 and was maintained at 100% in 2014. A survey of 22 outpatient providers showed that 90% preferred the new summary and 86% found it comprehensive. Despite increasing required documentation, the time required to discharge a patient, from physician decision to discharge note completion, improved from 5.6 h in 2010 to 4.1 h in 2012 (p = 0.04), and to 2.8 h in 2015 (p < 0.001). The implementation of a novel discharge summary tool improved the timeliness and comprehensiveness of discharge information as needed for the delivery of appropriate, high-quality follow-up care, without adversely affecting the efficiency of the discharge process.

  8. A Haptic-Enhanced System for Molecular Sensing

    NASA Astrophysics Data System (ADS)

    Comai, Sara; Mazza, Davide

    The science of haptics has received an enormous attention in the last decade. One of the major application trends of haptics technology is data visualization and training. In this paper, we present a haptically-enhanced system for manipulation and tactile exploration of molecules.The geometrical models of molecules is extracted either from theoretical or empirical data using file formats widely adopted in chemical and biological fields. The addition of information computed with computational chemistry tools, allows users to feel the interaction forces between an explored molecule and a charge associated to the haptic device, and to visualize a huge amount of numerical data in a more comprehensible way. The developed tool can be used either for teaching or research purposes due to its high reliance on both theoretical and experimental data.

  9. Measuring infrastructure: A key step in program evaluation and planning.

    PubMed

    Schmitt, Carol L; Glasgow, LaShawn; Lavinghouze, S Rene; Rieker, Patricia P; Fulmer, Erika; McAleer, Kelly; Rogers, Todd

    2016-06-01

    State tobacco prevention and control programs (TCPs) require a fully functioning infrastructure to respond effectively to the Surgeon General's call for accelerating the national reduction in tobacco use. The literature describes common elements of infrastructure; however, a lack of valid and reliable measures has made it difficult for program planners to monitor relevant infrastructure indicators and address observed deficiencies, or for evaluators to determine the association among infrastructure, program efforts, and program outcomes. The Component Model of Infrastructure (CMI) is a comprehensive, evidence-based framework that facilitates TCP program planning efforts to develop and maintain their infrastructure. Measures of CMI components were needed to evaluate the model's utility and predictive capability for assessing infrastructure. This paper describes the development of CMI measures and results of a pilot test with nine state TCP managers. Pilot test findings indicate that the tool has good face validity and is clear and easy to follow. The CMI tool yields data that can enhance public health efforts in a funding-constrained environment and provides insight into program sustainability. Ultimately, the CMI measurement tool could facilitate better evaluation and program planning across public health programs. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Falls screening and assessment tools used in acute mental health settings: a review of policies in England and Wales

    PubMed Central

    Narayanan, V.; Dickinson, A.; Victor, C.; Griffiths, C.; Humphrey, D.

    2016-01-01

    Objectives There is an urgent need to improve the care of older people at risk of falls or who experience falls in mental health settings. The aims of this study were to evaluate the individual falls risk assessment tools adopted by National Health Service (NHS) mental health trusts in England and healthcare boards in Wales, to evaluate the comprehensiveness of these tools and to review their predictive validity. Methods All NHS mental health trusts in England (n = 56) and healthcare boards in Wales (n = 6) were invited to supply their falls policies and other relevant documentation (e.g. local falls audits). In order to check the comprehensiveness of tools listed in policy documents, the risk variables of the tools adopted by the mental health trusts’ policies were compared with the 2004 National Institute for Health and Care Excellence (NICE) falls prevention guidelines. A comprehensive analytical literature review was undertaken to evaluate the predictive validity of the tools used in these settings. Results Falls policies were obtained from 46 mental health trusts. Thirty-five policies met the study inclusion criteria and were included in the analysis. The main falls assessment tools used were the St. Thomas’ Risk Assessment Tool in Falling Elderly Inpatients (STRATIFY), Falls Risk Assessment Scale for the Elderly, Morse Falls Scale (MFS) and Falls Risk Assessment Tool (FRAT). On detailed examination, a number of different versions of the FRAT were evident; validated tools had inconsistent predictive validity and none of them had been validated in mental health settings. Conclusions Falls risk assessment is the most commonly used component of risk prevention strategies, but most policies included unvalidated tools and even well validated tool such as the STRATIFY and the MFS that are reported to have inconsistent predictive accuracy. This raises questions about operational usefulness, as none of these tools have been tested in acute mental health settings. The falls risk assessment tools from only four mental health trusts met all the recommendations of the NICE falls guidelines on multifactorial assessment for prevention of falls. The recent NICE (2013) guidance states that tools predicting risk using numeric scales should no longer be used; however, multifactorial risk assessment and interventions tailored to patient needs is recommended. Trusts will need to update their policies in response to this guidance. PMID:26395210

  11. Exposure Assessment Tools by Chemical Classes - Nanomaterials

    EPA Pesticide Factsheets

    EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  12. Exposure Assessment Tools by Chemical Classes - Other Organics

    EPA Pesticide Factsheets

    EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  13. Exposure Assessment Tools by Lifestages and Populations - Lifestages

    EPA Pesticide Factsheets

    EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  14. Exposure Assessment Tools by Lifestages and Populations

    EPA Pesticide Factsheets

    EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  15. Exposure Assessment Tools by Media - Consumer Products

    EPA Pesticide Factsheets

    EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  16. Exposure Assessment Tools by Media - Water and Sediment

    EPA Pesticide Factsheets

    EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  17. Exposure Assessment Tools by Tiers and Types

    EPA Pesticide Factsheets

    EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  18. Exposure Assessment Tools by Media - Soil and Dust

    EPA Pesticide Factsheets

    EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  19. Exposure Assessment Tools by Chemical Classes - Pesticides

    EPA Pesticide Factsheets

    EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  20. Exposure Assessment Tools by Media - Aquatic Biota

    EPA Pesticide Factsheets

    EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  1. Exposure Assessment Tools by Media - Soil and Dust

    EPA Pesticide Factsheets

    2017-02-13

    EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  2. Exposure Assessment Tools by Chemical Classes - Other ...

    EPA Pesticide Factsheets

    2017-02-13

    EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  3. EPA EcoBox Tools by Exposure Pathways - Soil

    EPA Pesticide Factsheets

    Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  4. EPA EcoBox Tools by Exposure Pathways - Food Chains

    EPA Pesticide Factsheets

    Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  5. EPA EcoBox Tools by Exposure Pathways - References

    EPA Pesticide Factsheets

    Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  6. EPA EcoBox Tools by Receptors - Habitats and Ecosystems

    EPA Pesticide Factsheets

    Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  7. EPA EcoBox Tools by Effects - Effects In ERA

    EPA Pesticide Factsheets

    Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  8. EPA EcoBox Tools by Stressors - Stressors in ERA

    EPA Pesticide Factsheets

    Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  9. EPA EcoBox Tools by Receptors - Receptors in ERA

    EPA Pesticide Factsheets

    Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  10. EPA EcoBox Tools by Exposure Pathways - Air

    EPA Pesticide Factsheets

    Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  11. EPA ExpoBox: Submit Tool Information

    EPA Pesticide Factsheets

    EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases, mode

  12. Conceptual Modeling in Systems Biology Fosters Empirical Findings: The mRNA Lifecycle

    PubMed Central

    Dori, Dov; Choder, Mordechai

    2007-01-01

    One of the main obstacles to understanding complex biological systems is the extent and rapid evolution of information, way beyond the capacity individuals to manage and comprehend. Current modeling approaches and tools lack adequate capacity to model concurrently structure and behavior of biological systems. Here we propose Object-Process Methodology (OPM), a holistic conceptual modeling paradigm, as a means to model both diagrammatically and textually biological systems formally and intuitively at any desired number of levels of detail. OPM combines objects, e.g., proteins, and processes, e.g., transcription, in a way that is simple and easily comprehensible to researchers and scholars. As a case in point, we modeled the yeast mRNA lifecycle. The mRNA lifecycle involves mRNA synthesis in the nucleus, mRNA transport to the cytoplasm, and its subsequent translation and degradation therein. Recent studies have identified specific cytoplasmic foci, termed processing bodies that contain large complexes of mRNAs and decay factors. Our OPM model of this cellular subsystem, presented here, led to the discovery of a new constituent of these complexes, the translation termination factor eRF3. Association of eRF3 with processing bodies is observed after a long-term starvation period. We suggest that OPM can eventually serve as a comprehensive evolvable model of the entire living cell system. The model would serve as a research and communication platform, highlighting unknown and uncertain aspects that can be addressed empirically and updated consequently while maintaining consistency. PMID:17849002

  13. HiC-bench: comprehensive and reproducible Hi-C data analysis designed for parameter exploration and benchmarking.

    PubMed

    Lazaris, Charalampos; Kelly, Stephen; Ntziachristos, Panagiotis; Aifantis, Iannis; Tsirigos, Aristotelis

    2017-01-05

    Chromatin conformation capture techniques have evolved rapidly over the last few years and have provided new insights into genome organization at an unprecedented resolution. Analysis of Hi-C data is complex and computationally intensive involving multiple tasks and requiring robust quality assessment. This has led to the development of several tools and methods for processing Hi-C data. However, most of the existing tools do not cover all aspects of the analysis and only offer few quality assessment options. Additionally, availability of a multitude of tools makes scientists wonder how these tools and associated parameters can be optimally used, and how potential discrepancies can be interpreted and resolved. Most importantly, investigators need to be ensured that slight changes in parameters and/or methods do not affect the conclusions of their studies. To address these issues (compare, explore and reproduce), we introduce HiC-bench, a configurable computational platform for comprehensive and reproducible analysis of Hi-C sequencing data. HiC-bench performs all common Hi-C analysis tasks, such as alignment, filtering, contact matrix generation and normalization, identification of topological domains, scoring and annotation of specific interactions using both published tools and our own. We have also embedded various tasks that perform quality assessment and visualization. HiC-bench is implemented as a data flow platform with an emphasis on analysis reproducibility. Additionally, the user can readily perform parameter exploration and comparison of different tools in a combinatorial manner that takes into account all desired parameter settings in each pipeline task. This unique feature facilitates the design and execution of complex benchmark studies that may involve combinations of multiple tool/parameter choices in each step of the analysis. To demonstrate the usefulness of our platform, we performed a comprehensive benchmark of existing and new TAD callers exploring different matrix correction methods, parameter settings and sequencing depths. Users can extend our pipeline by adding more tools as they become available. HiC-bench consists an easy-to-use and extensible platform for comprehensive analysis of Hi-C datasets. We expect that it will facilitate current analyses and help scientists formulate and test new hypotheses in the field of three-dimensional genome organization.

  14. Knowledge-based and model-based hybrid methodology for comprehensive waste minimization in electroplating plants

    NASA Astrophysics Data System (ADS)

    Luo, Keqin

    1999-11-01

    The electroplating industry of over 10,000 planting plants nationwide is one of the major waste generators in the industry. Large quantities of wastewater, spent solvents, spent process solutions, and sludge are the major wastes generated daily in plants, which costs the industry tremendously for waste treatment and disposal and hinders the further development of the industry. It becomes, therefore, an urgent need for the industry to identify technically most effective and economically most attractive methodologies and technologies to minimize the waste, while the production competitiveness can be still maintained. This dissertation aims at developing a novel WM methodology using artificial intelligence, fuzzy logic, and fundamental knowledge in chemical engineering, and an intelligent decision support tool. The WM methodology consists of two parts: the heuristic knowledge-based qualitative WM decision analysis and support methodology and fundamental knowledge-based quantitative process analysis methodology for waste reduction. In the former, a large number of WM strategies are represented as fuzzy rules. This becomes the main part of the knowledge base in the decision support tool, WMEP-Advisor. In the latter, various first-principles-based process dynamic models are developed. These models can characterize all three major types of operations in an electroplating plant, i.e., cleaning, rinsing, and plating. This development allows us to perform a thorough process analysis on bath efficiency, chemical consumption, wastewater generation, sludge generation, etc. Additional models are developed for quantifying drag-out and evaporation that are critical for waste reduction. The models are validated through numerous industrial experiments in a typical plating line of an industrial partner. The unique contribution of this research is that it is the first time for the electroplating industry to (i) use systematically available WM strategies, (ii) know quantitatively and accurately what is going on in each tank, and (iii) identify all WM opportunities through process improvement. This work has formed a solid foundation for the further development of powerful WM technologies for comprehensive WM in the following decade.

  15. Do online prognostication tools represent a valid alternative to genomic profiling in the context of adjuvant treatment of early breast cancer? A systematic review of the literature.

    PubMed

    El Hage Chehade, Hiba; Wazir, Umar; Mokbel, Kinan; Kasem, Abdul; Mokbel, Kefah

    2018-01-01

    Decision-making regarding adjuvant chemotherapy has been based on clinical and pathological features. However, such decisions are seldom consistent. Web-based predictive models have been developed using data from cancer registries to help determine the need for adjuvant therapy. More recently, with the recognition of the heterogenous nature of breast cancer, genomic assays have been developed to aid in the therapeutic decision-making. We have carried out a comprehensive literature review regarding online prognostication tools and genomic assays to assess whether online tools could be used as valid alternatives to genomic profiling in decision-making regarding adjuvant therapy in early breast cancer. Breast cancer has been recently recognized as a heterogenous disease based on variations in molecular characteristics. Online tools are valuable in guiding adjuvant treatment, especially in resource constrained countries. However, in the era of personalized therapy, molecular profiling appears to be superior in predicting clinical outcome and guiding therapy. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. C3 System Performance Simulation and User Manual. Getting Started: Guidelines for Users

    NASA Technical Reports Server (NTRS)

    2006-01-01

    This document is a User's Manual describing the C3 Simulation capabilities. The subject work was designed to simulate the communications involved in the flight of a Remotely Operated Aircraft (ROA) using the Opnet software. Opnet provides a comprehensive development environment supporting the modeling of communication networks and distributed systems. It has tools for model design, simulation, data collection, and data analysis. Opnet models are hierarchical -- consisting of a project which contains node models which in turn contain process models. Nodes can be fixed, mobile, or satellite. Links between nodes can be physical or wireless. Communications are packet based. The model is very generic in its current form. Attributes such as frequency and bandwidth can easily be modified to better reflect a specific platform. The model is not fully developed at this stage -- there are still more enhancements to be added. Current issues are documented throughout this guide.

  17. cellPACK: A Virtual Mesoscope to Model and Visualize Structural Systems Biology

    PubMed Central

    Johnson, Graham T.; Autin, Ludovic; Al-Alusi, Mostafa; Goodsell, David S.; Sanner, Michel F.; Olson, Arthur J.

    2014-01-01

    cellPACK assembles computational models of the biological mesoscale, an intermediate scale (10−7–10−8m) between molecular and cellular biology. cellPACK’s modular architecture unites existing and novel packing algorithms to generate, visualize and analyze comprehensive 3D models of complex biological environments that integrate data from multiple experimental systems biology and structural biology sources. cellPACK is currently available as open source code, with tools for validation of models and with recipes and models for five biological systems: blood plasma, cytoplasm, synaptic vesicles, HIV and a mycoplasma cell. We have applied cellPACK to model distributions of HIV envelope protein to test several hypotheses for consistency with experimental observations. Biologists, educators, and outreach specialists can interact with cellPACK models, develop new recipes and perform packing experiments through scripting and graphical user interfaces at http://cellPACK.org. PMID:25437435

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hong, Tianzhen; Yan, Da; D'Oca, Simona

    Occupant behavior has significant impacts on building energy performance and occupant comfort. However, occupant behavior is not well understood and is often oversimplified in the building life cycle, due to its stochastic, diverse, complex, and interdisciplinary nature. The use of simplified methods or tools to quantify the impacts of occupant behavior in building performance simulations significantly contributes to performance gaps between simulated models and actual building energy consumption. Therefore, it is crucial to understand occupant behavior in a comprehensive way, integrating qualitative approaches and data- and model-driven quantitative approaches, and employing appropriate tools to guide the design and operation ofmore » low-energy residential and commercial buildings that integrate technological and human dimensions. This paper presents ten questions, highlighting some of the most important issues regarding concepts, applications, and methodologies in occupant behavior research. The proposed questions and answers aim to provide insights into occupant behavior for current and future researchers, designers, and policy makers, and most importantly, to inspire innovative research and applications to increase energy efficiency and reduce energy use in buildings.« less

  19. Detrusor underactivity: Pathophysiological considerations, models and proposals for future research. ICI-RS 2013.

    PubMed

    van Koeveringe, Gommert A; Rademakers, Kevin L J; Birder, Lori A; Korstanje, Cees; Daneshgari, Firouz; Ruggieri, Michael R; Igawa, Yasuhiko; Fry, Christopher; Wagg, Adrian

    2014-06-01

    Detrusor underactivity, resulting in either prolonged or inefficient voiding, is a common clinical problem for which treatment options are currently limited. The aim of this report is to summarize current understanding of the clinical observation and its underlying pathophysiological entities. This report results from presentations and subsequent discussion at the International Consultation on Incontinence Research Society (ICI-RS) in Bristol, 2013. The recommendations made by the ICI-RS panel include: Development of study tools based on a system's pathophysiological approach, correlation of in vitro and in vivo data in experimental animals and humans, and development of more comprehensive translational animal models. In addition, there is a need for longitudinal patient data to define risk groups and for the development of screening tools. In the near-future these recommendations should lead to a better understanding of detrusor underactivity and its pathophysiological background. Neurourol. Urodynam. 33:591-596, 2014. © 2014 Wiley Periodicals, Inc. © 2014 Wiley Periodicals, Inc.

  20. Anthropometric characteristics of female smallholder farmers of Uganda--Toward design of labor-saving tools.

    PubMed

    Mugisa, Dana J; Katimbo, Abia; Sempiira, John E; Kisaalita, William S

    2016-05-01

    Sub-Saharan African women on small-acreage farms carry a disproportionately higher labor burden, which is one of the main reasons they are unable to produce for both home and the market and realize higher incomes. Labor-saving interventions such as hand-tools are needed to save time and/or increase productivity in, for example, land preparation for crop and animal agriculture, post-harvest processing, and meeting daily energy and water needs. Development of such tools requires comprehensive and content-specific anthropometric data or body dimensions and existing databases based on Western women may be less relevant. We conducted measurements on 89 women to provide preliminary results toward answering two questions. First, how well existing databases are applicable in the design of hand-tools for sub-Saharan African women. Second, how universal body dimension predictive models are among ethnic groups. Our results show that, body dimensions between Bantu and Nilotic ethnolinguistic groups are different and both are different from American women. These results strongly support the need for establishing anthropometric databases for sub-Saharan African women, toward hand-tool design. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  1. Set-up of a decision support system to support sustainable development of the Laguna de Bay, Philippines.

    PubMed

    Nauta, Tjitte A; Bongco, Alicia E; Santos-Borja, Adelina C

    2003-01-01

    Over recent decades, population expansion, deforestation, land conversion, urbanisation, intense fisheries and industrialisation have produced massive changes in the Laguna de Bay catchment, Philippines. The resulting problems include rapid siltation of the lake, eutrophication, inputs of toxics, flooding problems and loss of biodiversity. Rational and systematic resolution of conflicting water use and water allocation interests is now urgently needed in order to ensure sustainable use of the water resources. With respect to the competing and conflicting pressures on the water resources, the Laguna Lake Development Authority (LLDA) needs to achieve comprehensive management and development of the area. In view of these problems and needs, the Government of the Netherlands was funding a two-year project entitled 'Sustainable Development of the Laguna de Bay Environment'.A comprehensive tool has been developed to support decision-making at catchment level. This consists of an ArcView GIS-database linked to a state-of-the-art modelling suite, including hydrological and waste load models for the catchment area and a three-dimensional hydrodynamic and water quality model (Delft3D) linked to a habitat evaluation module for the lake. In addition, MS Office based tools to support a stakeholder analysis and financial and economic assessments have been developed. The project also focused on technical studies relating to dredging, drinking water supply and infrastructure works. These aimed to produce technically and economically feasible solutions to water quantity and quality problems. The paper also presents the findings of a study on the development of polder islands in the Laguna de Bay, addressing the water quantity and quality problems and focusing on the application of the decision support system.

  2. Parametric Testing of Launch Vehicle FDDR Models

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Bajwa, Anupa; Berg, Peter; Thirumalainambi, Rajkumar

    2011-01-01

    For the safe operation of a complex system like a (manned) launch vehicle, real-time information about the state of the system and potential faults is extremely important. The on-board FDDR (Failure Detection, Diagnostics, and Response) system is a software system to detect and identify failures, provide real-time diagnostics, and to initiate fault recovery and mitigation. The ERIS (Evaluation of Rocket Integrated Subsystems) failure simulation is a unified Matlab/Simulink model of the Ares I Launch Vehicle with modular, hierarchical subsystems and components. With this model, the nominal flight performance characteristics can be studied. Additionally, failures can be injected to see their effects on vehicle state and on vehicle behavior. A comprehensive test and analysis of such a complicated model is virtually impossible. In this paper, we will describe, how parametric testing (PT) can be used to support testing and analysis of the ERIS failure simulation. PT uses a combination of Monte Carlo techniques with n-factor combinatorial exploration to generate a small, yet comprehensive set of parameters for the test runs. For the analysis of the high-dimensional simulation data, we are using multivariate clustering to automatically find structure in this high-dimensional data space. Our tools can generate detailed HTML reports that facilitate the analysis.

  3. Comprehensive Modeling and Visualization of Cardiac Anatomy and Physiology from CT Imaging and Computer Simulations

    PubMed Central

    Sun, Peng; Zhou, Haoyin; Ha, Seongmin; Hartaigh, Bríain ó; Truong, Quynh A.; Min, James K.

    2016-01-01

    In clinical cardiology, both anatomy and physiology are needed to diagnose cardiac pathologies. CT imaging and computer simulations provide valuable and complementary data for this purpose. However, it remains challenging to gain useful information from the large amount of high-dimensional diverse data. The current tools are not adequately integrated to visualize anatomic and physiologic data from a complete yet focused perspective. We introduce a new computer-aided diagnosis framework, which allows for comprehensive modeling and visualization of cardiac anatomy and physiology from CT imaging data and computer simulations, with a primary focus on ischemic heart disease. The following visual information is presented: (1) Anatomy from CT imaging: geometric modeling and visualization of cardiac anatomy, including four heart chambers, left and right ventricular outflow tracts, and coronary arteries; (2) Function from CT imaging: motion modeling, strain calculation, and visualization of four heart chambers; (3) Physiology from CT imaging: quantification and visualization of myocardial perfusion and contextual integration with coronary artery anatomy; (4) Physiology from computer simulation: computation and visualization of hemodynamics (e.g., coronary blood velocity, pressure, shear stress, and fluid forces on the vessel wall). Substantially, feedback from cardiologists have confirmed the practical utility of integrating these features for the purpose of computer-aided diagnosis of ischemic heart disease. PMID:26863663

  4. Helping School Leaders Help New Teachers: A Tool for Transforming School-Based Induction

    ERIC Educational Resources Information Center

    Birkeland, Sarah; Feiman-Nemser, Sharon

    2012-01-01

    Ample research demonstrates the power of comprehensive induction to develop and retain new teachers. Education scholars generally agree on what powerful systems of induction include, yet few tools exist for guiding schools in creating such systems. Drawing on theory and practice, we have created such a tool. This article introduces the "Continuum…

  5. Development of an informatics infrastructure for data exchange of biomolecular simulations: architecture, data models and ontology$

    PubMed Central

    Thibault, J. C.; Roe, D. R.; Eilbeck, K.; Cheatham, T. E.; Facelli, J. C.

    2015-01-01

    Biomolecular simulations aim to simulate structure, dynamics, interactions, and energetics of complex biomolecular systems. With the recent advances in hardware, it is now possible to use more complex and accurate models, but also reach time scales that are biologically significant. Molecular simulations have become a standard tool for toxicology and pharmacology research, but organizing and sharing data – both within the same organization and among different ones – remains a substantial challenge. In this paper we review our recent work leading to the development of a comprehensive informatics infrastructure to facilitate the organization and exchange of biomolecular simulations data. Our efforts include the design of data models and dictionary tools that allow the standardization of the metadata used to describe the biomedical simulations, the development of a thesaurus and ontology for computational reasoning when searching for biomolecular simulations in distributed environments, and the development of systems based on these models to manage and share the data at a large scale (iBIOMES), and within smaller groups of researchers at laboratory scale (iBIOMES Lite), that take advantage of the standardization of the meta data used to describe biomolecular simulations. PMID:26387907

  6. Development of an informatics infrastructure for data exchange of biomolecular simulations: Architecture, data models and ontology.

    PubMed

    Thibault, J C; Roe, D R; Eilbeck, K; Cheatham, T E; Facelli, J C

    2015-01-01

    Biomolecular simulations aim to simulate structure, dynamics, interactions, and energetics of complex biomolecular systems. With the recent advances in hardware, it is now possible to use more complex and accurate models, but also reach time scales that are biologically significant. Molecular simulations have become a standard tool for toxicology and pharmacology research, but organizing and sharing data - both within the same organization and among different ones - remains a substantial challenge. In this paper we review our recent work leading to the development of a comprehensive informatics infrastructure to facilitate the organization and exchange of biomolecular simulations data. Our efforts include the design of data models and dictionary tools that allow the standardization of the metadata used to describe the biomedical simulations, the development of a thesaurus and ontology for computational reasoning when searching for biomolecular simulations in distributed environments, and the development of systems based on these models to manage and share the data at a large scale (iBIOMES), and within smaller groups of researchers at laboratory scale (iBIOMES Lite), that take advantage of the standardization of the meta data used to describe biomolecular simulations.

  7. Design and implementation of ergonomic performance measurement system at a steel plant in India.

    PubMed

    Ray, Pradip Kumar; Tewari, V K

    2012-01-01

    Management of Tata Steel, the largest steel making company of India in the private sector, felt the need to develop a framework to determine the levels of ergonomic performance at its different workplaces. The objectives of the study are manifold: to identify and characterize the ergonomic variables for a given worksystem with regard to work efficiency, operator safety, and working conditions, to design a comprehensive Ergonomic Performance Indicator (EPI) for quantitative determination of the ergonomic status and maturity of a given worksystem. The study team of IIT Kharagpur consists of three faculty members and the management of Tata Steel formed a team of eleven members for implementation of EPI model. In order to design and develop the EPI model with total participation and understanding of the concerned personnel of Tata Steel, a three-phase action plan for the project was prepared. The project consists of three phases: preparation and data collection, detailed structuring and validation of EPI model. Identification of ergonomic performance factors, development of interaction matrix, design of assessment tool, and testing and validation of assessment tool (EPI) in varied situations are the major steps in these phases. The case study discusses in detail the EPI model and its applications.

  8. SLUG - stochastically lighting up galaxies - III. A suite of tools for simulated photometry, spectroscopy, and Bayesian inference with stochastic stellar populations

    NASA Astrophysics Data System (ADS)

    Krumholz, Mark R.; Fumagalli, Michele; da Silva, Robert L.; Rendahl, Theodore; Parra, Jonathan

    2015-09-01

    Stellar population synthesis techniques for predicting the observable light emitted by a stellar population have extensive applications in numerous areas of astronomy. However, accurate predictions for small populations of young stars, such as those found in individual star clusters, star-forming dwarf galaxies, and small segments of spiral galaxies, require that the population be treated stochastically. Conversely, accurate deductions of the properties of such objects also require consideration of stochasticity. Here we describe a comprehensive suite of modular, open-source software tools for tackling these related problems. These include the following: a greatly-enhanced version of the SLUG code introduced by da Silva et al., which computes spectra and photometry for stochastically or deterministically sampled stellar populations with nearly arbitrary star formation histories, clustering properties, and initial mass functions; CLOUDY_SLUG, a tool that automatically couples SLUG-computed spectra with the CLOUDY radiative transfer code in order to predict stochastic nebular emission; BAYESPHOT, a general-purpose tool for performing Bayesian inference on the physical properties of stellar systems based on unresolved photometry; and CLUSTER_SLUG and SFR_SLUG, a pair of tools that use BAYESPHOT on a library of SLUG models to compute the mass, age, and extinction of mono-age star clusters, and the star formation rate of galaxies, respectively. The latter two tools make use of an extensive library of pre-computed stellar population models, which are included in the software. The complete package is available at http://www.slugsps.com.

  9. Story Map Instruction: A Road Map for Reading Comprehension.

    ERIC Educational Resources Information Center

    Davis, Zephaniah, T.; McPherson, Michael D.

    1989-01-01

    Introduces teachers to the development and use of story maps as a tool for promoting reading comprehension. Presents a definition and review of story map research. Explains how to construct story maps, and offers suggestions for starting story map instruction. Provides variations on the use of story maps. (MG)

  10. Teacher Logs: A Tool for Gaining a Comprehensive Understanding of Classroom Practices

    ERIC Educational Resources Information Center

    Glennie, Elizabeth J.; Charles, Karen J.; Rice, Olivia N.

    2017-01-01

    Examining repeated classroom encounters over time provides a comprehensive picture of activities. Studies of instructional practices in classrooms have traditionally relied on two methods: classroom observations, which are expensive, and surveys, which are limited in scope and accuracy. Teacher logs provide a "real-time" method for…

  11. Enhancing Comprehension through Graphic Organizers.

    ERIC Educational Resources Information Center

    Ben-David, Renee

    The purpose of this study was to determine whether graphic organizers serve as a better tool for comprehension assessment than traditional tests. Subjects, 16 seventh-grade learning disabled students, were given 8 weeks of instruction and assessments using both graphic organizer and linear note forms. Tests were graded, compared and contrasted to…

  12. Evaluating California Campus Tobacco Policies Using the American College Health Association Guidelines and the Institutional Grammar Tool

    ERIC Educational Resources Information Center

    Roditis, Maria L.; Wang, Donna; Glantz, Stanton A.; Fallin, Amanda

    2015-01-01

    Objective: To measure comprehensiveness of California campus tobacco policies. Participants: Sixteen campuses representing different regions, institution types, and policies. Research occurred June-August 2013. Methods: Comprehensiveness was scored using American College Health Association's (ACHA) "Position Statement on Tobacco." The…

  13. Measuring Reading Comprehension with the Lexile Framework.

    ERIC Educational Resources Information Center

    Stenner, A. Jackson

    This paper shows how the concept of general objectivity can be used to improve behavioral science measurement, particularly as it applies to the Lexile Framework, a tool for objectively measuring reading comprehension. It begins with a dialogue between a physicist and a psychometrician that details some of the differences between physical science…

  14. Towards a National Space Weather Predictive Capability

    NASA Astrophysics Data System (ADS)

    Fox, N. J.; Ryschkewitsch, M. G.; Merkin, V. G.; Stephens, G. K.; Gjerloev, J. W.; Barnes, R. J.; Anderson, B. J.; Paxton, L. J.; Ukhorskiy, A. Y.; Kelly, M. A.; Berger, T. E.; Bonadonna, L. C. M. F.; Hesse, M.; Sharma, S.

    2015-12-01

    National needs in the area of space weather informational and predictive tools are growing rapidly. Adverse conditions in the space environment can cause disruption of satellite operations, communications, navigation, and electric power distribution grids, leading to a variety of socio-economic losses and impacts on our security. Future space exploration and most modern human endeavors will require major advances in physical understanding and improved transition of space research to operations. At present, only a small fraction of the latest research and development results from NASA, NOAA, NSF and DoD investments are being used to improve space weather forecasting and to develop operational tools. The power of modern research and space weather model development needs to be better utilized to enable comprehensive, timely, and accurate operational space weather tools. The mere production of space weather information is not sufficient to address the needs of those who are affected by space weather. A coordinated effort is required to support research-to-applications transition efforts and to develop the tools required those who rely on this information. In this presentation we will review the space weather system developed for the Van Allen Probes mission, together with other datasets, tools and models that have resulted from research by scientists at JHU/APL. We will look at how these, and results from future missions such as Solar Probe Plus, could be applied to support space weather applications in coordination with other community assets and capabilities.

  15. Are your students ready for anatomy and physiology? Developing tools to identify students at risk for failure.

    PubMed

    Gultice, Amy; Witham, Ann; Kallmeyer, Robert

    2015-06-01

    High failure rates in introductory college science courses, including anatomy and physiology, are common at institutions across the country, and determining the specific factors that contribute to this problem is challenging. To identify students at risk for failure in introductory physiology courses at our open-enrollment institution, an online pilot survey was administered to 200 biology students. The survey results revealed several predictive factors related to academic preparation and prompted a comprehensive analysis of college records of >2,000 biology students over a 5-yr period. Using these historical data, a model that was 91% successful in predicting student success in these courses was developed. The results of the present study support the use of surveys and similar models to identify at-risk students and to provide guidance in the development of evidence-based advising programs and pedagogies. This comprehensive approach may be a tangible step in improving student success for students from a wide variety of backgrounds in anatomy and physiology courses. Copyright © 2015 The American Physiological Society.

  16. SoftLab: A Soft-Computing Software for Experimental Research with Commercialization Aspects

    NASA Technical Reports Server (NTRS)

    Akbarzadeh-T, M.-R.; Shaikh, T. S.; Ren, J.; Hubbell, Rob; Kumbla, K. K.; Jamshidi, M

    1998-01-01

    SoftLab is a software environment for research and development in intelligent modeling/control using soft-computing paradigms such as fuzzy logic, neural networks, genetic algorithms, and genetic programs. SoftLab addresses the inadequacies of the existing soft-computing software by supporting comprehensive multidisciplinary functionalities from management tools to engineering systems. Furthermore, the built-in features help the user process/analyze information more efficiently by a friendly yet powerful interface, and will allow the user to specify user-specific processing modules, hence adding to the standard configuration of the software environment.

  17. Analytical aspects of plant metabolite profiling platforms: current standings and future aims.

    PubMed

    Seger, Christoph; Sturm, Sonja

    2007-02-01

    Over the past years, metabolic profiling has been established as a comprehensive systems biology tool. Mass spectrometry or NMR spectroscopy-based technology platforms combined with unsupervised or supervised multivariate statistical methodologies allow a deep insight into the complex metabolite patterns of plant-derived samples. Within this review, we provide a thorough introduction to the analytical hard- and software requirements of metabolic profiling platforms. Methodological limitations are addressed, and the metabolic profiling workflow is exemplified by summarizing recent applications ranging from model systems to more applied topics.

  18. Grid Modernization Laboratory Consortium - Testing and Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kroposki, Benjamin; Skare, Paul; Pratt, Rob

    This paper highlights some of the unique testing capabilities and projects being performed at several national laboratories as part of the U. S. Department of Energy Grid Modernization Laboratory Consortium. As part of this effort, the Grid Modernization Laboratory Consortium Testing Network isbeing developed to accelerate grid modernization by enablingaccess to a comprehensive testing infrastructure and creating a repository of validated models and simulation tools that will be publicly available. This work is key to accelerating thedevelopment, validation, standardization, adoption, and deployment of new grid technologies to help meet U. S. energy goals.

  19. At-TAX: a whole genome tiling array resource for developmental expression analysis and transcript identification in Arabidopsis thaliana

    PubMed Central

    Laubinger, Sascha; Zeller, Georg; Henz, Stefan R; Sachsenberg, Timo; Widmer, Christian K; Naouar, Naïra; Vuylsteke, Marnik; Schölkopf, Bernhard; Rätsch, Gunnar; Weigel, Detlef

    2008-01-01

    Gene expression maps for model organisms, including Arabidopsis thaliana, have typically been created using gene-centric expression arrays. Here, we describe a comprehensive expression atlas, Arabidopsis thaliana Tiling Array Express (At-TAX), which is based on whole-genome tiling arrays. We demonstrate that tiling arrays are accurate tools for gene expression analysis and identified more than 1,000 unannotated transcribed regions. Visualizations of gene expression estimates, transcribed regions, and tiling probe measurements are accessible online at the At-TAX homepage. PMID:18613972

  20. Participatory data collection and monitoring of agricultural pest dynamics for climate-resilient coffee production using Tiko'n, a generic tool to develop agroecological food web models

    NASA Astrophysics Data System (ADS)

    Rojas, M.; Malard, J. J.; Adamowski, J. F.; Tuy, H.

    2016-12-01

    Climate variability impacts agricultural processes through many mechanisms. For example, the proliferation of pests and diseases increases with warmer climate and alternated wind patterns, as longer growing seasons allow pest species to complete more reproductive cycles and changes in the weather patterns alter the stages and rates of development of pests and pathogens. Several studies suggest that enhancing plant diversity and complexity in farming systems, such as in agroforestry systems, reduces the vulnerability of farms to extreme climatic events. On the other hand, other authors have argued that vegetation diversity does not necessarily reduce the incidence of pests and diseases, highlighting the importance of understanding how, where and when it is recommendable to diversify vegetation to improve pest and disease control, and emphasising the need for tools to develop, monitor and evaluate agroecosystems. In order to understand how biodiversity can enhance ecosystem services provided by the agroecosystem in the context of climatic variability, it is important to develop comprehensive models that include the role of trophic chains in the regulation of pests, which can be achieved by integrating crop models with pest-predator models, also known as agroecosystem network (AEN) models. Here we present a methodology for the participatory data collection and monitoring necessary for running Tiko'n, an AEN model that can also be coupled to a crop model such as DSSAT. This methodology aims to combine the local and practical knowledge of farmers with the scientific knowledge of entomologists and agronomists, allowing for the simplification of complex ecological networks of plant and insect interactions. This also increases the acceptability, credibility, and comprehension of the model by farmers, allowing them to understand their relationship with the local agroecosystem and their potential to use key agroecosystem principles such as functional diversity to mitigate climate variability impacts. Preliminary results of a study currently being conducted in a coffee agroforestry system in El Quebracho, Guatemala, will be presented, where the data was directly collected by farmers during eight consecutive months. Finally, future recommendations from lessons learnt during this study will be discussed.

Top