Sample records for current modeling tools

  1. Limitations Of The Current State Space Modelling Approach In Multistage Machining Processes Due To Operation Variations

    NASA Astrophysics Data System (ADS)

    Abellán-Nebot, J. V.; Liu, J.; Romero, F.

    2009-11-01

    The State Space modelling approach has been recently proposed as an engineering-driven technique for part quality prediction in Multistage Machining Processes (MMP). Current State Space models incorporate fixture and datum variations in the multi-stage variation propagation, without explicitly considering common operation variations such as machine-tool thermal distortions, cutting-tool wear, cutting-tool deflections, etc. This paper shows the limitations of the current State Space model through an experimental case study where the effect of the spindle thermal expansion, cutting-tool flank wear and locator errors are introduced. The paper also discusses the extension of the current State Space model to include operation variations and its potential benefits.

  2. THE USEPA'S METAL FINISHING FACILITY RISK SCREENING TOOL (MFFRST) AND POLLUTION PREVENTION TOOL (MFFP2T)

    EPA Science Inventory

    This presentation will provide an overview of the USEPA's Metal Finishing Facility Risk Screening Tool, including a discussion of the models used and outputs. The tool is currently being expanded to include pollution prevention considerations as part of the model. The current st...

  3. THE ATMOSPHERIC MODEL EVALUATION TOOL

    EPA Science Inventory

    This poster describes a model evaluation tool that is currently being developed and applied for meteorological and air quality model evaluation. The poster outlines the framework and provides examples of statistical evaluations that can be performed with the model evaluation tool...

  4. COUNCIL FOR REGULATORY ENVIRONMENTAL MODELING (CREM) PILOT WATER QUALITY MODEL SELECTION TOOL

    EPA Science Inventory

    EPA's Council for Regulatory Environmental Modeling (CREM) is currently supporting the development of a pilot model selection tool that is intended to help the states and the regions implement the total maximum daily load (TMDL) program. This tool will be implemented within the ...

  5. Examination of the low frequency limit for helicopter noise data in the Federal Aviation Administration's Aviation Environmental Design Tool and Integrated Noise Model

    DOT National Transportation Integrated Search

    2010-04-19

    The Federal Aviation Administration (FAA) aircraft noise modeling tools Aviation Environmental Design Tool (AEDTc) and Integrated Noise Model (INM) do not currently consider noise below 50 Hz in their computations. This paper describes a preliminary ...

  6. Watershed Health Assessment Tools Investigating Fisheries WHAT IF Version 2: A Manager’s Guide to New Features

    EPA Pesticide Factsheets

    The CVI Watershed Health Assessment Tool Investigating Fisheries, WHAT IF version 2, currently contains five components: Regional Prioritization Tool, Hydrologic Tool, Clustering Tool, Habitat Suitability Tool, BASS model

  7. Introducing GHOST: The Geospace/Heliosphere Observation & Simulation Tool-kit

    NASA Astrophysics Data System (ADS)

    Murphy, J. J.; Elkington, S. R.; Schmitt, P.; Wiltberger, M. J.; Baker, D. N.

    2013-12-01

    Simulation models of the heliospheric and geospace environments can provide key insights into the geoeffective potential of solar disturbances such as Coronal Mass Ejections and High Speed Solar Wind Streams. Advanced post processing of the results of these simulations greatly enhances the utility of these models for scientists and other researchers. Currently, no supported centralized tool exists for performing these processing tasks. With GHOST, we introduce a toolkit for the ParaView visualization environment that provides a centralized suite of tools suited for Space Physics post processing. Building on the work from the Center For Integrated Space Weather Modeling (CISM) Knowledge Transfer group, GHOST is an open-source tool suite for ParaView. The tool-kit plugin currently provides tools for reading LFM and Enlil data sets, and provides automated tools for data comparison with NASA's CDAweb database. As work progresses, many additional tools will be added and through open-source collaboration, we hope to add readers for additional model types, as well as any additional tools deemed necessary by the scientific public. The ultimate end goal of this work is to provide a complete Sun-to-Earth model analysis toolset.

  8. Modelling the urban water cycle as an integrated part of the city: a review.

    PubMed

    Urich, Christian; Rauch, Wolfgang

    2014-01-01

    In contrast to common perceptions, the urban water infrastructure system is a complex and dynamic system that is constantly evolving and adapting to changes in the urban environment, to sustain existing services and provide additional ones. Instead of simplifying urban water infrastructure to a static system that is decoupled from its urban context, new management strategies use the complexity of the system to their advantage by integrating centralised with decentralised solutions and explicitly embedding water systems into their urban form. However, to understand and test possible adaptation strategies, urban water modelling tools are required to support exploration of their effectiveness as the human-technology-environment system coevolves under different future scenarios. The urban water modelling community has taken first steps to developing these new modelling tools. This paper critically reviews the historical development of urban water modelling tools and provides a summary of the current state of integrated modelling approaches. It reflects on the challenges that arise through the current practice of coupling urban water management tools with urban development models and discusses a potential pathway towards a new generation of modelling tools.

  9. Status of the AIAA Modeling and Simulation Format Standard

    NASA Technical Reports Server (NTRS)

    Jackson, E. Bruce; Hildreth, Bruce L.

    2008-01-01

    The current draft AIAA Standard for flight simulation models represents an on-going effort to improve the productivity of practitioners of the art of digital flight simulation (one of the original digital computer applications). This initial release provides the capability for the efficient representation and exchange of an aerodynamic model in full fidelity; the DAVE-ML format can be easily imported (with development of site-specific import tools) in an unambiguous way with automatic verification. An attractive feature of the standard is the ability to coexist with existing legacy software or tools. The draft Standard is currently limited in scope to static elements of dynamic flight simulations; however, these static elements represent the bulk of typical flight simulation mathematical models. It is already seeing application within U.S. and Australian government agencies in an effort to improve productivity and reduce model rehosting overhead. An existing tool allows import of DAVE-ML models into a popular simulation modeling and analysis tool, and other community-contributed tools and libraries can simplify the use of DAVE-ML compliant models at compile- or run-time of high-fidelity flight simulation.

  10. Verification of a Multiphysics Toolkit against the Magnetized Target Fusion Concept

    NASA Technical Reports Server (NTRS)

    Thomas, Scott; Perrell, Eric; Liron, Caroline; Chiroux, Robert; Cassibry, Jason; Adams, Robert B.

    2005-01-01

    In the spring of 2004 the Advanced Concepts team at MSFC embarked on an ambitious project to develop a suite of modeling routines that would interact with one another. The tools would each numerically model a portion of any advanced propulsion system. The tools were divided by physics categories, hence the name multiphysics toolset. Currently most of the anticipated modeling tools have been created and integrated. Results are given in this paper for both a quarter nozzle with chemically reacting flow and the interaction of two plasma jets representative of a Magnetized Target Fusion device. The results have not been calibrated against real data as of yet, but this paper demonstrates the current capability of the multiphysics tool and planned future enhancements

  11. Software Engineering Tools for Scientific Models

    NASA Technical Reports Server (NTRS)

    Abrams, Marc; Saboo, Pallabi; Sonsini, Mike

    2013-01-01

    Software tools were constructed to address issues the NASA Fortran development community faces, and they were tested on real models currently in use at NASA. These proof-of-concept tools address the High-End Computing Program and the Modeling, Analysis, and Prediction Program. Two examples are the NASA Goddard Earth Observing System Model, Version 5 (GEOS-5) atmospheric model in Cell Fortran on the Cell Broadband Engine, and the Goddard Institute for Space Studies (GISS) coupled atmosphere- ocean model called ModelE, written in fixed format Fortran.

  12. Toward A Simulation-Based Tool for the Treatment of Vocal Fold Paralysis

    PubMed Central

    Mittal, Rajat; Zheng, Xudong; Bhardwaj, Rajneesh; Seo, Jung Hee; Xue, Qian; Bielamowicz, Steven

    2011-01-01

    Advances in high-performance computing are enabling a new generation of software tools that employ computational modeling for surgical planning. Surgical management of laryngeal paralysis is one area where such computational tools could have a significant impact. The current paper describes a comprehensive effort to develop a software tool for planning medialization laryngoplasty where a prosthetic implant is inserted into the larynx in order to medialize the paralyzed vocal fold (VF). While this is one of the most common procedures used to restore voice in patients with VF paralysis, it has a relatively high revision rate, and the tool being developed is expected to improve surgical outcomes. This software tool models the biomechanics of airflow-induced vibration in the human larynx and incorporates sophisticated approaches for modeling the turbulent laryngeal flow, the complex dynamics of the VFs, as well as the production of voiced sound. The current paper describes the key elements of the modeling approach, presents computational results that demonstrate the utility of the approach and also describes some of the limitations and challenges. PMID:21556320

  13. Automated MRI segmentation for individualized modeling of current flow in the human head.

    PubMed

    Huang, Yu; Dmochowski, Jacek P; Su, Yuzhuo; Datta, Abhishek; Rorden, Christopher; Parra, Lucas C

    2013-12-01

    High-definition transcranial direct current stimulation (HD-tDCS) and high-density electroencephalography require accurate models of current flow for precise targeting and current source reconstruction. At a minimum, such modeling must capture the idiosyncratic anatomy of the brain, cerebrospinal fluid (CSF) and skull for each individual subject. Currently, the process to build such high-resolution individualized models from structural magnetic resonance images requires labor-intensive manual segmentation, even when utilizing available automated segmentation tools. Also, accurate placement of many high-density electrodes on an individual scalp is a tedious procedure. The goal was to develop fully automated techniques to reduce the manual effort in such a modeling process. A fully automated segmentation technique based on Statical Parametric Mapping 8, including an improved tissue probability map and an automated correction routine for segmentation errors, was developed, along with an automated electrode placement tool for high-density arrays. The performance of these automated routines was evaluated against results from manual segmentation on four healthy subjects and seven stroke patients. The criteria include segmentation accuracy, the difference of current flow distributions in resulting HD-tDCS models and the optimized current flow intensities on cortical targets. The segmentation tool can segment out not just the brain but also provide accurate results for CSF, skull and other soft tissues with a field of view extending to the neck. Compared to manual results, automated segmentation deviates by only 7% and 18% for normal and stroke subjects, respectively. The predicted electric fields in the brain deviate by 12% and 29% respectively, which is well within the variability observed for various modeling choices. Finally, optimized current flow intensities on cortical targets do not differ significantly. Fully automated individualized modeling may now be feasible for large-sample EEG research studies and tDCS clinical trials.

  14. SBML-SAT: a systems biology markup language (SBML) based sensitivity analysis tool

    PubMed Central

    Zi, Zhike; Zheng, Yanan; Rundell, Ann E; Klipp, Edda

    2008-01-01

    Background It has long been recognized that sensitivity analysis plays a key role in modeling and analyzing cellular and biochemical processes. Systems biology markup language (SBML) has become a well-known platform for coding and sharing mathematical models of such processes. However, current SBML compatible software tools are limited in their ability to perform global sensitivity analyses of these models. Results This work introduces a freely downloadable, software package, SBML-SAT, which implements algorithms for simulation, steady state analysis, robustness analysis and local and global sensitivity analysis for SBML models. This software tool extends current capabilities through its execution of global sensitivity analyses using multi-parametric sensitivity analysis, partial rank correlation coefficient, SOBOL's method, and weighted average of local sensitivity analyses in addition to its ability to handle systems with discontinuous events and intuitive graphical user interface. Conclusion SBML-SAT provides the community of systems biologists a new tool for the analysis of their SBML models of biochemical and cellular processes. PMID:18706080

  15. SBML-SAT: a systems biology markup language (SBML) based sensitivity analysis tool.

    PubMed

    Zi, Zhike; Zheng, Yanan; Rundell, Ann E; Klipp, Edda

    2008-08-15

    It has long been recognized that sensitivity analysis plays a key role in modeling and analyzing cellular and biochemical processes. Systems biology markup language (SBML) has become a well-known platform for coding and sharing mathematical models of such processes. However, current SBML compatible software tools are limited in their ability to perform global sensitivity analyses of these models. This work introduces a freely downloadable, software package, SBML-SAT, which implements algorithms for simulation, steady state analysis, robustness analysis and local and global sensitivity analysis for SBML models. This software tool extends current capabilities through its execution of global sensitivity analyses using multi-parametric sensitivity analysis, partial rank correlation coefficient, SOBOL's method, and weighted average of local sensitivity analyses in addition to its ability to handle systems with discontinuous events and intuitive graphical user interface. SBML-SAT provides the community of systems biologists a new tool for the analysis of their SBML models of biochemical and cellular processes.

  16. Modeling and Prediction of Fan Noise

    NASA Technical Reports Server (NTRS)

    Envia, Ed

    2008-01-01

    Fan noise is a significant contributor to the total noise signature of a modern high bypass ratio aircraft engine and with the advent of ultra high bypass ratio engines like the geared turbofan, it is likely to remain so in the future. As such, accurate modeling and prediction of the basic characteristics of fan noise are necessary ingredients in designing quieter aircraft engines in order to ensure compliance with ever more stringent aviation noise regulations. In this paper, results from a comprehensive study aimed at establishing the utility of current tools for modeling and predicting fan noise will be summarized. It should be emphasized that these tools exemplify present state of the practice and embody what is currently used at NASA and Industry for predicting fan noise. The ability of these tools to model and predict fan noise is assessed against a set of benchmark fan noise databases obtained for a range of representative fan cycles and operating conditions. Detailed comparisons between the predicted and measured narrowband spectral and directivity characteristics of fan nose will be presented in the full paper. General conclusions regarding the utility of current tools and recommendations for future improvements will also be given.

  17. IDSE Version 1 User's Manual

    NASA Technical Reports Server (NTRS)

    Mayer, Richard

    1988-01-01

    The integrated development support environment (IDSE) is a suite of integrated software tools that provide intelligent support for information modelling. These tools assist in function, information, and process modeling. Additional tools exist to assist in gathering and analyzing information to be modeled. This is a user's guide to application of the IDSE. Sections covering the requirements and design of each of the tools are presented. There are currently three integrated computer aided manufacturing definition (IDEF) modeling methodologies: IDEF0, IDEF1, and IDEF2. Also, four appendices exist to describe hardware and software requirements, installation procedures, and basic hardware usage.

  18. XMI2USE: A Tool for Transforming XMI to USE Specifications

    NASA Astrophysics Data System (ADS)

    Sun, Wuliang; Song, Eunjee; Grabow, Paul C.; Simmonds, Devon M.

    The UML-based Specification Environment (USE) tool supports syntactic analysis, type checking, consistency checking, and dynamic validation of invariants and pre-/post conditions specified in the Object Constraint Language (OCL). Due to its animation and analysis power, it is useful when checking critical non-functional properties such as security policies. However, the USE tool requires one to specify (i.e., "write") a model using its own textual language and does not allow one to import any model specification files created by other UML modeling tools. Hence, to make the best use of existing UML tools, we often create a model with OCL constraints using a modeling tool such as the IBM Rational Software Architect (RSA) and then use the USE tool for model validation. This approach, however, requires a manual transformation between the specifications of two different tool formats, which is error-prone and diminishes the benefit of automated model-level validations. In this paper, we describe our own implementation of a specification transformation engine that is based on the Model Driven Architecture (MDA) framework and currently supports automatic tool-level transformations from RSA to USE.

  19. Comparison of four modeling tools for the prediction of potential distribution for non-indigenous weeds in the United States

    USGS Publications Warehouse

    Magarey, Roger; Newton, Leslie; Hong, Seung C.; Takeuchi, Yu; Christie, Dave; Jarnevich, Catherine S.; Kohl, Lisa; Damus, Martin; Higgins, Steven I.; Miller, Leah; Castro, Karen; West, Amanda; Hastings, John; Cook, Gericke; Kartesz, John; Koop, Anthony

    2018-01-01

    This study compares four models for predicting the potential distribution of non-indigenous weed species in the conterminous U.S. The comparison focused on evaluating modeling tools and protocols as currently used for weed risk assessment or for predicting the potential distribution of invasive weeds. We used six weed species (three highly invasive and three less invasive non-indigenous species) that have been established in the U.S. for more than 75 years. The experiment involved providing non-U. S. location data to users familiar with one of the four evaluated techniques, who then developed predictive models that were applied to the United States without knowing the identity of the species or its U.S. distribution. We compared a simple GIS climate matching technique known as Proto3, a simple climate matching tool CLIMEX Match Climates, the correlative model MaxEnt, and a process model known as the Thornley Transport Resistance (TTR) model. Two experienced users ran each modeling tool except TTR, which had one user. Models were trained with global species distribution data excluding any U.S. data, and then were evaluated using the current known U.S. distribution. The influence of weed species identity and modeling tool on prevalence and sensitivity effects was compared using a generalized linear mixed model. Each modeling tool itself had a low statistical significance, while weed species alone accounted for 69.1 and 48.5% of the variance for prevalence and sensitivity, respectively. These results suggest that simple modeling tools might perform as well as complex ones in the case of predicting potential distribution for a weed not yet present in the United States. Considerations of model accuracy should also be balanced with those of reproducibility and ease of use. More important than the choice of modeling tool is the construction of robust protocols and testing both new and experienced users under blind test conditions that approximate operational conditions.

  20. Fundamental CRISPR-Cas9 tools and current applications in microbial systems.

    PubMed

    Tian, Pingfang; Wang, Jia; Shen, Xiaolin; Rey, Justin Forrest; Yuan, Qipeng; Yan, Yajun

    2017-09-01

    Derived from the bacterial adaptive immune system, CRISPR technology has revolutionized conventional genetic engineering methods and unprecedentedly facilitated strain engineering. In this review, we outline the fundamental CRISPR tools that have been employed for strain optimization. These tools include CRISPR editing, CRISPR interference, CRISPR activation and protein imaging. To further characterize the CRISPR technology, we present current applications of these tools in microbial systems, including model- and non-model industrial microorganisms. Specially, we point out the major challenges of the CRISPR tools when utilized for multiplex genome editing and sophisticated expression regulation. To address these challenges, we came up with strategies that place emphasis on the amelioration of DNA repair efficiency through CRISPR-Cas9-assisted recombineering. Lastly, multiple promising research directions were proposed, mainly focusing on CRISPR-based construction of microbial ecosystems toward high production of desired chemicals.

  1. Visualization, documentation, analysis, and communication of large scale gene regulatory networks

    PubMed Central

    Longabaugh, William J.R.; Davidson, Eric H.; Bolouri, Hamid

    2009-01-01

    Summary Genetic regulatory networks (GRNs) are complex, large-scale, and spatially and temporally distributed. These characteristics impose challenging demands on computational GRN modeling tools, and there is a need for custom modeling tools. In this paper, we report on our ongoing development of BioTapestry, an open source, freely available computational tool designed specifically for GRN modeling. We also outline our future development plans, and give some examples of current applications of BioTapestry. PMID:18757046

  2. Automated MRI Segmentation for Individualized Modeling of Current Flow in the Human Head

    PubMed Central

    Huang, Yu; Dmochowski, Jacek P.; Su, Yuzhuo; Datta, Abhishek; Rorden, Christopher; Parra, Lucas C.

    2013-01-01

    Objective High-definition transcranial direct current stimulation (HD-tDCS) and high-density electroencephalography (HD-EEG) require accurate models of current flow for precise targeting and current source reconstruction. At a minimum, such modeling must capture the idiosyncratic anatomy of brain, cerebrospinal fluid (CSF) and skull for each individual subject. Currently, the process to build such high-resolution individualized models from structural magnetic resonance images (MRI) requires labor-intensive manual segmentation, even when leveraging available automated segmentation tools. Also, accurate placement of many high-density electrodes on individual scalp is a tedious procedure. The goal was to develop fully automated techniques to reduce the manual effort in such a modeling process. Approach A fully automated segmentation technique based on Statical Parametric Mapping 8 (SPM8), including an improved tissue probability map (TPM) and an automated correction routine for segmentation errors, was developed, along with an automated electrode placement tool for high-density arrays. The performance of these automated routines was evaluated against results from manual segmentation on 4 healthy subjects and 7 stroke patients. The criteria include segmentation accuracy, the difference of current flow distributions in resulting HD-tDCS models and the optimized current flow intensities on cortical targets. Main results The segmentation tool can segment out not just the brain but also provide accurate results for CSF, skull and other soft tissues with a field of view (FOV) extending to the neck. Compared to manual results, automated segmentation deviates by only 7% and 18% for normal and stroke subjects, respectively. The predicted electric fields in the brain deviate by 12% and 29% respectively, which is well within the variability observed for various modeling choices. Finally, optimized current flow intensities on cortical targets do not differ significantly. Significance Fully automated individualized modeling may now be feasible for large-sample EEG research studies and tDCS clinical trials. PMID:24099977

  3. Automated MRI segmentation for individualized modeling of current flow in the human head

    NASA Astrophysics Data System (ADS)

    Huang, Yu; Dmochowski, Jacek P.; Su, Yuzhuo; Datta, Abhishek; Rorden, Christopher; Parra, Lucas C.

    2013-12-01

    Objective. High-definition transcranial direct current stimulation (HD-tDCS) and high-density electroencephalography require accurate models of current flow for precise targeting and current source reconstruction. At a minimum, such modeling must capture the idiosyncratic anatomy of the brain, cerebrospinal fluid (CSF) and skull for each individual subject. Currently, the process to build such high-resolution individualized models from structural magnetic resonance images requires labor-intensive manual segmentation, even when utilizing available automated segmentation tools. Also, accurate placement of many high-density electrodes on an individual scalp is a tedious procedure. The goal was to develop fully automated techniques to reduce the manual effort in such a modeling process. Approach. A fully automated segmentation technique based on Statical Parametric Mapping 8, including an improved tissue probability map and an automated correction routine for segmentation errors, was developed, along with an automated electrode placement tool for high-density arrays. The performance of these automated routines was evaluated against results from manual segmentation on four healthy subjects and seven stroke patients. The criteria include segmentation accuracy, the difference of current flow distributions in resulting HD-tDCS models and the optimized current flow intensities on cortical targets.Main results. The segmentation tool can segment out not just the brain but also provide accurate results for CSF, skull and other soft tissues with a field of view extending to the neck. Compared to manual results, automated segmentation deviates by only 7% and 18% for normal and stroke subjects, respectively. The predicted electric fields in the brain deviate by 12% and 29% respectively, which is well within the variability observed for various modeling choices. Finally, optimized current flow intensities on cortical targets do not differ significantly.Significance. Fully automated individualized modeling may now be feasible for large-sample EEG research studies and tDCS clinical trials.

  4. Reliability, Validity, and Factor Structure of the Current Assessment Practice Evaluation-Revised (CAPER) in a National Sample.

    PubMed

    Lyon, Aaron R; Pullmann, Michael D; Dorsey, Shannon; Martin, Prerna; Grigore, Alexandra A; Becker, Emily M; Jensen-Doss, Amanda

    2018-05-11

    Measurement-based care (MBC) is an increasingly popular, evidence-based practice, but there are no tools with established psychometrics to evaluate clinician use of MBC practices in mental health service delivery. The current study evaluated the reliability, validity, and factor structure of scores generated from a brief, standardized tool to measure MBC practices, the Current Assessment Practice Evaluation-Revised (CAPER). Survey data from a national sample of 479 mental health clinicians were used to conduct exploratory and confirmatory factor analyses, as well as reliability and validity analyses (e.g., relationships between CAPER subscales and clinician MBC attitudes). Analyses revealed competing two- and three-factor models. Regardless of the model used, scores from CAPER subscales demonstrated good reliability and convergent and divergent validity with MBC attitudes in the expected directions. The CAPER appears to be a psychometrically sound tool for assessing clinician MBC practices. Future directions for development and application of the tool are discussed.

  5. Modeling Electrostatic Fields Generated by Internal Charging of Materials in Space Radiation Environments

    NASA Technical Reports Server (NTRS)

    Minow, Joseph I.

    2011-01-01

    Internal charging is a risk to spacecraft in energetic electron environments. DICTAT, NU MIT computational codes are the most widely used engineering tools for evaluating internal charging of insulator materials exposed to these environments. Engineering tools are designed for rapid evaluation of ESD threats, but there is a need for more physics based models for investigating the science of materials interactions with energetic electron environments. Current tools are limited by the physics included in the models and ease of user implementation .... additional development work is needed to improve models.

  6. Current Development Status of an Integrated Tool for Modeling Quasi-static Deformation in the Solid Earth

    NASA Astrophysics Data System (ADS)

    Williams, C. A.; Dicaprio, C.; Simons, M.

    2003-12-01

    With the advent of projects such as the Plate Boundary Observatory and future InSAR missions, spatially dense geodetic data of high quality will provide an increasingly detailed picture of the movement of the earth's surface. To interpret such information, powerful and easily accessible modeling tools are required. We are presently developing such a tool that we feel will meet many of the needs for evaluating quasi-static earth deformation. As a starting point, we begin with a modified version of the finite element code TECTON, which has been specifically designed to solve tectonic problems involving faulting and viscoelastic/plastic earth behavior. As our first priority, we are integrating the code into the GeoFramework, which is an extension of the Python-based Pyre modeling framework. The goal of this framework is to provide simplified user interfaces for powerful modeling codes, to provide easy access to utilities such as meshers and visualization tools, and to provide a tight integration between different modeling tools so they can interact with each other. The initial integration of the code into this framework is essentially complete, and a more thorough integration, where Python-based drivers control the entire solution, will be completed in the near future. We have an evolving set of priorities that we expect to solidify as we receive more input from the modeling community. Current priorities include the development of linear and quadratic tetrahedral elements, the development of a parallelized version of the code using the PETSc libraries, the addition of more complex rheologies, realistic fault friction models, adaptive time stepping, and spherical geometries. In this presentation we describe current progress toward our various priorities, briefly describe the structure of the code within the GeoFramework, and demonstrate some sample applications.

  7. Applications and issues of GIS as tool for civil engineering modeling

    USGS Publications Warehouse

    Miles, S.B.; Ho, C.L.

    1999-01-01

    A tool that has proliferated within civil engineering in recent years is geographic information systems (GIS). The goal of a tool is to supplement ability and knowledge that already exists, not to serve as a replacement for that which is lacking. To secure the benefits and avoid misuse of a burgeoning tool, engineers must understand the limitations, alternatives, and context of the tool. The common benefits of using GIS as a supplement to engineering modeling are summarized. Several brief case studies of GIS modeling applications are taken from popular civil engineering literature to demonstrate the wide use and varied implementation of GIS across the discipline. Drawing from the case studies, limitations regarding traditional GIS data models find the implementation of civil engineering models within current GIS are identified and countered by discussing the direction of the next generation of GIS. The paper concludes by highlighting the potential for the misuse of GIS in the context of engineering modeling and suggests that this potential can be reduced through education and awareness. The goal of this paper is to promote awareness of the issues related to GIS-based modeling and to assist in the formulation of questions regarding the application of current GIS. The technology has experienced much publicity of late, with many engineers being perhaps too excited about the usefulness of current GIS. An undoubtedly beneficial side effect of this, however, is that engineers are becoming more aware of GIS and, hopefully, the associated subtleties. Civil engineers must stay informed of GIS issues and progress, but more importantly, civil engineers must inform the GIS community to direct the technology development optimally.

  8. OLTARIS - Overview and Recent Updates

    NASA Technical Reports Server (NTRS)

    Sandridge, C. A.

    2015-01-01

    The On-Line Tool for the Assessment of Radiation in Space (OLTARIS) is a web-based set of tools and models that allow engineers and scientists to assess the effects of space radiation in spacecraft, habitats, rovers, and spacesuits. The site is intended to be a design tool for those studying the effects of space radiation for current and future missions as well as a research tool for those developing advanced material and shielding concepts. The tools and models are built around the HZETRN radiation transport code and are currently focused on human-related responses. OLTARIS was deployed in 2008. Since that time, many improvements and additional capabilities have been added to the site. The purpose of this poster/presentation is to give an overview of the current capabilities of OLTARIS and focus on the updates to the site since the last workshop presentation in 2014. OLTARIS currently has 240 active accounts - 87 accounts are government (including NASA, ORNL, JPL, AFRL, and FAA), 76 are university professors/researchers/students, and 51 are industry (including Boeing, Space X, Lockheed-Martin, ATK, Northrup Grumman, and Bigelow Aerospace). There have been 14,000 jobs run through OLTARIS since counting began in November 2009. ITAR restrictions were recently reversed, so the site is now available to registered users worldwide.

  9. A Hyperbolic Ontology Visualization Tool for Model Application Programming Interface Documentation

    NASA Technical Reports Server (NTRS)

    Hyman, Cody

    2011-01-01

    Spacecraft modeling, a critically important portion in validating planned spacecraft activities, is currently carried out using a time consuming method of mission to mission model implementations and integration. A current project in early development, Integrated Spacecraft Analysis (ISCA), aims to remedy this hindrance by providing reusable architectures and reducing time spent integrating models with planning and sequencing tools. The principle objective of this internship was to develop a user interface for an experimental ontology-based structure visualization of navigation and attitude control system modeling software. To satisfy this, a number of tree and graph visualization tools were researched and a Java based hyperbolic graph viewer was selected for experimental adaptation. Early results show promise in the ability to organize and display large amounts of spacecraft model documentation efficiently and effectively through a web browser. This viewer serves as a conceptual implementation for future development but trials with both ISCA developers and end users should be performed to truly evaluate the effectiveness of continued development of such visualizations.

  10. PathCase-SB architecture and database design

    PubMed Central

    2011-01-01

    Background Integration of metabolic pathways resources and regulatory metabolic network models, and deploying new tools on the integrated platform can help perform more effective and more efficient systems biology research on understanding the regulation in metabolic networks. Therefore, the tasks of (a) integrating under a single database environment regulatory metabolic networks and existing models, and (b) building tools to help with modeling and analysis are desirable and intellectually challenging computational tasks. Description PathCase Systems Biology (PathCase-SB) is built and released. The PathCase-SB database provides data and API for multiple user interfaces and software tools. The current PathCase-SB system provides a database-enabled framework and web-based computational tools towards facilitating the development of kinetic models for biological systems. PathCase-SB aims to integrate data of selected biological data sources on the web (currently, BioModels database and KEGG), and to provide more powerful and/or new capabilities via the new web-based integrative framework. This paper describes architecture and database design issues encountered in PathCase-SB's design and implementation, and presents the current design of PathCase-SB's architecture and database. Conclusions PathCase-SB architecture and database provide a highly extensible and scalable environment with easy and fast (real-time) access to the data in the database. PathCase-SB itself is already being used by researchers across the world. PMID:22070889

  11. Automated visualization of rule-based models

    PubMed Central

    Tapia, Jose-Juan; Faeder, James R.

    2017-01-01

    Frameworks such as BioNetGen, Kappa and Simmune use “reaction rules” to specify biochemical interactions compactly, where each rule specifies a mechanism such as binding or phosphorylation and its structural requirements. Current rule-based models of signaling pathways have tens to hundreds of rules, and these numbers are expected to increase as more molecule types and pathways are added. Visual representations are critical for conveying rule-based models, but current approaches to show rules and interactions between rules scale poorly with model size. Also, inferring design motifs that emerge from biochemical interactions is an open problem, so current approaches to visualize model architecture rely on manual interpretation of the model. Here, we present three new visualization tools that constitute an automated visualization framework for rule-based models: (i) a compact rule visualization that efficiently displays each rule, (ii) the atom-rule graph that conveys regulatory interactions in the model as a bipartite network, and (iii) a tunable compression pipeline that incorporates expert knowledge and produces compact diagrams of model architecture when applied to the atom-rule graph. The compressed graphs convey network motifs and architectural features useful for understanding both small and large rule-based models, as we show by application to specific examples. Our tools also produce more readable diagrams than current approaches, as we show by comparing visualizations of 27 published models using standard graph metrics. We provide an implementation in the open source and freely available BioNetGen framework, but the underlying methods are general and can be applied to rule-based models from the Kappa and Simmune frameworks also. We expect that these tools will promote communication and analysis of rule-based models and their eventual integration into comprehensive whole-cell models. PMID:29131816

  12. Integrated Wind Power Planning Tool

    NASA Astrophysics Data System (ADS)

    Rosgaard, M. H.; Giebel, G.; Nielsen, T. S.; Hahmann, A.; Sørensen, P.; Madsen, H.

    2012-04-01

    This poster presents the current state of the public service obligation (PSO) funded project PSO 10464, with the working title "Integrated Wind Power Planning Tool". The project commenced October 1, 2011, and the goal is to integrate a numerical weather prediction (NWP) model with purely statistical tools in order to assess wind power fluctuations, with focus on long term power system planning for future wind farms as well as short term forecasting for existing wind farms. Currently, wind power fluctuation models are either purely statistical or integrated with NWP models of limited resolution. With regard to the latter, one such simulation tool has been developed at the Wind Energy Division, Risø DTU, intended for long term power system planning. As part of the PSO project the inferior NWP model used at present will be replaced by the state-of-the-art Weather Research & Forecasting (WRF) model. Furthermore, the integrated simulation tool will be improved so it can handle simultaneously 10-50 times more turbines than the present ~ 300, as well as additional atmospheric parameters will be included in the model. The WRF data will also be input for a statistical short term prediction model to be developed in collaboration with ENFOR A/S; a danish company that specialises in forecasting and optimisation for the energy sector. This integrated prediction model will allow for the description of the expected variability in wind power production in the coming hours to days, accounting for its spatio-temporal dependencies, and depending on the prevailing weather conditions defined by the WRF output. The output from the integrated prediction tool constitute scenario forecasts for the coming period, which can then be fed into any type of system model or decision making problem to be solved. The high resolution of the WRF results loaded into the integrated prediction model will ensure a high accuracy data basis is available for use in the decision making process of the Danish transmission system operator, and the need for high accuracy predictions will only increase over the next decade as Denmark approaches the goal of 50% wind power based electricity in 2020, from the current 20%.

  13. Human Behavior Based Exploratory Model for Successful Implementation of Lean Enterprise in Industry

    ERIC Educational Resources Information Center

    Sawhney, Rupy; Chason, Stewart

    2005-01-01

    Currently available Lean tools such as Lean Assessments, Value Stream Mapping, and Process Flow Charting focus on system requirements and overlook human behavior. A need is felt for a tool that allows one to baseline personnel, determine personnel requirements and align system requirements with personnel requirements. Our exploratory model--The…

  14. Some Observations on the Current Status of Performing Finite Element Analyses

    NASA Technical Reports Server (NTRS)

    Raju, Ivatury S.; Knight, Norman F., Jr; Shivakumar, Kunigal N.

    2015-01-01

    Aerospace structures are complex high-performance structures. Advances in reliable and efficient computing and modeling tools are enabling analysts to consider complex configurations, build complex finite element models, and perform analysis rapidly. Many of the early career engineers of today are very proficient in the usage of modern computers, computing engines, complex software systems, and visualization tools. These young engineers are becoming increasingly efficient in building complex 3D models of complicated aerospace components. However, the current trends demonstrate blind acceptance of the results of the finite element analysis results. This paper is aimed at raising an awareness of this situation. Examples of the common encounters are presented. To overcome the current trends, some guidelines and suggestions for analysts, senior engineers, and educators are offered.

  15. U.S. Army Workshop on Solid-Propellant Ignition and Combustion Modeling.

    DTIC Science & Technology

    1997-07-01

    saving tool in the design, development, testing, and evaluation of future gun-propulsion systems , and that, under current funding constraints, research...53 7.1 What systems are currently being addressed...9 ............. . . .. .. . . . . . . . . . . . . . . . . . . . . . . . . 56 7.5 What model systems might be valuable for

  16. Operational Models Supporting Manned Space Flight

    NASA Astrophysics Data System (ADS)

    Johnson, A. S.; Weyland, M. D.; Lin, T. C.; Zapp, E. N.

    2006-12-01

    The Space Radiation Analysis Group (SRAG) at Johnson Space Center (JSC) has the primary responsibility to provide real-time radiation health operational support for manned space flight. Forecasts from NOAA SEC, real-time space environment data and radiation models are used to infer changes in the radiation environment due to space weather. Unlike current operations in low earth orbit which are afforded substantial protection from the geomagnetic field, exploration missions will have little protection and require improved operational tools for mission support. The current state of operational models and their limitations will be presented as well as an examination of needed tools to support exploration missions.

  17. DEVELOPMENT OF THE U.S. EPA'S METAL FINISHING FACILITY POLLUTION PREVENTION TOOL

    EPA Science Inventory

    Metal finishing processes are a type of chemical processes and can be modeled using Computer Aided Process Engineering (CAPE). Currently, the U.S. EPA is developing the Metal Finishing Facility Pollution Prevention Tool (MFFP2T), a pollution prevention software tool for the meta...

  18. Integrated Wind Power Planning Tool

    NASA Astrophysics Data System (ADS)

    Rosgaard, Martin; Giebel, Gregor; Skov Nielsen, Torben; Hahmann, Andrea; Sørensen, Poul; Madsen, Henrik

    2013-04-01

    This poster presents the current state of the public service obligation (PSO) funded project PSO 10464, with the title "Integrated Wind Power Planning Tool". The goal is to integrate a mesoscale numerical weather prediction (NWP) model with purely statistical tools in order to assess wind power fluctuations, with focus on long term power system planning for future wind farms as well as short term forecasting for existing wind farms. Currently, wind power fluctuation models are either purely statistical or integrated with NWP models of limited resolution. Using the state-of-the-art mesoscale NWP model Weather Research & Forecasting model (WRF) the forecast error is sought quantified in dependence of the time scale involved. This task constitutes a preparative study for later implementation of features accounting for NWP forecast errors in the DTU Wind Energy maintained Corwind code - a long term wind power planning tool. Within the framework of PSO 10464 research related to operational short term wind power prediction will be carried out, including a comparison of forecast quality at different mesoscale NWP model resolutions and development of a statistical wind power prediction tool taking input from WRF. The short term prediction part of the project is carried out in collaboration with ENFOR A/S; a Danish company that specialises in forecasting and optimisation for the energy sector. The integrated prediction model will allow for the description of the expected variability in wind power production in the coming hours to days, accounting for its spatio-temporal dependencies, and depending on the prevailing weather conditions defined by the WRF output. The output from the integrated short term prediction tool constitutes scenario forecasts for the coming period, which can then be fed into any type of system model or decision making problem to be solved. The high resolution of the WRF results loaded into the integrated prediction model will ensure a high accuracy data basis is available for use in the decision making process of the Danish transmission system operator. The need for high accuracy predictions will only increase over the next decade as Denmark approaches the goal of 50% wind power based electricity in 2025 from the current 20%.

  19. Angular approach combined to mechanical model for tool breakage detection by eddy current sensors

    NASA Astrophysics Data System (ADS)

    Ritou, M.; Garnier, S.; Furet, B.; Hascoet, J. Y.

    2014-02-01

    The paper presents a new complete approach for Tool Condition Monitoring (TCM) in milling. The aim is the early detection of small damages so that catastrophic tool failures are prevented. A versatile in-process monitoring system is introduced for reliability concerns. The tool condition is determined by estimates of the radial eccentricity of the teeth. An adequate criterion is proposed combining mechanical model of milling and angular approach.Then, a new solution is proposed for the estimate of cutting force using eddy current sensors implemented close to spindle nose. Signals are analysed in the angular domain, notably by synchronous averaging technique. Phase shifts induced by changes of machining direction are compensated. Results are compared with cutting forces measured with a dynamometer table.The proposed method is implemented in an industrial case of pocket machining operation. One of the cutting edges has been slightly damaged during the machining, as shown by a direct measurement of the tool. A control chart is established with the estimates of cutter eccentricity obtained during the machining from the eddy current sensors signals. Efficiency and reliability of the method is demonstrated by a successful detection of the damage.

  20. Model-Driven Useware Engineering

    NASA Astrophysics Data System (ADS)

    Meixner, Gerrit; Seissler, Marc; Breiner, Kai

    User-oriented hardware and software development relies on a systematic development process based on a comprehensive analysis focusing on the users' requirements and preferences. Such a development process calls for the integration of numerous disciplines, from psychology and ergonomics to computer sciences and mechanical engineering. Hence, a correspondingly interdisciplinary team must be equipped with suitable software tools to allow it to handle the complexity of a multimodal and multi-device user interface development approach. An abstract, model-based development approach seems to be adequate for handling this complexity. This approach comprises different levels of abstraction requiring adequate tool support. Thus, in this chapter, we present the current state of our model-based software tool chain. We introduce the use model as the core model of our model-based process, transformation processes, and a model-based architecture, and we present different software tools that provide support for creating and maintaining the models or performing the necessary model transformations.

  1. High-resolution Modeling Assisted Design of Customized and Individualized Transcranial Direct Current Stimulation Protocols

    PubMed Central

    Bikson, Marom; Rahman, Asif; Datta, Abhishek; Fregni, Felipe; Merabet, Lotfi

    2012-01-01

    Objectives Transcranial direct current stimulation (tDCS) is a neuromodulatory technique that delivers low-intensity currents facilitating or inhibiting spontaneous neuronal activity. tDCS is attractive since dose is readily adjustable by simply changing electrode number, position, size, shape, and current. In the recent past, computational models have been developed with increased precision with the goal to help customize tDCS dose. The aim of this review is to discuss the incorporation of high-resolution patient-specific computer modeling to guide and optimize tDCS. Methods In this review, we discuss the following topics: (i) The clinical motivation and rationale for models of transcranial stimulation is considered pivotal in order to leverage the flexibility of neuromodulation; (ii) The protocols and the workflow for developing high-resolution models; (iii) The technical challenges and limitations of interpreting modeling predictions, and (iv) Real cases merging modeling and clinical data illustrating the impact of computational models on the rational design of rehabilitative electrotherapy. Conclusions Though modeling for non-invasive brain stimulation is still in its development phase, it is predicted that with increased validation, dissemination, simplification and democratization of modeling tools, computational forward models of neuromodulation will become useful tools to guide the optimization of clinical electrotherapy. PMID:22780230

  2. Regional 3-D Modeling of Ground Geoelectric Field for the Northeast United States due to Realistic Geomagnetic Disturbances

    NASA Astrophysics Data System (ADS)

    Ivannikova, E.; Kruglyakov, M.; Kuvshinov, A. V.; Rastaetter, L.; Pulkkinen, A. A.; Ngwira, C. M.

    2017-12-01

    During extreme space weather events electric currents in the Earth's magnetosphere and ionosphere experience large variations, which leads to dramatic intensification of the fluctuating magnetic field at the surface of the Earth. According to Faraday's law of induction, the fluctuating geomagnetic field in turn induces electric field that generates harmful currents (so-called "geomagnetically induced currents"; GICs) in grounded technological systems. Understanding (via modeling) of the spatio-temporal evolution of the geoelectric field during enhanced geomagnetic activity is a key consideration in estimating the hazard to technological systems from space weather. We present the results of ground geoelectric field modeling for the Northeast United States, which is performed with the use of our novel numerical tool based on integral equation approach. The tool exploits realistic regional three-dimensional (3-D) models of the Earth's electrical conductivity and realistic global models of the spatio-temporal evolution of the magnetospheric and ionospheric current systems responsible for geomagnetic disturbances. We also explore in detail the manifestation of the coastal effect (anomalous intensification of the geoelectric field near the coasts) in this region.

  3. Cathodic Protection Measurement Through Inline Inspection Technology Uses and Observations

    NASA Astrophysics Data System (ADS)

    Ferguson, Briana Ley

    This research supports the evaluation of an impressed current cathodic protection (CP) system of a buried coated steel pipeline through alternative technology and methods, via an inline inspection device (ILI, CP ILI tool, or tool), in order to prevent and mitigate external corrosion. This thesis investigates the ability to measure the current density of a pipeline's CP system from inside of a pipeline rather than manually from outside, and then convert that CP ILI tool reading into a pipe-to-soil potential as required by regulations and standards. This was demonstrated through a mathematical model that utilizes applications of Ohm's Law, circuit concepts, and attenuation principles in order to match the results of the ILI sample data by varying parameters of the model (i.e., values for over potential and coating resistivity). This research has not been conducted previously in order to determine if the protected potential range can be achieved with respect to the predicted current density from the CP ILI device. Kirchhoff's method was explored, but certain principals could not be used in the model as manual measurements were required. This research was based on circuit concepts which indirectly affected electrochemical processes. Through Ohm's law, the results show that a constant current density is possible in the protected potential range; therefore, indicates polarization of the pipeline, which leads to calcareous deposit development with respect to electrochemistry. Calcareous deposit is desirable in industry since it increases the resistance of the pipeline coating and lowers current, thus slowing the oxygen diffusion process. This research conveys that an alternative method for CP evaluation from inside of the pipeline is possible where the pipe-to-soil potential can be estimated (as required by regulations) from the ILI tool's current density measurement.

  4. SpacePy - a Python-based library of tools for the space sciences

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morley, Steven K; Welling, Daniel T; Koller, Josef

    Space science deals with the bodies within the solar system and the interplanetary medium; the primary focus is on atmospheres and above - at Earth the short timescale variation in the the geomagnetic field, the Van Allen radiation belts and the deposition of energy into the upper atmosphere are key areas of investigation. SpacePy is a package for Python, targeted at the space sciences, that aims to make basic data analysis, modeling and visualization easier. It builds on the capabilities of the well-known NumPy and MatPlotLib packages. Publication quality output direct from analyses is emphasized. The SpacePy project seeks tomore » promote accurate and open research standards by providing an open environment for code development. In the space physics community there has long been a significant reliance on proprietary languages that restrict free transfer of data and reproducibility of results. By providing a comprehensive, open-source library of widely used analysis and visualization tools in a free, modern and intuitive language, we hope that this reliance will be diminished. SpacePy includes implementations of widely used empirical models, statistical techniques used frequently in space science (e.g. superposed epoch analysis), and interfaces to advanced tools such as electron drift shell calculations for radiation belt studies. SpacePy also provides analysis and visualization tools for components of the Space Weather Modeling Framework - currently this only includes the BATS-R-US 3-D magnetohydrodynamic model and the RAM ring current model - including streamline tracing in vector fields. Further development is currently underway. External libraries, which include well-known magnetic field models, high-precision time conversions and coordinate transformations are wrapped for access from Python using SWIG and f2py. The rest of the tools have been implemented directly in Python. The provision of open-source tools to perform common tasks will provide openness in the analysis methods employed in scientific studies and will give access to advanced tools to all space scientists regardless of affiliation or circumstance.« less

  5. A Review and Analysis of Remote Sensing Capability for Air Quality Measurements as a Potential Decision Support Tool Conducted by the NASA DEVELOP Program

    NASA Technical Reports Server (NTRS)

    Ross, A.; Richards, A.; Keith, K.; Frew, C.; Boseck, J.; Sutton, S.; Watts, C.; Rickman, D.

    2007-01-01

    This project focused on a comprehensive utilization of air quality model products as decision support tools (DST) needed for public health applications. A review of past and future air quality measurement methods and their uncertainty, along with the relationship of air quality to national and global public health, is vital. This project described current and future NASA satellite remote sensing and ground sensing capabilities and the potential for using these sensors to enhance the prediction, prevention, and control of public health effects that result from poor air quality. The qualitative uncertainty of current satellite remotely sensed air quality, the ground-based remotely sensed air quality, the air quality/public health model, and the decision making process is evaluated in this study. Current peer-reviewed literature suggests that remotely sensed air quality parameters correlate well with ground-based sensor data. A satellite remote-sensed and ground-sensed data complement is needed to enhance the models/tools used by policy makers for the protection of national and global public health communities

  6. The evolution of diagnosis-related groups (DRGs): from its beginnings in case-mix and resource use theory, to its implementation for payment and now for its current utilization for quality within and outside the hospital.

    PubMed

    Goldfield, Norbert

    2010-01-01

    Policymakers are searching for ways to control health care costs and improve quality. Diagnosis-related groups (DRGs) are by far the most important cost control and quality improvement tool that governments and private payers have implemented. This article reviews why DRGs have had this singular success both in the hospital sector and, over the past 10 years, in ambulatory and managed care settings. Last, the author reviews current trends in the development and implementation of tools that have the key ingredients of DRG success: categorical clinical model, separation of the clinical model from payment weights, separate payment adjustments for nonclinical factors, and outlier payments. Virtually all current tools used to manage health care costs and improve quality do not have these characteristics. This failure explains a key reason for the failure, for example, of the Medicare Advantage program to control health care costs. This article concludes with a discussion of future developments for DRG-type models outside the hospital sector.

  7. Chapter 13: Tools for analysis

    Treesearch

    William Elliot; Kevin Hyde; Lee MacDonald; James McKean

    2007-01-01

    This chapter presents a synthesis of current computer modeling tools that are, or could be, adopted for use in evaluating the cumulative watershed effects of fuel management. The chapter focuses on runoff, soil erosion and slope stability predictive tools. Readers should refer to chapters on soil erosion and stability for more detailed information on the physical...

  8. Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)

    NASA Technical Reports Server (NTRS)

    Doyle, Monica; ONeil, Daniel A.; Christensen, Carissa B.

    2005-01-01

    The Advanced Technology Lifecycle Analysis System (ATLAS) is a decision support tool designed to aid program managers and strategic planners in determining how to invest technology research and development dollars. It is an Excel-based modeling package that allows a user to build complex space architectures and evaluate the impact of various technology choices. ATLAS contains system models, cost and operations models, a campaign timeline and a centralized technology database. Technology data for all system models is drawn from a common database, the ATLAS Technology Tool Box (TTB). The TTB provides a comprehensive, architecture-independent technology database that is keyed to current and future timeframes.

  9. Modeling and Simulation Tools for Heavy Lift Airships

    NASA Technical Reports Server (NTRS)

    Hochstetler, Ron; Chachad, Girish; Hardy, Gordon; Blanken, Matthew; Melton, John

    2016-01-01

    For conventional fixed wing and rotary wing aircraft a variety of modeling and simulation tools have been developed to provide designers the means to thoroughly investigate proposed designs and operational concepts. However, lighter-than-air (LTA) airships, hybrid air vehicles, and aerostats have some important aspects that are different from heavier-than-air (HTA) vehicles. In order to account for these differences, modifications are required to the standard design tools to fully characterize the LTA vehicle design and performance parameters.. To address these LTA design and operational factors, LTA development organizations have created unique proprietary modeling tools, often at their own expense. An expansion of this limited LTA tool set could be accomplished by leveraging existing modeling and simulation capabilities available in the National laboratories and public research centers. Development of an expanded set of publicly available LTA modeling and simulation tools for LTA developers would mitigate the reliance on proprietary LTA design tools in use today. A set of well researched, open source, high fidelity LTA design modeling and simulation tools would advance LTA vehicle development and also provide the analytical basis for accurate LTA operational cost assessments. This paper will present the modeling and analysis tool capabilities required for LTA vehicle design, analysis of operations, and full life-cycle support. A survey of the tools currently available will be assessed to identify the gaps between their capabilities and the LTA industry's needs. Options for development of new modeling and analysis capabilities to supplement contemporary tools will also be presented.

  10. MOD Tool (Microwave Optics Design Tool)

    NASA Technical Reports Server (NTRS)

    Katz, Daniel S.; Borgioli, Andrea; Cwik, Tom; Fu, Chuigang; Imbriale, William A.; Jamnejad, Vahraz; Springer, Paul L.

    1999-01-01

    The Jet Propulsion Laboratory (JPL) is currently designing and building a number of instruments that operate in the microwave and millimeter-wave bands. These include MIRO (Microwave Instrument for the Rosetta Orbiter), MLS (Microwave Limb Sounder), and IMAS (Integrated Multispectral Atmospheric Sounder). These instruments must be designed and built to meet key design criteria (e.g., beamwidth, gain, pointing) obtained from the scientific goals for the instrument. These criteria are frequently functions of the operating environment (both thermal and mechanical). To design and build instruments which meet these criteria, it is essential to be able to model the instrument in its environments. Currently, a number of modeling tools exist. Commonly used tools at JPL include: FEMAP (meshing), NASTRAN (structural modeling), TRASYS and SINDA (thermal modeling), MACOS/IMOS (optical modeling), and POPO (physical optics modeling). Each of these tools is used by an analyst, who models the instrument in one discipline. The analyst then provides the results of this modeling to another analyst, who continues the overall modeling in another discipline. There is a large reengineering task in place at JPL to automate and speed-up the structural and thermal modeling disciplines, which does not include MOD Tool. The focus of MOD Tool (and of this paper) is in the fields unique to microwave and millimeter-wave instrument design. These include initial design and analysis of the instrument without thermal or structural loads, the automation of the transfer of this design to a high-end CAD tool, and the analysis of the structurally deformed instrument (due to structural and/or thermal loads). MOD Tool is a distributed tool, with a database of design information residing on a server, physical optics analysis being performed on a variety of supercomputer platforms, and a graphical user interface (GUI) residing on the user's desktop computer. The MOD Tool client is being developed using Tcl/Tk, which allows the user to work on a choice of platforms (PC, Mac, or Unix) after downloading the Tcl/Tk binary, which is readily available on the web. The MOD Tool server is written using Expect, and it resides on a Sun workstation. Client/server communications are performed over a socket, where upon a connection from a client to the server, the server spawns a child which is be dedicated to communicating with that client. The server communicates with other machines, such as supercomputers using Expect with the username and password being provided by the user on the client.

  11. A GIS Tool for evaluating and improving NEXRAD and its application in distributed hydrologic modeling

    NASA Astrophysics Data System (ADS)

    Zhang, X.; Srinivasan, R.

    2008-12-01

    In this study, a user friendly GIS tool was developed for evaluating and improving NEXRAD using raingauge data. This GIS tool can automatically read in raingauge and NEXRAD data, evaluate the accuracy of NEXRAD for each time unit, implement several geostatistical methods to improve the accuracy of NEXRAD through raingauge data, and output spatial precipitation map for distributed hydrologic model. The geostatistical methods incorporated in this tool include Simple Kriging with varying local means, Kriging with External Drift, Regression Kriging, Co-Kriging, and a new geostatistical method that was newly developed by Li et al. (2008). This tool was applied in two test watersheds at hourly and daily temporal scale. The preliminary cross-validation results show that incorporating raingauge data to calibrate NEXRAD can pronouncedly change the spatial pattern of NEXRAD and improve its accuracy. Using different geostatistical methods, the GIS tool was applied to produce long term precipitation input for a distributed hydrologic model - Soil and Water Assessment Tool (SWAT). Animated video was generated to vividly illustrate the effect of using different precipitation input data on distributed hydrologic modeling. Currently, this GIS tool is developed as an extension of SWAT, which is used as water quantity and quality modeling tool by USDA and EPA. The flexible module based design of this tool also makes it easy to be adapted for other hydrologic models for hydrological modeling and water resources management.

  12. The Role of Wakes in Modelling Tidal Current Turbines

    NASA Astrophysics Data System (ADS)

    Conley, Daniel; Roc, Thomas; Greaves, Deborah

    2010-05-01

    The eventual proper development of arrays of Tidal Current Turbines (TCT) will require a balance which maximizes power extraction while minimizing environmental impacts. Idealized analytical analogues and simple 2-D models are useful tools for investigating questions of a general nature but do not represent a practical tool for application to realistic cases. Some form of 3-D numerical simulations will be required for such applications and the current project is designed to develop a numerical decision-making tool for use in planning large scale TCT projects. The project is predicated on the use of an existing regional ocean modelling framework (the Regional Ocean Modelling System - ROMS) which is modified to enable the user to account for the effects of TCTs. In such a framework where mixing processes are highly parametrized, the fidelity of the quantitative results is critically dependent on the parameter values utilized. In light of the early stage of TCT development and the lack of field scale measurements, the calibration of such a model is problematic. In the absence of explicit calibration data sets, the device wake structure has been identified as an efficient feature for model calibration. This presentation will discuss efforts to design an appropriate calibration scheme which focuses on wake decay and the motivation for this approach, techniques applied, validation results from simple test cases and limitations shall be presented.

  13. Ranking of Business Process Simulation Software Tools with DEX/QQ Hierarchical Decision Model.

    PubMed

    Damij, Nadja; Boškoski, Pavle; Bohanec, Marko; Mileva Boshkoska, Biljana

    2016-01-01

    The omnipresent need for optimisation requires constant improvements of companies' business processes (BPs). Minimising the risk of inappropriate BP being implemented is usually performed by simulating the newly developed BP under various initial conditions and "what-if" scenarios. An effectual business process simulations software (BPSS) is a prerequisite for accurate analysis of an BP. Characterisation of an BPSS tool is a challenging task due to the complex selection criteria that includes quality of visual aspects, simulation capabilities, statistical facilities, quality reporting etc. Under such circumstances, making an optimal decision is challenging. Therefore, various decision support models are employed aiding the BPSS tool selection. The currently established decision support models are either proprietary or comprise only a limited subset of criteria, which affects their accuracy. Addressing this issue, this paper proposes a new hierarchical decision support model for ranking of BPSS based on their technical characteristics by employing DEX and qualitative to quantitative (QQ) methodology. Consequently, the decision expert feeds the required information in a systematic and user friendly manner. There are three significant contributions of the proposed approach. Firstly, the proposed hierarchical model is easily extendible for adding new criteria in the hierarchical structure. Secondly, a fully operational decision support system (DSS) tool that implements the proposed hierarchical model is presented. Finally, the effectiveness of the proposed hierarchical model is assessed by comparing the resulting rankings of BPSS with respect to currently available results.

  14. Development of a Spot-Application Tool for Rapid, High-Resolution Simulation of Wave-Driven Nearshore Hydrodynamics

    DTIC Science & Technology

    2013-09-30

    flow models, such as Delft3D, with our developed Boussinesq -type model. The vision of this project is to develop an operational tool for the...situ measurements or large-scale wave models. This information will be used to drive the offshore wave boundary condition. • Execute the Boussinesq ...model to match with the Boussinesq -type theory would be one which can simulate sheared and stratified currents due to large-scale (non-wave) forcings

  15. NATIONAL URBAN DATABASE AND ACCESS PROTAL TOOL

    EPA Science Inventory

    Current mesoscale weather prediction and microscale dispersion models are limited in their ability to perform accurate assessments in urban areas. A project called the National Urban Database with Access Portal Tool (NUDAPT) is beginning to provide urban data and improve the para...

  16. Supporting cognition in systems biology analysis: findings on users' processes and design implications.

    PubMed

    Mirel, Barbara

    2009-02-13

    Current usability studies of bioinformatics tools suggest that tools for exploratory analysis support some tasks related to finding relationships of interest but not the deep causal insights necessary for formulating plausible and credible hypotheses. To better understand design requirements for gaining these causal insights in systems biology analyses a longitudinal field study of 15 biomedical researchers was conducted. Researchers interacted with the same protein-protein interaction tools to discover possible disease mechanisms for further experimentation. Findings reveal patterns in scientists' exploratory and explanatory analysis and reveal that tools positively supported a number of well-structured query and analysis tasks. But for several of scientists' more complex, higher order ways of knowing and reasoning the tools did not offer adequate support. Results show that for a better fit with scientists' cognition for exploratory analysis systems biology tools need to better match scientists' processes for validating, for making a transition from classification to model-based reasoning, and for engaging in causal mental modelling. As the next great frontier in bioinformatics usability, tool designs for exploratory systems biology analysis need to move beyond the successes already achieved in supporting formulaic query and analysis tasks and now reduce current mismatches with several of scientists' higher order analytical practices. The implications of results for tool designs are discussed.

  17. Modeling of Photoionized Plasmas

    NASA Technical Reports Server (NTRS)

    Kallman, Timothy R.

    2010-01-01

    In this paper I review the motivation and current status of modeling of plasmas exposed to strong radiation fields, as it applies to the study of cosmic X-ray sources. This includes some of the astrophysical issues which can be addressed, the ingredients for the models, the current computational tools, the limitations imposed by currently available atomic data, and the validity of some of the standard assumptions. I will also discuss ideas for the future: challenges associated with future missions, opportunities presented by improved computers, and goals for atomic data collection.

  18. AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT ...

    EPA Pesticide Factsheets

    The Automated Geospatial Watershed Assessment tool (AGWA) is a GIS interface jointly developed by the USDA Agricultural Research Service, the U.S. Environmental Protection Agency, the University of Arizona, and the University of Wyoming to automate the parameterization and execution of the Soil Water Assessment Tool (SWAT) and KINEmatic Runoff and EROSion (KINEROS2) hydrologic models. The application of these two models allows AGWA to conduct hydrologic modeling and watershed assessments at multiple temporal and spatial scales. AGWA’s current outputs are runoff (volumes and peaks) and sediment yield, plus nitrogen and phosphorus with the SWAT model. AGWA uses commonly available GIS data layers to fully parameterize, execute, and visualize results from both models. Through an intuitive interface the user selects an outlet from which AGWA delineates and discretizes the watershed using a Digital Elevation Model (DEM) based on the individual model requirements. The watershed model elements are then intersected with soils and land cover data layers to derive the requisite model input parameters. The chosen model is then executed, and the results are imported back into AGWA for visualization. This allows managers to identify potential problem areas where additional monitoring can be undertaken or mitigation activities can be focused. AGWA also has tools to apply an array of best management practices. There are currently two versions of AGWA available; AGWA 1.5 for

  19. A new software tool for computing Earth's atmospheric transmission of near- and far-infrared radiation

    NASA Technical Reports Server (NTRS)

    Lord, Steven D.

    1992-01-01

    This report describes a new software tool, ATRAN, which computes the transmittance of Earth's atmosphere at near- and far-infrared wavelengths. We compare the capabilities of this program with others currently available and demonstrate its utility for observational data calibration and reduction. The program employs current water-vapor and ozone models to produce fast and accurate transmittance spectra for wavelengths ranging from 0.8 microns to 10 mm.

  20. Integrating Visualizations into Modeling NEST Simulations

    PubMed Central

    Nowke, Christian; Zielasko, Daniel; Weyers, Benjamin; Peyser, Alexander; Hentschel, Bernd; Kuhlen, Torsten W.

    2015-01-01

    Modeling large-scale spiking neural networks showing realistic biological behavior in their dynamics is a complex and tedious task. Since these networks consist of millions of interconnected neurons, their simulation produces an immense amount of data. In recent years it has become possible to simulate even larger networks. However, solutions to assist researchers in understanding the simulation's complex emergent behavior by means of visualization are still lacking. While developing tools to partially fill this gap, we encountered the challenge to integrate these tools easily into the neuroscientists' daily workflow. To understand what makes this so challenging, we looked into the workflows of our collaborators and analyzed how they use the visualizations to solve their daily problems. We identified two major issues: first, the analysis process can rapidly change focus which requires to switch the visualization tool that assists in the current problem domain. Second, because of the heterogeneous data that results from simulations, researchers want to relate data to investigate these effectively. Since a monolithic application model, processing and visualizing all data modalities and reflecting all combinations of possible workflows in a holistic way, is most likely impossible to develop and to maintain, a software architecture that offers specialized visualization tools that run simultaneously and can be linked together to reflect the current workflow, is a more feasible approach. To this end, we have developed a software architecture that allows neuroscientists to integrate visualization tools more closely into the modeling tasks. In addition, it forms the basis for semantic linking of different visualizations to reflect the current workflow. In this paper, we present this architecture and substantiate the usefulness of our approach by common use cases we encountered in our collaborative work. PMID:26733860

  1. An Innovative Interactive Modeling Tool to Analyze Scenario-Based Physician Workforce Supply and Demand.

    PubMed

    Gupta, Saurabh; Black-Schaffer, W Stephen; Crawford, James M; Gross, David; Karcher, Donald S; Kaufman, Jill; Knapman, Doug; Prystowsky, Michael B; Wheeler, Thomas M; Bean, Sarah; Kumar, Paramhans; Sharma, Raghav; Chamoli, Vaibhav; Ghai, Vikrant; Gogia, Vineet; Weintraub, Sally; Cohen, Michael B; Robboy, Stanley J

    2015-01-01

    Effective physician workforce management requires that the various organizations comprising the House of Medicine be able to assess their current and future workforce supply. This information has direct relevance to funding of graduate medical education. We describe a dynamic modeling tool that examines how individual factors and practice variables can be used to measure and forecast the supply and demand for existing and new physician services. The system we describe, while built to analyze the pathologist workforce, is sufficiently broad and robust for use in any medical specialty. Our design provides a computer-based software model populated with data from surveys and best estimates by specialty experts about current and new activities in the scope of practice. The model describes the steps needed and data required for analysis of supply and demand. Our modeling tool allows educators and policy makers, in addition to physician specialty organizations, to assess how various factors may affect demand (and supply) of current and emerging services. Examples of factors evaluated include types of professional services (3 categories with 16 subcategories), service locations, elements related to the Patient Protection and Affordable Care Act, new technologies, aging population, and changing roles in capitated, value-based, and team-based systems of care. The model also helps identify where physicians in a given specialty will likely need to assume new roles, develop new expertise, and become more efficient in practice to accommodate new value-based payment models.

  2. An Innovative Interactive Modeling Tool to Analyze Scenario-Based Physician Workforce Supply and Demand

    PubMed Central

    Gupta, Saurabh; Black-Schaffer, W. Stephen; Crawford, James M.; Gross, David; Karcher, Donald S.; Kaufman, Jill; Knapman, Doug; Prystowsky, Michael B.; Wheeler, Thomas M.; Bean, Sarah; Kumar, Paramhans; Sharma, Raghav; Chamoli, Vaibhav; Ghai, Vikrant; Gogia, Vineet; Weintraub, Sally; Cohen, Michael B.

    2015-01-01

    Effective physician workforce management requires that the various organizations comprising the House of Medicine be able to assess their current and future workforce supply. This information has direct relevance to funding of graduate medical education. We describe a dynamic modeling tool that examines how individual factors and practice variables can be used to measure and forecast the supply and demand for existing and new physician services. The system we describe, while built to analyze the pathologist workforce, is sufficiently broad and robust for use in any medical specialty. Our design provides a computer-based software model populated with data from surveys and best estimates by specialty experts about current and new activities in the scope of practice. The model describes the steps needed and data required for analysis of supply and demand. Our modeling tool allows educators and policy makers, in addition to physician specialty organizations, to assess how various factors may affect demand (and supply) of current and emerging services. Examples of factors evaluated include types of professional services (3 categories with 16 subcategories), service locations, elements related to the Patient Protection and Affordable Care Act, new technologies, aging population, and changing roles in capitated, value-based, and team-based systems of care. The model also helps identify where physicians in a given specialty will likely need to assume new roles, develop new expertise, and become more efficient in practice to accommodate new value-based payment models. PMID:28725751

  3. Digital Sketch Modelling: Integrating Digital Sketching as a Transition between Sketching and CAD in Industrial Design Education

    ERIC Educational Resources Information Center

    Ranscombe, Charlie; Bissett-Johnson, Katherine

    2017-01-01

    Literature on the use of design tools in educational settings notes an uneasy relationship between student use of traditional hand sketching and digital modelling tools (CAD) during the industrial design process. This is often manifested in the transition from sketching to CAD and exacerbated by a preference of current students to use CAD. In this…

  4. Hybrid Wing Body Planform Design with Vehicle Sketch Pad

    NASA Technical Reports Server (NTRS)

    Wells, Douglas P.; Olson, Erik D.

    2011-01-01

    The objective of this paper was to provide an update on NASA s current tools for design and analysis of hybrid wing body (HWB) aircraft with an emphasis on Vehicle Sketch Pad (VSP). NASA started HWB analysis using the Flight Optimization System (FLOPS). That capability is enhanced using Phoenix Integration's ModelCenter(Registered TradeMark). Model Center enables multifidelity analysis tools to be linked as an integrated structure. Two major components are linked to FLOPS as an example; a planform discretization tool and VSP. The planform discretization tool ensures the planform is smooth and continuous. VSP is used to display the output geometry. This example shows that a smooth & continuous HWB planform can be displayed as a three-dimensional model and rapidly sized and analyzed.

  5. MAGIC: A Tool for Combining, Interpolating, and Processing Magnetograms

    NASA Technical Reports Server (NTRS)

    Allred, Joel

    2012-01-01

    Transients in the solar coronal magnetic field are ultimately the source of space weather. Models which seek to track the evolution of the coronal field require magnetogram images to be used as boundary conditions. These magnetograms are obtained by numerous instruments with different cadences and resolutions. A tool is required which allows modelers to fmd all available data and use them to craft accurate and physically consistent boundary conditions for their models. We have developed a software tool, MAGIC (MAGnetogram Interpolation and Composition), to perform exactly this function. MAGIC can manage the acquisition of magneto gram data, cast it into a source-independent format, and then perform the necessary spatial and temporal interpolation to provide magnetic field values as requested onto model-defined grids. MAGIC has the ability to patch magneto grams from different sources together providing a more complete picture of the Sun's field than is possible from single magneto grams. In doing this, care must be taken so as not to introduce nonphysical current densities along the seam between magnetograms. We have designed a method which minimizes these spurious current densities. MAGIC also includes a number of post-processing tools which can provide additional information to models. For example, MAGIC includes an interface to the DA VE4VM tool which derives surface flow velocities from the time evolution of surface magnetic field. MAGIC has been developed as an application of the KAMELEON data formatting toolkit which has been developed by the CCMC.

  6. Conceptual FOM design tool

    NASA Astrophysics Data System (ADS)

    Krause, Lee S.; Burns, Carla L.

    2000-06-01

    This paper discusses the research currently in progress to develop the Conceptual Federation Object Model Design Tool. The objective of the Conceptual FOM (C-FOM) Design Tool effort is to provide domain and subject matter experts, such as scenario developers, with automated support for understanding and utilizing available HLA simulation and other simulation assets during HLA Federation development. The C-FOM Design Tool will import Simulation Object Models from HLA reuse repositories, such as the MSSR, to populate the domain space that will contain all the objects and their supported interactions. In addition, the C-FOM tool will support the conversion of non-HLA legacy models into HLA- compliant models by applying proven abstraction techniques against the legacy models. Domain experts will be able to build scenarios based on the domain objects and interactions in both a text and graphical form and export a minimal FOM. The ability for domain and subject matter experts to effectively access HLA and non-HLA assets is critical to the long-term acceptance of the HLA initiative.

  7. A Multiple-Regression Model for Monitoring Tool Wear with a Dynamometer in Milling Operations

    ERIC Educational Resources Information Center

    Chen, Jacob C.; Chen, Joseph C.

    2004-01-01

    A major goal of the manufacturing industry is increasing product quality. The quality of a product is strongly associated with the condition of the cutting tool that produced it. Catching poor tool conditions early in the production will help reduce defects. However, with current CNC technology, manufacturers still rely mainly on the operator's…

  8. Current State and Model for Development of Technology-Based Care for Attention Deficit Hyperactivity Disorder.

    PubMed

    Benyakorn, Songpoom; Riley, Steven J; Calub, Catrina A; Schweitzer, Julie B

    2016-09-01

    Care (i.e., evaluation and intervention) delivered through technology is used in many areas of mental health services, including for persons with attention deficit hyperactivity disorder (ADHD). Technology can facilitate care for individuals with ADHD, their parents, and their care providers. The adoption of technological tools for ADHD care requires evidence-based studies to support the transition from development to integration into use in the home, school, or work for persons with the disorder. The initial phase, which is development of technological tools, has begun in earnest; however, the evidence base for many of these tools is lacking. In some instances, the uptake of a piece of technology into home use or clinical practice may be further along than the research to support its use. In this study, we review the current evidence regarding technology for ADHD and also propose a model to evaluate the support for other tools that have yet to be tested. We propose using the Research Domain Criteria as a framework for evaluating the tools' relationships to dimensions related to ADHD. This article concludes with recommendations for testing new tools that may have promise in improving the evaluation or treatment of persons with ADHD.

  9. BioSPICE: access to the most current computational tools for biologists.

    PubMed

    Garvey, Thomas D; Lincoln, Patrick; Pedersen, Charles John; Martin, David; Johnson, Mark

    2003-01-01

    The goal of the BioSPICE program is to create a framework that provides biologists access to the most current computational tools. At the program midpoint, the BioSPICE member community has produced a software system that comprises contributions from approximately 20 participating laboratories integrated under the BioSPICE Dashboard and a methodology for continued software integration. These contributed software modules are the BioSPICE Dashboard, a graphical environment that combines Open Agent Architecture and NetBeans software technologies in a coherent, biologist-friendly user interface. The current Dashboard permits data sources, models, simulation engines, and output displays provided by different investigators and running on different machines to work together across a distributed, heterogeneous network. Among several other features, the Dashboard enables users to create graphical workflows by configuring and connecting available BioSPICE components. Anticipated future enhancements to BioSPICE include a notebook capability that will permit researchers to browse and compile data to support model building, a biological model repository, and tools to support the development, control, and data reduction of wet-lab experiments. In addition to the BioSPICE software products, a project website supports information exchange and community building.

  10. Technology Transfer Challenges for High-Assurance Software Engineering Tools

    NASA Technical Reports Server (NTRS)

    Koga, Dennis (Technical Monitor); Penix, John; Markosian, Lawrence Z.

    2003-01-01

    In this paper, we describe our experience with the challenges thar we are currently facing in our effort to develop advanced software verification and validation tools. We categorize these challenges into several areas: cost benefits modeling, tool usability, customer application domain, and organizational issues. We provide examples of challenges in each area and identrfj, open research issues in areas which limit our ability to transfer high-assurance software engineering tools into practice.

  11. Estimating potential habitat for 134 eastern US tree species under six climate scenarios

    Treesearch

    Louis R. Iverson; Anantha M. Prasad; Stephen N. Matthews; Matthew Peters

    2008-01-01

    We modeled and mapped, using the predictive data mining tool Random Forests, 134 tree species from the eastern United States for potential response to several scenarios of climate change. Each species was modeled individually to show current and potential future habitats according to two emission scenarios (high emissions on current trajectory and reasonable...

  12. TEMPORALLY-RESOLVED AMMONIA EMISSION INVENTORIES: CURRENT ESTIMATES, EVALUATION TOOLS, AND MEASUREMENT NEEDS

    EPA Science Inventory

    In this study, we evaluate the suitability of a three-dimensional chemical transport model (CTM) as a tool for assessing ammonia emission inventories, calculate the improvement in CTM performance owing to recent advances in temporally-varying ammonia emission estimates, and ident...

  13. Requirements for clinical information modelling tools.

    PubMed

    Moreno-Conde, Alberto; Jódar-Sánchez, Francisco; Kalra, Dipak

    2015-07-01

    This study proposes consensus requirements for clinical information modelling tools that can support modelling tasks in medium/large scale institutions. Rather than identify which functionalities are currently available in existing tools, the study has focused on functionalities that should be covered in order to provide guidance about how to evolve the existing tools. After identifying a set of 56 requirements for clinical information modelling tools based on a literature review and interviews with experts, a classical Delphi study methodology was applied to conduct a two round survey in order to classify them as essential or recommended. Essential requirements are those that must be met by any tool that claims to be suitable for clinical information modelling, and if we one day have a certified tools list, any tool that does not meet essential criteria would be excluded. Recommended requirements are those more advanced requirements that may be met by tools offering a superior product or only needed in certain modelling situations. According to the answers provided by 57 experts from 14 different countries, we found a high level of agreement to enable the study to identify 20 essential and 21 recommended requirements for these tools. It is expected that this list of identified requirements will guide developers on the inclusion of new basic and advanced functionalities that have strong support by end users. This list could also guide regulators in order to identify requirements that could be demanded of tools adopted within their institutions. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  14. NASA Data for Water Resources Applications

    NASA Technical Reports Server (NTRS)

    Toll, David; Houser, Paul; Arsenault, Kristi; Entin, Jared

    2004-01-01

    Water Management Applications is one of twelve elements in the Earth Science Enterprise National Applications Program. NASA Goddard Space Flight Center is supporting the Applications Program through partnering with other organizations to use NASA project results, such as from satellite instruments and Earth system models to enhance the organizations critical needs. The focus thus far has been: 1) estimating water storage including snowpack and soil moisture, 2) modeling and predicting water fluxes such as evapotranspiration (ET), precipitation and river runoff, and 3) remote sensing of water quality, including both point source (e.g., turbidity and productivity) and non-point source (e.g., land cover conversion such as forest to agriculture yielding higher nutrient runoff). The objectives of the partnering cover three steps of: 1) Evaluation, 2) Verification and Validation, and 3) Benchmark Report. We are working with the U.S. federal agencies including the Environmental Protection Agency (EPA), the Bureau of Reclamation (USBR) and the Department of Agriculture (USDA). We are using several of their Decision Support Systems (DSS) tools. This includes the DSS support tools BASINS used by EPA, Riverware and AWARDS ET ToolBox by USBR and SWAT by USDA and EPA. Regional application sites using NASA data across the US. are currently being eliminated for the DSS tools. The current NASA data emphasized thus far are from the Land Data Assimilation Systems WAS) and MODIS satellite products. We are currently in the first two steps of evaluation and verification validation. Water Management Applications is one of twelve elements in the Earth Science Enterprise s National Applications Program. NASA Goddard Space Flight Center is supporting the Applications Program through partnering with other organizations to use NASA project results, such as from satellite instruments and Earth system models to enhance the organizations critical needs. The focus thus far has been: 1) estimating water storage including snowpack and soil moisture, 2) modeling and predicting water fluxes such as evapotranspiration (ET), precipitation and river runoff, and 3) remote sensing of water quality, including both point source (e.g., turbidity and productivity) and non-point source (e.g., land cover conversion such as forest to agriculture yielding higher nutrient runoff). The objectives of the partnering cover three steps of 1) Evaluation, 2) Verification and Validation, and 3) Benchmark Report. We are working with the U.S. federal agencies the Environmental Protection Agency (EPA), the Bureau of Reclamation (USBR) and the Department of Agriculture (USDA). We are using several of their Decision Support Systems (DSS) tools. T us includes the DSS support tools BASINS used by EPA, Riverware and AWARDS ET ToolBox by USBR and SWAT by USDA and EPA. Regional application sites using NASA data across the US. are currently being evaluated for the DSS tools. The current NASA data emphasized thus far are from the Land Data Assimilation Systems (LDAS) and MODIS satellite products. We are currently in the first two steps of evaluation and verification and validation.

  15. Validation of a Mechanistic Model for Non-Invasive Study of Ecological Energetics in an Endangered Wading Bird with Counter-Current Heat Exchange in its Legs.

    PubMed

    Fitzpatrick, Megan J; Mathewson, Paul D; Porter, Warren P

    2015-01-01

    Mechanistic models provide a powerful, minimally invasive tool for gaining a deeper understanding of the ecology of animals across geographic space and time. In this paper, we modified and validated the accuracy of the mechanistic model Niche Mapper for simulating heat exchanges of animals with counter-current heat exchange mechanisms in their legs and animals that wade in water. We then used Niche Mapper to explore the effects of wading and counter-current heat exchange on the energy expenditures of Whooping Cranes, a long-legged wading bird. We validated model accuracy against the energy expenditure of two captive Whooping Cranes measured using the doubly-labeled water method and time energy budgets. Energy expenditure values modeled by Niche Mapper were similar to values measured by the doubly-labeled water method and values estimated from time-energy budgets. Future studies will be able to use Niche Mapper as a non-invasive tool to explore energy-based limits to the fundamental niche of Whooping Cranes and apply this knowledge to management decisions. Basic questions about the importance of counter-current exchange and wading to animal physiological tolerances can also now be explored with the model.

  16. Validation of a Mechanistic Model for Non-Invasive Study of Ecological Energetics in an Endangered Wading Bird with Counter-Current Heat Exchange in its Legs

    PubMed Central

    Fitzpatrick, Megan J.; Mathewson, Paul D.; Porter, Warren P.

    2015-01-01

    Mechanistic models provide a powerful, minimally invasive tool for gaining a deeper understanding of the ecology of animals across geographic space and time. In this paper, we modified and validated the accuracy of the mechanistic model Niche Mapper for simulating heat exchanges of animals with counter-current heat exchange mechanisms in their legs and animals that wade in water. We then used Niche Mapper to explore the effects of wading and counter-current heat exchange on the energy expenditures of Whooping Cranes, a long-legged wading bird. We validated model accuracy against the energy expenditure of two captive Whooping Cranes measured using the doubly-labeled water method and time energy budgets. Energy expenditure values modeled by Niche Mapper were similar to values measured by the doubly-labeled water method and values estimated from time-energy budgets. Future studies will be able to use Niche Mapper as a non-invasive tool to explore energy-based limits to the fundamental niche of Whooping Cranes and apply this knowledge to management decisions. Basic questions about the importance of counter-current exchange and wading to animal physiological tolerances can also now be explored with the model. PMID:26308207

  17. Regulatory Technology Development Plan - Sodium Fast Reactor. Mechanistic Source Term - Trial Calculation. Work Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabaskas, David; Bucknor, Matthew; Jerden, James

    2016-02-01

    The overall objective of the SFR Regulatory Technology Development Plan (RTDP) effort is to identify and address potential impediments to the SFR regulatory licensing process. In FY14, an analysis by Argonne identified the development of an SFR-specific MST methodology as an existing licensing gap with high regulatory importance and a potentially long lead-time to closure. This work was followed by an initial examination of the current state-of-knowledge regarding SFR source term development (ANLART-3), which reported several potential gaps. Among these were the potential inadequacies of current computational tools to properly model and assess the transport and retention of radionuclides duringmore » a metal fuel pool-type SFR core damage incident. The objective of the current work is to determine the adequacy of existing computational tools, and the associated knowledge database, for the calculation of an SFR MST. To accomplish this task, a trial MST calculation will be performed using available computational tools to establish their limitations with regard to relevant radionuclide release/retention/transport phenomena. The application of existing modeling tools will provide a definitive test to assess their suitability for an SFR MST calculation, while also identifying potential gaps in the current knowledge base and providing insight into open issues regarding regulatory criteria/requirements. The findings of this analysis will assist in determining future research and development needs.« less

  18. Cranial electrotherapy stimulation and transcranial pulsed current stimulation: a computer based high-resolution modeling study.

    PubMed

    Datta, Abhishek; Dmochowski, Jacek P; Guleyupoglu, Berkan; Bikson, Marom; Fregni, Felipe

    2013-01-15

    The field of non-invasive brain stimulation has developed significantly over the last two decades. Though two techniques of noninvasive brain stimulation--transcranial direct current stimulation (tDCS) and transcranial magnetic stimulation (TMS)--are becoming established tools for research in neuroscience and for some clinical applications, related techniques that also show some promising clinical results have not been developed at the same pace. One of these related techniques is cranial electrotherapy stimulation (CES), a class of transcranial pulsed current stimulation (tPCS). In order to understand further the mechanisms of CES, we aimed to model CES using a magnetic resonance imaging (MRI)-derived finite element head model including cortical and also subcortical structures. Cortical electric field (current density) peak intensities and distributions were analyzed. We evaluated different electrode configurations of CES including in-ear and over-ear montages. Our results confirm that significant amounts of current pass the skull and reach cortical and subcortical structures. In addition, depending on the montage, induced currents at subcortical areas, such as midbrain, pons, thalamus and hypothalamus are of similar magnitude than that of cortical areas. Incremental variations of electrode position on the head surface also influence which cortical regions are modulated. The high-resolution modeling predictions suggest that details of electrode montage influence current flow through superficial and deep structures. Finally we present laptop based methods for tPCS dose design using dominant frequency and spherical models. These modeling predictions and tools are the first step to advance rational and optimized use of tPCS and CES. Copyright © 2012 Elsevier Inc. All rights reserved.

  19. Ranking of Business Process Simulation Software Tools with DEX/QQ Hierarchical Decision Model

    PubMed Central

    2016-01-01

    The omnipresent need for optimisation requires constant improvements of companies’ business processes (BPs). Minimising the risk of inappropriate BP being implemented is usually performed by simulating the newly developed BP under various initial conditions and “what-if” scenarios. An effectual business process simulations software (BPSS) is a prerequisite for accurate analysis of an BP. Characterisation of an BPSS tool is a challenging task due to the complex selection criteria that includes quality of visual aspects, simulation capabilities, statistical facilities, quality reporting etc. Under such circumstances, making an optimal decision is challenging. Therefore, various decision support models are employed aiding the BPSS tool selection. The currently established decision support models are either proprietary or comprise only a limited subset of criteria, which affects their accuracy. Addressing this issue, this paper proposes a new hierarchical decision support model for ranking of BPSS based on their technical characteristics by employing DEX and qualitative to quantitative (QQ) methodology. Consequently, the decision expert feeds the required information in a systematic and user friendly manner. There are three significant contributions of the proposed approach. Firstly, the proposed hierarchical model is easily extendible for adding new criteria in the hierarchical structure. Secondly, a fully operational decision support system (DSS) tool that implements the proposed hierarchical model is presented. Finally, the effectiveness of the proposed hierarchical model is assessed by comparing the resulting rankings of BPSS with respect to currently available results. PMID:26871694

  20. An Overview of NASA's Orbital Debris Engineering Model

    NASA Technical Reports Server (NTRS)

    Matney, Mark

    2010-01-01

    This slide presentation reviews the importance of Orbital debris engineering models. They are mathematical tools to assess orbital debris flux. It briefly reviews the history of the orbital debris engineering models, and reviews the new features in the current model (i.e., ORDEM2010).

  1. Myokit: A simple interface to cardiac cellular electrophysiology.

    PubMed

    Clerx, Michael; Collins, Pieter; de Lange, Enno; Volders, Paul G A

    2016-01-01

    Myokit is a new powerful and versatile software tool for modeling and simulation of cardiac cellular electrophysiology. Myokit consists of an easy-to-read modeling language, a graphical user interface, single and multi-cell simulation engines and a library of advanced analysis tools accessible through a Python interface. Models can be loaded from Myokit's native file format or imported from CellML. Model export is provided to C, MATLAB, CellML, CUDA and OpenCL. Patch-clamp data can be imported and used to estimate model parameters. In this paper, we review existing tools to simulate the cardiac cellular action potential to find that current tools do not cater specifically to model development and that there is a gap between easy-to-use but limited software and powerful tools that require strong programming skills from their users. We then describe Myokit's capabilities, focusing on its model description language, simulation engines and import/export facilities in detail. Using three examples, we show how Myokit can be used for clinically relevant investigations, multi-model testing and parameter estimation in Markov models, all with minimal programming effort from the user. This way, Myokit bridges a gap between performance, versatility and user-friendliness. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Genetic Simulation Tools for Post-Genome Wide Association Studies of Complex Diseases

    PubMed Central

    Amos, Christopher I.; Bafna, Vineet; Hauser, Elizabeth R.; Hernandez, Ryan D.; Li, Chun; Liberles, David A.; McAllister, Kimberly; Moore, Jason H.; Paltoo, Dina N.; Papanicolaou, George J.; Peng, Bo; Ritchie, Marylyn D.; Rosenfeld, Gabriel; Witte, John S.

    2014-01-01

    Genetic simulation programs are used to model data under specified assumptions to facilitate the understanding and study of complex genetic systems. Standardized data sets generated using genetic simulation are essential for the development and application of novel analytical tools in genetic epidemiology studies. With continuing advances in high-throughput genomic technologies and generation and analysis of larger, more complex data sets, there is a need for updating current approaches in genetic simulation modeling. To provide a forum to address current and emerging challenges in this area, the National Cancer Institute (NCI) sponsored a workshop, entitled “Genetic Simulation Tools for Post-Genome Wide Association Studies of Complex Diseases” at the National Institutes of Health (NIH) in Bethesda, Maryland on March 11-12, 2014. The goals of the workshop were to: (i) identify opportunities, challenges and resource needs for the development and application of genetic simulation models; (ii) improve the integration of tools for modeling and analysis of simulated data; and (iii) foster collaborations to facilitate development and applications of genetic simulation. During the course of the meeting the group identified challenges and opportunities for the science of simulation, software and methods development, and collaboration. This paper summarizes key discussions at the meeting, and highlights important challenges and opportunities to advance the field of genetic simulation. PMID:25371374

  3. A systematic review on popularity, application and characteristics of protein secondary structure prediction tools.

    PubMed

    Kashani-Amin, Elaheh; Tabatabaei-Malazy, Ozra; Sakhteman, Amirhossein; Larijani, Bagher; Ebrahim-Habibi, Azadeh

    2018-02-27

    Prediction of proteins' secondary structure is one of the major steps in the generation of homology models. These models provide structural information which is used to design suitable ligands for potential medicinal targets. However, selecting a proper tool between multiple secondary structure prediction (SSP) options is challenging. The current study is an insight onto currently favored methods and tools, within various contexts. A systematic review was performed for a comprehensive access to recent (2013-2016) studies which used or recommended protein SSP tools. Three databases, Web of Science, PubMed and Scopus were systematically searched and 99 out of 209 studies were finally found eligible to extract data. Four categories of applications for 59 retrieved SSP tools were: (I) prediction of structural features of a given sequence, (II) evaluation of a method, (III) providing input for a new SSP method and (IV) integrating a SSP tool as a component for a program. PSIPRED was found to be the most popular tool in all four categories. JPred and tools utilizing PHD (Profile network from HeiDelberg) method occupied second and third places of popularity in categories I and II. JPred was only found in the two first categories, while PHD was present in three fields. This study provides a comprehensive insight about the recent usage of SSP tools which could be helpful for selecting a proper tool's choice. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  4. Multiscale Modeling in Computational Biomechanics: Determining Computational Priorities and Addressing Current Challenges

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tawhai, Merryn; Bischoff, Jeff; Einstein, Daniel R.

    2009-05-01

    Abstract In this article, we describe some current multiscale modeling issues in computational biomechanics from the perspective of the musculoskeletal and respiratory systems and mechanotransduction. First, we outline the necessity of multiscale simulations in these biological systems. Then we summarize challenges inherent to multiscale biomechanics modeling, regardless of the subdiscipline, followed by computational challenges that are system-specific. We discuss some of the current tools that have been utilized to aid research in multiscale mechanics simulations, and the priorities to further the field of multiscale biomechanics computation.

  5. Thermal Error Test and Intelligent Modeling Research on the Spindle of High Speed CNC Machine Tools

    NASA Astrophysics Data System (ADS)

    Luo, Zhonghui; Peng, Bin; Xiao, Qijun; Bai, Lu

    2018-03-01

    Thermal error is the main factor affecting the accuracy of precision machining. Through experiments, this paper studies the thermal error test and intelligent modeling for the spindle of vertical high speed CNC machine tools in respect of current research focuses on thermal error of machine tool. Several testing devices for thermal error are designed, of which 7 temperature sensors are used to measure the temperature of machine tool spindle system and 2 displacement sensors are used to detect the thermal error displacement. A thermal error compensation model, which has a good ability in inversion prediction, is established by applying the principal component analysis technology, optimizing the temperature measuring points, extracting the characteristic values closely associated with the thermal error displacement, and using the artificial neural network technology.

  6. Modeling Interoperable Information Systems with 3LGM² and IHE.

    PubMed

    Stäubert, S; Schaaf, M; Jahn, F; Brandner, R; Winter, A

    2015-01-01

    Strategic planning of information systems (IS) in healthcare requires descriptions of the current and the future IS state. Enterprise architecture planning (EAP) tools like the 3LGM² tool help to build up and to analyze IS models. A model of the planned architecture can be derived from an analysis of current state IS models. Building an interoperable IS, i. e. an IS consisting of interoperable components, can be considered a relevant strategic information management goal for many IS in healthcare. Integrating the healthcare enterprise (IHE) is an initiative which targets interoperability by using established standards. To link IHE concepts to 3LGM² concepts within the 3LGM² tool. To describe how an information manager can be supported in handling the complex IHE world and planning interoperable IS using 3LGM² models. To describe how developers or maintainers of IHE profiles can be supported by the representation of IHE concepts in 3LGM². Conceptualization and concept mapping methods are used to assign IHE concepts such as domains, integration profiles actors and transactions to the concepts of the three-layer graph-based meta-model (3LGM²). IHE concepts were successfully linked to 3LGM² concepts. An IHE-master-model, i. e. an abstract model for IHE concepts, was modeled with the help of 3LGM² tool. Two IHE domains were modeled in detail (ITI, QRPH). We describe two use cases for the representation of IHE concepts and IHE domains as 3LGM² models. Information managers can use the IHE-master-model as reference model for modeling interoperable IS based on IHE profiles during EAP activities. IHE developers are supported in analyzing consistency of IHE concepts with the help of the IHE-master-model and functions of the 3LGM² tool The complex relations between IHE concepts can be modeled by using the EAP method 3LGM². 3LGM² tool offers visualization and analysis features which are now available for the IHE-master-model. Thus information managers and IHE developers can use or develop IHE profiles systematically. In order to improve the usability and handling of the IHE-master-model and its usage as a reference model, some further refinements have to be done. Evaluating the use of the IHE-master-model by information managers and IHE developers is subject to further research.

  7. New tools for Content Innovation and data sharing: Enhancing reproducibility and rigor in biomechanics research.

    PubMed

    Guilak, Farshid

    2017-03-21

    We are currently in one of the most exciting times for science and engineering as we witness unprecedented growth in our computational and experimental capabilities to generate new data and models. To facilitate data and model sharing, and to enhance reproducibility and rigor in biomechanics research, the Journal of Biomechanics has introduced a number of tools for Content Innovation to allow presentation, sharing, and archiving of methods, models, and data in our articles. The tools include an Interactive Plot Viewer, 3D Geometric Shape and Model Viewer, Virtual Microscope, Interactive MATLAB Figure Viewer, and Audioslides. Authors are highly encouraged to make use of these in upcoming journal submissions. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. New V and V Tools for Diagnostic Modeling Environment (DME)

    NASA Technical Reports Server (NTRS)

    Pecheur, Charles; Nelson, Stacy; Merriam, Marshall (Technical Monitor)

    2002-01-01

    The purpose of this report is to provide correctness and reliability criteria for verification and validation (V&V) of Second Generation Reusable Launch Vehicle (RLV) Diagnostic Modeling Environment, describe current NASA Ames Research Center tools for V&V of Model Based Reasoning systems, and discuss the applicability of Advanced V&V to DME. This report is divided into the following three sections: (1) correctness and reliability criteria; (2) tools for V&V of Model Based Reasoning; and (3) advanced V&V applicable to DME. The Executive Summary includes an overview of the main points from each section. Supporting details, diagrams, figures, and other information are included in subsequent sections. A glossary, acronym list, appendices, and references are included at the end of this report.

  9. GAMBIT: the global and modular beyond-the-standard-model inference tool

    NASA Astrophysics Data System (ADS)

    Athron, Peter; Balazs, Csaba; Bringmann, Torsten; Buckley, Andy; Chrząszcz, Marcin; Conrad, Jan; Cornell, Jonathan M.; Dal, Lars A.; Dickinson, Hugh; Edsjö, Joakim; Farmer, Ben; Gonzalo, Tomás E.; Jackson, Paul; Krislock, Abram; Kvellestad, Anders; Lundberg, Johan; McKay, James; Mahmoudi, Farvah; Martinez, Gregory D.; Putze, Antje; Raklev, Are; Ripken, Joachim; Rogan, Christopher; Saavedra, Aldo; Savage, Christopher; Scott, Pat; Seo, Seon-Hee; Serra, Nicola; Weniger, Christoph; White, Martin; Wild, Sebastian

    2017-11-01

    We describe the open-source global fitting package GAMBIT: the Global And Modular Beyond-the-Standard-Model Inference Tool. GAMBIT combines extensive calculations of observables and likelihoods in particle and astroparticle physics with a hierarchical model database, advanced tools for automatically building analyses of essentially any model, a flexible and powerful system for interfacing to external codes, a suite of different statistical methods and parameter scanning algorithms, and a host of other utilities designed to make scans faster, safer and more easily-extendible than in the past. Here we give a detailed description of the framework, its design and motivation, and the current models and other specific components presently implemented in GAMBIT. Accompanying papers deal with individual modules and present first GAMBIT results. GAMBIT can be downloaded from gambit.hepforge.org.

  10. Modeling of screening currents in coated conductor magnets containing up to 40000 turns

    NASA Astrophysics Data System (ADS)

    Pardo, E.

    2016-08-01

    Screening currents caused by varying magnetic fields degrade the homogeneity and stability of the magnetic fields created by REBCO coated conductor coils. They are responsible for the AC loss; which is also important for other power applications containing windings, such as transformers, motors and generators. Since real magnets contain coils exceeding 10000 turns, accurate modeling tools for this number of turns or above are necessary for magnet design. This article presents a fast numerical method to model coils with no loss of accuracy. We model a 10400-turn coil for its real number of turns and coils of up to 40000 turns with continuous approximation, which introduces negligible errors. The screening currents, the screening current induced field (SCIF) and the AC loss is analyzed in detail. The SCIF is at a maximum at the remnant state with a considerably large value. The instantaneous AC loss for an anisotropic magnetic-field dependent J c is qualitatively different than for a constant J c , although the loss per cycle is similar. Saturation of the magnetization currents at the end pancakes causes the maximum AC loss at the first ramp to increase with J c . The presented modeling tool can accurately calculate the SCIF and AC loss in practical computing times for coils with any number of turns used in real windings, enabling parameter optimization.

  11. Challenges of NDE Simulation Tool Challenges of NDE Simulation Tool

    NASA Technical Reports Server (NTRS)

    Leckey, Cara A. C.; Juarez, Peter D.; Seebo, Jeffrey P.; Frank, Ashley L.

    2015-01-01

    Realistic nondestructive evaluation (NDE) simulation tools enable inspection optimization and predictions of inspectability for new aerospace materials and designs. NDE simulation tools may someday aid in the design and certification of advanced aerospace components; potentially shortening the time from material development to implementation by industry and government. Furthermore, modeling and simulation are expected to play a significant future role in validating the capabilities and limitations of guided wave based structural health monitoring (SHM) systems. The current state-of-the-art in ultrasonic NDE/SHM simulation cannot rapidly simulate damage detection techniques for large scale, complex geometry composite components/vehicles with realistic damage types. This paper discusses some of the challenges of model development and validation for composites, such as the level of realism and scale of simulation needed for NASA' applications. Ongoing model development work is described along with examples of model validation studies. The paper will also discuss examples of the use of simulation tools at NASA to develop new damage characterization methods, and associated challenges of validating those methods.

  12. Making Better Re/Insurance Underwriting and Capital Management Decisions with Public-Private-Academic Partnerships

    NASA Astrophysics Data System (ADS)

    Michel, G.; Gunasekera, R.; Werner, A.; Galy, H.

    2012-04-01

    Similar to 2001, 2004, and 2005, 2011 was another year of unexpected international catastrophe events, in which insured losses were more than twice the expected long-term annual average catastrophe losses of USD 30 to 40bn. Key catastrophe events that significantly contributed these losses included the Mw 9.0 Great Tohoku earthquake and tsunami, the Jan. 2011 floods in Queensland, the October 2011 floods in Thailand, the Mw 6.1 Christchurch earthquake and Convective system (Tornado) in United States. However, despite considerable progress in catastrophe modelling, the advent of global catastrophe models, increasing risk model coverage and skill in the detailed modelling, the above mentioned events were not satisfactorily modelled by the current mainstream Re/Insurance catastrophe models. This presentation therefore address problems in models and incomplete understanding identified from recent catastrophic events by considering: i) the current modelling environment, and ii) how the current processes could be improved via: a) the understanding of risk within science networks such as the Willis Research Network, and b) the integration of risk model results from available insurance catastrophe models and tools. This presentation aims to highlight the needed improvements in decision making and market practices, thereby advancing the current management of risk in the Re/Insurance industry. This also increases the need for better integration of Public-Private-Academic partnerships and tools to provide better estimates of not only financial loss but also humanitarian and infrastructural losses as well.

  13. Building a Case for Blocks as Kindergarten Mathematics Learning Tools

    ERIC Educational Resources Information Center

    Kinzer, Cathy; Gerhardt, Kacie; Coca, Nicole

    2016-01-01

    Kindergarteners need access to blocks as thinking tools to develop, model, test, and articulate their mathematical ideas. In the current educational landscape, resources such as blocks are being pushed to the side and being replaced by procedural worksheets and academic "seat time" in order to address standards. Mathematics research…

  14. Mobile Adaptive Communication Support for Vocabulary Acquisition

    ERIC Educational Resources Information Center

    Epp, Carrie Demmans

    2014-01-01

    This work explores the use of an adaptive mobile tool for language learning. A school-based deployment study showed that the tool supported learning. A second study is being conducted in informal learning environments. Current work focuses on building models that increase our understanding of the relationship between application usage and learning.

  15. A coarse-grained model for DNA origami.

    PubMed

    Reshetnikov, Roman V; Stolyarova, Anastasia V; Zalevsky, Arthur O; Panteleev, Dmitry Y; Pavlova, Galina V; Klinov, Dmitry V; Golovin, Andrey V; Protopopova, Anna D

    2018-02-16

    Modeling tools provide a valuable support for DNA origami design. However, current solutions have limited application for conformational analysis of the designs. In this work we present a tool for a thorough study of DNA origami structure and dynamics. The tool is based on a novel coarse-grained model dedicated to geometry optimization and conformational analysis of DNA origami. We explored the ability of the model to predict dynamic behavior, global shapes, and fine details of two single-layer systems designed in hexagonal and square lattices using atomic force microscopy, Förster resonance energy transfer spectroscopy, and all-atom molecular dynamic simulations for validation of the results. We also examined the performance of the model for multilayer systems by simulation of DNA origami with published cryo-electron microscopy and atomic force microscopy structures. A good agreement between the simulated and experimental data makes the model suitable for conformational analysis of DNA origami objects. The tool is available at http://vsb.fbb.msu.ru/cosm as a web-service and as a standalone version.

  16. A coarse-grained model for DNA origami

    PubMed Central

    Stolyarova, Anastasia V; Zalevsky, Arthur O; Panteleev, Dmitry Y; Pavlova, Galina V; Klinov, Dmitry V; Golovin, Andrey V; Protopopova, Anna D

    2018-01-01

    Abstract Modeling tools provide a valuable support for DNA origami design. However, current solutions have limited application for conformational analysis of the designs. In this work we present a tool for a thorough study of DNA origami structure and dynamics. The tool is based on a novel coarse-grained model dedicated to geometry optimization and conformational analysis of DNA origami. We explored the ability of the model to predict dynamic behavior, global shapes, and fine details of two single-layer systems designed in hexagonal and square lattices using atomic force microscopy, Förster resonance energy transfer spectroscopy, and all-atom molecular dynamic simulations for validation of the results. We also examined the performance of the model for multilayer systems by simulation of DNA origami with published cryo-electron microscopy and atomic force microscopy structures. A good agreement between the simulated and experimental data makes the model suitable for conformational analysis of DNA origami objects. The tool is available at http://vsb.fbb.msu.ru/cosm as a web-service and as a standalone version. PMID:29267876

  17. The National energy modeling system

    NASA Astrophysics Data System (ADS)

    The DOE uses a variety of energy and economic models to forecast energy supply and demand. It also uses a variety of more narrowly focussed analytical tools to examine energy policy options. For the purpose of the scope of this work, this set of models and analytical tools is called the National Energy Modeling System (NEMS). The NEMS is the result of many years of development of energy modeling and analysis tools, many of which were developed for different applications and under different assumptions. As such, NEMS is believed to be less than satisfactory in certain areas. For example, NEMS is difficult to keep updated and expensive to use. Various outputs are often difficult to reconcile. Products were not required to interface, but were designed to stand alone. Because different developers were involved, the inner workings of the NEMS are often not easily or fully understood. Even with these difficulties, however, NEMS comprises the best tools currently identified to deal with our global, national and regional energy modeling, and energy analysis needs.

  18. On the importance of incorporating sampling weights in occupancy model estimation

    EPA Science Inventory

    Occupancy models are used extensively to assess wildlife-habitat associations and to predict species distributions across large geographic regions. Occupancy models were developed as a tool to properly account for imperfect detection of a species. Current guidelines on survey des...

  19. Integrated optomechanical analysis and testing software development at MIT Lincoln Laboratory

    NASA Astrophysics Data System (ADS)

    Stoeckel, Gerhard P.; Doyle, Keith B.

    2013-09-01

    Advanced analytical software capabilities are being developed to advance the design of prototypical hardware in the Engineering Division at MIT Lincoln Laboratory. The current effort is focused on the integration of analysis tools tailored to the work flow, organizational structure, and current technology demands. These tools are being designed to provide superior insight into the interdisciplinary behavior of optical systems and enable rapid assessment and execution of design trades to optimize the design of optomechanical systems. The custom software architecture is designed to exploit and enhance the functionality of existing industry standard commercial software, provide a framework for centralizing internally developed tools, and deliver greater efficiency, productivity, and accuracy through standardization, automation, and integration. Specific efforts have included the development of a feature-rich software package for Structural-Thermal-Optical Performance (STOP) modeling, advanced Line Of Sight (LOS) jitter simulations, and improved integration of dynamic testing and structural modeling.

  20. Web tools for predictive toxicology model building.

    PubMed

    Jeliazkova, Nina

    2012-07-01

    The development and use of web tools in chemistry has accumulated more than 15 years of history already. Powered by the advances in the Internet technologies, the current generation of web systems are starting to expand into areas, traditional for desktop applications. The web platforms integrate data storage, cheminformatics and data analysis tools. The ease of use and the collaborative potential of the web is compelling, despite the challenges. The topic of this review is a set of recently published web tools that facilitate predictive toxicology model building. The focus is on software platforms, offering web access to chemical structure-based methods, although some of the frameworks could also provide bioinformatics or hybrid data analysis functionalities. A number of historical and current developments are cited. In order to provide comparable assessment, the following characteristics are considered: support for workflows, descriptor calculations, visualization, modeling algorithms, data management and data sharing capabilities, availability of GUI or programmatic access and implementation details. The success of the Web is largely due to its highly decentralized, yet sufficiently interoperable model for information access. The expected future convergence between cheminformatics and bioinformatics databases provides new challenges toward management and analysis of large data sets. The web tools in predictive toxicology will likely continue to evolve toward the right mix of flexibility, performance, scalability, interoperability, sets of unique features offered, friendly user interfaces, programmatic access for advanced users, platform independence, results reproducibility, curation and crowdsourcing utilities, collaborative sharing and secure access.

  1. Patient-centered medical home model: do school-based health centers fit the model?

    PubMed

    Larson, Satu A; Chapman, Susan A

    2013-01-01

    School-based health centers (SBHCs) are an important component of health care reform. The SBHC model of care offers accessible, continuous, comprehensive, family-centered, coordinated, and compassionate care to infants, children, and adolescents. These same elements comprise the patient-centered medical home (PCMH) model of care being promoted by the Affordable Care Act with the hope of lowering health care costs by rewarding clinicians for primary care services. PCMH survey tools have been developed to help payers determine whether a clinician/site serves as a PCMH. Our concern is that current survey tools will be unable to capture how a SBHC may provide a medical home and therefore be denied needed funding. This article describes how SBHCs might meet the requirements of one PCMH tool. SBHC stakeholders need to advocate for the creation or modification of existing survey tools that allow the unique characteristics of SBHCs to qualify as PCMHs.

  2. An Overview of Tools for Creating, Validating and Using PDS Metadata

    NASA Astrophysics Data System (ADS)

    King, T. A.; Hardman, S. H.; Padams, J.; Mafi, J. N.; Cecconi, B.

    2017-12-01

    NASA's Planetary Data System (PDS) has defined information models for creating metadata to describe bundles, collections and products for all the assets acquired by a planetary science projects. Version 3 of the PDS Information Model (commonly known as "PDS3") is widely used and is used to describe most of the existing planetary archive. Recently PDS has released version 4 of the Information Model (commonly known as "PDS4") which is designed to improve consistency, efficiency and discoverability of information. To aid in creating, validating and using PDS4 metadata the PDS and a few associated groups have developed a variety of tools. In addition, some commercial tools, both free and for a fee, can be used to create and work with PDS4 metadata. We present an overview of these tools, describe those tools currently under development and provide guidance as to which tools may be most useful for missions, instrument teams and the individual researcher.

  3. Control/structure interaction conceptual design tool

    NASA Technical Reports Server (NTRS)

    Briggs, Hugh C.

    1990-01-01

    The JPL Control/Structure Interaction Program is developing new analytical methods for designing micro-precision spacecraft with controlled structures. One of these, the Conceptual Design Tool, will illustrate innovative new approaches to the integration of multi-disciplinary analysis and design methods. The tool will be used to demonstrate homogeneity of presentation, uniform data representation across analytical methods, and integrated systems modeling. The tool differs from current 'integrated systems' that support design teams most notably in its support for the new CSI multi-disciplinary engineer. The design tool will utilize a three dimensional solid model of the spacecraft under design as the central data organization metaphor. Various analytical methods, such as finite element structural analysis, control system analysis, and mechanical configuration layout, will store and retrieve data from a hierarchical, object oriented data structure that supports assemblies of components with associated data and algorithms. In addition to managing numerical model data, the tool will assist the designer in organizing, stating, and tracking system requirements.

  4. A quality-based cost model for new electronic systems and products

    NASA Astrophysics Data System (ADS)

    Shina, Sammy G.; Saigal, Anil

    1998-04-01

    This article outlines a method for developing a quality-based cost model for the design of new electronic systems and products. The model incorporates a methodology for determining a cost-effective design margin allocation for electronic products and systems and its impact on manufacturing quality and cost. A spreadsheet-based cost estimating tool was developed to help implement this methodology in order for the system design engineers to quickly estimate the effect of design decisions and tradeoffs on the quality and cost of new products. The tool was developed with automatic spreadsheet connectivity to current process capability and with provisions to consider the impact of capital equipment and tooling purchases to reduce the product cost.

  5. Browsing Space Weather Data and Models with the Integrated Space Weather Analysis (iSWA) System

    NASA Technical Reports Server (NTRS)

    Maddox, Marlo M.; Mullinix, Richard E.; Berrios, David H.; Hesse, Michael; Rastaetter, Lutz; Pulkkinen, Antti; Hourcle, Joseph A.; Thompson, Barbara J.

    2011-01-01

    The Integrated Space Weather Analysis (iSWA) System is a comprehensive web-based platform for space weather information that combines data from solar, heliospheric and geospace observatories with forecasts based on the most advanced space weather models. The iSWA system collects, generates, and presents a wide array of space weather resources in an intuitive, user-configurable, and adaptable format - thus enabling users to respond to current and future space weather impacts as well as enabling post-impact analysis. iSWA currently provides over 200 data and modeling products, and features a variety of tools that allow the user to browse, combine, and examine data and models from various sources. This presentation will consist of a summary of the iSWA products and an overview of the customizable user interfaces, and will feature several tutorial demonstrations highlighting the interactive tools and advanced capabilities.

  6. Requirements to Design to Code: Towards a Fully Formal Approach to Automatic Code Generation

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.

    2004-01-01

    A general-purpose method to mechanically transform system requirements into a provably equivalent model has yet to appear. Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including sensor networks and autonomous systems. Currently available tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The gap that current tools and methods leave unfilled is that their formal models cannot be proven to be equivalent to the system requirements as originated by the customer. For the classes of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations.

  7. An overview of quantitative approaches in Gestalt perception.

    PubMed

    Jäkel, Frank; Singh, Manish; Wichmann, Felix A; Herzog, Michael H

    2016-09-01

    Gestalt psychology is often criticized as lacking quantitative measurements and precise mathematical models. While this is true of the early Gestalt school, today there are many quantitative approaches in Gestalt perception and the special issue of Vision Research "Quantitative Approaches in Gestalt Perception" showcases the current state-of-the-art. In this article we give an overview of these current approaches. For example, ideal observer models are one of the standard quantitative tools in vision research and there is a clear trend to try and apply this tool to Gestalt perception and thereby integrate Gestalt perception into mainstream vision research. More generally, Bayesian models, long popular in other areas of vision research, are increasingly being employed to model perceptual grouping as well. Thus, although experimental and theoretical approaches to Gestalt perception remain quite diverse, we are hopeful that these quantitative trends will pave the way for a unified theory. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. A Formal Approach to Requirements-Based Programming

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.

    2005-01-01

    No significant general-purpose method is currently available to mechanically transform system requirements into a provably equivalent model. The widespread use of such a method represents a necessary step toward high-dependability system engineering for numerous application domains. Current tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The "gap" unfilled by such tools and methods is that the formal models cannot be proven to be equivalent to the requirements. We offer a method for mechanically transforming requirements into a provably equivalent formal model that can be used as the basis for code generation and other transformations. This method is unique in offering full mathematical tractability while using notations and techniques that are well known and well trusted. Finally, we describe further application areas we are investigating for use of the approach.

  9. Current State and Model for Development of Technology-Based Care for Attention Deficit Hyperactivity Disorder

    PubMed Central

    Riley, Steven J.; Calub, Catrina A.; Schweitzer, Julie B.

    2016-01-01

    Abstract Introduction: Care (i.e., evaluation and intervention) delivered through technology is used in many areas of mental health services, including for persons with attention deficit hyperactivity disorder (ADHD). Technology can facilitate care for individuals with ADHD, their parents, and their care providers. The adoption of technological tools for ADHD care requires evidence-based studies to support the transition from development to integration into use in the home, school, or work for persons with the disorder. The initial phase, which is development of technological tools, has begun in earnest; however, the evidence base for many of these tools is lacking. In some instances, the uptake of a piece of technology into home use or clinical practice may be further along than the research to support its use. Methods: In this study, we review the current evidence regarding technology for ADHD and also propose a model to evaluate the support for other tools that have yet to be tested. Results: We propose using the Research Domain Criteria as a framework for evaluating the tools' relationships to dimensions related to ADHD. Conclusion: This article concludes with recommendations for testing new tools that may have promise in improving the evaluation or treatment of persons with ADHD. PMID:26985703

  10. AIR QUALITY SIMULATION MODEL PERFORMANCE FOR ONE-HOUR AVERAGES

    EPA Science Inventory

    If a one-hour standard for sulfur dioxide were promulgated, air quality dispersion modeling in the vicinity of major point sources would be an important air quality management tool. Would currently available dispersion models be suitable for use in demonstrating attainment of suc...

  11. Next generation data systems and knowledge products to support agricultural producers and science-based policy decision making.

    PubMed

    Capalbo, Susan M; Antle, John M; Seavert, Clark

    2017-07-01

    Research on next generation agricultural systems models shows that the most important current limitation is data, both for on-farm decision support and for research investment and policy decision making. One of the greatest data challenges is to obtain reliable data on farm management decision making, both for current conditions and under scenarios of changed bio-physical and socio-economic conditions. This paper presents a framework for the use of farm-level and landscape-scale models and data to provide analysis that could be used in NextGen knowledge products, such as mobile applications or personal computer data analysis and visualization software. We describe two analytical tools - AgBiz Logic and TOA-MD - that demonstrate the current capability of farmlevel and landscape-scale models. The use of these tools is explored with a case study of an oilseed crop, Camelina sativa , which could be used to produce jet aviation fuel. We conclude with a discussion of innovations needed to facilitate the use of farm and policy-level models to generate data and analysis for improved knowledge products.

  12. Software Tools to Support Research on Airport Departure Planning

    NASA Technical Reports Server (NTRS)

    Carr, Francis; Evans, Antony; Feron, Eric; Clarke, John-Paul

    2003-01-01

    A simple, portable and useful collection of software tools has been developed for the analysis of airport surface traffic. The tools are based on a flexible and robust traffic-flow model, and include calibration, validation and simulation functionality for this model. Several different interfaces have been developed to help promote usage of these tools, including a portable Matlab(TM) implementation of the basic algorithms; a web-based interface which provides online access to automated analyses of airport traffic based on a database of real-world operations data which covers over 250 U.S. airports over a 5-year period; and an interactive simulation-based tool currently in use as part of a college-level educational module. More advanced applications for airport departure traffic include taxi-time prediction and evaluation of "windowing" congestion control.

  13. Modeling Post-Accident Vehicle Egress

    DTIC Science & Technology

    2013-01-01

    interest for military situations may involve rolled-over vehicles for which detailed movement data are not available. In the current design process...test trials. These evaluations are expensive and time-consuming, and are often performed late in the design process when it is too difficult to...alter the design if weaknesses are discovered. Yet, due to the limitations of current software tools, digital human models (DHMs) are not yet widely

  14. Rendering the Intractable More Tractable: Tools from Caenorhabditis elegans Ripe for Import into Parasitic Nematodes

    PubMed Central

    Ward, Jordan D.

    2015-01-01

    Recent and rapid advances in genetic and molecular tools have brought spectacular tractability to Caenorhabditis elegans, a model that was initially prized because of its simple design and ease of imaging. C. elegans has long been a powerful model in biomedical research, and tools such as RNAi and the CRISPR/Cas9 system allow facile knockdown of genes and genome editing, respectively. These developments have created an additional opportunity to tackle one of the most debilitating burdens on global health and food security: parasitic nematodes. I review how development of nonparasitic nematodes as genetic models informs efforts to import tools into parasitic nematodes. Current tools in three commonly studied parasites (Strongyloides spp., Brugia malayi, and Ascaris suum) are described, as are tools from C. elegans that are ripe for adaptation and the benefits and barriers to doing so. These tools will enable dissection of a huge array of questions that have been all but completely impenetrable to date, allowing investigation into host–parasite and parasite–vector interactions, and the genetic basis of parasitism. PMID:26644478

  15. A model for flexible tools used in minimally invasive medical virtual environments.

    PubMed

    Soler, Francisco; Luzon, M Victoria; Pop, Serban R; Hughes, Chris J; John, Nigel W; Torres, Juan Carlos

    2011-01-01

    Within the limits of current technology, many applications of a virtual environment will trade-off accuracy for speed. This is not an acceptable compromise in a medical training application where both are essential. Efficient algorithms must therefore be developed. The purpose of this project is the development and validation of a novel physics-based real time tool manipulation model, which is easy to integrate into any medical virtual environment that requires support for the insertion of long flexible tools into complex geometries. This encompasses medical specialities such as vascular interventional radiology, endoscopy, and laparoscopy, where training, prototyping of new instruments/tools and mission rehearsal can all be facilitated by using an immersive medical virtual environment. Our model recognises and uses accurately patient specific data and adapts to the geometrical complexity of the vessel in real time.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rutqvist, Jonny; Blanco Martin, Laura; Mukhopadhyay, Sumit

    In this report, we present FY2014 progress by Lawrence Berkeley National Laboratory (LBNL) related to modeling of coupled thermal-hydrological-mechanical-chemical (THMC) processes in salt and their effect on brine migration at high temperatures. LBNL’s work on the modeling of coupled THMC processes in salt was initiated in FY2012, focusing on exploring and demonstrating the capabilities of an existing LBNL modeling tool (TOUGH-FLAC) for simulating temperature-driven coupled flow and geomechanical processes in salt. This work includes development related to, and implementation of, essential capabilities, as well as testing the model against relevant information and published experimental data related to the fate andmore » transport of water. we provide more details on the FY2014 work, first presenting updated tools and improvements made to the TOUGH-FLAC simulator, and the use of this updated tool in a new model simulation of long-term THM behavior within a generic repository in a salt formation. This is followed by the description of current benchmarking and validations efforts, including the TSDE experiment. We then present the current status in the development of constitutive relationships and the dual-continuum model for brine migration. We conclude with an outlook for FY2015, which will be much focused on model validation against field experiments and on the use of the model for the design studies related to a proposed heater experiment.« less

  17. Web-based applications for building, managing and analysing kinetic models of biological systems.

    PubMed

    Lee, Dong-Yup; Saha, Rajib; Yusufi, Faraaz Noor Khan; Park, Wonjun; Karimi, Iftekhar A

    2009-01-01

    Mathematical modelling and computational analysis play an essential role in improving our capability to elucidate the functions and characteristics of complex biological systems such as metabolic, regulatory and cell signalling pathways. The modelling and concomitant simulation render it possible to predict the cellular behaviour of systems under various genetically and/or environmentally perturbed conditions. This motivates systems biologists/bioengineers/bioinformaticians to develop new tools and applications, allowing non-experts to easily conduct such modelling and analysis. However, among a multitude of systems biology tools developed to date, only a handful of projects have adopted a web-based approach to kinetic modelling. In this report, we evaluate the capabilities and characteristics of current web-based tools in systems biology and identify desirable features, limitations and bottlenecks for further improvements in terms of usability and functionality. A short discussion on software architecture issues involved in web-based applications and the approaches taken by existing tools is included for those interested in developing their own simulation applications.

  18. Obs4MIPS: Satellite Observations for Model Evaluation

    NASA Astrophysics Data System (ADS)

    Ferraro, R.; Waliser, D. E.; Gleckler, P. J.

    2017-12-01

    This poster will review the current status of the obs4MIPs project, whose purpose is to provide a limited collection of well-established and documented datasets for comparison with Earth system models (https://www.earthsystemcog.org/projects/obs4mips/). These datasets have been reformatted to correspond with the CMIP5 model output requirements, and include technical documentation specifically targeted for their use in model output evaluation. The project holdings now exceed 120 datasets with observations that directly correspond to CMIP5 model output variables, with new additions in response to the CMIP6 experiments. With the growth in climate model output data volume, it is increasing more difficult to bring the model output and the observations together to do evaluations. The positioning of the obs4MIPs datasets within the Earth System Grid Federation (ESGF) allows for the use of currently available and planned online tools within the ESGF to perform analysis using model output and observational datasets without necessarily downloading everything to a local workstation. This past year, obs4MIPs has updated its submission guidelines to closely align with changes in the CMIP6 experiments, and is implementing additional indicators and ancillary data to allow users to more easily determine the efficacy of an obs4MIPs dataset for specific evaluation purposes. This poster will present the new guidelines and indicators, and update the list of current obs4MIPs holdings and their connection to the ESGF evaluation and analysis tools currently available, and being developed for the CMIP6 experiments.

  19. Bio-AIMS Collection of Chemoinformatics Web Tools based on Molecular Graph Information and Artificial Intelligence Models.

    PubMed

    Munteanu, Cristian R; Gonzalez-Diaz, Humberto; Garcia, Rafael; Loza, Mabel; Pazos, Alejandro

    2015-01-01

    The molecular information encoding into molecular descriptors is the first step into in silico Chemoinformatics methods in Drug Design. The Machine Learning methods are a complex solution to find prediction models for specific biological properties of molecules. These models connect the molecular structure information such as atom connectivity (molecular graphs) or physical-chemical properties of an atom/group of atoms to the molecular activity (Quantitative Structure - Activity Relationship, QSAR). Due to the complexity of the proteins, the prediction of their activity is a complicated task and the interpretation of the models is more difficult. The current review presents a series of 11 prediction models for proteins, implemented as free Web tools on an Artificial Intelligence Model Server in Biosciences, Bio-AIMS (http://bio-aims.udc.es/TargetPred.php). Six tools predict protein activity, two models evaluate drug - protein target interactions and the other three calculate protein - protein interactions. The input information is based on the protein 3D structure for nine models, 1D peptide amino acid sequence for three tools and drug SMILES formulas for two servers. The molecular graph descriptor-based Machine Learning models could be useful tools for in silico screening of new peptides/proteins as future drug targets for specific treatments.

  20. Leveraging Observation Tools for Instructional Improvement: Exploring Variability in Uptake of Ambitious Instructional Practices

    ERIC Educational Resources Information Center

    Cohen, Julie; Schuldt, Lorien Chambers; Brown, Lindsay; Grossman, Pamela

    2016-01-01

    Background/Context: Current efforts to build rigorous teacher evaluation systems has increased interest in standardized classroom observation tools as reliable measures for assessing teaching. However, many argue these instruments can also be used to effect change in classroom practice. This study investigates a model of professional development…

  1. Alexander Meets Michotte: A Simulation Tool Based on Pattern Programming and Phenomenology

    ERIC Educational Resources Information Center

    Basawapatna, Ashok

    2016-01-01

    Simulation and modeling activities, a key point of computational thinking, are currently not being integrated into the science classroom. This paper describes a new visual programming tool entitled the Simulation Creation Toolkit. The Simulation Creation Toolkit is a high level pattern-based phenomenological approach to bringing rapid simulation…

  2. Uncertainty analysis of an irrigation scheduling model for water management in crop production

    USDA-ARS?s Scientific Manuscript database

    Irrigation scheduling tools are critical to allow producers to manage water resources for crop production in an accurate and timely manner. To be useful, these tools need to be accurate, complete, and relatively reliable. The current work presents the uncertainty analysis and its results for the Mis...

  3. Model Uncertainty and Bayesian Model Averaged Benchmark Dose Estimation for Continuous Data

    EPA Science Inventory

    The benchmark dose (BMD) approach has gained acceptance as a valuable risk assessment tool, but risk assessors still face significant challenges associated with selecting an appropriate BMD/BMDL estimate from the results of a set of acceptable dose-response models. Current approa...

  4. Development of a regional groundwater flow model for the area of the Idaho National Engineering Laboratory, Eastern Snake River Plain Aquifer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCarthy, J.M.; Arnett, R.C.; Neupauer, R.M.

    This report documents a study conducted to develop a regional groundwater flow model for the Eastern Snake River Plain Aquifer in the area of the Idaho National Engineering Laboratory. The model was developed to support Waste Area Group 10, Operable Unit 10-04 groundwater flow and transport studies. The products of this study are this report and a set of computational tools designed to numerically model the regional groundwater flow in the Eastern Snake River Plain aquifer. The objective of developing the current model was to create a tool for defining the regional groundwater flow at the INEL. The model wasmore » developed to (a) support future transport modeling for WAG 10-04 by providing the regional groundwater flow information needed for the WAG 10-04 risk assessment, (b) define the regional groundwater flow setting for modeling groundwater contaminant transport at the scale of the individual WAGs, (c) provide a tool for improving the understanding of the groundwater flow system below the INEL, and (d) consolidate the existing regional groundwater modeling information into one usable model. The current model is appropriate for defining the regional flow setting for flow submodels as well as hypothesis testing to better understand the regional groundwater flow in the area of the INEL. The scale of the submodels must be chosen based on accuracy required for the study.« less

  5. Wind Sensing, Analysis, and Modeling

    NASA Technical Reports Server (NTRS)

    Corvin, Michael A.

    1995-01-01

    The purpose of this task was to begin development of a unified approach to the sensing, analysis, and modeling of the wind environments in which launch system operate. The initial activity was to examine the current usage and requirements for wind modeling for the Titan 4 vehicle. This was to be followed by joint technical efforts with NASA Langley Research Center to develop applicable analysis methods. This work was to be performed in and demonstrate the use of prototype tools implementing an environment in which to realize a unified system. At the convenience of the customer, due to resource limitations, the task was descoped. The survey of Titan 4 processes was accomplished and is reported in this document. A summary of general requirements is provided. Current versions of prototype Process Management Environment tools are being provided to the customer.

  6. Wind sensing, analysis, and modeling

    NASA Technical Reports Server (NTRS)

    Corvin, Michael A.

    1995-01-01

    The purpose of this task was to begin development of a unified approach to the sensing, analysis, and modeling of the wind environments in which launch systems operate. The initial activity was to examine the current usage and requirements for wind modeling for the Titan 4 vehicle. This was to be followed by joint technical efforts with NASA Langley Research Center to develop applicable analysis methods. This work was to be performed in and demonstrate the use of prototype tools implementing an environment in which to realize a unified system. At the convenience of the customer, due to resource limitations, the task was descoped. The survey of Titan 4 processes was accomplished and is reported in this document. A summary of general requirements is provided . Current versions of prototype Process Management Environment tools are being provided to the customer.

  7. NICMOS Temperature-specific Darks

    NASA Astrophysics Data System (ADS)

    Monroe, B.; Bergeron, E.

    1999-11-01

    The various components of NICMOS dark images have been modeled and combined to make synthetic dark calibration files which are intended for use with observations in a temperature range from 61 to ~75 K, currently available only for camera 2, with cameras 1 and 3 to follow in a few months. The amplifier glow and the true linear dark current have been constructed as temperature-independent quantities, while the “shading” component of the darks has been modeled as temperature-dependent. The data used to construct these models was taken with NIC 2, in a temperature range of 61 to 80 K during the recent warm-up of NICMOS due to cryogen exhaustion. The resulting synthetic darks are available through a web-based tool on the STScI NICMOS website http://www.stsci.edu/instruments/nicmos/NICMOS_tools/syndark.html.

  8. High-Energy Electron-Ion and Photon-Ion Collisions: Status and Challenges

    NASA Technical Reports Server (NTRS)

    Kallman, Timothy R.

    2010-01-01

    Non-LTE plasmas are ubiquitous in objects studied in the UV and X-ray energy bands. Collisional and photoionization cross sections for atoms and ions are fundamental to our ability to model such plasmas. Modeling is key in the X-ray band, where detector properties and limited spectral resolution limit the ability to measure model-independent line strengths, or other spectral features. Much of the motivation for studying such collisions and many of the tools, are not new. However, the motivation for such studies and their applications, have been affected by the advent of X-ray spectroscopy with the gratings on Chandra and XMM-Newton. In this talk I will review this motivation and describe the tools currently in use for such studies. I will also describe some current unresolved problems and the likely future needs for such data.

  9. A Micro-Grid Simulator Tool (SGridSim) using Effective Node-to-Node Complex Impedance (EN2NCI) Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Udhay Ravishankar; Milos manic

    2013-08-01

    This paper presents a micro-grid simulator tool useful for implementing and testing multi-agent controllers (SGridSim). As a common engineering practice it is important to have a tool that simplifies the modeling of the salient features of a desired system. In electric micro-grids, these salient features are the voltage and power distributions within the micro-grid. Current simplified electric power grid simulator tools such as PowerWorld, PowerSim, Gridlab, etc, model only the power distribution features of a desired micro-grid. Other power grid simulators such as Simulink, Modelica, etc, use detailed modeling to accommodate the voltage distribution features. This paper presents a SGridSimmore » micro-grid simulator tool that simplifies the modeling of both the voltage and power distribution features in a desired micro-grid. The SGridSim tool accomplishes this simplified modeling by using Effective Node-to-Node Complex Impedance (EN2NCI) models of components that typically make-up a micro-grid. The term EN2NCI models means that the impedance based components of a micro-grid are modeled as single impedances tied between their respective voltage nodes on the micro-grid. Hence the benefit of the presented SGridSim tool are 1) simulation of a micro-grid is performed strictly in the complex-domain; 2) faster simulation of a micro-grid by avoiding the simulation of detailed transients. An example micro-grid model was built using the SGridSim tool and tested to simulate both the voltage and power distribution features with a total absolute relative error of less than 6%.« less

  10. Simulation of networks of spiking neurons: A review of tools and strategies

    PubMed Central

    Brette, Romain; Rudolph, Michelle; Carnevale, Ted; Hines, Michael; Beeman, David; Bower, James M.; Diesmann, Markus; Morrison, Abigail; Goodman, Philip H.; Harris, Frederick C.; Zirpe, Milind; Natschläger, Thomas; Pecevski, Dejan; Ermentrout, Bard; Djurfeldt, Mikael; Lansner, Anders; Rochel, Olivier; Vieville, Thierry; Muller, Eilif; Davison, Andrew P.; El Boustani, Sami

    2009-01-01

    We review different aspects of the simulation of spiking neural networks. We start by reviewing the different types of simulation strategies and algorithms that are currently implemented. We next review the precision of those simulation strategies, in particular in cases where plasticity depends on the exact timing of the spikes. We overview different simulators and simulation environments presently available (restricted to those freely available, open source and documented). For each simulation tool, its advantages and pitfalls are reviewed, with an aim to allow the reader to identify which simulator is appropriate for a given task. Finally, we provide a series of benchmark simulations of different types of networks of spiking neurons, including Hodgkin–Huxley type, integrate-and-fire models, interacting with current-based or conductance-based synapses, using clock-driven or event-driven integration strategies. The same set of models are implemented on the different simulators, and the codes are made available. The ultimate goal of this review is to provide a resource to facilitate identifying the appropriate integration strategy and simulation tool to use for a given modeling problem related to spiking neural networks. PMID:17629781

  11. Modeling fuels and fire effects in 3D: Model description and applications

    Treesearch

    Francois Pimont; Russell Parsons; Eric Rigolot; Francois de Coligny; Jean-Luc Dupuy; Philippe Dreyfus; Rodman R. Linn

    2016-01-01

    Scientists and managers critically need ways to assess how fuel treatments alter fire behavior, yet few tools currently exist for this purpose.We present a spatially-explicit-fuel-modeling system, FuelManager, which models fuels, vegetation growth, fire behavior (using a physics-based model, FIRETEC), and fire effects. FuelManager's flexible approach facilitates...

  12. A META-COMPOSITE SOFTWARE DEVELOPMENT APPROACH FOR TRANSLATIONAL RESEARCH

    PubMed Central

    Sadasivam, Rajani S.; Tanik, Murat M.

    2013-01-01

    Translational researchers conduct research in a highly data-intensive and continuously changing environment and need to use multiple, disparate tools to achieve their goals. These researchers would greatly benefit from meta-composite software development or the ability to continuously compose and recompose tools together in response to their ever-changing needs. However, the available tools are largely disconnected, and current software approaches are inefficient and ineffective in their support for meta-composite software development. Building on the composite services development approach, the de facto standard for developing integrated software systems, we propose a concept-map and agent-based meta-composite software development approach. A crucial step in composite services development is the modeling of users’ needs as processes, which can then be specified in an executable format for system composition. We have two key innovations. First, our approach allows researchers (who understand their needs best) instead of technicians to take a leadership role in the development of process models, reducing inefficiencies and errors. A second innovation is that our approach also allows for modeling of complex user interactions as part of the process, overcoming the technical limitations of current tools. We demonstrate the feasibility of our approach using a real-world translational research use case. We also present results of usability studies evaluating our approach for future refinements. PMID:23504436

  13. A meta-composite software development approach for translational research.

    PubMed

    Sadasivam, Rajani S; Tanik, Murat M

    2013-06-01

    Translational researchers conduct research in a highly data-intensive and continuously changing environment and need to use multiple, disparate tools to achieve their goals. These researchers would greatly benefit from meta-composite software development or the ability to continuously compose and recompose tools together in response to their ever-changing needs. However, the available tools are largely disconnected, and current software approaches are inefficient and ineffective in their support for meta-composite software development. Building on the composite services development approach, the de facto standard for developing integrated software systems, we propose a concept-map and agent-based meta-composite software development approach. A crucial step in composite services development is the modeling of users' needs as processes, which can then be specified in an executable format for system composition. We have two key innovations. First, our approach allows researchers (who understand their needs best) instead of technicians to take a leadership role in the development of process models, reducing inefficiencies and errors. A second innovation is that our approach also allows for modeling of complex user interactions as part of the process, overcoming the technical limitations of current tools. We demonstrate the feasibility of our approach using a real-world translational research use case. We also present results of usability studies evaluating our approach for future refinements.

  14. Biodiversity in environmental assessment-current practice and tools for prediction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gontier, Mikael; Balfors, Berit; Moertberg, Ulla

    Habitat loss and fragmentation are major threats to biodiversity. Environmental impact assessment and strategic environmental assessment are essential instruments used in physical planning to address such problems. Yet there are no well-developed methods for quantifying and predicting impacts of fragmentation on biodiversity. In this study, a literature review was conducted on GIS-based ecological models that have potential as prediction tools for biodiversity assessment. Further, a review of environmental impact statements for road and railway projects from four European countries was performed, to study how impact prediction concerning biodiversity issues was addressed. The results of the study showed the existing gapmore » between research in GIS-based ecological modelling and current practice in biodiversity assessment within environmental assessment.« less

  15. Virtual Power Electronics: Novel Software Tools for Design, Modeling and Education

    NASA Astrophysics Data System (ADS)

    Hamar, Janos; Nagy, István; Funato, Hirohito; Ogasawara, Satoshi; Dranga, Octavian; Nishida, Yasuyuki

    The current paper is dedicated to present browser-based multimedia-rich software tools and e-learning curriculum to support the design and modeling process of power electronics circuits and to explain sometimes rather sophisticated phenomena. Two projects will be discussed. The so-called Inetele project is financed by the Leonardo da Vinci program of the European Union (EU). It is a collaborative project between numerous EU universities and institutes to develop state-of-the art curriculum in Electrical Engineering. Another cooperative project with participation of Japanese, European and Australian institutes focuses especially on developing e-learning curriculum, interactive design and modeling tools, furthermore on development of a virtual laboratory. Snapshots from these two projects will be presented.

  16. Thinking outside the channel: modeling nitrogen cycling in networked river ecosystems

    Treesearch

    Ashley M. Helton; Geoffrey C. Poole; Judy L. Meyer; Wilfred M. Wollheim; Bruce J. Peterson; Patrick J. Mulholland; Emily S. Bernhardt; Jack A. Stanford; Clay Arango; Linda R. Ashkenas; Lee W. Cooper; Walter K. Dodds; Stanley V. Gregory; Robert O. Hall; Stephen K. Hamilton; Sherri L. Johnson; William H. McDowell; Jody D. Potter; Jennifer L. Tank; Suzanne M. Thomas; H. Maurice Valett; Jackson R. Webster; Lydia Zeglin

    2011-01-01

    Agricultural and urban development alters nitrogen and other biogeochemical cycles in rivers worldwide. Because such biogeochemical processes cannot be measured empirically across whole river networks, simulation models are critical tools for understanding river-network biogeochemistry. However, limitations inherent in current models restrict our ability to simulate...

  17. INVERSE MODELING TO ESTIMATE NH3 EMISSION SEASONALLY AND THE SENSITIVITY TO UNCERTAINTY REPRESENTATIONS

    EPA Science Inventory

    Inverse modeling has been used extensively on the global scale to produce top-down estimates of emissions for chemicals such as CO and CH4. Regional scale air quality studies could also benefit from inverse modeling as a tool to evaluate current emission inventories; however, ...

  18. Scaling a Human Body Finite Element Model with Radial Basis Function Interpolation

    DTIC Science & Technology

    Human body models are currently used to evaluate the body’s response to a variety of threats to the Soldier. The ability to adjust the size of human...body models is currently limited because of the complex shape changes that are required. Here, a radial basis function interpolation method is used to...morph the shape on an existing finite element mesh. Tools are developed and integrated into the Blender computer graphics software to assist with

  19. Modeling energy/economy interactions for conservation and renewable energy-policy analysis

    NASA Astrophysics Data System (ADS)

    Groncki, P. J.

    Energy policy and the implications for policy analysis and the methodological tools are discussed. The evolution of one methodological approach and the combined modeling system of the component models, their evolution in response to changing analytic needs, and the development of the integrated framework are reported. The analyses performed over the past several years are summarized. The current philosophy behind energy policy is discussed and compared to recent history. Implications for current policy analysis and methodological approaches are drawn.

  20. On the next generation of reliability analysis tools

    NASA Technical Reports Server (NTRS)

    Babcock, Philip S., IV; Leong, Frank; Gai, Eli

    1987-01-01

    The current generation of reliability analysis tools concentrates on improving the efficiency of the description and solution of the fault-handling processes and providing a solution algorithm for the full system model. The tools have improved user efficiency in these areas to the extent that the problem of constructing the fault-occurrence model is now the major analysis bottleneck. For the next generation of reliability tools, it is proposed that techniques be developed to improve the efficiency of the fault-occurrence model generation and input. Further, the goal is to provide an environment permitting a user to provide a top-down design description of the system from which a Markov reliability model is automatically constructed. Thus, the user is relieved of the tedious and error-prone process of model construction, permitting an efficient exploration of the design space, and an independent validation of the system's operation is obtained. An additional benefit of automating the model construction process is the opportunity to reduce the specialized knowledge required. Hence, the user need only be an expert in the system he is analyzing; the expertise in reliability analysis techniques is supplied.

  1. Climate Model Diagnostic Analyzer

    NASA Technical Reports Server (NTRS)

    Lee, Seungwon; Pan, Lei; Zhai, Chengxing; Tang, Benyang; Kubar, Terry; Zhang, Zia; Wang, Wei

    2015-01-01

    The comprehensive and innovative evaluation of climate models with newly available global observations is critically needed for the improvement of climate model current-state representation and future-state predictability. A climate model diagnostic evaluation process requires physics-based multi-variable analyses that typically involve large-volume and heterogeneous datasets, making them both computation- and data-intensive. With an exploratory nature of climate data analyses and an explosive growth of datasets and service tools, scientists are struggling to keep track of their datasets, tools, and execution/study history, let alone sharing them with others. In response, we have developed a cloud-enabled, provenance-supported, web-service system called Climate Model Diagnostic Analyzer (CMDA). CMDA enables the physics-based, multivariable model performance evaluations and diagnoses through the comprehensive and synergistic use of multiple observational data, reanalysis data, and model outputs. At the same time, CMDA provides a crowd-sourcing space where scientists can organize their work efficiently and share their work with others. CMDA is empowered by many current state-of-the-art software packages in web service, provenance, and semantic search.

  2. Data needs for X-ray astronomy satellites

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kallman, T.

    I review the current status of atomic data for X-ray astronomy satellites. This includes some of the astrophysical issues which can be addressed, current modeling and analysis techniques, computational tools, the limitations imposed by currently available atomic data, and the validity of standard assumptions. I also discuss the future: challenges associated with future missions and goals for atomic data collection.

  3. Urothelial cancer of the upper urinary tract: emerging biomarkers and integrative models for risk stratification.

    PubMed

    Mathieu, Romain; Vartolomei, Mihai D; Mbeutcha, Aurélie; Karakiewicz, Pierre I; Briganti, Alberto; Roupret, Morgan; Shariat, Shahrokh F

    2016-08-01

    The aim of this review was to provide an overview of current biomarkers and risk stratification models in urothelial cancer of the upper urinary tract (UTUC). A non-systematic Medline/PubMed literature search was performed using the terms "biomarkers", "preoperative models", "postoperative models", "risk stratification", together with "upper tract urothelial carcinoma". Original articles published between January 2003 and August 2015 were included based on their clinical relevance. Additional references were collected by cross referencing the bibliography of the selected articles. Various promising predictive and prognostic biomarkers have been identified in UTUC thanks to the increasing knowledge of the different biological pathways involved in UTUC tumorigenesis. These biomarkers may help identify tumors with aggressive biology and worse outcomes. Current tools aim at predicting muscle invasive or non-organ confined disease, renal failure after radical nephroureterectomy and survival outcomes. These models are still mainly based on imaging and clinicopathological feature and none has integrated biomarkers. Risk stratification in UTUC is still suboptimal, especially in the preoperative setting due to current limitations in staging and grading. Identification of novel biomarkers and external validation of current prognostic models may help improve risk stratification to allow evidence-based counselling for kidney-sparing approaches, perioperative chemotherapy and/or risk-based surveillance. Despite growing understanding of the biology underlying UTUC, management of this disease remains difficult due to the lack of validated biomarkers and the limitations of current predictive and prognostic tools. Further efforts and collaborations are necessaryry to allow their integration in daily practice.

  4. You Are Your Words: Modeling Students' Vocabulary Knowledge with Natural Language Processing Tools

    ERIC Educational Resources Information Center

    Allen, Laura K.; McNamara, Danielle S.

    2015-01-01

    The current study investigates the degree to which the lexical properties of students' essays can inform stealth assessments of their vocabulary knowledge. In particular, we used indices calculated with the natural language processing tool, TAALES, to predict students' performance on a measure of vocabulary knowledge. To this end, two corpora were…

  5. Recent Contributions of Information Sciences Research at RAND to Modeling- and Simulation-Based Policy Analysis.

    ERIC Educational Resources Information Center

    Donohue, G. L.; And Others

    This report presents examples of Rand's current research in the information sciences and illustrates the application of information science tools to specific policy studies. The projects discussed depict Rand's success with using corporate seed money to bridge the gap between the research and development of new information science tools and…

  6. Quality Tools for Professional Higher Education Review and Improvement. PHExcel Report

    ERIC Educational Resources Information Center

    Jørgensen, Malene Dahl; Sparre Kristensen, Regitze; Wimpf, Alexandre; Delplace, Stefan

    2014-01-01

    The report is the project's first outcome, and provides an overview of quality tools, quality models and quality labels, currently in use in (professional) higher education. It is followed by a gap analysis as regards the Standards and Guidelines for quality assurance in the European Higher Education Area (ESG), and the identified characteristics…

  7. Re-defining and quantifying inorganic phosphate pools in the Soil and Water Assessment Tool

    USDA-ARS?s Scientific Manuscript database

    Abstract The Soil and Water Assessment Tool (SWAT), a large-scale hydrologic model, can be used to estimate the impact of land management practices on phosphate (P) loading in streams and water bodies. Three inorganic soil P pools (labile, active, and stable P) are currently defined in the SWAT mo...

  8. Digital fabrication of textiles: an analysis of electrical networks in 3D knitted functional fabrics

    NASA Astrophysics Data System (ADS)

    Vallett, Richard; Knittel, Chelsea; Christe, Daniel; Castaneda, Nestor; Kara, Christina D.; Mazur, Krzysztof; Liu, Dani; Kontsos, Antonios; Kim, Youngmoo; Dion, Genevieve

    2017-05-01

    Digital fabrication methods are reshaping design and manufacturing processes through the adoption of pre-production visualization and analysis tools, which help minimize waste of materials and time. Despite the increasingly widespread use of digital fabrication techniques, comparatively few of these advances have benefited the design and fabrication of textiles. The development of functional fabrics such as knitted touch sensors, antennas, capacitors, and other electronic textiles could benefit from the same advances in electrical network modeling that revolutionized the design of integrated circuits. In this paper, the efficacy of using current state-of-the-art digital fabrication tools over the more common trialand- error methods currently used in textile design is demonstrated. Gaps are then identified in the current state-of-the-art tools that must be resolved to further develop and streamline the rapidly growing field of smart textiles and devices, bringing textile production into the realm of 21st century manufacturing.

  9. Evaluation of modeling as a tool to determine the potential impacts related to drilling wastes in the Brazilian offshore.

    PubMed

    Pivel, María Alejandra Gómez; Dal Sasso Freitas, Carla Maria

    2010-08-01

    Numerical models that predict the fate of drilling discharges at sea constitute a valuable tool for both the oil industry and regulatory agencies. In order to provide reliable estimates, models must be validated through the comparison of predictions with field or laboratory observations. In this paper, we used the Offshore Operators Committee Model to simulate the discharges from two wells drilled at Campos Basin, offshore SE Brazil, and compared the results with field observations obtained 3 months after drilling. The comparison showed that the model provided reasonable predictions, considering that data about currents were reconstructed and theoretical data were used to characterize the classes of solids. The model proved to be a valuable tool to determine the degree of potential impact associated to drilling activities. However, since the accuracy of the model is directly dependent on the quality of input data, different possible scenarios should be considered when used for forecast modeling.

  10. OLTARIS: On-Line Tool for the Assessment of Radiation in Space

    NASA Technical Reports Server (NTRS)

    Sandridge, Chris A.; Blattnig, Steve R.; Clowdsley, Martha S.; Norbury, John; Qualis, Garry D.; Simonsen, Lisa C.; Singleterry, Robert C.; Slaba, Tony C.; Walker, Steven A.; Badavi, Francis F.; hide

    2009-01-01

    The effects of ionizing radiation on humans in space is a major technical challenge for exploration to the moon and beyond. The radiation shielding team at NASA Langley Research Center has been working for over 30 years to develop techniques that can efficiently assist the engineer throughout the entire design process. OLTARIS: On-Line Tool for the Assessment of Radiation in Space is a new NASA website (http://oltaris.larc.nasa.gov) that allows engineers and physicists to access a variety of tools and models to study the effects of ionizing space radiation on humans and shielding materials. The site is intended to be an analysis and design tool for those working radiation issues for current and future manned missions, as well as a research tool for developing advanced material and shielding concepts. The site, along with the analysis tools and models within, have been developed using strict software practices to ensure reliable and reproducible results in a production environment. They have also been developed as a modular system so that models and algorithms can be easily added or updated.

  11. Novel 3D Approach to Flare Modeling via Interactive IDL Widget Tools

    NASA Astrophysics Data System (ADS)

    Nita, G. M.; Fleishman, G. D.; Gary, D. E.; Kuznetsov, A.; Kontar, E. P.

    2011-12-01

    Currently, and soon-to-be, available sophisticated 3D models of particle acceleration and transport in solar flares require a new level of user-friendly visualization and analysis tools allowing quick and easy adjustment of the model parameters and computation of realistic radiation patterns (images, spectra, polarization, etc). We report the current state of the art of these tools in development, already proved to be highly efficient for the direct flare modeling. We present an interactive IDL widget application intended to provide a flexible tool that allows the user to generate spatially resolved radio and X-ray spectra. The object-based architecture of this application provides full interaction with imported 3D magnetic field models (e.g., from an extrapolation) that may be embedded in a global coronal model. Various tools provided allow users to explore the magnetic connectivity of the model by generating magnetic field lines originating in user-specified volume positions. Such lines may serve as reference lines for creating magnetic flux tubes, which are further populated with user-defined analytical thermal/non thermal particle distribution models. By default, the application integrates IDL callable DLL and Shared libraries containing fast GS emission codes developed in FORTRAN and C++ and soft and hard X-ray codes developed in IDL. However, the interactive interface allows interchanging these default libraries with any user-defined IDL or external callable codes designed to solve the radiation transfer equation in the same or other wavelength ranges of interest. To illustrate the tool capacity and generality, we present a step-by-step real-time computation of microwave and X-ray images from realistic magnetic structures obtained from a magnetic field extrapolation preceding a real event, and compare them with the actual imaging data obtained by NORH and RHESSI instruments. We discuss further anticipated developments of the tools needed to accommodate temporal evolution of the magnetic field structure and/or fast electron population implied by the electron acceleration and transport. This work was supported in part by NSF grants AGS-0961867, AST-0908344, and NASA grants NNX10AF27G and NNX11AB49G to New Jersey Institute of Technology, by a UK STFC rolling grant, STFC/PPARC Advanced Fellowship, and the Leverhulme Trust, UK. Financial support by the European Commission through the SOLAIRE and HESPE Networks is gratefully acknowledged.

  12. The development of a plant risk evaluation (PRE) tool for assessing the invasive potential of ornamental plants.

    PubMed

    Conser, Christiana; Seebacher, Lizbeth; Fujino, David W; Reichard, Sarah; DiTomaso, Joseph M

    2015-01-01

    Weed Risk Assessment (WRA) methods for evaluating invasiveness in plants have evolved rapidly in the last two decades. Many WRA tools exist, but none were specifically designed to screen ornamental plants prior to being released into the environment. To be accepted as a tool to evaluate ornamental plants for the nursery industry, it is critical that a WRA tool accurately predicts non-invasiveness without falsely categorizing them as invasive. We developed a new Plant Risk Evaluation (PRE) tool for ornamental plants. The 19 questions in the final PRE tool were narrowed down from 56 original questions from existing WRA tools. We evaluated the 56 WRA questions by screening 21 known invasive and 14 known non-invasive ornamental plants. After statistically comparing the predictability of each question and the frequency the question could be answered for both invasive and non-invasive species, we eliminated questions that provided no predictive power, were irrelevant in our current model, or could not be answered reliably at a high enough percentage. We also combined many similar questions. The final 19 remaining PRE questions were further tested for accuracy using 56 additional known invasive plants and 36 known non-invasive ornamental species. The resulting evaluation demonstrated that when "needs further evaluation" classifications were not included, the accuracy of the model was 100% for both predicting invasiveness and non-invasiveness. When "needs further evaluation" classifications were included as either false positive or false negative, the model was still 93% accurate in predicting invasiveness and 97% accurate in predicting non-invasiveness, with an overall accuracy of 95%. We conclude that the PRE tool should not only provide growers with a method to accurately screen their current stock and potential new introductions, but also increase the probability of the tool being accepted for use by the industry as the basis for a nursery certification program.

  13. The Development of a Plant Risk Evaluation (PRE) Tool for Assessing the Invasive Potential of Ornamental Plants

    PubMed Central

    Conser, Christiana; Seebacher, Lizbeth; Fujino, David W.; Reichard, Sarah; DiTomaso, Joseph M.

    2015-01-01

    Weed Risk Assessment (WRA) methods for evaluating invasiveness in plants have evolved rapidly in the last two decades. Many WRA tools exist, but none were specifically designed to screen ornamental plants prior to being released into the environment. To be accepted as a tool to evaluate ornamental plants for the nursery industry, it is critical that a WRA tool accurately predicts non-invasiveness without falsely categorizing them as invasive. We developed a new Plant Risk Evaluation (PRE) tool for ornamental plants. The 19 questions in the final PRE tool were narrowed down from 56 original questions from existing WRA tools. We evaluated the 56 WRA questions by screening 21 known invasive and 14 known non-invasive ornamental plants. After statistically comparing the predictability of each question and the frequency the question could be answered for both invasive and non-invasive species, we eliminated questions that provided no predictive power, were irrelevant in our current model, or could not be answered reliably at a high enough percentage. We also combined many similar questions. The final 19 remaining PRE questions were further tested for accuracy using 56 additional known invasive plants and 36 known non-invasive ornamental species. The resulting evaluation demonstrated that when “needs further evaluation” classifications were not included, the accuracy of the model was 100% for both predicting invasiveness and non-invasiveness. When “needs further evaluation” classifications were included as either false positive or false negative, the model was still 93% accurate in predicting invasiveness and 97% accurate in predicting non-invasiveness, with an overall accuracy of 95%. We conclude that the PRE tool should not only provide growers with a method to accurately screen their current stock and potential new introductions, but also increase the probability of the tool being accepted for use by the industry as the basis for a nursery certification program. PMID:25803830

  14. Physics-based process model approach for detecting discontinuity during friction stir welding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shrivastava, Amber; Pfefferkorn, Frank E.; Duffie, Neil A.

    2015-02-12

    The goal of this work is to develop a method for detecting the creation of discontinuities during friction stir welding. This in situ weld monitoring method could significantly reduce the need for post-process inspection. A process force model and a discontinuity force model were created based on the state-of-the-art understanding of flow around an friction stir welding (FSW) tool. These models are used to predict the FSW forces and size of discontinuities formed in the weld. Friction stir welds with discontinuities and welds without discontinuities were created, and the differences in force dynamics were observed. In this paper, discontinuities weremore » generated by reducing the tool rotation frequency and increasing the tool traverse speed in order to create "cold" welds. Experimental force data for welds with discontinuities and welds without discontinuities compared favorably with the predicted forces. The model currently overpredicts the discontinuity size.« less

  15. SIRTF Tools for DIRT

    NASA Astrophysics Data System (ADS)

    Pound, M. W.; Wolfire, M. G.; Amarnath, N. S.

    2003-12-01

    The Dust InfraRed ToolBox (DIRT - a part of the Web Infrared ToolShed, or WITS, located at http://dustem.astro.umd.edu) is a Java applet for modeling astrophysical processes in circumstellar shells around young and evolved stars. DIRT has been used by the astrophysics community for about 5 years. Users can automatically and efficiently search grids of pre-calculated models to fit their data. A large set of physical parameters and dust types are included in the model database, which contains over 500,000 models. We are adding new functionality to DIRT to support new missions like SIRTF and SOFIA. A new Instrument module allows for plotting of the model points convolved with the spatial and spectral responses of the selected instrument. This lets users better fit data from specific instruments. Currently, we have implemented modules for the Infrared Array Camera (IRAC) and Multiband Imaging Photometer (MIPS) on SIRTF.

  16. Modeling tools to Account for Ethanol Impacts on BTEX Plumes

    EPA Science Inventory

    Widespread usage of ethanol in gasoline leads to impacts at leak sites which differ from those of non-ethanol gasolines. The presentation reviews current research results on the distribution of gasoline and ethanol, biodegradation, phase separation and cosolvancy. Model results f...

  17. The use of music therapy within the SCERTS model for children with Autism Spectrum Disorder.

    PubMed

    Walworth, Darcy DeLoach

    2007-01-01

    The SCERTS model is a new, comprehensive curriculum designed to assess and identify treatment goals and objectives within a multidisciplinary team of clinicians and educators for children with Autism Spectrum Disorders (ASD). This model is an ongoing assessment tool with resulting goals and objectives derived there from. Because music therapy offers a unique interaction setting for children with ASD to elicit communication skills, music therapists will need to be an integral part of the multidisciplinary assessment team using the SCERTS model which is projected to become the primary nation wide curriculum for children with ASD. The purpose of this paper is to assist music therapists in transitioning to this model by providing an overview and explanation of the SCERTS model and by identifying how music therapists are currently providing clinical services incorporated in the SCERTS Model for children with ASD. In order to formulate comprehensive transitional suggestions, a national survey of music therapists working with clients at risk or diagnosed with ASD was conducted to: (a) identify the areas of SCERTS assessment model that music therapists are currently addressing within their written goals for clients with ASD, (b) identify current music therapy activities that address various SCERTS goals and objectives, and (c) provide demographic information about settings, length, and tools used in music therapy interventions for clients with ASD.

  18. The potential impact of integrated malaria transmission control on entomologic inoculation rate in highly endemic areas.

    PubMed

    Killeen, G F; McKenzie, F E; Foy, B D; Schieffelin, C; Billingsley, P F; Beier, J C

    2000-05-01

    We have used a relatively simple but accurate model for predicting the impact of integrated transmission control on the malaria entomologic inoculation rate (EIR) at four endemic sites from across sub-Saharan Africa and the southwest Pacific. The simulated campaign incorporated modestly effective vaccine coverage, bed net use, and larval control. The results indicate that such campaigns would reduce EIRs at all four sites by 30- to 50-fold. Even without the vaccine, 15- to 25-fold reductions of EIR were predicted, implying that integrated control with a few modestly effective tools can meaningfully reduce malaria transmission in a range of endemic settings. The model accurately predicts the effects of bed nets and indoor spraying and demonstrates that they are the most effective tools available for reducing EIR. However, the impact of domestic adult vector control is amplified by measures for reducing the rate of emergence of vectors or the level of infectiousness of the human reservoir. We conclude that available tools, including currently neglected methods for larval control, can reduce malaria transmission intensity enough to alleviate mortality. Integrated control programs should be implemented to the fullest extent possible, even in areas of intense transmission, using simple models as decision-making tools. However, we also conclude that to eliminate malaria in many areas of intense transmission is beyond the scope of methods which developing nations can currently afford. New, cost-effective, practical tools are needed if malaria is ever to be eliminated from highly endemic areas.

  19. Safety Case Development as an Information Modelling Problem

    NASA Astrophysics Data System (ADS)

    Lewis, Robert

    This paper considers the benefits from applying information modelling as the basis for creating an electronically-based safety case. It highlights the current difficulties of developing and managing large document-based safety cases for complex systems such as those found in Air Traffic Control systems. After a review of current tools and related literature on this subject, the paper proceeds to examine the many relationships between entities that can exist within a large safety case. The paper considers the benefits to both safety case writers and readers from the future development of an ideal safety case tool that is able to exploit these information models. The paper also introduces the idea that the safety case has formal relationships between entities that directly support the safety case argument using a methodology such as GSN, and informal relationships that provide links to direct and backing evidence and to supporting information.

  20. Prediction of miRNA targets.

    PubMed

    Oulas, Anastasis; Karathanasis, Nestoras; Louloupi, Annita; Pavlopoulos, Georgios A; Poirazi, Panayiota; Kalantidis, Kriton; Iliopoulos, Ioannis

    2015-01-01

    Computational methods for miRNA target prediction are currently undergoing extensive review and evaluation. There is still a great need for improvement of these tools and bioinformatics approaches are looking towards high-throughput experiments in order to validate predictions. The combination of large-scale techniques with computational tools will not only provide greater credence to computational predictions but also lead to the better understanding of specific biological questions. Current miRNA target prediction tools utilize probabilistic learning algorithms, machine learning methods and even empirical biologically defined rules in order to build models based on experimentally verified miRNA targets. Large-scale protein downregulation assays and next-generation sequencing (NGS) are now being used to validate methodologies and compare the performance of existing tools. Tools that exhibit greater correlation between computational predictions and protein downregulation or RNA downregulation are considered the state of the art. Moreover, efficiency in prediction of miRNA targets that are concurrently verified experimentally provides additional validity to computational predictions and further highlights the competitive advantage of specific tools and their efficacy in extracting biologically significant results. In this review paper, we discuss the computational methods for miRNA target prediction and provide a detailed comparison of methodologies and features utilized by each specific tool. Moreover, we provide an overview of current state-of-the-art high-throughput methods used in miRNA target prediction.

  1. Bridging groundwater models and decision support with a Bayesian network

    USGS Publications Warehouse

    Fienen, Michael N.; Masterson, John P.; Plant, Nathaniel G.; Gutierrez, Benjamin T.; Thieler, E. Robert

    2013-01-01

    Resource managers need to make decisions to plan for future environmental conditions, particularly sea level rise, in the face of substantial uncertainty. Many interacting processes factor in to the decisions they face. Advances in process models and the quantification of uncertainty have made models a valuable tool for this purpose. Long-simulation runtimes and, often, numerical instability make linking process models impractical in many cases. A method for emulating the important connections between model input and forecasts, while propagating uncertainty, has the potential to provide a bridge between complicated numerical process models and the efficiency and stability needed for decision making. We explore this using a Bayesian network (BN) to emulate a groundwater flow model. We expand on previous approaches to validating a BN by calculating forecasting skill using cross validation of a groundwater model of Assateague Island in Virginia and Maryland, USA. This BN emulation was shown to capture the important groundwater-flow characteristics and uncertainty of the groundwater system because of its connection to island morphology and sea level. Forecast power metrics associated with the validation of multiple alternative BN designs guided the selection of an optimal level of BN complexity. Assateague island is an ideal test case for exploring a forecasting tool based on current conditions because the unique hydrogeomorphological variability of the island includes a range of settings indicative of past, current, and future conditions. The resulting BN is a valuable tool for exploring the response of groundwater conditions to sea level rise in decision support.

  2. Predicting RNA 3D structure using a coarse-grain helix-centered model

    PubMed Central

    Kerpedjiev, Peter; Höner zu Siederdissen, Christian; Hofacker, Ivo L.

    2015-01-01

    A 3D model of RNA structure can provide information about its function and regulation that is not possible with just the sequence or secondary structure. Current models suffer from low accuracy and long running times and either neglect or presume knowledge of the long-range interactions which stabilize the tertiary structure. Our coarse-grained, helix-based, tertiary structure model operates with only a few degrees of freedom compared with all-atom models while preserving the ability to sample tertiary structures given a secondary structure. It strikes a balance between the precision of an all-atom tertiary structure model and the simplicity and effectiveness of a secondary structure representation. It provides a simplified tool for exploring global arrangements of helices and loops within RNA structures. We provide an example of a novel energy function relying only on the positions of stems and loops. We show that coupling our model to this energy function produces predictions as good as or better than the current state of the art tools. We propose that given the wide range of conformational space that needs to be explored, a coarse-grain approach can explore more conformations in less iterations than an all-atom model coupled to a fine-grain energy function. Finally, we emphasize the overarching theme of providing an ensemble of predicted structures, something which our tool excels at, rather than providing a handful of the lowest energy structures. PMID:25904133

  3. An online tool for tracking soil nitrogen

    NASA Astrophysics Data System (ADS)

    Wang, J.; Umar, M.; Banger, K.; Pittelkow, C. M.; Nafziger, E. D.

    2016-12-01

    Near real-time crop models can be useful tools for optimizing agricultural management practices. For example, model simulations can potentially provide current estimates of nitrogen availability in soil, helping growers decide whether more nitrogen needs to be applied in a given season. Traditionally, crop models have been used at point locations (i.e. single fields) with homogenous soil, climate and initial conditions. However, nitrogen availability across fields with varied weather and soil conditions at a regional or national level is necessary to guide better management decisions. This study presents the development of a publicly available, online tool that automates the integration of high-spatial-resolution forecast and past weather and soil data in DSSAT to estimate nitrogen availability for individual fields in Illinois. The model has been calibrated with field experiments from past year at six research corn fields across Illinois. These sites were treated with applications of different N fertilizer timings and amounts. The tool requires minimal management information from growers and yet has the capability to simulate nitrogen-water-crop interactions with calibrated parameters that are more appropriate for Illinois. The results from the tool will be combined with incoming field experiment data from 2016 for model validation and further improvement of model's predictive accuracy. The tool has the potential to help guide better nitrogen management practices to maximize economic and environmental benefits.

  4. A comparative assessment of tools for ecosystem services quantification and valuation

    USGS Publications Warehouse

    Bagstad, Kenneth J.; Semmens, Darius; Waage, Sissel; Winthrop, Robert

    2013-01-01

    To enter widespread use, ecosystem service assessments need to be quantifiable, replicable, credible, flexible, and affordable. With recent growth in the field of ecosystem services, a variety of decision-support tools has emerged to support more systematic ecosystem services assessment. Despite the growing complexity of the tool landscape, thorough reviews of tools for identifying, assessing, modeling and in some cases monetarily valuing ecosystem services have generally been lacking. In this study, we describe 17 ecosystem services tools and rate their performance against eight evaluative criteria that gauge their readiness for widespread application in public- and private-sector decision making. We describe each of the tools′ intended uses, services modeled, analytical approaches, data requirements, and outputs, as well time requirements to run seven tools in a first comparative concurrent application of multiple tools to a common location – the San Pedro River watershed in southeast Arizona, USA, and northern Sonora, Mexico. Based on this work, we offer conclusions about these tools′ current ‘readiness’ for widespread application within both public- and private-sector decision making processes. Finally, we describe potential pathways forward to reduce the resource requirements for running ecosystem services models, which are essential to facilitate their more widespread use in environmental decision making.

  5. Lysimetric evaluation of the APEX Model to simulate daily ET for irrigated crops in the Texas High Plains

    USDA-ARS?s Scientific Manuscript database

    The NTT (Nutrient Tracking Tool) was designed to provide an opportunity for all users, including producers, to simulate the complex models, such as APEX (Agricultural Policy Environmental eXtender) and associated required databases. The APEX model currently nested within NTT provides estimates of th...

  6. Automation life-cycle cost model

    NASA Technical Reports Server (NTRS)

    Gathmann, Thomas P.; Reeves, Arlinda J.; Cline, Rick; Henrion, Max; Ruokangas, Corinne

    1992-01-01

    The problem domain being addressed by this contractual effort can be summarized by the following list: Automation and Robotics (A&R) technologies appear to be viable alternatives to current, manual operations; Life-cycle cost models are typically judged with suspicion due to implicit assumptions and little associated documentation; and Uncertainty is a reality for increasingly complex problems and few models explicitly account for its affect on the solution space. The objectives for this effort range from the near-term (1-2 years) to far-term (3-5 years). In the near-term, the envisioned capabilities of the modeling tool are annotated. In addition, a framework is defined and developed in the Decision Modelling System (DEMOS) environment. Our approach is summarized as follows: Assess desirable capabilities (structure into near- and far-term); Identify useful existing models/data; Identify parameters for utility analysis; Define tool framework; Encode scenario thread for model validation; and Provide transition path for tool development. This report contains all relevant, technical progress made on this contractual effort.

  7. Model Identification of Integrated ARMA Processes

    ERIC Educational Resources Information Center

    Stadnytska, Tetiana; Braun, Simone; Werner, Joachim

    2008-01-01

    This article evaluates the Smallest Canonical Correlation Method (SCAN) and the Extended Sample Autocorrelation Function (ESACF), automated methods for the Autoregressive Integrated Moving-Average (ARIMA) model selection commonly available in current versions of SAS for Windows, as identification tools for integrated processes. SCAN and ESACF can…

  8. Requirements to Design to Code: Towards a Fully Formal Approach to Automatic Code Generation

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.

    2005-01-01

    A general-purpose method to mechanically transform system requirements into a provably equivalent model has yet to appear. Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including distributed software systems, sensor networks, robot operation, complex scripts for spacecraft integration and testing, and autonomous systems. Currently available tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The gap that current tools and methods leave unfilled is that their formal models cannot be proven to be equivalent to the system requirements as originated by the customer. For the classes of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations.

  9. Requirements to Design to Code: Towards a Fully Formal Approach to Automatic Code Generation

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.

    2005-01-01

    A general-purpose method to mechanically transform system requirements into a provably equivalent model has yet to appear. Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including distributed software systems, sensor networks, robot operation, complex scripts for spacecraft integration and testing, and autonomous systems. Currently available tools and methods that start with a formal model of a: system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The "gap" that current tools and methods leave unfilled is that their formal models cannot be proven to be equivalent to the system requirements as originated by the customer. For the ciasses of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations.

  10. The impact of sea surface currents in wave power potential modeling

    NASA Astrophysics Data System (ADS)

    Zodiatis, George; Galanis, George; Kallos, George; Nikolaidis, Andreas; Kalogeri, Christina; Liakatas, Aristotelis; Stylianou, Stavros

    2015-11-01

    The impact of sea surface currents to the estimation and modeling of wave energy potential over an area of increased economic interest, the Eastern Mediterranean Sea, is investigated in this work. High-resolution atmospheric, wave, and circulation models, the latter downscaled from the regional Mediterranean Forecasting System (MFS) of the Copernicus marine service (former MyOcean regional MFS system), are utilized towards this goal. The modeled data are analyzed by means of a variety of statistical tools measuring the potential changes not only in the main wave characteristics, but also in the general distribution of the wave energy and the wave parameters that mainly affect it, when using sea surface currents as a forcing to the wave models. The obtained results prove that the impact of the sea surface currents is quite significant in wave energy-related modeling, as well as temporally and spatially dependent. These facts are revealing the necessity of the utilization of the sea surface currents characteristics in renewable energy studies in conjunction with their meteo-ocean forecasting counterparts.

  11. Evaluating the integration of cultural competence skills into health and physical assessment tools: a survey of Canadian schools of nursing.

    PubMed

    Chircop, Andrea; Edgecombe, Nancy; Hayward, Kathryn; Ducey-Gilbert, Cherie; Sheppard-Lemoine, Debbie

    2013-04-01

    Currently used audiovisual (AV) teaching tools to teach health and physical assessment reflect a Eurocentric bias using the biomedical model. The purpose of our study was to (a) identify commonly used AV teaching tools of Canadian schools of nursing and (b) evaluate the identified tools. A two-part descriptive quantitative method design was used. First, we surveyed schools of nursing across Canada. Second, the identified AV teaching tools were evaluated for content and modeling of cultural competence. The majority of the schools (67%) used publisher-produced videos associated with a physical assessment textbook. Major findings included minimal demonstration of negotiation with a client around cultural aspects of the interview including the need for an interpreter, modesty, and inclusion of support persons. Identification of culturally specific examples given during the videos was superficial and did not provide students with a comprehensive understanding of necessary culturally competent skills.

  12. Advanced techniques in reliability model representation and solution

    NASA Technical Reports Server (NTRS)

    Palumbo, Daniel L.; Nicol, David M.

    1992-01-01

    The current tendency of flight control system designs is towards increased integration of applications and increased distribution of computational elements. The reliability analysis of such systems is difficult because subsystem interactions are increasingly interdependent. Researchers at NASA Langley Research Center have been working for several years to extend the capability of Markov modeling techniques to address these problems. This effort has been focused in the areas of increased model abstraction and increased computational capability. The reliability model generator (RMG) is a software tool that uses as input a graphical object-oriented block diagram of the system. RMG uses a failure-effects algorithm to produce the reliability model from the graphical description. The ASSURE software tool is a parallel processing program that uses the semi-Markov unreliability range evaluator (SURE) solution technique and the abstract semi-Markov specification interface to the SURE tool (ASSIST) modeling language. A failure modes-effects simulation is used by ASSURE. These tools were used to analyze a significant portion of a complex flight control system. The successful combination of the power of graphical representation, automated model generation, and parallel computation leads to the conclusion that distributed fault-tolerant system architectures can now be analyzed.

  13. Proceedings of the Ninth Annual Software Engineering Workshop

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Experiences in measurement, utilization, and evaluation of software methodologies, models, and tools are discussed. NASA's involvement in ever larger and more complex systems, like the space station project, provides a motive for the support of software engineering research and the exchange of ideas in such forums. The topics of current SEL research are software error studies, experiments with software development, and software tools.

  14. Challenges and Advances for Genetic Engineering of Non-model Bacteria and Uses in Consolidated Bioprocessing

    PubMed Central

    Yan, Qiang; Fong, Stephen S.

    2017-01-01

    Metabolic diversity in microorganisms can provide the basis for creating novel biochemical products. However, most metabolic engineering projects utilize a handful of established model organisms and thus, a challenge for harnessing the potential of novel microbial functions is the ability to either heterologously express novel genes or directly utilize non-model organisms. Genetic manipulation of non-model microorganisms is still challenging due to organism-specific nuances that hinder universal molecular genetic tools and translatable knowledge of intracellular biochemical pathways and regulatory mechanisms. However, in the past several years, unprecedented progress has been made in synthetic biology, molecular genetics tools development, applications of omics data techniques, and computational tools that can aid in developing non-model hosts in a systematic manner. In this review, we focus on concerns and approaches related to working with non-model microorganisms including developing molecular genetics tools such as shuttle vectors, selectable markers, and expression systems. In addition, we will discuss: (1) current techniques in controlling gene expression (transcriptional/translational level), (2) advances in site-specific genome engineering tools [homologous recombination (HR) and clustered regularly interspaced short palindromic repeats (CRISPR)], and (3) advances in genome-scale metabolic models (GSMMs) in guiding design of non-model species. Application of these principles to metabolic engineering strategies for consolidated bioprocessing (CBP) will be discussed along with some brief comments on foreseeable future prospects. PMID:29123506

  15. A Generalized Decision Framework Using Multi-objective Optimization for Water Resources Planning

    NASA Astrophysics Data System (ADS)

    Basdekas, L.; Stewart, N.; Triana, E.

    2013-12-01

    Colorado Springs Utilities (CSU) is currently engaged in an Integrated Water Resource Plan (IWRP) to address the complex planning scenarios, across multiple time scales, currently faced by CSU. The modeling framework developed for the IWRP uses a flexible data-centered Decision Support System (DSS) with a MODSIM-based modeling system to represent the operation of the current CSU raw water system coupled with a state-of-the-art multi-objective optimization algorithm. Three basic components are required for the framework, which can be implemented for planning horizons ranging from seasonal to interdecadal. First, a water resources system model is required that is capable of reasonable system simulation to resolve performance metrics at the appropriate temporal and spatial scales of interest. The system model should be an existing simulation model, or one developed during the planning process with stakeholders, so that 'buy-in' has already been achieved. Second, a hydrologic scenario tool(s) capable of generating a range of plausible inflows for the planning period of interest is required. This may include paleo informed or climate change informed sequences. Third, a multi-objective optimization model that can be wrapped around the system simulation model is required. The new generation of multi-objective optimization models do not require parameterization which greatly reduces problem complexity. Bridging the gap between research and practice will be evident as we use a case study from CSU's planning process to demonstrate this framework with specific competing water management objectives. Careful formulation of objective functions, choice of decision variables, and system constraints will be discussed. Rather than treating results as theoretically Pareto optimal in a planning process, we use the powerful multi-objective optimization models as tools to more efficiently and effectively move out of the inferior decision space. The use of this framework will help CSU evaluate tradeoffs in a continually changing world.

  16. Prediction of Acoustic Loads Generated by Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Perez, Linamaria; Allgood, Daniel C.

    2011-01-01

    NASA Stennis Space Center is one of the nation's premier facilities for conducting large-scale rocket engine testing. As liquid rocket engines vary in size, so do the acoustic loads that they produce. When these acoustic loads reach very high levels they may cause damages both to humans and to actual structures surrounding the testing area. To prevent these damages, prediction tools are used to estimate the spectral content and levels of the acoustics being generated by the rocket engine plumes and model their propagation through the surrounding atmosphere. Prior to the current work, two different acoustic prediction tools were being implemented at Stennis Space Center, each having their own advantages and disadvantages depending on the application. Therefore, a new prediction tool was created, using NASA SP-8072 handbook as a guide, which would replicate the same prediction methods as the previous codes, but eliminate any of the drawbacks the individual codes had. Aside from replicating the previous modeling capability in a single framework, additional modeling functions were added thereby expanding the current modeling capability. To verify that the new code could reproduce the same predictions as the previous codes, two verification test cases were defined. These verification test cases also served as validation cases as the predicted results were compared to actual test data.

  17. A Framework for Action: Intervening to Increase Adoption of Transformative Web 2.0 Learning Resources

    ERIC Educational Resources Information Center

    Hughes, Joan E.; Guion, James M.; Bruce, Kama A.; Horton, Lucas R.; Prescott, Amy

    2011-01-01

    Web 2.0 tools have emerged as conducive for innovative pedagogy and transformative learning opportunities for youth. Currently,Web 2.0 is often adopted into teachers' practice to simply replace or amplify traditional instructional approaches rather than to promote or facilitate transformative educational change. Current models of innovation…

  18. Geometric modeling for computer aided design

    NASA Technical Reports Server (NTRS)

    Schwing, James L.

    1992-01-01

    The goal was the design and implementation of software to be used in the conceptual design of aerospace vehicles. Several packages and design studies were completed, including two software tools currently used in the conceptual level design of aerospace vehicles. These tools are the Solid Modeling Aerospace Research Tool (SMART) and the Environment for Software Integration and Execution (EASIE). SMART provides conceptual designers with a rapid prototyping capability and additionally provides initial mass property analysis. EASIE provides a set of interactive utilities that simplify the task of building and executing computer aided design systems consisting of diverse, stand alone analysis codes that result in the streamlining of the exchange of data between programs, reducing errors and improving efficiency.

  19. CFD Methods and Tools for Multi-Element Airfoil Analysis

    NASA Technical Reports Server (NTRS)

    Rogers, Stuart E.; George, Michael W. (Technical Monitor)

    1995-01-01

    This lecture will discuss the computational tools currently available for high-lift multi-element airfoil analysis. It will present an overview of a number of different numerical approaches, their current capabilities, short-comings, and computational costs. The lecture will be limited to viscous methods, including inviscid/boundary layer coupling methods, and incompressible and compressible Reynolds-averaged Navier-Stokes methods. Both structured and unstructured grid generation approaches will be presented. Two different structured grid procedures are outlined, one which uses multi-block patched grids, the other uses overset chimera grids. Turbulence and transition modeling will be discussed.

  20. Current progress in patient-specific modeling

    PubMed Central

    2010-01-01

    We present a survey of recent advancements in the emerging field of patient-specific modeling (PSM). Researchers in this field are currently simulating a wide variety of tissue and organ dynamics to address challenges in various clinical domains. The majority of this research employs three-dimensional, image-based modeling techniques. Recent PSM publications mostly represent feasibility or preliminary validation studies on modeling technologies, and these systems will require further clinical validation and usability testing before they can become a standard of care. We anticipate that with further testing and research, PSM-derived technologies will eventually become valuable, versatile clinical tools. PMID:19955236

  1. Automated Verification of Specifications with Typestates and Access Permissions

    NASA Technical Reports Server (NTRS)

    Siminiceanu, Radu I.; Catano, Nestor

    2011-01-01

    We propose an approach to formally verify Plural specifications based on access permissions and typestates, by model-checking automatically generated abstract state-machines. Our exhaustive approach captures all the possible behaviors of abstract concurrent programs implementing the specification. We describe the formal methodology employed by our technique and provide an example as proof of concept for the state-machine construction rules. The implementation of a fully automated algorithm to generate and verify models, currently underway, provides model checking support for the Plural tool, which currently supports only program verification via data flow analysis (DFA).

  2. Interoperability of Neuroscience Modeling Software

    PubMed Central

    Cannon, Robert C.; Gewaltig, Marc-Oliver; Gleeson, Padraig; Bhalla, Upinder S.; Cornelis, Hugo; Hines, Michael L.; Howell, Fredrick W.; Muller, Eilif; Stiles, Joel R.; Wils, Stefan; De Schutter, Erik

    2009-01-01

    Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “Neuro-IT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19-20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. PMID:17873374

  3. Systems metabolic engineering: genome-scale models and beyond.

    PubMed

    Blazeck, John; Alper, Hal

    2010-07-01

    The advent of high throughput genome-scale bioinformatics has led to an exponential increase in available cellular system data. Systems metabolic engineering attempts to use data-driven approaches--based on the data collected with high throughput technologies--to identify gene targets and optimize phenotypical properties on a systems level. Current systems metabolic engineering tools are limited for predicting and defining complex phenotypes such as chemical tolerances and other global, multigenic traits. The most pragmatic systems-based tool for metabolic engineering to arise is the in silico genome-scale metabolic reconstruction. This tool has seen wide adoption for modeling cell growth and predicting beneficial gene knockouts, and we examine here how this approach can be expanded for novel organisms. This review will highlight advances of the systems metabolic engineering approach with a focus on de novo development and use of genome-scale metabolic reconstructions for metabolic engineering applications. We will then discuss the challenges and prospects for this emerging field to enable model-based metabolic engineering. Specifically, we argue that current state-of-the-art systems metabolic engineering techniques represent a viable first step for improving product yield that still must be followed by combinatorial techniques or random strain mutagenesis to achieve optimal cellular systems.

  4. A finite element analysis modeling tool for solid oxide fuel cell development: coupled electrochemistry, thermal and flow analysis in MARC®

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khaleel, Mohammad A.; Lin, Zijing; Singh, Prabhakar

    2004-05-03

    A 3D simulation tool for modeling solid oxide fuel cells is described. The tool combines the versatility and efficiency of a commercial finite element analysis code, MARC{reg_sign}, with an in-house developed robust and flexible electrochemical (EC) module. Based upon characteristic parameters obtained experimentally and assigned by the user, the EC module calculates the current density distribution, heat generation, and fuel and oxidant species concentration, taking the temperature profile provided by MARC{reg_sign} and operating conditions such as the fuel and oxidant flow rate and the total stack output voltage or current as the input. MARC{reg_sign} performs flow and thermal analyses basedmore » on the initial and boundary thermal and flow conditions and the heat generation calculated by the EC module. The main coupling between MARC{reg_sign} and EC is for MARC{reg_sign} to supply the temperature field to EC and for EC to give the heat generation profile to MARC{reg_sign}. The loosely coupled, iterative scheme is advantageous in terms of memory requirement, numerical stability and computational efficiency. The coupling is iterated to self-consistency for a steady-state solution. Sample results for steady states as well as the startup process for stacks with different flow designs are presented to illustrate the modeling capability and numerical performance characteristic of the simulation tool.« less

  5. Optimization of Microelectronic Devices for Sensor Applications

    NASA Technical Reports Server (NTRS)

    Cwik, Tom; Klimeck, Gerhard

    2000-01-01

    The NASA/JPL goal to reduce payload in future space missions while increasing mission capability demands miniaturization of active and passive sensors, analytical instruments and communication systems among others. Currently, typical system requirements include the detection of particular spectral lines, associated data processing, and communication of the acquired data to other systems. Advances in lithography and deposition methods result in more advanced devices for space application, while the sub-micron resolution currently available opens a vast design space. Though an experimental exploration of this widening design space-searching for optimized performance by repeated fabrication efforts-is unfeasible, it does motivate the development of reliable software design tools. These tools necessitate models based on fundamental physics and mathematics of the device to accurately model effects such as diffraction and scattering in opto-electronic devices, or bandstructure and scattering in heterostructure devices. The software tools must have convenient turn-around times and interfaces that allow effective usage. The first issue is addressed by the application of high-performance computers and the second by the development of graphical user interfaces driven by properly developed data structures. These tools can then be integrated into an optimization environment, and with the available memory capacity and computational speed of high performance parallel platforms, simulation of optimized components can proceed. In this paper, specific applications of the electromagnetic modeling of infrared filtering, as well as heterostructure device design will be presented using genetic algorithm global optimization methods.

  6. Selecting Tools to Model Integer and Binomial Multiplication

    ERIC Educational Resources Information Center

    Pratt, Sarah Smitherman; Eddy, Colleen M.

    2017-01-01

    Mathematics teachers frequently provide concrete manipulatives to students during instruction; however, the rationale for using certain manipulatives in conjunction with concepts may not be explored. This article focuses on area models that are currently used in classrooms to provide concrete examples of integer and binomial multiplication. The…

  7. Uncertainty Modeling for Robustness Analysis of Control Upset Prevention and Recovery Systems

    NASA Technical Reports Server (NTRS)

    Belcastro, Christine M.; Khong, Thuan H.; Shin, Jong-Yeob; Kwatny, Harry; Chang, Bor-Chin; Balas, Gary J.

    2005-01-01

    Formal robustness analysis of aircraft control upset prevention and recovery systems could play an important role in their validation and ultimate certification. Such systems (developed for failure detection, identification, and reconfiguration, as well as upset recovery) need to be evaluated over broad regions of the flight envelope and under extreme flight conditions, and should include various sources of uncertainty. However, formulation of linear fractional transformation (LFT) models for representing system uncertainty can be very difficult for complex parameter-dependent systems. This paper describes a preliminary LFT modeling software tool which uses a matrix-based computational approach that can be directly applied to parametric uncertainty problems involving multivariate matrix polynomial dependencies. Several examples are presented (including an F-16 at an extreme flight condition, a missile model, and a generic example with numerous crossproduct terms), and comparisons are given with other LFT modeling tools that are currently available. The LFT modeling method and preliminary software tool presented in this paper are shown to compare favorably with these methods.

  8. Linking Plasma Conditions in the Magnetosphere with Ionospheric Signatures

    NASA Technical Reports Server (NTRS)

    Rastaetter, Lutz; Kozyra, Janet; Kuznetsova, Maria M.; Berrios, David H.

    2012-01-01

    Modeling of the full magnetosphere, ring current and ionosphere system has become an indispensable tool in analyzing the series of events that occur during geomagnetic storms. The CCMC has a full model suite available for the magnetosphere, together with visualization tools that allow a user to perform a large variety of analyses. The January, 21, 2005 storm was a moderate-size storm that has been found to feature a large penetration electric field and unusually large polar caps (low-latitude precipitation patterns) that are otherwise found in super storms. Based on simulations runs at CCMC we can outline the likely causes of this behavior. Using visualization tools available to the online user we compare results from different magnetosphere models and present connections found between features in the magnetosphere and the ionosphere that are connected magnetically. The range of magnetic mappings found with different models can be compared with statistical models (Tsyganenko) and the model's fidelity can be verified with observations from low earth orbiting satellites such as DMSP and TIMED.

  9. Simscape Modeling Verification in the Simulink Development Environment

    NASA Technical Reports Server (NTRS)

    Volle, Christopher E. E.

    2014-01-01

    The purpose of the Simulation Product Group of the Control and Data Systems division of the NASA Engineering branch at Kennedy Space Center is to provide a realtime model and simulation of the Ground Subsystems participating in vehicle launching activities. The simulation software is part of the Spaceport Command and Control System (SCCS) and is designed to support integrated launch operation software verification, and console operator training. Using Mathworks Simulink tools, modeling engineers currently build models from the custom-built blocks to accurately represent ground hardware. This is time consuming and costly due to required rigorous testing and peer reviews to be conducted for each custom-built block. Using Mathworks Simscape tools, modeling time can be reduced since there would be no custom-code developed. After careful research, the group came to the conclusion it is feasible to use Simscape's blocks in MatLab's Simulink. My project this fall was to verify the accuracy of the Crew Access Arm model developed using Simscape tools running in the Simulink development environment.

  10. Experimental anti-GBM disease as a tool for studying spontaneous lupus nephritis.

    PubMed

    Fu, Yuyang; Du, Yong; Mohan, Chandra

    2007-08-01

    Lupus nephritis is an immune-mediated disease, where antibodies and T cells both play pathogenic roles. Since spontaneous lupus nephritis in mouse models takes 6-12 months to manifest, there is an urgent need for a mouse model that can be used to delineate the pathogenic processes that lead to immune nephritis, over a quicker time frame. We propose that the experimental anti-glomerular basement membrane (GBM) disease model might be a suitable tool for uncovering some of the molecular steps underlying lupus nephritis. This article reviews the current evidence that supports the use of the experimental anti-GBM nephritis model for studying spontaneous lupus nephritis. Importantly, out of about 25 different molecules that have been specifically examined in the experimental anti-GBM model and also spontaneous lupus nephritis, all influence both diseases concordantly, suggesting that the experimental model might be a useful tool for unraveling the molecular basis of spontaneous lupus nephritis. This has important clinical implications, both from the perspective of genetic susceptibility as well as clinical therapeutics.

  11. Benefits of a Unified LaSRS++ Simulation for NAS-Wide and High-Fidelity Modeling

    NASA Technical Reports Server (NTRS)

    Glaab, Patricia; Madden, Michael

    2014-01-01

    The LaSRS++ high-fidelity vehicle simulation was extended in 2012 to support a NAS-wide simulation mode. Since the initial proof-of-concept, the LaSRS++ NAS-wide simulation is maturing into a research-ready tool. A primary benefit of this new capability is the consolidation of the two modeling paradigms under a single framework to save cost, facilitate iterative concept testing between the two tools, and to promote communication and model sharing between user communities at Langley. Specific benefits of each type of modeling are discussed along with the expected benefits of the unified framework. Current capability details of the LaSRS++ NAS-wide simulations are provided, including the visualization tool, live data interface, trajectory generators, terminal routing for arrivals and departures, maneuvering, re-routing, navigation, winds, and turbulence. The plan for future development is also described.

  12. Modeling tidal hydrodynamics of San Diego Bay, California

    USGS Publications Warehouse

    Wang, P.-F.; Cheng, R.T.; Richter, K.; Gross, E.S.; Sutton, D.; Gartner, J.W.

    1998-01-01

    In 1983, current data were collected by the National Oceanic and Atmospheric Administration using mechanical current meters. During 1992 through 1996, acoustic Doppler current profilers as well as mechanical current meters and tide gauges were used. These measurements not only document tides and tidal currents in San Diego Bay, but also provide independent data sets for model calibration and verification. A high resolution (100-m grid), depth-averaged, numerical hydrodynamic model has been implemented for San Diego Bay to describe essential tidal hydrodynamic processes in the bay. The model is calibrated using the 1983 data set and verified using the more recent 1992-1996 data. Discrepancies between model predictions and field data in beth model calibration and verification are on the order of the magnitude of uncertainties in the field data. The calibrated and verified numerical model has been used to quantify residence time and dilution and flushing of contaminant effluent into San Diego Bay. Furthermore, the numerical model has become an important research tool in ongoing hydrodynamic and water quality studies and in guiding future field data collection programs.

  13. Technical Data Exchange Software Tools Adapted to Distributed Microsatellite Design

    NASA Astrophysics Data System (ADS)

    Pache, Charly

    2002-01-01

    One critical issue concerning distributed design of satellites, is the collaborative work it requires. In particular, the exchange of data between each group responsible for each subsystem can be complex and very time-consuming. The goal of this paper is to present a design collaborative tool, the SSETI Design Model (SDM), specifically developed for enabling satellite distributed design. SDM is actually used in the ongoing Student Space Exploration &Technology (SSETI) initiative (www.sseti.net). SSETI is lead by European Space Agency (ESA) outreach office (http://www.estec.esa.nl/outreach), involving student groups from all over Europe for design, construction and launch of a microsatellite. The first part of this paper presents the current version of the SDM tool, a collection of Microsoft Excel linked worksheets, one for each subsystem. An overview of the project framework/structure is given, explaining the different actors, the flows between them, as well as the different types of data and the links - formulas - between data sets. Unified Modeling Language (UML) diagrams give an overview of the different parts . Then the SDM's functionalities, developed in VBA scripts (Visual Basic for Application), are introduced, as well as the interactive features, user interfaces and administration tools. The second part discusses the capabilities and limitations of SDM current version. Taking into account these capabilities and limitations, the third part outlines the next version of SDM, a web-oriented, database-driven evolution of the current version. This new approach will enable real-time data exchange and processing between the different actors of the mission. Comprehensive UML diagrams will guide the audience through the entire modeling process of such a system. Tradeoffs simulation capabilities, security, reliability, hardware and software issues will also be thoroughly discussed.

  14. NASA's Lunar and Planetary Mapping and Modeling Program

    NASA Astrophysics Data System (ADS)

    Law, E.; Day, B. H.; Kim, R. M.; Bui, B.; Malhotra, S.; Chang, G.; Sadaqathullah, S.; Arevalo, E.; Vu, Q. A.

    2016-12-01

    NASA's Lunar and Planetary Mapping and Modeling Program produces a suite of online visualization and analysis tools. Originally designed for mission planning and science, these portals offer great benefits for education and public outreach (EPO), providing access to data from a wide range of instruments aboard a variety of past and current missions. As a component of NASA's Science EPO Infrastructure, they are available as resources for NASA STEM EPO programs, and to the greater EPO community. As new missions are planned to a variety of planetary bodies, these tools are facilitating the public's understanding of the missions and engaging the public in the process of identifying and selecting where these missions will land. There are currently three web portals in the program: the Lunar Mapping and Modeling Portal or LMMP (http://lmmp.nasa.gov), Vesta Trek (http://vestatrek.jpl.nasa.gov), and Mars Trek (http://marstrek.jpl.nasa.gov). Portals for additional planetary bodies are planned. As web-based toolsets, the portals do not require users to purchase or install any software beyond current web browsers. The portals provide analysis tools for measurement and study of planetary terrain. They allow data to be layered and adjusted to optimize visualization. Visualizations are easily stored and shared. The portals provide 3D visualization and give users the ability to mark terrain for generation of STL files that can be directed to 3D printers. Such 3D prints are valuable tools in museums, public exhibits, and classrooms - especially for the visually impaired. Along with the web portals, the program supports additional clients, web services, and APIs that facilitate dissemination of planetary data to a range of external applications and venues. NASA challenges and hackathons are also providing members of the software development community opportunities to participate in tool development and leverage data from the portals.

  15. Assessment of Spacecraft Systems Integration Using the Electric Propulsion Interactions Code (EPIC)

    NASA Technical Reports Server (NTRS)

    Mikellides, Ioannis G.; Kuharski, Robert A.; Mandell, Myron J.; Gardner, Barbara M.; Kauffman, William J. (Technical Monitor)

    2002-01-01

    SAIC is currently developing the Electric Propulsion Interactions Code 'EPIC', an interactive computer tool that allows the construction of a 3-D spacecraft model, and the assessment of interactions between its subsystems and the plume from an electric thruster. EPIC unites different computer tools to address the complexity associated with the interaction processes. This paper describes the overall architecture and capability of EPIC including the physics and algorithms that comprise its various components. Results from selected modeling efforts of different spacecraft-thruster systems are also presented.

  16. Prognostic and Prediction Tools in Bladder Cancer: A Comprehensive Review of the Literature.

    PubMed

    Kluth, Luis A; Black, Peter C; Bochner, Bernard H; Catto, James; Lerner, Seth P; Stenzl, Arnulf; Sylvester, Richard; Vickers, Andrew J; Xylinas, Evanguelos; Shariat, Shahrokh F

    2015-08-01

    This review focuses on risk assessment and prediction tools for bladder cancer (BCa). To review the current knowledge on risk assessment and prediction tools to enhance clinical decision making and counseling of patients with BCa. A literature search in English was performed using PubMed in July 2013. Relevant risk assessment and prediction tools for BCa were selected. More than 1600 publications were retrieved. Special attention was given to studies that investigated the clinical benefit of a prediction tool. Most prediction tools for BCa focus on the prediction of disease recurrence and progression in non-muscle-invasive bladder cancer or disease recurrence and survival after radical cystectomy. Although these tools are helpful, recent prediction tools aim to address a specific clinical problem, such as the prediction of organ-confined disease and lymph node metastasis to help identify patients who might benefit from neoadjuvant chemotherapy. Although a large number of prediction tools have been reported in recent years, many of them lack external validation. Few studies have investigated the clinical utility of any given model as measured by its ability to improve clinical decision making. There is a need for novel biomarkers to improve the accuracy and utility of prediction tools for BCa. Decision tools hold the promise of facilitating the shared decision process, potentially improving clinical outcomes for BCa patients. Prediction models need external validation and assessment of clinical utility before they can be incorporated into routine clinical care. We looked at models that aim to predict outcomes for patients with bladder cancer (BCa). We found a large number of prediction models that hold the promise of facilitating treatment decisions for patients with BCa. However, many models are missing confirmation in a different patient cohort, and only a few studies have tested the clinical utility of any given model as measured by its ability to improve clinical decision making. Copyright © 2015 European Association of Urology. Published by Elsevier B.V. All rights reserved.

  17. Current advances in mathematical modeling of anti-cancer drug penetration into tumor tissues.

    PubMed

    Kim, Munju; Gillies, Robert J; Rejniak, Katarzyna A

    2013-11-18

    Delivery of anti-cancer drugs to tumor tissues, including their interstitial transport and cellular uptake, is a complex process involving various biochemical, mechanical, and biophysical factors. Mathematical modeling provides a means through which to understand this complexity better, as well as to examine interactions between contributing components in a systematic way via computational simulations and quantitative analyses. In this review, we present the current state of mathematical modeling approaches that address phenomena related to drug delivery. We describe how various types of models were used to predict spatio-temporal distributions of drugs within the tumor tissue, to simulate different ways to overcome barriers to drug transport, or to optimize treatment schedules. Finally, we discuss how integration of mathematical modeling with experimental or clinical data can provide better tools to understand the drug delivery process, in particular to examine the specific tissue- or compound-related factors that limit drug penetration through tumors. Such tools will be important in designing new chemotherapy targets and optimal treatment strategies, as well as in developing non-invasive diagnosis to monitor treatment response and detect tumor recurrence.

  18. Tree injury and mortality in fires: developing process-based models

    Treesearch

    Bret W. Butler; Matthew B. Dickinson

    2010-01-01

    Wildland fire managers are often required to predict tree injury and mortality when planning a prescribed burn or when considering wildfire management options; and, currently, statistical models based on post-fire observations are the only tools available for this purpose. Implicit in the derivation of statistical models is the assumption that they are strictly...

  19. Techniques for Down-Sampling a Measured Surface Height Map for Model Validation

    NASA Technical Reports Server (NTRS)

    Sidick, Erkin

    2012-01-01

    This software allows one to down-sample a measured surface map for model validation, not only without introducing any re-sampling errors, but also eliminating the existing measurement noise and measurement errors. The software tool of the current two new techniques can be used in all optical model validation processes involving large space optical surfaces

  20. Implementation of channel-routing routines in the Water Erosion Prediction Project (WEPP) model

    Treesearch

    Li Wang; Joan Q. Wu; William J. Elliott; Shuhui Dun; Sergey Lapin; Fritz R. Fiedler; Dennis C. Flanagan

    2010-01-01

    The Water Erosion Prediction Project (WEPP) model is a process-based, continuous-simulation, watershed hydrology and erosion model. It is an important tool for water erosion simulation owing to its unique functionality in representing diverse landuse and management conditions. Its applicability is limited to relatively small watersheds since its current version does...

  1. Collision detection and modeling of rigid and deformable objects in laparoscopic simulator

    NASA Astrophysics Data System (ADS)

    Dy, Mary-Clare; Tagawa, Kazuyoshi; Tanaka, Hiromi T.; Komori, Masaru

    2015-03-01

    Laparoscopic simulators are viable alternatives for surgical training and rehearsal. Haptic devices can also be incorporated with virtual reality simulators to provide additional cues to the users. However, to provide realistic feedback, the haptic device must be updated by 1kHz. On the other hand, realistic visual cues, that is, the collision detection and deformation between interacting objects must be rendered at least 30 fps. Our current laparoscopic simulator detects the collision between a point on the tool tip, and on the organ surfaces, in which haptic devices are attached on actual tool tips for realistic tool manipulation. The triangular-mesh organ model is rendered using a mass spring deformation model, or finite element method-based models. In this paper, we investigated multi-point-based collision detection on the rigid tool rods. Based on the preliminary results, we propose a method to improve the collision detection scheme, and speed up the organ deformation reaction. We discuss our proposal for an efficient method to compute simultaneous multiple collision between rigid (laparoscopic tools) and deformable (organs) objects, and perform the subsequent collision response, with haptic feedback, in real-time.

  2. Improved Aerodynamic Analysis for Hybrid Wing Body Conceptual Design Optimization

    NASA Technical Reports Server (NTRS)

    Gern, Frank H.

    2012-01-01

    This paper provides an overview of ongoing efforts to develop, evaluate, and validate different tools for improved aerodynamic modeling and systems analysis of Hybrid Wing Body (HWB) aircraft configurations. Results are being presented for the evaluation of different aerodynamic tools including panel methods, enhanced panel methods with viscous drag prediction, and computational fluid dynamics. Emphasis is placed on proper prediction of aerodynamic loads for structural sizing as well as viscous drag prediction to develop drag polars for HWB conceptual design optimization. Data from transonic wind tunnel tests at the Arnold Engineering Development Center s 16-Foot Transonic Tunnel was used as a reference data set in order to evaluate the accuracy of the aerodynamic tools. Triangularized surface data and Vehicle Sketch Pad (VSP) models of an X-48B 2% scale wind tunnel model were used to generate input and model files for the different analysis tools. In support of ongoing HWB scaling studies within the NASA Environmentally Responsible Aviation (ERA) program, an improved finite element based structural analysis and weight estimation tool for HWB center bodies is currently under development. Aerodynamic results from these analyses are used to provide additional aerodynamic validation data.

  3. Cloud-Based Orchestration of a Model-Based Power and Data Analysis Toolchain

    NASA Technical Reports Server (NTRS)

    Post, Ethan; Cole, Bjorn; Dinkel, Kevin; Kim, Hongman; Lee, Erich; Nairouz, Bassem

    2016-01-01

    The proposed Europa Mission concept contains many engineering and scientific instruments that consume varying amounts of power and produce varying amounts of data throughout the mission. System-level power and data usage must be well understood and analyzed to verify design requirements. Numerous cross-disciplinary tools and analysis models are used to simulate the system-level spacecraft power and data behavior. This paper addresses the problem of orchestrating a consistent set of models, tools, and data in a unified analysis toolchain when ownership is distributed among numerous domain experts. An analysis and simulation environment was developed as a way to manage the complexity of the power and data analysis toolchain and to reduce the simulation turnaround time. A system model data repository is used as the trusted store of high-level inputs and results while other remote servers are used for archival of larger data sets and for analysis tool execution. Simulation data passes through numerous domain-specific analysis tools and end-to-end simulation execution is enabled through a web-based tool. The use of a cloud-based service facilitates coordination among distributed developers and enables scalable computation and storage needs, and ensures a consistent execution environment. Configuration management is emphasized to maintain traceability between current and historical simulation runs and their corresponding versions of models, tools and data.

  4. A Short Review of Ablative-Material Response Models and Simulation Tools

    NASA Technical Reports Server (NTRS)

    Lachaud, Jean; Magin, Thierry E.; Cozmuta, Ioana; Mansour, Nagi N.

    2011-01-01

    A review of the governing equations and boundary conditions used to model the response of ablative materials submitted to a high-enthalpy flow is proposed. The heritage of model-development efforts undertaken in the 1960s is extremely clear: the bases of the models used in the community are mathematically equivalent. Most of the material-response codes implement a single model in which the equation parameters may be modified to model different materials or conditions. The level of fidelity of the models implemented in design tools only slightly varies. Research and development codes are generally more advanced but often not as robust. The capabilities of each of these codes are summarized in a color-coded table along with research and development efforts currently in progress.

  5. Foundations of the Bandera Abstraction Tools

    NASA Technical Reports Server (NTRS)

    Hatcliff, John; Dwyer, Matthew B.; Pasareanu, Corina S.; Robby

    2003-01-01

    Current research is demonstrating that model-checking and other forms of automated finite-state verification can be effective for checking properties of software systems. Due to the exponential costs associated with model-checking, multiple forms of abstraction are often necessary to obtain system models that are tractable for automated checking. The Bandera Tool Set provides multiple forms of automated support for compiling concurrent Java software systems to models that can be supplied to several different model-checking tools. In this paper, we describe the foundations of Bandera's data abstraction mechanism which is used to reduce the cardinality (and the program's state-space) of data domains in software to be model-checked. From a technical standpoint, the form of data abstraction used in Bandera is simple, and it is based on classical presentations of abstract interpretation. We describe the mechanisms that Bandera provides for declaring abstractions, for attaching abstractions to programs, and for generating abstracted programs and properties. The contributions of this work are the design and implementation of various forms of tool support required for effective application of data abstraction to software components written in a programming language like Java which has a rich set of linguistic features.

  6. Open Source GIS based integrated watershed management

    NASA Astrophysics Data System (ADS)

    Byrne, J. M.; Lindsay, J.; Berg, A. A.

    2013-12-01

    Optimal land and water management to address future and current resource stresses and allocation challenges requires the development of state-of-the-art geomatics and hydrological modelling tools. Future hydrological modelling tools should be of high resolution, process based with real-time capability to assess changing resource issues critical to short, medium and long-term enviromental management. The objective here is to merge two renowned, well published resource modeling programs to create an source toolbox for integrated land and water management applications. This work will facilitate a much increased efficiency in land and water resource security, management and planning. Following an 'open-source' philosophy, the tools will be computer platform independent with source code freely available, maximizing knowledge transfer and the global value of the proposed research. The envisioned set of water resource management tools will be housed within 'Whitebox Geospatial Analysis Tools'. Whitebox, is an open-source geographical information system (GIS) developed by Dr. John Lindsay at the University of Guelph. The emphasis of the Whitebox project has been to develop a user-friendly interface for advanced spatial analysis in environmental applications. The plugin architecture of the software is ideal for the tight-integration of spatially distributed models and spatial analysis algorithms such as those contained within the GENESYS suite. Open-source development extends knowledge and technology transfer to a broad range of end-users and builds Canadian capability to address complex resource management problems with better tools and expertise for managers in Canada and around the world. GENESYS (Generate Earth Systems Science input) is an innovative, efficient, high-resolution hydro- and agro-meteorological model for complex terrain watersheds developed under the direction of Dr. James Byrne. GENESYS is an outstanding research and applications tool to address challenging resource management issues in industry, government and nongovernmental agencies. Current research and analysis tools were developed to manage meteorological, climatological, and land and water resource data efficiently at high resolution in space and time. The deliverable for this work is a Whitebox-GENESYS open-source resource management capacity with routines for GIS based watershed management including water in agriculture and food production. We are adding urban water management routines through GENESYS in 2013-15 with an engineering PhD candidate. Both Whitebox-GAT and GENESYS are already well-established tools. The proposed research will combine these products to create an open-source geomatics based water resource management tool that is revolutionary in both capacity and availability to a wide array of Canadian and global users

  7. An Investigation of Software Scaffolds Supporting Modeling Practices

    NASA Astrophysics Data System (ADS)

    Fretz, Eric B.; Wu, Hsin-Kai; Zhang, Baohui; Davis, Elizabeth A.; Krajcik, Joseph S.; Soloway, Elliot

    2002-08-01

    Modeling of complex systems and phenomena is of value in science learning and is increasingly emphasised as an important component of science teaching and learning. Modeling engages learners in desired pedagogical activities. These activities include practices such as planning, building, testing, analysing, and critiquing. Designing realistic models is a difficult task. Computer environments allow the creation of dynamic and even more complex models. One way of bringing the design of models within reach is through the use of scaffolds. Scaffolds are intentional assistance provided to learners from a variety of sources, allowing them to complete tasks that would otherwise be out of reach. Currently, our understanding of how scaffolds in software tools assist learners is incomplete. In this paper the scaffolds designed into a dynamic modeling software tool called Model-It are assessed in terms of their ability to support learners' use of modeling practices. Four pairs of middle school students were video-taped as they used the modeling software for three hours, spread over a two week time frame. Detailed analysis of coded videotape transcripts provided evidence of the importance of scaffolds in supporting the use of modeling practices. Learners used a variety of modeling practices, the majority of which occurred in conjunction with scaffolds. The use of three tool scaffolds was assessed as directly as possible, and these scaffolds were seen to support a variety of modeling practices. An argument is made for the continued empirical validation of types and instances of tool scaffolds, and further investigation of the important role of teacher and peer scaffolding in the use of scaffolded tools.

  8. Emerging from the bottleneck: Benefits of the comparative approach to modern neuroscience

    PubMed Central

    Brenowitz, Eliot A.; Zakon, Harold H.

    2015-01-01

    Neuroscience historically exploited a wide diversity of animal taxa. Recently, however, research focused increasingly on a few model species. This trend accelerated with the genetic revolution, as genomic sequences and genetic tools became available for a few species, which formed a bottleneck. This coalescence on a small set of model species comes with several costs often not considered, especially in the current drive to use mice explicitly as models for human diseases. Comparative studies of strategically chosen non-model species can complement model species research and yield more rigorous studies. As genetic sequences and tools become available for many more species, we are poised to emerge from the bottleneck and once again exploit the rich biological diversity offered by comparative studies. PMID:25800324

  9. Application of Complex Adaptive Systems in Portfolio Management

    ERIC Educational Resources Information Center

    Su, Zheyuan

    2017-01-01

    Simulation-based methods are becoming a promising research tool in financial markets. A general Complex Adaptive System can be tailored to different application scenarios. Based on the current research, we built two models that would benefit portfolio management by utilizing Complex Adaptive Systems (CAS) in Agent-based Modeling (ABM) approach.…

  10. Maximum Likelihood as an Operational Tool in Socio-Economic Modeling : As Outlined in a Recent Thesis of D. W. Peterson

    DOT National Transportation Integrated Search

    1977-02-01

    The limitations of currently used estimation procedures in socio-economic modeling have been highlighted in the ongoing work of Senge, in which it is shown where more sophisticated estimation procedures may become necessary. One such advanced method ...

  11. Augmenting Literacy: The Role of Expertise in Digital Writing

    ERIC Educational Resources Information Center

    Van Ittersum, Derek

    2011-01-01

    This essay presents a model of reflective use of writing technologies, one that provides a means of more fully exploiting the possibilities of these tools for transforming writing activity. Derived from the work of computer designer Douglas Engelbart, the "bootstrapping" model of reflective use extends current arguments in the field…

  12. Modeling Spatial and Temporal Aspects of Visual Backward Masking

    ERIC Educational Resources Information Center

    Hermens, Frouke; Luksys, Gediminas; Gerstner, Wulfram; Herzog, Michael H.; Ernst, Udo

    2008-01-01

    Visual backward masking is a versatile tool for understanding principles and limitations of visual information processing in the human brain. However, the mechanisms underlying masking are still poorly understood. In the current contribution, the authors show that a structurally simple mathematical model can explain many spatial and temporal…

  13. System Architecture Modeling for Technology Portfolio Management using ATLAS

    NASA Technical Reports Server (NTRS)

    Thompson, Robert W.; O'Neil, Daniel A.

    2006-01-01

    Strategic planners and technology portfolio managers have traditionally relied on consensus-based tools, such as Analytical Hierarchy Process (AHP) and Quality Function Deployment (QFD) in planning the funding of technology development. While useful to a certain extent, these tools are limited in the ability to fully quantify the impact of a technology choice on system mass, system reliability, project schedule, and lifecycle cost. The Advanced Technology Lifecycle Analysis System (ATLAS) aims to provide strategic planners a decision support tool for analyzing technology selections within a Space Exploration Architecture (SEA). Using ATLAS, strategic planners can select physics-based system models from a library, configure the systems with technologies and performance parameters, and plan the deployment of a SEA. Key parameters for current and future technologies have been collected from subject-matter experts and other documented sources in the Technology Tool Box (TTB). ATLAS can be used to compare the technical feasibility and economic viability of a set of technology choices for one SEA, and compare it against another set of technology choices or another SEA. System architecture modeling in ATLAS is a multi-step process. First, the modeler defines the system level requirements. Second, the modeler identifies technologies of interest whose impact on an SEA. Third, the system modeling team creates models of architecture elements (e.g. launch vehicles, in-space transfer vehicles, crew vehicles) if they are not already in the model library. Finally, the architecture modeler develops a script for the ATLAS tool to run, and the results for comparison are generated.

  14. Application of the NASCAP Spacecraft Simulation Tool to Investigate Electrodynamic Tether Current Collection in LEO

    NASA Technical Reports Server (NTRS)

    Adams, Mitzi; HabashKrause, Linda

    2012-01-01

    Recent interest in using electrodynamic tethers (EDTs) for orbital maneuvering in Low Earth Orbit (LEO) has prompted the development of the Marshall ElectroDynamic Tether Orbit Propagator (MEDTOP) model. The model is comprised of several modules which address various aspects of EDT propulsion, including calculation of state vectors using a standard orbit propagator (e.g., J2), an atmospheric drag model, realistic ionospheric and magnetic field models, space weather effects, and tether librations. The natural electromotive force (EMF) attained during a radially-aligned conductive tether results in electrons flowing down the tether and accumulating on the lower-altitude spacecraft. The energy that drives this EMF is sourced from the orbital energy of the system; thus, EDTs are often proposed as de-orbiting systems. However, when the current is reversed using satellite charged particle sources, then propulsion is possible. One of the most difficult challenges of the modeling effort is to ascertain the equivalent circuit between the spacecraft and the ionospheric plasma. The present study investigates the use of the NASA Charging Analyzer Program (NASCAP) to calculate currents to and from the tethered satellites and the ionospheric plasma. NASCAP is a sophisticated set of computational tools to model the surface charging of three-dimensional (3D) spacecraft surfaces in a time-varying space environment. The model's surface is tessellated into a collection of facets, and NASCAP calculates currents and potentials for each one. Additionally, NASCAP provides for the construction of one or more nested grids to calculate space potential and time-varying electric fields. This provides for the capability to track individual particles orbits, to model charged particle wakes, and to incorporate external charged particle sources. With this study, we have developed a model of calculating currents incident onto an electrodynamic tethered satellite system, and first results are shown here.

  15. Human performance cognitive-behavioral modeling: a benefit for occupational safety.

    PubMed

    Gore, Brian F

    2002-01-01

    Human Performance Modeling (HPM) is a computer-aided job analysis software methodology used to generate predictions of complex human-automation integration and system flow patterns with the goal of improving operator and system safety. The use of HPM tools has recently been increasing due to reductions in computational cost, augmentations in the tools' fidelity, and usefulness in the generated output. An examination of an Air Man-machine Integration Design and Analysis System (Air MIDAS) model evaluating complex human-automation integration currently underway at NASA Ames Research Center will highlight the importance to occupational safety of considering both cognitive and physical aspects of performance when researching human error.

  16. Human performance cognitive-behavioral modeling: a benefit for occupational safety

    NASA Technical Reports Server (NTRS)

    Gore, Brian F.

    2002-01-01

    Human Performance Modeling (HPM) is a computer-aided job analysis software methodology used to generate predictions of complex human-automation integration and system flow patterns with the goal of improving operator and system safety. The use of HPM tools has recently been increasing due to reductions in computational cost, augmentations in the tools' fidelity, and usefulness in the generated output. An examination of an Air Man-machine Integration Design and Analysis System (Air MIDAS) model evaluating complex human-automation integration currently underway at NASA Ames Research Center will highlight the importance to occupational safety of considering both cognitive and physical aspects of performance when researching human error.

  17. Localized Overheating Phenomena and Optimization of Spark-Plasma Sintering Tooling Design

    PubMed Central

    Giuntini, Diletta; Olevsky, Eugene A.; Garcia-Cardona, Cristina; Maximenko, Andrey L.; Yurlova, Maria S.; Haines, Christopher D.; Martin, Darold G.; Kapoor, Deepak

    2013-01-01

    The present paper shows the application of a three-dimensional coupled electrical, thermal, mechanical finite element macro-scale modeling framework of Spark Plasma Sintering (SPS) to an actual problem of SPS tooling overheating, encountered during SPS experimentation. The overheating phenomenon is analyzed by varying the geometry of the tooling that exhibits the problem, namely by modeling various tooling configurations involving sequences of disk-shape spacers with step-wise increasing radii. The analysis is conducted by means of finite element simulations, intended to obtain temperature spatial distributions in the graphite press-forms, including punches, dies, and spacers; to identify the temperature peaks and their respective timing, and to propose a more suitable SPS tooling configuration with the avoidance of the overheating as a final aim. Electric currents-based Joule heating, heat transfer, mechanical conditions, and densification are imbedded in the model, utilizing the finite-element software COMSOL™, which possesses a distinguishing ability of coupling multiple physics. Thereby the implementation of a finite element method applicable to a broad range of SPS procedures is carried out, together with the more specific optimization of the SPS tooling design when dealing with excessive heating phenomena. PMID:28811398

  18. THE POTENTIAL IMPACT OF INTEGRATED MALARIA TRANSMISSION CONTROL ON ENTOMOLOGIC INOCULATION RATE IN HIGHLY ENDEMIC AREAS

    PubMed Central

    KILLEEN, GERRY F.; McKENZIE, F. ELLIS; FOY, BRIAN D.; SCHIEFFELIN, CATHERINE; BILLINGSLEY, PETER F.; BEIER, JOHN C.

    2008-01-01

    We have used a relatively simple but accurate model for predicting the impact of integrated transmission control on the malaria entomologic inoculation rate (EIR) at four endemic sites from across sub-Saharan Africa and the southwest Pacific. The simulated campaign incorporated modestly effective vaccine coverage, bed net use, and larval control. The results indicate that such campaigns would reduce EIRs at all four sites by 30- to 50-fold. Even without the vaccine, 15- to 25-fold reductions of EIR were predicted, implying that integrated control with a few modestly effective tools can meaningfully reduce malaria transmission in a range of endemic settings. The model accurately predicts the effects of bed nets and indoor spraying and demonstrates that they are the most effective tools available for reducing EIR. However, the impact of domestic adult vector control is amplified by measures for reducing the rate of emergence of vectors or the level of infectiousness of the human reservoir. We conclude that available tools, including currently neglected methods for larval control, can reduce malaria transmission intensity enough to alleviate mortality. Integrated control programs should be implemented to the fullest extent possible, even in areas of intense transmission, using simple models as decision-making tools. However, we also conclude that to eliminate malaria in many areas of intense transmission is beyond the scope of methods which developing nations can currently afford. New, cost-effective, practical tools are needed if malaria is ever to be eliminated from highly endemic areas. PMID:11289662

  19. Pond and Irrigation Model (PIM): a tool for simultaneously evaluating pond water availability and crop irrigation demand

    Treesearch

    Ying Ouyang; Gary Feng; Theodor D. Leininger; John Read; Johnie N. Jenkins

    2018-01-01

    Agricultural ponds are an important alternative source of water for crop irrigation to conserve surface and ground water resources. In recent years more such ponds have been constructed in Mississippi and around the world. There is currently, however, a lack of a tool to simultaneously estimate crop irrigation demand and pond water availability. In this study, a Pond-...

  20. The Use of Individual Growth and Developmental Indicators for Progress Monitoring and Intervention Decision Making in Early Education

    ERIC Educational Resources Information Center

    Walker, Dale; Carta, Judith J.; Greenwood, Charles R.; Buzhardt, Joseph F.

    2008-01-01

    Progress monitoring tools have been shown to be essential elements in current approaches to intervention problem-solving models. Such tools have been valuable not only in marking individual children's level of performance relative to peers but also in measuring change in skill level in a way that can be attributed to intervention and development.…

  1. Development of wavelet analysis tools for turbulence

    NASA Technical Reports Server (NTRS)

    Bertelrud, A.; Erlebacher, G.; Dussouillez, PH.; Liandrat, M. P.; Liandrat, J.; Bailly, F. Moret; Tchamitchian, PH.

    1992-01-01

    Presented here is the general framework and the initial results of a joint effort to derive novel research tools and easy to use software to analyze and model turbulence and transition. Given here is a brief review of the issues, a summary of some basic properties of wavelets, and preliminary results. Technical aspects of the implementation, the physical conclusions reached at this time, and current developments are discussed.

  2. Forecasting of wet snow avalanche activity: Proof of concept and operational implementation

    NASA Astrophysics Data System (ADS)

    Gobiet, Andreas; Jöbstl, Lisa; Rieder, Hannes; Bellaire, Sascha; Mitterer, Christoph

    2017-04-01

    State-of-the-art tools for the operational assessment of avalanche danger include field observations, recordings from automatic weather stations, meteorological analyses and forecasts, and recently also indices derived from snowpack models. In particular, an index for identifying the onset of wet-snow avalanche cycles (LWCindex), has been demonstrated to be useful. However, its value for operational avalanche forecasting is currently limited, since detailed, physically based snowpack models are usually driven by meteorological data from automatic weather stations only and have therefore no prognostic ability. Since avalanche risk management heavily relies on timely information and early warnings, many avalanche services in Europe nowadays start issuing forecasts for the following days, instead of the traditional assessment of the current avalanche danger. In this context, the prognostic operation of detailed snowpack models has recently been objective of extensive research. In this study a new, observationally constrained setup for forecasting the onset of wet-snow avalanche cycles with the detailed snow cover model SNOWPACK is presented and evaluated. Based on data from weather stations and different numerical weather prediction models, we demonstrate that forecasts of the LWCindex as indicator for wet-snow avalanche cycles can be useful for operational warning services, but is so far not reliable enough to be used as single warning tool without considering other factors. Therefore, further development currently focuses on the improvement of the forecasts by applying ensemble techniques and suitable post processing approaches to the output of numerical weather prediction models. In parallel, the prognostic meteo-snow model chain is operationally used by two regional avalanche warning services in Austria since winter 2016/2017 for the first time. Experiences from the first operational season and first results from current model developments will be reported.

  3. Ion beam deposition system for depositing low defect density extreme ultraviolet mask blanks

    NASA Astrophysics Data System (ADS)

    Jindal, V.; Kearney, P.; Sohn, J.; Harris-Jones, J.; John, A.; Godwin, M.; Antohe, A.; Teki, R.; Ma, A.; Goodwin, F.; Weaver, A.; Teora, P.

    2012-03-01

    Extreme ultraviolet lithography (EUVL) is the leading next-generation lithography (NGL) technology to succeed optical lithography at the 22 nm node and beyond. EUVL requires a low defect density reflective mask blank, which is considered to be one of the top two critical technology gaps for commercialization of the technology. At the SEMATECH Mask Blank Development Center (MBDC), research on defect reduction in EUV mask blanks is being pursued using the Veeco Nexus deposition tool. The defect performance of this tool is one of the factors limiting the availability of defect-free EUVL mask blanks. SEMATECH identified the key components in the ion beam deposition system that is currently impeding the reduction of defect density and the yield of EUV mask blanks. SEMATECH's current research is focused on in-house tool components to reduce their contributions to mask blank defects. SEMATECH is also working closely with the supplier to incorporate this learning into a next-generation deposition tool. This paper will describe requirements for the next-generation tool that are essential to realize low defect density EUV mask blanks. The goal of our work is to enable model-based predictions of defect performance and defect improvement for targeted process improvement and component learning to feed into the new deposition tool design. This paper will also highlight the defect reduction resulting from process improvements and the restrictions inherent in the current tool geometry and components that are an impediment to meeting HVM quality EUV mask blanks will be outlined.

  4. RSAT: regulatory sequence analysis tools.

    PubMed

    Thomas-Chollier, Morgane; Sand, Olivier; Turatsinze, Jean-Valéry; Janky, Rekin's; Defrance, Matthieu; Vervisch, Eric; Brohée, Sylvain; van Helden, Jacques

    2008-07-01

    The regulatory sequence analysis tools (RSAT, http://rsat.ulb.ac.be/rsat/) is a software suite that integrates a wide collection of modular tools for the detection of cis-regulatory elements in genome sequences. The suite includes programs for sequence retrieval, pattern discovery, phylogenetic footprint detection, pattern matching, genome scanning and feature map drawing. Random controls can be performed with random gene selections or by generating random sequences according to a variety of background models (Bernoulli, Markov). Beyond the original word-based pattern-discovery tools (oligo-analysis and dyad-analysis), we recently added a battery of tools for matrix-based detection of cis-acting elements, with some original features (adaptive background models, Markov-chain estimation of P-values) that do not exist in other matrix-based scanning tools. The web server offers an intuitive interface, where each program can be accessed either separately or connected to the other tools. In addition, the tools are now available as web services, enabling their integration in programmatic workflows. Genomes are regularly updated from various genome repositories (NCBI and EnsEMBL) and 682 organisms are currently supported. Since 1998, the tools have been used by several hundreds of researchers from all over the world. Several predictions made with RSAT were validated experimentally and published.

  5. Regulatory assessment of chemical mixtures: Requirements, current approaches and future perspectives.

    PubMed

    Kienzler, Aude; Bopp, Stephanie K; van der Linden, Sander; Berggren, Elisabet; Worth, Andrew

    2016-10-01

    This paper reviews regulatory requirements and recent case studies to illustrate how the risk assessment (RA) of chemical mixtures is conducted, considering both the effects on human health and on the environment. A broad range of chemicals, regulations and RA methodologies are covered, in order to identify mixtures of concern, gaps in the regulatory framework, data needs, and further work to be carried out. Also the current and potential future use of novel tools (Adverse Outcome Pathways, in silico tools, toxicokinetic modelling, etc.) in the RA of combined effects were reviewed. The assumptions made in the RA, predictive model specifications and the choice of toxic reference values can greatly influence the assessment outcome, and should therefore be specifically justified. Novel tools could support mixture RA mainly by providing a better understanding of the underlying mechanisms of combined effects. Nevertheless, their use is currently limited because of a lack of guidance, data, and expertise. More guidance is needed to facilitate their application. As far as the authors are aware, no prospective RA concerning chemicals related to various regulatory sectors has been performed to date, even though numerous chemicals are registered under several regulatory frameworks. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  6. SDMtoolbox 2.0: the next generation Python-based GIS toolkit for landscape genetic, biogeographic and species distribution model analyses.

    PubMed

    Brown, Jason L; Bennett, Joseph R; French, Connor M

    2017-01-01

    SDMtoolbox 2.0 is a software package for spatial studies of ecology, evolution, and genetics. The release of SDMtoolbox 2.0 allows researchers to use the most current ArcGIS software and MaxEnt software, and reduces the amount of time that would be spent developing common solutions. The central aim of this software is to automate complicated and repetitive spatial analyses in an intuitive graphical user interface. One core tenant facilitates careful parameterization of species distribution models (SDMs) to maximize each model's discriminatory ability and minimize overfitting. This includes carefully processing of occurrence data, environmental data, and model parameterization. This program directly interfaces with MaxEnt, one of the most powerful and widely used species distribution modeling software programs, although SDMtoolbox 2.0 is not limited to species distribution modeling or restricted to modeling in MaxEnt. Many of the SDM pre- and post-processing tools have 'universal' analogs for use with any modeling software. The current version contains a total of 79 scripts that harness the power of ArcGIS for macroecology, landscape genetics, and evolutionary studies. For example, these tools allow for biodiversity quantification (such as species richness or corrected weighted endemism), generation of least-cost paths and corridors among shared haplotypes, assessment of the significance of spatial randomizations, and enforcement of dispersal limitations of SDMs projected into future climates-to only name a few functions contained in SDMtoolbox 2.0. Lastly, dozens of generalized tools exists for batch processing and conversion of GIS data types or formats, which are broadly useful to any ArcMap user.

  7. Cloud-Based Tools to Support High-Resolution Modeling (Invited)

    NASA Astrophysics Data System (ADS)

    Jones, N.; Nelson, J.; Swain, N.; Christensen, S.

    2013-12-01

    The majority of watershed models developed to support decision-making by water management agencies are simple, lumped-parameter models. Maturity in research codes and advances in the computational power from multi-core processors on desktop machines, commercial cloud-computing resources, and supercomputers with thousands of cores have created new opportunities for employing more accurate, high-resolution distributed models for routine use in decision support. The barriers for using such models on a more routine basis include massive amounts of spatial data that must be processed for each new scenario and lack of efficient visualization tools. In this presentation we will review a current NSF-funded project called CI-WATER that is intended to overcome many of these roadblocks associated with high-resolution modeling. We are developing a suite of tools that will make it possible to deploy customized web-based apps for running custom scenarios for high-resolution models with minimal effort. These tools are based on a software stack that includes 52 North, MapServer, PostGIS, HT Condor, CKAN, and Python. This open source stack provides a simple scripting environment for quickly configuring new custom applications for running high-resolution models as geoprocessing workflows. The HT Condor component facilitates simple access to local distributed computers or commercial cloud resources when necessary for stochastic simulations. The CKAN framework provides a powerful suite of tools for hosting such workflows in a web-based environment that includes visualization tools and storage of model simulations in a database to archival, querying, and sharing of model results. Prototype applications including land use change, snow melt, and burned area analysis will be presented. This material is based upon work supported by the National Science Foundation under Grant No. 1135482

  8. The Cryosphere Model Comparison Tool (CmCt): Ice Sheet Model Validation and Comparison Tool for Greenland and Antarctica

    NASA Astrophysics Data System (ADS)

    Simon, E.; Nowicki, S.; Neumann, T.; Tyahla, L.; Saba, J. L.; Guerber, J. R.; Bonin, J. A.; DiMarzio, J. P.

    2017-12-01

    The Cryosphere model Comparison tool (CmCt) is a web based ice sheet model validation tool that is being developed by NASA to facilitate direct comparison between observational data and various ice sheet models. The CmCt allows the user to take advantage of several decades worth of observations from Greenland and Antarctica. Currently, the CmCt can be used to compare ice sheet models provided by the user with remotely sensed satellite data from ICESat (Ice, Cloud, and land Elevation Satellite) laser altimetry, GRACE (Gravity Recovery and Climate Experiment) satellite, and radar altimetry (ERS-1, ERS-2, and Envisat). One or more models can be uploaded through the CmCt website and compared with observational data, or compared to each other or other models. The CmCt calculates statistics on the differences between the model and observations, and other quantitative and qualitative metrics, which can be used to evaluate the different model simulations against the observations. The qualitative metrics consist of a range of visual outputs and the quantitative metrics consist of several whole-ice-sheet scalar values that can be used to assign an overall score to a particular simulation. The comparison results from CmCt are useful in quantifying improvements within a specific model (or within a class of models) as a result of differences in model dynamics (e.g., shallow vs. higher-order dynamics approximations), model physics (e.g., representations of ice sheet rheological or basal processes), or model resolution (mesh resolution and/or changes in the spatial resolution of input datasets). The framework and metrics could also be used for use as a model-to-model intercomparison tool, simply by swapping outputs from another model as the observational datasets. Future versions of the tool will include comparisons with other datasets that are of interest to the modeling community, such as ice velocity, ice thickness, and surface mass balance.

  9. An Engineering Tool for the Prediction of Internal Dielectric Charging

    NASA Astrophysics Data System (ADS)

    Rodgers, D. J.; Ryden, K. A.; Wrenn, G. L.; Latham, P. M.; Sorensen, J.; Levy, L.

    1998-11-01

    A practical internal charging tool has been developed. It provides an easy-to-use means for satellite engineers to predict whether on-board dielectrics are vulnerable to electrostatic discharge in the outer radiation belt. The tool is designed to simulate irradiation of single-dielectric planar or cylindrical structures with or without shielding. Analytical equations are used to describe current deposition in the dielectric. This is fast and gives charging currents to sufficient accuracy given the uncertainties in other aspects of the problem - particularly material characteristics. Time-dependent internal electric fields are calculated, taking into account the effect on conductivity of electric field, dose rate and temperature. A worst-case model of electron fluxes in the outer belt has been created specifically for the internal charging problem and is built into the code. For output, the tool gives a YES or NO decision on the susceptibility of the structure to internal electrostatic breakdown and if necessary, calculates the required changes to bring the system below the breakdown threshold. A complementary programme of laboratory irradiations has been carried out to validate the tool. The results for Epoxy-fibreglass samples show that the code models electric field realistically for a wide variety of shields, dielectric thicknesses and electron spectra. Results for Teflon samples indicate that some further experimentation is required and the radiation-induced conductivity aspects of the code have not been validated.

  10. Digital modeling of end-mill cutting tools for FEM applications from the active cutting contour

    NASA Astrophysics Data System (ADS)

    Salguero, Jorge; Marcos, M.; Batista, M.; Gómez, A.; Mayuet, P.; Bienvenido, R.

    2012-04-01

    A very current technique in the research field of machining by material removal is the use of simulations using the Finite Element Method (FEM). Nevertheless, and although is widely used in processes that allows approximations to orthogonal cutting, such as shaping, is scarcely used in more complexes processes, such as milling. This fact is due principally to the complex geometry of the cutting tools in these processes, and the need to realize the studi es in an oblique cutting configuration. This paper shows a methodology for the geometrical characterization of commercial endmill cutting tools, by the extraction of the cutting tool contour, making use of optical metrology, and using this geometry to model the active cutting zone with a 3D CAD software. This model is easily exportable to different CAD formats, such as IGES or STEP, and importable from FEM software, where is possible to study the behavior in service of the same ones.

  11. Pelagic Habitat Analysis Module (PHAM) for GIS Based Fisheries Decision Support

    NASA Technical Reports Server (NTRS)

    Kiefer, D. A.; Armstrong, Edward M.; Harrison, D. P.; Hinton, M. G.; Kohin, S.; Snyder, S.; O'Brien, F. J.

    2011-01-01

    We have assembled a system that integrates satellite and model output with fisheries data We have developed tools that allow analysis of the interaction between species and key environmental variables Demonstrated the capacity to accurately map habitat of Thresher Sharks Alopias vulpinus & pelagicus. Their seasonal migration along the California Current is at least partly driven by the seasonal migration of sardine, key prey of the sharks.We have assembled a system that integrates satellite and model output with fisheries data We have developed tools that allow analysis of the interaction between species and key environmental variables Demonstrated the capacity to accurately map habitat of Thresher Sharks Alopias vulpinus nd pelagicus. Their seasonal migration along the California Current is at least partly driven by the seasonal migration of sardine, key prey of the sharks.

  12. Galen-In-Use: using artificial intelligence terminology tools to improve the linguistic coherence of a national coding system for surgical procedures.

    PubMed

    Rodrigues, J M; Trombert-Paviot, B; Baud, R; Wagner, J; Meusnier-Carriot, F

    1998-01-01

    GALEN has developed a language independent common reference model based on a medically oriented ontology and practical tools and techniques for managing healthcare terminology including natural language processing. GALEN-IN-USE is the current phase which applied the modelling and the tools to the development or the updating of coding systems for surgical procedures in different national coding centers co-operating within the European Federation of Coding Centre (EFCC) to create a language independent knowledge repository for multicultural Europe. We used an integrated set of artificial intelligence terminology tools named CLAssification Manager workbench to process French professional medical language rubrics into intermediate dissections and to the Grail reference ontology model representation. From this language independent concept model representation we generate controlled French natural language. The French national coding centre is then able to retrieve the initial professional rubrics with different categories of concepts, to compare the professional language proposed by expert clinicians to the French generated controlled vocabulary and to finalize the linguistic labels of the coding system in relation with the meanings of the conceptual system structure.

  13. Integrated modeling of advanced optical systems

    NASA Astrophysics Data System (ADS)

    Briggs, Hugh C.; Needels, Laura; Levine, B. Martin

    1993-02-01

    This poster session paper describes an integrated modeling and analysis capability being developed at JPL under funding provided by the JPL Director's Discretionary Fund and the JPL Control/Structure Interaction Program (CSI). The posters briefly summarize the program capabilities and illustrate them with an example problem. The computer programs developed under this effort will provide an unprecedented capability for integrated modeling and design of high performance optical spacecraft. The engineering disciplines supported include structural dynamics, controls, optics and thermodynamics. Such tools are needed in order to evaluate the end-to-end system performance of spacecraft such as OSI, POINTS, and SMMM. This paper illustrates the proof-of-concept tools that have been developed to establish the technology requirements and demonstrate the new features of integrated modeling and design. The current program also includes implementation of a prototype tool based upon the CAESY environment being developed under the NASA Guidance and Control Research and Technology Computational Controls Program. This prototype will be available late in FY-92. The development plan proposes a major software production effort to fabricate, deliver, support and maintain a national-class tool from FY-93 through FY-95.

  14. Current state of the art for statistical modeling of species distributions [Chapter 16

    Treesearch

    Troy M. Hegel; Samuel A. Cushman; Jeffrey Evans; Falk Huettmann

    2010-01-01

    Over the past decade the number of statistical modelling tools available to ecologists to model species' distributions has increased at a rapid pace (e.g. Elith et al. 2006; Austin 2007), as have the number of species distribution models (SDM) published in the literature (e.g. Scott et al. 2002). Ten years ago, basic logistic regression (Hosmer and Lemeshow 2000)...

  15. Integration of eHealth Tools in the Process of Workplace Health Promotion: Proposal for Design and Implementation

    PubMed Central

    2018-01-01

    Background Electronic health (eHealth) and mobile health (mHealth) tools can support and improve the whole process of workplace health promotion (WHP) projects. However, several challenges and opportunities have to be considered while integrating these tools in WHP projects. Currently, a large number of eHealth tools are developed for changing health behavior, but these tools can support the whole WHP process, including group administration, information flow, assessment, intervention development process, or evaluation. Objective To support a successful implementation of eHealth tools in the whole WHP processes, we introduce a concept of WHP (life cycle model of WHP) with 7 steps and present critical and success factors for the implementation of eHealth tools in each step. Methods We developed a life cycle model of WHP based on the World Health Organization (WHO) model of healthy workplace continual improvement process. We suggest adaptations to the WHO model to demonstrate the large number of possibilities to implement eHealth tools in WHP as well as possible critical points in the implementation process. Results eHealth tools can enhance the efficiency of WHP in each of the 7 steps of the presented life cycle model of WHP. Specifically, eHealth tools can support by offering easier administration, providing an information and communication platform, supporting assessments, presenting and discussing assessment results in a dashboard, and offering interventions to change individual health behavior. Important success factors include the possibility to give automatic feedback about health parameters, create incentive systems, or bring together a large number of health experts in one place. Critical factors such as data security, anonymity, or lack of management involvement have to be addressed carefully to prevent nonparticipation and dropouts. Conclusions Using eHealth tools can support WHP, but clear regulations for the usage and implementation of these tools at the workplace are needed to secure quality and reach sustainable results. PMID:29475828

  16. USGS River Ecosystem Modeling: Where Are We, How Did We Get Here, and Where Are We Going?

    USGS Publications Warehouse

    Hanson, Leanne; Schrock, Robin; Waddle, Terry; Duda, Jeffrey J.; Lellis, Bill

    2009-01-01

    This report developed as an outcome of the USGS River Ecosystem Modeling Work Group, convened on February 11, 2008 as a preconference session to the second USGS Modeling Conference in Orange Beach, Ala. Work Group participants gained an understanding of the types of models currently being applied to river ecosystem studies within the USGS, learned how model outputs are being used by a Federal land management agency, and developed recommendations for advancing the state of the art in river ecosystem modeling within the USGS. During a break-out session, participants restated many of the recommendations developed at the first USGS Modeling Conference in 2006 and in previous USGS needs assessments. All Work Group recommendations require organization and coordination across USGS disciplines and regions, and include (1) enhancing communications, (2) increasing efficiency through better use of current human and technologic resources, and (3) providing a national infrastructure for river ecosystem modeling resources, making it easier to integrate modeling efforts. By implementing these recommendations, the USGS will benefit from enhanced multi-disciplinary, integrated models for river ecosystems that provide valuable risk assessment and decision support tools for adaptive management of natural and managed riverine ecosystems. These tools generate key information that resource managers need and can use in making decisions about river ecosystem resources.

  17. Data Model Management for Space Information Systems

    NASA Technical Reports Server (NTRS)

    Hughes, J. Steven; Crichton, Daniel J.; Ramirez, Paul; Mattmann, chris

    2006-01-01

    The Reference Architecture for Space Information Management (RASIM) suggests the separation of the data model from software components to promote the development of flexible information management systems. RASIM allows the data model to evolve independently from the software components and results in a robust implementation that remains viable as the domain changes. However, the development and management of data models within RASIM are difficult and time consuming tasks involving the choice of a notation, the capture of the model, its validation for consistency, and the export of the model for implementation. Current limitations to this approach include the lack of ability to capture comprehensive domain knowledge, the loss of significant modeling information during implementation, the lack of model visualization and documentation capabilities, and exports being limited to one or two schema types. The advent of the Semantic Web and its demand for sophisticated data models has addressed this situation by providing a new level of data model management in the form of ontology tools. In this paper we describe the use of a representative ontology tool to capture and manage a data model for a space information system. The resulting ontology is implementation independent. Novel on-line visualization and documentation capabilities are available automatically, and the ability to export to various schemas can be added through tool plug-ins. In addition, the ingestion of data instances into the ontology allows validation of the ontology and results in a domain knowledge base. Semantic browsers are easily configured for the knowledge base. For example the export of the knowledge base to RDF/XML and RDFS/XML and the use of open source metadata browsers provide ready-made user interfaces that support both text- and facet-based search. This paper will present the Planetary Data System (PDS) data model as a use case and describe the import of the data model into an ontology tool. We will also describe the current effort to provide interoperability with the European Space Agency (ESA)/Planetary Science Archive (PSA) which is critically dependent on a common data model.

  18. System and Software Reliability (C103)

    NASA Technical Reports Server (NTRS)

    Wallace, Dolores

    2003-01-01

    Within the last decade better reliability models (hardware. software, system) than those currently used have been theorized and developed but not implemented in practice. Previous research on software reliability has shown that while some existing software reliability models are practical, they are no accurate enough. New paradigms of development (e.g. OO) have appeared and associated reliability models have been proposed posed but not investigated. Hardware models have been extensively investigated but not integrated into a system framework. System reliability modeling is the weakest of the three. NASA engineers need better methods and tools to demonstrate that the products meet NASA requirements for reliability measurement. For the new models for the software component of the last decade, there is a great need to bring them into a form that they can be used on software intensive systems. The Statistical Modeling and Estimation of Reliability Functions for Systems (SMERFS'3) tool is an existing vehicle that may be used to incorporate these new modeling advances. Adapting some existing software reliability modeling changes to accommodate major changes in software development technology may also show substantial improvement in prediction accuracy. With some additional research, the next step is to identify and investigate system reliability. System reliability models could then be incorporated in a tool such as SMERFS'3. This tool with better models would greatly add value in assess in GSFC projects.

  19. Membrane Packing Problems: A short Review on computational Membrane Modeling Methods and Tools

    PubMed Central

    Sommer, Björn

    2013-01-01

    The use of model membranes is currently part of the daily workflow for many biochemical and biophysical disciplines. These membranes are used to analyze the behavior of small substances, to simulate transport processes, to study the structure of macromolecules or for illustrative purposes. But, how can these membrane structures be generated? This mini review discusses a number of ways to obtain these structures. First, the problem will be formulated as the Membrane Packing Problem. It will be shown that the theoretical problem of placing proteins and lipids onto a membrane area differ significantly. Thus, two sub-problems will be defined and discussed. Then, different – partly historical – membrane modeling methods will be introduced. And finally, membrane modeling tools will be evaluated which are able to semi-automatically generate these model membranes and thus, drastically accelerate and simplify the membrane generation process. The mini review concludes with advice about which tool is appropriate for which application case. PMID:24688707

  20. An efficient current-based logic cell model for crosstalk delay analysis

    NASA Astrophysics Data System (ADS)

    Nazarian, Shahin; Das, Debasish

    2013-04-01

    Logic cell modelling is an important component in the analysis and design of CMOS integrated circuits, mostly due to nonlinear behaviour of CMOS cells with respect to the voltage signal at their input and output pins. A current-based model for CMOS logic cells is presented, which can be used for effective crosstalk noise and delta delay analysis in CMOS VLSI circuits. Existing current source models are expensive and need a new set of Spice-based characterisation, which is not compatible with typical EDA tools. In this article we present Imodel, a simple nonlinear logic cell model that can be derived from the typical cell libraries such as NLDM, with accuracy much higher than NLDM-based cell delay models. In fact, our experiments show an average error of 3% compared to Spice. This level of accuracy comes with a maximum runtime penalty of 19% compared to NLDM-based cell delay models on medium-sized industrial designs.

  1. On-Line, Self-Learning, Predictive Tool for Determining Payload Thermal Response

    NASA Technical Reports Server (NTRS)

    Jen, Chian-Li; Tilwick, Leon

    2000-01-01

    This paper will present the results of a joint ManTech / Goddard R&D effort, currently under way, to develop and test a computer based, on-line, predictive simulation model for use by facility operators to predict the thermal response of a payload during thermal vacuum testing. Thermal response was identified as an area that could benefit from the algorithms developed by Dr. Jeri for complex computer simulations. Most thermal vacuum test setups are unique since no two payloads have the same thermal properties. This requires that the operators depend on their past experiences to conduct the test which requires time for them to learn how the payload responds while at the same time limiting any risk of exceeding hot or cold temperature limits. The predictive tool being developed is intended to be used with the new Thermal Vacuum Data System (TVDS) developed at Goddard for the Thermal Vacuum Test Operations group. This model can learn the thermal response of the payload by reading a few data points from the TVDS, accepting the payload's current temperature as the initial condition for prediction. The model can then be used as a predictive tool to estimate the future payload temperatures according to a predetermined shroud temperature profile. If the error of prediction is too big, the model can be asked to re-learn the new situation on-line in real-time and give a new prediction. Based on some preliminary tests, we feel this predictive model can forecast the payload temperature of the entire test cycle within 5 degrees Celsius after it has learned 3 times during the beginning of the test. The tool will allow the operator to play "what-if' experiments to decide what is his best shroud temperature set-point control strategy. This tool will save money by minimizing guess work and optimizing transitions as well as making the testing process safer and easier to conduct.

  2. Nuclear Engine System Simulation (NESS). Volume 1: Program user's guide

    NASA Astrophysics Data System (ADS)

    Pelaccio, Dennis G.; Scheil, Christine M.; Petrosky, Lyman J.

    1993-03-01

    A Nuclear Thermal Propulsion (NTP) engine system design analysis tool is required to support current and future Space Exploration Initiative (SEI) propulsion and vehicle design studies. Currently available NTP engine design models are those developed during the NERVA program in the 1960's and early 1970's and are highly unique to that design or are modifications of current liquid propulsion system design models. To date, NTP engine-based liquid design models lack integrated design of key NTP engine design features in the areas of reactor, shielding, multi-propellant capability, and multi-redundant pump feed fuel systems. Additionally, since the SEI effort is in the initial development stage, a robust, verified NTP analysis design tool could be of great use to the community. This effort developed an NTP engine system design analysis program (tool), known as the Nuclear Engine System Simulation (NESS) program, to support ongoing and future engine system and stage design study efforts. In this effort, Science Applications International Corporation's (SAIC) NTP version of the Expanded Liquid Engine Simulation (ELES) program was modified extensively to include Westinghouse Electric Corporation's near-term solid-core reactor design model. The ELES program has extensive capability to conduct preliminary system design analysis of liquid rocket systems and vehicles. The program is modular in nature and is versatile in terms of modeling state-of-the-art component and system options as discussed. The Westinghouse reactor design model, which was integrated in the NESS program, is based on the near-term solid-core ENABLER NTP reactor design concept. This program is now capable of accurately modeling (characterizing) a complete near-term solid-core NTP engine system in great detail, for a number of design options, in an efficient manner. The following discussion summarizes the overall analysis methodology, key assumptions, and capabilities associated with the NESS presents an example problem, and compares the results to related NTP engine system designs. Initial installation instructions and program disks are in Volume 2 of the NESS Program User's Guide.

  3. Nuclear Engine System Simulation (NESS). Volume 1: Program user's guide

    NASA Technical Reports Server (NTRS)

    Pelaccio, Dennis G.; Scheil, Christine M.; Petrosky, Lyman J.

    1993-01-01

    A Nuclear Thermal Propulsion (NTP) engine system design analysis tool is required to support current and future Space Exploration Initiative (SEI) propulsion and vehicle design studies. Currently available NTP engine design models are those developed during the NERVA program in the 1960's and early 1970's and are highly unique to that design or are modifications of current liquid propulsion system design models. To date, NTP engine-based liquid design models lack integrated design of key NTP engine design features in the areas of reactor, shielding, multi-propellant capability, and multi-redundant pump feed fuel systems. Additionally, since the SEI effort is in the initial development stage, a robust, verified NTP analysis design tool could be of great use to the community. This effort developed an NTP engine system design analysis program (tool), known as the Nuclear Engine System Simulation (NESS) program, to support ongoing and future engine system and stage design study efforts. In this effort, Science Applications International Corporation's (SAIC) NTP version of the Expanded Liquid Engine Simulation (ELES) program was modified extensively to include Westinghouse Electric Corporation's near-term solid-core reactor design model. The ELES program has extensive capability to conduct preliminary system design analysis of liquid rocket systems and vehicles. The program is modular in nature and is versatile in terms of modeling state-of-the-art component and system options as discussed. The Westinghouse reactor design model, which was integrated in the NESS program, is based on the near-term solid-core ENABLER NTP reactor design concept. This program is now capable of accurately modeling (characterizing) a complete near-term solid-core NTP engine system in great detail, for a number of design options, in an efficient manner. The following discussion summarizes the overall analysis methodology, key assumptions, and capabilities associated with the NESS presents an example problem, and compares the results to related NTP engine system designs. Initial installation instructions and program disks are in Volume 2 of the NESS Program User's Guide.

  4. FACET: Future ATM Concepts Evaluation Tool

    NASA Technical Reports Server (NTRS)

    Bilmoria, Karl D.; Banavar, Sridhar; Chatterji, Gano B.; Sheth, Kapil S.; Grabbe, Shon

    2000-01-01

    FACET (Future ATM Concepts Evaluation Tool) is an Air Traffic Management research tool being developed at the NASA Ames Research Center. This paper describes the design, architecture and functionalities of FACET. The purpose of FACET is to provide E simulation environment for exploration, development and evaluation of advanced ATM concepts. Examples of these concepts include new ATM paradigms such as Distributed Air-Ground Traffic Management, airspace redesign and new Decision Support Tools (DSTs) for controllers working within the operational procedures of the existing air traffic control system. FACET is currently capable of modeling system-wide en route airspace operations over the contiguous United States. Airspace models (e.g., Center/sector boundaries, airways, locations of navigation aids and airports) are available from databases. A core capability of FACET is the modeling of aircraft trajectories. Using round-earth kinematic equations, aircraft can be flown along flight plan routes or great circle routes as they climb, cruise and descend according to their individual aircraft-type performance models. Performance parameters (e.g., climb/descent rates and speeds, cruise speeds) are obtained from data table lookups. Heading, airspeed and altitude-rate dynamics are also modeled. Additional functionalities will be added as necessary for specific applications. FACET software is written in Java and C programming languages. It is platform-independent, and can be run on a variety of computers. FACET has been designed with a modular software architecture to enable rapid integration of research prototype implementations of new ATM concepts. There are several advanced ATM concepts that are currently being implemented in FACET airborne separation assurance, dynamic density predictions, airspace redesign (re-sectorization), benefits of a controller DST for direct-routing, and the integration of commercial space transportation system operations into the U.S. National Airspace System (NAS).

  5. Developing Flexible, Integrated Hydrologic Modeling Systems for Multiscale Analysis in the Midwest and Great Lakes Region

    NASA Astrophysics Data System (ADS)

    Hamlet, A. F.; Chiu, C. M.; Sharma, A.; Byun, K.; Hanson, Z.

    2016-12-01

    Physically based hydrologic modeling of surface and groundwater resources that can be flexibly and efficiently applied to support water resources policy/planning/management decisions at a wide range of spatial and temporal scales are greatly needed in the Midwest, where stakeholder access to such tools is currently a fundamental barrier to basic climate change assessment and adaptation efforts, and also the co-production of useful products to support detailed decision making. Based on earlier pilot studies in the Pacific Northwest Region, we are currently assembling a suite of end-to-end tools and resources to support various kinds of water resources planning and management applications across the region. One of the key aspects of these integrated tools is that the user community can access gridded products at any point along the end-to-end chain of models, looking backwards in time about 100 years (1915-2015), and forwards in time about 85 years using CMIP5 climate model projections. The integrated model is composed of historical and projected future meteorological data based on station observations and statistical and dynamically downscaled climate model output respectively. These gridded meteorological data sets serve as forcing data for the macro-scale VIC hydrologic model implemented over the Midwest at 1/16 degree resolution. High-resolution climate model (4km WRF) output provides inputs for the analyses of urban impacts, hydrologic extremes, agricultural impacts, and impacts to the Great Lakes. Groundwater recharge estimated by the surface water model provides input data for fine-scale and macro-scale groundwater models needed for specific applications. To highlight the multi-scale use of the integrated models in support of co-production of scientific information for decision making, we briefly describe three current case studies addressing different spatial scales of analysis: 1) Effects of climate change on the water balance of the Great Lakes, 2) Future hydropower resources in the St. Joseph River basin, 3) Effects of climate change on carbon cycling in small lakes in the Northern Highland Lakes District.

  6. Design Science Research toward Designing/Prototyping a Repeatable Model for Testing Location Management (LM) Algorithms for Wireless Networking

    ERIC Educational Resources Information Center

    Peacock, Christopher

    2012-01-01

    The purpose of this research effort was to develop a model that provides repeatable Location Management (LM) testing using a network simulation tool, QualNet version 5.1 (2011). The model will provide current and future protocol developers a framework to simulate stable protocol environments for development. This study used the Design Science…

  7. A Bayesian Model for the Estimation of Latent Interaction and Quadratic Effects When Latent Variables Are Non-Normally Distributed

    ERIC Educational Resources Information Center

    Kelava, Augustin; Nagengast, Benjamin

    2012-01-01

    Structural equation models with interaction and quadratic effects have become a standard tool for testing nonlinear hypotheses in the social sciences. Most of the current approaches assume normally distributed latent predictor variables. In this article, we present a Bayesian model for the estimation of latent nonlinear effects when the latent…

  8. Comparative analysis for various redox flow batteries chemistries using a cost performance model

    NASA Astrophysics Data System (ADS)

    Crawford, Alasdair; Viswanathan, Vilayanur; Stephenson, David; Wang, Wei; Thomsen, Edwin; Reed, David; Li, Bin; Balducci, Patrick; Kintner-Meyer, Michael; Sprenkle, Vincent

    2015-10-01

    The total energy storage system cost is determined by means of a robust performance-based cost model for multiple flow battery chemistries. Systems aspects such as shunt current losses, pumping losses and various flow patterns through electrodes are accounted for. The system cost minimizing objective function determines stack design by optimizing the state of charge operating range, along with current density and current-normalized flow. The model cost estimates are validated using 2-kW stack performance data for the same size electrodes and operating conditions. Using our validated tool, it has been demonstrated that an optimized all-vanadium system has an estimated system cost of < 350 kWh-1 for 4-h application. With an anticipated decrease in component costs facilitated by economies of scale from larger production volumes, coupled with performance improvements enabled by technology development, the system cost is expected to decrease to 160 kWh-1 for a 4-h application, and to 100 kWh-1 for a 10-h application. This tool has been shared with the redox flow battery community to enable cost estimation using their stack data and guide future direction.

  9. Artificial neural networks in gynaecological diseases: current and potential future applications.

    PubMed

    Siristatidis, Charalampos S; Chrelias, Charalampos; Pouliakis, Abraham; Katsimanis, Evangelos; Kassanos, Dimitrios

    2010-10-01

    Current (and probably future) practice of medicine is mostly associated with prediction and accurate diagnosis. Especially in clinical practice, there is an increasing interest in constructing and using valid models of diagnosis and prediction. Artificial neural networks (ANNs) are mathematical systems being used as a prospective tool for reliable, flexible and quick assessment. They demonstrate high power in evaluating multifactorial data, assimilating information from multiple sources and detecting subtle and complex patterns. Their capability and difference from other statistical techniques lies in performing nonlinear statistical modelling. They represent a new alternative to logistic regression, which is the most commonly used method for developing predictive models for outcomes resulting from partitioning in medicine. In combination with the other non-algorithmic artificial intelligence techniques, they provide useful software engineering tools for the development of systems in quantitative medicine. Our paper first presents a brief introduction to ANNs, then, using what we consider the best available evidence through paradigms, we evaluate the ability of these networks to serve as first-line detection and prediction techniques in some of the most crucial fields in gynaecology. Finally, through the analysis of their current application, we explore their dynamics for future use.

  10. Mussel dynamics model: A hydroinformatics tool for analyzing the effects of different stressors on the dynamics of freshwater mussel communities

    USGS Publications Warehouse

    Morales, Y.; Weber, L.J.; Mynett, A.E.; Newton, T.J.

    2006-01-01

    A model for simulating freshwater mussel population dynamics is presented. The model is a hydroinformatics tool that integrates principles from ecology, river hydraulics, fluid mechanics and sediment transport, and applies the individual-based modelling approach for simulating population dynamics. The general model layout, data requirements, and steps of the simulation process are discussed. As an illustration, simulation results from an application in a 10 km reach of the Upper Mississippi River are presented. The model was used to investigate the spatial distribution of mussels and the effects of food competition in native unionid mussel communities, and communities infested by Dreissena polymorpha, the zebra mussel. Simulation results were found to be realistic and coincided with data obtained from the literature. These results indicate that the model can be a useful tool for assessing the potential effects of different stressors on long-term population dynamics, and consequently, may improve the current understanding of cause and effect relationships in freshwater mussel communities. ?? 2006 Elsevier B.V. All rights reserved.

  11. AgMIP Training in Multiple Crop Models and Tools

    NASA Technical Reports Server (NTRS)

    Boote, Kenneth J.; Porter, Cheryl H.; Hargreaves, John; Hoogenboom, Gerrit; Thornburn, Peter; Mutter, Carolyn

    2015-01-01

    The Agricultural Model Intercomparison and Improvement Project (AgMIP) has the goal of using multiple crop models to evaluate climate impacts on agricultural production and food security in developed and developing countries. There are several major limitations that must be overcome to achieve this goal, including the need to train AgMIP regional research team (RRT) crop modelers to use models other than the ones they are currently familiar with, plus the need to harmonize and interconvert the disparate input file formats used for the various models. Two activities were followed to address these shortcomings among AgMIP RRTs to enable them to use multiple models to evaluate climate impacts on crop production and food security. We designed and conducted courses in which participants trained on two different sets of crop models, with emphasis on the model of least experience. In a second activity, the AgMIP IT group created templates for inputting data on soils, management, weather, and crops into AgMIP harmonized databases, and developed translation tools for converting the harmonized data into files that are ready for multiple crop model simulations. The strategies for creating and conducting the multi-model course and developing entry and translation tools are reviewed in this chapter.

  12. Analysis and numerical modelling of eddy current damper for vibration problems

    NASA Astrophysics Data System (ADS)

    Irazu, L.; Elejabarrieta, M. J.

    2018-07-01

    This work discusses a contactless eddy current damper, which is used to attenuate structural vibration. Eddy currents can remove energy from dynamic systems without any contact and, thus, without adding mass or modifying the rigidity of the structure. An experimental modal analysis of a cantilever beam in the absence of and under a partial magnetic field is conducted in the bandwidth of 01 kHz. The results show that the eddy current phenomenon can attenuate the vibration of the entire structure without modifying the natural frequencies or the mode shapes of the structure itself. In this study, a new inverse method to numerically determine the dynamic properties of the contactless eddy current damper is proposed. The proposed inverse method and the eddy current model based on a lineal viscous force are validated by a practical application. The numerically obtained transfer function correlates with the experimental one, thus showing good agreement in the entire bandwidth of 01 kHz. The proposed method provides an easy and quick tool to model and predict the dynamic behaviour of the contactless eddy current damper, thereby avoiding the use of complex analytical models.

  13. Optimization of Surface Roughness Parameters of Al-6351 Alloy in EDC Process: A Taguchi Coupled Fuzzy Logic Approach

    NASA Astrophysics Data System (ADS)

    Kar, Siddhartha; Chakraborty, Sujoy; Dey, Vidyut; Ghosh, Subrata Kumar

    2017-10-01

    This paper investigates the application of Taguchi method with fuzzy logic for multi objective optimization of roughness parameters in electro discharge coating process of Al-6351 alloy with powder metallurgical compacted SiC/Cu tool. A Taguchi L16 orthogonal array was employed to investigate the roughness parameters by varying tool parameters like composition and compaction load and electro discharge machining parameters like pulse-on time and peak current. Crucial roughness parameters like Centre line average roughness, Average maximum height of the profile and Mean spacing of local peaks of the profile were measured on the coated specimen. The signal to noise ratios were fuzzified to optimize the roughness parameters through a single comprehensive output measure (COM). Best COM obtained with lower values of compaction load, pulse-on time and current and 30:70 (SiC:Cu) composition of tool. Analysis of variance is carried out and a significant COM model is observed with peak current yielding highest contribution followed by pulse-on time, compaction load and composition. The deposited layer is characterised by X-Ray Diffraction analysis which confirmed the presence of tool materials on the work piece surface.

  14. Influx: A Tool and Framework for Reasoning under Uncertainty

    DTIC Science & Technology

    2015-09-01

    Interfaces to external programs Not all types of problems are naturally suited to being entirely modelled and implemented within Influx1. In general... development pertaining to the implementation of the reasoning tool and specific applications are not included in this document. RELEASE LIMITATION...which case a probability is supposed to reflect the subjective belief of an agent for the problem at hand ( based on its experience and/or current state

  15. Transitioning Human, Social, Cultural Behavior (HSCB) Models and Simulations to the Operational User1

    DTIC Science & Technology

    2009-10-01

    actuelle M&S couvrant le soutien aux operations, la representation du comportement humain , la guerre asymetrique, la defense contre le terrorisme et...methods, tools, data, intellectual capital , and processes to address these capability requirements. Fourth, there is a need to compare capability...requirements to current capabilities to identify gaps that may be addressed with DoD HSCB methods, tools, data, intellectual capital , and process

  16. Health information systems: failure, success and improvisation.

    PubMed

    Heeks, Richard

    2006-02-01

    The generalised assumption of health information systems (HIS) success is questioned by a few commentators in the medical informatics field. They point to widespread HIS failure. The purpose of this paper was therefore to develop a better conceptual foundation for, and practical guidance on, health information systems failure (and success). Literature and case analysis plus pilot testing of developed model. Defining HIS failure and success is complex, and the current evidence base on HIS success and failure rates was found to be weak. Nonetheless, the best current estimate is that HIS failure is an important problem. The paper therefore derives and explains the "design-reality gap" conceptual model. This is shown to be robust in explaining multiple cases of HIS success and failure, yet provides a contingency that encompasses the differences which exist in different HIS contexts. The design-reality gap model is piloted to demonstrate its value as a tool for risk assessment and mitigation on HIS projects. It also throws into question traditional, structured development methodologies, highlighting the importance of emergent change and improvisation in HIS. The design-reality gap model can be used to address the problem of HIS failure, both as a post hoc evaluative tool and as a pre hoc risk assessment and mitigation tool. It also validates a set of methods, techniques, roles and competencies needed to support the dynamic improvisations that are found to underpin cases of HIS success.

  17. Model Verification and Validation Using Graphical Information Systems Tools

    DTIC Science & Technology

    2013-07-31

    Davis Highway, Suite 1204, Arlington, VA 22202-4302. Respondents should be aware that notwithstanding any other provision of law , no person shall be...12 Geomorphic Measurements...to a model. Ocean flows, which are organized E-2 current systems, transport heat and salinity and cause water to pile up as a water surface

  18. A coupled modeling approach to assess the impact of fuel treatments on post-wildfire runoff and erosion

    USDA-ARS?s Scientific Manuscript database

    The hydrological consequences of wildfires are some of the most significant and long-lasting effects. Since wildfire severity impacts post-fire hydrological response, fuel treatments can be a useful tool for land managers to moderate this response. However, current models focus on only one aspect of...

  19. Kinetic Simulation and Energetic Neutral Atom Imaging of the Magnetosphere

    NASA Technical Reports Server (NTRS)

    Fok, Mei-Ching H.

    2011-01-01

    Advanced simulation tools and measurement techniques have been developed to study the dynamic magnetosphere and its response to drivers in the solar wind. The Comprehensive Ring Current Model (CRCM) is a kinetic code that solves the 3D distribution in space, energy and pitch-angle information of energetic ions and electrons. Energetic Neutral Atom (ENA) imagers have been carried in past and current satellite missions. Global morphology of energetic ions were revealed by the observed ENA images. We have combined simulation and ENA analysis techniques to study the development of ring current ions during magnetic storms and substorms. We identify the timing and location of particle injection and loss. We examine the evolution of ion energy and pitch-angle distribution during different phases of a storm. In this talk we will discuss the findings from our ring current studies and how our simulation and ENA analysis tools can be applied to the upcoming TRIO-CINAMA mission.

  20. Information Technology: A Tool to Cut Health Care Costs

    NASA Technical Reports Server (NTRS)

    Mukkamala, Ravi; Maly, K. J.; Overstreet, C. M.; Foudriat, E. C.

    1996-01-01

    Old Dominion University embarked on a project to see how current computer technology could be applied to reduce the cost and or to improve the efficiency of health care services. We designed and built a prototype for an integrated medical record system (MRS). The MRS is written in Tool control language/Tool kit (Tcl/Tk). While the initial version of the prototype had patient information hard coded into the system, later versions used an INGRES database for storing patient information. Currently, we have proposed an object-oriented model for implementing MRS. These projects involve developing information systems for physicians and medical researchers to enhance their ability for improved treatment at reduced costs. The move to computerized patient records is well underway, several standards exist for laboratory records, and several groups are working on standards for other portions of the patient record.

  1. Kate's Model Verification Tools

    NASA Technical Reports Server (NTRS)

    Morgan, Steve

    1991-01-01

    Kennedy Space Center's Knowledge-based Autonomous Test Engineer (KATE) is capable of monitoring electromechanical systems, diagnosing their errors, and even repairing them when they crash. A survey of KATE's developer/modelers revealed that they were already using a sophisticated set of productivity enhancing tools. They did request five more, however, and those make up the body of the information presented here: (1) a transfer function code fitter; (2) a FORTRAN-Lisp translator; (3) three existing structural consistency checkers to aid in syntax checking their modeled device frames; (4) an automated procedure for calibrating knowledge base admittances to protect KATE's hardware mockups from inadvertent hand valve twiddling; and (5) three alternatives for the 'pseudo object', a programming patch that currently apprises KATE's modeling devices of their operational environments.

  2. Developing a Computational Environment for Coupling MOR Data, Maps, and Models: The Virtual Research Vessel (VRV) Prototype

    NASA Astrophysics Data System (ADS)

    Wright, D. J.; O'Dea, E.; Cushing, J. B.; Cuny, J. E.; Toomey, D. R.; Hackett, K.; Tikekar, R.

    2001-12-01

    The East Pacific Rise (EPR) from 9-10deg. N is currently our best-studied section of fast-spreading mid-ocean ridge. During several decades of investigation it has been explored by the full spectrum of ridge investigators, including chemists, biologists, geologists and geophysicists. These studies, and those that are ongoing, provide a wealth of observational data, results and data-driven theoretical (often numerical) studies that have not yet been fully utilized either by research scientists or by professional educators. While the situation is improving, a large amount of data, results, and related theoretical models still exist either in an inert, non-interactive form (e.g., journal publications) or as unlinked and currently incompatible computer data or algorithms. Infrastructure is needed not just for ready access to data, but linkage of disparate data sets (data to data) as well as data to models in order quantitatively evaluate hypotheses, refine numerical simulations, and explore new relations between observables. The prototype of a computational environment and toolset, called the Virtual Research Vessel (VRV), is being developed to provide scientists and educators with ready access to data, results and numerical models. While this effort is focused on the EPR 9N region, the resulting software tools and infrastructure should be helpful in establishing similar systems for other sections of the global mid-ocean ridge. Work in progress includes efforts to develop: (1) virtual database to incorporate diverse data types with domain-specific metadata into a global schema that allows web-query across different marine geology data sets, and an analogous declarative (database available) description of tools and models; (2) the ability to move data between GIS and the above DBMS, and tools to encourage data submission to archivesl (3) tools for finding and viewing archives, and translating between formats; (4) support for "computational steering" (tool composition) and model coupling (e.g., ability to run tool composition locally but access input data from the web, APIs to support coupling such as invoking programs that are running remotely, and help in writing data wrappers to publish programs); (5) support of migration paths for prototyped model coupling; and (6) export of marine geological data and data analysis to the undergraduate classroom (VRV-ET, "Educational Tool"). See the main VRV web site at http://oregonstate.edu/dept/vrv and the VRV-ET web site at: http://www.cs.uoregon.edu/research/vrv-et.

  3. Assessment of Hybrid Coordinate Model Velocity Fields During Agulhas Return Current 2012 Cruise

    DTIC Science & Technology

    2013-06-01

    Forecasts GDEM Generalized Digital Environmental Model GPS Global Positioning System HYCOM HYbrid Coordinate Ocean Model MICOM Miami Isopycnal...speed profiles was climatology from the Generalized Digital Environmental Model ( GDEM ; Teague et al. 1990). Made operational in 1999, the Modular... GDEM was the only tool a naval oceanographer had at his or her disposal to characterize ocean conditions where in-situ observations could not be

  4. Mathematical modelling and numerical simulation of forces in milling process

    NASA Astrophysics Data System (ADS)

    Turai, Bhanu Murthy; Satish, Cherukuvada; Prakash Marimuthu, K.

    2018-04-01

    Machining of the material by milling induces forces, which act on the work piece material, tool and which in turn act on the machining tool. The forces involved in milling process can be quantified, mathematical models help to predict these forces. A lot of research has been carried out in this area in the past few decades. The current research aims at developing a mathematical model to predict forces at different levels which arise machining of Aluminium6061 alloy. Finite element analysis was used to develop a FE model to predict the cutting forces. Simulation was done for varying cutting conditions. Different experiments was designed using Taguchi method. A L9 orthogonal array was designed and the output was measure for the different experiments. The same was used to develop the mathematical model.

  5. Analyzing Human-Landscape Interactions: Tools That Integrate

    NASA Astrophysics Data System (ADS)

    Zvoleff, Alex; An, Li

    2014-01-01

    Humans have transformed much of Earth's land surface, giving rise to loss of biodiversity, climate change, and a host of other environmental issues that are affecting human and biophysical systems in unexpected ways. To confront these problems, environmental managers must consider human and landscape systems in integrated ways. This means making use of data obtained from a broad range of methods (e.g., sensors, surveys), while taking into account new findings from the social and biophysical science literatures. New integrative methods (including data fusion, simulation modeling, and participatory approaches) have emerged in recent years to address these challenges, and to allow analysts to provide information that links qualitative and quantitative elements for policymakers. This paper brings attention to these emergent tools while providing an overview of the tools currently in use for analysis of human-landscape interactions. Analysts are now faced with a staggering array of approaches in the human-landscape literature—in an attempt to bring increased clarity to the field, we identify the relative strengths of each tool, and provide guidance to analysts on the areas to which each tool is best applied. We discuss four broad categories of tools: statistical methods (including survival analysis, multi-level modeling, and Bayesian approaches), GIS and spatial analysis methods, simulation approaches (including cellular automata, agent-based modeling, and participatory modeling), and mixed-method techniques (such as alternative futures modeling and integrated assessment). For each tool, we offer an example from the literature of its application in human-landscape research. Among these tools, participatory approaches are gaining prominence for analysts to make the broadest possible array of information available to researchers, environmental managers, and policymakers. Further development of new approaches of data fusion and integration across sites or disciplines pose an important challenge for future work in integrating human and landscape components.

  6. Which benefits in the use of a modeling platform : The VSoil example.

    NASA Astrophysics Data System (ADS)

    Lafolie, François; Cousin, Isabelle; Mollier, Alain; Pot, Valérie; Maron, Pierre-Alain; Moitrier, Nicolas; Nouguier, Cedric; Moitrier, Nathalie; Beudez, Nicolas

    2015-04-01

    In the environmental community the need for coupling the models and the associated knowledges emerged recently. The development of a coupling tool or of a modeling platform is mainly driven by the necessity to create models accounting for multiple processes and to take into account the feed back between these processes. Models focusing on a restricted number of processes exist and thus the coupling of these numerical tools appeared as an efficient and rapid mean to fill up the identified gaps. Several tools have been proposed : OMS3 (David et al. 2013) ; CSDMS framework (Peckham et al. 2013) ; the Open MI project developed within the frame of European Community (Open MI, 2011). However, what we should expect from a modeling platform could be more ambitious than only coupling existing numerical codes. We believe that we need to share easily not only our numerical representations but also the attached knowledges. We need to rapidly and easily develop complex models to have tools to bring responses to current issues on soil functioning and soil evolution within the frame of global change. We also need to share in a common frame our visions of soil functioning at various scales, one the one hand to strengthen our collaborations, and, on the other hand, to make them visible by the other communities working on environmental issues. The presentation will briefly present the VSoil platform. The platform is able to manipulate concepts and numerical representations of these processes. The tool helps in assembling modules to create a model and automatically generates an executable code and a GUI. Potentialities of the tool will be illustrated on few selected cases.

  7. TokenPasser: A petri net specification tool. Thesis

    NASA Technical Reports Server (NTRS)

    Mittmann, Michael

    1991-01-01

    In computer program design it is essential to know the effectiveness of different design options in improving performance, and dependability. This paper provides a description of a CAD tool for distributed hierarchical Petri nets. After a brief review of Petri nets, Petri net languages, and Petri net transducers, and descriptions of several current Petri net tools, the specifications and design of the TokenPasser tool are presented. TokenPasser is a tool to allow design of distributed hierarchical systems based on Petri nets. A case study for an intelligent robotic system is conducted, a coordination structure with one dispatcher controlling three coordinators is built to model a proposed robotic assembly system. The system is implemented using TokenPasser, and the results are analyzed to allow judgment of the tool.

  8. 76 FR 13663 - Cooper Tools, Currently Known as Apex Tool Group, LLC, Hicksville, OH; Amended Certification...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-14

    ... DEPARTMENT OF LABOR Employment and Training Administration [TA-W-71,652] Cooper Tools, Currently... Adjustment Assistance on April 27, 2010, applicable to workers of Cooper Tools, Hicksville, Ohio. The workers.... purchased Cooper Tools and is currently known as Apex Tool Group, LLC. Some workers separated from...

  9. Current Challenges in Health Economic Modeling of Cancer Therapies: A Research Inquiry

    PubMed Central

    Miller, Jeffrey D.; Foley, Kathleen A.; Russell, Mason W.

    2014-01-01

    Background The demand for economic models that evaluate cancer treatments is increasing, as healthcare decision makers struggle for ways to manage their budgets while providing the best care possible to patients with cancer. Yet, after nearly 2 decades of cultivating and refining techniques for modeling the cost-effectiveness and budget impact of cancer therapies, serious methodologic and policy challenges have emerged that question the adequacy of economic modeling as a sound decision-making tool in oncology. Objectives We sought to explore some of the contentious issues associated with the development and use of oncology economic models as informative tools in current healthcare decision-making. Our objective was to draw attention to these complex pharmacoeconomic concerns and to promote discussion within the oncology and health economics research communities. Methods Using our combined expertise in health economics research and economic modeling, we structured our inquiry around the following 4 questions: (1) Are economic models adequately addressing questions relevant to oncology decision makers; (2) What are the methodologic limitations of oncology economic models; (3) What guidelines are followed for developing oncology economic models; and (4) Is the evolution of oncology economic modeling keeping pace with treatment innovation? Within the context of each of these questions, we discuss issues related to the technical limitations of oncology modeling, the availability of adequate data for developing models, and the problems with how modeling analyses and results are presented and interpreted. Discussion There is general acceptance that economic models are good, essential tools for decision-making, but the practice of oncology and its rapidly evolving technologies present unique challenges that make assessing and demonstrating value especially complex. There is wide latitude for improvement in oncology modeling methodologies and how model results are presented and interpreted. Conclusion Complex technical and data availability issues with oncology economic modeling pose serious concerns that need to be addressed. It is our hope that this article will provide a framework to guide future discourse on this important topic. PMID:24991399

  10. Current challenges in health economic modeling of cancer therapies: a research inquiry.

    PubMed

    Miller, Jeffrey D; Foley, Kathleen A; Russell, Mason W

    2014-05-01

    The demand for economic models that evaluate cancer treatments is increasing, as healthcare decision makers struggle for ways to manage their budgets while providing the best care possible to patients with cancer. Yet, after nearly 2 decades of cultivating and refining techniques for modeling the cost-effectiveness and budget impact of cancer therapies, serious methodologic and policy challenges have emerged that question the adequacy of economic modeling as a sound decision-making tool in oncology. We sought to explore some of the contentious issues associated with the development and use of oncology economic models as informative tools in current healthcare decision-making. Our objective was to draw attention to these complex pharmacoeconomic concerns and to promote discussion within the oncology and health economics research communities. Using our combined expertise in health economics research and economic modeling, we structured our inquiry around the following 4 questions: (1) Are economic models adequately addressing questions relevant to oncology decision makers; (2) What are the methodologic limitations of oncology economic models; (3) What guidelines are followed for developing oncology economic models; and (4) Is the evolution of oncology economic modeling keeping pace with treatment innovation? Within the context of each of these questions, we discuss issues related to the technical limitations of oncology modeling, the availability of adequate data for developing models, and the problems with how modeling analyses and results are presented and interpreted. There is general acceptance that economic models are good, essential tools for decision-making, but the practice of oncology and its rapidly evolving technologies present unique challenges that make assessing and demonstrating value especially complex. There is wide latitude for improvement in oncology modeling methodologies and how model results are presented and interpreted. Complex technical and data availability issues with oncology economic modeling pose serious concerns that need to be addressed. It is our hope that this article will provide a framework to guide future discourse on this important topic.

  11. Statistical methods for the forensic analysis of striated tool marks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoeksema, Amy Beth

    In forensics, fingerprints can be used to uniquely identify suspects in a crime. Similarly, a tool mark left at a crime scene can be used to identify the tool that was used. However, the current practice of identifying matching tool marks involves visual inspection of marks by forensic experts which can be a very subjective process. As a result, declared matches are often successfully challenged in court, so law enforcement agencies are particularly interested in encouraging research in more objective approaches. Our analysis is based on comparisons of profilometry data, essentially depth contours of a tool mark surface taken alongmore » a linear path. In current practice, for stronger support of a match or non-match, multiple marks are made in the lab under the same conditions by the suspect tool. We propose the use of a likelihood ratio test to analyze the difference between a sample of comparisons of lab tool marks to a field tool mark, against a sample of comparisons of two lab tool marks. Chumbley et al. (2010) point out that the angle of incidence between the tool and the marked surface can have a substantial impact on the tool mark and on the effectiveness of both manual and algorithmic matching procedures. To better address this problem, we describe how the analysis can be enhanced to model the effect of tool angle and allow for angle estimation for a tool mark left at a crime scene. With sufficient development, such methods may lead to more defensible forensic analyses.« less

  12. Aeroservoelastic Modeling of Body Freedom Flutter for Control System Design

    NASA Technical Reports Server (NTRS)

    Ouellette, Jeffrey

    2017-01-01

    One of the most severe forms of coupling between aeroelasticity and flight dynamics is an instability called freedom flutter. The existing tools often assume relatively weak coupling, and are therefore unable to accurately model body freedom flutter. Because the existing tools were developed from traditional flutter analysis models, inconsistencies in the final models are not compatible with control system design tools. To resolve these issues, a number of small, but significant changes have been made to the existing approaches. A frequency domain transformation is used with the unsteady aerodynamics to ensure a more physically consistent stability axis rational function approximation of the unsteady aerodynamic model. The aerodynamic model is augmented with additional terms to account for limitations of the baseline unsteady aerodynamic model and to account for the gravity forces. An assumed modes method is used for the structural model to ensure a consistent definition of the aircraft states across the flight envelope. The X-56A stiff wing flight-test data were used to validate the current modeling approach. The flight-test data does not show body-freedom flutter, but does show coupling between the flight dynamics and the aeroelastic dynamics and the effects of the fuel weight.

  13. Top-level modeling of an als system utilizing object-oriented techniques

    NASA Astrophysics Data System (ADS)

    Rodriguez, L. F.; Kang, S.; Ting, K. C.

    The possible configuration of an Advanced Life Support (ALS) System capable of supporting human life for long-term space missions continues to evolve as researchers investigate potential technologies and configurations. To facilitate the decision process the development of acceptable, flexible, and dynamic mathematical computer modeling tools capable of system level analysis is desirable. Object-oriented techniques have been adopted to develop a dynamic top-level model of an ALS system.This approach has several advantages; among these, object-oriented abstractions of systems are inherently modular in architecture. Thus, models can initially be somewhat simplistic, while allowing for adjustments and improvements. In addition, by coding the model in Java, the model can be implemented via the World Wide Web, greatly encouraging the utilization of the model. Systems analysis is further enabled with the utilization of a readily available backend database containing information supporting the model. The subsystem models of the ALS system model include Crew, Biomass Production, Waste Processing and Resource Recovery, Food Processing and Nutrition, and the Interconnecting Space. Each subsystem model and an overall model have been developed. Presented here is the procedure utilized to develop the modeling tool, the vision of the modeling tool, and the current focus for each of the subsystem models.

  14. The Ottawa Model of Research Use: a guide to clinical innovation in the NICU.

    PubMed

    Hogan, Debora L; Logan, Jo

    2004-01-01

    To improve performance of a neonatal transport team by implementing a research-based family assessment instrument. Objectives included providing a structure for evaluating families and fostering the healthcare relationship. Neonatal transports are associated with family crises. Transport teams require a comprehensive framework to accurately assess family responses to adversity and tools to guide their practice toward parental mastery of the event. Currently, there are no assessment tools that merge family nursing expertise with neonatal transport. A family assessment tool grounded in contemporary family nursing theory and research was developed by a clinical nurse specialist. The Ottawa Model of Research Use guided the process of piloting the innovation with members of a transport team. Focus groups, interviews, and surveys were conducted to create profiles of barriers and facilitators to research use by team members. Tailored research transfer strategies were enacted based on the profile results. Formative evaluations demonstrated improvements in team members' perceptions of their knowledge, family centeredness, and ability to assess and intervene with families. The family assessment tool is currently being incorporated into Clinical Practice Guidelines for Transport and thus will be considered standard care. Use of a family assessment tool is an effective way of appraising families and addressing suffering. The Ottawa Model of Research Use provided a framework for implementing the clinical innovation. A key role of the clinical nurse specialist is to influence nursing practice by fostering research use by practitioners. When developing and implementing a clinical innovation, input from end users and consumers is pivotal. Incorporating the innovation into a practice guideline provides a structure to imbed research evidence into practice.

  15. Past, present and prospect of an Artificial Intelligence (AI) based model for sediment transport prediction

    NASA Astrophysics Data System (ADS)

    Afan, Haitham Abdulmohsin; El-shafie, Ahmed; Mohtar, Wan Hanna Melini Wan; Yaseen, Zaher Mundher

    2016-10-01

    An accurate model for sediment prediction is a priority for all hydrological researchers. Many conventional methods have shown an inability to achieve an accurate prediction of suspended sediment. These methods are unable to understand the behaviour of sediment transport in rivers due to the complexity, noise, non-stationarity, and dynamism of the sediment pattern. In the past two decades, Artificial Intelligence (AI) and computational approaches have become a remarkable tool for developing an accurate model. These approaches are considered a powerful tool for solving any non-linear model, as they can deal easily with a large number of data and sophisticated models. This paper is a review of all AI approaches that have been applied in sediment modelling. The current research focuses on the development of AI application in sediment transport. In addition, the review identifies major challenges and opportunities for prospective research. Throughout the literature, complementary models superior to classical modelling.

  16. Results from a workshop on research needs for modeling aquifer thermal energy storage systems

    NASA Astrophysics Data System (ADS)

    Drost, M. K.

    1990-08-01

    A workshop an aquifer thermal energy storage (ATES) system modeling was conducted by Pacific Northwest Laboratory (PNL). The goal of the workshop was to develop a list of high priority research activities that would facilitate the commercial success of ATES. During the workshop, participants reviewed currently available modeling tools for ATES systems and produced a list of significant issues related to modeling ATES systems. Participants assigned a priority to each issue on the list by voting and developed a list of research needs for each of four high-priority research areas; the need for a feasibility study model, the need for engineering design models, the need for aquifer characterization, and the need for an economic model. The workshop participants concluded that ATES commercialization can be accelerated by aggressive development of ATES modeling tools and made specific recommendations for that development.

  17. Predictive and Prognostic Models: Implications for Healthcare Decision-Making in a Modern Recession

    PubMed Central

    Vogenberg, F. Randy

    2009-01-01

    Various modeling tools have been developed to address the lack of standardized processes that incorporate the perspectives of all healthcare stakeholders. Such models can assist in the decision-making process aimed at achieving specific clinical outcomes, as well as guide the allocation of healthcare resources and reduce costs. The current efforts in Congress to change the way healthcare is financed, reimbursed, and delivered have rendered the incorporation of modeling tools into the clinical decision-making all the more important. Prognostic and predictive models are particularly relevant to healthcare, particularly in the clinical decision-making, with implications for payers, patients, and providers. The use of these models is likely to increase, as providers and patients seek to improve their clinical decision process to achieve better outcomes, while reducing overall healthcare costs. PMID:25126292

  18. Machinability of titanium metal matrix composites (Ti-MMCs)

    NASA Astrophysics Data System (ADS)

    Aramesh, Maryam

    Titanium metal matrix composites (Ti-MMCs), as a new generation of materials, have various potential applications in aerospace and automotive industries. The presence of ceramic particles enhances the physical and mechanical properties of the alloy matrix. However, the hard and abrasive nature of these particles causes various issues in the field of their machinability. Severe tool wear and short tool life are the most important drawbacks of machining this class of materials. There is very limited work in the literature regarding the machinability of this class of materials especially in the area of tool life estimation and tool wear. By far, polycrystalline diamond (PCD) tools appear to be the best choice for machining MMCs from researchers' point of view. However, due to their high cost, economical alternatives are sought. Cubic boron nitride (CBN) inserts, as the second hardest available tools, show superior characteristics such as great wear resistance, high hardness at elevated temperatures, a low coefficient of friction and a high melting point. Yet, so far CBN tools have not been studied during machining of Ti-MMCs. In this study, a comprehensive study has been performed to explore the tool wear mechanisms of CBN inserts during turning of Ti-MMCs. The unique morphology of the worn faces of the tools was investigated for the first time, which led to new insights in the identification of chemical wear mechanisms during machining of Ti-MMCs. Utilizing the full tool life capacity of cutting tools is also very crucial, due to the considerable costs associated with suboptimal replacement of tools. This strongly motivates development of a reliable model for tool life estimation under any cutting conditions. In this study, a novel model based on the survival analysis methodology is developed to estimate the progressive states of tool wear under any cutting conditions during machining of Ti-MMCs. This statistical model takes into account the machining time in addition to the effect of cutting parameters. Thus, promising results were obtained which showed a very good agreement with the experimental results. Moreover, a more advanced model was constructed, by adding the tool wear as another variable to the previous model. Therefore, a new model was proposed for estimating the remaining life of worn inserts under different cutting conditions, using the current tool wear data as an input. The results of this model were validated with the experimental results. The estimated results were well consistent with the results obtained from the experiments.

  19. Computational Analysis of Material Flow During Friction Stir Welding of AA5059 Aluminum Alloys

    NASA Astrophysics Data System (ADS)

    Grujicic, M.; Arakere, G.; Pandurangan, B.; Ochterbeck, J. M.; Yen, C.-F.; Cheeseman, B. A.; Reynolds, A. P.; Sutton, M. A.

    2012-09-01

    Workpiece material flow and stirring/mixing during the friction stir welding (FSW) process are investigated computationally. Within the numerical model of the FSW process, the FSW tool is treated as a Lagrangian component while the workpiece material is treated as an Eulerian component. The employed coupled Eulerian/Lagrangian computational analysis of the welding process was of a two-way thermo-mechanical character (i.e., frictional-sliding/plastic-work dissipation is taken to act as a heat source in the thermal-energy balance equation) while temperature is allowed to affect mechanical aspects of the model through temperature-dependent material properties. The workpiece material (AA5059, solid-solution strengthened and strain-hardened aluminum alloy) is represented using a modified version of the classical Johnson-Cook model (within which the strain-hardening term is augmented to take into account for the effect of dynamic recrystallization) while the FSW tool material (AISI H13 tool steel) is modeled as an isotropic linear-elastic material. Within the analysis, the effects of some of the FSW key process parameters are investigated (e.g., weld pitch, tool tilt-angle, and the tool pin-size). The results pertaining to the material flow during FSW are compared with their experimental counterparts. It is found that, for the most part, experimentally observed material-flow characteristics are reproduced within the current FSW-process model.

  20. A Flexible Statechart-to-Model-Checker Translator

    NASA Technical Reports Server (NTRS)

    Rouquette, Nicolas; Dunphy, Julia; Feather, Martin S.

    2000-01-01

    Many current-day software design tools offer some variant of statechart notation for system specification. We, like others, have built an automatic translator from (a subset of) statecharts to a model checker, for use to validate behavioral requirements. Our translator is designed to be flexible. This allows us to quickly adjust the translator to variants of statechart semantics, including problem-specific notational conventions that designers employ. Our system demonstration will be of interest to the following two communities: (1) Potential end-users: Our demonstration will show translation from statecharts created in a commercial UML tool (Rational Rose) to Promela, the input language of Holzmann's model checker SPIN. The translation is accomplished automatically. To accommodate the major variants of statechart semantics, our tool offers user-selectable choices among semantic alternatives. Options for customized semantic variants are also made available. The net result is an easy-to-use tool that operates on a wide range of statechart diagrams to automate the pathway to model-checking input. (2) Other researchers: Our translator embodies, in one tool, ideas and approaches drawn from several sources. Solutions to the major challenges of statechart-to-model-checker translation (e.g., determining which transition(s) will fire, handling of concurrent activities) are retired in a uniform, fully mechanized, setting. The way in which the underlying architecture of the translator itself facilitates flexible and customizable translation will also be evident.

  1. Crops in silico: A community wide multi-scale computational modeling framework of plant canopies

    NASA Astrophysics Data System (ADS)

    Srinivasan, V.; Christensen, A.; Borkiewic, K.; Yiwen, X.; Ellis, A.; Panneerselvam, B.; Kannan, K.; Shrivastava, S.; Cox, D.; Hart, J.; Marshall-Colon, A.; Long, S.

    2016-12-01

    Current crop models predict a looming gap between supply and demand for primary foodstuffs over the next 100 years. While significant yield increases were achieved in major food crops during the early years of the green revolution, the current rates of yield increases are insufficient to meet future projected food demand. Furthermore, with projected reduction in arable land, decrease in water availability, and increasing impacts of climate change on future food production, innovative technologies are required to sustainably improve crop yield. To meet these challenges, we are developing Crops in silico (Cis), a biologically informed, multi-scale, computational modeling framework that can facilitate whole plant simulations of crop systems. The Cis framework is capable of linking models of gene networks, protein synthesis, metabolic pathways, physiology, growth, and development in order to investigate crop response to different climate scenarios and resource constraints. This modeling framework will provide the mechanistic details to generate testable hypotheses toward accelerating directed breeding and engineering efforts to increase future food security. A primary objective for building such a framework is to create synergy among an inter-connected community of biologists and modelers to create a realistic virtual plant. This framework advantageously casts the detailed mechanistic understanding of individual plant processes across various scales in a common scalable framework that makes use of current advances in high performance and parallel computing. We are currently designing a user friendly interface that will make this tool equally accessible to biologists and computer scientists. Critically, this framework will provide the community with much needed tools for guiding future crop breeding and engineering, understanding the emergent implications of discoveries at the molecular level for whole plant behavior, and improved prediction of plant and ecosystem responses to the environment.

  2. Climate Model Diagnostic Analyzer Web Service System

    NASA Astrophysics Data System (ADS)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Jiang, J. H.

    2013-12-01

    The latest Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report stressed the need for the comprehensive and innovative evaluation of climate models with newly available global observations. The traditional approach to climate model evaluation, which compares a single parameter at a time, identifies symptomatic model biases and errors but fails to diagnose the model problems. The model diagnosis process requires physics-based multi-variable comparisons that typically involve large-volume and heterogeneous datasets, making them both computationally- and data-intensive. To address these challenges, we are developing a parallel, distributed web-service system that enables the physics-based multi-variable model performance evaluations and diagnoses through the comprehensive and synergistic use of multiple observational data, reanalysis data, and model outputs. We have developed a methodology to transform an existing science application code into a web service using a Python wrapper interface and Python web service frameworks (i.e., Flask, Gunicorn, and Tornado). The web-service system, called Climate Model Diagnostic Analyzer (CMDA), currently supports (1) all the datasets from Obs4MIPs and a few ocean datasets from NOAA and Argo, which can serve as observation-based reference data for model evaluation and (2) many of CMIP5 model outputs covering a broad range of atmosphere, ocean, and land variables from the CMIP5 specific historical runs and AMIP runs. Analysis capabilities currently supported by CMDA are (1) the calculation of annual and seasonal means of physical variables, (2) the calculation of time evolution of the means in any specified geographical region, (3) the calculation of correlation between two variables, and (4) the calculation of difference between two variables. A web user interface is chosen for CMDA because it not only lowers the learning curve and removes the adoption barrier of the tool but also enables instantaneous use, avoiding the hassle of local software installation and environment incompatibility. CMDA is planned to be used as an educational tool for the summer school organized by JPL's Center for Climate Science in 2014. The requirements of the educational tool are defined with the interaction with the school organizers, and CMDA is customized to meet the requirements accordingly. The tool needs to be production quality for 30+ simultaneous users. The summer school will thus serve as a valuable testbed for the tool development, preparing CMDA to serve the Earth-science modeling and model-analysis community at the end of the project. This work was funded by the NASA Earth Science Program called Computational Modeling Algorithms and Cyberinfrastructure (CMAC).

  3. Improved cyberinfrastructure for integrated hydrometeorological predictions within the fully-coupled WRF-Hydro modeling system

    NASA Astrophysics Data System (ADS)

    gochis, David; hooper, Rick; parodi, Antonio; Jha, Shantenu; Yu, Wei; Zaslavsky, Ilya; Ganapati, Dinesh

    2014-05-01

    The community WRF-Hydro system is currently being used in a variety of flood prediction and regional hydroclimate impacts assessment applications around the world. Despite its increasingly wide use certain cyberinfrastructure bottlenecks exist in the setup, execution and post-processing of WRF-Hydro model runs. These bottlenecks result in wasted time, labor, data transfer bandwidth and computational resource use. Appropriate development and use of cyberinfrastructure to setup and manage WRF-Hydro modeling applications will streamline the entire workflow of hydrologic model predictions. This talk will present recent advances in the development and use of new open-source cyberinfrastructure tools for the WRF-Hydro architecture. These tools include new web-accessible pre-processing applications, supercomputer job management applications and automated verification and visualization applications. The tools will be described successively and then demonstrated in a set of flash flood use cases for recent destructive flood events in the U.S. and in Europe. Throughout, an emphasis on the implementation and use of community data standards for data exchange is made.

  4. Resistance-surface-based wildlife conservation connectivity modeling: Summary of efforts in the United States and guide for practitioners

    Treesearch

    Alisa A. Wade; Kevin S. McKelvey; Michael K. Schwartz

    2015-01-01

    Resistance-surface-based connectivity modeling has become a widespread tool for conservation planning. The current ease with which connectivity models can be created, however, masks the numerous untested assumptions underlying both the rules that produce the resistance surface and the algorithms used to locate low-cost paths across the target landscape. Here we present...

  5. Using Collaborative Simulation Modeling to Develop a Web-Based Tool to Support Policy-Level Decision Making About Breast Cancer Screening Initiation Age

    PubMed Central

    Burnside, Elizabeth S.; Lee, Sandra J.; Bennette, Carrie; Near, Aimee M.; Alagoz, Oguzhan; Huang, Hui; van den Broek, Jeroen J.; Kim, Joo Yeon; Ergun, Mehmet A.; van Ravesteyn, Nicolien T.; Stout, Natasha K.; de Koning, Harry J.; Mandelblatt, Jeanne S.

    2017-01-01

    Background There are no publicly available tools designed specifically to assist policy makers to make informed decisions about the optimal ages of breast cancer screening initiation for different populations of US women. Objective To use three established simulation models to develop a web-based tool called Mammo OUTPuT. Methods The simulation models use the 1970 US birth cohort and common parameters for incidence, digital screening performance, and treatment effects. Outcomes include breast cancers diagnosed, breast cancer deaths averted, breast cancer mortality reduction, false-positive mammograms, benign biopsies, and overdiagnosis. The Mammo OUTPuT tool displays these outcomes for combinations of age at screening initiation (every year from 40 to 49), annual versus biennial interval, lifetime versus 10-year horizon, and breast density, compared to waiting to start biennial screening at age 50 and continuing to 74. The tool was piloted by decision makers (n = 16) who completed surveys. Results The tool demonstrates that benefits in the 40s increase linearly with earlier initiation age, without a specific threshold age. Likewise, the harms of screening increase monotonically with earlier ages of initiation in the 40s. The tool also shows users how the balance of benefits and harms varies with breast density. Surveys revealed that 100% of users (16/16) liked the appearance of the site; 94% (15/16) found the tool helpful; and 94% (15/16) would recommend the tool to a colleague. Conclusions This tool synthesizes a representative subset of the most current CISNET (Cancer Intervention and Surveillance Modeling Network) simulation model outcomes to provide policy makers with quantitative data on the benefits and harms of screening women in the 40s. Ultimate decisions will depend on program goals, the population served, and informed judgments about the weight of benefits and harms. PMID:29376135

  6. State-of-the-Art for Hygrothermal Simulation Tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boudreaux, Philip R.; New, Joshua Ryan; Shrestha, Som S.

    2017-02-01

    The hygrothermal (heat and moisture) performance of buildings can be assessed by utilizing simulation tools. There are currently a number of available hygrothermal calculation tools available which vary in their degree of sophistication and runtime requirements. This report investigates three of the most commonly used models (WUFI, HAMT, and EMPD) to assess their limitations and potential to generate physically realistic results to prioritize improvements for EnergyPlus (which uses HAMT and EMPD). The outcome of the study shows that, out of these three tools, WUFI has the greatest hygrothermal capabilities. Limitations of these tools were also assessed including: WUFI’s inability tomore » properly account for air leakage and transfer at surface boundaries; HAMT’s inability to handle air leakage, precipitationrelated moisture problems, or condensation problems from high relative humidity; and multiple limitations for EMPD as a simplified method to estimate indoor temperature and humidity levels and generally not used to estimate the hygrothermal performance of the building envelope materials. In conclusion, out of the three investigated simulation tools, HAMT has the greatest modeling potential, is open source, and we have prioritized specific features that can enable EnergyPlus to model all relevant heat and moisture transfer mechanisms that impact the performance of building envelope components.« less

  7. Effects-based strategy development through center of gravity and target system analysis

    NASA Astrophysics Data System (ADS)

    White, Christopher M.; Prendergast, Michael; Pioch, Nicholas; Jones, Eric K.; Graham, Stephen

    2003-09-01

    This paper describes an approach to effects-based planning in which a strategic-theater-level mission is refined into operational-level and ultimately tactical-level tasks and desired effects, informed by models of the expected enemy response at each level of abstraction. We describe a strategy development system that implements this approach and supports human-in-the-loop development of an effects-based plan. This system consists of plan authoring tools tightly integrated with a suite of center of gravity (COG) and target system analysis tools. A human planner employs the plan authoring tools to develop a hierarchy of tasks and desired effects. Upon invocation, the target system analysis tools use reduced-order models of enemy centers of gravity to select appropriate target set options for the achievement of desired effects, together with associated indicators for each option. The COG analysis tools also provide explicit models of the causal mechanisms linking tasks and desired effects to one another, and suggest appropriate observable indicators to guide ISR planning, execution monitoring, and campaign assessment. We are currently implementing the system described here as part of the AFRL-sponsored Effects Based Operations program.

  8. Interactive Web Interface to the Global Strain Rate Map Project

    NASA Astrophysics Data System (ADS)

    Meertens, C. M.; Estey, L.; Kreemer, C.; Holt, W.

    2004-05-01

    An interactive web interface allows users to explore the results of a global strain rate and velocity model and to compare them to other geophysical observations. The most recent model, an updated version of Kreemer et al., 2003, has 25 independent rigid plate-like regions separated by deformable boundaries covered by about 25,000 grid areas. A least-squares fit was made to 4900 geodetic velocities from 79 different geodetic studies. In addition, Quaternary fault slip rate data are used to infer geologic strain rate estimates (currently only for central Asia). Information about the style and direction of expected strain rate is inferred from the principal axes of the seismic strain rate field. The current model, as well as source data, references and an interactive map tool, are located at the International Lithosphere Program (ILP) "A Global Strain Rate Map (ILP II-8)" project website: http://www-world-strain-map.org. The purpose of the ILP GSRM project is to provide new information from this, and other investigations, that will contribute to a better understanding of continental dynamics and to the quantification of seismic hazards. A unique aspect of the GSRM interactive Java map tool is that the user can zoom in and make custom views of the model grid and results for any area of the globe selecting strain rate and style contour plots and principal axes, observed and model velocity fields in specified frames of reference, and geologic fault data. The results can be displayed with other data sets such Harvard CMT earthquake focal mechanisms, stress directions from the ILP World Stress Map Project, and topography. With the GSRM Java map tool, the user views custom maps generated by a Generic Mapping Tool (GMT) server. These interactive capabilities greatly extend what is possible to present in a published paper. A JavaScript version, using pre-constructed maps, as well as a related information site have also been created for broader education and outreach access. The GSRM map tool will be demonstrated and latest model GSRM 1.1 results, containing important new data for Asia, Iran, western Pacific, and Southern California, will be presented.

  9. One ring to rule them all: storm time ring current and its influence on radiation belts, plasmasphere and global magnetosphere electrodynamics

    NASA Astrophysics Data System (ADS)

    Buzulukova, Natalia; Fok, Mei-Ching; Glocer, Alex; Moore, Thomas E.

    2013-04-01

    We report studies of the storm time ring current and its influence on the radiation belts, plasmasphere and global magnetospheric dynamics. The near-Earth space environment is described by multiscale physics that reflects a variety of processes and conditions that occur in magnetospheric plasma. For a successful description of such a plasma, a complex solution is needed which allows multiple physics domains to be described using multiple physical models. A key population of the inner magnetosphere is ring current plasma. Ring current dynamics affects magnetic and electric fields in the entire magnetosphere, the distribution of cold ionospheric plasma (plasmasphere), and radiation belts particles. To study electrodynamics of the inner magnetosphere, we present a MHD model (BATSRUS code) coupled with ionospheric solver for electric field and with ring current-radiation belt model (CIMI code). The model will be used as a tool to reveal details of coupling between different regions of the Earth's magnetosphere. A model validation will be also presented based on comparison with data from THEMIS, POLAR, GOES, and TWINS missions. INVITED TALK

  10. Ring Current Pressure Estimation withRAM-SCB using Data Assimilation and VanAllen Probe Flux Data

    NASA Astrophysics Data System (ADS)

    Godinez, H. C.; Yu, Y.; Henderson, M. G.; Larsen, B.; Jordanova, V.

    2015-12-01

    Capturing and subsequently modeling the influence of tail plasma injections on the inner magnetosphere is particularly important for understanding the formation and evolution of Earth's ring current. In this study, the ring current distribution is estimated with the Ring Current-Atmosphere Interactions Model with Self-Consistent Magnetic field (RAM-SCB) using, for the first time, data assimilation techniques and particle flux data from the Van Allen Probes. The state of the ring current within the RAM-SCB is corrected via an ensemble based data assimilation technique by using proton flux from one of the Van Allen Probes, to capture the enhancement of ring current following an isolated substorm event on July 18 2013. The results show significant improvement in the estimation of the ring current particle distributions in the RAM-SCB model, leading to better agreement with observations. This newly implemented data assimilation technique in the global modeling of the ring current thus provides a promising tool to better characterize the effect of substorm injections in the near-Earth regions. The work is part of the Space Hazards Induced near Earth by Large, Dynamic Storms (SHIELDS) project in Los Alamos National Laboratory.

  11. An Automated Method for High-Definition Transcranial Direct Current Stimulation Modeling*

    PubMed Central

    Huang, Yu; Su, Yuzhuo; Rorden, Christopher; Dmochowski, Jacek; Datta, Abhishek; Parra, Lucas C.

    2014-01-01

    Targeted transcranial stimulation with electric currents requires accurate models of the current flow from scalp electrodes to the human brain. Idiosyncratic anatomy of individual brains and heads leads to significant variability in such current flows across subjects, thus, necessitating accurate individualized head models. Here we report on an automated processing chain that computes current distributions in the head starting from a structural magnetic resonance image (MRI). The main purpose of automating this process is to reduce the substantial effort currently required for manual segmentation, electrode placement, and solving of finite element models. In doing so, several weeks of manual labor were reduced to no more than 4 hours of computation time and minimal user interaction, while current-flow results for the automated method deviated by less than 27.9% from the manual method. Key facilitating factors are the addition of three tissue types (skull, scalp and air) to a state-of-the-art automated segmentation process, morphological processing to correct small but important segmentation errors, and automated placement of small electrodes based on easily reproducible standard electrode configurations. We anticipate that such an automated processing will become an indispensable tool to individualize transcranial direct current stimulation (tDCS) therapy. PMID:23367144

  12. Use of Influenza Risk Assessment Tool for Prepandemic Preparedness

    PubMed Central

    Trock, Susan C.

    2018-01-01

    In 2010, the Centers for Disease Control and Prevention began to develop an Influenza Risk Assessment Tool (IRAT) to methodically capture and assess information relating to influenza A viruses not currently circulating among humans. The IRAT uses a multiattribute, additive model to generate a summary risk score for each virus. Although the IRAT is not intended to predict the next pandemic influenza A virus, it has provided input into prepandemic preparedness decisions. PMID:29460739

  13. Solar Energetic Proton Nowcast for Low Earth Orbits

    NASA Astrophysics Data System (ADS)

    Winter, L. M.; Quinn, R. A.

    2013-12-01

    Solar energetic proton flux levels above > 10 pfu can damage spacecraft and pose a hazard to astronauts as well as passengers and crew on polar commercial flights. While the GOES satellites provide real-time data of SEP levels in geosynchronous orbit, it is also important to determine the risk to objects in lower altitude orbits. To assess this risk in real-time, we created a web-based nowcast of SEP flux. The tool determines the current solar energetic proton flux level given input position (latitude, longitude, and altitude) and energy of the protons (e.g., > 10 MeV). The effective cutoff energy is calculated for the location and current geomagnetic storm level (i.e., the Kp value from SWPC) using the Shea & Smart (e.g., Smart et al. 1999abc, 2000) geomagnetic cutoff model, which uses a trajectory tracing technique through the Tsyganenko magnetospheric model for the geomagnetic field. With the cutoff energy and GOES proton flux measurements, a map of the current predicted proton flux level at the input energy is displayed along with the calculated integral spectrum for the input position. This operational tool is a powerful new diagnostic for assessing the risk to spacecraft from current solar proton levels, with easy to read color-coded maps generated for all GOES integral proton flux energies and a range of altitudes (1000 - 35000 km). The figures show example maps over a ';'quiet'' (03-26-13) and active (10-30-03) time, with high proton levels easily distinguishable at or above the NOAA warning level (yellow-orange-red). The tool also displays the current GOES integral spectrum and fit, and the estimated spectrum at a user-defined location and altitude.

  14. KNOW ESSENTIALS: a tool for informed decisions in the absence of formal HTA systems.

    PubMed

    Mathew, Joseph L

    2011-04-01

    Most developing countries and resource-limited settings lack robust health technology assessment (HTA) systems. Because the development of locally relevant HTA is not immediately viable, and the extrapolation of external HTA is inappropriate, a new model for evaluating health technologies is required. The aim of this study was to describe the development and application of KNOW ESSENTIALS, a tool facilitating evidence-based decisions on health technologies by stakeholders in settings lacking formal HTA systems. Current HTA methodology was examined through literature search. Additional issues relevant to resource-limited settings, but not adequately addressed in current methodology, were identified through further literature search, appraisal of contextually relevant issues, discussion with healthcare professionals familiar with the local context, and personal experience. A set of thirteen elements important for evidence-based decisions was identified, selected and combined into a tool with the mnemonic KNOW ESSENTIALS. Detailed definitions for each element, coding for the elements, and a system to evaluate a given health technology using the tool were developed. Developing countries and resource-limited settings face several challenges to informed decision making. Models that are relevant and applicable in high-income countries are unlikely in such settings. KNOW ESSENTIALS is an alternative that facilitates evidence-based decision making by stakeholders without formal expertise in HTA. The tool could be particularly useful, as an interim measure, in healthcare systems that are developing HTA capacity. It could also be useful anywhere when rapid evidence-based decisions on health technologies are required.

  15. Current algebra, statistical mechanics and quantum models

    NASA Astrophysics Data System (ADS)

    Vilela Mendes, R.

    2017-11-01

    Results obtained in the past for free boson systems at zero and nonzero temperatures are revisited to clarify the physical meaning of current algebra reducible functionals which are associated to systems with density fluctuations, leading to observable effects on phase transitions. To use current algebra as a tool for the formulation of quantum statistical mechanics amounts to the construction of unitary representations of diffeomorphism groups. Two mathematical equivalent procedures exist for this purpose. One searches for quasi-invariant measures on configuration spaces, the other for a cyclic vector in Hilbert space. Here, one argues that the second approach is closer to the physical intuition when modelling complex systems. An example of application of the current algebra methodology to the pairing phenomenon in two-dimensional fermion systems is discussed.

  16. Design strategies and functionality of the Visual Interface for Virtual Interaction Development (VIVID) tool

    NASA Technical Reports Server (NTRS)

    Nguyen, Lac; Kenney, Patrick J.

    1993-01-01

    Development of interactive virtual environments (VE) has typically consisted of three primary activities: model (object) development, model relationship tree development, and environment behavior definition and coding. The model and relationship tree development activities are accomplished with a variety of well-established graphic library (GL) based programs - most utilizing graphical user interfaces (GUI) with point-and-click interactions. Because of this GUI format, little programming expertise on the part of the developer is necessary to create the 3D graphical models or to establish interrelationships between the models. However, the third VE development activity, environment behavior definition and coding, has generally required the greatest amount of time and programmer expertise. Behaviors, characteristics, and interactions between objects and the user within a VE must be defined via command line C coding prior to rendering the environment scenes. In an effort to simplify this environment behavior definition phase for non-programmers, and to provide easy access to model and tree tools, a graphical interface and development tool has been created. The principal thrust of this research is to effect rapid development and prototyping of virtual environments. This presentation will discuss the 'Visual Interface for Virtual Interaction Development' (VIVID) tool; an X-Windows based system employing drop-down menus for user selection of program access, models, and trees, behavior editing, and code generation. Examples of these selection will be highlighted in this presentation, as will the currently available program interfaces. The functionality of this tool allows non-programming users access to all facets of VE development while providing experienced programmers with a collection of pre-coded behaviors. In conjunction with its existing, interfaces and predefined suite of behaviors, future development plans for VIVID will be described. These include incorporation of dual user virtual environment enhancements, tool expansion, and additional behaviors.

  17. The Conservation Effects Assessment Project (CEAP): a national scale natural resources and conservation needs assessment and decision support tool

    NASA Astrophysics Data System (ADS)

    Johnson, M.-V. V.; Norfleet, M. L.; Atwood, J. D.; Behrman, K. D.; Kiniry, J. R.; Arnold, J. G.; White, M. J.; Williams, J.

    2015-07-01

    The Conservation Effects Assessment Project (CEAP) was initiated to quantify the impacts of agricultural conservation practices at the watershed, regional, and national scales across the United States. Representative cropland acres in all major U.S. watersheds were surveyed in 2003-2006 as part of the seminal CEAP Cropland National Assessment. Two process-based models, the Agricultural Policy Environmental eXtender(APEX) and the Soil Water Assessment Tool (SWAT), were applied to the survey data to provide a quantitative assessment of current conservation practice impacts, establish a benchmark against which future conservation trends and efforts could be measured, and identify outstanding conservation concerns. The flexibility of these models and the unprecedented amount of data on current conservation practices across the country enabled Cropland CEAP to meet its Congressional mandate of quantifying the value of current conservation practices. It also enabled scientifically grounded exploration of a variety of conservation scenarios, empowering CEAP to not only inform on past successes and additional needs, but to also provide a decision support tool to help guide future policy development and conservation practice decision making. The CEAP effort will repeat the national survey in 2015-2016, enabling CEAP to provide analyses of emergent conservation trends, outstanding needs, and potential costs and benefits of pursuing various treatment scenarios for all agricultural watersheds across the United States.

  18. CmapTools: A Software Environment for Knowledge Modeling and Sharing

    NASA Technical Reports Server (NTRS)

    Canas, Alberto J.

    2004-01-01

    In an ongoing collaborative effort between a group of NASA Ames scientists and researchers at the Institute for Human and Machine Cognition (IHMC) of the University of West Florida, a new version of CmapTools has been developed that enable scientists to construct knowledge models of their domain of expertise, share them with other scientists, make them available to anybody on the Internet with access to a Web browser, and peer-review other scientists models. These software tools have been successfully used at NASA to build a large-scale multimedia on Mars and in knowledge model on Habitability Assessment. The new version of the software places emphasis on greater usability for experts constructing their own knowledge models, and support for the creation of large knowledge models with large number of supporting resources in the forms of images, videos, web pages, and other media. Additionally, the software currently allows scientists to cooperate with each other in the construction, sharing and criticizing of knowledge models. Scientists collaborating from remote distances, for example researchers at the Astrobiology Institute, can concurrently manipulate the knowledge models they are viewing without having to do this at a special videoconferencing facility.

  19. Inter-model analysis of tsunami-induced coastal currents

    NASA Astrophysics Data System (ADS)

    Lynett, Patrick J.; Gately, Kara; Wilson, Rick; Montoya, Luis; Arcas, Diego; Aytore, Betul; Bai, Yefei; Bricker, Jeremy D.; Castro, Manuel J.; Cheung, Kwok Fai; David, C. Gabriel; Dogan, Gozde Guney; Escalante, Cipriano; González-Vida, José Manuel; Grilli, Stephan T.; Heitmann, Troy W.; Horrillo, Juan; Kânoğlu, Utku; Kian, Rozita; Kirby, James T.; Li, Wenwen; Macías, Jorge; Nicolsky, Dmitry J.; Ortega, Sergio; Pampell-Manis, Alyssa; Park, Yong Sung; Roeber, Volker; Sharghivand, Naeimeh; Shelby, Michael; Shi, Fengyan; Tehranirad, Babak; Tolkova, Elena; Thio, Hong Kie; Velioğlu, Deniz; Yalçıner, Ahmet Cevdet; Yamazaki, Yoshiki; Zaytsev, Andrey; Zhang, Y. J.

    2017-06-01

    To help produce accurate and consistent maritime hazard products, the National Tsunami Hazard Mitigation Program organized a benchmarking workshop to evaluate the numerical modeling of tsunami currents. Thirteen teams of international researchers, using a set of tsunami models currently utilized for hazard mitigation studies, presented results for a series of benchmarking problems; these results are summarized in this paper. Comparisons focus on physical situations where the currents are shear and separation driven, and are thus de-coupled from the incident tsunami waveform. In general, we find that models of increasing physical complexity provide better accuracy, and that low-order three-dimensional models are superior to high-order two-dimensional models. Inside separation zones and in areas strongly affected by eddies, the magnitude of both model-data errors and inter-model differences can be the same as the magnitude of the mean flow. Thus, we make arguments for the need of an ensemble modeling approach for areas affected by large-scale turbulent eddies, where deterministic simulation may be misleading. As a result of the analyses presented herein, we expect that tsunami modelers now have a better awareness of their ability to accurately capture the physics of tsunami currents, and therefore a better understanding of how to use these simulation tools for hazard assessment and mitigation efforts.

  20. Assessment of the GECKO-A Modeling Tool and Simplified 3D Model Parameterizations for SOA Formation

    NASA Astrophysics Data System (ADS)

    Aumont, B.; Hodzic, A.; La, S.; Camredon, M.; Lannuque, V.; Lee-Taylor, J. M.; Madronich, S.

    2014-12-01

    Explicit chemical mechanisms aim to embody the current knowledge of the transformations occurring in the atmosphere during the oxidation of organic matter. These explicit mechanisms are therefore useful tools to explore the fate of organic matter during its tropospheric oxidation and examine how these chemical processes shape the composition and properties of the gaseous and the condensed phases. Furthermore, explicit mechanisms provide powerful benchmarks to design and assess simplified parameterizations to be included 3D model. Nevertheless, the explicit mechanism describing the oxidation of hydrocarbons with backbones larger than few carbon atoms involves millions of secondary organic compounds, far exceeding the size of chemical mechanisms that can be written manually. Data processing tools can however be designed to overcome these difficulties and automatically generate consistent and comprehensive chemical mechanisms on a systematic basis. The Generator for Explicit Chemistry and Kinetics of Organics in the Atmosphere (GECKO-A) has been developed for the automatic writing of explicit chemical schemes of organic species and their partitioning between the gas and condensed phases. GECKO-A can be viewed as an expert system that mimics the steps by which chemists might develop chemical schemes. GECKO-A generates chemical schemes according to a prescribed protocol assigning reaction pathways and kinetics data on the basis of experimental data and structure-activity relationships. In its current version, GECKO-A can generate the full atmospheric oxidation scheme for most linear, branched and cyclic precursors, including alkanes and alkenes up to C25. Assessments of the GECKO-A modeling tool based on chamber SOA observations will be presented. GECKO-A was recently used to design a parameterization for SOA formation based on a Volatility Basis Set (VBS) approach. First results will be presented.

  1. Formulation of advanced consumables management models: Executive summary. [modeling spacecraft environmental control, life support, and electric power supply systems

    NASA Technical Reports Server (NTRS)

    Daly, J. K.; Torian, J. G.

    1979-01-01

    An overview of studies conducted to establish the requirements for advanced subsystem analytical tools is presented. Modifications are defined for updating current computer programs used to analyze environmental control, life support, and electric power supply systems so that consumables for future advanced spacecraft may be managed.

  2. Public Library Site Evaluation and Location: Past and Present Market-Based Modelling Tools for the Future.

    ERIC Educational Resources Information Center

    Koontz, Christine M.

    1992-01-01

    Presents a methodology for construction of location modeling for public library facilities in diverse urban environments. Historical and current research in library location is reviewed; and data collected from a survey of six library systems are analyzed according to population, spatial, library use, and library attractiveness variables. (48…

  3. A Season of Change: How Science Librarians Can Remain Relevant with Open Access and Scholarly Communications Initiatives

    ERIC Educational Resources Information Center

    Brown, Elizabeth

    2009-01-01

    The current rate of change suggests scholarly communications issues such as new publication models and technology to connect library and research tools is expected to continue into the foreseeable future. As models evolve, standards develop, and scientists evolve in their communication patterns, librarians will need to embrace transitional…

  4. The Pelagics Habitat Analysis Module (PHAM): Decision Support Tools for Pelagic Fisheries

    NASA Astrophysics Data System (ADS)

    Armstrong, E. M.; Harrison, D. P.; Kiefer, D.; O'Brien, F.; Hinton, M.; Kohin, S.; Snyder, S.

    2009-12-01

    PHAM is a project funded by NASA to integrate satellite imagery and circulation models into the management of commercial and threatened pelagic species. Specifically, the project merges data from fishery surveys, and fisheries catch and effort data with satellite imagery and circulation models to define the habitat of each species. This new information on habitat will then be used to inform population distribution and models of population dynamics that are used for management. During the first year of the project, we created two prototype modules. One module, which was developed for the Inter-American Tropical Tuna Commission, is designed to help improve information available to manage the tuna fisheries of the eastern Pacific Ocean. The other module, which was developed for the Coastal Pelagics Division of the Southwest Fishery Science Center, assists management of by-catch of mako, blue, and thresher sharks along the Californian coast. Both modules were built with the EASy marine geographic information system, which provides a 4 dimensional (latitude, longitude, depth, and time) home for integration of the data. The projects currently provide tools for automated downloading and geo-referencing of satellite imagery of sea surface temperature, height, and chlorophyll concentrations; output from JPL’s ECCO2 global circulation model and its ROM California current model; and gridded data from fisheries and fishery surveys. It also provides statistical tools for defining species habitat from these and other types of environmental data. These tools include unbalanced ANOVA, EOF analysis of satellite imagery, and multivariate search routines for fitting fishery data to transforms of the environmental data. Output from the projects consists of dynamic maps of the distribution of the species that are driven by the time series of satellite imagery and output from the circulation models. It also includes relationships between environmental variables and recruitment. During the talk, we will briefly demonstrate features of the software and present the results of our analyses of habitats.

  5. Musite, a tool for global prediction of general and kinase-specific phosphorylation sites.

    PubMed

    Gao, Jianjiong; Thelen, Jay J; Dunker, A Keith; Xu, Dong

    2010-12-01

    Reversible protein phosphorylation is one of the most pervasive post-translational modifications, regulating diverse cellular processes in various organisms. High throughput experimental studies using mass spectrometry have identified many phosphorylation sites, primarily from eukaryotes. However, the vast majority of phosphorylation sites remain undiscovered, even in well studied systems. Because mass spectrometry-based experimental approaches for identifying phosphorylation events are costly, time-consuming, and biased toward abundant proteins and proteotypic peptides, in silico prediction of phosphorylation sites is potentially a useful alternative strategy for whole proteome annotation. Because of various limitations, current phosphorylation site prediction tools were not well designed for comprehensive assessment of proteomes. Here, we present a novel software tool, Musite, specifically designed for large scale predictions of both general and kinase-specific phosphorylation sites. We collected phosphoproteomics data in multiple organisms from several reliable sources and used them to train prediction models by a comprehensive machine-learning approach that integrates local sequence similarities to known phosphorylation sites, protein disorder scores, and amino acid frequencies. Application of Musite on several proteomes yielded tens of thousands of phosphorylation site predictions at a high stringency level. Cross-validation tests show that Musite achieves some improvement over existing tools in predicting general phosphorylation sites, and it is at least comparable with those for predicting kinase-specific phosphorylation sites. In Musite V1.0, we have trained general prediction models for six organisms and kinase-specific prediction models for 13 kinases or kinase families. Although the current pretrained models were not correlated with any particular cellular conditions, Musite provides a unique functionality for training customized prediction models (including condition-specific models) from users' own data. In addition, with its easily extensible open source application programming interface, Musite is aimed at being an open platform for community-based development of machine learning-based phosphorylation site prediction applications. Musite is available at http://musite.sourceforge.net/.

  6. Measured Engine Installation Effects of Four Civil Transport Airplanes.

    DOT National Transportation Integrated Search

    2001-10-28

    The Federal Aviation Administration's Integrated Noise Model (INM) is one of the primary tools : for land use planning around airports [1]. The INM currently calculates airplane noise lateral : attenuation using the methods contained in the Society o...

  7. Optimizing romanian maritime coastline using mathematical model Litpack

    NASA Astrophysics Data System (ADS)

    Anton, I. A.; Panaitescu, M.; Panaitescu, F. V.

    2017-08-01

    There are many methods and tools to study shoreline change in coastal engineering. LITPACK is a numerical model included in MIKE software developed by DHI (Danish Hydraulic Institute). With this matehematical model we can simulate coastline evolution and profile along beach. Research and methodology: the paper contents location of the study area, the current status of Midia-Mangalia shoreline, protection objectives, the changes of shoreline after having protected constructions. In this paper are presented numerical and graphycal results obtained with this model for studying the romanian maritime coastline in area MIDIA-MANGALIA: non-cohesive sediment transport, long-shore current and littoral drift, coastline evolution, crossshore profile evolution, the development of the coastline position in time.

  8. A decision-support tool to inform Australian strategies for preventing suicide and suicidal behaviour.

    PubMed

    Page, Andrew; Atkinson, Jo-An; Heffernan, Mark; McDonnell, Geoff; Hickie, Ian

    2017-04-27

    Dynamic simulation modelling is increasingly being recognised as a valuable decision-support tool to help guide investments and actions to address complex public health issues such as suicide. In particular, participatory system dynamics (SD) modelling provides a useful tool for asking high-level 'what if' questions, and testing the likely impacts of different combinations of policies and interventions at an aggregate level before they are implemented in the real world. We developed an SD model for suicide prevention in Australia, and investigated the hypothesised impacts over the next 10 years (2015-2025) of a combination of current intervention strategies proposed for population interventions in Australia: 1) general practitioner (GP) training, 2) coordinated aftercare in those who have attempted suicide, 3) school-based mental health literacy programs, 4) brief-contact interventions in hospital settings, and 5) psychosocial treatment approaches. Findings suggest that the largest reductions in suicide were associated with GP training (6%) and coordinated aftercare approaches (4%), with total reductions of 12% for all interventions combined. This paper highlights the value of dynamic modelling methods for managing complexity and uncertainty, and demonstrates their potential use as a decision-support tool for policy makers and program planners for community suicide prevention actions.

  9. Morphogenic designer--an efficient tool to digitally design tooth forms.

    PubMed

    Hajtó, J; Marinescu, C; Silva, N R F A

    2014-01-01

    Different digital software tools are available today for the purpose of designing anatomically correct anterior and posterior restorations. The current concepts present weaknesses, which can be potentially addressed by more advanced modeling tools, such as the ones already available in professional CAD (Computer Aided Design) graphical software. This study describes the morphogenic designer (MGD) as an efficient and easy method for digitally designing tooth forms for the anterior and posterior dentition. Anterior and posterior tooth forms were selected from a collection of digitalized natural teeth and subjectively assessed as "average". The models in the form of STL files were filtered, cleaned, idealized, and re-meshed to match the specifications of the software used. The shapes were then imported as wavefront ".obj" model into Modo 701, software built for modeling, texturing, visualization, and animation. In order to create a parametric design system, intentional interactive deformations were performed on the average tooth shapes and then further defined as morph targets. By combining various such parameters, several tooth shapes were formed virtually and their images presented. MGD proved to be a versatile and powerful tool for the purpose of esthetic and functional digital crown designs.

  10. New approaches for real time decision support systems

    NASA Technical Reports Server (NTRS)

    Hair, D. Charles; Pickslay, Kent

    1994-01-01

    NCCOSC RDT&E Division (NRaD) is conducting research into ways of improving decision support systems (DSS) that are used in tactical Navy decision making situations. The research has focused on the incorporation of findings about naturalistic decision-making processes into the design of the DSS. As part of that research, two computer tools were developed that model the two primary naturalistic decision-making strategies used by Navy experts in tactical settings. Current work is exploring how best to incorporate the information produced by those tools into an existing simulation of current Navy decision support systems. This work has implications for any applications involving the need to make decisions under time constraints, based on incomplete or ambiguous data.

  11. The Durham Adaptive Optics Simulation Platform (DASP): Current status

    NASA Astrophysics Data System (ADS)

    Basden, A. G.; Bharmal, N. A.; Jenkins, D.; Morris, T. J.; Osborn, J.; Peng, J.; Staykov, L.

    2018-01-01

    The Durham Adaptive Optics Simulation Platform (DASP) is a Monte-Carlo modelling tool used for the simulation of astronomical and solar adaptive optics systems. In recent years, this tool has been used to predict the expected performance of the forthcoming extremely large telescope adaptive optics systems, and has seen the addition of several modules with new features, including Fresnel optics propagation and extended object wavefront sensing. Here, we provide an overview of the features of DASP and the situations in which it can be used. Additionally, the user tools for configuration and control are described.

  12. CASTEAUR: a simple tool to assess the transfer of radionuclides in waterways.

    PubMed

    Beaugelin-Seiller, K; Boyer, P; Garnier-Laplace, J; Adam, C

    2002-10-01

    The CASTEAUR project proposes a simplified tool to assess the transfer of radionuclides between and in the main biotic and abiotic components of the freshwater ecosystem. Applied to phenomenological modeling, various hypotheses simplify the transfer equations, which, when programmed under Excel, can be readily dispatched and used. CASTEAUR can be used as an assessment tool for impact studies of accidental release as well as "routine" release. This code is currently being tested on the Rhone River, downstream from a nuclear reprocessing plant. The first results are reported to illustrate the possibilities offered by CASTEAUR.

  13. Mining Marketing Data

    NASA Technical Reports Server (NTRS)

    2002-01-01

    MarketMiner(R) Products, a line of automated marketing analysis tools manufactured by MarketMiner, Inc., can benefit organizations that perform significant amounts of direct marketing. MarketMiner received a Small Business Innovation Research (SBIR) contract from NASA's Johnson Space Center to develop the software as a data modeling tool for space mission applications. The technology was then built into the company current products to provide decision support for business and marketing applications. With the tool, users gain valuable information about customers and prospects from existing data in order to increase sales and profitability. MarketMiner(R) is a registered trademark of MarketMiner, Inc.

  14. SDMtoolbox 2.0: the next generation Python-based GIS toolkit for landscape genetic, biogeographic and species distribution model analyses

    PubMed Central

    Bennett, Joseph R.; French, Connor M.

    2017-01-01

    SDMtoolbox 2.0 is a software package for spatial studies of ecology, evolution, and genetics. The release of SDMtoolbox 2.0 allows researchers to use the most current ArcGIS software and MaxEnt software, and reduces the amount of time that would be spent developing common solutions. The central aim of this software is to automate complicated and repetitive spatial analyses in an intuitive graphical user interface. One core tenant facilitates careful parameterization of species distribution models (SDMs) to maximize each model’s discriminatory ability and minimize overfitting. This includes carefully processing of occurrence data, environmental data, and model parameterization. This program directly interfaces with MaxEnt, one of the most powerful and widely used species distribution modeling software programs, although SDMtoolbox 2.0 is not limited to species distribution modeling or restricted to modeling in MaxEnt. Many of the SDM pre- and post-processing tools have ‘universal’ analogs for use with any modeling software. The current version contains a total of 79 scripts that harness the power of ArcGIS for macroecology, landscape genetics, and evolutionary studies. For example, these tools allow for biodiversity quantification (such as species richness or corrected weighted endemism), generation of least-cost paths and corridors among shared haplotypes, assessment of the significance of spatial randomizations, and enforcement of dispersal limitations of SDMs projected into future climates—to only name a few functions contained in SDMtoolbox 2.0. Lastly, dozens of generalized tools exists for batch processing and conversion of GIS data types or formats, which are broadly useful to any ArcMap user. PMID:29230356

  15. Modeling biofilms with dual extracellular electron transfer mechanisms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Renslow, Ryan S.; Babauta, Jerome T.; Kuprat, Andrew P.

    2013-11-28

    Electrochemically active biofilms have a unique form of respiration in which they utilize solid external materials as their terminal electron acceptor for metabolism. Currently, two primary mechanisms have been identified for long-range extracellular electron transfer (EET): a diffusion- and a conduction-based mechanism. Evidence in the literature suggests that some biofilms, particularly Shewanella oneidensis, produce components requisite for both mechanisms. In this study, a generic model is presented that incorporates both diffusion- and conduction-based mechanisms and allows electrochemically active biofilms to utilize both simultaneously. The model was applied to Shewanella oneidensis and Geobacter sulfurreducens biofilms using experimentally generated data found themore » literature. Our simulation results showed that 1) biofilms having both mechanisms available, especially if they can interact, may have metabolic advantage over biofilms that can use only a single mechanism; 2) the thickness of Geobacter sulfurreducens biofilms is likely not limited by conductivity; 3) accurate intrabiofilm diffusion coefficient values are critical for current generation predictions; and 4) the local biofilm potential and redox potential are two distinct measurements and cannot be assumed to have identical values. Finally, we determined that cyclic and squarewave voltammetry are currently not good tools to determine the specific percentage of extracellular electron transfer mechanisms used by biofilms. The developed model will be a critical tool in designing experiments to explain EET mechanisms.« less

  16. Integration of eHealth Tools in the Process of Workplace Health Promotion: Proposal for Design and Implementation.

    PubMed

    Jimenez, Paulino; Bregenzer, Anita

    2018-02-23

    Electronic health (eHealth) and mobile health (mHealth) tools can support and improve the whole process of workplace health promotion (WHP) projects. However, several challenges and opportunities have to be considered while integrating these tools in WHP projects. Currently, a large number of eHealth tools are developed for changing health behavior, but these tools can support the whole WHP process, including group administration, information flow, assessment, intervention development process, or evaluation. To support a successful implementation of eHealth tools in the whole WHP processes, we introduce a concept of WHP (life cycle model of WHP) with 7 steps and present critical and success factors for the implementation of eHealth tools in each step. We developed a life cycle model of WHP based on the World Health Organization (WHO) model of healthy workplace continual improvement process. We suggest adaptations to the WHO model to demonstrate the large number of possibilities to implement eHealth tools in WHP as well as possible critical points in the implementation process. eHealth tools can enhance the efficiency of WHP in each of the 7 steps of the presented life cycle model of WHP. Specifically, eHealth tools can support by offering easier administration, providing an information and communication platform, supporting assessments, presenting and discussing assessment results in a dashboard, and offering interventions to change individual health behavior. Important success factors include the possibility to give automatic feedback about health parameters, create incentive systems, or bring together a large number of health experts in one place. Critical factors such as data security, anonymity, or lack of management involvement have to be addressed carefully to prevent nonparticipation and dropouts. Using eHealth tools can support WHP, but clear regulations for the usage and implementation of these tools at the workplace are needed to secure quality and reach sustainable results. ©Paulino Jimenez, Anita Bregenzer. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 23.02.2018.

  17. Material Protection, Accounting, and Control Technologies (MPACT): Modeling and Simulation Roadmap

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cipiti, Benjamin; Dunn, Timothy; Durbin, Samual

    The development of sustainable advanced nuclear fuel cycles is a long-term goal of the Office of Nuclear Energy’s (DOE-NE) Fuel Cycle Technologies program. The Material Protection, Accounting, and Control Technologies (MPACT) campaign is supporting research and development (R&D) of advanced instrumentation, analysis tools, and integration methodologies to meet this goal. This advanced R&D is intended to facilitate safeguards and security by design of fuel cycle facilities. The lab-scale demonstration of a virtual facility, distributed test bed, that connects the individual tools being developed at National Laboratories and university research establishments, is a key program milestone for 2020. These tools willmore » consist of instrumentation and devices as well as computer software for modeling. To aid in framing its long-term goal, during FY16, a modeling and simulation roadmap is being developed for three major areas of investigation: (1) radiation transport and sensors, (2) process and chemical models, and (3) shock physics and assessments. For each area, current modeling approaches are described, and gaps and needs are identified.« less

  18. Development of task network models of human performance in microgravity

    NASA Technical Reports Server (NTRS)

    Diaz, Manuel F.; Adam, Susan

    1992-01-01

    This paper discusses the utility of task-network modeling for quantifying human performance variability in microgravity. The data are gathered for: (1) improving current methodologies for assessing human performance and workload in the operational space environment; (2) developing tools for assessing alternative system designs; and (3) developing an integrated set of methodologies for the evaluation of performance degradation during extended duration spaceflight. The evaluation entailed an analysis of the Remote Manipulator System payload-grapple task performed on many shuttle missions. Task-network modeling can be used as a tool for assessing and enhancing human performance in man-machine systems, particularly for modeling long-duration manned spaceflight. Task-network modeling can be directed toward improving system efficiency by increasing the understanding of basic capabilities of the human component in the system and the factors that influence these capabilities.

  19. Emerging from the bottleneck: benefits of the comparative approach to modern neuroscience.

    PubMed

    Brenowitz, Eliot A; Zakon, Harold H

    2015-05-01

    Neuroscience has historically exploited a wide diversity of animal taxa. Recently, however, research has focused increasingly on a few model species. This trend has accelerated with the genetic revolution, as genomic sequences and genetic tools became available for a few species, which formed a bottleneck. This coalescence on a small set of model species comes with several costs that are often not considered, especially in the current drive to use mice explicitly as models for human diseases. Comparative studies of strategically chosen non-model species can complement model species research and yield more rigorous studies. As genetic sequences and tools become available for many more species, we are poised to emerge from the bottleneck and once again exploit the rich biological diversity offered by comparative studies. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Patient-specific finite element modeling of bones.

    PubMed

    Poelert, Sander; Valstar, Edward; Weinans, Harrie; Zadpoor, Amir A

    2013-04-01

    Finite element modeling is an engineering tool for structural analysis that has been used for many years to assess the relationship between load transfer and bone morphology and to optimize the design and fixation of orthopedic implants. Due to recent developments in finite element model generation, for example, improved computed tomography imaging quality, improved segmentation algorithms, and faster computers, the accuracy of finite element modeling has increased vastly and finite element models simulating the anatomy and properties of an individual patient can be constructed. Such so-called patient-specific finite element models are potentially valuable tools for orthopedic surgeons in fracture risk assessment or pre- and intraoperative planning of implant placement. The aim of this article is to provide a critical overview of current themes in patient-specific finite element modeling of bones. In addition, the state-of-the-art in patient-specific modeling of bones is compared with the requirements for a clinically applicable patient-specific finite element method, and judgment is passed on the feasibility of application of patient-specific finite element modeling as a part of clinical orthopedic routine. It is concluded that further development in certain aspects of patient-specific finite element modeling are needed before finite element modeling can be used as a routine clinical tool.

  1. Integrated tokamak modeling: when physics informs engineering and research planning

    NASA Astrophysics Data System (ADS)

    Poli, Francesca

    2017-10-01

    Simulations that integrate virtually all the relevant engineering and physics aspects of a real tokamak experiment are a power tool for experimental interpretation, model validation and planning for both present and future devices. This tutorial will guide through the building blocks of an ``integrated'' tokamak simulation, such as magnetic flux diffusion, thermal, momentum and particle transport, external heating and current drive sources, wall particle sources and sinks. Emphasis is given to the connection and interplay between external actuators and plasma response, between the slow time scales of the current diffusion and the fast time scales of transport, and how reduced and high-fidelity models can contribute to simulate a whole device. To illustrate the potential and limitations of integrated tokamak modeling for discharge prediction, a helium plasma scenario for the ITER pre-nuclear phase is taken as an example. This scenario presents challenges because it requires core-edge integration and advanced models for interaction between waves and fast-ions, which are subject to a limited experimental database for validation and guidance. Starting from a scenario obtained by re-scaling parameters from the demonstration inductive ``ITER baseline'', it is shown how self-consistent simulations that encompass both core and edge plasma regions, as well as high-fidelity heating and current drive source models are needed to set constraints on the density, magnetic field and heating scheme. This tutorial aims at demonstrating how integrated modeling, when used with adequate level of criticism, can not only support design of operational scenarios, but also help to asses the limitations and gaps in the available models, thus indicating where improved modeling tools are required and how present experiments can help their validation and inform research planning. Work supported by DOE under DE-AC02-09CH1146.

  2. Transient tracking of low and high-order eccentricity-related components in induction motors via TFD tools

    NASA Astrophysics Data System (ADS)

    Climente-Alarcon, V.; Antonino-Daviu, J.; Riera-Guasp, M.; Pons-Llinares, J.; Roger-Folch, J.; Jover-Rodriguez, P.; Arkkio, A.

    2011-02-01

    The present work is focused on the diagnosis of mixed eccentricity faults in induction motors via the study of currents demanded by the machine. Unlike traditional methods, based on the analysis of stationary currents (Motor Current Signature Analysis (MCSA)), this work provides new findings regarding the diagnosis approach proposed by the authors in recent years, which is mainly focused on the fault diagnosis based on the analysis of transient quantities, such as startup or plug stopping currents (Transient Motor Current Signature Analysis (TMCSA)), using suitable time-frequency decomposition (TFD) tools. The main novelty of this work is to prove the usefulness of tracking the transient evolution of high-order eccentricity-related harmonics in order to diagnose the condition of the machine, complementing the information obtained with the low-order components, whose transient evolution was well characterised in previous works. Tracking of high-order eccentricity-related harmonics during the transient, through their associated patterns in the time-frequency plane, may significantly increase the reliability of the diagnosis, since the set of fault-related patterns arising after application of the corresponding TFD tool is very unlikely to be caused by other faults or phenomena. Although there are different TFD tools which could be suitable for the transient extraction of these harmonics, this paper makes use of a Wigner-Ville distribution (WVD)-based algorithm in order to carry out the time-frequency decomposition of the startup current signal, since this is a tool showing an excellent trade-off between frequency resolution at both high and low frequencies. Several simulation results obtained with a finite element-based model and experimental results show the validity of this fault diagnosis approach under several faulty and operating conditions. Also, additional signals corresponding to the coexistence of the eccentricity and other non-fault related phenomena making difficult the diagnosis (fluctuating load torque) are included in the paper. Finally, a comparison with an alternative TFD tool - the discrete wavelet transform (DWT) - applied in previous papers, is also carried out in the contribution. The results are promising regarding the usefulness of the methodology for the reliable diagnosis of eccentricities and for their discrimination against other phenomena.

  3. A Final Approach Trajectory Model for Current Operations

    NASA Technical Reports Server (NTRS)

    Gong, Chester; Sadovsky, Alexander

    2010-01-01

    Predicting accurate trajectories with limited intent information is a challenge faced by air traffic management decision support tools in operation today. One such tool is the FAA's Terminal Proximity Alert system which is intended to assist controllers in maintaining safe separation of arrival aircraft during final approach. In an effort to improve the performance of such tools, two final approach trajectory models are proposed; one based on polynomial interpolation, the other on the Fourier transform. These models were tested against actual traffic data and used to study effects of the key final approach trajectory modeling parameters of wind, aircraft type, and weight class, on trajectory prediction accuracy. Using only the limited intent data available to today's ATM system, both the polynomial interpolation and Fourier transform models showed improved trajectory prediction accuracy over a baseline dead reckoning model. Analysis of actual arrival traffic showed that this improved trajectory prediction accuracy leads to improved inter-arrival separation prediction accuracy for longer look ahead times. The difference in mean inter-arrival separation prediction error between the Fourier transform and dead reckoning models was 0.2 nmi for a look ahead time of 120 sec, a 33 percent improvement, with a corresponding 32 percent improvement in standard deviation.

  4. MHDL CAD tool with fault circuit handling

    NASA Astrophysics Data System (ADS)

    Espinosa Flores-Verdad, Guillermo; Altamirano Robles, Leopoldo; Osorio Roque, Leticia

    2003-04-01

    Behavioral modeling and simulation, with Analog Hardware and Mixed Signal Description High Level Languages (MHDLs), have generated the development of diverse simulation tools that allow handling the requirements of the modern designs. These systems have million of transistors embedded and they are radically diverse between them. This tendency of simulation tools is exemplified by the development of languages for modeling and simulation, whose applications are the re-use of complete systems, construction of virtual prototypes, realization of test and synthesis. This paper presents the general architecture of a Mixed Hardware Description Language, based on the standard 1076.1-1999 IEEE VHDL Analog and Mixed-Signal Extensions known as VHDL-AMS. This architecture is novel by consider the modeling and simulation of faults. The main modules of the CAD tool are briefly described in order to establish the information flow and its transformations, starting from the description of a circuit model, going throw the lexical analysis, mathematical models generation and the simulation core, ending at the collection of the circuit behavior as simulation"s data. In addition, the incorporated mechanisms to the simulation core are explained in order to realize the handling of faults into the circuit models. Currently, the CAD tool works with algebraic and differential descriptions for the circuit models, nevertheless the language design is open to be able to handle different model types: Fuzzy Models, Differentials Equations, Transfer Functions and Tables. This applies for fault models too, in this sense the CAD tool considers the inclusion of mutants and saboteurs. To exemplified the results obtained until now, the simulated behavior of a circuit is shown when it is fault free and when it has been modified by the inclusion of a fault as a mutant or a saboteur. The obtained results allow the realization of a virtual diagnosis for mixed circuits. This language works in a UNIX system; it was developed with an object-oriented methodology and programmed in C++.

  5. Parameterization of the 3-PG model for Pinus elliottii stands using alternative methods to estimate fertility rating, biomass partitioning and canopy closure

    Treesearch

    Carlos A. Gonzalez-Benecke; Eric J. Jokela; Wendell P. Cropper; Rosvel Bracho; Daniel J. Leduc

    2014-01-01

    The forest simulation model, 3-PG, has been widely applied as a useful tool for predicting growth of forest species in many countries. The model has the capability to estimate the effects of management, climate and site characteristics on many stand attributes using easily available data. Currently, there is an increasing interest in estimating biomass and assessing...

  6. Development of TIF based figuring algorithm for deterministic pitch tool polishing

    NASA Astrophysics Data System (ADS)

    Yi, Hyun-Su; Kim, Sug-Whan; Yang, Ho-Soon; Lee, Yun-Woo

    2007-12-01

    Pitch is perhaps the oldest material used for optical polishing, leaving superior surface texture, and has been used widely in the optics shop floor. However, for its unpredictable controllability of removal characteristics, the pitch tool polishing has been rarely analysed quantitatively and many optics shops rely heavily on optician's "feel" even today. In order to bring a degree of process controllability to the pitch tool polishing, we added motorized tool motions to the conventional Draper type polishing machine and modelled the tool path in the absolute machine coordinate. We then produced a number of Tool Influence Function (TIF) both from an analytical model and a series of experimental polishing runs using the pitch tool. The theoretical TIFs agreed well with the experimental TIFs to the profile accuracy of 79 % in terms of its shape. The surface figuring algorithm was then developed in-house utilizing both theoretical and experimental TIFs. We are currently undertaking a series of trial figuring experiments to prove the performance of the polishing algorithm, and the early results indicate that the highly deterministic material removal control with the pitch tool can be achieved to a certain level of form error. The machine renovation, TIF theory and experimental confirmation, figuring simulation results are reported together with implications to deterministic polishing.

  7. Ulcerative colitis outpatient management: development and evaluation of tools to support primary care practitioners.

    PubMed

    Bennett, A L; Buckton, S; Lawrance, I; Leong, R W; Moore, G; Andrews, J M

    2015-12-01

    Current models of care for ulcerative colitis (UC) across healthcare systems are inconsistent with a paucity of existing guidelines or supportive tools for outpatient management. This study aimed to produce and evaluate evidence-based outpatient management tools for UC to guide primary care practitioners and patients in clinical decision-making. Three tools were developed after identifying current gaps in the provision of healthcare services for patients with UC at a Clinical Insights Meeting in 2013. Draft designs were further refined through consultation and consolidation of feedback by the steering committee. Final drafts were developed following feasibility testing in three key stakeholder groups (gastroenterologists, general practitioners and patients) by questionnaire. The tools were officially launched into mainstream use in Australia in 2014. Three quarters of all respondents liked the layout and content of each tool. Minimal safety concerns were aired and those, along with pieces of information that were felt to be omitted, that were reviewed by the steering committee and incorporated into the final documents. The majority (over 80%) of respondents felt that the tools would be useful and would improve outpatient management of UC. Evidence-based outpatient clinical management tools for UC can be developed. The concept and end-product have been well received by all stakeholder groups. These tools should support non-specialist clinicians to optimise UC management and empower patients by facilitating them to safely self-manage and identify when medical support is needed. © 2015 Royal Australasian College of Physicians.

  8. Experimental High-Resolution Land Surface Prediction System for the Vancouver 2010 Winter Olympic Games

    NASA Astrophysics Data System (ADS)

    Belair, S.; Bernier, N.; Tong, L.; Mailhot, J.

    2008-05-01

    The 2010 Winter Olympic and Paralympic Games will take place in Vancouver, Canada, from 12 to 28 February 2010 and from 12 to 21 March 2010, respectively. In order to provide the best possible guidance achievable with current state-of-the-art science and technology, Environment Canada is currently setting up an experimental numerical prediction system for these special events. This system consists of a 1-km limited-area atmospheric model that will be integrated for 16h, twice a day, with improved microphysics compared with the system currently operational at the Canadian Meteorological Centre. In addition, several new and original tools will be used to adapt and refine predictions near and at the surface. Very high-resolution two-dimensional surface systems, with 100-m and 20-m grid size, will cover the Vancouver Olympic area. Using adaptation methods to improve the forcing from the lower-resolution atmospheric models, these 2D surface models better represent surface processes, and thus lead to better predictions of snow conditions and near-surface air temperature. Based on a similar strategy, a single-point model will be implemented to better predict surface characteristics at each station of an observing network especially installed for the 2010 events. The main advantage of this single-point system is that surface observations are used as forcing for the land surface models, and can even be assimilated (although this is not expected in the first version of this new tool) to improve initial conditions of surface variables such as snow depth and surface temperatures. Another adaptation tool, based on 2D stationnary solutions of a simple dynamical system, will be used to produce near-surface winds on the 100-m grid, coherent with the high- resolution orography. The configuration of the experimental numerical prediction system will be presented at the conference, together with preliminary results for winter 2007-2008.

  9. Fostering Team Awareness in Earth System Modeling Communities

    NASA Astrophysics Data System (ADS)

    Easterbrook, S. M.; Lawson, A.; Strong, S.

    2009-12-01

    Existing Global Climate Models are typically managed and controlled at a single site, with varied levels of participation by scientists outside the core lab. As these models evolve to encompass a wider set of earth systems, this central control of the modeling effort becomes a bottleneck. But such models cannot evolve to become fully distributed open source projects unless they address the imbalance in the availability of communication channels: scientists at the core site have access to regular face-to-face communication with one another, while those at remote sites have access to only a subset of these conversations - e.g. formally scheduled teleconferences and user meetings. Because of this imbalance, critical decision making can be hidden from many participants, their code contributions can interact in unanticipated ways, and the community loses awareness of who knows what. We have documented some of these problems in a field study at one climate modeling centre, and started to develop tools to overcome these problems. We report on one such tool, TracSNAP, which analyzes the social network of the scientists contributing code to the model by extracting the data in an existing project code repository. The tool presents the results of this analysis to modelers and model users in a number of ways: recommendation for who has expertise on particular code modules, suggestions for code sections that are related to files being worked on, and visualizations of team communication patterns. The tool is currently available as a plugin for the Trac bug tracking system.

  10. Modelling and interpreting spectral energy distributions of galaxies with BEAGLE

    NASA Astrophysics Data System (ADS)

    Chevallard, Jacopo; Charlot, Stéphane

    2016-10-01

    We present a new-generation tool to model and interpret spectral energy distributions (SEDs) of galaxies, which incorporates in a consistent way the production of radiation and its transfer through the interstellar and intergalactic media. This flexible tool, named BEAGLE (for BayEsian Analysis of GaLaxy sEds), allows one to build mock galaxy catalogues as well as to interpret any combination of photometric and spectroscopic galaxy observations in terms of physical parameters. The current version of the tool includes versatile modelling of the emission from stars and photoionized gas, attenuation by dust and accounting for different instrumental effects, such as spectroscopic flux calibration and line spread function. We show a first application of the BEAGLE tool to the interpretation of broad-band SEDs of a published sample of ˜ 10^4 galaxies at redshifts 0.1 ≲ z ≲ 8. We find that the constraints derived on photometric redshifts using this multipurpose tool are comparable to those obtained using public, dedicated photometric-redshift codes and quantify this result in a rigorous statistical way. We also show how the post-processing of BEAGLE output data with the PYTHON extension PYP-BEAGLE allows the characterization of systematic deviations between models and observations, in particular through posterior predictive checks. The modular design of the BEAGLE tool allows easy extensions to incorporate, for example, the absorption by neutral galactic and circumgalactic gas, and the emission from an active galactic nucleus, dust and shock-ionized gas. Information about public releases of the BEAGLE tool will be maintained on http://www.jacopochevallard.org/beagle.

  11. Development of a Three-Dimensional, Unstructured Material Response Design Tool

    NASA Technical Reports Server (NTRS)

    Schulz, Joseph C.; Stern, Eric C.; Muppidi, Suman; Palmer, Grant E.; Schroeder, Olivia

    2017-01-01

    A preliminary verification and validation of a new material response model is presented. This model, Icarus, is intended to serve as a design tool for the thermal protection systems of re-entry vehicles. Currently, the capability of the model is limited to simulating the pyrolysis of a material as a result of the radiative and convective surface heating imposed on the material from the surrounding high enthalpy gas. Since the major focus behind the development of Icarus has been model extensibility, the hope is that additional physics can be quickly added. This extensibility is critical since thermal protection systems are becoming increasing complex, e.g. woven carbon polymers. Additionally, as a three-dimensional, unstructured, finite-volume model, Icarus is capable of modeling complex geometries. In this paper, the mathematical and numerical formulation is presented followed by a discussion of the software architecture and some preliminary verification and validation studies.

  12. Decision models in the evaluation of psychotropic drugs : useful tool or useless toy?

    PubMed

    Barbui, Corrado; Lintas, Camilla

    2006-09-01

    A current contribution in the European Journal of Health Economics employs a decision model to compare health care costs of olanzapine and risperidone treatment for schizophrenia. The model suggests that a treatment strategy of first-line olanzapine is cost-saving over a 1-year period, with additional clinical benefits in the form of avoided relapses in the long-term. From a clinical perspective this finding is indubitably relevant, but can physicians and policy makers believe it? The study is presented in a balanced way, assumptions are based on data extracted from clinical trials published in major psychiatric journals, and the theoretical underpinnings of the model are reasonable. Despite these positive aspects, we believe that the methodology used in this study-the decision model approach-is an unsuitable and potentially misleading tool for evaluating psychotropic drugs. In this commentary, taking the olanzapine vs. risperidone model as an example, arguments are provided to support this statement.

  13. CURRENT CHALLENGES ON ENDOCRINE DISRUPTORS

    EPA Science Inventory

    For over ten years, major international efforts have been aimed at understanding the mechanism and extent of endocrine disruption in experimental models, wildlife, and people; the occurrence of this in the real world and in developing tools for screening and prediction of risk. ...

  14. A Database and Tool for Boundary Conditions for Regional Air Quality Modeling: Description and Evaluation

    EPA Science Inventory

    Transported air pollutants receive increasing attention as regulations tighten and global concentrations increase. The need to represent international transport in regional air quality assessments requires improved representation of boundary concentrations. Currently available ob...

  15. Service quality assessment of workers compensation health care delivery programs in New York using SERVQUAL.

    PubMed

    Arunasalam, Mark; Paulson, Albert; Wallace, William

    2003-01-01

    Preferred provider organizations (PPOs) provide healthcare services to an expanding proportion of the U.S. population. This paper presents a programmatic assessment of service quality in the workers' compensation environment using two different models: the PPO program model and the fee-for-service (FFS) payor model. The methodology used here will augment currently available research in workers' compensation, which has been lacking in measuring service quality determinants and assessing programmatic success/failure of managed care type programs. Results indicated that the SERVQUAL tool provided a reliable and valid clinical quality assessment tool that ascertained that PPO marketers should focus on promoting physician outreach (to show empathy) and accessibility (to show reliability) for injured workers.

  16. An Object-Based Approach to Evaluation of Climate Variability Projections and Predictions

    NASA Astrophysics Data System (ADS)

    Ammann, C. M.; Brown, B.; Kalb, C. P.; Bullock, R.

    2017-12-01

    Evaluations of the performance of earth system model predictions and projections are of critical importance to enhance usefulness of these products. Such evaluations need to address specific concerns depending on the system and decisions of interest; hence, evaluation tools must be tailored to inform about specific issues. Traditional approaches that summarize grid-based comparisons of analyses and models, or between current and future climate, often do not reveal important information about the models' performance (e.g., spatial or temporal displacements; the reason behind a poor score) and are unable to accommodate these specific information needs. For example, summary statistics such as the correlation coefficient or the mean-squared error provide minimal information to developers, users, and decision makers regarding what is "right" and "wrong" with a model. New spatial and temporal-spatial object-based tools from the field of weather forecast verification (where comparisons typically focus on much finer temporal and spatial scales) have been adapted to more completely answer some of the important earth system model evaluation questions. In particular, the Method for Object-based Diagnostic Evaluation (MODE) tool and its temporal (three-dimensional) extension (MODE-TD) have been adapted for these evaluations. More specifically, these tools can be used to address spatial and temporal displacements in projections of El Nino-related precipitation and/or temperature anomalies, ITCZ-associated precipitation areas, atmospheric rivers, seasonal sea-ice extent, and other features of interest. Examples of several applications of these tools in a climate context will be presented, using output of the CESM large ensemble. In general, these tools provide diagnostic information about model performance - accounting for spatial, temporal, and intensity differences - that cannot be achieved using traditional (scalar) model comparison approaches. Thus, they can provide more meaningful information that can be used in decision-making and planning. Future extensions and applications of these tools in a climate context will be considered.

  17. Integrated Exoplanet Modeling with the GSFC Exoplanet Modeling & Analysis Center (EMAC)

    NASA Astrophysics Data System (ADS)

    Mandell, Avi M.; Hostetter, Carl; Pulkkinen, Antti; Domagal-Goldman, Shawn David

    2018-01-01

    Our ability to characterize the atmospheres of extrasolar planets will be revolutionized by JWST, WFIRST and future ground- and space-based telescopes. In preparation, the exoplanet community must develop an integrated suite of tools with which we can comprehensively predict and analyze observations of exoplanets, in order to characterize the planetary environments and ultimately search them for signs of habitability and life.The GSFC Exoplanet Modeling and Analysis Center (EMAC) will be a web-accessible high-performance computing platform with science support for modelers and software developers to host and integrate their scientific software tools, with the goal of leveraging the scientific contributions from the entire exoplanet community to improve our interpretations of future exoplanet discoveries. Our suite of models will include stellar models, models for star-planet interactions, atmospheric models, planet system science models, telescope models, instrument models, and finally models for retrieving signals from observational data. By integrating this suite of models, the community will be able to self-consistently calculate the emergent spectra from the planet whether from emission, scattering, or in transmission, and use these simulations to model the performance of current and new telescopes and their instrumentation.The EMAC infrastructure will not only provide a repository for planetary and exoplanetary community models, modeling tools and intermodal comparisons, but it will include a "run-on-demand" portal with each software tool hosted on a separate virtual machine. The EMAC system will eventually include a means of running or “checking in” new model simulations that are in accordance with the community-derived standards. Additionally, the results of intermodal comparisons will be used to produce open source publications that quantify the model comparisons and provide an overview of community consensus on model uncertainties on the climates of various planetary targets.

  18. The Pittsburgh Cervical Cancer Screening Model: a risk assessment tool.

    PubMed

    Austin, R Marshall; Onisko, Agnieszka; Druzdzel, Marek J

    2010-05-01

    Evaluation of cervical cancer screening has grown increasingly complex with the introduction of human papillomavirus (HPV) vaccination and newer screening technologies approved by the US Food and Drug Administration. To create a unique Pittsburgh Cervical Cancer Screening Model (PCCSM) that quantifies risk for histopathologic cervical precancer (cervical intraepithelial neoplasia [CIN] 2, CIN3, and adenocarcinoma in situ) and cervical cancer in an environment predominantly using newer screening technologies. The PCCSM is a dynamic Bayesian network consisting of 19 variables available in the laboratory information system, including patient history data (most recent HPV vaccination data), Papanicolaou test results, high-risk HPV results, procedure data, and histopathologic results. The model's graphic structure was based on the published literature. Results from 375 441 patient records from 2005 through 2008 were used to build and train the model. Additional data from 45 930 patients were used to test the model. The PCCSM compares risk quantitatively over time for histopathologically verifiable CIN2, CIN3, adenocarcinoma in situ, and cervical cancer in screened patients for each current cytology result category and for each HPV result. For each current cytology result, HPV test results affect risk; however, the degree of cytologic abnormality remains the largest positive predictor of risk. Prior history also alters the CIN2, CIN3, adenocarcinoma in situ, and cervical cancer risk for patients with common current cytology and HPV test results. The PCCSM can also generate negative risk projections, estimating the likelihood of the absence of histopathologic CIN2, CIN3, adenocarcinoma in situ, and cervical cancer in screened patients. The PCCSM is a dynamic Bayesian network that computes quantitative cervical disease risk estimates for patients undergoing cervical screening. Continuously updatable with current system data, the PCCSM provides a new tool to monitor cervical disease risk in the evolving postvaccination era.

  19. SpaceNet: Modeling and Simulating Space Logistics

    NASA Technical Reports Server (NTRS)

    Lee, Gene; Jordan, Elizabeth; Shishko, Robert; de Weck, Olivier; Armar, Nii; Siddiqi, Afreen

    2008-01-01

    This paper summarizes the current state of the art in interplanetary supply chain modeling and discusses SpaceNet as one particular method and tool to address space logistics modeling and simulation challenges. Fundamental upgrades to the interplanetary supply chain framework such as process groups, nested elements, and cargo sharing, enabled SpaceNet to model an integrated set of missions as a campaign. The capabilities and uses of SpaceNet are demonstrated by a step-by-step modeling and simulation of a lunar campaign.

  20. Understanding and managing fish populations: keeping the toolbox fit for purpose.

    PubMed

    Paris, J R; Sherman, K D; Bell, E; Boulenger, C; Delord, C; El-Mahdi, M B M; Fairfield, E A; Griffiths, A M; Gutmann Roberts, C; Hedger, R D; Holman, L E; Hooper, L H; Humphries, N E; Katsiadaki, I; King, R A; Lemopoulos, A; Payne, C J; Peirson, G; Richter, K K; Taylor, M I; Trueman, C N; Hayden, B; Stevens, J R

    2018-03-01

    Wild fish populations are currently experiencing unprecedented pressures, which are projected to intensify in the coming decades. Developing a thorough understanding of the influences of both biotic and abiotic factors on fish populations is a salient issue in contemporary fish conservation and management. During the 50th Anniversary Symposium of The Fisheries Society of the British Isles at the University of Exeter, UK, in July 2017, scientists from diverse research backgrounds gathered to discuss key topics under the broad umbrella of 'Understanding Fish Populations'. Below, the output of one such discussion group is detailed, focusing on tools used to investigate natural fish populations. Five main groups of approaches were identified: tagging and telemetry; molecular tools; survey tools; statistical and modelling tools; tissue analyses. The appraisal covered current challenges and potential solutions for each of these topics. In addition, three key themes were identified as applicable across all tool-based applications. These included data management, public engagement, and fisheries policy and governance. The continued innovation of tools and capacity to integrate interdisciplinary approaches into the future assessment and management of fish populations is highlighted as an important focus for the next 50 years of fisheries research. © 2018 The Fisheries Society of the British Isles.

  1. Off-target model based OPC

    NASA Astrophysics Data System (ADS)

    Lu, Mark; Liang, Curtis; King, Dion; Melvin, Lawrence S., III

    2005-11-01

    Model-based Optical Proximity correction has become an indispensable tool for achieving wafer pattern to design fidelity at current manufacturing process nodes. Most model-based OPC is performed considering the nominal process condition, with limited consideration of through process manufacturing robustness. This study examines the use of off-target process models - models that represent non-nominal process states such as would occur with a dose or focus variation - to understands and manipulate the final pattern correction to a more process robust configuration. The study will first examine and validate the process of generating an off-target model, then examine the quality of the off-target model. Once the off-target model is proven, it will be used to demonstrate methods of generating process robust corrections. The concepts are demonstrated using a 0.13 μm logic gate process. Preliminary indications show success in both off-target model production and process robust corrections. With these off-target models as tools, mask production cycle times can be reduced.

  2. Transport, Acceleration and Spatial Access of Solar Energetic Particles

    NASA Astrophysics Data System (ADS)

    Borovikov, D.; Sokolov, I.; Effenberger, F.; Jin, M.; Gombosi, T. I.

    2017-12-01

    Solar Energetic Particles (SEPs) are a major branch of space weather. Often driven by Coronal Mass Ejections (CMEs), SEPs have a very high destructive potential, which includes but is not limited to disrupting communication systems on Earth, inflicting harmful and potentially fatal radiation doses to crew members onboard spacecraft and, in extreme cases, to people aboard high altitude flights. However, currently the research community lacks efficient tools to predict such hazardous SEP events. Such a tool would serve as the first step towards improving humanity's preparedness for SEP events and ultimately its ability to mitigate their effects. The main goal of the presented research is to develop a computational tool that provides the said capabilities and meets the community's demand. Our model has the forecasting capability and can be the basis for operational system that will provide live information on the current potential threats posed by SEPs based on observations of the Sun. The tool comprises several numerical models, which are designed to simulate different physical aspects of SEPs. The background conditions in the interplanetary medium, in particular, the Coronal Mass Ejection driving the particle acceleration, play a defining role and are simulated with the state-of-the-art MHD solver, Block-Adaptive-Tree Solar-wind Roe-type Upwind Scheme (BATS-R-US). The newly developed particle code, Multiple-Field-Line-Advection Model for Particle Acceleration (M-FLAMPA), simulates the actual transport and acceleration of SEPs and is coupled to the MHD code. The special property of SEPs, the tendency to follow magnetic lines of force, is fully taken advantage of in the computational model, which substitutes a complicated 3-D model with a multitude of 1-D models. This approach significantly simplifies computations and improves the time performance of the overall model. Also, it plays an important role of mapping the affected region by connecting it with the origin of SEPs at the solar surface. Our model incorporates the effects of the near-Sun field line meandering that affects the perpendicular transport of SEPs and can explain the occurrence of large longitudinal spread observed even in the early phases of such events.

  3. A review of fracture mechanics life technology

    NASA Technical Reports Server (NTRS)

    Thomas, J. M.; Besuner, P. M.; Harris, D. O.

    1985-01-01

    Current lifetime prediction technology for structural components subjected to cyclic loads was reviewed. The central objectives of the project were to report the current state of and recommend future development of fracture mechanics-based analytical tools for modeling and forecasting subcritical fatigue crack growth in structures. Of special interest to NASA was the ability to apply these tools to practical engineering problems and the developmental steps necessary to bring vital technologies to this stage. A survey of published literature and numerous discussions with experts in the field of fracture mechanics life technology were conducted. One of the key points made is that fracture mechanics analyses of crack growth often involve consideration of fatigue and fracture under extreme conditions. Therefore, inaccuracies in predicting component lifetime will be dominated by inaccuracies in environment and fatigue crack growth relations, stress intensity factor solutions, and methods used to model given loads and stresses. Suggestions made for reducing these inaccuracies include: development of improved models of subcritical crack growth, research efforts aimed at better characterizing residual and assembly stresses that can be introduced during fabrication, and more widespread and uniform use of the best existing methods.

  4. The SCEC Community Modeling Environment (SCEC/CME) - An Overview of its Architecture and Current Capabilities

    NASA Astrophysics Data System (ADS)

    Maechling, P. J.; Jordan, T. H.; Minster, B.; Moore, R.; Kesselman, C.; SCEC ITR Collaboration

    2004-12-01

    The Southern California Earthquake Center (SCEC), in collaboration with the San Diego Supercomputer Center, the USC Information Sciences Institute, the Incorporated Research Institutions for Seismology, and the U.S. Geological Survey, is developing the Southern California Earthquake Center Community Modeling Environment (CME) under a five-year grant from the National Science Foundation's Information Technology Research (ITR) Program jointly funded by the Geosciences and Computer and Information Science & Engineering Directorates. The CME system is an integrated geophysical simulation modeling framework that automates the process of selecting, configuring, and executing models of earthquake systems. During the Project's first three years, we have performed fundamental geophysical and information technology research and have also developed substantial system capabilities, software tools, and data collections that can help scientist perform systems-level earthquake science. The CME system provides collaborative tools to facilitate distributed research and development. These collaborative tools are primarily communication tools, providing researchers with access to information in ways that are convenient and useful. The CME system provides collaborators with access to significant computing and storage resources. The computing resources of the Project include in-house servers, Project allocations on USC High Performance Computing Linux Cluster, as well as allocations on NPACI Supercomputers and the TeraGrid. The CME system provides access to SCEC community geophysical models such as the Community Velocity Model, Community Fault Model, Community Crustal Motion Model, and the Community Block Model. The organizations that develop these models often provide access to them so it is not necessary to use the CME system to access these models. However, in some cases, the CME system supplements the SCEC community models with utility codes that make it easier to use or access these models. In some cases, the CME system also provides alternatives to the SCEC community models. The CME system hosts a collection of community geophysical software codes. These codes include seismic hazard analysis (SHA) programs developed by the SCEC/USGS OpenSHA group. Also, the CME system hosts anelastic wave propagation codes including Kim Olsen's Finite Difference code and Carnegie Mellon's Hercules Finite Element tool chain. The CME system can execute a workflow, that is, a series of geophysical computations using the output of one processing step as the input to a subsequent step. Our workflow capability utilizes grid-based computing software that can submit calculations to a pool of computing resources as well as data management tools that help us maintain an association between data files and metadata descriptions of those files. The CME system maintains, and provides access to, a collection of valuable geophysical data sets. The current CME Digital Library holdings include a collection of 60 ground motion simulation results calculated by a SCEC/PEER working group and a collection of Greens Functions calculated for 33 TriNet broadband receiver sites in the Los Angeles area.

  5. QuakeSim 2.0

    NASA Technical Reports Server (NTRS)

    Donnellan, Andrea; Parker, Jay W.; Lyzenga, Gregory A.; Granat, Robert A.; Norton, Charles D.; Rundle, John B.; Pierce, Marlon E.; Fox, Geoffrey C.; McLeod, Dennis; Ludwig, Lisa Grant

    2012-01-01

    QuakeSim 2.0 improves understanding of earthquake processes by providing modeling tools and integrating model applications and various heterogeneous data sources within a Web services environment. QuakeSim is a multisource, synergistic, data-intensive environment for modeling the behavior of earthquake faults individually, and as part of complex interacting systems. Remotely sensed geodetic data products may be explored, compared with faults and landscape features, mined by pattern analysis applications, and integrated with models and pattern analysis applications in a rich Web-based and visualization environment. Integration of heterogeneous data products with pattern informatics tools enables efficient development of models. Federated database components and visualization tools allow rapid exploration of large datasets, while pattern informatics enables identification of subtle, but important, features in large data sets. QuakeSim is valuable for earthquake investigations and modeling in its current state, and also serves as a prototype and nucleus for broader systems under development. The framework provides access to physics-based simulation tools that model the earthquake cycle and related crustal deformation. Spaceborne GPS and Inter ferometric Synthetic Aperture (InSAR) data provide information on near-term crustal deformation, while paleoseismic geologic data provide longerterm information on earthquake fault processes. These data sources are integrated into QuakeSim's QuakeTables database system, and are accessible by users or various model applications. UAVSAR repeat pass interferometry data products are added to the QuakeTables database, and are available through a browseable map interface or Representational State Transfer (REST) interfaces. Model applications can retrieve data from Quake Tables, or from third-party GPS velocity data services; alternatively, users can manually input parameters into the models. Pattern analysis of GPS and seismicity data has proved useful for mid-term forecasting of earthquakes, and for detecting subtle changes in crustal deformation. The GPS time series analysis has also proved useful as a data-quality tool, enabling the discovery of station anomalies and data processing and distribution errors. Improved visualization tools enable more efficient data exploration and understanding. Tools provide flexibility to science users for exploring data in new ways through download links, but also facilitate standard, intuitive, and routine uses for science users and end users such as emergency responders.

  6. Are We Teaching Them Anything?: A Model for Measuring Methodology Skills in the Political Science Major

    ERIC Educational Resources Information Center

    Siver, Christi; Greenfest, Seth W.; Haeg, G. Claire

    2016-01-01

    While the literature emphasizes the importance of teaching political science students methods skills, there currently exists little guidance for how to assess student learning over the course of their time in the major. To address this gap, we develop a model set of assessment tools that may be adopted and adapted by political science departments…

  7. Community College Capital Analysis Model; A Report to the Washington State Legislature. Performance Audit Report No. 75-12.

    ERIC Educational Resources Information Center

    O'Brien, John E.

    This performance audit was conducted to provide the Legislature with an evaluation of the Capital Analysis Model (CAM) utilized in the development of the Washington State Community College System capital budget request to the Legislature. The CAM is a tool for measuring projected capital facilities needs in relation to current capital facilities,…

  8. Using the SAMR Model as a Framework for Evaluating mLearning Activities and Supporting a Transformation of Learning

    ERIC Educational Resources Information Center

    Pfaffe, Linda D.

    2017-01-01

    An examination of the perceptions of mLearning held by secondary teachers as well as mLearning best practices and professional development. Using the SAMR model (Puentedura, 2013) along with Romrell's (2014) definition of mLearning, this study evaluated mLearning activities (tools and applications) currently being used by secondary school…

  9. NASA Enterprise Architecture and Its Use in Transition of Research Results to Operations

    NASA Astrophysics Data System (ADS)

    Frisbie, T. E.; Hall, C. M.

    2006-12-01

    Enterprise architecture describes the design of the components of an enterprise, their relationships and how they support the objectives of that enterprise. NASA Stennis Space Center leads several projects involving enterprise architecture tools used to gather information on research assets within NASA's Earth Science Division. In the near future, enterprise architecture tools will link and display the relevant requirements, parameters, observatories, models, decision systems, and benefit/impact information relationships and map to the Federal Enterprise Architecture Reference Models. Components configured within the enterprise architecture serving the NASA Applied Sciences Program include the Earth Science Components Knowledge Base, the Systems Components database, and the Earth Science Architecture Tool. The Earth Science Components Knowledge Base systematically catalogues NASA missions, sensors, models, data products, model products, and network partners appropriate for consideration in NASA Earth Science applications projects. The Systems Components database is a centralized information warehouse of NASA's Earth Science research assets and a critical first link in the implementation of enterprise architecture. The Earth Science Architecture Tool is used to analyze potential NASA candidate systems that may be beneficial to decision-making capabilities of other Federal agencies. Use of the current configuration of NASA enterprise architecture (the Earth Science Components Knowledge Base, the Systems Components database, and the Earth Science Architecture Tool) has far exceeded its original intent and has tremendous potential for the transition of research results to operational entities.

  10. Monitoring Drought Conditions in the Navajo Nation Using NASA Earth Observations

    NASA Technical Reports Server (NTRS)

    Ly, Vickie; Gao, Michael; Cary, Cheryl; Turnbull-Appell, Sophie; Surunis, Anton

    2016-01-01

    The Navajo Nation, a 65,700 sq km Native American territory located in the southwestern United States, has been increasingly impacted by severe drought events and changes in climate. These events are coupled with a lack of domestic water infrastructure and economic resources, leaving approximately one-third of the population without access to potable water in their homes. Current methods of monitoring drought are dependent on state-based monthly Standardized Precipitation Index value maps calculated by the Western Regional Climate Center. However, these maps do not provide the spatial resolution needed to illustrate differences in drought severity across the vast Nation. To better understand and monitor drought events and drought regime changes in the Navajo Nation, this project created a geodatabase of historical climate information specific to the area, and a decision support tool to calculate average Standardized Precipitation Index values for user-specified areas. The tool and geodatabase use Tropical Rainfall Monitoring Mission (TRMM) and Global Precipitation Monitor (GPM) observed precipitation data and Parameter-elevation Relationships on Independent Slopes Model modeled historical precipitation data, as well as NASA's modeled Land Data Assimilation Systems deep soil moisture, evaporation, and transpiration data products. The geodatabase and decision support tool will allow resource managers in the Navajo Nation to utilize current and future NASA Earth observation data for increased decision-making capacity regarding future climate change impact on water resources.

  11. Impact and Penetration Simulations for Composite Wing-like Structures

    NASA Technical Reports Server (NTRS)

    Knight, Norman F.

    1998-01-01

    The goal of this research project was to develop methodologies for the analysis of wing-like structures subjected to impact loadings. Low-speed impact causing either no damage or only minimal damage and high-speed impact causing severe laminate damage and possible penetration of the structure were to be considered during this research effort. To address this goal, an assessment of current analytical tools for impact analysis was performed. Assessment of the analytical tools for impact and penetration simulations with regard to accuracy, modeling, and damage modeling was considered as well as robustness, efficient, and usage in a wing design environment. Following a qualitative assessment, selected quantitative evaluations will be performed using the leading simulation tools. Based on this assessment, future research thrusts for impact and penetration simulation of composite wing-like structures were identified.

  12. BDA special care case mix model.

    PubMed

    Bateman, P; Arnold, C; Brown, R; Foster, L V; Greening, S; Monaghan, N; Zoitopoulos, L

    2010-04-10

    Routine dental care provided in special care dentistry is complicated by patient specific factors which increase the time taken and costs of treatment. The BDA have developed and conducted a field trial of a case mix tool to measure this complexity. For each episode of care the case mix tool assesses the following on a four point scale: 'ability to communicate', 'ability to cooperate', 'medical status', 'oral risk factors', 'access to oral care' and 'legal and ethical barriers to care'. The tool is reported to be easy to use and captures sufficient detail to discriminate between types of service and special care dentistry provided. It offers potential as a simple to use and clinically relevant source of performance management and commissioning data. This paper describes the model, demonstrates how it is currently being used, and considers future developments in its use.

  13. Simulated Design Strategies for SPECT Collimators to Reduce the Eddy Currents Induced by MRI Gradient Fields

    NASA Astrophysics Data System (ADS)

    Samoudi, Amine M.; Van Audenhaege, Karen; Vermeeren, Günter; Verhoyen, Gregory; Martens, Luc; Van Holen, Roel; Joseph, Wout

    2015-10-01

    Combining single photon emission computed tomography (SPECT) with magnetic resonance imaging (MRI) requires the insertion of highly conductive SPECT collimators inside the MRI scanner, resulting in an induced eddy current disturbing the combined system. We reduced the eddy currents due to the insert of a novel tungsten collimator inside transverse and longitudinal gradient coils. The collimator was produced with metal additive manufacturing, that is part of a microSPECT insert for a preclinical SPECT/MRI scanner. We characterized the induced magnetic field due to the gradient field and adapted the collimators to reduce the induced eddy currents. We modeled the x-, y-, and z-gradient coil and the different collimator designs and simulated them with FEKO, a three-dimensional method of moments / finite element methods (MoM/FEM) full-wave simulation tool. We used a time analysis approach to generate the pulsed magnetic field gradient. Simulation results show that the maximum induced field can be reduced by 50.82% in the final design bringing the maximum induced magnetic field to less than 2% of the applied gradient for all the gradient coils. The numerical model was validated with measurements and was proposed as a tool for studying the effect of a SPECT collimator within the MRI gradient coils.

  14. Online and Certifiable Spectroscopy Courses Using Information and Communication Tools. a Model for Classrooms and Beyond

    NASA Astrophysics Data System (ADS)

    Krishnan, Mangala Sunder

    2015-06-01

    Online education tools and flipped (reverse) class models for teaching and learning and pedagogic and andragogic approaches to self-learning have become quite mature in the last few years because of the revolution in video, interactive software and social learning tools. Open Educational resources of dependable quality and variety are also becoming available throughout the world making the current era truly a renaissance period for higher education using Internet. In my presentation, I shall highlight structured course content preparation online in several areas of spectroscopy and also the design and development of virtual lab tools and kits for studying optical spectroscopy. Both elementary and advanced courses on molecular spectroscopy are currently under development jointly with researchers in other institutions in India. I would like to explore participation from teachers throughout the world in the teaching-learning process using flipped class methods for topics such as experimental and theoretical microwave spectroscopy of semi-rigid and non-rigid molecules, molecular complexes and aggregates. In addition, courses in Raman, Infrared spectroscopy experimentation and advanced electronic spectroscopy courses are also envisaged for free, online access. The National Programme on Technology Enhanced Learning (NPTEL) and the National Mission on Education through Information and Communication Technology (NMEICT) are two large Government of India funded initiatives for producing certified and self-learning courses with financial support for moderated discussion forums. The learning tools and interactive presentations so developed can be used in classrooms throughout the world using flipped mode of teaching. They are very much sought after by learners and researchers who are in other areas of learning but want to contribute to research and development through inter-disciplinary learning. NPTEL is currently is experimenting with Massive Open Online Course (MOOC) strategy, but with proctored and certified examination processes for large numbers in some of the above courses. I would like to present a summary of developments in these areas to help focus classroom (online and offline) learning of Molecular spectroscopy.

  15. DarkBit: a GAMBIT module for computing dark matter observables and likelihoods

    NASA Astrophysics Data System (ADS)

    Bringmann, Torsten; Conrad, Jan; Cornell, Jonathan M.; Dal, Lars A.; Edsjö, Joakim; Farmer, Ben; Kahlhoefer, Felix; Kvellestad, Anders; Putze, Antje; Savage, Christopher; Scott, Pat; Weniger, Christoph; White, Martin; Wild, Sebastian

    2017-12-01

    We introduce DarkBit, an advanced software code for computing dark matter constraints on various extensions to the Standard Model of particle physics, comprising both new native code and interfaces to external packages. This release includes a dedicated signal yield calculator for gamma-ray observations, which significantly extends current tools by implementing a cascade-decay Monte Carlo, as well as a dedicated likelihood calculator for current and future experiments ( gamLike). This provides a general solution for studying complex particle physics models that predict dark matter annihilation to a multitude of final states. We also supply a direct detection package that models a large range of direct detection experiments ( DDCalc), and that provides the corresponding likelihoods for arbitrary combinations of spin-independent and spin-dependent scattering processes. Finally, we provide custom relic density routines along with interfaces to DarkSUSY, micrOMEGAs, and the neutrino telescope likelihood package nulike. DarkBit is written in the framework of the Global And Modular Beyond the Standard Model Inference Tool ( GAMBIT), providing seamless integration into a comprehensive statistical fitting framework that allows users to explore new models with both particle and astrophysics constraints, and a consistent treatment of systematic uncertainties. In this paper we describe its main functionality, provide a guide to getting started quickly, and show illustrative examples for results obtained with DarkBit (both as a stand-alone tool and as a GAMBIT module). This includes a quantitative comparison between two of the main dark matter codes ( DarkSUSY and micrOMEGAs), and application of DarkBit 's advanced direct and indirect detection routines to a simple effective dark matter model.

  16. An ontology-driven, diagnostic modeling system.

    PubMed

    Haug, Peter J; Ferraro, Jeffrey P; Holmen, John; Wu, Xinzi; Mynam, Kumar; Ebert, Matthew; Dean, Nathan; Jones, Jason

    2013-06-01

    To present a system that uses knowledge stored in a medical ontology to automate the development of diagnostic decision support systems. To illustrate its function through an example focused on the development of a tool for diagnosing pneumonia. We developed a system that automates the creation of diagnostic decision-support applications. It relies on a medical ontology to direct the acquisition of clinic data from a clinical data warehouse and uses an automated analytic system to apply a sequence of machine learning algorithms that create applications for diagnostic screening. We refer to this system as the ontology-driven diagnostic modeling system (ODMS). We tested this system using samples of patient data collected in Salt Lake City emergency rooms and stored in Intermountain Healthcare's enterprise data warehouse. The system was used in the preliminary development steps of a tool to identify patients with pneumonia in the emergency department. This tool was compared with a manually created diagnostic tool derived from a curated dataset. The manually created tool is currently in clinical use. The automatically created tool had an area under the receiver operating characteristic curve of 0.920 (95% CI 0.916 to 0.924), compared with 0.944 (95% CI 0.942 to 0.947) for the manually created tool. Initial testing of the ODMS demonstrates promising accuracy for the highly automated results and illustrates the route to model improvement. The use of medical knowledge, embedded in ontologies, to direct the initial development of diagnostic computing systems appears feasible.

  17. Development and preliminary validation of the EASE: a tool to measure perceived singing voice function.

    PubMed

    Phyland, Debra J; Pallant, Julie F; Benninger, Michael S; Thibeault, Susan L; Greenwood, Ken M; Smith, Julian A; Vallance, Neil

    2013-07-01

    Most voice self-rating tools are disease-specific measures and are not suitable for use with healthy voice users. There is a need for a tool that is sensitive to the subtleties of a singer's voice and to perceived physical changes in the singing voice mechanism as a function of load. The aim of this study was to devise and validate a scale to assess singer's perceptions of the current status of their singing voice. Ninety-five vocal health descriptors were collected from focus group interviews of singers. These were reviewed by 25 currently performing music theater (MT) singers. Based on a consensus technique, the number of descriptors was decreased to 42 items. These were administered to a sample of 284 professional MT singers using an online survey to evaluate their perception of current singing voice status. Principal component analysis identified two subsets of items. Rasch analysis was used to evaluate and refine these sets of items to form two 10-item subscales. Both subscales demonstrated good overall fit to the Rasch model, no differential item functioning by sex or age, and good internal consistency reliability. The two subscales were strongly correlated and subsequent Rasch analysis supported their combination to form a single 20-item scale with good psychometric properties. The Evaluation of the Ability to Sing Easily (EASE) is a concise clinical tool to assess singer's perceptions of the current status of their singing voice with good measurement properties. EASE may prove a useful tool to measure changes in the singing voice as indicators of the effect of vocal load. Furthermore, it may offer a valuable means for the prediction or screening of singers "at risk" of developing voice disorders. Copyright © 2013 The Voice Foundation. All rights reserved.

  18. Planform: an application and database of graph-encoded planarian regenerative experiments.

    PubMed

    Lobo, Daniel; Malone, Taylor J; Levin, Michael

    2013-04-15

    Understanding the mechanisms governing the regeneration capabilities of many organisms is a fundamental interest in biology and medicine. An ever-increasing number of manipulation and molecular experiments are attempting to discover a comprehensive model for regeneration, with the planarian flatworm being one of the most important model species. Despite much effort, no comprehensive, constructive, mechanistic models exist yet, and it is now clear that computational tools are needed to mine this huge dataset. However, until now, there is no database of regenerative experiments, and the current genotype-phenotype ontologies and databases are based on textual descriptions, which are not understandable by computers. To overcome these difficulties, we present here Planform (Planarian formalization), a manually curated database and software tool for planarian regenerative experiments, based on a mathematical graph formalism. The database contains more than a thousand experiments from the main publications in the planarian literature. The software tool provides the user with a graphical interface to easily interact with and mine the database. The presented system is a valuable resource for the regeneration community and, more importantly, will pave the way for the application of novel artificial intelligence tools to extract knowledge from this dataset. The database and software tool are freely available at http://planform.daniel-lobo.com.

  19. Estimation of tool wear during CNC milling using neural network-based sensor fusion

    NASA Astrophysics Data System (ADS)

    Ghosh, N.; Ravi, Y. B.; Patra, A.; Mukhopadhyay, S.; Paul, S.; Mohanty, A. R.; Chattopadhyay, A. B.

    2007-01-01

    Cutting tool wear degrades the product quality in manufacturing processes. Monitoring tool wear value online is therefore needed to prevent degradation in machining quality. Unfortunately there is no direct way of measuring the tool wear online. Therefore one has to adopt an indirect method wherein the tool wear is estimated from several sensors measuring related process variables. In this work, a neural network-based sensor fusion model has been developed for tool condition monitoring (TCM). Features extracted from a number of machining zone signals, namely cutting forces, spindle vibration, spindle current, and sound pressure level have been fused to estimate the average flank wear of the main cutting edge. Novel strategies such as, signal level segmentation for temporal registration, feature space filtering, outlier removal, and estimation space filtering have been proposed. The proposed approach has been validated by both laboratory and industrial implementations.

  20. Transposons As Tools for Functional Genomics in Vertebrate Models.

    PubMed

    Kawakami, Koichi; Largaespada, David A; Ivics, Zoltán

    2017-11-01

    Genetic tools and mutagenesis strategies based on transposable elements are currently under development with a vision to link primary DNA sequence information to gene functions in vertebrate models. By virtue of their inherent capacity to insert into DNA, transposons can be developed into powerful tools for chromosomal manipulations. Transposon-based forward mutagenesis screens have numerous advantages including high throughput, easy identification of mutated alleles, and providing insight into genetic networks and pathways based on phenotypes. For example, the Sleeping Beauty transposon has become highly instrumental to induce tumors in experimental animals in a tissue-specific manner with the aim of uncovering the genetic basis of diverse cancers. Here, we describe a battery of mutagenic cassettes that can be applied in conjunction with transposon vectors to mutagenize genes, and highlight versatile experimental strategies for the generation of engineered chromosomes for loss-of-function as well as gain-of-function mutagenesis for functional gene annotation in vertebrate models, including zebrafish, mice, and rats. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. A flexible tool for diagnosing water, energy, and entropy budgets in climate models

    NASA Astrophysics Data System (ADS)

    Lembo, Valerio; Lucarini, Valerio

    2017-04-01

    We have developed a new flexible software for studying the global energy budget, the hydrological cycle, and the material entropy production of global climate models. The program receives as input radiative, latent and sensible energy fluxes, with the requirement that the variable names are in agreement with the Climate and Forecast (CF) conventions for the production of NetCDF datasets. Annual mean maps, meridional sections and time series are computed by means of Climate Data Operators (CDO) collection of command line operators developed at Max-Planck Institute for Meteorology (MPI-M). If a land-sea mask is provided, the program also computes the required quantities separately on the continents and oceans. Depending on the user's choice, the program also calls the MATLAB software to compute meridional heat transports and location and intensities of the peaks in the two hemispheres. We are currently planning to adapt the program in order to be included in the Earth System Model eValuation Tool (ESMValTool) community diagnostics.

  2. RHydro - Hydrological models and tools to represent and analyze hydrological data in R

    NASA Astrophysics Data System (ADS)

    Reusser, Dominik; Buytaert, Wouter

    2010-05-01

    In hydrology, basic equations and procedures keep being implemented from scratch by scientist, with the potential for errors and inefficiency. The use of libraries can overcome these problems. Other scientific disciplines such as mathematics and physics have benefited significantly from such an approach with freely available implementations for many routines. As an example, hydrological libraries could contain: Major representations of hydrological processes such as infiltration, sub-surface runoff and routing algorithms. Scaling functions, for instance to combine remote sensing precipitation fields with rain gauge data Data consistency checks Performance measures. Here we present a beginning for such a library implemented in the high level data programming language R. Currently, Top-model, data import routines for WaSiM-ETH as well basic visualization and evaluation tools are implemented. The design is such, that a definition of import scripts for additional models is sufficient to have access to the full set of evaluation and visualization tools.

  3. Modeling the Energy Use of a Connected and Automated Transportation System (Poster)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gonder, J.; Brown, A.

    Early research points to large potential impacts of connected and automated vehicles (CAVs) on transportation energy use - dramatic savings, increased use, or anything in between. Due to a lack of suitable data and integrated modeling tools to explore these complex future systems, analyses to date have relied on simple combinations of isolated effects. This poster proposes a framework for modeling the potential energy implications from increasing penetration of CAV technologies and for assessing technology and policy options to steer them toward favorable energy outcomes. Current CAV modeling challenges include estimating behavior change, understanding potential vehicle-to-vehicle interactions, and assessing trafficmore » flow and vehicle use under different automation scenarios. To bridge these gaps and develop a picture of potential future automated systems, NREL is integrating existing modeling capabilities with additional tools and data inputs to create a more fully integrated CAV assessment toolkit.« less

  4. Induced Pluripotent Stem Cell Models to Enable In Vitro Models for Screening in the Central Nervous System.

    PubMed

    Hunsberger, Joshua G; Efthymiou, Anastasia G; Malik, Nasir; Behl, Mamta; Mead, Ivy L; Zeng, Xianmin; Simeonov, Anton; Rao, Mahendra

    2015-08-15

    There is great need to develop more predictive drug discovery tools to identify new therapies to treat diseases of the central nervous system (CNS). Current nonpluripotent stem cell-based models often utilize non-CNS immortalized cell lines and do not enable the development of personalized models of disease. In this review, we discuss why in vitro models are necessary for translational research and outline the unique advantages of induced pluripotent stem cell (iPSC)-based models over those of current systems. We suggest that iPSC-based models can be patient specific and isogenic lines can be differentiated into many neural cell types for detailed comparisons. iPSC-derived cells can be combined to form small organoids, or large panels of lines can be developed that enable new forms of analysis. iPSC and embryonic stem cell-derived cells can be readily engineered to develop reporters for lineage studies or mechanism of action experiments further extending the utility of iPSC-based systems. We conclude by describing novel technologies that include strategies for the development of diversity panels, novel genomic engineering tools, new three-dimensional organoid systems, and modified high-content screens that may bring toxicology into the 21st century. The strategic integration of these technologies with the advantages of iPSC-derived cell technology, we believe, will be a paradigm shift for toxicology and drug discovery efforts.

  5. Advanced computer techniques for inverse modeling of electric current in cardiac tissue

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hutchinson, S.A.; Romero, L.A.; Diegert, C.F.

    1996-08-01

    For many years, ECG`s and vector cardiograms have been the tools of choice for non-invasive diagnosis of cardiac conduction problems, such as found in reentrant tachycardia or Wolff-Parkinson-White (WPW) syndrome. Through skillful analysis of these skin-surface measurements of cardiac generated electric currents, a physician can deduce the general location of heart conduction irregularities. Using a combination of high-fidelity geometry modeling, advanced mathematical algorithms and massively parallel computing, Sandia`s approach would provide much more accurate information and thus allow the physician to pinpoint the source of an arrhythmia or abnormal conduction pathway.

  6. On simulation of local fluxes in molecular junctions

    NASA Astrophysics Data System (ADS)

    Cabra, Gabriel; Jensen, Anders; Galperin, Michael

    2018-05-01

    We present a pedagogical review of the current density simulation in molecular junction models indicating its advantages and deficiencies in analysis of local junction transport characteristics. In particular, we argue that current density is a universal tool which provides more information than traditionally simulated bond currents, especially when discussing inelastic processes. However, current density simulations are sensitive to the choice of basis and electronic structure method. We note that while discussing the local current conservation in junctions, one has to account for the source term caused by the open character of the system and intra-molecular interactions. Our considerations are illustrated with numerical simulations of a benzenedithiol molecular junction.

  7. In-silico wear prediction for knee replacements--methodology and corroboration.

    PubMed

    Strickland, M A; Taylor, M

    2009-07-22

    The capability to predict in-vivo wear of knee replacements is a valuable pre-clinical analysis tool for implant designers. Traditionally, time-consuming experimental tests provided the principal means of investigating wear. Today, computational models offer an alternative. However, the validity of these models has not been demonstrated across a range of designs and test conditions, and several different formulas are in contention for estimating wear rates, limiting confidence in the predictive power of these in-silico models. This study collates and retrospectively simulates a wide range of experimental wear tests using fast rigid-body computational models with extant wear prediction algorithms, to assess the performance of current in-silico wear prediction tools. The number of tests corroborated gives a broader, more general assessment of the performance of these wear-prediction tools, and provides better estimates of the wear 'constants' used in computational models. High-speed rigid-body modelling allows a range of alternative algorithms to be evaluated. Whilst most cross-shear (CS)-based models perform comparably, the 'A/A+B' wear model appears to offer the best predictive power amongst existing wear algorithms. However, the range and variability of experimental data leaves considerable uncertainty in the results. More experimental data with reduced variability and more detailed reporting of studies will be necessary to corroborate these models with greater confidence. With simulation times reduced to only a few minutes, these models are ideally suited to large-volume 'design of experiment' or probabilistic studies (which are essential if pre-clinical assessment tools are to begin addressing the degree of variation observed clinically and in explanted components).

  8. Experimental Evaluation of Acoustic Engine Liner Models Developed with COMSOL Multiphysics

    NASA Technical Reports Server (NTRS)

    Schiller, Noah H.; Jones, Michael G.; Bertolucci, Brandon

    2017-01-01

    Accurate modeling tools are needed to design new engine liners capable of reducing aircraft noise. The purpose of this study is to determine if a commercially-available finite element package, COMSOL Multiphysics, can be used to accurately model a range of different acoustic engine liner designs, and in the process, collect and document a benchmark dataset that can be used in both current and future code evaluation activities. To achieve these goals, a variety of liner samples, ranging from conventional perforate-over-honeycomb to extended-reaction designs, were installed in one wall of the grazing flow impedance tube at the NASA Langley Research Center. The liners were exposed to high sound pressure levels and grazing flow, and the effect of the liner on the sound field in the flow duct was measured. These measurements were then compared with predictions. While this report only includes comparisons for a subset of the configurations, the full database of all measurements and predictions is available in electronic format upon request. The results demonstrate that both conventional perforate-over-honeycomb and extended-reaction liners can be accurately modeled using COMSOL. Therefore, this modeling tool can be used with confidence to supplement the current suite of acoustic propagation codes, and ultimately develop new acoustic engine liners designed to reduce aircraft noise.

  9. A simple simulation model as a tool to assess alternative health care provider payment reform options in Vietnam.

    PubMed

    Cashin, Cheryl; Phuong, Nguyen Khanh; Shain, Ryan; Oanh, Tran Thi Mai; Thuy, Nguyen Thi

    2015-01-01

    Vietnam is currently considering a revision of its 2008 Health Insurance Law, including the regulation of provider payment methods. This study uses a simple spreadsheet-based, micro-simulation model to analyse the potential impacts of different provider payment reform scenarios on resource allocation across health care providers in three provinces in Vietnam, as well as on the total expenditure of the provincial branches of the public health insurance agency (Provincial Social Security [PSS]). The results show that currently more than 50% of PSS spending is concentrated at the provincial level with less than half at the district level. There is also a high degree of financial risk on district hospitals with the current fund-holding arrangement. Results of the simulation model show that several alternative scenarios for provider payment reform could improve the current payment system by reducing the high financial risk currently borne by district hospitals without dramatically shifting the current level and distribution of PSS expenditure. The results of the simulation analysis provided an empirical basis for health policy-makers in Vietnam to assess different provider payment reform options and make decisions about new models to support health system objectives.

  10. Assessment of the National Combustion Code

    NASA Technical Reports Server (NTRS)

    Liu, nan-Suey; Iannetti, Anthony; Shih, Tsan-Hsing

    2007-01-01

    The advancements made during the last decade in the areas of combustion modeling, numerical simulation, and computing platform have greatly facilitated the use of CFD based tools in the development of combustion technology. Further development of verification, validation and uncertainty quantification will have profound impact on the reliability and utility of these CFD based tools. The objectives of the present effort are to establish baseline for the National Combustion Code (NCC) and experimental data, as well as to document current capabilities and identify gaps for further improvements.

  11. Authoring Tools and Methods for Adaptive Training and Education in Support of the US Army Learning Model: Research Outline

    DTIC Science & Technology

    2015-10-01

    higher effect sizes than others when comparing any intervention (e.g., computer trainers, human tutors, group learning) to a control . It is difficult... control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1. REPORT DATE (DD-MM-YYYY) October 2015 2. REPORT TYPE Special Report 3...ABSTRACT While human tutoring and mentoring are common teaching tools, current US Army standards for training and education are group instruction and

  12. Batch mode grid generation: An endangered species

    NASA Technical Reports Server (NTRS)

    Schuster, David M.

    1992-01-01

    Non-interactive grid generation schemes should thrive as emphasis shifts from development of numerical analysis and design methods to application of these tools to real engineering problems. A strong case is presented for the continued development and application of non-interactive geometry modeling methods. Guidelines, strategies, and techniques for developing and implementing these tools are presented using current non-interactive grid generation methods as examples. These schemes play an important role in the development of multidisciplinary analysis methods and some of these applications are also discussed.

  13. Rotorcraft Optimization Tools: Incorporating Rotorcraft Design Codes into Multi-Disciplinary Design, Analysis, and Optimization

    NASA Technical Reports Server (NTRS)

    Meyn, Larry A.

    2018-01-01

    One of the goals of NASA's Revolutionary Vertical Lift Technology Project (RVLT) is to provide validated tools for multidisciplinary design, analysis and optimization (MDAO) of vertical lift vehicles. As part of this effort, the software package, RotorCraft Optimization Tools (RCOTOOLS), is being developed to facilitate incorporating key rotorcraft conceptual design codes into optimizations using the OpenMDAO multi-disciplinary optimization framework written in Python. RCOTOOLS, also written in Python, currently supports the incorporation of the NASA Design and Analysis of RotorCraft (NDARC) vehicle sizing tool and the Comprehensive Analytical Model of Rotorcraft Aerodynamics and Dynamics II (CAMRAD II) analysis tool into OpenMDAO-driven optimizations. Both of these tools use detailed, file-based inputs and outputs, so RCOTOOLS provides software wrappers to update input files with new design variable values, execute these codes and then extract specific response variable values from the file outputs. These wrappers are designed to be flexible and easy to use. RCOTOOLS also provides several utilities to aid in optimization model development, including Graphical User Interface (GUI) tools for browsing input and output files in order to identify text strings that are used to identify specific variables as optimization input and response variables. This paper provides an overview of RCOTOOLS and its use

  14. Tool Steel Heat Treatment Optimization Using Neural Network Modeling

    NASA Astrophysics Data System (ADS)

    Podgornik, Bojan; Belič, Igor; Leskovšek, Vojteh; Godec, Matjaz

    2016-11-01

    Optimization of tool steel properties and corresponding heat treatment is mainly based on trial and error approach, which requires tremendous experimental work and resources. Therefore, there is a huge need for tools allowing prediction of mechanical properties of tool steels as a function of composition and heat treatment process variables. The aim of the present work was to explore the potential and possibilities of artificial neural network-based modeling to select and optimize vacuum heat treatment conditions depending on the hot work tool steel composition and required properties. In the current case training of the feedforward neural network with error backpropagation training scheme and four layers of neurons (8-20-20-2) scheme was based on the experimentally obtained tempering diagrams for ten different hot work tool steel compositions and at least two austenitizing temperatures. Results show that this type of modeling can be successfully used for detailed and multifunctional analysis of different influential parameters as well as to optimize heat treatment process of hot work tool steels depending on the composition. In terms of composition, V was found as the most beneficial alloying element increasing hardness and fracture toughness of hot work tool steel; Si, Mn, and Cr increase hardness but lead to reduced fracture toughness, while Mo has the opposite effect. Optimum concentration providing high KIc/HRC ratios would include 0.75 pct Si, 0.4 pct Mn, 5.1 pct Cr, 1.5 pct Mo, and 0.5 pct V, with the optimum heat treatment performed at lower austenitizing and intermediate tempering temperatures.

  15. Current State of the Art Historic Building Information Modelling

    NASA Astrophysics Data System (ADS)

    Dore, C.; Murphy, M.

    2017-08-01

    In an extensive review of existing literature a number of observations were made in relation to the current approaches for recording and modelling existing buildings and environments: Data collection and pre-processing techniques are becoming increasingly automated to allow for near real-time data capture and fast processing of this data for later modelling applications. Current BIM software is almost completely focused on new buildings and has very limited tools and pre-defined libraries for modelling existing and historic buildings. The development of reusable parametric library objects for existing and historic buildings supports modelling with high levels of detail while decreasing the modelling time. Mapping these parametric objects to survey data, however, is still a time-consuming task that requires further research. Promising developments have been made towards automatic object recognition and feature extraction from point clouds for as-built BIM. However, results are currently limited to simple and planar features. Further work is required for automatic accurate and reliable reconstruction of complex geometries from point cloud data. Procedural modelling can provide an automated solution for generating 3D geometries but lacks the detail and accuracy required for most as-built applications in AEC and heritage fields.

  16. Evaluation of the probability of arrester failure in a high-voltage transmission line using a Q learning artificial neural network model

    NASA Astrophysics Data System (ADS)

    Ekonomou, L.; Karampelas, P.; Vita, V.; Chatzarakis, G. E.

    2011-04-01

    One of the most popular methods of protecting high voltage transmission lines against lightning strikes and internal overvoltages is the use of arresters. The installation of arresters in high voltage transmission lines can prevent or even reduce the lines' failure rate. Several studies based on simulation tools have been presented in order to estimate the critical currents that exceed the arresters' rated energy stress and to specify the arresters' installation interval. In this work artificial intelligence, and more specifically a Q-learning artificial neural network (ANN) model, is addressed for evaluating the arresters' failure probability. The aims of the paper are to describe in detail the developed Q-learning ANN model and to compare the results obtained by its application in operating 150 kV Greek transmission lines with those produced using a simulation tool. The satisfactory and accurate results of the proposed ANN model can make it a valuable tool for designers of electrical power systems seeking more effective lightning protection, reducing operational costs and better continuity of service.

  17. Optimization in Cardiovascular Modeling

    NASA Astrophysics Data System (ADS)

    Marsden, Alison L.

    2014-01-01

    Fluid mechanics plays a key role in the development, progression, and treatment of cardiovascular disease. Advances in imaging methods and patient-specific modeling now reveal increasingly detailed information about blood flow patterns in health and disease. Building on these tools, there is now an opportunity to couple blood flow simulation with optimization algorithms to improve the design of surgeries and devices, incorporating more information about the flow physics in the design process to augment current medical knowledge. In doing so, a major challenge is the need for efficient optimization tools that are appropriate for unsteady fluid mechanics problems, particularly for the optimization of complex patient-specific models in the presence of uncertainty. This article reviews the state of the art in optimization tools for virtual surgery, device design, and model parameter identification in cardiovascular flow and mechanobiology applications. In particular, it reviews trade-offs between traditional gradient-based methods and derivative-free approaches, as well as the need to incorporate uncertainties. Key future challenges are outlined, which extend to the incorporation of biological response and the customization of surgeries and devices for individual patients.

  18. CalFitter: a web server for analysis of protein thermal denaturation data.

    PubMed

    Mazurenko, Stanislav; Stourac, Jan; Kunka, Antonin; Nedeljkovic, Sava; Bednar, David; Prokop, Zbynek; Damborsky, Jiri

    2018-05-14

    Despite significant advances in the understanding of protein structure-function relationships, revealing protein folding pathways still poses a challenge due to a limited number of relevant experimental tools. Widely-used experimental techniques, such as calorimetry or spectroscopy, critically depend on a proper data analysis. Currently, there are only separate data analysis tools available for each type of experiment with a limited model selection. To address this problem, we have developed the CalFitter web server to be a unified platform for comprehensive data fitting and analysis of protein thermal denaturation data. The server allows simultaneous global data fitting using any combination of input data types and offers 12 protein unfolding pathway models for selection, including irreversible transitions often missing from other tools. The data fitting produces optimal parameter values, their confidence intervals, and statistical information to define unfolding pathways. The server provides an interactive and easy-to-use interface that allows users to directly analyse input datasets and simulate modelled output based on the model parameters. CalFitter web server is available free at https://loschmidt.chemi.muni.cz/calfitter/.

  19. Federation Development and Execution Process (FEDEP) Tools in Support of NATO Modelling & Simulation (M&S) Programmes (Des outils d’aide au processus de d’eveloppement et d’execution de federations (FEDEP))

    DTIC Science & Technology

    2004-05-01

    currently contains 79 tools and others should be added as they become known. Finally, the Task Group has recommended that the tool list be made available...approach and analysis. Conclusions and recommendations are contained in Chapter 5. RTO-TR-MSG-005 v Des outils d’aide au processus de développement...generation, Version 1.5 [A.3-1], was created in December 1999 and contained only minor editorial changes. RTO-TR-MSG-005 2 - 1 FEDEP With this

  20. OCSEGen: Open Components and Systems Environment Generator

    NASA Technical Reports Server (NTRS)

    Tkachuk, Oksana

    2014-01-01

    To analyze a large system, one often needs to break it into smaller components.To analyze a component or unit under analysis, one needs to model its context of execution, called environment, which represents the components with which the unit interacts. Environment generation is a challenging problem, because the environment needs to be general enough to uncover unit errors, yet precise enough to make the analysis tractable. In this paper, we present a tool for automated environment generation for open components and systems. The tool, called OCSEGen, is implemented on top of the Soot framework. We present the tool's current support and discuss its possible future extensions.

  1. A combined observational and modeling approach to the study of coastal areas: the case of the Gulf of Trieste

    NASA Astrophysics Data System (ADS)

    Cosoli, Simone; Licer, Matjaz; Malacic, Vlado; Papapostolou, Alexandros; Axaopoulos, Panagiotis

    2015-04-01

    During the last decade high-frequency (HF) radar systems have been installed operationally throughout the world, and extensive validation efforts have proven their reliability in mapping near-surface currents at high spatial and temporal resolutions. Nowadays, they are considered as a reliable benchmark for the validation of numerical circulation models and of tidal current models. Similarly to HFR data, ocean circulation models are now considered reliable tools that are routinely put into operational use to provide a wide range of products of public interest. To insure the scientific integrity, assessing the skill of the model products is a crucial point, especially in coastal areas where tidal processes (such as currents or mixing) are important, bathymetry and changes in the vertical and horizontal structure of temperature, salinity, and density due either to seasonal variations or impulsive-type freshwater input are also critical. Here we present the case of the Gulf of Trieste, northern Adriatic Sea, a complex coastal region in which circulation is controlled by a number of complex processes that include tides, wind, waves and variations in river discharge with significant temporal variability. By comparing radar observations, data from moorings and coastal tide gauges, with the output of different circulation models (NAPOM -an operational version of Princeton Ocean Model (POM) for the Northern Adriatic; and OTPS, a barotropic tidal model for the Northern Adriatic), we show that: HFR observations and model simulations are complementary tools in complex coastal regions, in the sense that they reciprocally help accounting for their intrinsic limitations (i.e., lack of vertical resolution in HFR data; areas with significant topographic gradients for models); tidal models accurately describe tidal features in the region; and that existing intrinsic data-model discrepancies can be interpreted and used to propose correction to the models.

  2. Conversion of HSPF Legacy Model to a Platform-Independent, Open-Source Language

    NASA Astrophysics Data System (ADS)

    Heaphy, R. T.; Burke, M. P.; Love, J. T.

    2015-12-01

    Since its initial development over 30 years ago, the Hydrologic Simulation Program - FORTAN (HSPF) model has been used worldwide to support water quality planning and management. In the United States, HSPF receives widespread endorsement as a regulatory tool at all levels of government and is a core component of the EPA's Better Assessment Science Integrating Point and Nonpoint Sources (BASINS) system, which was developed to support nationwide Total Maximum Daily Load (TMDL) analysis. However, the model's legacy code and data management systems have limitations in their ability to integrate with modern software, hardware, and leverage parallel computing, which have left voids in optimization, pre-, and post-processing tools. Advances in technology and our scientific understanding of environmental processes that have occurred over the last 30 years mandate that upgrades be made to HSPF to allow it to evolve and continue to be a premiere tool for water resource planners. This work aims to mitigate the challenges currently facing HSPF through two primary tasks: (1) convert code to a modern widely accepted, open-source, high-performance computing (hpc) code; and (2) convert model input and output files to modern widely accepted, open-source, data model, library, and binary file format. Python was chosen as the new language for the code conversion. It is an interpreted, object-oriented, hpc code with dynamic semantics that has become one of the most popular open-source languages. While python code execution can be slow compared to compiled, statically typed programming languages, such as C and FORTRAN, the integration of Numba (a just-in-time specializing compiler) has allowed this challenge to be overcome. For the legacy model data management conversion, HDF5 was chosen to store the model input and output. The code conversion for HSPF's hydrologic and hydraulic modules has been completed. The converted code has been tested against HSPF's suite of "test" runs and shown good agreement and similar execution times while using the Numba compiler. Continued verification of the accuracy of the converted code against more complex legacy applications and improvement upon execution times by incorporating an intelligent network change detection tool is currently underway, and preliminary results will be presented.

  3. The Soil and Water Assessment Tool (SWAT) Ecohydrological Model Circa 2015: Global Application Trends, Insights and Issues

    NASA Astrophysics Data System (ADS)

    Gassman, P. W.; Arnold, J. G.; Srinivasan, R.

    2015-12-01

    The Soil and Water Assessment Tool (SWAT) is one of the most widely used watershed-scale water quality models in the world. Over 2,000 peer-reviewed SWAT-related journal articles have been published and hundreds of other studies have been published in conference proceedings and other formats. The use of SWAT was initially concentrated in North America and Europe but has also expanded dramatically in other countries and regions during the past decade including Brazil, China, India, Iran, South Korea, Southeast Asia and eastern Africa. The SWAT model has proven to be a very flexible tool for investigating a broad range of hydrologic and water quality problems at different watershed scales and environmental conditions, and has proven very adaptable for applications requiring improved hydrologic and other enhanced simulation needs. We investigate here the various technological, networking, and other factors that have supported the expanded use of SWAT, and also highlight current worldwide simulation trends and possible impediments to future increased usage of the model. Examples of technological advances include easy access to web-based documentation, user-support groups, and SWAT literature, a variety of Geographic Information System (GIS) interface tools, pre- and post-processing calibration software and other software, and an open source code which has served as a model development catalyst for multiple user groups. Extensive networking regarding the use of SWAT has further occurred via internet-based user support groups, model training workshops, regional working groups, regional and international conferences, and targeted development workshops. We further highlight several important model development trends that have emerged during the past decade including improved hydrologic, cropping system, best management practice (BMP) and pollutant transport simulation methods. In addition, several current SWAT weaknesses will be addressed and key development needs will be described including the ability to represent landscapes and practices with more spatial definition, the incorporation of a module specifically designed to simulate rice paddy systems and algorithms that can capture plant competition dynamics such as occur in complex tree/crop systems and interactions between crops and weeds.

  4. Some issues in data model mapping

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Alsabbagh, Jamal R.

    1985-01-01

    Numerous data models have been reported in the literature since the early 1970's. They have been used as database interfaces and as conceptual design tools. The mapping between schemas expressed according to the same data model or according to different models is interesting for theoretical and practical purposes. This paper addresses some of the issues involved in such a mapping. Of special interest are the identification of the mapping parameters and some current approaches for handling the various situations that require a mapping.

  5. On the design of computer-based models for integrated environmental science.

    PubMed

    McIntosh, Brian S; Jeffrey, Paul; Lemon, Mark; Winder, Nick

    2005-06-01

    The current research agenda in environmental science is dominated by calls to integrate science and policy to better understand and manage links between social (human) and natural (nonhuman) processes. Freshwater resource management is one area where such calls can be heard. Designing computer-based models for integrated environmental science poses special challenges to the research community. At present it is not clear whether such tools, or their outputs, receive much practical policy or planning application. It is argued that this is a result of (1) a lack of appreciation within the research modeling community of the characteristics of different decision-making processes including policy, planning, and (2) participation, (3) a lack of appreciation of the characteristics of different decision-making contexts, (4) the technical difficulties in implementing the necessary support tool functionality, and (5) the socio-technical demands of designing tools to be of practical use. This article presents a critical synthesis of ideas from each of these areas and interprets them in terms of design requirements for computer-based models being developed to provide scientific information support for policy and planning. Illustrative examples are given from the field of freshwater resources management. Although computer-based diagramming and modeling tools can facilitate processes of dialogue, they lack adequate simulation capabilities. Component-based models and modeling frameworks provide such functionality and may be suited to supporting problematic or messy decision contexts. However, significant technical (implementation) and socio-technical (use) challenges need to be addressed before such ambition can be realized.

  6. A Comparison of Career-Related Assessment Tools/Models. Final [Report].

    ERIC Educational Resources Information Center

    WestEd, San Francisco, CA.

    This document contains charts that evaluate career related assessment items. Chart categories include: Purpose/Current Uses/Format; Intended Population; Oregon Career Related Learning Standards Addressed; Relationship to the Standards; Relationship to Endorsement Area Frameworks; Evidence of Validity; Evidence of Reliability; Evidence of Fairness…

  7. So Many Chemicals, So Little Time... Evolution of Computational Toxicology (NCSU Toxicology Lecture Series)

    EPA Science Inventory

    Current testing is limited by traditional testing models and regulatory systems. An overview is given of high throughput screening approaches to provide broader chemical and biological coverage, toxicokinetics and molecular pathway data and tools to facilitate utilization for reg...

  8. Energy Economics of Farm Biogas in Cold Climates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pillay, Pragasen; Grimberg, Stefan; Powers, Susan E

    Anaerobic digestion of farm and dairy waste has been shown to be capital intensive. One way to improve digester economics is to co-digest high-energy substrates together with the dairy manure. Cheese whey for example represents a high-energy substrate that is generated during cheese manufacture. There are currently no quantitative tools available that predict performance of co-digestion farm systems. The goal of this project was to develop a mathematical tool that would (1) predict the impact of co-digestion and (2) determine the best use of the generated biogas for a cheese manufacturing plant. Two models were developed that separately could bemore » used to meet both goals of the project. Given current pricing structures of the most economical use of the generated biogas at the cheese manufacturing plant was as a replacement of fuel oil to generate heat. The developed digester model accurately predicted the performance of 26 farm digesters operating in the North Eastern U.S.« less

  9. Detrusor underactivity: Pathophysiological considerations, models and proposals for future research. ICI-RS 2013.

    PubMed

    van Koeveringe, Gommert A; Rademakers, Kevin L J; Birder, Lori A; Korstanje, Cees; Daneshgari, Firouz; Ruggieri, Michael R; Igawa, Yasuhiko; Fry, Christopher; Wagg, Adrian

    2014-06-01

    Detrusor underactivity, resulting in either prolonged or inefficient voiding, is a common clinical problem for which treatment options are currently limited. The aim of this report is to summarize current understanding of the clinical observation and its underlying pathophysiological entities. This report results from presentations and subsequent discussion at the International Consultation on Incontinence Research Society (ICI-RS) in Bristol, 2013. The recommendations made by the ICI-RS panel include: Development of study tools based on a system's pathophysiological approach, correlation of in vitro and in vivo data in experimental animals and humans, and development of more comprehensive translational animal models. In addition, there is a need for longitudinal patient data to define risk groups and for the development of screening tools. In the near-future these recommendations should lead to a better understanding of detrusor underactivity and its pathophysiological background. Neurourol. Urodynam. 33:591-596, 2014. © 2014 Wiley Periodicals, Inc. © 2014 Wiley Periodicals, Inc.

  10. TEMPy: a Python library for assessment of three-dimensional electron microscopy density fits.

    PubMed

    Farabella, Irene; Vasishtan, Daven; Joseph, Agnel Praveen; Pandurangan, Arun Prasad; Sahota, Harpal; Topf, Maya

    2015-08-01

    Three-dimensional electron microscopy is currently one of the most promising techniques used to study macromolecular assemblies. Rigid and flexible fitting of atomic models into density maps is often essential to gain further insights into the assemblies they represent. Currently, tools that facilitate the assessment of fitted atomic models and maps are needed. TEMPy (template and electron microscopy comparison using Python) is a toolkit designed for this purpose. The library includes a set of methods to assess density fits in intermediate-to-low resolution maps, both globally and locally. It also provides procedures for single-fit assessment, ensemble generation of fits, clustering, and multiple and consensus scoring, as well as plots and output files for visualization purposes to help the user in analysing rigid and flexible fits. The modular nature of TEMPy helps the integration of scoring and assessment of fits into large pipelines, making it a tool suitable for both novice and expert structural biologists.

  11. High performance TWT development for the microwave power module

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whaley, D.R.; Armstrong, C.M.; Groshart, G.

    1996-12-31

    Northrop Grumman`s ongoing development of microwave power modules (MPM) provides microwave power at various power levels, frequencies, and bandwidths for a variety of applications. Present day requirements for the vacuum power booster traveling wave tubes of the microwave power module are becoming increasingly more demanding, necessitating the need for further enhancement of tube performance. The MPM development program at Northrop Grumman is designed specifically to meet this need by construction and test of a series of new tubes aimed at verifying computation and reaching high efficiency design goals. Tubes under test incorporate several different helix designs, as well as varyingmore » electron gun and magnetic confinement configurations. Current efforts also include further development of state-of-the-art TWT modeling and computational methods at Northrop Grumman incorporating new, more accurate models into existing design tools and developing new tools to be used in all aspects of traveling wave tube design. Current status of the Northrop Grumman MPM TWT development program will be presented.« less

  12. ePlant and the 3D data display initiative: integrative systems biology on the world wide web.

    PubMed

    Fucile, Geoffrey; Di Biase, David; Nahal, Hardeep; La, Garon; Khodabandeh, Shokoufeh; Chen, Yani; Easley, Kante; Christendat, Dinesh; Kelley, Lawrence; Provart, Nicholas J

    2011-01-10

    Visualization tools for biological data are often limited in their ability to interactively integrate data at multiple scales. These computational tools are also typically limited by two-dimensional displays and programmatic implementations that require separate configurations for each of the user's computing devices and recompilation for functional expansion. Towards overcoming these limitations we have developed "ePlant" (http://bar.utoronto.ca/eplant) - a suite of open-source world wide web-based tools for the visualization of large-scale data sets from the model organism Arabidopsis thaliana. These tools display data spanning multiple biological scales on interactive three-dimensional models. Currently, ePlant consists of the following modules: a sequence conservation explorer that includes homology relationships and single nucleotide polymorphism data, a protein structure model explorer, a molecular interaction network explorer, a gene product subcellular localization explorer, and a gene expression pattern explorer. The ePlant's protein structure explorer module represents experimentally determined and theoretical structures covering >70% of the Arabidopsis proteome. The ePlant framework is accessed entirely through a web browser, and is therefore platform-independent. It can be applied to any model organism. To facilitate the development of three-dimensional displays of biological data on the world wide web we have established the "3D Data Display Initiative" (http://3ddi.org).

  13. Improved Analysis of Earth System Models and Observations using Simple Climate Models

    NASA Astrophysics Data System (ADS)

    Nadiga, B. T.; Urban, N. M.

    2016-12-01

    Earth system models (ESM) are the most comprehensive tools we have to study climate change and develop climate projections. However, the computational infrastructure required and the cost incurred in running such ESMs precludes direct use of such models in conjunction with a wide variety of tools that can further our understanding of climate. Here we are referring to tools that range from dynamical systems tools that give insight into underlying flow structure and topology to tools that come from various applied mathematical and statistical techniques and are central to quantifying stability, sensitivity, uncertainty and predictability to machine learning tools that are now being rapidly developed or improved. Our approach to facilitate the use of such models is to analyze output of ESM experiments (cf. CMIP) using a range of simpler models that consider integral balances of important quantities such as mass and/or energy in a Bayesian framework.We highlight the use of this approach in the context of the uptake of heat by the world oceans in the ongoing global warming. Indeed, since in excess of 90% of the anomalous radiative forcing due greenhouse gas emissions is sequestered in the world oceans, the nature of ocean heat uptake crucially determines the surface warming that is realized (cf. climate sensitivity). Nevertheless, ESMs themselves are never run long enough to directly assess climate sensitivity. So, we consider a range of models based on integral balances--balances that have to be realized in all first-principles based models of the climate system including the most detailed state-of-the art climate simulations. The models range from simple models of energy balance to those that consider dynamically important ocean processes such as the conveyor-belt circulation (Meridional Overturning Circulation, MOC), North Atlantic Deep Water (NADW) formation, Antarctic Circumpolar Current (ACC) and eddy mixing. Results from Bayesian analysis of such models using both ESM experiments and actual observations are presented. One such result points to the importance of direct sequestration of heat below 700 m, a process that is not allowed for in the simple models that have been traditionally used to deduce climate sensitivity.

  14. Acute care patient portals: a qualitative study of stakeholder perspectives on current practices.

    PubMed

    Collins, Sarah A; Rozenblum, Ronen; Leung, Wai Yin; Morrison, Constance Rc; Stade, Diana L; McNally, Kelly; Bourie, Patricia Q; Massaro, Anthony; Bokser, Seth; Dwyer, Cindy; Greysen, Ryan S; Agarwal, Priyanka; Thornton, Kevin; Dalal, Anuj K

    2017-04-01

    To describe current practices and stakeholder perspectives of patient portals in the acute care setting. We aimed to: (1) identify key features, (2) recognize challenges, (3) understand current practices for design, configuration, and use, and (4) propose new directions for investigation and innovation. Mixed methods including surveys, interviews, focus groups, and site visits with stakeholders at leading academic medical centers. Thematic analyses to inform development of an explanatory model and recommendations. Site surveys were administered to 5 institutions. Thirty interviews/focus groups were conducted at 4 site visits that included a total of 84 participants. Ten themes regarding content and functionality, engagement and culture, and access and security were identified, from which an explanatory model of current practices was developed. Key features included clinical data, messaging, glossary, patient education, patient personalization and family engagement tools, and tiered displays. Four actionable recommendations were identified by group consensus. Design, development, and implementation of acute care patient portals should consider: (1) providing a single integrated experience across care settings, (2) humanizing the patient-clinician relationship via personalization tools, (3) providing equitable access, and (4) creating a clear organizational mission and strategy to achieve outcomes of interest. Portals should provide a single integrated experience across the inpatient and ambulatory settings. Core functionality includes tools that facilitate communication, personalize the patient, and deliver education to advance safe, coordinated, and dignified patient-centered care. Our findings can be used to inform a "road map" for future work related to acute care patient portals. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  15. Current Challenges in Geothermal Reservoir Simulation

    NASA Astrophysics Data System (ADS)

    Driesner, T.

    2016-12-01

    Geothermal reservoir simulation has long been introduced as a valuable tool for geothermal reservoir management and research. Yet, the current generation of simulation tools faces a number of severe challenges, in particular in the application for novel types of geothermal resources such as supercritical reservoirs or hydraulic stimulation. This contribution reviews a number of key problems: Representing the magmatic heat source of high enthalpy resources in simulations. Current practice is representing the deeper parts of a high enthalpy reservoir by a heat flux or temperature boundary condition. While this is sufficient for many reservoir management purposes it precludes exploring the chances of very high enthalpy resources in the deepest parts of such systems as well as the development of reliable conceptual models. Recent 2D simulations with the CSMP++ simulation platform demonstrate the potential of explicitly including the heat source, namely for understanding supercritical resources. Geometrically realistic incorporation of discrete fracture networks in simulation. A growing number of simulation tools can, in principle, handle flow and heat transport in discrete fracture networks. However, solving the governing equations and representing the physical properties are often biased by introducing strongly simplifying assumptions. Including proper fracture mechanics in complex fracture network simulations remains an open challenge. Improvements of the simulating chemical fluid-rock interaction in geothermal reservoirs. Major improvements have been made towards more stable and faster numerical solvers for multicomponent chemical fluid rock interaction. However, the underlying thermodynamic models and databases are unable to correctly address a number of important regions in temperature-pressure-composition parameter space. Namely, there is currently no thermodynamic formalism to describe relevant chemical reactions in supercritical reservoirs. Overcoming this unsatisfactory situation requires fundamental research in high temperature physical chemistry rather than further numerical development.

  16. Electrophysiological models of neural processing.

    PubMed

    Nelson, Mark E

    2011-01-01

    The brain is an amazing information processing system that allows organisms to adaptively monitor and control complex dynamic interactions with their environment across multiple spatial and temporal scales. Mathematical modeling and computer simulation techniques have become essential tools in understanding diverse aspects of neural processing ranging from sub-millisecond temporal coding in the sound localization circuity of barn owls to long-term memory storage and retrieval in humans that can span decades. The processing capabilities of individual neurons lie at the core of these models, with the emphasis shifting upward and downward across different levels of biological organization depending on the nature of the questions being addressed. This review provides an introduction to the techniques for constructing biophysically based models of individual neurons and local networks. Topics include Hodgkin-Huxley-type models of macroscopic membrane currents, Markov models of individual ion-channel currents, compartmental models of neuronal morphology, and network models involving synaptic interactions among multiple neurons.

  17. Identifying influential data points in hydrological model calibration and their impact on streamflow predictions

    NASA Astrophysics Data System (ADS)

    Wright, David; Thyer, Mark; Westra, Seth

    2015-04-01

    Highly influential data points are those that have a disproportionately large impact on model performance, parameters and predictions. However, in current hydrological modelling practice the relative influence of individual data points on hydrological model calibration is not commonly evaluated. This presentation illustrates and evaluates several influence diagnostics tools that hydrological modellers can use to assess the relative influence of data. The feasibility and importance of including influence detection diagnostics as a standard tool in hydrological model calibration is discussed. Two classes of influence diagnostics are evaluated: (1) computationally demanding numerical "case deletion" diagnostics; and (2) computationally efficient analytical diagnostics, based on Cook's distance. These diagnostics are compared against hydrologically orientated diagnostics that describe changes in the model parameters (measured through the Mahalanobis distance), performance (objective function displacement) and predictions (mean and maximum streamflow). These influence diagnostics are applied to two case studies: a stage/discharge rating curve model, and a conceptual rainfall-runoff model (GR4J). Removing a single data point from the calibration resulted in differences to mean flow predictions of up to 6% for the rating curve model, and differences to mean and maximum flow predictions of up to 10% and 17%, respectively, for the hydrological model. When using the Nash-Sutcliffe efficiency in calibration, the computationally cheaper Cook's distance metrics produce similar results to the case-deletion metrics at a fraction of the computational cost. However, Cooks distance is adapted from linear regression with inherit assumptions on the data and is therefore less flexible than case deletion. Influential point detection diagnostics show great potential to improve current hydrological modelling practices by identifying highly influential data points. The findings of this study establish the feasibility and importance of including influential point detection diagnostics as a standard tool in hydrological model calibration. They provide the hydrologist with important information on whether model calibration is susceptible to a small number of highly influent data points. This enables the hydrologist to make a more informed decision of whether to (1) remove/retain the calibration data; (2) adjust the calibration strategy and/or hydrological model to reduce the susceptibility of model predictions to a small number of influential observations.

  18. Quality of Big Data in Healthcare

    DOE PAGES

    Sukumar, Sreenivas R.; Ramachandran, Natarajan; Ferrell, Regina Kay

    2015-01-01

    The current trend in Big Data Analytics and in particular Health information technology is towards building sophisticated models, methods and tools for business, operational and clinical intelligence, but the critical issue of data quality required for these models is not getting the attention it deserves. The objective of the paper is to highlight the issues of data quality in the context of Big Data Healthcare Analytics.

  19. Project Development Model | Integrated Energy Solutions | NREL

    Science.gov Websites

    . The five elements of project fundamentals are: Baseline: Analyze the current situation for the site . The two-phase iterative model includes elements in project fundamentals and project development based State and Local Energy Data (SLED) tool, developed by NREL for the U.S. Department of Energy, to get

  20. Quality of Big Data in Healthcare

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sukumar, Sreenivas R.; Ramachandran, Natarajan; Ferrell, Regina Kay

    The current trend in Big Data Analytics and in particular Health information technology is towards building sophisticated models, methods and tools for business, operational and clinical intelligence, but the critical issue of data quality required for these models is not getting the attention it deserves. The objective of the paper is to highlight the issues of data quality in the context of Big Data Healthcare Analytics.

  1. A database for propagation models

    NASA Technical Reports Server (NTRS)

    Kantak, Anil V.; Suwitra, Krisjani; Le, Chuong

    1995-01-01

    A database of various propagation phenomena models that can be used by telecommunications systems engineers to obtain parameter values for systems design is presented. This is an easy-to-use tool and is currently available for either a PC using Excel software under Windows environment or a Macintosh using Excel software for Macintosh. All the steps necessary to use the software are easy and many times self explanatory.

  2. Time series analysis of malaria in Afghanistan: using ARIMA models to predict future trends in incidence.

    PubMed

    Anwar, Mohammad Y; Lewnard, Joseph A; Parikh, Sunil; Pitzer, Virginia E

    2016-11-22

    Malaria remains endemic in Afghanistan. National control and prevention strategies would be greatly enhanced through a better ability to forecast future trends in disease incidence. It is, therefore, of interest to develop a predictive tool for malaria patterns based on the current passive and affordable surveillance system in this resource-limited region. This study employs data from Ministry of Public Health monthly reports from January 2005 to September 2015. Malaria incidence in Afghanistan was forecasted using autoregressive integrated moving average (ARIMA) models in order to build a predictive tool for malaria surveillance. Environmental and climate data were incorporated to assess whether they improve predictive power of models. Two models were identified, each appropriate for different time horizons. For near-term forecasts, malaria incidence can be predicted based on the number of cases in the four previous months and 12 months prior (Model 1); for longer-term prediction, malaria incidence can be predicted using the rates 1 and 12 months prior (Model 2). Next, climate and environmental variables were incorporated to assess whether the predictive power of proposed models could be improved. Enhanced vegetation index was found to have increased the predictive accuracy of longer-term forecasts. Results indicate ARIMA models can be applied to forecast malaria patterns in Afghanistan, complementing current surveillance systems. The models provide a means to better understand malaria dynamics in a resource-limited context with minimal data input, yielding forecasts that can be used for public health planning at the national level.

  3. Advances in Time-Distance Helioseismology

    NASA Technical Reports Server (NTRS)

    Duvall, Thomas L., Jr.; Beck, John G.; Gizon, Laurent; Kosovichev, Alexander F.; Oegerle, William (Technical Monitor)

    2002-01-01

    Time-distance helioseismology is a way to measure travel times between surface locations for waves traversing the solar interior. Coupling the travel with an extensive modeling effort has proven to be a powerful tool for measuring flows and other wave speed inhomogeneities in the solar interior. Problems receiving current attention include studying the time variation of the meridional circulation and torsional oscillation and active region emergence and evolution, current results on these topics will be presented.

  4. Analytical drain current model for symmetric dual-gate amorphous indium gallium zinc oxide thin-film transistors

    NASA Astrophysics Data System (ADS)

    Qin, Ting; Liao, Congwei; Huang, Shengxiang; Yu, Tianbao; Deng, Lianwen

    2018-01-01

    An analytical drain current model based on the surface potential is proposed for amorphous indium gallium zinc oxide (a-InGaZnO) thin-film transistors (TFTs) with a synchronized symmetric dual-gate (DG) structure. Solving the electric field, surface potential (φS), and central potential (φ0) of the InGaZnO film using the Poisson equation with the Gaussian method and Lambert function is demonstrated in detail. The compact analytical model of current-voltage behavior, which consists of drift and diffusion components, is investigated by regional integration, and voltage-dependent effective mobility is taken into account. Comparison results demonstrate that the calculation results obtained using the derived models match well with the simulation results obtained using a technology computer-aided design (TCAD) tool. Furthermore, the proposed model is incorporated into SPICE simulations using Verilog-A to verify the feasibility of using DG InGaZnO TFTs for high-performance circuit designs.

  5. A review on experimental design for pollutants removal in water treatment with the aid of artificial intelligence.

    PubMed

    Fan, Mingyi; Hu, Jiwei; Cao, Rensheng; Ruan, Wenqian; Wei, Xionghui

    2018-06-01

    Water pollution occurs mainly due to inorganic and organic pollutants, such as nutrients, heavy metals and persistent organic pollutants. For the modeling and optimization of pollutants removal, artificial intelligence (AI) has been used as a major tool in the experimental design that can generate the optimal operational variables, since AI has recently gained a tremendous advance. The present review describes the fundamentals, advantages and limitations of AI tools. Artificial neural networks (ANNs) are the AI tools frequently adopted to predict the pollutants removal processes because of their capabilities of self-learning and self-adapting, while genetic algorithm (GA) and particle swarm optimization (PSO) are also useful AI methodologies in efficient search for the global optima. This article summarizes the modeling and optimization of pollutants removal processes in water treatment by using multilayer perception, fuzzy neural, radial basis function and self-organizing map networks. Furthermore, the results conclude that the hybrid models of ANNs with GA and PSO can be successfully applied in water treatment with satisfactory accuracies. Finally, the limitations of current AI tools and their new developments are also highlighted for prospective applications in the environmental protection. Copyright © 2018 Elsevier Ltd. All rights reserved.

  6. Development of a New Data Tool for Computing Launch and Landing Availability with Respect to Surface Weather

    NASA Technical Reports Server (NTRS)

    Burns, K. Lee; Altino, Karen

    2008-01-01

    The Marshall Space Flight Center Natural Environments Branch has a long history of expertise in the modeling and computation of statistical launch availabilities with respect to weather conditions. Their existing data analysis product, the Atmospheric Parametric Risk Assessment (APRA) tool, computes launch availability given an input set of vehicle hardware and/or operational weather constraints by calculating the climatological probability of exceeding the specified constraint limits, APRA has been used extensively to provide the Space Shuttle program the ability to estimate impacts that various proposed design modifications would have to overall launch availability. The model accounts for both seasonal and diurnal variability at a single geographic location and provides output probabilities for a single arbitrary launch attempt. Recently, the Shuttle program has shown interest in having additional capabilities added to the APRA model, including analysis of humidity parameters, inclusion of landing site weather to produce landing availability, and concurrent analysis of multiple sites, to assist in operational landing site selection. In addition, the Constellation program has also expressed interest in the APRA tool, and has requested several additional capabilities to address some Constellation-specific issues, both in the specification and verification of design requirements and in the development of operations concepts. The combined scope of the requested capability enhancements suggests an evolution of the model beyond a simple revision process. Development has begun for a new data analysis tool that will satisfy the requests of both programs. This new tool, Probabilities of Atmospheric Conditions and Environmental Risk (PACER), will provide greater flexibility and significantly enhanced functionality compared to the currently existing tool.

  7. Modeling of prepregs during automated draping sequences

    NASA Astrophysics Data System (ADS)

    Krogh, Christian; Glud, Jens A.; Jakobsen, Johnny

    2017-10-01

    The behavior of wowen prepreg fabric during automated draping sequences is investigated. A drape tool under development with an arrangement of grippers facilitates the placement of a woven prepreg fabric in a mold. It is essential that the draped configuration is free from wrinkles and other defects. The present study aims at setting up a virtual draping framework capable of modeling the draping process from the initial flat fabric to the final double curved shape and aims at assisting the development of an automated drape tool. The virtual draping framework consists of a kinematic mapping algorithm used to generate target points on the mold which are used as input to a draping sequence planner. The draping sequence planner prescribes the displacement history for each gripper in the drape tool and these displacements are then applied to each gripper in a transient model of the draping sequence. The model is based on a transient finite element analysis with the material's constitutive behavior currently being approximated as linear elastic orthotropic. In-plane tensile and bias-extension tests as well as bending tests are conducted and used as input for the model. The virtual draping framework shows a good potential for obtaining a better understanding of the drape process and guide the development of the drape tool. However, results obtained from using the framework on a simple test case indicate that the generation of draping sequences is non-trivial.

  8. Review of hardware cost estimation methods, models and tools applied to early phases of space mission planning

    NASA Astrophysics Data System (ADS)

    Trivailo, O.; Sippel, M.; Şekercioğlu, Y. A.

    2012-08-01

    The primary purpose of this paper is to review currently existing cost estimation methods, models, tools and resources applicable to the space sector. While key space sector methods are outlined, a specific focus is placed on hardware cost estimation on a system level, particularly for early mission phases during which specifications and requirements are not yet crystallised, and information is limited. For the space industry, cost engineering within the systems engineering framework is an integral discipline. The cost of any space program now constitutes a stringent design criterion, which must be considered and carefully controlled during the entire program life cycle. A first step to any program budget is a representative cost estimate which usually hinges on a particular estimation approach, or methodology. Therefore appropriate selection of specific cost models, methods and tools is paramount, a difficult task given the highly variable nature, scope as well as scientific and technical requirements applicable to each program. Numerous methods, models and tools exist. However new ways are needed to address very early, pre-Phase 0 cost estimation during the initial program research and establishment phase when system specifications are limited, but the available research budget needs to be established and defined. Due to their specificity, for vehicles such as reusable launchers with a manned capability, a lack of historical data implies that using either the classic heuristic approach such as parametric cost estimation based on underlying CERs, or the analogy approach, is therefore, by definition, limited. This review identifies prominent cost estimation models applied to the space sector, and their underlying cost driving parameters and factors. Strengths, weaknesses, and suitability to specific mission types and classes are also highlighted. Current approaches which strategically amalgamate various cost estimation strategies both for formulation and validation of an estimate, and techniques and/or methods to attain representative and justifiable cost estimates are consequently discussed. Ultimately, the aim of the paper is to establish a baseline for development of a non-commercial, low cost, transparent cost estimation methodology to be applied during very early program research phases at a complete vehicle system level, for largely unprecedented manned launch vehicles in the future. This paper takes the first step to achieving this through the identification, analysis and understanding of established, existing techniques, models, tools and resources relevant within the space sector.

  9. Conversion of Component-Based Point Definition to VSP Model and Higher Order Meshing

    NASA Technical Reports Server (NTRS)

    Ordaz, Irian

    2011-01-01

    Vehicle Sketch Pad (VSP) has become a powerful conceptual and parametric geometry tool with numerous export capabilities for third-party analysis codes as well as robust surface meshing capabilities for computational fluid dynamics (CFD) analysis. However, a capability gap currently exists for reconstructing a fully parametric VSP model of a geometry generated by third-party software. A computer code called GEO2VSP has been developed to close this gap and to allow the integration of VSP into a closed-loop geometry design process with other third-party design tools. Furthermore, the automated CFD surface meshing capability of VSP are demonstrated for component-based point definition geometries in a conceptual analysis and design framework.

  10. Phasing via pure crystallographic least squares: an unexpected feature.

    PubMed

    Burla, Maria Cristina; Carrozzini, Benedetta; Cascarano, Giovanni Luca; Giacovazzo, Carmelo; Polidori, Giampiero

    2018-03-01

    Crystallographic least-squares techniques, the main tool for crystal structure refinement of small and medium-size molecules, are for the first time used for ab initio phasing. It is shown that the chief obstacle to such use, the least-squares severe convergence limits, may be overcome by a multi-solution procedure able to progressively recognize and discard model atoms in false positions and to include in the current model new atoms sufficiently close to correct positions. The applications show that the least-squares procedure is able to solve many small structures without the use of important ancillary tools: e.g. no electron-density map is calculated as a support for the least-squares procedure.

  11. Moving university hydrology education forward with community-based geoinformatics, data and modeling resources

    NASA Astrophysics Data System (ADS)

    Merwade, V.; Ruddell, B. L.

    2012-08-01

    In this opinion paper, we review recent literature related to data and modeling driven instruction in hydrology, and present our findings from surveying the hydrology education community in the United States. This paper presents an argument that that data and modeling driven geoscience cybereducation (DMDGC) approaches are essential for teaching the conceptual and applied aspects of hydrology, as a part of the broader effort to improve science, technology, engineering, and mathematics (STEM) education at the university level. The authors have undertaken a series of surveys and a workshop involving university hydrology educators to determine the state of the practice of DMDGC approaches to hydrology. We identify the most common tools and approaches currently utilized, quantify the extent of the adoption of DMDGC approaches in the university hydrology classroom, and explain the community's views on the challenges and barriers preventing DMDGC approaches from wider use. DMDGC approaches are currently emphasized at the graduate level of the curriculum, and only the most basic modeling and visualization tools are in widespread use. The community identifies the greatest barriers to greater adoption as a lack of access to easily adoptable curriculum materials and a lack of time and training to learn constantly changing tools and methods. The community's current consensus is that DMDGC approaches should emphasize conceptual learning, and should be used to complement rather than replace lecture-based pedagogies. Inadequate online material publication and sharing systems, and a lack of incentives for faculty to develop and publish materials via such systems, is also identified as a challenge. Based on these findings, we suggest that a number of steps should be taken by the community to develop the potential of DMDGC in university hydrology education, including formal development and assessment of curriculum materials, integrating lecture-format and DMDGC approaches, incentivizing the publication by faculty of excellent DMDGC curriculum materials, and implementing the publication and dissemination cyberinfrastructure necessary to support the unique DMDGC digital curriculum materials.

  12. Moving university hydrology education forward with geoinformatics, data and modeling approaches

    NASA Astrophysics Data System (ADS)

    Merwade, V.; Ruddell, B. L.

    2012-02-01

    In this opinion paper, we review recent literature related to data and modeling driven instruction in hydrology, and present our findings from surveying the hydrology education community in the United States. This paper presents an argument that that Data and Modeling Driven Geoscience Cybereducation (DMDGC) approaches are valuable for teaching the conceptual and applied aspects of hydrology, as a part of the broader effort to improve Science, Technology, Engineering, and Mathematics (STEM) education at the university level. The authors have undertaken a series of surveys and a workshop involving the community of university hydrology educators to determine the state of the practice of DMDGC approaches to hydrology. We identify the most common tools and approaches currently utilized, quantify the extent of the adoption of DMDGC approaches in the university hydrology classroom, and explain the community's views on the challenges and barriers preventing DMDGC approaches from wider use. DMDGC approaches are currently emphasized at the graduate level of the curriculum, and only the most basic modeling and visualization tools are in widespread use. The community identifies the greatest barriers to greater adoption as a lack of access to easily adoptable curriculum materials and a lack of time and training to learn constantly changing tools and methods. The community's current consensus is that DMDGC approaches should emphasize conceptual learning, and should be used to complement rather than replace lecture-based pedagogies. Inadequate online material-publication and sharing systems, and a lack of incentives for faculty to develop and publish materials via such systems, is also identified as a challenge. Based on these findings, we suggest that a number of steps should be taken by the community to develop the potential of DMDGC in university hydrology education, including formal development and assessment of curriculum materials integrating lecture-format and DMDGC approaches, incentivizing the publication by faculty of excellent DMDGC curriculum materials, and implementing the publication and dissemination cyberinfrastructure necessary to support the unique DMDGC digital curriculum materials.

  13. Off-Gas Adsorption Model Capabilities and Recommendations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lyon, Kevin L.; Welty, Amy K.; Law, Jack

    2016-03-01

    Off-gas treatment is required to reduce emissions from aqueous fuel reprocessing. Evaluating the products of innovative gas adsorption research requires increased computational simulation capability to more effectively transition from fundamental research to operational design. Early modeling efforts produced the Off-Gas SeParation and REcoverY (OSPREY) model that, while efficient in terms of computation time, was of limited value for complex systems. However, the computational and programming lessons learned in development of the initial model were used to develop Discontinuous Galerkin OSPREY (DGOSPREY), a more effective model. Initial comparisons between OSPREY and DGOSPREY show that, while OSPREY does reasonably well to capturemore » the initial breakthrough time, it displays far too much numerical dispersion to accurately capture the real shape of the breakthrough curves. DGOSPREY is a much better tool as it utilizes a more stable set of numerical methods. In addition, DGOSPREY has shown the capability to capture complex, multispecies adsorption behavior, while OSPREY currently only works for a single adsorbing species. This capability makes DGOSPREY ultimately a more practical tool for real world simulations involving many different gas species. While DGOSPREY has initially performed very well, there is still need for improvement. The current state of DGOSPREY does not include any micro-scale adsorption kinetics and therefore assumes instantaneous adsorption. This is a major source of error in predicting water vapor breakthrough because the kinetics of that adsorption mechanism is particularly slow. However, this deficiency can be remedied by building kinetic kernels into DGOSPREY. Another source of error in DGOSPREY stems from data gaps in single species, such as Kr and Xe, isotherms. Since isotherm data for each gas is currently available at a single temperature, the model is unable to predict adsorption at temperatures outside of the set of data currently available. Thus, in order to improve the predictive capabilities of the model, there is a need for more single-species adsorption isotherms at different temperatures, in addition to extending the model to include adsorption kinetics. This report provides background information about the modeling process and a path forward for further model improvement in terms of accuracy and user interface.« less

  14. Development of an Unstructured, Three-Dimensional Material Response Design Tool

    NASA Technical Reports Server (NTRS)

    Schulz, Joseph; Stern, Eric; Palmer, Grant; Muppidi, Suman; Schroeder, Olivia

    2017-01-01

    A preliminary verification and validation of a new material response model is presented. This model, Icarus, is intended to serve as a design tool for the thermal protection systems of re-entry vehicles. Currently, the capability of the model is limited to simulating the pyrolysis of a material as a result of the radiative and convective surface heating imposed on the material from the surrounding high enthalpy gas. Since the major focus behind the development of Icarus has been model extensibility, the hope is that additional physics can be quickly added. The extensibility is critical since thermal protection systems are becoming increasing complex, e.g. woven carbon polymers. Additionally, as a three-dimensional, unstructured, finite-volume model, Icarus is capable of modeling complex geometries as well as multi-dimensional physics, which have been shown to be important in some scenarios and are not captured by one-dimensional models. In this paper, the mathematical and numerical formulation is presented followed by a discussion of the software architecture and some preliminary verification and validation studies.

  15. Measurement of W + bb and a search for MSSM Higgs bosons with the CMS detector at the LHC

    NASA Astrophysics Data System (ADS)

    O'Connor, Alexander Pinpin

    Tooling used to cure composite laminates in the aerospace and automotive industries must provide a dimensionally stable geometry throughout the thermal cycle applied during the part curing process. This requires that the Coefficient of Thermal Expansion (CTE) of the tooling materials match that of the composite being cured. The traditional tooling material for production applications is a nickel alloy. Poor machinability and high material costs increase the expense of metallic tooling made from nickel alloys such as 'Invar 36' or 'Invar 42'. Currently, metallic tooling is unable to meet the needs of applications requiring rapid affordable tooling solutions. In applications where the tooling is not required to have the durability provided by metals, such as for small area repair, an opportunity exists for non-metallic tooling materials like graphite, carbon foams, composites, or ceramics and machinable glasses. Nevertheless, efficient machining of brittle, non-metallic materials is challenging due to low ductility, porosity, and high hardness. The machining of a layup tool comprises a large portion of the final cost. Achieving maximum process economy requires optimization of the machining process in the given tooling material. Therefore, machinability of the tooling material is a critical aspect of the overall cost of the tool. In this work, three commercially available, brittle/porous, non-metallic candidate tooling materials were selected, namely: (AAC) Autoclaved Aerated Concrete, CB1100 ceramic block and Cfoam carbon foam. Machining tests were conducted in order to evaluate the machinability of these materials using end milling. Chip formation, cutting forces, cutting tool wear, machining induced damage, surface quality and surface integrity were investigated using High Speed Steel (HSS), carbide, diamond abrasive and Polycrystalline Diamond (PCD) cutting tools. Cutting forces were found to be random in magnitude, which was a result of material porosity. The abrasive nature of Cfoam produced rapid tool wear when using HSS and PCD type cutting tools. However, tool wear was not significant in AAC or CB1100 regardless of the type of cutting edge. Machining induced damage was observed in the form of macro-scale chipping and fracture in combination with micro-scale cracking. Transverse rupture test results revealed significant reductions in residual strength and damage tolerance in CB1100. In contrast, AAC and Cfoam showed no correlation between machining induced damage and a reduction in surface integrity. Cutting forces in machining were modeled for all materials. Cutting force regression models were developed based on Design of Experiment and Analysis of Variance. A mechanistic cutting force model was proposed based upon conventional end milling force models and statistical distributions of material porosity. In order to validate the model, predicted cutting forces were compared to experimental results. Predicted cutting forces agreed well with experimental measurements. Furthermore, over the range of cutting conditions tested, the proposed model was shown to have comparable predictive accuracy to empirically produced regression models; greatly reducing the number of cutting tests required to simulate cutting forces. Further, this work demonstrates a key adaptation of metallic cutting force models to brittle porous material; a vital step in the research into the machining of these materials using end milling.

  16. Advanced process control framework initiative

    NASA Astrophysics Data System (ADS)

    Hill, Tom; Nettles, Steve

    1997-01-01

    The semiconductor industry, one the world's most fiercely competitive industries, is driven by increasingly complex process technologies and global competition to improve cycle time, quality, and process flexibility. Due to the complexity of these problems, current process control techniques are generally nonautomated, time-consuming, reactive, nonadaptive, and focused on individual fabrication tools and processes. As the semiconductor industry moves into higher density processes, radical new approaches are required. To address the need for advanced factory-level process control in this environment, Honeywell, Advanced Micro Devices (AMD), and SEMATECH formed the Advanced Process Control Framework Initiative (APCFI) joint research project. The project defines and demonstrates an Advanced Process Control (APC) approach based on SEMATECH's Computer Integrated Manufacturing (CIM) Framework. Its scope includes the coordination of Manufacturing Execution Systems, process control tools, and wafer fabrication equipment to provide necessary process control capabilities. Moreover, it takes advantage of the CIM Framework to integrate and coordinate applications from other suppliers that provide services necessary for the overall system to function. This presentation discusses the key concept of model-based process control that differentiates the APC Framework. This major improvement over current methods enables new systematic process control by linking the knowledge of key process settings to desired product characteristics that reside in models created with commercial model development tools The unique framework-based approach facilitates integration of commercial tools and reuse of their data by tying them together in an object-based structure. The presentation also explores the perspective of each organization's involvement in the APCFI project. Each has complementary goals and expertise to contribute; Honeywell represents the supplier viewpoint, AMD represents the user with 'real customer requirements', and SEMATECH provides a consensus-building organization that widely disseminates technology to suppliers and users in the semiconductor industry that face similar equipment and factory control systems challenges.

  17. Methods Developed by the Tools for Engine Diagnostics Task to Monitor and Predict Rotor Damage in Real Time

    NASA Technical Reports Server (NTRS)

    Baaklini, George Y.; Smith, Kevin; Raulerson, David; Gyekenyesi, Andrew L.; Sawicki, Jerzy T.; Brasche, Lisa

    2003-01-01

    Tools for Engine Diagnostics is a major task in the Propulsion System Health Management area of the Single Aircraft Accident Prevention project under NASA s Aviation Safety Program. The major goal of the Aviation Safety Program is to reduce fatal aircraft accidents by 80 percent within 10 years and by 90 percent within 25 years. The goal of the Propulsion System Health Management area is to eliminate propulsion system malfunctions as a primary or contributing factor to the cause of aircraft accidents. The purpose of Tools for Engine Diagnostics, a 2-yr-old task, is to establish and improve tools for engine diagnostics and prognostics that measure the deformation and damage of rotating engine components at the ground level and that perform intermittent or continuous monitoring on the engine wing. In this work, nondestructive-evaluation- (NDE-) based technology is combined with model-dependent disk spin experimental simulation systems, like finite element modeling (FEM) and modal norms, to monitor and predict rotor damage in real time. Fracture mechanics time-dependent fatigue crack growth and damage-mechanics-based life estimation are being developed, and their potential use investigated. In addition, wireless eddy current and advanced acoustics are being developed for on-wing and just-in-time NDE engine inspection to provide deeper access and higher sensitivity to extend on-wing capabilities and improve inspection readiness. In the long run, these methods could establish a base for prognostic sensing while an engine is running, without any overt actions, like inspections. This damage-detection strategy includes experimentally acquired vibration-, eddy-current- and capacitance-based displacement measurements and analytically computed FEM-, modal norms-, and conventional rotordynamics-based models of well-defined damages and critical mass imbalances in rotating disks and rotors.

  18. Development of efficient and cost-effective distributed hydrological modeling tool MWEasyDHM based on open-source MapWindow GIS

    NASA Astrophysics Data System (ADS)

    Lei, Xiaohui; Wang, Yuhui; Liao, Weihong; Jiang, Yunzhong; Tian, Yu; Wang, Hao

    2011-09-01

    Many regions are still threatened with frequent floods and water resource shortage problems in China. Consequently, the task of reproducing and predicting the hydrological process in watersheds is hard and unavoidable for reducing the risks of damage and loss. Thus, it is necessary to develop an efficient and cost-effective hydrological tool in China as many areas should be modeled. Currently, developed hydrological tools such as Mike SHE and ArcSWAT (soil and water assessment tool based on ArcGIS) show significant power in improving the precision of hydrological modeling in China by considering spatial variability both in land cover and in soil type. However, adopting developed commercial tools in such a large developing country comes at a high cost. Commercial modeling tools usually contain large numbers of formulas, complicated data formats, and many preprocessing or postprocessing steps that may make it difficult for the user to carry out simulation, thus lowering the efficiency of the modeling process. Besides, commercial hydrological models usually cannot be modified or improved to be suitable for some special hydrological conditions in China. Some other hydrological models are open source, but integrated into commercial GIS systems. Therefore, by integrating hydrological simulation code EasyDHM, a hydrological simulation tool named MWEasyDHM was developed based on open-source MapWindow GIS, the purpose of which is to establish the first open-source GIS-based distributed hydrological model tool in China by integrating modules of preprocessing, model computation, parameter estimation, result display, and analysis. MWEasyDHM provides users with a friendly manipulating MapWindow GIS interface, selectable multifunctional hydrological processing modules, and, more importantly, an efficient and cost-effective hydrological simulation tool. The general construction of MWEasyDHM consists of four major parts: (1) a general GIS module for hydrological analysis, (2) a preprocessing module for modeling inputs, (3) a model calibration module, and (4) a postprocessing module. The general GIS module for hydrological analysis is developed on the basis of totally open-source GIS software, MapWindow, which contains basic GIS functions. The preprocessing module is made up of three submodules including a DEM-based submodule for hydrological analysis, a submodule for default parameter calculation, and a submodule for the spatial interpolation of meteorological data. The calibration module contains parallel computation, real-time computation, and visualization. The postprocessing module includes model calibration and model results spatial visualization using tabular form and spatial grids. MWEasyDHM makes it possible for efficient modeling and calibration of EasyDHM, and promises further development of cost-effective applications in various watersheds.

  19. Model-based reasoning for system and software engineering: The Knowledge From Pictures (KFP) environment

    NASA Technical Reports Server (NTRS)

    Bailin, Sydney; Paterra, Frank; Henderson, Scott; Truszkowski, Walt

    1993-01-01

    This paper presents a discussion of current work in the area of graphical modeling and model-based reasoning being undertaken by the Automation Technology Section, Code 522.3, at Goddard. The work was initially motivated by the growing realization that the knowledge acquisition process was a major bottleneck in the generation of fault detection, isolation, and repair (FDIR) systems for application in automated Mission Operations. As with most research activities this work started out with a simple objective: to develop a proof-of-concept system demonstrating that a draft rule-base for a FDIR system could be automatically realized by reasoning from a graphical representation of the system to be monitored. This work was called Knowledge From Pictures (KFP) (Truszkowski et. al. 1992). As the work has successfully progressed the KFP tool has become an environment populated by a set of tools that support a more comprehensive approach to model-based reasoning. This paper continues by giving an overview of the graphical modeling objectives of the work, describing the three tools that now populate the KFP environment, briefly presenting a discussion of related work in the field, and by indicating future directions for the KFP environment.

  20. Systems engineering medicine: engineering the inflammation response to infectious and traumatic challenges

    PubMed Central

    Parker, Robert S.; Clermont, Gilles

    2010-01-01

    The complexity of the systemic inflammatory response and the lack of a treatment breakthrough in the treatment of pathogenic infection demand that advanced tools be brought to bear in the treatment of severe sepsis and trauma. Systems medicine, the translational science counterpart to basic science's systems biology, is the interface at which these tools may be constructed. Rapid initial strides in improving sepsis treatment are possible through the use of phenomenological modelling and optimization tools for process understanding and device design. Higher impact, and more generalizable, treatment designs are based on mechanistic understanding developed through the use of physiologically based models, characterization of population variability, and the use of control-theoretic systems engineering concepts. In this review we introduce acute inflammation and sepsis as an example of just one area that is currently underserved by the systems medicine community, and, therefore, an area in which contributions of all types can be made. PMID:20147315

  1. A practical six-degree of freedom solar sail dynamics model for optimizing solar sail trajectories with torque constraints

    NASA Technical Reports Server (NTRS)

    Lisano, Michael E.

    2004-01-01

    Controlled flight of a solar sail-propelled spacecraft ('sailcraft') is a six-degree-of-freedom dynamics problem. Current state-of-the-art tools that simulate and optimize the trajectories flown by sailcraft do not treat the full kinetic (i.e. force and torque-constrained) motion, instead treating a discrete history of commanded sail attitudes, and either neglecting the sail attitude motion over an integration timestep, or treating the attitude evolution kinematically with a spline or similar treatment. The present paper discusses an aspect of developing a next generation sailcraf trajectory designing optimization tool JPL, for NASA's Solar Sail Spaceflight Simulation Software (SS). The aspect discussed in an experimental approach to modeling full six-degree-of-freedom kinetic motion of a solar sail in a trajectory propagator. Early results from implementing this approach in a new trajectory propagation tool are given.

  2. Systems engineering medicine: engineering the inflammation response to infectious and traumatic challenges.

    PubMed

    Parker, Robert S; Clermont, Gilles

    2010-07-06

    The complexity of the systemic inflammatory response and the lack of a treatment breakthrough in the treatment of pathogenic infection demand that advanced tools be brought to bear in the treatment of severe sepsis and trauma. Systems medicine, the translational science counterpart to basic science's systems biology, is the interface at which these tools may be constructed. Rapid initial strides in improving sepsis treatment are possible through the use of phenomenological modelling and optimization tools for process understanding and device design. Higher impact, and more generalizable, treatment designs are based on mechanistic understanding developed through the use of physiologically based models, characterization of population variability, and the use of control-theoretic systems engineering concepts. In this review we introduce acute inflammation and sepsis as an example of just one area that is currently underserved by the systems medicine community, and, therefore, an area in which contributions of all types can be made.

  3. Parallel stochastic simulation of macroscopic calcium currents.

    PubMed

    González-Vélez, Virginia; González-Vélez, Horacio

    2007-06-01

    This work introduces MACACO, a macroscopic calcium currents simulator. It provides a parameter-sweep framework which computes macroscopic Ca(2+) currents from the individual aggregation of unitary currents, using a stochastic model for L-type Ca(2+) channels. MACACO uses a simplified 3-state Markov model to simulate the response of each Ca(2+) channel to different voltage inputs to the cell. In order to provide an accurate systematic view for the stochastic nature of the calcium channels, MACACO is composed of an experiment generator, a central simulation engine and a post-processing script component. Due to the computational complexity of the problem and the dimensions of the parameter space, the MACACO simulation engine employs a grid-enabled task farm. Having been designed as a computational biology tool, MACACO heavily borrows from the way cell physiologists conduct and report their experimental work.

  4. A database for propagation models

    NASA Technical Reports Server (NTRS)

    Kantak, Anil V.; Suwitra, Krisjani; Le, Choung

    1994-01-01

    A database of various propagation phenomena models that can be used by telecommunications systems engineers to obtain parameter values for systems design is presented. This is an easy-to-use tool and is currently available for either a PC using Excel software under Windows environment or a Macintosh using Excel software for Macintosh. All the steps necessary to use the software are easy and many times self-explanatory; however, a sample run of the CCIR rain attenuation model is presented.

  5. Challenges of predicting the potential distribution of a slow-spreading invader: a habitat suitability map for an invasive riparian tree

    USGS Publications Warehouse

    Jarnevich, Catherine S.; Reynolds, Lindsay V.

    2011-01-01

    Understanding the potential spread of invasive species is essential for land managers to prevent their establishment and restore impacted habitat. Habitat suitability modeling provides a tool for researchers and managers to understand the potential extent of invasive species spread. Our goal was to use habitat suitability modeling to map potential habitat of the riparian plant invader, Russian olive (Elaeagnus angustifolia). Russian olive has invaded riparian habitat across North America and is continuing to expand its range. We compiled 11 disparate datasets for Russian olive presence locations (n = 1,051 points and 139 polygons) in the western US and used Maximum entropy (Maxent) modeling to develop two habitat suitability maps for Russian olive in the western United States: one with coarse-scale water data and one with fine-scale water data. Our models were able to accurately predict current suitable Russian olive habitat (Coarse model: training AUC = 0.938, test AUC = 0.907; Fine model: training AUC = 0.923, test AUC = 0.885). Distance to water was the most important predictor for Russian olive presence in our coarse-scale water model, but it was only the fifth most important variable in the fine-scale model, suggesting that when water bodies are considered on a fine scale, Russian olive does not necessarily rely on water. Our model predicted that Russian olive has suitable habitat further west from its current distribution, expanding into the west coast and central North America. Our methodology proves useful for identifying potential future areas of invasion. Model results may be influenced by locations of cultivated individuals and sampling bias. Further study is needed to examine the potential for Russian olive to invade beyond its current range. Habitat suitability modeling provides an essential tool for enhancing our understanding of invasive species spread.

  6. Proposals for enhanced health risk assessment and stratification in an integrated care scenario

    PubMed Central

    Dueñas-Espín, Ivan; Vela, Emili; Pauws, Steffen; Bescos, Cristina; Cano, Isaac; Cleries, Montserrat; Contel, Joan Carles; de Manuel Keenoy, Esteban; Garcia-Aymerich, Judith; Gomez-Cabrero, David; Kaye, Rachelle; Lahr, Maarten M H; Lluch-Ariet, Magí; Moharra, Montserrat; Monterde, David; Mora, Joana; Nalin, Marco; Pavlickova, Andrea; Piera, Jordi; Ponce, Sara; Santaeugenia, Sebastià; Schonenberg, Helen; Störk, Stefan; Tegner, Jesper; Velickovski, Filip; Westerteicher, Christoph; Roca, Josep

    2016-01-01

    Objectives Population-based health risk assessment and stratification are considered highly relevant for large-scale implementation of integrated care by facilitating services design and case identification. The principal objective of the study was to analyse five health-risk assessment strategies and health indicators used in the five regions participating in the Advancing Care Coordination and Telehealth Deployment (ACT) programme (http://www.act-programme.eu). The second purpose was to elaborate on strategies toward enhanced health risk predictive modelling in the clinical scenario. Settings The five ACT regions: Scotland (UK), Basque Country (ES), Catalonia (ES), Lombardy (I) and Groningen (NL). Participants Responsible teams for regional data management in the five ACT regions. Primary and secondary outcome measures We characterised and compared risk assessment strategies among ACT regions by analysing operational health risk predictive modelling tools for population-based stratification, as well as available health indicators at regional level. The analysis of the risk assessment tool deployed in Catalonia in 2015 (GMAs, Adjusted Morbidity Groups) was used as a basis to propose how population-based analytics could contribute to clinical risk prediction. Results There was consensus on the need for a population health approach to generate health risk predictive modelling. However, this strategy was fully in place only in two ACT regions: Basque Country and Catalonia. We found marked differences among regions in health risk predictive modelling tools and health indicators, and identified key factors constraining their comparability. The research proposes means to overcome current limitations and the use of population-based health risk prediction for enhanced clinical risk assessment. Conclusions The results indicate the need for further efforts to improve both comparability and flexibility of current population-based health risk predictive modelling approaches. Applicability and impact of the proposals for enhanced clinical risk assessment require prospective evaluation. PMID:27084274

  7. Overcoming redundancies in bedside nursing assessments by validating a parsimonious meta-tool: findings from a methodological exercise study.

    PubMed

    Palese, Alvisa; Marini, Eva; Guarnier, Annamaria; Barelli, Paolo; Zambiasi, Paola; Allegrini, Elisabetta; Bazoli, Letizia; Casson, Paola; Marin, Meri; Padovan, Marisa; Picogna, Michele; Taddia, Patrizia; Chiari, Paolo; Salmaso, Daniele; Marognolli, Oliva; Canzan, Federica; Ambrosi, Elisa; Saiani, Luisa; Grassetti, Luca

    2016-10-01

    There is growing interest in validating tools aimed at supporting the clinical decision-making process and research. However, an increased bureaucratization of clinical practice and redundancies in the measures collected have been reported by clinicians. Redundancies in clinical assessments affect negatively both patients and nurses. To validate a meta-tool measuring the risks/problems currently estimated by multiple tools used in daily practice. A secondary analysis of a database was performed, using a cross-validation and a longitudinal study designs. In total, 1464 patients admitted to 12 medical units in 2012 were assessed at admission with the Brass, Barthel, Conley and Braden tools. Pertinent outcomes such as the occurrence of post-discharge need for resources and functional decline at discharge, as well as falls and pressure sores, were measured. Explorative factor analysis of each tool, inter-tool correlations and a conceptual evaluation of the redundant/similar items across tools were performed. Therefore, the validation of the meta-tool was performed through explorative factor analysis, confirmatory factor analysis and the structural equation model to establish the ability of the meta-tool to predict the outcomes estimated by the original tools. High correlations between the tools have emerged (from r 0.428 to 0.867) with a common variance from 18.3% to 75.1%. Through a conceptual evaluation and explorative factor analysis, the items were reduced from 42 to 20, and the three factors that emerged were confirmed by confirmatory factor analysis. According to the structural equation model results, two out of three emerged factors predicted the outcomes. From the initial 42 items, the meta-tool is composed of 20 items capable of predicting the outcomes as with the original tools. © 2016 John Wiley & Sons, Ltd.

  8. Genetically engineered mouse models of melanoma.

    PubMed

    Pérez-Guijarro, Eva; Day, Chi-Ping; Merlino, Glenn; Zaidi, M Raza

    2017-06-01

    Melanoma is a complex disease that exhibits highly heterogeneous etiological, histopathological, and genetic features, as well as therapeutic responses. Genetically engineered mouse (GEM) models provide powerful tools to unravel the molecular mechanisms critical for melanoma development and drug resistance. Here, we expound briefly the basis of the mouse modeling design, the available technology for genetic engineering, and the aspects influencing the use of GEMs to model melanoma. Furthermore, we describe in detail the currently available GEM models of melanoma. Cancer 2017;123:2089-103. © 2017 American Cancer Society. © 2017 American Cancer Society.

  9. A Cognitive Model for Problem Solving in Computer Science

    ERIC Educational Resources Information Center

    Parham, Jennifer R.

    2009-01-01

    According to industry representatives, computer science education needs to emphasize the processes involved in solving computing problems rather than their solutions. Most of the current assessment tools used by universities and computer science departments analyze student answers to problems rather than investigating the processes involved in…

  10. Enhanced model of photovoltaic cell/panel/array considering the direct and reverse modes

    NASA Astrophysics Data System (ADS)

    Zegaoui, Abdallah; Boutoubat, Mohamed; Sawicki, Jean-Paul; Kessaissia, Fatma Zohra; Djahbar, Abdelkader; Aillerie, Michel

    2018-05-01

    This paper presents an improved generalized physical model for photovoltaic, PV cells, panels and arrays taking into account the behavior of these devices when considering their biasing existing in direct and reverse modes. Existing PV physical models generally are very efficient for simulating influence of irradiation changes on the short circuit current but they could not visualize the influences of temperature changes. The Enhanced Direct and Reverse Mode model, named EDRM model, enlightens the influence on the short-circuit current of both temperature and irradiation in the reverse mode of the considered PV devices. Due to its easy implementation, the proposed model can be a useful power tool for the development of new photovoltaic systems taking into account and in a more exhaustive manner, environmental conditions. The developed model was tested on a marketed PV panel and it gives a satisfactory results compared with parameters given in the manufacturer datasheet.

  11. C3 System Performance Simulation and User Manual. Getting Started: Guidelines for Users

    NASA Technical Reports Server (NTRS)

    2006-01-01

    This document is a User's Manual describing the C3 Simulation capabilities. The subject work was designed to simulate the communications involved in the flight of a Remotely Operated Aircraft (ROA) using the Opnet software. Opnet provides a comprehensive development environment supporting the modeling of communication networks and distributed systems. It has tools for model design, simulation, data collection, and data analysis. Opnet models are hierarchical -- consisting of a project which contains node models which in turn contain process models. Nodes can be fixed, mobile, or satellite. Links between nodes can be physical or wireless. Communications are packet based. The model is very generic in its current form. Attributes such as frequency and bandwidth can easily be modified to better reflect a specific platform. The model is not fully developed at this stage -- there are still more enhancements to be added. Current issues are documented throughout this guide.

  12. TOPLA: A New Empirical Representation of the F-Region Topside and Plasmasphere for the International Reference Ionosphere

    NASA Technical Reports Server (NTRS)

    Bilitza, D.; Reinisch, B.; Gallagher, D.; Huang, X.; Truhlik, V.; Nsumei, P.

    2007-01-01

    The goal of this LWS tools effort is the development of a new data-based F-region TOpside and PLAsmasphere (TOPLA) model for the electron density (Ne) and temperature (Te) for inclusion in the International Reference Ionosphere (IRI) model using newly available satellite data and models for these regions. The IRI model is the de facto international standard for specification of ionospheric parameters and is currently being considered as an ISO Technical Specification for the ionosphere. Our effort is directed towards improving the topside part of the model and extending it into the plasmasphere. Specifically we are planning to overcome the following shortcomings of the current IRI topside model: (I) overestimation of densities above 700 km by a factor of 2 and more, (3) unrealistically steep density profiles at high latitudes during very high solar activities, (4) no solar cycle variations and no semi-annual variations for the electron temperature, (5) discontinuities or unphysical gradients when merging with plasmaspheric models. We will report on first accomplishments and on the current status of the project.

  13. Use of physiologically relevant biopharmaceutics tools within the pharmaceutical industry and in regulatory sciences: Where are we now and what are the gaps?

    PubMed

    Flanagan, Talia; Van Peer, Achiel; Lindahl, Anders

    2016-08-25

    Regulatory interactions are an important part of the drug development and licensing process. A survey on the use of biopharmaceutical tools for regulatory purposes has been carried out within the industry community of the EU project OrBiTo within Innovative Medicines Initiative (IMI). The aim was to capture current practice and experience in using in vitro and in silico biopharmaceutics tools at various stages of development, what barriers exist or are perceived, and to understand the current gaps in regulatory biopharmaceutics. The survey indicated that biorelevant dissolution testing and physiologically based modelling and simulation are widely applied throughout development to address a number of biopharmaceutics issues. However, data from these in vitro and in silico predictive biopharmaceutics tools are submitted to regulatory authorities far less often than they are used for internal risk assessment and decision making. This may prevent regulators from becoming familiar with these tools and how they are applied in industry, and limits the opportunities for biopharmaceutics scientists working in industry to understand the acceptability of these tools in the regulatory environment. It is anticipated that the advanced biopharmaceutics tools and understanding delivered in the next years by OrBiTo and other initiatives in the area of predictive tools will also be of value in the regulatory setting, and provide a basis for more informed and confident biopharmaceutics risk assessment and regulatory decision making. To enable the regulatory potential of predictive biopharmaceutics tools to be realized, further scientific dialogue is needed between industry, regulators and scientists in academia, and more examples need to be published to demonstrate the applicability of these tools. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Toward improving hurricane forecasts using the JPL Tropical Cyclone Information System (TCIS): A framework to address the issues of Big Data

    NASA Astrophysics Data System (ADS)

    Hristova-Veleva, S. M.; Boothe, M.; Gopalakrishnan, S.; Haddad, Z. S.; Knosp, B.; Lambrigtsen, B.; Li, P.; montgomery, M. T.; Niamsuwan, N.; Tallapragada, V. S.; Tanelli, S.; Turk, J.; Vukicevic, T.

    2013-12-01

    Accurate forecasting of extreme weather requires the use of both regional models as well as global General Circulation Models (GCMs). The regional models have higher resolution and more accurate physics - two critical components needed for properly representing the key convective processes. GCMs, on the other hand, have better depiction of the large-scale environment and, thus, are necessary for properly capturing the important scale interactions. But how to evaluate the models, understand their shortcomings and improve them? Satellite observations can provide invaluable information. And this is where the issues of Big Data come: satellite observations are very complex and have large variety while model forecast are very voluminous. We are developing a system - TCIS - that addresses the issues of model evaluation and process understanding with the goal of improving the accuracy of hurricane forecasts. This NASA/ESTO/AIST-funded project aims at bringing satellite/airborne observations and model forecasts into a common system and developing on-line tools for joint analysis. To properly evaluate the models we go beyond the comparison of the geophysical fields. We input the model fields into instrument simulators (NEOS3, CRTM, etc.) and compute synthetic observations for a more direct comparison to the observed parameters. In this presentation we will start by describing the scientific questions. We will then outline our current framework to provide fusion of models and observations. Next, we will illustrate how the system can be used to evaluate several models (HWRF, GFS, ECMWF) by applying a couple of our analysis tools to several hurricanes observed during the 2013 season. Finally, we will outline our future plans. Our goal is to go beyond the image comparison and point-by-point statistics, by focusing instead on understanding multi-parameter correlations and providing robust statistics. By developing on-line analysis tools, our framework will allow for consistent model evaluation, providing results that are much more robust than those produced by case studies - the current paradigm imposed by the Big Data issues (voluminous data and incompatible analysis tools). We believe that this collaborative approach, with contributions of models, observations and analysis approaches used by the research and operational communities, will help untangle the complex interactions that lead to hurricane genesis and rapid intensity changes - two processes that still pose many unanswered questions. The developed framework for evaluation of the global models will also have implications for the improvement of the climate models, which output only a limited amount of information making it difficult to evaluate them. Our TCIS will help by investigating the GCMs under current weather scenarios and with much more detailed model output, making it possible to compare the models to multiple observed parameters to help narrow down the uncertainty in their performance. This knowledge could then be transferred to the climate models to lower the uncertainty in their predictions. The work described here was performed at the Jet Propulsion Laboratory, California Institute of Technology, under contract with the National Aeronautics and Space Administration.

  15. Defining Uncertainty and Error in Planktic Foraminiferal Oxygen Isotope Measurements

    NASA Astrophysics Data System (ADS)

    Fraass, A. J.; Lowery, C.

    2016-12-01

    Foraminifera are the backbone of paleoceanography, and planktic foraminifera are one of the leading tools for reconstructing water column structure. Currently, there are unconstrained variables when dealing with the reproducibility of oxygen isotope measurements. This study presents the first results from a simple model of foraminiferal calcification (Foraminiferal Isotope Reproducibility Model; FIRM), designed to estimate the precision and accuracy of oxygen isotope measurements. FIRM produces synthetic isotope data using parameters including location, depth habitat, season, number of individuals included in measurement, diagenesis, misidentification, size variation, and vital effects. Reproducibility is then tested using Monte Carlo simulations. The results from a series of experiments show that reproducibility is largely controlled by the number of individuals in each measurement, but also strongly a function of local oceanography if the number of individuals is held constant. Parameters like diagenesis or misidentification have an impact on both the precision and the accuracy of the data. Currently FIRM is a tool to estimate isotopic error values best employed in the Holocene. It is also a tool to explore the impact of myriad factors on the fidelity of paleoceanographic records. FIRM was constructed in the open-source computing environment R and is freely available via GitHub. We invite modification and expansion, and have planned inclusions for benthic foram reproducibility and stratigraphic uncertainty.

  16. Are we ready for Taenia solium cysticercosis elimination in sub-Saharan Africa?

    PubMed

    Johansen, Maria Vang; Trevisan, Chiara; Gabriël, Sarah; Magnussen, Pascal; Braae, Uffe Christian

    2017-01-01

    The World Health Organization announced in November 2014 at the fourth international meeting on 'the control of neglected zoonotic diseases - from advocacy to action', that intervention tools for eliminating Taenia solium taeniosis/cysticercosis (TSTC) are in place. The aim of this work was to elucidate theoretical outcomes of various control options suggested for TSTC elimination in sub-Saharan Africa (SSA) over a 4-year period. Our current knowledge regarding T. solium epidemiology and control primarily builds on studies from Latin America. A simple transmission model - built on data from Latin America - has been used to predict the effect of various interventions such as mass treatment of humans, vaccination and treatment of pigs, and health education of communities, potentially leading to change in bad practices and reducing transmission risks. Based on simulations of the transmission model, even a 4-year integrated One Health approach fails to eliminate TSTC from a small community and in all simulations, the prevalence of human taeniosis and porcine cysticercosis start to rise as soon as the programmes end. Our current knowledge regarding transmission and burden of TSTC in SSA is scarce and while claiming to be tool ready, the selection of diagnostic and surveillance tools, as well as the algorithms and stepwise approaches for control and elimination of TSTC remain major challenges.

  17. Warthog: A MOOSE-Based Application for the Direct Code Coupling of BISON and PROTEUS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCaskey, Alexander J.; Slattery, Stuart; Billings, Jay Jay

    The Nuclear Energy Advanced Modeling and Simulation (NEAMS) program from the Department of Energy's Office of Nuclear Energy provides a robust toolkit for the modeling and simulation of current and future advanced nuclear reactor designs. This toolkit provides these technologies organized across product lines: two divisions targeted at fuels and end-to-end reactor modeling, and a third for integration, coupling, and high-level workflow management. The Fuels Product Line and the Reactor Product line provide advanced computational technologies that serve each respective field well, however, their current lack of integration presents a major impediment to future improvements of simulation solution fidelity. Theremore » is a desire for the capability to mix and match tools across Product Lines in an effort to utilize the best from both to improve NEAMS modeling and simulation technologies. This report details a new effort to provide this Product Line interoperability through the development of a new application called Warthog. This application couples the BISON Fuel Performance application from the Fuels Product Line and the PROTEUS Core Neutronics application from the Reactors Product Line in an effort to utilize the best from all parts of the NEAMS toolkit and improve overall solution fidelity of nuclear fuel simulations. To achieve this, Warthog leverages as much prior work from the NEAMS program as possible, and in doing so, enables interoperability between the disparate MOOSE and SHARP frameworks, and the libMesh and MOAB mesh data formats. This report describes this work in full. We begin with a detailed look at the individual NEAMS framework technologies used and developed in the various Product Lines, and the current status of their interoperability. We then introduce the Warthog application: its overall architecture and the ways it leverages the best existing tools from across the NEAMS toolkit to enable BISON-PROTEUS integration. Furthermore, we show how Warthog leverages a tool known as DataTransferKit to seamlessly enable the transfer for solution data between disparate frameworks and mesh formats. To end, we demonstrate tests for the direct software coupling of BISON and PROTEUS using Warthog, and discuss current impediments and solutions to the construction of physically realistic input models for this coupled BISON-PROTEUS system.« less

  18. Computational Analysis of Static and Dynamic Behaviour of Magnetic Suspensions and Magnetic Bearings

    NASA Technical Reports Server (NTRS)

    Britcher, Colin P. (Editor); Groom, Nelson J.

    1996-01-01

    Static modelling of magnetic bearings is often carried out using magnetic circuit theory. This theory cannot easily include nonlinear effects such as magnetic saturation or the fringing of flux in air-gaps. Modern computational tools are able to accurately model complex magnetic bearing geometries, provided some care is exercised. In magnetic suspension applications, the magnetic fields are highly three-dimensional and require computational tools for the solution of most problems of interest. The dynamics of a magnetic bearing or magnetic suspension system can be strongly affected by eddy currents. Eddy currents are present whenever a time-varying magnetic flux penetrates a conducting medium. The direction of flow of the eddy current is such as to reduce the rate-of-change of flux. Analytic solutions for eddy currents are available for some simplified geometries, but complex geometries must be solved by computation. It is only in recent years that such computations have been considered truly practical. At NASA Langley Research Center, state-of-the-art finite-element computer codes, 'OPERA', 'TOSCA' and 'ELEKTRA' have recently been installed and applied to the magnetostatic and eddy current problems. This paper reviews results of theoretical analyses which suggest general forms of mathematical models for eddy currents, together with computational results. A simplified circuit-based eddy current model proposed appears to predict the observed trends in the case of large eddy current circuits in conducting non-magnetic material. A much more difficult case is seen to be that of eddy currents in magnetic material, or in non-magnetic material at higher frequencies, due to the lower skin depths. Even here, the dissipative behavior has been shown to yield at least somewhat to linear modelling. Magnetostatic and eddy current computations have been carried out relating to the Annular Suspension and Pointing System, a prototype for a space payload pointing and vibration isolation system, where the magnetic actuator geometry resembles a conventional magnetic bearing. Magnetostatic computations provide estimates of flux density within airgaps and the iron core material, fringing at the pole faces and the net force generated. Eddy current computations provide coil inductance, power dissipation and the phase lag in the magnetic field, all as functions of excitation frequency. Here, the dynamics of the magnetic bearings, notably the rise time of forces with changing currents, are found to be very strongly affected by eddy currents, even at quite low frequencies. Results are also compared to experimental measurements of the performance of a large-gap magnetic suspension system, the Large Angle Magnetic Suspension Test Fixture (LAMSTF). Eddy current effects are again shown to significantly affect the dynamics of the system. Some consideration is given to the ease and accuracy of computation, specifically relating to OPERA/TOSCA/ELEKTRA.

  19. Air quality and future energy system planning

    NASA Astrophysics Data System (ADS)

    Sobral Mourao, Zenaida; Konadu, Dennis; Lupton, Rick

    2016-04-01

    Ambient air pollution has been linked to an increasing number of premature deaths throughout the world. Projected increases in demand for food, energy resources and manufactured products will likely contribute to exacerbate air pollution with an increasing impact on human health, agricultural productivity and climate change. Current events such as tampering emissions tests by VW car manufacturers, failure to comply with EU Air Quality directives and WHO guidelines by many EU countries, the problem of smog in Chinese cities and new industrial emissions regulations represent unique challenges but also opportunities for regulators, local authorities and industry. However current models and practices of energy and resource use do not consider ambient air impacts as an integral part of the planing process. Furthermore the analysis of drivers, sources and impacts of air pollution is often fragmented, difficult to understand and lacks effective visualization tools that bring all of these components together. This work aims to develop a model that links impacts of air quality on human health and ecosystems to current and future developments in the energy system, industrial and agricultural activity and patterns of land use. The model will be added to the ForeseerTM tool, which is an integrated resource analysis platform that has been developed at the University of Cambridge initially with funding from BP and more recently through the EPSRC funded Whole Systems Energy Modeling (WholeSEM) project. The basis of the tool is a set of linked physical models for energy, water and land, including the technologies that are used to transform these resources into final services such as housing, food, transport and household goods. The new air quality model will explore different feedback effects between energy, land and atmospheric systems with the overarching goal of supporting better communication about the drivers of air quality and to incorporate concerns about air quality into energy system planning. Some example applications of this work are: (1) to discover conflicts and synergies between air quality regulations and future developments in the energy system and land use change; (2) to show the drivers of air quality in a given spatial context; (3) to explore effective ways to visualize impacts of different energy, land use and emissions control policies on air quality. An initial test case for the Bay Area in California will be presented, extending the scope of the existing California ForeseerTM tool to identify impacts of different policies within the water-energy-land nexus on local air quality.

  20. Supervised learning of tools for content-based search of image databases

    NASA Astrophysics Data System (ADS)

    Delanoy, Richard L.

    1996-03-01

    A computer environment, called the Toolkit for Image Mining (TIM), is being developed with the goal of enabling users with diverse interests and varied computer skills to create search tools for content-based image retrieval and other pattern matching tasks. Search tools are generated using a simple paradigm of supervised learning that is based on the user pointing at mistakes of classification made by the current search tool. As mistakes are identified, a learning algorithm uses the identified mistakes to build up a model of the user's intentions, construct a new search tool, apply the search tool to a test image, display the match results as feedback to the user, and accept new inputs from the user. Search tools are constructed in the form of functional templates, which are generalized matched filters capable of knowledge- based image processing. The ability of this system to learn the user's intentions from experience contrasts with other existing approaches to content-based image retrieval that base searches on the characteristics of a single input example or on a predefined and semantically- constrained textual query. Currently, TIM is capable of learning spectral and textural patterns, but should be adaptable to the learning of shapes, as well. Possible applications of TIM include not only content-based image retrieval, but also quantitative image analysis, the generation of metadata for annotating images, data prioritization or data reduction in bandwidth-limited situations, and the construction of components for larger, more complex computer vision algorithms.

  1. Linking Science and Management in an Interactive Geospatial, Mutli-Criterion, Structured Decision Support Framework: Use Case Studies of the "Future Forests Geo-visualization and Decision Support Tool

    NASA Astrophysics Data System (ADS)

    Pontius, J.; Duncan, J.

    2017-12-01

    Land managers are often faced with balancing management activities to accomplish a diversity of management objectives, in systems faced with many stress agents. Advances in ecosystem modeling provide a rich source of information to inform management. Coupled with advances in decision support techniques and computing capabilities, interactive tools are now accessible for a broad audience of stakeholders. Here we present one such tool designed to capture information on how climate change may impact forested ecosystems, and how that impact varies spatially across the landscape. This tool integrates empirical models of current and future forest structure and function in a structured decision framework that allows users to customize weights for multiple management objectives and visualize suitability outcomes across the landscape. Combined with climate projections, the resulting products allow stakeholders to compare the relative success of various management objectives on a pixel by pixel basis and identify locations where management outcomes are most likely to be met. Here we demonstrate this approach with the integration of several of the preliminary models developed to map species distributions, sugar maple health, forest fragmentation risk and hemlock vulnerability to hemlock woolly adelgid under current and future climate scenarios. We compare three use case studies with objective weightings designed to: 1) Identify key parcels for sugarbush conservation and management, 2) Target state lands that may serve as hemlock refugia from hemlock woolly adelgid induced mortality, and 3) Examine how climate change may alter the success of managing for both sugarbush and hemlock across privately owned lands. This tool highlights the value of flexible models that can be easily run with customized weightings in a dynamic, integrated assessment that allows users to hone in on their potentially complex management objectives, and to visualize and prioritize locations across the landscape. It also demonstrates the importance of including climate considerations for long-term management. This merging of scientific knowledge with the diversity of stakeholder needs is an important step towards using science to inform management and policy decisions.

  2. First tier modeling of consumer dermal exposure to substances in consumer articles under REACH: a quantitative evaluation of the ECETOC TRA for consumers tool.

    PubMed

    Delmaar, J E; Bokkers, B G H; ter Burg, W; van Engelen, J G M

    2013-02-01

    The demonstration of safe use of chemicals in consumer products, as required under REACH, is proposed to follow a tiered process. In the first tier, simple conservative methods and assumptions should be made to quickly verify whether risks for a particular use are expected. The ECETOC TRA Consumer Exposure Tool was developed to assist in first tier risk assessments for substances in consumer products. The ECETOC TRA is not a prioritization tool, but is meant as a first screening. Therefore, the exposure assessment needs to cover all products/articles in a specific category. For the assessment of the dermal exposure for substances in articles, ECETOC TRA uses the concept of a 'contact layer', a hypothetical layer that limits the exposure to a substance contained in the product. For each product/article category, ECETOC TRA proposes default values for the thickness of this contact layer. As relevant experimental exposure data is currently lacking, default values are based on expert judgment alone. In this paper it is verified whether this concept meets the requirement of being a conservative exposure evaluation method. This is done by confronting the ECETOC TRA expert judgment based predictions with a mechanistic emission model, based on the well established theory of diffusion of substances in materials. Diffusion models have been applied and tested in many applications of emission modeling. Experimentally determined input data for a number of material and substance combinations are available. The estimated emissions provide information on the range of emissions that could occur in reality. First tier tools such as ECETOC TRA tool are required to cover all products/articles in a category and to provide estimates that are at least as high as is expected on the basis of current scientific knowledge. Since this was not the case, it is concluded that the ECETOC TRA does not provide a proper conservative estimation method for the dermal exposure to articles. An alternative method was proposed. Copyright © 2012 Elsevier Inc. All rights reserved.

  3. The potential of induced pluripotent stem cells in models of neurological disorders: implications on future therapy.

    PubMed

    Crook, Jeremy Micah; Wallace, Gordon; Tomaskovic-Crook, Eva

    2015-03-01

    There is an urgent need for new and advanced approaches to modeling the pathological mechanisms of complex human neurological disorders. This is underscored by the decline in pharmaceutical research and development efficiency resulting in a relative decrease in new drug launches in the last several decades. Induced pluripotent stem cells represent a new tool to overcome many of the shortcomings of conventional methods, enabling live human neural cell modeling of complex conditions relating to aberrant neurodevelopment, such as schizophrenia, epilepsy and autism as well as age-associated neurodegeneration. This review considers the current status of induced pluripotent stem cell-based modeling of neurological disorders, canvassing proven and putative advantages, current constraints, and future prospects of next-generation culture systems for biomedical research and translation.

  4. Evaluation of the 3d Urban Modelling Capabilities in Geographical Information Systems

    NASA Astrophysics Data System (ADS)

    Dogru, A. O.; Seker, D. Z.

    2010-12-01

    Geographical Information System (GIS) Technology, which provides successful solutions to basic spatial problems, is currently widely used in 3 dimensional (3D) modeling of physical reality with its developing visualization tools. The modeling of large and complicated phenomenon is a challenging problem in terms of computer graphics currently in use. However, it is possible to visualize that phenomenon in 3D by using computer systems. 3D models are used in developing computer games, military training, urban planning, tourism and etc. The use of 3D models for planning and management of urban areas is very popular issue of city administrations. In this context, 3D City models are produced and used for various purposes. However the requirements of the models vary depending on the type and scope of the application. While a high level visualization, where photorealistic visualization techniques are widely used, is required for touristy and recreational purposes, an abstract visualization of the physical reality is generally sufficient for the communication of the thematic information. The visual variables, which are the principle components of cartographic visualization, such as: color, shape, pattern, orientation, size, position, and saturation are used for communicating the thematic information. These kinds of 3D city models are called as abstract models. Standardization of technologies used for 3D modeling is now available by the use of CityGML. CityGML implements several novel concepts to support interoperability, consistency and functionality. For example it supports different Levels-of-Detail (LoD), which may arise from independent data collection processes and are used for efficient visualization and efficient data analysis. In one CityGML data set, the same object may be represented in different LoD simultaneously, enabling the analysis and visualization of the same object with regard to different degrees of resolution. Furthermore, two CityGML data sets containing the same object in different LoD may be combined and integrated. In this study GIS tools used for 3D modeling issues were examined. In this context, the availability of the GIS tools for obtaining different LoDs of CityGML standard. Additionally a 3D GIS application that covers a small part of the city of Istanbul was implemented for communicating the thematic information rather than photorealistic visualization by using 3D model. An abstract model was created by using a commercial GIS software modeling tools and the results of the implementation were also presented in the study.

  5. Tuberculosis vaccines: barriers and prospects on the quest for a transformative tool.

    PubMed

    Karp, Christopher L; Wilson, Christopher B; Stuart, Lynda M

    2015-03-01

    The road to a more efficacious vaccine that could be a truly transformative tool for decreasing tuberculosis morbidity and mortality, along with Mycobacterium tuberculosis transmission, is quite daunting. Despite this, there are reasons for optimism. Abetted by better conceptual clarity, clear acknowledgment of the degree of our current immunobiological ignorance, the availability of powerful new tools for dissecting the immunopathogenesis of human tuberculosis, the generation of more creative diversity in tuberculosis vaccine concepts, the development of better fit-for-purpose animal models, and the potential of more pragmatic approaches to the clinical testing of vaccine candidates, the field has promise for delivering novel tools for dealing with this worldwide scourge of poverty. © 2015 The Authors. Immunological Reviews Published by John Wiley & Sons Ltd.

  6. Exploring the architectural trade space of NASAs Space Communication and Navigation Program

    NASA Astrophysics Data System (ADS)

    Sanchez, M.; Selva, D.; Cameron, B.; Crawley, E.; Seas, A.; Seery, B.

    NASAs Space Communication and Navigation (SCaN) Program is responsible for providing communication and navigation services to space missions and other users in and beyond low Earth orbit. The current SCaN architecture consists of three independent networks: the Space Network (SN), which contains the TDRS relay satellites in GEO; the Near Earth Network (NEN), which consists of several NASA owned and commercially operated ground stations; and the Deep Space Network (DSN), with three ground stations in Goldstone, Madrid, and Canberra. The first task of this study is the stakeholder analysis. The goal of the stakeholder analysis is to identify the main stakeholders of the SCaN system and their needs. Twenty-one main groups of stakeholders have been identified and put on a stakeholder map. Their needs are currently being elicited by means of interviews and an extensive literature review. The data will then be analyzed by applying Cameron and Crawley's stakeholder analysis theory, with a view to highlighting dominant needs and conflicting needs. The second task of this study is the architectural tradespace exploration of the next generation TDRSS. The space of possible architectures for SCaN is represented by a set of architectural decisions, each of which has a discrete set of options. A computational tool is used to automatically synthesize a very large number of possible architectures by enumerating different combinations of decisions and options. The same tool contains models to evaluate the architectures in terms of performance and cost. The performance model uses the stakeholder needs and requirements identified in the previous steps as inputs, and it is based in the VASSAR methodology presented in a companion paper. This paper summarizes the current status of the MIT SCaN architecture study. It starts by motivating the need to perform tradespace exploration studies in the context of relay data systems through a description of the history NASA's space communicati- n networks. It then presents the generalities of possible architectures for future space communication and navigation networks. Finally, it describes the tools and methods being developed, clearly indicating the architectural decisions that have been taken into account as well as the systematic approach followed to model them. The purpose of this study is to explore the SCaN architectural tradespace by means of a computational tool. This paper describes the tool, while the tradespace exploration is underway.

  7. Endotracheal intubation: application of virtual reality to emergency medical services education.

    PubMed

    Mayrose, James; Myers, Jeffrey W

    2007-01-01

    Virtual reality simulation has been identified as an emerging educational tool with significant potential to enhance teaching of residents and students in emergency clinical encounters and procedures. Endotracheal intubation represents a critical procedure for emergency care providers. Current methods of training include working with cadavers and mannequins, which have limitations in their representation of reality, ethical concerns, and overall availability with access, cost, and location of models. This paper will present a human airway simulation model designed for tracheal intubation and discuss the aspects that lend itself to use as an educational tool. This realistic and dynamic model is used to teach routine intubations, while future models will include more difficult airway management scenarios. This work provides a solid foundation for future versions of the intubation simulator, which will incorporate two haptic devices to allow for simultaneous control of the laryngoscope blade and endotracheal tube.

  8. Software systems for modeling articulated figures

    NASA Technical Reports Server (NTRS)

    Phillips, Cary B.

    1989-01-01

    Research in computer animation and simulation of human task performance requires sophisticated geometric modeling and user interface tools. The software for a research environment should present the programmer with a powerful but flexible substrate of facilities for displaying and manipulating geometric objects, yet insure that future tools have a consistent and friendly user interface. Jack is a system which provides a flexible and extensible programmer and user interface for displaying and manipulating complex geometric figures, particularly human figures in a 3D working environment. It is a basic software framework for high-performance Silicon Graphics IRIS workstations for modeling and manipulating geometric objects in a general but powerful way. It provides a consistent and user-friendly interface across various applications in computer animation and simulation of human task performance. Currently, Jack provides input and control for applications including lighting specification and image rendering, anthropometric modeling, figure positioning, inverse kinematics, dynamic simulation, and keyframe animation.

  9. A description of the new 3D electron gun and collector modeling tool: MICHELLE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Petillo, J.; Mondelli, A.; Krueger, W.

    1999-07-01

    A new 3D finite element gun and collector modeling code is under development at SAIC in collaboration with industrial partners and national laboratories. This development program has been designed specifically to address the shortcomings of current simulation and modeling tools. In particular, although there are 3D gun codes that exist today, their ability to address fine scale features is somewhat limited in 3D due to disparate length scales of certain classes of devices. Additionally, features like advanced emission rules, including thermionic Child's law and comprehensive secondary emission models also need attention. The program specifically targets problems classes including gridded-guns, sheet-beammore » guns, multi-beam devices, and anisotropic collectors. The presentation will provide an overview of the program objectives, the approach to be taken by the development team, and a status of the project.« less

  10. Towards an Automated Development Methodology for Dependable Systems with Application to Sensor Networks

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.

    2005-01-01

    A general-purpose method to mechanically transform system requirements into a probably equivalent model has yet to appeal: Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including sensor networks and autonomous systems. Currently available tools and methods that start with a formal model of a system and mechanically produce a probably equivalent implementation are valuable but not su8cient. The "gap" unfilled by such tools and methods is that their. formal models cannot be proven to be equivalent to the system requirements as originated by the customel: For the classes of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a probably equivalent formal model that can be used as the basis for code generation and other transformations.

  11. MODELING OF HIGH SPEED FRICTION STIR SPOT WELDING USING A LAGRANGIAN FINITE ELEMENT APPROACH

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miles, Michael; Karki, U.; Woodward, C.

    2013-09-03

    Friction stir spot welding (FSSW) has been shown to be capable of joining steels of very high strength, while also being very flexible in terms of controlling the heat of welding and the resulting microstructure of the joint. This makes FSSW a potential alternative to resistance spot welding (RSW) if tool life is sufficiently high, and if machine spindle loads are sufficiently low so that the process can be implemented on an industrial robot. Robots for spot welding can typically sustain vertical loads of about 8kN, but FSSW at tool speeds of less than 3000 rpm cause loads that aremore » too high, in the range of 11-14 kN. Therefore, in the current work tool speeds of 3000 rpm and higher were employed, in order to generate heat more quickly and to reduce welding loads to acceptable levels. The FSSW process was modeled using a finite element approach with the Forge® software package. An updated Lagrangian scheme with explicit time integration was employed to model the flow of the sheet material, subjected to boundary conditions of a rotating tool and a fixed backing plate [3]. The modeling approach can be described as two-dimensional, axisymmetric, but with an aspect of three dimensions in terms of thermal boundary conditions. Material flow was calculated from a velocity field which was two dimensional, but heat generated by friction was computed using a virtual rotational velocity component from the tool surface. An isotropic, viscoplastic Norton-Hoff law was used to model the evolution of material flow stress as a function of strain, strain rate, and temperature. The model predicted welding temperatures and the movement of the joint interface with reasonable accuracy for the welding of a dual phase 980 steel.« less

  12. Novel Door-opening Method for Six-legged Robots Based on Only Force Sensing

    NASA Astrophysics Data System (ADS)

    Chen, Zhi-Jun; Gao, Feng; Pan, Yang

    2017-09-01

    Current door-opening methods are mainly developed on tracked, wheeled and biped robots by applying multi-DOF manipulators and vision systems. However, door-opening methods for six-legged robots are seldom studied, especially using 0-DOF tools to operate and only force sensing to detect. A novel door-opening method for six-legged robots is developed and implemented to the six-parallel-legged robot. The kinematic model of the six-parallel-legged robot is established and the model of measuring the positional relationship between the robot and the door is proposed. The measurement model is completely based on only force sensing. The real-time trajectory planning method and the control strategy are designed. The trajectory planning method allows the maximum angle between the sagittal axis of the robot body and the normal line of the door plane to be 45º. A 0-DOF tool mounted to the robot body is applied to operate. By integrating with the body, the tool has 6 DOFs and enough workspace to operate. The loose grasp achieved by the tool helps release the inner force in the tool. Experiments are carried out to validate the method. The results show that the method is effective and robust in opening doors wider than 1 m. This paper proposes a novel door-opening method for six-legged robots, which notably uses a 0-DOF tool and only force sensing to detect and open the door.

  13. Model Development for EHR Interdisciplinary Information Exchange of ICU Common Goals

    PubMed Central

    Collins, Sarah A.; Bakken, Suzanne; Vawdrey, David K.; Coiera, Enrico; Currie, Leanne

    2010-01-01

    Purpose Effective interdisciplinary exchange of patient information is an essential component of safe, efficient, and patient–centered care in the intensive care unit (ICU). Frequent handoffs of patient care, high acuity of patient illness, and the increasing amount of available data complicate information exchange. Verbal communication can be affected by interruptions and time limitations. To supplement verbal communication, many ICUs rely on documentation in electronic health records (EHRs) to reduce errors of omission and information loss. The purpose of this study was to develop a model of EHR interdisciplinary information exchange of ICU common goals. Methods The theoretical frameworks of distributed cognition and the clinical communication space were integrated and a previously published categorization of verbal information exchange was used. 59.5 hours of interdisciplinary rounds in a Neurovascular ICU were observed and five interviews and one focus group with ICU nurses and physicians were conducted. Results Current documentation tools in the ICU were not sufficient to capture the nurses' and physicians' collaborative decision-making and verbal communication of goal-directed actions and interactions. Clinicians perceived the EHR to be inefficient for information retrieval, leading to a further reliance on verbal information exchange. Conclusion The model suggests that EHRs should support: 1) Information tools for the explicit documentation of goals, interventions, and assessments with synthesized and summarized information outputs of events and updates; and 2) Messaging tools that support collaborative decision-making and patient safety double checks that currently occur between nurses and physicians in the absence of EHR support. PMID:20974549

  14. Computational Modeling in Liver Surgery

    PubMed Central

    Christ, Bruno; Dahmen, Uta; Herrmann, Karl-Heinz; König, Matthias; Reichenbach, Jürgen R.; Ricken, Tim; Schleicher, Jana; Ole Schwen, Lars; Vlaic, Sebastian; Waschinsky, Navina

    2017-01-01

    The need for extended liver resection is increasing due to the growing incidence of liver tumors in aging societies. Individualized surgical planning is the key for identifying the optimal resection strategy and to minimize the risk of postoperative liver failure and tumor recurrence. Current computational tools provide virtual planning of liver resection by taking into account the spatial relationship between the tumor and the hepatic vascular trees, as well as the size of the future liver remnant. However, size and function of the liver are not necessarily equivalent. Hence, determining the future liver volume might misestimate the future liver function, especially in cases of hepatic comorbidities such as hepatic steatosis. A systems medicine approach could be applied, including biological, medical, and surgical aspects, by integrating all available anatomical and functional information of the individual patient. Such an approach holds promise for better prediction of postoperative liver function and hence improved risk assessment. This review provides an overview of mathematical models related to the liver and its function and explores their potential relevance for computational liver surgery. We first summarize key facts of hepatic anatomy, physiology, and pathology relevant for hepatic surgery, followed by a description of the computational tools currently used in liver surgical planning. Then we present selected state-of-the-art computational liver models potentially useful to support liver surgery. Finally, we discuss the main challenges that will need to be addressed when developing advanced computational planning tools in the context of liver surgery. PMID:29249974

  15. Applying AI tools to operational space environmental analysis

    NASA Technical Reports Server (NTRS)

    Krajnak, Mike; Jesse, Lisa; Mucks, John

    1995-01-01

    The U.S. Air Force and National Oceanic Atmospheric Agency (NOAA) space environmental operations centers are facing increasingly complex challenges meeting the needs of their growing user community. These centers provide current space environmental information and short term forecasts of geomagnetic activity. Recent advances in modeling and data access have provided sophisticated tools for making accurate and timely forecasts, but have introduced new problems associated with handling and analyzing large quantities of complex data. AI (Artificial Intelligence) techniques have been considered as potential solutions to some of these problems. Fielding AI systems has proven more difficult than expected, in part because of operational constraints. Using systems which have been demonstrated successfully in the operational environment will provide a basis for a useful data fusion and analysis capability. Our approach uses a general purpose AI system already in operational use within the military intelligence community, called the Temporal Analysis System (TAS). TAS is an operational suite of tools supporting data processing, data visualization, historical analysis, situation assessment and predictive analysis. TAS includes expert system tools to analyze incoming events for indications of particular situations and predicts future activity. The expert system operates on a knowledge base of temporal patterns encoded using a knowledge representation called Temporal Transition Models (TTM's) and an event database maintained by the other TAS tools. The system also includes a robust knowledge acquisition and maintenance tool for creating TTM's using a graphical specification language. The ability to manipulate TTM's in a graphical format gives non-computer specialists an intuitive way of accessing and editing the knowledge base. To support space environmental analyses, we used TAS's ability to define domain specific event analysis abstractions. The prototype system defines events covering reports of natural phenomena such as solar flares, bursts, geomagnetic storms, and five others pertinent to space environmental analysis. With our preliminary event definitions we experimented with TAS's support for temporal pattern analysis using X-ray flare and geomagnetic storm forecasts as case studies. We are currently working on a framework for integrating advanced graphics and space environmental models into this analytical environment.

  16. Reduced Current Spread by Concentric Electrodes in Transcranial Electrical Stimulation (tES).

    PubMed

    Bortoletto, M; Rodella, C; Salvador, R; Miranda, P C; Miniussi, C

    2016-01-01

    We propose the use of a new montage for transcranial direct current stimulation (tDCS), called concentric electrodes tDCS (CE-tDCS), involving two concentric round electrodes that may improve stimulation focality. To test efficacy and focality of CE-tDCS, we modelled the current distribution and tested physiological effects on cortical excitability. Motor evoked potentials (MEPs) from first dorsal interosseous (FDI) and abductor digiti minimi (ADM) were recorded before and after the delivery of anodal, cathodal and sham stimulation on the FDI hotspot for 10 minutes. MEP amplitude of FDI increased after anodal-tDCS and decreased after cathodal-tDCS, supporting the efficacy of CE-tDCS in modulating cortical excitability. Moreover, modelled current distribution and no significant effects of stimulation on MEP amplitude of ADM suggest high focality of CE-tDCS. CE-tDCS may allow a better control of current distribution and may represent a novel tool for applying tDCS and other transcranial current stimulation approaches. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Extended behavioural device modelling and circuit simulation with Qucs-S

    NASA Astrophysics Data System (ADS)

    Brinson, M. E.; Kuznetsov, V.

    2018-03-01

    Current trends in circuit simulation suggest a growing interest in open source software that allows access to more than one simulation engine while simultaneously supporting schematic drawing tools, behavioural Verilog-A and XSPICE component modelling, and output data post-processing. This article introduces a number of new features recently implemented in the 'Quite universal circuit simulator - SPICE variant' (Qucs-S), including structure and fundamental schematic capture algorithms, at the same time highlighting their use in behavioural semiconductor device modelling. Particular importance is placed on the interaction between Qucs-S schematics, equation-defined devices, SPICE B behavioural sources and hardware description language (HDL) scripts. The multi-simulator version of Qucs is a freely available tool that offers extended modelling and simulation features compared to those provided by legacy circuit simulators. The performance of a number of Qucs-S modelling extensions are demonstrated with a GaN HEMT compact device model and data obtained from tests using the Qucs-S/Ngspice/Xyce ©/SPICE OPUS multi-engine circuit simulator.

  18. Assessment of Technologies for the Space Shuttle External Tank Thermal Protection System and Recommendations for Technology Improvement. Part 2; Structural Analysis Technologies and Modeling Practices

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Nemeth, Michael P.; Hilburger, Mark W.

    2004-01-01

    A technology review and assessment of modeling and analysis efforts underway in support of a safe return to flight of the thermal protection system (TPS) for the Space Shuttle external tank (ET) are summarized. This review and assessment effort focuses on the structural modeling and analysis practices employed for ET TPS foam design and analysis and on identifying analysis capabilities needed in the short-term and long-term. The current understanding of the relationship between complex flight environments and ET TPS foam failure modes are reviewed as they relate to modeling and analysis. A literature review on modeling and analysis of TPS foam material systems is also presented. Finally, a review of modeling and analysis tools employed in the Space Shuttle Program is presented for the ET TPS acreage and close-out foam regions. This review includes existing simplified engineering analysis tools are well as finite element analysis procedures.

  19. Computational Modeling in Structural Materials Processing

    NASA Technical Reports Server (NTRS)

    Meyyappan, Meyya; Arnold, James O. (Technical Monitor)

    1997-01-01

    High temperature materials such as silicon carbide, a variety of nitrides, and ceramic matrix composites find use in aerospace, automotive, machine tool industries and in high speed civil transport applications. Chemical vapor deposition (CVD) is widely used in processing such structural materials. Variations of CVD include deposition on substrates, coating of fibers, inside cavities and on complex objects, and infiltration within preforms called chemical vapor infiltration (CVI). Our current knowledge of the process mechanisms, ability to optimize processes, and scale-up for large scale manufacturing is limited. In this regard, computational modeling of the processes is valuable since a validated model can be used as a design tool. The effort is similar to traditional chemically reacting flow modeling with emphasis on multicomponent diffusion, thermal diffusion, large sets of homogeneous reactions, and surface chemistry. In the case of CVI, models for pore infiltration are needed. In the present talk, examples of SiC nitride, and Boron deposition from the author's past work will be used to illustrate the utility of computational process modeling.

  20. Microsurgery Training for the Twenty-First Century

    PubMed Central

    Myers, Simon Richard; Froschauer, Stefan; Akelina, Yelena; Tos, Pierluigi; Kim, Jeong Tae

    2013-01-01

    Current educational interventions and training courses in microsurgery are often predicated on theories of skill acquisition and development that follow a 'practice makes perfect' model. Given the changing landscape of surgical training and advances in educational theories related to skill development, research is needed to assess current training tools in microsurgery education and devise alternative methods that would enhance training. Simulation is an increasingly important tool for educators because, whilst facilitating improved technical proficiency, it provides a way to reduce risks to both trainees and patients. The International Microsurgery Simulation Society has been founded in 2012 in order to consolidate the global effort in promoting excellence in microsurgical training. The society's aim to achieve standarisation of microsurgical training worldwide could be realised through the development of evidence based educational interventions and sharing best practices. PMID:23898422

  1. Additive Manufacturing of Tooling for Refrigeration Cabinet Foaming Processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Post, Brian K; Nuttall, David; Cukier, Michael

    The primary objective of this project was to leverage the Big Area Additive Manufacturing (BAAM) process and materials into a long term, quick change tooling concept to drastically reduce product lead and development timelines and costs. Current refrigeration foam molds are complicated to manufacture involving casting several aluminum parts in an approximate shape, machining components of the molds and post fitting and shimming of the parts in an articulated fixture. The total process timeline can take over 6 months. The foaming process is slower than required for production, therefore multiple fixtures, 10 to 27, are required per refrigerator model. Moldsmore » are particular to a specific product configuration making mixed model assembly challenging for sequencing, mold changes or auto changeover features. The initial goal was to create a tool leveraging the ORNL materials and additive process to build a tool in 4 to 6 weeks or less. A secondary goal was to create common fixture cores and provide lightweight fixture sections that could be revised in a very short time to increase equipment flexibility reduce lead times, lower the barriers to first production trials, and reduce tooling costs.« less

  2. Using multi-scale entropy and principal component analysis to monitor gears degradation via the motor current signature analysis

    NASA Astrophysics Data System (ADS)

    Aouabdi, Salim; Taibi, Mahmoud; Bouras, Slimane; Boutasseta, Nadir

    2017-06-01

    This paper describes an approach for identifying localized gear tooth defects, such as pitting, using phase currents measured from an induction machine driving the gearbox. A new tool of anomaly detection based on multi-scale entropy (MSE) algorithm SampEn which allows correlations in signals to be identified over multiple time scales. The motor current signature analysis (MCSA) in conjunction with principal component analysis (PCA) and the comparison of observed values with those predicted from a model built using nominally healthy data. The Simulation results show that the proposed method is able to detect gear tooth pitting in current signals.

  3. 6 DOF Nonlinear AUV Simulation Toolbox

    DTIC Science & Technology

    1997-01-01

    is to supply a flexible 3D -simulation platform for motion visualization, in-lab debugging and testing of mission-specific strategies as well as those...Explorer are modular designed [Smith] in order to cut time and cost for vehicle recontlguration. A flexible 3D -simulation platform is desired to... 3D models. Current implemented modules include a nonlinear dynamic model for the OEX, shared memory and semaphore manager tools, shared memory monitor

  4. On-Line Tool for the Assessment of Radiation in Space - Deep Space Mission Enhancements

    NASA Technical Reports Server (NTRS)

    Sandridge, Chris a.; Blattnig, Steve R.; Norman, Ryan B.; Slaba, Tony C.; Walker, Steve A.; Spangler, Jan L.

    2011-01-01

    The On-Line Tool for the Assessment of Radiation in Space (OLTARIS, https://oltaris.nasa.gov) is a web-based set of tools and models that allows engineers and scientists to assess the effects of space radiation on spacecraft, habitats, rovers, and spacesuits. The site is intended to be a design tool for those studying the effects of space radiation for current and future missions as well as a research tool for those developing advanced material and shielding concepts. The tools and models are built around the HZETRN radiation transport code and are primarily focused on human- and electronic-related responses. The focus of this paper is to highlight new capabilities that have been added to support deep space (outside Low Earth Orbit) missions. Specifically, the electron, proton, and heavy ion design environments for the Europa mission have been incorporated along with an efficient coupled electron-photon transport capability to enable the analysis of complicated geometries and slabs exposed to these environments. In addition, a neutron albedo lunar surface environment was also added, that will be of value for the analysis of surface habitats. These updates will be discussed in terms of their implementation and on how OLTARIS can be used by instrument vendors, mission designers, and researchers to analyze their specific requirements.12

  5. Evaluating an immersive virtual environment prototyping and simulation system

    NASA Astrophysics Data System (ADS)

    Nemire, Kenneth

    1997-05-01

    An immersive virtual environment (IVE) modeling and simulation tool is being developed for designing advanced weapon and training systems. One unique feature of the tool is that the design, and not just visualization of the design is accomplished with the IVE tool. Acceptance of IVE tools requires comparisons with current commercial applications. In this pilot study, expert users of a popular desktop 3D graphics application performed identical modeling and simulation tasks using both the desktop and IVE applications. The IVE tool consisted of a head-mounted display, 3D spatialized sound, spatial trackers on head and hands, instrumented gloves, and a simulated speech recognition system. The results are preliminary because performance from only four users has been examined. When using the IVE system, users completed the tasks to criteria in less time than when using the desktop application. Subjective ratings of the visual displays in each system were similar. Ratings for the desktop controls were higher than for the IVE controls. Ratings of immersion and user enjoyment were higher for the IVE than for the desktop application. These results are particular remarkable because participants had used the desktop application regularly for three to five years and the prototype IVE tool for only three to six hours.

  6. Genome-wide prediction models that incorporate de novo GWAS are a powerful new tool for tropical rice improvement

    USDA-ARS?s Scientific Manuscript database

    To address the multiple challenges to food security posed by global climate change, population growth and rising incomes, plant breeders are developing new crop varieties that can enhance both agricultural productivity and environmental sustainability. Current breeding practices, however, are unable...

  7. Utilizing Peer Observation as a Professional Development Tool to Learn in Context

    ERIC Educational Resources Information Center

    Hirsch, Linda J.

    2011-01-01

    De-contextualized professional development is the common route taken by school districts to addresses pedagogical skills and address change within an educational organization. Research suggests that the current process of professional development activities is limited if not ineffective. Research shows that another model of professional…

  8. Mentoring in the Art Classroom

    ERIC Educational Resources Information Center

    Green, Denise; Mitchell, Timothy; Taylor, Patrick

    2011-01-01

    Mentoring in classrooms allows teachers the opportunity to be motivational tools in the lives of students while operating as role models. The current research shows that mentoring in the art classroom provides stimulation and the momentum to students who are less motivated with creative assignments. The first part of this study looks at the…

  9. Understanding the Effects of Infrastructure Changes on Subpopulations: Survey of Current Methods, Models, and Tools

    DTIC Science & Technology

    2016-04-01

    key leaders, government services, and businesses, while the cultural/historical/religious focuses on specific cultural sites. These cards were...labeled “intensive,” because these impacts had severe but localized impacts (Figure 21). Environmental impacts (such as raw sewage dumping following

  10. AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT (AGWA): A GIS-BASED HYDROLOGIC MODELING TOOL FOR LANDSCAPE ASSESSMENT AND WATERSHED MANAGEMENT

    EPA Science Inventory

    The assessment of land use and land cover is an extremely important activity for contemporary land management. A large body of current literature suggests that human land-use practice is the most important factor influencing natural resource management and environmental condition...

  11. A flexible system for the estimation of infiltration and hydraulic resistance parameters in surface irrigation

    USDA-ARS?s Scientific Manuscript database

    Critical to the use of modeling tools for the hydraulic analysis of surface irrigation systems is characterizing the infiltration and hydraulic resistance process. Since those processes are still not well understood, various formulations are currently used to represent them. A software component h...

  12. Artificial neural network models: A decision support tool for enhancing seedling selection in sugarcane

    USDA-ARS?s Scientific Manuscript database

    Currently, sugarcane selection begins at the seedling stage with visual selection for cane yield and other yield-related traits. Although subjective and inefficient, visual selection remains the primary method for selection. Visual selection is inefficient because of the confounding effect of genoty...

  13. Visualization of Learning Scenarios with UML4LD

    ERIC Educational Resources Information Center

    Laforcade, Pierre

    2007-01-01

    Present Educational Modelling Languages are used to formally specify abstract learning scenarios in a machine-interpretable format. Current tooling does not provide teachers/designers with some graphical facilities to help them in reusing existent scenarios. They need human-readable representations. This paper discusses the UML4LD experimental…

  14. Improving toxicity extrapolation using molecular sequence similarity: A case study of pyrethroids and the sodium ion channel

    EPA Science Inventory

    A significant challenge in ecotoxicology has been determining chemical hazards to species with limited or no toxicity data. Currently, extrapolation tools like U.S. EPA’s Web-based Interspecies Correlation Estimation (Web-ICE; www3.epa.gov/webice) models categorize toxicity...

  15. Besnoitia besnoti lytic cycle in vitro and differences in invasion and intracellular proliferation among isolates

    USDA-ARS?s Scientific Manuscript database

    Background: Bovine besnoitiosis, caused by the protozoan Besnoitia besnoiti, reduces productivity and fertility of affected herds. Besnoitiosis continues to expand in Europe and no effective control tools are currently available. Experimental models are urgently needed. Herein, we describe for the f...

  16. New trends in species distribution modelling

    USGS Publications Warehouse

    Zimmermann, Niklaus E.; Edwards, Thomas C.; Graham, Catherine H.; Pearman, Peter B.; Svenning, Jens-Christian

    2010-01-01

    Species distribution modelling has its origin in the late 1970s when computing capacity was limited. Early work in the field concentrated mostly on the development of methods to model effectively the shape of a species' response to environmental gradients (Austin 1987, Austin et al. 1990). The methodology and its framework were summarized in reviews 10–15 yr ago (Franklin 1995, Guisan and Zimmermann 2000), and these syntheses are still widely used as reference landmarks in the current distribution modelling literature. However, enormous advancements have occurred over the last decade, with hundreds – if not thousands – of publications on species distribution model (SDM) methodologies and their application to a broad set of conservation, ecological and evolutionary questions. With this special issue, originating from the third of a set of specialized SDM workshops (2008 Riederalp) entitled 'The Utility of Species Distribution Models as Tools for Conservation Ecology', we reflect on current trends and the progress achieved over the last decade.

  17. Unconventional Tools for an Unconventional Resource: Community and Landscape Planning for Shale in the Marcellus Region

    NASA Astrophysics Data System (ADS)

    Murtha, T., Jr.; Orland, B.; Goldberg, L.; Hammond, R.

    2014-12-01

    Deep shale natural gas deposits made accessible by new technologies are quickly becoming a considerable share of North America's energy portfolio. Unlike traditional deposits and extraction footprints, shale gas offers dispersed and complex landscape and community challenges. These challenges are both cultural and environmental. This paper describes the development and application of creative geospatial tools as a means to engage communities along the northern tier counties of Pennsylvania, experiencing Marcellus shale drilling in design and planning. Uniquely combining physical landscape models with predictive models of exploration activities, including drilling, pipeline construction and road reconstruction, the tools quantify the potential impacts of drilling activities for communities and landscapes in the commonwealth of Pennsylvania. Dividing the state into 9836 watershed sub-basins, we first describe the current state of Marcellus related activities through 2014. We then describe and report the results of three scaled predictive models designed to investigate probable sub-basins where future activities will be focused. Finally, the core of the paper reports on the second level of tools we have now developed to engage communities in planning for unconventional gas extraction in Pennsylvania. Using a geodesign approach we are working with communities to transfer information for comprehensive landscape planning and informed decision making. These tools not only quantify physical landscape impacts, but also quantify potential visual, aesthetic and cultural resource implications.

  18. Modeling the milling tool wear by using an evolutionary SVM-based model from milling runs experimental data

    NASA Astrophysics Data System (ADS)

    Nieto, Paulino José García; García-Gonzalo, Esperanza; Vilán, José Antonio Vilán; Robleda, Abraham Segade

    2015-12-01

    The main aim of this research work is to build a new practical hybrid regression model to predict the milling tool wear in a regular cut as well as entry cut and exit cut of a milling tool. The model was based on Particle Swarm Optimization (PSO) in combination with support vector machines (SVMs). This optimization mechanism involved kernel parameter setting in the SVM training procedure, which significantly influences the regression accuracy. Bearing this in mind, a PSO-SVM-based model, which is based on the statistical learning theory, was successfully used here to predict the milling tool flank wear (output variable) as a function of the following input variables: the time duration of experiment, depth of cut, feed, type of material, etc. To accomplish the objective of this study, the experimental dataset represents experiments from runs on a milling machine under various operating conditions. In this way, data sampled by three different types of sensors (acoustic emission sensor, vibration sensor and current sensor) were acquired at several positions. A second aim is to determine the factors with the greatest bearing on the milling tool flank wear with a view to proposing milling machine's improvements. Firstly, this hybrid PSO-SVM-based regression model captures the main perception of statistical learning theory in order to obtain a good prediction of the dependence among the flank wear (output variable) and input variables (time, depth of cut, feed, etc.). Indeed, regression with optimal hyperparameters was performed and a determination coefficient of 0.95 was obtained. The agreement of this model with experimental data confirmed its good performance. Secondly, the main advantages of this PSO-SVM-based model are its capacity to produce a simple, easy-to-interpret model, its ability to estimate the contributions of the input variables, and its computational efficiency. Finally, the main conclusions of this study are exposed.

  19. The cardiovascular event reduction tool (CERT)--a simplified cardiac risk prediction model developed from the West of Scotland Coronary Prevention Study (WOSCOPS).

    PubMed

    L'Italien, G; Ford, I; Norrie, J; LaPuerta, P; Ehreth, J; Jackson, J; Shepherd, J

    2000-03-15

    The clinical decision to treat hypercholesterolemia is premised on an awareness of patient risk, and cardiac risk prediction models offer a practical means of determining such risk. However, these models are based on observational cohorts where estimates of the treatment benefit are largely inferred. The West of Scotland Coronary Prevention Study (WOSCOPS) provides an opportunity to develop a risk-benefit prediction model from the actual observed primary event reduction seen in the trial. Five-year Cox model risk estimates were derived from all WOSCOPS subjects (n = 6,595 men, aged 45 to 64 years old at baseline) using factors previously shown to be predictive of definite fatal coronary heart disease or nonfatal myocardial infarction. Model risk factors included age, diastolic blood pressure, total cholesterol/ high-density lipoprotein ratio (TC/HDL), current smoking, diabetes, family history of fatal coronary heart disease, nitrate use or angina, and treatment (placebo/ 40-mg pravastatin). All risk factors were expressed as categorical variables to facilitate risk assessment. Risk estimates were incorporated into a simple, hand-held slide rule or risk tool. Risk estimates were identified for 5-year age bands (45 to 65 years), 4 categories of TC/HDL ratio (<5.5, 5.5 to <6.5, 6.5 to <7.5, > or = 7.5), 2 levels of diastolic blood pressure (<90, > or = 90 mm Hg), from 0 to 3 additional risk factors (current smoking, diabetes, family history of premature fatal coronary heart disease, nitrate use or angina), and pravastatin treatment. Five-year risk estimates ranged from 2% in very low-risk subjects to 61% in the very high-risk subjects. Risk reduction due to pravastatin treatment averaged 31%. Thus, the Cardiovascular Event Reduction Tool (CERT) is a risk prediction model derived from the WOSCOPS trial. Its use will help physicians identify patients who will benefit from cholesterol reduction.

  20. Risk stratification following acute myocardial infarction.

    PubMed

    Singh, Mandeep

    2007-07-01

    This article reviews the current risk assessment models available for patients presenting with myocardial infarction (MI). These practical tools enhance the health care provider's ability to rapidly and accurately assess patient risk from the event or revascularization therapy, and are of paramount importance in managing patients presenting with MI. This article highlights the models used for ST-elevation MI (STEMI) and non-ST elevation MI (NSTEMI) and provides an additional description of models used to assess risks after primary angioplasty (ie, angioplasty performed for STEMI).

  1. Tools and resources for neuroanatomy education: a systematic review.

    PubMed

    Arantes, M; Arantes, J; Ferreira, M A

    2018-05-03

    The aim of this review was to identify studies exploring neuroanatomy teaching tools and their impact in learning, as a basis towards the implementation of a neuroanatomy program in the context of a curricular reform in medical education. Computer-assisted searches were conducted through March 2017 in the PubMed, Web of Science, Medline, Current Contents Connect, KCI and Scielo Citation Index databases. Four sets of keywords were used, combining "neuroanatomy" with "education", "teaching", "learning" and "student*". Studies were reviewed independently by two readers, and data collected were confirmed by a third reader. Of the 214 studies identified, 29 studies reported data on the impact of using specific neuroanatomy teaching tools. Most of them (83%) were published in the last 8 years and were conducted in the United States of America (65.52%). Regarding the participants, medical students were the most studied sample (37.93%) and the majority of the studies (65.52%) had less than 100 participants. Approximately half of the studies included in this review used digital teaching tools (e.g., 3D computer neuroanatomy models), whereas the remaining used non-digital learning tools (e.g., 3D physical models). Our work highlight the progressive interest in the study of neuroanatomy teaching tools over the last years, as evidenced from the number of publications and highlight the need to consider new tools, coping with technological development in medical education.

  2. A systematic review on in vitro 3D bone metastases models: A new horizon to recapitulate the native clinical scenario?

    PubMed

    Salamanna, Francesca; Contartese, Deyanira; Maglio, Melania; Fini, Milena

    2016-07-12

    While the skeleton is not the only organ where metastasis can occur, it is one of the preferred sites, with a significant impact in patients' quality of life. With the aim of delineating the cellular and molecular mechanisms of bone metastasis, numerous studies have been employed to identify any contributing factors that trigger cancer progression. One of the major limitations of studying cancer-bone metastasis is the multifaceted nature of the native bone environment and the lack of reliable, simple, and not expensive models that strictly mimic the biological processes occurring in vivo allowing a correct translation of results. Currently, with the growing acceptance of in vitro models as effective tools for studying cancer biology, three-dimensional (3D) models have emerged as a compromise between two-dimensional cultures of isolated cancer cells and the complexity of human cancer xenografts in immunocompromised animal hosts. This descriptive systematic literature review summarizes the current status of advanced and alternative 3D in vitro bone metastases models. We have also reviewed the strategies employed by researchers to set-up these models with special reference to recent promising developments trying to better replicate the complexity and heterogeneity of a human metastasis in situ, with an outlook at their use in medicine. All these aspects will greatly contribute to the existing knowledge on bone metastases, providing a specific link to clinical scenarios and thus making 3D in vitro bone metastasis models an attractive tool for multidisciplinary experts.

  3. Space Mission Human Reliability Analysis (HRA) Project

    NASA Technical Reports Server (NTRS)

    Boyer, Roger

    2014-01-01

    The purpose of the Space Mission Human Reliability Analysis (HRA) Project is to extend current ground-based HRA risk prediction techniques to a long-duration, space-based tool. Ground-based HRA methodology has been shown to be a reasonable tool for short-duration space missions, such as Space Shuttle and lunar fly-bys. However, longer-duration deep-space missions, such as asteroid and Mars missions, will require the crew to be in space for as long as 400 to 900 day missions with periods of extended autonomy and self-sufficiency. Current indications show higher risk due to fatigue, physiological effects due to extended low gravity environments, and others, may impact HRA predictions. For this project, Safety & Mission Assurance (S&MA) will work with Human Health & Performance (HH&P) to establish what is currently used to assess human reliabiilty for human space programs, identify human performance factors that may be sensitive to long duration space flight, collect available historical data, and update current tools to account for performance shaping factors believed to be important to such missions. This effort will also contribute data to the Human Performance Data Repository and influence the Space Human Factors Engineering research risks and gaps (part of the HRP Program). An accurate risk predictor mitigates Loss of Crew (LOC) and Loss of Mission (LOM).The end result will be an updated HRA model that can effectively predict risk on long-duration missions.

  4. Enhancement of Local Climate Analysis Tool

    NASA Astrophysics Data System (ADS)

    Horsfall, F. M.; Timofeyeva, M. M.; Dutton, J.

    2012-12-01

    The National Oceanographic and Atmospheric Administration (NOAA) National Weather Service (NWS) will enhance its Local Climate Analysis Tool (LCAT) to incorporate specific capabilities to meet the needs of various users including energy, health, and other communities. LCAT is an online interactive tool that provides quick and easy access to climate data and allows users to conduct analyses at the local level such as time series analysis, trend analysis, compositing, correlation and regression techniques, with others to be incorporated as needed. LCAT uses principles of Artificial Intelligence in connecting human and computer perceptions on application of data and scientific techniques in multiprocessing simultaneous users' tasks. Future development includes expanding the type of data currently imported by LCAT (historical data at stations and climate divisions) to gridded reanalysis and General Circulation Model (GCM) data, which are available on global grids and thus will allow for climate studies to be conducted at international locations. We will describe ongoing activities to incorporate NOAA Climate Forecast System (CFS) reanalysis data (CFSR), NOAA model output data, including output from the National Multi Model Ensemble Prediction System (NMME) and longer term projection models, and plans to integrate LCAT into the Earth System Grid Federation (ESGF) and its protocols for accessing model output and observational data to ensure there is no redundancy in development of tools that facilitate scientific advancements and use of climate model information in applications. Validation and inter-comparison of forecast models will be included as part of the enhancement to LCAT. To ensure sustained development, we will investigate options for open sourcing LCAT development, in particular, through the University Corporation for Atmospheric Research (UCAR).

  5. Framework Development Supporting the Safety Portal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prescott, Steven Ralph; Kvarfordt, Kellie Jean; Vang, Leng

    2015-07-01

    In a collaborating scientific research arena it is important to have an environment where analysts have access to a shared repository of information, documents, and software tools, and be able to accurately maintain and track historical changes in models. The new Safety Portal cloud-based environment will be accessible remotely from anywhere regardless of computing platforms given that the platform has available Internet access and proper browser capabilities. Information stored at this environment would be restricted based on user assigned credentials. This report discusses current development of a cloud-based web portal for PRA tools.

  6. Aviation Safety Program Atmospheric Environment Safety Technologies (AEST) Project

    NASA Technical Reports Server (NTRS)

    Colantonio, Ron

    2011-01-01

    Engine Icing: Characterization and Simulation Capability: Develop knowledge bases, analysis methods, and simulation tools needed to address the problem of engine icing; in particular, ice-crystal icing Airframe Icing Simulation and Engineering Tool Capability: Develop and demonstrate 3-D capability to simulate and model airframe ice accretion and related aerodynamic performance degradation for current and future aircraft configurations in an expanded icing environment that includes freezing drizzle/rain Atmospheric Hazard Sensing and Mitigation Technology Capability: Improve and expand remote sensing and mitigation of hazardous atmospheric environments and phenomena

  7. Understanding Kidney Disease: Toward the Integration of Regulatory Networks Across Species

    PubMed Central

    Ju, Wenjun; Brosius, Frank C.

    2010-01-01

    Animal models have long been useful in investigating both normal and abnormal human physiology. Systems biology provides a relatively new set of approaches to identify similarities and differences between animal models and humans that may lead to a more comprehensive understanding of human kidney pathophysiology. In this review, we briefly describe how genome-wide analyses of mouse models have helped elucidate features of human kidney diseases, discuss strategies to achieve effective network integration, and summarize currently available web-based tools that may facilitate integration of data across species. The rapid progress in systems biology and orthology, as well as the advent of web-based tools to facilitate these processes, now make it possible to take advantage of knowledge from distant animal species in targeted identification of regulatory networks that may have clinical relevance for human kidney diseases. PMID:21044762

  8. SIRTF Tools for DIRT

    NASA Astrophysics Data System (ADS)

    Pound, M. W.; Wolfire, M. G.; Amarnath, N. S.

    2004-07-01

    The Dust InfraRed ToolBox (DIRT - a part of the Web Infrared ToolShed, or WITS {http://dustem.astro.umd.edu}) is a Java applet for modeling astrophysical processes in circumstellar shells around young and evolved stars. DIRT has been used by the astrophysics community for about 5 years. Users can automatically and efficiently search grids of pre-calculated models to fit their data. A large set of physical parameters and dust types are included in the model database, which contains over 500,000 models. We are adding new functionality to DIRT to support new missions like SIRTF and SOFIA. A new Instrument module allows for plotting of the model points convolved with the spatial and spectral responses of the selected instrument. This lets users better fit data from specific instruments. Currently, we have implemented modules for the Infrared Array Camera (IRAC) and Multiband Imaging Photometer (MIPS) on SIRTF. The models are based on the dust radiation transfer code of Wolfire & Cassinelli (1986) which accounts for multiple grain sizes and compositions. The model outputs are averaged over the instrument bands using the same weighting (νFν = constant) as the SIRTF data pipeline which allows the SIRTF data products to be compared directly with the model database. This work was supported in part by a NASA AISRP grant NAG 5-10751 and the SIRTF Legacy Science Program provided by NASA through an award issued by JPL under NASA contract 1407.

  9. NetMOD v. 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Merchant, Bion J

    2015-12-22

    NetMOD is a tool to model the performance of global ground-based explosion monitoring systems. The version 2.0 of the software supports the simulation of seismic, hydroacoustic, and infrasonic detection capability. The tool provides a user interface to execute simulations based upon a hypothetical definition of the monitoring system configuration, geophysical properties of the Earth, and detection analysis criteria. NetMOD will be distributed with a project file defining the basic performance characteristics of the International Monitoring System (IMS), a network of sensors operated by the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO). Network modeling is needed to be able to assess and explainmore » the potential effect of changes to the IMS, to prioritize station deployment and repair, and to assess the overall CTBTO monitoring capability currently and in the future. Currently the CTBTO uses version 1.0 of NetMOD, provided to them in early 2014. NetMOD will provide a modern tool that will cover all the simulations currently available and allow for the development of additional simulation capabilities of the IMS in the future. NetMOD simulates the performance of monitoring networks by estimating the relative amplitudes of the signal and noise measured at each of the stations within the network based upon known geophysical principles. From these signal and noise estimates, a probability of detection may be determined for each of the stations. The detection probabilities at each of the stations may then be combined to produce an estimate of the detection probability for the entire monitoring network.« less

  10. Clinician accessible tools for GUI computational models of transcranial electrical stimulation: BONSAI and SPHERES.

    PubMed

    Truong, Dennis Q; Hüber, Mathias; Xie, Xihe; Datta, Abhishek; Rahman, Asif; Parra, Lucas C; Dmochowski, Jacek P; Bikson, Marom

    2014-01-01

    Computational models of brain current flow during transcranial electrical stimulation (tES), including transcranial direct current stimulation (tDCS) and transcranial alternating current stimulation (tACS), are increasingly used to understand and optimize clinical trials. We propose that broad dissemination requires a simple graphical user interface (GUI) software that allows users to explore and design montages in real-time, based on their own clinical/experimental experience and objectives. We introduce two complimentary open-source platforms for this purpose: BONSAI and SPHERES. BONSAI is a web (cloud) based application (available at neuralengr.com/bonsai) that can be accessed through any flash-supported browser interface. SPHERES (available at neuralengr.com/spheres) is a stand-alone GUI application that allow consideration of arbitrary montages on a concentric sphere model by leveraging an analytical solution. These open-source tES modeling platforms are designed go be upgraded and enhanced. Trade-offs between open-access approaches that balance ease of access, speed, and flexibility are discussed. Copyright © 2014 Elsevier Inc. All rights reserved.

  11. Validation of a coupled wave-flow model in a high-energy setting: the mouth of the Columbia River

    USGS Publications Warehouse

    Elias, Edwin P.L.; Gelfenbaum, Guy R.; van der Westhuysen, André J.

    2012-01-01

     A monthlong time series of wave, current, salinity, and suspended-sediment measurements was made at five sites on a transect across the Mouth of Columbia River (MCR). These data were used to calibrate and evaluate the performance of a coupled hydrodynamic and wave model for the MCR based on the Delft3D modeling system. The MCR is a dynamic estuary inlet in which tidal currents, river discharge, and wave-driven currents are all important. Model tuning consisted primarily of spatial adjustments to bottom drag coefficients. In combination with (near-) default parameter settings, the MCR model application is able to simulate the dominant features in the tidal flow, salinity and wavefields observed in field measurements. The wave-orbital averaged method for representing the current velocity profile in the wave model is considered the most realistic for the MCR. The hydrodynamic model is particularly effective in reproducing the observed vertical residual and temporal variations in current structure. Density gradients introduce the observed and modeled reversal of the mean flow at the bed and augment mean and peak flow in the upper half of the water column. This implies that sediment transport during calmer summer conditions is controlled by density stratification and is likely net landward due to the reversal of flow near the bed. The correspondence between observed and modeled hydrodynamics makes this application a tool to investigate hydrodynamics and associated sediment transport.

  12. Validation of a coupled wave-flow model in a high-energy setting: The mouth of the Columbia River

    NASA Astrophysics Data System (ADS)

    Elias, Edwin P. L.; Gelfenbaum, Guy; Van der Westhuysen, André J.

    2012-09-01

    A monthlong time series of wave, current, salinity, and suspended-sediment measurements was made at five sites on a transect across the Mouth of Columbia River (MCR). These data were used to calibrate and evaluate the performance of a coupled hydrodynamic and wave model for the MCR based on the Delft3D modeling system. The MCR is a dynamic estuary inlet in which tidal currents, river discharge, and wave-driven currents are all important. Model tuning consisted primarily of spatial adjustments to bottom drag coefficients. In combination with (near-) default parameter settings, the MCR model application is able to simulate the dominant features in the tidal flow, salinity and wavefields observed in field measurements. The wave-orbital averaged method for representing the current velocity profile in the wave model is considered the most realistic for the MCR. The hydrodynamic model is particularly effective in reproducing the observed vertical residual and temporal variations in current structure. Density gradients introduce the observed and modeled reversal of the mean flow at the bed and augment mean and peak flow in the upper half of the water column. This implies that sediment transport during calmer summer conditions is controlled by density stratification and is likely net landward due to the reversal of flow near the bed. The correspondence between observed and modeled hydrodynamics makes this application a tool to investigate hydrodynamics and associated sediment transport.

  13. 2D simulations of orthogonal cutting of CFRP: Effect of tool angles on parameters of cut and chip morphology

    NASA Astrophysics Data System (ADS)

    Benhassine, Mehdi; Rivière-Lorphèvre, Edouard; Arrazola, Pedro-Jose; Gobin, Pierre; Dumas, David; Madhavan, Vinay; Aizpuru, Ohian; Ducobu, François

    2018-05-01

    Carbon-fiber reinforced composites (CFRP) are attractive materials for lightweight designs in applications needing good mechanical properties. Machining of such materials can be harder than metals due to their anisotropic behavior. In addition, the combination of the fibers and resin mechanical properties must also include the fiber orientation. In the case of orthogonal cutting, the tool inclination, rake angle or cutting angle usually influence the cutting process but such a detailed investigation is currently lacking in a 2D configuration. To address this issue, a model has been developed with Abaqus Explicit including Hashin damage. This model has been validated with experimental results from the literature. The effects of the tool parameters (rake angle, clearance angle) on the tool cutting forces, CFRP chip morphology and surface damage are herewith studied. It is shown that 90° orientation for the CFRP increases the surface damage. The rake angle has a minimal effect on the cutting forces but modifies the chip formation times. The feed forces are increased with increasing rake angle.

  14. Characterizing Aerosols over Southeast Asia using the AERONET Data Synergy Tool

    NASA Technical Reports Server (NTRS)

    Giles, David M.; Holben, Brent N.; Eck, Thomas F.; Slutsker, Ilya; Slutsker, Ilya; Welton, Ellsworth, J.; Chin, Mian; Kucsera, Thomas; Schmaltz, Jeffery E.; Diehl, Thomas; hide

    2007-01-01

    Biomass burning, urban pollution and dust aerosols have significant impacts on the radiative forcing of the atmosphere over Asia. In order to better quanti@ these aerosol characteristics, the Aerosol Robotic Network (AERONET) has established over 200 sites worldwide with an emphasis in recent years on the Asian continent - specifically Southeast Asia. A total of approximately 15 AERONET sun photometer instruments have been deployed to China, India, Pakistan, Thailand, and Vietnam. Sun photometer spectral aerosol optical depth measurements as well as microphysical and optical aerosol retrievals over Southeast Asia will be analyzed and discussed with supporting ground-based instrument, satellite, and model data sets, which are freely available via the AERONET Data Synergy tool at the AERONET web site (http://aeronet.gsfc.nasa.gov). This web-based data tool provides access to groundbased (AERONET and MPLNET), satellite (MODIS, SeaWiFS, TOMS, and OMI) and model (GOCART and back trajectory analyses) databases via one web portal. Future development of the AERONET Data Synergy Tool will include the expansion of current data sets as well as the implementation of other Earth Science data sets pertinent to advancing aerosol research.

  15. TRANSP: status and planning

    NASA Astrophysics Data System (ADS)

    Andre, R.; Carlsson, J.; Gorelenkova, M.; Jardin, S.; Kaye, S.; Poli, F.; Yuan, X.

    2016-10-01

    TRANSP is an integrated interpretive and predictive transport analysis tool that incorporates state of the art heating/current drive sources and transport models. The treatments and transport solvers are becoming increasingly sophisticated and comprehensive. For instance, the ISOLVER component provides a free boundary equilibrium solution, while the PT- SOLVER transport solver is especially suited for stiff transport models such as TGLF. TRANSP incorporates high fidelity heating and current drive source models, such as NUBEAM for neutral beam injection, the beam tracing code TORBEAM for EC, TORIC for ICRF, the ray tracing TORAY and GENRAY for EC. The implementation of selected components makes efficient use of MPI for speed up of code calculations. Recently the GENRAY-CQL3D solver for modeling of LH heating and current drive has been implemented and currently being extended to multiple antennas, to allow modeling of EAST discharges. Also, GENRAY+CQL3D is being extended to the use of EC/EBW and of HHFW for NSTX-U. This poster will describe present uses of the code worldwide, as well as plans for upgrading the physics modules and code framework. Work supported by the US Department of Energy under DE-AC02-CH0911466.

  16. When everything is not everywhere but species evolve: an alternative method to model adaptive properties of marine ecosystems

    PubMed Central

    Sauterey, Boris; Ward, Ben A.; Follows, Michael J.; Bowler, Chris; Claessen, David

    2015-01-01

    The functional and taxonomic biogeography of marine microbial systems reflects the current state of an evolving system. Current models of marine microbial systems and biogeochemical cycles do not reflect this fundamental organizing principle. Here, we investigate the evolutionary adaptive potential of marine microbial systems under environmental change and introduce explicit Darwinian adaptation into an ocean modelling framework, simulating evolving phytoplankton communities in space and time. To this end, we adopt tools from adaptive dynamics theory, evaluating the fitness of invading mutants over annual timescales, replacing the resident if a fitter mutant arises. Using the evolutionary framework, we examine how community assembly, specifically the emergence of phytoplankton cell size diversity, reflects the combined effects of bottom-up and top-down controls. When compared with a species-selection approach, based on the paradigm that “Everything is everywhere, but the environment selects”, we show that (i) the selected optimal trait values are similar; (ii) the patterns emerging from the adaptive model are more robust, but (iii) the two methods lead to different predictions in terms of emergent diversity. We demonstrate that explicitly evolutionary approaches to modelling marine microbial populations and functionality are feasible and practical in time-varying, space-resolving settings and provide a new tool for exploring evolutionary interactions on a range of timescales in the ocean. PMID:25852217

  17. When everything is not everywhere but species evolve: an alternative method to model adaptive properties of marine ecosystems.

    PubMed

    Sauterey, Boris; Ward, Ben A; Follows, Michael J; Bowler, Chris; Claessen, David

    2015-01-01

    The functional and taxonomic biogeography of marine microbial systems reflects the current state of an evolving system. Current models of marine microbial systems and biogeochemical cycles do not reflect this fundamental organizing principle. Here, we investigate the evolutionary adaptive potential of marine microbial systems under environmental change and introduce explicit Darwinian adaptation into an ocean modelling framework, simulating evolving phytoplankton communities in space and time. To this end, we adopt tools from adaptive dynamics theory, evaluating the fitness of invading mutants over annual timescales, replacing the resident if a fitter mutant arises. Using the evolutionary framework, we examine how community assembly, specifically the emergence of phytoplankton cell size diversity, reflects the combined effects of bottom-up and top-down controls. When compared with a species-selection approach, based on the paradigm that "Everything is everywhere, but the environment selects", we show that (i) the selected optimal trait values are similar; (ii) the patterns emerging from the adaptive model are more robust, but (iii) the two methods lead to different predictions in terms of emergent diversity. We demonstrate that explicitly evolutionary approaches to modelling marine microbial populations and functionality are feasible and practical in time-varying, space-resolving settings and provide a new tool for exploring evolutionary interactions on a range of timescales in the ocean.

  18. Effect of pulsed current GTA welding parameters on the fusion zone microstructure of AA 6061 aluminium alloy

    NASA Astrophysics Data System (ADS)

    Kumar, T. Senthil; Balasubramanian, V.; Babu, S.; Sanavullah, M. Y.

    2007-08-01

    AA6061 aluminium alloy (Al-Mg-Si alloy) has gathered wide acceptance in the fabrication of food processing equipment, chemical containers, passenger cars, road tankers, and railway transport systems. The preferred process for welding these aluminium alloys is frequently Gas Tungsten Arc (GTA) welding due to its comparatively easy applicability and lower cost. In the case of single pass GTA welding of thinner sections of this alloy, the pulsed current has been found beneficial due to its advantages over the conventional continuous current processes. The use of pulsed current parameters has been found to improve the mechanical properties of the welds compared to those of continuous current welds of this alloy due to grain refinement occurring in the fusion zone. In this investigation, an attempt has been made to develop a mathematical model to predict the fusion zone grain diameter incorporating pulsed current welding parameters. Statistical tools such as design of experiments, analysis of variance, and regression analysis are used to develop the mathematical model. The developed model can be effectively used to predict the fusion grain diameter at a 95% confidence level for the given pulsed current parameters. The effect of pulsed current GTA welding parameters on the fusion zone grain diameter of AA 6061 aluminium alloy welds is reported in this paper.

  19. Identification of nursing assessment models/tools validated in clinical practice for use with diverse ethno-cultural groups: an integrative review of the literature

    PubMed Central

    2011-01-01

    Background High income nations are currently exhibiting increasing ethno-cultural diversity which may present challenges for nursing practice. We performed an integrative review of literature published in North America and Europe between 1990 and 2007, to map the state of knowledge and to identify nursing assessment tools/models which are have an associated research or empirical perspective in relation to ethno-cultural dimensions of nursing care. Methods Data was retrieved from a wide variety of sources, including key electronic bibliographic databases covering research in biomedical fields, nursing and allied health, and culture, e.g. CINAHL, MEDline, PUBmed, Cochrane library, PsycINFO, Web of Science, and HAPI. We used the Critical Appraisal Skills Programme tools for quality assessment. We applied Torraco's definition and method of an integrative review that aims to create new knowledge and perspectives on a given phenomena. To add methodological rigor with respect to the search strategy and other key review components we also used the principles established by the Centre for Reviews and Dissemination. Results Thirteen thousand and thirteen articles were retrieved, from which 53 full papers were assessed for inclusion. Eight papers met the inclusion criteria, describing research on a total of eight ethno-cultural assessment tools/models. The tools/models are described and synthesized. Conclusions While many ethno-cultural assessment tools exist to guide nursing practice, few are informed by research perspectives. An increased focus on the efficiency and effectiveness of health services, patient safety, and risk management, means that provision of culturally responsive and competent health services will inevitably become paramount. PMID:21812960

  20. Knowledge Management tools integration within DLR's concurrent engineering facility

    NASA Astrophysics Data System (ADS)

    Lopez, R. P.; Soragavi, G.; Deshmukh, M.; Ludtke, D.

    The complexity of space endeavors has increased the need for Knowledge Management (KM) tools. The concept of KM involves not only the electronic storage of knowledge, but also the process of making this knowledge available, reusable and traceable. Establishing a KM concept within the Concurrent Engineering Facility (CEF) has been a research topic of the German Aerospace Centre (DLR). This paper presents the current KM tools of the CEF: the Software Platform for Organizing and Capturing Knowledge (S.P.O.C.K.), the data model Virtual Satellite (VirSat), and the Simulation Model Library (SimMoLib), and how their usage improved the Concurrent Engineering (CE) process. This paper also exposes the lessons learned from the introduction of KM practices into the CEF and elaborates a roadmap for the further development of KM in CE activities at DLR. The results of the application of the Knowledge Management tools have shown the potential of merging the three software platforms with their functionalities, as the next step towards the fully integration of KM practices into the CE process. VirSat will stay as the main software platform used within a CE study, and S.P.O.C.K. and SimMoLib will be integrated into VirSat. These tools will support the data model as a reference and documentation source, and as an access to simulation and calculation models. The use of KM tools in the CEF aims to become a basic practice during the CE process. The settlement of this practice will result in a much more extended knowledge and experience exchange within the Concurrent Engineering environment and, consequently, the outcome of the studies will comprise higher quality in the design of space systems.

Top