Science.gov

Sample records for advanced modeling tools

  1. Modeling Tool Advances Rotorcraft Design

    NASA Technical Reports Server (NTRS)

    2007-01-01

    Continuum Dynamics Inc. (CDI), founded in 1979, specializes in advanced engineering services, including fluid dynamic modeling and analysis for aeronautics research. The company has completed a number of SBIR research projects with NASA, including early rotorcraft work done through Langley Research Center, but more recently, out of Ames Research Center. NASA Small Business Innovation Research (SBIR) grants on helicopter wake modeling resulted in the Comprehensive Hierarchical Aeromechanics Rotorcraft Model (CHARM), a tool for studying helicopter and tiltrotor unsteady free wake modeling, including distributed and integrated loads, and performance prediction. Application of the software code in a blade redesign program for Carson Helicopters, of Perkasie, Pennsylvania, increased the payload and cruise speeds of its S-61 helicopter. Follow-on development resulted in a $24 million revenue increase for Sikorsky Aircraft Corporation, of Stratford, Connecticut, as part of the company's rotor design efforts. Now under continuous development for more than 25 years, CHARM models the complete aerodynamics and dynamics of rotorcraft in general flight conditions. CHARM has been used to model a broad spectrum of rotorcraft attributes, including performance, blade loading, blade-vortex interaction noise, air flow fields, and hub loads. The highly accurate software is currently in use by all major rotorcraft manufacturers, NASA, the U.S. Army, and the U.S. Navy.

  2. Constructing an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Sims, Michael; Podolak, Ester; Mckay, Christopher

    1990-01-01

    Scientific model building can be an intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot be easily distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We believe that advanced software techniques can facilitate both the model building and model sharing process. In this paper, we describe a prototype for a scientific modeling software tool that serves as an aid to the scientist in developing and using models. This tool includes an interactive intelligent graphical interface, a high level domain specific modeling language, a library of physics equations and experimental datasets, and a suite of data display facilities. Our prototype has been developed in the domain of planetary atmospheric modeling, and is being used to construct models of Titan's atmosphere.

  3. Evaluation of reliability modeling tools for advanced fault tolerant systems

    NASA Technical Reports Server (NTRS)

    Baker, Robert; Scheper, Charlotte

    1986-01-01

    The Computer Aided Reliability Estimation (CARE III) and Automated Reliability Interactice Estimation System (ARIES 82) reliability tools for application to advanced fault tolerance aerospace systems were evaluated. To determine reliability modeling requirements, the evaluation focused on the Draper Laboratories' Advanced Information Processing System (AIPS) architecture as an example architecture for fault tolerance aerospace systems. Advantages and limitations were identified for each reliability evaluation tool. The CARE III program was designed primarily for analyzing ultrareliable flight control systems. The ARIES 82 program's primary use was to support university research and teaching. Both CARE III and ARIES 82 were not suited for determining the reliability of complex nodal networks of the type used to interconnect processing sites in the AIPS architecture. It was concluded that ARIES was not suitable for modeling advanced fault tolerant systems. It was further concluded that subject to some limitations (the difficulty in modeling systems with unpowered spare modules, systems where equipment maintenance must be considered, systems where failure depends on the sequence in which faults occurred, and systems where multiple faults greater than a double near coincident faults must be considered), CARE III is best suited for evaluating the reliability of advanced tolerant systems for air transport.

  4. ADVISOR: a systems analysis tool for advanced vehicle modeling

    NASA Astrophysics Data System (ADS)

    Markel, T.; Brooker, A.; Hendricks, T.; Johnson, V.; Kelly, K.; Kramer, B.; O'Keefe, M.; Sprik, S.; Wipke, K.

    This paper provides an overview of Advanced Vehicle Simulator (ADVISOR)—the US Department of Energy's (DOE's) ADVISOR written in the MATLAB/Simulink environment and developed by the National Renewable Energy Laboratory. ADVISOR provides the vehicle engineering community with an easy-to-use, flexible, yet robust and supported analysis package for advanced vehicle modeling. It is primarily used to quantify the fuel economy, the performance, and the emissions of vehicles that use alternative technologies including fuel cells, batteries, electric motors, and internal combustion engines in hybrid (i.e. multiple power sources) configurations. It excels at quantifying the relative change that can be expected due to the implementation of technology compared to a baseline scenario. ADVISOR's capabilities and limitations are presented and the power source models that are included in ADVISOR are discussed. Finally, several applications of the tool are presented to highlight ADVISOR's functionality. The content of this paper is based on a presentation made at the 'Development of Advanced Battery Engineering Models' workshop held in Crystal City, Virginia in August 2001.

  5. Advanced REACH Tool: a Bayesian model for occupational exposure assessment.

    PubMed

    McNally, Kevin; Warren, Nicholas; Fransman, Wouter; Entink, Rinke Klein; Schinkel, Jody; van Tongeren, Martie; Cherrie, John W; Kromhout, Hans; Schneider, Thomas; Tielemans, Erik

    2014-06-01

    This paper describes a Bayesian model for the assessment of inhalation exposures in an occupational setting; the methodology underpins a freely available web-based application for exposure assessment, the Advanced REACH Tool (ART). The ART is a higher tier exposure tool that combines disparate sources of information within a Bayesian statistical framework. The information is obtained from expert knowledge expressed in a calibrated mechanistic model of exposure assessment, data on inter- and intra-individual variability in exposures from the literature, and context-specific exposure measurements. The ART provides central estimates and credible intervals for different percentiles of the exposure distribution, for full-shift and long-term average exposures. The ART can produce exposure estimates in the absence of measurements, but the precision of the estimates improves as more data become available. The methodology presented in this paper is able to utilize partially analogous data, a novel approach designed to make efficient use of a sparsely populated measurement database although some additional research is still required before practical implementation. The methodology is demonstrated using two worked examples: an exposure to copper pyrithione in the spraying of antifouling paints and an exposure to ethyl acetate in shoe repair. PMID:24665110

  6. Advanced REACH Tool: A Bayesian Model for Occupational Exposure Assessment

    PubMed Central

    McNally, Kevin; Warren, Nicholas; Fransman, Wouter; Entink, Rinke Klein; Schinkel, Jody; van Tongeren, Martie; Cherrie, John W.; Kromhout, Hans; Schneider, Thomas; Tielemans, Erik

    2014-01-01

    This paper describes a Bayesian model for the assessment of inhalation exposures in an occupational setting; the methodology underpins a freely available web-based application for exposure assessment, the Advanced REACH Tool (ART). The ART is a higher tier exposure tool that combines disparate sources of information within a Bayesian statistical framework. The information is obtained from expert knowledge expressed in a calibrated mechanistic model of exposure assessment, data on inter- and intra-individual variability in exposures from the literature, and context-specific exposure measurements. The ART provides central estimates and credible intervals for different percentiles of the exposure distribution, for full-shift and long-term average exposures. The ART can produce exposure estimates in the absence of measurements, but the precision of the estimates improves as more data become available. The methodology presented in this paper is able to utilize partially analogous data, a novel approach designed to make efficient use of a sparsely populated measurement database although some additional research is still required before practical implementation. The methodology is demonstrated using two worked examples: an exposure to copper pyrithione in the spraying of antifouling paints and an exposure to ethyl acetate in shoe repair. PMID:24665110

  7. Advanced Computing Tools and Models for Accelerator Physics

    SciTech Connect

    Ryne, Robert; Ryne, Robert D.

    2008-06-11

    This paper is based on a transcript of my EPAC'08 presentation on advanced computing tools for accelerator physics. Following an introduction I present several examples, provide a history of the development of beam dynamics capabilities, and conclude with thoughts on the future of large scale computing in accelerator physics.

  8. Construction of an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Friedland, Peter; Keller, Richard M.; Mckay, Christopher P.; Sims, Michael H.; Thompson, David E.

    1992-01-01

    Scientific model-building can be a time intensive and painstaking process, often involving the development of large complex computer programs. Despite the effort involved, scientific models cannot be distributed easily and shared with other scientists. In general, implemented scientific models are complicated, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing, using and sharing models. The proposed tool will include an interactive intelligent graphical interface and a high-level domain-specific modeling language. As a test bed for this research, we propose to develop a software prototype in the domain of planetary atmospheric modeling.

  9. Proposal for constructing an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Sims, Michael H.; Podolak, Esther; Mckay, Christopher P.; Thompson, David E.

    1990-01-01

    Scientific model building can be a time intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot easily be distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We believe that advanced software techniques can facilitate both the model building and model sharing process. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing and using models. The proposed tool will include an interactive intelligent graphical interface and a high level, domain specific, modeling language. As a testbed for this research, we propose development of a software prototype in the domain of planetary atmospheric modeling.

  10. Global search tool for the Advanced Photon Source Integrated Relational Model of Installed Systems (IRMIS) database.

    SciTech Connect

    Quock, D. E. R.; Cianciarulo, M. B.; APS Engineering Support Division; Purdue Univ.

    2007-01-01

    The Integrated Relational Model of Installed Systems (IRMIS) is a relational database tool that has been implemented at the Advanced Photon Source to maintain an updated account of approximately 600 control system software applications, 400,000 process variables, and 30,000 control system hardware components. To effectively display this large amount of control system information to operators and engineers, IRMIS was initially built with nine Web-based viewers: Applications Organizing Index, IOC, PLC, Component Type, Installed Components, Network, Controls Spares, Process Variables, and Cables. However, since each viewer is designed to provide details from only one major category of the control system, the necessity for a one-stop global search tool for the entire database became apparent. The user requirements for extremely fast database search time and ease of navigation through search results led to the choice of Asynchronous JavaScript and XML (AJAX) technology in the implementation of the IRMIS global search tool. Unique features of the global search tool include a two-tier level of displayed search results, and a database data integrity validation and reporting mechanism.

  11. Predictive Modeling of Estrogen Receptor Binding Agents Using Advanced Cheminformatics Tools and Massive Public Data

    PubMed Central

    Ribay, Kathryn; Kim, Marlene T.; Wang, Wenyi; Pinolini, Daniel; Zhu, Hao

    2016-01-01

    Estrogen receptors (ERα) are a critical target for drug design as well as a potential source of toxicity when activated unintentionally. Thus, evaluating potential ERα binding agents is critical in both drug discovery and chemical toxicity areas. Using computational tools, e.g., Quantitative Structure-Activity Relationship (QSAR) models, can predict potential ERα binding agents before chemical synthesis. The purpose of this project was to develop enhanced predictive models of ERα binding agents by utilizing advanced cheminformatics tools that can integrate publicly available bioassay data. The initial ERα binding agent data set, consisting of 446 binders and 8307 non-binders, was obtained from the Tox21 Challenge project organized by the NIH Chemical Genomics Center (NCGC). After removing the duplicates and inorganic compounds, this data set was used to create a training set (259 binders and 259 non-binders). This training set was used to develop QSAR models using chemical descriptors. The resulting models were then used to predict the binding activity of 264 external compounds, which were available to us after the models were developed. The cross-validation results of training set [Correct Classification Rate (CCR) = 0.72] were much higher than the external predictivity of the unknown compounds (CCR = 0.59). To improve the conventional QSAR models, all compounds in the training set were used to search PubChem and generate a profile of their biological responses across thousands of bioassays. The most important bioassays were prioritized to generate a similarity index that was used to calculate the biosimilarity score between each two compounds. The nearest neighbors for each compound within the set were then identified and its ERα binding potential was predicted by its nearest neighbors in the training set. The hybrid model performance (CCR = 0.94 for cross validation; CCR = 0.68 for external prediction) showed significant improvement over the original QSAR

  12. Using explanatory crop models to develop simple tools for Advanced Life Support system studies

    NASA Technical Reports Server (NTRS)

    Cavazzoni, J.

    2004-01-01

    System-level analyses for Advanced Life Support require mathematical models for various processes, such as for biomass production and waste management, which would ideally be integrated into overall system models. Explanatory models (also referred to as mechanistic or process models) would provide the basis for a more robust system model, as these would be based on an understanding of specific processes. However, implementing such models at the system level may not always be practicable because of their complexity. For the area of biomass production, explanatory models were used to generate parameters and multivariable polynomial equations for basic models that are suitable for estimating the direction and magnitude of daily changes in canopy gas-exchange, harvest index, and production scheduling for both nominal and off-nominal growing conditions. c2004 COSPAR. Published by Elsevier Ltd. All rights reserved.

  13. ADVANCED POWER SYSTEMS ANALYSIS TOOLS

    SciTech Connect

    Robert R. Jensen; Steven A. Benson; Jason D. Laumb

    2001-08-31

    The use of Energy and Environmental Research Center (EERC) modeling tools and improved analytical methods has provided key information in optimizing advanced power system design and operating conditions for efficiency, producing minimal air pollutant emissions and utilizing a wide range of fossil fuel properties. This project was divided into four tasks: the demonstration of the ash transformation model, upgrading spreadsheet tools, enhancements to analytical capabilities using the scanning electron microscopy (SEM), and improvements to the slag viscosity model. The ash transformation model, Atran, was used to predict the size and composition of ash particles, which has a major impact on the fate of the combustion system. To optimize Atran key factors such as mineral fragmentation and coalescence, the heterogeneous and homogeneous interaction of the organically associated elements must be considered as they are applied to the operating conditions. The resulting model's ash composition compares favorably to measured results. Enhancements to existing EERC spreadsheet application included upgrading interactive spreadsheets to calculate the thermodynamic properties for fuels, reactants, products, and steam with Newton Raphson algorithms to perform calculations on mass, energy, and elemental balances, isentropic expansion of steam, and gasifier equilibrium conditions. Derivative calculations can be performed to estimate fuel heating values, adiabatic flame temperatures, emission factors, comparative fuel costs, and per-unit carbon taxes from fuel analyses. Using state-of-the-art computer-controlled scanning electron microscopes and associated microanalysis systems, a method to determine viscosity using the incorporation of grey-scale binning acquired by the SEM image was developed. The image analysis capabilities of a backscattered electron image can be subdivided into various grey-scale ranges that can be analyzed separately. Since the grey scale's intensity is

  14. Advanced semi-active engine and transmission mounts: tools for modelling, analysis, design, and tuning

    NASA Astrophysics Data System (ADS)

    Farjoud, Alireza; Taylor, Russell; Schumann, Eric; Schlangen, Timothy

    2014-02-01

    This paper is focused on modelling, design, and testing of semi-active magneto-rheological (MR) engine and transmission mounts used in the automotive industry. The purpose is to develop a complete analysis, synthesis, design, and tuning tool that reduces the need for expensive and time-consuming laboratory and field tests. A detailed mathematical model of such devices is developed using multi-physics modelling techniques for physical systems with various energy domains. The model includes all major features of an MR mount including fluid dynamics, fluid track, elastic components, decoupler, rate-dip, gas-charged chamber, MR fluid rheology, magnetic circuit, electronic driver, and control algorithm. Conventional passive hydraulic mounts can also be studied using the same mathematical model. The model is validated using standard experimental procedures. It is used for design and parametric study of mounts; effects of various geometric and material parameters on dynamic response of mounts can be studied. Additionally, this model can be used to test various control strategies to obtain best vibration isolation performance by tuning control parameters. Another benefit of this work is that nonlinear interactions between sub-components of the mount can be observed and investigated. This is not possible by using simplified linear models currently available.

  15. Advanced Online Survival Analysis Tool for Predictive Modelling in Clinical Data Science.

    PubMed

    Montes-Torres, Julio; Subirats, José Luis; Ribelles, Nuria; Urda, Daniel; Franco, Leonardo; Alba, Emilio; Jerez, José Manuel

    2016-01-01

    One of the prevailing applications of machine learning is the use of predictive modelling in clinical survival analysis. In this work, we present our view of the current situation of computer tools for survival analysis, stressing the need of transferring the latest results in the field of machine learning to biomedical researchers. We propose a web based software for survival analysis called OSA (Online Survival Analysis), which has been developed as an open access and user friendly option to obtain discrete time, predictive survival models at individual level using machine learning techniques, and to perform standard survival analysis. OSA employs an Artificial Neural Network (ANN) based method to produce the predictive survival models. Additionally, the software can easily generate survival and hazard curves with multiple options to personalise the plots, obtain contingency tables from the uploaded data to perform different tests, and fit a Cox regression model from a number of predictor variables. In the Materials and Methods section, we depict the general architecture of the application and introduce the mathematical background of each of the implemented methods. The study concludes with examples of use showing the results obtained with public datasets. PMID:27532883

  16. Advanced Online Survival Analysis Tool for Predictive Modelling in Clinical Data Science

    PubMed Central

    Montes-Torres, Julio; Subirats, José Luis; Ribelles, Nuria; Urda, Daniel; Franco, Leonardo; Alba, Emilio; Jerez, José Manuel

    2016-01-01

    One of the prevailing applications of machine learning is the use of predictive modelling in clinical survival analysis. In this work, we present our view of the current situation of computer tools for survival analysis, stressing the need of transferring the latest results in the field of machine learning to biomedical researchers. We propose a web based software for survival analysis called OSA (Online Survival Analysis), which has been developed as an open access and user friendly option to obtain discrete time, predictive survival models at individual level using machine learning techniques, and to perform standard survival analysis. OSA employs an Artificial Neural Network (ANN) based method to produce the predictive survival models. Additionally, the software can easily generate survival and hazard curves with multiple options to personalise the plots, obtain contingency tables from the uploaded data to perform different tests, and fit a Cox regression model from a number of predictor variables. In the Materials and Methods section, we depict the general architecture of the application and introduce the mathematical background of each of the implemented methods. The study concludes with examples of use showing the results obtained with public datasets. PMID:27532883

  17. Propulsion Simulations Using Advanced Turbulence Models with the Unstructured Grid CFD Tool, TetrUSS

    NASA Technical Reports Server (NTRS)

    Abdol-Hamid, Khaled S.; Frink, Neal T.; Deere, Karen A.; Pandya, Mohangna J.

    2004-01-01

    A computational investigation has been completed to assess the capability of TetrUSS for exhaust nozzle flows. Three configurations were chosen for this study (1) an axisymmetric supersonic jet, (2) a transonic axisymmetric boattail with solid sting operated at different Reynolds number and Mach number, and (3) an isolated non-axisymmetric nacelle with a supersonic cruise nozzle. These configurations were chosen because existing experimental data provided a means for measuring the ability of TetrUSS for simulating complex nozzle flows. The main objective of this paper is to validate the implementation of advanced two-equation turbulence models in the unstructured-grid CFD code USM3D for propulsion flow cases. USM3D is the flow solver of the TetrUSS system. Three different turbulence models, namely, Menter Shear Stress Transport (SST), basic k epsilon, and the Spalart-Allmaras (SA) are used in the present study. The results are generally in agreement with other implementations of these models in structured-grid CFD codes. Results indicate that USM3D provides accurate simulations for complex aerodynamic configurations with propulsion integration.

  18. FACILITATING ADVANCED URBAN METEOROLOGY AND AIR QUALITY MODELING CAPABILITIES WITH HIGH RESOLUTION URBAN DATABASE AND ACCESS PORTAL TOOLS

    EPA Science Inventory

    Information of urban morphological features at high resolution is needed to properly model and characterize the meteorological and air quality fields in urban areas. We describe a new project called National Urban Database with Access Portal Tool, (NUDAPT) that addresses this nee...

  19. Advanced genetic tools for plant biotechnology

    SciTech Connect

    Liu, WS; Yuan, JS; Stewart, CN

    2013-10-09

    Basic research has provided a much better understanding of the genetic networks and regulatory hierarchies in plants. To meet the challenges of agriculture, we must be able to rapidly translate this knowledge into generating improved plants. Therefore, in this Review, we discuss advanced tools that are currently available for use in plant biotechnology to produce new products in plants and to generate plants with new functions. These tools include synthetic promoters, 'tunable' transcription factors, genome-editing tools and site-specific recombinases. We also review some tools with the potential to enable crop improvement, such as methods for the assembly and synthesis of large DNA molecules, plant transformation with linked multigenes and plant artificial chromosomes. These genetic technologies should be integrated to realize their potential for applications to pressing agricultural and environmental problems.

  20. Exposure tool control for advanced semiconductor lithography

    NASA Astrophysics Data System (ADS)

    Matsuyama, Tomoyuki

    2015-08-01

    This is a review paper to show how we control exposure tool parameters in order to satisfy patterning performance and productivity requirements for advanced semiconductor lithography. In this paper, we will discuss how we control illumination source shape to satisfy required imaging performance, heat-induced lens aberration during exposure to minimize the aberration impact on imaging, dose and focus control to realize uniform patterning performance across the wafer and patterning position of circuit patterns on different layers. The contents are mainly about current Nikon immersion exposure tools.

  1. Load Model Data Tool

    SciTech Connect

    David Chassin, Pavel Etingov

    2013-04-30

    The LMDT software automates the process of the load composite model data preparation in the format supported by the major power system software vendors (GE and Siemens). Proper representation of the load composite model in power system dynamic analysis is very important. Software tools for power system simulation like GE PSLF and Siemens PSSE already include algorithms for the load composite modeling. However, these tools require that the input information on composite load to be provided in custom formats. Preparation of this data is time consuming and requires multiple manual operations. The LMDT software enables to automate this process. Software is designed to generate composite load model data. It uses the default load composition data, motor information, and bus information as an input. Software processes the input information and produces load composition model. Generated model can be stored in .dyd format supported by GE PSLF package or .dyr format supported by Siemens PSSE package.

  2. Load Model Data Tool

    2013-04-30

    The LMDT software automates the process of the load composite model data preparation in the format supported by the major power system software vendors (GE and Siemens). Proper representation of the load composite model in power system dynamic analysis is very important. Software tools for power system simulation like GE PSLF and Siemens PSSE already include algorithms for the load composite modeling. However, these tools require that the input information on composite load to bemore » provided in custom formats. Preparation of this data is time consuming and requires multiple manual operations. The LMDT software enables to automate this process. Software is designed to generate composite load model data. It uses the default load composition data, motor information, and bus information as an input. Software processes the input information and produces load composition model. Generated model can be stored in .dyd format supported by GE PSLF package or .dyr format supported by Siemens PSSE package.« less

  3. Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)

    NASA Technical Reports Server (NTRS)

    Doyle, Monica; ONeil, Daniel A.; Christensen, Carissa B.

    2005-01-01

    The Advanced Technology Lifecycle Analysis System (ATLAS) is a decision support tool designed to aid program managers and strategic planners in determining how to invest technology research and development dollars. It is an Excel-based modeling package that allows a user to build complex space architectures and evaluate the impact of various technology choices. ATLAS contains system models, cost and operations models, a campaign timeline and a centralized technology database. Technology data for all system models is drawn from a common database, the ATLAS Technology Tool Box (TTB). The TTB provides a comprehensive, architecture-independent technology database that is keyed to current and future timeframes.

  4. Machine Tool Advanced Skills Technology Program (MAST). Overview and Methodology.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    The Machine Tool Advanced Skills Technology Program (MAST) is a geographical partnership of six of the nation's best two-year colleges located in the six states that have about one-third of the density of metals-related industries in the United States. The purpose of the MAST grant is to develop and implement a national training model to overcome…

  5. Modeling and MBL: Software Tools for Science.

    ERIC Educational Resources Information Center

    Tinker, Robert F.

    Recent technological advances and new software packages put unprecedented power for experimenting and theory-building in the hands of students at all levels. Microcomputer-based laboratory (MBL) and model-solving tools illustrate the educational potential of the technology. These tools include modeling software and three MBL packages (which are…

  6. Self-advancing step-tap tool

    NASA Technical Reports Server (NTRS)

    Pettit, Donald R. (Inventor); Penner, Ronald K. (Inventor); Franklin, Larry D. (Inventor); Camarda, Charles J. (Inventor)

    2008-01-01

    Methods and tool for simultaneously forming a bore in a work piece and forming a series of threads in said bore. In an embodiment, the tool has a predetermined axial length, a proximal end, and a distal end, said tool comprising: a shank located at said proximal end; a pilot drill portion located at said distal end; and a mill portion intermediately disposed between said shank and said pilot drill portion. The mill portion is comprised of at least two drill-tap sections of predetermined axial lengths and at least one transition section of predetermined axial length, wherein each of said at least one transition section is sandwiched between a distinct set of two of said at least two drill-tap sections. The at least two drill-tap sections are formed of one or more drill-tap cutting teeth spirally increasing along said at least two drill-tap sections, wherein said tool is self-advanced in said work piece along said formed threads, and wherein said tool simultaneously forms said bore and said series of threads along a substantially similar longitudinal axis.

  7. Regional Arctic System Model (RASM): A Tool to Advance Understanding and Prediction of Arctic Climate Change at Process Scales

    NASA Astrophysics Data System (ADS)

    Maslowski, W.; Roberts, A.; Osinski, R.; Brunke, M.; Cassano, J. J.; Clement Kinney, J. L.; Craig, A.; Duvivier, A.; Fisel, B. J.; Gutowski, W. J., Jr.; Hamman, J.; Hughes, M.; Nijssen, B.; Zeng, X.

    2014-12-01

    The Arctic is undergoing rapid climatic changes, which are some of the most coordinated changes currently occurring anywhere on Earth. They are exemplified by the retreat of the perennial sea ice cover, which integrates forcing by, exchanges with and feedbacks between atmosphere, ocean and land. While historical reconstructions from Global Climate and Global Earth System Models (GC/ESMs) are in broad agreement with these changes, the rate of change in the GC/ESMs remains outpaced by observations. Reasons for that stem from a combination of coarse model resolution, inadequate parameterizations, unrepresented processes and a limited knowledge of physical and other real world interactions. We demonstrate the capability of the Regional Arctic System Model (RASM) in addressing some of the GC/ESM limitations in simulating observed seasonal to decadal variability and trends in the sea ice cover and climate. RASM is a high resolution, fully coupled, pan-Arctic climate model that uses the Community Earth System Model (CESM) framework. It uses the Los Alamos Sea Ice Model (CICE) and Parallel Ocean Program (POP) configured at an eddy-permitting resolution of 1/12° as well as the Weather Research and Forecasting (WRF) and Variable Infiltration Capacity (VIC) models at 50 km resolution. All RASM components are coupled via the CESM flux coupler (CPL7) at 20-minute intervals. RASM is an example of limited-area, process-resolving, fully coupled earth system model, which due to the additional constraints from lateral boundary conditions and nudging within a regional model domain facilitates detailed comparisons with observational statistics that are not possible with GC/ESMs. In this talk, we will emphasize the utility of RASM to understand sensitivity to variable parameter space, importance of critical processes, coupled feedbacks and ultimately to reduce uncertainty in arctic climate change projections.

  8. Development of advanced composite ceramic tool material

    SciTech Connect

    Huang Chuanzhen; Ai Xing

    1996-08-01

    An advanced ceramic cutting tool material has been developed by means of silicon carbide whisker (SiCw) reinforcement and silicon carbide particle (SiCp) dispersion. The material has the advantage of high bending strength and fracture toughness. Compared with the mechanical properties of Al{sub 2}O{sub 3}/SiCp(AP), Al{sub 2}O{sub 3}/SiCw(JX-1), and Al{sub 2}O{sub 3}/SiCp/SiCw(JX-2-I), it confirms that JX-2-I composites have obvious additive effects of both reinforcing and toughening. The reinforcing and toughening mechanisms of JX-2-I composites were studied based on the analysis of thermal expansion mismatch and the observation of microstructure. The cutting performance of JX-2-I composites was investigated primarily.

  9. Kate's Model Verification Tools

    NASA Technical Reports Server (NTRS)

    Morgan, Steve

    1991-01-01

    Kennedy Space Center's Knowledge-based Autonomous Test Engineer (KATE) is capable of monitoring electromechanical systems, diagnosing their errors, and even repairing them when they crash. A survey of KATE's developer/modelers revealed that they were already using a sophisticated set of productivity enhancing tools. They did request five more, however, and those make up the body of the information presented here: (1) a transfer function code fitter; (2) a FORTRAN-Lisp translator; (3) three existing structural consistency checkers to aid in syntax checking their modeled device frames; (4) an automated procedure for calibrating knowledge base admittances to protect KATE's hardware mockups from inadvertent hand valve twiddling; and (5) three alternatives for the 'pseudo object', a programming patch that currently apprises KATE's modeling devices of their operational environments.

  10. Regional Arctic System Model (RASM): A Tool to Address the U.S. Priorities and Advance Capabilities for Arctic Climate Modeling and Prediction

    NASA Astrophysics Data System (ADS)

    Maslowski, W.; Roberts, A.; Cassano, J. J.; Gutowski, W. J., Jr.; Nijssen, B.; Osinski, R.; Zeng, X.; Brunke, M.; Duvivier, A.; Hamman, J.; Hossainzadeh, S.; Hughes, M.; Seefeldt, M. W.

    2015-12-01

    The Arctic is undergoing some of the most coordinated rapid climatic changes currently occurring anywhere on Earth, including the retreat of the perennial sea ice cover, which integrates forcing by, exchanges with and feedbacks between atmosphere, ocean and land. While historical reconstructions from Earth System Models (ESMs) are in broad agreement with these changes, the rate of change in ESMs generally remains outpaced by observations. Reasons for that relate to a combination of coarse resolution, inadequate parameterizations, under-represented processes and a limited knowledge of physical interactions. We demonstrate the capability of the Regional Arctic System Model (RASM) in addressing some of the ESM limitations in simulating observed variability and trends in arctic surface climate. RASM is a high resolution, pan-Arctic coupled climate model with the sea ice and ocean model components configured at an eddy-permitting resolution of 1/12o and the atmosphere and land hydrology model components at 50 km resolution, which are all coupled at 20-minute intervals. RASM is an example of limited-area, process-resolving, fully coupled ESM, which due to the constraints from boundary conditions facilitates detailed comparisons with observational statistics that are not possible with ESMs. The overall goal of RASM is to address key requirements published in the Navy Arctic Roadmap: 2014-2030 and in the Implementation Plan for the National Strategy for the Arctic Region, regarding the need for advanced modeling capabilities for operational forecasting and strategic climate predictions through 2030. The main science objectives of RASM are to advance understanding and model representation of critical physical processes and feedbacks of importance to sea ice thickness and area distribution. RASM results are presented to quantify relative contributions by (i) resolved processes and feedbacks as well as (ii) sensitivity to space dependent sub-grid parameterizations to better

  11. Advanced computational tools for 3-D seismic analysis

    SciTech Connect

    Barhen, J.; Glover, C.W.; Protopopescu, V.A.

    1996-06-01

    The global objective of this effort is to develop advanced computational tools for 3-D seismic analysis, and test the products using a model dataset developed under the joint aegis of the United States` Society of Exploration Geophysicists (SEG) and the European Association of Exploration Geophysicists (EAEG). The goal is to enhance the value to the oil industry of the SEG/EAEG modeling project, carried out with US Department of Energy (DOE) funding in FY` 93-95. The primary objective of the ORNL Center for Engineering Systems Advanced Research (CESAR) is to spearhead the computational innovations techniques that would enable a revolutionary advance in 3-D seismic analysis. The CESAR effort is carried out in collaboration with world-class domain experts from leading universities, and in close coordination with other national laboratories and oil industry partners.

  12. Anvil Forecast Tool in the Advanced Weather Interactive Processing System

    NASA Technical Reports Server (NTRS)

    Barrett, Joe H., III; Hood, Doris

    2009-01-01

    Meteorologists from the 45th Weather Squadron (45 WS) and National Weather Service Spaceflight Meteorology Group (SMG) have identified anvil forecasting as one of their most challenging tasks when predicting the probability of violations of the Lightning Launch Commit Criteria and Space Shuttle Flight Rules. As a result, the Applied Meteorology Unit (AMU) was tasked to create a graphical overlay tool for the Meteorological Interactive Data Display System (MIDDS) that indicates the threat of thunderstorm anvil clouds, using either observed or model forecast winds as input. The tool creates a graphic depicting the potential location of thunderstorm anvils one, two, and three hours into the future. The locations are based on the average of the upper level observed or forecasted winds. The graphic includes 10 and 20 n mi standoff circles centered at the location of interest, as well as one-, two-, and three-hour arcs in the upwind direction. The arcs extend outward across a 30 sector width based on a previous AMU study that determined thunderstorm anvils move in a direction plus or minus 15 of the upper-level wind direction. The AMU was then tasked to transition the tool to the Advanced Weather Interactive Processing System (AWIPS). SMG later requested the tool be updated to provide more flexibility and quicker access to model data. This presentation describes the work performed by the AMU to transition the tool into AWIPS, as well as the subsequent improvements made to the tool.

  13. Development of Advanced Tools for Cryogenic Integration

    NASA Astrophysics Data System (ADS)

    Bugby, D. C.; Marland, B. C.; Stouffer, C. J.; Kroliczek, E. J.

    2004-06-01

    This paper describes four advanced devices (or tools) that were developed to help solve problems in cryogenic integration. The four devices are: (1) an across-gimbal nitrogen cryogenic loop heat pipe (CLHP); (2) a miniaturized neon CLHP; (3) a differential thermal expansion (DTE) cryogenic thermal switch (CTSW); and (4) a dual-volume nitrogen cryogenic thermal storage unit (CTSU). The across-gimbal CLHP provides a low torque, high conductance solution for gimbaled cryogenic systems wishing to position their cryocoolers off-gimbal. The miniaturized CLHP combines thermal transport, flexibility, and thermal switching (at 35 K) into one device that can be directly mounted to both the cooler cold head and the cooled component. The DTE-CTSW, designed and successfully tested in a previous program using a stainless steel tube and beryllium (Be) end-pieces, was redesigned with a polymer rod and high-purity aluminum (Al) end-pieces to improve performance and manufacturability while still providing a miniaturized design. Lastly, the CTSU was designed with a 6063 Al heat exchanger and integrally welded, segmented, high purity Al thermal straps for direct attachment to both a cooler cold head and a Be component whose peak heat load exceeds its average load by 2.5 times. For each device, the paper will describe its development objective, operating principles, heritage, requirements, design, test data and lessons learned.

  14. Advanced cryogenics for cutting tools. Final report

    SciTech Connect

    Lazarus, L.J.

    1996-10-01

    The purpose of the investigation was to determine if cryogenic treatment improved the life and cost effectiveness of perishable cutting tools over other treatments or coatings. Test results showed that in five of seven of the perishable cutting tools tested there was no improvement in tool life. The other two tools showed a small gain in tool life, but not as much as when switching manufacturers of the cutting tool. The following conclusions were drawn from this study: (1) titanium nitride coatings are more effective than cryogenic treatment in increasing the life of perishable cutting tools made from all cutting tool materials, (2) cryogenic treatment may increase tool life if the cutting tool is improperly heat treated during its origination, and (3) cryogenic treatment was only effective on those tools made from less sophisticated high speed tool steels. As a part of a recent detailed investigation, four cutting tool manufacturers and two cutting tool laboratories were queried and none could supply any data to substantiate cryogenic treatment of perishable cutting tools.

  15. Distillation Column Modeling Tools

    SciTech Connect

    2001-09-01

    Advanced Computational and Experimental Techniques will Optimize Distillation Column Operation. Distillation is a low thermal efficiency unit operation that currently consumes 4.8 quadrillion BTUs of energy...

  16. Advanced machine tools, loading systems viewed

    NASA Astrophysics Data System (ADS)

    Kharkov, V. I.

    1986-03-01

    The machine-tooling complex built from a revolving lathe and a two-armed robot designed to machine short revolving bodies including parts with curvilinear and threaded surfaces from piece blanks in either small-series or series multiitem production is described. The complex consists of: (1) a model 1V340F30 revolving lathe with a vertical axis of rotation, 8-position revolving head on a cross carriage and an Elektronika NTs-31 on-line control system; (2) a gantry-style two-armed M20-Ts robot with a 20-kilogram (20 x 2) load capacity; and (3) an 8-position indexable blank table, one of whose positions is for initial unloading of finished parts. Subsequently, machined parts are set onto the position into which all of the blanks are unloaded. Complex enclosure allows adjustment and process correction during maintenance and convenient observation of the machining process.

  17. Anvil Tool in the Advanced Weather Interactive Processing System

    NASA Technical Reports Server (NTRS)

    Barrett, Joe, III; Bauman, William, III; Keen, Jeremy

    2007-01-01

    Meteorologists from the 45th Weather Squadron (45 WS) and Spaceflight Meteorology Group (SMG) have identified anvil forecasting as one of their most challenging tasks when predicting the probability of violations of the lightning Launch Commit Criteria and Space Shuttle Flight Rules. As a result, the Applied Meteorology Unit (AMU) created a graphical overlay tool for the Meteorological Interactive Data Display Systems (MIDDS) to indicate the threat of thunderstorm anvil clouds, using either observed or model forecast winds as input. In order for the Anvil Tool to remain available to the meteorologists, the AMU was tasked to transition the tool to the Advanced Weather interactive Processing System (AWIPS). This report describes the work done by the AMU to develop the Anvil Tool for AWIPS to create a graphical overlay depicting the threat from thunderstorm anvil clouds. The AWIPS Anvil Tool is based on the previously deployed AMU MIDDS Anvil Tool. SMG and 45 WS forecasters have used the MIDDS Anvil Tool during launch and landing operations. SMG's primary weather analysis and display system is now AWIPS and the 45 WS has plans to replace MIDDS with AWIPS. The Anvil Tool creates a graphic that users can overlay on satellite or radar imagery to depict the potential location of thunderstorm anvils one, two, and three hours into the future. The locations are based on an average of the upper-level observed or forecasted winds. The graphic includes 10 and 20 nm standoff circles centered at the location of interest, in addition to one-, two-, and three-hour arcs in the upwind direction. The arcs extend outward across a 30 degree sector width based on a previous AMU study which determined thunderstorm anvils move in a direction plus or minus 15 degrees of the upper-level (300- to 150-mb) wind direction. This report briefly describes the history of the MIDDS Anvil Tool and then explains how the initial development of the AWIPS Anvil Tool was carried out. After testing was

  18. Advanced Concept Modeling

    NASA Technical Reports Server (NTRS)

    Chaput, Armand; Johns, Zachary; Hodges, Todd; Selfridge, Justin; Bevirt, Joeben; Ahuja, Vivek

    2015-01-01

    Advanced Concepts Modeling software validation, analysis, and design. This was a National Institute of Aerospace contract with a lot of pieces. Efforts ranged from software development and validation for structures and aerodynamics, through flight control development, and aeropropulsive analysis, to UAV piloting services.

  19. Robotic-locomotor training as a tool to reduce neuromuscular abnormality in spinal cord injury: the application of system identification and advanced longitudinal modeling.

    PubMed

    Mirbagheri, Mehdi M; Kindig, Matthew; Niu, Xun; Varoqui, Deborah; Conaway, Petra

    2013-06-01

    In this study, the effect of the LOKOMAT, a robotic-assisted locomotor training system, on the reduction of neuromuscular abnormalities associated with spasticity was examined, for the first time in the spinal cord injury (SCI) population. Twenty-three individuals with chronic incomplete SCI received 1-hour training sessions in the LOKOMAT three times per week, with up to 45 minutes of training per session; matched control group received no intervention. The neuromuscular properties of the spastic ankle were then evaluated prior to training and after 1, 2, and 4 weeks of training. A parallel-cascade system identification technique was used to determine the reflex and intrinsic stiffness of the ankle joint as a function of ankle position at each time point. The slope of the stiffness vs. joint angle curve, i.e. the modulation of stiffness with joint position, was then calculated and tracked over the four-week period. Growth Mixture Modeling (GMM), an advanced statistical method, was then used to classify subjects into subgroups based on similar trends in recovery pattern of slope over time, and Random Coefficient Regression (RCR) was used to model the recovery patterns within each subgroup. All groups showed significant reductions in both reflex and intrinsic slope over time, but subjects in classes with higher baseline values of the slope showed larger improvements over the four weeks of training. These findings suggest that LOKOMAT training may also be useful for reducing the abnormal modulation of neuromuscular properties that arises as secondary effects after SCI. This can advise clinicians as to which patients can benefit the most from LOKOMAT training prior to beginning the training. Further, this study shows that system identification and GMM/RCR can serve as powerful tools to quantify and track spasticity over time in the SCI population. PMID:24187312

  20. ADVANCED CHEMISTRY BASINS MODEL

    SciTech Connect

    William Goddard III; Lawrence Cathles III; Mario Blanco; Paul Manhardt; Peter Meulbroek; Yongchun Tang

    2004-05-01

    The advanced Chemistry Basin Model project has been operative for 48 months. During this period, about half the project tasks are on projected schedule. On average the project is somewhat behind schedule (90%). Unanticipated issues are causing model integration to take longer then scheduled, delaying final debugging and manual development. It is anticipated that a short extension will be required to fulfill all contract obligations.

  1. Advanced tool kits for EPR security.

    PubMed

    Blobel, B

    2000-11-01

    Responding to the challenge for efficient and high quality health care, the shared care paradigm must be established in health. In that context, information systems such as electronic patient records (EPR) have to meet this paradigm supporting communication and interoperation between the health care establishments (HCE) and health professionals (HP) involved. Due to the sensitivity of personal medical information, this co-operation must be provided in a trustworthy way. To enable different views of HCE and HP ranging from management, doctors, nurses up to systems administrators and IT professionals, a set of models for analysis, design and implementation of secure distributed EPR has been developed and introduced. The approach is based on the popular UML methodology and the component paradigm for open, interoperable systems. Easy to use tool kits deal with both application security services and communication security services but also with the security infrastructure needed. Regarding the requirements for distributed multi-user EPRs, modelling and implementation of policy agreements, authorisation and access control are especially considered. Current developments for a security infrastructure in health care based on cryptographic algorithms as health professional cards (HPC), security services employing digital signatures, and health-related TTP services are discussed. CEN and ISO initiatives for health informatics standards in the context of secure and communicable EPR are especially mentioned. PMID:11154968

  2. SOURCE APPORTIONMENT RESULTS, UNCERTAINTIES, AND MODELING TOOLS

    EPA Science Inventory

    Advanced multivariate receptor modeling tools are available from the U.S. Environmental Protection Agency (EPA) that use only speciated sample data to identify and quantify sources of air pollution. EPA has developed both EPA Unmix and EPA Positive Matrix Factorization (PMF) and ...

  3. Interoperable mesh and geometry tools for advanced petascale simulations

    SciTech Connect

    Diachin, L; Bauer, A; Fix, B; Kraftcheck, J; Jansen, K; Luo, X; Miller, M; Ollivier-Gooch, C; Shephard, M; Tautges, T; Trease, H

    2007-07-04

    SciDAC applications have a demonstrated need for advanced software tools to manage the complexities associated with sophisticated geometry, mesh, and field manipulation tasks, particularly as computer architectures move toward the petascale. The Center for Interoperable Technologies for Advanced Petascale Simulations (ITAPS) will deliver interoperable and interchangeable mesh, geometry, and field manipulation services that are of direct use to SciDAC applications. The premise of our technology development goal is to provide such services as libraries that can be used with minimal intrusion into application codes. To develop these technologies, we focus on defining a common data model and datastructure neutral interfaces that unify a number of different services such as mesh generation and improvement, front tracking, adaptive mesh refinement, shape optimization, and solution transfer operations. We highlight the use of several ITAPS services in SciDAC applications.

  4. Advanced Electric Submersible Pump Design Tool for Geothermal Applications

    SciTech Connect

    Xuele Qi; Norman Turnquist; Farshad Ghasripoor

    2012-05-31

    Electrical Submersible Pumps (ESPs) present higher efficiency, larger production rate, and can be operated in deeper wells than the other geothermal artificial lifting systems. Enhanced Geothermal Systems (EGS) applications recommend lifting 300 C geothermal water at 80kg/s flow rate in a maximum 10-5/8-inch diameter wellbore to improve the cost-effectiveness. In this paper, an advanced ESP design tool comprising a 1D theoretical model and a 3D CFD analysis has been developed to design ESPs for geothermal applications. Design of Experiments was also performed to optimize the geometry and performance. The designed mixed-flow type centrifugal impeller and diffuser exhibit high efficiency and head rise under simulated EGS conditions. The design tool has been validated by comparing the prediction to experimental data of an existing ESP product.

  5. An online model composition tool for system biology models

    PubMed Central

    2013-01-01

    Background There are multiple representation formats for Systems Biology computational models, and the Systems Biology Markup Language (SBML) is one of the most widely used. SBML is used to capture, store, and distribute computational models by Systems Biology data sources (e.g., the BioModels Database) and researchers. Therefore, there is a need for all-in-one web-based solutions that support advance SBML functionalities such as uploading, editing, composing, visualizing, simulating, querying, and browsing computational models. Results We present the design and implementation of the Model Composition Tool (Interface) within the PathCase-SB (PathCase Systems Biology) web portal. The tool helps users compose systems biology models to facilitate the complex process of merging systems biology models. We also present three tools that support the model composition tool, namely, (1) Model Simulation Interface that generates a visual plot of the simulation according to user’s input, (2) iModel Tool as a platform for users to upload their own models to compose, and (3) SimCom Tool that provides a side by side comparison of models being composed in the same pathway. Finally, we provide a web site that hosts BioModels Database models and a separate web site that hosts SBML Test Suite models. Conclusions Model composition tool (and the other three tools) can be used with little or no knowledge of the SBML document structure. For this reason, students or anyone who wants to learn about systems biology will benefit from the described functionalities. SBML Test Suite models will be a nice starting point for beginners. And, for more advanced purposes, users will able to access and employ models of the BioModels Database as well. PMID:24006914

  6. Alternative Fuel and Advanced Vehicle Tools (AFAVT), AFDC (Fact Sheet)

    SciTech Connect

    Not Available

    2010-01-01

    The Alternative Fuels and Advanced Vehicles Web site offers a collection of calculators, interactive maps, and informational tools to assist fleets, fuel providers, and others looking to reduce petroleum consumption in the transportation sector.

  7. Visualization tool for advanced laser system development

    NASA Astrophysics Data System (ADS)

    Crockett, Gregg A.; Brunson, Richard L.

    2002-06-01

    Simulation development for Laser Weapon Systems design and system trade analyses has progressed to new levels with the advent of object-oriented software development tools and PC processor capabilities. These tools allow rapid visualization of upcoming laser weapon system architectures and the ability to rapidly respond to what-if scenario questions from potential user commands. These simulations can solve very intensive problems in short time periods to investigate the parameter space of a newly emerging weapon system concept, or can address user mission performance for many different scenario engagements. Equally important to the rapid solution of complex numerical problems is the ability to rapidly visualize the results of the simulation, and to effectively interact with visualized output to glean new insights into the complex interactions of a scenario. Boeing has applied these ideas to develop a tool called the Satellite Visualization and Signature Tool (SVST). This Windows application is based upon a series of C++ coded modules that have evolved from several programs at Boeing-SVS. The SVST structure, extensibility, and some recent results of applying the simulation to weapon system concepts and designs will be discussed in this paper.

  8. Innovative Tools Advance Revolutionary Weld Technique

    NASA Technical Reports Server (NTRS)

    2009-01-01

    The iconic, orange external tank of the space shuttle launch system not only contains the fuel used by the shuttle s main engines during liftoff but also comprises the shuttle s backbone, supporting the space shuttle orbiter and solid rocket boosters. Given the tank s structural importance and the extreme forces (7.8 million pounds of thrust load) and temperatures it encounters during launch, the welds used to construct the tank must be highly reliable. Variable polarity plasma arc welding, developed for manufacturing the external tank and later employed for building the International Space Station, was until 1994 the best process for joining the aluminum alloys used during construction. That year, Marshall Space Flight Center engineers began experimenting with a relatively new welding technique called friction stir welding (FSW), developed in 1991 by The Welding Institute, of Cambridge, England. FSW differs from traditional fusion welding in that it is a solid-state welding technique, using frictional heat and motion to join structural components without actually melting any of the material. The weld is created by a shouldered pin tool that is plunged into the seam of the materials to be joined. The tool traverses the line while rotating at high speeds, generating friction that heats and softens but does not melt the metal. (The heat produced approaches about 80 percent of the metal s melting temperature.) The pin tool s rotation crushes and stirs the plasticized metal, extruding it along the seam as the tool moves forward. The material cools and consolidates, resulting in a weld with superior mechanical properties as compared to those weld properties of fusion welds. The innovative FSW technology promises a number of attractive benefits. Because the welded materials are not melted, many of the undesirables associated with fusion welding porosity, cracking, shrinkage, and distortion of the weld are minimized or avoided. The process is more energy efficient, safe

  9. Intelligent Software Tools for Advanced Computing

    SciTech Connect

    Baumgart, C.W.

    2001-04-03

    Feature extraction and evaluation are two procedures common to the development of any pattern recognition application. These features are the primary pieces of information which are used to train the pattern recognition tool, whether that tool is a neural network, a fuzzy logic rulebase, or a genetic algorithm. Careful selection of the features to be used by the pattern recognition tool can significantly streamline the overall development and training of the solution for the pattern recognition application. This report summarizes the development of an integrated, computer-based software package called the Feature Extraction Toolbox (FET), which can be used for the development and deployment of solutions to generic pattern recognition problems. This toolbox integrates a number of software techniques for signal processing, feature extraction and evaluation, and pattern recognition, all under a single, user-friendly development environment. The toolbox has been developed to run on a laptop computer, so that it may be taken to a site and used to develop pattern recognition applications in the field. A prototype version of this toolbox has been completed and is currently being used for applications development on several projects in support of the Department of Energy.

  10. Terahertz Tools Advance Imaging for Security, Industry

    NASA Technical Reports Server (NTRS)

    2010-01-01

    Picometrix, a wholly owned subsidiary of Advanced Photonix Inc. (API), of Ann Arbor, Michigan, invented the world s first commercial terahertz system. The company improved the portability and capabilities of their systems through Small Business Innovation Research (SBIR) agreements with Langley Research Center to provide terahertz imaging capabilities for inspecting the space shuttle external tanks and orbiters. Now API s systems make use of the unique imaging capacity of terahertz radiation on manufacturing floors, for thickness measurements of coatings, pharmaceutical tablet production, and even art conservation.

  11. [Advance directives, a tool to humanize care].

    PubMed

    Olmari-Ebbing, M; Zumbach, C N; Forest, M I; Rapin, C H

    2000-07-01

    The relationship between the patient and a medical care giver is complex specially as it implies to the human, juridical and practical points of view. It depends on legal and deontological considerations, but also on professional habits. Today, we are confronted to a fundamental modification of this relationship. Professional guidelines exist, but are rarely applied and rarely taught in universities. However, patients are eager to move from a paternalistic relationship to a true partnership, more harmonious and more respectful of individual values ("value based medicine"). Advance directives give us an opportunity to improve our practices and to provide care consistent with the needs and wishes of each patient. PMID:10967645

  12. Tools for advance directives. American Health Information Management Association.

    PubMed

    Schraffenberger, L A

    1992-02-01

    these a model as you develop your own presentation geared for your specific audience. Last, but not least, we include samples of a "Living Will Declaration" and a "Durable Power of Attorney for Health Care" forms reprinted with permission from the American Association of Retired Persons (AARP). We include them here so you can examine the language of each type of advance directive. Copies for your organization should be requested from AARP at 1909 K Street NW, Washington, DC 20049, (202) 662-4895. Forms specific to each state are available from The Society for the Right to Die/Concern for Dying at 250 W. 57th Street, New York, NY 10107, (212) 246-6973. The requirement under The Patient Self-Determination Act became effective December 1, 1991, but the educational requirements of the act are meant to be ongoing. These "tools" are to help you continue to be a regular contributor to the educational process in your organization. PMID:10145646

  13. Advancing representation of hydrologic processes in the Soil and Water Assessment Tool (SWAT) through integration of the TOPographic MODEL (TOPMODEL) features

    USGS Publications Warehouse

    Chen, J.; Wu, Y.

    2012-01-01

    This paper presents a study of the integration of the Soil and Water Assessment Tool (SWAT) model and the TOPographic MODEL (TOPMODEL) features for enhancing the physical representation of hydrologic processes. In SWAT, four hydrologic processes, which are surface runoff, baseflow, groundwater re-evaporation and deep aquifer percolation, are modeled by using a group of empirical equations. The empirical equations usually constrain the simulation capability of relevant processes. To replace these equations and to model the influences of topography and water table variation on streamflow generation, the TOPMODEL features are integrated into SWAT, and a new model, the so-called SWAT-TOP, is developed. In the new model, the process of deep aquifer percolation is removed, the concept of groundwater re-evaporation is refined, and the processes of surface runoff and baseflow are remodeled. Consequently, three parameters in SWAT are discarded, and two new parameters to reflect the TOPMODEL features are introduced. SWAT-TOP and SWAT are applied to the East River basin in South China, and the results reveal that, compared with SWAT, the new model can provide a more reasonable simulation of the hydrologic processes of surface runoff, groundwater re-evaporation, and baseflow. This study evidences that an established hydrologic model can be further improved by integrating the features of another model, which is a possible way to enhance our understanding of the workings of catchments.

  14. Advances in nanocrystallography as a proteomic tool.

    PubMed

    Pechkova, Eugenia; Bragazzi, Nicola Luigi; Nicolini, Claudio

    2014-01-01

    In order to overcome the difficulties and hurdles too much often encountered in crystallizing a protein with the conventional techniques, our group has introduced the innovative Langmuir-Blodgett (LB)-based crystallization, as a major advance in the field of both structural and functional proteomics, thus pioneering the emerging field of the so-called nanocrystallography or nanobiocrystallography. This approach uniquely combines protein crystallography and nanotechnologies within an integrated, coherent framework that allows one to obtain highly stable protein crystals and to fully characterize them at a nano- and subnanoscale. A variety of experimental techniques and theoretical/semi-theoretical approaches, ranging from atomic force microscopy, circular dichroism, Raman spectroscopy and other spectroscopic methods, microbeam grazing-incidence small-angle X-ray scattering to in silico simulations, bioinformatics, and molecular dynamics, has been exploited in order to study the LB-films and to investigate the kinetics and the main features of LB-grown crystals. When compared to classical hanging-drop crystallization, LB technique appears strikingly superior and yields results comparable with crystallization in microgravity environments. Therefore, the achievement of LB-based crystallography can have a tremendous impact in the field of industrial and clinical/therapeutic applications, opening new perspectives for personalized medicine. These implications are envisaged and discussed in the present contribution. PMID:24985772

  15. Advanced CAN (Controller Area Network) Tool

    SciTech Connect

    Terry, D.J.

    2000-03-17

    The CAN interface cards that are currently in use are PCMCIA based and use a microprocessor and CAN chip that are no longer in production. The long-term support of the SGT CAN interface is of concern due to this issue along with performance inadequacies and technical support. The CAN bus is at the heart of the SGT trailer. If the CAN bus in the SGT trailer cannot be maintained adequately, then the trailer itself cannot be maintained adequately. These concerns led to the need for a CRADA to help develop a new product that would be called the ''Gryphon'' CAN tool. FM and T provided manufacturing expertise along with design criteria to ensure SGT compatibility and long-term support. FM and T also provided resources for software support. Dearborn provided software and hardware design expertise to implement the necessary requirements. Both partners worked around heavy internal workloads to support completion of the project. This CRADA establishes a US source for an item that is very critical to support the SGT project. The Dearborn Group had the same goal to provide a US alternative to German suppliers. The Dearborn Group was also interested in developing a CAN product that has performance characteristics that place the Gryphon in a class by itself. This enhanced product not only meets and exceeds SGT requirements; it has opened up options that were not even considered before the project began. The cost of the product is also less than the European options.

  16. Advanced Production Planning Models

    SciTech Connect

    JONES,DEAN A.; LAWTON,CRAIG R.; KJELDGAARD,EDWIN A.; WRIGHT,STEPHEN TROY; TURNQUIST,MARK A.; NOZICK,LINDA K.; LIST,GEORGE F.

    2000-12-01

    >This report describes the innovative modeling approach developed as a result of a 3-year Laboratory Directed Research and Development project. The overall goal of this project was to provide an effective suite of solvers for advanced production planning at facilities in the nuclear weapons complex (NWC). We focused our development activities on problems related to operations at the DOE's Pantex Plant. These types of scheduling problems appear in many contexts other than Pantex--both within the NWC (e.g., Neutron Generators) and in other commercial manufacturing settings. We successfully developed an innovative and effective solution strategy for these types of problems. We have tested this approach on actual data from Pantex, and from Org. 14000 (Neutron Generator production). This report focuses on the mathematical representation of the modeling approach and presents three representative studies using Pantex data. Results associated with the Neutron Generator facility will be published in a subsequent SAND report. The approach to task-based scheduling described here represents a significant addition to the literature for large-scale, realistic scheduling problems in a variety of production settings.

  17. Advanced Chemistry Basins Model

    SciTech Connect

    William Goddard; Mario Blanco; Lawrence Cathles; Paul Manhardt; Peter Meulbroek; Yongchun Tang

    2002-11-10

    The DOE-funded Advanced Chemistry Basin model project is intended to develop a public domain, user-friendly basin modeling software under PC or low end workstation environment that predicts hydrocarbon generation, expulsion, migration and chemistry. The main features of the software are that it will: (1) afford users the most flexible way to choose or enter kinetic parameters for different maturity indicators; (2) afford users the most flexible way to choose or enter compositional kinetic parameters to predict hydrocarbon composition (e.g., gas/oil ratio (GOR), wax content, API gravity, etc.) at different kerogen maturities; (3) calculate the chemistry, fluxes and physical properties of all hydrocarbon phases (gas, liquid and solid) along the primary and secondary migration pathways of the basin and predict the location and intensity of phase fractionation, mixing, gas washing, etc.; and (4) predict the location and intensity of de-asphaltene processes. The project has be operative for 36 months, and is on schedule for a successful completion at the end of FY 2003.

  18. The Application of the NASA Advanced Concepts Office, Launch Vehicle Team Design Process and Tools for Modeling Small Responsive Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Threet, Grady E.; Waters, Eric D.; Creech, Dennis M.

    2012-01-01

    The Advanced Concepts Office (ACO) Launch Vehicle Team at the NASA Marshall Space Flight Center (MSFC) is recognized throughout NASA for launch vehicle conceptual definition and pre-phase A concept design evaluation. The Launch Vehicle Team has been instrumental in defining the vehicle trade space for many of NASA s high level launch system studies from the Exploration Systems Architecture Study (ESAS) through the Augustine Report, Constellation, and now Space Launch System (SLS). The Launch Vehicle Team s approach to rapid turn-around and comparative analysis of multiple launch vehicle architectures has played a large role in narrowing the design options for future vehicle development. Recently the Launch Vehicle Team has been developing versions of their vetted tools used on large launch vehicles and repackaged the process and capability to apply to smaller more responsive launch vehicles. Along this development path the LV Team has evaluated trajectory tools and assumptions against sounding rocket trajectories and air launch systems, begun altering subsystem mass estimating relationships to handle smaller vehicle components, and as an additional development driver, have begun an in-house small launch vehicle study. With the recent interest in small responsive launch systems and the known capability and response time of the ACO LV Team, ACO s launch vehicle assessment capability can be utilized to rapidly evaluate the vast and opportune trade space that small launch vehicles currently encompass. This would provide a great benefit to the customer in order to reduce that large trade space to a select few alternatives that should best fit the customer s payload needs.

  19. Development and Integration of an Advanced Stirling Convertor Linear Alternator Model for a Tool Simulating Convertor Performance and Creating Phasor Diagrams

    NASA Technical Reports Server (NTRS)

    Metscher, Jonathan F.; Lewandowski, Edward J.

    2013-01-01

    A simple model of the Advanced Stirling Convertors (ASC) linear alternator and an AC bus controller has been developed and combined with a previously developed thermodynamic model of the convertor for a more complete simulation and analysis of the system performance. The model was developed using Sage, a 1-D thermodynamic modeling program that now includes electro-magnetic components. The convertor, consisting of a free-piston Stirling engine combined with a linear alternator, has sufficiently sinusoidal steady-state behavior to allow for phasor analysis of the forces and voltages acting in the system. A MATLAB graphical user interface (GUI) has been developed to interface with the Sage software for simplified use of the ASC model, calculation of forces, and automated creation of phasor diagrams. The GUI allows the user to vary convertor parameters while fixing different input or output parameters and observe the effect on the phasor diagrams or system performance. The new ASC model and GUI help create a better understanding of the relationship between the electrical component voltages and mechanical forces. This allows better insight into the overall convertor dynamics and performance.

  20. Tool for Sizing Analysis of the Advanced Life Support System

    NASA Technical Reports Server (NTRS)

    Yeh, Hue-Hsie Jannivine; Brown, Cheryl B.; Jeng, Frank J.

    2005-01-01

    Advanced Life Support Sizing Analysis Tool (ALSSAT) is a computer model for sizing and analyzing designs of environmental-control and life support systems (ECLSS) for spacecraft and surface habitats involved in the exploration of Mars and Moon. It performs conceptual designs of advanced life support (ALS) subsystems that utilize physicochemical and biological processes to recycle air and water, and process wastes in order to reduce the need of resource resupply. By assuming steady-state operations, ALSSAT is a means of investigating combinations of such subsystems technologies and thereby assisting in determining the most cost-effective technology combination available. In fact, ALSSAT can perform sizing analysis of the ALS subsystems that are operated dynamically or steady in nature. Using the Microsoft Excel spreadsheet software with Visual Basic programming language, ALSSAT has been developed to perform multiple-case trade studies based on the calculated ECLSS mass, volume, power, and Equivalent System Mass, as well as parametric studies by varying the input parameters. ALSSAT s modular format is specifically designed for the ease of future maintenance and upgrades.

  1. Expert Models and Modeling Processes Associated with a Computer-Modeling Tool

    ERIC Educational Resources Information Center

    Zhang, BaoHui; Liu, Xiufeng; Krajcik, Joseph S.

    2006-01-01

    Holding the premise that the development of expertise is a continuous process, this study concerns expert models and modeling processes associated with a modeling tool called Model-It. Five advanced Ph.D. students in environmental engineering and public health used Model-It to create and test models of water quality. Using "think aloud" technique…

  2. A storm modeling system as an advanced tool in prediction of well organized slowly moving convective cloud system and early warning of severe weather risk

    NASA Astrophysics Data System (ADS)

    Spiridonov, Vlado; Curic, Mladjen

    2015-02-01

    Short-range prediction of precipitation is a critical input to flood prediction and hence the accuracy of flood warnings. Since most of the intensive processes come from convective clouds-the primary aim is to forecast these small-scale atmospheric processes. One characteristic pattern of organized group of convective clouds consist of a line of deep convection resulted in the repeated passage of heavy-rain-producing convective cells over NW part of Macedonia along the line. This slowly moving convective system produced extreme local rainfall and hailfall in urban Skopje city. A 3-d cloud model is used to simulate the main storm characteristic (e.g., structure, intensity, evolution) and the main physical processes responsible for initiation of heavy rainfall and hailfall. The model showed a good performance in producing significantly more realistic and spatially accurate forecasts of convective rainfall event than is possible with current operational system. The output results give a good initial input for developing appropriate tools such as flooding indices and potential risk mapping for interpreting and presenting the predictions so that they enhance operational flood prediction capabilities and warnings of severe weather risk of weather services. Convective scale model-even for a single case used has proved significant benefits in several aspects (initiation of convection, storm structure and evolution and precipitation). The storm-scale model (grid spacing-1 km) is capable of producing significantly more realistic and spatially accurate forecasts of convective rainfall events than is possible with current operational systems based on model with grid spacing 15 km.

  3. Microfield exposure tool enables advances in EUV lithography development

    SciTech Connect

    Naulleau, Patrick

    2009-09-07

    With demonstrated resist resolution of 20 nm half pitch, the SEMATECH Berkeley BUV microfield exposure tool continues to push crucial advances in the areas of BUY resists and masks. The ever progressing shrink in computer chip feature sizes has been fueled over the years by a continual reduction in the wavelength of light used to pattern the chips. Recently, this trend has been threatened by unavailability of lens materials suitable for wavelengths shorter than 193 nm. To circumvent this roadblock, a reflective technology utilizing a significantly shorter extreme ultraviolet (EUV) wavelength (13.5 nm) has been under development for the past decade. The dramatic wavelength shrink was required to compensate for optical design limitations intrinsic in mirror-based systems compared to refractive lens systems. With this significant reduction in wavelength comes a variety of new challenges including developing sources of adequate power, photoresists with suitable resolution, sensitivity, and line-edge roughness characteristics, as well as the fabrication of reflection masks with zero defects. While source development can proceed in the absence of available exposure tools, in order for progress to be made in the areas of resists and masks it is crucial to have access to advanced exposure tools with resolutions equal to or better than that expected from initial production tools. These advanced development tools, however, need not be full field tools. Also, implementing such tools at synchrotron facilities allows them to be developed independent of the availability of reliable stand-alone BUY sources. One such tool is the SEMATECH Berkeley microfield exposure tool (MET). The most unique attribute of the SEMA TECH Berkeley MET is its use of a custom-coherence illuminator made possible by its implementation on a synchrotron beamline. With only conventional illumination and conventional binary masks, the resolution limit of the 0.3-NA optic is approximately 25 nm, however

  4. Second NASA Technical Interchange Meeting (TIM): Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)

    NASA Technical Reports Server (NTRS)

    ONeil, D. A.; Mankins, J. C.; Christensen, C. B.; Gresham, E. C.

    2005-01-01

    The Advanced Technology Lifecycle Analysis System (ATLAS), a spreadsheet analysis tool suite, applies parametric equations for sizing and lifecycle cost estimation. Performance, operation, and programmatic data used by the equations come from a Technology Tool Box (TTB) database. In this second TTB Technical Interchange Meeting (TIM), technologists, system model developers, and architecture analysts discussed methods for modeling technology decisions in spreadsheet models, identified specific technology parameters, and defined detailed development requirements. This Conference Publication captures the consensus of the discussions and provides narrative explanations of the tool suite, the database, and applications of ATLAS within NASA s changing environment.

  5. Advances in scientific balloon thermal modeling

    NASA Astrophysics Data System (ADS)

    Bohaboj, T.; Cathey, H.

    The National Aeronautics and Space Administration's Balloon Program Office has long acknowledged that the accurate modeling of balloon performance and flight prediction is dependant on how well the balloon is thermally modeled. This ongoing effort is focused on developing accurate balloon thermal models that can be used to quickly predict balloon temperatures and balloon performance. The ability to model parametric changes is also a driver for this effort. This paper will present the most recent advances made in this area. This research effort continues to utilize the ``Thermal Desktop'' addition to AUTO CAD for the modeling. Recent advances have been made by using this analytical tool. A number of analyses have been completed to test the applicability of this tool to the problem with very positive results. Progressively detailed models have been developed to explore the capabilities of the tool as well as to provide guidance in model formulation. A number of parametric studies have been completed. These studies have varied the shape of the structure, material properties, environmental inputs, and model geometry. These studies have concentrated on spherical ``proxy models'' for the initial development stages and then to transition to the natural shaped zero pressure and super pressure balloons. An assessment of required model resolution has also been determined. Model solutions have been cross checked with known solutions via hand calculations. The comparison of these cases will also be presented. One goal is to develop analysis guidelines and an approach for modeling balloons for both simple first order estimates and detailed full models. This paper presents the step by step advances made as part of this effort, capabilities, limitations, and the lessons learned. Also presented are the plans for further thermal modeling work.

  6. Advances in Scientific Balloon Thermal Modeling

    NASA Technical Reports Server (NTRS)

    Bohaboj, T.; Cathey, H. M., Jr.

    2004-01-01

    The National Aeronautics and Space Administration's Balloon Program office has long acknowledged that the accurate modeling of balloon performance and flight prediction is dependant on how well the balloon is thermally modeled. This ongoing effort is focused on developing accurate balloon thermal models that can be used to quickly predict balloon temperatures and balloon performance. The ability to model parametric changes is also a driver for this effort. This paper will present the most recent advances made in this area. This research effort continues to utilize the "Thrmal Desktop" addition to AUTO CAD for the modeling. Recent advances have been made by using this analytical tool. A number of analyses have been completed to test the applicability of this tool to the problem with very positive results. Progressively detailed models have been developed to explore the capabilities of the tool as well as to provide guidance in model formulation. A number of parametric studies have been completed. These studies have varied the shape of the structure, material properties, environmental inputs, and model geometry. These studies have concentrated on spherical "proxy models" for the initial development stages and then to transition to the natural shaped zero pressure and super pressure balloons. An assessment of required model resolution has also been determined. Model solutions have been cross checked with known solutions via hand calculations. The comparison of these cases will also be presented. One goal is to develop analysis guidelines and an approach for modeling balloons for both simple first order estimates and detailed full models. This papa presents the step by step advances made as part of this effort, capabilities, limitations, and the lessons learned. Also presented are the plans for further thermal modeling work.

  7. Advanced Fuel Cycle Economic Tools, Algorithms, and Methodologies

    SciTech Connect

    David E. Shropshire

    2009-05-01

    The Advanced Fuel Cycle Initiative (AFCI) Systems Analysis supports engineering economic analyses and trade-studies, and requires a requisite reference cost basis to support adequate analysis rigor. In this regard, the AFCI program has created a reference set of economic documentation. The documentation consists of the “Advanced Fuel Cycle (AFC) Cost Basis” report (Shropshire, et al. 2007), “AFCI Economic Analysis” report, and the “AFCI Economic Tools, Algorithms, and Methodologies Report.” Together, these documents provide the reference cost basis, cost modeling basis, and methodologies needed to support AFCI economic analysis. The application of the reference cost data in the cost and econometric systems analysis models will be supported by this report. These methodologies include: the energy/environment/economic evaluation of nuclear technology penetration in the energy market—domestic and internationally—and impacts on AFCI facility deployment, uranium resource modeling to inform the front-end fuel cycle costs, facility first-of-a-kind to nth-of-a-kind learning with application to deployment of AFCI facilities, cost tradeoffs to meet nuclear non-proliferation requirements, and international nuclear facility supply/demand analysis. The economic analysis will be performed using two cost models. VISION.ECON will be used to evaluate and compare costs under dynamic conditions, consistent with the cases and analysis performed by the AFCI Systems Analysis team. Generation IV Excel Calculations of Nuclear Systems (G4-ECONS) will provide static (snapshot-in-time) cost analysis and will provide a check on the dynamic results. In future analysis, additional AFCI measures may be developed to show the value of AFCI in closing the fuel cycle. Comparisons can show AFCI in terms of reduced global proliferation (e.g., reduction in enrichment), greater sustainability through preservation of a natural resource (e.g., reduction in uranium ore depletion), value from

  8. Advancing computational methods for calibration of the Soil and Water Assessment Tool (SWAT): Application for modeling climate change impacts on water resources in the Upper Neuse Watershed of North Carolina

    NASA Astrophysics Data System (ADS)

    Ercan, Mehmet Bulent

    -Dominated Sorting Genetic Algorithm II (NSGA-II). This tool was demonstrated through an application for the Upper Neuse Watershed in North Carolina, USA. The objective functions used for the calibration were Nash-Sutcliffe (E) and Percent Bias (PB), and the objective sites were the Flat, Little, and Eno watershed outlets. The results show that the use of multi-objective calibration algorithms for SWAT calibration improved model performance especially in terms of minimizing PB compared to the single objective model calibration. The third study builds upon the first two studies by leveraging the new calibration methods and tools to study future climate impacts on the Upper Neuse watershed. Statistically downscaled outputs from eight Global Circulation Models (GCMs) were used for both low and high emission scenarios to drive a well calibrated SWAT model of the Upper Neuse watershed. The objective of the study was to understand the potential hydrologic response of the watershed, which serves as a public water supply for the growing Research Triangle Park region of North Carolina, under projected climate change scenarios. The future climate change scenarios, in general, indicate an increase in precipitation and temperature for the watershed in coming decades. The SWAT simulations using the future climate scenarios, in general, suggest an increase in soil water and water yield, and a decrease in evapotranspiration within the Upper Neuse watershed. In summary, this dissertation advances the field of watershed-scale hydrologic modeling by (i) providing some of the first work to apply cloud computing for the computationally-demanding task of model calibration; (ii) providing a new, open source library that can be used by SWAT modelers to perform multi-objective calibration of their models; and (iii) advancing understanding of climate change impacts on water resources for an important watershed in the Research Triangle Park region of North Carolina. The third study leveraged the

  9. Advanced tools, multiple missions, flexible organizations, and education

    NASA Astrophysics Data System (ADS)

    Lucas, Ray A.; Koratkar, Anuradha

    2000-07-01

    In this new era of modern astronomy, observations across multiple wavelengths are often required. This implies understanding many different costly and complex observatories. Yet, the process for translating ideas into proposals is very similar for all of these observatories If we had a new generation of uniform, common tools, writing proposals for the various observatories would be simpler for the observer because the learning curve would not be as steep. As observatory staffs struggle to meet the demands for higher scientific productivity with fewer resources, it is important to remember that another benefit of having such universal tools is that they enable much greater flexibility within an organization. The shifting manpower needs of multiple- instrument support or multiple-mission operations may be more readily met since the expertise is built into the tools. The flexibility of an organization is critical to its ability to change, to plan ahead, and respond to various new opportunities and operating conditions on shorter time scales, and to achieve the goal of maximizing scientific returns. In this paper we will discuss the role of a new generation of tools with relation to multiple missions and observatories. We will also discuss some of the impact of how uniform, consistently familiar software tools can enhance the individual's expertise and the organization's flexibility. Finally, we will discuss the relevance of advanced tools to higher education.

  10. Micromechanical Modeling Efforts for Advanced Composites

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Over the past two decades, NASA Lewis Research Center's in-house efforts in analytical modeling for advanced composites have yielded several computational predictive tools. These are, in general, based on simplified micromechanics equations. During the last 3 years, our efforts have been directed primarily toward developing prediction tools for high temperature ceramic matrix composite (CMC's) materials. These materials are being considered for High Speed Research program applications, specifically for combustor liners. In comparison to conventional materials, CMC's offer several advantages: high specific stiffness and strength, and higher toughness and nonbrittle failure in comparison to monolithic ceramics, as well as environmental stability and wear resistance for both roomtemperature and elevated-temperature applications. Under the sponsorship of the High Temperature Engine Materials Program (HITEMP), CMC analytical modeling has resulted in the computational tool Ceramic Matrix Composites Analyzer (CEMCAN).

  11. Advanced Chemistry Basins Model

    SciTech Connect

    Blanco, Mario; Cathles, Lawrence; Manhardt, Paul; Meulbroek, Peter; Tang, Yongchun

    2003-02-13

    The objective of this project is to: (1) Develop a database of additional and better maturity indicators for paleo-heat flow calibration; (2) Develop maturation models capable of predicting the chemical composition of hydrocarbons produced by a specific kerogen as a function of maturity, heating rate, etc.; assemble a compositional kinetic database of representative kerogens; (3) Develop a 4 phase equation of state-flash model that can define the physical properties (viscosity, density, etc.) of the products of kerogen maturation, and phase transitions that occur along secondary migration pathways; (4) Build a conventional basin model and incorporate new maturity indicators and data bases in a user-friendly way; (5) Develop an algorithm which combines the volume change and viscosities of the compositional maturation model to predict the chemistry of the hydrocarbons that will be expelled from the kerogen to the secondary migration pathways; (6) Develop an algorithm that predicts the flow of hydrocarbons along secondary migration pathways, accounts for mixing of miscible hydrocarbon components along the pathway, and calculates the phase fractionation that will occur as the hydrocarbons move upward down the geothermal and fluid pressure gradients in the basin; and (7) Integrate the above components into a functional model implemented on a PC or low cost workstation.

  12. Advanced Turbulence Modeling Concepts

    NASA Technical Reports Server (NTRS)

    Shih, Tsan-Hsing

    2005-01-01

    The ZCET program developed at NASA Glenn Research Center is to study hydrogen/air injection concepts for aircraft gas turbine engines that meet conventional gas turbine performance levels and provide low levels of harmful NOx emissions. A CFD study for ZCET program has been successfully carried out. It uses the most recently enhanced National combustion code (NCC) to perform CFD simulations for two configurations of hydrogen fuel injectors (GRC- and Sandia-injector). The results can be used to assist experimental studies to provide quick mixing, low emission and high performance fuel injector designs. The work started with the configuration of the single-hole injector. The computational models were taken from the experimental designs. For example, the GRC single-hole injector consists of one air tube (0.78 inches long and 0.265 inches in diameter) and two hydrogen tubes (0.3 inches long and 0.0226 inches in diameter opposed at 180 degree). The hydrogen tubes are located 0.3 inches upstream from the exit of the air element (the inlet location for the combustor). To do the simulation, the single-hole injector is connected to a combustor model (8.16 inches long and 0.5 inches in diameter). The inlet conditions for air and hydrogen elements are defined according to actual experimental designs. Two crossing jets of hydrogen/air are simulated in detail in the injector. The cold flow, reacting flow, flame temperature, combustor pressure and possible flashback phenomena are studied. Two grid resolutions of the numerical model have been adopted. The first computational grid contains 0.52 million elements, the second one contains over 1.3 million elements. The CFD results have shown only about 5% difference between the two grid resolutions. Therefore, the CFD result obtained from the model of 1.3-million grid resolution can be considered as a grid independent numerical solution. Turbulence models built in NCC are consolidated and well tested. They can handle both coarse and

  13. NATIONAL URBAN DATABASE AND ACCESS PORTAL TOOL (NUDAPT): FACILITATING ADVANCEMENTS IN URBAN METEOROLOGY AND CLIMATE MODELING WITH COMMUNITY-BASED URBAN DATABASES

    EPA Science Inventory

    We discuss the initial design and application of the National Urban Database and Access Portal Tool (NUDAPT). This new project is sponsored by the USEPA and involves collaborations and contributions from many groups from federal and state agencies, and from private and academic i...

  14. Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)

    NASA Astrophysics Data System (ADS)

    Doyle, Monica M.; O'Neil, Daniel A.; Christensen, Carissa B.

    2005-02-01

    Forecasting technology capabilities requires a tool and a process for capturing state-of-the-art technology metrics and estimates for future metrics. A decision support tool, known as the Advanced Technology Lifecycle Analysis System (ATLAS), contains a Technology Tool Box (TTB) database designed to accomplish this goal. Sections of this database correspond to a Work Breakdown Structure (WBS) developed by NASA's Exploration Systems Research and Technology (ESRT) Program. These sections cover the waterfront of technologies required for human and robotic space exploration. Records in each section include technology performance, operations, and programmatic metrics. Timeframes in the database provide metric values for the state of the art (Timeframe 0) and forecasts for timeframes that correspond to spiral development milestones in NASA's Exploration Systems Mission Directorate (ESMD) development strategy. Collecting and vetting data for the TTB will involve technologists from across the agency, the aerospace industry and academia. Technologists will have opportunities to submit technology metrics and forecasts to the TTB development team. Semi-annual forums will facilitate discussions about the basis of forecast estimates. As the tool and process mature, the TTB will serve as a powerful communication and decision support tool for the ESRT program.

  15. STRING 3: An Advanced Groundwater Flow Visualization Tool

    NASA Astrophysics Data System (ADS)

    Schröder, Simon; Michel, Isabel; Biedert, Tim; Gräfe, Marius; Seidel, Torsten; König, Christoph

    2016-04-01

    The visualization of 3D groundwater flow is a challenging task. Previous versions of our software STRING [1] solely focused on intuitive visualization of complex flow scenarios for non-professional audiences. STRING, developed by Fraunhofer ITWM (Kaiserslautern, Germany) and delta h Ingenieurgesellschaft mbH (Witten, Germany), provides the necessary means for visualization of both 2D and 3D data on planar and curved surfaces. In this contribution we discuss how to extend this approach to a full 3D tool and its challenges in continuation of Michel et al. [2]. This elevates STRING from a post-production to an exploration tool for experts. In STRING moving pathlets provide an intuition of velocity and direction of both steady-state and transient flows. The visualization concept is based on the Lagrangian view of the flow. To capture every detail of the flow an advanced method for intelligent, time-dependent seeding is used building on the Finite Pointset Method (FPM) developed by Fraunhofer ITWM. Lifting our visualization approach from 2D into 3D provides many new challenges. With the implementation of a seeding strategy for 3D one of the major problems has already been solved (see Schröder et al. [3]). As pathlets only provide an overview of the velocity field other means are required for the visualization of additional flow properties. We suggest the use of Direct Volume Rendering and isosurfaces for scalar features. In this regard we were able to develop an efficient approach for combining the rendering through raytracing of the volume and regular OpenGL geometries. This is achieved through the use of Depth Peeling or A-Buffers for the rendering of transparent geometries. Animation of pathlets requires a strict boundary of the simulation domain. Hence, STRING needs to extract the boundary, even from unstructured data, if it is not provided. In 3D we additionally need a good visualization of the boundary itself. For this the silhouette based on the angle of

  16. Modeling and Simulation Tools for Heavy Lift Airships

    NASA Technical Reports Server (NTRS)

    Hochstetler, Ron; Chachad, Girish; Hardy, Gordon; Blanken, Matthew; Melton, John

    2016-01-01

    For conventional fixed wing and rotary wing aircraft a variety of modeling and simulation tools have been developed to provide designers the means to thoroughly investigate proposed designs and operational concepts. However, lighter-than-air (LTA) airships, hybrid air vehicles, and aerostats have some important aspects that are different from heavier-than-air (HTA) vehicles. In order to account for these differences, modifications are required to the standard design tools to fully characterize the LTA vehicle design and performance parameters.. To address these LTA design and operational factors, LTA development organizations have created unique proprietary modeling tools, often at their own expense. An expansion of this limited LTA tool set could be accomplished by leveraging existing modeling and simulation capabilities available in the National laboratories and public research centers. Development of an expanded set of publicly available LTA modeling and simulation tools for LTA developers would mitigate the reliance on proprietary LTA design tools in use today. A set of well researched, open source, high fidelity LTA design modeling and simulation tools would advance LTA vehicle development and also provide the analytical basis for accurate LTA operational cost assessments. This paper will present the modeling and analysis tool capabilities required for LTA vehicle design, analysis of operations, and full life-cycle support. A survey of the tools currently available will be assessed to identify the gaps between their capabilities and the LTA industry's needs. Options for development of new modeling and analysis capabilities to supplement contemporary tools will also be presented.

  17. Component Modeling Approach Software Tool

    2010-08-23

    The Component Modeling Approach Software Tool (CMAST) establishes a set of performance libraries of approved components (frames, glass, and spacer) which can be accessed for configuring fenestration products for a project, and btaining a U-factor, Solar Heat Gain Coefficient (SHGC), and Visible Transmittance (VT) rating for those products, which can then be reflected in a CMA Label Certificate for code compliance. CMAST is web-based as well as client-based. The completed CMA program and software toolmore » will be useful in several ways for a vast array of stakeholders in the industry: Generating performance ratings for bidding projects Ascertaining credible and accurate performance data Obtaining third party certification of overall product performance for code compliance« less

  18. Review on advanced composite materials boring mechanism and tools

    NASA Astrophysics Data System (ADS)

    Shi, Runping; Wang, Chengyong

    2010-12-01

    With the rapid development of aviation and aerospace manufacturing technology, advanced composite materials represented by carbon fibre reinforced plastics (CFRP) and super hybrid composites (fibre/metal plates) are more and more widely applied. The fibres are mainly carbon fibre, boron fibre, Aramid fiber and Sic fibre. The matrixes are resin matrix, metal matrix and ceramic matrix. Advanced composite materials have higher specific strength and higher specific modulus than glass fibre reinforced resin composites of the 1st generation. They are widely used in aviation and aerospace industry due to their high specific strength, high specific modulus, excellent ductility, anticorrosion, heat-insulation, sound-insulation, shock absorption and high&low temperature resistance. They are used for radomes, inlets, airfoils(fuel tank included), flap, aileron, vertical tail, horizontal tail, air brake, skin, baseboards and tails, etc. Its hardness is up to 62~65HRC. The holes are greatly affected by the fibre laminates direction of carbon fibre reinforced composite material due to its anisotropy when drilling in unidirectional laminates. There are burrs, splits at the exit because of stress concentration. Besides there is delamination and the hole is prone to be smaller. Burrs are caused by poor sharpness of cutting edge, delamination, tearing, splitting are caused by the great stress caused by high thrust force. Poorer sharpness of cutting edge leads to lower cutting performance and higher drilling force at the same time. The present research focuses on the interrelation between rotation speed, feed, drill's geometry, drill life, cutting mode, tools material etc. and thrust force. At the same time, holes quantity and holes making difficulty of composites have also increased. It requires high performance drills which won't bring out defects and have long tool life. It has become a trend to develop super hard material tools and tools with special geometry for drilling

  19. Review on advanced composite materials boring mechanism and tools

    NASA Astrophysics Data System (ADS)

    Shi, Runping; Wang, Chengyong

    2011-05-01

    With the rapid development of aviation and aerospace manufacturing technology, advanced composite materials represented by carbon fibre reinforced plastics (CFRP) and super hybrid composites (fibre/metal plates) are more and more widely applied. The fibres are mainly carbon fibre, boron fibre, Aramid fiber and Sic fibre. The matrixes are resin matrix, metal matrix and ceramic matrix. Advanced composite materials have higher specific strength and higher specific modulus than glass fibre reinforced resin composites of the 1st generation. They are widely used in aviation and aerospace industry due to their high specific strength, high specific modulus, excellent ductility, anticorrosion, heat-insulation, sound-insulation, shock absorption and high&low temperature resistance. They are used for radomes, inlets, airfoils(fuel tank included), flap, aileron, vertical tail, horizontal tail, air brake, skin, baseboards and tails, etc. Its hardness is up to 62~65HRC. The holes are greatly affected by the fibre laminates direction of carbon fibre reinforced composite material due to its anisotropy when drilling in unidirectional laminates. There are burrs, splits at the exit because of stress concentration. Besides there is delamination and the hole is prone to be smaller. Burrs are caused by poor sharpness of cutting edge, delamination, tearing, splitting are caused by the great stress caused by high thrust force. Poorer sharpness of cutting edge leads to lower cutting performance and higher drilling force at the same time. The present research focuses on the interrelation between rotation speed, feed, drill's geometry, drill life, cutting mode, tools material etc. and thrust force. At the same time, holes quantity and holes making difficulty of composites have also increased. It requires high performance drills which won't bring out defects and have long tool life. It has become a trend to develop super hard material tools and tools with special geometry for drilling

  20. ADVANCED MIXING MODELS

    SciTech Connect

    Lee, S; Dimenna, R; Tamburello, D

    2011-02-14

    height from zero to 10 ft. The sludge has been characterized and modeled as micron-sized solids, typically 1 to 5 microns, at weight fractions as high as 20 to 30 wt%, specific gravities to 1.4, and viscosities up to 64 cp during motion. The sludge is suspended and mixed through the use of submersible slurry jet pumps. To suspend settled sludge, water is added to the tank as a slurry medium and stirred with the jet pump. Although there is considerable technical literature on mixing and solid suspension in agitated tanks, very little literature has been published on jet mixing in a large-scale tank. One of the main objectives in the waste processing is to provide feed of a uniform slurry composition at a certain weight percentage (e.g. typically {approx}13 wt% at SRS) over an extended period of time. In preparation of the sludge for slurrying, several important questions have been raised with regard to sludge suspension and mixing of the solid suspension in the bulk of the tank: (1) How much time is required to prepare a slurry with a uniform solid composition? (2) How long will it take to suspend and mix the sludge for uniform composition in any particular waste tank? (3) What are good mixing indicators to answer the questions concerning sludge mixing stated above in a general fashion applicable to any waste tank/slurry pump geometry and fluid/sludge combination?

  1. ADVANCED MIXING MODELS

    SciTech Connect

    Lee, S; Richard Dimenna, R; David Tamburello, D

    2008-11-13

    The process of recovering the waste in storage tanks at the Savannah River Site (SRS) typically requires mixing the contents of the tank with one to four dual-nozzle jet mixers located within the tank. The typical criteria to establish a mixed condition in a tank are based on the number of pumps in operation and the time duration of operation. To ensure that a mixed condition is achieved, operating times are set conservatively long. This approach results in high operational costs because of the long mixing times and high maintenance and repair costs for the same reason. A significant reduction in both of these costs might be realized by reducing the required mixing time based on calculating a reliable indicator of mixing with a suitably validated computer code. The work described in this report establishes the basis for further development of the theory leading to the identified mixing indicators, the benchmark analyses demonstrating their consistency with widely accepted correlations, and the application of those indicators to SRS waste tanks to provide a better, physically based estimate of the required mixing time. Waste storage tanks at SRS contain settled sludge which varies in height from zero to 10 ft. The sludge has been characterized and modeled as micron-sized solids, typically 1 to 5 microns, at weight fractions as high as 20 to 30 wt%, specific gravities to 1.4, and viscosities up to 64 cp during motion. The sludge is suspended and mixed through the use of submersible slurry jet pumps. To suspend settled sludge, water is added to the tank as a slurry medium and stirred with the jet pump. Although there is considerable technical literature on mixing and solid suspension in agitated tanks, very little literature has been published on jet mixing in a large-scale tank. If shorter mixing times can be shown to support Defense Waste Processing Facility (DWPF) or other feed requirements, longer pump lifetimes can be achieved with associated operational cost and

  2. Simulated Interactive Research Experiments as Educational Tools for Advanced Science

    NASA Astrophysics Data System (ADS)

    Tomandl, Mathias; Mieling, Thomas; Losert-Valiente Kroon, Christiane M.; Hopf, Martin; Arndt, Markus

    2015-09-01

    Experimental research has become complex and thus a challenge to science education. Only very few students can typically be trained on advanced scientific equipment. It is therefore important to find new tools that allow all students to acquire laboratory skills individually and independent of where they are located. In a design-based research process we have investigated the feasibility of using a virtual laboratory as a photo-realistic and scientifically valid representation of advanced scientific infrastructure to teach modern experimental science, here, molecular quantum optics. We found a concept based on three educational principles that allows undergraduate students to become acquainted with procedures and concepts of a modern research field. We find a significant increase in student understanding using our Simulated Interactive Research Experiment (SiReX), by evaluating the learning outcomes with semi-structured interviews in a pre/post design. This suggests that this concept of an educational tool can be generalized to disseminate findings in other fields.

  3. Simulated Interactive Research Experiments as Educational Tools for Advanced Science.

    PubMed

    Tomandl, Mathias; Mieling, Thomas; Losert-Valiente Kroon, Christiane M; Hopf, Martin; Arndt, Markus

    2015-01-01

    Experimental research has become complex and thus a challenge to science education. Only very few students can typically be trained on advanced scientific equipment. It is therefore important to find new tools that allow all students to acquire laboratory skills individually and independent of where they are located. In a design-based research process we have investigated the feasibility of using a virtual laboratory as a photo-realistic and scientifically valid representation of advanced scientific infrastructure to teach modern experimental science, here, molecular quantum optics. We found a concept based on three educational principles that allows undergraduate students to become acquainted with procedures and concepts of a modern research field. We find a significant increase in student understanding using our Simulated Interactive Research Experiment (SiReX), by evaluating the learning outcomes with semi-structured interviews in a pre/post design. This suggests that this concept of an educational tool can be generalized to disseminate findings in other fields. PMID:26370627

  4. Simulated Interactive Research Experiments as Educational Tools for Advanced Science

    PubMed Central

    Tomandl, Mathias; Mieling, Thomas; Losert-Valiente Kroon, Christiane M.; Hopf, Martin; Arndt, Markus

    2015-01-01

    Experimental research has become complex and thus a challenge to science education. Only very few students can typically be trained on advanced scientific equipment. It is therefore important to find new tools that allow all students to acquire laboratory skills individually and independent of where they are located. In a design-based research process we have investigated the feasibility of using a virtual laboratory as a photo-realistic and scientifically valid representation of advanced scientific infrastructure to teach modern experimental science, here, molecular quantum optics. We found a concept based on three educational principles that allows undergraduate students to become acquainted with procedures and concepts of a modern research field. We find a significant increase in student understanding using our Simulated Interactive Research Experiment (SiReX), by evaluating the learning outcomes with semi-structured interviews in a pre/post design. This suggests that this concept of an educational tool can be generalized to disseminate findings in other fields. PMID:26370627

  5. Model Rocketry: University-Level Educational Tool

    ERIC Educational Resources Information Center

    Barrowman, James S.

    1974-01-01

    Describes how model rocketry can be a useful educational tool at the university level as a practical application of theoretical aerodynamic concepts and as a tool for students in experimental research. (BR)

  6. New advanced radio diagnostics tools for Space Weather Program

    NASA Astrophysics Data System (ADS)

    Krankowski, A.; Rothkaehl, H.; Atamaniuk, B.; Morawski, M.; Zakharenkova, I.; Cherniak, I.; Otmianowska-Mazur, K.

    2013-12-01

    To give a more detailed and complete understanding of physical plasma processes that govern the solar-terrestrial space, and to develop qualitative and quantitative models of the magnetosphere-ionosphere-thermosphere coupling, it is necessary to design and build the next generation of instruments for space diagnostics and monitoring. Novel ground- based wide-area sensor networks, such as the LOFAR (Low Frequency Array) radar facility, comprising wide band, and vector-sensing radio receivers and multi-spacecraft plasma diagnostics should help solve outstanding problems of space physics and describe long-term environmental changes. The LOw Frequency ARray - LOFAR - is a new fully digital radio telescope designed for frequencies between 30 MHz and 240 MHz located in Europe. The three new LOFAR stations will be installed until summer 2015 in Poland. The LOFAR facilities in Poland will be distributed among three sites: Lazy (East of Krakow), Borowiec near Poznan and Baldy near Olsztyn. All they will be connected via PIONIER dedicated links to Poznan. Each site will host one LOFAR station (96 high-band+96 low-band antennas). They will most time work as a part of European network, however, when less charged, they can operate as a national network The new digital radio frequency analyzer (RFA) on board the low-orbiting RELEC satellite was designed to monitor and investigate the ionospheric plasma properties. This two-point ground-based and topside ionosphere-located space plasma diagnostic can be a useful new tool for monitoring and diagnosing turbulent plasma properties. The RFA on board the RELEC satellite is the first in a series of experiments which is planned to be launched into the near-Earth environment. In order to improve and validate the large scales and small scales ionospheric structures we will used the GPS observations collected at IGS/EPN network employed to reconstruct diurnal variations of TEC using all satellite passes over individual GPS stations and the

  7. An Advanced Tool for Control System Design and Maintenance

    SciTech Connect

    Storm, Joachim; Lohmann, Heinz

    2006-07-01

    The detailed engineering for control systems is usually supported by CAD Tools creating the relevant logic diagrams including software parameters and signal cross references. However at this stage of the design an early V and V process for checking out the functional correctness of the design is not available. The article describes the scope and capabilities of an advanced control system design tool which has the embedded capability of a stand-alone simulation of complex logic structures. The tool provides the following features for constructing logic diagrams for control systems: - Drag and Drop construction of logic diagrams using a predefined symbol sets; - Cross reference facility; - Data extraction facility; - Stand-alone simulation for Logic Diagrams featuring: On the fly changes, signal line animation, value boxes and mini trends etc. - Creation and on-line animation of Compound Objects (Handler); - Code Generation Facility for Simulation; - Code Generation Facility for several control systems. The results of the integrated simulation based V and V process can be used further for initial control system configuration and life cycle management as well as for Engineering Test Bed applications and finally in full Scope Replica Simulators for Operator Training. (authors)

  8. An Advanced Decision Support Tool for Electricity Infrastructure Operations

    SciTech Connect

    Chen, Yousu; Huang, Zhenyu; Wong, Pak C.; Mackey, Patrick S.; Allwardt, Craig H.; Ma, Jian; Greitzer, Frank L.

    2010-01-31

    Electricity infrastructure, as one of the most critical infrastructures in the U.S., plays an important role in modern societies. Its failure would lead to significant disruption of people’s lives, industry and commercial activities, and result in massive economic losses. Reliable operation of electricity infrastructure is an extremely challenging task because human operators need to consider thousands of possible configurations in near real-time to choose the best option and operate the network effectively. In today’s practice, electricity infrastructure operation is largely based on operators’ experience with very limited real-time decision support, resulting in inadequate management of complex predictions and the inability to anticipate, recognize, and respond to situations caused by human errors, natural disasters, or cyber attacks. Therefore, a systematic approach is needed to manage the complex operational paradigms and choose the best option in a near-real-time manner. This paper proposes an advanced decision support tool for electricity infrastructure operations. The tool has the functions of turning large amount of data into actionable information to help operators monitor power grid status in real time; performing trend analysis to indentify system trend at the regional level or system level to help the operator to foresee and discern emergencies, studying clustering analysis to assist operators to identify the relationships between system configurations and affected assets, and interactively evaluating the alternative remedial actions to aid operators to make effective and timely decisions. This tool can provide significant decision support on electricity infrastructure operations and lead to better reliability in power grids. This paper presents examples with actual electricity infrastructure data to demonstrate the capability of this tool.

  9. Bioinformatics Methods and Tools to Advance Clinical Care

    PubMed Central

    Lecroq, T.

    2015-01-01

    Summary Objectives To summarize excellent current research in the field of Bioinformatics and Translational Informatics with application in the health domain and clinical care. Method We provide a synopsis of the articles selected for the IMIA Yearbook 2015, from which we attempt to derive a synthetic overview of current and future activities in the field. As last year, a first step of selection was performed by querying MEDLINE with a list of MeSH descriptors completed by a list of terms adapted to the section. Each section editor has evaluated separately the set of 1,594 articles and the evaluation results were merged for retaining 15 articles for peer-review. Results The selection and evaluation process of this Yearbook’s section on Bioinformatics and Translational Informatics yielded four excellent articles regarding data management and genome medicine that are mainly tool-based papers. In the first article, the authors present PPISURV a tool for uncovering the role of specific genes in cancer survival outcome. The second article describes the classifier PredictSNP which combines six performing tools for predicting disease-related mutations. In the third article, by presenting a high-coverage map of the human proteome using high resolution mass spectrometry, the authors highlight the need for using mass spectrometry to complement genome annotation. The fourth article is also related to patient survival and decision support. The authors present datamining methods of large-scale datasets of past transplants. The objective is to identify chances of survival. Conclusions The current research activities still attest the continuous convergence of Bioinformatics and Medical Informatics, with a focus this year on dedicated tools and methods to advance clinical care. Indeed, there is a need for powerful tools for managing and interpreting complex, large-scale genomic and biological datasets, but also a need for user-friendly tools developed for the clinicians in their

  10. Clinical holistic health: advanced tools for holistic medicine.

    PubMed

    Ventegodt, Søren; Clausen, Birgitte; Nielsen, May Lyck; Merrick, Joav

    2006-01-01

    According to holistic medical theory, the patient will heal when old painful moments, the traumatic events of life that are often called "gestalts", are integrated in the present "now". The advanced holistic physician's expanded toolbox has many different tools to induce this healing, some that are more dangerous and potentially traumatic than others. The more intense the therapeutic technique, the more emotional energy will be released and contained in the session, but the higher also is the risk for the therapist to lose control of the session and lose the patient to his or her own dark side. To avoid harming the patient must be the highest priority in holistic existential therapy, making sufficient education and training an issue of highest importance. The concept of "stepping up" the therapy by using more and more "dramatic" methods to get access to repressed emotions and events has led us to a "therapeutic staircase" with ten steps: (1) establishing the relationship; (2) establishing intimacy, trust, and confidentiality; (3) giving support and holding; (4) taking the patient into the process of physical, emotional, and mental healing; (5) social healing of being in the family; (6) spiritual healing--returning to the abstract wholeness of the soul; (7) healing the informational layer of the body; (8) healing the three fundamental dimensions of existence: love, power, and sexuality in a direct way using, among other techniques, "controlled violence" and "acupressure through the vagina"; (9) mind-expanding and consciousness-transformative techniques like psychotropic drugs; and (10) techniques transgressing the patient's borders and, therefore, often traumatizing (for instance, the use of force against the will of the patient). We believe that the systematic use of the staircase will greatly improve the power and efficiency of holistic medicine for the patient and we invite a broad cooperation in scientifically testing the efficiency of the advanced holistic

  11. Advanced Tools and Techniques for Formal Techniques in Aerospace Systems

    NASA Technical Reports Server (NTRS)

    Knight, John C.

    2005-01-01

    This is the final technical report for grant number NAG-1-02101. The title of this grant was "Advanced Tools and Techniques for Formal Techniques In Aerospace Systems". The principal investigator on this grant was Dr. John C. Knight of the Computer Science Department, University of Virginia, Charlottesville, Virginia 22904-4740. This report summarizes activities under the grant during the period 7/01/2002 to 9/30/2004. This report is organized as follows. In section 2, the technical background of the grant is summarized. Section 3 lists accomplishments and section 4 lists students funded under the grant. In section 5, we present a list of presentations given at various academic and research institutions about the research conducted. Finally, a list of publications generated under this grant is included in section 6.

  12. Sandia Advanced MEMS Design Tools, Version 2.0

    2002-06-13

    Sandia Advanced MEMS Design Tools is a 5-level surface micromachine fabrication technology, which customers internal and external to Sandia can access to fabricate prototype MEMS devices. This CD contains an integrated set of electronic files that: a) Describe the SUMMiT V fabrication process b) Provide enabling educational information (including pictures, videos, technical information) c)Facilitate the process of designing MEMS with the SUMMiT process (prototype file, Design Rule Checker, Standard Parts Library) d) Facilitate the processmore » of having MEMS fabricated at SNL e) Facilitate the process of having post-fabrication services performed While there exist some files on the CD that are used in conjunction with the software AutoCAD, these files are not intended for use independent of the CD. NOTE: THE CUSTOMER MUST PURCHASE HIS/HER OWN COPY OF AutoCAD TO USE WITH THESE FILES.« less

  13. Towards Model Driven Tool Interoperability: Bridging Eclipse and Microsoft Modeling Tools

    NASA Astrophysics Data System (ADS)

    Brunelière, Hugo; Cabot, Jordi; Clasen, Cauê; Jouault, Frédéric; Bézivin, Jean

    Successful application of model-driven engineering approaches requires interchanging a lot of relevant data among the tool ecosystem employed by an engineering team (e.g., requirements elicitation tools, several kinds of modeling tools, reverse engineering tools, development platforms and so on). Unfortunately, this is not a trivial task. Poor tool interoperability makes data interchange a challenge even among tools with a similar scope. This paper presents a model-based solution to overcome such interoperability issues. With our approach, the internal schema/s (i.e., metamodel/s) of each tool are explicited and used as basis for solving syntactic and semantic differences between the tools. Once the corresponding metamodels are aligned, model-to-model transformations are (semi)automatically derived and executed to perform the actual data interchange. We illustrate our approach by bridging the Eclipse and Microsoft (DSL Tools and SQL Server Modeling) modeling tools.

  14. Evaluating modeling tools for the EDOS

    NASA Technical Reports Server (NTRS)

    Knoble, Gordon; Mccaleb, Frederick; Aslam, Tanweer; Nester, Paul

    1994-01-01

    The Earth Observing System (EOS) Data and Operations System (EDOS) Project is developing a functional, system performance model to support the system implementation phase of the EDOS which is being designed and built by the Goddard Space Flight Center (GSFC). The EDOS Project will use modeling to meet two key objectives: (1) manage system design impacts introduced by unplanned changed in mission requirements; and (2) evaluate evolutionary technology insertions throughout the development of the EDOS. To select a suitable modeling tool, the EDOS modeling team developed an approach for evaluating modeling tools and languages by deriving evaluation criteria from both the EDOS modeling requirements and the development plan. Essential and optional features for an appropriate modeling tool were identified and compared with known capabilities of several modeling tools. Vendors were also provided the opportunity to model a representative EDOS processing function to demonstrate the applicability of their modeling tool to the EDOS modeling requirements. This paper emphasizes the importance of using a well defined approach for evaluating tools to model complex systems like the EDOS. The results of this evaluation study do not in any way signify the superiority of any one modeling tool since the results will vary with the specific modeling requirements of each project.

  15. Tools for the advancement of undergraduate statistics education

    NASA Astrophysics Data System (ADS)

    Schaffner, Andrew Alan

    To keep pace with advances in applied statistics and to maintain literate consumers of quantitative analyses, statistics educators stress the need for change in the classroom (Cobb, 1992; Garfield, 1993, 1995; Moore, 1991a; Snee, 1993; Steinhorst and Keeler, 1995). These authors stress a more concept oriented undergraduate introductory statistics course which emphasizes true understanding over mechanical skills. Drawing on recent educational research, this dissertation attempts to realize this vision by developing tools and pedagogy to assist statistics instructors. This dissertation describes statistical facets, pieces of statistical understanding that are building blocks of knowledge, and discusses DIANA, a World-Wide Web tool for diagnosing facets. Further, I show how facets may be incorporated into course design through the development of benchmark lessons based on the principles of collaborative learning (diSessa and Minstrell, 1995; Cohen, 1994; Reynolds et al., 1995; Bruer, 1993; von Glasersfeld, 1991) and activity based courses (Jones, 1991; Yackel, Cobb and Wood, 1991). To support benchmark lessons and collaborative learning in large classes I describe Virtual Benchmark Instruction, benchmark lessons which take place on a structured hypertext bulletin board using the technology of the World-Wide Web. Finally, I present randomized experiments which suggest that these educational developments are effective in a university introductory statistics course.

  16. Advanced Vibration Analysis Tool Developed for Robust Engine Rotor Designs

    NASA Technical Reports Server (NTRS)

    Min, James B.

    2005-01-01

    The primary objective of this research program is to develop vibration analysis tools, design tools, and design strategies to significantly improve the safety and robustness of turbine engine rotors. Bladed disks in turbine engines always feature small, random blade-to-blade differences, or mistuning. Mistuning can lead to a dramatic increase in blade forced-response amplitudes and stresses. Ultimately, this results in high-cycle fatigue, which is a major safety and cost concern. In this research program, the necessary steps will be taken to transform a state-of-the-art vibration analysis tool, the Turbo- Reduce forced-response prediction code, into an effective design tool by enhancing and extending the underlying modeling and analysis methods. Furthermore, novel techniques will be developed to assess the safety of a given design. In particular, a procedure will be established for using natural-frequency curve veerings to identify ranges of operating conditions (rotational speeds and engine orders) in which there is a great risk that the rotor blades will suffer high stresses. This work also will aid statistical studies of the forced response by reducing the necessary number of simulations. Finally, new strategies for improving the design of rotors will be pursued.

  17. Synthetic biology and molecular genetics in non-conventional yeasts: Current tools and future advances.

    PubMed

    Wagner, James M; Alper, Hal S

    2016-04-01

    Coupling the tools of synthetic biology with traditional molecular genetic techniques can enable the rapid prototyping and optimization of yeast strains. While the era of yeast synthetic biology began in the well-characterized model organism Saccharomyces cerevisiae, it is swiftly expanding to include non-conventional yeast production systems such as Hansenula polymorpha, Kluyveromyces lactis, Pichia pastoris, and Yarrowia lipolytica. These yeasts already have roles in the manufacture of vaccines, therapeutic proteins, food additives, and biorenewable chemicals, but recent synthetic biology advances have the potential to greatly expand and diversify their impact on biotechnology. In this review, we summarize the development of synthetic biological tools (including promoters and terminators) and enabling molecular genetics approaches that have been applied in these four promising alternative biomanufacturing platforms. An emphasis is placed on synthetic parts and genome editing tools. Finally, we discuss examples of synthetic tools developed in other organisms that can be adapted or optimized for these hosts in the near future. PMID:26701310

  18. Sandia Advanced MEMS Design Tools, Version 2.2.5

    2010-01-19

    The Sandia National Laboratories Advanced MEMS Design Tools, Version 2.2.5, is a collection of menus, prototype drawings, and executables that provide significant productivity enhancements when using AutoCAD to design MEMS components. This release is designed for AutoCAD 2000i, 2002, or 2004 and is supported under Windows NT 4.0, Windows 2000, or XP. SUMMiT V (Sandia Ultra planar Multi level MEMS Technology) is a 5 level surface micromachine fabrication technology, which customers internal and external tomore » Sandia can access to fabricate prototype MEMS devices. This CD contains an integrated set of electronic files that: a) Describe the SUMMiT V fabrication process b) Facilitate the process of designing MEMS with the SUMMiT process (prototype file, Design Rule Checker, Standard Parts Library) New features in this version: AutoCAD 2004 support has been added. SafeExplode ? a new feature that explodes blocks without affecting polylines (avoids exploding polylines into objects that are ignored by the DRC and Visualization tools). Layer control menu ? a pull-down menu for selecting layers to isolate, freeze, or thaw. Updated tools: A check has been added to catch invalid block names. DRC features: Added username/password validation, added a method to update the user?s password. SNL_DRC_WIDTH ? a value to control the width of the DRC error lines. SNL_BIAS_VALUE ? a value use to offset selected geometry SNL_PROCESS_NAME ? a value to specify the process name Documentation changes: The documentation has been updated to include the new features. While there exist some files on the CD that are used in conjunction with software package AutoCAD, these files are not intended for use independent of the CD. Note that the customer must purchase his/her own copy of AutoCAD to use with these files.« less

  19. Sandia Advanced MEMS Design Tools, Version 2.2.5

    SciTech Connect

    Yarberry, Victor; Allen, James; Lantz, Jeffery; Priddy, Brian; & Westling, Belinda

    2010-01-19

    The Sandia National Laboratories Advanced MEMS Design Tools, Version 2.2.5, is a collection of menus, prototype drawings, and executables that provide significant productivity enhancements when using AutoCAD to design MEMS components. This release is designed for AutoCAD 2000i, 2002, or 2004 and is supported under Windows NT 4.0, Windows 2000, or XP. SUMMiT V (Sandia Ultra planar Multi level MEMS Technology) is a 5 level surface micromachine fabrication technology, which customers internal and external to Sandia can access to fabricate prototype MEMS devices. This CD contains an integrated set of electronic files that: a) Describe the SUMMiT V fabrication process b) Facilitate the process of designing MEMS with the SUMMiT process (prototype file, Design Rule Checker, Standard Parts Library) New features in this version: AutoCAD 2004 support has been added. SafeExplode ? a new feature that explodes blocks without affecting polylines (avoids exploding polylines into objects that are ignored by the DRC and Visualization tools). Layer control menu ? a pull-down menu for selecting layers to isolate, freeze, or thaw. Updated tools: A check has been added to catch invalid block names. DRC features: Added username/password validation, added a method to update the user?s password. SNL_DRC_WIDTH ? a value to control the width of the DRC error lines. SNL_BIAS_VALUE ? a value use to offset selected geometry SNL_PROCESS_NAME ? a value to specify the process name Documentation changes: The documentation has been updated to include the new features. While there exist some files on the CD that are used in conjunction with software package AutoCAD, these files are not intended for use independent of the CD. Note that the customer must purchase his/her own copy of AutoCAD to use with these files.

  20. Introductory Tools for Radiative Transfer Models

    NASA Astrophysics Data System (ADS)

    Feldman, D.; Kuai, L.; Natraj, V.; Yung, Y.

    2006-12-01

    Satellite data are currently so voluminous that, despite their unprecedented quality and potential for scientific application, only a small fraction is analyzed due to two factors: researchers' computational constraints and a relatively small number of researchers actively utilizing the data. Ultimately it is hoped that the terabytes of unanalyzed data being archived can receive scientific scrutiny but this will require a popularization of the methods associated with the analysis. Since a large portion of complexity is associated with the proper implementation of the radiative transfer model, it is reasonable and appropriate to make the model as accessible as possible to general audiences. Unfortunately, the algorithmic and conceptual details that are necessary for state-of-the-art analysis also tend to frustrate the accessibility for those new to remote sensing. Several efforts have been made to have web- based radiative transfer calculations, and these are useful for limited calculations, but analysis of more than a few spectra requires the utilization of home- or server-based computing resources. We present a system that is designed to allow for easier access to radiative transfer models with implementation on a home computing platform in the hopes that this system can be utilized in and expanded upon in advanced high school and introductory college settings. This learning-by-doing process is aided through the use of several powerful tools. The first is a wikipedia-style introduction to the salient features of radiative transfer that references the seminal works in the field and refers to more complicated calculations and algorithms sparingly5. The second feature is a technical forum, commonly referred to as a tiki-wiki, that addresses technical and conceptual questions through public postings, private messages, and a ranked searching routine. Together, these tools may be able to facilitate greater interest in the field of remote sensing.

  1. Advanced Risk Reduction Tool (ARRT) Special Case Study Report: Science and Engineering Technical Assessments (SETA) Program

    NASA Technical Reports Server (NTRS)

    Kirsch, Paul J.; Hayes, Jane; Zelinski, Lillian

    2000-01-01

    This special case study report presents the Science and Engineering Technical Assessments (SETA) team's findings for exploring the correlation between the underlying models of Advanced Risk Reduction Tool (ARRT) relative to how it identifies, estimates, and integrates Independent Verification & Validation (IV&V) activities. The special case study was conducted under the provisions of SETA Contract Task Order (CTO) 15 and the approved technical approach documented in the CTO-15 Modification #1 Task Project Plan.

  2. Advancing Software Architecture Modeling for Large Scale Heterogeneous Systems

    SciTech Connect

    Gorton, Ian; Liu, Yan

    2010-11-07

    In this paper we describe how incorporating technology-specific modeling at the architecture level can help reduce risks and produce better designs for large, heterogeneous software applications. We draw an analogy with established modeling approaches in scientific domains, using groundwater modeling as an example, to help illustrate gaps in current software architecture modeling approaches. We then describe the advances in modeling, analysis and tooling that are required to bring sophisticated modeling and development methods within reach of software architects.

  3. Model Analysis ToolKit

    SciTech Connect

    Harp, Dylan R.

    2015-05-15

    MATK provides basic functionality to facilitate model analysis within the Python computational environment. Model analysis setup within MATK includes: - define parameters - define observations - define model (python function) - define samplesets (sets of parameter combinations) Currently supported functionality includes: - forward model runs - Latin-Hypercube sampling of parameters - multi-dimensional parameter studies - parallel execution of parameter samples - model calibration using internal Levenberg-Marquardt algorithm - model calibration using lmfit package - model calibration using levmar package - Markov Chain Monte Carlo using pymc package MATK facilitates model analysis using: - scipy - calibration (scipy.optimize) - rpy2 - Python interface to R

  4. Advanced Modeling of Micromirror Devices

    NASA Technical Reports Server (NTRS)

    Michalicek, M. Adrian; Sene, Darren E.; Bright, Victor M.

    1995-01-01

    The flexure-beam micromirror device (FBMD) is a phase only piston style spatial light modulator demonstrating properties which can be used for phase adaptive corrective optics. This paper presents a complete study of a square FBMD, from advanced model development through final device testing and model verification. The model relates the electrical and mechanical properties of the device by equating the electrostatic force of a parallel-plate capacitor with the counter-acting spring force of the device's support flexures. The capacitor solution is derived via the Schwartz-Christoffel transformation such that the final solution accounts for non-ideal electric fields. The complete model describes the behavior of any piston-style device, given its design geometry and material properties. It includes operational parameters such as drive frequency and temperature, as well as fringing effects, mirror surface deformations, and cross-talk from neighboring devices. The steps taken to develop this model can be applied to other micromirrors, such as the cantilever and torsion-beam designs, to produce an advanced model for any given device. The micromirror devices studied in this paper were commercially fabricated in a surface micromachining process. A microscope-based laser interferometer is used to test the device in which a beam reflected from the device modulates a fixed reference beam. The mirror displacement is determined from the relative phase which generates a continuous set of data for each selected position on the mirror surface. Plots of this data describe the localized deflection as a function of drive voltage.

  5. Model Analysis ToolKit

    2015-05-15

    MATK provides basic functionality to facilitate model analysis within the Python computational environment. Model analysis setup within MATK includes: - define parameters - define observations - define model (python function) - define samplesets (sets of parameter combinations) Currently supported functionality includes: - forward model runs - Latin-Hypercube sampling of parameters - multi-dimensional parameter studies - parallel execution of parameter samples - model calibration using internal Levenberg-Marquardt algorithm - model calibration using lmfit package - modelmore » calibration using levmar package - Markov Chain Monte Carlo using pymc package MATK facilitates model analysis using: - scipy - calibration (scipy.optimize) - rpy2 - Python interface to R« less

  6. Advanced Mirror & Modelling Technology Development

    NASA Technical Reports Server (NTRS)

    Effinger, Michael; Stahl, H. Philip; Abplanalp, Laura; Maffett, Steven; Egerman, Robert; Eng, Ron; Arnold, William; Mosier, Gary; Blaurock, Carl

    2014-01-01

    The 2020 Decadal technology survey is starting in 2018. Technology on the shelf at that time will help guide selection to future low risk and low cost missions. The Advanced Mirror Technology Development (AMTD) team has identified development priorities based on science goals and engineering requirements for Ultraviolet Optical near-Infrared (UVOIR) missions in order to contribute to the selection process. One key development identified was lightweight mirror fabrication and testing. A monolithic, stacked, deep core mirror was fused and replicated twice to achieve the desired radius of curvature. It was subsequently successfully polished and tested. A recently awarded second phase to the AMTD project will develop larger mirrors to demonstrate the lateral scaling of the deep core mirror technology. Another key development was rapid modeling for the mirror. One model focused on generating optical and structural model results in minutes instead of months. Many variables could be accounted for regarding the core, face plate and back structure details. A portion of a spacecraft model was also developed. The spacecraft model incorporated direct integration to transform optical path difference to Point Spread Function (PSF) and between PSF to modulation transfer function. The second phase to the project will take the results of the rapid mirror modeler and integrate them into the rapid spacecraft modeler.

  7. Software Engineering Tools for Scientific Models

    NASA Technical Reports Server (NTRS)

    Abrams, Marc; Saboo, Pallabi; Sonsini, Mike

    2013-01-01

    Software tools were constructed to address issues the NASA Fortran development community faces, and they were tested on real models currently in use at NASA. These proof-of-concept tools address the High-End Computing Program and the Modeling, Analysis, and Prediction Program. Two examples are the NASA Goddard Earth Observing System Model, Version 5 (GEOS-5) atmospheric model in Cell Fortran on the Cell Broadband Engine, and the Goddard Institute for Space Studies (GISS) coupled atmosphere- ocean model called ModelE, written in fixed format Fortran.

  8. Functional toxicology: tools to advance the future of toxicity testing.

    PubMed

    Gaytán, Brandon D; Vulpe, Chris D

    2014-01-01

    The increased presence of chemical contaminants in the environment is an undeniable concern to human health and ecosystems. Historically, by relying heavily upon costly and laborious animal-based toxicity assays, the field of toxicology has often neglected examinations of the cellular and molecular mechanisms of toxicity for the majority of compounds-information that, if available, would strengthen risk assessment analyses. Functional toxicology, where cells or organisms with gene deletions or depleted proteins are used to assess genetic requirements for chemical tolerance, can advance the field of toxicity testing by contributing data regarding chemical mechanisms of toxicity. Functional toxicology can be accomplished using available genetic tools in yeasts, other fungi and bacteria, and eukaryotes of increased complexity, including zebrafish, fruit flies, rodents, and human cell lines. Underscored is the value of using less complex systems such as yeasts to direct further studies in more complex systems such as human cell lines. Functional techniques can yield (1) novel insights into chemical toxicity; (2) pathways and mechanisms deserving of further study; and (3) candidate human toxicant susceptibility or resistance genes. PMID:24847352

  9. Functional toxicology: tools to advance the future of toxicity testing

    PubMed Central

    Gaytán, Brandon D.; Vulpe, Chris D.

    2014-01-01

    The increased presence of chemical contaminants in the environment is an undeniable concern to human health and ecosystems. Historically, by relying heavily upon costly and laborious animal-based toxicity assays, the field of toxicology has often neglected examinations of the cellular and molecular mechanisms of toxicity for the majority of compounds—information that, if available, would strengthen risk assessment analyses. Functional toxicology, where cells or organisms with gene deletions or depleted proteins are used to assess genetic requirements for chemical tolerance, can advance the field of toxicity testing by contributing data regarding chemical mechanisms of toxicity. Functional toxicology can be accomplished using available genetic tools in yeasts, other fungi and bacteria, and eukaryotes of increased complexity, including zebrafish, fruit flies, rodents, and human cell lines. Underscored is the value of using less complex systems such as yeasts to direct further studies in more complex systems such as human cell lines. Functional techniques can yield (1) novel insights into chemical toxicity; (2) pathways and mechanisms deserving of further study; and (3) candidate human toxicant susceptibility or resistance genes. PMID:24847352

  10. Advanced computational tools for optimization and uncertainty quantification of carbon capture processes

    SciTech Connect

    Miller, David C.; Ng, Brenda; Eslick, John

    2014-01-01

    Advanced multi-scale modeling and simulation has the potential to dramatically reduce development time, resulting in considerable cost savings. The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and universities that is developing, demonstrating, and deploying a suite of multi-scale modeling and simulation tools. One significant computational tool is FOQUS, a Framework for Optimization and Quantification of Uncertainty and Sensitivity, which enables basic data submodels, including thermodynamics and kinetics, to be used within detailed process models to rapidly synthesize and optimize a process and determine the level of uncertainty associated with the resulting process. The overall approach of CCSI is described with a more detailed discussion of FOQUS and its application to carbon capture systems.

  11. New V and V Tools for Diagnostic Modeling Environment (DME)

    NASA Technical Reports Server (NTRS)

    Pecheur, Charles; Nelson, Stacy; Merriam, Marshall (Technical Monitor)

    2002-01-01

    The purpose of this report is to provide correctness and reliability criteria for verification and validation (V&V) of Second Generation Reusable Launch Vehicle (RLV) Diagnostic Modeling Environment, describe current NASA Ames Research Center tools for V&V of Model Based Reasoning systems, and discuss the applicability of Advanced V&V to DME. This report is divided into the following three sections: (1) correctness and reliability criteria; (2) tools for V&V of Model Based Reasoning; and (3) advanced V&V applicable to DME. The Executive Summary includes an overview of the main points from each section. Supporting details, diagrams, figures, and other information are included in subsequent sections. A glossary, acronym list, appendices, and references are included at the end of this report.

  12. NASA Technical Interchange Meeting (TIM): Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box

    NASA Technical Reports Server (NTRS)

    ONeil, D. A.; Craig, D. A.; Christensen, C. B.; Gresham, E. C.

    2005-01-01

    The objective of this Technical Interchange Meeting was to increase the quantity and quality of technical, cost, and programmatic data used to model the impact of investing in different technologies. The focus of this meeting was the Technology Tool Box (TTB), a database of performance, operations, and programmatic parameters provided by technologists and used by systems engineers. The TTB is the data repository used by a system of models known as the Advanced Technology Lifecycle Analysis System (ATLAS). This report describes the result of the November meeting, and also provides background information on ATLAS and the TTB.

  13. ANSYS tools in modeling tires

    NASA Technical Reports Server (NTRS)

    Ali, Ashraf; Lovell, Michael

    1995-01-01

    This presentation summarizes the capabilities in the ANSYS program that relate to the computational modeling of tires. The power and the difficulties associated with modeling nearly incompressible rubber-like materials using hyperelastic constitutive relationships are highlighted from a developer's point of view. The topics covered include a hyperelastic material constitutive model for rubber-like materials, a general overview of contact-friction capabilities, and the acoustic fluid-structure interaction problem for noise prediction. Brief theoretical development and example problems are presented for each topic.

  14. Research on graphical workflow modeling tool

    NASA Astrophysics Data System (ADS)

    Gu, Hongjiu

    2013-07-01

    Through the technical analysis of existing modeling tools, combined with Web technology, this paper presents a graphical workflow modeling tool design program, through which designers can draw process directly in the browser and automatically transform the drawn process description in XML description file, to facilitate the workflow engine analysis and barrier-free sharing of workflow data in a networked environment. The program has software reusability, cross-platform, scalability, and strong practicality.

  15. Anvil Forecast Tool in the Advanced Weather Interactive Processing System (AWIPS)

    NASA Technical Reports Server (NTRS)

    Barrett, Joe H., III; Hood, Doris

    2009-01-01

    Launch Weather Officers (LWOs) from the 45th Weather Squadron (45 WS) and forecasters from the National Weather Service (NWS) Spaceflight Meteorology Group (SMG) have identified anvil forecasting as one of their most challenging tasks when predicting the probability of violating the Lightning Launch Commit Criteria (LLCC) (Krider et al. 2006; Space Shuttle Flight Rules (FR), NASA/JSC 2004)). As a result, the Applied Meteorology Unit (AMU) developed a tool that creates an anvil threat corridor graphic that can be overlaid on satellite imagery using the Meteorological Interactive Data Display System (MIDDS, Short and Wheeler, 2002). The tool helps forecasters estimate the locations of thunderstorm anvils at one, two, and three hours into the future. It has been used extensively in launch and landing operations by both the 45 WS and SMG. The Advanced Weather Interactive Processing System (AWIPS) is now used along with MIDDS for weather analysis and display at SMG. In Phase I of this task, SMG tasked the AMU to transition the tool from MIDDS to AWIPS (Barrett et aI., 2007). For Phase II, SMG requested the AMU make the Anvil Forecast Tool in AWIPS more configurable by creating the capability to read model gridded data from user-defined model files instead of hard-coded files. An NWS local AWIPS application called AGRID was used to accomplish this. In addition, SMG needed to be able to define the pressure levels for the model data, instead of hard-coding the bottom level as 300 mb and the top level as 150 mb. This paper describes the initial development of the Anvil Forecast Tool for MIDDS, followed by the migration of the tool to AWIPS in Phase I. It then gives a detailed presentation of the Phase II improvements to the AWIPS tool.

  16. Advanced Small Modular Reactor Economics Model Development

    SciTech Connect

    Harrison, Thomas J.

    2014-10-01

    The US Department of Energy Office of Nuclear Energy’s Advanced Small Modular Reactor (SMR) research and development activities focus on four key areas: Developing assessment methods for evaluating advanced SMR technologies and characteristics; and Developing and testing of materials, fuels and fabrication techniques; and Resolving key regulatory issues identified by US Nuclear Regulatory Commission and industry; and Developing advanced instrumentation and controls and human-machine interfaces. This report focuses on development of assessment methods to evaluate advanced SMR technologies and characteristics. Specifically, this report describes the expansion and application of the economic modeling effort at Oak Ridge National Laboratory. Analysis of the current modeling methods shows that one of the primary concerns for the modeling effort is the handling of uncertainty in cost estimates. Monte Carlo–based methods are commonly used to handle uncertainty, especially when implemented by a stand-alone script within a program such as Python or MATLAB. However, a script-based model requires each potential user to have access to a compiler and an executable capable of handling the script. Making the model accessible to multiple independent analysts is best accomplished by implementing the model in a common computing tool such as Microsoft Excel. Excel is readily available and accessible to most system analysts, but it is not designed for straightforward implementation of a Monte Carlo–based method. Using a Monte Carlo algorithm requires in-spreadsheet scripting and statistical analyses or the use of add-ons such as Crystal Ball. An alternative method uses propagation of error calculations in the existing Excel-based system to estimate system cost uncertainty. This method has the advantage of using Microsoft Excel as is, but it requires the use of simplifying assumptions. These assumptions do not necessarily bring into question the analytical results. In fact, the

  17. Rapid Modeling and Analysis Tools: Evolution, Status, Needs and Directions

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Stone, Thomas J.; Ransom, Jonathan B. (Technical Monitor)

    2002-01-01

    Advanced aerospace systems are becoming increasingly more complex, and customers are demanding lower cost, higher performance, and high reliability. Increased demands are placed on the design engineers to collaborate and integrate design needs and objectives early in the design process to minimize risks that may occur later in the design development stage. High performance systems require better understanding of system sensitivities much earlier in the design process to meet these goals. The knowledge, skills, intuition, and experience of an individual design engineer will need to be extended significantly for the next generation of aerospace system designs. Then a collaborative effort involving the designer, rapid and reliable analysis tools and virtual experts will result in advanced aerospace systems that are safe, reliable, and efficient. This paper discusses the evolution, status, needs and directions for rapid modeling and analysis tools for structural analysis. First, the evolution of computerized design and analysis tools is briefly described. Next, the status of representative design and analysis tools is described along with a brief statement on their functionality. Then technology advancements to achieve rapid modeling and analysis are identified. Finally, potential future directions including possible prototype configurations are proposed.

  18. Comparing Simple and Advanced Video Tools as Supports for Complex Collaborative Design Processes

    ERIC Educational Resources Information Center

    Zahn, Carmen; Pea, Roy; Hesse, Friedrich W.; Rosen, Joe

    2010-01-01

    Working with digital video technologies, particularly advanced video tools with editing capabilities, offers new prospects for meaningful learning through design. However, it is also possible that the additional complexity of such tools does "not" advance learning. We compared in an experiment the design processes and learning outcomes of 24…

  19. Cockpit System Situational Awareness Modeling Tool

    NASA Technical Reports Server (NTRS)

    Keller, John; Lebiere, Christian; Shay, Rick; Latorella, Kara

    2004-01-01

    This project explored the possibility of predicting pilot situational awareness (SA) using human performance modeling techniques for the purpose of evaluating developing cockpit systems. The Improved Performance Research Integration Tool (IMPRINT) was combined with the Adaptive Control of Thought-Rational (ACT-R) cognitive modeling architecture to produce a tool that can model both the discrete tasks of pilots and the cognitive processes associated with SA. The techniques for using this tool to predict SA were demonstrated using the newly developed Aviation Weather Information (AWIN) system. By providing an SA prediction tool to cockpit system designers, cockpit concepts can be assessed early in the design process while providing a cost-effective complement to the traditional pilot-in-the-loop experiments and data collection techniques.

  20. Advanced gradient-index lens design tools to maximize system performance and reduce SWaP

    NASA Astrophysics Data System (ADS)

    Campbell, Sawyer D.; Nagar, Jogender; Brocker, Donovan E.; Easum, John A.; Turpin, Jeremiah P.; Werner, Douglas H.

    2016-05-01

    GRadient-INdex (GRIN) lenses have long been of interest due to their potential for providing levels of performance unachievable with traditional homogeneous lenses. While historically limited by a lack of suitable materials, rapid advancements in manufacturing techniques, including 3D printing, have recently kindled a renewed interest in GRIN optics. Further increasing the desire for GRIN devices has been the advent of Transformation Optics (TO), which provides the mathematical framework for representing the behavior of electromagnetic radiation in a given geometry by "transforming" it to an alternative, usually more desirable, geometry through an appropriate mapping of the constituent material parameters. Using TO, aspherical lenses can be transformed to simpler spherical and flat geometries or even rotationally-asymmetric shapes which result in true 3D GRIN profiles. Meanwhile, there is a critical lack of suitable design tools which can effectively evaluate the optical wave propagation through 3D GRIN profiles produced by TO. Current modeling software packages for optical lens systems also lack advanced multi-objective global optimization capability which allows the user to explicitly view the trade-offs between all design objectives such as focus quality, FOV, ▵nand focal drift due to chromatic aberrations. When coupled with advanced design methodologies such as TO, wavefront matching (WFM), and analytical achromatic GRIN theory, these tools provide a powerful framework for maximizing SWaP (Size, Weight and Power) reduction in GRIN-enabled optical systems. We provide an overview of our advanced GRIN design tools and examples which minimize the presence of mono- and polychromatic aberrations in the context of reducing SWaP.

  1. Vibration absorber modeling for handheld machine tool

    NASA Astrophysics Data System (ADS)

    Abdullah, Mohd Azman; Mustafa, Mohd Muhyiddin; Jamil, Jazli Firdaus; Salim, Mohd Azli; Ramli, Faiz Redza

    2015-05-01

    Handheld machine tools produce continuous vibration to the users during operation. This vibration causes harmful effects to the health of users for repeated operations in a long period of time. In this paper, a dynamic vibration absorber (DVA) is designed and modeled to reduce the vibration generated by the handheld machine tool. Several designs and models of vibration absorbers with various stiffness properties are simulated, tested and optimized in order to diminish the vibration. Ordinary differential equation is used to derive and formulate the vibration phenomena in the machine tool with and without the DVA. The final transfer function of the DVA is later analyzed using commercial available mathematical software. The DVA with optimum properties of mass and stiffness is developed and applied on the actual handheld machine tool. The performance of the DVA is experimentally tested and validated by the final result of vibration reduction.

  2. Software Tools for Weed Seed Germination Modeling

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The next generation of weed seed germination models will need to account for variable soil microclimate conditions. In order to predict this microclimate environment we have developed a suite of individual tools (models) that can be used in conjunction with the next generation of weed seed germinati...

  3. Performance and Architecture Lab Modeling Tool

    SciTech Connect

    2014-06-19

    Analytical application performance models are critical for diagnosing performance-limiting resources, optimizing systems, and designing machines. Creating models, however, is difficult. Furthermore, models are frequently expressed in forms that are hard to distribute and validate. The Performance and Architecture Lab Modeling tool, or Palm, is a modeling tool designed to make application modeling easier. Palm provides a source code modeling annotation language. Not only does the modeling language divide the modeling task into sub problems, it formally links an application's source code with its model. This link is important because a model's purpose is to capture application behavior. Furthermore, this link makes it possible to define rules for generating models according to source code organization. Palm generates hierarchical models according to well-defined rules. Given an application, a set of annotations, and a representative execution environment, Palm will generate the same model. A generated model is a an executable program whose constituent parts directly correspond to the modeled application. Palm generates models by combining top-down (human-provided) semantic insight with bottom-up static and dynamic analysis. A model's hierarchy is defined by static and dynamic source code structure. Because Palm coordinates models and source code, Palm's models are 'first-class' and reproducible. Palm automates common modeling tasks. For instance, Palm incorporates measurements to focus attention, represent constant behavior, and validate models. Palm's workflow is as follows. The workflow's input is source code annotated with Palm modeling annotations. The most important annotation models an instance of a block of code. Given annotated source code, the Palm Compiler produces executables and the Palm Monitor collects a representative performance profile. The Palm Generator synthesizes a model based on the static and dynamic mapping of annotations to program behavior

  4. Performance and Architecture Lab Modeling Tool

    2014-06-19

    Analytical application performance models are critical for diagnosing performance-limiting resources, optimizing systems, and designing machines. Creating models, however, is difficult. Furthermore, models are frequently expressed in forms that are hard to distribute and validate. The Performance and Architecture Lab Modeling tool, or Palm, is a modeling tool designed to make application modeling easier. Palm provides a source code modeling annotation language. Not only does the modeling language divide the modeling task into sub problems, itmore » formally links an application's source code with its model. This link is important because a model's purpose is to capture application behavior. Furthermore, this link makes it possible to define rules for generating models according to source code organization. Palm generates hierarchical models according to well-defined rules. Given an application, a set of annotations, and a representative execution environment, Palm will generate the same model. A generated model is a an executable program whose constituent parts directly correspond to the modeled application. Palm generates models by combining top-down (human-provided) semantic insight with bottom-up static and dynamic analysis. A model's hierarchy is defined by static and dynamic source code structure. Because Palm coordinates models and source code, Palm's models are 'first-class' and reproducible. Palm automates common modeling tasks. For instance, Palm incorporates measurements to focus attention, represent constant behavior, and validate models. Palm's workflow is as follows. The workflow's input is source code annotated with Palm modeling annotations. The most important annotation models an instance of a block of code. Given annotated source code, the Palm Compiler produces executables and the Palm Monitor collects a representative performance profile. The Palm Generator synthesizes a model based on the static and dynamic mapping of annotations to program

  5. Advances in Watershed Models and Modeling

    NASA Astrophysics Data System (ADS)

    Yeh, G. T.; Zhang, F.

    2015-12-01

    The development of watershed models and their applications to real-world problems has evolved significantly since 1960's. Watershed models can be classified based on what media are included, what processes are dealt with, and what approaches are taken. In term of media, a watershed may include segregated overland regime, river-canal-open channel networks, ponds-reservoirs-small lakes, and subsurface media. It may also include integrated media of all these or a partial set of these as well as man-made control structures. In term of processes, a watershed model may deal with coupled or decoupled hydrological and biogeochemical cycles. These processes include fluid flow, thermal transport, salinity transport, sediment transport, reactive transport, and biota and microbe kinetics. In terms of approaches, either parametric or physics-based approach can be taken. This talk discusses the evolution of watershed models in the past sixty years. The advances of watershed models center around their increasing design capability to foster these segregated or integrated media and coupled or decoupled processes. Widely used models developed by academia, research institutes, government agencies, and private industries will be reviewed in terms of the media and processes included as well as approaches taken. Many types of potential benchmark problems in general can be proposed and will be discussed. This presentation will focus on three benchmark problems of biogeochemical cycles. These three problems, dealing with water quality transport, will be formulated in terms of reactive transport. Simulation results will be illustrated using WASH123D, a watershed model developed and continuously updated by the author and his PhD graduates. Keywords: Hydrological Cycles, Biogeochemical Cycles, Biota Kinetics, Parametric Approach, Physics-based Approach, Reactive Transport.

  6. Five levels of PACS modularity: integrating 3D and other advanced visualization tools.

    PubMed

    Wang, Kenneth C; Filice, Ross W; Philbin, James F; Siegel, Eliot L; Nagy, Paul G

    2011-12-01

    The current array of PACS products and 3D visualization tools presents a wide range of options for applying advanced visualization methods in clinical radiology. The emergence of server-based rendering techniques creates new opportunities for raising the level of clinical image review. However, best-of-breed implementations of core PACS technology, volumetric image navigation, and application-specific 3D packages will, in general, be supplied by different vendors. Integration issues should be carefully considered before deploying such systems. This work presents a classification scheme describing five tiers of PACS modularity and integration with advanced visualization tools, with the goals of characterizing current options for such integration, providing an approach for evaluating such systems, and discussing possible future architectures. These five levels of increasing PACS modularity begin with what was until recently the dominant model for integrating advanced visualization into the clinical radiologist's workflow, consisting of a dedicated stand-alone post-processing workstation in the reading room. Introduction of context-sharing, thin clients using server-based rendering, archive integration, and user-level application hosting at successive levels of the hierarchy lead to a modularized imaging architecture, which promotes user interface integration, resource efficiency, system performance, supportability, and flexibility. These technical factors and system metrics are discussed in the context of the proposed five-level classification scheme. PMID:21301923

  7. Integrated performance and dependability analysis using the advanced design environment prototype tool ADEPT

    SciTech Connect

    Rao, R.; Rahman, A.; Johnson, B.W.

    1995-09-01

    The Advanced Design Environment Prototype Tool (ADEPT) is an evolving integrated design environment which supports both performance and dependability analysis. ADEPT models are constructed using a collection of predefined library elements, called ADEPT modules. Each ADEPT module has an unambiguous mathematical definition in the form of a Colored Petri Net (CPN) and a corresponding Very High Speed Integrated Circuits (VHSIC) Hardware Description Language (VHDL) description. As a result, both simulation-based and analytical approaches for analysis can be employed. The focus of this paper is on dependability modeling and analysis using ADEPT. We present the simulation based approach to dependability analysis using ADEPT and an approach to integrating ADEPT and the Reliability Estimation System Testbed (REST) engine developed at NASA. We also present analytical techniques to extract the dependability characteristics of a system from the CPN definitions of the modules, in order to generate alternate models such as Markov models and fault trees.

  8. Recovery Act: Advanced Interaction, Computation, and Visualization Tools for Sustainable Building Design

    SciTech Connect

    Greenberg, Donald P.; Hencey, Brandon M.

    2013-08-20

    Current building energy simulation technology requires excessive labor, time and expertise to create building energy models, excessive computational time for accurate simulations and difficulties with the interpretation of the results. These deficiencies can be ameliorated using modern graphical user interfaces and algorithms which take advantage of modern computer architectures and display capabilities. To prove this hypothesis, we developed an experimental test bed for building energy simulation. This novel test bed environment offers an easy-to-use interactive graphical interface, provides access to innovative simulation modules that run at accelerated computational speeds, and presents new graphics visualization methods to interpret simulation results. Our system offers the promise of dramatic ease of use in comparison with currently available building energy simulation tools. Its modular structure makes it suitable for early stage building design, as a research platform for the investigation of new simulation methods, and as a tool for teaching concepts of sustainable design. Improvements in the accuracy and execution speed of many of the simulation modules are based on the modification of advanced computer graphics rendering algorithms. Significant performance improvements are demonstrated in several computationally expensive energy simulation modules. The incorporation of these modern graphical techniques should advance the state of the art in the domain of whole building energy analysis and building performance simulation, particularly at the conceptual design stage when decisions have the greatest impact. More importantly, these better simulation tools will enable the transition from prescriptive to performative energy codes, resulting in better, more efficient designs for our future built environment.

  9. Update on ORNL TRANSFORM Tool: Simulating Multi-Module Advanced Reactor with End-to-End I&C

    SciTech Connect

    Hale, Richard Edward; Fugate, David L.; Cetiner, Sacit M.; Qualls, A. L.

    2015-05-01

    The Small Modular Reactor (SMR) Dynamic System Modeling Tool project is in the fourth year of development. The project is designed to support collaborative modeling and study of various advanced SMR (non-light water cooled reactor) concepts, including the use of multiple coupled reactors at a single site. The focus of this report is the development of a steam generator and drum system model that includes the complex dynamics of typical steam drum systems, the development of instrumentation and controls for the steam generator with drum system model, and the development of multi-reactor module models that reflect the full power reactor innovative small module design concept. The objective of the project is to provide a common simulation environment and baseline modeling resources to facilitate rapid development of dynamic advanced reactor models; ensure consistency among research products within the Instrumentation, Controls, and Human-Machine Interface technical area; and leverage cross-cutting capabilities while minimizing duplication of effort. The combined simulation environment and suite of models are identified as the TRANSFORM tool. The critical elements of this effort include (1) defining a standardized, common simulation environment that can be applied throughout the Advanced Reactors Technology program; (2) developing a library of baseline component modules that can be assembled into full plant models using available geometry, design, and thermal-hydraulic data; (3) defining modeling conventions for interconnecting component models; and (4) establishing user interfaces and support tools to facilitate simulation development (i.e., configuration and parameterization), execution, and results display and capture.

  10. Some mathematical tools for a modeller's workbench

    NASA Technical Reports Server (NTRS)

    Cohen, E.

    1984-01-01

    The development of a mathematical software tools in workbench environment to model related objects more straightforward is outlined. A computer model from informal drawings and a plastic model of a helicopter is discussed. Lofting was the predominant, characteristic modelling technique. Ships and airplane designs use lofting as a technique because they have defined surfaces, (hulls and fuselages) from vertical station cuts perpendicular to the vertical center plane defining the major axis of reflective symmetry. A turbine blade from a jet engine was modelled in this way. The aerodynamic portion and the root comes from different paradigms. The union of these two parts into a coherent model is shown.

  11. FMFilter: A fast model based variant filtering tool.

    PubMed

    Akgün, Mete; Faruk Gerdan, Ö; Görmez, Zeliha; Demirci, Hüseyin

    2016-04-01

    The availability of whole exome and genome sequencing has completely changed the structure of genetic disease studies. It is now possible to solve the disease causing mechanisms within shorter time and budgets. For this reason, mining out the valuable information from the huge amount of data produced by next generation techniques becomes a challenging task. Current tools analyze sequencing data in various methods. However, there is still need for fast, easy to use and efficacious tools. Considering genetic disease studies, there is a lack of publicly available tools which support compound heterozygous and de novo models. Also, existing tools either require advanced IT expertise or are inefficient for handling large variant files. In this work, we provide FMFilter, an efficient sieving tool for next generation sequencing data produced by genetic disease studies. We develop a software which allows to choose the inheritance model (recessive, dominant, compound heterozygous and de novo), the affected and control individuals. The program provides a user friendly Graphical User Interface which eliminates the requirement of advanced computer techniques. It has various filtering options which enable to eliminate the majority of the false alarms. FMFilter requires negligible memory, therefore it can easily handle very large variant files like multiple whole genomes with ordinary computers. We demonstrate the variant reduction capability and effectiveness of the proposed tool with public and in-house data for different inheritance models. We also compare FMFilter with the existing filtering software. We conclude that FMFilter provides an effective and easy to use environment for analyzing next generation sequencing data from Mendelian diseases. PMID:26925517

  12. Advanced PANIC quick-look tool using Python

    NASA Astrophysics Data System (ADS)

    Ibáñez, José-Miguel; García Segura, Antonio J.; Storz, Clemens; Fried, Josef W.; Fernández, Matilde; Rodríguez Gómez, Julio F.; Terrón, V.; Cárdenas, M. C.

    2012-09-01

    PANIC, the Panoramic Near Infrared Camera, is an instrument for the Calar Alto Observatory currently being integrated in laboratory and whose first light is foreseen for end 2012 or early 2013. We present here how the PANIC Quick-Look tool (PQL) and pipeline (PAPI) are being implemented, using existing rapid programming Python technologies and packages, together with well-known astronomical software suites (Astromatic, IRAF) and parallel processing techniques. We will briefly describe the structure of the PQL tool, whose main characteristics are the use of the SQLite database and PyQt, a Python binding of the GUI toolkit Qt.

  13. Numerical modeling tools for chemical vapor deposition

    NASA Technical Reports Server (NTRS)

    Jasinski, Thomas J.; Childs, Edward P.

    1992-01-01

    Development of general numerical simulation tools for chemical vapor deposition (CVD) was the objective of this study. Physical models of important CVD phenomena were developed and implemented into the commercial computational fluid dynamics software FLUENT. The resulting software can address general geometries as well as the most important phenomena occurring with CVD reactors: fluid flow patterns, temperature and chemical species distribution, gas phase and surface deposition. The physical models are documented which are available and examples are provided of CVD simulation capabilities.

  14. The Will, Skill, Tool Model of Technology Integration: Adding Pedagogy as a New Model Construct

    ERIC Educational Resources Information Center

    Knezek, Gerald; Christensen, Rhonda

    2015-01-01

    An expansion of the Will, Skill, Tool Model of Technology Integration to include teacher's pedagogical style is proposed by the authors as a means of advancing the predictive power for level of classroom technology integration to beyond 90%. Suggested advantages to this expansion include more precise identification of areas to be targeted for…

  15. XML based tools for assessing potential impact of advanced technology space validation

    NASA Technical Reports Server (NTRS)

    Some, Raphael R.; Weisbin, Charles

    2004-01-01

    A hierarchical XML database and related analysis tools are being developed by the New Millennium Program to provide guidance on the relative impact, to future NASA missions, of advanced technologies under consideration for developmental funding.

  16. Human Factors Evaluation of Advanced Electric Power Grid Visualization Tools

    SciTech Connect

    Greitzer, Frank L.; Dauenhauer, Peter M.; Wierks, Tamara G.; Podmore, Robin

    2009-04-01

    This report describes initial human factors evaluation of four visualization tools (Graphical Contingency Analysis, Force Directed Graphs, Phasor State Estimator and Mode Meter/ Mode Shapes) developed by PNNL, and proposed test plans that may be implemented to evaluate their utility in scenario-based experiments.

  17. Advanced terahertz imaging system performance model for concealed weapon identification

    NASA Astrophysics Data System (ADS)

    Murrill, Steven R.; Redman, Brian; Espinola, Richard L.; Franck, Charmaine C.; Petkie, Douglas T.; De Lucia, Frank C.; Jacobs, Eddie L.; Griffin, Steven T.; Halford, Carl E.; Reynolds, Joe

    2007-04-01

    The U.S. Army Night Vision and Electronic Sensors Directorate (NVESD) and the U.S. Army Research Laboratory (ARL) have developed a terahertz-band imaging system performance model for detection and identification of concealed weaponry. The details of this MATLAB-based model which accounts for the effects of all critical sensor and display components, and for the effects of atmospheric attenuation, concealment material attenuation, and active illumination, were reported on at the 2005 SPIE Europe Security and Defence Symposium. The focus of this paper is to report on recent advances to the base model which have been designed to more realistically account for the dramatic impact that target and background orientation can have on target observability as related to specular and Lambertian reflections captured by an active-illumination-based imaging system. The advanced terahertz-band imaging system performance model now also accounts for target and background thermal emission, and has been recast into a user-friendly, Windows-executable tool. This advanced THz model has been developed in support of the Defense Advanced Research Project Agency's (DARPA) Terahertz Imaging Focal-Plane Technology (TIFT) program. This paper will describe the advanced THz model and its new radiometric sub-model in detail, and provide modeling and experimental results on target observability as a function of target and background orientation.

  18. Collaboration tools and techniques for large model datasets

    USGS Publications Warehouse

    Signell, R.P.; Carniel, S.; Chiggiato, J.; Janekovic, I.; Pullen, J.; Sherwood, C.R.

    2008-01-01

    In MREA and many other marine applications, it is common to have multiple models running with different grids, run by different institutions. Techniques and tools are described for low-bandwidth delivery of data from large multidimensional datasets, such as those from meteorological and oceanographic models, directly into generic analysis and visualization tools. Output is stored using the NetCDF CF Metadata Conventions, and then delivered to collaborators over the web via OPeNDAP. OPeNDAP datasets served by different institutions are then organized via THREDDS catalogs. Tools and procedures are then used which enable scientists to explore data on the original model grids using tools they are familiar with. It is also low-bandwidth, enabling users to extract just the data they require, an important feature for access from ship or remote areas. The entire implementation is simple enough to be handled by modelers working with their webmasters - no advanced programming support is necessary. ?? 2007 Elsevier B.V. All rights reserved.

  19. An Advanced Time Averaging Modelling Technique for Power Electronic Circuits

    NASA Astrophysics Data System (ADS)

    Jankuloski, Goce

    For stable and efficient performance of power converters, a good mathematical model is needed. This thesis presents a new modelling technique for DC/DC and DC/AC Pulse Width Modulated (PWM) converters. The new model is more accurate than the existing modelling techniques such as State Space Averaging (SSA) and Discrete Time Modelling. Unlike the SSA model, the new modelling technique, the Advanced Time Averaging Model (ATAM) includes the averaging dynamics of the converter's output. In addition to offering enhanced model accuracy, application of linearization techniques to the ATAM enables the use of conventional linear control design tools. A controller design application demonstrates that a controller designed based on the ATAM outperforms one designed using the ubiquitous SSA model. Unlike the SSA model, ATAM for DC/AC augments the system's dynamics with the dynamics needed for subcycle fundamental contribution (SFC) calculation. This allows for controller design that is based on an exact model.

  20. DEVELOPMENT OF THE ADVANCED UTILITY SIMULATION MODEL

    EPA Science Inventory

    The paper discusses the development of the Advanced Utility Simulation Model (AUSM), developed for the National Acid Precipitation Assessment Program (NAPAP), to forecast air emissions of pollutants from electric utilities. USM integrates generating unit engineering detail with d...

  1. Monitoring of seismic time-series with advanced parallel computational tools and complex networks

    NASA Astrophysics Data System (ADS)

    Kechaidou, M.; Sirakoulis, G. Ch.; Scordilis, E. M.

    2012-04-01

    Earthquakes have been in the focus of human and research interest for several centuries due to their catastrophic effect to the everyday life as they occur almost all over the world demonstrating a hard to be modelled unpredictable behaviour. On the other hand, their monitoring with more or less technological updated instruments has been almost continuous and thanks to this fact several mathematical models have been presented and proposed so far to describe possible connections and patterns found in the resulting seismological time-series. Especially, in Greece, one of the most seismically active territories on earth, detailed instrumental seismological data are available from the beginning of the past century providing the researchers with valuable and differential knowledge about the seismicity levels all over the country. Considering available powerful parallel computational tools, such as Cellular Automata, these data can be further successfully analysed and, most important, modelled to provide possible connections between different parameters of the under study seismic time-series. More specifically, Cellular Automata have been proven very effective to compose and model nonlinear complex systems resulting in the advancement of several corresponding models as possible analogues of earthquake fault dynamics. In this work preliminary results of modelling of the seismic time-series with the help of Cellular Automata so as to compose and develop the corresponding complex networks are presented. The proposed methodology will be able to reveal under condition hidden relations as found in the examined time-series and to distinguish the intrinsic time-series characteristics in an effort to transform the examined time-series to complex networks and graphically represent their evolvement in the time-space. Consequently, based on the presented results, the proposed model will eventually serve as a possible efficient flexible computational tool to provide a generic

  2. An advanced image analysis tool for the quantification and characterization of breast cancer in microscopy images.

    PubMed

    Goudas, Theodosios; Maglogiannis, Ilias

    2015-03-01

    The paper presents an advanced image analysis tool for the accurate and fast characterization and quantification of cancer and apoptotic cells in microscopy images. The proposed tool utilizes adaptive thresholding and a Support Vector Machines classifier. The segmentation results are enhanced through a Majority Voting and a Watershed technique, while an object labeling algorithm has been developed for the fast and accurate validation of the recognized cells. Expert pathologists evaluated the tool and the reported results are satisfying and reproducible. PMID:25681102

  3. Advances in Modelling of Valley Glaciers

    NASA Astrophysics Data System (ADS)

    Adhikari, Surendra

    For glaciological conditions typical of valley glaciers, the central idea of this research lies in understanding the effects of high-order mechanics and parameterizing these for simpler dynamical and statistical methods in glaciology. As an effective tool for this, I formulate a new brand of dynamical models that describes distinct physical processes of deformational flow. Through numerical simulations of idealized glacier domains, I calculate empirical correction factors to capture the effects of longitudinal stress gradients and lateral drag for simplified dynamical models in the plane-strain regime. To get some insights into real glacier dynamics, I simulate Haig Glacier in the Canadian Rocky Mountains. As geometric effects overshadow dynamical effects in glacier retreat scenarios, it appears that high-order physics are not very important for Haig Glacier, particularly for evaluating its fate. Indeed, high-order and reduced models all predict that Haig Glacier ceases to exist by about AD2080 under ongoing climate warming. This finding regarding the minimal role of high-order physics may not be broadly valid, as it is not true in advance scenarios at Haig Glacier and it may not be representative of other glaciological settings. Through a 'bulk' parameterization of high-order physics, geometric and climatic settings, sliding conditions, and transient effects, I also provide new insights into the volume-area relation, a widely used statistical method for estimating glacier volume. I find a steady-state power-law exponent of 1:46, which declines systematically to 1:38 after 100 years of sustained retreat, in good accord with the observations. I recommend more accurate scaling relations through characterization of individual glacier morphology and degree of climatic disequilibrium. This motivates a revision of global glacier volume estimates, of some urgency in sea level rise assessments.

  4. SOFA 2015: Authoritative Tools & Standard Models

    NASA Astrophysics Data System (ADS)

    Hohenkerk, Catherine

    2015-08-01

    The International Astronomical Union's Standards of Fundamental Astronomy (SOFA) service has the responsibility of establishing and maintaining an accessible and authoritative set of algorithms and procedures that implement standard models used in fundamental astronomy. This poster not only gives a summary of usage and available algorithms, but also highlights tools for astrometry and galactic coordinates, which have been added since the last IAU General Assembly.

  5. Chemical Kinetic Modeling of Advanced Transportation Fuels

    SciTech Connect

    PItz, W J; Westbrook, C K; Herbinet, O

    2009-01-20

    Development of detailed chemical kinetic models for advanced petroleum-based and nonpetroleum based fuels is a difficult challenge because of the hundreds to thousands of different components in these fuels and because some of these fuels contain components that have not been considered in the past. It is important to develop detailed chemical kinetic models for these fuels since the models can be put into engine simulation codes used for optimizing engine design for maximum efficiency and minimal pollutant emissions. For example, these chemistry-enabled engine codes can be used to optimize combustion chamber shape and fuel injection timing. They also allow insight into how the composition of advanced petroleum-based and non-petroleum based fuels affect engine performance characteristics. Additionally, chemical kinetic models can be used separately to interpret important in-cylinder experimental data and gain insight into advanced engine combustion processes such as HCCI and lean burn engines. The objectives are: (1) Develop detailed chemical kinetic reaction models for components of advanced petroleum-based and non-petroleum based fuels. These fuels models include components from vegetable-oil-derived biodiesel, oil-sand derived fuel, alcohol fuels and other advanced bio-based and alternative fuels. (2) Develop detailed chemical kinetic reaction models for mixtures of non-petroleum and petroleum-based components to represent real fuels and lead to efficient reduced combustion models needed for engine modeling codes. (3) Characterize the role of fuel composition on efficiency and pollutant emissions from practical automotive engines.

  6. Advancements in engineering turbulence modeling

    NASA Technical Reports Server (NTRS)

    Shih, T.-H.

    1991-01-01

    Some new developments in two-equation models and second order closure models are presented. Two-equation models (k-epsilon models) have been widely used in computational fluid dynamics (CFD) for engineering problems. Most of low-Reynolds number two-equation models contain some wall-distance damping functions to account for the effect of wall on turbulence. However, this often causes the confusion and difficulties in computing flows with complex geometry and also needs an ad hoc treatment near the separation and reattachment points. A set of modified two-equation models is proposed to remove the aforementioned shortcomings. The calculations using various two-equation models are compared with direct numerical simulations of channel flow and flat boundary layers. Development of a second order closure model is also discussed with emphasis on the modeling of pressure related correlation terms and dissipation rates in the second moment equations. All the existing models poorly predict the normal stresses near the wall and fail to predict the 3-D effect of mean flow on the turbulence (e.g. decrease in the shear stress caused by the cross flow in the boundary layer). The newly developed second order near-wall turbulence model is described and is capable of capturing the near-wall behavior of turbulence as well as the effect of 3-D mean flow on the turbulence.

  7. A tool box for implementing supersymmetric models

    NASA Astrophysics Data System (ADS)

    Staub, Florian; Ohl, Thorsten; Porod, Werner; Speckner, Christian

    2012-10-01

    We present a framework for performing a comprehensive analysis of a large class of supersymmetric models, including spectrum calculation, dark matter studies and collider phenomenology. To this end, the respective model is defined in an easy and straightforward way using the Mathematica package SARAH. SARAH then generates model files for CalcHep which can be used with micrOMEGAs as well as model files for WHIZARD and O'Mega. In addition, Fortran source code for SPheno is created which facilitates the determination of the particle spectrum using two-loop renormalization group equations and one-loop corrections to the masses. As an additional feature, the generated SPheno code can write out input files suitable for use with HiggsBounds to apply bounds coming from the Higgs searches to the model. Combining all programs provides a closed chain from model building to phenomenology. Program summary Program title: SUSY Phenomenology toolbox. Catalog identifier: AEMN_v1_0. Program summary URL: http://cpc.cs.qub.ac.uk/summaries/AEMN_v1_0.html. Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland. Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html. No. of lines in distributed program, including test data, etc.: 140206. No. of bytes in distributed program, including test data, etc.: 1319681. Distribution format: tar.gz. Programming language: Autoconf, Mathematica. Computer: PC running Linux, Mac. Operating system: Linux, Mac OS. Classification: 11.6. Nature of problem: Comprehensive studies of supersymmetric models beyond the MSSM is considerably complicated by the number of different tasks that have to be accomplished, including the calculation of the mass spectrum and the implementation of the model into tools for performing collider studies, calculating the dark matter density and checking the compatibility with existing collider bounds (in particular, from the Higgs searches). Solution method: The

  8. Laser vision: lidar as a transformative tool to advance critical zone science

    NASA Astrophysics Data System (ADS)

    Harpold, A. A.; Marshall, J. A.; Lyon, S. W.; Barnhart, T. B.; Fisher, B.; Donovan, M.; Brubaker, K. M.; Crosby, C. J.; Glenn, N. F.; Glennie, C. L.; Kirchner, P. B.; Lam, N.; Mankoff, K. D.; McCreight, J. L.; Molotch, N. P.; Musselman, K. N.; Pelletier, J.; Russo, T.; Sangireddy, H.; Sjöberg, Y.; Swetnam, T.; West, N.

    2015-01-01

    Laser vision: lidar as a transformative tool to advance critical zone science. Observation and quantification of the Earth surface is undergoing a revolutionary change due to the increased spatial resolution and extent afforded by light detection and ranging (lidar) technology. As a consequence, lidar-derived information has led to fundamental discoveries within the individual disciplines of geomorphology, hydrology, and ecology. These disciplines form the cornerstones of Critical Zone (CZ) science, where researchers study how interactions among the geosphere, hydrosphere, and ecosphere shape and maintain the "zone of life", extending from the groundwater to the vegetation canopy. Lidar holds promise as a transdisciplinary CZ research tool by simultaneously allowing for quantification of topographic, vegetative, and hydrological data. Researchers are just beginning to utilize lidar datasets to answer synergistic questions in CZ science, such as how landforms and soils develop in space and time as a function of the local climate, biota, hydrologic properties, and lithology. This review's objective is to demonstrate the transformative potential of lidar by critically assessing both challenges and opportunities for transdisciplinary lidar applications. A review of 147 peer-reviewed studies utilizing lidar showed that 38 % of the studies were focused in geomorphology, 18 % in hydrology, 32 % in ecology, and the remaining 12 % have an interdisciplinary focus. We find that using lidar to its full potential will require numerous advances across CZ applications, including new and more powerful open-source processing tools, exploiting new lidar acquisition technologies, and improved integration with physically-based models and complementary in situ and remote-sensing observations. We provide a five-year vision to utilize and advocate for the expanded use of lidar datasets to benefit CZ science applications.

  9. A Simple Tool to Predict ESRD Within 1 Year in Elderly Patients with Advanced CKD

    PubMed Central

    Drawz, Paul E.; Goswami, Puja; Azem, Reem; Babineau, Denise C.; Rahman, Mahboob

    2013-01-01

    BACKGROUND/OBJECTIVES Chronic kidney disease (CKD) is common in older patients; currently, no tools are available to predict the risk of end-stage renal disease (ESRD) within 1 year. The goal of this study was to develop and validate a model to predict the 1 year risk for ESRD in elderly subjects with advanced CKD. DESIGN Retrospective study SETTING Veterans Affairs Medical Center PARTICIPANTS Patients over 65 years of age with CKD with an estimated (eGFR) less than 30mL/min/1.73m2. MEASUREMENTS The outcome was ESRD within 1 year of the index eGFR. Cox regression was used to develop a predictive model (VA risk score) which was validated in a separate cohort. RESULTS Of the 1,866 patients in the developmental cohort, 77 developed ESRD. Risk factors for ESRD in the final model were age, congestive heart failure, systolic blood pressure, eGFR, potassium, and albumin. In the validation cohort, the C index for the VA risk score was 0.823. The risk for developing ESRD at 1 year from lowest to highest tertile was 0.08%, 2.7%, and 11.3% (P<0.001). The C-index for the recently published Tangri model in the validation cohort was 0.780. CONCLUSION A new model using commonly available clinical measures shows excellent ability to predict the onset of ESRD within the next year in elderly subjects. Additionally, the Tangri model had very good predictive ability. Patients and physicians can use these risk models to inform decisions regarding preparation for renal replacement therapy in patients with advanced CKD. PMID:23617782

  10. Advanced Epi Tools for Gallium Nitride Light Emitting Diode Devices

    SciTech Connect

    Patibandla, Nag; Agrawal, Vivek

    2012-12-01

    Over the course of this program, Applied Materials, Inc., with generous support from the United States Department of Energy, developed a world-class three chamber III-Nitride epi cluster tool for low-cost, high volume GaN growth for the solid state lighting industry. One of the major achievements of the program was to design, build, and demonstrate the world’s largest wafer capacity HVPE chamber suitable for repeatable high volume III-Nitride template and device manufacturing. Applied Materials’ experience in developing deposition chambers for the silicon chip industry over many decades resulted in many orders of magnitude reductions in the price of transistors. That experience and understanding was used in developing this GaN epi deposition tool. The multi-chamber approach, which continues to be unique in the ability of the each chamber to deposit a section of the full device structure, unlike other cluster tools, allows for extreme flexibility in the manufacturing process. This robust architecture is suitable for not just the LED industry, but GaN power devices as well, both horizontal and vertical designs. The new HVPE technology developed allows GaN to be grown at a rate unheard of with MOCVD, up to 20x the typical MOCVD rates of 3{micro}m per hour, with bulk crystal quality better than the highest-quality commercial GaN films grown by MOCVD at a much cheaper overall cost. This is a unique development as the HVPE process has been known for decades, but never successfully commercially developed for high volume manufacturing. This research shows the potential of the first commercial-grade HVPE chamber, an elusive goal for III-V researchers and those wanting to capitalize on the promise of HVPE. Additionally, in the course of this program, Applied Materials built two MOCVD chambers, in addition to the HVPE chamber, and a robot that moves wafers between them. The MOCVD chambers demonstrated industry-leading wavelength yield for GaN based LED wafers and industry

  11. CUAHSI's Hydrologic Measurement Facility: Putting Advanced Tools in Scientists' Hands

    NASA Astrophysics Data System (ADS)

    Hooper, R. P.; Robinson, D.; Selker, J.; Duncan, J.

    2006-05-01

    Like related environmental sciences, the hydrologic sciences community has been defining environmental observatories and the support components necessary for their successful implementation, such as informatics (cyberinfrastructure) and instrumentation. Unlike programs, such as NEON and OOI, that have been pursuing large-scale capital funding through the Major Research Equipment program of the National Science Foundation, CUAHSI has been pursuing incremental development of observatories that has allowed us to pilot different parts of these support functions, namely Hydrologic Information Systems and a Hydrologic Measurement Facility (HMF), the subject of this paper. The approach has allowed us to gain greater specificity of the requirements for these facilities and their operational challenges. The HMF is developing the foundation to support innovative research across the breadth of the Hydrologic Community, including classic PI-driven projects as well as over 20 grass-roots observatories that have been developing over the past 2 years. HMF is organized around three basic areas: water cycle instrumentation, biogeochemistry and geophysics. Committees have been meeting to determined the most effective manner to deliver instrumentation, whether by special instrumentation packages proposed by host institutions; collaborative agreements with federal agencies; and contributions from industrial partners. These efforts are guided by the results of a community wide survey conducted in Nov-Dec 2005, and a series of ongoing workshops. The survey helped identify the types of equipment that will advance hydrological sciences and are often beyond the capabilities of individual PI's. Respondents to the survey indicated they were keen for HMF to focus on providing supported equipment such as atmospheric profilers like LIDAR, geophysical instrumentation ranging from airborne sensors to ground-penetrating radar, and field-deployed mass spectrophotometers. A recently signed agreement

  12. Advances in National Capabilities for Consequence Assessment Modeling of Airborne Hazards

    SciTech Connect

    Nasstrom, J; Sugiyama, G; Foster, K; Larsen, S; Kosovic, B; Eme, B; Walker, H; Goldstein, P; Lundquist, J; Pobanz, B; Fulton, J

    2007-11-26

    This paper describes ongoing advancement of airborne hazard modeling capabilities in support of multiple agencies through the National Atmospheric Release Advisory Center (NARAC) and the Interagency Atmospheric Modeling and Atmospheric Assessment Center (IMAAC). A suite of software tools developed by Lawrence Livermore National Laboratory (LLNL) and collaborating organizations includes simple stand-alone, local-scale plume modeling tools for end user's computers, Web- and Internet-based software to access advanced 3-D flow and atmospheric dispersion modeling tools and expert analysis from the national center at LLNL, and state-of-the-science high-resolution urban models and event reconstruction capabilities.

  13. From bacterial genomics to metagenomics: concept, tools and recent advances.

    PubMed

    Sharma, Pooja; Kumari, Hansi; Kumar, Mukesh; Verma, Mansi; Kumari, Kirti; Malhotra, Shweta; Khurana, Jitendra; Lal, Rup

    2008-06-01

    In the last 20 years, the applications of genomics tools have completely transformed the field of microbial research. This has primarily happened due to revolution in sequencing technologies that have become available today. This review therefore, first describes the discoveries, upgradation and automation of sequencing techniques in a chronological order, followed by a brief discussion on microbial genomics. Some of the recently sequenced bacterial genomes are described to explain how complete genome data is now being used to derive interesting findings. Apart from the genomics of individual microbes, the study of unculturable microbiota from different environments is increasingly gaining importance. The second section is thus dedicated to the concept of metagenomics describing environmental DNA isolation, metagenomic library construction and screening methods to look for novel and potentially important genes, enzymes and biomolecules. It also deals with the pioneering studies in the area of metagenomics that are offering new insights into the previously unappreciated microbial world. PMID:23100712

  14. AN ADVANCED TOOL FOR APPLIED INTEGRATED SAFETY MANAGEMENT

    SciTech Connect

    Potts, T. Todd; Hylko, James M.; Douglas, Terence A.

    2003-02-27

    WESKEM, LLC's Environmental, Safety and Health (ES&H) Department had previously assessed that a lack of consistency, poor communication and using antiquated communication tools could result in varying operating practices, as well as a failure to capture and disseminate appropriate Integrated Safety Management (ISM) information. To address these issues, the ES&H Department established an Activity Hazard Review (AHR)/Activity Hazard Analysis (AHA) process for systematically identifying, assessing, and controlling hazards associated with project work activities during work planning and execution. Depending on the scope of a project, information from field walkdowns and table-top meetings are collected on an AHR form. The AHA then documents the potential failure and consequence scenarios for a particular hazard. Also, the AHA recommends whether the type of mitigation appears appropriate or whether additional controls should be implemented. Since the application is web based, the information is captured into a single system and organized according to the >200 work activities already recorded in the database. Using the streamlined AHA method improved cycle time from over four hours to an average of one hour, allowing more time to analyze unique hazards and develop appropriate controls. Also, the enhanced configuration control created a readily available AHA library to research and utilize along with standardizing hazard analysis and control selection across four separate work sites located in Kentucky and Tennessee. The AHR/AHA system provides an applied example of how the ISM concept evolved into a standardized field-deployed tool yielding considerable efficiency gains in project planning and resource utilization. Employee safety is preserved through detailed planning that now requires only a portion of the time previously necessary. The available resources can then be applied to implementing appropriate engineering, administrative and personal protective equipment

  15. Modeling of Spacecraft Advanced Chemical Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Benfield, Michael P. J.; Belcher, Jeremy A.

    2004-01-01

    This paper outlines the development of the Advanced Chemical Propulsion System (ACPS) model for Earth and Space Storable propellants. This model was developed by the System Technology Operation of SAIC-Huntsville for the NASA MSFC In-Space Propulsion Project Office. Each subsystem of the model is described. Selected model results will also be shown to demonstrate the model's ability to evaluate technology changes in chemical propulsion systems.

  16. GridTool: A surface modeling and grid generation tool

    NASA Technical Reports Server (NTRS)

    Samareh-Abolhassani, Jamshid

    1995-01-01

    GridTool is designed around the concept that the surface grids are generated on a set of bi-linear patches. This type of grid generation is quite easy to implement, and it avoids the problems associated with complex CAD surface representations and associated surface parameterizations. However, the resulting surface grids are close to but not on the original CAD surfaces. This problem can be alleviated by projecting the resulting surface grids onto the original CAD surfaces. GridTool is designed primary for unstructured grid generation systems. Currently, GridTool supports VGRID and FELISA systems, and it can be easily extended to support other unstructured grid generation systems. The data in GridTool is stored parametrically so that once the problem is set up, one can modify the surfaces and the entire set of points, curves and patches will be updated automatically. This is very useful in a multidisciplinary design and optimization process. GridTool is written entirely in ANSI 'C', the interface is based on the FORMS library, and the graphics is based on the GL library. The code has been tested successfully on IRIS workstations running IRIX4.0 and above. The memory is allocated dynamically, therefore, memory size will depend on the complexity of geometry/grid. GridTool data structure is based on a link-list structure which allows the required memory to expand and contract dynamically according to the user's data size and action. Data structure contains several types of objects such as points, curves, patches, sources and surfaces. At any given time, there is always an active object which is drawn in magenta, or in their highlighted colors as defined by the resource file which will be discussed later.

  17. Advanced Flow Control as a Management Tool in the National Airspace System

    NASA Technical Reports Server (NTRS)

    Wugalter, S.

    1974-01-01

    Advanced Flow Control is closely related to Air Traffic Control. Air Traffic Control is the business of the Federal Aviation Administration. To formulate an understanding of advanced flow control and its use as a management tool in the National Airspace System, it becomes necessary to speak somewhat of air traffic control, the role of FAA, and their relationship to advanced flow control. Also, this should dispell forever, any notion that advanced flow control is the inspirational master valve scheme to be used on the Alaskan Oil Pipeline.

  18. Scanning magnetoresistive microscopy: An advanced characterization tool for magnetic nanosystems.

    PubMed

    Mitin, D; Grobis, M; Albrecht, M

    2016-02-01

    An advanced scanning magnetoresistive microscopy (SMRM) - a robust magnetic imaging and probing technique - will be presented, which utilizes state-of-the-art recording heads of a hard disk drive as sensors. The spatial resolution of modern tunneling magnetoresistive sensors is nowadays comparable to the more commonly used magnetic force microscopes. Important advantages of SMRM are the ability to detect pure magnetic signals directly proportional to the out-of-plane magnetic stray field, negligible sensor stray fields, and the ability to apply local bipolar magnetic field pulses up to 10 kOe with bandwidths from DC up to 1 GHz. Moreover, the SMRM can be further equipped with a heating stage and external magnetic field units. The performance of this method and corresponding best practices are demonstrated by presenting various examples, including a temperature dependent recording study on hard magnetic L1(0) FeCuPt thin films, imaging of magnetic vortex states in an in-plane magnetic field, and their controlled manipulation by applying local field pulses. PMID:26931856

  19. Scanning magnetoresistive microscopy: An advanced characterization tool for magnetic nanosystems

    NASA Astrophysics Data System (ADS)

    Mitin, D.; Grobis, M.; Albrecht, M.

    2016-02-01

    An advanced scanning magnetoresistive microscopy (SMRM) — a robust magnetic imaging and probing technique — will be presented, which utilizes state-of-the-art recording heads of a hard disk drive as sensors. The spatial resolution of modern tunneling magnetoresistive sensors is nowadays comparable to the more commonly used magnetic force microscopes. Important advantages of SMRM are the ability to detect pure magnetic signals directly proportional to the out-of-plane magnetic stray field, negligible sensor stray fields, and the ability to apply local bipolar magnetic field pulses up to 10 kOe with bandwidths from DC up to 1 GHz. Moreover, the SMRM can be further equipped with a heating stage and external magnetic field units. The performance of this method and corresponding best practices are demonstrated by presenting various examples, including a temperature dependent recording study on hard magnetic L10 FeCuPt thin films, imaging of magnetic vortex states in an in-plane magnetic field, and their controlled manipulation by applying local field pulses.

  20. Model Standards Advance the Profession

    ERIC Educational Resources Information Center

    Journal of Staff Development, 2011

    2011-01-01

    Leadership by teachers is essential to serving the needs of students, schools, and the teaching profession. To that end, the Teacher Leadership Exploratory Consortium has developed Teacher Leader Model Standards to codify, promote, and support teacher leadership as a vehicle to transform schools for the needs of the 21st century. The Teacher…

  1. Teaching Advanced Data Analysis Tools to High School Astronomy Students

    NASA Astrophysics Data System (ADS)

    Black, David V.; Herring, Julie; Hintz, Eric G.

    2015-01-01

    A major barrier to becoming an astronomer is learning how to analyze astronomical data, such as using photometry to compare the brightness of stars. Most fledgling astronomers learn observation, data reduction, and analysis skills through an upper division college class. If the same skills could be taught in an introductory high school astronomy class, then more students would have an opportunity to do authentic science earlier, with implications for how many choose to become astronomers. Several software tools have been developed that can analyze astronomical data ranging from fairly straightforward (AstroImageJ and DS9) to very complex (IRAF and DAOphot). During the summer of 2014, a study was undertaken at Brigham Young University through a Research Experience for Teachers (RET) program to evaluate the effectiveness and ease-of-use of these four software packages. Standard tasks tested included creating a false-color IR image using WISE data in DS9, Adobe Photoshop, and The Gimp; a multi-aperture analyses of variable stars over time using AstroImageJ; creating Spectral Energy Distributions (SEDs) of stars using photometry at multiple wavelengths in AstroImageJ and DS9; and color-magnitude and hydrogen alpha index diagrams for open star clusters using IRAF and DAOphot. Tutorials were then written and combined with screen captures to teach high school astronomy students at Walden School of Liberal Arts in Provo, UT how to perform these same tasks. They analyzed image data using the four software packages, imported it into Microsoft Excel, and created charts using images from BYU's 36-inch telescope at their West Mountain Observatory. The students' attempts to complete these tasks were observed, mentoring was provided, and the students then reported on their experience through a self-reflection essay and concept test. Results indicate that high school astronomy students can successfully complete professional-level astronomy data analyses when given detailed

  2. Data Analysis with Graphical Models: Software Tools

    NASA Technical Reports Server (NTRS)

    Buntine, Wray L.

    1994-01-01

    Probabilistic graphical models (directed and undirected Markov fields, and combined in chain graphs) are used widely in expert systems, image processing and other areas as a framework for representing and reasoning with probabilities. They come with corresponding algorithms for performing probabilistic inference. This paper discusses an extension to these models by Spiegelhalter and Gilks, plates, used to graphically model the notion of a sample. This offers a graphical specification language for representing data analysis problems. When combined with general methods for statistical inference, this also offers a unifying framework for prototyping and/or generating data analysis algorithms from graphical specifications. This paper outlines the framework and then presents some basic tools for the task: a graphical version of the Pitman-Koopman Theorem for the exponential family, problem decomposition, and the calculation of exact Bayes factors. Other tools already developed, such as automatic differentiation, Gibbs sampling, and use of the EM algorithm, make this a broad basis for the generation of data analysis software.

  3. Advanced Space Shuttle simulation model

    NASA Technical Reports Server (NTRS)

    Tatom, F. B.; Smith, S. R.

    1982-01-01

    A non-recursive model (based on von Karman spectra) for atmospheric turbulence along the flight path of the shuttle orbiter was developed. It provides for simulation of instantaneous vertical and horizontal gusts at the vehicle center-of-gravity, and also for simulation of instantaneous gusts gradients. Based on this model the time series for both gusts and gust gradients were generated and stored on a series of magnetic tapes, entitled Shuttle Simulation Turbulence Tapes (SSTT). The time series are designed to represent atmospheric turbulence from ground level to an altitude of 120,000 meters. A description of the turbulence generation procedure is provided. The results of validating the simulated turbulence are described. Conclusions and recommendations are presented. One-dimensional von Karman spectra are tabulated, while a discussion of the minimum frequency simulated is provided. The results of spectral and statistical analyses of the SSTT are presented.

  4. Reservoir geology using 3D modelling tools

    SciTech Connect

    Dubrule, O.; Samson, P.; Segonds, D.

    1996-12-31

    The last decade has seen tremendous developments in the area of quantitative geological modelling. These developments have a significant impact on the current practice of constructing reservoir models. A structural model can first be constructed on the basis of depth-converted structural interpretations produced on a seismic interpretation workstation. Surfaces and faults can be represented as geological objects, and interactively modified. Once the tectonic framework has been obtained, intermediate stratigraphic surfaces can be constructed between the main structural surfaces. Within each layer, reservoir attributes can be represented using various techniques. Examples show how the distribution of different facies (i.e. from fine to coarse grain) can be represented, or how various depositional units (for instance channels, crevasses and lobes in a turbidite setting) can be modelled as geological {open_quotes}objects{close_quotes} with complex geometries. Elf Aquitaine, in close co-operation with the GOCAD project in Nancy (France) is investigating how geological models can be made more realistic by developing interactive functionalities. Examples show that, contrary to standard deterministic or geostatistical modelling techniques (which tend to be difficult to control) the use of new 3D tools allows the geologist to interactively modify geological surfaces (including faults) or volumetric properties. Thus, the sensitivity of various economic parameters (oil in place, connected volumes, reserves) to major geological uncertainties can be evaluated. It is argued that future breakthroughs in geological modelling techniques are likely to happen in the development of interactive approaches rather than in the research of new mathematical algorithms.

  5. Reservoir geology using 3D modelling tools

    SciTech Connect

    Dubrule, O. ); Samson, P. ); Segonds, D. )

    1996-01-01

    The last decade has seen tremendous developments in the area of quantitative geological modelling. These developments have a significant impact on the current practice of constructing reservoir models. A structural model can first be constructed on the basis of depth-converted structural interpretations produced on a seismic interpretation workstation. Surfaces and faults can be represented as geological objects, and interactively modified. Once the tectonic framework has been obtained, intermediate stratigraphic surfaces can be constructed between the main structural surfaces. Within each layer, reservoir attributes can be represented using various techniques. Examples show how the distribution of different facies (i.e. from fine to coarse grain) can be represented, or how various depositional units (for instance channels, crevasses and lobes in a turbidite setting) can be modelled as geological [open quotes]objects[close quotes] with complex geometries. Elf Aquitaine, in close co-operation with the GOCAD project in Nancy (France) is investigating how geological models can be made more realistic by developing interactive functionalities. Examples show that, contrary to standard deterministic or geostatistical modelling techniques (which tend to be difficult to control) the use of new 3D tools allows the geologist to interactively modify geological surfaces (including faults) or volumetric properties. Thus, the sensitivity of various economic parameters (oil in place, connected volumes, reserves) to major geological uncertainties can be evaluated. It is argued that future breakthroughs in geological modelling techniques are likely to happen in the development of interactive approaches rather than in the research of new mathematical algorithms.

  6. An MCMC Circumstellar Disks Modeling Tool

    NASA Astrophysics Data System (ADS)

    Wolff, Schuyler; Perrin, Marshall D.; Mazoyer, Johan; Choquet, Elodie; Soummer, Remi; Ren, Bin; Pueyo, Laurent; Debes, John H.; Duchene, Gaspard; Pinte, Christophe; Menard, Francois

    2016-01-01

    We present an enhanced software framework for the Monte Carlo Markov Chain modeling of circumstellar disk observations, including spectral energy distributions and multi wavelength images from a variety of instruments (e.g. GPI, NICI, HST, WFIRST). The goal is to self-consistently and simultaneously fit a wide variety of observables in order to place constraints on the physical properties of a given disk, while also rigorously assessing the uncertainties in the derived properties. This modular code is designed to work with a collection of existing modeling tools, ranging from simple scripts to define the geometry for optically thin debris disks, to full radiative transfer modeling of complex grain structures in protoplanetary disks (using the MCFOST radiative transfer modeling code). The MCMC chain relies on direct chi squared comparison of model images/spectra to observations. We will include a discussion of how best to weight different observations in the modeling of a single disk and how to incorporate forward modeling from PCA PSF subtraction techniques. The code is open source, python, and available from github. Results for several disks at various evolutionary stages will be discussed.

  7. Modeling Advance Life Support Systems

    NASA Technical Reports Server (NTRS)

    Pitts, Marvin; Sager, John; Loader, Coleen; Drysdale, Alan

    1996-01-01

    Activities this summer consisted of two projects that involved computer simulation of bioregenerative life support systems for space habitats. Students in the Space Life Science Training Program (SLSTP) used the simulation, space station, to learn about relationships between humans, fish, plants, and microorganisms in a closed environment. One student complete a six week project to modify the simulation by converting the microbes from anaerobic to aerobic, and then balancing the simulation's life support system. A detailed computer simulation of a closed lunar station using bioregenerative life support was attempted, but there was not enough known about system restraints and constants in plant growth, bioreactor design for space habitats and food preparation to develop an integrated model with any confidence. Instead of a completed detailed model with broad assumptions concerning the unknown system parameters, a framework for an integrated model was outlined and work begun on plant and bioreactor simulations. The NASA sponsors and the summer Fell were satisfied with the progress made during the 10 weeks, and we have planned future cooperative work.

  8. Micromechanical modeling of advanced materials

    SciTech Connect

    Silling, S.A.; Taylor, P.A.; Wise, J.L.; Furnish, M.D.

    1994-04-01

    Funded as a laboratory-directed research and development (LDRD) project, the work reported here focuses on the development of a computational methodology to determine the dynamic response of heterogeneous solids on the basis of their composition and microstructural morphology. Using the solid dynamics wavecode CTH, material response is simulated on a scale sufficiently fine to explicitly represent the material`s microstructure. Conducting {open_quotes}numerical experiments{close_quotes} on this scale, the authors explore the influence that the microstructure exerts on the material`s overall response. These results are used in the development of constitutive models that take into account the effects of microstructure without explicit representation of its features. Applying this methodology to a glass-reinforced plastic (GRP) composite, the authors examined the influence of various aspects of the composite`s microstructure on its response in a loading regime typical of impact and penetration. As a prerequisite to the microscale modeling effort, they conducted extensive materials testing on the constituents, S-2 glass and epoxy resin (UF-3283), obtaining the first Hugoniot and spall data for these materials. The results of this work are used in the development of constitutive models for GRP materials in transient-dynamics computer wavecodes.

  9. WMT: The CSDMS Web Modeling Tool

    NASA Astrophysics Data System (ADS)

    Piper, M.; Hutton, E. W. H.; Overeem, I.; Syvitski, J. P.

    2015-12-01

    The Community Surface Dynamics Modeling System (CSDMS) has a mission to enable model use and development for research in earth surface processes. CSDMS strives to expand the use of quantitative modeling techniques, promotes best practices in coding, and advocates for the use of open-source software. To streamline and standardize access to models, CSDMS has developed the Web Modeling Tool (WMT), a RESTful web application with a client-side graphical interface and a server-side database and API that allows users to build coupled surface dynamics models in a web browser on a personal computer or a mobile device, and run them in a high-performance computing (HPC) environment. With WMT, users can: Design a model from a set of components Edit component parameters Save models to a web-accessible server Share saved models with the community Submit runs to an HPC system Download simulation results The WMT client is an Ajax application written in Java with GWT, which allows developers to employ object-oriented design principles and development tools such as Ant, Eclipse and JUnit. For deployment on the web, the GWT compiler translates Java code to optimized and obfuscated JavaScript. The WMT client is supported on Firefox, Chrome, Safari, and Internet Explorer. The WMT server, written in Python and SQLite, is a layered system, with each layer exposing a web service API: wmt-db: database of component, model, and simulation metadata and output wmt-api: configure and connect components wmt-exe: launch simulations on remote execution servers The database server provides, as JSON-encoded messages, the metadata for users to couple model components, including descriptions of component exchange items, uses and provides ports, and input parameters. Execution servers are network-accessible computational resources, ranging from HPC systems to desktop computers, containing the CSDMS software stack for running a simulation. Once a simulation completes, its output, in NetCDF, is packaged

  10. Accurate mask model for advanced nodes

    NASA Astrophysics Data System (ADS)

    Zine El Abidine, Nacer; Sundermann, Frank; Yesilada, Emek; Ndiaye, El Hadji Omar; Mishra, Kushlendra; Paninjath, Sankaranarayanan; Bork, Ingo; Buck, Peter; Toublan, Olivier; Schanen, Isabelle

    2014-07-01

    Standard OPC models consist of a physical optical model and an empirical resist model. The resist model compensates the optical model imprecision on top of modeling resist development. The optical model imprecision may result from mask topography effects and real mask information including mask ebeam writing and mask process contributions. For advanced technology nodes, significant progress has been made to model mask topography to improve optical model accuracy. However, mask information is difficult to decorrelate from standard OPC model. Our goal is to establish an accurate mask model through a dedicated calibration exercise. In this paper, we present a flow to calibrate an accurate mask enabling its implementation. The study covers the different effects that should be embedded in the mask model as well as the experiment required to model them.

  11. Advances and Computational Tools towards Predictable Design in Biological Engineering

    PubMed Central

    2014-01-01

    The design process of complex systems in all the fields of engineering requires a set of quantitatively characterized components and a method to predict the output of systems composed by such elements. This strategy relies on the modularity of the used components or the prediction of their context-dependent behaviour, when parts functioning depends on the specific context. Mathematical models usually support the whole process by guiding the selection of parts and by predicting the output of interconnected systems. Such bottom-up design process cannot be trivially adopted for biological systems engineering, since parts function is hard to predict when components are reused in different contexts. This issue and the intrinsic complexity of living systems limit the capability of synthetic biologists to predict the quantitative behaviour of biological systems. The high potential of synthetic biology strongly depends on the capability of mastering this issue. This review discusses the predictability issues of basic biological parts (promoters, ribosome binding sites, coding sequences, transcriptional terminators, and plasmids) when used to engineer simple and complex gene expression systems in Escherichia coli. A comparison between bottom-up and trial-and-error approaches is performed for all the discussed elements and mathematical models supporting the prediction of parts behaviour are illustrated. PMID:25161694

  12. Advances in Coupling of Kinetics and Molecular Scale Tools to Shed Light on Soil Biogeochemical Processes

    SciTech Connect

    Sparks, Donald

    2014-09-02

    Biogeochemical processes in soils such as sorption, precipitation, and redox play critical roles in the cycling and fate of nutrients, metal(loid)s and organic chemicals in soil and water environments. Advanced analytical tools enable soil scientists to track these processes in real-time and at the molecular scale. Our review focuses on recent research that has employed state-of-the-art molecular scale spectroscopy, coupled with kinetics, to elucidate the mechanisms of nutrient and metal(loid) reactivity and speciation in soils. We found that by coupling kinetics with advanced molecular and nano-scale tools major advances have been made in elucidating important soil chemical processes including sorption, precipitation, dissolution, and redox of metal(loids) and nutrients. Such advances will aid in better predicting the fate and mobility of nutrients and contaminants in soils and water and enhance environmental and agricultural sustainability.

  13. Advanced Write Tool Effects on 100-nm Node OPC

    NASA Astrophysics Data System (ADS)

    Buck, Peter D.; Green, Kent G.; Ibsen, Kent B.; Nakagawa, Kent H.; Hong, Dongsung; Krishnan, Prakash; Coburn, Dianna

    2002-12-01

    It has long been understood that there is an image fidelity difference between the integrated circuit design pattern and the photomask made from that pattern, largely due to the finite spot size of pattern generators. Furthermore, there are known differences in photomask image fidelity (rounding, jogs, etc.) between e-beam and laser pattern generators. Using a novel technique developed by DuPont Photomasks, Inc. (DPI), actual photomask fidelity has been simulated from design data to produce a more true-to-life representation of the mask. We have performed analytical simulations and printed-wafer measurements on Cypress 100-nm technology designs to determine the differences and effects on optical proximity correction (OPC) of two types of pattern generators: 50 keV e-beam and DUV laser. Both JEOL 9000MV-II+ and ETEC ALTA 4000 images were simulated and saved in GDSII format ("mask-GDSII"). These new mask images were processed through standard lithography simulation software to predict the effects each mask writer has on localized optical proximity effects. Simulations were compared to printed wafer results. A detailed comparison of the accuracy of the mask-GDSII and original design GDSII is performed. Furthermore, comparison of 50 keV e-beam and DUV laser image fidelity is completed, and recommendations are made on how to correct OPC models for each type of photomask generator. Lastly, conclusions are drawn about the use of DUV laser and 50 keV e-beam photomasks.

  14. Right approach to 3D modeling using CAD tools

    NASA Astrophysics Data System (ADS)

    Baddam, Mounica Reddy

    The thesis provides a step-by-step methodology to enable an instructor dealing with CAD tools to optimally guide his/her students through an understandable 3D modeling approach which will not only enhance their knowledge about the tool's usage but also enable them to achieve their desired result in comparatively lesser time. In the known practical field, there is particularly very little information available to apply CAD skills to formal beginners' training sessions. Additionally, advent of new software in 3D domain cumulates updating into a more difficult task. Keeping up to the industry's advanced requirements emphasizes the importance of more skilled hands in the field of CAD development, rather than just prioritizing manufacturing in terms of complex software features. The thesis analyses different 3D modeling approaches specified to the varieties of CAD tools currently available in the market. Utilizing performance-time databases, learning curves have been generated to measure their performance time, feature count etc. Based on the results, improvement parameters have also been provided for (Asperl, 2005).

  15. Advanced REACH Tool: development and application of the substance emission potential modifying factor.

    PubMed

    van Tongeren, Martie; Fransman, Wouter; Spankie, Sally; Tischer, Martin; Brouwer, Derk; Schinkel, Jody; Cherrie, John W; Tielemans, Erik

    2011-11-01

    The Advanced REACH Tool (ART) is an exposure assessment tool that combines mechanistically modelled inhalation exposure estimates with available exposure data using a Bayesian approach. The mechanistic model is based on nine independent principal modifying factors (MF). One of these MF is the substance emission potential, which addresses the intrinsic substance properties as determinants of the emission from a source. This paper describes the current knowledge and evidence on intrinsic characteristics of solids and liquids that determine the potential for their release into workplace air. The principal factor determining the release of aerosols from handling or processing powdered, granular, or pelletized materials is the dustiness of the material, as well as the weight fraction of the substance of interest in the powder and the moisture content. The partial vapour pressure is the main intrinsic factor determining the substance emission potential for emission of vapours. For generation of mist, the substance emission potential is determined by the viscosity of the liquid as well as the weight fraction of the substance of interest in the liquid. Within ART release of vapours is considered for substances with a partial vapour pressure at the process temperature of 10 Pa or more, while mist formation is considered for substances with a vapour pressure ≤ 10 Pa. Relative multipliers are assigned for most of the intrinsic factors, with the exception of the weight fraction and the vapour pressure, which is applied as a continuous variable in the estimation of the substance emission potential. Currently, estimation of substance emission potential is not available for fumes, fibres, and gases. The substance emission potential takes account of the latest thinking on emissions of dusts, mists, and vapours and in our view provides a good balance between theory and pragmatism. Expanding the knowledge base on substance emission potential will improve the predictive power of

  16. Recent advances in modeling stellar interiors (u)

    SciTech Connect

    Guzik, Joyce Ann

    2010-01-01

    Advances in stellar interior modeling are being driven by new data from large-scale surveys and high-precision photometric and spectroscopic observations. Here we focus on single stars in normal evolutionary phases; we will not discuss the many advances in modeling star formation, interacting binaries, supernovae, or neutron stars. We review briefly: (1) updates to input physics of stellar models; (2) progress in two and three-dimensional evolution and hydrodynamic models; (3) insights from oscillation data used to infer stellar interior structure and validate model predictions (asteroseismology). We close by highlighting a few outstanding problems, e.g., the driving mechanisms for hybrid {gamma} Dor/{delta} Sct star pulsations, the cause of giant eruptions seen in luminous blue variables such as {eta} Car and P Cyg, and the solar abundance problem.

  17. BASEMENT - a freeware simulation tool for hydro- and morphodynamic modelling

    NASA Astrophysics Data System (ADS)

    Vetsch, David; Rousselot, Patric; Volz, Christian; Vonwiller, Lukas; Siviglia, Annunziato; Peter, Samuel; Ehrbar, Daniel; Facchini, Matteo; Boes, Robert

    2014-05-01

    The application of numerical modelling tools to river engineering problems is a well established methodology. In the present contribution, a numerical software for simulation of hydro- and morphodynamics is presented that is available free of charge - also for commercial use. The main motivation for development of the software is to provide an powerful user-friendly tool that facilitates basic applications for practitioners as well as advanced model configuration for research. The underlying one- and two-dimensional models are based on the Saint-Venant equations for hydrodynamics, the Exner-Hirano equations for bed load and an advection-diffusion approach with source terms for suspended sediment transport. Mentionable special features of the software are arbitrary combination of 1-D and 2-D model domains, a PID controller for various monitoring values and use of an unstructured dual-mesh to improve topographic accuracy. Besides the presentation of some appealing examples of use, the possibility of embedding the software into an open-source pre- and post-processing environment is highlighted.

  18. Modeling in the Classroom: An Evolving Learning Tool

    NASA Astrophysics Data System (ADS)

    Few, A. A.; Marlino, M. R.; Low, R.

    2006-12-01

    Among the early programs (early 1990s) focused on teaching Earth System Science were the Global Change Instruction Program (GCIP) funded by NSF through UCAR and the Earth System Science Education Program (ESSE) funded by NASA through USRA. These two programs introduced modeling as a learning tool from the beginning, and they provided workshops, demonstrations and lectures for their participating universities. These programs were aimed at university-level education. Recently, classroom modeling is experiencing a revival of interest. Drs John Snow and Arthur Few conducted two workshops on modeling at the ESSE21 meeting in Fairbanks, Alaska, in August 2005. The Digital Library for Earth System Education (DLESE) at http://www.dlese.org provides web access to STELLA models and tutorials, and UCAR's Education and Outreach (EO) program holds workshops that include training in modeling. An important innovation to the STELLA modeling software by isee systems, http://www.iseesystems.com, called "isee Player" is available as a free download. The Player allows users to view and run STELLA models, change model parameters, share models with colleagues and students, and make working models available on the web. This is important because the expert can create models, and the user can learn how the modeled system works. Another aspect of this innovation is that the educational benefits of modeling concepts can be extended throughout most of the curriculum. The procedure for building a working computer model of an Earth Science System follows this general format: (1) carefully define the question(s) for which you seek the answer(s); (2) identify the interacting system components and inputs contributing to the system's behavior; (3) collect the information and data that will be required to complete the conceptual model; (4) construct a system diagram (graphic) of the system that displays all of system's central questions, components, relationships and required inputs. At this stage

  19. Collaborative Inquiry Learning: Models, tools, and challenges

    NASA Astrophysics Data System (ADS)

    Bell, Thorsten; Urhahne, Detlef; Schanze, Sascha; Ploetzner, Rolf

    2010-02-01

    Collaborative inquiry learning is one of the most challenging and exciting ventures for today's schools. It aims at bringing a new and promising culture of teaching and learning into the classroom where students in groups engage in self-regulated learning activities supported by the teacher. It is expected that this way of learning fosters students' motivation and interest in science, that they learn to perform steps of inquiry similar to scientists and that they gain knowledge on scientific processes. Starting from general pedagogical reflections and science standards, the article reviews some prominent models of inquiry learning. This comparison results in a set of inquiry processes being the basis for cooperation in the scientific network NetCoIL. Inquiry learning is conceived in several ways with emphasis on different processes. For an illustration of the spectrum, some main conceptions of inquiry and their focuses are described. In the next step, the article describes exemplary computer tools and environments from within and outside the NetCoIL network that were designed to support processes of collaborative inquiry learning. These tools are analysed by describing their functionalities as well as effects on student learning known from the literature. The article closes with challenges for further developments elaborated by the NetCoIL network.

  20. Laser vision: lidar as a transformative tool to advance critical zone science

    NASA Astrophysics Data System (ADS)

    Harpold, A. A.; Marshall, J. A.; Lyon, S. W.; Barnhart, T. B.; Fisher, B. A.; Donovan, M.; Brubaker, K. M.; Crosby, C. J.; Glenn, N. F.; Glennie, C. L.; Kirchner, P. B.; Lam, N.; Mankoff, K. D.; McCreight, J. L.; Molotch, N. P.; Musselman, K. N.; Pelletier, J.; Russo, T.; Sangireddy, H.; Sjöberg, Y.; Swetnam, T.; West, N.

    2015-06-01

    . We propose that using lidar to its full potential will require numerous advances, including new and more powerful open-source processing tools, exploiting new lidar acquisition technologies, and improved integration with physically based models and complementary in situ and remote-sensing observations. We provide a 5-year vision that advocates for the expanded use of lidar data sets and highlights subsequent potential to advance the state of CZ science.

  1. Center for Advanced Modeling and Simulation Intern

    SciTech Connect

    Gertman, Vanessa

    2010-01-01

    Some interns just copy papers and seal envelopes. Not at INL! Check out how Vanessa Gertman, an INL intern working at the Center for Advanced Modeling and Simulation, spent her summer working with some intense visualization software. Lots more content like this is available at INL's facebook page http://www.facebook.com/idahonationallaboratory.

  2. Center for Advanced Modeling and Simulation Intern

    ScienceCinema

    Gertman, Vanessa

    2013-05-28

    Some interns just copy papers and seal envelopes. Not at INL! Check out how Vanessa Gertman, an INL intern working at the Center for Advanced Modeling and Simulation, spent her summer working with some intense visualization software. Lots more content like this is available at INL's facebook page http://www.facebook.com/idahonationallaboratory.

  3. Advances in a distributed approach for ocean model data interoperability

    USGS Publications Warehouse

    Signell, Richard P.; Snowden, Derrick P.

    2014-01-01

    An infrastructure for earth science data is emerging across the globe based on common data models and web services. As we evolve from custom file formats and web sites to standards-based web services and tools, data is becoming easier to distribute, find and retrieve, leaving more time for science. We describe recent advances that make it easier for ocean model providers to share their data, and for users to search, access, analyze and visualize ocean data using MATLAB® and Python®. These include a technique for modelers to create aggregated, Climate and Forecast (CF) metadata convention datasets from collections of non-standard Network Common Data Form (NetCDF) output files, the capability to remotely access data from CF-1.6-compliant NetCDF files using the Open Geospatial Consortium (OGC) Sensor Observation Service (SOS), a metadata standard for unstructured grid model output (UGRID), and tools that utilize both CF and UGRID standards to allow interoperable data search, browse and access. We use examples from the U.S. Integrated Ocean Observing System (IOOS®) Coastal and Ocean Modeling Testbed, a project in which modelers using both structured and unstructured grid model output needed to share their results, to compare their results with other models, and to compare models with observed data. The same techniques used here for ocean modeling output can be applied to atmospheric and climate model output, remote sensing data, digital terrain and bathymetric data.

  4. Development of Advanced Light-Duty Powertrain and Hybrid Analysis Tool (SAE 2013-01-0808)

    EPA Science Inventory

    The Advanced Light-Duty Powertrain and Hybrid Analysis tool was created by Environmental Protection Agency to evaluate the Greenhouse gas emissions and fuel efficiency from light-duty vehicles. It is a physics-based, forward-looking, full vehicle computer simulator, which is cap...

  5. Earthquake information products and tools from the Advanced National Seismic System (ANSS)

    USGS Publications Warehouse

    Wald, Lisa

    2006-01-01

    This Fact Sheet provides a brief description of postearthquake tools and products provided by the Advanced National Seismic System (ANSS) through the U.S. Geological Survey Earthquake Hazards Program. The focus is on products specifically aimed at providing situational awareness in the period immediately following significant earthquake events.

  6. THE AGWA – KINEROS2 SUITE OF MODELING TOOLS

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A suite of modeling tools ranging from the event-based KINEROS2 flash-flood forecasting tool to the continuous (K2-O2) KINEROS-OPUS biogeochemistry tool. The KINEROS2 flash flood forecasting tool is being tested with the National Weather Service (NEW) is described. Tne NWS version assimilates Dig...

  7. Air modeling: Air dispersion models; regulatory applications and technological advances

    SciTech Connect

    Miller, M.; Liles, R.

    1995-09-01

    Air dispersion models are a useful and practical tool for both industry and regulatory agencies. They serve as tools for engineering, permitting, and regulations development. Their cost effectiveness and ease of implementation compared to ambient monitoring is perhaps their most-appealing trait. Based on the current momentum within the U.S. EPA to develop better models and contain regulatory burdens on industry, it is likely that air dispersion modeling will be a major player in future air regulatory initiatives.

  8. Advanced computer modeling techniques expand belt conveyor technology

    SciTech Connect

    Alspaugh, M.

    1998-07-01

    Increased mining production is continuing to challenge engineers and manufacturers to keep up. The pressure to produce larger and more versatile equipment is increasing. This paper will show some recent major projects in the belt conveyor industry that have pushed the limits of design and engineering technology. Also, it will discuss the systems engineering discipline and advanced computer modeling tools that have helped make these achievements possible. Several examples of technologically advanced designs will be reviewed. However, new technology can sometimes produce increased problems with equipment availability and reliability if not carefully developed. Computer modeling techniques that help one design larger equipment can also compound operational headaches if engineering processes and algorithms are not carefully analyzed every step of the way.

  9. Numerical modelling of tool wear in turning with cemented carbide cutting tools

    SciTech Connect

    Franco, P.; Estrems, M.; Faura, F.

    2007-04-07

    A numerical model is proposed for analysing the flank and crater wear resulting from the loss of material on cutting tool surface in turning processes due to wear mechanisms of adhesion, abrasion and fracture. By means of this model, the material loss along cutting tool surface can be analysed, and the worn surface shape during the workpiece machining can be determined. The proposed model analyses the gradual degradation of cutting tool during turning operation, and tool wear can be estimated as a function of cutting time. Wear-land width (VB) and crater depth (KT) can be obtained for description of material loss on cutting tool surface, and the effects of the distinct wear mechanisms on surface shape can be studied. The parameters required for the tool wear model are obtained from bibliography and experimental observation for AISI 4340 steel turning with WC-Co cutting tools.

  10. Combustion modeling in advanced gas turbine systems

    SciTech Connect

    Smoot, L.D.; Hedman, P.O.; Fletcher, T.H.; Brewster, B.S.; Kramer, S.K.

    1995-12-31

    Goal of DOE`s Advanced Turbine Systems program is to develop and commercialize ultra-high efficiency, environmentally superior, cost competitive gas turbine systems for base-load applications in utility, independent power producer, and industrial markets. Primary objective of the program here is to develop a comprehensive combustion model for advanced gas turbine combustion systems using natural gas (coal gasification or biomass fuels). The efforts included code evaluation (PCGC-3), coherent anti-Stokes Raman spectroscopy, laser Doppler anemometry, and laser-induced fluorescence.

  11. Advanced dynamic modelling for friction draft gears

    NASA Astrophysics Data System (ADS)

    Wu, Qing; Spiryagin, Maksym; Cole, Colin

    2015-04-01

    A white-box friction draft gear model has been developed with all components of the draft gear and their geometries considered. The conventional two-stage (loading and unloading) working process of the friction draft gear was detailed as a four-stage process. A preliminary work called the 'base model' was improved with regard to force-displacement characteristics, friction modelling and transitional characteristics. A set of impact test data were analysed; five types of draft gear behaviour were identified and modelled: hysteresis, stiffening, change of stage, locked unloading and softening. Simulated comparisons of three draft gear models were presented: a look-up table model, the base model and the advanced model.

  12. Integrated modeling tool for performance engineering of complex computer systems

    NASA Technical Reports Server (NTRS)

    Wright, Gary; Ball, Duane; Hoyt, Susan; Steele, Oscar

    1989-01-01

    This report summarizes Advanced System Technologies' accomplishments on the Phase 2 SBIR contract NAS7-995. The technical objectives of the report are: (1) to develop an evaluation version of a graphical, integrated modeling language according to the specification resulting from the Phase 2 research; and (2) to determine the degree to which the language meets its objectives by evaluating ease of use, utility of two sets of performance predictions, and the power of the language constructs. The technical approach followed to meet these objectives was to design, develop, and test an evaluation prototype of a graphical, performance prediction tool. The utility of the prototype was then evaluated by applying it to a variety of test cases found in the literature and in AST case histories. Numerous models were constructed and successfully tested. The major conclusion of this Phase 2 SBIR research and development effort is that complex, real-time computer systems can be specified in a non-procedural manner using combinations of icons, windows, menus, and dialogs. Such a specification technique provides an interface that system designers and architects find natural and easy to use. In addition, PEDESTAL's multiview approach provides system engineers with the capability to perform the trade-offs necessary to produce a design that meets timing performance requirements. Sample system designs analyzed during the development effort showed that models could be constructed in a fraction of the time required by non-visual system design capture tools.

  13. Maturity Model for Advancing Smart Grid Interoperability

    SciTech Connect

    Knight, Mark; Widergren, Steven E.; Mater, J.; Montgomery, Austin

    2013-10-28

    Abstract—Interoperability is about the properties of devices and systems to connect and work properly. Advancing interoperability eases integration and maintenance of the resulting interconnection. This leads to faster integration, lower labor and component costs, predictability of projects and the resulting performance, and evolutionary paths for upgrade. When specifications are shared and standardized, competition and novel solutions can bring new value streams to the community of stakeholders involved. Advancing interoperability involves reaching agreement for how things join at their interfaces. The quality of the agreements and the alignment of parties involved in the agreement present challenges that are best met with process improvement techniques. The GridWise® Architecture Council (GWAC) sponsored by the United States Department of Energy is supporting an effort to use concepts from capability maturity models used in the software industry to advance interoperability of smart grid technology. An interoperability maturity model has been drafted and experience is being gained through trials on various types of projects and community efforts. This paper describes the value and objectives of maturity models, the nature of the interoperability maturity model and how it compares with other maturity models, and experiences gained with its use.

  14. Recombination hotspots: Models and tools for detection.

    PubMed

    Paul, Prosenjit; Nag, Debjyoti; Chakraborty, Supriyo

    2016-04-01

    Recombination hotspots are the regions within the genome where the rate, and the frequency of recombination are optimum with a size varying from 1 to 2kb. The recombination event is mediated by the double-stranded break formation, guided by the combined enzymatic action of DNA topoisomerase and Spo 11 endonuclease. These regions are distributed non-uniformly throughout the human genome and cause distortions in the genetic map. Numerous lines of evidence suggest that the number of hotspots known in humans has increased manifold in recent years. A few facts about the hotspot evolutions were also put forward, indicating the differences in the hotspot position between chimpanzees and humans. In mice, recombination hot spots were found to be clustered within the major histocompatibility complex (MHC) region. Several models, that help explain meiotic recombination has been proposed. Moreover, scientists also developed some computational tools to locate the hotspot position and estimate their recombination rate in humans is of great interest to population and medical geneticists. Here we reviewed the molecular mechanisms, models and in silico prediction techniques of hot spot residues. PMID:26991854

  15. Integrated modeling of advanced optical systems

    NASA Astrophysics Data System (ADS)

    Briggs, Hugh C.; Needels, Laura; Levine, B. Martin

    1993-02-01

    This poster session paper describes an integrated modeling and analysis capability being developed at JPL under funding provided by the JPL Director's Discretionary Fund and the JPL Control/Structure Interaction Program (CSI). The posters briefly summarize the program capabilities and illustrate them with an example problem. The computer programs developed under this effort will provide an unprecedented capability for integrated modeling and design of high performance optical spacecraft. The engineering disciplines supported include structural dynamics, controls, optics and thermodynamics. Such tools are needed in order to evaluate the end-to-end system performance of spacecraft such as OSI, POINTS, and SMMM. This paper illustrates the proof-of-concept tools that have been developed to establish the technology requirements and demonstrate the new features of integrated modeling and design. The current program also includes implementation of a prototype tool based upon the CAESY environment being developed under the NASA Guidance and Control Research and Technology Computational Controls Program. This prototype will be available late in FY-92. The development plan proposes a major software production effort to fabricate, deliver, support and maintain a national-class tool from FY-93 through FY-95.

  16. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 11: Computer-Aided Manufacturing & Advanced CNC, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  17. Advances in Computationally Modeling Human Oral Bioavailability

    PubMed Central

    Wang, Junmei; Hou, Tingjun

    2015-01-01

    Although significant progress has been made in experimental high throughput screening (HTS) of ADME (absorption, distribution, metabolism, excretion) and pharmacokinetic properties, the ADME and Toxicity (ADME-Tox) in silico modeling is still indispensable in drug discovery as it can guide us to wisely select drug candidates prior to expensive ADME screenings and clinical trials. Compared to other ADME-Tox properties, human oral bioavailability (HOBA) is particularly important but extremely difficult to predict. In this paper, the advances in human oral bioavailability modeling will be reviewed. Moreover, our deep insight on how to construct more accurate and reliable HOBA QSAR and classification models will also discussed. PMID:25582307

  18. Advances in computationally modeling human oral bioavailability.

    PubMed

    Wang, Junmei; Hou, Tingjun

    2015-06-23

    Although significant progress has been made in experimental high throughput screening (HTS) of ADME (absorption, distribution, metabolism, excretion) and pharmacokinetic properties, the ADME and Toxicity (ADME-Tox) in silico modeling is still indispensable in drug discovery as it can guide us to wisely select drug candidates prior to expensive ADME screenings and clinical trials. Compared to other ADME-Tox properties, human oral bioavailability (HOBA) is particularly important but extremely difficult to predict. In this paper, the advances in human oral bioavailability modeling will be reviewed. Moreover, our deep insight on how to construct more accurate and reliable HOBA QSAR and classification models will also discussed. PMID:25582307

  19. Advanced Technology System Scheduling Governance Model

    SciTech Connect

    Ang, Jim; Carnes, Brian; Hoang, Thuc; Vigil, Manuel

    2015-06-11

    In the fall of 2005, the Advanced Simulation and Computing (ASC) Program appointed a team to formulate a governance model for allocating resources and scheduling the stockpile stewardship workload on ASC capability systems. This update to the original document takes into account the new technical challenges and roles for advanced technology (AT) systems and the new ASC Program workload categories that must be supported. The goal of this updated model is to effectively allocate and schedule AT computing resources among all three National Nuclear Security Administration (NNSA) laboratories for weapons deliverables that merit priority on this class of resource. The process outlined below describes how proposed work can be evaluated and approved for resource allocations while preserving high effective utilization of the systems. This approach will provide the broadest possible benefit to the Stockpile Stewardship Program (SSP).

  20. Combustion modeling in advanced gas turbine systems

    SciTech Connect

    Smoot, L.D.; Hedman, P.O.; Fletcher, T.H.

    1995-10-01

    The goal of the U.S. Department of Energy`s Advanced Turbine Systems (ATS) program is to help develop and commercialize ultra-high efficiency, environmentally superior, and cost competitive gas turbine systems for base-load applications in the utility, independent power producer, and industrial markets. Combustion modeling, including emission characteristics, has been identified as a needed, high-priority technology by key professionals in the gas turbine industry.

  1. Advances in the identification of transfer function models using Prony analysis

    SciTech Connect

    Trudnowski, D.J.; Donnelly, M.K.; Hauer, J.F.

    1993-06-01

    This paper further advances the usefulness and understanding of Prony analysis as a tool for identification of models. The presented results allow more generality in the assumed model formulation. In addition, a comparison is made between Prony analysis and autoregressive moving-average (ARMA) modeling. Special attention is given to system conditions often encountered with power system electromechanical dynamics.

  2. Modelling of Tool Wear and Residual Stress during Machining of AISI H13 Tool Steel

    NASA Astrophysics Data System (ADS)

    Outeiro, José C.; Umbrello, Domenico; Pina, José C.; Rizzuti, Stefania

    2007-05-01

    Residual stresses can enhance or impair the ability of a component to withstand loading conditions in service (fatigue, creep, stress corrosion cracking, etc.), depending on their nature: compressive or tensile, respectively. This poses enormous problems in structural assembly as this affects the structural integrity of the whole part. In addition, tool wear issues are of critical importance in manufacturing since these affect component quality, tool life and machining cost. Therefore, prediction and control of both tool wear and the residual stresses in machining are absolutely necessary. In this work, a two-dimensional Finite Element model using an implicit Lagrangian formulation with an automatic remeshing was applied to simulate the orthogonal cutting process of AISI H13 tool steel. To validate such model the predicted and experimentally measured chip geometry, cutting forces, temperatures, tool wear and residual stresses on the machined affected layers were compared. The proposed FE model allowed us to investigate the influence of tool geometry, cutting regime parameters and tool wear on residual stress distribution in the machined surface and subsurface of AISI H13 tool steel. The obtained results permit to conclude that in order to reduce the magnitude of surface residual stresses, the cutting speed should be increased, the uncut chip thickness (or feed) should be reduced and machining with honed tools having large cutting edge radii produce better results than chamfered tools. Moreover, increasing tool wear increases the magnitude of surface residual stresses.

  3. Modelling of Tool Wear and Residual Stress during Machining of AISI H13 Tool Steel

    SciTech Connect

    Outeiro, Jose C.; Pina, Jose C.; Umbrello, Domenico; Rizzuti, Stefania

    2007-05-17

    Residual stresses can enhance or impair the ability of a component to withstand loading conditions in service (fatigue, creep, stress corrosion cracking, etc.), depending on their nature: compressive or tensile, respectively. This poses enormous problems in structural assembly as this affects the structural integrity of the whole part. In addition, tool wear issues are of critical importance in manufacturing since these affect component quality, tool life and machining cost. Therefore, prediction and control of both tool wear and the residual stresses in machining are absolutely necessary. In this work, a two-dimensional Finite Element model using an implicit Lagrangian formulation with an automatic remeshing was applied to simulate the orthogonal cutting process of AISI H13 tool steel. To validate such model the predicted and experimentally measured chip geometry, cutting forces, temperatures, tool wear and residual stresses on the machined affected layers were compared. The proposed FE model allowed us to investigate the influence of tool geometry, cutting regime parameters and tool wear on residual stress distribution in the machined surface and subsurface of AISI H13 tool steel. The obtained results permit to conclude that in order to reduce the magnitude of surface residual stresses, the cutting speed should be increased, the uncut chip thickness (or feed) should be reduced and machining with honed tools having large cutting edge radii produce better results than chamfered tools. Moreover, increasing tool wear increases the magnitude of surface residual stresses.

  4. Simulation Tools Model Icing for Aircraft Design

    NASA Technical Reports Server (NTRS)

    2012-01-01

    the years from strictly a research tool to one used routinely by industry and other government agencies. Glenn contractor William Wright has been the architect of this development, supported by a team of researchers investigating icing physics, creating validation data, and ensuring development according to standard software engineering practices. The program provides a virtual simulation environment for determining where water droplets strike an airfoil in flight, what kind of ice would result, and what shape that ice would take. Users can enter geometries for specific, two-dimensional cross sections of an airfoil or other airframe surface and then apply a range of inputs - different droplet sizes, temperatures, airspeeds, and more - to model how ice would build up on the surface in various conditions. The program s versatility, ease of use, and speed - LEWICE can run through complex icing simulations in only a few minutes - have contributed to it becoming a popular resource in the aviation industry.

  5. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 9: Tool and Die, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  6. Accelerating advances in continental domain hydrologic modeling

    NASA Astrophysics Data System (ADS)

    Archfield, Stacey A.; Clark, Martyn; Arheimer, Berit; Hay, Lauren E.; McMillan, Hilary; Kiang, Julie E.; Seibert, Jan; Hakala, Kirsti; Bock, Andrew; Wagener, Thorsten; Farmer, William H.; Andréassian, Vazken; Attinger, Sabine; Viglione, Alberto; Knight, Rodney; Markstrom, Steven; Over, Thomas

    2015-12-01

    In the past, hydrologic modeling of surface water resources has mainly focused on simulating the hydrologic cycle at local to regional catchment modeling domains. There now exists a level of maturity among the catchment, global water security, and land surface modeling communities such that these communities are converging toward continental domain hydrologic models. This commentary, written from a catchment hydrology community perspective, provides a review of progress in each community toward this achievement, identifies common challenges the communities face, and details immediate and specific areas in which these communities can mutually benefit one another from the convergence of their research perspectives. Those include: (1) creating new incentives and infrastructure to report and share model inputs, outputs, and parameters in data services and open access, machine-independent formats for model replication or reanalysis; (2) ensuring that hydrologic models have: sufficient complexity to represent the dominant physical processes and adequate representation of anthropogenic impacts on the terrestrial water cycle, a process-based approach to model parameter estimation, and appropriate parameterizations to represent large-scale fluxes and scaling behavior; (3) maintaining a balance between model complexity and data availability as well as uncertainties; and (4) quantifying and communicating significant advancements toward these modeling goals.

  7. A National Strategy for Advancing Climate Modeling

    SciTech Connect

    Dunlea, Edward; Elfring, Chris

    2012-12-04

    Climate models are the foundation for understanding and projecting climate and climate-related changes and are thus critical tools for supporting climate-related decision making. This study developed a holistic strategy for improving the nation's capability to accurately simulate climate and related Earth system changes on decadal to centennial timescales. The committee's report is a high level analysis, providing a strategic framework to guide progress in the nation's climate modeling enterprise over the next 10-20 years. This study was supported by DOE, NSF, NASA, NOAA, and the intelligence community.

  8. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 1: Executive Summary, of a 15-Volume Set of Skills Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    The Machine Tool Advanced Skills Technology (MAST) consortium was formed to address the shortage of skilled workers for the machine tools and metals-related industries. Featuring six of the nation's leading advanced technology centers, the MAST consortium developed, tested, and disseminated industry-specific skill standards and model curricula for…

  9. Multiscale and Multiphysics Modeling of Additive Manufacturing of Advanced Materials

    NASA Technical Reports Server (NTRS)

    Liou, Frank; Newkirk, Joseph; Fan, Zhiqiang; Sparks, Todd; Chen, Xueyang; Fletcher, Kenneth; Zhang, Jingwei; Zhang, Yunlu; Kumar, Kannan Suresh; Karnati, Sreekar

    2015-01-01

    The objective of this proposed project is to research and develop a prediction tool for advanced additive manufacturing (AAM) processes for advanced materials and develop experimental methods to provide fundamental properties and establish validation data. Aircraft structures and engines demand materials that are stronger, useable at much higher temperatures, provide less acoustic transmission, and enable more aeroelastic tailoring than those currently used. Significant improvements in properties can only be achieved by processing the materials under nonequilibrium conditions, such as AAM processes. AAM processes encompass a class of processes that use a focused heat source to create a melt pool on a substrate. Examples include Electron Beam Freeform Fabrication and Direct Metal Deposition. These types of additive processes enable fabrication of parts directly from CAD drawings. To achieve the desired material properties and geometries of the final structure, assessing the impact of process parameters and predicting optimized conditions with numerical modeling as an effective prediction tool is necessary. The targets for the processing are multiple and at different spatial scales, and the physical phenomena associated occur in multiphysics and multiscale. In this project, the research work has been developed to model AAM processes in a multiscale and multiphysics approach. A macroscale model was developed to investigate the residual stresses and distortion in AAM processes. A sequentially coupled, thermomechanical, finite element model was developed and validated experimentally. The results showed the temperature distribution, residual stress, and deformation within the formed deposits and substrates. A mesoscale model was developed to include heat transfer, phase change with mushy zone, incompressible free surface flow, solute redistribution, and surface tension. Because of excessive computing time needed, a parallel computing approach was also tested. In addition

  10. Advancements in predictive plasma formation modeling

    NASA Astrophysics Data System (ADS)

    Purvis, Michael A.; Schafgans, Alexander; Brown, Daniel J. W.; Fomenkov, Igor; Rafac, Rob; Brown, Josh; Tao, Yezheng; Rokitski, Slava; Abraham, Mathew; Vargas, Mike; Rich, Spencer; Taylor, Ted; Brandt, David; Pirati, Alberto; Fisher, Aaron; Scott, Howard; Koniges, Alice; Eder, David; Wilks, Scott; Link, Anthony; Langer, Steven

    2016-03-01

    We present highlights from plasma simulations performed in collaboration with Lawrence Livermore National Labs. This modeling is performed to advance the rate of learning about optimal EUV generation for laser produced plasmas and to provide insights where experimental results are not currently available. The goal is to identify key physical processes necessary for an accurate and predictive model capable of simulating a wide range of conditions. This modeling will help to drive source performance scaling in support of the EUV Lithography roadmap. The model simulates pre-pulse laser interaction with the tin droplet and follows the droplet expansion into the main pulse target zone. Next, the interaction of the expanded droplet with the main laser pulse is simulated. We demonstrate the predictive nature of the code and provide comparison with experimental results.

  11. Adaptive Modeling, Engineering Analysis and Design of Advanced Aerospace Vehicles

    NASA Technical Reports Server (NTRS)

    Mukhopadhyay, Vivek; Hsu, Su-Yuen; Mason, Brian H.; Hicks, Mike D.; Jones, William T.; Sleight, David W.; Chun, Julio; Spangler, Jan L.; Kamhawi, Hilmi; Dahl, Jorgen L.

    2006-01-01

    This paper describes initial progress towards the development and enhancement of a set of software tools for rapid adaptive modeling, and conceptual design of advanced aerospace vehicle concepts. With demanding structural and aerodynamic performance requirements, these high fidelity geometry based modeling tools are essential for rapid and accurate engineering analysis at the early concept development stage. This adaptive modeling tool was used for generating vehicle parametric geometry, outer mold line and detailed internal structural layout of wing, fuselage, skin, spars, ribs, control surfaces, frames, bulkheads, floors, etc., that facilitated rapid finite element analysis, sizing study and weight optimization. The high quality outer mold line enabled rapid aerodynamic analysis in order to provide reliable design data at critical flight conditions. Example application for structural design of a conventional aircraft and a high altitude long endurance vehicle configuration are presented. This work was performed under the Conceptual Design Shop sub-project within the Efficient Aerodynamic Shape and Integration project, under the former Vehicle Systems Program. The project objective was to design and assess unconventional atmospheric vehicle concepts efficiently and confidently. The implementation may also dramatically facilitate physics-based systems analysis for the NASA Fundamental Aeronautics Mission. In addition to providing technology for design and development of unconventional aircraft, the techniques for generation of accurate geometry and internal sub-structure and the automated interface with the high fidelity analysis codes could also be applied towards the design of vehicles for the NASA Exploration and Space Science Mission projects.

  12. Modeling and Tool Wear in Routing of CFRP

    SciTech Connect

    Iliescu, D.; Fernandez, A.; Gutierrez-Orrantia, M. E.; Lopez de Lacalle, L. N.

    2011-01-17

    This paper presents the prediction and evaluation of feed force in routing of carbon composite material. In order to extend tool life and improve quality of the machined surface, a better understanding of uncoated and coated tool behaviors is required. This work describes (1) the optimization of the geometry of multiple teeth tools minimizing the tool wear and the feed force, (2) the optimization of tool coating and (3) the development of a phenomenological model between the feed force, the routing parameters and the tool wear. The experimental results indicate that the feed rate, the cutting speed and the tool wear are the most significant factors affecting the feed force. In the case of multiple teeth tools, a particular geometry with 14 teeth right helix right cut and 11 teeth left helix right cut gives the best results. A thick AlTiN coating or a diamond coating can dramatically improve the tool life while minimizing the axial force, roughness and delamination. A wear model has then been developed based on an abrasive behavior of the tool. The model links the feed rate to the tool geometry parameters (tool diameter), to the process parameters (feed rate, cutting speed and depth of cut) and to the wear. The model presented has been verified by experimental tests.

  13. Thermochemical modelling of advanced CANDU reactor fuel

    NASA Astrophysics Data System (ADS)

    Corcoran, Emily Catherine

    2009-04-01

    With an aging fleet of nuclear generating facilities, the imperative to limit the use of non-renewal fossil fuels and the inevitable need for additional electricity to power Canada's economy, a renaissance in the use of nuclear technology in Canada is at hand. The experience and knowledge of over 40 years of CANDU research, development and operation in Ontario and elsewhere has been applied to a new generation of CANDU, the Advanced CANDU Reactor (ACR). Improved fuel design allows for an extended burnup, which is a significant improvement, enhancing the safety and the economies of the ACR. The use of a Burnable Neutron Absorber (BNA) material and Low Enriched Uranium (LEU) fuel has created a need to understand better these novel materials and fuel types. This thesis documents a work to advance the scientific and technological knowledge of the ACR fuel design with respect to thermodynamic phase stability and fuel oxidation modelling. For the BNA material, a new (BNA) model is created based on the fundamental first principles of Gibbs energy minimization applied to material phase stability. For LEU fuel, the methodology used for the BNA model is applied to the oxidation of irradiated fuel. The pertinent knowledge base for uranium, oxygen and the major fission products is reviewed, updated and integrated to create a model that is applicable to current and future CANDU fuel designs. As part of this thesis, X-Ray Diffraction (XRD) and Coulombic Titration (CT) experiments are compared to the BNA and LEU models, respectively. From the analysis of the CT results, a number of improvements are proposed to enhance the LEU model and provide confidence in its application to ACR fuel. A number of applications for the potential use of these models are proposed and discussed. Keywords: CANDU Fuel, Gibbs Energy Mimimization, Low Enriched Uranium (LEU) Fuel, Burnable Neutron Absorber (BNA) Material, Coulometric Titration, X-Ray Diffraction

  14. Performance analysis and optimization of an advanced pharmaceutical wastewater treatment plant through a visual basic software tool (PWWT.VB).

    PubMed

    Pal, Parimal; Thakura, Ritwik; Chakrabortty, Sankha

    2016-05-01

    A user-friendly, menu-driven simulation software tool has been developed for the first time to optimize and analyze the system performance of an advanced continuous membrane-integrated pharmaceutical wastewater treatment plant. The software allows pre-analysis and manipulation of input data which helps in optimization and shows the software performance visually on a graphical platform. Moreover, the software helps the user to "visualize" the effects of the operating parameters through its model-predicted output profiles. The software is based on a dynamic mathematical model, developed for a systematically integrated forward osmosis-nanofiltration process for removal of toxic organic compounds from pharmaceutical wastewater. The model-predicted values have been observed to corroborate well with the extensive experimental investigations which were found to be consistent under varying operating conditions like operating pressure, operating flow rate, and draw solute concentration. Low values of the relative error (RE = 0.09) and high values of Willmott-d-index (d will = 0.981) reflected a high degree of accuracy and reliability of the software. This software is likely to be a very efficient tool for system design or simulation of an advanced membrane-integrated treatment plant for hazardous wastewater. PMID:26856870

  15. 76 FR 68011 - Medicare Program; Advanced Payment Model

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-02

    ... Medicare Program; Advanced Payment Model; Notice #0;#0;Federal Register / Vol. 76, No. 212 / Wednesday... Services Medicare Program; Advanced Payment Model AGENCY: Centers for Medicare & Medicaid Services (CMS), HHS. ACTION: Notice. SUMMARY: This notice announces the testing of the Advance Payment Model...

  16. A Tabletop Tool for Modeling Life Support Systems

    NASA Technical Reports Server (NTRS)

    Ramachandran, N.; Majumdar, A.; McDaniels, D.; Stewart, E.

    2003-01-01

    This paper describes the development plan for a comprehensive research and diagnostic tool for aspects of advanced life support systems in space-based laboratories. Specifically it aims to build a high fidelity tabletop model that can be used for the purpose of risk mitigation, failure mode analysis, contamination tracking, and testing reliability. We envision a comprehensive approach involving experimental work coupled with numerical simulation to develop this diagnostic tool. It envisions a 10% scale transparent model of a space platform such as the International Space Station that operates with water or a specific matched index of refraction liquid as the working fluid. This allows the scaling of a 10 ft x 10 ft x 10 ft room with air flow to 1 ft x 1 ft x 1 ft tabletop model with water/liquid flow. Dynamic similitude for this length scale dictates model velocities to be 67% of full-scale and thereby the time scale of the model to represent 15% of the full- scale system; meaning identical processes in the model are completed in 15% of the full- scale time. The use of an index matching fluid (fluid that matches the refractive index of cast acrylic, the model material) allows making the entire model (with complex internal geometry) transparent and hence conducive to non-intrusive optical diagnostics. So using such a system one can test environment control parameters such as core flows (axial flows), cross flows (from registers and diffusers), potential problem areas such as flow short circuits, inadequate oxygen content, build up of other gases beyond desirable levels, test mixing processes within the system at local nodes or compartments and assess the overall system performance. The system allows quantitative measurements of contaminants introduced in the system and allows testing and optimizing the tracking process and removal of contaminants. The envisaged system will be modular and hence flexible for quick configuration change and subsequent testing. The data

  17. Modeling: The Right Tool for the Job.

    ERIC Educational Resources Information Center

    Gavanasen, Varut; Hussain, S. Tariq

    1993-01-01

    Reviews the different types of models that can be used in groundwater modeling. Discusses the flow and contaminant transport models in the saturated zone, flow and contaminant transport in variably saturated flow regime, vapor transport, biotransformation models, multiphase models, optimization algorithms, and potentials pitfalls of using these…

  18. Advances in Sun-Earth Connection Modeling

    NASA Astrophysics Data System (ADS)

    Ganguli, S. B.; Gavrishchaka, V. V.

    2003-06-01

    Space weather forecasting is a focus of a multidisciplinary research effort motivated by a sensitive dependence of many modern technologies on geospace conditions. Adequate understanding of the physics of the Sun-Earth connection and associated multi-scale magnetospheric and ionospheric processes is an essential part of this effort. Modern physical simulation models such as multimoment multifluid models with effective coupling from small-scale kinetic processes can provide valuable insight into the role of various physical mechanisms operating during geomagnetic storm/substorm activity. However, due to necessary simplifying assumptions, physical models are still not well suited for accurate real-time forecasting. Complimentary approach includes data-driven models capable of efficient processing of multi-scale spatio-temporal data. However, the majority of advanced nonlinear algorithms, including neural networks (NN), can encounter a set of problems called dimensionality curse when applied to high-dimensional data. Forecasting of rare/extreme events such as large geomagnetic storms/substorms is of the most practical importance but is also very challenging for many existing models. A very promising algorithm that combines the power of the best nonlinear techniques and tolerance to high-dimensional and incomplete data is support vector machine (SVM). We have summarized advantages of the SVM and described a hybrid model based on SVM and extreme value theory (EVT) for rare event forecasting. Results of the SVM application to substorm forecasting and future directions are discussed.

  19. THE ATMOSPHERIC MODEL EVALUATION TOOL (AMET); AIR QUALITY MODULE

    EPA Science Inventory

    This presentation reviews the development of the Atmospheric Model Evaluation Tool (AMET) air quality module. The AMET tool is being developed to aid in the model evaluation. This presentation focuses on the air quality evaluation portion of AMET. Presented are examples of the...

  20. Storm Water Management Model Climate Adjustment Tool (SWMM-CAT)

    EPA Science Inventory

    The US EPA’s newest tool, the Stormwater Management Model (SWMM) – Climate Adjustment Tool (CAT) is meant to help municipal stormwater utilities better address potential climate change impacts affecting their operations. SWMM, first released in 1971, models hydrology and hydrauli...

  1. THE ATMOSPHERIC MODEL EVALUATION TOOL: METEOROLOGY MODULE

    EPA Science Inventory

    Air quality modeling is continuously expanding in sophistication and function. Currently, air quality models are being used for research, forecasting, regulatory related emission control strategies, and other applications. Results from air-quality model applications are closely ...

  2. Development of Advanced Computational Aeroelasticity Tools at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Bartels, R. E.

    2008-01-01

    NASA Langley Research Center has continued to develop its long standing computational tools to address new challenges in aircraft and launch vehicle design. This paper discusses the application and development of those computational aeroelastic tools. Four topic areas will be discussed: 1) Modeling structural and flow field nonlinearities; 2) Integrated and modular approaches to nonlinear multidisciplinary analysis; 3) Simulating flight dynamics of flexible vehicles; and 4) Applications that support both aeronautics and space exploration.

  3. Prospects for Advanced RF Theory and Modeling

    SciTech Connect

    Batchelor, D.B.

    1999-04-12

    This paper represents an attempt to express in print the contents of a rather philosophical review talk. The charge for the talk was not to summarize the present status of the field and what we can do, but to assess what we will need to do in the future and where the gaps are in fulfilling these needs. The objective was to be complete, covering all aspects of theory and modeling in all frequency regimes, although in the end the talk mainly focussed on the ion cyclotron range of frequencies (ICRF). In choosing which areas to develop, it is important to keep in mind who the customers for RF modeling are likely to be and what sorts of tasks they will need for RF to do. This occupies the first part of the paper. Then we examine each of the elements of a complete RF theory and try to identify the kinds of advances needed.

  4. A Multi-layer, Data-driven Advanced Reasoning Tool for Intelligent Data Mining and Analysis for Smart Grids

    SciTech Connect

    Lu, Ning; Du, Pengwei; Greitzer, Frank L.; Guo, Xinxin; Hohimer, Ryan E.; Pomiak, Yekaterina G.

    2012-12-31

    This paper presents the multi-layer, data-driven advanced reasoning tool (M-DART), a proof-of-principle decision support tool for improved power system operation. M-DART will cross-correlate and examine different data sources to assess anomalies, infer root causes, and anneal data into actionable information. By performing higher-level reasoning “triage” of diverse data sources, M-DART focuses on early detection of emerging power system events and identifies highest priority actions for the human decision maker. M-DART represents a significant advancement over today’s grid monitoring technologies that apply offline analyses to derive model-based guidelines for online real-time operations and use isolated data processing mechanisms focusing on individual data domains. The development of the M-DART will bridge these gaps by reasoning about results obtained from multiple data sources that are enabled by the smart grid infrastructure. This hybrid approach integrates a knowledge base that is trained offline but tuned online to capture model-based relationships while revealing complex causal relationships among data from different domains.

  5. Information Model for Machine-Tool-Performance Tests

    PubMed Central

    Lee, Y. Tina; Soons, Johannes A.; Donmez, M. Alkan

    2001-01-01

    This report specifies an information model of machine-tool-performance tests in the EXPRESS [1] language. The information model provides a mechanism for describing the properties and results of machine-tool-performance tests. The objective of the information model is a standardized, computer-interpretable representation that allows for efficient archiving and exchange of performance test data throughout the life cycle of the machine. The report also demonstrates the implementation of the information model using three different implementation methods.

  6. Advancing an Information Model for Environmental Observations

    NASA Astrophysics Data System (ADS)

    Horsburgh, J. S.; Aufdenkampe, A. K.; Hooper, R. P.; Lehnert, K. A.; Schreuders, K.; Tarboton, D. G.; Valentine, D. W.; Zaslavsky, I.

    2011-12-01

    have been modified to support data management for the Critical Zone Observatories (CZOs). This paper will present limitations of the existing information model used by the CUAHSI HIS that have been uncovered through its deployment and use, as well as new advances to the information model, including: better representation of both in situ observations from field sensors and observations derived from environmental samples, extensibility in attributes used to describe observations, and observation provenance. These advances have been developed by the HIS team and the broader scientific community and will enable the information model to accommodate and better describe wider classes of environmental observations and to better meet the needs of the hydrologic science and CZO communities.

  7. Tampa Bay Water Clarity Model (TBWCM): As a Predictive Tool

    EPA Science Inventory

    The Tampa Bay Water Clarity Model was developed as a predictive tool for estimating the impact of changing nutrient loads on water clarity as measured by secchi depth. The model combines a physical mixing model with an irradiance model and nutrient cycling model. A 10 segment bi...

  8. Advanced Space Propulsion System Flowfield Modeling

    NASA Technical Reports Server (NTRS)

    Smith, Sheldon

    1998-01-01

    Solar thermal upper stage propulsion systems currently under development utilize small low chamber pressure/high area ratio nozzles. Consequently, the resulting flow in the nozzle is highly viscous, with the boundary layer flow comprising a significant fraction of the total nozzle flow area. Conventional uncoupled flow methods which treat the nozzle boundary layer and inviscid flowfield separately by combining the two calculations via the influence of the boundary layer displacement thickness on the inviscid flowfield are not accurate enough to adequately treat highly viscous nozzles. Navier Stokes models such as VNAP2 can treat these flowfields but cannot perform a vacuum plume expansion for applications where the exhaust plume produces induced environments on adjacent structures. This study is built upon recently developed artificial intelligence methods and user interface methodologies to couple the VNAP2 model for treating viscous nozzle flowfields with a vacuum plume flowfield model (RAMP2) that is currently a part of the Plume Environment Prediction (PEP) Model. This study integrated the VNAP2 code into the PEP model to produce an accurate, practical and user friendly tool for calculating highly viscous nozzle and exhaust plume flowfields.

  9. Scratch as a computational modelling tool for teaching physics

    NASA Astrophysics Data System (ADS)

    Lopez, Victor; Hernandez, Maria Isabel

    2015-05-01

    The Scratch online authoring tool, which features a simple programming language that has been adapted to primary and secondary students, is being used more and more in schools as it offers students and teachers the opportunity to use a tool to build scientific models and evaluate their behaviour, just as can be done with computational modelling programs. In this article, we briefly discuss why Scratch could be a useful tool for computational modelling in the primary or secondary physics classroom, and we present practical examples of how it can be used to build a model.

  10. A Manually Operated, Advance Off-Stylet Insertion Tool for Minimally Invasive Cochlear Implantation Surgery

    PubMed Central

    Kratchman, Louis B.; Schurzig, Daniel; McRackan, Theodore R.; Balachandran, Ramya; Noble, Jack H.; Webster, Robert J.; Labadie, Robert F.

    2014-01-01

    The current technique for cochlear implantation (CI) surgery requires a mastoidectomy to gain access to the cochlea for electrode array insertion. It has been shown that microstereotactic frames can enable an image-guided, minimally invasive approach to CI surgery called percutaneous cochlear implantation (PCI) that uses a single drill hole for electrode array insertion, avoiding a more invasive mastoidectomy. Current clinical methods for electrode array insertion are not compatible with PCI surgery because they require a mastoidectomy to access the cochlea; thus, we have developed a manually operated electrode array insertion tool that can be deployed through a PCI drill hole. The tool can be adjusted using a preoperative CT scan for accurate execution of the advance off-stylet (AOS) insertion technique and requires less skill to operate than is currently required to implant electrode arrays. We performed three cadaver insertion experiments using the AOS technique and determined that all insertions were successful using CT and microdissection. PMID:22851233

  11. Applying Modeling Tools to Ground System Procedures

    NASA Technical Reports Server (NTRS)

    Di Pasquale, Peter

    2012-01-01

    As part of a long-term effort to revitalize the Ground Systems (GS) Engineering Section practices, Systems Modeling Language (SysML) and Business Process Model and Notation (BPMN) have been used to model existing GS products and the procedures GS engineers use to produce them.

  12. Tools and Equipment Modeling for Automobile Interactive Assembling Operating Simulation

    SciTech Connect

    Wu Dianliang; Zhu Hongmin

    2010-05-21

    Tools and equipment play an important role in the simulation of virtual assembly, especially in the assembly process simulation and plan. Because of variety in function and complexity in structure and manipulation, the simulation of tools and equipments remains to be a challenge for interactive assembly operation. Based on analysis of details and characteristics of interactive operations for automobile assembly, the functional requirement for tools and equipments of automobile assembly is given. Then, a unified modeling method for information expression and function realization of general tools and equipments is represented, and the handling methods of manual, semi-automatic, automatic tools and equipments are discussed. Finally, the application in assembly simulation of rear suspension and front suspension of Roewe 750 automobile is given. The result shows that the modeling and handling methods are applicable in the interactive simulation of various tools and equipments, and can also be used for supporting assembly process planning in virtual environment.

  13. Tools and Equipment Modeling for Automobile Interactive Assembling Operating Simulation

    NASA Astrophysics Data System (ADS)

    Wu, Dianliang; Zhu, Hongmin

    2010-05-01

    Tools and equipment play an important role in the simulation of virtual assembly, especially in the assembly process simulation and plan. Because of variety in function and complexity in structure and manipulation, the simulation of tools and equipments remains to be a challenge for interactive assembly operation. Based on analysis of details and characteristics of interactive operations for automobile assembly, the functional requirement for tools and equipments of automobile assembly is given. Then, a unified modeling method for information expression and function realization of general tools and equipments is represented, and the handling methods of manual, semi-automatic, automatic tools and equipments are discussed. Finally, the application in assembly simulation of rear suspension and front suspension of Roewe 750 automobile is given. The result shows that the modeling and handling methods are applicable in the interactive simulation of various tools and equipments, and can also be used for supporting assembly process planning in virtual environment.

  14. Advancing Cyberinfrastructure to support high resolution water resources modeling

    NASA Astrophysics Data System (ADS)

    Tarboton, D. G.; Ogden, F. L.; Jones, N.; Horsburgh, J. S.

    2012-12-01

    Addressing the problem of how the availability and quality of water resources at large scales are sensitive to climate variability, watershed alterations and management activities requires computational resources that combine data from multiple sources and support integrated modeling. Related cyberinfrastructure challenges include: 1) how can we best structure data and computer models to address this scientific problem through the use of high-performance and data-intensive computing, and 2) how can we do this in a way that discipline scientists without extensive computational and algorithmic knowledge and experience can take advantage of advances in cyberinfrastructure? This presentation will describe a new system called CI-WATER that is being developed to address these challenges and advance high resolution water resources modeling in the Western U.S. We are building on existing tools that enable collaboration to develop model and data interfaces that link integrated system models running within an HPC environment to multiple data sources. Our goal is to enhance the use of computational simulation and data-intensive modeling to better understand water resources. Addressing water resource problems in the Western U.S. requires simulation of natural and engineered systems, as well as representation of legal (water rights) and institutional constraints alongside the representation of physical processes. We are establishing data services to represent the engineered infrastructure and legal and institutional systems in a way that they can be used with high resolution multi-physics watershed modeling at high spatial resolution. These services will enable incorporation of location-specific information on water management infrastructure and systems into the assessment of regional water availability in the face of growing demands, uncertain future meteorological forcings, and existing prior-appropriations water rights. This presentation will discuss the informatics

  15. Novel Multiscale Modeling Tool Applied to Pseudomonas aeruginosa Biofilm Formation

    PubMed Central

    Biggs, Matthew B.; Papin, Jason A.

    2013-01-01

    Multiscale modeling is used to represent biological systems with increasing frequency and success. Multiscale models are often hybrids of different modeling frameworks and programming languages. We present the MATLAB-NetLogo extension (MatNet) as a novel tool for multiscale modeling. We demonstrate the utility of the tool with a multiscale model of Pseudomonas aeruginosa biofilm formation that incorporates both an agent-based model (ABM) and constraint-based metabolic modeling. The hybrid model correctly recapitulates oxygen-limited biofilm metabolic activity and predicts increased growth rate via anaerobic respiration with the addition of nitrate to the growth media. In addition, a genome-wide survey of metabolic mutants and biofilm formation exemplifies the powerful analyses that are enabled by this computational modeling tool. PMID:24147108

  16. Tempest: Tools for Addressing the Needs of Next-Generation Climate Models

    NASA Astrophysics Data System (ADS)

    Ullrich, P. A.; Guerra, J. E.; Pinheiro, M. C.; Fong, J.

    2015-12-01

    Tempest is a comprehensive simulation-to-science infrastructure that tackles the needs of next-generation, high-resolution, data intensive climate modeling activities. This project incorporates three key components: TempestDynamics, a global modeling framework for experimental numerical methods and high-performance computing; TempestRemap, a toolset for arbitrary-order conservative and consistent remapping between unstructured grids; and TempestExtremes, a suite of detection and characterization tools for identifying weather extremes in large climate datasets. In this presentation, the latest advances with the implementation of this framework will be discussed, and a number of projects now utilizing these tools will be featured.

  17. An Analysis of Energy Savings Possible Through Advances in Automotive Tooling Technology

    SciTech Connect

    Rick Schmoyer, RLS

    2004-12-03

    The use of lightweight and highly formable advanced materials in automobile and truck manufacturing has the potential to save fuel. Advances in tooling technology would promote the use of these materials. This report describes an energy savings analysis performed to approximate the potential fuel savings and consequential carbon-emission reductions that would be possible because of advances in tooling in the manufacturing of, in particular, non-powertrain components of passenger cars and heavy trucks. Separate energy analyses are performed for cars and heavy trucks. Heavy trucks are considered to be Class 7 and 8 trucks (trucks rated over 26,000 lbs gross vehicle weight). A critical input to the analysis is a set of estimates of the percentage reductions in weight and drag that could be achieved by the implementation of advanced materials, as a consequence of improved tooling technology, which were obtained by surveying tooling industry experts who attended a DOE Workshop, Tooling Technology for Low-Volume Vehicle Production, held in Seattle and Detroit in October and November 2003. The analysis is also based on 2001 fuel consumption totals and on energy-audit component proportions of fuel use due to drag, rolling resistance, and braking. The consumption proportions are assumed constant over time, but an allowance is made for fleet growth. The savings for a particular component is then the product of total fuel consumption, the percentage reduction of the component, and the energy audit component proportion. Fuel savings estimates for trucks also account for weight-limited versus volume-limited operations. Energy savings are assumed to be of two types: (1) direct energy savings incurred through reduced forces that must be overcome to move the vehicle or to slow it down in braking. and (2) indirect energy savings through reductions in the required engine power, the production and transmission of which incur thermodynamic losses, internal friction, and other

  18. Advances in the identification of electrochemical transfer function models using Prony analysis

    SciTech Connect

    Trudnowski, D.J. ); Donnelly, M.K. ); Hauer, J.F. )

    1993-02-01

    This paper further advances the usefulness and understanding of Prony analysis as a tool for identification of power system electromechanical oscillation models. These linear models are developed by analyzing power system ring-down data. The presented results allow more generality in the assumed model formulation. In addition, a comparison is made between Prony analysis and autoregressive moving-average (KARMA) modeling, which has also been proposed for analysis of system oscillations. Under the conditions investigated, the Prony algorithm performed more accurate identification.

  19. Smallpox Models as Policy Tools1

    PubMed Central

    2004-01-01

    Mathematical models can help prepare for and respond to bioterrorism attacks, provided that their strengths and weaknesses are clearly understood. A series of initiatives within the Department of Health and Human Services brought modelers together with biologists and epidemiologists who specialize in smallpox and experts in bioterrorism response and health policy and has led to the parallel development of models with different technical approaches but standardized scenarios, parameter ranges, and outcome measures. Cross-disciplinary interactions throughout the process supported the development of models focused on systematically comparing alternate intervention strategies, determining the most important issues in decision-making, and identifying gaps in current knowledge. PMID:15550219

  20. AFDM: An Advanced Fluid-Dynamics Model

    SciTech Connect

    Bohl, W.R.; Parker, F.R. ); Wilhelm, D. . Inst. fuer Neutronenphysik und Reaktortechnik); Berthier, J. ); Goutagny, L. . Inst. de Protection et de Surete Nucleaire); Ninokata,

    1990-09-01

    AFDM, or the Advanced Fluid-Dynamics Model, is a computer code that investigates new approaches simulating the multiphase-flow fluid-dynamics aspects of severe accidents in fast reactors. The AFDM formalism starts with differential equations similar to those in the SIMMER-II code. These equations are modified to treat three velocity fields and supplemented with a variety of new models. The AFDM code has 12 topologies describing what material contacts are possible depending on the presence or absence of a given material in a computational cell, on the dominant liquid, and on the continuous phase. Single-phase, bubbly, churn-turbulent, cellular, and dispersed flow regimes are permitted for the pool situations modeled. Virtual mass terms are included for vapor in liquid-continuous flow. Interfacial areas between the continuous and discontinuous phases are convected to allow some tracking of phenomenological histories. Interfacial areas are also modified by models of nucleation, dynamic forces, turbulence, flashing, coalescence, and mass transfer. Heat transfer is generally treated using engineering correlations. Liquid-vapor phase transitions are handled with the nonequilibrium, heat-transfer-limited model, whereas melting and freezing processes are based on equilibrium considerations. Convection is treated using a fractional-step method of time integration, including a semi-implicit pressure iteration. A higher-order differencing option is provided to control numerical diffusion. The Los Alamos SESAME equation-of-state has been implemented using densities and temperatures as the independent variables. AFDM programming has vectorized all computational loops consistent with the objective of producing an exportable code. 24 refs., 4 figs.

  1. Nonlinear Dynamic Models in Advanced Life Support

    NASA Technical Reports Server (NTRS)

    Jones, Harry

    2002-01-01

    To facilitate analysis, ALS systems are often assumed to be linear and time invariant, but they usually have important nonlinear and dynamic aspects. Nonlinear dynamic behavior can be caused by time varying inputs, changes in system parameters, nonlinear system functions, closed loop feedback delays, and limits on buffer storage or processing rates. Dynamic models are usually cataloged according to the number of state variables. The simplest dynamic models are linear, using only integration, multiplication, addition, and subtraction of the state variables. A general linear model with only two state variables can produce all the possible dynamic behavior of linear systems with many state variables, including stability, oscillation, or exponential growth and decay. Linear systems can be described using mathematical analysis. Nonlinear dynamics can be fully explored only by computer simulations of models. Unexpected behavior is produced by simple models having only two or three state variables with simple mathematical relations between them. Closed loop feedback delays are a major source of system instability. Exceeding limits on buffer storage or processing rates forces systems to change operating mode. Different equilibrium points may be reached from different initial conditions. Instead of one stable equilibrium point, the system may have several equilibrium points, oscillate at different frequencies, or even behave chaotically, depending on the system inputs and initial conditions. The frequency spectrum of an output oscillation may contain harmonics and the sums and differences of input frequencies, but it may also contain a stable limit cycle oscillation not related to input frequencies. We must investigate the nonlinear dynamic aspects of advanced life support systems to understand and counter undesirable behavior.

  2. Biochemical pathway modeling tools for drug target detection in cancer and other complex diseases.

    PubMed

    Marin-Sanguino, Alberto; Gupta, Shailendra K; Voit, Eberhard O; Vera, Julio

    2011-01-01

    In the near future, computational tools and methods based on the mathematical modeling of biomedically relevant networks and pathways will be necessary for the design of therapeutic strategies that fight complex multifactorial diseases. Beyond the use of pharmacokinetic and pharmacodynamic approaches, we propose here the use of dynamic modeling as a tool for describing and analyzing the structure and responses of signaling, genetic and metabolic networks involved in such diseases. Specifically, we discuss the design and construction of meaningful models of biochemical networks, as well as tools, concepts, and strategies for using these models in the search of potential drug targets. We describe three different families of computational tools: predictive model simulations as tools for designing optimal drug profiles and doses; sensitivity analysis as a method to detect key interactions that affect critical outcomes and other characteristics of the network; and other tools integrating mathematical modeling with advanced computation and optimization for detecting potential drug targets. Furthermore, we show how potential drug targets detected with these approaches can be used in a computer-aided context to design or select new drug molecules. All concepts are illustrated with simplified examples and with actual case studies extracted from the recent literature. PMID:21187230

  3. Modeling of advanced fossil fuel power plants

    NASA Astrophysics Data System (ADS)

    Zabihian, Farshid

    The first part of this thesis deals with greenhouse gas (GHG) emissions from fossil fuel-fired power stations. The GHG emission estimation from fossil fuel power generation industry signifies that emissions from this industry can be significantly reduced by fuel switching and adaption of advanced power generation technologies. In the second part of the thesis, steady-state models of some of the advanced fossil fuel power generation technologies are presented. The impacts of various parameters on the solid oxide fuel cell (SOFC) overpotentials and outputs are investigated. The detail analyses of operation of the hybrid SOFC-gas turbine (GT) cycle when fuelled with methane and syngas demonstrate that the efficiencies of the cycles with and without anode exhaust recirculation are close, but the specific power of the former is much higher. The parametric analysis of the performance of the hybrid SOFC-GT cycle indicates that increasing the system operating pressure and SOFC operating temperature and fuel utilization factor improves cycle efficiency, but the effects of the increasing SOFC current density and turbine inlet temperature are not favourable. The analysis of the operation of the system when fuelled with a wide range of fuel types demonstrates that the hybrid SOFC-GT cycle efficiency can be between 59% and 75%, depending on the inlet fuel type. Then, the system performance is investigated when methane as a reference fuel is replaced with various species that can be found in the fuel, i.e., H2, CO2, CO, and N 2. The results point out that influence of various species can be significant and different for each case. The experimental and numerical analyses of a biodiesel fuelled micro gas turbine indicate that fuel switching from petrodiesel to biodiesel can influence operational parameters of the system. The modeling results of gas turbine-based power plants signify that relatively simple models can predict plant performance with acceptable accuracy. The unique

  4. ARPENTEUR: a web-based photogrammetry tool for architectural modeling

    NASA Astrophysics Data System (ADS)

    Grussenmeyer, Pierre; Drap, Pierre

    2000-12-01

    ARPENTEUR is a web application for digital photogrammetry mainly dedicated to architecture. ARPENTEUR has been developed since 1998 by two French research teams: the 'Photogrammetry and Geomatics' group of ENSAIS-LERGEC's laboratory and the MAP-gamsau CNRS laboratory located in the school of Architecture of Marseille. The software package is a web based tool since photogrammetric concepts are embedded in Web technology and Java programming language. The aim of this project is to propose a photogrammetric software package and 3D modeling methods available on the Internet as applets through a simple browser. The use of Java and the Web platform is ful of advantages. Distributing software on any platform, at any pace connected to Internet is of course very promising. The updating is done directly on the server and the user always works with the latest release installed on the server. Three years ago the first prototype of ARPENTEUR was based on the Java Development Kit at the time only available for some browsers. Nowadays, we are working with the JDK 1.3 plug-in enriched by Java Advancing Imaging library.

  5. Data Modeling & the Infrastructural Nature of Conceptual Tools

    ERIC Educational Resources Information Center

    Lesh, Richard; Caylor, Elizabeth; Gupta, Shweta

    2007-01-01

    The goal of this paper is to demonstrate the infrastructural nature of many modern conceptual technologies. The focus of this paper is on conceptual tools associated with elementary types of data modeling. We intend to show a variety of ways in which these conceptual tools not only express thinking, but also mold and shape thinking. And those ways…

  6. Scratch as a Computational Modelling Tool for Teaching Physics

    ERIC Educational Resources Information Center

    Lopez, Victor; Hernandez, Maria Isabel

    2015-01-01

    The Scratch online authoring tool, which features a simple programming language that has been adapted to primary and secondary students, is being used more and more in schools as it offers students and teachers the opportunity to use a tool to build scientific models and evaluate their behaviour, just as can be done with computational modelling…

  7. Model Fusion Tool - the Open Environmental Modelling Platform Concept

    NASA Astrophysics Data System (ADS)

    Kessler, H.; Giles, J. R.

    2010-12-01

    The vision of an Open Environmental Modelling Platform - seamlessly linking geoscience data, concepts and models to aid decision making in times of environmental change. Governments and their executive agencies across the world are facing increasing pressure to make decisions about the management of resources in light of population growth and environmental change. In the UK for example, groundwater is becoming a scarce resource for large parts of its most densely populated areas. At the same time river and groundwater flooding resulting from high rainfall events are increasing in scale and frequency and sea level rise is threatening the defences of coastal cities. There is also a need for affordable housing, improved transport infrastructure and waste disposal as well as sources of renewable energy and sustainable food production. These challenges can only be resolved if solutions are based on sound scientific evidence. Although we have knowledge and understanding of many individual processes in the natural sciences it is clear that a single science discipline is unable to answer the questions and their inter-relationships. Modern science increasingly employs computer models to simulate the natural, economic and human system. Management and planning requires scenario modelling, forecasts and ‘predictions’. Although the outputs are often impressive in terms of apparent accuracy and visualisation, they are inherently not suited to simulate the response to feedbacks from other models of the earth system, such as the impact of human actions. Geological Survey Organisations (GSO) are increasingly employing advances in Information Technology to visualise and improve their understanding of geological systems. Instead of 2 dimensional paper maps and reports many GSOs now produce 3 dimensional geological framework models and groundwater flow models as their standard output. Additionally the British Geological Survey have developed standard routines to link geological

  8. EEGLAB, SIFT, NFT, BCILAB, and ERICA: New Tools for Advanced EEG Processing

    PubMed Central

    Delorme, Arnaud; Mullen, Tim; Kothe, Christian; Akalin Acar, Zeynep; Bigdely-Shamlo, Nima; Vankov, Andrey; Makeig, Scott

    2011-01-01

    We describe a set of complementary EEG data collection and processing tools recently developed at the Swartz Center for Computational Neuroscience (SCCN) that connect to and extend the EEGLAB software environment, a freely available and readily extensible processing environment running under Matlab. The new tools include (1) a new and flexible EEGLAB STUDY design facility for framing and performing statistical analyses on data from multiple subjects; (2) a neuroelectromagnetic forward head modeling toolbox (NFT) for building realistic electrical head models from available data; (3) a source information flow toolbox (SIFT) for modeling ongoing or event-related effective connectivity between cortical areas; (4) a BCILAB toolbox for building online brain-computer interface (BCI) models from available data, and (5) an experimental real-time interactive control and analysis (ERICA) environment for real-time production and coordination of interactive, multimodal experiments. PMID:21687590

  9. Modeling and modification of medical 3D objects. The benefit of using a haptic modeling tool.

    PubMed

    Kling-Petersen, T; Rydmark, M

    2000-01-01

    The Computer Laboratory of the medical faculty in Goteborg (Mednet) has since the end of 1998 been one of a limited numbers of participants in the development of a new modeling tool together with SensAble Technologies Inc [http:¿www.sensable.com/]. The software called SensAble FreeForm was officially released at Siggraph September 1999. Briefly, the software mimics the modeling techniques traditionally used by clay artists. An imported model or a user defined block of "clay" can be modified using different tools such as a ball, square block, scrape etc via the use of a SensAble Technologies PHANToM haptic arm. The model will deform in 3D as a result of touching the "clay" with any selected tool and the amount of deformation is linear to the force applied. By getting instantaneous haptic as well as visual feedback, precise and intuitive changes are easily made. While SensAble FreeForm lacks several of the features normally associated with a 3D modeling program (such as text handling, application of surface and bumpmaps, high-end rendering engines, etc) it's strength lies in the ability to rapidly create non-geometric 3D models. For medical use, very few anatomically correct models are created from scratch. However, FreeForm features tools enable advanced modification of reconstructed or 3D scanned models. One of the main problems with 3D laserscanning of medical specimens is that the technique usually leaves holes or gaps in the dataset corresponding to areas in shadows such as orifices, deep grooves etc. By using FreeForms different tools, these defects are easily corrected and gaps are filled out. Similarly, traditional 3D reconstruction (based on serial sections etc) often shows artifacts as a result of the triangulation and/or tessellation processes. These artifacts usually manifest as unnatural ridges or uneven areas ("the accordion effect"). FreeForm contains a smoothing algorithm that enables the user to select an area to be modified and subsequently apply

  10. Applying computer simulation models as learning tools in fishery management

    USGS Publications Warehouse

    Johnson, B.L.

    1995-01-01

    Computer models can be powerful tools for addressing many problems in fishery management, but uncertainty about how to apply models and how they should perform can lead to a cautious approach to modeling. Within this approach, we expect models to make quantitative predictions but only after all model inputs have been estimated from empirical data and after the model has been tested for agreement with an independent data set. I review the limitations to this approach and show how models can be more useful as tools for organizing data and concepts, learning about the system to be managed, and exploring management options. Fishery management requires deciding what actions to pursue to meet management objectives. Models do not make decisions for us but can provide valuable input to the decision-making process. When empirical data are lacking, preliminary modeling with parameters derived from other sources can help determine priorities for data collection. When evaluating models for management applications, we should attempt to define the conditions under which the model is a useful, analytical tool (its domain of applicability) and should focus on the decisions made using modeling results, rather than on quantitative model predictions. I describe an example of modeling used as a learning tool for the yellow perch Perca flavescens fishery in Green Bay, Lake Michigan.

  11. Irena : tool suite for modeling and analysis of small-angle scattering.

    SciTech Connect

    Ilavsky, J.; Jemian, P.

    2009-04-01

    Irena, a tool suite for analysis of both X-ray and neutron small-angle scattering (SAS) data within the commercial Igor Pro application, brings together a comprehensive suite of tools useful for investigations in materials science, physics, chemistry, polymer science and other fields. In addition to Guinier and Porod fits, the suite combines a variety of advanced SAS data evaluation tools for the modeling of size distribution in the dilute limit using maximum entropy and other methods, dilute limit small-angle scattering from multiple non-interacting populations of scatterers, the pair-distance distribution function, a unified fit, the Debye-Bueche model, the reflectivity (X-ray and neutron) using Parratt's formalism, and small-angle diffraction. There are also a number of support tools, such as a data import/export tool supporting a broad sampling of common data formats, a data modification tool, a presentation-quality graphics tool optimized for small-angle scattering data, and a neutron and X-ray scattering contrast calculator. These tools are brought together into one suite with consistent interfaces and functionality. The suite allows robust automated note recording and saving of parameters during export.

  12. A Repository for Beyond-the-Standard-Model Tools

    SciTech Connect

    Skands, P.; Richardson, P.; Allanach, B.C.; Baer, H.; Belanger, G.; El Kacimi, M.; Ellwanger, U.; Freitas, A.; Ghodbane, N.; Goujdami, D.; Hahn, T.; Heinemeyer, S.; Kneur, J.-L.; Landsberg, G.; Lee, J.S.; Muhlleitner, M.; Ohl, T.; Perez, E.; Peskin, M.; Pilaftsis, A.; Plehn, T.

    2005-05-01

    To aid phenomenological studies of Beyond-the-Standard-Model (BSM) physics scenarios, a web repository for BSM calculational tools has been created. We here present brief overviews of the relevant codes, ordered by topic as well as by alphabet.

  13. Research Tools Available at the Community Coordinated Modeling Center

    NASA Astrophysics Data System (ADS)

    Berrios, D. H.; Maddox, M.; Rastaetter, L.; Chulaki, A.; Hesse, M.

    2007-12-01

    The Community Coordinated Modeling Center (CCMC), located at NASA Goddard Space Flight Center, provides access to state-of-the-art space weather models to the research community. The majority of the models residing at the CCMC are comprehensive computationally intensive physics-based models. The CCMC also provides free services and tools to assist the research community in analyzing the results from the space weather model simulations. We present an overview of the available services at the CCMC: the Runs-On-Request system, the online visualizations, the Kameleon access and interpolation library, and the CCMC Space Weather Widget. Finally, we discuss the future services and tools in development.

  14. Agent Based Modeling as an Educational Tool

    NASA Astrophysics Data System (ADS)

    Fuller, J. H.; Johnson, R.; Castillo, V.

    2012-12-01

    Motivation is a key element in high school education. One way to improve motivation and provide content, while helping address critical thinking and problem solving skills, is to have students build and study agent based models in the classroom. This activity visually connects concepts with their applied mathematical representation. "Engaging students in constructing models may provide a bridge between frequently disconnected conceptual and mathematical forms of knowledge." (Levy and Wilensky, 2011) We wanted to discover the feasibility of implementing a model based curriculum in the classroom given current and anticipated core and content standards.; Simulation using California GIS data ; Simulation of high school student lunch popularity using aerial photograph on top of terrain value map.

  15. Development of Experimental and Computational Aeroacoustic Tools for Advanced Liner Evaluation

    NASA Technical Reports Server (NTRS)

    Jones, Michael G.; Watson, Willie R.; Nark, Douglas N.; Parrott, Tony L.; Gerhold, Carl H.; Brown, Martha C.

    2006-01-01

    Acoustic liners in aircraft engine nacelles suppress radiated noise. Therefore, as air travel increases, increasingly sophisticated tools are needed to maximize noise suppression. During the last 30 years, NASA has invested significant effort in development of experimental and computational acoustic liner evaluation tools. The Curved Duct Test Rig is a 152-mm by 381- mm curved duct that supports liner evaluation at Mach numbers up to 0.3 and source SPLs up to 140 dB, in the presence of user-selected modes. The Grazing Flow Impedance Tube is a 51- mm by 63-mm duct currently being fabricated to operate at Mach numbers up to 0.6 with source SPLs up to at least 140 dB, and will replace the existing 51-mm by 51-mm duct. Together, these test rigs allow evaluation of advanced acoustic liners over a range of conditions representative of those observed in aircraft engine nacelles. Data acquired with these test ducts are processed using three aeroacoustic propagation codes. Two are based on finite element solutions to convected Helmholtz and linearized Euler equations. The third is based on a parabolic approximation to the convected Helmholtz equation. The current status of these computational tools and their associated usage with the Langley test rigs is provided.

  16. Modeling and analysis of advanced binary cycles

    SciTech Connect

    Gawlik, K.

    1997-12-31

    A computer model (Cycle Analysis Simulation Tool, CAST) and a methodology have been developed to perform value analysis for small, low- to moderate-temperature binary geothermal power plants. The value analysis method allows for incremental changes in the levelized electricity cost (LEC) to be determined between a baseline plant and a modified plant. Thermodynamic cycle analyses and component sizing are carried out in the model followed by economic analysis which provides LEC results. The emphasis of the present work is on evaluating the effect of mixed working fluids instead of pure fluids on the LEC of a geothermal binary plant that uses a simple Organic Rankine Cycle. Four resources were studied spanning the range of 265{degrees}F to 375{degrees}F. A variety of isobutane and propane based mixtures, in addition to pure fluids, were used as working fluids. This study shows that the use of propane mixtures at a 265{degrees}F resource can reduce the LEC by 24% when compared to a base case value that utilizes commercial isobutane as its working fluid. The cost savings drop to 6% for a 375{degrees}F resource, where an isobutane mixture is favored. Supercritical cycles were found to have the lowest cost at all resources.

  17. Watershed modeling tools and data for prognostic and diagnostic

    NASA Astrophysics Data System (ADS)

    Chambel-Leitao, P.; Brito, D.; Neves, R.

    2009-04-01

    -557-4411-5 Trancoso, R., F. Braunschweig, Chambel-Leitão P., Neves, R., Obermann, M. (2009) An advanced modelling tool for simulating complex river systems. Accepted for publication in Journal of Total Environment. Yarrow M., Chambel-Leitão P. (2006) Calibration of the SWAT model to the Aysén basin of the Chilean Patagonia: Challenges and Lessons. Proceedings of the Watershed Management to Meet Water Quality Standards and TMDLS (Total Maximum Daily Load) 10-14 March 2007, San Antonio, Texas 701P0207. Yarrow M., Chambel-Leitão P.. (2007) Simulating Nothfagus forests in the Chilean Patagonia: a test and analysis of tree growth and nutrient cycling in swat. Submited to the Proceedings of the , 4th International SWAT Conference July 2-6 2007. Yarrow, M., Chambel-Leitão P. (2008) Estimation of loads in the Aysén Basin of the Chilean Patagonia: SWAT model and Harp-Nut guidelines. In Perspectives on Integrated Coastal Zone Management in South America R Neves, J Baretta & M Mateus (eds.). IST Press, Lisbon, Portugal. (ISBN: 978-972-8469-74-0)

  18. Update on Small Modular Reactors Dynamics System Modeling Tool -- Molten Salt Cooled Architecture

    SciTech Connect

    Hale, Richard Edward; Cetiner, Sacit M.; Fugate, David L.; Qualls, A L.; Borum, Robert C.; Chaleff, Ethan S.; Rogerson, Doug W.; Batteh, John J.; Tiller, Michael M.

    2014-08-01

    The Small Modular Reactor (SMR) Dynamic System Modeling Tool project is in the third year of development. The project is designed to support collaborative modeling and study of various advanced SMR (non-light water cooled) concepts, including the use of multiple coupled reactors at a single site. The objective of the project is to provide a common simulation environment and baseline modeling resources to facilitate rapid development of dynamic advanced reactor SMR models, ensure consistency among research products within the Instrumentation, Controls, and Human-Machine Interface (ICHMI) technical area, and leverage cross-cutting capabilities while minimizing duplication of effort. The combined simulation environment and suite of models are identified as the Modular Dynamic SIMulation (MoDSIM) tool. The critical elements of this effort include (1) defining a standardized, common simulation environment that can be applied throughout the program, (2) developing a library of baseline component modules that can be assembled into full plant models using existing geometry and thermal-hydraulic data, (3) defining modeling conventions for interconnecting component models, and (4) establishing user interfaces and support tools to facilitate simulation development (i.e., configuration and parameterization), execution, and results display and capture.

  19. Advanced Modeling, Simulation and Analysis (AMSA) Capability Roadmap Progress Review

    NASA Technical Reports Server (NTRS)

    Antonsson, Erik; Gombosi, Tamas

    2005-01-01

    Contents include the following: NASA capability roadmap activity. Advanced modeling, simulation, and analysis overview. Scientific modeling and simulation. Operations modeling. Multi-special sensing (UV-gamma). System integration. M and S Environments and Infrastructure.

  20. Advanced modeling of high intensity accelerators

    SciTech Connect

    Ryne, R.D.; Habib, S.; Wangler, T.P.

    1998-11-01

    This is the final report of a three-year, Laboratory Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). The goals of this project were three-fold: (1) to develop a new capability, based on high performance (parallel) computers, to perform large scale simulations of high intensity accelerators; (2) to apply this capability to modeling high intensity accelerators under design at LANL; and (3) to use this new capability to improve the understanding of the physics of intense charge particle beams, especially in regard to the issue of beam halo formation. All of these goals were met. In particular, the authors introduced split-operator methods as a powerful and efficient means to simulate intense beams in the presence of rapidly varying accelerating and focusing fields. They then applied these methods to develop scaleable, parallel beam dynamics codes for modeling intense beams in linacs, and in the process they implemented a new three-dimensional space charge algorithm. They also used the codes to study a number of beam dynamics issues related to the Accelerator Production of Tritium (APT) project, and in the process performed the largest simulations to date for any accelerator design project. Finally, they used the new modeling capability to provide direction and validation to beam physics studies, helping to identify beam mismatch as a major source of halo formation in high intensity accelerators. This LDRD project ultimately benefited not only LANL but also the US accelerator community since, by promoting expertise in high performance computing and advancing the state-of-the-art in accelerator simulation, its accomplishments helped lead to approval of a new DOE Grand Challenge in Computational Accelerator Physics.

  1. Tools and Products of Real-Time Modeling: Opportunities for Space Weather Forecasting

    NASA Technical Reports Server (NTRS)

    Hesse, Michael

    2009-01-01

    The Community Coordinated Modeling Center (CCMC) is a US inter-agency activity aiming at research in support of the generation of advanced space weather models. As one of its main functions, the CCMC provides to researchers the use of space science models, even if they are not model owners themselves. The second CCMC activity is to support Space Weather forecasting at national Space Weather Forecasting Centers. This second activity involves model evaluations, model transitions to operations, and the development of draft Space Weather forecasting tools. This presentation will focus on the last element. Specifically, we will discuss present capabilities, and the potential to derive further tools. These capabilities will be interpreted in the context of a broad-based, bootstrapping activity for modern Space Weather forecasting.

  2. Current advancements and challenges in soil-root interactions modelling

    NASA Astrophysics Data System (ADS)

    Schnepf, Andrea; Huber, Katrin; Abesha, Betiglu; Meunier, Felicien; Leitner, Daniel; Roose, Tiina; Javaux, Mathieu; Vanderborght, Jan; Vereecken, Harry

    2015-04-01

    Roots change their surrounding soil chemically, physically and biologically. This includes changes in soil moisture and solute concentration, the exudation of organic substances into the rhizosphere, increased growth of soil microorganisms, or changes in soil structure. The fate of water and solutes in the root zone is highly determined by these root-soil interactions. Mathematical models of soil-root systems in combination with non-invasive techniques able to characterize root systems are a promising tool to understand and predict the behaviour of water and solutes in the root zone. With respect to different fields of applications, predictive mathematical models can contribute to the solution of optimal control problems in plant recourse efficiency. This may result in significant gains in productivity, efficiency and environmental sustainability in various land use activities. Major challenges include the coupling of model parameters of the relevant processes with the surrounding environment such as temperature, nutrient concentration or soil water content. A further challenge is the mathematical description of the different spatial and temporal scales involved. This includes in particular the branched structures formed by root systems or the external mycelium of mycorrhizal fungi. Here, reducing complexity as well as bridging between spatial scales is required. Furthermore, the combination of experimental and mathematical techniques may advance the field enormously. Here, the use of root system, soil and rhizosphere models is presented through a number of modelling case studies, including image based modelling of phosphate uptake by a root with hairs, model-based optimization of root architecture for phosphate uptake from soil, upscaling of rhizosphere models, modelling root growth in structured soil, and the effect of root hydraulic architecture on plant water uptake efficiency and drought resistance.

  3. Current Advancements and Challenges in Soil-Root Interactions Modelling

    NASA Astrophysics Data System (ADS)

    Schnepf, A.; Huber, K.; Abesha, B.; Meunier, F.; Leitner, D.; Roose, T.; Javaux, M.; Vanderborght, J.; Vereecken, H.

    2014-12-01

    Roots change their surrounding soil chemically, physically and biologically. This includes changes in soil moisture and solute concentration, the exudation of organic substances into the rhizosphere, increased growth of soil microorganisms, or changes in soil structure. The fate of water and solutes in the root zone is highly determined by these root-soil interactions. Mathematical models of soil-root systems in combination with non-invasive techniques able to characterize root systems are a promising tool to understand and predict the behaviour of water and solutes in the root zone. With respect to different fields of applications, predictive mathematical models can contribute to the solution of optimal control problems in plant recourse efficiency. This may result in significant gains in productivity, efficiency and environmental sustainability in various land use activities. Major challenges include the coupling of model parameters of the relevant processes with the surrounding environment such as temperature, nutrient concentration or soil water content. A further challenge is the mathematical description of the different spatial and temporal scales involved. This includes in particular the branched structures formed by root systems or the external mycelium of mycorrhizal fungi. Here, reducing complexity as well as bridging between spatial scales is required. Furthermore, the combination of experimental and mathematical techniques may advance the field enormously. Here, the use of root system, soil and rhizosphere models is presented through a number of modelling case studies, including image based modelling of phosphate uptake by a root with hairs, model-based optimization of root architecture for phosphate uptake from soil, upscaling of rhizosphere models, modelling root growth in structured soil, and the effect of root hydraulic architecture on plant water uptake efficiency and drought resistance.

  4. Rasp Tool on Phoenix Robotic Arm Model

    NASA Technical Reports Server (NTRS)

    2008-01-01

    This close-up photograph taken at the Payload Interoperability Testbed at the University of Arizona, Tucson, shows the motorized rasp protruding from the bottom of the scoop on the engineering model of NASA's Phoenix Mars Lander's Robotic Arm.

    The rasp will be placed against the hard Martian surface to cut into the hard material and acquire an icy soil sample for analysis by Phoenix's scientific instruments.

    The Phoenix Mission is led by the University of Arizona, Tucson, on behalf of NASA. Project management of the mission is led by NASA's Jet Propulsion Laboratory, Pasadena, Calif. Spacecraft development is by Lockheed Martin Space Systems, Denver.

  5. Tetrahymena as a Unicellular Model Eukaryote: Genetic and Genomic Tools.

    PubMed

    Ruehle, Marisa D; Orias, Eduardo; Pearson, Chad G

    2016-06-01

    Tetrahymena thermophila is a ciliate model organism whose study has led to important discoveries and insights into both conserved and divergent biological processes. In this review, we describe the tools for the use of Tetrahymena as a model eukaryote, including an overview of its life cycle, orientation to its evolutionary roots, and methodological approaches to forward and reverse genetics. Recent genomic tools have expanded Tetrahymena's utility as a genetic model system. With the unique advantages that Tetrahymena provide, we argue that it will continue to be a model organism of choice. PMID:27270699

  6. Fabric-based systems: model, tools, applications.

    SciTech Connect

    Wolinski, C.; Gokhale, M.; McCabe, K. P.

    2003-01-01

    A Fabric Based System is a parameterized cellular architecture in which an array of computing cells communicates with an embedded processor through a global memory . This architecture is customizable to different classes of applications by funtional unit, interconnect, and memory parameters, and can be instantiated efficiently on platform FPGAs . In previous work, we have demonstrated the advantage of reconfigurable fabrics for image and signal processing applications . Recently, we have build a Fabric Generator, a Java-based toolset that greatly accelerates construction of the fabrics presented in. A module-generation library is used to define, instantiate, and interconnect cells' datapaths . FG generates customized sequencers for individual cells or collections of cells . We describe the Fabric-Based System model, the FG toolset, and concrete realizations offabric architectures generated by FG on the Altera Excalibur ARM that can deliver 4.5 GigaMACs/s (8/16 bit data, Multiply-Accumulate) .

  7. Advanced Launch Technology Life Cycle Analysis Using the Architectural Comparison Tool (ACT)

    NASA Technical Reports Server (NTRS)

    McCleskey, Carey M.

    2015-01-01

    Life cycle technology impact comparisons for nanolauncher technology concepts were performed using an Affordability Comparison Tool (ACT) prototype. Examined are cost drivers and whether technology investments can dramatically affect the life cycle characteristics. Primary among the selected applications was the prospect of improving nanolauncher systems. As a result, findings and conclusions are documented for ways of creating more productive and affordable nanolauncher systems; e.g., an Express Lane-Flex Lane concept is forwarded, and the beneficial effect of incorporating advanced integrated avionics is explored. Also, a Functional Systems Breakdown Structure (F-SBS) was developed to derive consistent definitions of the flight and ground systems for both system performance and life cycle analysis. Further, a comprehensive catalog of ground segment functions was created.

  8. A decision support tool for synchronizing technology advances with strategic mission objectives

    NASA Technical Reports Server (NTRS)

    Hornstein, Rhoda S.; Willoughby, John K.

    1992-01-01

    Successful accomplishment of the objectives of many long-range future missions in areas such as space systems, land-use planning, and natural resource management requires significant technology developments. This paper describes the development of a decision-support data-derived tool called MisTec for helping strategic planners to determine technology development alternatives and to synchronize the technology development schedules with the performance schedules of future long-term missions. Special attention is given to the operations, concept, design, and functional capabilities of the MisTec. The MisTec was initially designed for manned Mars mission, but can be adapted to support other high-technology long-range strategic planning situations, making it possible for a mission analyst, planner, or manager to describe a mission scenario, determine the technology alternatives for making the mission achievable, and to plan the R&D activity necessary to achieve the required technology advances.

  9. Community-based participatory research as a tool to advance environmental health sciences.

    PubMed Central

    O'Fallon, Liam R; Dearry, Allen

    2002-01-01

    The past two decades have witnessed a rapid proliferation of community-based participatory research (CBPR) projects. CBPR methodology presents an alternative to traditional population-based biomedical research practices by encouraging active and equal partnerships between community members and academic investigators. The National Institute of Environmental Health Sciences (NIEHS), the premier biomedical research facility for environmental health, is a leader in promoting the use of CBPR in instances where community-university partnerships serve to advance our understanding of environmentally related disease. In this article, the authors highlight six key principles of CBPR and describe how these principles are met within specific NIEHS-supported research investigations. These projects demonstrate that community-based participatory research can be an effective tool to enhance our knowledge of the causes and mechanisms of disorders having an environmental etiology, reduce adverse health outcomes through innovative intervention strategies and policy change, and address the environmental health concerns of community residents. PMID:11929724

  10. Anvil Forecast Tool in the Advanced Weather Interactive Processing System, Phase II

    NASA Technical Reports Server (NTRS)

    Barrett, Joe H., III

    2008-01-01

    Meteorologists from the 45th Weather Squadron (45 WS) and Spaceflight Meteorology Group have identified anvil forecasting as one of their most challenging tasks when predicting the probability of violations of the Lightning Launch Commit Criteria and Space Light Rules. As a result, the Applied Meteorology Unit (AMU) created a graphical overlay tool for the Meteorological Interactive Data Display Systems (MIDDS) to indicate the threat of thunderstorm anvil clouds, using either observed or model forecast winds as input.

  11. Technology, Pedagogy, and Epistemology: Opportunities and Challenges of Using Computer Modeling and Simulation Tools in Elementary Science Methods

    ERIC Educational Resources Information Center

    Schwarz, Christina V.; Meyer, Jason; Sharma, Ajay

    2007-01-01

    This study infused computer modeling and simulation tools in a 1-semester undergraduate elementary science methods course to advance preservice teachers' understandings of computer software use in science teaching and to help them learn important aspects of pedagogy and epistemology. Preservice teachers used computer modeling and simulation tools…

  12. Proposing "the burns suite" as a novel simulation tool for advancing the delivery of burns education.

    PubMed

    Sadideen, Hazim; Wilson, David; Moiemen, Naiem; Kneebone, Roger

    2014-01-01

    Educational theory highlights the importance of contextualized simulation for effective learning. We explored this concept in a burns scenario in a novel, low-cost, high-fidelity, portable, immersive simulation environment (referred to as distributed simulation). This contextualized simulation/distributed simulation combination was named "The Burns Suite" (TBS). A pediatric burn resuscitation scenario was selected after high trainee demand. It was designed on Advanced Trauma and Life Support and Emergency Management of Severe Burns principles and refined using expert opinion through cognitive task analysis. TBS contained "realism" props, briefed nurses, and a simulated patient. Novices and experts were recruited. Five-point Likert-type questionnaires were developed for face and content validity. Cronbach's α was calculated for scale reliability. Semistructured interviews captured responses for qualitative thematic analysis allowing for data triangulation. Twelve participants completed TBS scenario. Mean face and content validity ratings were high (4.6 and 4.5, respectively; range, 4-5). The internal consistency of questions was high. Qualitative data analysis revealed that participants felt 1) the experience was "real" and they were "able to behave as if in a real resuscitation environment," and 2) TBS "addressed what Advanced Trauma and Life Support and Emergency Management of Severe Burns didn't" (including the efficacy of incorporating nontechnical skills). TBS provides a novel, effective simulation tool to significantly advance the delivery of burns education. Recreating clinical challenge is crucial to optimize simulation training. This low-cost approach also has major implications for surgical education, particularly during increasing financial austerity. Alternative scenarios and/or procedures can be recreated within TBS, providing a diverse educational immersive simulation experience. PMID:23877145

  13. Advanced modelling of the Planck-LFI radiometers

    NASA Astrophysics Data System (ADS)

    Battaglia, P.; Franceschet, C.; Zonca, A.; Bersanelli, M.; Butler, R. C.; D'Arcangelo, O.; Davis, R. J.; Galeotta, S.; Guzzi, P.; Hoyland, R.; Hughes, N.; Jukkala, P.; Kettle, D.; Laaninen, M.; Leonardi, R.; Maino, D.; Mandolesi, N.; Meinhold, P.; Mennella, A.; Platania, P.; Terenzi, L.; Tuovinen, J.; Varis, J.; Villa, F.; Wilkinson, A.

    2009-12-01

    The Low Frequency Instrument (LFI) is a radiometer array covering the 30-70 GHz spectral range on-board the ESA Planck satellite, launched on May 14th, 2009 to observe the cosmic microwave background (CMB) with unprecedented precision. In this paper we describe the development and validation of a software model of the LFI pseudo-correlation receivers which enables to reproduce and predict all the main system parameters of interest as measured at each of the 44 LFI detectors. These include system total gain, noise temperature, band-pass response, non-linear response. The LFI Advanced RF Model (LARFM) has been constructed by using commercial software tools and data of each radiometer component as measured at single unit level. The LARFM has been successfully used to reproduce the LFI behavior observed during the LFI ground-test campaign. The model is an essential element in the database of LFI data processing center and will be available for any detailed study of radiometer behaviour during the survey.

  14. Using Enabling Technologies to Advance Data Intensive Analysis Tools in the JPL Tropical Cyclone Information System

    NASA Astrophysics Data System (ADS)

    Knosp, B.; Gangl, M. E.; Hristova-Veleva, S. M.; Kim, R. M.; Lambrigtsen, B.; Li, P.; Niamsuwan, N.; Shen, T. P. J.; Turk, F. J.; Vu, Q. A.

    2014-12-01

    The JPL Tropical Cyclone Information System (TCIS) brings together satellite, aircraft, and model forecast data from several NASA, NOAA, and other data centers to assist researchers in comparing and analyzing data related to tropical cyclones. The TCIS has been supporting specific science field campaigns, such as the Genesis and Rapid Intensification Processes (GRIP) campaign and the Hurricane and Severe Storm Sentinel (HS3) campaign, by creating near real-time (NRT) data visualization portals. These portals are intended to assist in mission planning, enhance the understanding of current physical processes, and improve model data by comparing it to satellite and aircraft observations. The TCIS NRT portals allow the user to view plots on a Google Earth interface. To compliment these visualizations, the team has been working on developing data analysis tools to let the user actively interrogate areas of Level 2 swath and two-dimensional plots they see on their screen. As expected, these observation and model data are quite voluminous and bottlenecks in the system architecture can occur when the databases try to run geospatial searches for data files that need to be read by the tools. To improve the responsiveness of the data analysis tools, the TCIS team has been conducting studies on how to best store Level 2 swath footprints and run sub-second geospatial searches to discover data. The first objective was to improve the sampling accuracy of the footprints being stored in the TCIS database by comparing the Java-based NASA PO.DAAC Level 2 Swath Generator with a TCIS Python swath generator. The second objective was to compare the performance of four database implementations - MySQL, MySQL+Solr, MongoDB, and PostgreSQL - to see which database management system would yield the best geospatial query and storage performance. The final objective was to integrate our chosen technologies with our Joint Probability Density Function (Joint PDF), Wave Number Analysis, and

  15. Advancing swine models for human health and diseases.

    PubMed

    Walters, Eric M; Prather, Randall S

    2013-01-01

    Swine models are relatively new kids on the block for modeling human health and diseases when compared to rodents and dogs. Because of the similarity to humans in size, physiology, and genetics, the pig has made significant strides in advancing the understanding of the human condition, and is thus an excellent choice for an animal model. Recent technological advances to genetic engineering of the swine genome enhance the utility of swine as models of human genetic diseases. PMID:23829105

  16. 'Model' or 'tool'? New definitions for translational research.

    PubMed

    Sive, Hazel

    2011-03-01

    The term 'model' often describes non-human biological systems that are used to obtain a better understanding of human disorders. According to the most stringent definition, an animal 'model' would display exactly the same phenotype as seen in the relevant human disorder; however, this precise correspondence is often not present. In this Editorial, I propose the alternative, broader term 'tool' to describe a biological system that does not obviously (or precisely) recapitulate a human disorder, but that nonetheless provides useful insight into the etiology or treatment of that disorder. Applying the term 'tool' to biological systems used in disease-related studies will help to identify those systems that can most effectively address mechanisms underlying human disease. Conversely, differentiating 'models' from 'tools' will help to define more clearly the limitations of biological systems used in preclinical analyses. PMID:21357758

  17. Virtual Cell: computational tools for modeling in cell biology

    PubMed Central

    Resasco, Diana C.; Gao, Fei; Morgan, Frank; Novak, Igor L.; Schaff, James C.; Slepchenko, Boris M.

    2011-01-01

    The Virtual Cell (VCell) is a general computational framework for modeling physico-chemical and electrophysiological processes in living cells. Developed by the National Resource for Cell Analysis and Modeling at the University of Connecticut Health Center, it provides automated tools for simulating a wide range of cellular phenomena in space and time, both deterministically and stochastically. These computational tools allow one to couple electrophysiology and reaction kinetics with transport mechanisms, such as diffusion and directed transport, and map them onto spatial domains of various shapes, including irregular three-dimensional geometries derived from experimental images. In this article, we review new robust computational tools recently deployed in VCell for treating spatially resolved models. PMID:22139996

  18. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 4: Manufacturing Engineering Technology, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  19. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 6: Welding, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  20. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 10: Computer-Aided Drafting & Design, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  1. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 14: Automated Equipment Technician (CIM), of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  2. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 2: Career Development, General Education and Remediation, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  3. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 5: Mold Making, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational speciality areas within the U.S. machine tool and metals-related…

  4. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 3: Machining, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  5. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 7: Industrial Maintenance Technology, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  6. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 13: Laser Machining, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  7. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 8: Sheet Metal & Composites, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  8. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 12: Instrumentation, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  9. Modeling and Simulation Tools: From Systems Biology to Systems Medicine.

    PubMed

    Olivier, Brett G; Swat, Maciej J; Moné, Martijn J

    2016-01-01

    Modeling is an integral component of modern biology. In this chapter we look into the role of the model, as it pertains to Systems Medicine, and the software that is required to instantiate and run it. We do this by comparing the development, implementation, and characteristics of tools that have been developed to work with two divergent methodologies: Systems Biology and Pharmacometrics. From the Systems Biology perspective we consider the concept of "Software as a Medical Device" and what this may imply for the migration of research-oriented, simulation software into the domain of human health.In our second perspective, we see how in practice hundreds of computational tools already accompany drug discovery and development at every stage of the process. Standardized exchange formats are required to streamline the model exchange between tools, which would minimize translation errors and reduce the required time. With the emergence, almost 15 years ago, of the SBML standard, a large part of the domain of interest is already covered and models can be shared and passed from software to software without recoding them. Until recently the last stage of the process, the pharmacometric analysis used in clinical studies carried out on subject populations, lacked such an exchange medium. We describe a new emerging exchange format in Pharmacometrics which covers the non-linear mixed effects models, the standard statistical model type used in this area. By interfacing these two formats the entire domain can be covered by complementary standards and subsequently the according tools. PMID:26677194

  10. Analytical Modelling Of Milling For Tool Design And Selection

    SciTech Connect

    Fontaine, M.; Devillez, A.; Dudzinski, D.

    2007-05-17

    This paper presents an efficient analytical model which allows to simulate a large panel of milling operations. A geometrical description of common end mills and of their engagement in the workpiece material is proposed. The internal radius of the rounded part of the tool envelope is used to define the considered type of mill. The cutting edge position is described for a constant lead helix and for a constant local helix angle. A thermomechanical approach of oblique cutting is applied to predict forces acting on the tool and these results are compared with experimental data obtained from milling tests on a 42CrMo4 steel for three classical types of mills. The influence of some tool's geometrical parameters on predicted cutting forces is presented in order to propose optimisation criteria for design and selection of cutting tools.

  11. The Advancement Value Chain: An Exploratory Model

    ERIC Educational Resources Information Center

    Leonard, Edward F., III

    2005-01-01

    Since the introduction of the value chain concept in 1985, several varying, yet virtually similar, value chains have been developed for the business enterprise. Shifting to higher education, can a value chain be found that links together the various activities of advancement so that an institution's leaders can actually look at the philanthropic…

  12. Predicting Career Advancement with Structural Equation Modelling

    ERIC Educational Resources Information Center

    Heimler, Ronald; Rosenberg, Stuart; Morote, Elsa-Sofia

    2012-01-01

    Purpose: The purpose of this paper is to use the authors' prior findings concerning basic employability skills in order to determine which skills best predict career advancement potential. Design/methodology/approach: Utilizing survey responses of human resource managers, the employability skills showing the largest relationships to career…

  13. Advanced Placement: Model Policy Components. Policy Analysis

    ERIC Educational Resources Information Center

    Zinth, Jennifer

    2016-01-01

    Advanced Placement (AP), launched in 1955 by the College Board as a program to offer gifted high school students the opportunity to complete entry-level college coursework, has since expanded to encourage a broader array of students to tackle challenging content. This Education Commission of the State's Policy Analysis identifies key components of…

  14. Microfluidic 3D cell culture: from tools to tissue models.

    PubMed

    van Duinen, Vincent; Trietsch, Sebastiaan J; Joore, Jos; Vulto, Paul; Hankemeier, Thomas

    2015-12-01

    The transition from 2D to 3D cell culture techniques is an important step in a trend towards better biomimetic tissue models. Microfluidics allows spatial control over fluids in micrometer-sized channels has become a valuable tool to further increase the physiological relevance of 3D cell culture by enabling spatially controlled co-cultures, perfusion flow and spatial control over of signaling gradients. This paper reviews most important developments in microfluidic 3D culture since 2012. Most efforts were exerted in the field of vasculature, both as a tissue on its own and as part of cancer models. We observe that the focus is shifting from tool building to implementation of specific tissue models. The next big challenge for the field is the full validation of these models and subsequently the implementation of these models in drug development pipelines of the pharmaceutical industry and ultimately in personalized medicine applications. PMID:26094109

  15. New generation of exploration tools: interactive modeling software and microcomputers

    SciTech Connect

    Krajewski, S.A.

    1986-08-01

    Software packages offering interactive modeling techniques are now available for use on microcomputer hardware systems. These packages are reasonably priced for both company and independent explorationists; they do not require users to have high levels of computer literacy; they are capable of rapidly completing complex ranges of sophisticated geologic and geophysical modeling tasks; and they can produce presentation-quality output for comparison with real-world data. For example, interactive packages are available for mapping, log analysis, seismic modeling, reservoir studies, and financial projects as well as for applying a variety of statistical and geostatistical techniques to analysis of exploration data. More importantly, these packages enable explorationists to directly apply their geologic expertise when developing and fine-tuning models for identifying new prospects and for extending producing fields. As a result of these features, microcomputers and interactive modeling software are becoming common tools in many exploration offices. Gravity and magnetics software programs illustrate some of the capabilities of such exploration tools.

  16. Modeling of cumulative tool wear in machining metal matrix composites

    SciTech Connect

    Hung, N.P.; Tan, V.K.; Oon, B.E.

    1995-12-31

    Metal matrix composites (MMCs) are notoriously known for their low machinability because of the abrasive and brittle reinforcement. Although a near-net-shape product could be produced, finish machining is still required for the final shape and dimension. The classical Taylor`s tool life equation that relates tool life and cutting conditions has been traditionally used to study machinability. The turning operation is commonly used to investigate the machinability of a material; tedious and costly milling experiments have to be performed separately; while a facing test is not applicable for the Taylor`s model since the facing speed varies as the tool moves radially. Collecting intensive machining data for MMCs is often difficult because of the constraints on size, cost of the material, and the availability of sophisticated machine tools. A more flexible model and machinability testing technique are, therefore, sought. This study presents and verifies new models for turning, facing, and milling operations. Different cutting conditions were utilized to assess the machinability of MMCs reinforced with silicon carbide or alumina particles. Experimental data show that tool wear does not depend on the order of different cutting speeds since abrasion is the main wear mechanism. Correlation between data for turning, milling, and facing is presented. It is more economical to rank machinability using data for facing and then to convert the data for turning and milling, if required. Subsurface damages such as work-hardened and cracked matrix alloy, and fractured and delaminated particles are discussed.

  17. Advanced AEM by Comprehensive Analysis and Modeling of System Drift

    NASA Astrophysics Data System (ADS)

    Schiller, Arnulf; Klune, Klaus; Schattauer, Ingrid

    2010-05-01

    The quality of the assessment of risks outgoing from environmental hazards strongly depends on the spatial and temporal distribution of the data collected in a survey area. Natural hazards generally emerge from wide areas as it is in the case of volcanoes or land slides. Conventional surface measurements are restricted to few lines or locations and often can't be conducted in difficult terrain. So they only give a spatial and temporary limited data set and therefore limit the reliability of risk analysis. Aero-geophysical measurements potentially provide a valuable tool for completing the data set as they can be performed over a wide area, even above difficult terrain within a short time. A most desirable opportunity in course of such measurements is the ascertainment of the dynamics of such potentially hazardous environmental processes. This necessitates repeated and reproducible measurements. Current HEM systems can't accomplish this adequately due to their system immanent drift and - in some cases - bad signal to noise ratio. So, to develop comprising concepts for advancing state of the art HEM-systems to a valuable tool for data acquisition in risk assessment or hydrological problems, different studies have been undertaken which form the contents of the presented work conducted in course of the project HIRISK (Helicopter Based Electromagnetic System for Advanced Environmental Risk Assessment - FWF L-354 N10, supported by the Austrian Science Fund). The methodology is based upon two paths: A - Comprehensive experimental testing on an existing HEM system serving as an experimental platform. B - The setup of a numerical model which is continuously refined according to the results of the experimental data. The model then serves to simulate the experimental as well as alternative configurations and to analyze them subject to their drift behavior. Finally, concepts for minimizing the drift are derived and tested. Different test series - stationary on ground as well

  18. Development, Implementation and Application of Micromechanical Analysis Tools for Advanced High Temperature Composites

    NASA Technical Reports Server (NTRS)

    2005-01-01

    This document contains the final report to the NASA Glenn Research Center (GRC) for the research project entitled Development, Implementation, and Application of Micromechanical Analysis Tools for Advanced High-Temperature Composites. The research supporting this initiative has been conducted by Dr. Brett A. Bednarcyk, a Senior Scientist at OM in Brookpark, Ohio from the period of August 1998 to March 2005. Most of the work summarized herein involved development, implementation, and application of enhancements and new capabilities for NASA GRC's Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC) software package. When the project began, this software was at a low TRL (3-4) and at release version 2.0. Due to this project, the TRL of MAC/GMC has been raised to 7 and two new versions (3.0 and 4.0) have been released. The most important accomplishments with respect to MAC/GMC are: (1) A multi-scale framework has been built around the software, enabling coupled design and analysis from the global structure scale down to the micro fiber-matrix scale; (2) The software has been expanded to analyze smart materials; (3) State-of-the-art micromechanics theories have been implemented and validated within the code; (4) The damage, failure, and lifing capabilities of the code have been expanded from a very limited state to a vast degree of functionality and utility; and (5) The user flexibility of the code has been significantly enhanced. MAC/GMC is now the premier code for design and analysis of advanced composite and smart materials. It is a candidate for the 2005 NASA Software of the Year Award. The work completed over the course of the project is summarized below on a year by year basis. All publications resulting from the project are listed at the end of this report.

  19. Revel8or: Model Driven Capacity Planning Tool Suite

    SciTech Connect

    Zhu, Liming; Liu, Yan; Bui, Ngoc B.; Gorton, Ian

    2007-05-31

    Designing complex multi-tier applications that must meet strict performance requirements is a challenging software engineering problem. Ideally, the application architect could derive accurate performance predictions early in the project life-cycle, leveraging initial application design-level models and a description of the target software and hardware platforms. To this end, we have developed a capacity planning tool suite for component-based applications, called Revel8tor. The tool adheres to the model driven development paradigm and supports benchmarking and performance prediction for J2EE, .Net and Web services platforms. The suite is composed of three different tools: MDAPerf, MDABench and DSLBench. MDAPerf allows annotation of design diagrams and derives performance analysis models. MDABench allows a customized benchmark application to be modeled in the UML 2.0 Testing Profile and automatically generates a deployable application, with measurement automatically conducted. DSLBench allows the same benchmark modeling and generation to be conducted using a simple performance engineering Domain Specific Language (DSL) in Microsoft Visual Studio. DSLBench integrates with Visual Studio and reuses its load testing infrastructure. Together, the tool suite can assist capacity planning across platforms in an automated fashion.

  20. ADVANCED COMPUTATIONAL METHODS IN DOSE MODELING

    EPA Science Inventory

    The overall goal of the EPA-ORD NERL research program on Computational Toxicology (CompTox) is to provide the Agency with the tools of modern chemistry, biology, and computing to improve quantitative risk assessments and reduce uncertainties in the source-to-adverse outcome conti...

  1. NASA Advanced Concepts Office, Earth-To-Orbit Team Design Process and Tools

    NASA Technical Reports Server (NTRS)

    Waters, Eric D.; Creech, Dennis M.; Garcia, Jessica; Threet, Grady E., Jr.; Phillips, Alan

    2012-01-01

    The Earth-to-Orbit Team (ETO) of the Advanced Concepts Office (ACO) at NASA Marshall Space Flight Center (MSFC) is considered the pre-eminent go-to group for pre-phase A and phase A concept definition. Over the past several years the ETO team has evaluated thousands of launch vehicle concept variations for a significant number of studies including agency-wide efforts such as the Exploration Systems Architecture Study (ESAS), Constellation, Heavy Lift Launch Vehicle (HLLV), Augustine Report, Heavy Lift Propulsion Technology (HLPT), Human Exploration Framework Team (HEFT), and Space Launch System (SLS). The ACO ETO Team is called upon to address many needs in NASA s design community; some of these are defining extremely large trade-spaces, evaluating advanced technology concepts which have not been addressed by a large majority of the aerospace community, and the rapid turn-around of highly time critical actions. It is the time critical actions, those often limited by schedule or little advanced warning, that have forced the five member ETO team to develop a design process robust enough to handle their current output level in order to meet their customer s needs. Based on the number of vehicle concepts evaluated over the past year this output level averages to four completed vehicle concepts per day. Each of these completed vehicle concepts includes a full mass breakdown of the vehicle to a tertiary level of subsystem components and a vehicle trajectory analysis to determine optimized payload delivery to specified orbital parameters, flight environments, and delta v capability. A structural analysis of the vehicle to determine flight loads based on the trajectory output, material properties, and geometry of the concept is also performed. Due to working in this fast-paced and sometimes rapidly changing environment, the ETO Team has developed a finely tuned process to maximize their delivery capabilities. The objective of this paper is to describe the interfaces

  2. NASA Advanced Concepts Office, Earth-To-Orbit Team Design Process and Tools

    NASA Technical Reports Server (NTRS)

    Waters, Eric D.; Garcia, Jessica; Threet, Grady E., Jr.; Phillips, Alan

    2013-01-01

    The Earth-to-Orbit Team (ETO) of the Advanced Concepts Office (ACO) at NASA Marshall Space Flight Center (MSFC) is considered the pre-eminent "go-to" group for pre-phase A and phase A concept definition. Over the past several years the ETO team has evaluated thousands of launch vehicle concept variations for a significant number of studies including agency-wide efforts such as the Exploration Systems Architecture Study (ESAS), Constellation, Heavy Lift Launch Vehicle (HLLV), Augustine Report, Heavy Lift Propulsion Technology (HLPT), Human Exploration Framework Team (HEFT), and Space Launch System (SLS). The ACO ETO Team is called upon to address many needs in NASA's design community; some of these are defining extremely large trade-spaces, evaluating advanced technology concepts which have not been addressed by a large majority of the aerospace community, and the rapid turn-around of highly time critical actions. It is the time critical actions, those often limited by schedule or little advanced warning, that have forced the five member ETO team to develop a design process robust enough to handle their current output level in order to meet their customer's needs. Based on the number of vehicle concepts evaluated over the past year this output level averages to four completed vehicle concepts per day. Each of these completed vehicle concepts includes a full mass breakdown of the vehicle to a tertiary level of subsystem components and a vehicle trajectory analysis to determine optimized payload delivery to specified orbital parameters, flight environments, and delta v capability. A structural analysis of the vehicle to determine flight loads based on the trajectory output, material properties, and geometry of the concept is also performed. Due to working in this fast-paced and sometimes rapidly changing environment, the ETO Team has developed a finely tuned process to maximize their delivery capabilities. The objective of this paper is to describe the interfaces

  3. Development and application of modeling tools for sodium fast reactor inspection

    SciTech Connect

    Le Bourdais, Florian; Marchand, Benoît; Baronian, Vahan

    2014-02-18

    To support the development of in-service inspection methods for the Advanced Sodium Test Reactor for Industrial Demonstration (ASTRID) project led by the French Atomic Energy Commission (CEA), several tools that allow situations specific to Sodium cooled Fast Reactors (SFR) to be modeled have been implemented in the CIVA software and exploited. This paper details specific applications and results obtained. For instance, a new specular reflection model allows the calculation of complex echoes from scattering structures inside the reactor vessel. EMAT transducer simulation models have been implemented to develop new transducers for sodium visualization and imaging. Guided wave analysis tools have been developed to permit defect detection in the vessel shell. Application examples and comparisons with experimental data are presented.

  4. Development and application of modeling tools for sodium fast reactor inspection

    NASA Astrophysics Data System (ADS)

    Le Bourdais, Florian; Marchand, Benoît; Baronian, Vahan

    2014-02-01

    To support the development of in-service inspection methods for the Advanced Sodium Test Reactor for Industrial Demonstration (ASTRID) project led by the French Atomic Energy Commission (CEA), several tools that allow situations specific to Sodium cooled Fast Reactors (SFR) to be modeled have been implemented in the CIVA software and exploited. This paper details specific applications and results obtained. For instance, a new specular reflection model allows the calculation of complex echoes from scattering structures inside the reactor vessel. EMAT transducer simulation models have been implemented to develop new transducers for sodium visualization and imaging. Guided wave analysis tools have been developed to permit defect detection in the vessel shell. Application examples and comparisons with experimental data are presented.

  5. Cloud-Based Tools to Support High-Resolution Modeling (Invited)

    NASA Astrophysics Data System (ADS)

    Jones, N.; Nelson, J.; Swain, N.; Christensen, S.

    2013-12-01

    The majority of watershed models developed to support decision-making by water management agencies are simple, lumped-parameter models. Maturity in research codes and advances in the computational power from multi-core processors on desktop machines, commercial cloud-computing resources, and supercomputers with thousands of cores have created new opportunities for employing more accurate, high-resolution distributed models for routine use in decision support. The barriers for using such models on a more routine basis include massive amounts of spatial data that must be processed for each new scenario and lack of efficient visualization tools. In this presentation we will review a current NSF-funded project called CI-WATER that is intended to overcome many of these roadblocks associated with high-resolution modeling. We are developing a suite of tools that will make it possible to deploy customized web-based apps for running custom scenarios for high-resolution models with minimal effort. These tools are based on a software stack that includes 52 North, MapServer, PostGIS, HT Condor, CKAN, and Python. This open source stack provides a simple scripting environment for quickly configuring new custom applications for running high-resolution models as geoprocessing workflows. The HT Condor component facilitates simple access to local distributed computers or commercial cloud resources when necessary for stochastic simulations. The CKAN framework provides a powerful suite of tools for hosting such workflows in a web-based environment that includes visualization tools and storage of model simulations in a database to archival, querying, and sharing of model results. Prototype applications including land use change, snow melt, and burned area analysis will be presented. This material is based upon work supported by the National Science Foundation under Grant No. 1135482

  6. Accessing Curriculum Through Technology Tools (ACTTT): A Model Development Project

    ERIC Educational Resources Information Center

    Daytner, Katrina M.; Johanson, Joyce; Clark, Letha; Robinson, Linda

    2012-01-01

    Accessing Curriculum Through Technology Tools (ACTTT), a project funded by the U.S. Office of Special Education Programs (OSEP), developed and tested a model designed to allow children in early elementary school, including those "at risk" and with disabilities, to better access, participate in, and benefit from the general curriculum. Children in…

  7. Models and Methods: The Tools of Library Networking

    ERIC Educational Resources Information Center

    Parker, Thomas F.

    1975-01-01

    Several models and methods applicable to university library networks are discussed as tools to increase understanding of cause and effect relationships in these complex organizations. Collection of data by unobtrusive measures and display of information by graphic techniques are briefly described. (Author)

  8. Object-Oriented MDAO Tool with Aeroservoelastic Model Tuning Capability

    NASA Technical Reports Server (NTRS)

    Pak, Chan-gi; Li, Wesley; Lung, Shun-fat

    2008-01-01

    An object-oriented multi-disciplinary analysis and optimization (MDAO) tool has been developed at the NASA Dryden Flight Research Center to automate the design and analysis process and leverage existing commercial as well as in-house codes to enable true multidisciplinary optimization in the preliminary design stage of subsonic, transonic, supersonic and hypersonic aircraft. Once the structural analysis discipline is finalized and integrated completely into the MDAO process, other disciplines such as aerodynamics and flight controls will be integrated as well. Simple and efficient model tuning capabilities based on optimization problem are successfully integrated with the MDAO tool. More synchronized all phases of experimental testing (ground and flight), analytical model updating, high-fidelity simulations for model validation, and integrated design may result in reduction of uncertainties in the aeroservoelastic model and increase the flight safety.

  9. Advances in the genetic dissection of plant cell walls: tools and resources available in Miscanthus

    PubMed Central

    Slavov, Gancho; Allison, Gordon; Bosch, Maurice

    2013-01-01

    Tropical C4 grasses from the genus Miscanthus are believed to have great potential as biomass crops. However, Miscanthus species are essentially undomesticated, and genetic, molecular and bioinformatics tools are in very early stages of development. Furthermore, similar to other crops targeted as lignocellulosic feedstocks, the efficient utilization of biomass is hampered by our limited knowledge of the structural organization of the plant cell wall and the underlying genetic components that control this organization. The Institute of Biological, Environmental and Rural Sciences (IBERS) has assembled an extensive collection of germplasm for several species of Miscanthus. In addition, an integrated, multidisciplinary research programme at IBERS aims to inform accelerated breeding for biomass productivity and composition, while also generating fundamental knowledge. Here we review recent advances with respect to the genetic characterization of the cell wall in Miscanthus. First, we present a summary of recent and on-going biochemical studies, including prospects and limitations for the development of powerful phenotyping approaches. Second, we review current knowledge about genetic variation for cell wall characteristics of Miscanthus and illustrate how phenotypic data, combined with high-density arrays of single-nucleotide polymorphisms, are being used in genome-wide association studies to generate testable hypotheses and guide biological discovery. Finally, we provide an overview of the current knowledge about the molecular biology of cell wall biosynthesis in Miscanthus and closely related grasses, discuss the key conceptual and technological bottlenecks, and outline the short-term prospects for progress in this field. PMID:23847628

  10. Ares First Stage "Systemology" - Combining Advanced Systems Engineering and Planning Tools to Assure Mission Success

    NASA Technical Reports Server (NTRS)

    Seiler, James; Brasfield, Fred; Cannon, Scott

    2008-01-01

    Ares is an integral part of NASA s Constellation architecture that will provide crew and cargo access to the International Space Station as well as low earth orbit support for lunar missions. Ares replaces the Space Shuttle in the post 2010 time frame. Ares I is an in-line, two-stage rocket topped by the Orion Crew Exploration Vehicle, its service module, and a launch abort system. The Ares I first stage is a single, five-segment reusable solid rocket booster derived from the Space Shuttle Program's reusable solid rocket motor. The Ares second or upper stage is propelled by a J-2X main engine fueled with liquid oxygen and liquid hydrogen. This paper describes the advanced systems engineering and planning tools being utilized for the design, test, and qualification of the Ares I first stage element. Included are descriptions of the current first stage design, the milestone schedule requirements, and the marriage of systems engineering, detailed planning efforts, and roadmapping employed to achieve these goals.

  11. Bioassays as a tool for evaluating advanced oxidation processes in water and wastewater treatment.

    PubMed

    Rizzo, Luigi

    2011-10-01

    Advanced oxidation processes (AOPs) have been widely used in water and wastewater treatment for the removal of organic and inorganic contaminants as well as to improve biodegradability of industrial wastewater. Unfortunately, the partial oxidation of organic contaminants may result in the formation of intermediates more toxic than parent compounds. In order to avoid this drawback, AOPs are expected to be carefully operated and monitored, and toxicity tests have been used to evaluate whether effluent detoxification takes place. In the present work, the effect of AOPs on the toxicity of aqueous solutions of different classes of contaminants as well as actual aqueous matrices are critically reviewed. The dualism toxicity-biodegradability when AOPs are used as pre-treatment step to improve industrial wastewater biodegradability is also discussed. The main conclusions/remarks include the followings: (i) bioassays are a really useful tool to evaluate the dangerousness of AOPs as well as to set up the proper operative conditions, (ii) target organisms for bioassays should be chosen according to the final use of the treated water matrix, (iii) acute toxicity tests may be not suitable to evaluate toxicity in the presence of low/realistic concentrations of target contaminants, so studies on chronic effects should be further developed, (iv) some toxicity tests may be not useful to evaluate biodegradability potential, in this case more suitable tests should be applied (e.g., activated sludge bioassays, respirometry). PMID:21722938

  12. How Project Management Tools Aid in Association to Advance Collegiate Schools of Business (AACSB) International Maintenance of Accreditation

    ERIC Educational Resources Information Center

    Cann, Cynthia W.; Brumagim, Alan L.

    2008-01-01

    The authors present the case of one business college's use of project management techniques as tools for accomplishing Association to Advance Collegiate Schools of Business (AACSB) International maintenance of accreditation. Using these techniques provides an efficient and effective method of organizing maintenance efforts. In addition, using…

  13. Rapid State Space Modeling Tool for Rectangular Wing Aeroservoelastic Studies

    NASA Technical Reports Server (NTRS)

    Suh, Peter M.; Conyers, Howard J.; Mavris, Dimitri N.

    2014-01-01

    This paper introduces a modeling and simulation tool for aeroservoelastic analysis of rectangular wings with trailing edge control surfaces. The inputs to the code are planform design parameters such as wing span, aspect ratio and number of control surfaces. A doublet lattice approach is taken to compute generalized forces. A rational function approximation is computed. The output, computed in a few seconds, is a state space aeroservoelastic model which can be used for analysis and control design. The tool is fully parameterized with default information so there is little required interaction with the model developer. Although, all parameters can be easily modified if desired.The focus of this paper is on tool presentation, verification and validation. This process is carried out in stages throughout the paper. The rational function approximation is verified against computed generalized forces for a plate model. A model composed of finite element plates is compared to a modal analysis from commercial software and an independently conducted experimental ground vibration test analysis. Aeroservoelastic analysis is the ultimate goal of this tool. Therefore the flutter speed and frequency for a clamped plate are computed using V-g and V-f analysis. The computational results are compared to a previously published computational analysis and wind tunnel results for the same structure. Finally a case study of a generic wing model with a single control surface is presented. Verification of the state space model is presented in comparison to V-g and V-f analysis. This also includes the analysis of the model in response to a 1-cos gust.

  14. Rapid State Space Modeling Tool for Rectangular Wing Aeroservoelastic Studies

    NASA Technical Reports Server (NTRS)

    Suh, Peter M.; Conyers, Howard J.; Mavris, Dimitri N.

    2015-01-01

    This paper introduces a modeling and simulation tool for aeroservoelastic analysis of rectangular wings with trailing-edge control surfaces. The inputs to the code are planform design parameters such as wing span, aspect ratio, and number of control surfaces. Using this information, the generalized forces are computed using the doublet-lattice method. Using Roger's approximation, a rational function approximation is computed. The output, computed in a few seconds, is a state space aeroservoelastic model which can be used for analysis and control design. The tool is fully parameterized with default information so there is little required interaction with the model developer. All parameters can be easily modified if desired. The focus of this paper is on tool presentation, verification, and validation. These processes are carried out in stages throughout the paper. The rational function approximation is verified against computed generalized forces for a plate model. A model composed of finite element plates is compared to a modal analysis from commercial software and an independently conducted experimental ground vibration test analysis. Aeroservoelastic analysis is the ultimate goal of this tool, therefore, the flutter speed and frequency for a clamped plate are computed using damping-versus-velocity and frequency-versus-velocity analysis. The computational results are compared to a previously published computational analysis and wind-tunnel results for the same structure. A case study of a generic wing model with a single control surface is presented. Verification of the state space model is presented in comparison to damping-versus-velocity and frequency-versus-velocity analysis, including the analysis of the model in response to a 1-cos gust.

  15. Rapid State Space Modeling Tool for Rectangular Wing Aeroservoelastic Studies

    NASA Technical Reports Server (NTRS)

    Suh, Peter M.; Conyers, Howard Jason; Mavris, Dimitri N.

    2015-01-01

    This report introduces a modeling and simulation tool for aeroservoelastic analysis of rectangular wings with trailing-edge control surfaces. The inputs to the code are planform design parameters such as wing span, aspect ratio, and number of control surfaces. Using this information, the generalized forces are computed using the doublet-lattice method. Using Roger's approximation, a rational function approximation is computed. The output, computed in a few seconds, is a state space aeroservoelastic model which can be used for analysis and control design. The tool is fully parameterized with default information so there is little required interaction with the model developer. All parameters can be easily modified if desired. The focus of this report is on tool presentation, verification, and validation. These processes are carried out in stages throughout the report. The rational function approximation is verified against computed generalized forces for a plate model. A model composed of finite element plates is compared to a modal analysis from commercial software and an independently conducted experimental ground vibration test analysis. Aeroservoelastic analysis is the ultimate goal of this tool, therefore, the flutter speed and frequency for a clamped plate are computed using damping-versus-velocity and frequency-versus-velocity analysis. The computational results are compared to a previously published computational analysis and wind-tunnel results for the same structure. A case study of a generic wing model with a single control surface is presented. Verification of the state space model is presented in comparison to damping-versus-velocity and frequency-versus-velocity analysis, including the analysis of the model in response to a 1-cos gust.

  16. Greenhouse gases from wastewater treatment - A review of modelling tools.

    PubMed

    Mannina, Giorgio; Ekama, George; Caniani, Donatella; Cosenza, Alida; Esposito, Giovanni; Gori, Riccardo; Garrido-Baserba, Manel; Rosso, Diego; Olsson, Gustaf

    2016-05-01

    Nitrous oxide, carbon dioxide and methane are greenhouse gases (GHG) emitted from wastewater treatment that contribute to its carbon footprint. As a result of the increasing awareness of GHG emissions from wastewater treatment plants (WWTPs), new modelling, design, and operational tools have been developed to address and reduce GHG emissions at the plant-wide scale and beyond. This paper reviews the state-of-the-art and the recently developed tools used to understand and manage GHG emissions from WWTPs, and discusses open problems and research gaps. The literature review reveals that knowledge on the processes related to N2O formation, especially due to autotrophic biomass, is still incomplete. The literature review shows also that a plant-wide modelling approach that includes GHG is the best option for the understanding how to reduce the carbon footprint of WWTPs. Indeed, several studies have confirmed that a wide vision of the WWPTs has to be considered in order to make them more sustainable as possible. Mechanistic dynamic models were demonstrated as the most comprehensive and reliable tools for GHG assessment. Very few plant-wide GHG modelling studies have been applied to real WWTPs due to the huge difficulties related to data availability and the model complexity. For further improvement in GHG plant-wide modelling and to favour its use at large real scale, knowledge of the mechanisms involved in GHG formation and release, and data acquisition must be enhanced. PMID:26878638

  17. Ground-water models as a management tool in Florida

    USGS Publications Warehouse

    Hutchinson, C.B.

    1984-01-01

    Highly sophisticated computer models provide powerful tools for analyzing historic data and for simulating future water levels, water movement, and water chemistry under stressed conditions throughout the ground-water system in Florida. Models that simulate the movement of heat and subsidence of land in response to aquifer pumping also have potential for application to hydrologic problems in the State. Florida, with 20 ground-water modeling studies reported since 1972, has applied computer modeling techniques to a variety of water-resources problems. Models in Florida generally have been used to provide insight to problems of water supply, contamination, and impact on the environment. The model applications range from site-specific studies, such as estimating contamination by wastewater injection at St. Petersburg, to a regional model of the entire State that may be used to assess broad-scale environmental impact of water-resources development. Recently, groundwater models have been used as management tools by the State regulatory authority to permit or deny development of water resources. As modeling precision, knowledge, and confidence increase, the use of ground-water models will shift more and more toward regulation of development and enforcement of environmental laws. (USGS)

  18. AgMIP Training in Multiple Crop Models and Tools

    NASA Technical Reports Server (NTRS)

    Boote, Kenneth J.; Porter, Cheryl H.; Hargreaves, John; Hoogenboom, Gerrit; Thornburn, Peter; Mutter, Carolyn

    2015-01-01

    The Agricultural Model Intercomparison and Improvement Project (AgMIP) has the goal of using multiple crop models to evaluate climate impacts on agricultural production and food security in developed and developing countries. There are several major limitations that must be overcome to achieve this goal, including the need to train AgMIP regional research team (RRT) crop modelers to use models other than the ones they are currently familiar with, plus the need to harmonize and interconvert the disparate input file formats used for the various models. Two activities were followed to address these shortcomings among AgMIP RRTs to enable them to use multiple models to evaluate climate impacts on crop production and food security. We designed and conducted courses in which participants trained on two different sets of crop models, with emphasis on the model of least experience. In a second activity, the AgMIP IT group created templates for inputting data on soils, management, weather, and crops into AgMIP harmonized databases, and developed translation tools for converting the harmonized data into files that are ready for multiple crop model simulations. The strategies for creating and conducting the multi-model course and developing entry and translation tools are reviewed in this chapter.

  19. Translational research to develop a human PBPK models tool kit-volatile organic compounds (VOCs).

    PubMed

    Mumtaz, M Moiz; Ray, Meredith; Crowell, Susan R; Keys, Deborah; Fisher, Jeffrey; Ruiz, Patricia

    2012-01-01

    Toxicity and exposure evaluations remain the two of the key components of human health assessment. While improvement in exposure assessment relies on a better understanding of human behavior patterns, toxicity assessment still relies to a great extent on animal toxicity testing and human epidemiological studies. Recent advances in computer modeling of the dose-response relationship and distribution of xenobiotics in humans to important target tissues have advanced our abilities to assess toxicity. In particular, physiologically based pharmacokinetic (PBPK) models are among the tools than can enhance toxicity assessment accuracy. Many PBPK models are available to the health assessor, but most are so difficult to use that health assessors rarely use them. To encourage their use these models need to have transparent and user-friendly formats. To this end the Agency for Toxic Substances and Disease Registry (ATSDR) is using translational research to increase PBPK model accessibility, understandability, and use in the site-specific health assessment arena. The agency has initiated development of a human PBPK tool-kit for certain high priority pollutants. The tool kit comprises a series of suitable models. The models are recoded in a single computer simulation language and evaluated for use by health assessors. While not necessarily being state-of-the-art code for each chemical, the models will be sufficiently accurate to use for screening purposes. This article presents a generic, seven-compartment PBPK model for six priority volatile organic compounds (VOCs): benzene (BEN), carbon tetrachloride (CCl(4)), dichloromethane (DCM), perchloroethylene (PCE), trichloroethylene (TCE), and vinyl chloride (VC). Limited comparisons of the generic and original model predictions to published kinetic data were conducted. A goodness of fit was determined by calculating the means of the sum of the squared differences (MSSDs) for simulation vs. experimental kinetic data using the

  20. Continued development of modeling tools and theory for RF heating

    SciTech Connect

    1998-12-01

    Mission Research Corporation (MRC) is pleased to present the Department of Energy (DOE) with its renewal proposal to the Continued Development of Modeling Tools and Theory for RF Heating program. The objective of the program is to continue and extend the earlier work done by the proposed principal investigator in the field of modeling (Radio Frequency) RF heating experiments in the large tokamak fusion experiments, particularly the Tokamak Fusion Test Reactor (TFTR) device located at Princeton Plasma Physics Laboratory (PPPL). An integral part of this work is the investigation and, in some cases, resolution of theoretical issues which pertain to accurate modeling. MRC is nearing the successful completion of the specified tasks of the Continued Development of Modeling Tools and Theory for RF Heating project. The following tasks are either completed or nearing completion. (1) Anisotropic temperature and rotation upgrades; (2) Modeling for relativistic ECRH; (3) Further documentation of SHOOT and SPRUCE. As a result of the progress achieved under this project, MRC has been urged to continue this effort. Specifically, during the performance of this project two topics were identified by PPPL personnel as new applications of the existing RF modeling tools. These two topics concern (a) future fast-wave current drive experiments on the large tokamaks including TFTR and (c) the interpretation of existing and future RF probe data from TFTR. To address each of these topics requires some modification or enhancement of the existing modeling tools, and the first topic requires resolution of certain theoretical issues to produce self-consistent results. This work falls within the scope of the original project and is more suited to the project`s renewal than to the initiation of a new project.

  1. Experiences & Tools from Modeling Instruction Applied to Earth Sciences

    NASA Astrophysics Data System (ADS)

    Cervenec, J.; Landis, C. E.

    2012-12-01

    The Framework for K-12 Science Education calls for stronger curricular connections within the sciences, greater depth in understanding, and tasks higher on Bloom's Taxonomy. Understanding atmospheric sciences draws on core knowledge traditionally taught in physics, chemistry, and in some cases, biology. If this core knowledge is not conceptually sound, well retained, and transferable to new settings, understanding the causes and consequences of climate changes become a task in memorizing seemingly disparate facts to a student. Fortunately, experiences and conceptual tools have been developed and refined in the nationwide network of Physics Modeling and Chemistry Modeling teachers to build necessary understanding of conservation of mass, conservation of energy, particulate nature of matter, kinetic molecular theory, and particle model of light. Context-rich experiences are first introduced for students to construct an understanding of these principles and then conceptual tools are deployed for students to resolve misconceptions and deepen their understanding. Using these experiences and conceptual tools takes an investment of instructional time, teacher training, and in some cases, re-envisioning the format of a science classroom. There are few financial barriers to implementation and students gain a greater understanding of the nature of science by going through successive cycles of investigation and refinement of their thinking. This presentation shows how these experiences and tools could be used in an Earth Science course to support students developing conceptually rich understanding of the atmosphere and connections happening within.

  2. Thermal Model Predictions of Advanced Stirling Radioisotope Generator Performance

    NASA Technical Reports Server (NTRS)

    Wang, Xiao-Yen J.; Fabanich, William Anthony; Schmitz, Paul C.

    2014-01-01

    This presentation describes the capabilities of three-dimensional thermal power model of advanced stirling radioisotope generator (ASRG). The performance of the ASRG is presented for different scenario, such as Venus flyby with or without the auxiliary cooling system.

  3. A Short Review of Ablative-Material Response Models and Simulation Tools

    NASA Technical Reports Server (NTRS)

    Lachaud, Jean; Magin, Thierry E.; Cozmuta, Ioana; Mansour, Nagi N.

    2011-01-01

    A review of the governing equations and boundary conditions used to model the response of ablative materials submitted to a high-enthalpy flow is proposed. The heritage of model-development efforts undertaken in the 1960s is extremely clear: the bases of the models used in the community are mathematically equivalent. Most of the material-response codes implement a single model in which the equation parameters may be modified to model different materials or conditions. The level of fidelity of the models implemented in design tools only slightly varies. Research and development codes are generally more advanced but often not as robust. The capabilities of each of these codes are summarized in a color-coded table along with research and development efforts currently in progress.

  4. Modelling of tunnelling processes and rock cutting tool wear with the particle finite element method

    NASA Astrophysics Data System (ADS)

    Carbonell, Josep Maria; Oñate, Eugenio; Suárez, Benjamín

    2013-09-01

    Underground construction involves all sort of challenges in analysis, design, project and execution phases. The dimension of tunnels and their structural requirements are growing, and so safety and security demands do. New engineering tools are needed to perform a safer planning and design. This work presents the advances in the particle finite element method (PFEM) for the modelling and the analysis of tunneling processes including the wear of the cutting tools. The PFEM has its foundation on the Lagrangian description of the motion of a continuum built from a set of particles with known physical properties. The method uses a remeshing process combined with the alpha-shape technique to detect the contacting surfaces and a finite element method for the mechanical computations. A contact procedure has been developed for the PFEM which is combined with a constitutive model for predicting the excavation front and the wear of cutting tools. The material parameters govern the coupling of frictional contact and wear between the interacting domains at the excavation front. The PFEM allows predicting several parameters which are relevant for estimating the performance of a tunnelling boring machine such as wear in the cutting tools, the pressure distribution on the face of the boring machine and the vibrations produced in the machinery and the adjacent soil/rock. The final aim is to help in the design of the excavating tools and in the planning of the tunnelling operations. The applications presented show that the PFEM is a promising technique for the analysis of tunnelling problems.

  5. A communication tool to improve the patient journey modeling process.

    PubMed

    Curry, Joanne; McGregor, Carolyn; Tracy, Sally

    2006-01-01

    Quality improvement is high on the agenda of Health Care Organisations (HCO) worldwide. Patient journey modeling is a relatively recent innovation in healthcare quality improvement that models the patient's movement through the HCO by viewing it from a patient centric perspective. Critical to the success of the redesigning care process is the involvement of all stakeholders and their commitment to actively participate in the process. Tools which promote this type of communication are a critical enabler that can significantly affect the overall process redesign outcomes. Such a tool must also be able to incorporate additional factors such as relevant policies and procedures, staff roles, system usage and measurements such as process time and cost. This paper presents a graphically based communication tool that can be used as part of the patient journey modeling process to promote stakeholder involvement, commitment and ownership as well highlighting the relationship of other relevant variables that contribute to the patient's journey. Examples of how the tool has been used and the framework employed are demonstrated via a midwife-led primary care case study. A key contribution of this research is the provision of a graphical communication framework that is simple to use, is easily understood by a diverse range of stakeholders and enables ready recognition of patient journey issues. Results include strong stakeholder buy-in and significant enhancement to the overall design of the future patient journey. Initial results indicate that the use of such a communication tool can improve the patient journey modeling process and the overall quality improvement outcomes. PMID:17945852

  6. Advances in modelling of condensation phenomena

    SciTech Connect

    Liu, W.S.; Zaltsgendler, E.; Hanna, B.

    1997-07-01

    The physical parameters in the modelling of condensation phenomena in the CANDU reactor system codes are discussed. The experimental programs used for thermal-hydraulic code validation in the Canadian nuclear industry are briefly described. The modelling of vapour generation and in particular condensation plays a key role in modelling of postulated reactor transients. The condensation models adopted in the current state-of-the-art two-fluid CANDU reactor thermal-hydraulic system codes (CATHENA and TUF) are described. As examples of the modelling challenges faced, the simulation of a cold water injection experiment by CATHENA and the simulation of a condensation induced water hammer experiment by TUF are described.

  7. Process models: analytical tools for managing industrial energy systems

    SciTech Connect

    Howe, S O; Pilati, D A; Balzer, C; Sparrow, F T

    1980-01-01

    How the process models developed at BNL are used to analyze industrial energy systems is described and illustrated. Following a brief overview of the industry modeling program, the general methodology of process modeling is discussed. The discussion highlights the important concepts, contents, inputs, and outputs of a typical process model. A model of the US pulp and paper industry is then discussed as a specific application of process modeling methodology. Applications addressed with the case study results include projections of energy demand, conservation technology assessment, energy-related tax policies, and sensitivity analysis. A subsequent discussion of these results supports the conclusion that industry process models are versatile and powerful tools for managing industrial energy systems.

  8. High Performance Computing Modeling Advances Accelerator Science for High-Energy Physics

    SciTech Connect

    Amundson, James; Macridin, Alexandru; Spentzouris, Panagiotis

    2014-07-28

    The development and optimization of particle accelerators are essential for advancing our understanding of the properties of matter, energy, space, and time. Particle accelerators are complex devices whose behavior involves many physical effects on multiple scales. Therefore, advanced computational tools utilizing high-performance computing are essential for accurately modeling them. In the past decade, the US Department of Energy's SciDAC program has produced accelerator-modeling tools that have been employed to tackle some of the most difficult accelerator science problems. The authors discuss the Synergia framework and its applications to high-intensity particle accelerator physics. Synergia is an accelerator simulation package capable of handling the entire spectrum of beam dynamics simulations. Our authors present Synergia's design principles and its performance on HPC platforms.

  9. High Performance Computing Modeling Advances Accelerator Science for High-Energy Physics

    DOE PAGESBeta

    Amundson, James; Macridin, Alexandru; Spentzouris, Panagiotis

    2014-07-28

    The development and optimization of particle accelerators are essential for advancing our understanding of the properties of matter, energy, space, and time. Particle accelerators are complex devices whose behavior involves many physical effects on multiple scales. Therefore, advanced computational tools utilizing high-performance computing are essential for accurately modeling them. In the past decade, the US Department of Energy's SciDAC program has produced accelerator-modeling tools that have been employed to tackle some of the most difficult accelerator science problems. The authors discuss the Synergia framework and its applications to high-intensity particle accelerator physics. Synergia is an accelerator simulation package capable ofmore » handling the entire spectrum of beam dynamics simulations. Our authors present Synergia's design principles and its performance on HPC platforms.« less

  10. Advancing complementary and alternative medicine through social network analysis and agent-based modeling.

    PubMed

    Frantz, Terrill L

    2012-01-01

    This paper introduces the contemporary perspectives and techniques of social network analysis (SNA) and agent-based modeling (ABM) and advocates applying them to advance various aspects of complementary and alternative medicine (CAM). SNA and ABM are invaluable methods for representing, analyzing and projecting complex, relational, social phenomena; they provide both an insightful vantage point and a set of analytic tools that can be useful in a wide range of contexts. Applying these methods in the CAM context can aid the ongoing advances in the CAM field, in both its scientific aspects and in developing broader acceptance in associated stakeholder communities. PMID:22327550

  11. Development of tools for safety analysis of control software in advanced reactors

    SciTech Connect

    Guarro, S.; Yau, M.; Motamed, M.

    1996-04-01

    Software based control systems have gained a pervasive presence in a wide variety of applications, including nuclear power plant control and protection systems which are within the oversight and licensing responsibility of the US Nuclear Regulatory Commission. While the cost effectiveness and flexibility of software based plant process control is widely recognized, it is very difficult to achieve and prove high levels of demonstrated dependability and safety assurance for the functions performed by process control software, due to the very flexibility and potential complexity of the software itself. The development of tools to model, analyze and test software design and implementations in the context of the system that the software is designed to control can greatly assist the task of providing higher levels of assurance than those obtainable by software testing alone. This report presents and discusses the development of the Dynamic Flowgraph Methodology (DFM) and its application in the dependability and assurance analysis of software-based control systems. The features of the methodology and full-scale examples of application to both generic process and nuclear power plant control systems are presented and discussed in detail. The features of a workstation software tool developed to assist users in the application of DFM are also described.

  12. Petri Nets as Modeling Tool for Emergent Agents

    NASA Technical Reports Server (NTRS)

    Bergman, Marto

    2004-01-01

    Emergent agents, those agents whose local interactions can cause unexpected global results, require a method of modeling that is both dynamic and structured Petri Nets, a modeling tool developed for dynamic discrete event system of mainly functional agents, provide this, and have the benefit of being an established tool. We present here the details of the modeling method here and discuss how to implement its use for modeling agent-based systems. Petri Nets have been used extensively in the modeling of functional agents, those agents who have defined purposes and whose actions should result in a know outcome. However, emergent agents, those agents who have a defined structure but whose interaction causes outcomes that are unpredictable, have not yet found a modeling style that suits them. A problem with formally modeling emergent agents that any formal modeling style usually expects to show the results of a problem and the results of problems studied using emergent agents are not apparent from the initial construction. However, the study of emergent agents still requires a method to analyze the agents themselves, and have sensible conversation about the differences and similarities between types of emergent agents. We attempt to correct this problem by applying Petri Nets to the characterization of emergent agents. In doing so, the emergent properties of these agents can be highlighted, and conversation about the nature and compatibility of the differing methods of agent creation can begin.

  13. Modeling a nonlinear water transfer between two reservoirs in a midterm hydroelectric scheduling tool

    NASA Astrophysics Data System (ADS)

    Moraga, RocíO.; GarcíA-GonzáLez, Javier; Parrilla, Ernesto; Nogales, Sergio

    2007-04-01

    In a competitive environment, operation and planning decisions of generating units are decentralized. Therefore the management of hydroelectric generation resources requires the development of advanced planning and scheduling tools adapted to the particular needs of each company. This paper presents a method for considering natural water transfers through a pipeline in the context of a midterm hydro scheduling model. The main complexity of gravitational transfer modeling resides in considering the nonlinear relation between the water levels in the connected reservoirs and the transfer flow. The methodology proposed consists first in simplifying the problem by means of a change of variables, subsequently using a piecewise linear approximation of the transfer flow equation in order to consider it within a mixed integer linear programming tool, and ultimately adjusting the final solution. The proposed methodology is currently being used to manage the Sil River hydro basin in the northwest of Spain, with satisfactory results, as shown in the case study.

  14. Advanced in turbulence physics and modeling by direct numerical simulations

    NASA Technical Reports Server (NTRS)

    Reynolds, W. C.

    1987-01-01

    The advent of direct numerical simulations of turbulence has opened avenues for research on turbulence physics and turbulence modeling. Direct numerical simulation provides values for anything that the scientist or modeler would like to know about the flow. An overview of some recent advances in the physical understanding of turbulence and in turbulence modeling obtained through such simulations is presented.

  15. ADVANCED TECHNIQUES FOR RESERVOIR SIMULATION AND MODELING OF NONCONVENTIONAL WELLS

    SciTech Connect

    Louis J. Durlofsky; Khalid Aziz

    2004-08-20

    Nonconventional wells, which include horizontal, deviated, multilateral and ''smart'' wells, offer great potential for the efficient management of oil and gas reservoirs. These wells are able to contact larger regions of the reservoir than conventional wells and can also be used to target isolated hydrocarbon accumulations. The use of nonconventional wells instrumented with downhole inflow control devices allows for even greater flexibility in production. Because nonconventional wells can be very expensive to drill, complete and instrument, it is important to be able to optimize their deployment, which requires the accurate prediction of their performance. However, predictions of nonconventional well performance are often inaccurate. This is likely due to inadequacies in some of the reservoir engineering and reservoir simulation tools used to model and optimize nonconventional well performance. A number of new issues arise in the modeling and optimization of nonconventional wells. For example, the optimal use of downhole inflow control devices has not been addressed for practical problems. In addition, the impact of geological and engineering uncertainty (e.g., valve reliability) has not been previously considered. In order to model and optimize nonconventional wells in different settings, it is essential that the tools be implemented into a general reservoir simulator. This simulator must be sufficiently general and robust and must in addition be linked to a sophisticated well model. Our research under this five year project addressed all of the key areas indicated above. The overall project was divided into three main categories: (1) advanced reservoir simulation techniques for modeling nonconventional wells; (2) improved techniques for computing well productivity (for use in reservoir engineering calculations) and for coupling the well to the simulator (which includes the accurate calculation of well index and the modeling of multiphase flow in the wellbore

  16. ISAC: A tool for aeroservoelastic modeling and analysis

    NASA Technical Reports Server (NTRS)

    Adams, William M., Jr.; Hoadley, Sherwood Tiffany

    1993-01-01

    The capabilities of the Interaction of Structures, Aerodynamics, and Controls (ISAC) system of program modules is discussed. The major modeling, analysis, and data management components of ISAC are identified. Equations of motion are displayed for a Laplace-domain representation of the unsteady aerodynamic forces. Options for approximating a frequency-domain representation of unsteady aerodynamic forces with rational functions of the Laplace variable are shown. Linear time invariant state-space equations of motion that result are discussed. Model generation and analyses of stability and dynamic response characteristics are shown for an aeroelastic vehicle which illustrates some of the capabilities of ISAC as a modeling and analysis tool for aeroelastic applications.

  17. Open Innovation at NASA: A New Business Model for Advancing Human Health and Performance Innovations

    NASA Technical Reports Server (NTRS)

    Davis, Jeffrey R.; Richard, Elizabeth E.; Keeton, Kathryn E.

    2014-01-01

    This paper describes a new business model for advancing NASA human health and performance innovations and demonstrates how open innovation shaped its development. A 45 percent research and technology development budget reduction drove formulation of a strategic plan grounded in collaboration. We describe the strategy execution, including adoption and results of open innovation initiatives, the challenges of cultural change, and the development of virtual centers and a knowledge management tool to educate and engage the workforce and promote cultural change.

  18. Advances on genetic rat models of epilepsy

    PubMed Central

    Serikawa, Tadao; Mashimo, Tomoji; Kuramoto, Takashi; Voigt, Birger; Ohno, Yukihiro; Sasa, Masashi

    2014-01-01

    Considering the suitability of laboratory rats in epilepsy research, we and other groups have been developing genetic models of epilepsy in this species. After epileptic rats or seizure-susceptible rats were sporadically found in outbred stocks, the epileptic traits were usually genetically-fixed by selective breeding. So far, the absence seizure models GAERS and WAG/Rij, audiogenic seizure models GEPR-3 and GEPR-9, generalized tonic-clonic seizure models IER, NER and WER, and Canavan-disease related epileptic models TRM and SER have been established. Dissection of the genetic bases including causative genes in these epileptic rat models would be a significant step toward understanding epileptogenesis. N-ethyl-N-nitrosourea (ENU) mutagenesis provides a systematic approach which allowed us to develop two novel epileptic rat models: heat-induced seizure susceptible (Hiss) rats with an Scn1a missense mutation and autosomal dominant lateral temporal epilepsy (ADLTE) model rats with an Lgi1 missense mutation. In addition, we have established episodic ataxia type 1 (EA1) model rats with a Kcna1 missense mutation derived from the ENU-induced rat mutant stock, and identified a Cacna1a missense mutation in a N-Methyl-N-nitrosourea (MNU)-induced mutant rat strain GRY, resulting in the discovery of episodic ataxia type 2 (EA2) model rats. Thus, epileptic rat models have been established on the two paths: ‘phenotype to gene’ and ‘gene to phenotype’. In the near future, development of novel epileptic rat models will be extensively promoted by the use of sophisticated genome editing technologies. PMID:25312505

  19. Advances on genetic rat models of epilepsy.

    PubMed

    Serikawa, Tadao; Mashimo, Tomoji; Kuramoro, Takashi; Voigt, Birger; Ohno, Yukihiro; Sasa, Masashi

    2015-01-01

    Considering the suitability of laboratory rats in epilepsy research, we and other groups have been developing genetic models of epilepsy in this species. After epileptic rats or seizure-susceptible rats were sporadically found in outbred stocks, the epileptic traits were usually genetically-fixed by selective breeding. So far, the absence seizure models GAERS and WAG/Rij, audiogenic seizure models GEPR-3 and GEPR-9, generalized tonic-clonic seizure models IER, NER and WER, and Canavan-disease related epileptic models TRM and SER have been established. Dissection of the genetic bases including causative genes in these epileptic rat models would be a significant step toward understanding epileptogenesis. N-ethyl-N-nitrosourea (ENU) mutagenesis provides a systematic approach which allowed us to develop two novel epileptic rat models: heat-induced seizure susceptible (Hiss) rats with an Scn1a missense mutation and autosomal dominant lateral temporal epilepsy (ADLTE) model rats with an Lgi1 missense mutation. In addition, we have established episodic ataxia type 1 (EA1) model rats with a Kcna1 missense mutation derived from the ENU-induced rat mutant stock, and identified a Cacna1a missense mutation in a N-Methyl-N-nitrosourea (MNU)-induced mutant rat strain GRY, resulting in the discovery of episodic ataxia type 2 (EA2) model rats. Thus, epileptic rat models have been established on the two paths: 'phenotype to gene' and 'gene to phenotype'. In the near future, development of novel epileptic rat models will be extensively promoted by the use of sophisticated genome editing technologies. PMID:25312505

  20. Recent Advances in the LEWICE Icing Model

    NASA Technical Reports Server (NTRS)

    Wright, William B.; Addy, Gene; Struk, Peter; Bartkus, Tadas

    2015-01-01

    This paper will describe two recent modifications to the Glenn ICE software. First, a capability for modeling ice crystals and mixed phase icing has been modified based on recent experimental data. Modifications have been made to the ice particle bouncing and erosion model. This capability has been added as part of a larger effort to model ice crystal ingestion in aircraft engines. Comparisons have been made to ice crystal ice accretions performed in the NRC Research Altitude Test Facility (RATFac). Second, modifications were made to the run back model based on data and observations from thermal scaling tests performed in the NRC Altitude Icing Tunnel.

  1. An Advanced Sea-Floor Spreading Model.

    ERIC Educational Resources Information Center

    Dutch, Steven I.

    1986-01-01

    Describes models which (1) illustrate spreading that varies in rate from place to place; (2) clearly show transform faults as arcs of small circles; and (3) illustrate what happens near a pole of rotation. The models are easy to construct and have been well received by students. (JN)

  2. Advances in Swine biomedical Model Genomics

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This manuscript is a short update on the diversity of swine biomedical models and the importance of genomics in their continued development. The swine has been used as a major mammalian model for human studies because of the similarity in size and physiology, and in organ development and disease pro...

  3. Carbon export algorithm advancements in models

    NASA Astrophysics Data System (ADS)

    Çağlar Yumruktepe, Veli; Salihoğlu, Barış

    2015-04-01

    The rate at which anthropogenic CO2 is absorbed by the oceans remains a critical question under investigation by climate researchers. Construction of a complete carbon budget, requires better understanding of air-sea exchanges and the processes controlling the vertical and horizontal transport of carbon in the ocean, particularly the biological carbon pump. Improved parameterization of carbon sequestration within ecosystem models is vital to better understand and predict changes in the global carbon cycle. Due to the complexity of processes controlling particle aggregation, sinking and decomposition, existing ecosystem models necessarily parameterize carbon sequestration using simple algorithms. Development of improved algorithms describing carbon export and sequestration, suitable for inclusion in numerical models is an ongoing work. Existing unique algorithms used in the state-of-the art ecosystem models and new experimental results obtained from mesocosm experiments and open ocean observations have been inserted into a common 1D pelagic ecosystem model for testing purposes. The model was implemented to the timeseries stations in the North Atlantic (BATS, PAP and ESTOC) and were evaluated with datasets of carbon export. Targetted topics of algorithms were PFT functional types, grazing and vertical movement of zooplankton, and remineralization, aggregation and ballasting dynamics of organic matter. Ultimately it is intended to feed improved algorithms to the 3D modelling community, for inclusion in coupled numerical models.

  4. Recent modelling advances for ultrasonic TOFD inspections

    SciTech Connect

    Darmon, Michel; Ferrand, Adrien; Dorval, Vincent; Chatillon, Sylvain; Lonné, Sébastien

    2015-03-31

    The ultrasonic TOFD (Time of Flight Diffraction) Technique is commonly used to detect and characterize disoriented cracks using their edge diffraction echoes. An overview of the models integrated in the CIVA software platform and devoted to TOFD simulation is presented. CIVA allows to predict diffraction echoes from complex 3D flaws using a PTD (Physical Theory of Diffraction) based model. Other dedicated developments have been added to simulate lateral waves in 3D on planar entry surfaces and in 2D on irregular surfaces by a ray approach. Calibration echoes from Side Drilled Holes (SDHs), specimen echoes and shadowing effects from flaws can also been modelled. Some examples of theoretical validation of the models are presented. In addition, experimental validations have been performed both on planar blocks containing calibration holes and various notches and also on a specimen with an irregular entry surface and allow to draw conclusions on the validity of all the developed models.

  5. An advanced terrain modeler for an autonomous planetary rover

    NASA Technical Reports Server (NTRS)

    Hunter, E. L.

    1980-01-01

    A roving vehicle capable of autonomously exploring the surface of an alien world is under development and an advanced terrain modeler to characterize the possible paths of the rover as hazardous or safe is presented. This advanced terrain modeler has several improvements over the Troiani modeler that include: a crosspath analysis, better determination of hazards on slopes, and methods for dealing with missing returns at the extremities of the sensor field. The results from a package of programs to simulate the roving vehicle are then examined and compared to results from the Troiani modeler.

  6. Neural Networks for Hydrological Modeling Tool for Operational Purposes

    NASA Astrophysics Data System (ADS)

    Bhatt, Divya; Jain, Ashu

    2010-05-01

    Hydrological models are useful in many water resources applications such as flood control, irrigation and drainage, hydro power generation, water supply, erosion and sediment control, etc. Estimates of runoff are needed in many water resources planning, design development, operation and maintenance activities. Runoff is generally computed using rainfall-runoff models. Computer based hydrologic models have become popular for obtaining hydrological forecasts and for managing water systems. Rainfall-runoff library (RRL) is computer software developed by Cooperative Research Centre for Catchment Hydrology (CRCCH), Australia consisting of five different conceptual rainfall-runoff models, and has been in operation in many water resources applications in Australia. Recently, soft artificial intelligence tools such as Artificial Neural Networks (ANNs) have become popular for research purposes but have not been adopted in operational hydrological forecasts. There is a strong need to develop ANN models based on real catchment data and compare them with the conceptual models actually in use in real catchments. In this paper, the results from an investigation on the use of RRL and ANNs are presented. Out of the five conceptual models in the RRL toolkit, SimHyd model has been used. Genetic Algorithm has been used as an optimizer in the RRL to calibrate the SimHyd model. Trial and error procedures were employed to arrive at the best values of various parameters involved in the GA optimizer to develop the SimHyd model. The results obtained from the best configuration of the SimHyd model are presented here. Feed-forward neural network model structure trained by back-propagation training algorithm has been adopted here to develop the ANN models. The daily rainfall and runoff data derived from Bird Creek Basin, Oklahoma, USA have been employed to develop all the models included here. A wide range of error statistics have been used to evaluate the performance of all the models

  7. Visualization Skills: A Prerequisite to Advanced Solid Modeling

    ERIC Educational Resources Information Center

    Gow, George

    2007-01-01

    Many educators believe that solid modeling software has made teaching two- and three-dimensional visualization skills obsolete. They claim that the visual tools built into the solid modeling software serve as a replacement for the CAD operator's personal visualization skills. They also claim that because solid modeling software can produce…

  8. Community Coordinated Modeling Center (CCMC): Providing Access to Space Weather Models and Research Support Tools

    NASA Astrophysics Data System (ADS)

    Chulaki, A.; Bakshi, S. S.; Berrios, D.; Hesse, M.; Kuznetsova, M. M.; Lee, H.; MacNeice, P. J.; Mendoza, A. M.; Mullinix, R.; Patel, K. D.; Pulkkinen, A.; Rastaetter, L.; Shim, J.; Taktakishvili, A.; Zheng, Y.

    2011-12-01

    The Community Coordinated Modeling Center at NASA, Goddard Space flight Center, provides access to state-of-the-art space weather models to the research community. The majority of the models residing at the CCMC are comprehensive computationally intensive physics-based models. The CCMC also provides free services and tools to assist the research community in analyzing the results from the space weather model simulations. We present an overview of the available tools and services at the CCMC: the Runs-On-Request system, the online visualization, the Kameleon access and interpolation library and the Metrics Challenge tools suite.

  9. COUNCIL FOR REGULATORY ENVIRONMENTAL MODELING (CREM) PILOT WATER QUALITY MODEL SELECTION TOOL

    EPA Science Inventory

    EPA's Council for Regulatory Environmental Modeling (CREM) is currently supporting the development of a pilot model selection tool that is intended to help the states and the regions implement the total maximum daily load (TMDL) program. This tool will be implemented within the ...

  10. Advances and applications of occupancy models

    USGS Publications Warehouse

    Bailey, Larissa; MacKenzie, Darry I.; Nichols, James D.

    2013-01-01

    Summary: The past decade has seen an explosion in the development and application of models aimed at estimating species occurrence and occupancy dynamics while accounting for possible non-detection or species misidentification. We discuss some recent occupancy estimation methods and the biological systems that motivated their development. Collectively, these models offer tremendous flexibility, but simultaneously place added demands on the investigator. Unlike many mark–recapture scenarios, investigators utilizing occupancy models have the ability, and responsibility, to define their sample units (i.e. sites), replicate sampling occasions, time period over which species occurrence is assumed to be static and even the criteria that constitute ‘detection’ of a target species. Subsequent biological inference and interpretation of model parameters depend on these definitions and the ability to meet model assumptions. We demonstrate the relevance of these definitions by highlighting applications from a single biological system (an amphibian–pathogen system) and discuss situations where the use of occupancy models has been criticized. Finally, we use these applications to suggest future research and model development.

  11. Designing a training tool for imaging mental models

    NASA Technical Reports Server (NTRS)

    Dede, Christopher J.; Jayaram, Geetha

    1990-01-01

    The training process can be conceptualized as the student acquiring an evolutionary sequence of classification-problem solving mental models. For example a physician learns (1) classification systems for patient symptoms, diagnostic procedures, diseases, and therapeutic interventions and (2) interrelationships among these classifications (e.g., how to use diagnostic procedures to collect data about a patient's symptoms in order to identify the disease so that therapeutic measures can be taken. This project developed functional specifications for a computer-based tool, Mental Link, that allows the evaluative imaging of such mental models. The fundamental design approach underlying this representational medium is traversal of virtual cognition space. Typically intangible cognitive entities and links among them are visible as a three-dimensional web that represents a knowledge structure. The tool has a high degree of flexibility and customizability to allow extension to other types of uses, such a front-end to an intelligent tutoring system, knowledge base, hypermedia system, or semantic network.

  12. ADVANCED UTILITY SIMULATION MODEL DESCRIPTION OF MODIFICATIONS TO THE STATE LEVEL MODEL (VERSION 3.0)

    EPA Science Inventory

    The report documents modifications to the state level model portion of the Advanced Utility Simulation Model (AUSM), one of four stationary source emission and control cost forecasting models developed for the National Acid Precipitation Assessment Program (NAPAP). The AUSM model...

  13. SMART (Shop floor Modeling, Analysis and Reporting Tool Project

    NASA Technical Reports Server (NTRS)

    Centeno, Martha A.; Garcia, Maretys L.; Mendoza, Alicia C.; Molina, Louis A.; Correa, Daisy; Wint, Steve; Doice, Gregorie; Reyes, M. Florencia

    1999-01-01

    This document presents summarizes the design and prototype of the Shop floor Modeling, Analysis, and Reporting Tool (S.M.A.R.T.) A detailed description of it is found on the full documentation given to the NASA liaison. This documentation is also found on the A.R.I.S.E. Center web site, under a projected directory. Only authorized users can gain access to this site.

  14. Digraph reliability model processing advances and applications

    NASA Technical Reports Server (NTRS)

    Iverson, D. L.; Patterson-Hine, F. A.

    1993-01-01

    This paper describes a new algorithm, called SourceDoubls, which efficiently solves for singletons and doubletons of a digraph reliability model. Compared with previous methods, the SourceDoubls algorithm provides up to a two order of magnitude reduction in the amount of time required to solve large digraph models. This significant increase in model solution speed allows complex digraphs containing thousands of nodes to be used as knowledge bases for real time automated monitoring and diagnosis applications. Currently, an application to provide monitoring and diagnosis of the Space Station Freedom Data Management System is under development at NASA/Ames Research Center and NASA/Johnson Space Center. This paper contains an overview of this system and provides details of how it will use digraph models processed by the SourceDoubls algorithm to accomplish its task.

  15. State of the art: diagnostic tools and innovative therapies for treatment of advanced thymoma and thymic carcinoma.

    PubMed

    Ried, Michael; Marx, Alexander; Götz, Andrea; Hamer, Okka; Schalke, Berthold; Hofmann, Hans-Stefan

    2016-06-01

    In this review article, state-of-the-art diagnostic tools and innovative treatments of thymoma and thymic carcinoma (TC) are described with special respect to advanced tumour stages. Complete surgical resection (R0) remains the standard therapeutic approach for almost all a priori resectable mediastinal tumours as defined by preoperative standard computed tomography (CT). If lymphoma or germ-cell tumours are differential diagnostic considerations, biopsy may be indicated. Resection status is the most important prognostic factor in thymoma and TC, followed by tumour stage. Advanced (Masaoka-Koga stage III and IVa) tumours require interdisciplinary therapy decisions based on distinctive findings of preoperative CT scan and ancillary investigations [magnetic resonance imaging (MRI)] to select cases for primary surgery or neoadjuvant strategies with optional secondary resection. In neoadjuvant settings, octreotide scans and histological evaluation of pretherapeutic needle biopsies may help to choose between somatostatin agonist/prednisolone regimens and neoadjuvant chemotherapy as first-line treatment. Finally, a multimodality treatment regime is recommended for advanced and unresectable thymic tumours. In conclusion, advanced stage thymoma and TC should preferably be treated in experienced centres in order to provide all modern diagnostic tools (imaging, histology) and innovative therapy techniques. Systemic and local (hyperthermic intrathoracic chemotherapy) medical treatments together with extended surgical resections have increased the therapeutic options in patients with advanced or recurrent thymoma and TC. PMID:26670806

  16. Advanced Concepts for Underwater Acoustic Channel Modeling

    NASA Astrophysics Data System (ADS)

    Etter, P. C.; Haas, C. H.; Ramani, D. V.

    2014-12-01

    This paper examines nearshore underwater-acoustic channel modeling concepts and compares channel-state information requirements against existing modeling capabilities. This process defines a subset of candidate acoustic models suitable for simulating signal propagation in underwater communications. Underwater-acoustic communications find many practical applications in coastal oceanography, and networking is the enabling technology for these applications. Such networks can be formed by establishing two-way acoustic links between autonomous underwater vehicles and moored oceanographic sensors. These networks can be connected to a surface unit for further data transfer to ships, satellites, or shore stations via a radio-frequency link. This configuration establishes an interactive environment in which researchers can extract real-time data from multiple, but distant, underwater instruments. After evaluating the obtained data, control messages can be sent back to individual instruments to adapt the networks to changing situations. Underwater networks can also be used to increase the operating ranges of autonomous underwater vehicles by hopping the control and data messages through networks that cover large areas. A model of the ocean medium between acoustic sources and receivers is called a channel model. In an oceanic channel, characteristics of the acoustic signals change as they travel from transmitters to receivers. These characteristics depend upon the acoustic frequency, the distances between sources and receivers, the paths followed by the signals, and the prevailing ocean environment in the vicinity of the paths. Properties of the received signals can be derived from those of the transmitted signals using these channel models. This study concludes that ray-theory models are best suited to the simulation of acoustic signal propagation in oceanic channels and identifies 33 such models that are eligible candidates.

  17. Building an advanced climate model: Program plan for the CHAMMP (Computer Hardware, Advanced Mathematics, and Model Physics) Climate Modeling Program

    SciTech Connect

    Not Available

    1990-12-01

    The issue of global warming and related climatic changes from increasing concentrations of greenhouse gases in the atmosphere has received prominent attention during the past few years. The Computer Hardware, Advanced Mathematics, and Model Physics (CHAMMP) Climate Modeling Program is designed to contribute directly to this rapid improvement. The goal of the CHAMMP Climate Modeling Program is to develop, verify, and apply a new generation of climate models within a coordinated framework that incorporates the best available scientific and numerical approaches to represent physical, biogeochemical, and ecological processes, that fully utilizes the hardware and software capabilities of new computer architectures, that probes the limits of climate predictability, and finally that can be used to address the challenging problem of understanding the greenhouse climate issue through the ability of the models to simulate time-dependent climatic changes over extended times and with regional resolution.

  18. Advanced modeling techniques for micromagnetic systems.

    PubMed

    Jalil, M B A; Tan, S G; Cheng, X Z

    2007-01-01

    We present a review of micromagnetic and magnetotransport modeling methods which go beyond the standard model. We first give a brief overview of the standard micromagnetic model, which for (i) the steady-state (equilibrium) solution is based on the minimization of the free energy functional, and for (ii) the dynamical solution, relies on the numerical solution of the Landau-Lifshitz-Gilbert (LLG) equation. We present three complements to the standard model, i.e., (i) magnetotransport calculations based on ohmic conduction in the presence of the anisotropic magnetoresistance (AMR) effect, (ii) magnetotransport calculations based on spin-dependent tunneling in the presence of single charge tunneling (Coulomb blockade) effect, and (iii) stochastic micromagnetics, which incorporates the effects of thermal fluctuations via a white-noise thermal field in the LLG equation. All three complements are of practical importance: (i) magnetotransport model either in the ohmic or tunneling transport regimes, enables the conversion of the micromagnetic results to the measurable quantity of magnetoresistance ratio, while (ii) stochastic modeling is essential as the dimensions of the micromagnetic system reduces to the deep submicron regime and approaches the superparamagnetic limit. PMID:17455475

  19. The MineTool Software Suite: A Novel Data Mining Palette of Tools for Automated Modeling of Space Physics Data

    NASA Astrophysics Data System (ADS)

    Sipes, T.; Karimabadi, H.; Roberts, A.

    2009-12-01

    We present a new data mining software tool called MineTool for analysis and modeling of space physics data. MineTool is a graphical user interface implementation that merges two data mining algorithms into an easy-to-use software tool: an algorithm for analysis and modeling of static data [Karimabadi et al, 2007] and MineTool-TS, an algorithm for data mining of time series data [Karimabadi et al, 2009]. By virtue of automating the modeling process and model evaluations, MineTool makes data mining and predictive modeling more accessible to non-experts. The software is entirely in Java and freeware. By ranking all inputs as predictors of the outcome before constructing a model, MineTool enables inclusion of only relevant variables as well. The technique aggregates the various stages of model building into a four-step process consisting of (i) data segmentation and sampling, (ii) variable pre-selection and transform generation, (iii) predictive model estimation and validation, and (iv) final model selection. Optimal strategies are chosen for each modeling step. A notable feature of the technique is that the final model is always in closed analytical form rather than “black box” form characteristic of some other techniques. Having the analytical model enables deciphering the importance of various variables to affecting the outcome. MineTool suite also provides capabilities for data preparation for data mining as well as visualization of the datasets. MineTool has successfully been used to develop models for automated detection of flux transfer events (FTEs) at Earth’s magnetopause in the Cluster spacecraft time series data and 3D magnetopause modeling. In this presentation, we demonstrate the ease of use of the software through examples including how it was used in the FTE problem.

  20. Advancing the argument for validity of the Alberta Context Tool with healthcare aides in residential long-term care

    PubMed Central

    2011-01-01

    Background Organizational context has the potential to influence the use of new knowledge. However, despite advances in understanding the theoretical base of organizational context, its measurement has not been adequately addressed, limiting our ability to quantify and assess context in healthcare settings and thus, advance development of contextual interventions to improve patient care. We developed the Alberta Context Tool (the ACT) to address this concern. It consists of 58 items representing 10 modifiable contextual concepts. We reported the initial validation of the ACT in 2009. This paper presents the second stage of the psychometric validation of the ACT. Methods We used the Standards for Educational and Psychological Testing to frame our validity assessment. Data from 645 English speaking healthcare aides from 25 urban residential long-term care facilities (nursing homes) in the three Canadian Prairie Provinces were used for this stage of validation. In this stage we focused on: (1) advanced aspects of internal structure (e.g., confirmatory factor analysis) and (2) relations with other variables validity evidence. To assess reliability and validity of scores obtained using the ACT we conducted: Cronbach's alpha, confirmatory factor analysis, analysis of variance, and tests of association. We also assessed the performance of the ACT when individual responses were aggregated to the care unit level, because the instrument was developed to obtain unit-level scores of context. Results Item-total correlations exceeded acceptable standards (> 0.3) for the majority of items (51 of 58). We ran three confirmatory factor models. Model 1 (all ACT items) displayed unacceptable fit overall and for five specific items (1 item on adequate space for resident care in the Organizational Slack-Space ACT concept and 4 items on use of electronic resources in the Structural and Electronic Resources ACT concept). This prompted specification of two additional models. Model 2 used

  1. Networking Sensor Observations, Forecast Models & Data Analysis Tools

    NASA Astrophysics Data System (ADS)

    Falke, S. R.; Roberts, G.; Sullivan, D.; Dibner, P. C.; Husar, R. B.

    2009-12-01

    This presentation explores the interaction between sensor webs and forecast models and data analysis processes within service oriented architectures (SOA). Earth observation data from surface monitors and satellite sensors and output from earth science models are increasingly available through open interfaces that adhere to web standards, such as the OGC Web Coverage Service (WCS), OGC Sensor Observation Service (SOS), OGC Web Processing Service (WPS), SOAP-Web Services Description Language (WSDL), or RESTful web services. We examine the implementation of these standards from the perspective of forecast models and analysis tools. Interoperable interfaces for model inputs, outputs, and settings are defined with the purpose of connecting them with data access services in service oriented frameworks. We review current best practices in modular modeling, such as OpenMI and ESMF/Mapl, and examine the applicability of those practices to service oriented sensor webs. In particular, we apply sensor-model-analysis interfaces within the context of wildfire smoke analysis and forecasting scenario used in the recent GEOSS Architecture Implementation Pilot. Fire locations derived from satellites and surface observations and reconciled through a US Forest Service SOAP web service are used to initialize a CALPUFF smoke forecast model. The results of the smoke forecast model are served through an OGC WCS interface that is accessed from an analysis tool that extract areas of high particulate matter concentrations and a data comparison tool that compares the forecasted smoke with Unattended Aerial System (UAS) collected imagery and satellite-derived aerosol indices. An OGC WPS that calculates population statistics based on polygon areas is used with the extract area of high particulate matter to derive information on the population expected to be impacted by smoke from the wildfires. We described the process for enabling the fire location, smoke forecast, smoke observation, and

  2. A new tool for accelerator system modeling and analysis

    SciTech Connect

    Gillespie, G.H.; Hill, B.W.; Jameson, R.A.

    1994-09-01

    A novel computer code is being developed to generate system level designs of radiofrequency ion accelerators. The goal of the Accelerator System Model (ASM) code is to create a modeling and analysis tool that is easy to use, automates many of the initial design calculations, supports trade studies used in assessing alternate designs and yet is flexible enough to incorporate new technology concepts as they emerge. Hardware engineering parameters and beam dynamics are modeled at comparable levels of fidelity. Existing scaling models of accelerator subsystems were sued to produce a prototype of ASM (version 1.0) working within the Shell for Particle Accelerator Related Codes (SPARC) graphical user interface. A small user group has been testing and evaluating the prototype for about a year. Several enhancements and improvements are now being developed. The current version (1.1) of ASM is briefly described and an example of the modeling and analysis capabilities is illustrated.

  3. Advances in NLTE modeling for integrated simulations

    NASA Astrophysics Data System (ADS)

    Scott, H. A.; Hansen, S. B.

    2010-01-01

    The last few years have seen significant progress in constructing the atomic models required for non-local thermodynamic equilibrium (NLTE) simulations. Along with this has come an increased understanding of the requirements for accurately modeling the ionization balance, energy content and radiative properties of different atomic species for a wide range of densities and temperatures. Much of this progress is the result of a series of workshops dedicated to comparing the results from different codes and computational approaches applied to a series of test problems. The results of these workshops emphasized the importance of atomic model completeness, especially in doubly-excited states and autoionization transitions, to calculating ionization balance, and the importance of accurate, detailed atomic data to producing reliable spectra. We describe a simple screened-hydrogenic model that calculates NLTE ionization balance with sufficient accuracy, at a low enough computational cost for routine use in radiation-hydrodynamics codes. The model incorporates term splitting, Δ n = 0 transitions, and approximate UTA widths for spectral calculations, with results comparable to those of much more detailed codes. Simulations done with this model have been increasingly successful at matching experimental data for laser-driven systems and hohlraums. Accurate and efficient atomic models are just one requirement for integrated NLTE simulations. Coupling the atomic kinetics to hydrodynamics and radiation transport constrains both discretizations and algorithms to retain energy conservation, accuracy and stability. In particular, the strong coupling between radiation and populations can require either very short time steps or significantly modified radiation transport algorithms to account for NLTE material response. Considerations such as these continue to provide challenges for NLTE simulations.

  4. Advanced Numerical Model for Irradiated Concrete

    SciTech Connect

    Giorla, Alain B.

    2015-03-01

    In this report, we establish a numerical model for concrete exposed to irradiation to address these three critical points. The model accounts for creep in the cement paste and its coupling with damage, temperature and relative humidity. The shift in failure mode with the loading rate is also properly represented. The numerical model for creep has been validated and calibrated against different experiments in the literature [Wittmann, 1970, Le Roy, 1995]. Results from a simplified model are shown to showcase the ability of numerical homogenization to simulate irradiation effects in concrete. In future works, the complete model will be applied to the analysis of the irradiation experiments of Elleuch et al. [1972] and Kelly et al. [1969]. This requires a careful examination of the experimental environmental conditions as in both cases certain critical information are missing, including the relative humidity history. A sensitivity analysis will be conducted to provide lower and upper bounds of the concrete expansion under irradiation, and check if the scatter in the simulated results matches the one found in experiments. The numerical and experimental results will be compared in terms of expansion and loss of mechanical stiffness and strength. Both effects should be captured accordingly by the model to validate it. Once the model has been validated on these two experiments, it can be applied to simulate concrete from nuclear power plants. To do so, the materials used in these concrete must be as well characterized as possible. The main parameters required are the mechanical properties of each constituent in the concrete (aggregates, cement paste), namely the elastic modulus, the creep properties, the tensile and compressive strength, the thermal expansion coefficient, and the drying shrinkage. These can be either measured experimentally, estimated from the initial composition in the case of cement paste, or back-calculated from mechanical tests on concrete. If some

  5. Phenomenological Modeling of Infrared Sources: Recent Advances

    NASA Technical Reports Server (NTRS)

    Leung, Chun Ming; Kwok, Sun (Editor)

    1993-01-01

    Infrared observations from planned space facilities (e.g., ISO (Infrared Space Observatory), SIRTF (Space Infrared Telescope Facility)) will yield a large and uniform sample of high-quality data from both photometric and spectroscopic measurements. To maximize the scientific returns of these space missions, complementary theoretical studies must be undertaken to interpret these observations. A crucial step in such studies is the construction of phenomenological models in which we parameterize the observed radiation characteristics in terms of the physical source properties. In the last decade, models with increasing degree of physical realism (in terms of grain properties, physical processes, and source geometry) have been constructed for infrared sources. Here we review current capabilities available in the phenomenological modeling of infrared sources and discuss briefly directions for future research in this area.

  6. Advances in Modeling Exploding Bridgewire Initiation

    SciTech Connect

    Hrousis, C A; Christensen, J S

    2010-03-10

    There is great interest in applying magnetohydrodynamic (MHD) simulation techniques to the designs of electrical high explosive (HE) initiators, for the purpose of better understanding a design's sensitivities, optimizing its performance, and/or predicting its useful lifetime. Two MHD-capable LLNL codes, CALE and ALE3D, are being used to simulate the process of ohmic heating, vaporization, and plasma formation in exploding bridgewires (EBW). Initiation of the HE is simulated using Ignition & Growth reactive flow models. 1-D, 2-D and 3-D models have been constructed and studied. The models provide some intuitive explanation of the initiation process and are useful for evaluating the potential impact of identified aging mechanisms (such as the growth of intermetallic compounds or powder sintering). The end product of this work is a simulation capability for evaluating margin in proposed, modified or aged initiation system designs.

  7. Evaluation of air pollution modelling tools as environmental engineering courseware.

    PubMed

    Souto González, J A; Bello Bugallo, P M; Casares Long, J J

    2004-01-01

    The study of phenomena related to the dispersion of pollutants usually takes advantage of the use of mathematical models based on the description of the different processes involved. This educational approach is especially important in air pollution dispersion, when the processes follow a non-linear behaviour so it is difficult to understand the relationships between inputs and outputs, and in a 3D context where it becomes hard to analyze alphanumeric results. In this work, three different software tools, as computer solvers for typical air pollution dispersion phenomena, are presented. Each software tool developed to be implemented on PCs, follows approaches that represent three generations of programming languages (Fortran 77, VisualBasic and Java), applied over three different environments: MS-DOS, MS-Windows and the world wide web. The software tools were tested by students of environmental engineering (undergraduate) and chemical engineering (postgraduate), in order to evaluate the ability of these software tools to improve both theoretical and practical knowledge of the air pollution dispersion problem, and the impact of the different environment in the learning process in terms of content, ease of use and visualization of results. PMID:15193095

  8. Model advanced for hydrocarbon microseepage, related alterations

    SciTech Connect

    Thompson, C.K. ); Saunders, D.F.; Burson, K.R. )

    1994-11-14

    Future significant petroleum fields will be found in subtle stratigraphic traps in addition to structural traps. Both may be detectable by measuring surface hydrocarbon microseepage and related alterations. Reasons these methods have not been commonly used include: (1) early over-selling by some contractors with consequent bad client experiences, and (2) lack of generally accepted scientific models to relate anomalies to subsurface hydrocarbon accumulations. This article is restricted primarily to the authors' specific experience, studies, and conclusions over some 38 years with particular emphasis on the last 15 years. The authors believe these findings have resulted in improved wildcat success rates and realistic scientific models.

  9. Modeling Innovations Advance Wind Energy Industry

    NASA Technical Reports Server (NTRS)

    2009-01-01

    In 1981, Glenn Research Center scientist Dr. Larry Viterna developed a model that predicted certain elements of wind turbine performance with far greater accuracy than previous methods. The model was met with derision from others in the wind energy industry, but years later, Viterna discovered it had become the most widely used method of its kind, enabling significant wind energy technologies-like the fixed pitch turbines produced by manufacturers like Aerostar Inc. of Westport, Massachusetts-that are providing sustainable, climate friendly energy sources today.

  10. Modeling of Passive Forces of Machine Tool Covers

    NASA Astrophysics Data System (ADS)

    Kolar, Petr; Hudec, Jan; Sulitka, Matej

    The passive forces acting against the drive force are phenomena that influence dynamical properties and precision of linear axes equipped with feed drives. Covers are one of important sources of passive forces in machine tools. The paper describes virtual evaluation of cover passive forces using the cover complex model. The model is able to compute interaction between flexible cover segments and sealing wiper. The result is deformation of cover segments and wipers which is used together with measured friction coefficient for computation of cover total passive force. This resulting passive force is dependent on cover position. Comparison of computational results and measurement on the real cover is presented in the paper.

  11. A Tool for Sharing Empirical Models of Climate Impacts

    NASA Astrophysics Data System (ADS)

    Rising, J.; Kopp, R. E.; Hsiang, S. M.

    2013-12-01

    Scientists, policy advisors, and the public struggle to synthesize the quickly evolving empirical work on climate change impacts. The Integrated Assessment Models (IAMs) used to estimate the impacts of climate change and the effects of adaptation and mitigation policies can also benefit greatly from recent empirical results (Kopp, Hsiang & Oppenheimer, Impacts World 2013 discussion paper). This paper details a new online tool for exploring, analyzing, combining, and communicating a wide range of impact results, and supporting their integration into IAMs. The tool uses a new database of statistical results, which researchers can expand both in depth (by providing additional results that describing existing relationships) and breadth (by adding new relationships). Scientists can use the tool to quickly perform meta-analyses of related results, using Bayesian techniques to produce pooled and partially-pooled posterior distributions. Policy advisors can apply the statistical results to particular contexts, and combine different kinds of results in a cost-benefit framework. For example, models of the impact of temperature changes on agricultural yields can be first aggregated to build a best-estimate of the effect under given assumptions, then compared across countries using different temperature scenarios, and finally combined to estimate a social cost of carbon. The general public can better understand the many estimates of climate impacts and their range of uncertainty by exploring these results dynamically, with maps, bar charts, and dose-response-style plots. Front page of the climate impacts tool website. Sample "collections" of models, within which all results are estimates of the same fundamental relationship, are shown on the right. Simple pooled result for Gelman's "8 schools" example. Pooled results are calculated analytically, while partial-pooling (Bayesian hierarchical estimation) uses posterior simulations.

  12. Advances in Swine Biomedical Model Genomics

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The swine has been a major biomedical model species, for transplantation, heart disease, allergies and asthma, as well as normal neonatal development and reproductive physiology. Swine have been used extensively for studies of infectious disease processes and analyses of preventative strategies, inc...

  13. Smart Engines Via Advanced Model Based Controls

    SciTech Connect

    Allain, Marc

    2000-08-20

    A ''new'' process for developing control systems - Less engine testing - More robust control system - Shorter development cycle time - ''Smarter'' approach to engine control - On-board models describe engine behavior - Shorter, systematic calibration process - Customer and legislative requirements designed-in.

  14. Measurement and modeling of advanced coal conversion processes

    SciTech Connect

    Solomon, P.R.; Serio, M.A.; Hamblen, D.G.; Smoot, L.D.; Brewster, B.S. Brigham Young Univ., Provo, UT )

    1991-01-01

    The overall objective of this program is the development of predictive capability for the design, scale up, simulation, control and feedstock evaluation in advanced coal conversion devices. This program will merge significant advances made in measuring and quantitatively describing the mechanisms in coal conversion behavior. Comprehensive computer codes for mechanistic modeling of entrained-bed gasification. Additional capabilities in predicting pollutant formation will be implemented and the technology will be expanded to fixed-bed reactors.

  15. Community Surface Dynamics Modeling System and its CSDMS Modeling Tool to couple models and data (Invited)

    NASA Astrophysics Data System (ADS)

    Syvitski, J. P.; Csdms Scientific; Software Team

    2010-12-01

    CSDMS is the virtual home for a diverse community who foster and promote the modeling of earth surface processes, with emphasis on the movement of fluids, sediment and solutes through landscapes, seascapes and through their sedimentary basins. CSDMS develops, integrates, disseminates & archives software (> 150 models and 3million+ lines of code) that reflects and predicts earth surface processes over a broad range of time and space scales. CSDMS deals with the Earth's surface—the ever-changing, dynamic interface between lithosphere, hydrosphere, cryosphere, and atmosphere. CSDMS employs state-of-the-art architectures, interface standards and frameworks that make it possible to convert stand-alone models into flexible, "plug-and-play" components that can be assembled into larger applications. The CSDMS model-coupling environment offers language interoperability, structured and unstructured grids, and serves as a migration pathway for surface dynamics modelers towards High-Performance Computing (HPC). The CSDMS Modeling Tool is a key product of the overall project, as it allows earth scientists with relatively modest computer coding experience to use the CSDMS modules for earth surface dynamics research and education. The CMT Tool is platform independent. CMT can easily couple models that have followed the CSDMS protocols for model contribution: 1) Open-source license; 2) Available; 3) Vetted; 4) Open-source language; 5) Refactored for componentization; 6) Metadata & test files; 7) Clean and documented using keywords.

  16. Community Coordinated Modeling Center tools for solar and space physics higher education

    NASA Astrophysics Data System (ADS)

    Pulkkinen, A.; Kuznetsova, M. M.; Hesse, M.; Berrios, D.; Maddox, M. M.; Rastaetter, L.

    2011-12-01

    Community Coordinated Modeling Center (CCMC) has developed over the years a comprehensive set of tools that are directly applicable to higher education. The tools range from standard runs-on-request system providing access to state-of-the-art solar and space physics models to online tutorials on space weather analysis. CCMC has also supported directly, for example, by means of tailored simulations and web interfaces various educational activities such as CISM and Heliophysics Summer Schools. In this paper we provide a brief overview of CCMC tools that can benefit a variety of educational efforts. We will discuss, for example, the usage of integrated Space Weather Analysis (iSWA) system, CCMC runs-on-request system, advanced visualizations tools such as Space Weather Explorer 2 and other publicly available CCMC material in higher education. We will give explicit examples of some of our success stories and outline our envisioned higher education path forward. The main portals to CCMC's educational material are ccmc.gsfc.nasa.gov and iswa.gsfc.nasa.gov.

  17. Development of 3D multimedia with advanced computer animation tools for outreach activities related to Meteor Science and Meteoritics

    NASA Astrophysics Data System (ADS)

    Madiedo, J. M.

    2012-09-01

    Documentaries related to Astronomy and Planetary Sciences are a common and very attractive way to promote the interest of the public in these areas. These educational tools can get benefit from new advanced computer animation software and 3D technologies, as these allow making these documentaries even more attractive. However, special care must be taken in order to guarantee that the information contained in them is serious and objective. In this sense, an additional value is given when the footage is produced by the own researchers. With this aim, a new documentary produced and directed by Prof. Madiedo has been developed. The documentary, which has been entirely developed by means of advanced computer animation tools, is dedicated to several aspects of Meteor Science and Meteoritics. The main features of this outreach and education initiative are exposed here.

  18. Advanced air revitalization system modeling and testing

    NASA Technical Reports Server (NTRS)

    Dall-Baumann, Liese; Jeng, Frank; Christian, Steve; Edeer, Marybeth; Lin, Chin

    1990-01-01

    To support manned lunar and Martian exploration, an extensive evaluation of air revitalization subsystems (ARS) is being conducted. The major operations under study include carbon dioxide removal and reduction; oxygen and nitrogen production, storage, and distribution; humidity and temperature control; and trace contaminant control. A comprehensive analysis program based on a generalized block flow model was developed to facilitate the evaluation of various processes and their interaction. ASPEN PLUS was used in modelling carbon dioxide removal and reduction. Several life support test stands were developed to test new and existing technologies for their potential applicability in space. The goal was to identify processes which use compact, lightweight equipment and maximize the recovery of oxygen and water. The carbon dioxide removal test stands include solid amine/vacuum desorption (SAVD), regenerative silver oxide chemisorption, and electrochemical carbon dioxide concentration (EDC). Membrane-based carbon dioxide removal and humidity control, catalytic reduction of carbon dioxide, and catalytic oxidation of trace contaminants were also investigated.

  19. Integrating Cache Performance Modeling and Tuning Support in Parallelization Tools

    NASA Technical Reports Server (NTRS)

    Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1998-01-01

    With the resurgence of distributed shared memory (DSM) systems based on cache-coherent Non Uniform Memory Access (ccNUMA) architectures and increasing disparity between memory and processors speeds, data locality overheads are becoming the greatest bottlenecks in the way of realizing potential high performance of these systems. While parallelization tools and compilers facilitate the users in porting their sequential applications to a DSM system, a lot of time and effort is needed to tune the memory performance of these applications to achieve reasonable speedup. In this paper, we show that integrating cache performance modeling and tuning support within a parallelization environment can alleviate this problem. The Cache Performance Modeling and Prediction Tool (CPMP), employs trace-driven simulation techniques without the overhead of generating and managing detailed address traces. CPMP predicts the cache performance impact of source code level "what-if" modifications in a program to assist a user in the tuning process. CPMP is built on top of a customized version of the Computer Aided Parallelization Tools (CAPTools) environment. Finally, we demonstrate how CPMP can be applied to tune a real Computational Fluid Dynamics (CFD) application.

  20. Advanced Numerical Modeling of Turbulent Atmospheric Flows

    NASA Astrophysics Data System (ADS)

    Kühnlein, Christian; Dörnbrack, Andreas; Gerz, Thomas

    The present chapter introduces the method of computational simulation to predict and study turbulent atmospheric flows. This includes a description of the fundamental approach to computational simulation and the practical implementation using the technique of large-eddy simulation. In addition, selected contributions from IPA scientists to computational model development and various examples for applications are given. These examples include homogeneous turbulence, convective boundary layers, heated forest canopy, buoyant thermals, and large-scale flows with baroclinic wave instability.

  1. Logic flowgraph methodology - A tool for modeling embedded systems

    NASA Technical Reports Server (NTRS)

    Muthukumar, C. T.; Guarro, S. B.; Apostolakis, G. E.

    1991-01-01

    The logic flowgraph methodology (LFM), a method for modeling hardware in terms of its process parameters, has been extended to form an analytical tool for the analysis of integrated (hardware/software) embedded systems. In the software part of a given embedded system model, timing and the control flow among different software components are modeled by augmenting LFM with modified Petrinet structures. The objective of the use of such an augmented LFM model is to uncover possible errors and the potential for unanticipated software/hardware interactions. This is done by backtracking through the augmented LFM mode according to established procedures which allow the semiautomated construction of fault trees for any chosen state of the embedded system (top event). These fault trees, in turn, produce the possible combinations of lower-level states (events) that may lead to the top event.

  2. Computational science: shifting the focus from tools to models

    PubMed Central

    Hinsen, Konrad

    2014-01-01

    Computational techniques have revolutionized many aspects of scientific research over the last few decades. Experimentalists use computation for data analysis, processing ever bigger data sets. Theoreticians compute predictions from ever more complex models. However, traditional articles do not permit the publication of big data sets or complex models. As a consequence, these crucial pieces of information no longer enter the scientific record. Moreover, they have become prisoners of scientific software: many models exist only as software implementations, and the data are often stored in proprietary formats defined by the software. In this article, I argue that this emphasis on software tools over models and data is detrimental to science in the long term, and I propose a means by which this can be reversed. PMID:25309728

  3. Computational Modeling, Formal Analysis, and Tools for Systems Biology

    PubMed Central

    Bartocci, Ezio; Lió, Pietro

    2016-01-01

    As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science. PMID:26795950

  4. Diagnostic tools for mixing models of stream water chemistry

    USGS Publications Warehouse

    Hooper, R.P.

    2003-01-01

    Mixing models provide a useful null hypothesis against which to evaluate processes controlling stream water chemical data. Because conservative mixing of end-members with constant concentration is a linear process, a number of simple mathematical and multivariate statistical methods can be applied to this problem. Although mixing models have been most typically used in the context of mixing soil and groundwater end-members, an extension of the mathematics of mixing models is presented that assesses the "fit" of a multivariate data set to a lower dimensional mixing subspace without the need for explicitly identified end-members. Diagnostic tools are developed to determine the approximate rank of the data set and to assess lack of fit of the data. This permits identification of processes that violate the assumptions of the mixing model and can suggest the dominant processes controlling stream water chemical variation. These same diagnostic tools can be used to assess the fit of the chemistry of one site into the mixing subspace of a different site, thereby permitting an assessment of the consistency of controlling end-members across sites. This technique is applied to a number of sites at the Panola Mountain Research Watershed located near Atlanta, Georgia.

  5. Homology Modeling a Fast Tool for Drug Discovery: Current Perspectives

    PubMed Central

    Vyas, V. K.; Ukawala, R. D.; Ghate, M.; Chintha, C.

    2012-01-01

    Major goal of structural biology involve formation of protein-ligand complexes; in which the protein molecules act energetically in the course of binding. Therefore, perceptive of protein-ligand interaction will be very important for structure based drug design. Lack of knowledge of 3D structures has hindered efforts to understand the binding specificities of ligands with protein. With increasing in modeling software and the growing number of known protein structures, homology modeling is rapidly becoming the method of choice for obtaining 3D coordinates of proteins. Homology modeling is a representation of the similarity of environmental residues at topologically corresponding positions in the reference proteins. In the absence of experimental data, model building on the basis of a known 3D structure of a homologous protein is at present the only reliable method to obtain the structural information. Knowledge of the 3D structures of proteins provides invaluable insights into the molecular basis of their functions. The recent advances in homology modeling, particularly in detecting and aligning sequences with template structures, distant homologues, modeling of loops and side chains as well as detecting errors in a model contributed to consistent prediction of protein structure, which was not possible even several years ago. This review focused on the features and a role of homology modeling in predicting protein structure and described current developments in this field with victorious applications at the different stages of the drug design and discovery. PMID:23204616

  6. Recent advances in modeling of well hydraulics

    NASA Astrophysics Data System (ADS)

    Yeh, Hund-Der; Chang, Ya-Chi

    2013-01-01

    Well hydraulics is a discipline to understand the process of flow to the well in an aquifer which is regarded as a source of groundwater. A variety of analytical and numerical models have been developed over the last few decades to provide a framework for understanding and quantifying the flow behavior in aquifer systems. In this review, we first briefly introduce the background of the theory of well hydraulics and the concepts, methodologies, and applications of analytical, semi-analytical, numerical and approximate methods in solving the well-hydraulic problems. We then address the subjects of current interests such as the incorporation of effects of finite well radius, wellbore storage, well partial penetration, and the presence of skin into various practical problems of groundwater flow. Furthermore, we also summarize recent developments of flow modeling such as the flow in aquifers with horizontal wells or collector wells, the capture zone delineation, and the non-Darcian flow in porous media and fractured formations. Finally, we present a comprehensive review on the numerical calculations for five well functions frequently appearing in well-hydraulic literature and suggest some topics in groundwater flow for future research.

  7. FAST: A Fuel And Sheath Modeling Tool for CANDU Reactor Fuel

    NASA Astrophysics Data System (ADS)

    Prudil, Andrew Albert

    Understanding the behaviour of nuclear fuel during irradiation is a complicated multiphysics problem involving neutronics, chemistry, radiation physics, material-science, solid mechanics, heat transfer and thermal-hydraulics. Due to the complexity and interdependence of the physics and models involved, fuel modeling is typically clone with numerical models. Advancements in both computer hardware and software have made possible new more complex and sophisticated fuel modeling codes. The Fuel And Sheath modelling Tool (FAST) is a fuel performance code that has been developed for modeling nuclear fuel behaviour under normal and transient conditions. The FAST code includes models for heat generation and transport, thermal expansion, elastic strain, densification, fission product swelling, pellet relocation, contact, grain growth, fission gas release, gas and coolant pressure and sheath creep. These models are coupled and solved numerically using the Comsol Multiphysics finite-element platform. The model utilizes a radialaxial geometry of a fuel pellet (including dishing and chamfering) and accompanying fuel sheath allowing the model to predict circumferential ridging. This model has evolved from previous treatments developed at the Royal Military College. The model has now been significantly advanced to include: a more detailed pellet geometry, localized pellet-to-sheath gap size and contact pressure, ability to model cracked pellets, localized fuel burnup for material property models, improved U02 densification behaviour, fully 2-dimensional model for the sheath, additional creep models, additional material models, an FEM Booth-diffusion model for fission gas release (including ability to model temperature and power changes), a capability for end-of-life predictions, the ability to utilize text files as model inputs, and provides a first time integration of normal operating conditions (NOC) and transient fuel models into a single code (which has never been achieved

  8. Advances in Chimera Grid Tools for Multi-Body Dynamics Simulations and Script Creation

    NASA Technical Reports Server (NTRS)

    Chan, William M.

    2004-01-01

    This viewgraph presentation contains information about (1) Framework for multi-body dynamics - Geometry Manipulation Protocol (GMP), (2) Simulation procedure using Chimera Grid Tools (CGT) and OVERFLOW-2 (3) Further recent developments in Chimera Grid Tools OVERGRID, Grid modules, Script library and (4) Future work.

  9. Development of Advanced Life Prediction Tools for Elastic-Plastic Fatigue Crack Growth

    NASA Technical Reports Server (NTRS)

    Gregg, Wayne; McGill, Preston; Swanson, Greg; Wells, Doug; Throckmorton, D. A. (Technical Monitor)

    2001-01-01

    The objective of this viewgraph presentation is to develop a systematic approach to improving the fracture control process, including analytical tools, standards, guidelines, and awareness. Analytical tools specifically for elastic-plastic fracture analysis is a regime that is currently empirical for the Space Shuttle External Tank (ET) and is handled by simulated service testing of pre-cracked panels.

  10. Advancing Research in Second Language Writing through Computational Tools and Machine Learning Techniques: A Research Agenda

    ERIC Educational Resources Information Center

    Crossley, Scott A.

    2013-01-01

    This paper provides an agenda for replication studies focusing on second language (L2) writing and the use of natural language processing (NLP) tools and machine learning algorithms. Specifically, it introduces a range of the available NLP tools and machine learning algorithms and demonstrates how these could be used to replicate seminal studies…

  11. Synergia: an accelerator modeling tool with 3-D space charge

    SciTech Connect

    Amundson, James F.; Spentzouris, P.; Qiang, J.; Ryne, R.; /LBL, Berkeley

    2004-07-01

    High precision modeling of space-charge effects, together with accurate treatment of single-particle dynamics, is essential for designing future accelerators as well as optimizing the performance of existing machines. We describe Synergia, a high-fidelity parallel beam dynamics simulation package with fully three dimensional space-charge capabilities and a higher order optics implementation. We describe the computational techniques, the advanced human interface, and the parallel performance obtained using large numbers of macroparticles. We also perform code benchmarks comparing to semi-analytic results and other codes. Finally, we present initial results on particle tune spread, beam halo creation, and emittance growth in the Fermilab booster accelerator.

  12. Modeling of advanced ECLSS/ARS with ASPEN

    NASA Technical Reports Server (NTRS)

    Kolodney, M.; Lange, K. E.; Edeen, M. A.

    1991-01-01

    System-level ASPEN models were developed for the CO2 partial reduction subsystem and a bioregenerative life support subsystem (BRLSS). The individual component and subsystem models were integrated into three different system-level atmospheric revitalization subsystem (ARS) models: baseline physico-chemical, BRLSS, and partial reduction of Martian CO2. The Aspen models were based on FORTRAN interfaces necessary for integration with another program, G189A, to perform quasi-transient modeling. Detailed reactor models were prepared for the two CO2 reduction reactors (Bosch and Advanced Carbon Formation), and the low-temperature trace contaminant oxidation reactor.

  13. Advanced modeling and simulation to design and manufacture high performance and reliable advanced microelectronics and microsystems.

    SciTech Connect

    Nettleship, Ian (University of Pittsburgh, Pittsburgh, PA); Hinklin, Thomas; Holcomb, David Joseph; Tandon, Rajan; Arguello, Jose Guadalupe, Jr.; Dempsey, James Franklin; Ewsuk, Kevin Gregory; Neilsen, Michael K.; Lanagan, Michael (Pennsylvania State University, University Park, PA)

    2007-07-01

    An interdisciplinary team of scientists and engineers having broad expertise in materials processing and properties, materials characterization, and computational mechanics was assembled to develop science-based modeling/simulation technology to design and reproducibly manufacture high performance and reliable, complex microelectronics and microsystems. The team's efforts focused on defining and developing a science-based infrastructure to enable predictive compaction, sintering, stress, and thermomechanical modeling in ''real systems'', including: (1) developing techniques to and determining materials properties and constitutive behavior required for modeling; (2) developing new, improved/updated models and modeling capabilities, (3) ensuring that models are representative of the physical phenomena being simulated; and (4) assessing existing modeling capabilities to identify advances necessary to facilitate the practical application of Sandia's predictive modeling technology.

  14. Psychiatric symptoms and disorders associated with reproductive cyclicity in women: advances in screening tools.

    PubMed

    Hall, Elise; Steiner, Meir

    2015-06-01

    Female-specific psychiatric illness including premenstrual dysphoria, perinatal depression, and psychopathology related to the perimenopausal period are often underdiagnosed and treated. These conditions can negatively affect the quality of life for women and their families. The development of screening tools has helped guide our understanding of these conditions. There is a wide disparity in the methods, definitions, and tools used in studies relevant to female-specific psychiatric illness. As a result, there is no consensus on one tool that is most appropriate for use in a research or clinical setting. In reviewing this topic, we hope to highlight the evolution of various tools as they have built on preexisting instruments and to identify the psychometric properties and clinical applicability of available tools. It would be valuable for researchers to reach a consensus on a core set of screening instruments specific to female psychopathology to gain consistency within and between clinical settings. PMID:26102476

  15. Dynamical aspects in modeling long cantilevering workpieces in tool grinding

    NASA Astrophysics Data System (ADS)

    de Payrebrune, K. M.; Kröger, M.

    2015-10-01

    Tool grinding is a complex process in which temporal dynamics of workpiece and grinding wheel, and the material removal process itself, affect the quality of the workpiece. Many existing models already provide the option to study the dynamics of workpiece and grinding wheel or cutting forces and material removal processes, but mostly do not combine these aspects. Here, workpiece dynamics are studied in relation to its structural and geometrical changing properties during machining, and are used to simulate the vibrations and deformation of the workpiece during grinding. In combination with models for the grinding wheel and the material removal process, dependencies of the workpiece dynamics on the workpieces quality are studied and results from this hybrid model are in excellent agreement with empirical measurements. Furthermore, the results demonstrate the significant effects of deformations of the workpiece on its final geometry.

  16. Fuzzy regression modeling for tool performance prediction and degradation detection.

    PubMed

    Li, X; Er, M J; Lim, B S; Zhou, J H; Gan, O P; Rutkowski, L

    2010-10-01

    In this paper, the viability of using Fuzzy-Rule-Based Regression Modeling (FRM) algorithm for tool performance and degradation detection is investigated. The FRM is developed based on a multi-layered fuzzy-rule-based hybrid system with Multiple Regression Models (MRM) embedded into a fuzzy logic inference engine that employs Self Organizing Maps (SOM) for clustering. The FRM converts a complex nonlinear problem to a simplified linear format in order to further increase the accuracy in prediction and rate of convergence. The efficacy of the proposed FRM is tested through a case study - namely to predict the remaining useful life of a ball nose milling cutter during a dry machining process of hardened tool steel with a hardness of 52-54 HRc. A comparative study is further made between four predictive models using the same set of experimental data. It is shown that the FRM is superior as compared with conventional MRM, Back Propagation Neural Networks (BPNN) and Radial Basis Function Networks (RBFN) in terms of prediction accuracy and learning speed. PMID:20945519

  17. Lagrangian modelling tool for IAGOS database added-value products

    NASA Astrophysics Data System (ADS)

    Fontaine, Alain; Auby, Antoine; Petetin, Hervé; Sauvage, Bastien; Thouret, Valérie; Boulanger, Damien

    2015-04-01

    Since 1994, the IAGOS (In-Service Aircraft for a Global Observing System, http://www.iagos.fr) project has produced in-situ measurements of chemical as ozone, carbon monoxide or nitrogen oxides species through more than 40000 commercial aircraft flights. In order to help analysing these observations a tool which links the observed pollutants to their sources was developped based on the Stohl et al. (2003) methodology. Build on the lagrangian particle dispersion model FLEXPART coupled with ECMWF meteorological fields, this tool simulates contributions of anthropogenic and biomass burning emissions from the ECCAD database, to the measured carbon monoxide mixing ratio along each IAGOS flight. Thanks to automated processes, 20-days backward simulation are run from the observation, separating individual contributions from the different source regions. The main goal is to supply added-value product to the IAGOS database showing pollutants geographical origin and emission type and link trends in the atmospheric composition to changes in the transport pathways and to the evolution of emissions. This tool may also be used for statistical validation for intercomparisons of emission inventories, where they can be compared to the in-situ observations from the IAGOS database.

  18. A tool for modeling concurrent real-time computation

    NASA Technical Reports Server (NTRS)

    Sharma, D. D.; Huang, Shie-Rei; Bhatt, Rahul; Sridharan, N. S.

    1990-01-01

    Real-time computation is a significant area of research in general, and in AI in particular. The complexity of practical real-time problems demands use of knowledge-based problem solving techniques while satisfying real-time performance constraints. Since the demands of a complex real-time problem cannot be predicted (owing to the dynamic nature of the environment) powerful dynamic resource control techniques are needed to monitor and control the performance. A real-time computation model for a real-time tool, an implementation of the QP-Net simulator on a Symbolics machine, and an implementation on a Butterfly multiprocessor machine are briefly described.

  19. ADVANCED TOOLS FOR ASSESSING SELECTED PRESCRIPTION AND ILLICIT DRUGS IN TREATED SEWAGE EFFLUENTS AND SOURCE WATERS

    EPA Science Inventory

    The purpose of this poster is to present the application and assessment of advanced technologies in a real-world environment - wastewater effluent and source waters - for detecting six drugs (azithromycin, fluoxetine, omeprazole, levothyroxine, methamphetamine, and methylenedioxy...

  20. Novel 3D Approach to Flare Modeling via Interactive IDL Widget Tools

    NASA Astrophysics Data System (ADS)

    Nita, G. M.; Fleishman, G. D.; Gary, D. E.; Kuznetsov, A.; Kontar, E. P.

    2011-12-01

    Currently, and soon-to-be, available sophisticated 3D models of particle acceleration and transport in solar flares require a new level of user-friendly visualization and analysis tools allowing quick and easy adjustment of the model parameters and computation of realistic radiation patterns (images, spectra, polarization, etc). We report the current state of the art of these tools in development, already proved to be highly efficient for the direct flare modeling. We present an interactive IDL widget application intended to provide a flexible tool that allows the user to generate spatially resolved radio and X-ray spectra. The object-based architecture of this application provides full interaction with imported 3D magnetic field models (e.g., from an extrapolation) that may be embedded in a global coronal model. Various tools provided allow users to explore the magnetic connectivity of the model by generating magnetic field lines originating in user-specified volume positions. Such lines may serve as reference lines for creating magnetic flux tubes, which are further populated with user-defined analytical thermal/non thermal particle distribution models. By default, the application integrates IDL callable DLL and Shared libraries containing fast GS emission codes developed in FORTRAN and C++ and soft and hard X-ray codes developed in IDL. However, the interactive interface allows interchanging these default libraries with any user-defined IDL or external callable codes designed to solve the radiation transfer equation in the same or other wavelength ranges of interest. To illustrate the tool capacity and generality, we present a step-by-step real-time computation of microwave and X-ray images from realistic magnetic structures obtained from a magnetic field extrapolation preceding a real event, and compare them with the actual imaging data obtained by NORH and RHESSI instruments. We discuss further anticipated developments of the tools needed to accommodate

  1. Advanced techniques in reliability model representation and solution

    NASA Technical Reports Server (NTRS)

    Palumbo, Daniel L.; Nicol, David M.

    1992-01-01

    The current tendency of flight control system designs is towards increased integration of applications and increased distribution of computational elements. The reliability analysis of such systems is difficult because subsystem interactions are increasingly interdependent. Researchers at NASA Langley Research Center have been working for several years to extend the capability of Markov modeling techniques to address these problems. This effort has been focused in the areas of increased model abstraction and increased computational capability. The reliability model generator (RMG) is a software tool that uses as input a graphical object-oriented block diagram of the system. RMG uses a failure-effects algorithm to produce the reliability model from the graphical description. The ASSURE software tool is a parallel processing program that uses the semi-Markov unreliability range evaluator (SURE) solution technique and the abstract semi-Markov specification interface to the SURE tool (ASSIST) modeling language. A failure modes-effects simulation is used by ASSURE. These tools were used to analyze a significant portion of a complex flight control system. The successful combination of the power of graphical representation, automated model generation, and parallel computation leads to the conclusion that distributed fault-tolerant system architectures can now be analyzed.

  2. Advanced modeling to accelerate the scale up of carbon capture technologies

    SciTech Connect

    Miller, David C.; Sun, XIN; Storlie, Curtis B.; Bhattacharyya, Debangsu

    2015-06-01

    In order to help meet the goals of the DOE carbon capture program, the Carbon Capture Simulation Initiative (CCSI) was launched in early 2011 to develop, demonstrate, and deploy advanced computational tools and validated multi-scale models to reduce the time required to develop and scale-up new carbon capture technologies. This article focuses on essential elements related to the development and validation of multi-scale models in order to help minimize risk and maximize learning as new technologies progress from pilot to demonstration scale.

  3. Application of the GEM Inventory Data Capture Tools for Dynamic Vulnerability Assessment and Recovery Modelling

    NASA Astrophysics Data System (ADS)

    Verrucci, Enrica; Bevington, John; Vicini, Alessandro

    2014-05-01

    A set of open-source tools to create building exposure datasets for seismic risk assessment was developed from 2010-13 by the Inventory Data Capture Tools (IDCT) Risk Global Component of the Global Earthquake Model (GEM). The tools were designed to integrate data derived from remotely-sensed imagery, statistically-sampled in-situ field data of buildings to generate per-building and regional exposure data. A number of software tools were created to aid the development of these data, including mobile data capture tools for in-field structural assessment, and the Spatial Inventory Data Developer (SIDD) for creating "mapping schemes" - statistically-inferred distributions of building stock applied to areas of homogeneous urban land use. These tools were made publically available in January 2014. Exemplar implementations in Europe and Central Asia during the IDCT project highlighted several potential application areas beyond the original scope of the project. These are investigated here. We describe and demonstrate how the GEM-IDCT suite can be used extensively within the framework proposed by the EC-FP7 project SENSUM (Framework to integrate Space-based and in-situ sENSing for dynamic vUlnerability and recovery Monitoring). Specifically, applications in the areas of 1) dynamic vulnerability assessment (pre-event), and 2) recovery monitoring and evaluation (post-event) are discussed. Strategies for using the IDC Tools for these purposes are discussed. The results demonstrate the benefits of using advanced technology tools for data capture, especially in a systematic fashion using the taxonomic standards set by GEM. Originally designed for seismic risk assessment, it is clear the IDCT tools have relevance for multi-hazard risk assessment. When combined with a suitable sampling framework and applied to multi-temporal recovery monitoring, data generated from the tools can reveal spatio-temporal patterns in the quality of recovery activities and resilience trends can be

  4. ADAPTATION OF THE ADVANCED STATISTICAL TRAJECTORY REGIONAL AIR POLLUTION (ASTRAP) MODEL TO THE EPA VAX COMPUTER - MODIFICATIONS AND TESTING

    EPA Science Inventory

    The Advanced Statistical Trajectory Regional Air Pollution (ASTRAP) model simulates long-term transport and deposition of oxides of and nitrogen. t is a potential screening tool for assessing long-term effects on regional visibility from sulfur emission sources. owever, a rigorou...

  5. ISRU System Model Tool: From Excavation to Oxygen Production

    NASA Technical Reports Server (NTRS)

    Santiago-Maldonado, Edgardo; Linne, Diane L.

    2007-01-01

    In the late 80's, conceptual designs for an in situ oxygen production plant were documented in a study by Eagle Engineering [1]. In the "Summary of Findings" of this study, it is clearly pointed out that: "reported process mass and power estimates lack a consistent basis to allow comparison." The study goes on to say: "A study to produce a set of process mass, power, and volume requirements on a consistent basis is recommended." Today, approximately twenty years later, as humans plan to return to the moon and venture beyond, the need for flexible up-to-date models of the oxygen extraction production process has become even more clear. Multiple processes for the production of oxygen from lunar regolith are being investigated by NASA, academia, and industry. Three processes that have shown technical merit are molten regolith electrolysis, hydrogen reduction, and carbothermal reduction. These processes have been selected by NASA as the basis for the development of the ISRU System Model Tool (ISMT). In working to develop up-to-date system models for these processes NASA hopes to accomplish the following: (1) help in the evaluation process to select the most cost-effective and efficient process for further prototype development, (2) identify key parameters, (3) optimize the excavation and oxygen production processes, and (4) provide estimates on energy and power requirements, mass and volume of the system, oxygen production rate, mass of regolith required, mass of consumables, and other important parameters. Also, as confidence and high fidelity is achieved with each component's model, new techniques and processes can be introduced and analyzed at a fraction of the cost of traditional hardware development and test approaches. A first generation ISRU System Model Tool has been used to provide inputs to the Lunar Architecture Team studies.

  6. Parameterized locally invariant manifolds: A tool for multiscale modeling

    NASA Astrophysics Data System (ADS)

    Sawant, Aarti

    In this thesis two methods for coarse graining in nonlinear ODE systems are demonstrated through analysis of model problems. The basic ideas of a method for model reduction and a method for non-asymptotic time-averaging are presented, using the idea of the Parameterized locally invariant manifolds. New approximation techniques for carrying out this methodology are developed. The work is divided in four categories based on the type of coarse-graining used: reduction of degrees of freedom, spatial averaging, time averaging and a combination of space and time averaging. Model problems showing complex dynamics are selected and various features of the PLIM method are elaborated. The quality and efficiency of the different coarse-graining approaches are evaluated. From the computational standpoint, it is shown that the method has the potential of serving as a subgrid modeling tool for problems in engineering. The developed ideas are evaluated on the following model problems: Lorenz System, a 4D Hamiltonian System due to Hald, 1D Elastodynamics in a strongly heterogeneous medium, kinetics of a phase transforming material with wiggly energy due to Abeyaratne, Chu and James, 2D gradient system with wiggly energy due to Menon, and macroscopic stress-strain behavior of an atomic chain based on the Frenkel-Kontorova model.

  7. The role of advanced engineering simulation in model-based design

    SciTech Connect

    Hommert, P.J.; Biffle, J.H.

    1995-03-01

    The agile manufacturing paradigm engenders many new concepts and work approaches for manufacturing operations. A technology often invoked in the concept of agility is modeling and simulation. Few would disagree that modeling and simulation holds the potential to substantially reduce the product development cycle and lead to improve product reliability and performance. Advanced engineering simulation can impact manufacturing in three areas: process design, product design, and process control. However, despite that promise, the routine utilization of modeling and simulation by industry within the design process is very limited. Advanced simulation is still used primarily in a troubleshooting mode examining design or process problems after the fact. Sandia National Laboratories has been engaged in the development of advanced engineering simulation tools for many years and more recently has begun to focus on the application of such models to manufacturing processes important for the defense industry. These efforts involve considerable interaction and cooperative research with US industry. Based upon this experience, this presentation examines the elements that are necessary for advanced engineering simulation to become an integral part of the design process.

  8. Advancing lighting and daylighting simulation: The transition from analysis to design aid tools

    SciTech Connect

    Hitchcock, R.J.

    1995-05-01

    This paper explores three significant software development requirements for making the transition from stand-alone lighting simulation/analysis tools to simulation-based design aid tools. These requirements include specialized lighting simulation engines, facilitated methods for creating detailed simulatable building descriptions, an automated techniques for providing lighting design guidance. Initial computer implementations meant to address each of these requirements are discussed to further elaborate these requirements and to illustrate work-in-progress.

  9. Advanced repair solution of clear defects on HTPSM by using nanomachining tool

    NASA Astrophysics Data System (ADS)

    Lee, Hyemi; Kim, Munsik; Jung, Hoyong; Kim, Sangpyo; Yim, Donggyu

    2015-10-01

    As the mask specifications become tighter for low k1 lithography, more aggressive repair accuracy is required below sub 20nm tech. node. To meet tight defect specifications, many maskshops select effective repair tools according to defect types. Normally, pattern defects are repaired by the e-beam repair tool and soft defects such as particles are repaired by the nanomachining tool. It is difficult for an e-beam repair tool to remove particle defects because it uses chemical reaction between gas and electron, and a nanomachining tool, which uses physical reaction between a nano-tip and defects, cannot be applied for repairing clear defects. Generally, film deposition process is widely used for repairing clear defects. However, the deposited film has weak cleaning durability, so it is easily removed by accumulated cleaning process. Although the deposited film is strongly attached on MoSiN(or Qz) film, the adhesive strength between deposited Cr film and MoSiN(or Qz) film becomes weaker and weaker by the accumulated energy when masks are exposed in a scanner tool due to the different coefficient of thermal expansion of each materials. Therefore, whenever a re-pellicle process is needed to a mask, all deposited repair points have to be confirmed whether those deposition film are damaged or not. And if a deposition point is damaged, repair process is needed again. This process causes longer and more complex process. In this paper, the basic theory and the principle are introduced to recover clear defects by using nanomachining tool, and the evaluated results are reviewed at dense line (L/S) patterns and contact hole (C/H) patterns. Also, the results using a nanomachining were compared with those using an e-beam repair tool, including the cleaning durability evaluated by the accumulated cleaning process. Besides, we discuss the phase shift issue and the solution about the image placement error caused by phase error.

  10. The Aerosol Modeling Testbed: A community tool to objectively evaluate aerosol process modules

    SciTech Connect

    Fast, Jerome D.; Gustafson, William I.; Chapman, Elaine G.; Easter, Richard C.; Rishel, Jeremy P.; Zaveri, Rahul A.; Grell, Georg; Barth, Mary

    2011-03-02

    This study describes a new modeling paradigm that significantly advances how the third activity is conducted while also fully exploiting data and findings from the first two activities. The Aerosol Modeling Testbed (AMT) is a computational framework for the atmospheric sciences community that streamlines the process of testing and evaluating aerosol process modules over a wide range of spatial and temporal scales. The AMT consists of a fully-coupled meteorology-chemistry-aerosol model, and a suite of tools to evaluate the performance of aerosol process modules via comparison with a wide range of field measurements. The philosophy of the AMT is to systematically and objectively evaluate aerosol process modules over local to regional spatial scales that are compatible with most field campaigns measurement strategies. The performance of new treatments can then be quantified and compared to existing treatments before they are incorporated into regional and global climate models. Since the AMT is a community tool, it also provides a means of enhancing collaboration and coordination among aerosol modelers.

  11. Innovative Tools and Systems Addressing Space Weather Needs Developed By the Community Coordinated Modeling Center (CCMC)

    NASA Astrophysics Data System (ADS)

    Maddox, M. M.; Wiegand, C.; Mullinix, R.; Mays, M. L.; Chulaki, A.; Kuznetsova, M. M.; Pulkkinen, A. A.; Zheng, Y.

    2014-12-01

    The Community Coordinated Modeling Center (CCMC) at NASA Goddard Space Flight Center has always been a pioneer in utilizing and developing innovative systems and tools in addressing the needs of the space weather community. This paper intends to introduce some of our cutting edge systems and tools that are available to everyone in the community. An important objective of the CCMC is to prototype, validate, and compare various methods for CME arrival predictions. As such, CCMC has developed three web based CME specific tools with the goal of facilitating advanced analysis and collaboration within the space weather community. The three tools we highlight in this abstract are: Stereoscopic CME Analysis Tool (StereoCAT), WSA-ENLIL+Cone Fast Track, and Space Weather Scoreboard. These three tools allow making CME measurements, executing space weather simulations in near real-time, and providing a systematic way for the scientific community to record and compare predictions both prior to, and after CME arrivals at near Earth. In order to address the space weather needs of NASA missions and encourage collaboration between various groups, CCMC has developed a web based system called the Space Weather Database Of Notifications, Knowledge, Information (SW DONKI). SW DONKI serves as an archive of all space weather activities including: flares, CMEs (including simulations), SEPs, and geomagnetic storms. An innovative feature of the system is the ability to generate, modify, and store complex linkages between activities - creating a comprehensive network of relationships between activities, and identifying potential cause-and-effect paradigms for each space weather "event". SW DONKI also provides public access to all human generated event analysis and other notifications produced by the Space Weather Research Center (SWRC) forecasting team.

  12. Google Earth as a tool in 2-D hydrodynamic modeling

    NASA Astrophysics Data System (ADS)

    Chien, Nguyen Quang; Keat Tan, Soon

    2011-01-01

    A method for coupling virtual globes with geophysical hydrodynamic models is presented. Virtual globes such as Google TM Earth can be used as a visualization tool to help users create and enter input data. The authors discuss techniques for representing linear and areal geographical objects with KML (Keyhole Markup Language) files generated using computer codes (scripts). Although virtual globes offer very limited tools for data input, some data of categorical or vector type can be entered by users, and then transformed into inputs for the hydrodynamic program by using appropriate scripts. An application with the AnuGA hydrodynamic model was used as an illustration of the method. Firstly, users draw polygons on the Google Earth screen. These features are then saved in a KML file which is read using a script file written in the Lua programming language. After the hydrodynamic simulation has been performed, another script file is used to convert the resulting output text file to a KML file for visualization, where the depths of inundation are represented by the color of discrete point icons. The visualization of a wind speed vector field was also included as a supplementary example.

  13. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    NASA Astrophysics Data System (ADS)

    Joshi, D. M.; Patel, H. K.

    2015-10-01

    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant.

  14. Computational tool for modeling and simulation of mechanically ventilated patients.

    PubMed

    Serna, Leidy Y; Hernandez, Alher M; Mananas, Miguel A

    2010-01-01

    The mechanical ventilator settings in patients with respiratory diseases like chronic obstructive pulmonary disease (COPD) during episodes of acute respiratory failure (ARF) is not a simple task that in most cases is successful based on the experience of physicians. This paper describes an interactive tool based in mathematical models, developed to make easier the study of the interaction between a mechanical ventilator and a patient. It describes all stages of system development, including simulated ventilatory modes, the pathologies of interest and interaction between the user and the system through a graphical interface developed in Matlab and Simulink. The developed computational tool allows the study of most widely used ventilatory modes and its advantages in the treatment of different kind of patients. The graphical interface displays all variables and parameters in the common way of last generation mechanical ventilators do and it is totally interactive, making possible its use by clinical personal, hiding the complexity of implemented mathematical models to the user. The evaluation in different clinical simulated scenes adjusts properly with recent findings in mechanical ventilation scientific literature. PMID:21096101

  15. Watershed modeling tools and data for prognostic and diagnostic

    NASA Astrophysics Data System (ADS)

    Chambel-Leitao, P.; Brito, D.; Neves, R.

    2009-04-01

    When eutrophication is considered an important process to control it can be accomplished reducing nitrogen and phosphorus losses from both point and nonpoint sources and helping to assess the effectiveness of the pollution reduction strategy. HARP-NUT guidelines (Guidelines on Harmonized Quantification and Reporting Procedures for Nutrients) (Borgvang & Selvik, 2000) are presented by OSPAR as the best common quantification and reporting procedures for calculating the reduction of nutrient inputs. In 2000, OSPAR HARP-NUT guidelines on a trial basis. They were intended to serve as a tool for OSPAR Contracting Parties to report, in a harmonized manner, their different commitments, present or future, with regard to nutrients under the OSPAR Convention, in particular the "Strategy to Combat Eutrophication". HARP-NUT Guidelines (Borgvang and Selvik, 2000; Schoumans, 2003) were developed to quantify and report on the individual sources of nitrogen and phosphorus discharges/losses to surface waters (Source Orientated Approach). These results can be compared to nitrogen and phosphorus figures with the total riverine loads measured at downstream monitoring points (Load Orientated Approach), as load reconciliation. Nitrogen and phosphorus retention in river systems represents the connecting link between the "Source Orientated Approach" and the "Load Orientated Approach". Both approaches are necessary for verification purposes and both may be needed for providing the information required for the various commitments. Guidelines 2,3,4,5 are mainly concerned with the sources estimation. They present a set of simple calculations that allow the estimation of the origin of loads. Guideline 6 is a particular case where the application of a model is advised, in order to estimate the sources of nutrients from diffuse sources associated with land use/land cover. The model chosen for this was SWAT (Arnold & Fohrer, 2005) model because it is suggested in the guideline 6 and because it

  16. Animal models as tools to study the pathophysiology of depression.

    PubMed

    Abelaira, Helena M; Réus, Gislaine Z; Quevedo, João

    2013-01-01

    The incidence of depressive illness is high worldwide, and the inadequacy of currently available drug treatments contributes to the significant health burden associated with depression. A basic understanding of the underlying disease processes in depression is lacking; therefore, recreating the disease in animal models is not possible. Popular current models of depression creatively merge ethologically valid behavioral assays with the latest technological advances in molecular biology. Within this context, this study aims to evaluate animal models of depression and determine which has the best face, construct, and predictive validity. These models differ in the degree to which they produce features that resemble a depressive-like state, and models that include stress exposure are widely used. Paradigms that employ acute or sub-chronic stress exposure include learned helplessness, the forced swimming test, the tail suspension test, maternal deprivation, chronic mild stress, and sleep deprivation, to name but a few, all of which employ relatively short-term exposure to inescapable or uncontrollable stress and can reliably detect antidepressant drug response. PMID:24271223

  17. Ceramic compaction models: Useful design tools or simple trend indicators?

    SciTech Connect

    Mahoney, F.M.; Readey, M.J.

    1995-08-01

    It is well-known that dry pressing of ceramic powders leads to density gradients in a ceramic compact resulting in non-uniform shrinkage during densification. This necessitates diamond grinding to final dimensions which, in addition to being an extra processing step, greatly increases the manufacturing cost of ceramic components. To develop methods to control and thus mitigate density variations in compacted powders, it has been an objective of researchers to better understand the mechanics of the compaction process and the underlying material and tooling effects on the formation of density gradients. This paper presents a review of models existing in the literature related to the compaction behavior of ceramic powders. In particular, this paper focuses on several well-known compaction models that predict pressure and density variations in powder compacts.

  18. Empirical flow parameters : a tool for hydraulic model validity

    USGS Publications Warehouse

    Asquith, William H.; Burley, Thomas E.; Cleveland, Theodore G.

    2013-01-01

    The objectives of this project were (1) To determine and present from existing data in Texas, relations between observed stream flow, topographic slope, mean section velocity, and other hydraulic factors, to produce charts such as Figure 1 and to produce empirical distributions of the various flow parameters to provide a methodology to "check if model results are way off!"; (2) To produce a statistical regional tool to estimate mean velocity or other selected parameters for storm flows or other conditional discharges at ungauged locations (most bridge crossings) in Texas to provide a secondary way to compare such values to a conventional hydraulic modeling approach. (3.) To present ancillary values such as Froude number, stream power, Rosgen channel classification, sinuosity, and other selected characteristics (readily determinable from existing data) to provide additional information to engineers concerned with the hydraulic-soil-foundation component of transportation infrastructure.

  19. The Caenorhabiditis elegans model as a reliable tool in neurotoxicology

    PubMed Central

    Avila, Daiana; Helmcke, Kirsten; Aschner, Michael

    2016-01-01

    Caenorhabiditis elegans (C. elegans) offers an attractive experimental platform as it has a short life cycle, is inexpensive to maintain and most importantly has high degree of evolutionary conservation with higher eukaryotes. Understanding the contribution of inherent genes that regulate neurotoxicity and antioxidant stress responses in the worm provides critical insight into mechanisms of mammalian neurotoxicity. The C. elegans model readily enables multi-gene approach, allowing for combinatorial genetic variation to be studied within the context of the influence of multigenic polymorphisms in environmental risk and vulnerability. This review provides a synopsis of recent studies on metal and pesticides toxicity in C. elegans, highlighting the utility of the model system in understanding molecular mechanisms that underlie developmental, reproductive and neuronal damage. The continuation of these investigations combining basic toxicological experimentation with novel genetic and high throughput methods will continue to make C. elegans an invaluable tool for future research, providing insight into molecular and cellular mechanisms of toxicity. PMID:21148196

  20. MTK: An AI tool for model-based reasoning

    NASA Technical Reports Server (NTRS)

    Erickson, William K.; Schwartz, Mary R.

    1987-01-01

    A 1988 goal for the Systems Autonomy Demonstration Project Office of the NASA Ames Research Center is to apply model-based representation and reasoning techniques in a knowledge-based system that will provide monitoring, fault diagnosis, control and trend analysis of the space station Thermal Management System (TMS). A number of issues raised during the development of the first prototype system inspired the design and construction of a model-based reasoning tool called MTK, which was used in the building of the second prototype. These issues are outlined, along with examples from the thermal system to highlight the motivating factors behind them. An overview of the capabilities of MTK is given.

  1. Online and Certifiable Spectroscopy Courses Using Information and Communication Tools. a Model for Classrooms and Beyond

    NASA Astrophysics Data System (ADS)

    Krishnan, Mangala Sunder

    2015-06-01

    Online education tools and flipped (reverse) class models for teaching and learning and pedagogic and andragogic approaches to self-learning have become quite mature in the last few years because of the revolution in video, interactive software and social learning tools. Open Educational resources of dependable quality and variety are also becoming available throughout the world making the current era truly a renaissance period for higher education using Internet. In my presentation, I shall highlight structured course content preparation online in several areas of spectroscopy and also the design and development of virtual lab tools and kits for studying optical spectroscopy. Both elementary and advanced courses on molecular spectroscopy are currently under development jointly with researchers in other institutions in India. I would like to explore participation from teachers throughout the world in the teaching-learning process using flipped class methods for topics such as experimental and theoretical microwave spectroscopy of semi-rigid and non-rigid molecules, molecular complexes and aggregates. In addition, courses in Raman, Infrared spectroscopy experimentation and advanced electronic spectroscopy courses are also envisaged for free, online access. The National Programme on Technology Enhanced Learning (NPTEL) and the National Mission on Education through Information and Communication Technology (NMEICT) are two large Government of India funded initiatives for producing certified and self-learning courses with financial support for moderated discussion forums. The learning tools and interactive presentations so developed can be used in classrooms throughout the world using flipped mode of teaching. They are very much sought after by learners and researchers who are in other areas of learning but want to contribute to research and development through inter-disciplinary learning. NPTEL is currently is experimenting with Massive Open Online Course (MOOC

  2. Introducing Modeling Transition Diagrams as a Tool to Connect Mathematical Modeling to Mathematical Thinking

    ERIC Educational Resources Information Center

    Czocher, Jennifer A.

    2016-01-01

    This study contributes a methodological tool to reconstruct the cognitive processes and mathematical activities carried out by mathematical modelers. Represented as Modeling Transition Diagrams (MTDs), individual modeling routes were constructed for four engineering undergraduate students. Findings stress the importance and limitations of using…

  3. A Novel Bioluminescence Orthotopic Mouse Model for Advanced Lung Cancer

    PubMed Central

    Li, Bo; Torossian, Artour; Li, Wenyan; Schleicher, Stephen; Niu, Kathy; Giacalone, Nicholas J.; Kim, Sung June; Chen, Heidi; Gonzalez, Adriana; Moretti, Luigi; Lu, Bo

    2011-01-01

    Lung cancer is the leading cause of cancer-related death in the United States despite recent advances in our understanding of this challenging disease. An animal model for high-throughput screening of therapeutic agents for advanced lung cancer could help promote the development of more successful treatment interventions. To develop our orthotopic lung cancer model, luciferase-expressing A549 cancer cells were injected into the mediastinum of athymic nude mice. To determine whether the model would allow easy monitoring of response to therapeutic interventions, tumors were treated with 30 mg/kg Paclitaxel or were irradiated with 5 fractions of 2 Gy, and tumor burden was monitored using bioluminescence imaging. Evidence of radiation-induced lung injury was assessed using immunohistochemical staining for phospho-Smad2/3 and cleaved caspase-3. We found that tumor implantation recapitulated advanced human lung cancer as evidenced by tumor establishment and proliferation within the mediastinum. The tumor responded to Paclitaxel or radiation as shown by decreased tumor bioluminescence and improved overall survival. Immunohistochemistry revealed increased phospho-Smad2/3 and cleaved caspase-3 in irradiated lungs, consistent with radiation-induced lung injury. This orthotopic lung cancer model may help provide a method to assess therapeutic interventions in a preclinical setting that recapitulates locally advanced lung cancer. PMID:21663394

  4. Advances in Games Technology: Software, Models, and Intelligence

    ERIC Educational Resources Information Center

    Prakash, Edmond; Brindle, Geoff; Jones, Kevin; Zhou, Suiping; Chaudhari, Narendra S.; Wong, Kok-Wai

    2009-01-01

    Games technology has undergone tremendous development. In this article, the authors report the rapid advancement that has been observed in the way games software is being developed, as well as in the development of games content using game engines. One area that has gained special attention is modeling the game environment such as terrain and…

  5. QUEST FOR AN ADVANCED REGIONAL AIR QUALITY MODEL

    EPA Science Inventory

    Organizations interested in advancing the science and technology of regional air quality modeling on the "grand challenge" scale have joined to form CAMRAQ. hey plan to leverage their research finds by collaborating on the development and evaluation of CMSs so ambitious in scope ...

  6. Advancing Space Weather Modeling Capabilities at the CCMC

    NASA Astrophysics Data System (ADS)

    Mays, M. Leila; Kuznetsova, Maria; Boblitt, Justin; Chulaki, Anna; MacNeice, Peter; Mendoza, Michelle; Mullinix, Richard; Pembroke, Asher; Pulkkinen, Antti; Rastaetter, Lutz; Shim, Ja Soon; Taktakishvili, Aleksandre; Wiegand, Chiu; Zheng, Yihua

    2016-04-01

    The Community Coordinated Modeling Center (CCMC, http://ccmc.gsfc.nasa.gov) serves as a community access point to an expanding collection of state-of-the-art space environment models and as a hub for collaborative development on next generation of space weather forecasting systems. In partnership with model developers and the international research and operational communities, the CCMC integrates new data streams and models from diverse sources into end-to-end space weather predictive systems, identifies weak links in data-model & model-model coupling and leads community efforts to fill those gaps. The presentation will focus on the latest model installations at the CCMC and advances in CCMC-led community-wide model validation projects.

  7. [Population surveys as management tools and health care models].

    PubMed

    Andrade, Flávia Reis de; Narvai, Paulo Capel

    2013-12-01

    The article briefly systematizes health care models, emphasizes the role of population surveys as a management tool and analyzes the specific case of the Brazilian Oral Health Survey (SBBrasil 2010) and its contribution to the consolidation process of health care models consistent with the principles of the Sistema Único de Saúde (SUS, Public Health Care System). While in legal terms SUS corresponds to a health care model, in actual practice the public policy planning and health action, the system gives rise to a care model which is not the result of legal texts or theoretical formulations, but rather the praxis of the personnel involved. Bearing in mind that the management of day-to-day health affairs is a privileged space for the production and consolidation of health care models, it is necessary to stimulate and support the development of technical and operational skills which are different from those required for the management of care related to individual demands. PMID:24626592

  8. Introducing BioSARN - an ecological niche model refinement tool.

    PubMed

    Heap, Marshall J

    2016-08-01

    Environmental niche modeling outputs a biological species' potential distribution. Further work is needed to arrive at a species' realized distribution. The Biological Species Approximate Realized Niche (BioSARN) application provides the ecological modeler with a toolset to refine Environmental niche models (ENMs). These tools include soil and land class filtering, niche area quantification and novelties like enhanced temporal corridor definition, and output to a high spatial resolution land class model. BioSARN is exemplified with a study on Fraser fir, a tree species with strong land class and edaphic correlations. Soil and land class filtering caused the potential distribution area to decline 17%. Enhanced temporal corridor definition permitted distinction of current, continuing, and future niches, and thus niche change and movement. Tile quantification analysis provided further corroboration of these trends. BioSARN does not substitute other established ENM methods. Rather, it allows the experimenter to work with their preferred ENM, refining it using their knowledge and experience. Output from lower spatial resolution ENMs to a high spatial resolution land class model is a pseudo high-resolution result. Still, it maybe the best that can be achieved until wide range high spatial resolution environmental data and accurate high precision species occurrence data become generally available. PMID:27547356

  9. Finite Element Modeling, Simulation, Tools, and Capabilities at Superform

    NASA Astrophysics Data System (ADS)

    Raman, Hari; Barnes, A. J.

    2010-06-01

    Over the past thirty years Superform has been a pioneer in the SPF arena, having developed a keen understanding of the process and a range of unique forming techniques to meet varying market needs. Superform’s high-profile list of customers includes Boeing, Airbus, Aston Martin, Ford, and Rolls Royce. One of the more recent additions to Superform’s technical know-how is finite element modeling and simulation. Finite element modeling is a powerful numerical technique which when applied to SPF provides a host of benefits including accurate prediction of strain levels in a part, presence of wrinkles and predicting pressure cycles optimized for time and part thickness. This paper outlines a brief history of finite element modeling applied to SPF and then reviews some of the modeling tools and techniques that Superform have applied and continue to do so to successfully superplastically form complex-shaped parts. The advantages of employing modeling at the design stage are discussed and illustrated with real-world examples.

  10. The STREON Recirculation Chamber: An Advanced Tool to Quantify Stream Ecosystem Metabolism in the Benthic Zone

    NASA Astrophysics Data System (ADS)

    Brock, J. T.; Utz, R.; McLaughlin, B.

    2013-12-01

    The STReam Experimental Observatory Network is a large-scale experimental effort that will investigate the effects of eutrophication and loss of large consumers in stream ecosystems. STREON represents the first experimental effort undertaken and supported by the National Ecological Observatory Network (NEON).Two treatments will be applied at 10 NEON sites and maintained for 10 years in the STREON program: the addition of nitrate and phosphate to enrich concentrations by five times ambient levels and electrical fields that exclude top consumers (i.e., fish or invertebrates) of the food web from the surface of buried sediment baskets. Following a 3-5 week period, the sediment baskets will be extracted and incubated in closed, recirculating metabolic chambers to measure rates of respiration, photosynthesis, and nutrient uptake. All STREON-generated data will be open access and available on the NEON web portal. The recirculation chamber represents a critical infrastructural component of STREON. Although researchers have applied such chambers for metabolic and nutrient uptake measurements in the past, the scope of STREON demands a novel design that addresses multiple processes often neglected by earlier models. The STREON recirculation chamber must be capable of: 1) incorporating hyporheic exchange into the flow field to ensure measurements of respiration include the activity of subsurface biota, 2) operating consistently with heterogeneous sediments from sand to cobble, 3) minimizing heat exchange from the motor and external environment, 4) delivering a reproducible uniform flow field over the surface of the sediment basket, and 5) efficient assembly/disassembly with minimal use of tools. The chamber also required a means of accommodating an optical dissolved oxygen probe and a means to inject/extract water. A prototype STREON chamber has been designed and thoroughly tested. The flow field within the chamber has been mapped using particle imaging velocimetry (PIV

  11. Potential for MERLIN-Expo, an advanced tool for higher tier exposure assessment, within the EU chemical legislative frameworks.

    PubMed

    Suciu, Nicoleta; Tediosi, Alice; Ciffroy, Philippe; Altenpohl, Annette; Brochot, Céline; Verdonck, Frederik; Ferrari, Federico; Giubilato, Elisa; Capri, Ettore; Fait, Gabriella

    2016-08-15

    MERLIN-Expo merges and integrates advanced exposure assessment methodologies, allowing the building of complex scenarios involving several pollution sources and targets. The assessment of exposure and risks to human health from chemicals is of major concern for policy and ultimately benefits all citizens. The development and operational fusion of the advanced exposure assessment methodologies envisaged in the MERLIN-Expo tool will have a significant impact in the long term on several policies dealing with chemical safety management. There are more than 30 agencies in Europe related to exposure and risk evaluation of chemicals, which have an important role in implementing EU policies, having especially tasks of technical, scientific, operational and/or regulatory nature. The main purpose of the present paper is to introduce MERLIN-Expo and to highlight its potential for being effectively integrated within the group of tools available to assess the risk and exposure of chemicals for EU policy. The main results show that the tool is highly suitable for use in site-specific or local impact assessment, with minor modifications it can also be used for Plant Protection Products (PPPs), biocides and REACH, while major additions would be required for a comprehensive application in the field of consumer and worker exposure assessment. PMID:27107646

  12. Specification of advanced safety modeling requirements (Rev. 0).

    SciTech Connect

    Fanning, T. H.; Tautges, T. J.

    2008-06-30

    The U.S. Department of Energy's Global Nuclear Energy Partnership has lead to renewed interest in liquid-metal-cooled fast reactors for the purpose of closing the nuclear fuel cycle and making more efficient use of future repository capacity. However, the U.S. has not designed or constructed a fast reactor in nearly 30 years. Accurate, high-fidelity, whole-plant dynamics safety simulations will play a crucial role by providing confidence that component and system designs will satisfy established design limits and safety margins under a wide variety of operational, design basis, and beyond design basis transient conditions. Current modeling capabilities for fast reactor safety analyses have resulted from several hundred person-years of code development effort supported by experimental validation. The broad spectrum of mechanistic and phenomenological models that have been developed represent an enormous amount of institutional knowledge that needs to be maintained. Complicating this, the existing code architectures for safety modeling evolved from programming practices of the 1970s. This has lead to monolithic applications with interdependent data models which require significant knowledge of the complexities of the entire code in order for each component to be maintained. In order to develop an advanced fast reactor safety modeling capability, the limitations of the existing code architecture must be overcome while preserving the capabilities that already exist. To accomplish this, a set of advanced safety modeling requirements is defined, based on modern programming practices, that focuses on modular development within a flexible coupling framework. An approach for integrating the existing capabilities of the SAS4A/SASSYS-1 fast reactor safety analysis code into the SHARP framework is provided in order to preserve existing capabilities while providing a smooth transition to advanced modeling capabilities. In doing this, the advanced fast reactor safety models will

  13. GIS as an Integration Tool for Hydrologic Modeling: Spatial Data Management, Analysis and Visualization

    NASA Astrophysics Data System (ADS)

    Setegn, S. G.; Lawrence, A.; Mahmoudi, M.

    2015-12-01

    The Applied Research Center at Florida International University (ARC-FIU) is supporting the soil and groundwater remediation efforts of the U.S. Department of Energy (DOE) Savannah River Site (SRS) by developing a surface water model to simulate the hydrology and the fate and transport of contaminants and sediment in the Tims Branch watershed. The first phase of model development was initiated in 2014 using the MIKE SHE/MIKE 11 hydrological modeling package which has a geographic information systems (GIS) user interface built into its system that can directly use spatial GIS databases (geodatabases) for model inputs. This study developed an ArcGIS geodatabase to support the hydrological modeling work for SRS. The coupling of a geodatabase with MIKE SHE/MIKE 11 numerical models can serve as an efficient tool that significantly reduces the time needed for data preparation. The geodatabase provides an advanced spatial data structure needed to address the management, processing, and analysis of large GIS and timeseries datasets derived from multiple sources that are used for numerical model calibration, uncertainty analysis, and simulation of flow and contaminant fate and transport during extreme climatic events. The geodatabase developed is based on the ArcHydro and ArcGIS Base Map data models with modifications made for project specific input parameters. The significance of this approach was to ensure its replicability for potential application in other watersheds. This paper describes the process of development of the SRS geodatabase and the application of GIS tools to pre-process and analyze hydrological model data; automate repetitive geoprocessing tasks; and produce maps for visualization of the surface water hydrology of the Tims Branch watershed. Key Words: GIS, hydrological modeling, geodatabase, hydrology, MIKE SHE/MIKE 11

  14. Computer modeling for advanced life support system analysis.

    PubMed

    Drysdale, A

    1997-01-01

    This article discusses the equivalent mass approach to advanced life support system analysis, describes a computer model developed to use this approach, and presents early results from modeling the NASA JSC BioPlex. The model is built using an object oriented approach and G2, a commercially available modeling package Cost factor equivalencies are given for the Volosin scenarios. Plant data from NASA KSC and Utah State University (USU) are used, together with configuration data from the BioPlex design effort. Initial results focus on the importance of obtaining high plant productivity with a flight-like configuration. PMID:11540448

  15. Test model designs for advanced refractory ceramic materials

    NASA Technical Reports Server (NTRS)

    Tran, Huy Kim

    1993-01-01

    The next generation of space vehicles will be subjected to severe aerothermal loads and will require an improved thermal protection system (TPS) and other advanced vehicle components. In order to ensure the satisfactory performance system (TPS) and other advanced vehicle materials and components, testing is to be performed in environments similar to space flight. The design and fabrication of the test models should be fairly simple but still accomplish test objectives. In the Advanced Refractory Ceramic Materials test series, the models and model holders will need to withstand the required heat fluxes of 340 to 817 W/sq cm or surface temperatures in the range of 2700 K to 3000 K. The model holders should provide one dimensional (1-D) heat transfer to the samples and the appropriate flow field without compromising the primary test objectives. The optical properties such as the effective emissivity, catalytic efficiency coefficients, thermal properties, and mass loss measurements are also taken into consideration in the design process. Therefore, it is the intent of this paper to demonstrate the design schemes for different models and model holders that would accommodate these test requirements and ensure the safe operation in a typical arc jet facility.

  16. Advanced practice in neurocritical care: an innovative orientation and competency model.

    PubMed

    Vicari-Christensen, Michele

    2014-02-01

    The advanced registered nurse practitioner (ARNP) began in the 1960s as an alternative provider to meet the demands of an escalating healthcare resource deficit. As the role evolved and ARNPs demonstrated safe and effective care, these providers began to appear in critical care settings. It is believed that in the specialty of Neurocritical Care, about half the providers are ARNPs. Hiring and training practitioners for this complex environment is daunting. At the University of Florida & Shands Jacksonville, an innovative orientation and competency model for ARNPs hired for the newly opened Neurocritical Care unit was developed and implemented. The program contains a roadmap for knowledge base and skill acquisition as well as competency training and maintenance. Experience with appropriate hiring and screening standards, internally developed training tools, and identification of necessary advanced classes are discussed. This model may be used as a guideline for Neurocritical Care ARNP training as well as adapted for all other critical care settings. PMID:24399169

  17. Model-free adaptive control of advanced power plants

    SciTech Connect

    Cheng, George Shu-Xing; Mulkey, Steven L.; Wang, Qiang

    2015-08-18

    A novel 3-Input-3-Output (3.times.3) Model-Free Adaptive (MFA) controller with a set of artificial neural networks as part of the controller is introduced. A 3.times.3 MFA control system using the inventive 3.times.3 MFA controller is described to control key process variables including Power, Steam Throttle Pressure, and Steam Temperature of boiler-turbine-generator (BTG) units in conventional and advanced power plants. Those advanced power plants may comprise Once-Through Supercritical (OTSC) Boilers, Circulating Fluidized-Bed (CFB) Boilers, and Once-Through Supercritical Circulating Fluidized-Bed (OTSC CFB) Boilers.

  18. Integration of Advanced Probabilistic Analysis Techniques with Multi-Physics Models

    SciTech Connect

    Cetiner, Mustafa Sacit; none,; Flanagan, George F.; Poore III, Willis P.; Muhlheim, Michael David

    2014-07-30

    An integrated simulation platform that couples probabilistic analysis-based tools with model-based simulation tools can provide valuable insights for reactive and proactive responses to plant operating conditions. The objective of this work is to demonstrate the benefits of a partial implementation of the Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Detailed Framework Specification through the coupling of advanced PRA capabilities and accurate multi-physics plant models. Coupling a probabilistic model with a multi-physics model will aid in design, operations, and safety by providing a more accurate understanding of plant behavior. This represents the first attempt at actually integrating these two types of analyses for a control system used for operations, on a faster than real-time basis. This report documents the development of the basic communication capability to exchange data with the probabilistic model using Reliability Workbench (RWB) and the multi-physics model using Dymola. The communication pathways from injecting a fault (i.e., failing a component) to the probabilistic and multi-physics models were successfully completed. This first version was tested with prototypic models represented in both RWB and Modelica. First, a simple event tree/fault tree (ET/FT) model was created to develop the software code to implement the communication capabilities between the dynamic-link library (dll) and RWB. A program, written in C#, successfully communicates faults to the probabilistic model through the dll. A systems model of the Advanced Liquid-Metal Reactor–Power Reactor Inherently Safe Module (ALMR-PRISM) design developed under another DOE project was upgraded using Dymola to include proper interfaces to allow data exchange with the control application (ConApp). A program, written in C+, successfully communicates faults to the multi-physics model. The results of the example simulation were successfully plotted.

  19. Earth remote sensing as an effective tool for the development of advanced innovative educational technologies

    NASA Astrophysics Data System (ADS)

    Mayorova, Vera; Mayorov, Kirill

    2009-11-01

    Current educational system is facing a contradiction between the fundamentality of engineering education and the necessity of applied learning extension, which requires new methods of training to combine both academic and practical knowledge in balance. As a result there are a number of innovations being developed and implemented into the process of education aimed at optimizing the quality of the entire educational system. Among a wide range of innovative educational technologies there is an especially important subset of educational technologies which involve learning through hands-on scientific and technical projects. The purpose of this paper is to describe the implementation of educational technologies based on small satellites development as well as the usage of Earth remote sensing data acquired from these satellites. The increase in public attention to the education through Earth remote sensing is based on the concern that although there is a great progress in the development of new methods of Earth imagery and remote sensing data acquisition there is still a big question remaining open on practical applications of this kind of data. It is important to develop the new way of thinking for the new generation of people so they understand that they are the masters of their own planet and they are responsible for its state. They should desire and should be able to use a powerful set of tools based on modern and perspective Earth remote sensing. For example NASA sponsors "Classroom of the Future" project. The Universities Space Research Association in United States provides a mechanism through which US universities can cooperate effectively with one another, with the government, and with other organizations to further space science and technology, and to promote education in these areas. It also aims at understanding the Earth as a system and promoting the role of humankind in the destiny of their own planet. The Association has founded a Journal of Earth System

  20. Nuclear fuel cycle system simulation tool based on high-fidelity component modeling

    SciTech Connect

    Ames, David E.

    2014-02-01

    The DOE is currently directing extensive research into developing fuel cycle technologies that will enable the safe, secure, economic, and sustainable expansion of nuclear energy. The task is formidable considering the numerous fuel cycle options, the large dynamic systems that each represent, and the necessity to accurately predict their behavior. The path to successfully develop and implement an advanced fuel cycle is highly dependent on the modeling capabilities and simulation tools available for performing useful relevant analysis to assist stakeholders in decision making. Therefore a high-fidelity fuel cycle simulation tool that performs system analysis, including uncertainty quantification and optimization was developed. The resulting simulator also includes the capability to calculate environmental impact measures for individual components and the system. An integrated system method and analysis approach that provides consistent and comprehensive evaluations of advanced fuel cycles was developed. A general approach was utilized allowing for the system to be modified in order to provide analysis for other systems with similar attributes. By utilizing this approach, the framework for simulating many different fuel cycle options is provided. Two example fuel cycle configurations were developed to take advantage of used fuel recycling and transmutation capabilities in waste management scenarios leading to minimized waste inventories.

  1. A Model-Driven Visualization Tool for Use with Model-Based Systems Engineering Projects

    NASA Technical Reports Server (NTRS)

    Trase, Kathryn; Fink, Eric

    2014-01-01

    Model-Based Systems Engineering (MBSE) promotes increased consistency between a system's design and its design documentation through the use of an object-oriented system model. The creation of this system model facilitates data presentation by providing a mechanism from which information can be extracted by automated manipulation of model content. Existing MBSE tools enable model creation, but are often too complex for the unfamiliar model viewer to easily use. These tools do not yet provide many opportunities for easing into the development and use of a system model when system design documentation already exists. This study creates a Systems Modeling Language (SysML) Document Traceability Framework (SDTF) for integrating design documentation with a system model, and develops an Interactive Visualization Engine for SysML Tools (InVEST), that exports consistent, clear, and concise views of SysML model data. These exported views are each meaningful to a variety of project stakeholders with differing subjects of concern and depth of technical involvement. InVEST allows a model user to generate multiple views and reports from a MBSE model, including wiki pages and interactive visualizations of data. System data can also be filtered to present only the information relevant to the particular stakeholder, resulting in a view that is both consistent with the larger system model and other model views. Viewing the relationships between system artifacts and documentation, and filtering through data to see specialized views improves the value of the system as a whole, as data becomes information

  2. Unleashing spatially distributed ecohydrology modeling using Big Data tools

    NASA Astrophysics Data System (ADS)

    Miles, B.; Idaszak, R.

    2015-12-01

    Physically based spatially distributed ecohydrology models are useful for answering science and management questions related to the hydrology and biogeochemistry of prairie, savanna, forested, as well as urbanized ecosystems. However, these models can produce hundreds of gigabytes of spatial output for a single model run over decadal time scales when run at regional spatial scales and moderate spatial resolutions (~100-km2+ at 30-m spatial resolution) or when run for small watersheds at high spatial resolutions (~1-km2 at 3-m spatial resolution). Numerical data formats such as HDF5 can store arbitrarily large datasets. However even in HPC environments, there are practical limits on the size of single files that can be stored and reliably backed up. Even when such large datasets can be stored, querying and analyzing these data can suffer from poor performance due to memory limitations and I/O bottlenecks, for example on single workstations where memory and bandwidth are limited, or in HPC environments where data are stored separately from computational nodes. The difficulty of storing and analyzing spatial data from ecohydrology models limits our ability to harness these powerful tools. Big Data tools such as distributed databases have the potential to surmount the data storage and analysis challenges inherent to large spatial datasets. Distributed databases solve these problems by storing data close to computational nodes while enabling horizontal scalability and fault tolerance. Here we present the architecture of and preliminary results from PatchDB, a distributed datastore for managing spatial output from the Regional Hydro-Ecological Simulation System (RHESSys). The initial version of PatchDB uses message queueing to asynchronously write RHESSys model output to an Apache Cassandra cluster. Once stored in the cluster, these data can be efficiently queried to quickly produce both spatial visualizations for a particular variable (e.g. maps and animations), as well

  3. Individual-based modelling: an essential tool for microbiology.

    PubMed

    Ferrer, Jordi; Prats, Clara; López, Daniel

    2008-04-01

    Micro-organisms play a central role in every ecosystem and in the global biomass cycle. They are strongly involved in many fields of human interest, from medicine to the food industry and waste control. Nevertheless, most micro-organisms remain almost unknown, and nearly 99% of them have not yet been successfully cultured in vitro. Therefore, new approaches and new tools must be developed in order to understand the collective behaviour of microbial communities in any natural or artificial setting. In particular, theoretical and practical methodologies to deal with such systems at a mesoscopic level of description (covering the range from 100 to 10(8) cells) are required. Individual-based modelling (IBM) has become a widely used tool for describing complex systems made up of autonomous entities, such as ecosystems and social networks. Individual-based models (IBMs) provide some advantages over the traditional whole-population models: (a) they are bottom-up approaches, so they describe the behaviour of a system as a whole by establishing procedural rules for the individuals and for their interactions, and thus allow more realistic assumptions for the model of the individuals than population models do; (b) they permit the introduction of randomness and individual variability, so they can reproduce the diversity found in real systems; and (c) they can account for individual adaptive behaviour to their environmental conditions, so the evolution of the whole system arises from the dynamics that govern individuals in their pursuit of optimal fitness. However, they also present some drawbacks: they lack the clarity of continuous models and may easily become rambling, which makes them difficult to analyse and communicate. All in all, IBMs supply a holistic description of microbial systems and their emerging properties. They are specifically appropriate to deal with microbial communities in non-steady states, and spatially explicit IBMs are particularly appropriate to study

  4. A Clinical Assessment Tool for Advanced Theory of Mind Performance in 5 to 12 Year Olds

    ERIC Educational Resources Information Center

    O'Hare, Anne E.; Bremner, Lynne; Nash, Marysia; Happe, Francesca; Pettigrew, Luisa M.

    2009-01-01

    One hundred forty typically developing 5- to 12-year-old children were assessed with a test of advanced theory of mind employing Happe's strange stories. There was no significant difference in performance between boys and girls. The stories discriminated performance across the different ages with the lowest performance being in the younger…

  5. Just-in-Time Teaching: A Tool for Enhancing Student Engagement in Advanced Foreign Language Learning

    ERIC Educational Resources Information Center

    Abreu, Laurel; Knouse, Stephanie

    2014-01-01

    Scholars have indicated a need for further research on effective pedagogical strategies designed for advanced foreign language courses in the postsecondary setting, especially in light of decreased enrollments at this level and the elimination of foreign language programs altogether in some institutions (Paesani & Allen, 2012). This article…

  6. Advanced Technologies as Educational Tools in Science: Concepts, Applications, and Issues. Monograph Series Number 8.

    ERIC Educational Resources Information Center

    Kumar, David D.; And Others

    Systems incorporating two advanced technologies, hypermedia systems and intelligent tutors, are examined with respect to their potential impact on science education. The conceptual framework underlying these systems is discussed first. Applications of systems are then presented with examples of each in operation within the context of science…

  7. Teaching Advanced Concepts in Computer Networks: VNUML-UM Virtualization Tool

    ERIC Educational Resources Information Center

    Ruiz-Martinez, A.; Pereniguez-Garcia, F.; Marin-Lopez, R.; Ruiz-Martinez, P. M.; Skarmeta-Gomez, A. F.

    2013-01-01

    In the teaching of computer networks the main problem that arises is the high price and limited number of network devices the students can work with in the laboratories. Nowadays, with virtualization we can overcome this limitation. In this paper, we present a methodology that allows students to learn advanced computer network concepts through…

  8. ADVANCED TOOLS FOR ASSESSING SELECTED PRESCRIPTION AND ILLICIT DRUGS IN TREATED SEWAGE EFFLUENTS AND SOURCE WATERS

    EPA Science Inventory

    The purpose of this poster is to present the application and assessment of advanced state-of-the-art technologies in a real-world environment - wastewater effluent and source waters - for detecting six drugs [azithromycin, fluoxetine, omeprazole, levothyroxine, methamphetamine, m...

  9. Network Models: An Underutilized Tool in Wildlife Epidemiology?

    PubMed Central

    Craft, Meggan E.; Caillaud, Damien

    2011-01-01

    Although the approach of contact network epidemiology has been increasing in popularity for studying transmission of infectious diseases in human populations, it has generally been an underutilized approach for investigating disease outbreaks in wildlife populations. In this paper we explore the differences between the type of data that can be collected on human and wildlife populations, provide an update on recent advances that have been made in wildlife epidemiology by using a network approach, and discuss why networks might have been underutilized and why networks could and should be used more in the future. We conclude with ideas for future directions and a call for field biologists and network modelers to engage in more cross-disciplinary collaboration. PMID:21527981

  10. A Simple Evacuation Modeling and Simulation Tool for First Responders

    SciTech Connect

    Koch, Daniel B; Payne, Patricia W

    2015-01-01

    Although modeling and simulation of mass evacuations during a natural or man-made disaster is an on-going and vigorous area of study, tool adoption by front-line first responders is uneven. Some of the factors that account for this situation include cost and complexity of the software. For several years, Oak Ridge National Laboratory has been actively developing the free Incident Management Preparedness and Coordination Toolkit (IMPACT) to address these issues. One of the components of IMPACT is a multi-agent simulation module for area-based and path-based evacuations. The user interface is designed so that anyone familiar with typical computer drawing tools can quickly author a geospatially-correct evacuation visualization suitable for table-top exercises. Since IMPACT is designed for use in the field where network communications may not be available, quick on-site evacuation alternatives can be evaluated to keep pace with a fluid threat situation. Realism is enhanced by incorporating collision avoidance into the simulation. Statistics are gathered as the simulation unfolds, including most importantly time-to-evacuate, to help first responders choose the best course of action.

  11. Tool/tissues interaction modeling for transluminal angioplasty simulation.

    PubMed

    Le Fol, T; Haigron, P; Lucas, A

    2007-01-01

    In this paper, a simulation environment is described for balloon dilation during percutaneous transluminal angioplasty. It means simulating tool/tissues interactions involved in the inflation of a balloon by considering patient specific data. In this context, three main behaviors have been identified: soft tissues, crush completely under the effect of the balloon, calcified plaques, do not admit any deformation but could move in deformable structures and blood vessel wall and organs, try to find their original forms. A deformable soft tissue model is proposed, based on the Enhanced ChainMail method to take into account tissues deformation during dilatation. We improved the original ChainMail method with a "forbidden zone" step to facilitate tool/tissues interactions. The simulation was implemented using five key steps: 1) initialization of balloon parameters; 2) definition of the data structure; 3) dilatation of the balloon and displacement approximation; 4) final position estimation by an elastic relaxation; and 5) interpolation step for visualization. Preliminary results obtained from patient CT data are reported. PMID:18002311

  12. Using urban forest assessment tools to model bird habitat potential

    USGS Publications Warehouse

    Lerman, Susannah B.; Nislow, Keith H.; Nowak, David J.; Destefano, Stephen; King, David I.; Jones-Farrand, D. Todd

    2014-01-01

    The alteration of forest cover and the replacement of native vegetation with buildings, roads, exotic vegetation, and other urban features pose one of the greatest threats to global biodiversity. As more land becomes slated for urban development, identifying effective urban forest wildlife management tools becomes paramount to ensure the urban forest provides habitat to sustain bird and other wildlife populations. The primary goal of this study was to integrate wildlife suitability indices to an existing national urban forest assessment tool, i-Tree. We quantified available habitat characteristics of urban forests for ten northeastern U.S. cities, and summarized bird habitat relationships from the literature in terms of variables that were represented in the i-Tree datasets. With these data, we generated habitat suitability equations for nine bird species representing a range of life history traits and conservation status that predicts the habitat suitability based on i-Tree data. We applied these equations to the urban forest datasets to calculate the overall habitat suitability for each city and the habitat suitability for different types of land-use (e.g., residential, commercial, parkland) for each bird species. The proposed habitat models will help guide wildlife managers, urban planners, and landscape designers who require specific information such as desirable habitat conditions within an urban management project to help improve the suitability of urban forests for birds.

  13. Gasification CFD Modeling for Advanced Power Plant Simulations

    SciTech Connect

    Zitney, S.E.; Guenther, C.P.

    2005-09-01

    In this paper we have described recent progress on developing CFD models for two commercial-scale gasifiers, including a two-stage, coal slurry-fed, oxygen-blown, pressurized, entrained-flow gasifier and a scaled-up design of the PSDF transport gasifier. Also highlighted was NETL’s Advanced Process Engineering Co-Simulator for coupling high-fidelity equipment models with process simulation for the design, analysis, and optimization of advanced power plants. Using APECS, we have coupled the entrained-flow gasifier CFD model into a coal-fired, gasification-based FutureGen power and hydrogen production plant. The results for the FutureGen co-simulation illustrate how the APECS technology can help engineers better understand and optimize gasifier fluid dynamics and related phenomena that impact overall power plant performance.

  14. Recent advances in microbial production of fuels and chemicals using tools and strategies of systems metabolic engineering.

    PubMed

    Cho, Changhee; Choi, So Young; Luo, Zi Wei; Lee, Sang Yup

    2015-11-15

    The advent of various systems metabolic engineering tools and strategies has enabled more sophisticated engineering of microorganisms for the production of industrially useful fuels and chemicals. Advances in systems metabolic engineering have been made in overproducing natural chemicals and producing novel non-natural chemicals. In this paper, we review the tools and strategies of systems metabolic engineering employed for the development of microorganisms for the production of various industrially useful chemicals belonging to fuels, building block chemicals, and specialty chemicals, in particular focusing on those reported in the last three years. It was aimed at providing the current landscape of systems metabolic engineering and suggesting directions to address future challenges towards successfully establishing processes for the bio-based production of fuels and chemicals from renewable resources. PMID:25450194

  15. WIFIRE Data Model and Catalog for Wildfire Data and Tools

    NASA Astrophysics Data System (ADS)

    Altintas, I.; Crawl, D.; Cowart, C.; Gupta, A.; Block, J.; de Callafon, R.

    2014-12-01

    The WIFIRE project (wifire.ucsd.edu) is building an end-to-end cyberinfrastructure for real-time and data-driven simulation, prediction and visualization of wildfire behavior. WIFIRE may be used by wildfire management authorities in the future to predict wildfire rate of spread and direction, and assess the effectiveness of high-density sensor networks in improving fire and weather predictions. WIFIRE has created a data model for wildfire resources including sensed and archived data, sensors, satellites, cameras, modeling tools, workflows and social information including Twitter feeds. This data model and associated wildfire resource catalog includes a detailed description of the HPWREN sensor network, SDG&E's Mesonet, and NASA MODIS. In addition, the WIFIRE data-model describes how to integrate the data from multiple heterogeneous sources to provide detailed fire-related information. The data catalog describes 'Observables' captured by each instrument using multiple ontologies including OGC SensorML and NASA SWEET. Observables include measurements such as wind speed, air temperature, and relative humidity, as well as their accuracy and resolution. We have implemented a REST service for publishing to and querying from the catalog using Web Application Description Language (WADL). We are creating web-based user interfaces and mobile device Apps that use the REST interface for dissemination to wildfire modeling community and project partners covering academic, private, and government laboratories while generating value to emergency officials and the general public. Additionally, the Kepler scientific workflow system is instrumented to interact with this data catalog to access real-time streaming and archived wildfire data and stream it into dynamic data-driven wildfire models at scale.

  16. Continuous Symmetry and Chemistry Teachers: Learning Advanced Chemistry Content through Novel Visualization Tools

    ERIC Educational Resources Information Center

    Tuvi-Arad, Inbal; Blonder, Ron

    2010-01-01

    In this paper we describe the learning process of a group of experienced chemistry teachers in a specially designed workshop on molecular symmetry and continuous symmetry. The workshop was based on interactive visualization tools that allow molecules and their symmetry elements to be rotated in three dimensions. The topic of continuous symmetry is…

  17. Advanced Algorithms and Automation Tools for Discrete Ordinates Methods in Parallel Environments

    SciTech Connect

    Alireza Haghighat

    2003-05-07

    This final report discusses major accomplishments of a 3-year project under the DOE's NEER Program. The project has developed innovative and automated algorithms, codes, and tools for solving the discrete ordinates particle transport method efficiently in parallel environments. Using a number of benchmark and real-life problems, the performance and accuracy of the new algorithms have been measured and analyzed.

  18. The diffraction grating in the Ivory optomechanical modeling tools

    NASA Astrophysics Data System (ADS)

    Hatheway, Alson E.

    2013-09-01

    In imaging spectrometers it is important that both the image of the far-field object and the image of the slit be stable on the detector plane. Lenses and mirrors contribute to the motions of these images but motions of the diffraction grating also have their own influences on these image motions. This paper develops the vector equations for the images (spectra) of the diffraction grating and derives their optomechanical influence coefficients from them. The Ivory Optomechanical Modeling Tools integrates the diffraction grating into the larger optical imaging system and formats the whole system's influence coefficients suitably for both spreadsheet and finite element analysis methods. Their application is illustrated in an example of a spectrometer exposed to both static and dynamic disturbances.

  19. Modeling of tool path for the CNC sheet cutting machines

    NASA Astrophysics Data System (ADS)

    Petunin, Aleksandr A.

    2015-11-01

    In the paper the problem of tool path optimization for CNC (Computer Numerical Control) cutting machines is considered. The classification of the cutting techniques is offered. We also propose a new classification of toll path problems. The tasks of cost minimization and time minimization for standard cutting technique (Continuous Cutting Problem, CCP) and for one of non-standard cutting techniques (Segment Continuous Cutting Problem, SCCP) are formalized. We show that the optimization tasks can be interpreted as discrete optimization problem (generalized travel salesman problem with additional constraints, GTSP). Formalization of some constraints for these tasks is described. For the solution GTSP we offer to use mathematical model of Prof. Chentsov based on concept of a megalopolis and dynamic programming.

  20. Computational Tools To Model Halogen Bonds in Medicinal Chemistry.

    PubMed

    Ford, Melissa Coates; Ho, P Shing

    2016-03-10

    The use of halogens in therapeutics dates back to the earliest days of medicine when seaweed was used as a source of iodine to treat goiters. The incorporation of halogens to improve the potency of drugs is now fairly standard in medicinal chemistry. In the past decade, halogens have been recognized as direct participants in defining the affinity of inhibitors through a noncovalent interaction called the halogen bond or X-bond. Incorporating X-bonding into structure-based drug design requires computational models for the anisotropic distribution of charge and the nonspherical shape of halogens, which lead to their highly directional geometries and stabilizing energies. We review here current successes and challenges in developing computational methods to introduce X-bonding into lead compound discovery and optimization during drug development. This fast-growing field will push further development of more accurate and efficient computational tools to accelerate the exploitation of halogens in medicinal chemistry. PMID:26465079

  1. Modeling a Transient Pressurization with Active Cooling Sizing Tool

    NASA Technical Reports Server (NTRS)

    Guzik, Monica C.; Plachta, David W.; Elchert, Justin P.

    2011-01-01

    As interest in the area of in-space zero boil-off cryogenic propellant storage develops, the need to visualize and quantify cryogen behavior during ventless tank self-pressurization and subsequent cool-down with active thermal control has become apparent. During the course of a mission, such as the launch ascent phase, there are periods that power to the active cooling system will be unavailable. In addition, because it is not feasible to install vacuum jackets on large propellant tanks, as is typically done for in-space cryogenic applications for science payloads, instances like the launch ascent heating phase are important to study. Numerous efforts have been made to characterize cryogenic tank pressurization during ventless cryogen storage without active cooling, but few tools exist to model this behavior in a user-friendly environment for general use, and none exist that quantify the marginal active cooling system size needed for power down periods to manage tank pressure response once active cooling is resumed. This paper describes the Transient pressurization with Active Cooling Tool (TACT), which is based on a ventless three-lump homogeneous thermodynamic self-pressurization model1 coupled with an active cooling system estimator. TACT has been designed to estimate the pressurization of a heated but unvented cryogenic tank, assuming an unavailable power period followed by a given cryocooler heat removal rate. By receiving input data on the tank material and geometry, propellant initial conditions, and passive and transient heating rates, a pressurization and recovery profile can be found, which establishes the time needed to return to a designated pressure. This provides the ability to understand the effect that launch ascent and unpowered mission segments have on the size of an active cooling system. A sample of the trends found show that an active cooling system sized for twice the steady state heating rate would results in a reasonable time for tank

  2. Using Drosophila models of Huntington's disease as a translatable tool.

    PubMed

    Lewis, Elizabeth A; Smith, Gaynor A

    2016-05-30

    The Huntingtin (Htt) protein is essential for a wealth of intracellular signaling cascades and when mutated, causes multifactorial dysregulation of basic cellular processes. Understanding the contribution to each of these intracellular pathways is essential for the elucidation of mechanisms that drive pathophysiology. Using appropriate models of Huntington's disease (HD) is key to finding the molecular mechanisms that contribute to neurodegeneration. While mouse models and cell lines expressing mutant Htt have been instrumental to HD research, there has been a significant contribution to our understating of the disease from studies utilizing Drosophila melanogaster. Flies have an Htt protein, so the endogenous pathways with which it interacts are likely conserved. Transgenic flies engineered to overexpress the human mutant HTT gene display protein aggregation, neurodegeneration, behavioral deficits and a reduced lifespan. The short life span of flies, low cost of maintaining stocks and genetic tools available for in vivo manipulation make them ideal for the discovery of new genes that are involved in HD pathology. It is possible to do rapid genome wide screens for enhancers or suppressors of the mutant Htt-mediated phenotype, expressed in specific tissues or neuronal subtypes. However, there likely remain many yet unknown genes that modify disease progression, which could be found through additional screening approaches using the fly. Importantly, there have been instances where genes discovered in Drosophila have been translated to HD mouse models. PMID:26241927

  3. Modelling of cutting tool - soil interaction - part I: contact behaviour

    NASA Astrophysics Data System (ADS)

    Nardin, A.; Zavarise, G.; Schrefler, B. A.

    The unknown interaction of the cutting tools with geological settings represents an interesting problem for the excavation machinery industry. To simplify the non-linear aspects involved in the numerical analysis of such phenomena a strategy for an accurate soil modelling has to be defined. A possible approach is the discrete one, by considering the soil as an assembly of rigid spheres. In this work this strategy is adopted. The basic idea is to concentrate at the contact level between the spheres the real mechanical behaviour of the soil. For this purpose suitable contact models have been developed, where specific elasto-plastic laws have been implemented in the node-to-segment contact formulation. The framework for the plastic behaviour consists of a failure criterion, a one-dimensional, rate-independent elasto-plastic flow rule for the normal and the tangential force and a non-linear yield criterion. The final aim of this paper is to develop mechanical models to study the behaviour of stiff soils and rocks under different loading conditions.

  4. Advances in modeling and simulation of vacuum electronic devices

    SciTech Connect

    Antonsen, T.M. Jr.; Mondelli, A.A.; Levush, B.; Verboncoeur, J.P.; Birdsall, C.K.

    1999-05-01

    Recent advances in the modeling and simulation of vacuum electronic devices are reviewed. Design of these devices makes use of a variety of physical models and numerical code types. Progress in the development of these models and codes is outlined and illustrated with specific examples. The state of the art in device simulation is evolving to the point such that devices can be designed on the computer, thereby eliminating many trial and error fabrication and test steps. The role of numerical simulation in the design process can be expected to grow further in the future.

  5. Advances in Omics and Bioinformatics Tools for Systems Analyses of Plant Functions

    PubMed Central

    Mochida, Keiichi; Shinozaki, Kazuo

    2011-01-01

    Omics and bioinformatics are essential to understanding the molecular systems that underlie various plant functions. Recent game-changing sequencing technologies have revitalized sequencing approaches in genomics and have produced opportunities for various emerging analytical applications. Driven by technological advances, several new omics layers such as the interactome, epigenome and hormonome have emerged. Furthermore, in several plant species, the development of omics resources has progressed to address particular biological properties of individual species. Integration of knowledge from omics-based research is an emerging issue as researchers seek to identify significance, gain biological insights and promote translational research. From these perspectives, we provide this review of the emerging aspects of plant systems research based on omics and bioinformatics analyses together with their associated resources and technological advances. PMID:22156726

  6. Portfolio use as a tool to demonstrate professional development in advanced nursing practice.

    PubMed

    Hespenheide, Molly; Cottingham, Talisha; Mueller, Gail

    2011-01-01

    A concrete way of recognizing and rewarding clinical leadership, excellence in practice, and personal and professional development of the advanced practice registered nurse (APRN) is lacking in the literature and healthcare institutions in the United States. This article presents the process of developing and evaluating a professional development program designed to address this gap. The program uses APRN Professional Performance Standards, Relationship-Based Care, and the Magnet Forces as a guide and theoretical base. A key tenet of the program is the creation of a professional portfolio. Narrative reflections are included that illustrate the convergence of theories. A crosswalk supports this structure, guides portfolio development, and operationalizes the convergence of theories as they specifically relate to professional development in advanced practice. Implementation of the program has proven to be challenging and rewarding. Feedback from APRNs involved in the program supports program participation as a meaningful method to recognize excellence in advanced practice and a clear means to foster ongoing professional growth and development. PMID:22016019

  7. Implementing an HL7 version 3 modeling tool from an Ecore model.

    PubMed

    Bánfai, Balázs; Ulrich, Brandon; Török, Zsolt; Natarajan, Ravi; Ireland, Tim

    2009-01-01

    One of the main challenges of achieving interoperability using the HL7 V3 healthcare standard is the lack of clear definition and supporting tools for modeling, testing, and conformance checking. Currently, the knowledge defining the modeling is scattered around in MIF schemas, tools and specifications or simply with the domain experts. Modeling core HL7 concepts, constraints, and semantic relationships in Ecore/EMF encapsulates the domain-specific knowledge in a transparent way while unifying Java, XML, and UML in an abstract, high-level representation. Moreover, persisting and versioning the core HL7 concepts as a single Ecore context allows modelers and implementers to create, edit and validate message models against a single modeling context. The solution discussed in this paper is implemented in the new HL7 Static Model Designer as an extensible toolset integrated as a standalone Eclipse RCP application. PMID:19745289

  8. A novel cell culture model as a tool for forensic biology experiments and validations.

    PubMed

    Feine, Ilan; Shpitzen, Moshe; Roth, Jonathan; Gafny, Ron

    2016-09-01

    To improve and advance DNA forensic casework investigation outcomes, extensive field and laboratory experiments are carried out in a broad range of relevant branches, such as touch and trace DNA, secondary DNA transfer and contamination confinement. Moreover, the development of new forensic tools, for example new sampling appliances, by commercial companies requires ongoing validation and assessment by forensic scientists. A frequent challenge in these kinds of experiments and validations is the lack of a stable, reproducible and flexible biological reference material. As a possible solution, we present here a cell culture model based on skin-derived human dermal fibroblasts. Cultured cells were harvested, quantified and dried on glass slides. These slides were used in adhesive tape-lifting experiments and tests of DNA crossover confinement by UV irradiation. The use of this model enabled a simple and concise comparison between four adhesive tapes, as well as a straightforward demonstration of the effect of UV irradiation intensities on DNA quantity and degradation. In conclusion, we believe this model has great potential to serve as an efficient research tool in forensic biology. PMID:27376694

  9. Detection and diagnosis of bearing and cutting tool faults using hidden Markov models

    NASA Astrophysics Data System (ADS)

    Boutros, Tony; Liang, Ming

    2011-08-01

    Over the last few decades, the research for new fault detection and diagnosis techniques in machining processes and rotating machinery has attracted increasing interest worldwide. This development was mainly stimulated by the rapid advance in industrial technologies and the increase in complexity of machining and machinery systems. In this study, the discrete hidden Markov model (HMM) is applied to detect and diagnose mechanical faults. The technique is tested and validated successfully using two scenarios: tool wear/fracture and bearing faults. In the first case the model correctly detected the state of the tool (i.e., sharp, worn, or broken) whereas in the second application, the model classified the severity of the fault seeded in two different engine bearings. The success rate obtained in our tests for fault severity classification was above 95%. In addition to the fault severity, a location index was developed to determine the fault location. This index has been applied to determine the location (inner race, ball, or outer race) of a bearing fault with an average success rate of 96%. The training time required to develop the HMMs was less than 5 s in both the monitoring cases.

  10. Collaborative platform, tool-kit, and physical models for DfM

    NASA Astrophysics Data System (ADS)

    Neureuther, Andy; Poppe, Wojtek; Holwill, Juliet; Chin, Eric; Wang, Lynn; Yang, Jae-Seok; Miller, Marshal; Ceperley, Dan; Clifford, Chris; Kikuchi, Koji; Choi, Jihong; Dornfeld, Dave; Friedberg, Paul; Spanos, Costas; Hoang, John; Chang, Jane; Hsu, Jerry; Graves, David; Wu, Alan C. F.; Lieberman, Mike

    2007-03-01

    Exploratory prototype DfM tools, methodologies and emerging physical process models are described. The examples include new platforms for collaboration on process/device/circuits, visualization and quantification of manufacturing effects at the mask layout level, and advances toward fast-CAD models for lithography, CMP, etch and photomasks. The examples have evolved from research supported over the last several years by DARPA, SRC, Industry and the Sate of California U.C. Discovery Program. DfM tools must enable complexity management with very fast first-cut accurate models across process, device and circuit performance with new modes of collaboration. Collaborations can be promoted by supporting simultaneous views in naturally intuitive parameters for each contributor. An important theme is to shift the view point of the statistical variation in timing and power upstream from gate level CD distributions to a more deterministic set of sources of variations in characterized processes. Many of these nonidealities of manufacturing can be expressed at the mask plane in terms of lateral impact functions to capture effects not included in design rules. Pattern Matching and Perturbation Formulations are shown to be well suited for quantifying these sources of variation.

  11. The smoke-fireplume model : tool for eventual application to prescribed burns and wildland fires.

    SciTech Connect

    Brown, D. F.; Dunn, W. E.; Lazaro, M. A.; Policastro, A. J.

    1999-08-17

    Land managers are increasingly implementing strategies that employ the use of fire in prescribed burns to sustain ecosystems and plan to sustain the rate of increase in its use over the next five years. In planning and executing expanded use of fire in wildland treatment it is important to estimate the human health and safety consequences, property damage, and the extent of visibility degradation from the resulting conflagration-pyrolysis gases, soot and smoke generated during flaming, smoldering and/or glowing fires. Traditional approaches have often employed the analysis of weather observations and forecasts to determine whether a prescribed burn will affect populations, property, or protected Class I areas. However, the complexity of the problem lends itself to advanced PC-based models that are simple to use for both calculating the emissions from the burning of wildland fuels and the downwind dispersion of smoke and other products of pyrolysis, distillation, and/or fuels combustion. These models will need to address the effects of residual smoldering combustion, including plume dynamics and optical effects. In this paper, we discuss a suite of tools that can be applied for analyzing dispersion. These tools include the dispersion models FIREPLUME and SMOKE, together with the meteorological preprocessor SEBMET.

  12. Advanced computer techniques for inverse modeling of electric current in cardiac tissue

    SciTech Connect

    Hutchinson, S.A.; Romero, L.A.; Diegert, C.F.

    1996-08-01

    For many years, ECG`s and vector cardiograms have been the tools of choice for non-invasive diagnosis of cardiac conduction problems, such as found in reentrant tachycardia or Wolff-Parkinson-White (WPW) syndrome. Through skillful analysis of these skin-surface measurements of cardiac generated electric currents, a physician can deduce the general location of heart conduction irregularities. Using a combination of high-fidelity geometry modeling, advanced mathematical algorithms and massively parallel computing, Sandia`s approach would provide much more accurate information and thus allow the physician to pinpoint the source of an arrhythmia or abnormal conduction pathway.

  13. The TEF modeling and analysis approach to advance thermionic space power technology

    NASA Astrophysics Data System (ADS)

    Marshall, Albert C.

    1997-01-01

    Thermionics space power systems have been proposed as advanced power sources for future space missions that require electrical power levels significantly above the capabilities of current space power systems. The Defense Special Weapons Agency's (DSWA) Thermionic Evaluation Facility (TEF) is carrying out both experimental and analytical research to advance thermionic space power technology to meet this expected need. A Modeling and Analysis (M&A) project has been created at the TEF to develop analysis tools, evaluate concepts, and guide research. M&A activities are closely linked to the TEF experimental program, providing experiment support and using experimental data to validate models. A planning exercise has been completed for the M&A project, and a strategy for implementation was developed. All M&A activities will build on a framework provided by a system performance model for a baseline Thermionic Fuel Element (TFE) concept. The system model is composed of sub-models for each of the system components and sub-systems. Additional thermionic component options and model improvements will continue to be incorporated in the basic system model during the course of the program. All tasks are organized into four focus areas: 1) system models, 2) thermionic research, 3) alternative concepts, and 4) documentation and integration. The M&A project will provide a solid framework for future thermionic system development.

  14. Processors, Pipelines, and Protocols for Advanced Modeling Networks

    NASA Technical Reports Server (NTRS)

    Coughlan, Joseph; Komar, George (Technical Monitor)

    2001-01-01

    Predictive capabilities arise from our understanding of natural processes and our ability to construct models that accurately reproduce these processes. Although our modeling state-of-the-art is primarily limited by existing computational capabilities, other technical areas will soon present obstacles to the development and deployment of future predictive capabilities. Advancement of our modeling capabilities will require not only faster processors, but new processing algorithms, high-speed data pipelines, and a common software engineering framework that allows networking of diverse models that represent the many components of Earth's climate and weather system. Development and integration of these new capabilities will pose serious challenges to the Information Systems (IS) technology community. Designers of future IS infrastructures must deal with issues that include performance, reliability, interoperability, portability of data and software, and ultimately, the full integration of various ES model systems into a unified ES modeling network.

  15. ADVANCED ELECTRIC AND MAGNETIC MATERIAL MODELS FOR FDTD ELECTROMAGNETIC CODES

    SciTech Connect

    Poole, B R; Nelson, S D; Langdon, S

    2005-05-05

    The modeling of dielectric and magnetic materials in the time domain is required for pulse power applications, pulsed induction accelerators, and advanced transmission lines. For example, most induction accelerator modules require the use of magnetic materials to provide adequate Volt-sec during the acceleration pulse. These models require hysteresis and saturation to simulate the saturation wavefront in a multipulse environment. In high voltage transmission line applications such as shock or soliton lines the dielectric is operating in a highly nonlinear regime, which require nonlinear models. Simple 1-D models are developed for fast parameterization of transmission line structures. In the case of nonlinear dielectrics, a simple analytic model describing the permittivity in terms of electric field is used in a 3-D finite difference time domain code (FDTD). In the case of magnetic materials, both rate independent and rate dependent Hodgdon magnetic material models have been implemented into 3-D FDTD codes and 1-D codes.

  16. Testing and Implementation of Advanced Reynolds Stress Models

    NASA Technical Reports Server (NTRS)

    Speziale, Charles G.

    1997-01-01

    A research program was proposed for the testing and implementation of advanced turbulence models for non-equilibrium turbulent flows of aerodynamic importance that are of interest to NASA. Turbulence models that are being developed in connection with the Office of Naval Research ARI in Non-equilibrium are provided for implementation and testing in aerodynamic flows at NASA Langley Research Center. Close interactions were established with researchers at Nasa Langley RC and refinements to the models were made based on the results of these tests. The models that have been considered include two-equation models with an anisotropic eddy viscosity as well as full second-order closures. Three types of non-equilibrium corrections to the models have been considered in connection with the ARI on Nonequilibrium Turbulence: conducted for ONR.

  17. Measurement and modeling of advanced coal conversion processes, Volume III

    SciTech Connect

    Ghani, M.U.; Hobbs, M.L.; Hamblen, D.G.

    1993-08-01

    A generalized one-dimensional, heterogeneous, steady-state, fixed-bed model for coal gasification and combustion is presented. The model, FBED-1, is a design and analysis tool that can be used to simulate a variety of gasification, devolatilization, and combustion processes. The model considers separate gas and solid temperatures, axially variable solid and gas flow rates, variable bed void fraction, coal drying, devolatilization based on chemical functional group composition, depolymerization, vaporization and crosslinking, oxidation, and gasification of char, and partial equilibrium in the gas phase.

  18. ADVANCEMENT OF NUCLEIC ACID-BASED TOOLS FOR MONITORING IN SITU REDUCTIVE DECHLORINATION

    SciTech Connect

    Vangelas, K; ELIZABETH EDWARDS, E; FRANK LOFFLER, F; Brian02 Looney, B

    2006-11-17

    Regulatory protocols generally recognize that destructive processes are the most effective mechanisms that support natural attenuation of chlorinated solvents. In many cases, these destructive processes will be biological processes and, for chlorinated compounds, will often be reductive processes that occur under anaerobic conditions. The existing EPA guidance (EPA, 1998) provides a list of parameters that provide indirect evidence of reductive dechlorination processes. In an effort to gather direct evidence of these processes, scientists have identified key microorganisms and are currently developing tools to measure the abundance and activity of these organisms in subsurface systems. Drs. Edwards and Luffler are two recognized leaders in this field. The research described herein continues their development efforts to provide a suite of tools to enable direct measures of biological processes related to the reductive dechlorination of TCE and PCE. This study investigated the strengths and weaknesses of the 16S rRNA gene-based approach to characterizing the natural attenuation capabilities in samples. The results suggested that an approach based solely on 16S rRNA may not provide sufficient information to document the natural attenuation capabilities in a system because it does not distinguish between strains of organisms that have different biodegradation capabilities. The results of the investigations provided evidence that tools focusing on relevant enzymes for functionally desired characteristics may be useful adjuncts to the 16SrRNA methods.

  19. From beginners to trained users: an advanced tool to guide experimenters in basic applied fluorescence

    NASA Astrophysics Data System (ADS)

    Pingand, Philippe B.; Lerner, Dan A.

    1993-05-01

    UPY-F is a software dedicated to solving various queries issued by end-users of spectrofluorimeters when they come across a problem in the course of an experiment. The main goal is to provide a diagnostic for the nonpertinent use of a spectrofluorimeter. Many artifacts may induce the operator into trouble and except for experts, the simple manipulation of the controls of a fluorimeter results in effects not always fully appreciated. The solution retained is an association between a powerful hypermedia tool and an expert system. A straight expert system offers a number of well-known advantages. But it is not well accepted by the user due to the many moves between the spectrofluorimeter and the diagnostic tool. In our hypermedia tool, knowledge can be displayed by the means of visual concepts through which one can browse, and navigate. The user still perceives his problem as a whole, which may not be the case with a straight expert system. We demonstrate typical situations in which an event will trigger a chain reasoning leading to the debugging of the problem. The system is not only meant to help a beginner but can conform itself to guide a well trained experimenter. We think that its functionalities and user-friendly interface are very attractive and open new vistas in the way future users may be trained, whether they work in research labs or industrial settings, as it could namely cut down on the time spent for their training.

  20. A crowdsourcing model for creating preclinical medical education study tools.

    PubMed

    Bow, Hansen C; Dattilo, Jonathan R; Jonas, Andrea M; Lehmann, Christoph U

    2013-06-01

    During their preclinical course work, medical students must memorize and recall substantial amounts of information. Recent trends in medical education emphasize collaboration through team-based learning. In the technology world, the trend toward collaboration has been characterized by the crowdsourcing movement. In 2011, the authors developed an innovative approach to team-based learning that combined students' use of flashcards to master large volumes of content with a crowdsourcing model, using a simple informatics system to enable those students to share in the effort of generating concise, high-yield study materials. The authors used Google Drive and developed a simple Java software program that enabled students to simultaneously access and edit sets of questions and answers in the form of flashcards. Through this crowdsourcing model, medical students in the class of 2014 at the Johns Hopkins University School of Medicine created a database of over 16,000 questions that corresponded to the Genes to Society basic science curriculum. An analysis of exam scores revealed that students in the class of 2014 outperformed those in the class of 2013, who did not have access to the flashcard system, and a survey of students demonstrated that users were generally satisfied with the system and found it a valuable study tool. In this article, the authors describe the development and implementation of their crowdsourcing model for creating study materials, emphasize its simplicity and user-friendliness, describe its impact on students' exam performance, and discuss how students in any educational discipline could implement a similar model of collaborative learning. PMID:23619061

  1. Advanced geothermal hydraulics model -- Phase 1 final report, Part 2

    SciTech Connect

    W. Zheng; J. Fu; W. C. Maurer

    1999-07-01

    An advanced geothermal well hydraulics model (GEODRIL) is being developed to accurately calculate bottom-hole conditions in these hot wells. In Phase 1, real-time monitoring and other improvements were added to GEODRIL. In Phase 2, GEODRIL will be integrated into Marconi's Intelligent Drilling Monitor (IDM) that will use artificial intelligence to detect lost circulation, fluid influxes and other circulation problems in geothermal wells. This software platform has potential for significantly reducing geothermal drilling costs.

  2. Evaluation of ADAM/1 model for advanced coal extraction concepts

    NASA Technical Reports Server (NTRS)

    Deshpande, G. K.; Gangal, M. D.

    1982-01-01

    Several existing computer programs for estimating life cycle cost of mining systems were evaluated. A commercially available program, ADAM/1 was found to be satisfactory in relation to the needs of the advanced coal extraction project. Two test cases were run to confirm the ability of the program to handle nonconventional mining equipment and procedures. The results were satisfactory. The model, therefore, is recommended to the project team for evaluation of their conceptual designs.

  3. Tools and Models for Integrating Multiple Cellular Networks

    SciTech Connect

    Gerstein, Mark

    2015-11-06

    In this grant, we have systematically investigated the integrated networks, which are responsible for the coordination of activity between metabolic pathways in prokaryotes. We have developed several computational tools to analyze the topology of the integrated networks consisting of metabolic, regulatory, and physical interaction networks. The tools are all open-source, and they are available to download from Github, and can be incorporated in the Knowledgebase. Here, we summarize our work as follow. Understanding the topology of the integrated networks is the first step toward understanding its dynamics and evolution. For Aim 1 of this grant, we have developed a novel algorithm to determine and measure the hierarchical structure of transcriptional regulatory networks [1]. The hierarchy captures the direction of information flow in the network. The algorithm is generally applicable to regulatory networks in prokaryotes, yeast and higher organisms. Integrated datasets are extremely beneficial in understanding the biology of a system in a compact manner due to the conflation of multiple layers of information. Therefore for Aim 2 of this grant, we have developed several tools and carried out analysis for integrating system-wide genomic information. To make use of the structural data, we have developed DynaSIN for protein-protein interactions networks with various dynamical interfaces [2]. We then examined the association between network topology with phenotypic effects such as gene essentiality. In particular, we have organized E. coli and S. cerevisiae transcriptional regulatory networks into hierarchies. We then correlated gene phenotypic effects by tinkering with different layers to elucidate which layers were more tolerant to perturbations [3]. In the context of evolution, we also developed a workflow to guide the comparison between different types of biological networks across various species using the concept of rewiring [4], and Furthermore, we have developed

  4. Enabling analytical and Modeling Tools for Enhanced Disease Surveillance

    SciTech Connect

    Dawn K. Manley

    2003-04-01

    Early detection, identification, and warning are essential to minimize casualties from a biological attack. For covert attacks, sick people are likely to provide the first indication of an attack. An enhanced medical surveillance system that synthesizes distributed health indicator information and rapidly analyzes the information can dramatically increase the number of lives saved. Current surveillance methods to detect both biological attacks and natural outbreaks are hindered by factors such as distributed ownership of information, incompatible data storage and analysis programs, and patient privacy concerns. Moreover, because data are not widely shared, few data mining algorithms have been tested on and applied to diverse health indicator data. This project addressed both integration of multiple data sources and development and integration of analytical tools for rapid detection of disease outbreaks. As a first prototype, we developed an application to query and display distributed patient records. This application incorporated need-to-know access control and incorporated data from standard commercial databases. We developed and tested two different algorithms for outbreak recognition. The first is a pattern recognition technique that searches for space-time data clusters that may signal a disease outbreak. The second is a genetic algorithm to design and train neural networks (GANN) that we applied toward disease forecasting. We tested these algorithms against influenza, respiratory illness, and Dengue Fever data. Through this LDRD in combination with other internal funding, we delivered a distributed simulation capability to synthesize disparate information and models for earlier recognition and improved decision-making in the event of a biological attack. The architecture incorporates user feedback and control so that a user's decision inputs can impact the scenario outcome as well as integrated security and role-based access-control for communicating between

  5. Development of an innovative spacer grid model utilizing computational fluid dynamics within a subchannel analysis tool

    NASA Astrophysics Data System (ADS)

    Avramova, Maria

    In the past few decades the need for improved nuclear reactor safety analyses has led to a rapid development of advanced methods for multidimensional thermal-hydraulic analyses. These methods have become progressively more complex in order to account for the many physical phenomena anticipated during steady state and transient Light Water Reactor (LWR) conditions. The advanced thermal-hydraulic subchannel code COBRA-TF (Thurgood, M. J. et al., 1983) is used worldwide for best-estimate evaluations of the nuclear reactor safety margins. In the framework of a joint research project between the Pennsylvania State University (PSU) and AREVA NP GmbH, the theoretical models and numerics of COBRA-TF have been improved. Under the name F-COBRA-TF, the code has been subjected to an extensive verification and validation program and has been applied to variety of LWR steady state and transient simulations. To enable F-COBRA-TF for industrial applications, including safety margins evaluations and design analyses, the code spacer grid models were revised and substantially improved. The state-of-the-art in the modeling of the spacer grid effects on the flow thermal-hydraulic performance in rod bundles employs numerical experiments performed by computational fluid dynamics (CFD) calculations. Because of the involved computational cost, the CFD codes cannot be yet used for full bundle predictions, but their capabilities can be utilized for development of more advanced and sophisticated models for subchannel-level analyses. A subchannel code, equipped with improved physical models, can be then a powerful tool for LWR safety and design evaluations. The unique contributions of this PhD research are seen as development, implementation, and qualification of an innovative spacer grid model by utilizing CFD results within a framework of a subchannel analysis code. Usually, the spacer grid models are mostly related to modeling of the entrainment and deposition phenomena and the heat

  6. The Advanced Light Source: A new tool for research in atomic and molecular physics

    SciTech Connect

    Schlachter, F.; Robinson, A.

    1991-04-01

    The Advanced Light Source at the Lawrence Berkeley Laboratory will be the world's brightest synchrotron radiation source in the extreme ultraviolet and soft x-ray regions of the spectrum when it begins operation in 1993. It will be available as a national user facility to researchers in a broad range of disciplines, including materials science, atomic and molecular physics, chemistry, biology, imaging, and technology. The high brightness of the ALS will be particularly well suited to high-resolution studies of tenuous targets, such as excited atoms, ions, and clusters. 13 figs., 4 tabs.

  7. Biomorphodynamic modelling of inner bank advance in migrating meander bends

    NASA Astrophysics Data System (ADS)

    Zen, Simone; Zolezzi, Guido; Toffolon, Marco; Gurnell, Angela M.

    2016-07-01

    We propose a bio-morphodynamic model at bend cross-sectional scale for the lateral migration of river meander bends, where the two banks can migrate separately as a result of the mutual interaction between river flow, sediments and riparian vegetation, particularly at the interface between the permanently wet channel and the advancing floodplain. The model combines a non-linear analytical model for the morphodynamic evolution of the channel bed, a quasi-1D model to account for flow unsteadiness, and an ecological model describing riparian vegetation dynamics. Simplified closures are included to estimate the feedbacks among vegetation, hydrodynamics and sediment transport, which affect the morphology of the river-floodplain system. Model tests reveal the fundamental role of riparian plants in generating bio-morphological patterns at the advancing floodplain margin. Importantly, they provide insight into the biophysical controls of the 'bar push' mechanism and into its role in the lateral migration of meander bends and in the temporal variations of the active channel width.

  8. SMOKE TOOL FOR MODELS-3 VERSION 4.1 STRUCTURE AND OPERATION DOCUMENTATION

    EPA Science Inventory

    The SMOKE Tool is a part of the Models-3 system, a flexible software system designed to simplify the development and use of air quality models and other environmental decision support tools. The SMOKE Tool is an input processor for SMOKE, (Sparse Matrix Operator Kernel Emissio...

  9. The Nuclear Energy Advanced Modeling and Simulation Enabling Computational Technologies FY09 Report

    SciTech Connect

    Diachin, L F; Garaizar, F X; Henson, V E; Pope, G

    2009-10-12

    In this document we report on the status of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Enabling Computational Technologies (ECT) effort. In particular, we provide the context for ECT In the broader NEAMS program and describe the three pillars of the ECT effort, namely, (1) tools and libraries, (2) software quality assurance, and (3) computational facility (computers, storage, etc) needs. We report on our FY09 deliverables to determine the needs of the integrated performance and safety codes (IPSCs) in these three areas and lay out the general plan for software quality assurance to meet the requirements of DOE and the DOE Advanced Fuel Cycle Initiative (AFCI). We conclude with a brief description of our interactions with the Idaho National Laboratory computer center to determine what is needed to expand their role as a NEAMS user facility.

  10. Tools for Model Building and Optimization into Near-Atomic Resolution Electron Cryo-Microscopy Density Maps.

    PubMed

    DiMaio, F; Chiu, W

    2016-01-01

    Electron cryo-microscopy (cryoEM) has advanced dramatically to become a viable tool for high-resolution structural biology research. The ultimate outcome of a cryoEM study is an atomic model of a macromolecule or its complex with interacting partners. This chapter describes a variety of algorithms and software to build a de novo model based on the cryoEM 3D density map, to optimize the model with the best stereochemistry restraints and finally to validate the model with proper protocols. The full process of atomic structure determination from a cryoEM map is described. The tools outlined in this chapter should prove extremely valuable in revealing atomic interactions guided by cryoEM data. PMID:27572730

  11. The DSET Tool Library: A software approach to enable data exchange between climate system models

    SciTech Connect

    McCormick, J.

    1994-12-01

    Climate modeling is a computationally intensive process. Until recently computers were not powerful enough to perform the complex calculations required to simulate the earth`s climate. As a result standalone programs were created that represent components of the earth`s climate (e.g., Atmospheric Circulation Model). However, recent advances in computing, including massively parallel computing, make it possible to couple the components forming a complete earth climate simulation. The ability to couple different climate model components will significantly improve our ability to predict climate accurately and reliably. Historically each major component of the coupled earth simulation is a standalone program designed independently with different coordinate systems and data representations. In order for two component models to be coupled, the data of one model must be mapped to the coordinate system of the second model. The focus of this project is to provide a general tool to facilitate the mapping of data between simulation components, with an emphasis on using object-oriented programming techniques to provide polynomial interpolation, line and area weighting, and aggregation services.

  12. NASA Advanced Concepts Office, Earth-To-Orbit Team Design Process and Tools

    NASA Technical Reports Server (NTRS)

    Waters, Eric D.; Garcia, Jessica; Beers, Benjamin; Philips, Alan; Holt, James B.; Threet, Grady E., Jr.

    2013-01-01

    The Earth to Orbit (ETO) Team of the Advanced Concepts Office (ACO) at NASA Marshal Space Flight Center (MSFC) is considered the preeminent group to go to for prephase A and phase A concept definition. The ACO team has been at the forefront of a multitude of launch vehicle studies determining the future direction of the Agency as a whole due, in part, to their rapid turnaround time in analyzing concepts and their ability to cover broad trade spaces of vehicles in that limited timeframe. Each completed vehicle concept includes a full mass breakdown of each vehicle to tertiary subsystem components, along with a vehicle trajectory analysis to determine optimized payload delivery to specified orbital parameters, flight environments, and delta v capability. Additionally, a structural analysis of the vehicle based on material properties and geometries is performed as well as an analysis to determine the flight loads based on the trajectory outputs. As mentioned, the ACO Earth to Orbit Team prides themselves on their rapid turnaround time and often need to fulfill customer requests within limited schedule or little advanced notice. Due to working in this fast paced environment, the ETO team has developed some finely honed skills and methods to maximize the delivery capability to meet their customer needs. This paper will describe the interfaces between the 3 primary disciplines used in the design process; weights and sizing, trajectory, and structural analysis, as well as the approach each discipline employs to streamline their particular piece of the design process.

  13. Advancing Collaboration through Hydrologic Data and Model Sharing

    NASA Astrophysics Data System (ADS)

    Tarboton, D. G.; Idaszak, R.; Horsburgh, J. S.; Ames, D. P.; Goodall, J. L.; Band, L. E.; Merwade, V.; Couch, A.; Hooper, R. P.; Maidment, D. R.; Dash, P. K.; Stealey, M.; Yi, H.; Gan, T.; Castronova, A. M.; Miles, B.; Li, Z.; Morsy, M. M.

    2015-12-01

    HydroShare is an online, collaborative system for open sharing of hydrologic data, analytical tools, and models. It supports the sharing of and collaboration around "resources" which are defined primarily by standardized metadata, content data models for each resource type, and an overarching resource data model based on the Open Archives Initiative's Object Reuse and Exchange (OAI-ORE) standard and a hierarchical file packaging system called "BagIt". HydroShare expands the data sharing capability of the CUAHSI Hydrologic Information System by broadening the classes of data accommodated to include geospatial and multidimensional space-time datasets commonly used in hydrology. HydroShare also includes new capability for sharing models, model components, and analytical tools and will take advantage of emerging social media functionality to enhance information about and collaboration around hydrologic data and models. It also supports web services and server/cloud based computation operating on resources for the execution of hydrologic models and analysis and visualization of hydrologic data. HydroShare uses iRODS as a network file system for underlying storage of datasets and models. Collaboration is enabled by casting datasets and models as "social objects". Social functions include both private and public sharing, formation of collaborative groups of users, and value-added annotation of shared datasets and models. The HydroShare web interface and social media functions were developed using the Django web application framework coupled to iRODS. Data visualization and analysis is supported through the Tethys Platform web GIS software stack. Links to external systems are supported by RESTful web service interfaces to HydroShare's content. This presentation will introduce the HydroShare functionality developed to date and describe ongoing development of functionality to support collaboration and integration of data and models.

  14. Reducing the power consumption in LTE-Advanced wireless access networks by a capacity based deployment tool

    NASA Astrophysics Data System (ADS)

    Deruyck, Margot; Joseph, Wout; Tanghe, Emmeric; Martens, Luc

    2014-09-01

    As both the bit rate required by applications on mobile devices and the number of those mobile devices are steadily growing, wireless access networks need to be expanded. As wireless networks also consume a lot of energy, it is important to develop energy-efficient wireless access networks in the near future. In this study, a capacity-based deployment tool for the design of energy-efficient wireless access networks is proposed. Capacity-based means that the network responds to the instantaneous bit rate requirements of the users active in the selected area. To the best of our knowledge, such a deployment tool for energy-efficient wireless access networks has never been presented before. This deployment tool is applied to a realistic case in Ghent, Belgium, to investigate three main functionalities incorporated in LTE-Advanced: carrier aggregation, heterogeneous deployments, and Multiple-Input Multiple-Output (MIMO). The results show that it is recommended to introduce femtocell base stations, supporting both MIMO and carrier aggregation, into the network (heterogeneous deployment) to reduce the network's power consumption. For the selected area and the assumptions made, this results in a power consumption reduction up to 70%. Introducing femtocell base stations without MIMO and carrier aggregation can already result in a significant power consumption reduction of 38%.

  15. Modeling Ionosphere Environments: Creating an ISS Electron Density Tool

    NASA Technical Reports Server (NTRS)

    Gurgew, Danielle N.; Minow, Joseph I.

    2011-01-01

    The International Space Station (ISS) maintains an altitude typically between 300 km and 400 km in low Earth orbit (LEO) which itself is situated in the Earth's ionosphere. The ionosphere is a region of partially ionized gas (plasma) formed by the photoionization of neutral atoms and molecules in the upper atmosphere of Earth. It is important to understand what electron density the spacecraft is/will be operating in because the ionized gas along the ISS orbit interacts with the electrical power system resulting in charging of the vehicle. One instrument that is already operational onboard the ISS with a goal of monitoring electron density, electron temperature, and ISS floating potential is the Floating Potential Measurement Unit (FPMU). Although this tool is a valuable addition to the ISS, there are limitations concerning the data collection periods. The FPMU uses the Ku band communication frequency to transmit data from orbit. Use of this band for FPMU data runs is often terminated due to necessary observation of higher priority Extravehicular Activities (EVAs) and other operations on ISS. Thus, large gaps are present in FPMU data. The purpose of this study is to solve the issue of missing environmental data by implementing a secondary electron density data source, derived from the COSMIC satellite constellation, to create a model of ISS orbital environments. Extrapolating data specific to ISS orbital altitudes, we model the ionospheric electron density along the ISS orbit track to supply a set of data when the FPMU is unavailable. This computer model also provides an additional new source of electron density data that is used to confirm FPMU is operating correctly and supplements the original environmental data taken by FPMU.

  16. Advances and Limitations of Disease Biogeography Using Ecological Niche Modeling.

    PubMed

    Escobar, Luis E; Craft, Meggan E

    2016-01-01

    Mapping disease transmission risk is crucial in public and animal health for evidence based decision-making. Ecology and epidemiology are highly related disciplines that may contribute to improvements in mapping disease, which can be used to answer health related questions. Ecological niche modeling is increasingly used for understanding the biogeography of diseases in plants, animals, and humans. However, epidemiological applications of niche modeling approaches for disease mapping can fail to generate robust study designs, producing incomplete or incorrect inferences. This manuscript is an overview of the history and conceptual bases behind ecological niche modeling, specifically as applied to epidemiology and public health; it does not pretend to be an exhaustive and detailed description of ecological niche modeling literature and methods. Instead, this review includes selected state-of-the-science approaches and tools, providing a short guide to designing studies incorporating information on the type and quality of the input data (i.e., occurrences and environmental variables), identification and justification of the extent of the study area, and encourages users to explore and test diverse algorithms for more informed conclusions. We provide a friendly introduction to the field of disease biogeography presenting an updated guide for researchers looking to use ecological niche modeling for disease mapping. We anticipate that ecological niche modeling will soon be a critical tool for epidemiologists aiming to map disease transmission risk, forecast disease distribution under climate change scenarios, and identify landscape factors triggering outbreaks. PMID:27547199

  17. Advances and Limitations of Disease Biogeography Using Ecological Niche Modeling

    PubMed Central

    Escobar, Luis E.; Craft, Meggan E.

    2016-01-01

    Mapping disease transmission risk is crucial in public and animal health for evidence based decision-making. Ecology and epidemiology are highly related disciplines that may contribute to improvements in mapping disease, which can be used to answer health related questions. Ecological niche modeling is increasingly used for understanding the biogeography of diseases in plants, animals, and humans. However, epidemiological applications of niche modeling approaches for disease mapping can fail to generate robust study designs, producing incomplete or incorrect inferences. This manuscript is an overview of the history and conceptual bases behind ecological niche modeling, specifically as applied to epidemiology and public health; it does not pretend to be an exhaustive and detailed description of ecological niche modeling literature and methods. Instead, this review includes selected state-of-the-science approaches and tools, providing a short guide to designing studies incorporating information on the type and quality of the input data (i.e., occurrences and environmental variables), identification and justification of the extent of the study area, and encourages users to explore and test diverse algorithms for more informed conclusions. We provide a friendly introduction to the field of disease biogeography presenting an updated guide for researchers looking to use ecological niche modeling for disease mapping. We anticipate that ecological niche modeling will soon be a critical tool for epidemiologists aiming to map disease transmission risk, forecast disease distribution under climate change scenarios, and identify landscape factors triggering outbreaks. PMID:27547199

  18. GenSAA: A tool for advancing satellite monitoring with graphical expert systems

    NASA Technical Reports Server (NTRS)

    Hughes, Peter M.; Luczak, Edward C.

    1993-01-01

    During numerous contacts with a satellite each day, spacecraft analysts must closely monitor real time data for combinations of telemetry parameter values, trends, and other indications that may signify a problem or failure. As satellites become more complex and the number of data items increases, this task is becoming increasingly difficult for humans to perform at acceptable performance levels. At the NASA Goddard Space Flight Center, fault-isolation expert systems have been developed to support data monitoring and fault detection tasks in satellite control centers. Based on the lessons learned during these initial efforts in expert system automation, a new domain-specific expert system development tool named the Generic Spacecraft Analyst Assistant (GenSAA) is being developed to facilitate the rapid development and reuse of real-time expert systems to serve as fault-isolation assistants for spacecraft analysts. Although initially domain-specific in nature, this powerful tool will support the development of highly graphical expert systems for data monitoring purposes throughout the space and commercial industry.

  19. Emerging tools for continuous nutrient monitoring networks: Sensors advancing science and water resources protection

    USGS Publications Warehouse

    Pellerin, Brian; Stauffer, Beth A; Young, Dwane A; Sullivan, Daniel J.; Bricker, Suzanne B.; Walbridge, Mark R; Clyde, Gerard A; Shaw, Denice M

    2016-01-01

    Sensors and enabling technologies are becoming increasingly important tools for water quality monitoring and associated water resource management decisions. In particular, nutrient sensors are of interest because of the well-known adverse effects of nutrient enrichment on coastal hypoxia, harmful algal blooms, and impacts to human health. Accurate and timely information on nutrient concentrations and loads is integral to strategies designed to minimize risk to humans and manage the underlying drivers of water quality impairment. Using nitrate sensors as an example, we highlight the types of applications in freshwater and coastal environments that are likely to benefit from continuous, real-time nutrient data. The concurrent emergence of new tools to integrate, manage and share large data sets is critical to the successful use of nutrient sensors and has made it possible for the field of continuous nutrient monitoring to rapidly move forward. We highlight several near-term opportunities for Federal agencies, as well as the broader scientific and management community, that will help accelerate sensor development, build and leverage sites within a national network, and develop open data standards and data management protocols that are key to realizing the benefits of a large-scale, integrated monitoring network. Investing in these opportunities will provide new information to guide management and policies designed to protect and restore our nation’s water resources.

  20. Neuron-Miner: An Advanced Tool for Morphological Search and Retrieval in Neuroscientific Image Databases.

    PubMed

    Conjeti, Sailesh; Mesbah, Sepideh; Negahdar, Mohammadreza; Rautenberg, Philipp L; Zhang, Shaoting; Navab, Nassir; Katouzian, Amin

    2016-10-01

    The steadily growing amounts of digital neuroscientific data demands for a reliable, systematic, and computationally effective retrieval algorithm. In this paper, we present Neuron-Miner, which is a tool for fast and accurate reference-based retrieval within neuron image databases. The proposed algorithm is established upon hashing (search and retrieval) technique by employing multiple unsupervised random trees, collectively called as Hashing Forests (HF). The HF are trained to parse the neuromorphological space hierarchically and preserve the inherent neuron neighborhoods while encoding with compact binary codewords. We further introduce the inverse-coding formulation within HF to effectively mitigate pairwise neuron similarity comparisons, thus allowing scalability to massive databases with little additional time overhead. The proposed hashing tool has superior approximation of the true neuromorphological neighborhood with better retrieval and ranking performance in comparison to existing generalized hashing methods. This is exhaustively validated by quantifying the results over 31266 neuron reconstructions from Neuromorpho.org dataset curated from 147 different archives. We envisage that finding and ranking similar neurons through reference-based querying via Neuron Miner would assist neuroscientists in objectively understanding the relationship between neuronal structure and function for applications in comparative anatomy or diagnosis. PMID:27155864