Science.gov

Sample records for advanced modeling tools

  1. Modeling Tool Advances Rotorcraft Design

    NASA Technical Reports Server (NTRS)

    2007-01-01

    Continuum Dynamics Inc. (CDI), founded in 1979, specializes in advanced engineering services, including fluid dynamic modeling and analysis for aeronautics research. The company has completed a number of SBIR research projects with NASA, including early rotorcraft work done through Langley Research Center, but more recently, out of Ames Research Center. NASA Small Business Innovation Research (SBIR) grants on helicopter wake modeling resulted in the Comprehensive Hierarchical Aeromechanics Rotorcraft Model (CHARM), a tool for studying helicopter and tiltrotor unsteady free wake modeling, including distributed and integrated loads, and performance prediction. Application of the software code in a blade redesign program for Carson Helicopters, of Perkasie, Pennsylvania, increased the payload and cruise speeds of its S-61 helicopter. Follow-on development resulted in a $24 million revenue increase for Sikorsky Aircraft Corporation, of Stratford, Connecticut, as part of the company's rotor design efforts. Now under continuous development for more than 25 years, CHARM models the complete aerodynamics and dynamics of rotorcraft in general flight conditions. CHARM has been used to model a broad spectrum of rotorcraft attributes, including performance, blade loading, blade-vortex interaction noise, air flow fields, and hub loads. The highly accurate software is currently in use by all major rotorcraft manufacturers, NASA, the U.S. Army, and the U.S. Navy.

  2. Constructing an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Sims, Michael; Podolak, Ester; Mckay, Christopher

    1990-01-01

    Scientific model building can be an intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot be easily distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We believe that advanced software techniques can facilitate both the model building and model sharing process. In this paper, we describe a prototype for a scientific modeling software tool that serves as an aid to the scientist in developing and using models. This tool includes an interactive intelligent graphical interface, a high level domain specific modeling language, a library of physics equations and experimental datasets, and a suite of data display facilities. Our prototype has been developed in the domain of planetary atmospheric modeling, and is being used to construct models of Titan's atmosphere.

  3. Evaluation of reliability modeling tools for advanced fault tolerant systems

    NASA Technical Reports Server (NTRS)

    Baker, Robert; Scheper, Charlotte

    1986-01-01

    The Computer Aided Reliability Estimation (CARE III) and Automated Reliability Interactice Estimation System (ARIES 82) reliability tools for application to advanced fault tolerance aerospace systems were evaluated. To determine reliability modeling requirements, the evaluation focused on the Draper Laboratories' Advanced Information Processing System (AIPS) architecture as an example architecture for fault tolerance aerospace systems. Advantages and limitations were identified for each reliability evaluation tool. The CARE III program was designed primarily for analyzing ultrareliable flight control systems. The ARIES 82 program's primary use was to support university research and teaching. Both CARE III and ARIES 82 were not suited for determining the reliability of complex nodal networks of the type used to interconnect processing sites in the AIPS architecture. It was concluded that ARIES was not suitable for modeling advanced fault tolerant systems. It was further concluded that subject to some limitations (the difficulty in modeling systems with unpowered spare modules, systems where equipment maintenance must be considered, systems where failure depends on the sequence in which faults occurred, and systems where multiple faults greater than a double near coincident faults must be considered), CARE III is best suited for evaluating the reliability of advanced tolerant systems for air transport.

  4. Advanced Reach Tool (ART): development of the mechanistic model.

    PubMed

    Fransman, Wouter; Van Tongeren, Martie; Cherrie, John W; Tischer, Martin; Schneider, Thomas; Schinkel, Jody; Kromhout, Hans; Warren, Nick; Goede, Henk; Tielemans, Erik

    2011-11-01

    This paper describes the development of the mechanistic model within a collaborative project, referred to as the Advanced REACH Tool (ART) project, to develop a tool to model inhalation exposure for workers sharing similar operational conditions across different industries and locations in Europe. The ART mechanistic model is based on a conceptual framework that adopts a source receptor approach, which describes the transport of a contaminant from the source to the receptor and defines seven independent principal modifying factors: substance emission potential, activity emission potential, localized controls, segregation, personal enclosure, surface contamination, and dispersion. ART currently differentiates between three different exposure types: vapours, mists, and dust (fumes, fibres, and gases are presently excluded). Various sources were used to assign numerical values to the multipliers to each modifying factor. The evidence used to underpin this assessment procedure was based on chemical and physical laws. In addition, empirical data obtained from literature were used. Where this was not possible, expert elicitation was applied for the assessment procedure. Multipliers for all modifying factors were peer reviewed by leading experts from industry, research institutes, and public authorities across the globe. In addition, several workshops with experts were organized to discuss the proposed exposure multipliers. The mechanistic model is a central part of the ART tool and with advancing knowledge on exposure, determinants will require updates and refinements on a continuous basis, such as the effect of worker behaviour on personal exposure, 'best practice' values that describe the maximum achievable effectiveness of control measures, the intrinsic emission potential of various solid objects (e.g. metal, glass, plastics, etc.), and extending the applicability domain to certain types of exposures (e.g. gas, fume, and fibre exposure).

  5. Advanced REACH Tool: a Bayesian model for occupational exposure assessment.

    PubMed

    McNally, Kevin; Warren, Nicholas; Fransman, Wouter; Entink, Rinke Klein; Schinkel, Jody; van Tongeren, Martie; Cherrie, John W; Kromhout, Hans; Schneider, Thomas; Tielemans, Erik

    2014-06-01

    This paper describes a Bayesian model for the assessment of inhalation exposures in an occupational setting; the methodology underpins a freely available web-based application for exposure assessment, the Advanced REACH Tool (ART). The ART is a higher tier exposure tool that combines disparate sources of information within a Bayesian statistical framework. The information is obtained from expert knowledge expressed in a calibrated mechanistic model of exposure assessment, data on inter- and intra-individual variability in exposures from the literature, and context-specific exposure measurements. The ART provides central estimates and credible intervals for different percentiles of the exposure distribution, for full-shift and long-term average exposures. The ART can produce exposure estimates in the absence of measurements, but the precision of the estimates improves as more data become available. The methodology presented in this paper is able to utilize partially analogous data, a novel approach designed to make efficient use of a sparsely populated measurement database although some additional research is still required before practical implementation. The methodology is demonstrated using two worked examples: an exposure to copper pyrithione in the spraying of antifouling paints and an exposure to ethyl acetate in shoe repair. PMID:24665110

  6. Advanced Computing Tools and Models for Accelerator Physics

    SciTech Connect

    Ryne, Robert; Ryne, Robert D.

    2008-06-11

    This paper is based on a transcript of my EPAC'08 presentation on advanced computing tools for accelerator physics. Following an introduction I present several examples, provide a history of the development of beam dynamics capabilities, and conclude with thoughts on the future of large scale computing in accelerator physics.

  7. Proposal for constructing an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Sims, Michael H.; Podolak, Esther; Mckay, Christopher P.; Thompson, David E.

    1990-01-01

    Scientific model building can be a time intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot easily be distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We believe that advanced software techniques can facilitate both the model building and model sharing process. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing and using models. The proposed tool will include an interactive intelligent graphical interface and a high level, domain specific, modeling language. As a testbed for this research, we propose development of a software prototype in the domain of planetary atmospheric modeling.

  8. Global search tool for the Advanced Photon Source Integrated Relational Model of Installed Systems (IRMIS) database.

    SciTech Connect

    Quock, D. E. R.; Cianciarulo, M. B.; APS Engineering Support Division; Purdue Univ.

    2007-01-01

    The Integrated Relational Model of Installed Systems (IRMIS) is a relational database tool that has been implemented at the Advanced Photon Source to maintain an updated account of approximately 600 control system software applications, 400,000 process variables, and 30,000 control system hardware components. To effectively display this large amount of control system information to operators and engineers, IRMIS was initially built with nine Web-based viewers: Applications Organizing Index, IOC, PLC, Component Type, Installed Components, Network, Controls Spares, Process Variables, and Cables. However, since each viewer is designed to provide details from only one major category of the control system, the necessity for a one-stop global search tool for the entire database became apparent. The user requirements for extremely fast database search time and ease of navigation through search results led to the choice of Asynchronous JavaScript and XML (AJAX) technology in the implementation of the IRMIS global search tool. Unique features of the global search tool include a two-tier level of displayed search results, and a database data integrity validation and reporting mechanism.

  9. Predictive Modeling of Estrogen Receptor Binding Agents Using Advanced Cheminformatics Tools and Massive Public Data

    PubMed Central

    Ribay, Kathryn; Kim, Marlene T.; Wang, Wenyi; Pinolini, Daniel; Zhu, Hao

    2016-01-01

    Estrogen receptors (ERα) are a critical target for drug design as well as a potential source of toxicity when activated unintentionally. Thus, evaluating potential ERα binding agents is critical in both drug discovery and chemical toxicity areas. Using computational tools, e.g., Quantitative Structure-Activity Relationship (QSAR) models, can predict potential ERα binding agents before chemical synthesis. The purpose of this project was to develop enhanced predictive models of ERα binding agents by utilizing advanced cheminformatics tools that can integrate publicly available bioassay data. The initial ERα binding agent data set, consisting of 446 binders and 8307 non-binders, was obtained from the Tox21 Challenge project organized by the NIH Chemical Genomics Center (NCGC). After removing the duplicates and inorganic compounds, this data set was used to create a training set (259 binders and 259 non-binders). This training set was used to develop QSAR models using chemical descriptors. The resulting models were then used to predict the binding activity of 264 external compounds, which were available to us after the models were developed. The cross-validation results of training set [Correct Classification Rate (CCR) = 0.72] were much higher than the external predictivity of the unknown compounds (CCR = 0.59). To improve the conventional QSAR models, all compounds in the training set were used to search PubChem and generate a profile of their biological responses across thousands of bioassays. The most important bioassays were prioritized to generate a similarity index that was used to calculate the biosimilarity score between each two compounds. The nearest neighbors for each compound within the set were then identified and its ERα binding potential was predicted by its nearest neighbors in the training set. The hybrid model performance (CCR = 0.94 for cross validation; CCR = 0.68 for external prediction) showed significant improvement over the original QSAR

  10. Predictive Modeling of Estrogen Receptor Binding Agents Using Advanced Cheminformatics Tools and Massive Public Data

    PubMed Central

    Ribay, Kathryn; Kim, Marlene T.; Wang, Wenyi; Pinolini, Daniel; Zhu, Hao

    2016-01-01

    Estrogen receptors (ERα) are a critical target for drug design as well as a potential source of toxicity when activated unintentionally. Thus, evaluating potential ERα binding agents is critical in both drug discovery and chemical toxicity areas. Using computational tools, e.g., Quantitative Structure-Activity Relationship (QSAR) models, can predict potential ERα binding agents before chemical synthesis. The purpose of this project was to develop enhanced predictive models of ERα binding agents by utilizing advanced cheminformatics tools that can integrate publicly available bioassay data. The initial ERα binding agent data set, consisting of 446 binders and 8307 non-binders, was obtained from the Tox21 Challenge project organized by the NIH Chemical Genomics Center (NCGC). After removing the duplicates and inorganic compounds, this data set was used to create a training set (259 binders and 259 non-binders). This training set was used to develop QSAR models using chemical descriptors. The resulting models were then used to predict the binding activity of 264 external compounds, which were available to us after the models were developed. The cross-validation results of training set [Correct Classification Rate (CCR) = 0.72] were much higher than the external predictivity of the unknown compounds (CCR = 0.59). To improve the conventional QSAR models, all compounds in the training set were used to search PubChem and generate a profile of their biological responses across thousands of bioassays. The most important bioassays were prioritized to generate a similarity index that was used to calculate the biosimilarity score between each two compounds. The nearest neighbors for each compound within the set were then identified and its ERα binding potential was predicted by its nearest neighbors in the training set. The hybrid model performance (CCR = 0.94 for cross validation; CCR = 0.68 for external prediction) showed significant improvement over the original QSAR

  11. Advanced semi-active engine and transmission mounts: tools for modelling, analysis, design, and tuning

    NASA Astrophysics Data System (ADS)

    Farjoud, Alireza; Taylor, Russell; Schumann, Eric; Schlangen, Timothy

    2014-02-01

    This paper is focused on modelling, design, and testing of semi-active magneto-rheological (MR) engine and transmission mounts used in the automotive industry. The purpose is to develop a complete analysis, synthesis, design, and tuning tool that reduces the need for expensive and time-consuming laboratory and field tests. A detailed mathematical model of such devices is developed using multi-physics modelling techniques for physical systems with various energy domains. The model includes all major features of an MR mount including fluid dynamics, fluid track, elastic components, decoupler, rate-dip, gas-charged chamber, MR fluid rheology, magnetic circuit, electronic driver, and control algorithm. Conventional passive hydraulic mounts can also be studied using the same mathematical model. The model is validated using standard experimental procedures. It is used for design and parametric study of mounts; effects of various geometric and material parameters on dynamic response of mounts can be studied. Additionally, this model can be used to test various control strategies to obtain best vibration isolation performance by tuning control parameters. Another benefit of this work is that nonlinear interactions between sub-components of the mount can be observed and investigated. This is not possible by using simplified linear models currently available.

  12. ADVANCED POWER SYSTEMS ANALYSIS TOOLS

    SciTech Connect

    Robert R. Jensen; Steven A. Benson; Jason D. Laumb

    2001-08-31

    The use of Energy and Environmental Research Center (EERC) modeling tools and improved analytical methods has provided key information in optimizing advanced power system design and operating conditions for efficiency, producing minimal air pollutant emissions and utilizing a wide range of fossil fuel properties. This project was divided into four tasks: the demonstration of the ash transformation model, upgrading spreadsheet tools, enhancements to analytical capabilities using the scanning electron microscopy (SEM), and improvements to the slag viscosity model. The ash transformation model, Atran, was used to predict the size and composition of ash particles, which has a major impact on the fate of the combustion system. To optimize Atran key factors such as mineral fragmentation and coalescence, the heterogeneous and homogeneous interaction of the organically associated elements must be considered as they are applied to the operating conditions. The resulting model's ash composition compares favorably to measured results. Enhancements to existing EERC spreadsheet application included upgrading interactive spreadsheets to calculate the thermodynamic properties for fuels, reactants, products, and steam with Newton Raphson algorithms to perform calculations on mass, energy, and elemental balances, isentropic expansion of steam, and gasifier equilibrium conditions. Derivative calculations can be performed to estimate fuel heating values, adiabatic flame temperatures, emission factors, comparative fuel costs, and per-unit carbon taxes from fuel analyses. Using state-of-the-art computer-controlled scanning electron microscopes and associated microanalysis systems, a method to determine viscosity using the incorporation of grey-scale binning acquired by the SEM image was developed. The image analysis capabilities of a backscattered electron image can be subdivided into various grey-scale ranges that can be analyzed separately. Since the grey scale's intensity is

  13. Advanced Online Survival Analysis Tool for Predictive Modelling in Clinical Data Science.

    PubMed

    Montes-Torres, Julio; Subirats, José Luis; Ribelles, Nuria; Urda, Daniel; Franco, Leonardo; Alba, Emilio; Jerez, José Manuel

    2016-01-01

    One of the prevailing applications of machine learning is the use of predictive modelling in clinical survival analysis. In this work, we present our view of the current situation of computer tools for survival analysis, stressing the need of transferring the latest results in the field of machine learning to biomedical researchers. We propose a web based software for survival analysis called OSA (Online Survival Analysis), which has been developed as an open access and user friendly option to obtain discrete time, predictive survival models at individual level using machine learning techniques, and to perform standard survival analysis. OSA employs an Artificial Neural Network (ANN) based method to produce the predictive survival models. Additionally, the software can easily generate survival and hazard curves with multiple options to personalise the plots, obtain contingency tables from the uploaded data to perform different tests, and fit a Cox regression model from a number of predictor variables. In the Materials and Methods section, we depict the general architecture of the application and introduce the mathematical background of each of the implemented methods. The study concludes with examples of use showing the results obtained with public datasets. PMID:27532883

  14. Advanced Online Survival Analysis Tool for Predictive Modelling in Clinical Data Science

    PubMed Central

    Montes-Torres, Julio; Subirats, José Luis; Ribelles, Nuria; Urda, Daniel; Franco, Leonardo; Alba, Emilio; Jerez, José Manuel

    2016-01-01

    One of the prevailing applications of machine learning is the use of predictive modelling in clinical survival analysis. In this work, we present our view of the current situation of computer tools for survival analysis, stressing the need of transferring the latest results in the field of machine learning to biomedical researchers. We propose a web based software for survival analysis called OSA (Online Survival Analysis), which has been developed as an open access and user friendly option to obtain discrete time, predictive survival models at individual level using machine learning techniques, and to perform standard survival analysis. OSA employs an Artificial Neural Network (ANN) based method to produce the predictive survival models. Additionally, the software can easily generate survival and hazard curves with multiple options to personalise the plots, obtain contingency tables from the uploaded data to perform different tests, and fit a Cox regression model from a number of predictor variables. In the Materials and Methods section, we depict the general architecture of the application and introduce the mathematical background of each of the implemented methods. The study concludes with examples of use showing the results obtained with public datasets. PMID:27532883

  15. Propulsion Simulations Using Advanced Turbulence Models with the Unstructured Grid CFD Tool, TetrUSS

    NASA Technical Reports Server (NTRS)

    Abdol-Hamid, Khaled S.; Frink, Neal T.; Deere, Karen A.; Pandya, Mohangna J.

    2004-01-01

    A computational investigation has been completed to assess the capability of TetrUSS for exhaust nozzle flows. Three configurations were chosen for this study (1) an axisymmetric supersonic jet, (2) a transonic axisymmetric boattail with solid sting operated at different Reynolds number and Mach number, and (3) an isolated non-axisymmetric nacelle with a supersonic cruise nozzle. These configurations were chosen because existing experimental data provided a means for measuring the ability of TetrUSS for simulating complex nozzle flows. The main objective of this paper is to validate the implementation of advanced two-equation turbulence models in the unstructured-grid CFD code USM3D for propulsion flow cases. USM3D is the flow solver of the TetrUSS system. Three different turbulence models, namely, Menter Shear Stress Transport (SST), basic k epsilon, and the Spalart-Allmaras (SA) are used in the present study. The results are generally in agreement with other implementations of these models in structured-grid CFD codes. Results indicate that USM3D provides accurate simulations for complex aerodynamic configurations with propulsion integration.

  16. Demonstrating Advancements in 3D Analysis and Prediction Tools for Space Weather Forecasting utilizing the Enlil Model

    NASA Astrophysics Data System (ADS)

    Murphy, J. J.; Elkington, S. R.; Schmitt, P.; Wiltberger, M. J.; Baker, D. N.

    2012-12-01

    Simulation models of the heliospheric and geospace environments can provide key insights into the geoeffective potential of solar disturbances such as Coronal Mass Ejections and High Speed Solar Wind Streams. Analysis and prediction tools for post processing and visualizing simulation results greatly enhance the utility of these models in aiding space weather forecasters to predict the terrestrial consequences of these events. The Center For Integrated Space Weather Modeling (CISM) Knowledge Transfer (KT) group is making significant progress on an integrated post-processing and analysis and prediction tool based on the ParaView open source visualization application for space weather prediction. These tools will provide space weather forecasters with the ability to use 3D situational awareness of the solar wind, CME, and eventually the geospace environments. Current work focuses on bringing new 3D analysis and prediction tools for the Enlil heliospheric model to space weather forecasters. In this effort we present a ParaView-based model interface that will provide forecasters with an interactive system for analyzing complete 3D datasets from modern space weather models.

  17. FACILITATING ADVANCED URBAN METEOROLOGY AND AIR QUALITY MODELING CAPABILITIES WITH HIGH RESOLUTION URBAN DATABASE AND ACCESS PORTAL TOOLS

    EPA Science Inventory

    Information of urban morphological features at high resolution is needed to properly model and characterize the meteorological and air quality fields in urban areas. We describe a new project called National Urban Database with Access Portal Tool, (NUDAPT) that addresses this nee...

  18. Rapid medical advances challenge the tooling industry.

    PubMed

    Conley, B

    2008-01-01

    The requirement for greater performance in smaller spaces has increased demands for product and process innovation in tubing and other medical products. In turn, these developments have placed greater demands on the producers of the advanced tooling for these products. Tooling manufacturers must now continuously design equipment with much tighter tolerances for more sophisticated coextrusions and for newer generations of multilumen and multilayer tubing.

  19. Advanced genetic tools for plant biotechnology

    SciTech Connect

    Liu, WS; Yuan, JS; Stewart, CN

    2013-10-09

    Basic research has provided a much better understanding of the genetic networks and regulatory hierarchies in plants. To meet the challenges of agriculture, we must be able to rapidly translate this knowledge into generating improved plants. Therefore, in this Review, we discuss advanced tools that are currently available for use in plant biotechnology to produce new products in plants and to generate plants with new functions. These tools include synthetic promoters, 'tunable' transcription factors, genome-editing tools and site-specific recombinases. We also review some tools with the potential to enable crop improvement, such as methods for the assembly and synthesis of large DNA molecules, plant transformation with linked multigenes and plant artificial chromosomes. These genetic technologies should be integrated to realize their potential for applications to pressing agricultural and environmental problems.

  20. Load Model Data Tool

    2013-04-30

    The LMDT software automates the process of the load composite model data preparation in the format supported by the major power system software vendors (GE and Siemens). Proper representation of the load composite model in power system dynamic analysis is very important. Software tools for power system simulation like GE PSLF and Siemens PSSE already include algorithms for the load composite modeling. However, these tools require that the input information on composite load to bemore » provided in custom formats. Preparation of this data is time consuming and requires multiple manual operations. The LMDT software enables to automate this process. Software is designed to generate composite load model data. It uses the default load composition data, motor information, and bus information as an input. Software processes the input information and produces load composition model. Generated model can be stored in .dyd format supported by GE PSLF package or .dyr format supported by Siemens PSSE package.« less

  1. Load Model Data Tool

    SciTech Connect

    David Chassin, Pavel Etingov

    2013-04-30

    The LMDT software automates the process of the load composite model data preparation in the format supported by the major power system software vendors (GE and Siemens). Proper representation of the load composite model in power system dynamic analysis is very important. Software tools for power system simulation like GE PSLF and Siemens PSSE already include algorithms for the load composite modeling. However, these tools require that the input information on composite load to be provided in custom formats. Preparation of this data is time consuming and requires multiple manual operations. The LMDT software enables to automate this process. Software is designed to generate composite load model data. It uses the default load composition data, motor information, and bus information as an input. Software processes the input information and produces load composition model. Generated model can be stored in .dyd format supported by GE PSLF package or .dyr format supported by Siemens PSSE package.

  2. Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)

    NASA Technical Reports Server (NTRS)

    Doyle, Monica; ONeil, Daniel A.; Christensen, Carissa B.

    2005-01-01

    The Advanced Technology Lifecycle Analysis System (ATLAS) is a decision support tool designed to aid program managers and strategic planners in determining how to invest technology research and development dollars. It is an Excel-based modeling package that allows a user to build complex space architectures and evaluate the impact of various technology choices. ATLAS contains system models, cost and operations models, a campaign timeline and a centralized technology database. Technology data for all system models is drawn from a common database, the ATLAS Technology Tool Box (TTB). The TTB provides a comprehensive, architecture-independent technology database that is keyed to current and future timeframes.

  3. Modeling and MBL: Software Tools for Science.

    ERIC Educational Resources Information Center

    Tinker, Robert F.

    Recent technological advances and new software packages put unprecedented power for experimenting and theory-building in the hands of students at all levels. Microcomputer-based laboratory (MBL) and model-solving tools illustrate the educational potential of the technology. These tools include modeling software and three MBL packages (which are…

  4. Self-advancing step-tap tool

    NASA Technical Reports Server (NTRS)

    Pettit, Donald R. (Inventor); Penner, Ronald K. (Inventor); Franklin, Larry D. (Inventor); Camarda, Charles J. (Inventor)

    2008-01-01

    Methods and tool for simultaneously forming a bore in a work piece and forming a series of threads in said bore. In an embodiment, the tool has a predetermined axial length, a proximal end, and a distal end, said tool comprising: a shank located at said proximal end; a pilot drill portion located at said distal end; and a mill portion intermediately disposed between said shank and said pilot drill portion. The mill portion is comprised of at least two drill-tap sections of predetermined axial lengths and at least one transition section of predetermined axial length, wherein each of said at least one transition section is sandwiched between a distinct set of two of said at least two drill-tap sections. The at least two drill-tap sections are formed of one or more drill-tap cutting teeth spirally increasing along said at least two drill-tap sections, wherein said tool is self-advanced in said work piece along said formed threads, and wherein said tool simultaneously forms said bore and said series of threads along a substantially similar longitudinal axis.

  5. Computational tools for protein modeling.

    PubMed

    Xu, D; Xu, Y; Uberbacher, E C

    2000-07-01

    Protein modeling is playing a more and more important role in protein and peptide sciences due to improvements in modeling methods, advances in computer technology, and the huge amount of biological data becoming available. Modeling tools can often predict the structure and shed some light on the function and its underlying mechanism. They can also provide insight to design experiments and suggest possible leads for drug design. This review attempts to provide a comprehensive introduction to major computer programs, especially on-line servers, for protein modeling. The review covers the following aspects: (1) protein sequence comparison, including sequence alignment/search, sequence-based protein family classification, domain parsing, and phylogenetic classification; (2) sequence annotation, including annotation/prediction of hydrophobic profiles, transmembrane regions, active sites, signaling sites, and secondary structures; (3) protein structure analysis, including visualization, geometry analysis, structure comparison/classification, dynamics, and electrostatics; (4) three-dimensional structure prediction, including homology modeling, fold recognition using threading, ab initio prediction, and docking. We will address what a user can expect from the computer tools in terms of their strengths and limitations. We will also discuss the major challenges and the future trends in the field. A collection of the links of tools can be found at http://compbio.ornl.gov/structure/resource/.

  6. Kate's Model Verification Tools

    NASA Technical Reports Server (NTRS)

    Morgan, Steve

    1991-01-01

    Kennedy Space Center's Knowledge-based Autonomous Test Engineer (KATE) is capable of monitoring electromechanical systems, diagnosing their errors, and even repairing them when they crash. A survey of KATE's developer/modelers revealed that they were already using a sophisticated set of productivity enhancing tools. They did request five more, however, and those make up the body of the information presented here: (1) a transfer function code fitter; (2) a FORTRAN-Lisp translator; (3) three existing structural consistency checkers to aid in syntax checking their modeled device frames; (4) an automated procedure for calibrating knowledge base admittances to protect KATE's hardware mockups from inadvertent hand valve twiddling; and (5) three alternatives for the 'pseudo object', a programming patch that currently apprises KATE's modeling devices of their operational environments.

  7. Regional Arctic System Model (RASM): A Tool to Address the U.S. Priorities and Advance Capabilities for Arctic Climate Modeling and Prediction

    NASA Astrophysics Data System (ADS)

    Maslowski, W.; Roberts, A.; Cassano, J. J.; Gutowski, W. J., Jr.; Nijssen, B.; Osinski, R.; Zeng, X.; Brunke, M.; Duvivier, A.; Hamman, J.; Hossainzadeh, S.; Hughes, M.; Seefeldt, M. W.

    2015-12-01

    The Arctic is undergoing some of the most coordinated rapid climatic changes currently occurring anywhere on Earth, including the retreat of the perennial sea ice cover, which integrates forcing by, exchanges with and feedbacks between atmosphere, ocean and land. While historical reconstructions from Earth System Models (ESMs) are in broad agreement with these changes, the rate of change in ESMs generally remains outpaced by observations. Reasons for that relate to a combination of coarse resolution, inadequate parameterizations, under-represented processes and a limited knowledge of physical interactions. We demonstrate the capability of the Regional Arctic System Model (RASM) in addressing some of the ESM limitations in simulating observed variability and trends in arctic surface climate. RASM is a high resolution, pan-Arctic coupled climate model with the sea ice and ocean model components configured at an eddy-permitting resolution of 1/12o and the atmosphere and land hydrology model components at 50 km resolution, which are all coupled at 20-minute intervals. RASM is an example of limited-area, process-resolving, fully coupled ESM, which due to the constraints from boundary conditions facilitates detailed comparisons with observational statistics that are not possible with ESMs. The overall goal of RASM is to address key requirements published in the Navy Arctic Roadmap: 2014-2030 and in the Implementation Plan for the National Strategy for the Arctic Region, regarding the need for advanced modeling capabilities for operational forecasting and strategic climate predictions through 2030. The main science objectives of RASM are to advance understanding and model representation of critical physical processes and feedbacks of importance to sea ice thickness and area distribution. RASM results are presented to quantify relative contributions by (i) resolved processes and feedbacks as well as (ii) sensitivity to space dependent sub-grid parameterizations to better

  8. Advanced computational tools for 3-D seismic analysis

    SciTech Connect

    Barhen, J.; Glover, C.W.; Protopopescu, V.A.

    1996-06-01

    The global objective of this effort is to develop advanced computational tools for 3-D seismic analysis, and test the products using a model dataset developed under the joint aegis of the United States` Society of Exploration Geophysicists (SEG) and the European Association of Exploration Geophysicists (EAEG). The goal is to enhance the value to the oil industry of the SEG/EAEG modeling project, carried out with US Department of Energy (DOE) funding in FY` 93-95. The primary objective of the ORNL Center for Engineering Systems Advanced Research (CESAR) is to spearhead the computational innovations techniques that would enable a revolutionary advance in 3-D seismic analysis. The CESAR effort is carried out in collaboration with world-class domain experts from leading universities, and in close coordination with other national laboratories and oil industry partners.

  9. Anvil Forecast Tool in the Advanced Weather Interactive Processing System

    NASA Technical Reports Server (NTRS)

    Barrett, Joe H., III; Hood, Doris

    2009-01-01

    Meteorologists from the 45th Weather Squadron (45 WS) and National Weather Service Spaceflight Meteorology Group (SMG) have identified anvil forecasting as one of their most challenging tasks when predicting the probability of violations of the Lightning Launch Commit Criteria and Space Shuttle Flight Rules. As a result, the Applied Meteorology Unit (AMU) was tasked to create a graphical overlay tool for the Meteorological Interactive Data Display System (MIDDS) that indicates the threat of thunderstorm anvil clouds, using either observed or model forecast winds as input. The tool creates a graphic depicting the potential location of thunderstorm anvils one, two, and three hours into the future. The locations are based on the average of the upper level observed or forecasted winds. The graphic includes 10 and 20 n mi standoff circles centered at the location of interest, as well as one-, two-, and three-hour arcs in the upwind direction. The arcs extend outward across a 30 sector width based on a previous AMU study that determined thunderstorm anvils move in a direction plus or minus 15 of the upper-level wind direction. The AMU was then tasked to transition the tool to the Advanced Weather Interactive Processing System (AWIPS). SMG later requested the tool be updated to provide more flexibility and quicker access to model data. This presentation describes the work performed by the AMU to transition the tool into AWIPS, as well as the subsequent improvements made to the tool.

  10. Development of Advanced Tools for Cryogenic Integration

    NASA Astrophysics Data System (ADS)

    Bugby, D. C.; Marland, B. C.; Stouffer, C. J.; Kroliczek, E. J.

    2004-06-01

    This paper describes four advanced devices (or tools) that were developed to help solve problems in cryogenic integration. The four devices are: (1) an across-gimbal nitrogen cryogenic loop heat pipe (CLHP); (2) a miniaturized neon CLHP; (3) a differential thermal expansion (DTE) cryogenic thermal switch (CTSW); and (4) a dual-volume nitrogen cryogenic thermal storage unit (CTSU). The across-gimbal CLHP provides a low torque, high conductance solution for gimbaled cryogenic systems wishing to position their cryocoolers off-gimbal. The miniaturized CLHP combines thermal transport, flexibility, and thermal switching (at 35 K) into one device that can be directly mounted to both the cooler cold head and the cooled component. The DTE-CTSW, designed and successfully tested in a previous program using a stainless steel tube and beryllium (Be) end-pieces, was redesigned with a polymer rod and high-purity aluminum (Al) end-pieces to improve performance and manufacturability while still providing a miniaturized design. Lastly, the CTSU was designed with a 6063 Al heat exchanger and integrally welded, segmented, high purity Al thermal straps for direct attachment to both a cooler cold head and a Be component whose peak heat load exceeds its average load by 2.5 times. For each device, the paper will describe its development objective, operating principles, heritage, requirements, design, test data and lessons learned.

  11. Distillation Column Modeling Tools

    SciTech Connect

    2001-09-01

    Advanced Computational and Experimental Techniques will Optimize Distillation Column Operation. Distillation is a low thermal efficiency unit operation that currently consumes 4.8 quadrillion BTUs of energy...

  12. Advanced cryogenics for cutting tools. Final report

    SciTech Connect

    Lazarus, L.J.

    1996-10-01

    The purpose of the investigation was to determine if cryogenic treatment improved the life and cost effectiveness of perishable cutting tools over other treatments or coatings. Test results showed that in five of seven of the perishable cutting tools tested there was no improvement in tool life. The other two tools showed a small gain in tool life, but not as much as when switching manufacturers of the cutting tool. The following conclusions were drawn from this study: (1) titanium nitride coatings are more effective than cryogenic treatment in increasing the life of perishable cutting tools made from all cutting tool materials, (2) cryogenic treatment may increase tool life if the cutting tool is improperly heat treated during its origination, and (3) cryogenic treatment was only effective on those tools made from less sophisticated high speed tool steels. As a part of a recent detailed investigation, four cutting tool manufacturers and two cutting tool laboratories were queried and none could supply any data to substantiate cryogenic treatment of perishable cutting tools.

  13. Anvil Tool in the Advanced Weather Interactive Processing System

    NASA Technical Reports Server (NTRS)

    Barrett, Joe, III; Bauman, William, III; Keen, Jeremy

    2007-01-01

    Meteorologists from the 45th Weather Squadron (45 WS) and Spaceflight Meteorology Group (SMG) have identified anvil forecasting as one of their most challenging tasks when predicting the probability of violations of the lightning Launch Commit Criteria and Space Shuttle Flight Rules. As a result, the Applied Meteorology Unit (AMU) created a graphical overlay tool for the Meteorological Interactive Data Display Systems (MIDDS) to indicate the threat of thunderstorm anvil clouds, using either observed or model forecast winds as input. In order for the Anvil Tool to remain available to the meteorologists, the AMU was tasked to transition the tool to the Advanced Weather interactive Processing System (AWIPS). This report describes the work done by the AMU to develop the Anvil Tool for AWIPS to create a graphical overlay depicting the threat from thunderstorm anvil clouds. The AWIPS Anvil Tool is based on the previously deployed AMU MIDDS Anvil Tool. SMG and 45 WS forecasters have used the MIDDS Anvil Tool during launch and landing operations. SMG's primary weather analysis and display system is now AWIPS and the 45 WS has plans to replace MIDDS with AWIPS. The Anvil Tool creates a graphic that users can overlay on satellite or radar imagery to depict the potential location of thunderstorm anvils one, two, and three hours into the future. The locations are based on an average of the upper-level observed or forecasted winds. The graphic includes 10 and 20 nm standoff circles centered at the location of interest, in addition to one-, two-, and three-hour arcs in the upwind direction. The arcs extend outward across a 30 degree sector width based on a previous AMU study which determined thunderstorm anvils move in a direction plus or minus 15 degrees of the upper-level (300- to 150-mb) wind direction. This report briefly describes the history of the MIDDS Anvil Tool and then explains how the initial development of the AWIPS Anvil Tool was carried out. After testing was

  14. Advanced Concept Modeling

    NASA Technical Reports Server (NTRS)

    Chaput, Armand; Johns, Zachary; Hodges, Todd; Selfridge, Justin; Bevirt, Joeben; Ahuja, Vivek

    2015-01-01

    Advanced Concepts Modeling software validation, analysis, and design. This was a National Institute of Aerospace contract with a lot of pieces. Efforts ranged from software development and validation for structures and aerodynamics, through flight control development, and aeropropulsive analysis, to UAV piloting services.

  15. Advances of implementing NC machine tools discussed

    NASA Astrophysics Data System (ADS)

    Kukuyev, Y. P.; Trukhan, Y. V.

    1984-11-01

    Numerical control machine tools which are one of the principal resources of reequipment, mechanization and automation of small series and series production in machine building were examined. The continually increasing volume of NC machine tools which are produced and introduced is economically significant for introduction of these machine tools to operation and organization of their effective use. Organizational and technical measures were directed at solving these problems. To insure the fastest introduction of NC machine tools into operation and their technical maintenance, a number of setting up organizations was organized. Setting up services are also provided by the plants manufacturing the NC machine tools, and appropriate subdivisions are created for this purpose.

  16. Advanced tool kits for EPR security.

    PubMed

    Blobel, B

    2000-11-01

    Responding to the challenge for efficient and high quality health care, the shared care paradigm must be established in health. In that context, information systems such as electronic patient records (EPR) have to meet this paradigm supporting communication and interoperation between the health care establishments (HCE) and health professionals (HP) involved. Due to the sensitivity of personal medical information, this co-operation must be provided in a trustworthy way. To enable different views of HCE and HP ranging from management, doctors, nurses up to systems administrators and IT professionals, a set of models for analysis, design and implementation of secure distributed EPR has been developed and introduced. The approach is based on the popular UML methodology and the component paradigm for open, interoperable systems. Easy to use tool kits deal with both application security services and communication security services but also with the security infrastructure needed. Regarding the requirements for distributed multi-user EPRs, modelling and implementation of policy agreements, authorisation and access control are especially considered. Current developments for a security infrastructure in health care based on cryptographic algorithms as health professional cards (HPC), security services employing digital signatures, and health-related TTP services are discussed. CEN and ISO initiatives for health informatics standards in the context of secure and communicable EPR are especially mentioned. PMID:11154968

  17. Advanced Electric Submersible Pump Design Tool for Geothermal Applications

    SciTech Connect

    Xuele Qi; Norman Turnquist; Farshad Ghasripoor

    2012-05-31

    Electrical Submersible Pumps (ESPs) present higher efficiency, larger production rate, and can be operated in deeper wells than the other geothermal artificial lifting systems. Enhanced Geothermal Systems (EGS) applications recommend lifting 300 C geothermal water at 80kg/s flow rate in a maximum 10-5/8-inch diameter wellbore to improve the cost-effectiveness. In this paper, an advanced ESP design tool comprising a 1D theoretical model and a 3D CFD analysis has been developed to design ESPs for geothermal applications. Design of Experiments was also performed to optimize the geometry and performance. The designed mixed-flow type centrifugal impeller and diffuser exhibit high efficiency and head rise under simulated EGS conditions. The design tool has been validated by comparing the prediction to experimental data of an existing ESP product.

  18. An online model composition tool for system biology models

    PubMed Central

    2013-01-01

    Background There are multiple representation formats for Systems Biology computational models, and the Systems Biology Markup Language (SBML) is one of the most widely used. SBML is used to capture, store, and distribute computational models by Systems Biology data sources (e.g., the BioModels Database) and researchers. Therefore, there is a need for all-in-one web-based solutions that support advance SBML functionalities such as uploading, editing, composing, visualizing, simulating, querying, and browsing computational models. Results We present the design and implementation of the Model Composition Tool (Interface) within the PathCase-SB (PathCase Systems Biology) web portal. The tool helps users compose systems biology models to facilitate the complex process of merging systems biology models. We also present three tools that support the model composition tool, namely, (1) Model Simulation Interface that generates a visual plot of the simulation according to user’s input, (2) iModel Tool as a platform for users to upload their own models to compose, and (3) SimCom Tool that provides a side by side comparison of models being composed in the same pathway. Finally, we provide a web site that hosts BioModels Database models and a separate web site that hosts SBML Test Suite models. Conclusions Model composition tool (and the other three tools) can be used with little or no knowledge of the SBML document structure. For this reason, students or anyone who wants to learn about systems biology will benefit from the described functionalities. SBML Test Suite models will be a nice starting point for beginners. And, for more advanced purposes, users will able to access and employ models of the BioModels Database as well. PMID:24006914

  19. Alternative Fuel and Advanced Vehicle Tools (AFAVT), AFDC (Fact Sheet)

    SciTech Connect

    Not Available

    2010-01-01

    The Alternative Fuels and Advanced Vehicles Web site offers a collection of calculators, interactive maps, and informational tools to assist fleets, fuel providers, and others looking to reduce petroleum consumption in the transportation sector.

  20. Innovative Tools Advance Revolutionary Weld Technique

    NASA Technical Reports Server (NTRS)

    2009-01-01

    The iconic, orange external tank of the space shuttle launch system not only contains the fuel used by the shuttle s main engines during liftoff but also comprises the shuttle s backbone, supporting the space shuttle orbiter and solid rocket boosters. Given the tank s structural importance and the extreme forces (7.8 million pounds of thrust load) and temperatures it encounters during launch, the welds used to construct the tank must be highly reliable. Variable polarity plasma arc welding, developed for manufacturing the external tank and later employed for building the International Space Station, was until 1994 the best process for joining the aluminum alloys used during construction. That year, Marshall Space Flight Center engineers began experimenting with a relatively new welding technique called friction stir welding (FSW), developed in 1991 by The Welding Institute, of Cambridge, England. FSW differs from traditional fusion welding in that it is a solid-state welding technique, using frictional heat and motion to join structural components without actually melting any of the material. The weld is created by a shouldered pin tool that is plunged into the seam of the materials to be joined. The tool traverses the line while rotating at high speeds, generating friction that heats and softens but does not melt the metal. (The heat produced approaches about 80 percent of the metal s melting temperature.) The pin tool s rotation crushes and stirs the plasticized metal, extruding it along the seam as the tool moves forward. The material cools and consolidates, resulting in a weld with superior mechanical properties as compared to those weld properties of fusion welds. The innovative FSW technology promises a number of attractive benefits. Because the welded materials are not melted, many of the undesirables associated with fusion welding porosity, cracking, shrinkage, and distortion of the weld are minimized or avoided. The process is more energy efficient, safe

  1. Terahertz Tools Advance Imaging for Security, Industry

    NASA Technical Reports Server (NTRS)

    2010-01-01

    Picometrix, a wholly owned subsidiary of Advanced Photonix Inc. (API), of Ann Arbor, Michigan, invented the world s first commercial terahertz system. The company improved the portability and capabilities of their systems through Small Business Innovation Research (SBIR) agreements with Langley Research Center to provide terahertz imaging capabilities for inspecting the space shuttle external tanks and orbiters. Now API s systems make use of the unique imaging capacity of terahertz radiation on manufacturing floors, for thickness measurements of coatings, pharmaceutical tablet production, and even art conservation.

  2. Advancing representation of hydrologic processes in the Soil and Water Assessment Tool (SWAT) through integration of the TOPographic MODEL (TOPMODEL) features

    NASA Astrophysics Data System (ADS)

    Chen, Ji; Wu, Yiping

    2012-02-01

    SummaryThis paper presents a study of the integration of the Soil and Water Assessment Tool (SWAT) model and the TOPographic MODEL (TOPMODEL) features for enhancing the physical representation of hydrologic processes. In SWAT, four hydrologic processes, which are surface runoff, baseflow, groundwater re-evaporation and deep aquifer percolation, are modeled by using a group of empirical equations. The empirical equations usually constrain the simulation capability of relevant processes. To replace these equations and to model the influences of topography and water table variation on streamflow generation, the TOPMODEL features are integrated into SWAT, and a new model, the so-called SWAT-TOP, is developed. In the new model, the process of deep aquifer percolation is removed, the concept of groundwater re-evaporation is refined, and the processes of surface runoff and baseflow are remodeled. Consequently, three parameters in SWAT are discarded, and two new parameters to reflect the TOPMODEL features are introduced. SWAT-TOP and SWAT are applied to the East River basin in South China, and the results reveal that, compared with SWAT, the new model can provide a more reasonable simulation of the hydrologic processes of surface runoff, groundwater re-evaporation, and baseflow. This study evidences that an established hydrologic model can be further improved by integrating the features of another model, which is a possible way to enhance our understanding of the workings of catchments.

  3. Advancing representation of hydrologic processes in the Soil and Water Assessment Tool (SWAT) through integration of the TOPographic MODEL (TOPMODEL) features

    USGS Publications Warehouse

    Chen, J.; Wu, Y.

    2012-01-01

    This paper presents a study of the integration of the Soil and Water Assessment Tool (SWAT) model and the TOPographic MODEL (TOPMODEL) features for enhancing the physical representation of hydrologic processes. In SWAT, four hydrologic processes, which are surface runoff, baseflow, groundwater re-evaporation and deep aquifer percolation, are modeled by using a group of empirical equations. The empirical equations usually constrain the simulation capability of relevant processes. To replace these equations and to model the influences of topography and water table variation on streamflow generation, the TOPMODEL features are integrated into SWAT, and a new model, the so-called SWAT-TOP, is developed. In the new model, the process of deep aquifer percolation is removed, the concept of groundwater re-evaporation is refined, and the processes of surface runoff and baseflow are remodeled. Consequently, three parameters in SWAT are discarded, and two new parameters to reflect the TOPMODEL features are introduced. SWAT-TOP and SWAT are applied to the East River basin in South China, and the results reveal that, compared with SWAT, the new model can provide a more reasonable simulation of the hydrologic processes of surface runoff, groundwater re-evaporation, and baseflow. This study evidences that an established hydrologic model can be further improved by integrating the features of another model, which is a possible way to enhance our understanding of the workings of catchments.

  4. Advances in nanocrystallography as a proteomic tool.

    PubMed

    Pechkova, Eugenia; Bragazzi, Nicola Luigi; Nicolini, Claudio

    2014-01-01

    In order to overcome the difficulties and hurdles too much often encountered in crystallizing a protein with the conventional techniques, our group has introduced the innovative Langmuir-Blodgett (LB)-based crystallization, as a major advance in the field of both structural and functional proteomics, thus pioneering the emerging field of the so-called nanocrystallography or nanobiocrystallography. This approach uniquely combines protein crystallography and nanotechnologies within an integrated, coherent framework that allows one to obtain highly stable protein crystals and to fully characterize them at a nano- and subnanoscale. A variety of experimental techniques and theoretical/semi-theoretical approaches, ranging from atomic force microscopy, circular dichroism, Raman spectroscopy and other spectroscopic methods, microbeam grazing-incidence small-angle X-ray scattering to in silico simulations, bioinformatics, and molecular dynamics, has been exploited in order to study the LB-films and to investigate the kinetics and the main features of LB-grown crystals. When compared to classical hanging-drop crystallization, LB technique appears strikingly superior and yields results comparable with crystallization in microgravity environments. Therefore, the achievement of LB-based crystallography can have a tremendous impact in the field of industrial and clinical/therapeutic applications, opening new perspectives for personalized medicine. These implications are envisaged and discussed in the present contribution.

  5. Advanced CAN (Controller Area Network) Tool

    SciTech Connect

    Terry, D.J.

    2000-03-17

    The CAN interface cards that are currently in use are PCMCIA based and use a microprocessor and CAN chip that are no longer in production. The long-term support of the SGT CAN interface is of concern due to this issue along with performance inadequacies and technical support. The CAN bus is at the heart of the SGT trailer. If the CAN bus in the SGT trailer cannot be maintained adequately, then the trailer itself cannot be maintained adequately. These concerns led to the need for a CRADA to help develop a new product that would be called the ''Gryphon'' CAN tool. FM and T provided manufacturing expertise along with design criteria to ensure SGT compatibility and long-term support. FM and T also provided resources for software support. Dearborn provided software and hardware design expertise to implement the necessary requirements. Both partners worked around heavy internal workloads to support completion of the project. This CRADA establishes a US source for an item that is very critical to support the SGT project. The Dearborn Group had the same goal to provide a US alternative to German suppliers. The Dearborn Group was also interested in developing a CAN product that has performance characteristics that place the Gryphon in a class by itself. This enhanced product not only meets and exceeds SGT requirements; it has opened up options that were not even considered before the project began. The cost of the product is also less than the European options.

  6. Advanced Chemistry Basins Model

    SciTech Connect

    William Goddard; Mario Blanco; Lawrence Cathles; Paul Manhardt; Peter Meulbroek; Yongchun Tang

    2002-11-10

    The DOE-funded Advanced Chemistry Basin model project is intended to develop a public domain, user-friendly basin modeling software under PC or low end workstation environment that predicts hydrocarbon generation, expulsion, migration and chemistry. The main features of the software are that it will: (1) afford users the most flexible way to choose or enter kinetic parameters for different maturity indicators; (2) afford users the most flexible way to choose or enter compositional kinetic parameters to predict hydrocarbon composition (e.g., gas/oil ratio (GOR), wax content, API gravity, etc.) at different kerogen maturities; (3) calculate the chemistry, fluxes and physical properties of all hydrocarbon phases (gas, liquid and solid) along the primary and secondary migration pathways of the basin and predict the location and intensity of phase fractionation, mixing, gas washing, etc.; and (4) predict the location and intensity of de-asphaltene processes. The project has be operative for 36 months, and is on schedule for a successful completion at the end of FY 2003.

  7. The Application of the NASA Advanced Concepts Office, Launch Vehicle Team Design Process and Tools for Modeling Small Responsive Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Threet, Grady E.; Waters, Eric D.; Creech, Dennis M.

    2012-01-01

    The Advanced Concepts Office (ACO) Launch Vehicle Team at the NASA Marshall Space Flight Center (MSFC) is recognized throughout NASA for launch vehicle conceptual definition and pre-phase A concept design evaluation. The Launch Vehicle Team has been instrumental in defining the vehicle trade space for many of NASA s high level launch system studies from the Exploration Systems Architecture Study (ESAS) through the Augustine Report, Constellation, and now Space Launch System (SLS). The Launch Vehicle Team s approach to rapid turn-around and comparative analysis of multiple launch vehicle architectures has played a large role in narrowing the design options for future vehicle development. Recently the Launch Vehicle Team has been developing versions of their vetted tools used on large launch vehicles and repackaged the process and capability to apply to smaller more responsive launch vehicles. Along this development path the LV Team has evaluated trajectory tools and assumptions against sounding rocket trajectories and air launch systems, begun altering subsystem mass estimating relationships to handle smaller vehicle components, and as an additional development driver, have begun an in-house small launch vehicle study. With the recent interest in small responsive launch systems and the known capability and response time of the ACO LV Team, ACO s launch vehicle assessment capability can be utilized to rapidly evaluate the vast and opportune trade space that small launch vehicles currently encompass. This would provide a great benefit to the customer in order to reduce that large trade space to a select few alternatives that should best fit the customer s payload needs.

  8. Development and Integration of an Advanced Stirling Convertor Linear Alternator Model for a Tool Simulating Convertor Performance and Creating Phasor Diagrams

    NASA Technical Reports Server (NTRS)

    Metscher, Jonathan F.; Lewandowski, Edward J.

    2013-01-01

    A simple model of the Advanced Stirling Convertors (ASC) linear alternator and an AC bus controller has been developed and combined with a previously developed thermodynamic model of the convertor for a more complete simulation and analysis of the system performance. The model was developed using Sage, a 1-D thermodynamic modeling program that now includes electro-magnetic components. The convertor, consisting of a free-piston Stirling engine combined with a linear alternator, has sufficiently sinusoidal steady-state behavior to allow for phasor analysis of the forces and voltages acting in the system. A MATLAB graphical user interface (GUI) has been developed to interface with the Sage software for simplified use of the ASC model, calculation of forces, and automated creation of phasor diagrams. The GUI allows the user to vary convertor parameters while fixing different input or output parameters and observe the effect on the phasor diagrams or system performance. The new ASC model and GUI help create a better understanding of the relationship between the electrical component voltages and mechanical forces. This allows better insight into the overall convertor dynamics and performance.

  9. Astonishing advances in mouse genetic tools for biomedical research.

    PubMed

    Kaczmarczyk, Lech; Jackson, Walker S

    2015-01-01

    The humble house mouse has long been a workhorse model system in biomedical research. The technology for introducing site-specific genome modifications led to Nobel Prizes for its pioneers and opened a new era of mouse genetics. However, this technology was very time-consuming and technically demanding. As a result, many investigators continued to employ easier genome manipulation methods, though resulting models can suffer from overlooked or underestimated consequences. Another breakthrough, invaluable for the molecular dissection of disease mechanisms, was the invention of high-throughput methods to measure the expression of a plethora of genes in parallel. However, the use of samples containing material from multiple cell types could obfuscate data, and thus interpretations. In this review we highlight some important issues in experimental approaches using mouse models for biomedical research. We then discuss recent technological advances in mouse genetics that are revolutionising human disease research. Mouse genomes are now easily manipulated at precise locations thanks to guided endonucleases, such as transcription activator-like effector nucleases (TALENs) or the CRISPR/Cas9 system, both also having the potential to turn the dream of human gene therapy into reality. Newly developed methods of cell type-specific isolation of transcriptomes from crude tissue homogenates, followed by detection with next generation sequencing (NGS), are vastly improving gene regulation studies. Taken together, these amazing tools simplify the creation of much more accurate mouse models of human disease, and enable the extraction of hitherto unobtainable data. PMID:26513700

  10. Astonishing advances in mouse genetic tools for biomedical research.

    PubMed

    Kaczmarczyk, Lech; Jackson, Walker S

    2015-01-01

    The humble house mouse has long been a workhorse model system in biomedical research. The technology for introducing site-specific genome modifications led to Nobel Prizes for its pioneers and opened a new era of mouse genetics. However, this technology was very time-consuming and technically demanding. As a result, many investigators continued to employ easier genome manipulation methods, though resulting models can suffer from overlooked or underestimated consequences. Another breakthrough, invaluable for the molecular dissection of disease mechanisms, was the invention of high-throughput methods to measure the expression of a plethora of genes in parallel. However, the use of samples containing material from multiple cell types could obfuscate data, and thus interpretations. In this review we highlight some important issues in experimental approaches using mouse models for biomedical research. We then discuss recent technological advances in mouse genetics that are revolutionising human disease research. Mouse genomes are now easily manipulated at precise locations thanks to guided endonucleases, such as transcription activator-like effector nucleases (TALENs) or the CRISPR/Cas9 system, both also having the potential to turn the dream of human gene therapy into reality. Newly developed methods of cell type-specific isolation of transcriptomes from crude tissue homogenates, followed by detection with next generation sequencing (NGS), are vastly improving gene regulation studies. Taken together, these amazing tools simplify the creation of much more accurate mouse models of human disease, and enable the extraction of hitherto unobtainable data.

  11. Tool for Sizing Analysis of the Advanced Life Support System

    NASA Technical Reports Server (NTRS)

    Yeh, Hue-Hsie Jannivine; Brown, Cheryl B.; Jeng, Frank J.

    2005-01-01

    Advanced Life Support Sizing Analysis Tool (ALSSAT) is a computer model for sizing and analyzing designs of environmental-control and life support systems (ECLSS) for spacecraft and surface habitats involved in the exploration of Mars and Moon. It performs conceptual designs of advanced life support (ALS) subsystems that utilize physicochemical and biological processes to recycle air and water, and process wastes in order to reduce the need of resource resupply. By assuming steady-state operations, ALSSAT is a means of investigating combinations of such subsystems technologies and thereby assisting in determining the most cost-effective technology combination available. In fact, ALSSAT can perform sizing analysis of the ALS subsystems that are operated dynamically or steady in nature. Using the Microsoft Excel spreadsheet software with Visual Basic programming language, ALSSAT has been developed to perform multiple-case trade studies based on the calculated ECLSS mass, volume, power, and Equivalent System Mass, as well as parametric studies by varying the input parameters. ALSSAT s modular format is specifically designed for the ease of future maintenance and upgrades.

  12. Expert Models and Modeling Processes Associated with a Computer-Modeling Tool

    ERIC Educational Resources Information Center

    Zhang, BaoHui; Liu, Xiufeng; Krajcik, Joseph S.

    2006-01-01

    Holding the premise that the development of expertise is a continuous process, this study concerns expert models and modeling processes associated with a modeling tool called Model-It. Five advanced Ph.D. students in environmental engineering and public health used Model-It to create and test models of water quality. Using "think aloud" technique…

  13. Second NASA Technical Interchange Meeting (TIM): Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)

    NASA Technical Reports Server (NTRS)

    ONeil, D. A.; Mankins, J. C.; Christensen, C. B.; Gresham, E. C.

    2005-01-01

    The Advanced Technology Lifecycle Analysis System (ATLAS), a spreadsheet analysis tool suite, applies parametric equations for sizing and lifecycle cost estimation. Performance, operation, and programmatic data used by the equations come from a Technology Tool Box (TTB) database. In this second TTB Technical Interchange Meeting (TIM), technologists, system model developers, and architecture analysts discussed methods for modeling technology decisions in spreadsheet models, identified specific technology parameters, and defined detailed development requirements. This Conference Publication captures the consensus of the discussions and provides narrative explanations of the tool suite, the database, and applications of ATLAS within NASA s changing environment.

  14. Advances in scientific balloon thermal modeling

    NASA Astrophysics Data System (ADS)

    Bohaboj, T.; Cathey, H.

    The National Aeronautics and Space Administration's Balloon Program Office has long acknowledged that the accurate modeling of balloon performance and flight prediction is dependant on how well the balloon is thermally modeled. This ongoing effort is focused on developing accurate balloon thermal models that can be used to quickly predict balloon temperatures and balloon performance. The ability to model parametric changes is also a driver for this effort. This paper will present the most recent advances made in this area. This research effort continues to utilize the ``Thermal Desktop'' addition to AUTO CAD for the modeling. Recent advances have been made by using this analytical tool. A number of analyses have been completed to test the applicability of this tool to the problem with very positive results. Progressively detailed models have been developed to explore the capabilities of the tool as well as to provide guidance in model formulation. A number of parametric studies have been completed. These studies have varied the shape of the structure, material properties, environmental inputs, and model geometry. These studies have concentrated on spherical ``proxy models'' for the initial development stages and then to transition to the natural shaped zero pressure and super pressure balloons. An assessment of required model resolution has also been determined. Model solutions have been cross checked with known solutions via hand calculations. The comparison of these cases will also be presented. One goal is to develop analysis guidelines and an approach for modeling balloons for both simple first order estimates and detailed full models. This paper presents the step by step advances made as part of this effort, capabilities, limitations, and the lessons learned. Also presented are the plans for further thermal modeling work.

  15. Advanced Fuel Cycle Economic Tools, Algorithms, and Methodologies

    SciTech Connect

    David E. Shropshire

    2009-05-01

    The Advanced Fuel Cycle Initiative (AFCI) Systems Analysis supports engineering economic analyses and trade-studies, and requires a requisite reference cost basis to support adequate analysis rigor. In this regard, the AFCI program has created a reference set of economic documentation. The documentation consists of the “Advanced Fuel Cycle (AFC) Cost Basis” report (Shropshire, et al. 2007), “AFCI Economic Analysis” report, and the “AFCI Economic Tools, Algorithms, and Methodologies Report.” Together, these documents provide the reference cost basis, cost modeling basis, and methodologies needed to support AFCI economic analysis. The application of the reference cost data in the cost and econometric systems analysis models will be supported by this report. These methodologies include: the energy/environment/economic evaluation of nuclear technology penetration in the energy market—domestic and internationally—and impacts on AFCI facility deployment, uranium resource modeling to inform the front-end fuel cycle costs, facility first-of-a-kind to nth-of-a-kind learning with application to deployment of AFCI facilities, cost tradeoffs to meet nuclear non-proliferation requirements, and international nuclear facility supply/demand analysis. The economic analysis will be performed using two cost models. VISION.ECON will be used to evaluate and compare costs under dynamic conditions, consistent with the cases and analysis performed by the AFCI Systems Analysis team. Generation IV Excel Calculations of Nuclear Systems (G4-ECONS) will provide static (snapshot-in-time) cost analysis and will provide a check on the dynamic results. In future analysis, additional AFCI measures may be developed to show the value of AFCI in closing the fuel cycle. Comparisons can show AFCI in terms of reduced global proliferation (e.g., reduction in enrichment), greater sustainability through preservation of a natural resource (e.g., reduction in uranium ore depletion), value from

  16. Advancing computational methods for calibration of the Soil and Water Assessment Tool (SWAT): Application for modeling climate change impacts on water resources in the Upper Neuse Watershed of North Carolina

    NASA Astrophysics Data System (ADS)

    Ercan, Mehmet Bulent

    -Dominated Sorting Genetic Algorithm II (NSGA-II). This tool was demonstrated through an application for the Upper Neuse Watershed in North Carolina, USA. The objective functions used for the calibration were Nash-Sutcliffe (E) and Percent Bias (PB), and the objective sites were the Flat, Little, and Eno watershed outlets. The results show that the use of multi-objective calibration algorithms for SWAT calibration improved model performance especially in terms of minimizing PB compared to the single objective model calibration. The third study builds upon the first two studies by leveraging the new calibration methods and tools to study future climate impacts on the Upper Neuse watershed. Statistically downscaled outputs from eight Global Circulation Models (GCMs) were used for both low and high emission scenarios to drive a well calibrated SWAT model of the Upper Neuse watershed. The objective of the study was to understand the potential hydrologic response of the watershed, which serves as a public water supply for the growing Research Triangle Park region of North Carolina, under projected climate change scenarios. The future climate change scenarios, in general, indicate an increase in precipitation and temperature for the watershed in coming decades. The SWAT simulations using the future climate scenarios, in general, suggest an increase in soil water and water yield, and a decrease in evapotranspiration within the Upper Neuse watershed. In summary, this dissertation advances the field of watershed-scale hydrologic modeling by (i) providing some of the first work to apply cloud computing for the computationally-demanding task of model calibration; (ii) providing a new, open source library that can be used by SWAT modelers to perform multi-objective calibration of their models; and (iii) advancing understanding of climate change impacts on water resources for an important watershed in the Research Triangle Park region of North Carolina. The third study leveraged the

  17. Advanced Chemistry Basins Model

    SciTech Connect

    Blanco, Mario; Cathles, Lawrence; Manhardt, Paul; Meulbroek, Peter; Tang, Yongchun

    2003-02-13

    The objective of this project is to: (1) Develop a database of additional and better maturity indicators for paleo-heat flow calibration; (2) Develop maturation models capable of predicting the chemical composition of hydrocarbons produced by a specific kerogen as a function of maturity, heating rate, etc.; assemble a compositional kinetic database of representative kerogens; (3) Develop a 4 phase equation of state-flash model that can define the physical properties (viscosity, density, etc.) of the products of kerogen maturation, and phase transitions that occur along secondary migration pathways; (4) Build a conventional basin model and incorporate new maturity indicators and data bases in a user-friendly way; (5) Develop an algorithm which combines the volume change and viscosities of the compositional maturation model to predict the chemistry of the hydrocarbons that will be expelled from the kerogen to the secondary migration pathways; (6) Develop an algorithm that predicts the flow of hydrocarbons along secondary migration pathways, accounts for mixing of miscible hydrocarbon components along the pathway, and calculates the phase fractionation that will occur as the hydrocarbons move upward down the geothermal and fluid pressure gradients in the basin; and (7) Integrate the above components into a functional model implemented on a PC or low cost workstation.

  18. NATIONAL URBAN DATABASE AND ACCESS PORTAL TOOL (NUDAPT): FACILITATING ADVANCEMENTS IN URBAN METEOROLOGY AND CLIMATE MODELING WITH COMMUNITY-BASED URBAN DATABASES

    EPA Science Inventory

    We discuss the initial design and application of the National Urban Database and Access Portal Tool (NUDAPT). This new project is sponsored by the USEPA and involves collaborations and contributions from many groups from federal and state agencies, and from private and academic i...

  19. Advanced Turbulence Modeling Concepts

    NASA Technical Reports Server (NTRS)

    Shih, Tsan-Hsing

    2005-01-01

    The ZCET program developed at NASA Glenn Research Center is to study hydrogen/air injection concepts for aircraft gas turbine engines that meet conventional gas turbine performance levels and provide low levels of harmful NOx emissions. A CFD study for ZCET program has been successfully carried out. It uses the most recently enhanced National combustion code (NCC) to perform CFD simulations for two configurations of hydrogen fuel injectors (GRC- and Sandia-injector). The results can be used to assist experimental studies to provide quick mixing, low emission and high performance fuel injector designs. The work started with the configuration of the single-hole injector. The computational models were taken from the experimental designs. For example, the GRC single-hole injector consists of one air tube (0.78 inches long and 0.265 inches in diameter) and two hydrogen tubes (0.3 inches long and 0.0226 inches in diameter opposed at 180 degree). The hydrogen tubes are located 0.3 inches upstream from the exit of the air element (the inlet location for the combustor). To do the simulation, the single-hole injector is connected to a combustor model (8.16 inches long and 0.5 inches in diameter). The inlet conditions for air and hydrogen elements are defined according to actual experimental designs. Two crossing jets of hydrogen/air are simulated in detail in the injector. The cold flow, reacting flow, flame temperature, combustor pressure and possible flashback phenomena are studied. Two grid resolutions of the numerical model have been adopted. The first computational grid contains 0.52 million elements, the second one contains over 1.3 million elements. The CFD results have shown only about 5% difference between the two grid resolutions. Therefore, the CFD result obtained from the model of 1.3-million grid resolution can be considered as a grid independent numerical solution. Turbulence models built in NCC are consolidated and well tested. They can handle both coarse and

  20. STRING 3: An Advanced Groundwater Flow Visualization Tool

    NASA Astrophysics Data System (ADS)

    Schröder, Simon; Michel, Isabel; Biedert, Tim; Gräfe, Marius; Seidel, Torsten; König, Christoph

    2016-04-01

    The visualization of 3D groundwater flow is a challenging task. Previous versions of our software STRING [1] solely focused on intuitive visualization of complex flow scenarios for non-professional audiences. STRING, developed by Fraunhofer ITWM (Kaiserslautern, Germany) and delta h Ingenieurgesellschaft mbH (Witten, Germany), provides the necessary means for visualization of both 2D and 3D data on planar and curved surfaces. In this contribution we discuss how to extend this approach to a full 3D tool and its challenges in continuation of Michel et al. [2]. This elevates STRING from a post-production to an exploration tool for experts. In STRING moving pathlets provide an intuition of velocity and direction of both steady-state and transient flows. The visualization concept is based on the Lagrangian view of the flow. To capture every detail of the flow an advanced method for intelligent, time-dependent seeding is used building on the Finite Pointset Method (FPM) developed by Fraunhofer ITWM. Lifting our visualization approach from 2D into 3D provides many new challenges. With the implementation of a seeding strategy for 3D one of the major problems has already been solved (see Schröder et al. [3]). As pathlets only provide an overview of the velocity field other means are required for the visualization of additional flow properties. We suggest the use of Direct Volume Rendering and isosurfaces for scalar features. In this regard we were able to develop an efficient approach for combining the rendering through raytracing of the volume and regular OpenGL geometries. This is achieved through the use of Depth Peeling or A-Buffers for the rendering of transparent geometries. Animation of pathlets requires a strict boundary of the simulation domain. Hence, STRING needs to extract the boundary, even from unstructured data, if it is not provided. In 3D we additionally need a good visualization of the boundary itself. For this the silhouette based on the angle of

  1. Component Modeling Approach Software Tool

    2010-08-23

    The Component Modeling Approach Software Tool (CMAST) establishes a set of performance libraries of approved components (frames, glass, and spacer) which can be accessed for configuring fenestration products for a project, and btaining a U-factor, Solar Heat Gain Coefficient (SHGC), and Visible Transmittance (VT) rating for those products, which can then be reflected in a CMA Label Certificate for code compliance. CMAST is web-based as well as client-based. The completed CMA program and software toolmore » will be useful in several ways for a vast array of stakeholders in the industry: Generating performance ratings for bidding projects Ascertaining credible and accurate performance data Obtaining third party certification of overall product performance for code compliance« less

  2. Modeling and Simulation Tools for Heavy Lift Airships

    NASA Technical Reports Server (NTRS)

    Hochstetler, Ron; Chachad, Girish; Hardy, Gordon; Blanken, Matthew; Melton, John

    2016-01-01

    For conventional fixed wing and rotary wing aircraft a variety of modeling and simulation tools have been developed to provide designers the means to thoroughly investigate proposed designs and operational concepts. However, lighter-than-air (LTA) airships, hybrid air vehicles, and aerostats have some important aspects that are different from heavier-than-air (HTA) vehicles. In order to account for these differences, modifications are required to the standard design tools to fully characterize the LTA vehicle design and performance parameters.. To address these LTA design and operational factors, LTA development organizations have created unique proprietary modeling tools, often at their own expense. An expansion of this limited LTA tool set could be accomplished by leveraging existing modeling and simulation capabilities available in the National laboratories and public research centers. Development of an expanded set of publicly available LTA modeling and simulation tools for LTA developers would mitigate the reliance on proprietary LTA design tools in use today. A set of well researched, open source, high fidelity LTA design modeling and simulation tools would advance LTA vehicle development and also provide the analytical basis for accurate LTA operational cost assessments. This paper will present the modeling and analysis tool capabilities required for LTA vehicle design, analysis of operations, and full life-cycle support. A survey of the tools currently available will be assessed to identify the gaps between their capabilities and the LTA industry's needs. Options for development of new modeling and analysis capabilities to supplement contemporary tools will also be presented.

  3. Advances in Mass Spectrometric Tools for Probing Neuropeptides

    NASA Astrophysics Data System (ADS)

    Buchberger, Amanda; Yu, Qing; Li, Lingjun

    2015-07-01

    Neuropeptides are important mediators in the functionality of the brain and other neurological organs. Because neuropeptides exist in a wide range of concentrations, appropriate characterization methods are needed to provide dynamic, chemical, and spatial information. Mass spectrometry and compatible tools have been a popular choice in analyzing neuropeptides. There have been several advances and challenges, both of which are the focus of this review. Discussions range from sample collection to bioinformatic tools, although avenues such as quantitation and imaging are included. Further development of the presented methods for neuropeptidomic mass spectrometric analysis is inevitable, which will lead to a further understanding of the complex interplay of neuropeptides and other signaling molecules in the nervous system.

  4. Review on advanced composite materials boring mechanism and tools

    NASA Astrophysics Data System (ADS)

    Shi, Runping; Wang, Chengyong

    2010-12-01

    With the rapid development of aviation and aerospace manufacturing technology, advanced composite materials represented by carbon fibre reinforced plastics (CFRP) and super hybrid composites (fibre/metal plates) are more and more widely applied. The fibres are mainly carbon fibre, boron fibre, Aramid fiber and Sic fibre. The matrixes are resin matrix, metal matrix and ceramic matrix. Advanced composite materials have higher specific strength and higher specific modulus than glass fibre reinforced resin composites of the 1st generation. They are widely used in aviation and aerospace industry due to their high specific strength, high specific modulus, excellent ductility, anticorrosion, heat-insulation, sound-insulation, shock absorption and high&low temperature resistance. They are used for radomes, inlets, airfoils(fuel tank included), flap, aileron, vertical tail, horizontal tail, air brake, skin, baseboards and tails, etc. Its hardness is up to 62~65HRC. The holes are greatly affected by the fibre laminates direction of carbon fibre reinforced composite material due to its anisotropy when drilling in unidirectional laminates. There are burrs, splits at the exit because of stress concentration. Besides there is delamination and the hole is prone to be smaller. Burrs are caused by poor sharpness of cutting edge, delamination, tearing, splitting are caused by the great stress caused by high thrust force. Poorer sharpness of cutting edge leads to lower cutting performance and higher drilling force at the same time. The present research focuses on the interrelation between rotation speed, feed, drill's geometry, drill life, cutting mode, tools material etc. and thrust force. At the same time, holes quantity and holes making difficulty of composites have also increased. It requires high performance drills which won't bring out defects and have long tool life. It has become a trend to develop super hard material tools and tools with special geometry for drilling

  5. Review on advanced composite materials boring mechanism and tools

    NASA Astrophysics Data System (ADS)

    Shi, Runping; Wang, Chengyong

    2011-05-01

    With the rapid development of aviation and aerospace manufacturing technology, advanced composite materials represented by carbon fibre reinforced plastics (CFRP) and super hybrid composites (fibre/metal plates) are more and more widely applied. The fibres are mainly carbon fibre, boron fibre, Aramid fiber and Sic fibre. The matrixes are resin matrix, metal matrix and ceramic matrix. Advanced composite materials have higher specific strength and higher specific modulus than glass fibre reinforced resin composites of the 1st generation. They are widely used in aviation and aerospace industry due to their high specific strength, high specific modulus, excellent ductility, anticorrosion, heat-insulation, sound-insulation, shock absorption and high&low temperature resistance. They are used for radomes, inlets, airfoils(fuel tank included), flap, aileron, vertical tail, horizontal tail, air brake, skin, baseboards and tails, etc. Its hardness is up to 62~65HRC. The holes are greatly affected by the fibre laminates direction of carbon fibre reinforced composite material due to its anisotropy when drilling in unidirectional laminates. There are burrs, splits at the exit because of stress concentration. Besides there is delamination and the hole is prone to be smaller. Burrs are caused by poor sharpness of cutting edge, delamination, tearing, splitting are caused by the great stress caused by high thrust force. Poorer sharpness of cutting edge leads to lower cutting performance and higher drilling force at the same time. The present research focuses on the interrelation between rotation speed, feed, drill's geometry, drill life, cutting mode, tools material etc. and thrust force. At the same time, holes quantity and holes making difficulty of composites have also increased. It requires high performance drills which won't bring out defects and have long tool life. It has become a trend to develop super hard material tools and tools with special geometry for drilling

  6. ADVANCED MIXING MODELS

    SciTech Connect

    Lee, S; Richard Dimenna, R; David Tamburello, D

    2008-11-13

    The process of recovering the waste in storage tanks at the Savannah River Site (SRS) typically requires mixing the contents of the tank with one to four dual-nozzle jet mixers located within the tank. The typical criteria to establish a mixed condition in a tank are based on the number of pumps in operation and the time duration of operation. To ensure that a mixed condition is achieved, operating times are set conservatively long. This approach results in high operational costs because of the long mixing times and high maintenance and repair costs for the same reason. A significant reduction in both of these costs might be realized by reducing the required mixing time based on calculating a reliable indicator of mixing with a suitably validated computer code. The work described in this report establishes the basis for further development of the theory leading to the identified mixing indicators, the benchmark analyses demonstrating their consistency with widely accepted correlations, and the application of those indicators to SRS waste tanks to provide a better, physically based estimate of the required mixing time. Waste storage tanks at SRS contain settled sludge which varies in height from zero to 10 ft. The sludge has been characterized and modeled as micron-sized solids, typically 1 to 5 microns, at weight fractions as high as 20 to 30 wt%, specific gravities to 1.4, and viscosities up to 64 cp during motion. The sludge is suspended and mixed through the use of submersible slurry jet pumps. To suspend settled sludge, water is added to the tank as a slurry medium and stirred with the jet pump. Although there is considerable technical literature on mixing and solid suspension in agitated tanks, very little literature has been published on jet mixing in a large-scale tank. If shorter mixing times can be shown to support Defense Waste Processing Facility (DWPF) or other feed requirements, longer pump lifetimes can be achieved with associated operational cost and

  7. ADVANCED MIXING MODELS

    SciTech Connect

    Lee, S; Dimenna, R; Tamburello, D

    2011-02-14

    height from zero to 10 ft. The sludge has been characterized and modeled as micron-sized solids, typically 1 to 5 microns, at weight fractions as high as 20 to 30 wt%, specific gravities to 1.4, and viscosities up to 64 cp during motion. The sludge is suspended and mixed through the use of submersible slurry jet pumps. To suspend settled sludge, water is added to the tank as a slurry medium and stirred with the jet pump. Although there is considerable technical literature on mixing and solid suspension in agitated tanks, very little literature has been published on jet mixing in a large-scale tank. One of the main objectives in the waste processing is to provide feed of a uniform slurry composition at a certain weight percentage (e.g. typically {approx}13 wt% at SRS) over an extended period of time. In preparation of the sludge for slurrying, several important questions have been raised with regard to sludge suspension and mixing of the solid suspension in the bulk of the tank: (1) How much time is required to prepare a slurry with a uniform solid composition? (2) How long will it take to suspend and mix the sludge for uniform composition in any particular waste tank? (3) What are good mixing indicators to answer the questions concerning sludge mixing stated above in a general fashion applicable to any waste tank/slurry pump geometry and fluid/sludge combination?

  8. Simulated Interactive Research Experiments as Educational Tools for Advanced Science

    NASA Astrophysics Data System (ADS)

    Tomandl, Mathias; Mieling, Thomas; Losert-Valiente Kroon, Christiane M.; Hopf, Martin; Arndt, Markus

    2015-09-01

    Experimental research has become complex and thus a challenge to science education. Only very few students can typically be trained on advanced scientific equipment. It is therefore important to find new tools that allow all students to acquire laboratory skills individually and independent of where they are located. In a design-based research process we have investigated the feasibility of using a virtual laboratory as a photo-realistic and scientifically valid representation of advanced scientific infrastructure to teach modern experimental science, here, molecular quantum optics. We found a concept based on three educational principles that allows undergraduate students to become acquainted with procedures and concepts of a modern research field. We find a significant increase in student understanding using our Simulated Interactive Research Experiment (SiReX), by evaluating the learning outcomes with semi-structured interviews in a pre/post design. This suggests that this concept of an educational tool can be generalized to disseminate findings in other fields.

  9. Simulated Interactive Research Experiments as Educational Tools for Advanced Science.

    PubMed

    Tomandl, Mathias; Mieling, Thomas; Losert-Valiente Kroon, Christiane M; Hopf, Martin; Arndt, Markus

    2015-09-15

    Experimental research has become complex and thus a challenge to science education. Only very few students can typically be trained on advanced scientific equipment. It is therefore important to find new tools that allow all students to acquire laboratory skills individually and independent of where they are located. In a design-based research process we have investigated the feasibility of using a virtual laboratory as a photo-realistic and scientifically valid representation of advanced scientific infrastructure to teach modern experimental science, here, molecular quantum optics. We found a concept based on three educational principles that allows undergraduate students to become acquainted with procedures and concepts of a modern research field. We find a significant increase in student understanding using our Simulated Interactive Research Experiment (SiReX), by evaluating the learning outcomes with semi-structured interviews in a pre/post design. This suggests that this concept of an educational tool can be generalized to disseminate findings in other fields.

  10. Simulated Interactive Research Experiments as Educational Tools for Advanced Science

    PubMed Central

    Tomandl, Mathias; Mieling, Thomas; Losert-Valiente Kroon, Christiane M.; Hopf, Martin; Arndt, Markus

    2015-01-01

    Experimental research has become complex and thus a challenge to science education. Only very few students can typically be trained on advanced scientific equipment. It is therefore important to find new tools that allow all students to acquire laboratory skills individually and independent of where they are located. In a design-based research process we have investigated the feasibility of using a virtual laboratory as a photo-realistic and scientifically valid representation of advanced scientific infrastructure to teach modern experimental science, here, molecular quantum optics. We found a concept based on three educational principles that allows undergraduate students to become acquainted with procedures and concepts of a modern research field. We find a significant increase in student understanding using our Simulated Interactive Research Experiment (SiReX), by evaluating the learning outcomes with semi-structured interviews in a pre/post design. This suggests that this concept of an educational tool can be generalized to disseminate findings in other fields. PMID:26370627

  11. Model Rocketry: University-Level Educational Tool

    ERIC Educational Resources Information Center

    Barrowman, James S.

    1974-01-01

    Describes how model rocketry can be a useful educational tool at the university level as a practical application of theoretical aerodynamic concepts and as a tool for students in experimental research. (BR)

  12. New advanced radio diagnostics tools for Space Weather Program

    NASA Astrophysics Data System (ADS)

    Krankowski, A.; Rothkaehl, H.; Atamaniuk, B.; Morawski, M.; Zakharenkova, I.; Cherniak, I.; Otmianowska-Mazur, K.

    2013-12-01

    To give a more detailed and complete understanding of physical plasma processes that govern the solar-terrestrial space, and to develop qualitative and quantitative models of the magnetosphere-ionosphere-thermosphere coupling, it is necessary to design and build the next generation of instruments for space diagnostics and monitoring. Novel ground- based wide-area sensor networks, such as the LOFAR (Low Frequency Array) radar facility, comprising wide band, and vector-sensing radio receivers and multi-spacecraft plasma diagnostics should help solve outstanding problems of space physics and describe long-term environmental changes. The LOw Frequency ARray - LOFAR - is a new fully digital radio telescope designed for frequencies between 30 MHz and 240 MHz located in Europe. The three new LOFAR stations will be installed until summer 2015 in Poland. The LOFAR facilities in Poland will be distributed among three sites: Lazy (East of Krakow), Borowiec near Poznan and Baldy near Olsztyn. All they will be connected via PIONIER dedicated links to Poznan. Each site will host one LOFAR station (96 high-band+96 low-band antennas). They will most time work as a part of European network, however, when less charged, they can operate as a national network The new digital radio frequency analyzer (RFA) on board the low-orbiting RELEC satellite was designed to monitor and investigate the ionospheric plasma properties. This two-point ground-based and topside ionosphere-located space plasma diagnostic can be a useful new tool for monitoring and diagnosing turbulent plasma properties. The RFA on board the RELEC satellite is the first in a series of experiments which is planned to be launched into the near-Earth environment. In order to improve and validate the large scales and small scales ionospheric structures we will used the GPS observations collected at IGS/EPN network employed to reconstruct diurnal variations of TEC using all satellite passes over individual GPS stations and the

  13. An Advanced Decision Support Tool for Electricity Infrastructure Operations

    SciTech Connect

    Chen, Yousu; Huang, Zhenyu; Wong, Pak C.; Mackey, Patrick S.; Allwardt, Craig H.; Ma, Jian; Greitzer, Frank L.

    2010-01-31

    Electricity infrastructure, as one of the most critical infrastructures in the U.S., plays an important role in modern societies. Its failure would lead to significant disruption of people’s lives, industry and commercial activities, and result in massive economic losses. Reliable operation of electricity infrastructure is an extremely challenging task because human operators need to consider thousands of possible configurations in near real-time to choose the best option and operate the network effectively. In today’s practice, electricity infrastructure operation is largely based on operators’ experience with very limited real-time decision support, resulting in inadequate management of complex predictions and the inability to anticipate, recognize, and respond to situations caused by human errors, natural disasters, or cyber attacks. Therefore, a systematic approach is needed to manage the complex operational paradigms and choose the best option in a near-real-time manner. This paper proposes an advanced decision support tool for electricity infrastructure operations. The tool has the functions of turning large amount of data into actionable information to help operators monitor power grid status in real time; performing trend analysis to indentify system trend at the regional level or system level to help the operator to foresee and discern emergencies, studying clustering analysis to assist operators to identify the relationships between system configurations and affected assets, and interactively evaluating the alternative remedial actions to aid operators to make effective and timely decisions. This tool can provide significant decision support on electricity infrastructure operations and lead to better reliability in power grids. This paper presents examples with actual electricity infrastructure data to demonstrate the capability of this tool.

  14. Bioinformatics Methods and Tools to Advance Clinical Care

    PubMed Central

    Lecroq, T.

    2015-01-01

    Summary Objectives To summarize excellent current research in the field of Bioinformatics and Translational Informatics with application in the health domain and clinical care. Method We provide a synopsis of the articles selected for the IMIA Yearbook 2015, from which we attempt to derive a synthetic overview of current and future activities in the field. As last year, a first step of selection was performed by querying MEDLINE with a list of MeSH descriptors completed by a list of terms adapted to the section. Each section editor has evaluated separately the set of 1,594 articles and the evaluation results were merged for retaining 15 articles for peer-review. Results The selection and evaluation process of this Yearbook’s section on Bioinformatics and Translational Informatics yielded four excellent articles regarding data management and genome medicine that are mainly tool-based papers. In the first article, the authors present PPISURV a tool for uncovering the role of specific genes in cancer survival outcome. The second article describes the classifier PredictSNP which combines six performing tools for predicting disease-related mutations. In the third article, by presenting a high-coverage map of the human proteome using high resolution mass spectrometry, the authors highlight the need for using mass spectrometry to complement genome annotation. The fourth article is also related to patient survival and decision support. The authors present datamining methods of large-scale datasets of past transplants. The objective is to identify chances of survival. Conclusions The current research activities still attest the continuous convergence of Bioinformatics and Medical Informatics, with a focus this year on dedicated tools and methods to advance clinical care. Indeed, there is a need for powerful tools for managing and interpreting complex, large-scale genomic and biological datasets, but also a need for user-friendly tools developed for the clinicians in their

  15. Clinical holistic health: advanced tools for holistic medicine.

    PubMed

    Ventegodt, Søren; Clausen, Birgitte; Nielsen, May Lyck; Merrick, Joav

    2006-02-24

    According to holistic medical theory, the patient will heal when old painful moments, the traumatic events of life that are often called "gestalts", are integrated in the present "now". The advanced holistic physician's expanded toolbox has many different tools to induce this healing, some that are more dangerous and potentially traumatic than others. The more intense the therapeutic technique, the more emotional energy will be released and contained in the session, but the higher also is the risk for the therapist to lose control of the session and lose the patient to his or her own dark side. To avoid harming the patient must be the highest priority in holistic existential therapy, making sufficient education and training an issue of highest importance. The concept of "stepping up" the therapy by using more and more "dramatic" methods to get access to repressed emotions and events has led us to a "therapeutic staircase" with ten steps: (1) establishing the relationship; (2) establishing intimacy, trust, and confidentiality; (3) giving support and holding; (4) taking the patient into the process of physical, emotional, and mental healing; (5) social healing of being in the family; (6) spiritual healing--returning to the abstract wholeness of the soul; (7) healing the informational layer of the body; (8) healing the three fundamental dimensions of existence: love, power, and sexuality in a direct way using, among other techniques, "controlled violence" and "acupressure through the vagina"; (9) mind-expanding and consciousness-transformative techniques like psychotropic drugs; and (10) techniques transgressing the patient's borders and, therefore, often traumatizing (for instance, the use of force against the will of the patient). We believe that the systematic use of the staircase will greatly improve the power and efficiency of holistic medicine for the patient and we invite a broad cooperation in scientifically testing the efficiency of the advanced holistic

  16. Towards Model Driven Tool Interoperability: Bridging Eclipse and Microsoft Modeling Tools

    NASA Astrophysics Data System (ADS)

    Brunelière, Hugo; Cabot, Jordi; Clasen, Cauê; Jouault, Frédéric; Bézivin, Jean

    Successful application of model-driven engineering approaches requires interchanging a lot of relevant data among the tool ecosystem employed by an engineering team (e.g., requirements elicitation tools, several kinds of modeling tools, reverse engineering tools, development platforms and so on). Unfortunately, this is not a trivial task. Poor tool interoperability makes data interchange a challenge even among tools with a similar scope. This paper presents a model-based solution to overcome such interoperability issues. With our approach, the internal schema/s (i.e., metamodel/s) of each tool are explicited and used as basis for solving syntactic and semantic differences between the tools. Once the corresponding metamodels are aligned, model-to-model transformations are (semi)automatically derived and executed to perform the actual data interchange. We illustrate our approach by bridging the Eclipse and Microsoft (DSL Tools and SQL Server Modeling) modeling tools.

  17. Sandia Advanced MEMS Design Tools, Version 2.0

    2002-06-13

    Sandia Advanced MEMS Design Tools is a 5-level surface micromachine fabrication technology, which customers internal and external to Sandia can access to fabricate prototype MEMS devices. This CD contains an integrated set of electronic files that: a) Describe the SUMMiT V fabrication process b) Provide enabling educational information (including pictures, videos, technical information) c)Facilitate the process of designing MEMS with the SUMMiT process (prototype file, Design Rule Checker, Standard Parts Library) d) Facilitate the processmore » of having MEMS fabricated at SNL e) Facilitate the process of having post-fabrication services performed While there exist some files on the CD that are used in conjunction with the software AutoCAD, these files are not intended for use independent of the CD. NOTE: THE CUSTOMER MUST PURCHASE HIS/HER OWN COPY OF AutoCAD TO USE WITH THESE FILES.« less

  18. Advanced Tools and Techniques for Formal Techniques in Aerospace Systems

    NASA Technical Reports Server (NTRS)

    Knight, John C.

    2005-01-01

    This is the final technical report for grant number NAG-1-02101. The title of this grant was "Advanced Tools and Techniques for Formal Techniques In Aerospace Systems". The principal investigator on this grant was Dr. John C. Knight of the Computer Science Department, University of Virginia, Charlottesville, Virginia 22904-4740. This report summarizes activities under the grant during the period 7/01/2002 to 9/30/2004. This report is organized as follows. In section 2, the technical background of the grant is summarized. Section 3 lists accomplishments and section 4 lists students funded under the grant. In section 5, we present a list of presentations given at various academic and research institutions about the research conducted. Finally, a list of publications generated under this grant is included in section 6.

  19. Evaluating modeling tools for the EDOS

    NASA Technical Reports Server (NTRS)

    Knoble, Gordon; Mccaleb, Frederick; Aslam, Tanweer; Nester, Paul

    1994-01-01

    The Earth Observing System (EOS) Data and Operations System (EDOS) Project is developing a functional, system performance model to support the system implementation phase of the EDOS which is being designed and built by the Goddard Space Flight Center (GSFC). The EDOS Project will use modeling to meet two key objectives: (1) manage system design impacts introduced by unplanned changed in mission requirements; and (2) evaluate evolutionary technology insertions throughout the development of the EDOS. To select a suitable modeling tool, the EDOS modeling team developed an approach for evaluating modeling tools and languages by deriving evaluation criteria from both the EDOS modeling requirements and the development plan. Essential and optional features for an appropriate modeling tool were identified and compared with known capabilities of several modeling tools. Vendors were also provided the opportunity to model a representative EDOS processing function to demonstrate the applicability of their modeling tool to the EDOS modeling requirements. This paper emphasizes the importance of using a well defined approach for evaluating tools to model complex systems like the EDOS. The results of this evaluation study do not in any way signify the superiority of any one modeling tool since the results will vary with the specific modeling requirements of each project.

  20. Introductory Tools for Radiative Transfer Models

    NASA Astrophysics Data System (ADS)

    Feldman, D.; Kuai, L.; Natraj, V.; Yung, Y.

    2006-12-01

    Satellite data are currently so voluminous that, despite their unprecedented quality and potential for scientific application, only a small fraction is analyzed due to two factors: researchers' computational constraints and a relatively small number of researchers actively utilizing the data. Ultimately it is hoped that the terabytes of unanalyzed data being archived can receive scientific scrutiny but this will require a popularization of the methods associated with the analysis. Since a large portion of complexity is associated with the proper implementation of the radiative transfer model, it is reasonable and appropriate to make the model as accessible as possible to general audiences. Unfortunately, the algorithmic and conceptual details that are necessary for state-of-the-art analysis also tend to frustrate the accessibility for those new to remote sensing. Several efforts have been made to have web- based radiative transfer calculations, and these are useful for limited calculations, but analysis of more than a few spectra requires the utilization of home- or server-based computing resources. We present a system that is designed to allow for easier access to radiative transfer models with implementation on a home computing platform in the hopes that this system can be utilized in and expanded upon in advanced high school and introductory college settings. This learning-by-doing process is aided through the use of several powerful tools. The first is a wikipedia-style introduction to the salient features of radiative transfer that references the seminal works in the field and refers to more complicated calculations and algorithms sparingly5. The second feature is a technical forum, commonly referred to as a tiki-wiki, that addresses technical and conceptual questions through public postings, private messages, and a ranked searching routine. Together, these tools may be able to facilitate greater interest in the field of remote sensing.

  1. Tools for the advancement of undergraduate statistics education

    NASA Astrophysics Data System (ADS)

    Schaffner, Andrew Alan

    To keep pace with advances in applied statistics and to maintain literate consumers of quantitative analyses, statistics educators stress the need for change in the classroom (Cobb, 1992; Garfield, 1993, 1995; Moore, 1991a; Snee, 1993; Steinhorst and Keeler, 1995). These authors stress a more concept oriented undergraduate introductory statistics course which emphasizes true understanding over mechanical skills. Drawing on recent educational research, this dissertation attempts to realize this vision by developing tools and pedagogy to assist statistics instructors. This dissertation describes statistical facets, pieces of statistical understanding that are building blocks of knowledge, and discusses DIANA, a World-Wide Web tool for diagnosing facets. Further, I show how facets may be incorporated into course design through the development of benchmark lessons based on the principles of collaborative learning (diSessa and Minstrell, 1995; Cohen, 1994; Reynolds et al., 1995; Bruer, 1993; von Glasersfeld, 1991) and activity based courses (Jones, 1991; Yackel, Cobb and Wood, 1991). To support benchmark lessons and collaborative learning in large classes I describe Virtual Benchmark Instruction, benchmark lessons which take place on a structured hypertext bulletin board using the technology of the World-Wide Web. Finally, I present randomized experiments which suggest that these educational developments are effective in a university introductory statistics course.

  2. Advanced Vibration Analysis Tool Developed for Robust Engine Rotor Designs

    NASA Technical Reports Server (NTRS)

    Min, James B.

    2005-01-01

    The primary objective of this research program is to develop vibration analysis tools, design tools, and design strategies to significantly improve the safety and robustness of turbine engine rotors. Bladed disks in turbine engines always feature small, random blade-to-blade differences, or mistuning. Mistuning can lead to a dramatic increase in blade forced-response amplitudes and stresses. Ultimately, this results in high-cycle fatigue, which is a major safety and cost concern. In this research program, the necessary steps will be taken to transform a state-of-the-art vibration analysis tool, the Turbo- Reduce forced-response prediction code, into an effective design tool by enhancing and extending the underlying modeling and analysis methods. Furthermore, novel techniques will be developed to assess the safety of a given design. In particular, a procedure will be established for using natural-frequency curve veerings to identify ranges of operating conditions (rotational speeds and engine orders) in which there is a great risk that the rotor blades will suffer high stresses. This work also will aid statistical studies of the forced response by reducing the necessary number of simulations. Finally, new strategies for improving the design of rotors will be pursued.

  3. Sandia Advanced MEMS Design Tools, Version 2.2.5

    SciTech Connect

    Yarberry, Victor; Allen, James; Lantz, Jeffery; Priddy, Brian; & Westling, Belinda

    2010-01-19

    The Sandia National Laboratories Advanced MEMS Design Tools, Version 2.2.5, is a collection of menus, prototype drawings, and executables that provide significant productivity enhancements when using AutoCAD to design MEMS components. This release is designed for AutoCAD 2000i, 2002, or 2004 and is supported under Windows NT 4.0, Windows 2000, or XP. SUMMiT V (Sandia Ultra planar Multi level MEMS Technology) is a 5 level surface micromachine fabrication technology, which customers internal and external to Sandia can access to fabricate prototype MEMS devices. This CD contains an integrated set of electronic files that: a) Describe the SUMMiT V fabrication process b) Facilitate the process of designing MEMS with the SUMMiT process (prototype file, Design Rule Checker, Standard Parts Library) New features in this version: AutoCAD 2004 support has been added. SafeExplode ? a new feature that explodes blocks without affecting polylines (avoids exploding polylines into objects that are ignored by the DRC and Visualization tools). Layer control menu ? a pull-down menu for selecting layers to isolate, freeze, or thaw. Updated tools: A check has been added to catch invalid block names. DRC features: Added username/password validation, added a method to update the user?s password. SNL_DRC_WIDTH ? a value to control the width of the DRC error lines. SNL_BIAS_VALUE ? a value use to offset selected geometry SNL_PROCESS_NAME ? a value to specify the process name Documentation changes: The documentation has been updated to include the new features. While there exist some files on the CD that are used in conjunction with software package AutoCAD, these files are not intended for use independent of the CD. Note that the customer must purchase his/her own copy of AutoCAD to use with these files.

  4. Sandia Advanced MEMS Design Tools, Version 2.2.5

    2010-01-19

    The Sandia National Laboratories Advanced MEMS Design Tools, Version 2.2.5, is a collection of menus, prototype drawings, and executables that provide significant productivity enhancements when using AutoCAD to design MEMS components. This release is designed for AutoCAD 2000i, 2002, or 2004 and is supported under Windows NT 4.0, Windows 2000, or XP. SUMMiT V (Sandia Ultra planar Multi level MEMS Technology) is a 5 level surface micromachine fabrication technology, which customers internal and external tomore » Sandia can access to fabricate prototype MEMS devices. This CD contains an integrated set of electronic files that: a) Describe the SUMMiT V fabrication process b) Facilitate the process of designing MEMS with the SUMMiT process (prototype file, Design Rule Checker, Standard Parts Library) New features in this version: AutoCAD 2004 support has been added. SafeExplode ? a new feature that explodes blocks without affecting polylines (avoids exploding polylines into objects that are ignored by the DRC and Visualization tools). Layer control menu ? a pull-down menu for selecting layers to isolate, freeze, or thaw. Updated tools: A check has been added to catch invalid block names. DRC features: Added username/password validation, added a method to update the user?s password. SNL_DRC_WIDTH ? a value to control the width of the DRC error lines. SNL_BIAS_VALUE ? a value use to offset selected geometry SNL_PROCESS_NAME ? a value to specify the process name Documentation changes: The documentation has been updated to include the new features. While there exist some files on the CD that are used in conjunction with software package AutoCAD, these files are not intended for use independent of the CD. Note that the customer must purchase his/her own copy of AutoCAD to use with these files.« less

  5. Advancing Material Models for Automotive Forming Simulations

    NASA Astrophysics Data System (ADS)

    Vegter, H.; An, Y.; ten Horn, C. H. L. J.; Atzema, E. H.; Roelofsen, M. E.

    2005-08-01

    Simulations in automotive industry need more advanced material models to achieve highly reliable forming and springback predictions. Conventional material models implemented in the FEM-simulation models are not capable to describe the plastic material behaviour during monotonic strain paths with sufficient accuracy. Recently, ESI and Corus co-operate on the implementation of an advanced material model in the FEM-code PAMSTAMP 2G. This applies to the strain hardening model, the influence of strain rate, and the description of the yield locus in these models. A subsequent challenge is the description of the material after a change of strain path. The use of advanced high strength steels in the automotive industry requires a description of plastic material behaviour of multiphase steels. The simplest variant is dual phase steel consisting of a ferritic and a martensitic phase. Multiphase materials also contain a bainitic phase in addition to the ferritic and martensitic phase. More physical descriptions of strain hardening than simple fitted Ludwik/Nadai curves are necessary. Methods to predict plastic behaviour of single-phase materials use a simple dislocation interaction model based on the formed cells structures only. At Corus, a new method is proposed to predict plastic behaviour of multiphase materials have to take hard phases into account, which deform less easily. The resulting deformation gradients create geometrically necessary dislocations. Additional micro-structural information such as morphology and size of hard phase particles or grains is necessary to derive the strain hardening models for this type of materials. Measurements available from the Numisheet benchmarks allow these models to be validated. At Corus, additional measured values are available from cross-die tests. This laboratory test can attain critical deformations by large variations in blank size and processing conditions. The tests are a powerful tool in optimising forming simulations

  6. Synthetic biology and molecular genetics in non-conventional yeasts: Current tools and future advances.

    PubMed

    Wagner, James M; Alper, Hal S

    2016-04-01

    Coupling the tools of synthetic biology with traditional molecular genetic techniques can enable the rapid prototyping and optimization of yeast strains. While the era of yeast synthetic biology began in the well-characterized model organism Saccharomyces cerevisiae, it is swiftly expanding to include non-conventional yeast production systems such as Hansenula polymorpha, Kluyveromyces lactis, Pichia pastoris, and Yarrowia lipolytica. These yeasts already have roles in the manufacture of vaccines, therapeutic proteins, food additives, and biorenewable chemicals, but recent synthetic biology advances have the potential to greatly expand and diversify their impact on biotechnology. In this review, we summarize the development of synthetic biological tools (including promoters and terminators) and enabling molecular genetics approaches that have been applied in these four promising alternative biomanufacturing platforms. An emphasis is placed on synthetic parts and genome editing tools. Finally, we discuss examples of synthetic tools developed in other organisms that can be adapted or optimized for these hosts in the near future.

  7. Advanced Risk Reduction Tool (ARRT) Special Case Study Report: Science and Engineering Technical Assessments (SETA) Program

    NASA Technical Reports Server (NTRS)

    Kirsch, Paul J.; Hayes, Jane; Zelinski, Lillian

    2000-01-01

    This special case study report presents the Science and Engineering Technical Assessments (SETA) team's findings for exploring the correlation between the underlying models of Advanced Risk Reduction Tool (ARRT) relative to how it identifies, estimates, and integrates Independent Verification & Validation (IV&V) activities. The special case study was conducted under the provisions of SETA Contract Task Order (CTO) 15 and the approved technical approach documented in the CTO-15 Modification #1 Task Project Plan.

  8. Model Analysis ToolKit

    SciTech Connect

    Harp, Dylan R.

    2015-05-15

    MATK provides basic functionality to facilitate model analysis within the Python computational environment. Model analysis setup within MATK includes: - define parameters - define observations - define model (python function) - define samplesets (sets of parameter combinations) Currently supported functionality includes: - forward model runs - Latin-Hypercube sampling of parameters - multi-dimensional parameter studies - parallel execution of parameter samples - model calibration using internal Levenberg-Marquardt algorithm - model calibration using lmfit package - model calibration using levmar package - Markov Chain Monte Carlo using pymc package MATK facilitates model analysis using: - scipy - calibration (scipy.optimize) - rpy2 - Python interface to R

  9. Model Analysis ToolKit

    2015-05-15

    MATK provides basic functionality to facilitate model analysis within the Python computational environment. Model analysis setup within MATK includes: - define parameters - define observations - define model (python function) - define samplesets (sets of parameter combinations) Currently supported functionality includes: - forward model runs - Latin-Hypercube sampling of parameters - multi-dimensional parameter studies - parallel execution of parameter samples - model calibration using internal Levenberg-Marquardt algorithm - model calibration using lmfit package - modelmore » calibration using levmar package - Markov Chain Monte Carlo using pymc package MATK facilitates model analysis using: - scipy - calibration (scipy.optimize) - rpy2 - Python interface to R« less

  10. Advancing Software Architecture Modeling for Large Scale Heterogeneous Systems

    SciTech Connect

    Gorton, Ian; Liu, Yan

    2010-11-07

    In this paper we describe how incorporating technology-specific modeling at the architecture level can help reduce risks and produce better designs for large, heterogeneous software applications. We draw an analogy with established modeling approaches in scientific domains, using groundwater modeling as an example, to help illustrate gaps in current software architecture modeling approaches. We then describe the advances in modeling, analysis and tooling that are required to bring sophisticated modeling and development methods within reach of software architects.

  11. Advanced Modeling of Micromirror Devices

    NASA Technical Reports Server (NTRS)

    Michalicek, M. Adrian; Sene, Darren E.; Bright, Victor M.

    1995-01-01

    The flexure-beam micromirror device (FBMD) is a phase only piston style spatial light modulator demonstrating properties which can be used for phase adaptive corrective optics. This paper presents a complete study of a square FBMD, from advanced model development through final device testing and model verification. The model relates the electrical and mechanical properties of the device by equating the electrostatic force of a parallel-plate capacitor with the counter-acting spring force of the device's support flexures. The capacitor solution is derived via the Schwartz-Christoffel transformation such that the final solution accounts for non-ideal electric fields. The complete model describes the behavior of any piston-style device, given its design geometry and material properties. It includes operational parameters such as drive frequency and temperature, as well as fringing effects, mirror surface deformations, and cross-talk from neighboring devices. The steps taken to develop this model can be applied to other micromirrors, such as the cantilever and torsion-beam designs, to produce an advanced model for any given device. The micromirror devices studied in this paper were commercially fabricated in a surface micromachining process. A microscope-based laser interferometer is used to test the device in which a beam reflected from the device modulates a fixed reference beam. The mirror displacement is determined from the relative phase which generates a continuous set of data for each selected position on the mirror surface. Plots of this data describe the localized deflection as a function of drive voltage.

  12. Querator: an advanced multi-archive data mining tool

    NASA Astrophysics Data System (ADS)

    Pierfederici, Francesco

    2001-11-01

    In recent years, the operation of large telescopes with wide field detectors - such as the European Southern Observatory (ESO) Wide Field Imager (WFI) on the 2.2 meters telescope at La Silla, Chile - have dramatically increased the amount of astronomical data produced each year. The next survey telescopes, such as the ESO VST, will continue on this trend, producing extremely large datasets. Astronomy, therefore, has become an incredibly data rich field requiring new tools and new strategies to efficiently handle huge archives and fully exploit their scientific content. At the Space Telescope European Coordinating Facility we are working on a new project, code named Querator (http://archive.eso.org/querator/). Querator is an advanced multi-archive search engine built to address the needs of astronomers looking for multicolor imaging data across different astronomical data-centers. Querator returns sets of images of a given astronomical object or search region. A set contains exposures in a number of different wave bands. The user constraints the number of desired wave bands by selecting from a set of instruments, filters or by specifying actual physical units. As far as present-day data-centers are concerned, Querator points out the need for: - an uniform and standard description of archival data and - an uniform and standard description of how the data was acquired (i.e. instrument and observation characteristics). Clearly, these pieces of information will constitute an intermediate layer between the data itself and the data mining tools operating on it. This layered structure is a prerequisite to real data-center inter-operability and, hence, to Virtual Observatories. A detailed description of Querator's design, of the required data structures, of the problems encountered so far and of the proposed solutions will be given in the following pages. Throughout this paper we'll favor the term data-center over archive to stress the need to look at raw-pixels' archives

  13. Advanced Mirror & Modelling Technology Development

    NASA Technical Reports Server (NTRS)

    Effinger, Michael; Stahl, H. Philip; Abplanalp, Laura; Maffett, Steven; Egerman, Robert; Eng, Ron; Arnold, William; Mosier, Gary; Blaurock, Carl

    2014-01-01

    The 2020 Decadal technology survey is starting in 2018. Technology on the shelf at that time will help guide selection to future low risk and low cost missions. The Advanced Mirror Technology Development (AMTD) team has identified development priorities based on science goals and engineering requirements for Ultraviolet Optical near-Infrared (UVOIR) missions in order to contribute to the selection process. One key development identified was lightweight mirror fabrication and testing. A monolithic, stacked, deep core mirror was fused and replicated twice to achieve the desired radius of curvature. It was subsequently successfully polished and tested. A recently awarded second phase to the AMTD project will develop larger mirrors to demonstrate the lateral scaling of the deep core mirror technology. Another key development was rapid modeling for the mirror. One model focused on generating optical and structural model results in minutes instead of months. Many variables could be accounted for regarding the core, face plate and back structure details. A portion of a spacecraft model was also developed. The spacecraft model incorporated direct integration to transform optical path difference to Point Spread Function (PSF) and between PSF to modulation transfer function. The second phase to the project will take the results of the rapid mirror modeler and integrate them into the rapid spacecraft modeler.

  14. Software Engineering Tools for Scientific Models

    NASA Technical Reports Server (NTRS)

    Abrams, Marc; Saboo, Pallabi; Sonsini, Mike

    2013-01-01

    Software tools were constructed to address issues the NASA Fortran development community faces, and they were tested on real models currently in use at NASA. These proof-of-concept tools address the High-End Computing Program and the Modeling, Analysis, and Prediction Program. Two examples are the NASA Goddard Earth Observing System Model, Version 5 (GEOS-5) atmospheric model in Cell Fortran on the Cell Broadband Engine, and the Goddard Institute for Space Studies (GISS) coupled atmosphere- ocean model called ModelE, written in fixed format Fortran.

  15. DSA hole defectivity analysis using advanced optical inspection tool

    NASA Astrophysics Data System (ADS)

    Harukawa, Ryota; Aoki, Masami; Cross, Andrew; Nagaswami, Venkat; Tomita, Tadatoshi; Nagahara, Seiji; Muramatsu, Makoto; Kawakami, Shinichiro; Kosugi, Hitoshi; Rathsack, Benjamen; Kitano, Takahiro; Sweis, Jason; Mokhberi, Ali

    2013-04-01

    This paper discusses the defect density detection and analysis methodology using advanced optical wafer inspection capability to enable accelerated development of a DSA process/process tools and the required inspection capability to monitor such a process. The defectivity inspection methodologies are optimized for grapho epitaxy directed self-assembly (DSA) contact holes with 25 nm sizes. A defect test reticle with programmed defects on guide patterns is designed for improved optimization of defectivity monitoring. Using this reticle, resist guide holes with a variety of sizes and shapes are patterned using an ArF immersion scanner. The negative tone development (NTD) type thermally stable resist guide is used for DSA of a polystyrene-b-poly(methyl methacrylate) (PS-b-PMMA) block copolymer (BCP). Using a variety of defects intentionally made by changing guide pattern sizes, the detection rates of each specific defectivity type has been analyzed. It is found in this work that to maximize sensitivity, a two pass scan with bright field (BF) and dark field (DF) modes provides the best overall defect type coverage and sensitivity. The performance of the two pass scan with BF and DF modes is also revealed by defect analysis for baseline defectivity on a wafer processed with nominal process conditions.

  16. European regulatory tools for advanced therapy medicinal products.

    PubMed

    Flory, Egbert; Reinhardt, Jens

    2013-12-01

    Increasing scientific knowledge and technical innovations in the areas of cell biology, biotechnology and medicine resulted in the development of promising therapeutic approaches for the prevention and treatment of human diseases. Advanced therapy medicinal products (ATMPs) reflect a complex and innovative class of biopharmaceuticals as these products are highly research-driven, characterised by innovative manufacturing processes and heterogeneous with regard to their origin, type and complexity. This class of ATMP integrates gene therapy medicinal products, somatic cell therapy medicinal products and tissue engineering products and are often individualized and patient-specific products. Multiple challenges arise from the nature of ATMPs, which are often developed by micro, small and medium sized enterprises, university and academia, for whom regulatory experiences are limited and regulatory requirements are challenging. Regulatory guidance such as the reflection paper on classification of ATMPs and guidelines highlighting product-specific issues support academic research groups and pharmaceutical companies to foster the development of safe and effective ATMPs. This review provides an overview on the European regulatory aspects of ATMPs and highlights specific regulatory tools such as the ATMP classification procedure, a discussion on the hospital exemption for selected ATMPs as well as borderline issues towards transplants/transfusion products.

  17. New V and V Tools for Diagnostic Modeling Environment (DME)

    NASA Technical Reports Server (NTRS)

    Pecheur, Charles; Nelson, Stacy; Merriam, Marshall (Technical Monitor)

    2002-01-01

    The purpose of this report is to provide correctness and reliability criteria for verification and validation (V&V) of Second Generation Reusable Launch Vehicle (RLV) Diagnostic Modeling Environment, describe current NASA Ames Research Center tools for V&V of Model Based Reasoning systems, and discuss the applicability of Advanced V&V to DME. This report is divided into the following three sections: (1) correctness and reliability criteria; (2) tools for V&V of Model Based Reasoning; and (3) advanced V&V applicable to DME. The Executive Summary includes an overview of the main points from each section. Supporting details, diagrams, figures, and other information are included in subsequent sections. A glossary, acronym list, appendices, and references are included at the end of this report.

  18. ANSYS tools in modeling tires

    NASA Technical Reports Server (NTRS)

    Ali, Ashraf; Lovell, Michael

    1995-01-01

    This presentation summarizes the capabilities in the ANSYS program that relate to the computational modeling of tires. The power and the difficulties associated with modeling nearly incompressible rubber-like materials using hyperelastic constitutive relationships are highlighted from a developer's point of view. The topics covered include a hyperelastic material constitutive model for rubber-like materials, a general overview of contact-friction capabilities, and the acoustic fluid-structure interaction problem for noise prediction. Brief theoretical development and example problems are presented for each topic.

  19. ANSYS tools in modeling tires

    NASA Astrophysics Data System (ADS)

    Ali, Ashraf; Lovell, Michael

    1995-08-01

    This presentation summarizes the capabilities in the ANSYS program that relate to the computational modeling of tires. The power and the difficulties associated with modeling nearly incompressible rubber-like materials using hyperelastic constitutive relationships are highlighted from a developer's point of view. The topics covered include a hyperelastic material constitutive model for rubber-like materials, a general overview of contact-friction capabilities, and the acoustic fluid-structure interaction problem for noise prediction. Brief theoretical development and example problems are presented for each topic.

  20. Advanced computational tools for optimization and uncertainty quantification of carbon capture processes

    SciTech Connect

    Miller, David C.; Ng, Brenda; Eslick, John

    2014-01-01

    Advanced multi-scale modeling and simulation has the potential to dramatically reduce development time, resulting in considerable cost savings. The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and universities that is developing, demonstrating, and deploying a suite of multi-scale modeling and simulation tools. One significant computational tool is FOQUS, a Framework for Optimization and Quantification of Uncertainty and Sensitivity, which enables basic data submodels, including thermodynamics and kinetics, to be used within detailed process models to rapidly synthesize and optimize a process and determine the level of uncertainty associated with the resulting process. The overall approach of CCSI is described with a more detailed discussion of FOQUS and its application to carbon capture systems.

  1. NASA Technical Interchange Meeting (TIM): Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box

    NASA Technical Reports Server (NTRS)

    ONeil, D. A.; Craig, D. A.; Christensen, C. B.; Gresham, E. C.

    2005-01-01

    The objective of this Technical Interchange Meeting was to increase the quantity and quality of technical, cost, and programmatic data used to model the impact of investing in different technologies. The focus of this meeting was the Technology Tool Box (TTB), a database of performance, operations, and programmatic parameters provided by technologists and used by systems engineers. The TTB is the data repository used by a system of models known as the Advanced Technology Lifecycle Analysis System (ATLAS). This report describes the result of the November meeting, and also provides background information on ATLAS and the TTB.

  2. Water quality management of aquifer recharge using advanced tools.

    PubMed

    Lazarova, Valentina; Emsellem, Yves; Paille, Julie; Glucina, Karl; Gislette, Philippe

    2011-01-01

    Managed aquifer recharge (MAR) with recycled water or other alternative resources is one of the most rapidly growing techniques that is viewed as a necessity in water-short areas. In order to better control health and environmental effects of MAR, this paper presents two case studies demonstrating how to improve water quality, enable reliable tracing of injected water and better control and manage MAR operation in the case of indirect and direct aquifer recharge. Two water quality management strategies are illustrated on two full-scale case studies, including the results of the combination of non conventional and advanced technologies for water quality improvement, comprehensive sampling and monitoring programs including emerging pollutants, tracer studies using boron isotopes and integrative aquifer 3D GIS hydraulic and hydrodispersive modelling.

  3. Rapid Modeling and Analysis Tools: Evolution, Status, Needs and Directions

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Stone, Thomas J.; Ransom, Jonathan B. (Technical Monitor)

    2002-01-01

    Advanced aerospace systems are becoming increasingly more complex, and customers are demanding lower cost, higher performance, and high reliability. Increased demands are placed on the design engineers to collaborate and integrate design needs and objectives early in the design process to minimize risks that may occur later in the design development stage. High performance systems require better understanding of system sensitivities much earlier in the design process to meet these goals. The knowledge, skills, intuition, and experience of an individual design engineer will need to be extended significantly for the next generation of aerospace system designs. Then a collaborative effort involving the designer, rapid and reliable analysis tools and virtual experts will result in advanced aerospace systems that are safe, reliable, and efficient. This paper discusses the evolution, status, needs and directions for rapid modeling and analysis tools for structural analysis. First, the evolution of computerized design and analysis tools is briefly described. Next, the status of representative design and analysis tools is described along with a brief statement on their functionality. Then technology advancements to achieve rapid modeling and analysis are identified. Finally, potential future directions including possible prototype configurations are proposed.

  4. Model analysis tools in the Virtual Model Repository (VMR)

    NASA Astrophysics Data System (ADS)

    De Zeeuw, D.; Ridley, A. J.

    2013-12-01

    The Virtual Model Repository (VMR) provides scientific analysis tools for a wide variety of numerical models of the Earth's magnetosphere. Data discovery, visualization tools and data/model comparisons are provided in a consistent and intuitive format. A large collection of numerical model runs are available to analyze, including the large Earth magnetosphere event run library at the CCMC and many runs from the University of Michigan. Relevant data useful for data/model comparisons is found using various APIs and included in many of the visualization tools. Recent additions to the VMR include a comprehensive suite of tools for analysis of the Global Ionosphere Thermosphere Model (GITM).

  5. Anvil Forecast Tool in the Advanced Weather Interactive Processing System (AWIPS)

    NASA Technical Reports Server (NTRS)

    Barrett, Joe H., III; Hood, Doris

    2009-01-01

    Launch Weather Officers (LWOs) from the 45th Weather Squadron (45 WS) and forecasters from the National Weather Service (NWS) Spaceflight Meteorology Group (SMG) have identified anvil forecasting as one of their most challenging tasks when predicting the probability of violating the Lightning Launch Commit Criteria (LLCC) (Krider et al. 2006; Space Shuttle Flight Rules (FR), NASA/JSC 2004)). As a result, the Applied Meteorology Unit (AMU) developed a tool that creates an anvil threat corridor graphic that can be overlaid on satellite imagery using the Meteorological Interactive Data Display System (MIDDS, Short and Wheeler, 2002). The tool helps forecasters estimate the locations of thunderstorm anvils at one, two, and three hours into the future. It has been used extensively in launch and landing operations by both the 45 WS and SMG. The Advanced Weather Interactive Processing System (AWIPS) is now used along with MIDDS for weather analysis and display at SMG. In Phase I of this task, SMG tasked the AMU to transition the tool from MIDDS to AWIPS (Barrett et aI., 2007). For Phase II, SMG requested the AMU make the Anvil Forecast Tool in AWIPS more configurable by creating the capability to read model gridded data from user-defined model files instead of hard-coded files. An NWS local AWIPS application called AGRID was used to accomplish this. In addition, SMG needed to be able to define the pressure levels for the model data, instead of hard-coding the bottom level as 300 mb and the top level as 150 mb. This paper describes the initial development of the Anvil Forecast Tool for MIDDS, followed by the migration of the tool to AWIPS in Phase I. It then gives a detailed presentation of the Phase II improvements to the AWIPS tool.

  6. Advanced Small Modular Reactor Economics Model Development

    SciTech Connect

    Harrison, Thomas J.

    2014-10-01

    The US Department of Energy Office of Nuclear Energy’s Advanced Small Modular Reactor (SMR) research and development activities focus on four key areas: Developing assessment methods for evaluating advanced SMR technologies and characteristics; and Developing and testing of materials, fuels and fabrication techniques; and Resolving key regulatory issues identified by US Nuclear Regulatory Commission and industry; and Developing advanced instrumentation and controls and human-machine interfaces. This report focuses on development of assessment methods to evaluate advanced SMR technologies and characteristics. Specifically, this report describes the expansion and application of the economic modeling effort at Oak Ridge National Laboratory. Analysis of the current modeling methods shows that one of the primary concerns for the modeling effort is the handling of uncertainty in cost estimates. Monte Carlo–based methods are commonly used to handle uncertainty, especially when implemented by a stand-alone script within a program such as Python or MATLAB. However, a script-based model requires each potential user to have access to a compiler and an executable capable of handling the script. Making the model accessible to multiple independent analysts is best accomplished by implementing the model in a common computing tool such as Microsoft Excel. Excel is readily available and accessible to most system analysts, but it is not designed for straightforward implementation of a Monte Carlo–based method. Using a Monte Carlo algorithm requires in-spreadsheet scripting and statistical analyses or the use of add-ons such as Crystal Ball. An alternative method uses propagation of error calculations in the existing Excel-based system to estimate system cost uncertainty. This method has the advantage of using Microsoft Excel as is, but it requires the use of simplifying assumptions. These assumptions do not necessarily bring into question the analytical results. In fact, the

  7. Comparing Simple and Advanced Video Tools as Supports for Complex Collaborative Design Processes

    ERIC Educational Resources Information Center

    Zahn, Carmen; Pea, Roy; Hesse, Friedrich W.; Rosen, Joe

    2010-01-01

    Working with digital video technologies, particularly advanced video tools with editing capabilities, offers new prospects for meaningful learning through design. However, it is also possible that the additional complexity of such tools does "not" advance learning. We compared in an experiment the design processes and learning outcomes of 24…

  8. Advanced Combustion Modeling for Complex Turbulent Flows

    NASA Technical Reports Server (NTRS)

    Ham, Frank Stanford

    2005-01-01

    The next generation of aircraft engines will need to pass stricter efficiency and emission tests. NASA's Ultra-Efficient Engine Technology (UEET) program has set an ambitious goal of 70% reduction of NO(x) emissions and a 15% increase in fuel efficiency of aircraft engines. We will demonstrate the state-of-the-art combustion tools developed a t Stanford's Center for Turbulence Research (CTR) as part of this program. In the last decade, CTR has spear-headed a multi-physics-based combustion modeling program. Key technologies have been transferred to the aerospace industry and are currently being used for engine simulations. In this demo, we will showcase the next-generation combustion modeling tools that integrate a very high level of detailed physics into advanced flow simulation codes. Combustor flows involve multi-phase physics with liquid fuel jet breakup, evaporation, and eventual combustion. Individual components of the simulation are verified against complex test cases and show excellent agreement with experimental data.

  9. Advanced gradient-index lens design tools to maximize system performance and reduce SWaP

    NASA Astrophysics Data System (ADS)

    Campbell, Sawyer D.; Nagar, Jogender; Brocker, Donovan E.; Easum, John A.; Turpin, Jeremiah P.; Werner, Douglas H.

    2016-05-01

    GRadient-INdex (GRIN) lenses have long been of interest due to their potential for providing levels of performance unachievable with traditional homogeneous lenses. While historically limited by a lack of suitable materials, rapid advancements in manufacturing techniques, including 3D printing, have recently kindled a renewed interest in GRIN optics. Further increasing the desire for GRIN devices has been the advent of Transformation Optics (TO), which provides the mathematical framework for representing the behavior of electromagnetic radiation in a given geometry by "transforming" it to an alternative, usually more desirable, geometry through an appropriate mapping of the constituent material parameters. Using TO, aspherical lenses can be transformed to simpler spherical and flat geometries or even rotationally-asymmetric shapes which result in true 3D GRIN profiles. Meanwhile, there is a critical lack of suitable design tools which can effectively evaluate the optical wave propagation through 3D GRIN profiles produced by TO. Current modeling software packages for optical lens systems also lack advanced multi-objective global optimization capability which allows the user to explicitly view the trade-offs between all design objectives such as focus quality, FOV, ▵nand focal drift due to chromatic aberrations. When coupled with advanced design methodologies such as TO, wavefront matching (WFM), and analytical achromatic GRIN theory, these tools provide a powerful framework for maximizing SWaP (Size, Weight and Power) reduction in GRIN-enabled optical systems. We provide an overview of our advanced GRIN design tools and examples which minimize the presence of mono- and polychromatic aberrations in the context of reducing SWaP.

  10. Performance and Architecture Lab Modeling Tool

    SciTech Connect

    2014-06-19

    Analytical application performance models are critical for diagnosing performance-limiting resources, optimizing systems, and designing machines. Creating models, however, is difficult. Furthermore, models are frequently expressed in forms that are hard to distribute and validate. The Performance and Architecture Lab Modeling tool, or Palm, is a modeling tool designed to make application modeling easier. Palm provides a source code modeling annotation language. Not only does the modeling language divide the modeling task into sub problems, it formally links an application's source code with its model. This link is important because a model's purpose is to capture application behavior. Furthermore, this link makes it possible to define rules for generating models according to source code organization. Palm generates hierarchical models according to well-defined rules. Given an application, a set of annotations, and a representative execution environment, Palm will generate the same model. A generated model is a an executable program whose constituent parts directly correspond to the modeled application. Palm generates models by combining top-down (human-provided) semantic insight with bottom-up static and dynamic analysis. A model's hierarchy is defined by static and dynamic source code structure. Because Palm coordinates models and source code, Palm's models are 'first-class' and reproducible. Palm automates common modeling tasks. For instance, Palm incorporates measurements to focus attention, represent constant behavior, and validate models. Palm's workflow is as follows. The workflow's input is source code annotated with Palm modeling annotations. The most important annotation models an instance of a block of code. Given annotated source code, the Palm Compiler produces executables and the Palm Monitor collects a representative performance profile. The Palm Generator synthesizes a model based on the static and dynamic mapping of annotations to program behavior

  11. Performance and Architecture Lab Modeling Tool

    2014-06-19

    Analytical application performance models are critical for diagnosing performance-limiting resources, optimizing systems, and designing machines. Creating models, however, is difficult. Furthermore, models are frequently expressed in forms that are hard to distribute and validate. The Performance and Architecture Lab Modeling tool, or Palm, is a modeling tool designed to make application modeling easier. Palm provides a source code modeling annotation language. Not only does the modeling language divide the modeling task into sub problems, itmore » formally links an application's source code with its model. This link is important because a model's purpose is to capture application behavior. Furthermore, this link makes it possible to define rules for generating models according to source code organization. Palm generates hierarchical models according to well-defined rules. Given an application, a set of annotations, and a representative execution environment, Palm will generate the same model. A generated model is a an executable program whose constituent parts directly correspond to the modeled application. Palm generates models by combining top-down (human-provided) semantic insight with bottom-up static and dynamic analysis. A model's hierarchy is defined by static and dynamic source code structure. Because Palm coordinates models and source code, Palm's models are 'first-class' and reproducible. Palm automates common modeling tasks. For instance, Palm incorporates measurements to focus attention, represent constant behavior, and validate models. Palm's workflow is as follows. The workflow's input is source code annotated with Palm modeling annotations. The most important annotation models an instance of a block of code. Given annotated source code, the Palm Compiler produces executables and the Palm Monitor collects a representative performance profile. The Palm Generator synthesizes a model based on the static and dynamic mapping of annotations to program

  12. Some mathematical tools for a modeller's workbench

    NASA Technical Reports Server (NTRS)

    Cohen, E.

    1984-01-01

    The development of a mathematical software tools in workbench environment to model related objects more straightforward is outlined. A computer model from informal drawings and a plastic model of a helicopter is discussed. Lofting was the predominant, characteristic modelling technique. Ships and airplane designs use lofting as a technique because they have defined surfaces, (hulls and fuselages) from vertical station cuts perpendicular to the vertical center plane defining the major axis of reflective symmetry. A turbine blade from a jet engine was modelled in this way. The aerodynamic portion and the root comes from different paradigms. The union of these two parts into a coherent model is shown.

  13. Advances in Watershed Models and Modeling

    NASA Astrophysics Data System (ADS)

    Yeh, G. T.; Zhang, F.

    2015-12-01

    The development of watershed models and their applications to real-world problems has evolved significantly since 1960's. Watershed models can be classified based on what media are included, what processes are dealt with, and what approaches are taken. In term of media, a watershed may include segregated overland regime, river-canal-open channel networks, ponds-reservoirs-small lakes, and subsurface media. It may also include integrated media of all these or a partial set of these as well as man-made control structures. In term of processes, a watershed model may deal with coupled or decoupled hydrological and biogeochemical cycles. These processes include fluid flow, thermal transport, salinity transport, sediment transport, reactive transport, and biota and microbe kinetics. In terms of approaches, either parametric or physics-based approach can be taken. This talk discusses the evolution of watershed models in the past sixty years. The advances of watershed models center around their increasing design capability to foster these segregated or integrated media and coupled or decoupled processes. Widely used models developed by academia, research institutes, government agencies, and private industries will be reviewed in terms of the media and processes included as well as approaches taken. Many types of potential benchmark problems in general can be proposed and will be discussed. This presentation will focus on three benchmark problems of biogeochemical cycles. These three problems, dealing with water quality transport, will be formulated in terms of reactive transport. Simulation results will be illustrated using WASH123D, a watershed model developed and continuously updated by the author and his PhD graduates. Keywords: Hydrological Cycles, Biogeochemical Cycles, Biota Kinetics, Parametric Approach, Physics-based Approach, Reactive Transport.

  14. Recovery Act: Advanced Interaction, Computation, and Visualization Tools for Sustainable Building Design

    SciTech Connect

    Greenberg, Donald P.; Hencey, Brandon M.

    2013-08-20

    Current building energy simulation technology requires excessive labor, time and expertise to create building energy models, excessive computational time for accurate simulations and difficulties with the interpretation of the results. These deficiencies can be ameliorated using modern graphical user interfaces and algorithms which take advantage of modern computer architectures and display capabilities. To prove this hypothesis, we developed an experimental test bed for building energy simulation. This novel test bed environment offers an easy-to-use interactive graphical interface, provides access to innovative simulation modules that run at accelerated computational speeds, and presents new graphics visualization methods to interpret simulation results. Our system offers the promise of dramatic ease of use in comparison with currently available building energy simulation tools. Its modular structure makes it suitable for early stage building design, as a research platform for the investigation of new simulation methods, and as a tool for teaching concepts of sustainable design. Improvements in the accuracy and execution speed of many of the simulation modules are based on the modification of advanced computer graphics rendering algorithms. Significant performance improvements are demonstrated in several computationally expensive energy simulation modules. The incorporation of these modern graphical techniques should advance the state of the art in the domain of whole building energy analysis and building performance simulation, particularly at the conceptual design stage when decisions have the greatest impact. More importantly, these better simulation tools will enable the transition from prescriptive to performative energy codes, resulting in better, more efficient designs for our future built environment.

  15. Update on ORNL TRANSFORM Tool: Simulating Multi-Module Advanced Reactor with End-to-End I&C

    SciTech Connect

    Hale, Richard Edward; Fugate, David L.; Cetiner, Sacit M.; Qualls, A. L.

    2015-05-01

    The Small Modular Reactor (SMR) Dynamic System Modeling Tool project is in the fourth year of development. The project is designed to support collaborative modeling and study of various advanced SMR (non-light water cooled reactor) concepts, including the use of multiple coupled reactors at a single site. The focus of this report is the development of a steam generator and drum system model that includes the complex dynamics of typical steam drum systems, the development of instrumentation and controls for the steam generator with drum system model, and the development of multi-reactor module models that reflect the full power reactor innovative small module design concept. The objective of the project is to provide a common simulation environment and baseline modeling resources to facilitate rapid development of dynamic advanced reactor models; ensure consistency among research products within the Instrumentation, Controls, and Human-Machine Interface technical area; and leverage cross-cutting capabilities while minimizing duplication of effort. The combined simulation environment and suite of models are identified as the TRANSFORM tool. The critical elements of this effort include (1) defining a standardized, common simulation environment that can be applied throughout the Advanced Reactors Technology program; (2) developing a library of baseline component modules that can be assembled into full plant models using available geometry, design, and thermal-hydraulic data; (3) defining modeling conventions for interconnecting component models; and (4) establishing user interfaces and support tools to facilitate simulation development (i.e., configuration and parameterization), execution, and results display and capture.

  16. FMFilter: A fast model based variant filtering tool.

    PubMed

    Akgün, Mete; Faruk Gerdan, Ö; Görmez, Zeliha; Demirci, Hüseyin

    2016-04-01

    The availability of whole exome and genome sequencing has completely changed the structure of genetic disease studies. It is now possible to solve the disease causing mechanisms within shorter time and budgets. For this reason, mining out the valuable information from the huge amount of data produced by next generation techniques becomes a challenging task. Current tools analyze sequencing data in various methods. However, there is still need for fast, easy to use and efficacious tools. Considering genetic disease studies, there is a lack of publicly available tools which support compound heterozygous and de novo models. Also, existing tools either require advanced IT expertise or are inefficient for handling large variant files. In this work, we provide FMFilter, an efficient sieving tool for next generation sequencing data produced by genetic disease studies. We develop a software which allows to choose the inheritance model (recessive, dominant, compound heterozygous and de novo), the affected and control individuals. The program provides a user friendly Graphical User Interface which eliminates the requirement of advanced computer techniques. It has various filtering options which enable to eliminate the majority of the false alarms. FMFilter requires negligible memory, therefore it can easily handle very large variant files like multiple whole genomes with ordinary computers. We demonstrate the variant reduction capability and effectiveness of the proposed tool with public and in-house data for different inheritance models. We also compare FMFilter with the existing filtering software. We conclude that FMFilter provides an effective and easy to use environment for analyzing next generation sequencing data from Mendelian diseases. PMID:26925517

  17. FMFilter: A fast model based variant filtering tool.

    PubMed

    Akgün, Mete; Faruk Gerdan, Ö; Görmez, Zeliha; Demirci, Hüseyin

    2016-04-01

    The availability of whole exome and genome sequencing has completely changed the structure of genetic disease studies. It is now possible to solve the disease causing mechanisms within shorter time and budgets. For this reason, mining out the valuable information from the huge amount of data produced by next generation techniques becomes a challenging task. Current tools analyze sequencing data in various methods. However, there is still need for fast, easy to use and efficacious tools. Considering genetic disease studies, there is a lack of publicly available tools which support compound heterozygous and de novo models. Also, existing tools either require advanced IT expertise or are inefficient for handling large variant files. In this work, we provide FMFilter, an efficient sieving tool for next generation sequencing data produced by genetic disease studies. We develop a software which allows to choose the inheritance model (recessive, dominant, compound heterozygous and de novo), the affected and control individuals. The program provides a user friendly Graphical User Interface which eliminates the requirement of advanced computer techniques. It has various filtering options which enable to eliminate the majority of the false alarms. FMFilter requires negligible memory, therefore it can easily handle very large variant files like multiple whole genomes with ordinary computers. We demonstrate the variant reduction capability and effectiveness of the proposed tool with public and in-house data for different inheritance models. We also compare FMFilter with the existing filtering software. We conclude that FMFilter provides an effective and easy to use environment for analyzing next generation sequencing data from Mendelian diseases.

  18. Advancing alternate tools: why science education needs CRP and CRT

    NASA Astrophysics Data System (ADS)

    Dodo Seriki, Vanessa

    2016-09-01

    Ridgeway and Yerrick's paper, Whose banner are we waving?: exploring STEM partnerships for marginalized urban youth, unearthed the tensions that existed between a local community "expert" and a group of students and their facilitator in an afterschool program. Those of us who work with youth who are traditionally marginalized, understand the importance of teaching in culturally relevant ways, but far too often—as Ridgeway and Yerrick shared—community partners have beliefs, motives, and ideologies that are incompatible to the program's mission and goals. Nevertheless, we often enter partnerships assuming that the other party understands the needs of the students or community; understands how in U.S. society White is normative while all others are deficient; and understands how to engage with students in culturally relevant ways. This forum addresses the underlying assumption, described in the Ridgeway and Yerrick article, that educators—despite their background and experiences—are able to teach in culturally relevant ways. Additionally, I assert based on the finding in the article that just as Ladson-Billings and Tate (Teach Coll Rec 97(1):47-68, 1995) asserted, race in the U.S. society, as a scholarly pursuit, was under theorized. The same is true of science education; race in science education is under theorized and the use of culturally relevant pedagogy and critical race theory as a pedagogical model and analytical tool, respectively, in science education is minimal. The increased use of both would impact our understanding of who does science, and how to broaden participation among people of color.

  19. Numerical modeling tools for chemical vapor deposition

    NASA Technical Reports Server (NTRS)

    Jasinski, Thomas J.; Childs, Edward P.

    1992-01-01

    Development of general numerical simulation tools for chemical vapor deposition (CVD) was the objective of this study. Physical models of important CVD phenomena were developed and implemented into the commercial computational fluid dynamics software FLUENT. The resulting software can address general geometries as well as the most important phenomena occurring with CVD reactors: fluid flow patterns, temperature and chemical species distribution, gas phase and surface deposition. The physical models are documented which are available and examples are provided of CVD simulation capabilities.

  20. The Will, Skill, Tool Model of Technology Integration: Adding Pedagogy as a New Model Construct

    ERIC Educational Resources Information Center

    Knezek, Gerald; Christensen, Rhonda

    2015-01-01

    An expansion of the Will, Skill, Tool Model of Technology Integration to include teacher's pedagogical style is proposed by the authors as a means of advancing the predictive power for level of classroom technology integration to beyond 90%. Suggested advantages to this expansion include more precise identification of areas to be targeted for…

  1. Collaboration tools and techniques for large model datasets

    USGS Publications Warehouse

    Signell, R.P.; Carniel, S.; Chiggiato, J.; Janekovic, I.; Pullen, J.; Sherwood, C.R.

    2008-01-01

    In MREA and many other marine applications, it is common to have multiple models running with different grids, run by different institutions. Techniques and tools are described for low-bandwidth delivery of data from large multidimensional datasets, such as those from meteorological and oceanographic models, directly into generic analysis and visualization tools. Output is stored using the NetCDF CF Metadata Conventions, and then delivered to collaborators over the web via OPeNDAP. OPeNDAP datasets served by different institutions are then organized via THREDDS catalogs. Tools and procedures are then used which enable scientists to explore data on the original model grids using tools they are familiar with. It is also low-bandwidth, enabling users to extract just the data they require, an important feature for access from ship or remote areas. The entire implementation is simple enough to be handled by modelers working with their webmasters - no advanced programming support is necessary. ?? 2007 Elsevier B.V. All rights reserved.

  2. Human Factors Evaluation of Advanced Electric Power Grid Visualization Tools

    SciTech Connect

    Greitzer, Frank L.; Dauenhauer, Peter M.; Wierks, Tamara G.; Podmore, Robin

    2009-04-01

    This report describes initial human factors evaluation of four visualization tools (Graphical Contingency Analysis, Force Directed Graphs, Phasor State Estimator and Mode Meter/ Mode Shapes) developed by PNNL, and proposed test plans that may be implemented to evaluate their utility in scenario-based experiments.

  3. Targeted mutagenesis tools for modelling psychiatric disorders.

    PubMed

    Deussing, Jan M

    2013-10-01

    In the 1980s, the basic principles of gene targeting were discovered and forged into sharp tools for efficient and precise engineering of the mouse genome. Since then, genetic mouse models have substantially contributed to our understanding of major neurobiological concepts and are of utmost importance for our comprehension of neuropsychiatric disorders. The "domestication" of site-specific recombinases and the continuous creative technological developments involving the implementation of previously identified biological principles such as transcriptional and posttranslational control now enable conditional mutagenesis with high spatial and temporal resolution. The initiation and successful accomplishment of large-scale efforts to annotate functionally the entire mouse genome and to build strategic resources for the research community have significantly accelerated the rapid proliferation and broad propagation of mouse genetic tools. Addressing neurobiological processes with the assistance of genetic mouse models is a routine procedure in psychiatric research and will be further extended in order to improve our understanding of disease mechanisms. In light of the highly complex nature of psychiatric disorders and the current lack of strong causal genetic variants, a major future challenge is to model of psychiatric disorders more appropriately. Humanized mice, and the recently developed toolbox of site-specific nucleases for more efficient and simplified tailoring of the genome, offer the perspective of significantly improved models. Ultimately, these tools will push the limits of gene targeting beyond the mouse to allow genome engineering in any model organism of interest.

  4. An Advanced Time Averaging Modelling Technique for Power Electronic Circuits

    NASA Astrophysics Data System (ADS)

    Jankuloski, Goce

    For stable and efficient performance of power converters, a good mathematical model is needed. This thesis presents a new modelling technique for DC/DC and DC/AC Pulse Width Modulated (PWM) converters. The new model is more accurate than the existing modelling techniques such as State Space Averaging (SSA) and Discrete Time Modelling. Unlike the SSA model, the new modelling technique, the Advanced Time Averaging Model (ATAM) includes the averaging dynamics of the converter's output. In addition to offering enhanced model accuracy, application of linearization techniques to the ATAM enables the use of conventional linear control design tools. A controller design application demonstrates that a controller designed based on the ATAM outperforms one designed using the ubiquitous SSA model. Unlike the SSA model, ATAM for DC/AC augments the system's dynamics with the dynamics needed for subcycle fundamental contribution (SFC) calculation. This allows for controller design that is based on an exact model.

  5. Advanced Engineering Tools for Structural Analysis of Advanced Power Plants Application to the GE ESBWR Design

    SciTech Connect

    Gamble, R.E.; Fanning, A.; Diaz Llanos, M.; Moreno, A.; Carrasco, A.

    2002-07-01

    Experience in the design of nuclear reactors for power generation shows that the plant structures and buildings involved are one of the major contributors to plant capital investment. Consequently, the design of theses elements must be optimised if cost reductions in future reactors are to be achieved. The benefits of using the 'Best Estimate Approach' are well known in the area of core and systems design. This consists in developing accurate models of a plant's phenomenology and behaviour, minimising the margins. Different safety margins have been applied in the past when performing structural analyses. Three of these margins can be identified: - increasing the value of the load by a factor that depends on the load frequency; - decreasing the resistance of the structure's resistance, and - safety margins introduced through two step analysis. The first two type of margins are established in the applicable codes in order to provide design safety margins. The third one derives from limitations in tools which, in the past, did not allow obtaining an accurate model in which both the dynamic and static loads could be evaluated simultaneously. Nowadays, improvements in hardware and software have eliminated the need for two-step calculations in structural analysis (dynamic plus static), allowing the creation one-through finite element models in which all loads, both dynamic and static, are combined without the determination of the equivalent static loads from the dynamic loads. This paper summarizes how these models and methods have been applied to optimize the Reactor Building structural design of the General Electric (GE) ESBWR Passive Plant. The work has focused on three areas: - the design of the Gravity Driven Cooling System (GDCS) Pools as pressure boundary between the Drywell and the Wet-well; - the evaluation of the thickness of the Reactor Building foundation slab, and - the global structural evaluation of the Reactor Building.

  6. A tool box for implementing supersymmetric models

    NASA Astrophysics Data System (ADS)

    Staub, Florian; Ohl, Thorsten; Porod, Werner; Speckner, Christian

    2012-10-01

    We present a framework for performing a comprehensive analysis of a large class of supersymmetric models, including spectrum calculation, dark matter studies and collider phenomenology. To this end, the respective model is defined in an easy and straightforward way using the Mathematica package SARAH. SARAH then generates model files for CalcHep which can be used with micrOMEGAs as well as model files for WHIZARD and O'Mega. In addition, Fortran source code for SPheno is created which facilitates the determination of the particle spectrum using two-loop renormalization group equations and one-loop corrections to the masses. As an additional feature, the generated SPheno code can write out input files suitable for use with HiggsBounds to apply bounds coming from the Higgs searches to the model. Combining all programs provides a closed chain from model building to phenomenology. Program summary Program title: SUSY Phenomenology toolbox. Catalog identifier: AEMN_v1_0. Program summary URL: http://cpc.cs.qub.ac.uk/summaries/AEMN_v1_0.html. Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland. Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html. No. of lines in distributed program, including test data, etc.: 140206. No. of bytes in distributed program, including test data, etc.: 1319681. Distribution format: tar.gz. Programming language: Autoconf, Mathematica. Computer: PC running Linux, Mac. Operating system: Linux, Mac OS. Classification: 11.6. Nature of problem: Comprehensive studies of supersymmetric models beyond the MSSM is considerably complicated by the number of different tasks that have to be accomplished, including the calculation of the mass spectrum and the implementation of the model into tools for performing collider studies, calculating the dark matter density and checking the compatibility with existing collider bounds (in particular, from the Higgs searches). Solution method: The

  7. Advances in Modelling of Valley Glaciers

    NASA Astrophysics Data System (ADS)

    Adhikari, Surendra

    For glaciological conditions typical of valley glaciers, the central idea of this research lies in understanding the effects of high-order mechanics and parameterizing these for simpler dynamical and statistical methods in glaciology. As an effective tool for this, I formulate a new brand of dynamical models that describes distinct physical processes of deformational flow. Through numerical simulations of idealized glacier domains, I calculate empirical correction factors to capture the effects of longitudinal stress gradients and lateral drag for simplified dynamical models in the plane-strain regime. To get some insights into real glacier dynamics, I simulate Haig Glacier in the Canadian Rocky Mountains. As geometric effects overshadow dynamical effects in glacier retreat scenarios, it appears that high-order physics are not very important for Haig Glacier, particularly for evaluating its fate. Indeed, high-order and reduced models all predict that Haig Glacier ceases to exist by about AD2080 under ongoing climate warming. This finding regarding the minimal role of high-order physics may not be broadly valid, as it is not true in advance scenarios at Haig Glacier and it may not be representative of other glaciological settings. Through a 'bulk' parameterization of high-order physics, geometric and climatic settings, sliding conditions, and transient effects, I also provide new insights into the volume-area relation, a widely used statistical method for estimating glacier volume. I find a steady-state power-law exponent of 1:46, which declines systematically to 1:38 after 100 years of sustained retreat, in good accord with the observations. I recommend more accurate scaling relations through characterization of individual glacier morphology and degree of climatic disequilibrium. This motivates a revision of global glacier volume estimates, of some urgency in sea level rise assessments.

  8. GridTool: A surface modeling and grid generation tool

    NASA Technical Reports Server (NTRS)

    Samareh-Abolhassani, Jamshid

    1995-01-01

    GridTool is designed around the concept that the surface grids are generated on a set of bi-linear patches. This type of grid generation is quite easy to implement, and it avoids the problems associated with complex CAD surface representations and associated surface parameterizations. However, the resulting surface grids are close to but not on the original CAD surfaces. This problem can be alleviated by projecting the resulting surface grids onto the original CAD surfaces. GridTool is designed primary for unstructured grid generation systems. Currently, GridTool supports VGRID and FELISA systems, and it can be easily extended to support other unstructured grid generation systems. The data in GridTool is stored parametrically so that once the problem is set up, one can modify the surfaces and the entire set of points, curves and patches will be updated automatically. This is very useful in a multidisciplinary design and optimization process. GridTool is written entirely in ANSI 'C', the interface is based on the FORMS library, and the graphics is based on the GL library. The code has been tested successfully on IRIS workstations running IRIX4.0 and above. The memory is allocated dynamically, therefore, memory size will depend on the complexity of geometry/grid. GridTool data structure is based on a link-list structure which allows the required memory to expand and contract dynamically according to the user's data size and action. Data structure contains several types of objects such as points, curves, patches, sources and surfaces. At any given time, there is always an active object which is drawn in magenta, or in their highlighted colors as defined by the resource file which will be discussed later.

  9. Chemical Kinetic Modeling of Advanced Transportation Fuels

    SciTech Connect

    PItz, W J; Westbrook, C K; Herbinet, O

    2009-01-20

    Development of detailed chemical kinetic models for advanced petroleum-based and nonpetroleum based fuels is a difficult challenge because of the hundreds to thousands of different components in these fuels and because some of these fuels contain components that have not been considered in the past. It is important to develop detailed chemical kinetic models for these fuels since the models can be put into engine simulation codes used for optimizing engine design for maximum efficiency and minimal pollutant emissions. For example, these chemistry-enabled engine codes can be used to optimize combustion chamber shape and fuel injection timing. They also allow insight into how the composition of advanced petroleum-based and non-petroleum based fuels affect engine performance characteristics. Additionally, chemical kinetic models can be used separately to interpret important in-cylinder experimental data and gain insight into advanced engine combustion processes such as HCCI and lean burn engines. The objectives are: (1) Develop detailed chemical kinetic reaction models for components of advanced petroleum-based and non-petroleum based fuels. These fuels models include components from vegetable-oil-derived biodiesel, oil-sand derived fuel, alcohol fuels and other advanced bio-based and alternative fuels. (2) Develop detailed chemical kinetic reaction models for mixtures of non-petroleum and petroleum-based components to represent real fuels and lead to efficient reduced combustion models needed for engine modeling codes. (3) Characterize the role of fuel composition on efficiency and pollutant emissions from practical automotive engines.

  10. Advancements in engineering turbulence modeling

    NASA Technical Reports Server (NTRS)

    Shih, T.-H.

    1991-01-01

    Some new developments in two-equation models and second order closure models are presented. Two-equation models (k-epsilon models) have been widely used in computational fluid dynamics (CFD) for engineering problems. Most of low-Reynolds number two-equation models contain some wall-distance damping functions to account for the effect of wall on turbulence. However, this often causes the confusion and difficulties in computing flows with complex geometry and also needs an ad hoc treatment near the separation and reattachment points. A set of modified two-equation models is proposed to remove the aforementioned shortcomings. The calculations using various two-equation models are compared with direct numerical simulations of channel flow and flat boundary layers. Development of a second order closure model is also discussed with emphasis on the modeling of pressure related correlation terms and dissipation rates in the second moment equations. All the existing models poorly predict the normal stresses near the wall and fail to predict the 3-D effect of mean flow on the turbulence (e.g. decrease in the shear stress caused by the cross flow in the boundary layer). The newly developed second order near-wall turbulence model is described and is capable of capturing the near-wall behavior of turbulence as well as the effect of 3-D mean flow on the turbulence.

  11. Advanced Epi Tools for Gallium Nitride Light Emitting Diode Devices

    SciTech Connect

    Patibandla, Nag; Agrawal, Vivek

    2012-12-01

    Over the course of this program, Applied Materials, Inc., with generous support from the United States Department of Energy, developed a world-class three chamber III-Nitride epi cluster tool for low-cost, high volume GaN growth for the solid state lighting industry. One of the major achievements of the program was to design, build, and demonstrate the world’s largest wafer capacity HVPE chamber suitable for repeatable high volume III-Nitride template and device manufacturing. Applied Materials’ experience in developing deposition chambers for the silicon chip industry over many decades resulted in many orders of magnitude reductions in the price of transistors. That experience and understanding was used in developing this GaN epi deposition tool. The multi-chamber approach, which continues to be unique in the ability of the each chamber to deposit a section of the full device structure, unlike other cluster tools, allows for extreme flexibility in the manufacturing process. This robust architecture is suitable for not just the LED industry, but GaN power devices as well, both horizontal and vertical designs. The new HVPE technology developed allows GaN to be grown at a rate unheard of with MOCVD, up to 20x the typical MOCVD rates of 3{micro}m per hour, with bulk crystal quality better than the highest-quality commercial GaN films grown by MOCVD at a much cheaper overall cost. This is a unique development as the HVPE process has been known for decades, but never successfully commercially developed for high volume manufacturing. This research shows the potential of the first commercial-grade HVPE chamber, an elusive goal for III-V researchers and those wanting to capitalize on the promise of HVPE. Additionally, in the course of this program, Applied Materials built two MOCVD chambers, in addition to the HVPE chamber, and a robot that moves wafers between them. The MOCVD chambers demonstrated industry-leading wavelength yield for GaN based LED wafers and industry

  12. AN ADVANCED TOOL FOR APPLIED INTEGRATED SAFETY MANAGEMENT

    SciTech Connect

    Potts, T. Todd; Hylko, James M.; Douglas, Terence A.

    2003-02-27

    WESKEM, LLC's Environmental, Safety and Health (ES&H) Department had previously assessed that a lack of consistency, poor communication and using antiquated communication tools could result in varying operating practices, as well as a failure to capture and disseminate appropriate Integrated Safety Management (ISM) information. To address these issues, the ES&H Department established an Activity Hazard Review (AHR)/Activity Hazard Analysis (AHA) process for systematically identifying, assessing, and controlling hazards associated with project work activities during work planning and execution. Depending on the scope of a project, information from field walkdowns and table-top meetings are collected on an AHR form. The AHA then documents the potential failure and consequence scenarios for a particular hazard. Also, the AHA recommends whether the type of mitigation appears appropriate or whether additional controls should be implemented. Since the application is web based, the information is captured into a single system and organized according to the >200 work activities already recorded in the database. Using the streamlined AHA method improved cycle time from over four hours to an average of one hour, allowing more time to analyze unique hazards and develop appropriate controls. Also, the enhanced configuration control created a readily available AHA library to research and utilize along with standardizing hazard analysis and control selection across four separate work sites located in Kentucky and Tennessee. The AHR/AHA system provides an applied example of how the ISM concept evolved into a standardized field-deployed tool yielding considerable efficiency gains in project planning and resource utilization. Employee safety is preserved through detailed planning that now requires only a portion of the time previously necessary. The available resources can then be applied to implementing appropriate engineering, administrative and personal protective equipment

  13. Advances in National Capabilities for Consequence Assessment Modeling of Airborne Hazards

    SciTech Connect

    Nasstrom, J; Sugiyama, G; Foster, K; Larsen, S; Kosovic, B; Eme, B; Walker, H; Goldstein, P; Lundquist, J; Pobanz, B; Fulton, J

    2007-11-26

    This paper describes ongoing advancement of airborne hazard modeling capabilities in support of multiple agencies through the National Atmospheric Release Advisory Center (NARAC) and the Interagency Atmospheric Modeling and Atmospheric Assessment Center (IMAAC). A suite of software tools developed by Lawrence Livermore National Laboratory (LLNL) and collaborating organizations includes simple stand-alone, local-scale plume modeling tools for end user's computers, Web- and Internet-based software to access advanced 3-D flow and atmospheric dispersion modeling tools and expert analysis from the national center at LLNL, and state-of-the-science high-resolution urban models and event reconstruction capabilities.

  14. Modeling of Spacecraft Advanced Chemical Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Benfield, Michael P. J.; Belcher, Jeremy A.

    2004-01-01

    This paper outlines the development of the Advanced Chemical Propulsion System (ACPS) model for Earth and Space Storable propellants. This model was developed by the System Technology Operation of SAIC-Huntsville for the NASA MSFC In-Space Propulsion Project Office. Each subsystem of the model is described. Selected model results will also be shown to demonstrate the model's ability to evaluate technology changes in chemical propulsion systems.

  15. Expert models and modeling processes associated with a computer-modeling tool

    NASA Astrophysics Data System (ADS)

    Zhang, Baohui; Liu, Xiufeng; Krajcik, Joseph S.

    2006-07-01

    Holding the premise that the development of expertise is a continuous process, this study concerns expert models and modeling processes associated with a modeling tool called Model-It. Five advanced Ph.D. students in environmental engineering and public health used Model-It to create and test models of water quality. Using think aloud technique and video recording, we captured their computer screen modeling activities and thinking processes. We also interviewed them the day following their modeling sessions to further probe the rationale of their modeling practices. We analyzed both the audio-video transcripts and the experts' models. We found the experts' modeling processes followed the linear sequence built in the modeling program with few instances of moving back and forth. They specified their goals up front and spent a long time thinking through an entire model before acting. They specified relationships with accurate and convincing evidence. Factors (i.e., variables) in expert models were clustered, and represented by specialized technical terms. Based on the above findings, we made suggestions for improving model-based science teaching and learning using Model-It.

  16. Scanning magnetoresistive microscopy: An advanced characterization tool for magnetic nanosystems

    NASA Astrophysics Data System (ADS)

    Mitin, D.; Grobis, M.; Albrecht, M.

    2016-02-01

    An advanced scanning magnetoresistive microscopy (SMRM) — a robust magnetic imaging and probing technique — will be presented, which utilizes state-of-the-art recording heads of a hard disk drive as sensors. The spatial resolution of modern tunneling magnetoresistive sensors is nowadays comparable to the more commonly used magnetic force microscopes. Important advantages of SMRM are the ability to detect pure magnetic signals directly proportional to the out-of-plane magnetic stray field, negligible sensor stray fields, and the ability to apply local bipolar magnetic field pulses up to 10 kOe with bandwidths from DC up to 1 GHz. Moreover, the SMRM can be further equipped with a heating stage and external magnetic field units. The performance of this method and corresponding best practices are demonstrated by presenting various examples, including a temperature dependent recording study on hard magnetic L10 FeCuPt thin films, imaging of magnetic vortex states in an in-plane magnetic field, and their controlled manipulation by applying local field pulses.

  17. Scanning magnetoresistive microscopy: An advanced characterization tool for magnetic nanosystems.

    PubMed

    Mitin, D; Grobis, M; Albrecht, M

    2016-02-01

    An advanced scanning magnetoresistive microscopy (SMRM) - a robust magnetic imaging and probing technique - will be presented, which utilizes state-of-the-art recording heads of a hard disk drive as sensors. The spatial resolution of modern tunneling magnetoresistive sensors is nowadays comparable to the more commonly used magnetic force microscopes. Important advantages of SMRM are the ability to detect pure magnetic signals directly proportional to the out-of-plane magnetic stray field, negligible sensor stray fields, and the ability to apply local bipolar magnetic field pulses up to 10 kOe with bandwidths from DC up to 1 GHz. Moreover, the SMRM can be further equipped with a heating stage and external magnetic field units. The performance of this method and corresponding best practices are demonstrated by presenting various examples, including a temperature dependent recording study on hard magnetic L1(0) FeCuPt thin films, imaging of magnetic vortex states in an in-plane magnetic field, and their controlled manipulation by applying local field pulses. PMID:26931856

  18. Advanced Flow Control as a Management Tool in the National Airspace System

    NASA Technical Reports Server (NTRS)

    Wugalter, S.

    1974-01-01

    Advanced Flow Control is closely related to Air Traffic Control. Air Traffic Control is the business of the Federal Aviation Administration. To formulate an understanding of advanced flow control and its use as a management tool in the National Airspace System, it becomes necessary to speak somewhat of air traffic control, the role of FAA, and their relationship to advanced flow control. Also, this should dispell forever, any notion that advanced flow control is the inspirational master valve scheme to be used on the Alaskan Oil Pipeline.

  19. Recent advances in crop growth modeling

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Crop simulation models and model-based decision support systems are increasingly used to assist agricultural research and development. The systems approach and modelling tools have been linked down to scales of functional genomics and up to regional scales of natural resource management. Although cr...

  20. Advanced tools for astronomical time series and image analysis

    NASA Astrophysics Data System (ADS)

    Scargle, Jeffrey D.

    The algorithms described here, which I have developed for applications in X-ray and γ-ray astronomy, will hopefully be of use in other ways, perhaps aiding in the exploration of modern astronomy's data cornucopia. The goal is to describe principled approaches to some ubiquitous problems, such as detection and characterization of periodic and aperiodic signals, estimation of time delays between multiple time series, and source detection in noisy images with noisy backgrounds. The latter problem is related to detection of clusters in data spaces of various dimensions. A goal of this work is to achieve a unifying view of several related topics: signal detection and characterization, cluster identification, classification, density estimation, and multivariate regression. In addition to being useful for analysis of data from space-based and ground-based missions, these algorithms may be a basis for a future automatic science discovery facility, and in turn provide analysis tools for the Virtual Observatory. This chapter has ties to those by Larry Bretthorst, Tom Loredo, Alanna Connors, Fionn Murtagh, Jim Berger, David van Dyk, Vicent Martinez & Enn Saar.

  1. Preparation of tool mark standards with jewelry modeling waxes.

    PubMed

    Petraco, Nicholas; Petraco, Nicholas D K; Faber, Lisa; Pizzola, Peter A

    2009-03-01

    This paper presents how jewelry modeling waxes are used in the preparation of tool mark standards from exemplar tools. We have previously found that jewelry modeling waxes are ideal for preparing test tool marks from exemplar tools. In this study, simple methods and techniques are offered for the replication of accurate, highly detailed tool mark standards with jewelry modeling waxes. The techniques described here demonstrate the conditioning and proper use of jewelry modeling wax in the production of tool mark standards. The application of each test tool's working surface to a piece of the appropriate wax in a manner consistent with the tool's design is clearly illustrated. The resulting tool mark standards are exact, highly detailed, 1:1, negative impressions of the exemplar tool's working surface. These wax models have a long shelf life and are suitable for use in microscopic examination comparison of questioned and known tool marks. PMID:19187458

  2. An MCMC Circumstellar Disks Modeling Tool

    NASA Astrophysics Data System (ADS)

    Wolff, Schuyler; Perrin, Marshall D.; Mazoyer, Johan; Choquet, Elodie; Soummer, Remi; Ren, Bin; Pueyo, Laurent; Debes, John H.; Duchene, Gaspard; Pinte, Christophe; Menard, Francois

    2016-01-01

    We present an enhanced software framework for the Monte Carlo Markov Chain modeling of circumstellar disk observations, including spectral energy distributions and multi wavelength images from a variety of instruments (e.g. GPI, NICI, HST, WFIRST). The goal is to self-consistently and simultaneously fit a wide variety of observables in order to place constraints on the physical properties of a given disk, while also rigorously assessing the uncertainties in the derived properties. This modular code is designed to work with a collection of existing modeling tools, ranging from simple scripts to define the geometry for optically thin debris disks, to full radiative transfer modeling of complex grain structures in protoplanetary disks (using the MCFOST radiative transfer modeling code). The MCMC chain relies on direct chi squared comparison of model images/spectra to observations. We will include a discussion of how best to weight different observations in the modeling of a single disk and how to incorporate forward modeling from PCA PSF subtraction techniques. The code is open source, python, and available from github. Results for several disks at various evolutionary stages will be discussed.

  3. WMT: The CSDMS Web Modeling Tool

    NASA Astrophysics Data System (ADS)

    Piper, M.; Hutton, E. W. H.; Overeem, I.; Syvitski, J. P.

    2015-12-01

    The Community Surface Dynamics Modeling System (CSDMS) has a mission to enable model use and development for research in earth surface processes. CSDMS strives to expand the use of quantitative modeling techniques, promotes best practices in coding, and advocates for the use of open-source software. To streamline and standardize access to models, CSDMS has developed the Web Modeling Tool (WMT), a RESTful web application with a client-side graphical interface and a server-side database and API that allows users to build coupled surface dynamics models in a web browser on a personal computer or a mobile device, and run them in a high-performance computing (HPC) environment. With WMT, users can: Design a model from a set of components Edit component parameters Save models to a web-accessible server Share saved models with the community Submit runs to an HPC system Download simulation results The WMT client is an Ajax application written in Java with GWT, which allows developers to employ object-oriented design principles and development tools such as Ant, Eclipse and JUnit. For deployment on the web, the GWT compiler translates Java code to optimized and obfuscated JavaScript. The WMT client is supported on Firefox, Chrome, Safari, and Internet Explorer. The WMT server, written in Python and SQLite, is a layered system, with each layer exposing a web service API: wmt-db: database of component, model, and simulation metadata and output wmt-api: configure and connect components wmt-exe: launch simulations on remote execution servers The database server provides, as JSON-encoded messages, the metadata for users to couple model components, including descriptions of component exchange items, uses and provides ports, and input parameters. Execution servers are network-accessible computational resources, ranging from HPC systems to desktop computers, containing the CSDMS software stack for running a simulation. Once a simulation completes, its output, in NetCDF, is packaged

  4. Teaching Advanced Data Analysis Tools to High School Astronomy Students

    NASA Astrophysics Data System (ADS)

    Black, David V.; Herring, Julie; Hintz, Eric G.

    2015-01-01

    A major barrier to becoming an astronomer is learning how to analyze astronomical data, such as using photometry to compare the brightness of stars. Most fledgling astronomers learn observation, data reduction, and analysis skills through an upper division college class. If the same skills could be taught in an introductory high school astronomy class, then more students would have an opportunity to do authentic science earlier, with implications for how many choose to become astronomers. Several software tools have been developed that can analyze astronomical data ranging from fairly straightforward (AstroImageJ and DS9) to very complex (IRAF and DAOphot). During the summer of 2014, a study was undertaken at Brigham Young University through a Research Experience for Teachers (RET) program to evaluate the effectiveness and ease-of-use of these four software packages. Standard tasks tested included creating a false-color IR image using WISE data in DS9, Adobe Photoshop, and The Gimp; a multi-aperture analyses of variable stars over time using AstroImageJ; creating Spectral Energy Distributions (SEDs) of stars using photometry at multiple wavelengths in AstroImageJ and DS9; and color-magnitude and hydrogen alpha index diagrams for open star clusters using IRAF and DAOphot. Tutorials were then written and combined with screen captures to teach high school astronomy students at Walden School of Liberal Arts in Provo, UT how to perform these same tasks. They analyzed image data using the four software packages, imported it into Microsoft Excel, and created charts using images from BYU's 36-inch telescope at their West Mountain Observatory. The students' attempts to complete these tasks were observed, mentoring was provided, and the students then reported on their experience through a self-reflection essay and concept test. Results indicate that high school astronomy students can successfully complete professional-level astronomy data analyses when given detailed

  5. Advanced Space Shuttle simulation model

    NASA Technical Reports Server (NTRS)

    Tatom, F. B.; Smith, S. R.

    1982-01-01

    A non-recursive model (based on von Karman spectra) for atmospheric turbulence along the flight path of the shuttle orbiter was developed. It provides for simulation of instantaneous vertical and horizontal gusts at the vehicle center-of-gravity, and also for simulation of instantaneous gusts gradients. Based on this model the time series for both gusts and gust gradients were generated and stored on a series of magnetic tapes, entitled Shuttle Simulation Turbulence Tapes (SSTT). The time series are designed to represent atmospheric turbulence from ground level to an altitude of 120,000 meters. A description of the turbulence generation procedure is provided. The results of validating the simulated turbulence are described. Conclusions and recommendations are presented. One-dimensional von Karman spectra are tabulated, while a discussion of the minimum frequency simulated is provided. The results of spectral and statistical analyses of the SSTT are presented.

  6. Advances and computational tools towards predictable design in biological engineering.

    PubMed

    Pasotti, Lorenzo; Zucca, Susanna

    2014-01-01

    The design process of complex systems in all the fields of engineering requires a set of quantitatively characterized components and a method to predict the output of systems composed by such elements. This strategy relies on the modularity of the used components or the prediction of their context-dependent behaviour, when parts functioning depends on the specific context. Mathematical models usually support the whole process by guiding the selection of parts and by predicting the output of interconnected systems. Such bottom-up design process cannot be trivially adopted for biological systems engineering, since parts function is hard to predict when components are reused in different contexts. This issue and the intrinsic complexity of living systems limit the capability of synthetic biologists to predict the quantitative behaviour of biological systems. The high potential of synthetic biology strongly depends on the capability of mastering this issue. This review discusses the predictability issues of basic biological parts (promoters, ribosome binding sites, coding sequences, transcriptional terminators, and plasmids) when used to engineer simple and complex gene expression systems in Escherichia coli. A comparison between bottom-up and trial-and-error approaches is performed for all the discussed elements and mathematical models supporting the prediction of parts behaviour are illustrated.

  7. Micromechanical modeling of advanced materials

    SciTech Connect

    Silling, S.A.; Taylor, P.A.; Wise, J.L.; Furnish, M.D.

    1994-04-01

    Funded as a laboratory-directed research and development (LDRD) project, the work reported here focuses on the development of a computational methodology to determine the dynamic response of heterogeneous solids on the basis of their composition and microstructural morphology. Using the solid dynamics wavecode CTH, material response is simulated on a scale sufficiently fine to explicitly represent the material`s microstructure. Conducting {open_quotes}numerical experiments{close_quotes} on this scale, the authors explore the influence that the microstructure exerts on the material`s overall response. These results are used in the development of constitutive models that take into account the effects of microstructure without explicit representation of its features. Applying this methodology to a glass-reinforced plastic (GRP) composite, the authors examined the influence of various aspects of the composite`s microstructure on its response in a loading regime typical of impact and penetration. As a prerequisite to the microscale modeling effort, they conducted extensive materials testing on the constituents, S-2 glass and epoxy resin (UF-3283), obtaining the first Hugoniot and spall data for these materials. The results of this work are used in the development of constitutive models for GRP materials in transient-dynamics computer wavecodes.

  8. Modeling Advance Life Support Systems

    NASA Technical Reports Server (NTRS)

    Pitts, Marvin; Sager, John; Loader, Coleen; Drysdale, Alan

    1996-01-01

    Activities this summer consisted of two projects that involved computer simulation of bioregenerative life support systems for space habitats. Students in the Space Life Science Training Program (SLSTP) used the simulation, space station, to learn about relationships between humans, fish, plants, and microorganisms in a closed environment. One student complete a six week project to modify the simulation by converting the microbes from anaerobic to aerobic, and then balancing the simulation's life support system. A detailed computer simulation of a closed lunar station using bioregenerative life support was attempted, but there was not enough known about system restraints and constants in plant growth, bioreactor design for space habitats and food preparation to develop an integrated model with any confidence. Instead of a completed detailed model with broad assumptions concerning the unknown system parameters, a framework for an integrated model was outlined and work begun on plant and bioreactor simulations. The NASA sponsors and the summer Fell were satisfied with the progress made during the 10 weeks, and we have planned future cooperative work.

  9. Accurate mask model for advanced nodes

    NASA Astrophysics Data System (ADS)

    Zine El Abidine, Nacer; Sundermann, Frank; Yesilada, Emek; Ndiaye, El Hadji Omar; Mishra, Kushlendra; Paninjath, Sankaranarayanan; Bork, Ingo; Buck, Peter; Toublan, Olivier; Schanen, Isabelle

    2014-07-01

    Standard OPC models consist of a physical optical model and an empirical resist model. The resist model compensates the optical model imprecision on top of modeling resist development. The optical model imprecision may result from mask topography effects and real mask information including mask ebeam writing and mask process contributions. For advanced technology nodes, significant progress has been made to model mask topography to improve optical model accuracy. However, mask information is difficult to decorrelate from standard OPC model. Our goal is to establish an accurate mask model through a dedicated calibration exercise. In this paper, we present a flow to calibrate an accurate mask enabling its implementation. The study covers the different effects that should be embedded in the mask model as well as the experiment required to model them.

  10. Advances in Coupling of Kinetics and Molecular Scale Tools to Shed Light on Soil Biogeochemical Processes

    SciTech Connect

    Sparks, Donald

    2014-09-02

    Biogeochemical processes in soils such as sorption, precipitation, and redox play critical roles in the cycling and fate of nutrients, metal(loid)s and organic chemicals in soil and water environments. Advanced analytical tools enable soil scientists to track these processes in real-time and at the molecular scale. Our review focuses on recent research that has employed state-of-the-art molecular scale spectroscopy, coupled with kinetics, to elucidate the mechanisms of nutrient and metal(loid) reactivity and speciation in soils. We found that by coupling kinetics with advanced molecular and nano-scale tools major advances have been made in elucidating important soil chemical processes including sorption, precipitation, dissolution, and redox of metal(loids) and nutrients. Such advances will aid in better predicting the fate and mobility of nutrients and contaminants in soils and water and enhance environmental and agricultural sustainability.

  11. Right approach to 3D modeling using CAD tools

    NASA Astrophysics Data System (ADS)

    Baddam, Mounica Reddy

    The thesis provides a step-by-step methodology to enable an instructor dealing with CAD tools to optimally guide his/her students through an understandable 3D modeling approach which will not only enhance their knowledge about the tool's usage but also enable them to achieve their desired result in comparatively lesser time. In the known practical field, there is particularly very little information available to apply CAD skills to formal beginners' training sessions. Additionally, advent of new software in 3D domain cumulates updating into a more difficult task. Keeping up to the industry's advanced requirements emphasizes the importance of more skilled hands in the field of CAD development, rather than just prioritizing manufacturing in terms of complex software features. The thesis analyses different 3D modeling approaches specified to the varieties of CAD tools currently available in the market. Utilizing performance-time databases, learning curves have been generated to measure their performance time, feature count etc. Based on the results, improvement parameters have also been provided for (Asperl, 2005).

  12. Modeling in the Classroom: An Evolving Learning Tool

    NASA Astrophysics Data System (ADS)

    Few, A. A.; Marlino, M. R.; Low, R.

    2006-12-01

    Among the early programs (early 1990s) focused on teaching Earth System Science were the Global Change Instruction Program (GCIP) funded by NSF through UCAR and the Earth System Science Education Program (ESSE) funded by NASA through USRA. These two programs introduced modeling as a learning tool from the beginning, and they provided workshops, demonstrations and lectures for their participating universities. These programs were aimed at university-level education. Recently, classroom modeling is experiencing a revival of interest. Drs John Snow and Arthur Few conducted two workshops on modeling at the ESSE21 meeting in Fairbanks, Alaska, in August 2005. The Digital Library for Earth System Education (DLESE) at http://www.dlese.org provides web access to STELLA models and tutorials, and UCAR's Education and Outreach (EO) program holds workshops that include training in modeling. An important innovation to the STELLA modeling software by isee systems, http://www.iseesystems.com, called "isee Player" is available as a free download. The Player allows users to view and run STELLA models, change model parameters, share models with colleagues and students, and make working models available on the web. This is important because the expert can create models, and the user can learn how the modeled system works. Another aspect of this innovation is that the educational benefits of modeling concepts can be extended throughout most of the curriculum. The procedure for building a working computer model of an Earth Science System follows this general format: (1) carefully define the question(s) for which you seek the answer(s); (2) identify the interacting system components and inputs contributing to the system's behavior; (3) collect the information and data that will be required to complete the conceptual model; (4) construct a system diagram (graphic) of the system that displays all of system's central questions, components, relationships and required inputs. At this stage

  13. Collaborative Inquiry Learning: Models, tools, and challenges

    NASA Astrophysics Data System (ADS)

    Bell, Thorsten; Urhahne, Detlef; Schanze, Sascha; Ploetzner, Rolf

    2010-02-01

    Collaborative inquiry learning is one of the most challenging and exciting ventures for today's schools. It aims at bringing a new and promising culture of teaching and learning into the classroom where students in groups engage in self-regulated learning activities supported by the teacher. It is expected that this way of learning fosters students' motivation and interest in science, that they learn to perform steps of inquiry similar to scientists and that they gain knowledge on scientific processes. Starting from general pedagogical reflections and science standards, the article reviews some prominent models of inquiry learning. This comparison results in a set of inquiry processes being the basis for cooperation in the scientific network NetCoIL. Inquiry learning is conceived in several ways with emphasis on different processes. For an illustration of the spectrum, some main conceptions of inquiry and their focuses are described. In the next step, the article describes exemplary computer tools and environments from within and outside the NetCoIL network that were designed to support processes of collaborative inquiry learning. These tools are analysed by describing their functionalities as well as effects on student learning known from the literature. The article closes with challenges for further developments elaborated by the NetCoIL network.

  14. Recent advances in modeling stellar interiors (u)

    SciTech Connect

    Guzik, Joyce Ann

    2010-01-01

    Advances in stellar interior modeling are being driven by new data from large-scale surveys and high-precision photometric and spectroscopic observations. Here we focus on single stars in normal evolutionary phases; we will not discuss the many advances in modeling star formation, interacting binaries, supernovae, or neutron stars. We review briefly: (1) updates to input physics of stellar models; (2) progress in two and three-dimensional evolution and hydrodynamic models; (3) insights from oscillation data used to infer stellar interior structure and validate model predictions (asteroseismology). We close by highlighting a few outstanding problems, e.g., the driving mechanisms for hybrid {gamma} Dor/{delta} Sct star pulsations, the cause of giant eruptions seen in luminous blue variables such as {eta} Car and P Cyg, and the solar abundance problem.

  15. METC Gasifier Advanced Simulation (MGAS) model

    SciTech Connect

    Syamlal, M.; Bissett, L.A.

    1992-01-01

    Morgantown Energy Technology Center is developing an advanced moving-bed gasifier, which is the centerpiece of the Integrated Gasifier Combined-Cycle (IGCC) system, with the features of good efficiency, low cost, and minimal environmental impact. A mathematical model of the gasifier, the METC-Gasifier Advanced Simulation (MGAS) model, has been developed for the analysis and design of advanced gasifiers and other moving-bed gasifiers. This report contains the technical and the user manuals of the MGAS model. The MGAS model can describe the transient operation of coflow, counterflow, or fixed-bed gasifiers. It is a one-dimensional model and can simulate the addition and withdrawal of gas and solids at multiple locations in the bed, a feature essential for simulating beds with recycle. The model describes the reactor in terms of a gas phase and a solids (coal or char) phase. These phases may exist at different temperatures. The model considers several combustion, gasification, and initial stage reactions. The model consists of a set of mass balances for 14 gas species and three coal (pseudo-) species and energy balances for the gas and the solids phases. The resulting partial differential equations are solved using a finite difference technique.

  16. Open Source assimilation tool for distributed hydrological model

    NASA Astrophysics Data System (ADS)

    Richard, Julien; Giangola-Murzyn, Agathe; Tchiguirinskaia, Ioulia; Schertzer, Daniel

    2013-04-01

    An advanced GIS data assimilation interface is a requisite to obtain a distributed hydrological model that is both transportable from catchment to catchment and is easily adaptable to data resolution. This tool is achieved for the cartographic data as well as the linked information data. In the case of the Multi-Hydro-Version2 model (A. Giangola-Murzyn et al. 2012), several types of information are distributed on a regular grid. The grid cell size has to be chosen by the user and each cell has to be filled up with information. In order to be the most realistic as possible, the Multi-Hydro model takes into account several data. For that, the assimilation tool (MH-AssimTool) has to be able to import all these different information. The needed flexibility of the studied area and grid size requires that the GIS interface must be easy to take in hand and also practical. The solution of a main window for the geographical visualisation and hierarchical menus coupled with checkboxes was chosen. For example, the geographical information, like the topography or the land use can be visualized in the main window. For the other data, like the soil conductivity, the geology or the initial moisture, the information is demanded through several pop-up windows. Once the needed information imported, MH-AssimTool prepares automatically the data. For the topography data conversion, if the resolution is too small, an interpolation is done during the processing. As a result, all the converted data is in a good resolution for the modelling. As Multi-Hydro, MH-AssimTool is open source. It's coded in Visual Basic language coupled with a GIS library. The interface is built in such a way then it can be used by a non specialist. We will illustrate the efficiency of the tool with some case studies of peri-urban catchments of widely different sizes and characteristics. We will also explain some parts of the coding of the interface.

  17. THE AGWA – KINEROS2 SUITE OF MODELING TOOLS

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A suite of modeling tools ranging from the event-based KINEROS2 flash-flood forecasting tool to the continuous (K2-O2) KINEROS-OPUS biogeochemistry tool. The KINEROS2 flash flood forecasting tool is being tested with the National Weather Service (NEW) is described. Tne NWS version assimilates Dig...

  18. Center for Advanced Modeling and Simulation Intern

    ScienceCinema

    Gertman, Vanessa

    2016-07-12

    Some interns just copy papers and seal envelopes. Not at INL! Check out how Vanessa Gertman, an INL intern working at the Center for Advanced Modeling and Simulation, spent her summer working with some intense visualization software. Lots more content like this is available at INL's facebook page http://www.facebook.com/idahonationallaboratory.

  19. Center for Advanced Modeling and Simulation Intern

    SciTech Connect

    Gertman, Vanessa

    2010-01-01

    Some interns just copy papers and seal envelopes. Not at INL! Check out how Vanessa Gertman, an INL intern working at the Center for Advanced Modeling and Simulation, spent her summer working with some intense visualization software. Lots more content like this is available at INL's facebook page http://www.facebook.com/idahonationallaboratory.

  20. Earthquake information products and tools from the Advanced National Seismic System (ANSS)

    USGS Publications Warehouse

    Wald, Lisa

    2006-01-01

    This Fact Sheet provides a brief description of postearthquake tools and products provided by the Advanced National Seismic System (ANSS) through the U.S. Geological Survey Earthquake Hazards Program. The focus is on products specifically aimed at providing situational awareness in the period immediately following significant earthquake events.

  1. Numerical modelling of tool wear in turning with cemented carbide cutting tools

    SciTech Connect

    Franco, P.; Estrems, M.; Faura, F.

    2007-04-07

    A numerical model is proposed for analysing the flank and crater wear resulting from the loss of material on cutting tool surface in turning processes due to wear mechanisms of adhesion, abrasion and fracture. By means of this model, the material loss along cutting tool surface can be analysed, and the worn surface shape during the workpiece machining can be determined. The proposed model analyses the gradual degradation of cutting tool during turning operation, and tool wear can be estimated as a function of cutting time. Wear-land width (VB) and crater depth (KT) can be obtained for description of material loss on cutting tool surface, and the effects of the distinct wear mechanisms on surface shape can be studied. The parameters required for the tool wear model are obtained from bibliography and experimental observation for AISI 4340 steel turning with WC-Co cutting tools.

  2. Advances in a distributed approach for ocean model data interoperability

    USGS Publications Warehouse

    Signell, Richard P.; Snowden, Derrick P.

    2014-01-01

    An infrastructure for earth science data is emerging across the globe based on common data models and web services. As we evolve from custom file formats and web sites to standards-based web services and tools, data is becoming easier to distribute, find and retrieve, leaving more time for science. We describe recent advances that make it easier for ocean model providers to share their data, and for users to search, access, analyze and visualize ocean data using MATLAB® and Python®. These include a technique for modelers to create aggregated, Climate and Forecast (CF) metadata convention datasets from collections of non-standard Network Common Data Form (NetCDF) output files, the capability to remotely access data from CF-1.6-compliant NetCDF files using the Open Geospatial Consortium (OGC) Sensor Observation Service (SOS), a metadata standard for unstructured grid model output (UGRID), and tools that utilize both CF and UGRID standards to allow interoperable data search, browse and access. We use examples from the U.S. Integrated Ocean Observing System (IOOS®) Coastal and Ocean Modeling Testbed, a project in which modelers using both structured and unstructured grid model output needed to share their results, to compare their results with other models, and to compare models with observed data. The same techniques used here for ocean modeling output can be applied to atmospheric and climate model output, remote sensing data, digital terrain and bathymetric data.

  3. Integrated modeling tool for performance engineering of complex computer systems

    NASA Technical Reports Server (NTRS)

    Wright, Gary; Ball, Duane; Hoyt, Susan; Steele, Oscar

    1989-01-01

    This report summarizes Advanced System Technologies' accomplishments on the Phase 2 SBIR contract NAS7-995. The technical objectives of the report are: (1) to develop an evaluation version of a graphical, integrated modeling language according to the specification resulting from the Phase 2 research; and (2) to determine the degree to which the language meets its objectives by evaluating ease of use, utility of two sets of performance predictions, and the power of the language constructs. The technical approach followed to meet these objectives was to design, develop, and test an evaluation prototype of a graphical, performance prediction tool. The utility of the prototype was then evaluated by applying it to a variety of test cases found in the literature and in AST case histories. Numerous models were constructed and successfully tested. The major conclusion of this Phase 2 SBIR research and development effort is that complex, real-time computer systems can be specified in a non-procedural manner using combinations of icons, windows, menus, and dialogs. Such a specification technique provides an interface that system designers and architects find natural and easy to use. In addition, PEDESTAL's multiview approach provides system engineers with the capability to perform the trade-offs necessary to produce a design that meets timing performance requirements. Sample system designs analyzed during the development effort showed that models could be constructed in a fraction of the time required by non-visual system design capture tools.

  4. Air modeling: Air dispersion models; regulatory applications and technological advances

    SciTech Connect

    Miller, M.; Liles, R.

    1995-09-01

    Air dispersion models are a useful and practical tool for both industry and regulatory agencies. They serve as tools for engineering, permitting, and regulations development. Their cost effectiveness and ease of implementation compared to ambient monitoring is perhaps their most-appealing trait. Based on the current momentum within the U.S. EPA to develop better models and contain regulatory burdens on industry, it is likely that air dispersion modeling will be a major player in future air regulatory initiatives.

  5. Advanced computer modeling techniques expand belt conveyor technology

    SciTech Connect

    Alspaugh, M.

    1998-07-01

    Increased mining production is continuing to challenge engineers and manufacturers to keep up. The pressure to produce larger and more versatile equipment is increasing. This paper will show some recent major projects in the belt conveyor industry that have pushed the limits of design and engineering technology. Also, it will discuss the systems engineering discipline and advanced computer modeling tools that have helped make these achievements possible. Several examples of technologically advanced designs will be reviewed. However, new technology can sometimes produce increased problems with equipment availability and reliability if not carefully developed. Computer modeling techniques that help one design larger equipment can also compound operational headaches if engineering processes and algorithms are not carefully analyzed every step of the way.

  6. Combustion modeling in advanced gas turbine systems

    SciTech Connect

    Smoot, L.D.; Hedman, P.O.; Fletcher, T.H.; Brewster, B.S.; Kramer, S.K.

    1995-12-31

    Goal of DOE`s Advanced Turbine Systems program is to develop and commercialize ultra-high efficiency, environmentally superior, cost competitive gas turbine systems for base-load applications in utility, independent power producer, and industrial markets. Primary objective of the program here is to develop a comprehensive combustion model for advanced gas turbine combustion systems using natural gas (coal gasification or biomass fuels). The efforts included code evaluation (PCGC-3), coherent anti-Stokes Raman spectroscopy, laser Doppler anemometry, and laser-induced fluorescence.

  7. Maturity Model for Advancing Smart Grid Interoperability

    SciTech Connect

    Knight, Mark; Widergren, Steven E.; Mater, J.; Montgomery, Austin

    2013-10-28

    Abstract—Interoperability is about the properties of devices and systems to connect and work properly. Advancing interoperability eases integration and maintenance of the resulting interconnection. This leads to faster integration, lower labor and component costs, predictability of projects and the resulting performance, and evolutionary paths for upgrade. When specifications are shared and standardized, competition and novel solutions can bring new value streams to the community of stakeholders involved. Advancing interoperability involves reaching agreement for how things join at their interfaces. The quality of the agreements and the alignment of parties involved in the agreement present challenges that are best met with process improvement techniques. The GridWise® Architecture Council (GWAC) sponsored by the United States Department of Energy is supporting an effort to use concepts from capability maturity models used in the software industry to advance interoperability of smart grid technology. An interoperability maturity model has been drafted and experience is being gained through trials on various types of projects and community efforts. This paper describes the value and objectives of maturity models, the nature of the interoperability maturity model and how it compares with other maturity models, and experiences gained with its use.

  8. General model for boring tool optimization

    NASA Astrophysics Data System (ADS)

    Moraru, G. M.; rbes, M. V. Ze; Popescu, L. G.

    2016-08-01

    Optimizing a tool (and therefore those for boring) consist in improving its performance through maximizing the objective functions chosen by the designer and/or by user. In order to define and to implement the proposed objective functions, contribute numerous features and performance required by tool users. Incorporation of new features makes the cutting tool to be competitive in the market and to meet user requirements.

  9. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 11: Computer-Aided Manufacturing & Advanced CNC, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  10. Integrated modeling of advanced optical systems

    NASA Astrophysics Data System (ADS)

    Briggs, Hugh C.; Needels, Laura; Levine, B. Martin

    1993-02-01

    This poster session paper describes an integrated modeling and analysis capability being developed at JPL under funding provided by the JPL Director's Discretionary Fund and the JPL Control/Structure Interaction Program (CSI). The posters briefly summarize the program capabilities and illustrate them with an example problem. The computer programs developed under this effort will provide an unprecedented capability for integrated modeling and design of high performance optical spacecraft. The engineering disciplines supported include structural dynamics, controls, optics and thermodynamics. Such tools are needed in order to evaluate the end-to-end system performance of spacecraft such as OSI, POINTS, and SMMM. This paper illustrates the proof-of-concept tools that have been developed to establish the technology requirements and demonstrate the new features of integrated modeling and design. The current program also includes implementation of a prototype tool based upon the CAESY environment being developed under the NASA Guidance and Control Research and Technology Computational Controls Program. This prototype will be available late in FY-92. The development plan proposes a major software production effort to fabricate, deliver, support and maintain a national-class tool from FY-93 through FY-95.

  11. Advances in Computationally Modeling Human Oral Bioavailability

    PubMed Central

    Wang, Junmei; Hou, Tingjun

    2015-01-01

    Although significant progress has been made in experimental high throughput screening (HTS) of ADME (absorption, distribution, metabolism, excretion) and pharmacokinetic properties, the ADME and Toxicity (ADME-Tox) in silico modeling is still indispensable in drug discovery as it can guide us to wisely select drug candidates prior to expensive ADME screenings and clinical trials. Compared to other ADME-Tox properties, human oral bioavailability (HOBA) is particularly important but extremely difficult to predict. In this paper, the advances in human oral bioavailability modeling will be reviewed. Moreover, our deep insight on how to construct more accurate and reliable HOBA QSAR and classification models will also discussed. PMID:25582307

  12. Measuring political commitment and opportunities to advance food and nutrition security: piloting a rapid assessment tool.

    PubMed

    Fox, Ashley M; Balarajan, Yarlini; Cheng, Chloe; Reich, Michael R

    2015-06-01

    Lack of political commitment has been identified as a primary reason for the low priority that food and nutrition interventions receive from national governments relative to the high disease burden caused by malnutrition. Researchers have identified a number of factors that contribute to food and nutrition's 'low-priority cycle' on national policy agendas, but few tools exist to rapidly measure political commitment and identify opportunities to advance food and nutrition on the policy agenda. This article presents a theory-based rapid assessment approach to gauging countries' level of political commitment to food and nutrition security and identifying opportunities to advance food and nutrition on the policy agenda. The rapid assessment tool was piloted among food and nutrition policymakers and planners in 10 low- and middle-income countries in April to June 2013. Food and nutrition commitment and policy opportunity scores were calculated for each country and strategies to advance food and nutrition on policy agendas were designed for each country. The article finds that, in a majority of countries, political leaders had verbally and symbolically committed to addressing food and nutrition, but adequate financial resources were not allocated to implement specific programmes. In addition, whereas the low cohesion of the policy community has been viewed a major underlying cause of the low-priority status of food and nutrition, the analysis finds that policy community cohesion and having a well thought-out policy alternative were present in most countries. This tool may be useful to policymakers and planners providing information that can be used to benchmark and/or evaluate advocacy efforts to advance reforms in the food and nutrition sector; furthermore, the results can help identify specific strategies that can be employed to move the food and nutrition agenda forward. This tool complements others that have been recently developed to measure national commitment to

  13. Measuring political commitment and opportunities to advance food and nutrition security: piloting a rapid assessment tool.

    PubMed

    Fox, Ashley M; Balarajan, Yarlini; Cheng, Chloe; Reich, Michael R

    2015-06-01

    Lack of political commitment has been identified as a primary reason for the low priority that food and nutrition interventions receive from national governments relative to the high disease burden caused by malnutrition. Researchers have identified a number of factors that contribute to food and nutrition's 'low-priority cycle' on national policy agendas, but few tools exist to rapidly measure political commitment and identify opportunities to advance food and nutrition on the policy agenda. This article presents a theory-based rapid assessment approach to gauging countries' level of political commitment to food and nutrition security and identifying opportunities to advance food and nutrition on the policy agenda. The rapid assessment tool was piloted among food and nutrition policymakers and planners in 10 low- and middle-income countries in April to June 2013. Food and nutrition commitment and policy opportunity scores were calculated for each country and strategies to advance food and nutrition on policy agendas were designed for each country. The article finds that, in a majority of countries, political leaders had verbally and symbolically committed to addressing food and nutrition, but adequate financial resources were not allocated to implement specific programmes. In addition, whereas the low cohesion of the policy community has been viewed a major underlying cause of the low-priority status of food and nutrition, the analysis finds that policy community cohesion and having a well thought-out policy alternative were present in most countries. This tool may be useful to policymakers and planners providing information that can be used to benchmark and/or evaluate advocacy efforts to advance reforms in the food and nutrition sector; furthermore, the results can help identify specific strategies that can be employed to move the food and nutrition agenda forward. This tool complements others that have been recently developed to measure national commitment to

  14. Noetica: A Tool for Semantic Data Modelling.

    ERIC Educational Resources Information Center

    Greenhill, Stewart; Venkatesh, Svetha

    1998-01-01

    Discusses Noetica, a tool that uses a semantic network for structuring knowledge about concepts and the relationships between them. It differs from typical information systems in that the knowledge it represents is abstract, highly connected, and includes metaknowledge. Class hierarchy, visualization, and query tools are also discussed.…

  15. Modeling as a research tool in poultry science.

    PubMed

    Gous, R M

    2014-01-01

    The World's Poultry Science Association (WPSA) is a long-established and unique organization that strives to advance knowledge and understanding of all aspects of poultry science and the poultry industry. Its 3 main aims are education, organization, and research. The WPSA Keynote Lecture, titled "Modeling as a research tool in poultry science," addresses 2 of these aims, namely, the value of modeling in research and education. The role of scientists is to put forward and then to test theories. These theories, or models, may be simple or highly complex, but they are aimed at improving our understanding of a system or the interaction between systems. In developing a model, the scientist must take into account existing knowledge, and in this process gaps in our knowledge of a system are identified. Useful ideas for research are generated in this way, and experiments may be designed specifically to address these issues. The resultant models become more accurate and more useful, and can be used in education and extension as a means of explaining many of the complex issues that arise in poultry science.

  16. Modelling of Tool Wear and Residual Stress during Machining of AISI H13 Tool Steel

    NASA Astrophysics Data System (ADS)

    Outeiro, José C.; Umbrello, Domenico; Pina, José C.; Rizzuti, Stefania

    2007-05-01

    Residual stresses can enhance or impair the ability of a component to withstand loading conditions in service (fatigue, creep, stress corrosion cracking, etc.), depending on their nature: compressive or tensile, respectively. This poses enormous problems in structural assembly as this affects the structural integrity of the whole part. In addition, tool wear issues are of critical importance in manufacturing since these affect component quality, tool life and machining cost. Therefore, prediction and control of both tool wear and the residual stresses in machining are absolutely necessary. In this work, a two-dimensional Finite Element model using an implicit Lagrangian formulation with an automatic remeshing was applied to simulate the orthogonal cutting process of AISI H13 tool steel. To validate such model the predicted and experimentally measured chip geometry, cutting forces, temperatures, tool wear and residual stresses on the machined affected layers were compared. The proposed FE model allowed us to investigate the influence of tool geometry, cutting regime parameters and tool wear on residual stress distribution in the machined surface and subsurface of AISI H13 tool steel. The obtained results permit to conclude that in order to reduce the magnitude of surface residual stresses, the cutting speed should be increased, the uncut chip thickness (or feed) should be reduced and machining with honed tools having large cutting edge radii produce better results than chamfered tools. Moreover, increasing tool wear increases the magnitude of surface residual stresses.

  17. Advanced Technology System Scheduling Governance Model

    SciTech Connect

    Ang, Jim; Carnes, Brian; Hoang, Thuc; Vigil, Manuel

    2015-06-11

    In the fall of 2005, the Advanced Simulation and Computing (ASC) Program appointed a team to formulate a governance model for allocating resources and scheduling the stockpile stewardship workload on ASC capability systems. This update to the original document takes into account the new technical challenges and roles for advanced technology (AT) systems and the new ASC Program workload categories that must be supported. The goal of this updated model is to effectively allocate and schedule AT computing resources among all three National Nuclear Security Administration (NNSA) laboratories for weapons deliverables that merit priority on this class of resource. The process outlined below describes how proposed work can be evaluated and approved for resource allocations while preserving high effective utilization of the systems. This approach will provide the broadest possible benefit to the Stockpile Stewardship Program (SSP).

  18. Simulation Tools Model Icing for Aircraft Design

    NASA Technical Reports Server (NTRS)

    2012-01-01

    the years from strictly a research tool to one used routinely by industry and other government agencies. Glenn contractor William Wright has been the architect of this development, supported by a team of researchers investigating icing physics, creating validation data, and ensuring development according to standard software engineering practices. The program provides a virtual simulation environment for determining where water droplets strike an airfoil in flight, what kind of ice would result, and what shape that ice would take. Users can enter geometries for specific, two-dimensional cross sections of an airfoil or other airframe surface and then apply a range of inputs - different droplet sizes, temperatures, airspeeds, and more - to model how ice would build up on the surface in various conditions. The program s versatility, ease of use, and speed - LEWICE can run through complex icing simulations in only a few minutes - have contributed to it becoming a popular resource in the aviation industry.

  19. Combustion modeling in advanced gas turbine systems

    SciTech Connect

    Smoot, L.D.; Hedman, P.O.; Fletcher, T.H.

    1995-10-01

    The goal of the U.S. Department of Energy`s Advanced Turbine Systems (ATS) program is to help develop and commercialize ultra-high efficiency, environmentally superior, and cost competitive gas turbine systems for base-load applications in the utility, independent power producer, and industrial markets. Combustion modeling, including emission characteristics, has been identified as a needed, high-priority technology by key professionals in the gas turbine industry.

  20. Advanced Atmospheric Modeling for Emergency Response.

    NASA Astrophysics Data System (ADS)

    Fast, Jerome D.; O'Steen, B. Lance; Addis, Robert P.

    1995-03-01

    Atmospheric transport and diffusion models are an important part of emergency response systems for industrial facilities that have the potential to release significant quantities of toxic or radioactive material into the atmosphere. An advanced atmospheric transport and diffusion modeling system for emergency response and environmental applications, based upon a three-dimensional mesoscale model, has been developed for the U.S. Department of Energy's Savannah River Site so that complex, time-dependent flow fields not explicitly measured can be routinely simulated. To overcome some of the current computational demands of mesoscale models, two operational procedures for the advanced atmospheric transport and diffusion modeling system are described including 1) a semiprognostic calculation to produce high-resolution wind fields for local pollutant transport in the vicinity of the Savannah River Site and 2) a fully prognostic calculation to produce a regional wind field encompassing the southeastern United States for larger-scale pollutant problems. Local and regional observations and large-scale model output are used by the mesoscale model for the initial conditions, lateral boundary conditions, and four-dimensional data assimilation procedure. This paper describes the current status of the modeling system and presents two case studies demonstrating the capabilities of both modes of operation. While the results from the case studies shown in this paper are preliminary and certainly not definitive, they do suggest that the mesoscale model has the potential for improving the prognostic capabilities of atmospheric modeling for emergency response at the Savannah River Site. Long-term model evaluation will be required to determine under what conditions significant forecast errors exist.

  1. Advances in the identification of transfer function models using Prony analysis

    SciTech Connect

    Trudnowski, D.J.; Donnelly, M.K.; Hauer, J.F.

    1993-06-01

    This paper further advances the usefulness and understanding of Prony analysis as a tool for identification of models. The presented results allow more generality in the assumed model formulation. In addition, a comparison is made between Prony analysis and autoregressive moving-average (ARMA) modeling. Special attention is given to system conditions often encountered with power system electromechanical dynamics.

  2. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 9: Tool and Die, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  3. Modeling and Tool Wear in Routing of CFRP

    SciTech Connect

    Iliescu, D.; Fernandez, A.; Gutierrez-Orrantia, M. E.; Lopez de Lacalle, L. N.

    2011-01-17

    This paper presents the prediction and evaluation of feed force in routing of carbon composite material. In order to extend tool life and improve quality of the machined surface, a better understanding of uncoated and coated tool behaviors is required. This work describes (1) the optimization of the geometry of multiple teeth tools minimizing the tool wear and the feed force, (2) the optimization of tool coating and (3) the development of a phenomenological model between the feed force, the routing parameters and the tool wear. The experimental results indicate that the feed rate, the cutting speed and the tool wear are the most significant factors affecting the feed force. In the case of multiple teeth tools, a particular geometry with 14 teeth right helix right cut and 11 teeth left helix right cut gives the best results. A thick AlTiN coating or a diamond coating can dramatically improve the tool life while minimizing the axial force, roughness and delamination. A wear model has then been developed based on an abrasive behavior of the tool. The model links the feed rate to the tool geometry parameters (tool diameter), to the process parameters (feed rate, cutting speed and depth of cut) and to the wear. The model presented has been verified by experimental tests.

  4. Accelerating advances in continental domain hydrologic modeling

    USGS Publications Warehouse

    Archfield, Stacey A.; Clark, Martyn; Arheimer, Berit; Hay, Lauren E.; McMillan, Hilary; Kiang, Julie E.; Seibert, Jan; Hakala, Kirsti; Bock, Andrew R.; Wagener, Thorsten; Farmer, William H.; Andreassian, Vazken; Attinger, Sabine; Viglione, Alberto; Knight, Rodney; Markstrom, Steven; Over, Thomas M.

    2015-01-01

    In the past, hydrologic modeling of surface water resources has mainly focused on simulating the hydrologic cycle at local to regional catchment modeling domains. There now exists a level of maturity among the catchment, global water security, and land surface modeling communities such that these communities are converging toward continental domain hydrologic models. This commentary, written from a catchment hydrology community perspective, provides a review of progress in each community toward this achievement, identifies common challenges the communities face, and details immediate and specific areas in which these communities can mutually benefit one another from the convergence of their research perspectives. Those include: (1) creating new incentives and infrastructure to report and share model inputs, outputs, and parameters in data services and open access, machine-independent formats for model replication or reanalysis; (2) ensuring that hydrologic models have: sufficient complexity to represent the dominant physical processes and adequate representation of anthropogenic impacts on the terrestrial water cycle, a process-based approach to model parameter estimation, and appropriate parameterizations to represent large-scale fluxes and scaling behavior; (3) maintaining a balance between model complexity and data availability as well as uncertainties; and (4) quantifying and communicating significant advancements toward these modeling goals.

  5. Accelerating advances in continental domain hydrologic modeling

    NASA Astrophysics Data System (ADS)

    Archfield, Stacey A.; Clark, Martyn; Arheimer, Berit; Hay, Lauren E.; McMillan, Hilary; Kiang, Julie E.; Seibert, Jan; Hakala, Kirsti; Bock, Andrew; Wagener, Thorsten; Farmer, William H.; Andréassian, Vazken; Attinger, Sabine; Viglione, Alberto; Knight, Rodney; Markstrom, Steven; Over, Thomas

    2015-12-01

    In the past, hydrologic modeling of surface water resources has mainly focused on simulating the hydrologic cycle at local to regional catchment modeling domains. There now exists a level of maturity among the catchment, global water security, and land surface modeling communities such that these communities are converging toward continental domain hydrologic models. This commentary, written from a catchment hydrology community perspective, provides a review of progress in each community toward this achievement, identifies common challenges the communities face, and details immediate and specific areas in which these communities can mutually benefit one another from the convergence of their research perspectives. Those include: (1) creating new incentives and infrastructure to report and share model inputs, outputs, and parameters in data services and open access, machine-independent formats for model replication or reanalysis; (2) ensuring that hydrologic models have: sufficient complexity to represent the dominant physical processes and adequate representation of anthropogenic impacts on the terrestrial water cycle, a process-based approach to model parameter estimation, and appropriate parameterizations to represent large-scale fluxes and scaling behavior; (3) maintaining a balance between model complexity and data availability as well as uncertainties; and (4) quantifying and communicating significant advancements toward these modeling goals.

  6. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 1: Executive Summary, of a 15-Volume Set of Skills Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    The Machine Tool Advanced Skills Technology (MAST) consortium was formed to address the shortage of skilled workers for the machine tools and metals-related industries. Featuring six of the nation's leading advanced technology centers, the MAST consortium developed, tested, and disseminated industry-specific skill standards and model curricula for…

  7. A National Strategy for Advancing Climate Modeling

    SciTech Connect

    Dunlea, Edward; Elfring, Chris

    2012-12-04

    Climate models are the foundation for understanding and projecting climate and climate-related changes and are thus critical tools for supporting climate-related decision making. This study developed a holistic strategy for improving the nation's capability to accurately simulate climate and related Earth system changes on decadal to centennial timescales. The committee's report is a high level analysis, providing a strategic framework to guide progress in the nation's climate modeling enterprise over the next 10-20 years. This study was supported by DOE, NSF, NASA, NOAA, and the intelligence community.

  8. Multiscale and Multiphysics Modeling of Additive Manufacturing of Advanced Materials

    NASA Technical Reports Server (NTRS)

    Liou, Frank; Newkirk, Joseph; Fan, Zhiqiang; Sparks, Todd; Chen, Xueyang; Fletcher, Kenneth; Zhang, Jingwei; Zhang, Yunlu; Kumar, Kannan Suresh; Karnati, Sreekar

    2015-01-01

    The objective of this proposed project is to research and develop a prediction tool for advanced additive manufacturing (AAM) processes for advanced materials and develop experimental methods to provide fundamental properties and establish validation data. Aircraft structures and engines demand materials that are stronger, useable at much higher temperatures, provide less acoustic transmission, and enable more aeroelastic tailoring than those currently used. Significant improvements in properties can only be achieved by processing the materials under nonequilibrium conditions, such as AAM processes. AAM processes encompass a class of processes that use a focused heat source to create a melt pool on a substrate. Examples include Electron Beam Freeform Fabrication and Direct Metal Deposition. These types of additive processes enable fabrication of parts directly from CAD drawings. To achieve the desired material properties and geometries of the final structure, assessing the impact of process parameters and predicting optimized conditions with numerical modeling as an effective prediction tool is necessary. The targets for the processing are multiple and at different spatial scales, and the physical phenomena associated occur in multiphysics and multiscale. In this project, the research work has been developed to model AAM processes in a multiscale and multiphysics approach. A macroscale model was developed to investigate the residual stresses and distortion in AAM processes. A sequentially coupled, thermomechanical, finite element model was developed and validated experimentally. The results showed the temperature distribution, residual stress, and deformation within the formed deposits and substrates. A mesoscale model was developed to include heat transfer, phase change with mushy zone, incompressible free surface flow, solute redistribution, and surface tension. Because of excessive computing time needed, a parallel computing approach was also tested. In addition

  9. Adaptive Modeling, Engineering Analysis and Design of Advanced Aerospace Vehicles

    NASA Technical Reports Server (NTRS)

    Mukhopadhyay, Vivek; Hsu, Su-Yuen; Mason, Brian H.; Hicks, Mike D.; Jones, William T.; Sleight, David W.; Chun, Julio; Spangler, Jan L.; Kamhawi, Hilmi; Dahl, Jorgen L.

    2006-01-01

    This paper describes initial progress towards the development and enhancement of a set of software tools for rapid adaptive modeling, and conceptual design of advanced aerospace vehicle concepts. With demanding structural and aerodynamic performance requirements, these high fidelity geometry based modeling tools are essential for rapid and accurate engineering analysis at the early concept development stage. This adaptive modeling tool was used for generating vehicle parametric geometry, outer mold line and detailed internal structural layout of wing, fuselage, skin, spars, ribs, control surfaces, frames, bulkheads, floors, etc., that facilitated rapid finite element analysis, sizing study and weight optimization. The high quality outer mold line enabled rapid aerodynamic analysis in order to provide reliable design data at critical flight conditions. Example application for structural design of a conventional aircraft and a high altitude long endurance vehicle configuration are presented. This work was performed under the Conceptual Design Shop sub-project within the Efficient Aerodynamic Shape and Integration project, under the former Vehicle Systems Program. The project objective was to design and assess unconventional atmospheric vehicle concepts efficiently and confidently. The implementation may also dramatically facilitate physics-based systems analysis for the NASA Fundamental Aeronautics Mission. In addition to providing technology for design and development of unconventional aircraft, the techniques for generation of accurate geometry and internal sub-structure and the automated interface with the high fidelity analysis codes could also be applied towards the design of vehicles for the NASA Exploration and Space Science Mission projects.

  10. An Introduction to the Advanced Tracking and Resource Tool for Archive Collections (ATRAC)

    NASA Astrophysics Data System (ADS)

    Roberts, K.; Ritchey, N. A.; Jones, P.; Brown, H.

    2011-12-01

    The National Climatic Data Center (NCDC) has stepped up to meet the demand of today's exponential growth of archive projects and datasets by creating a web-based tool for managing and tracking data archiving, the Advanced Tracking and Resource tool for Archive Collections (ATRAC). ATRAC allows users to enter, share and display information for an archive project. User-friendly forms collect new input or use existing components of information in the system. The tool generates archive documents in various formats from the input and can automatically notify stakeholders of important project milestones. Current information on projects, tasks and events are displayed in configurable timeframes with viewing rights set by the project stakeholders. This presentation will demonstrate ATRAC's latest features and how the capabilities of ATRAC can improve project communication and work flow.

  11. Performance analysis and optimization of an advanced pharmaceutical wastewater treatment plant through a visual basic software tool (PWWT.VB).

    PubMed

    Pal, Parimal; Thakura, Ritwik; Chakrabortty, Sankha

    2016-05-01

    A user-friendly, menu-driven simulation software tool has been developed for the first time to optimize and analyze the system performance of an advanced continuous membrane-integrated pharmaceutical wastewater treatment plant. The software allows pre-analysis and manipulation of input data which helps in optimization and shows the software performance visually on a graphical platform. Moreover, the software helps the user to "visualize" the effects of the operating parameters through its model-predicted output profiles. The software is based on a dynamic mathematical model, developed for a systematically integrated forward osmosis-nanofiltration process for removal of toxic organic compounds from pharmaceutical wastewater. The model-predicted values have been observed to corroborate well with the extensive experimental investigations which were found to be consistent under varying operating conditions like operating pressure, operating flow rate, and draw solute concentration. Low values of the relative error (RE = 0.09) and high values of Willmott-d-index (d will = 0.981) reflected a high degree of accuracy and reliability of the software. This software is likely to be a very efficient tool for system design or simulation of an advanced membrane-integrated treatment plant for hazardous wastewater. PMID:26856870

  12. Performance analysis and optimization of an advanced pharmaceutical wastewater treatment plant through a visual basic software tool (PWWT.VB).

    PubMed

    Pal, Parimal; Thakura, Ritwik; Chakrabortty, Sankha

    2016-05-01

    A user-friendly, menu-driven simulation software tool has been developed for the first time to optimize and analyze the system performance of an advanced continuous membrane-integrated pharmaceutical wastewater treatment plant. The software allows pre-analysis and manipulation of input data which helps in optimization and shows the software performance visually on a graphical platform. Moreover, the software helps the user to "visualize" the effects of the operating parameters through its model-predicted output profiles. The software is based on a dynamic mathematical model, developed for a systematically integrated forward osmosis-nanofiltration process for removal of toxic organic compounds from pharmaceutical wastewater. The model-predicted values have been observed to corroborate well with the extensive experimental investigations which were found to be consistent under varying operating conditions like operating pressure, operating flow rate, and draw solute concentration. Low values of the relative error (RE = 0.09) and high values of Willmott-d-index (d will = 0.981) reflected a high degree of accuracy and reliability of the software. This software is likely to be a very efficient tool for system design or simulation of an advanced membrane-integrated treatment plant for hazardous wastewater.

  13. Modeling: The Right Tool for the Job.

    ERIC Educational Resources Information Center

    Gavanasen, Varut; Hussain, S. Tariq

    1993-01-01

    Reviews the different types of models that can be used in groundwater modeling. Discusses the flow and contaminant transport models in the saturated zone, flow and contaminant transport in variably saturated flow regime, vapor transport, biotransformation models, multiphase models, optimization algorithms, and potentials pitfalls of using these…

  14. Rapid implementation of advanced constitutive models

    NASA Astrophysics Data System (ADS)

    Starman, Bojan; Halilovič, Miroslav; Vrh, Marko; Štok, Boris

    2013-12-01

    This paper presents a methodology based on the NICE integration scheme [1, 2] for simple and rapid numerical implementation of a class of plasticity constitutive models. In this regard, an algorithm is purposely developed for the implementation of newly developed advanced constitutive models into explicit finite element framework. The methodology follows the organization of the problem state variables into an extended form, which allows the constitutive models' equations to be organized in such a way, that the algorithm can be optionally extended with minimal effort to integrate also evolution equations related to a description of other specific phenomena, such as damage, distortional hardening, phase transitions, degradation etc. To confirm simplicity of the program implementation, computational robustness, effectiveness and improved accuracy of the implemented integration algorithm, a deep drawing simulation of the cylindrical cup is considered as the case study, performed in ABAQUS/Explicit. As a fairly complex considered model, the YLD2004-18p model [3, 4] is first implemented via external subroutine VUMAT. Further, to give additional proof of the simplicity of the proposed methodology, a combination of the YLD2004-18p model and Gurson-Tvergaard-Needleman model (GTN) is considered. As demonstrated, the implementation is really obtained in a very simple way.

  15. Storm Water Management Model Climate Adjustment Tool (SWMM-CAT)

    EPA Science Inventory

    The US EPA’s newest tool, the Stormwater Management Model (SWMM) – Climate Adjustment Tool (CAT) is meant to help municipal stormwater utilities better address potential climate change impacts affecting their operations. SWMM, first released in 1971, models hydrology and hydrauli...

  16. THE ATMOSPHERIC MODEL EVALUATION TOOL (AMET); AIR QUALITY MODULE

    EPA Science Inventory

    This presentation reviews the development of the Atmospheric Model Evaluation Tool (AMET) air quality module. The AMET tool is being developed to aid in the model evaluation. This presentation focuses on the air quality evaluation portion of AMET. Presented are examples of the...

  17. Information Model for Machine-Tool-Performance Tests

    PubMed Central

    Lee, Y. Tina; Soons, Johannes A.; Donmez, M. Alkan

    2001-01-01

    This report specifies an information model of machine-tool-performance tests in the EXPRESS [1] language. The information model provides a mechanism for describing the properties and results of machine-tool-performance tests. The objective of the information model is a standardized, computer-interpretable representation that allows for efficient archiving and exchange of performance test data throughout the life cycle of the machine. The report also demonstrates the implementation of the information model using three different implementation methods. PMID:27500031

  18. Advances in Sun-Earth Connection Modeling

    NASA Astrophysics Data System (ADS)

    Ganguli, S. B.; Gavrishchaka, V. V.

    2003-06-01

    Space weather forecasting is a focus of a multidisciplinary research effort motivated by a sensitive dependence of many modern technologies on geospace conditions. Adequate understanding of the physics of the Sun-Earth connection and associated multi-scale magnetospheric and ionospheric processes is an essential part of this effort. Modern physical simulation models such as multimoment multifluid models with effective coupling from small-scale kinetic processes can provide valuable insight into the role of various physical mechanisms operating during geomagnetic storm/substorm activity. However, due to necessary simplifying assumptions, physical models are still not well suited for accurate real-time forecasting. Complimentary approach includes data-driven models capable of efficient processing of multi-scale spatio-temporal data. However, the majority of advanced nonlinear algorithms, including neural networks (NN), can encounter a set of problems called dimensionality curse when applied to high-dimensional data. Forecasting of rare/extreme events such as large geomagnetic storms/substorms is of the most practical importance but is also very challenging for many existing models. A very promising algorithm that combines the power of the best nonlinear techniques and tolerance to high-dimensional and incomplete data is support vector machine (SVM). We have summarized advantages of the SVM and described a hybrid model based on SVM and extreme value theory (EVT) for rare event forecasting. Results of the SVM application to substorm forecasting and future directions are discussed.

  19. Development of Advanced Computational Aeroelasticity Tools at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Bartels, R. E.

    2008-01-01

    NASA Langley Research Center has continued to develop its long standing computational tools to address new challenges in aircraft and launch vehicle design. This paper discusses the application and development of those computational aeroelastic tools. Four topic areas will be discussed: 1) Modeling structural and flow field nonlinearities; 2) Integrated and modular approaches to nonlinear multidisciplinary analysis; 3) Simulating flight dynamics of flexible vehicles; and 4) Applications that support both aeronautics and space exploration.

  20. Advanced modeling of prompt fission neutrons

    SciTech Connect

    Talou, Patrick

    2009-01-01

    Theoretical and numerical studies of prompt fission neutrons are presented. The main results of the Los Alamos model often used in nuclear data evaluation work are reviewed briefly, and a preliminary assessment of uncertainties associated with the evaluated prompt fission neutron spectrum for n (0.5 MeV)+{sup 239}Pu is discussed. Advanced modeling of prompt fission neutrons is done by Monte Carlo simulations of the evaporation process of the excited primary fission fragments. The successive emissions of neutrons are followed in the statistical formalism framework, and detailed information, beyond average quantities, can be inferred. This approach is applied to the following reactions: {sup 252}Cf (sf), n{sub th} + {sup 239}Pu, n (0.5 MeV)+{sup 235}U, and {sup 236}Pu (sf). A discussion on the merits and present limitations of this approach concludes this presentation.

  1. Prospects for Advanced RF Theory and Modeling

    SciTech Connect

    Batchelor, D.B.

    1999-04-12

    This paper represents an attempt to express in print the contents of a rather philosophical review talk. The charge for the talk was not to summarize the present status of the field and what we can do, but to assess what we will need to do in the future and where the gaps are in fulfilling these needs. The objective was to be complete, covering all aspects of theory and modeling in all frequency regimes, although in the end the talk mainly focussed on the ion cyclotron range of frequencies (ICRF). In choosing which areas to develop, it is important to keep in mind who the customers for RF modeling are likely to be and what sorts of tasks they will need for RF to do. This occupies the first part of the paper. Then we examine each of the elements of a complete RF theory and try to identify the kinds of advances needed.

  2. Tampa Bay Water Clarity Model (TBWCM): As a Predictive Tool

    EPA Science Inventory

    The Tampa Bay Water Clarity Model was developed as a predictive tool for estimating the impact of changing nutrient loads on water clarity as measured by secchi depth. The model combines a physical mixing model with an irradiance model and nutrient cycling model. A 10 segment bi...

  3. Model-based advanced process control of coagulation.

    PubMed

    Baxter, C W; Shariff, R; Stanley, S J; Smith, D W; Zhang, Q; Saumer, E D

    2002-01-01

    The drinking water treatment industry has seen a recent increase in the use of artificial neural networks (ANNs) for process modelling and offline process control tools and applications. While conceptual frameworks for integrating the ANN technology into the real-time control of complex treatment processes have been proposed, actual working systems have yet to be developed. This paper presents development and application of an ANN model-based advanced process control system for the coagulation process at a pilot-scale water treatment facility in Edmonton, Alberta, Canada. The system was successfully used to maintain a user-defined set point for effluent quality, by automatically varying operating conditions in response to changes in influent water quality. This new technology has the potential to realize significant operational cost saving for utilities when applied in full-scale applications.

  4. Advancing an Information Model for Environmental Observations

    NASA Astrophysics Data System (ADS)

    Horsburgh, J. S.; Aufdenkampe, A. K.; Hooper, R. P.; Lehnert, K. A.; Schreuders, K.; Tarboton, D. G.; Valentine, D. W.; Zaslavsky, I.

    2011-12-01

    have been modified to support data management for the Critical Zone Observatories (CZOs). This paper will present limitations of the existing information model used by the CUAHSI HIS that have been uncovered through its deployment and use, as well as new advances to the information model, including: better representation of both in situ observations from field sensors and observations derived from environmental samples, extensibility in attributes used to describe observations, and observation provenance. These advances have been developed by the HIS team and the broader scientific community and will enable the information model to accommodate and better describe wider classes of environmental observations and to better meet the needs of the hydrologic science and CZO communities.

  5. Scratch as a computational modelling tool for teaching physics

    NASA Astrophysics Data System (ADS)

    Lopez, Victor; Hernandez, Maria Isabel

    2015-05-01

    The Scratch online authoring tool, which features a simple programming language that has been adapted to primary and secondary students, is being used more and more in schools as it offers students and teachers the opportunity to use a tool to build scientific models and evaluate their behaviour, just as can be done with computational modelling programs. In this article, we briefly discuss why Scratch could be a useful tool for computational modelling in the primary or secondary physics classroom, and we present practical examples of how it can be used to build a model.

  6. Shape: A 3D Modeling Tool for Astrophysics.

    PubMed

    Steffen, Wolfgang; Koning, Nicholas; Wenger, Stephan; Morisset, Christophe; Magnor, Marcus

    2011-04-01

    We present a flexible interactive 3D morpho-kinematical modeling application for astrophysics. Compared to other systems, our application reduces the restrictions on the physical assumptions, data type, and amount that is required for a reconstruction of an object's morphology. It is one of the first publicly available tools to apply interactive graphics to astrophysical modeling. The tool allows astrophysicists to provide a priori knowledge about the object by interactively defining 3D structural elements. By direct comparison of model prediction with observational data, model parameters can then be automatically optimized to fit the observation. The tool has already been successfully used in a number of astrophysical research projects.

  7. Advances in Homology Protein Structure Modeling

    PubMed Central

    Xiang, Zhexin

    2007-01-01

    Homology modeling plays a central role in determining protein structure in the structural genomics project. The importance of homology modeling has been steadily increasing because of the large gap that exists between the overwhelming number of available protein sequences and experimentally solved protein structures, and also, more importantly, because of the increasing reliability and accuracy of the method. In fact, a protein sequence with over 30% identity to a known structure can often be predicted with an accuracy equivalent to a low-resolution X-ray structure. The recent advances in homology modeling, especially in detecting distant homologues, aligning sequences with template structures, modeling of loops and side chains, as well as detecting errors in a model, have contributed to reliable prediction of protein structure, which was not possible even several years ago. The ongoing efforts in solving protein structures, which can be time-consuming and often difficult, will continue to spur the development of a host of new computational methods that can fill in the gap and further contribute to understanding the relationship between protein structure and function. PMID:16787261

  8. Advanced Space Propulsion System Flowfield Modeling

    NASA Technical Reports Server (NTRS)

    Smith, Sheldon

    1998-01-01

    Solar thermal upper stage propulsion systems currently under development utilize small low chamber pressure/high area ratio nozzles. Consequently, the resulting flow in the nozzle is highly viscous, with the boundary layer flow comprising a significant fraction of the total nozzle flow area. Conventional uncoupled flow methods which treat the nozzle boundary layer and inviscid flowfield separately by combining the two calculations via the influence of the boundary layer displacement thickness on the inviscid flowfield are not accurate enough to adequately treat highly viscous nozzles. Navier Stokes models such as VNAP2 can treat these flowfields but cannot perform a vacuum plume expansion for applications where the exhaust plume produces induced environments on adjacent structures. This study is built upon recently developed artificial intelligence methods and user interface methodologies to couple the VNAP2 model for treating viscous nozzle flowfields with a vacuum plume flowfield model (RAMP2) that is currently a part of the Plume Environment Prediction (PEP) Model. This study integrated the VNAP2 code into the PEP model to produce an accurate, practical and user friendly tool for calculating highly viscous nozzle and exhaust plume flowfields.

  9. Tools and Equipment Modeling for Automobile Interactive Assembling Operating Simulation

    SciTech Connect

    Wu Dianliang; Zhu Hongmin

    2010-05-21

    Tools and equipment play an important role in the simulation of virtual assembly, especially in the assembly process simulation and plan. Because of variety in function and complexity in structure and manipulation, the simulation of tools and equipments remains to be a challenge for interactive assembly operation. Based on analysis of details and characteristics of interactive operations for automobile assembly, the functional requirement for tools and equipments of automobile assembly is given. Then, a unified modeling method for information expression and function realization of general tools and equipments is represented, and the handling methods of manual, semi-automatic, automatic tools and equipments are discussed. Finally, the application in assembly simulation of rear suspension and front suspension of Roewe 750 automobile is given. The result shows that the modeling and handling methods are applicable in the interactive simulation of various tools and equipments, and can also be used for supporting assembly process planning in virtual environment.

  10. Novel Multiscale Modeling Tool Applied to Pseudomonas aeruginosa Biofilm Formation

    PubMed Central

    Biggs, Matthew B.; Papin, Jason A.

    2013-01-01

    Multiscale modeling is used to represent biological systems with increasing frequency and success. Multiscale models are often hybrids of different modeling frameworks and programming languages. We present the MATLAB-NetLogo extension (MatNet) as a novel tool for multiscale modeling. We demonstrate the utility of the tool with a multiscale model of Pseudomonas aeruginosa biofilm formation that incorporates both an agent-based model (ABM) and constraint-based metabolic modeling. The hybrid model correctly recapitulates oxygen-limited biofilm metabolic activity and predicts increased growth rate via anaerobic respiration with the addition of nitrate to the growth media. In addition, a genome-wide survey of metabolic mutants and biofilm formation exemplifies the powerful analyses that are enabled by this computational modeling tool. PMID:24147108

  11. Smallpox Models as Policy Tools1

    PubMed Central

    2004-01-01

    Mathematical models can help prepare for and respond to bioterrorism attacks, provided that their strengths and weaknesses are clearly understood. A series of initiatives within the Department of Health and Human Services brought modelers together with biologists and epidemiologists who specialize in smallpox and experts in bioterrorism response and health policy and has led to the parallel development of models with different technical approaches but standardized scenarios, parameter ranges, and outcome measures. Cross-disciplinary interactions throughout the process supported the development of models focused on systematically comparing alternate intervention strategies, determining the most important issues in decision-making, and identifying gaps in current knowledge. PMID:15550219

  12. Tempest: Tools for Addressing the Needs of Next-Generation Climate Models

    NASA Astrophysics Data System (ADS)

    Ullrich, P. A.; Guerra, J. E.; Pinheiro, M. C.; Fong, J.

    2015-12-01

    Tempest is a comprehensive simulation-to-science infrastructure that tackles the needs of next-generation, high-resolution, data intensive climate modeling activities. This project incorporates three key components: TempestDynamics, a global modeling framework for experimental numerical methods and high-performance computing; TempestRemap, a toolset for arbitrary-order conservative and consistent remapping between unstructured grids; and TempestExtremes, a suite of detection and characterization tools for identifying weather extremes in large climate datasets. In this presentation, the latest advances with the implementation of this framework will be discussed, and a number of projects now utilizing these tools will be featured.

  13. An Analysis of Energy Savings Possible Through Advances in Automotive Tooling Technology

    SciTech Connect

    Rick Schmoyer, RLS

    2004-12-03

    The use of lightweight and highly formable advanced materials in automobile and truck manufacturing has the potential to save fuel. Advances in tooling technology would promote the use of these materials. This report describes an energy savings analysis performed to approximate the potential fuel savings and consequential carbon-emission reductions that would be possible because of advances in tooling in the manufacturing of, in particular, non-powertrain components of passenger cars and heavy trucks. Separate energy analyses are performed for cars and heavy trucks. Heavy trucks are considered to be Class 7 and 8 trucks (trucks rated over 26,000 lbs gross vehicle weight). A critical input to the analysis is a set of estimates of the percentage reductions in weight and drag that could be achieved by the implementation of advanced materials, as a consequence of improved tooling technology, which were obtained by surveying tooling industry experts who attended a DOE Workshop, Tooling Technology for Low-Volume Vehicle Production, held in Seattle and Detroit in October and November 2003. The analysis is also based on 2001 fuel consumption totals and on energy-audit component proportions of fuel use due to drag, rolling resistance, and braking. The consumption proportions are assumed constant over time, but an allowance is made for fleet growth. The savings for a particular component is then the product of total fuel consumption, the percentage reduction of the component, and the energy audit component proportion. Fuel savings estimates for trucks also account for weight-limited versus volume-limited operations. Energy savings are assumed to be of two types: (1) direct energy savings incurred through reduced forces that must be overcome to move the vehicle or to slow it down in braking. and (2) indirect energy savings through reductions in the required engine power, the production and transmission of which incur thermodynamic losses, internal friction, and other

  14. Advancing Cyberinfrastructure to support high resolution water resources modeling

    NASA Astrophysics Data System (ADS)

    Tarboton, D. G.; Ogden, F. L.; Jones, N.; Horsburgh, J. S.

    2012-12-01

    Addressing the problem of how the availability and quality of water resources at large scales are sensitive to climate variability, watershed alterations and management activities requires computational resources that combine data from multiple sources and support integrated modeling. Related cyberinfrastructure challenges include: 1) how can we best structure data and computer models to address this scientific problem through the use of high-performance and data-intensive computing, and 2) how can we do this in a way that discipline scientists without extensive computational and algorithmic knowledge and experience can take advantage of advances in cyberinfrastructure? This presentation will describe a new system called CI-WATER that is being developed to address these challenges and advance high resolution water resources modeling in the Western U.S. We are building on existing tools that enable collaboration to develop model and data interfaces that link integrated system models running within an HPC environment to multiple data sources. Our goal is to enhance the use of computational simulation and data-intensive modeling to better understand water resources. Addressing water resource problems in the Western U.S. requires simulation of natural and engineered systems, as well as representation of legal (water rights) and institutional constraints alongside the representation of physical processes. We are establishing data services to represent the engineered infrastructure and legal and institutional systems in a way that they can be used with high resolution multi-physics watershed modeling at high spatial resolution. These services will enable incorporation of location-specific information on water management infrastructure and systems into the assessment of regional water availability in the face of growing demands, uncertain future meteorological forcings, and existing prior-appropriations water rights. This presentation will discuss the informatics

  15. ARPENTEUR: a web-based photogrammetry tool for architectural modeling

    NASA Astrophysics Data System (ADS)

    Grussenmeyer, Pierre; Drap, Pierre

    2000-12-01

    ARPENTEUR is a web application for digital photogrammetry mainly dedicated to architecture. ARPENTEUR has been developed since 1998 by two French research teams: the 'Photogrammetry and Geomatics' group of ENSAIS-LERGEC's laboratory and the MAP-gamsau CNRS laboratory located in the school of Architecture of Marseille. The software package is a web based tool since photogrammetric concepts are embedded in Web technology and Java programming language. The aim of this project is to propose a photogrammetric software package and 3D modeling methods available on the Internet as applets through a simple browser. The use of Java and the Web platform is ful of advantages. Distributing software on any platform, at any pace connected to Internet is of course very promising. The updating is done directly on the server and the user always works with the latest release installed on the server. Three years ago the first prototype of ARPENTEUR was based on the Java Development Kit at the time only available for some browsers. Nowadays, we are working with the JDK 1.3 plug-in enriched by Java Advancing Imaging library.

  16. Robustness of thermal error compensation model of CNC machine tool

    NASA Astrophysics Data System (ADS)

    Lang, Xianli; Miao, Enming; Gong, Yayun; Niu, Pengcheng; Xu, Zhishang

    2013-01-01

    Thermal error is the major factor in restricting the accuracy of CNC machining. The modeling accuracy is the key of thermal error compensation which can achieve precision machining of CNC machine tool. The traditional thermal error compensation models mostly focus on the fitting accuracy without considering the robustness of the models, it makes the research results into practice is difficult. In this paper, the experiment of model robustness is done in different spinde speeds of leaderway V-450 machine tool. Combining fuzzy clustering and grey relevance selects temperature-sensitive points of thermal error. Using multiple linear regression model (MLR) and distributed lag model (DL) establishes model of the multi-batch experimental data and then gives robustness analysis, demonstrates the difference between fitting precision and prediction precision in engineering application, and provides a reference method to choose thermal error compensation model of CNC machine tool in the practical engineering application.

  17. Model Fusion Tool - the Open Environmental Modelling Platform Concept

    NASA Astrophysics Data System (ADS)

    Kessler, H.; Giles, J. R.

    2010-12-01

    The vision of an Open Environmental Modelling Platform - seamlessly linking geoscience data, concepts and models to aid decision making in times of environmental change. Governments and their executive agencies across the world are facing increasing pressure to make decisions about the management of resources in light of population growth and environmental change. In the UK for example, groundwater is becoming a scarce resource for large parts of its most densely populated areas. At the same time river and groundwater flooding resulting from high rainfall events are increasing in scale and frequency and sea level rise is threatening the defences of coastal cities. There is also a need for affordable housing, improved transport infrastructure and waste disposal as well as sources of renewable energy and sustainable food production. These challenges can only be resolved if solutions are based on sound scientific evidence. Although we have knowledge and understanding of many individual processes in the natural sciences it is clear that a single science discipline is unable to answer the questions and their inter-relationships. Modern science increasingly employs computer models to simulate the natural, economic and human system. Management and planning requires scenario modelling, forecasts and ‘predictions’. Although the outputs are often impressive in terms of apparent accuracy and visualisation, they are inherently not suited to simulate the response to feedbacks from other models of the earth system, such as the impact of human actions. Geological Survey Organisations (GSO) are increasingly employing advances in Information Technology to visualise and improve their understanding of geological systems. Instead of 2 dimensional paper maps and reports many GSOs now produce 3 dimensional geological framework models and groundwater flow models as their standard output. Additionally the British Geological Survey have developed standard routines to link geological

  18. Advances in the identification of electrochemical transfer function models using Prony analysis

    SciTech Connect

    Trudnowski, D.J. ); Donnelly, M.K. ); Hauer, J.F. )

    1993-02-01

    This paper further advances the usefulness and understanding of Prony analysis as a tool for identification of power system electromechanical oscillation models. These linear models are developed by analyzing power system ring-down data. The presented results allow more generality in the assumed model formulation. In addition, a comparison is made between Prony analysis and autoregressive moving-average (KARMA) modeling, which has also been proposed for analysis of system oscillations. Under the conditions investigated, the Prony algorithm performed more accurate identification.

  19. Scratch as a Computational Modelling Tool for Teaching Physics

    ERIC Educational Resources Information Center

    Lopez, Victor; Hernandez, Maria Isabel

    2015-01-01

    The Scratch online authoring tool, which features a simple programming language that has been adapted to primary and secondary students, is being used more and more in schools as it offers students and teachers the opportunity to use a tool to build scientific models and evaluate their behaviour, just as can be done with computational modelling…

  20. Model atmospheres - Tool for identifying interstellar features

    NASA Technical Reports Server (NTRS)

    Frisch, P. C.; Slojkowski, S. E.; Rodriguez-Bell, T.; York, D.

    1993-01-01

    Model atmosphere parameters are derived for 14 early A stars with rotation velocities, from optical spectra, in excess of 80 km/s. The models are compared with IUE observations of the stars in regions where interstellar lines are expected. In general, with the assumption of solar abundances, excellent fits are obtained in regions longward of 2580 A, and accurate interstellar equivalent widths can be derived using models to establish the continuum. The fits are poorer at shorter wavelengths, particularly at 2026-2062 A, where the stellar model parameters seem inadequate. Features indicating mass flows are evident in stars with known infrared excesses. In gamma TrA, variability in the Mg II lines is seen over the 5-year interval of these data, and also over timescales as short as 26 days. The present technique should be useful in systematic studies of episodic mass flows in A stars and for stellar abundance studies, as well as interstellar features.

  1. AFDM: An Advanced Fluid-Dynamics Model

    SciTech Connect

    Bohl, W.R.; Parker, F.R. ); Wilhelm, D. . Inst. fuer Neutronenphysik und Reaktortechnik); Berthier, J. ); Goutagny, L. . Inst. de Protection et de Surete Nucleaire); Ninokata,

    1990-09-01

    AFDM, or the Advanced Fluid-Dynamics Model, is a computer code that investigates new approaches simulating the multiphase-flow fluid-dynamics aspects of severe accidents in fast reactors. The AFDM formalism starts with differential equations similar to those in the SIMMER-II code. These equations are modified to treat three velocity fields and supplemented with a variety of new models. The AFDM code has 12 topologies describing what material contacts are possible depending on the presence or absence of a given material in a computational cell, on the dominant liquid, and on the continuous phase. Single-phase, bubbly, churn-turbulent, cellular, and dispersed flow regimes are permitted for the pool situations modeled. Virtual mass terms are included for vapor in liquid-continuous flow. Interfacial areas between the continuous and discontinuous phases are convected to allow some tracking of phenomenological histories. Interfacial areas are also modified by models of nucleation, dynamic forces, turbulence, flashing, coalescence, and mass transfer. Heat transfer is generally treated using engineering correlations. Liquid-vapor phase transitions are handled with the nonequilibrium, heat-transfer-limited model, whereas melting and freezing processes are based on equilibrium considerations. Convection is treated using a fractional-step method of time integration, including a semi-implicit pressure iteration. A higher-order differencing option is provided to control numerical diffusion. The Los Alamos SESAME equation-of-state has been implemented using densities and temperatures as the independent variables. AFDM programming has vectorized all computational loops consistent with the objective of producing an exportable code. 24 refs., 4 figs.

  2. Nonlinear Dynamic Models in Advanced Life Support

    NASA Technical Reports Server (NTRS)

    Jones, Harry

    2002-01-01

    To facilitate analysis, ALS systems are often assumed to be linear and time invariant, but they usually have important nonlinear and dynamic aspects. Nonlinear dynamic behavior can be caused by time varying inputs, changes in system parameters, nonlinear system functions, closed loop feedback delays, and limits on buffer storage or processing rates. Dynamic models are usually cataloged according to the number of state variables. The simplest dynamic models are linear, using only integration, multiplication, addition, and subtraction of the state variables. A general linear model with only two state variables can produce all the possible dynamic behavior of linear systems with many state variables, including stability, oscillation, or exponential growth and decay. Linear systems can be described using mathematical analysis. Nonlinear dynamics can be fully explored only by computer simulations of models. Unexpected behavior is produced by simple models having only two or three state variables with simple mathematical relations between them. Closed loop feedback delays are a major source of system instability. Exceeding limits on buffer storage or processing rates forces systems to change operating mode. Different equilibrium points may be reached from different initial conditions. Instead of one stable equilibrium point, the system may have several equilibrium points, oscillate at different frequencies, or even behave chaotically, depending on the system inputs and initial conditions. The frequency spectrum of an output oscillation may contain harmonics and the sums and differences of input frequencies, but it may also contain a stable limit cycle oscillation not related to input frequencies. We must investigate the nonlinear dynamic aspects of advanced life support systems to understand and counter undesirable behavior.

  3. Applying computer simulation models as learning tools in fishery management

    USGS Publications Warehouse

    Johnson, B.L.

    1995-01-01

    Computer models can be powerful tools for addressing many problems in fishery management, but uncertainty about how to apply models and how they should perform can lead to a cautious approach to modeling. Within this approach, we expect models to make quantitative predictions but only after all model inputs have been estimated from empirical data and after the model has been tested for agreement with an independent data set. I review the limitations to this approach and show how models can be more useful as tools for organizing data and concepts, learning about the system to be managed, and exploring management options. Fishery management requires deciding what actions to pursue to meet management objectives. Models do not make decisions for us but can provide valuable input to the decision-making process. When empirical data are lacking, preliminary modeling with parameters derived from other sources can help determine priorities for data collection. When evaluating models for management applications, we should attempt to define the conditions under which the model is a useful, analytical tool (its domain of applicability) and should focus on the decisions made using modeling results, rather than on quantitative model predictions. I describe an example of modeling used as a learning tool for the yellow perch Perca flavescens fishery in Green Bay, Lake Michigan.

  4. Agent Based Modeling as an Educational Tool

    NASA Astrophysics Data System (ADS)

    Fuller, J. H.; Johnson, R.; Castillo, V.

    2012-12-01

    Motivation is a key element in high school education. One way to improve motivation and provide content, while helping address critical thinking and problem solving skills, is to have students build and study agent based models in the classroom. This activity visually connects concepts with their applied mathematical representation. "Engaging students in constructing models may provide a bridge between frequently disconnected conceptual and mathematical forms of knowledge." (Levy and Wilensky, 2011) We wanted to discover the feasibility of implementing a model based curriculum in the classroom given current and anticipated core and content standards.; Simulation using California GIS data ; Simulation of high school student lunch popularity using aerial photograph on top of terrain value map.

  5. Modelling of advanced structural materials for GEN IV reactors

    NASA Astrophysics Data System (ADS)

    Samaras, M.; Hoffelner, W.; Victoria, M.

    2007-09-01

    The choice of suitable materials and the assessment of long-term materials damage are key issues that need to be addressed for the safe and reliable performance of nuclear power plants. Operating conditions such as high temperatures, irradiation and a corrosive environment degrade materials properties, posing the risk of very expensive or even catastrophic plant damage. Materials scientists are faced with the scientific challenge to determine the long-term damage evolution of materials under service exposure in advanced plants. A higher confidence in life-time assessments of these materials requires an understanding of the related physical phenomena on a range of scales from the microscopic level of single defect damage effects all the way up to macroscopic effects. To overcome lengthy and expensive trial-and-error experiments, the multiscale modelling of materials behaviour is a promising tool, bringing new insights into the fundamental understanding of basic mechanisms. This paper presents the multiscale modelling methodology which is taking root internationally to address the issues of advanced structural materials for Gen IV reactors.

  6. Irena : tool suite for modeling and analysis of small-angle scattering.

    SciTech Connect

    Ilavsky, J.; Jemian, P.

    2009-04-01

    Irena, a tool suite for analysis of both X-ray and neutron small-angle scattering (SAS) data within the commercial Igor Pro application, brings together a comprehensive suite of tools useful for investigations in materials science, physics, chemistry, polymer science and other fields. In addition to Guinier and Porod fits, the suite combines a variety of advanced SAS data evaluation tools for the modeling of size distribution in the dilute limit using maximum entropy and other methods, dilute limit small-angle scattering from multiple non-interacting populations of scatterers, the pair-distance distribution function, a unified fit, the Debye-Bueche model, the reflectivity (X-ray and neutron) using Parratt's formalism, and small-angle diffraction. There are also a number of support tools, such as a data import/export tool supporting a broad sampling of common data formats, a data modification tool, a presentation-quality graphics tool optimized for small-angle scattering data, and a neutron and X-ray scattering contrast calculator. These tools are brought together into one suite with consistent interfaces and functionality. The suite allows robust automated note recording and saving of parameters during export.

  7. EPRI PEAC Corp.: Certification Model Program and Interconnection Agreement Tools

    SciTech Connect

    Not Available

    2003-10-01

    Summarizes the work of EPRI PEAC Corp., under contract to DOE's Distribution and Interconnection R&D, to develop a certification model program and interconnection agreement tools to support the interconnection of distributed energy resources.

  8. Watershed modeling tools and data for prognostic and diagnostic

    NASA Astrophysics Data System (ADS)

    Chambel-Leitao, P.; Brito, D.; Neves, R.

    2009-04-01

    -557-4411-5 Trancoso, R., F. Braunschweig, Chambel-Leitão P., Neves, R., Obermann, M. (2009) An advanced modelling tool for simulating complex river systems. Accepted for publication in Journal of Total Environment. Yarrow M., Chambel-Leitão P. (2006) Calibration of the SWAT model to the Aysén basin of the Chilean Patagonia: Challenges and Lessons. Proceedings of the Watershed Management to Meet Water Quality Standards and TMDLS (Total Maximum Daily Load) 10-14 March 2007, San Antonio, Texas 701P0207. Yarrow M., Chambel-Leitão P.. (2007) Simulating Nothfagus forests in the Chilean Patagonia: a test and analysis of tree growth and nutrient cycling in swat. Submited to the Proceedings of the , 4th International SWAT Conference July 2-6 2007. Yarrow, M., Chambel-Leitão P. (2008) Estimation of loads in the Aysén Basin of the Chilean Patagonia: SWAT model and Harp-Nut guidelines. In Perspectives on Integrated Coastal Zone Management in South America R Neves, J Baretta & M Mateus (eds.). IST Press, Lisbon, Portugal. (ISBN: 978-972-8469-74-0)

  9. EPA MODELING TOOLS FOR CAPTURE ZONE DELINEATION

    EPA Science Inventory

    The EPA Office of Research and Development supports a step-wise modeling approach for design of wellhead protection areas for water supply wells. A web-based WellHEDSS (wellhead decision support system) is under development for determining when simple capture zones (e.g., centri...

  10. Update on Small Modular Reactors Dynamics System Modeling Tool -- Molten Salt Cooled Architecture

    SciTech Connect

    Hale, Richard Edward; Cetiner, Sacit M.; Fugate, David L.; Qualls, A L.; Borum, Robert C.; Chaleff, Ethan S.; Rogerson, Doug W.; Batteh, John J.; Tiller, Michael M.

    2014-08-01

    The Small Modular Reactor (SMR) Dynamic System Modeling Tool project is in the third year of development. The project is designed to support collaborative modeling and study of various advanced SMR (non-light water cooled) concepts, including the use of multiple coupled reactors at a single site. The objective of the project is to provide a common simulation environment and baseline modeling resources to facilitate rapid development of dynamic advanced reactor SMR models, ensure consistency among research products within the Instrumentation, Controls, and Human-Machine Interface (ICHMI) technical area, and leverage cross-cutting capabilities while minimizing duplication of effort. The combined simulation environment and suite of models are identified as the Modular Dynamic SIMulation (MoDSIM) tool. The critical elements of this effort include (1) defining a standardized, common simulation environment that can be applied throughout the program, (2) developing a library of baseline component modules that can be assembled into full plant models using existing geometry and thermal-hydraulic data, (3) defining modeling conventions for interconnecting component models, and (4) establishing user interfaces and support tools to facilitate simulation development (i.e., configuration and parameterization), execution, and results display and capture.

  11. Contemporary molecular tools in microbial ecology and their application to advancing biotechnology.

    PubMed

    Rashid, Mamoon; Stingl, Ulrich

    2015-12-01

    Novel methods in microbial ecology are revolutionizing our understanding of the structure and function of microbes in the environment, but concomitant advances in applications of these tools to biotechnology are mostly lagging behind. After more than a century of efforts to improve microbial culturing techniques, about 70-80% of microbial diversity - recently called the "microbial dark matter" - remains uncultured. In early attempts to identify and sample these so far uncultured taxonomic lineages, methods that amplify and sequence ribosomal RNA genes were extensively used. Recent developments in cell separation techniques, DNA amplification, and high-throughput DNA sequencing platforms have now made the discovery of genes/genomes of uncultured microorganisms from different environments possible through the use of metagenomic techniques and single-cell genomics. When used synergistically, these metagenomic and single-cell techniques create a powerful tool to study microbial diversity. These genomics techniques have already been successfully exploited to identify sources for i) novel enzymes or natural products for biotechnology applications, ii) novel genes from extremophiles, and iii) whole genomes or operons from uncultured microbes. More can be done to utilize these tools more efficiently in biotechnology.

  12. Modeling and analysis of advanced binary cycles

    SciTech Connect

    Gawlik, K.

    1997-12-31

    A computer model (Cycle Analysis Simulation Tool, CAST) and a methodology have been developed to perform value analysis for small, low- to moderate-temperature binary geothermal power plants. The value analysis method allows for incremental changes in the levelized electricity cost (LEC) to be determined between a baseline plant and a modified plant. Thermodynamic cycle analyses and component sizing are carried out in the model followed by economic analysis which provides LEC results. The emphasis of the present work is on evaluating the effect of mixed working fluids instead of pure fluids on the LEC of a geothermal binary plant that uses a simple Organic Rankine Cycle. Four resources were studied spanning the range of 265{degrees}F to 375{degrees}F. A variety of isobutane and propane based mixtures, in addition to pure fluids, were used as working fluids. This study shows that the use of propane mixtures at a 265{degrees}F resource can reduce the LEC by 24% when compared to a base case value that utilizes commercial isobutane as its working fluid. The cost savings drop to 6% for a 375{degrees}F resource, where an isobutane mixture is favored. Supercritical cycles were found to have the lowest cost at all resources.

  13. Advanced statistical methods for the definition of new staging models.

    PubMed

    Kates, Ronald; Schmitt, Manfred; Harbeck, Nadia

    2003-01-01

    Adequate staging procedures are the prerequisite for individualized therapy concepts in cancer, particularly in the adjuvant setting. Molecular staging markers tend to characterize specific, fundamental disease processes to a greater extent than conventional staging markers. At the biological level, the course of the disease will almost certainly involve interactions between multiple underlying processes. Since new therapeutic strategies tend to target specific processes as well, their impact will also involve interactions. Hence, assessment of the prognostic impact of new markers and their utilization for prediction of response to therapy will require increasingly sophisticated statistical tools that are capable of detecting and modeling complicated interactions. Because they are designed to model arbitrary interactions, neural networks offer a promising approach to improved staging. However, the typical clinical data environment poses severe challenges to high-performance survival modeling using neural nets, particularly the key problem of maintaining good generalization. Nonetheless, it turns out that by using newly developed methods to minimize unnecessary complexity in the neural network representation of disease course, it is possible to obtain models with high predictive performance. This performance has been validated on both simulated and real patient data sets. There are important applications for design of studies involving targeted therapy concepts and for identification of the improvement in decision support resulting from new staging markers. In this article, advantages of advanced statistical methods such as neural networks for definition of new staging models will be illustrated using breast cancer as an example.

  14. Tools and Products of Real-Time Modeling: Opportunities for Space Weather Forecasting

    NASA Technical Reports Server (NTRS)

    Hesse, Michael

    2009-01-01

    The Community Coordinated Modeling Center (CCMC) is a US inter-agency activity aiming at research in support of the generation of advanced space weather models. As one of its main functions, the CCMC provides to researchers the use of space science models, even if they are not model owners themselves. The second CCMC activity is to support Space Weather forecasting at national Space Weather Forecasting Centers. This second activity involves model evaluations, model transitions to operations, and the development of draft Space Weather forecasting tools. This presentation will focus on the last element. Specifically, we will discuss present capabilities, and the potential to derive further tools. These capabilities will be interpreted in the context of a broad-based, bootstrapping activity for modern Space Weather forecasting.

  15. Pantograph catenary dynamic optimisation based on advanced multibody and finite element co-simulation tools

    NASA Astrophysics Data System (ADS)

    Massat, Jean-Pierre; Laurent, Christophe; Bianchi, Jean-Philippe; Balmès, Etienne

    2014-05-01

    This paper presents recent developments undertaken by SNCF Innovation & Research Department on numerical modelling of pantograph catenary interaction. It aims at describing an efficient co-simulation process between finite element (FE) and multibody (MB) modelling methods. FE catenary models are coupled with a full flexible MB representation with pneumatic actuation of pantograph. These advanced functionalities allow new kind of numerical analyses such as dynamic improvements based on innovative pneumatic suspensions or assessment of crash risks crossing areas that demonstrate the powerful capabilities of this computing approach.

  16. Advanced Modeling, Simulation and Analysis (AMSA) Capability Roadmap Progress Review

    NASA Technical Reports Server (NTRS)

    Antonsson, Erik; Gombosi, Tamas

    2005-01-01

    Contents include the following: NASA capability roadmap activity. Advanced modeling, simulation, and analysis overview. Scientific modeling and simulation. Operations modeling. Multi-special sensing (UV-gamma). System integration. M and S Environments and Infrastructure.

  17. Vlasiator: Global Kinetic Magnetospheric Modeling Tool

    NASA Astrophysics Data System (ADS)

    Sandroos, A.; von Alfthan, S.; Hoilijoki, S.; Honkonen, I.; Kempf, Y.; Pokhotelov, D.; Palmroth, M.

    2015-10-01

    We present Vlasiator, a novel code based on Vlasov's equation, developed for modeling magnetospheric plasma on a global scale. We have parallelized the code to petascale supercomputers with a hybrid OpenMP-MPI approach to answer the high computational cost of propagating ion distribution functions in six dimensions. The accuracy of the numerical method is demonstrated by comparing simulated wave dispersion plots to analytical results. Simulations of Earth's bow shock region were able to reproduce many well-known plasma phenomena, such as compressional magnetosonic waves in the foreshock region, and mirror mode instability in the magnetosheath.

  18. Tetrahymena as a Unicellular Model Eukaryote: Genetic and Genomic Tools.

    PubMed

    Ruehle, Marisa D; Orias, Eduardo; Pearson, Chad G

    2016-06-01

    Tetrahymena thermophila is a ciliate model organism whose study has led to important discoveries and insights into both conserved and divergent biological processes. In this review, we describe the tools for the use of Tetrahymena as a model eukaryote, including an overview of its life cycle, orientation to its evolutionary roots, and methodological approaches to forward and reverse genetics. Recent genomic tools have expanded Tetrahymena's utility as a genetic model system. With the unique advantages that Tetrahymena provide, we argue that it will continue to be a model organism of choice.

  19. Tetrahymena as a Unicellular Model Eukaryote: Genetic and Genomic Tools.

    PubMed

    Ruehle, Marisa D; Orias, Eduardo; Pearson, Chad G

    2016-06-01

    Tetrahymena thermophila is a ciliate model organism whose study has led to important discoveries and insights into both conserved and divergent biological processes. In this review, we describe the tools for the use of Tetrahymena as a model eukaryote, including an overview of its life cycle, orientation to its evolutionary roots, and methodological approaches to forward and reverse genetics. Recent genomic tools have expanded Tetrahymena's utility as a genetic model system. With the unique advantages that Tetrahymena provide, we argue that it will continue to be a model organism of choice. PMID:27270699

  20. Fabric-based systems: model, tools, applications.

    SciTech Connect

    Wolinski, C.; Gokhale, M.; McCabe, K. P.

    2003-01-01

    A Fabric Based System is a parameterized cellular architecture in which an array of computing cells communicates with an embedded processor through a global memory . This architecture is customizable to different classes of applications by funtional unit, interconnect, and memory parameters, and can be instantiated efficiently on platform FPGAs . In previous work, we have demonstrated the advantage of reconfigurable fabrics for image and signal processing applications . Recently, we have build a Fabric Generator, a Java-based toolset that greatly accelerates construction of the fabrics presented in. A module-generation library is used to define, instantiate, and interconnect cells' datapaths . FG generates customized sequencers for individual cells or collections of cells . We describe the Fabric-Based System model, the FG toolset, and concrete realizations offabric architectures generated by FG on the Altera Excalibur ARM that can deliver 4.5 GigaMACs/s (8/16 bit data, Multiply-Accumulate) .

  1. Light-Weight Parallel Python Tools for Climate Model Workflows

    NASA Astrophysics Data System (ADS)

    Mickelson, S. A.; Paul, K.; Dennis, J.; Strand, G.

    2014-12-01

    It is expected that the data required for the next Intergovernmental Panel on Climate Change (IPCC) Assessment Report (AR6) will increase by more than a factor of 10 to an expected 25 terabytes per model. Experiences from the last Coupled Model Intercomparison Project (CMIP5), which assembled the data used for the last IPCC Assessment Report (AR5), concluded that the processing, archiving, and post-run diagnostic operations required on such large model output took almost as long to complete as the model runs themselves! As a result, we have been investigating and developing light-weight Python-based tools to parallelize the time-intensive post-run steps in the climate model workflow. In particular, we have developed a parallel Python tool for converting time-slice model output to time-series format, and we have more recently developed a parallel Python tool to perform fast time-averaging of time-series data, an operation needed for many diagnostic computations. These tools are designed to be light-weight, easy to install, with very few dependencies, and that can be easily inserted into the climate model workflow with negligible disruption. In this work, we present the motivation, approach, and results of the two light-weight parallel Python tools that we have developed, as well as our plans for future research and development.

  2. Current Advancements and Challenges in Soil-Root Interactions Modelling

    NASA Astrophysics Data System (ADS)

    Schnepf, A.; Huber, K.; Abesha, B.; Meunier, F.; Leitner, D.; Roose, T.; Javaux, M.; Vanderborght, J.; Vereecken, H.

    2014-12-01

    Roots change their surrounding soil chemically, physically and biologically. This includes changes in soil moisture and solute concentration, the exudation of organic substances into the rhizosphere, increased growth of soil microorganisms, or changes in soil structure. The fate of water and solutes in the root zone is highly determined by these root-soil interactions. Mathematical models of soil-root systems in combination with non-invasive techniques able to characterize root systems are a promising tool to understand and predict the behaviour of water and solutes in the root zone. With respect to different fields of applications, predictive mathematical models can contribute to the solution of optimal control problems in plant recourse efficiency. This may result in significant gains in productivity, efficiency and environmental sustainability in various land use activities. Major challenges include the coupling of model parameters of the relevant processes with the surrounding environment such as temperature, nutrient concentration or soil water content. A further challenge is the mathematical description of the different spatial and temporal scales involved. This includes in particular the branched structures formed by root systems or the external mycelium of mycorrhizal fungi. Here, reducing complexity as well as bridging between spatial scales is required. Furthermore, the combination of experimental and mathematical techniques may advance the field enormously. Here, the use of root system, soil and rhizosphere models is presented through a number of modelling case studies, including image based modelling of phosphate uptake by a root with hairs, model-based optimization of root architecture for phosphate uptake from soil, upscaling of rhizosphere models, modelling root growth in structured soil, and the effect of root hydraulic architecture on plant water uptake efficiency and drought resistance.

  3. Current advancements and challenges in soil-root interactions modelling

    NASA Astrophysics Data System (ADS)

    Schnepf, Andrea; Huber, Katrin; Abesha, Betiglu; Meunier, Felicien; Leitner, Daniel; Roose, Tiina; Javaux, Mathieu; Vanderborght, Jan; Vereecken, Harry

    2015-04-01

    Roots change their surrounding soil chemically, physically and biologically. This includes changes in soil moisture and solute concentration, the exudation of organic substances into the rhizosphere, increased growth of soil microorganisms, or changes in soil structure. The fate of water and solutes in the root zone is highly determined by these root-soil interactions. Mathematical models of soil-root systems in combination with non-invasive techniques able to characterize root systems are a promising tool to understand and predict the behaviour of water and solutes in the root zone. With respect to different fields of applications, predictive mathematical models can contribute to the solution of optimal control problems in plant recourse efficiency. This may result in significant gains in productivity, efficiency and environmental sustainability in various land use activities. Major challenges include the coupling of model parameters of the relevant processes with the surrounding environment such as temperature, nutrient concentration or soil water content. A further challenge is the mathematical description of the different spatial and temporal scales involved. This includes in particular the branched structures formed by root systems or the external mycelium of mycorrhizal fungi. Here, reducing complexity as well as bridging between spatial scales is required. Furthermore, the combination of experimental and mathematical techniques may advance the field enormously. Here, the use of root system, soil and rhizosphere models is presented through a number of modelling case studies, including image based modelling of phosphate uptake by a root with hairs, model-based optimization of root architecture for phosphate uptake from soil, upscaling of rhizosphere models, modelling root growth in structured soil, and the effect of root hydraulic architecture on plant water uptake efficiency and drought resistance.

  4. A decision support tool for synchronizing technology advances with strategic mission objectives

    NASA Technical Reports Server (NTRS)

    Hornstein, Rhoda S.; Willoughby, John K.

    1992-01-01

    Successful accomplishment of the objectives of many long-range future missions in areas such as space systems, land-use planning, and natural resource management requires significant technology developments. This paper describes the development of a decision-support data-derived tool called MisTec for helping strategic planners to determine technology development alternatives and to synchronize the technology development schedules with the performance schedules of future long-term missions. Special attention is given to the operations, concept, design, and functional capabilities of the MisTec. The MisTec was initially designed for manned Mars mission, but can be adapted to support other high-technology long-range strategic planning situations, making it possible for a mission analyst, planner, or manager to describe a mission scenario, determine the technology alternatives for making the mission achievable, and to plan the R&D activity necessary to achieve the required technology advances.

  5. Advanced Launch Technology Life Cycle Analysis Using the Architectural Comparison Tool (ACT)

    NASA Technical Reports Server (NTRS)

    McCleskey, Carey M.

    2015-01-01

    Life cycle technology impact comparisons for nanolauncher technology concepts were performed using an Affordability Comparison Tool (ACT) prototype. Examined are cost drivers and whether technology investments can dramatically affect the life cycle characteristics. Primary among the selected applications was the prospect of improving nanolauncher systems. As a result, findings and conclusions are documented for ways of creating more productive and affordable nanolauncher systems; e.g., an Express Lane-Flex Lane concept is forwarded, and the beneficial effect of incorporating advanced integrated avionics is explored. Also, a Functional Systems Breakdown Structure (F-SBS) was developed to derive consistent definitions of the flight and ground systems for both system performance and life cycle analysis. Further, a comprehensive catalog of ground segment functions was created.

  6. Technology, Pedagogy, and Epistemology: Opportunities and Challenges of Using Computer Modeling and Simulation Tools in Elementary Science Methods

    ERIC Educational Resources Information Center

    Schwarz, Christina V.; Meyer, Jason; Sharma, Ajay

    2007-01-01

    This study infused computer modeling and simulation tools in a 1-semester undergraduate elementary science methods course to advance preservice teachers' understandings of computer software use in science teaching and to help them learn important aspects of pedagogy and epistemology. Preservice teachers used computer modeling and simulation tools…

  7. Anvil Forecast Tool in the Advanced Weather Interactive Processing System, Phase II

    NASA Technical Reports Server (NTRS)

    Barrett, Joe H., III

    2008-01-01

    Meteorologists from the 45th Weather Squadron (45 WS) and Spaceflight Meteorology Group have identified anvil forecasting as one of their most challenging tasks when predicting the probability of violations of the Lightning Launch Commit Criteria and Space Light Rules. As a result, the Applied Meteorology Unit (AMU) created a graphical overlay tool for the Meteorological Interactive Data Display Systems (MIDDS) to indicate the threat of thunderstorm anvil clouds, using either observed or model forecast winds as input.

  8. DNA technological progress toward advanced diagnostic tools to support human hookworm control.

    PubMed

    Gasser, R B; Cantacessi, C; Loukas, A

    2008-01-01

    Blood-feeding hookworms are parasitic nematodes of major human health importance. Currently, it is estimated that 740 million people are infected worldwide, and more than 80 million of them are severely affected clinically by hookworm disease. In spite of the health problems caused and the advances toward the development of vaccines against some hookworms, limited attention has been paid to the need for improved, practical methods of diagnosis. Accurate diagnosis and genetic characterization of hookworms is central to their effective control. While traditional diagnostic methods have considerable limitations, there has been some progress toward the development of molecular-diagnostic tools. The present article provides a brief background on hookworm disease of humans, reviews the main methods that have been used for diagnosis and describes progress in establishing polymerase chain reaction (PCR)-based methods for the specific diagnosis of hookworm infection and the genetic characterisation of the causative agents. This progress provides a foundation for the rapid development of practical, highly sensitive and specific diagnostic and analytical tools to be used in improved hookworm prevention and control programmes.

  9. MATISSE: Multi-purpose Advanced Tool for Instruments for the Solar System Exploration .

    NASA Astrophysics Data System (ADS)

    Zinzi, A.; Capria, M. T.; Antonelli, L. A.

    In planetary sciences, design, assemble and launch onboard instruments are only preliminary steps toward the final aim of converting data into scientific knowledge, as the real challenge is the data analysis and interpretation. Up to now data have been generally stored in "old style" archives, i.e. common ftp servers where the user can manually search for data browsing directories organized in a time order manner. However, as datasets to be stored and searched become particularly large, this latter task absorbs a great part of the time, subtracting time to the real scientific work. In order to reduce the time spent to search and analyze data MATISSE (Multi-purpose Advanced Tool for Instruments for the Solar System Exploration), a new set of software tools developed together with the scientific teams of the instruments involved, is under development at ASDC (ASI Science Data Center), whose experience in space missions data management is well known (e.g., \\citealt{verrecchia07,pittori09,giommi09,massaro11}) and its features and aims will be presented here.

  10. Standards, databases, and modeling tools in systems biology.

    PubMed

    Kohl, Michael

    2011-01-01

    Modeling is a means for integrating the results from Genomics, Transcriptomics, Proteomics, and Metabolomics experiments and for gaining insights into the interaction of the constituents of biological systems. However, sharing such large amounts of frequently heterogeneous and distributed experimental data needs both standard data formats and public repositories. Standardization and a public storage system are also important for modeling due to the possibility of sharing models irrespective of the used software tools. Furthermore, rapid model development strongly benefits from available software packages that relieve the modeler of recurring tasks like numerical integration of rate equations or parameter estimation. In this chapter, the most common standard formats used for model encoding and some of the major public databases in this scientific field are presented. The main features of currently available modeling software are discussed and proposals for the application of such tools are given.

  11. Business intelligence tools for radiology: creating a prototype model using open-source tools.

    PubMed

    Prevedello, Luciano M; Andriole, Katherine P; Hanson, Richard; Kelly, Pauline; Khorasani, Ramin

    2010-04-01

    Digital radiology departments could benefit from the ability to integrate and visualize data (e.g. information reflecting complex workflow states) from all of their imaging and information management systems in one composite presentation view. Leveraging data warehousing tools developed in the business world may be one way to achieve this capability. In total, the concept of managing the information available in this data repository is known as Business Intelligence or BI. This paper describes the concepts used in Business Intelligence, their importance to modern Radiology, and the steps used in the creation of a prototype model of a data warehouse for BI using open-source tools.

  12. Using Enabling Technologies to Advance Data Intensive Analysis Tools in the JPL Tropical Cyclone Information System

    NASA Astrophysics Data System (ADS)

    Knosp, B.; Gangl, M. E.; Hristova-Veleva, S. M.; Kim, R. M.; Lambrigtsen, B.; Li, P.; Niamsuwan, N.; Shen, T. P. J.; Turk, F. J.; Vu, Q. A.

    2014-12-01

    The JPL Tropical Cyclone Information System (TCIS) brings together satellite, aircraft, and model forecast data from several NASA, NOAA, and other data centers to assist researchers in comparing and analyzing data related to tropical cyclones. The TCIS has been supporting specific science field campaigns, such as the Genesis and Rapid Intensification Processes (GRIP) campaign and the Hurricane and Severe Storm Sentinel (HS3) campaign, by creating near real-time (NRT) data visualization portals. These portals are intended to assist in mission planning, enhance the understanding of current physical processes, and improve model data by comparing it to satellite and aircraft observations. The TCIS NRT portals allow the user to view plots on a Google Earth interface. To compliment these visualizations, the team has been working on developing data analysis tools to let the user actively interrogate areas of Level 2 swath and two-dimensional plots they see on their screen. As expected, these observation and model data are quite voluminous and bottlenecks in the system architecture can occur when the databases try to run geospatial searches for data files that need to be read by the tools. To improve the responsiveness of the data analysis tools, the TCIS team has been conducting studies on how to best store Level 2 swath footprints and run sub-second geospatial searches to discover data. The first objective was to improve the sampling accuracy of the footprints being stored in the TCIS database by comparing the Java-based NASA PO.DAAC Level 2 Swath Generator with a TCIS Python swath generator. The second objective was to compare the performance of four database implementations - MySQL, MySQL+Solr, MongoDB, and PostgreSQL - to see which database management system would yield the best geospatial query and storage performance. The final objective was to integrate our chosen technologies with our Joint Probability Density Function (Joint PDF), Wave Number Analysis, and

  13. Virtual Cell: computational tools for modeling in cell biology

    PubMed Central

    Resasco, Diana C.; Gao, Fei; Morgan, Frank; Novak, Igor L.; Schaff, James C.; Slepchenko, Boris M.

    2011-01-01

    The Virtual Cell (VCell) is a general computational framework for modeling physico-chemical and electrophysiological processes in living cells. Developed by the National Resource for Cell Analysis and Modeling at the University of Connecticut Health Center, it provides automated tools for simulating a wide range of cellular phenomena in space and time, both deterministically and stochastically. These computational tools allow one to couple electrophysiology and reaction kinetics with transport mechanisms, such as diffusion and directed transport, and map them onto spatial domains of various shapes, including irregular three-dimensional geometries derived from experimental images. In this article, we review new robust computational tools recently deployed in VCell for treating spatially resolved models. PMID:22139996

  14. Modeling the Europa Pathfinder avionics system with a model based avionics architecture tool

    NASA Technical Reports Server (NTRS)

    Chau, S.; Traylor, M.; Hall, R.; Whitfield, A.

    2002-01-01

    In order to shorten the avionics architecture development time, the Jet Propulsion Laboratory has developed a model-based architecture simultion tool called the Avionics System Architecture Tool (ASAT).

  15. Modeling and Simulation Tools: From Systems Biology to Systems Medicine.

    PubMed

    Olivier, Brett G; Swat, Maciej J; Moné, Martijn J

    2016-01-01

    Modeling is an integral component of modern biology. In this chapter we look into the role of the model, as it pertains to Systems Medicine, and the software that is required to instantiate and run it. We do this by comparing the development, implementation, and characteristics of tools that have been developed to work with two divergent methodologies: Systems Biology and Pharmacometrics. From the Systems Biology perspective we consider the concept of "Software as a Medical Device" and what this may imply for the migration of research-oriented, simulation software into the domain of human health.In our second perspective, we see how in practice hundreds of computational tools already accompany drug discovery and development at every stage of the process. Standardized exchange formats are required to streamline the model exchange between tools, which would minimize translation errors and reduce the required time. With the emergence, almost 15 years ago, of the SBML standard, a large part of the domain of interest is already covered and models can be shared and passed from software to software without recoding them. Until recently the last stage of the process, the pharmacometric analysis used in clinical studies carried out on subject populations, lacked such an exchange medium. We describe a new emerging exchange format in Pharmacometrics which covers the non-linear mixed effects models, the standard statistical model type used in this area. By interfacing these two formats the entire domain can be covered by complementary standards and subsequently the according tools.

  16. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 12: Instrumentation, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  17. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 10: Computer-Aided Drafting & Design, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  18. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 2: Career Development, General Education and Remediation, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  19. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 5: Mold Making, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational speciality areas within the U.S. machine tool and metals-related…

  20. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 8: Sheet Metal & Composites, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  1. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 4: Manufacturing Engineering Technology, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  2. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 3: Machining, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  3. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 7: Industrial Maintenance Technology, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  4. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 14: Automated Equipment Technician (CIM), of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  5. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 13: Laser Machining, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  6. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 6: Welding, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  7. Development Life Cycle and Tools for XML Content Models

    SciTech Connect

    Kulvatunyou, Boonserm; Morris, Katherine; Buhwan, Jeong; Goyal, Puja

    2004-11-01

    Many integration projects today rely on shared semantic models based on standards represented using Extensible Mark up Language (XML) technologies. Shared semantic models typically evolve and require maintenance. In addition, to promote interoperability and reduce integration costs, the shared semantics should be reused as much as possible. Semantic components must be consistent and valid in terms of agreed upon standards and guidelines. In this paper, we describe an activity model for creation, use, and maintenance of a shared semantic model that is coherent and supports efficient enterprise integration. We then use this activity model to frame our research and the development of tools to support those activities. We provide overviews of these tools primarily in the context of the W3C XML Schema. At the present, we focus our work on the W3C XML Schema as the representation of choice, due to its extensive adoption by industry.

  8. New generation of exploration tools: interactive modeling software and microcomputers

    SciTech Connect

    Krajewski, S.A.

    1986-08-01

    Software packages offering interactive modeling techniques are now available for use on microcomputer hardware systems. These packages are reasonably priced for both company and independent explorationists; they do not require users to have high levels of computer literacy; they are capable of rapidly completing complex ranges of sophisticated geologic and geophysical modeling tasks; and they can produce presentation-quality output for comparison with real-world data. For example, interactive packages are available for mapping, log analysis, seismic modeling, reservoir studies, and financial projects as well as for applying a variety of statistical and geostatistical techniques to analysis of exploration data. More importantly, these packages enable explorationists to directly apply their geologic expertise when developing and fine-tuning models for identifying new prospects and for extending producing fields. As a result of these features, microcomputers and interactive modeling software are becoming common tools in many exploration offices. Gravity and magnetics software programs illustrate some of the capabilities of such exploration tools.

  9. Modeling of cumulative tool wear in machining metal matrix composites

    SciTech Connect

    Hung, N.P.; Tan, V.K.; Oon, B.E.

    1995-12-31

    Metal matrix composites (MMCs) are notoriously known for their low machinability because of the abrasive and brittle reinforcement. Although a near-net-shape product could be produced, finish machining is still required for the final shape and dimension. The classical Taylor`s tool life equation that relates tool life and cutting conditions has been traditionally used to study machinability. The turning operation is commonly used to investigate the machinability of a material; tedious and costly milling experiments have to be performed separately; while a facing test is not applicable for the Taylor`s model since the facing speed varies as the tool moves radially. Collecting intensive machining data for MMCs is often difficult because of the constraints on size, cost of the material, and the availability of sophisticated machine tools. A more flexible model and machinability testing technique are, therefore, sought. This study presents and verifies new models for turning, facing, and milling operations. Different cutting conditions were utilized to assess the machinability of MMCs reinforced with silicon carbide or alumina particles. Experimental data show that tool wear does not depend on the order of different cutting speeds since abrasion is the main wear mechanism. Correlation between data for turning, milling, and facing is presented. It is more economical to rank machinability using data for facing and then to convert the data for turning and milling, if required. Subsurface damages such as work-hardened and cracked matrix alloy, and fractured and delaminated particles are discussed.

  10. Revel8or: Model Driven Capacity Planning Tool Suite

    SciTech Connect

    Zhu, Liming; Liu, Yan; Bui, Ngoc B.; Gorton, Ian

    2007-05-31

    Designing complex multi-tier applications that must meet strict performance requirements is a challenging software engineering problem. Ideally, the application architect could derive accurate performance predictions early in the project life-cycle, leveraging initial application design-level models and a description of the target software and hardware platforms. To this end, we have developed a capacity planning tool suite for component-based applications, called Revel8tor. The tool adheres to the model driven development paradigm and supports benchmarking and performance prediction for J2EE, .Net and Web services platforms. The suite is composed of three different tools: MDAPerf, MDABench and DSLBench. MDAPerf allows annotation of design diagrams and derives performance analysis models. MDABench allows a customized benchmark application to be modeled in the UML 2.0 Testing Profile and automatically generates a deployable application, with measurement automatically conducted. DSLBench allows the same benchmark modeling and generation to be conducted using a simple performance engineering Domain Specific Language (DSL) in Microsoft Visual Studio. DSLBench integrates with Visual Studio and reuses its load testing infrastructure. Together, the tool suite can assist capacity planning across platforms in an automated fashion.

  11. Development, Implementation and Application of Micromechanical Analysis Tools for Advanced High Temperature Composites

    NASA Technical Reports Server (NTRS)

    2005-01-01

    This document contains the final report to the NASA Glenn Research Center (GRC) for the research project entitled Development, Implementation, and Application of Micromechanical Analysis Tools for Advanced High-Temperature Composites. The research supporting this initiative has been conducted by Dr. Brett A. Bednarcyk, a Senior Scientist at OM in Brookpark, Ohio from the period of August 1998 to March 2005. Most of the work summarized herein involved development, implementation, and application of enhancements and new capabilities for NASA GRC's Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC) software package. When the project began, this software was at a low TRL (3-4) and at release version 2.0. Due to this project, the TRL of MAC/GMC has been raised to 7 and two new versions (3.0 and 4.0) have been released. The most important accomplishments with respect to MAC/GMC are: (1) A multi-scale framework has been built around the software, enabling coupled design and analysis from the global structure scale down to the micro fiber-matrix scale; (2) The software has been expanded to analyze smart materials; (3) State-of-the-art micromechanics theories have been implemented and validated within the code; (4) The damage, failure, and lifing capabilities of the code have been expanded from a very limited state to a vast degree of functionality and utility; and (5) The user flexibility of the code has been significantly enhanced. MAC/GMC is now the premier code for design and analysis of advanced composite and smart materials. It is a candidate for the 2005 NASA Software of the Year Award. The work completed over the course of the project is summarized below on a year by year basis. All publications resulting from the project are listed at the end of this report.

  12. The Advancement Value Chain: An Exploratory Model

    ERIC Educational Resources Information Center

    Leonard, Edward F., III

    2005-01-01

    Since the introduction of the value chain concept in 1985, several varying, yet virtually similar, value chains have been developed for the business enterprise. Shifting to higher education, can a value chain be found that links together the various activities of advancement so that an institution's leaders can actually look at the philanthropic…

  13. Predicting Career Advancement with Structural Equation Modelling

    ERIC Educational Resources Information Center

    Heimler, Ronald; Rosenberg, Stuart; Morote, Elsa-Sofia

    2012-01-01

    Purpose: The purpose of this paper is to use the authors' prior findings concerning basic employability skills in order to determine which skills best predict career advancement potential. Design/methodology/approach: Utilizing survey responses of human resource managers, the employability skills showing the largest relationships to career…

  14. Advanced Placement: Model Policy Components. Policy Analysis

    ERIC Educational Resources Information Center

    Zinth, Jennifer

    2016-01-01

    Advanced Placement (AP), launched in 1955 by the College Board as a program to offer gifted high school students the opportunity to complete entry-level college coursework, has since expanded to encourage a broader array of students to tackle challenging content. This Education Commission of the State's Policy Analysis identifies key components of…

  15. Development and application of modeling tools for sodium fast reactor inspection

    SciTech Connect

    Le Bourdais, Florian; Marchand, Benoît; Baronian, Vahan

    2014-02-18

    To support the development of in-service inspection methods for the Advanced Sodium Test Reactor for Industrial Demonstration (ASTRID) project led by the French Atomic Energy Commission (CEA), several tools that allow situations specific to Sodium cooled Fast Reactors (SFR) to be modeled have been implemented in the CIVA software and exploited. This paper details specific applications and results obtained. For instance, a new specular reflection model allows the calculation of complex echoes from scattering structures inside the reactor vessel. EMAT transducer simulation models have been implemented to develop new transducers for sodium visualization and imaging. Guided wave analysis tools have been developed to permit defect detection in the vessel shell. Application examples and comparisons with experimental data are presented.

  16. Development and application of modeling tools for sodium fast reactor inspection

    NASA Astrophysics Data System (ADS)

    Le Bourdais, Florian; Marchand, Benoît; Baronian, Vahan

    2014-02-01

    To support the development of in-service inspection methods for the Advanced Sodium Test Reactor for Industrial Demonstration (ASTRID) project led by the French Atomic Energy Commission (CEA), several tools that allow situations specific to Sodium cooled Fast Reactors (SFR) to be modeled have been implemented in the CIVA software and exploited. This paper details specific applications and results obtained. For instance, a new specular reflection model allows the calculation of complex echoes from scattering structures inside the reactor vessel. EMAT transducer simulation models have been implemented to develop new transducers for sodium visualization and imaging. Guided wave analysis tools have been developed to permit defect detection in the vessel shell. Application examples and comparisons with experimental data are presented.

  17. Cloud-Based Tools to Support High-Resolution Modeling (Invited)

    NASA Astrophysics Data System (ADS)

    Jones, N.; Nelson, J.; Swain, N.; Christensen, S.

    2013-12-01

    The majority of watershed models developed to support decision-making by water management agencies are simple, lumped-parameter models. Maturity in research codes and advances in the computational power from multi-core processors on desktop machines, commercial cloud-computing resources, and supercomputers with thousands of cores have created new opportunities for employing more accurate, high-resolution distributed models for routine use in decision support. The barriers for using such models on a more routine basis include massive amounts of spatial data that must be processed for each new scenario and lack of efficient visualization tools. In this presentation we will review a current NSF-funded project called CI-WATER that is intended to overcome many of these roadblocks associated with high-resolution modeling. We are developing a suite of tools that will make it possible to deploy customized web-based apps for running custom scenarios for high-resolution models with minimal effort. These tools are based on a software stack that includes 52 North, MapServer, PostGIS, HT Condor, CKAN, and Python. This open source stack provides a simple scripting environment for quickly configuring new custom applications for running high-resolution models as geoprocessing workflows. The HT Condor component facilitates simple access to local distributed computers or commercial cloud resources when necessary for stochastic simulations. The CKAN framework provides a powerful suite of tools for hosting such workflows in a web-based environment that includes visualization tools and storage of model simulations in a database to archival, querying, and sharing of model results. Prototype applications including land use change, snow melt, and burned area analysis will be presented. This material is based upon work supported by the National Science Foundation under Grant No. 1135482

  18. NASA Advanced Concepts Office, Earth-To-Orbit Team Design Process and Tools

    NASA Technical Reports Server (NTRS)

    Waters, Eric D.; Garcia, Jessica; Threet, Grady E., Jr.; Phillips, Alan

    2013-01-01

    The Earth-to-Orbit Team (ETO) of the Advanced Concepts Office (ACO) at NASA Marshall Space Flight Center (MSFC) is considered the pre-eminent "go-to" group for pre-phase A and phase A concept definition. Over the past several years the ETO team has evaluated thousands of launch vehicle concept variations for a significant number of studies including agency-wide efforts such as the Exploration Systems Architecture Study (ESAS), Constellation, Heavy Lift Launch Vehicle (HLLV), Augustine Report, Heavy Lift Propulsion Technology (HLPT), Human Exploration Framework Team (HEFT), and Space Launch System (SLS). The ACO ETO Team is called upon to address many needs in NASA's design community; some of these are defining extremely large trade-spaces, evaluating advanced technology concepts which have not been addressed by a large majority of the aerospace community, and the rapid turn-around of highly time critical actions. It is the time critical actions, those often limited by schedule or little advanced warning, that have forced the five member ETO team to develop a design process robust enough to handle their current output level in order to meet their customer's needs. Based on the number of vehicle concepts evaluated over the past year this output level averages to four completed vehicle concepts per day. Each of these completed vehicle concepts includes a full mass breakdown of the vehicle to a tertiary level of subsystem components and a vehicle trajectory analysis to determine optimized payload delivery to specified orbital parameters, flight environments, and delta v capability. A structural analysis of the vehicle to determine flight loads based on the trajectory output, material properties, and geometry of the concept is also performed. Due to working in this fast-paced and sometimes rapidly changing environment, the ETO Team has developed a finely tuned process to maximize their delivery capabilities. The objective of this paper is to describe the interfaces

  19. NASA Advanced Concepts Office, Earth-To-Orbit Team Design Process and Tools

    NASA Technical Reports Server (NTRS)

    Waters, Eric D.; Creech, Dennis M.; Garcia, Jessica; Threet, Grady E., Jr.; Phillips, Alan

    2012-01-01

    The Earth-to-Orbit Team (ETO) of the Advanced Concepts Office (ACO) at NASA Marshall Space Flight Center (MSFC) is considered the pre-eminent go-to group for pre-phase A and phase A concept definition. Over the past several years the ETO team has evaluated thousands of launch vehicle concept variations for a significant number of studies including agency-wide efforts such as the Exploration Systems Architecture Study (ESAS), Constellation, Heavy Lift Launch Vehicle (HLLV), Augustine Report, Heavy Lift Propulsion Technology (HLPT), Human Exploration Framework Team (HEFT), and Space Launch System (SLS). The ACO ETO Team is called upon to address many needs in NASA s design community; some of these are defining extremely large trade-spaces, evaluating advanced technology concepts which have not been addressed by a large majority of the aerospace community, and the rapid turn-around of highly time critical actions. It is the time critical actions, those often limited by schedule or little advanced warning, that have forced the five member ETO team to develop a design process robust enough to handle their current output level in order to meet their customer s needs. Based on the number of vehicle concepts evaluated over the past year this output level averages to four completed vehicle concepts per day. Each of these completed vehicle concepts includes a full mass breakdown of the vehicle to a tertiary level of subsystem components and a vehicle trajectory analysis to determine optimized payload delivery to specified orbital parameters, flight environments, and delta v capability. A structural analysis of the vehicle to determine flight loads based on the trajectory output, material properties, and geometry of the concept is also performed. Due to working in this fast-paced and sometimes rapidly changing environment, the ETO Team has developed a finely tuned process to maximize their delivery capabilities. The objective of this paper is to describe the interfaces

  20. Object-Oriented MDAO Tool with Aeroservoelastic Model Tuning Capability

    NASA Technical Reports Server (NTRS)

    Pak, Chan-gi; Li, Wesley; Lung, Shun-fat

    2008-01-01

    An object-oriented multi-disciplinary analysis and optimization (MDAO) tool has been developed at the NASA Dryden Flight Research Center to automate the design and analysis process and leverage existing commercial as well as in-house codes to enable true multidisciplinary optimization in the preliminary design stage of subsonic, transonic, supersonic and hypersonic aircraft. Once the structural analysis discipline is finalized and integrated completely into the MDAO process, other disciplines such as aerodynamics and flight controls will be integrated as well. Simple and efficient model tuning capabilities based on optimization problem are successfully integrated with the MDAO tool. More synchronized all phases of experimental testing (ground and flight), analytical model updating, high-fidelity simulations for model validation, and integrated design may result in reduction of uncertainties in the aeroservoelastic model and increase the flight safety.

  1. Integrated Modeling Tools for Thermal Analysis and Applications

    NASA Technical Reports Server (NTRS)

    Milman, Mark H.; Needels, Laura; Papalexandris, Miltiadis

    1999-01-01

    Integrated modeling of spacecraft systems is a rapidly evolving area in which multidisciplinary models are developed to design and analyze spacecraft configurations. These models are especially important in the early design stages where rapid trades between subsystems can substantially impact design decisions. Integrated modeling is one of the cornerstones of two of NASA's planned missions in the Origins Program -- the Next Generation Space Telescope (NGST) and the Space Interferometry Mission (SIM). Common modeling tools for control design and opto-mechanical analysis have recently emerged and are becoming increasingly widely used. A discipline that has been somewhat less integrated, but is nevertheless of critical concern for high precision optical instruments, is thermal analysis and design. A major factor contributing to this mild estrangement is that the modeling philosophies and objectives for structural and thermal systems typically do not coincide. Consequently the tools that are used in these discplines suffer a degree of incompatibility, each having developed along their own evolutionary path. Although standard thermal tools have worked relatively well in the past. integration with other disciplines requires revisiting modeling assumptions and solution methods. Over the past several years we have been developing a MATLAB based integrated modeling tool called IMOS (Integrated Modeling of Optical Systems) which integrates many aspects of structural, optical, control and dynamical analysis disciplines. Recent efforts have included developing a thermal modeling and analysis capability, which is the subject of this article. Currently, the IMOS thermal suite contains steady state and transient heat equation solvers, and the ability to set up the linear conduction network from an IMOS finite element model. The IMOS code generates linear conduction elements associated with plates and beams/rods of the thermal network directly from the finite element structural

  2. Tool Steel Heat Treatment Optimization Using Neural Network Modeling

    NASA Astrophysics Data System (ADS)

    Podgornik, Bojan; Belič, Igor; Leskovšek, Vojteh; Godec, Matjaz

    2016-08-01

    Optimization of tool steel properties and corresponding heat treatment is mainly based on trial and error approach, which requires tremendous experimental work and resources. Therefore, there is a huge need for tools allowing prediction of mechanical properties of tool steels as a function of composition and heat treatment process variables. The aim of the present work was to explore the potential and possibilities of artificial neural network-based modeling to select and optimize vacuum heat treatment conditions depending on the hot work tool steel composition and required properties. In the current case training of the feedforward neural network with error backpropagation training scheme and four layers of neurons (8-20-20-2) scheme was based on the experimentally obtained tempering diagrams for ten different hot work tool steel compositions and at least two austenitizing temperatures. Results show that this type of modeling can be successfully used for detailed and multifunctional analysis of different influential parameters as well as to optimize heat treatment process of hot work tool steels depending on the composition. In terms of composition, V was found as the most beneficial alloying element increasing hardness and fracture toughness of hot work tool steel; Si, Mn, and Cr increase hardness but lead to reduced fracture toughness, while Mo has the opposite effect. Optimum concentration providing high KIc/HRC ratios would include 0.75 pct Si, 0.4 pct Mn, 5.1 pct Cr, 1.5 pct Mo, and 0.5 pct V, with the optimum heat treatment performed at lower austenitizing and intermediate tempering temperatures.

  3. Tool Steel Heat Treatment Optimization Using Neural Network Modeling

    NASA Astrophysics Data System (ADS)

    Podgornik, Bojan; Belič, Igor; Leskovšek, Vojteh; Godec, Matjaz

    2016-11-01

    Optimization of tool steel properties and corresponding heat treatment is mainly based on trial and error approach, which requires tremendous experimental work and resources. Therefore, there is a huge need for tools allowing prediction of mechanical properties of tool steels as a function of composition and heat treatment process variables. The aim of the present work was to explore the potential and possibilities of artificial neural network-based modeling to select and optimize vacuum heat treatment conditions depending on the hot work tool steel composition and required properties. In the current case training of the feedforward neural network with error backpropagation training scheme and four layers of neurons (8-20-20-2) scheme was based on the experimentally obtained tempering diagrams for ten different hot work tool steel compositions and at least two austenitizing temperatures. Results show that this type of modeling can be successfully used for detailed and multifunctional analysis of different influential parameters as well as to optimize heat treatment process of hot work tool steels depending on the composition. In terms of composition, V was found as the most beneficial alloying element increasing hardness and fracture toughness of hot work tool steel; Si, Mn, and Cr increase hardness but lead to reduced fracture toughness, while Mo has the opposite effect. Optimum concentration providing high KIc/HRC ratios would include 0.75 pct Si, 0.4 pct Mn, 5.1 pct Cr, 1.5 pct Mo, and 0.5 pct V, with the optimum heat treatment performed at lower austenitizing and intermediate tempering temperatures.

  4. Rapid State Space Modeling Tool for Rectangular Wing Aeroservoelastic Studies

    NASA Technical Reports Server (NTRS)

    Suh, Peter M.; Conyers, Howard Jason; Mavris, Dimitri N.

    2015-01-01

    This report introduces a modeling and simulation tool for aeroservoelastic analysis of rectangular wings with trailing-edge control surfaces. The inputs to the code are planform design parameters such as wing span, aspect ratio, and number of control surfaces. Using this information, the generalized forces are computed using the doublet-lattice method. Using Roger's approximation, a rational function approximation is computed. The output, computed in a few seconds, is a state space aeroservoelastic model which can be used for analysis and control design. The tool is fully parameterized with default information so there is little required interaction with the model developer. All parameters can be easily modified if desired. The focus of this report is on tool presentation, verification, and validation. These processes are carried out in stages throughout the report. The rational function approximation is verified against computed generalized forces for a plate model. A model composed of finite element plates is compared to a modal analysis from commercial software and an independently conducted experimental ground vibration test analysis. Aeroservoelastic analysis is the ultimate goal of this tool, therefore, the flutter speed and frequency for a clamped plate are computed using damping-versus-velocity and frequency-versus-velocity analysis. The computational results are compared to a previously published computational analysis and wind-tunnel results for the same structure. A case study of a generic wing model with a single control surface is presented. Verification of the state space model is presented in comparison to damping-versus-velocity and frequency-versus-velocity analysis, including the analysis of the model in response to a 1-cos gust.

  5. Rapid State Space Modeling Tool for Rectangular Wing Aeroservoelastic Studies

    NASA Technical Reports Server (NTRS)

    Suh, Peter M.; Conyers, Howard J.; Mavris, Dimitri N.

    2015-01-01

    This paper introduces a modeling and simulation tool for aeroservoelastic analysis of rectangular wings with trailing-edge control surfaces. The inputs to the code are planform design parameters such as wing span, aspect ratio, and number of control surfaces. Using this information, the generalized forces are computed using the doublet-lattice method. Using Roger's approximation, a rational function approximation is computed. The output, computed in a few seconds, is a state space aeroservoelastic model which can be used for analysis and control design. The tool is fully parameterized with default information so there is little required interaction with the model developer. All parameters can be easily modified if desired. The focus of this paper is on tool presentation, verification, and validation. These processes are carried out in stages throughout the paper. The rational function approximation is verified against computed generalized forces for a plate model. A model composed of finite element plates is compared to a modal analysis from commercial software and an independently conducted experimental ground vibration test analysis. Aeroservoelastic analysis is the ultimate goal of this tool, therefore, the flutter speed and frequency for a clamped plate are computed using damping-versus-velocity and frequency-versus-velocity analysis. The computational results are compared to a previously published computational analysis and wind-tunnel results for the same structure. A case study of a generic wing model with a single control surface is presented. Verification of the state space model is presented in comparison to damping-versus-velocity and frequency-versus-velocity analysis, including the analysis of the model in response to a 1-cos gust.

  6. Rapid State Space Modeling Tool for Rectangular Wing Aeroservoelastic Studies

    NASA Technical Reports Server (NTRS)

    Suh, Peter M.; Conyers, Howard J.; Mavris, Dimitri N.

    2014-01-01

    This paper introduces a modeling and simulation tool for aeroservoelastic analysis of rectangular wings with trailing edge control surfaces. The inputs to the code are planform design parameters such as wing span, aspect ratio and number of control surfaces. A doublet lattice approach is taken to compute generalized forces. A rational function approximation is computed. The output, computed in a few seconds, is a state space aeroservoelastic model which can be used for analysis and control design. The tool is fully parameterized with default information so there is little required interaction with the model developer. Although, all parameters can be easily modified if desired.The focus of this paper is on tool presentation, verification and validation. This process is carried out in stages throughout the paper. The rational function approximation is verified against computed generalized forces for a plate model. A model composed of finite element plates is compared to a modal analysis from commercial software and an independently conducted experimental ground vibration test analysis. Aeroservoelastic analysis is the ultimate goal of this tool. Therefore the flutter speed and frequency for a clamped plate are computed using V-g and V-f analysis. The computational results are compared to a previously published computational analysis and wind tunnel results for the same structure. Finally a case study of a generic wing model with a single control surface is presented. Verification of the state space model is presented in comparison to V-g and V-f analysis. This also includes the analysis of the model in response to a 1-cos gust.

  7. Ares First Stage "Systemology" - Combining Advanced Systems Engineering and Planning Tools to Assure Mission Success

    NASA Technical Reports Server (NTRS)

    Seiler, James; Brasfield, Fred; Cannon, Scott

    2008-01-01

    Ares is an integral part of NASA s Constellation architecture that will provide crew and cargo access to the International Space Station as well as low earth orbit support for lunar missions. Ares replaces the Space Shuttle in the post 2010 time frame. Ares I is an in-line, two-stage rocket topped by the Orion Crew Exploration Vehicle, its service module, and a launch abort system. The Ares I first stage is a single, five-segment reusable solid rocket booster derived from the Space Shuttle Program's reusable solid rocket motor. The Ares second or upper stage is propelled by a J-2X main engine fueled with liquid oxygen and liquid hydrogen. This paper describes the advanced systems engineering and planning tools being utilized for the design, test, and qualification of the Ares I first stage element. Included are descriptions of the current first stage design, the milestone schedule requirements, and the marriage of systems engineering, detailed planning efforts, and roadmapping employed to achieve these goals.

  8. Recent advances in i-Gene tools and analysis: microarrays, next generation sequencing and mass spectrometry.

    PubMed

    Moorhouse, Michael J; Sharma, Hari S

    2011-08-01

    Recent advances in technology and associated methodology have made the current period one of the most exciting in molecular biology and medicine. Underlying these is an appreciation that modern research is driven by increasing large amounts of data being interpreted by interdisciplinary collaborative teams which are often geographically dispersed. The availability of cheap computing power, high speed informatics networks and high quality analysis software has been essential to this as has the application of modern quality assurance methodologies. In this review, we discuss the application of modern 'High-Throughput' molecular biological technologies such as 'Microarrays' and 'Next Generation Sequencing' to scientific and biomedical research as we have observed. Furthermore in this review, we also offer some guidance that enables the reader as to understand certain features of these as well as new strategies and help them to apply these i-Gene tools in their endeavours successfully. Collectively, we term this 'i-Gene Analysis'. We also offer predictions as to the developments that are anticipated in the near and more distant future.

  9. Advances in the genetic dissection of plant cell walls: tools and resources available in Miscanthus

    PubMed Central

    Slavov, Gancho; Allison, Gordon; Bosch, Maurice

    2013-01-01

    Tropical C4 grasses from the genus Miscanthus are believed to have great potential as biomass crops. However, Miscanthus species are essentially undomesticated, and genetic, molecular and bioinformatics tools are in very early stages of development. Furthermore, similar to other crops targeted as lignocellulosic feedstocks, the efficient utilization of biomass is hampered by our limited knowledge of the structural organization of the plant cell wall and the underlying genetic components that control this organization. The Institute of Biological, Environmental and Rural Sciences (IBERS) has assembled an extensive collection of germplasm for several species of Miscanthus. In addition, an integrated, multidisciplinary research programme at IBERS aims to inform accelerated breeding for biomass productivity and composition, while also generating fundamental knowledge. Here we review recent advances with respect to the genetic characterization of the cell wall in Miscanthus. First, we present a summary of recent and on-going biochemical studies, including prospects and limitations for the development of powerful phenotyping approaches. Second, we review current knowledge about genetic variation for cell wall characteristics of Miscanthus and illustrate how phenotypic data, combined with high-density arrays of single-nucleotide polymorphisms, are being used in genome-wide association studies to generate testable hypotheses and guide biological discovery. Finally, we provide an overview of the current knowledge about the molecular biology of cell wall biosynthesis in Miscanthus and closely related grasses, discuss the key conceptual and technological bottlenecks, and outline the short-term prospects for progress in this field. PMID:23847628

  10. Advances in the genetic dissection of plant cell walls: tools and resources available in Miscanthus.

    PubMed

    Slavov, Gancho; Allison, Gordon; Bosch, Maurice

    2013-01-01

    Tropical C4 grasses from the genus Miscanthus are believed to have great potential as biomass crops. However, Miscanthus species are essentially undomesticated, and genetic, molecular and bioinformatics tools are in very early stages of development. Furthermore, similar to other crops targeted as lignocellulosic feedstocks, the efficient utilization of biomass is hampered by our limited knowledge of the structural organization of the plant cell wall and the underlying genetic components that control this organization. The Institute of Biological, Environmental and Rural Sciences (IBERS) has assembled an extensive collection of germplasm for several species of Miscanthus. In addition, an integrated, multidisciplinary research programme at IBERS aims to inform accelerated breeding for biomass productivity and composition, while also generating fundamental knowledge. Here we review recent advances with respect to the genetic characterization of the cell wall in Miscanthus. First, we present a summary of recent and on-going biochemical studies, including prospects and limitations for the development of powerful phenotyping approaches. Second, we review current knowledge about genetic variation for cell wall characteristics of Miscanthus and illustrate how phenotypic data, combined with high-density arrays of single-nucleotide polymorphisms, are being used in genome-wide association studies to generate testable hypotheses and guide biological discovery. Finally, we provide an overview of the current knowledge about the molecular biology of cell wall biosynthesis in Miscanthus and closely related grasses, discuss the key conceptual and technological bottlenecks, and outline the short-term prospects for progress in this field.

  11. Greenhouse gases from wastewater treatment - A review of modelling tools.

    PubMed

    Mannina, Giorgio; Ekama, George; Caniani, Donatella; Cosenza, Alida; Esposito, Giovanni; Gori, Riccardo; Garrido-Baserba, Manel; Rosso, Diego; Olsson, Gustaf

    2016-05-01

    Nitrous oxide, carbon dioxide and methane are greenhouse gases (GHG) emitted from wastewater treatment that contribute to its carbon footprint. As a result of the increasing awareness of GHG emissions from wastewater treatment plants (WWTPs), new modelling, design, and operational tools have been developed to address and reduce GHG emissions at the plant-wide scale and beyond. This paper reviews the state-of-the-art and the recently developed tools used to understand and manage GHG emissions from WWTPs, and discusses open problems and research gaps. The literature review reveals that knowledge on the processes related to N2O formation, especially due to autotrophic biomass, is still incomplete. The literature review shows also that a plant-wide modelling approach that includes GHG is the best option for the understanding how to reduce the carbon footprint of WWTPs. Indeed, several studies have confirmed that a wide vision of the WWPTs has to be considered in order to make them more sustainable as possible. Mechanistic dynamic models were demonstrated as the most comprehensive and reliable tools for GHG assessment. Very few plant-wide GHG modelling studies have been applied to real WWTPs due to the huge difficulties related to data availability and the model complexity. For further improvement in GHG plant-wide modelling and to favour its use at large real scale, knowledge of the mechanisms involved in GHG formation and release, and data acquisition must be enhanced. PMID:26878638

  12. Greenhouse gases from wastewater treatment - A review of modelling tools.

    PubMed

    Mannina, Giorgio; Ekama, George; Caniani, Donatella; Cosenza, Alida; Esposito, Giovanni; Gori, Riccardo; Garrido-Baserba, Manel; Rosso, Diego; Olsson, Gustaf

    2016-05-01

    Nitrous oxide, carbon dioxide and methane are greenhouse gases (GHG) emitted from wastewater treatment that contribute to its carbon footprint. As a result of the increasing awareness of GHG emissions from wastewater treatment plants (WWTPs), new modelling, design, and operational tools have been developed to address and reduce GHG emissions at the plant-wide scale and beyond. This paper reviews the state-of-the-art and the recently developed tools used to understand and manage GHG emissions from WWTPs, and discusses open problems and research gaps. The literature review reveals that knowledge on the processes related to N2O formation, especially due to autotrophic biomass, is still incomplete. The literature review shows also that a plant-wide modelling approach that includes GHG is the best option for the understanding how to reduce the carbon footprint of WWTPs. Indeed, several studies have confirmed that a wide vision of the WWPTs has to be considered in order to make them more sustainable as possible. Mechanistic dynamic models were demonstrated as the most comprehensive and reliable tools for GHG assessment. Very few plant-wide GHG modelling studies have been applied to real WWTPs due to the huge difficulties related to data availability and the model complexity. For further improvement in GHG plant-wide modelling and to favour its use at large real scale, knowledge of the mechanisms involved in GHG formation and release, and data acquisition must be enhanced.

  13. Ground-water models as a management tool in Florida

    USGS Publications Warehouse

    Hutchinson, C.B.

    1984-01-01

    Highly sophisticated computer models provide powerful tools for analyzing historic data and for simulating future water levels, water movement, and water chemistry under stressed conditions throughout the ground-water system in Florida. Models that simulate the movement of heat and subsidence of land in response to aquifer pumping also have potential for application to hydrologic problems in the State. Florida, with 20 ground-water modeling studies reported since 1972, has applied computer modeling techniques to a variety of water-resources problems. Models in Florida generally have been used to provide insight to problems of water supply, contamination, and impact on the environment. The model applications range from site-specific studies, such as estimating contamination by wastewater injection at St. Petersburg, to a regional model of the entire State that may be used to assess broad-scale environmental impact of water-resources development. Recently, groundwater models have been used as management tools by the State regulatory authority to permit or deny development of water resources. As modeling precision, knowledge, and confidence increase, the use of ground-water models will shift more and more toward regulation of development and enforcement of environmental laws. (USGS)

  14. AgMIP Training in Multiple Crop Models and Tools

    NASA Technical Reports Server (NTRS)

    Boote, Kenneth J.; Porter, Cheryl H.; Hargreaves, John; Hoogenboom, Gerrit; Thornburn, Peter; Mutter, Carolyn

    2015-01-01

    The Agricultural Model Intercomparison and Improvement Project (AgMIP) has the goal of using multiple crop models to evaluate climate impacts on agricultural production and food security in developed and developing countries. There are several major limitations that must be overcome to achieve this goal, including the need to train AgMIP regional research team (RRT) crop modelers to use models other than the ones they are currently familiar with, plus the need to harmonize and interconvert the disparate input file formats used for the various models. Two activities were followed to address these shortcomings among AgMIP RRTs to enable them to use multiple models to evaluate climate impacts on crop production and food security. We designed and conducted courses in which participants trained on two different sets of crop models, with emphasis on the model of least experience. In a second activity, the AgMIP IT group created templates for inputting data on soils, management, weather, and crops into AgMIP harmonized databases, and developed translation tools for converting the harmonized data into files that are ready for multiple crop model simulations. The strategies for creating and conducting the multi-model course and developing entry and translation tools are reviewed in this chapter.

  15. Practical use of advanced mouse models for lung cancer.

    PubMed

    Safari, Roghaiyeh; Meuwissen, Ralph

    2015-01-01

    To date a variety of non-small cell lung cancer (NSCLC) and small cell lung cancer (SCLC) mouse models have been developed that mimic human lung cancer. Chemically induced or spontaneous lung cancer in susceptible inbred strains has been widely used, but the more recent genetically engineered somatic mouse models recapitulate much better the genotype-phenotype correlations found in human lung cancer. Additionally, improved orthotopic transplantation of primary human cancer tissue fragments or cells into lungs of immune-compromised mice can be valuable tools for preclinical research such as antitumor drug tests. Here we give a short overview of most somatic mouse models for lung cancer that are currently in use. We accompany each different model with a description of its practical use and application for all major lung tumor types, as well as the intratracheal injection or direct injection of fresh or freeze-thawed tumor cells or tumor cell lines into lung parenchyma of recipient mice. All here presented somatic mouse models are based on the ability to (in) activate specific alleles at a time, and in a tissue-specific cell type, of choice. This spatial-temporal controlled induction of genetic lesions allows the selective introduction of main genetic lesions in an adult mouse lung as found in human lung cancer. The resulting conditional somatic mouse models can be used as versatile powerful tools in basic lung cancer research and preclinical translational studies alike. These distinctively advanced lung cancer models permit us to investigate initiation (cell of origin) and progression of lung cancer, along with response and resistance to drug therapy. Cre/lox or FLP/frt recombinase-mediated methods are now well-used techniques to develop tissue-restricted lung cancer in mice with tumor-suppressor gene and/or oncogene (in)activation. Intranasal or intratracheal administration of engineered adenovirus-Cre or lentivirus-Cre has been optimized for introducing Cre

  16. How Project Management Tools Aid in Association to Advance Collegiate Schools of Business (AACSB) International Maintenance of Accreditation

    ERIC Educational Resources Information Center

    Cann, Cynthia W.; Brumagim, Alan L.

    2008-01-01

    The authors present the case of one business college's use of project management techniques as tools for accomplishing Association to Advance Collegiate Schools of Business (AACSB) International maintenance of accreditation. Using these techniques provides an efficient and effective method of organizing maintenance efforts. In addition, using…

  17. Continued development of modeling tools and theory for RF heating

    SciTech Connect

    1998-12-01

    Mission Research Corporation (MRC) is pleased to present the Department of Energy (DOE) with its renewal proposal to the Continued Development of Modeling Tools and Theory for RF Heating program. The objective of the program is to continue and extend the earlier work done by the proposed principal investigator in the field of modeling (Radio Frequency) RF heating experiments in the large tokamak fusion experiments, particularly the Tokamak Fusion Test Reactor (TFTR) device located at Princeton Plasma Physics Laboratory (PPPL). An integral part of this work is the investigation and, in some cases, resolution of theoretical issues which pertain to accurate modeling. MRC is nearing the successful completion of the specified tasks of the Continued Development of Modeling Tools and Theory for RF Heating project. The following tasks are either completed or nearing completion. (1) Anisotropic temperature and rotation upgrades; (2) Modeling for relativistic ECRH; (3) Further documentation of SHOOT and SPRUCE. As a result of the progress achieved under this project, MRC has been urged to continue this effort. Specifically, during the performance of this project two topics were identified by PPPL personnel as new applications of the existing RF modeling tools. These two topics concern (a) future fast-wave current drive experiments on the large tokamaks including TFTR and (c) the interpretation of existing and future RF probe data from TFTR. To address each of these topics requires some modification or enhancement of the existing modeling tools, and the first topic requires resolution of certain theoretical issues to produce self-consistent results. This work falls within the scope of the original project and is more suited to the project`s renewal than to the initiation of a new project.

  18. Experiences & Tools from Modeling Instruction Applied to Earth Sciences

    NASA Astrophysics Data System (ADS)

    Cervenec, J.; Landis, C. E.

    2012-12-01

    The Framework for K-12 Science Education calls for stronger curricular connections within the sciences, greater depth in understanding, and tasks higher on Bloom's Taxonomy. Understanding atmospheric sciences draws on core knowledge traditionally taught in physics, chemistry, and in some cases, biology. If this core knowledge is not conceptually sound, well retained, and transferable to new settings, understanding the causes and consequences of climate changes become a task in memorizing seemingly disparate facts to a student. Fortunately, experiences and conceptual tools have been developed and refined in the nationwide network of Physics Modeling and Chemistry Modeling teachers to build necessary understanding of conservation of mass, conservation of energy, particulate nature of matter, kinetic molecular theory, and particle model of light. Context-rich experiences are first introduced for students to construct an understanding of these principles and then conceptual tools are deployed for students to resolve misconceptions and deepen their understanding. Using these experiences and conceptual tools takes an investment of instructional time, teacher training, and in some cases, re-envisioning the format of a science classroom. There are few financial barriers to implementation and students gain a greater understanding of the nature of science by going through successive cycles of investigation and refinement of their thinking. This presentation shows how these experiences and tools could be used in an Earth Science course to support students developing conceptually rich understanding of the atmosphere and connections happening within.

  19. Technology, Pedagogy, and Epistemology: Opportunities and Challenges of Using Computer Modeling and Simulation Tools in Elementary Science Methods

    NASA Astrophysics Data System (ADS)

    Schwarz, Christina V.; Meyer, Jason; Sharma, Ajay

    2007-04-01

    This study infused computer modeling and simulation tools in a 1-semester undergraduate elementary science methods course to advance preservice teachers’ understandings of computer software use in science teaching and to help them learn important aspects of pedagogy and epistemology. Preservice teachers used computer modeling and simulation tools within their own science investigations; discussed general technology issues; and explored, evaluated, and taught their peers about a particular modeling tool. Preservice teachers expanded their vision of the software available and the role that software can play in science teaching, but desired fun, easy-to-use software with scientifically accurate information within a clear, familiar learning task. Such conflict provided a fruitful platform for discussion and for potentially advancing preservice teachers’ pedagogical and epistemological understandings.

  20. A Short Review of Ablative-Material Response Models and Simulation Tools

    NASA Technical Reports Server (NTRS)

    Lachaud, Jean; Magin, Thierry E.; Cozmuta, Ioana; Mansour, Nagi N.

    2011-01-01

    A review of the governing equations and boundary conditions used to model the response of ablative materials submitted to a high-enthalpy flow is proposed. The heritage of model-development efforts undertaken in the 1960s is extremely clear: the bases of the models used in the community are mathematically equivalent. Most of the material-response codes implement a single model in which the equation parameters may be modified to model different materials or conditions. The level of fidelity of the models implemented in design tools only slightly varies. Research and development codes are generally more advanced but often not as robust. The capabilities of each of these codes are summarized in a color-coded table along with research and development efforts currently in progress.

  1. Modelling of tunnelling processes and rock cutting tool wear with the particle finite element method

    NASA Astrophysics Data System (ADS)

    Carbonell, Josep Maria; Oñate, Eugenio; Suárez, Benjamín

    2013-09-01

    Underground construction involves all sort of challenges in analysis, design, project and execution phases. The dimension of tunnels and their structural requirements are growing, and so safety and security demands do. New engineering tools are needed to perform a safer planning and design. This work presents the advances in the particle finite element method (PFEM) for the modelling and the analysis of tunneling processes including the wear of the cutting tools. The PFEM has its foundation on the Lagrangian description of the motion of a continuum built from a set of particles with known physical properties. The method uses a remeshing process combined with the alpha-shape technique to detect the contacting surfaces and a finite element method for the mechanical computations. A contact procedure has been developed for the PFEM which is combined with a constitutive model for predicting the excavation front and the wear of cutting tools. The material parameters govern the coupling of frictional contact and wear between the interacting domains at the excavation front. The PFEM allows predicting several parameters which are relevant for estimating the performance of a tunnelling boring machine such as wear in the cutting tools, the pressure distribution on the face of the boring machine and the vibrations produced in the machinery and the adjacent soil/rock. The final aim is to help in the design of the excavating tools and in the planning of the tunnelling operations. The applications presented show that the PFEM is a promising technique for the analysis of tunnelling problems.

  2. Process models: analytical tools for managing industrial energy systems

    SciTech Connect

    Howe, S O; Pilati, D A; Balzer, C; Sparrow, F T

    1980-01-01

    How the process models developed at BNL are used to analyze industrial energy systems is described and illustrated. Following a brief overview of the industry modeling program, the general methodology of process modeling is discussed. The discussion highlights the important concepts, contents, inputs, and outputs of a typical process model. A model of the US pulp and paper industry is then discussed as a specific application of process modeling methodology. Applications addressed with the case study results include projections of energy demand, conservation technology assessment, energy-related tax policies, and sensitivity analysis. A subsequent discussion of these results supports the conclusion that industry process models are versatile and powerful tools for managing industrial energy systems.

  3. Thermal Model Predictions of Advanced Stirling Radioisotope Generator Performance

    NASA Technical Reports Server (NTRS)

    Wang, Xiao-Yen J.; Fabanich, William Anthony; Schmitz, Paul C.

    2014-01-01

    This presentation describes the capabilities of three-dimensional thermal power model of advanced stirling radioisotope generator (ASRG). The performance of the ASRG is presented for different scenario, such as Venus flyby with or without the auxiliary cooling system.

  4. Petri Nets as Modeling Tool for Emergent Agents

    NASA Technical Reports Server (NTRS)

    Bergman, Marto

    2004-01-01

    Emergent agents, those agents whose local interactions can cause unexpected global results, require a method of modeling that is both dynamic and structured Petri Nets, a modeling tool developed for dynamic discrete event system of mainly functional agents, provide this, and have the benefit of being an established tool. We present here the details of the modeling method here and discuss how to implement its use for modeling agent-based systems. Petri Nets have been used extensively in the modeling of functional agents, those agents who have defined purposes and whose actions should result in a know outcome. However, emergent agents, those agents who have a defined structure but whose interaction causes outcomes that are unpredictable, have not yet found a modeling style that suits them. A problem with formally modeling emergent agents that any formal modeling style usually expects to show the results of a problem and the results of problems studied using emergent agents are not apparent from the initial construction. However, the study of emergent agents still requires a method to analyze the agents themselves, and have sensible conversation about the differences and similarities between types of emergent agents. We attempt to correct this problem by applying Petri Nets to the characterization of emergent agents. In doing so, the emergent properties of these agents can be highlighted, and conversation about the nature and compatibility of the differing methods of agent creation can begin.

  5. Large animal models of atherosclerosis--new tools for persistent problems in cardiovascular medicine.

    PubMed

    Shim, J; Al-Mashhadi, R H; Sørensen, C B; Bentzon, J F

    2016-01-01

    Coronary heart disease and ischaemic stroke caused by atherosclerosis are leading causes of illness and death worldwide. Small animal models have provided insight into the fundamental mechanisms driving early atherosclerosis, but it is increasingly clear that new strategies and research tools are needed to translate these discoveries into improved prevention and treatment of symptomatic atherosclerosis in humans. Key challenges include better understanding of processes in late atherosclerosis, factors affecting atherosclerosis in the coronary bed, and the development of reliable imaging biomarker tools for risk stratification and monitoring of drug effects in humans. Efficient large animal models of atherosclerosis may help tackle these problems. Recent years have seen tremendous advances in gene-editing tools for large animals. This has made it possible to create gene-modified minipigs that develop atherosclerosis with many similarities to humans in terms of predilection for lesion sites and histopathology. Together with existing porcine models of atherosclerosis that are based on spontaneous mutations or severe diabetes, such models open new avenues for translational research in atherosclerosis. In this review, we discuss the merits of different animal models of atherosclerosis and give examples of important research problems where porcine models could prove pivotal for progress.

  6. ISAC: A tool for aeroservoelastic modeling and analysis

    NASA Technical Reports Server (NTRS)

    Adams, William M., Jr.; Hoadley, Sherwood Tiffany

    1993-01-01

    The capabilities of the Interaction of Structures, Aerodynamics, and Controls (ISAC) system of program modules is discussed. The major modeling, analysis, and data management components of ISAC are identified. Equations of motion are displayed for a Laplace-domain representation of the unsteady aerodynamic forces. Options for approximating a frequency-domain representation of unsteady aerodynamic forces with rational functions of the Laplace variable are shown. Linear time invariant state-space equations of motion that result are discussed. Model generation and analyses of stability and dynamic response characteristics are shown for an aeroelastic vehicle which illustrates some of the capabilities of ISAC as a modeling and analysis tool for aeroelastic applications.

  7. Advances in modelling of condensation phenomena

    SciTech Connect

    Liu, W.S.; Zaltsgendler, E.; Hanna, B.

    1997-07-01

    The physical parameters in the modelling of condensation phenomena in the CANDU reactor system codes are discussed. The experimental programs used for thermal-hydraulic code validation in the Canadian nuclear industry are briefly described. The modelling of vapour generation and in particular condensation plays a key role in modelling of postulated reactor transients. The condensation models adopted in the current state-of-the-art two-fluid CANDU reactor thermal-hydraulic system codes (CATHENA and TUF) are described. As examples of the modelling challenges faced, the simulation of a cold water injection experiment by CATHENA and the simulation of a condensation induced water hammer experiment by TUF are described.

  8. Development of tools for safety analysis of control software in advanced reactors

    SciTech Connect

    Guarro, S.; Yau, M.; Motamed, M.

    1996-04-01

    Software based control systems have gained a pervasive presence in a wide variety of applications, including nuclear power plant control and protection systems which are within the oversight and licensing responsibility of the US Nuclear Regulatory Commission. While the cost effectiveness and flexibility of software based plant process control is widely recognized, it is very difficult to achieve and prove high levels of demonstrated dependability and safety assurance for the functions performed by process control software, due to the very flexibility and potential complexity of the software itself. The development of tools to model, analyze and test software design and implementations in the context of the system that the software is designed to control can greatly assist the task of providing higher levels of assurance than those obtainable by software testing alone. This report presents and discusses the development of the Dynamic Flowgraph Methodology (DFM) and its application in the dependability and assurance analysis of software-based control systems. The features of the methodology and full-scale examples of application to both generic process and nuclear power plant control systems are presented and discussed in detail. The features of a workstation software tool developed to assist users in the application of DFM are also described.

  9. High Performance Computing Modeling Advances Accelerator Science for High-Energy Physics

    SciTech Connect

    Amundson, James; Macridin, Alexandru; Spentzouris, Panagiotis

    2014-07-28

    The development and optimization of particle accelerators are essential for advancing our understanding of the properties of matter, energy, space, and time. Particle accelerators are complex devices whose behavior involves many physical effects on multiple scales. Therefore, advanced computational tools utilizing high-performance computing are essential for accurately modeling them. In the past decade, the US Department of Energy's SciDAC program has produced accelerator-modeling tools that have been employed to tackle some of the most difficult accelerator science problems. The authors discuss the Synergia framework and its applications to high-intensity particle accelerator physics. Synergia is an accelerator simulation package capable of handling the entire spectrum of beam dynamics simulations. Our authors present Synergia's design principles and its performance on HPC platforms.

  10. High Performance Computing Modeling Advances Accelerator Science for High-Energy Physics

    DOE PAGES

    Amundson, James; Macridin, Alexandru; Spentzouris, Panagiotis

    2014-07-28

    The development and optimization of particle accelerators are essential for advancing our understanding of the properties of matter, energy, space, and time. Particle accelerators are complex devices whose behavior involves many physical effects on multiple scales. Therefore, advanced computational tools utilizing high-performance computing are essential for accurately modeling them. In the past decade, the US Department of Energy's SciDAC program has produced accelerator-modeling tools that have been employed to tackle some of the most difficult accelerator science problems. The authors discuss the Synergia framework and its applications to high-intensity particle accelerator physics. Synergia is an accelerator simulation package capable ofmore » handling the entire spectrum of beam dynamics simulations. Our authors present Synergia's design principles and its performance on HPC platforms.« less

  11. Advancing complementary and alternative medicine through social network analysis and agent-based modeling.

    PubMed

    Frantz, Terrill L

    2012-01-01

    This paper introduces the contemporary perspectives and techniques of social network analysis (SNA) and agent-based modeling (ABM) and advocates applying them to advance various aspects of complementary and alternative medicine (CAM). SNA and ABM are invaluable methods for representing, analyzing and projecting complex, relational, social phenomena; they provide both an insightful vantage point and a set of analytic tools that can be useful in a wide range of contexts. Applying these methods in the CAM context can aid the ongoing advances in the CAM field, in both its scientific aspects and in developing broader acceptance in associated stakeholder communities.

  12. Propel: Tools and Methods for Practical Source Code Model Checking

    NASA Technical Reports Server (NTRS)

    Mansouri-Samani, Massoud; Mehlitz, Peter; Markosian, Lawrence; OMalley, Owen; Martin, Dale; Moore, Lantz; Penix, John; Visser, Willem

    2003-01-01

    The work reported here is an overview and snapshot of a project to develop practical model checking tools for in-the-loop verification of NASA s mission-critical, multithreaded programs in Java and C++. Our strategy is to develop and evaluate both a design concept that enables the application of model checking technology to C++ and Java, and a model checking toolset for C++ and Java. The design concept and the associated model checking toolset is called Propel. It builds upon the Java PathFinder (JPF) tool, an explicit state model checker for Java applications developed by the Automated Software Engineering group at NASA Ames Research Center. The design concept that we are developing is Design for Verification (D4V). This is an adaption of existing best design practices that has the desired side-effect of enhancing verifiability by improving modularity and decreasing accidental complexity. D4V, we believe, enhances the applicability of a variety of V&V approaches; we are developing the concept in the context of model checking. The model checking toolset, Propel, is based on extending JPF to handle C++. Our principal tasks in developing the toolset are to build a translator from C++ to Java, productize JPF, and evaluate the toolset in the context of D4V. Through all these tasks we are testing Propel capabilities on customer applications.

  13. MixSIAR: advanced stable isotope mixing models in R

    EPA Science Inventory

    Background/Question/Methods The development of stable isotope mixing models has coincided with modeling products (e.g. IsoSource, MixSIR, SIAR), where methodological advances are published in parity with software packages. However, while mixing model theory has recently been ex...

  14. ADVANCED TECHNIQUES FOR RESERVOIR SIMULATION AND MODELING OF NONCONVENTIONAL WELLS

    SciTech Connect

    Louis J. Durlofsky; Khalid Aziz

    2004-08-20

    Nonconventional wells, which include horizontal, deviated, multilateral and ''smart'' wells, offer great potential for the efficient management of oil and gas reservoirs. These wells are able to contact larger regions of the reservoir than conventional wells and can also be used to target isolated hydrocarbon accumulations. The use of nonconventional wells instrumented with downhole inflow control devices allows for even greater flexibility in production. Because nonconventional wells can be very expensive to drill, complete and instrument, it is important to be able to optimize their deployment, which requires the accurate prediction of their performance. However, predictions of nonconventional well performance are often inaccurate. This is likely due to inadequacies in some of the reservoir engineering and reservoir simulation tools used to model and optimize nonconventional well performance. A number of new issues arise in the modeling and optimization of nonconventional wells. For example, the optimal use of downhole inflow control devices has not been addressed for practical problems. In addition, the impact of geological and engineering uncertainty (e.g., valve reliability) has not been previously considered. In order to model and optimize nonconventional wells in different settings, it is essential that the tools be implemented into a general reservoir simulator. This simulator must be sufficiently general and robust and must in addition be linked to a sophisticated well model. Our research under this five year project addressed all of the key areas indicated above. The overall project was divided into three main categories: (1) advanced reservoir simulation techniques for modeling nonconventional wells; (2) improved techniques for computing well productivity (for use in reservoir engineering calculations) and for coupling the well to the simulator (which includes the accurate calculation of well index and the modeling of multiphase flow in the wellbore

  15. Foot-ankle simulators: A tool to advance biomechanical understanding of a complex anatomical structure.

    PubMed

    Natsakis, Tassos; Burg, Josefien; Dereymaeker, Greta; Jonkers, Ilse; Vander Sloten, Jos

    2016-05-01

    In vitro gait simulations have been available to researchers for more than two decades and have become an invaluable tool for understanding fundamental foot-ankle biomechanics. This has been realised through several incremental technological and methodological developments, such as the actuation of muscle tendons, the increase in controlled degrees of freedom and the use of advanced control schemes. Furthermore, in vitro experimentation enabled performing highly repeatable and controllable simulations of gait during simultaneous measurement of several biomechanical signals (e.g. bone kinematics, intra-articular pressure distribution, bone strain). Such signals cannot always be captured in detail using in vivo techniques, and the importance of in vitro experimentation is therefore highlighted. The information provided by in vitro gait simulations enabled researchers to answer numerous clinical questions related to pathology, injury and surgery. In this article, first an overview of the developments in design and methodology of the various foot-ankle simulators is presented. Furthermore, an overview of the conducted studies is outlined and an example of a study aiming at understanding the differences in kinematics of the hindfoot, ankle and subtalar joints after total ankle arthroplasty is presented. Finally, the limitations and future perspectives of in vitro experimentation and in particular of foot-ankle gait simulators are discussed. It is expected that the biofidelic nature of the controllers will be improved in order to make them more subject-specific and to link foot motion to the simulated behaviour of the entire missing body, providing additional information for understanding the complex anatomical structure of the foot. PMID:27160562

  16. Advances on genetic rat models of epilepsy.

    PubMed

    Serikawa, Tadao; Mashimo, Tomoji; Kuramoro, Takashi; Voigt, Birger; Ohno, Yukihiro; Sasa, Masashi

    2015-01-01

    Considering the suitability of laboratory rats in epilepsy research, we and other groups have been developing genetic models of epilepsy in this species. After epileptic rats or seizure-susceptible rats were sporadically found in outbred stocks, the epileptic traits were usually genetically-fixed by selective breeding. So far, the absence seizure models GAERS and WAG/Rij, audiogenic seizure models GEPR-3 and GEPR-9, generalized tonic-clonic seizure models IER, NER and WER, and Canavan-disease related epileptic models TRM and SER have been established. Dissection of the genetic bases including causative genes in these epileptic rat models would be a significant step toward understanding epileptogenesis. N-ethyl-N-nitrosourea (ENU) mutagenesis provides a systematic approach which allowed us to develop two novel epileptic rat models: heat-induced seizure susceptible (Hiss) rats with an Scn1a missense mutation and autosomal dominant lateral temporal epilepsy (ADLTE) model rats with an Lgi1 missense mutation. In addition, we have established episodic ataxia type 1 (EA1) model rats with a Kcna1 missense mutation derived from the ENU-induced rat mutant stock, and identified a Cacna1a missense mutation in a N-Methyl-N-nitrosourea (MNU)-induced mutant rat strain GRY, resulting in the discovery of episodic ataxia type 2 (EA2) model rats. Thus, epileptic rat models have been established on the two paths: 'phenotype to gene' and 'gene to phenotype'. In the near future, development of novel epileptic rat models will be extensively promoted by the use of sophisticated genome editing technologies.

  17. Advances on genetic rat models of epilepsy

    PubMed Central

    Serikawa, Tadao; Mashimo, Tomoji; Kuramoto, Takashi; Voigt, Birger; Ohno, Yukihiro; Sasa, Masashi

    2014-01-01

    Considering the suitability of laboratory rats in epilepsy research, we and other groups have been developing genetic models of epilepsy in this species. After epileptic rats or seizure-susceptible rats were sporadically found in outbred stocks, the epileptic traits were usually genetically-fixed by selective breeding. So far, the absence seizure models GAERS and WAG/Rij, audiogenic seizure models GEPR-3 and GEPR-9, generalized tonic-clonic seizure models IER, NER and WER, and Canavan-disease related epileptic models TRM and SER have been established. Dissection of the genetic bases including causative genes in these epileptic rat models would be a significant step toward understanding epileptogenesis. N-ethyl-N-nitrosourea (ENU) mutagenesis provides a systematic approach which allowed us to develop two novel epileptic rat models: heat-induced seizure susceptible (Hiss) rats with an Scn1a missense mutation and autosomal dominant lateral temporal epilepsy (ADLTE) model rats with an Lgi1 missense mutation. In addition, we have established episodic ataxia type 1 (EA1) model rats with a Kcna1 missense mutation derived from the ENU-induced rat mutant stock, and identified a Cacna1a missense mutation in a N-Methyl-N-nitrosourea (MNU)-induced mutant rat strain GRY, resulting in the discovery of episodic ataxia type 2 (EA2) model rats. Thus, epileptic rat models have been established on the two paths: ‘phenotype to gene’ and ‘gene to phenotype’. In the near future, development of novel epileptic rat models will be extensively promoted by the use of sophisticated genome editing technologies. PMID:25312505

  18. Advanced Prediction of Tool Wear by Taking the Load History into Consideration

    NASA Astrophysics Data System (ADS)

    Ersoy, K.; Nuernberg, G.; Herrmann, G.; Hoffmann, H.

    2007-04-01

    A disadvantage of the conventional methods of simulating the wear occurring in deep drawing processes is that the wear coefficient, and thus wear too, is considered to be constant along loading duration, which, in case of deep drawing, corresponds to sliding distance and number of punch strokes. However, in reality, it is a known fact that wear development is not constant over time. In former studies, the authors presented a method, which makes it possible to consider the number of punch strokes in the simulation of wear. Another enhancement of this method is introduced in this paper. It is proposed to consider wear as a function of wear work instead of the number of punch strokes. Using this approach, the wear coefficients are implemented as a function of wear work and fully take into account the load history of the respective node. This enhancement makes it possible to apply the variable wear coefficients to completely different geometries, where one punch stroke involves different sliding distance or pressure values than the experiments with which the wear coefficients were determined. In this study, deep drawing experiments with a cylindrical cup geometry were carried out, in which the characteristic wear coefficient values as well as their gradients along the life cycle were determined. In this case, the die was produced via rapid tooling techniques. The prediction of tool wear is carried out with REDSY, a wear simulation software which was developed at the Institute of Metal Forming and Casting, TU-Muenchen. The wear predictions made by this software are based on the results of a conventional deep drawing simulation. For the wear modelling a modified Archard model was used.

  19. Open Innovation at NASA: A New Business Model for Advancing Human Health and Performance Innovations

    NASA Technical Reports Server (NTRS)

    Davis, Jeffrey R.; Richard, Elizabeth E.; Keeton, Kathryn E.

    2014-01-01

    This paper describes a new business model for advancing NASA human health and performance innovations and demonstrates how open innovation shaped its development. A 45 percent research and technology development budget reduction drove formulation of a strategic plan grounded in collaboration. We describe the strategy execution, including adoption and results of open innovation initiatives, the challenges of cultural change, and the development of virtual centers and a knowledge management tool to educate and engage the workforce and promote cultural change.

  20. ADAS tools for collisional-radiative modelling of molecules

    NASA Astrophysics Data System (ADS)

    Guzmán, F.; O'Mullane, M.; Summers, H. P.

    2013-07-01

    New theoretical and computational tools for molecular collisional-radiative models are presented. An application to the hydrogen molecule system has been made. At the same time, a structured database has been created where fundamental cross sections and rates for individual processes as well as derived data (effective coefficients) are stored. Relative populations for the vibrational states of the ground electronic state of H2 are presented and this vibronic resolution model is compared electronic resolution where vibronic transitions are summed over vibrational sub-states. Some new reaction rates are calculated by means of the impact parameter approximation. Computational tools have been developed to automate process and simplify the data assembly. Effective (collisional-radiative) rate coefficients versus temperature and density are presented.

  1. Designing a training tool for imaging mental models

    NASA Technical Reports Server (NTRS)

    Dede, Christopher J.; Jayaram, Geetha

    1990-01-01

    The training process can be conceptualized as the student acquiring an evolutionary sequence of classification-problem solving mental models. For example a physician learns (1) classification systems for patient symptoms, diagnostic procedures, diseases, and therapeutic interventions and (2) interrelationships among these classifications (e.g., how to use diagnostic procedures to collect data about a patient's symptoms in order to identify the disease so that therapeutic measures can be taken. This project developed functional specifications for a computer-based tool, Mental Link, that allows the evaluative imaging of such mental models. The fundamental design approach underlying this representational medium is traversal of virtual cognition space. Typically intangible cognitive entities and links among them are visible as a three-dimensional web that represents a knowledge structure. The tool has a high degree of flexibility and customizability to allow extension to other types of uses, such a front-end to an intelligent tutoring system, knowledge base, hypermedia system, or semantic network.

  2. SMART (Shop floor Modeling, Analysis and Reporting Tool Project

    NASA Technical Reports Server (NTRS)

    Centeno, Martha A.; Garcia, Maretys L.; Mendoza, Alicia C.; Molina, Louis A.; Correa, Daisy; Wint, Steve; Doice, Gregorie; Reyes, M. Florencia

    1999-01-01

    This document presents summarizes the design and prototype of the Shop floor Modeling, Analysis, and Reporting Tool (S.M.A.R.T.) A detailed description of it is found on the full documentation given to the NASA liaison. This documentation is also found on the A.R.I.S.E. Center web site, under a projected directory. Only authorized users can gain access to this site.

  3. An Advanced Sea-Floor Spreading Model.

    ERIC Educational Resources Information Center

    Dutch, Steven I.

    1986-01-01

    Describes models which (1) illustrate spreading that varies in rate from place to place; (2) clearly show transform faults as arcs of small circles; and (3) illustrate what happens near a pole of rotation. The models are easy to construct and have been well received by students. (JN)

  4. Carbon export algorithm advancements in models

    NASA Astrophysics Data System (ADS)

    Çağlar Yumruktepe, Veli; Salihoğlu, Barış

    2015-04-01

    The rate at which anthropogenic CO2 is absorbed by the oceans remains a critical question under investigation by climate researchers. Construction of a complete carbon budget, requires better understanding of air-sea exchanges and the processes controlling the vertical and horizontal transport of carbon in the ocean, particularly the biological carbon pump. Improved parameterization of carbon sequestration within ecosystem models is vital to better understand and predict changes in the global carbon cycle. Due to the complexity of processes controlling particle aggregation, sinking and decomposition, existing ecosystem models necessarily parameterize carbon sequestration using simple algorithms. Development of improved algorithms describing carbon export and sequestration, suitable for inclusion in numerical models is an ongoing work. Existing unique algorithms used in the state-of-the art ecosystem models and new experimental results obtained from mesocosm experiments and open ocean observations have been inserted into a common 1D pelagic ecosystem model for testing purposes. The model was implemented to the timeseries stations in the North Atlantic (BATS, PAP and ESTOC) and were evaluated with datasets of carbon export. Targetted topics of algorithms were PFT functional types, grazing and vertical movement of zooplankton, and remineralization, aggregation and ballasting dynamics of organic matter. Ultimately it is intended to feed improved algorithms to the 3D modelling community, for inclusion in coupled numerical models.

  5. Transport of tools and mental representation: is capuchin monkey tool behaviour a useful model of Plio-Pleistocene hominid technology?

    PubMed

    Jalles-Filho, E; Teixeira da Cunha, R G; Salm, R A

    2001-05-01

    Capuchin monkeys display greatly developed tool-using capacities, performing successfully a variety of tool-tasks. Impressed by their achievements in this respect, some investigators have suggested that capuchin tool-using behaviour could be used as a model of the tool behaviour of the first hominids. The transport of tools, a task requiring complex cognitive capabilities, is an essential ingredient in the technological behaviour of the first hominids. In this way, to qualify as another source for modelling hominid behavioural evolution, capuchins had to exhibit proficiency in the transport of tools. We investigated this problem through experiments designed to elicit the transport of objects. The results showed that the monkeys were able to transport food to be processed with the use of tools, but failed when the tools themselves had to be transported. Our hypothesis is that a limited capacity for abstract representation, together with the lack of a regulatory system ensuring that the food would not be lost and consumed by another individual during the search for and transport of the tools, were responsible for such a failure. We conclude that the tool-using behaviour of capuchins presents no functional analogy with the tool behaviour of the Plio-Pleistocene hominids, and that capuchin monkeys are a very inadequate source for modelling Plio-Pleistocene hominid's technological behaviour.

  6. Recent modelling advances for ultrasonic TOFD inspections

    SciTech Connect

    Darmon, Michel; Ferrand, Adrien; Dorval, Vincent; Chatillon, Sylvain; Lonné, Sébastien

    2015-03-31

    The ultrasonic TOFD (Time of Flight Diffraction) Technique is commonly used to detect and characterize disoriented cracks using their edge diffraction echoes. An overview of the models integrated in the CIVA software platform and devoted to TOFD simulation is presented. CIVA allows to predict diffraction echoes from complex 3D flaws using a PTD (Physical Theory of Diffraction) based model. Other dedicated developments have been added to simulate lateral waves in 3D on planar entry surfaces and in 2D on irregular surfaces by a ray approach. Calibration echoes from Side Drilled Holes (SDHs), specimen echoes and shadowing effects from flaws can also been modelled. Some examples of theoretical validation of the models are presented. In addition, experimental validations have been performed both on planar blocks containing calibration holes and various notches and also on a specimen with an irregular entry surface and allow to draw conclusions on the validity of all the developed models.

  7. An advanced terrain modeler for an autonomous planetary rover

    NASA Technical Reports Server (NTRS)

    Hunter, E. L.

    1980-01-01

    A roving vehicle capable of autonomously exploring the surface of an alien world is under development and an advanced terrain modeler to characterize the possible paths of the rover as hazardous or safe is presented. This advanced terrain modeler has several improvements over the Troiani modeler that include: a crosspath analysis, better determination of hazards on slopes, and methods for dealing with missing returns at the extremities of the sensor field. The results from a package of programs to simulate the roving vehicle are then examined and compared to results from the Troiani modeler.

  8. Networking Sensor Observations, Forecast Models & Data Analysis Tools

    NASA Astrophysics Data System (ADS)

    Falke, S. R.; Roberts, G.; Sullivan, D.; Dibner, P. C.; Husar, R. B.

    2009-12-01

    This presentation explores the interaction between sensor webs and forecast models and data analysis processes within service oriented architectures (SOA). Earth observation data from surface monitors and satellite sensors and output from earth science models are increasingly available through open interfaces that adhere to web standards, such as the OGC Web Coverage Service (WCS), OGC Sensor Observation Service (SOS), OGC Web Processing Service (WPS), SOAP-Web Services Description Language (WSDL), or RESTful web services. We examine the implementation of these standards from the perspective of forecast models and analysis tools. Interoperable interfaces for model inputs, outputs, and settings are defined with the purpose of connecting them with data access services in service oriented frameworks. We review current best practices in modular modeling, such as OpenMI and ESMF/Mapl, and examine the applicability of those practices to service oriented sensor webs. In particular, we apply sensor-model-analysis interfaces within the context of wildfire smoke analysis and forecasting scenario used in the recent GEOSS Architecture Implementation Pilot. Fire locations derived from satellites and surface observations and reconciled through a US Forest Service SOAP web service are used to initialize a CALPUFF smoke forecast model. The results of the smoke forecast model are served through an OGC WCS interface that is accessed from an analysis tool that extract areas of high particulate matter concentrations and a data comparison tool that compares the forecasted smoke with Unattended Aerial System (UAS) collected imagery and satellite-derived aerosol indices. An OGC WPS that calculates population statistics based on polygon areas is used with the extract area of high particulate matter to derive information on the population expected to be impacted by smoke from the wildfires. We described the process for enabling the fire location, smoke forecast, smoke observation, and

  9. [Population surveys as management tools and health care models].

    PubMed

    Andrade, Flávia Reis de; Narvai, Paulo Capel

    2013-12-01

    The article briefly systematizes health care models, emphasizes the role of population surveys as a management tool and analyzes the specific case of the Brazilian Oral Health Survey (SBBrasil 2010) and its contribution to the consolidation process of health care models consistent with the principles of the Sistema Único de Saúde (SUS, Public Health Care System). While in legal terms SUS corresponds to a health care model, in actual practice the public policy planning and health action, the system gives rise to a care model which is not the result of legal texts or theoretical formulations, but rather the praxis of the personnel involved. Bearing in mind that the management of day-to-day health affairs is a privileged space for the production and consolidation of health care models, it is necessary to stimulate and support the development of technical and operational skills which are different from those required for the management of care related to individual demands.

  10. Visualization Skills: A Prerequisite to Advanced Solid Modeling

    ERIC Educational Resources Information Center

    Gow, George

    2007-01-01

    Many educators believe that solid modeling software has made teaching two- and three-dimensional visualization skills obsolete. They claim that the visual tools built into the solid modeling software serve as a replacement for the CAD operator's personal visualization skills. They also claim that because solid modeling software can produce…

  11. Advances and applications of occupancy models

    USGS Publications Warehouse

    Bailey, Larissa; MacKenzie, Darry I.; Nichols, James D.

    2013-01-01

    Summary: The past decade has seen an explosion in the development and application of models aimed at estimating species occurrence and occupancy dynamics while accounting for possible non-detection or species misidentification. We discuss some recent occupancy estimation methods and the biological systems that motivated their development. Collectively, these models offer tremendous flexibility, but simultaneously place added demands on the investigator. Unlike many mark–recapture scenarios, investigators utilizing occupancy models have the ability, and responsibility, to define their sample units (i.e. sites), replicate sampling occasions, time period over which species occurrence is assumed to be static and even the criteria that constitute ‘detection’ of a target species. Subsequent biological inference and interpretation of model parameters depend on these definitions and the ability to meet model assumptions. We demonstrate the relevance of these definitions by highlighting applications from a single biological system (an amphibian–pathogen system) and discuss situations where the use of occupancy models has been criticized. Finally, we use these applications to suggest future research and model development.

  12. State of the art: diagnostic tools and innovative therapies for treatment of advanced thymoma and thymic carcinoma.

    PubMed

    Ried, Michael; Marx, Alexander; Götz, Andrea; Hamer, Okka; Schalke, Berthold; Hofmann, Hans-Stefan

    2016-06-01

    In this review article, state-of-the-art diagnostic tools and innovative treatments of thymoma and thymic carcinoma (TC) are described with special respect to advanced tumour stages. Complete surgical resection (R0) remains the standard therapeutic approach for almost all a priori resectable mediastinal tumours as defined by preoperative standard computed tomography (CT). If lymphoma or germ-cell tumours are differential diagnostic considerations, biopsy may be indicated. Resection status is the most important prognostic factor in thymoma and TC, followed by tumour stage. Advanced (Masaoka-Koga stage III and IVa) tumours require interdisciplinary therapy decisions based on distinctive findings of preoperative CT scan and ancillary investigations [magnetic resonance imaging (MRI)] to select cases for primary surgery or neoadjuvant strategies with optional secondary resection. In neoadjuvant settings, octreotide scans and histological evaluation of pretherapeutic needle biopsies may help to choose between somatostatin agonist/prednisolone regimens and neoadjuvant chemotherapy as first-line treatment. Finally, a multimodality treatment regime is recommended for advanced and unresectable thymic tumours. In conclusion, advanced stage thymoma and TC should preferably be treated in experienced centres in order to provide all modern diagnostic tools (imaging, histology) and innovative therapy techniques. Systemic and local (hyperthermic intrathoracic chemotherapy) medical treatments together with extended surgical resections have increased the therapeutic options in patients with advanced or recurrent thymoma and TC.

  13. Evaluation of air pollution modelling tools as environmental engineering courseware.

    PubMed

    Souto González, J A; Bello Bugallo, P M; Casares Long, J J

    2004-01-01

    The study of phenomena related to the dispersion of pollutants usually takes advantage of the use of mathematical models based on the description of the different processes involved. This educational approach is especially important in air pollution dispersion, when the processes follow a non-linear behaviour so it is difficult to understand the relationships between inputs and outputs, and in a 3D context where it becomes hard to analyze alphanumeric results. In this work, three different software tools, as computer solvers for typical air pollution dispersion phenomena, are presented. Each software tool developed to be implemented on PCs, follows approaches that represent three generations of programming languages (Fortran 77, VisualBasic and Java), applied over three different environments: MS-DOS, MS-Windows and the world wide web. The software tools were tested by students of environmental engineering (undergraduate) and chemical engineering (postgraduate), in order to evaluate the ability of these software tools to improve both theoretical and practical knowledge of the air pollution dispersion problem, and the impact of the different environment in the learning process in terms of content, ease of use and visualization of results. PMID:15193095

  14. Building an advanced climate model: Program plan for the CHAMMP (Computer Hardware, Advanced Mathematics, and Model Physics) Climate Modeling Program

    SciTech Connect

    Not Available

    1990-12-01

    The issue of global warming and related climatic changes from increasing concentrations of greenhouse gases in the atmosphere has received prominent attention during the past few years. The Computer Hardware, Advanced Mathematics, and Model Physics (CHAMMP) Climate Modeling Program is designed to contribute directly to this rapid improvement. The goal of the CHAMMP Climate Modeling Program is to develop, verify, and apply a new generation of climate models within a coordinated framework that incorporates the best available scientific and numerical approaches to represent physical, biogeochemical, and ecological processes, that fully utilizes the hardware and software capabilities of new computer architectures, that probes the limits of climate predictability, and finally that can be used to address the challenging problem of understanding the greenhouse climate issue through the ability of the models to simulate time-dependent climatic changes over extended times and with regional resolution.

  15. Advanced Concepts for Underwater Acoustic Channel Modeling

    NASA Astrophysics Data System (ADS)

    Etter, P. C.; Haas, C. H.; Ramani, D. V.

    2014-12-01

    This paper examines nearshore underwater-acoustic channel modeling concepts and compares channel-state information requirements against existing modeling capabilities. This process defines a subset of candidate acoustic models suitable for simulating signal propagation in underwater communications. Underwater-acoustic communications find many practical applications in coastal oceanography, and networking is the enabling technology for these applications. Such networks can be formed by establishing two-way acoustic links between autonomous underwater vehicles and moored oceanographic sensors. These networks can be connected to a surface unit for further data transfer to ships, satellites, or shore stations via a radio-frequency link. This configuration establishes an interactive environment in which researchers can extract real-time data from multiple, but distant, underwater instruments. After evaluating the obtained data, control messages can be sent back to individual instruments to adapt the networks to changing situations. Underwater networks can also be used to increase the operating ranges of autonomous underwater vehicles by hopping the control and data messages through networks that cover large areas. A model of the ocean medium between acoustic sources and receivers is called a channel model. In an oceanic channel, characteristics of the acoustic signals change as they travel from transmitters to receivers. These characteristics depend upon the acoustic frequency, the distances between sources and receivers, the paths followed by the signals, and the prevailing ocean environment in the vicinity of the paths. Properties of the received signals can be derived from those of the transmitted signals using these channel models. This study concludes that ray-theory models are best suited to the simulation of acoustic signal propagation in oceanic channels and identifies 33 such models that are eligible candidates.

  16. A Tool for Sharing Empirical Models of Climate Impacts

    NASA Astrophysics Data System (ADS)

    Rising, J.; Kopp, R. E.; Hsiang, S. M.

    2013-12-01

    Scientists, policy advisors, and the public struggle to synthesize the quickly evolving empirical work on climate change impacts. The Integrated Assessment Models (IAMs) used to estimate the impacts of climate change and the effects of adaptation and mitigation policies can also benefit greatly from recent empirical results (Kopp, Hsiang & Oppenheimer, Impacts World 2013 discussion paper). This paper details a new online tool for exploring, analyzing, combining, and communicating a wide range of impact results, and supporting their integration into IAMs. The tool uses a new database of statistical results, which researchers can expand both in depth (by providing additional results that describing existing relationships) and breadth (by adding new relationships). Scientists can use the tool to quickly perform meta-analyses of related results, using Bayesian techniques to produce pooled and partially-pooled posterior distributions. Policy advisors can apply the statistical results to particular contexts, and combine different kinds of results in a cost-benefit framework. For example, models of the impact of temperature changes on agricultural yields can be first aggregated to build a best-estimate of the effect under given assumptions, then compared across countries using different temperature scenarios, and finally combined to estimate a social cost of carbon. The general public can better understand the many estimates of climate impacts and their range of uncertainty by exploring these results dynamically, with maps, bar charts, and dose-response-style plots. Front page of the climate impacts tool website. Sample "collections" of models, within which all results are estimates of the same fundamental relationship, are shown on the right. Simple pooled result for Gelman's "8 schools" example. Pooled results are calculated analytically, while partial-pooling (Bayesian hierarchical estimation) uses posterior simulations.

  17. Advances in NLTE Modeling for Integrated Simulations

    SciTech Connect

    Scott, H A; Hansen, S B

    2009-07-08

    The last few years have seen significant progress in constructing the atomic models required for non-local thermodynamic equilibrium (NLTE) simulations. Along with this has come an increased understanding of the requirements for accurately modeling the ionization balance, energy content and radiative properties of different elements for a wide range of densities and temperatures. Much of this progress is the result of a series of workshops dedicated to comparing the results from different codes and computational approaches applied to a series of test problems. The results of these workshops emphasized the importance of atomic model completeness, especially in doubly excited states and autoionization transitions, to calculating ionization balance, and the importance of accurate, detailed atomic data to producing reliable spectra. We describe a simple screened-hydrogenic model that calculates NLTE ionization balance with surprising accuracy, at a low enough computational cost for routine use in radiation-hydrodynamics codes. The model incorporates term splitting, {Delta}n = 0 transitions, and approximate UTA widths for spectral calculations, with results comparable to those of much more detailed codes. Simulations done with this model have been increasingly successful at matching experimental data for laser-driven systems and hohlraums. Accurate and efficient atomic models are just one requirement for integrated NLTE simulations. Coupling the atomic kinetics to hydrodynamics and radiation transport constrains both discretizations and algorithms to retain energy conservation, accuracy and stability. In particular, the strong coupling between radiation and populations can require either very short timesteps or significantly modified radiation transport algorithms to account for NLTE material response. Considerations such as these continue to provide challenges for NLTE simulations.

  18. Community Surface Dynamics Modeling System and its CSDMS Modeling Tool to couple models and data (Invited)

    NASA Astrophysics Data System (ADS)

    Syvitski, J. P.; Csdms Scientific; Software Team

    2010-12-01

    CSDMS is the virtual home for a diverse community who foster and promote the modeling of earth surface processes, with emphasis on the movement of fluids, sediment and solutes through landscapes, seascapes and through their sedimentary basins. CSDMS develops, integrates, disseminates & archives software (> 150 models and 3million+ lines of code) that reflects and predicts earth surface processes over a broad range of time and space scales. CSDMS deals with the Earth's surface—the ever-changing, dynamic interface between lithosphere, hydrosphere, cryosphere, and atmosphere. CSDMS employs state-of-the-art architectures, interface standards and frameworks that make it possible to convert stand-alone models into flexible, "plug-and-play" components that can be assembled into larger applications. The CSDMS model-coupling environment offers language interoperability, structured and unstructured grids, and serves as a migration pathway for surface dynamics modelers towards High-Performance Computing (HPC). The CSDMS Modeling Tool is a key product of the overall project, as it allows earth scientists with relatively modest computer coding experience to use the CSDMS modules for earth surface dynamics research and education. The CMT Tool is platform independent. CMT can easily couple models that have followed the CSDMS protocols for model contribution: 1) Open-source license; 2) Available; 3) Vetted; 4) Open-source language; 5) Refactored for componentization; 6) Metadata & test files; 7) Clean and documented using keywords.

  19. Advanced Numerical Model for Irradiated Concrete

    SciTech Connect

    Giorla, Alain B.

    2015-03-01

    In this report, we establish a numerical model for concrete exposed to irradiation to address these three critical points. The model accounts for creep in the cement paste and its coupling with damage, temperature and relative humidity. The shift in failure mode with the loading rate is also properly represented. The numerical model for creep has been validated and calibrated against different experiments in the literature [Wittmann, 1970, Le Roy, 1995]. Results from a simplified model are shown to showcase the ability of numerical homogenization to simulate irradiation effects in concrete. In future works, the complete model will be applied to the analysis of the irradiation experiments of Elleuch et al. [1972] and Kelly et al. [1969]. This requires a careful examination of the experimental environmental conditions as in both cases certain critical information are missing, including the relative humidity history. A sensitivity analysis will be conducted to provide lower and upper bounds of the concrete expansion under irradiation, and check if the scatter in the simulated results matches the one found in experiments. The numerical and experimental results will be compared in terms of expansion and loss of mechanical stiffness and strength. Both effects should be captured accordingly by the model to validate it. Once the model has been validated on these two experiments, it can be applied to simulate concrete from nuclear power plants. To do so, the materials used in these concrete must be as well characterized as possible. The main parameters required are the mechanical properties of each constituent in the concrete (aggregates, cement paste), namely the elastic modulus, the creep properties, the tensile and compressive strength, the thermal expansion coefficient, and the drying shrinkage. These can be either measured experimentally, estimated from the initial composition in the case of cement paste, or back-calculated from mechanical tests on concrete. If some

  20. Advances in Modeling Exploding Bridgewire Initiation

    SciTech Connect

    Hrousis, C A; Christensen, J S

    2010-03-10

    There is great interest in applying magnetohydrodynamic (MHD) simulation techniques to the designs of electrical high explosive (HE) initiators, for the purpose of better understanding a design's sensitivities, optimizing its performance, and/or predicting its useful lifetime. Two MHD-capable LLNL codes, CALE and ALE3D, are being used to simulate the process of ohmic heating, vaporization, and plasma formation in exploding bridgewires (EBW). Initiation of the HE is simulated using Ignition & Growth reactive flow models. 1-D, 2-D and 3-D models have been constructed and studied. The models provide some intuitive explanation of the initiation process and are useful for evaluating the potential impact of identified aging mechanisms (such as the growth of intermetallic compounds or powder sintering). The end product of this work is a simulation capability for evaluating margin in proposed, modified or aged initiation system designs.

  1. Phenomenological Modeling of Infrared Sources: Recent Advances

    NASA Technical Reports Server (NTRS)

    Leung, Chun Ming; Kwok, Sun (Editor)

    1993-01-01

    Infrared observations from planned space facilities (e.g., ISO (Infrared Space Observatory), SIRTF (Space Infrared Telescope Facility)) will yield a large and uniform sample of high-quality data from both photometric and spectroscopic measurements. To maximize the scientific returns of these space missions, complementary theoretical studies must be undertaken to interpret these observations. A crucial step in such studies is the construction of phenomenological models in which we parameterize the observed radiation characteristics in terms of the physical source properties. In the last decade, models with increasing degree of physical realism (in terms of grain properties, physical processes, and source geometry) have been constructed for infrared sources. Here we review current capabilities available in the phenomenological modeling of infrared sources and discuss briefly directions for future research in this area.

  2. Integrating Cache Performance Modeling and Tuning Support in Parallelization Tools

    NASA Technical Reports Server (NTRS)

    Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1998-01-01

    With the resurgence of distributed shared memory (DSM) systems based on cache-coherent Non Uniform Memory Access (ccNUMA) architectures and increasing disparity between memory and processors speeds, data locality overheads are becoming the greatest bottlenecks in the way of realizing potential high performance of these systems. While parallelization tools and compilers facilitate the users in porting their sequential applications to a DSM system, a lot of time and effort is needed to tune the memory performance of these applications to achieve reasonable speedup. In this paper, we show that integrating cache performance modeling and tuning support within a parallelization environment can alleviate this problem. The Cache Performance Modeling and Prediction Tool (CPMP), employs trace-driven simulation techniques without the overhead of generating and managing detailed address traces. CPMP predicts the cache performance impact of source code level "what-if" modifications in a program to assist a user in the tuning process. CPMP is built on top of a customized version of the Computer Aided Parallelization Tools (CAPTools) environment. Finally, we demonstrate how CPMP can be applied to tune a real Computational Fluid Dynamics (CFD) application.

  3. Logic flowgraph methodology - A tool for modeling embedded systems

    NASA Technical Reports Server (NTRS)

    Muthukumar, C. T.; Guarro, S. B.; Apostolakis, G. E.

    1991-01-01

    The logic flowgraph methodology (LFM), a method for modeling hardware in terms of its process parameters, has been extended to form an analytical tool for the analysis of integrated (hardware/software) embedded systems. In the software part of a given embedded system model, timing and the control flow among different software components are modeled by augmenting LFM with modified Petrinet structures. The objective of the use of such an augmented LFM model is to uncover possible errors and the potential for unanticipated software/hardware interactions. This is done by backtracking through the augmented LFM mode according to established procedures which allow the semiautomated construction of fault trees for any chosen state of the embedded system (top event). These fault trees, in turn, produce the possible combinations of lower-level states (events) that may lead to the top event.

  4. Computational Modeling, Formal Analysis, and Tools for Systems Biology

    PubMed Central

    Bartocci, Ezio; Lió, Pietro

    2016-01-01

    As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science. PMID:26795950

  5. Computational science: shifting the focus from tools to models

    PubMed Central

    Hinsen, Konrad

    2014-01-01

    Computational techniques have revolutionized many aspects of scientific research over the last few decades. Experimentalists use computation for data analysis, processing ever bigger data sets. Theoreticians compute predictions from ever more complex models. However, traditional articles do not permit the publication of big data sets or complex models. As a consequence, these crucial pieces of information no longer enter the scientific record. Moreover, they have become prisoners of scientific software: many models exist only as software implementations, and the data are often stored in proprietary formats defined by the software. In this article, I argue that this emphasis on software tools over models and data is detrimental to science in the long term, and I propose a means by which this can be reversed. PMID:25309728

  6. Computational Modeling, Formal Analysis, and Tools for Systems Biology.

    PubMed

    Bartocci, Ezio; Lió, Pietro

    2016-01-01

    As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science.

  7. Modeling Innovations Advance Wind Energy Industry

    NASA Technical Reports Server (NTRS)

    2009-01-01

    In 1981, Glenn Research Center scientist Dr. Larry Viterna developed a model that predicted certain elements of wind turbine performance with far greater accuracy than previous methods. The model was met with derision from others in the wind energy industry, but years later, Viterna discovered it had become the most widely used method of its kind, enabling significant wind energy technologies-like the fixed pitch turbines produced by manufacturers like Aerostar Inc. of Westport, Massachusetts-that are providing sustainable, climate friendly energy sources today.

  8. Development of 3D multimedia with advanced computer animation tools for outreach activities related to Meteor Science and Meteoritics

    NASA Astrophysics Data System (ADS)

    Madiedo, J. M.

    2012-09-01

    Documentaries related to Astronomy and Planetary Sciences are a common and very attractive way to promote the interest of the public in these areas. These educational tools can get benefit from new advanced computer animation software and 3D technologies, as these allow making these documentaries even more attractive. However, special care must be taken in order to guarantee that the information contained in them is serious and objective. In this sense, an additional value is given when the footage is produced by the own researchers. With this aim, a new documentary produced and directed by Prof. Madiedo has been developed. The documentary, which has been entirely developed by means of advanced computer animation tools, is dedicated to several aspects of Meteor Science and Meteoritics. The main features of this outreach and education initiative are exposed here.

  9. Smart Engines Via Advanced Model Based Controls

    SciTech Connect

    Allain, Marc

    2000-08-20

    A ''new'' process for developing control systems - Less engine testing - More robust control system - Shorter development cycle time - ''Smarter'' approach to engine control - On-board models describe engine behavior - Shorter, systematic calibration process - Customer and legislative requirements designed-in.

  10. Advances in Swine Biomedical Model Genomics

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The swine has been a major biomedical model species, for transplantation, heart disease, allergies and asthma, as well as normal neonatal development and reproductive physiology. Swine have been used extensively for studies of infectious disease processes and analyses of preventative strategies, inc...

  11. Measurement and modeling of advanced coal conversion processes

    SciTech Connect

    Solomon, P.R.; Serio, M.A.; Hamblen, D.G.; Smoot, L.D.; Brewster, B.S. Brigham Young Univ., Provo, UT )

    1991-01-01

    The overall objective of this program is the development of predictive capability for the design, scale up, simulation, control and feedstock evaluation in advanced coal conversion devices. This program will merge significant advances made in measuring and quantitatively describing the mechanisms in coal conversion behavior. Comprehensive computer codes for mechanistic modeling of entrained-bed gasification. Additional capabilities in predicting pollutant formation will be implemented and the technology will be expanded to fixed-bed reactors.

  12. Homology modeling a fast tool for drug discovery: current perspectives.

    PubMed

    Vyas, V K; Ukawala, R D; Ghate, M; Chintha, C

    2012-01-01

    Major goal of structural biology involve formation of protein-ligand complexes; in which the protein molecules act energetically in the course of binding. Therefore, perceptive of protein-ligand interaction will be very important for structure based drug design. Lack of knowledge of 3D structures has hindered efforts to understand the binding specificities of ligands with protein. With increasing in modeling software and the growing number of known protein structures, homology modeling is rapidly becoming the method of choice for obtaining 3D coordinates of proteins. Homology modeling is a representation of the similarity of environmental residues at topologically corresponding positions in the reference proteins. In the absence of experimental data, model building on the basis of a known 3D structure of a homologous protein is at present the only reliable method to obtain the structural information. Knowledge of the 3D structures of proteins provides invaluable insights into the molecular basis of their functions. The recent advances in homology modeling, particularly in detecting and aligning sequences with template structures, distant homologues, modeling of loops and side chains as well as detecting errors in a model contributed to consistent prediction of protein structure, which was not possible even several years ago. This review focused on the features and a role of homology modeling in predicting protein structure and described current developments in this field with victorious applications at the different stages of the drug design and discovery.

  13. Diagnostic tools for mixing models of stream water chemistry

    USGS Publications Warehouse

    Hooper, R.P.

    2003-01-01

    Mixing models provide a useful null hypothesis against which to evaluate processes controlling stream water chemical data. Because conservative mixing of end-members with constant concentration is a linear process, a number of simple mathematical and multivariate statistical methods can be applied to this problem. Although mixing models have been most typically used in the context of mixing soil and groundwater end-members, an extension of the mathematics of mixing models is presented that assesses the "fit" of a multivariate data set to a lower dimensional mixing subspace without the need for explicitly identified end-members. Diagnostic tools are developed to determine the approximate rank of the data set and to assess lack of fit of the data. This permits identification of processes that violate the assumptions of the mixing model and can suggest the dominant processes controlling stream water chemical variation. These same diagnostic tools can be used to assess the fit of the chemistry of one site into the mixing subspace of a different site, thereby permitting an assessment of the consistency of controlling end-members across sites. This technique is applied to a number of sites at the Panola Mountain Research Watershed located near Atlanta, Georgia.

  14. Advanced air revitalization system modeling and testing

    NASA Technical Reports Server (NTRS)

    Dall-Baumann, Liese; Jeng, Frank; Christian, Steve; Edeer, Marybeth; Lin, Chin

    1990-01-01

    To support manned lunar and Martian exploration, an extensive evaluation of air revitalization subsystems (ARS) is being conducted. The major operations under study include carbon dioxide removal and reduction; oxygen and nitrogen production, storage, and distribution; humidity and temperature control; and trace contaminant control. A comprehensive analysis program based on a generalized block flow model was developed to facilitate the evaluation of various processes and their interaction. ASPEN PLUS was used in modelling carbon dioxide removal and reduction. Several life support test stands were developed to test new and existing technologies for their potential applicability in space. The goal was to identify processes which use compact, lightweight equipment and maximize the recovery of oxygen and water. The carbon dioxide removal test stands include solid amine/vacuum desorption (SAVD), regenerative silver oxide chemisorption, and electrochemical carbon dioxide concentration (EDC). Membrane-based carbon dioxide removal and humidity control, catalytic reduction of carbon dioxide, and catalytic oxidation of trace contaminants were also investigated.

  15. Advanced Numerical Modeling of Turbulent Atmospheric Flows

    NASA Astrophysics Data System (ADS)

    Kühnlein, Christian; Dörnbrack, Andreas; Gerz, Thomas

    The present chapter introduces the method of computational simulation to predict and study turbulent atmospheric flows. This includes a description of the fundamental approach to computational simulation and the practical implementation using the technique of large-eddy simulation. In addition, selected contributions from IPA scientists to computational model development and various examples for applications are given. These examples include homogeneous turbulence, convective boundary layers, heated forest canopy, buoyant thermals, and large-scale flows with baroclinic wave instability.

  16. FAST: A Fuel And Sheath Modeling Tool for CANDU Reactor Fuel

    NASA Astrophysics Data System (ADS)

    Prudil, Andrew Albert

    Understanding the behaviour of nuclear fuel during irradiation is a complicated multiphysics problem involving neutronics, chemistry, radiation physics, material-science, solid mechanics, heat transfer and thermal-hydraulics. Due to the complexity and interdependence of the physics and models involved, fuel modeling is typically clone with numerical models. Advancements in both computer hardware and software have made possible new more complex and sophisticated fuel modeling codes. The Fuel And Sheath modelling Tool (FAST) is a fuel performance code that has been developed for modeling nuclear fuel behaviour under normal and transient conditions. The FAST code includes models for heat generation and transport, thermal expansion, elastic strain, densification, fission product swelling, pellet relocation, contact, grain growth, fission gas release, gas and coolant pressure and sheath creep. These models are coupled and solved numerically using the Comsol Multiphysics finite-element platform. The model utilizes a radialaxial geometry of a fuel pellet (including dishing and chamfering) and accompanying fuel sheath allowing the model to predict circumferential ridging. This model has evolved from previous treatments developed at the Royal Military College. The model has now been significantly advanced to include: a more detailed pellet geometry, localized pellet-to-sheath gap size and contact pressure, ability to model cracked pellets, localized fuel burnup for material property models, improved U02 densification behaviour, fully 2-dimensional model for the sheath, additional creep models, additional material models, an FEM Booth-diffusion model for fission gas release (including ability to model temperature and power changes), a capability for end-of-life predictions, the ability to utilize text files as model inputs, and provides a first time integration of normal operating conditions (NOC) and transient fuel models into a single code (which has never been achieved

  17. Development of Advanced Life Prediction Tools for Elastic-Plastic Fatigue Crack Growth

    NASA Technical Reports Server (NTRS)

    Gregg, Wayne; McGill, Preston; Swanson, Greg; Wells, Doug; Throckmorton, D. A. (Technical Monitor)

    2001-01-01

    The objective of this viewgraph presentation is to develop a systematic approach to improving the fracture control process, including analytical tools, standards, guidelines, and awareness. Analytical tools specifically for elastic-plastic fracture analysis is a regime that is currently empirical for the Space Shuttle External Tank (ET) and is handled by simulated service testing of pre-cracked panels.

  18. Advances in Chimera Grid Tools for Multi-Body Dynamics Simulations and Script Creation

    NASA Technical Reports Server (NTRS)

    Chan, William M.

    2004-01-01

    This viewgraph presentation contains information about (1) Framework for multi-body dynamics - Geometry Manipulation Protocol (GMP), (2) Simulation procedure using Chimera Grid Tools (CGT) and OVERFLOW-2 (3) Further recent developments in Chimera Grid Tools OVERGRID, Grid modules, Script library and (4) Future work.

  19. Advancing Research in Second Language Writing through Computational Tools and Machine Learning Techniques: A Research Agenda

    ERIC Educational Resources Information Center

    Crossley, Scott A.

    2013-01-01

    This paper provides an agenda for replication studies focusing on second language (L2) writing and the use of natural language processing (NLP) tools and machine learning algorithms. Specifically, it introduces a range of the available NLP tools and machine learning algorithms and demonstrates how these could be used to replicate seminal studies…

  20. Fuzzy regression modeling for tool performance prediction and degradation detection.

    PubMed

    Li, X; Er, M J; Lim, B S; Zhou, J H; Gan, O P; Rutkowski, L

    2010-10-01

    In this paper, the viability of using Fuzzy-Rule-Based Regression Modeling (FRM) algorithm for tool performance and degradation detection is investigated. The FRM is developed based on a multi-layered fuzzy-rule-based hybrid system with Multiple Regression Models (MRM) embedded into a fuzzy logic inference engine that employs Self Organizing Maps (SOM) for clustering. The FRM converts a complex nonlinear problem to a simplified linear format in order to further increase the accuracy in prediction and rate of convergence. The efficacy of the proposed FRM is tested through a case study - namely to predict the remaining useful life of a ball nose milling cutter during a dry machining process of hardened tool steel with a hardness of 52-54 HRc. A comparative study is further made between four predictive models using the same set of experimental data. It is shown that the FRM is superior as compared with conventional MRM, Back Propagation Neural Networks (BPNN) and Radial Basis Function Networks (RBFN) in terms of prediction accuracy and learning speed.

  1. A tool for modeling concurrent real-time computation

    NASA Technical Reports Server (NTRS)

    Sharma, D. D.; Huang, Shie-Rei; Bhatt, Rahul; Sridharan, N. S.

    1990-01-01

    Real-time computation is a significant area of research in general, and in AI in particular. The complexity of practical real-time problems demands use of knowledge-based problem solving techniques while satisfying real-time performance constraints. Since the demands of a complex real-time problem cannot be predicted (owing to the dynamic nature of the environment) powerful dynamic resource control techniques are needed to monitor and control the performance. A real-time computation model for a real-time tool, an implementation of the QP-Net simulator on a Symbolics machine, and an implementation on a Butterfly multiprocessor machine are briefly described.

  2. Advanced modeling and simulation to design and manufacture high performance and reliable advanced microelectronics and microsystems.

    SciTech Connect

    Nettleship, Ian (University of Pittsburgh, Pittsburgh, PA); Hinklin, Thomas; Holcomb, David Joseph; Tandon, Rajan; Arguello, Jose Guadalupe, Jr.; Dempsey, James Franklin; Ewsuk, Kevin Gregory; Neilsen, Michael K.; Lanagan, Michael (Pennsylvania State University, University Park, PA)

    2007-07-01

    An interdisciplinary team of scientists and engineers having broad expertise in materials processing and properties, materials characterization, and computational mechanics was assembled to develop science-based modeling/simulation technology to design and reproducibly manufacture high performance and reliable, complex microelectronics and microsystems. The team's efforts focused on defining and developing a science-based infrastructure to enable predictive compaction, sintering, stress, and thermomechanical modeling in ''real systems'', including: (1) developing techniques to and determining materials properties and constitutive behavior required for modeling; (2) developing new, improved/updated models and modeling capabilities, (3) ensuring that models are representative of the physical phenomena being simulated; and (4) assessing existing modeling capabilities to identify advances necessary to facilitate the practical application of Sandia's predictive modeling technology.

  3. Novel 3D Approach to Flare Modeling via Interactive IDL Widget Tools

    NASA Astrophysics Data System (ADS)

    Nita, G. M.; Fleishman, G. D.; Gary, D. E.; Kuznetsov, A.; Kontar, E. P.

    2011-12-01

    Currently, and soon-to-be, available sophisticated 3D models of particle acceleration and transport in solar flares require a new level of user-friendly visualization and analysis tools allowing quick and easy adjustment of the model parameters and computation of realistic radiation patterns (images, spectra, polarization, etc). We report the current state of the art of these tools in development, already proved to be highly efficient for the direct flare modeling. We present an interactive IDL widget application intended to provide a flexible tool that allows the user to generate spatially resolved radio and X-ray spectra. The object-based architecture of this application provides full interaction with imported 3D magnetic field models (e.g., from an extrapolation) that may be embedded in a global coronal model. Various tools provided allow users to explore the magnetic connectivity of the model by generating magnetic field lines originating in user-specified volume positions. Such lines may serve as reference lines for creating magnetic flux tubes, which are further populated with user-defined analytical thermal/non thermal particle distribution models. By default, the application integrates IDL callable DLL and Shared libraries containing fast GS emission codes developed in FORTRAN and C++ and soft and hard X-ray codes developed in IDL. However, the interactive interface allows interchanging these default libraries with any user-defined IDL or external callable codes designed to solve the radiation transfer equation in the same or other wavelength ranges of interest. To illustrate the tool capacity and generality, we present a step-by-step real-time computation of microwave and X-ray images from realistic magnetic structures obtained from a magnetic field extrapolation preceding a real event, and compare them with the actual imaging data obtained by NORH and RHESSI instruments. We discuss further anticipated developments of the tools needed to accommodate

  4. ISRU System Model Tool: From Excavation to Oxygen Production

    NASA Technical Reports Server (NTRS)

    Santiago-Maldonado, Edgardo; Linne, Diane L.

    2007-01-01

    In the late 80's, conceptual designs for an in situ oxygen production plant were documented in a study by Eagle Engineering [1]. In the "Summary of Findings" of this study, it is clearly pointed out that: "reported process mass and power estimates lack a consistent basis to allow comparison." The study goes on to say: "A study to produce a set of process mass, power, and volume requirements on a consistent basis is recommended." Today, approximately twenty years later, as humans plan to return to the moon and venture beyond, the need for flexible up-to-date models of the oxygen extraction production process has become even more clear. Multiple processes for the production of oxygen from lunar regolith are being investigated by NASA, academia, and industry. Three processes that have shown technical merit are molten regolith electrolysis, hydrogen reduction, and carbothermal reduction. These processes have been selected by NASA as the basis for the development of the ISRU System Model Tool (ISMT). In working to develop up-to-date system models for these processes NASA hopes to accomplish the following: (1) help in the evaluation process to select the most cost-effective and efficient process for further prototype development, (2) identify key parameters, (3) optimize the excavation and oxygen production processes, and (4) provide estimates on energy and power requirements, mass and volume of the system, oxygen production rate, mass of regolith required, mass of consumables, and other important parameters. Also, as confidence and high fidelity is achieved with each component's model, new techniques and processes can be introduced and analyzed at a fraction of the cost of traditional hardware development and test approaches. A first generation ISRU System Model Tool has been used to provide inputs to the Lunar Architecture Team studies.

  5. Application of the GEM Inventory Data Capture Tools for Dynamic Vulnerability Assessment and Recovery Modelling

    NASA Astrophysics Data System (ADS)

    Verrucci, Enrica; Bevington, John; Vicini, Alessandro

    2014-05-01

    A set of open-source tools to create building exposure datasets for seismic risk assessment was developed from 2010-13 by the Inventory Data Capture Tools (IDCT) Risk Global Component of the Global Earthquake Model (GEM). The tools were designed to integrate data derived from remotely-sensed imagery, statistically-sampled in-situ field data of buildings to generate per-building and regional exposure data. A number of software tools were created to aid the development of these data, including mobile data capture tools for in-field structural assessment, and the Spatial Inventory Data Developer (SIDD) for creating "mapping schemes" - statistically-inferred distributions of building stock applied to areas of homogeneous urban land use. These tools were made publically available in January 2014. Exemplar implementations in Europe and Central Asia during the IDCT project highlighted several potential application areas beyond the original scope of the project. These are investigated here. We describe and demonstrate how the GEM-IDCT suite can be used extensively within the framework proposed by the EC-FP7 project SENSUM (Framework to integrate Space-based and in-situ sENSing for dynamic vUlnerability and recovery Monitoring). Specifically, applications in the areas of 1) dynamic vulnerability assessment (pre-event), and 2) recovery monitoring and evaluation (post-event) are discussed. Strategies for using the IDC Tools for these purposes are discussed. The results demonstrate the benefits of using advanced technology tools for data capture, especially in a systematic fashion using the taxonomic standards set by GEM. Originally designed for seismic risk assessment, it is clear the IDCT tools have relevance for multi-hazard risk assessment. When combined with a suitable sampling framework and applied to multi-temporal recovery monitoring, data generated from the tools can reveal spatio-temporal patterns in the quality of recovery activities and resilience trends can be

  6. ADVANCED TOOLS FOR ASSESSING SELECTED PRESCRIPTION AND ILLICIT DRUGS IN TREATED SEWAGE EFFLUENTS AND SOURCE WATERS

    EPA Science Inventory

    The purpose of this poster is to present the application and assessment of advanced technologies in a real-world environment - wastewater effluent and source waters - for detecting six drugs (azithromycin, fluoxetine, omeprazole, levothyroxine, methamphetamine, and methylenedioxy...

  7. Numerical Modeling and Inverse Scattering in Nondestructive Testing: Recent Applications and Advances

    NASA Astrophysics Data System (ADS)

    Marklein, R.; Langenberg, K. J.; Mayer, K.; Shlivinski, A.; Miao, J.; Zimmer, A.; Müller, W.; Schmitz, V.; Kohl, C.; Mletzko, U.

    2005-04-01

    This paper presents recent advances and future challenges of the application of different numerical modeling tools and linear and nonlinear inversion algorithms in ultrasonics and electromagnetics applied in NDE. The inversion methods considered in the presented work vary from linear schemes, e.g. SAFT/InASAFT and Diffraction Tomography/FT-SAFT, to nonlinear schemes, e.g. the Contrast Source Inversion. Inversion results are presented and compared for modeled and measured ultrasonic and electromagnetic data to locate voids and cracks as well as to locate aluminum tendon ducts in concrete, which is a typical GPR problem. Finite Integration Technique (FIT) and Domain Integral Equation (DIE) solvers are used as modeling tools.

  8. Watershed modeling tools and data for prognostic and diagnostic

    NASA Astrophysics Data System (ADS)

    Chambel-Leitao, P.; Brito, D.; Neves, R.

    2009-04-01

    When eutrophication is considered an important process to control it can be accomplished reducing nitrogen and phosphorus losses from both point and nonpoint sources and helping to assess the effectiveness of the pollution reduction strategy. HARP-NUT guidelines (Guidelines on Harmonized Quantification and Reporting Procedures for Nutrients) (Borgvang & Selvik, 2000) are presented by OSPAR as the best common quantification and reporting procedures for calculating the reduction of nutrient inputs. In 2000, OSPAR HARP-NUT guidelines on a trial basis. They were intended to serve as a tool for OSPAR Contracting Parties to report, in a harmonized manner, their different commitments, present or future, with regard to nutrients under the OSPAR Convention, in particular the "Strategy to Combat Eutrophication". HARP-NUT Guidelines (Borgvang and Selvik, 2000; Schoumans, 2003) were developed to quantify and report on the individual sources of nitrogen and phosphorus discharges/losses to surface waters (Source Orientated Approach). These results can be compared to nitrogen and phosphorus figures with the total riverine loads measured at downstream monitoring points (Load Orientated Approach), as load reconciliation. Nitrogen and phosphorus retention in river systems represents the connecting link between the "Source Orientated Approach" and the "Load Orientated Approach". Both approaches are necessary for verification purposes and both may be needed for providing the information required for the various commitments. Guidelines 2,3,4,5 are mainly concerned with the sources estimation. They present a set of simple calculations that allow the estimation of the origin of loads. Guideline 6 is a particular case where the application of a model is advised, in order to estimate the sources of nutrients from diffuse sources associated with land use/land cover. The model chosen for this was SWAT (Arnold & Fohrer, 2005) model because it is suggested in the guideline 6 and because it

  9. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    NASA Astrophysics Data System (ADS)

    Joshi, D. M.; Patel, H. K.

    2015-10-01

    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant.

  10. Animal models as tools to study the pathophysiology of depression.

    PubMed

    Abelaira, Helena M; Réus, Gislaine Z; Quevedo, João

    2013-01-01

    The incidence of depressive illness is high worldwide, and the inadequacy of currently available drug treatments contributes to the significant health burden associated with depression. A basic understanding of the underlying disease processes in depression is lacking; therefore, recreating the disease in animal models is not possible. Popular current models of depression creatively merge ethologically valid behavioral assays with the latest technological advances in molecular biology. Within this context, this study aims to evaluate animal models of depression and determine which has the best face, construct, and predictive validity. These models differ in the degree to which they produce features that resemble a depressive-like state, and models that include stress exposure are widely used. Paradigms that employ acute or sub-chronic stress exposure include learned helplessness, the forced swimming test, the tail suspension test, maternal deprivation, chronic mild stress, and sleep deprivation, to name but a few, all of which employ relatively short-term exposure to inescapable or uncontrollable stress and can reliably detect antidepressant drug response.

  11. Advanced techniques in reliability model representation and solution

    NASA Technical Reports Server (NTRS)

    Palumbo, Daniel L.; Nicol, David M.

    1992-01-01

    The current tendency of flight control system designs is towards increased integration of applications and increased distribution of computational elements. The reliability analysis of such systems is difficult because subsystem interactions are increasingly interdependent. Researchers at NASA Langley Research Center have been working for several years to extend the capability of Markov modeling techniques to address these problems. This effort has been focused in the areas of increased model abstraction and increased computational capability. The reliability model generator (RMG) is a software tool that uses as input a graphical object-oriented block diagram of the system. RMG uses a failure-effects algorithm to produce the reliability model from the graphical description. The ASSURE software tool is a parallel processing program that uses the semi-Markov unreliability range evaluator (SURE) solution technique and the abstract semi-Markov specification interface to the SURE tool (ASSIST) modeling language. A failure modes-effects simulation is used by ASSURE. These tools were used to analyze a significant portion of a complex flight control system. The successful combination of the power of graphical representation, automated model generation, and parallel computation leads to the conclusion that distributed fault-tolerant system architectures can now be analyzed.

  12. MTK: An AI tool for model-based reasoning

    NASA Technical Reports Server (NTRS)

    Erickson, William K.; Schwartz, Mary R.

    1987-01-01

    A 1988 goal for the Systems Autonomy Demonstration Project Office of the NASA Ames Research Center is to apply model-based representation and reasoning techniques in a knowledge-based system that will provide monitoring, fault diagnosis, control and trend analysis of the space station Thermal Management System (TMS). A number of issues raised during the development of the first prototype system inspired the design and construction of a model-based reasoning tool called MTK, which was used in the building of the second prototype. These issues are outlined, along with examples from the thermal system to highlight the motivating factors behind them. An overview of the capabilities of MTK is given.

  13. Empirical flow parameters : a tool for hydraulic model validity

    USGS Publications Warehouse

    Asquith, William H.; Burley, Thomas E.; Cleveland, Theodore G.

    2013-01-01

    The objectives of this project were (1) To determine and present from existing data in Texas, relations between observed stream flow, topographic slope, mean section velocity, and other hydraulic factors, to produce charts such as Figure 1 and to produce empirical distributions of the various flow parameters to provide a methodology to "check if model results are way off!"; (2) To produce a statistical regional tool to estimate mean velocity or other selected parameters for storm flows or other conditional discharges at ungauged locations (most bridge crossings) in Texas to provide a secondary way to compare such values to a conventional hydraulic modeling approach. (3.) To present ancillary values such as Froude number, stream power, Rosgen channel classification, sinuosity, and other selected characteristics (readily determinable from existing data) to provide additional information to engineers concerned with the hydraulic-soil-foundation component of transportation infrastructure.

  14. The Aerosol Modeling Testbed: A community tool to objectively evaluate aerosol process modules

    SciTech Connect

    Fast, Jerome D.; Gustafson, William I.; Chapman, Elaine G.; Easter, Richard C.; Rishel, Jeremy P.; Zaveri, Rahul A.; Grell, Georg; Barth, Mary

    2011-03-02

    This study describes a new modeling paradigm that significantly advances how the third activity is conducted while also fully exploiting data and findings from the first two activities. The Aerosol Modeling Testbed (AMT) is a computational framework for the atmospheric sciences community that streamlines the process of testing and evaluating aerosol process modules over a wide range of spatial and temporal scales. The AMT consists of a fully-coupled meteorology-chemistry-aerosol model, and a suite of tools to evaluate the performance of aerosol process modules via comparison with a wide range of field measurements. The philosophy of the AMT is to systematically and objectively evaluate aerosol process modules over local to regional spatial scales that are compatible with most field campaigns measurement strategies. The performance of new treatments can then be quantified and compared to existing treatments before they are incorporated into regional and global climate models. Since the AMT is a community tool, it also provides a means of enhancing collaboration and coordination among aerosol modelers.

  15. Advanced modeling to accelerate the scale up of carbon capture technologies

    SciTech Connect

    Miller, David C.; Sun, XIN; Storlie, Curtis B.; Bhattacharyya, Debangsu

    2015-06-01

    In order to help meet the goals of the DOE carbon capture program, the Carbon Capture Simulation Initiative (CCSI) was launched in early 2011 to develop, demonstrate, and deploy advanced computational tools and validated multi-scale models to reduce the time required to develop and scale-up new carbon capture technologies. This article focuses on essential elements related to the development and validation of multi-scale models in order to help minimize risk and maximize learning as new technologies progress from pilot to demonstration scale.

  16. CRISPR/Cas9: an advanced tool for editing plant genomes.

    PubMed

    Samanta, Milan Kumar; Dey, Avishek; Gayen, Srimonta

    2016-10-01

    To meet current challenges in agriculture, genome editing using sequence-specific nucleases (SSNs) is a powerful tool for basic and applied plant biology research. Here, we describe the principle and application of available genome editing tools, including zinc finger nucleases (ZFNs), transcription activator-like effector nucleases (TALENs) and the clustered regularly interspaced short palindromic repeat associated CRISPR/Cas9 system. Among these SSNs, CRISPR/Cas9 is the most recently characterized and rapidly developing genome editing technology, and has been successfully utilized in a wide variety of organisms. This review specifically illustrates the power of CRISPR/Cas9 as a tool for plant genome engineering, and describes the strengths and weaknesses of the CRISPR/Cas9 technology compared to two well-established genome editing tools, ZFNs and TALENs. PMID:27012546

  17. Advancing lighting and daylighting simulation: The transition from analysis to design aid tools

    SciTech Connect

    Hitchcock, R.J.

    1995-05-01

    This paper explores three significant software development requirements for making the transition from stand-alone lighting simulation/analysis tools to simulation-based design aid tools. These requirements include specialized lighting simulation engines, facilitated methods for creating detailed simulatable building descriptions, an automated techniques for providing lighting design guidance. Initial computer implementations meant to address each of these requirements are discussed to further elaborate these requirements and to illustrate work-in-progress.

  18. Advanced repair solution of clear defects on HTPSM by using nanomachining tool

    NASA Astrophysics Data System (ADS)

    Lee, Hyemi; Kim, Munsik; Jung, Hoyong; Kim, Sangpyo; Yim, Donggyu

    2015-10-01

    As the mask specifications become tighter for low k1 lithography, more aggressive repair accuracy is required below sub 20nm tech. node. To meet tight defect specifications, many maskshops select effective repair tools according to defect types. Normally, pattern defects are repaired by the e-beam repair tool and soft defects such as particles are repaired by the nanomachining tool. It is difficult for an e-beam repair tool to remove particle defects because it uses chemical reaction between gas and electron, and a nanomachining tool, which uses physical reaction between a nano-tip and defects, cannot be applied for repairing clear defects. Generally, film deposition process is widely used for repairing clear defects. However, the deposited film has weak cleaning durability, so it is easily removed by accumulated cleaning process. Although the deposited film is strongly attached on MoSiN(or Qz) film, the adhesive strength between deposited Cr film and MoSiN(or Qz) film becomes weaker and weaker by the accumulated energy when masks are exposed in a scanner tool due to the different coefficient of thermal expansion of each materials. Therefore, whenever a re-pellicle process is needed to a mask, all deposited repair points have to be confirmed whether those deposition film are damaged or not. And if a deposition point is damaged, repair process is needed again. This process causes longer and more complex process. In this paper, the basic theory and the principle are introduced to recover clear defects by using nanomachining tool, and the evaluated results are reviewed at dense line (L/S) patterns and contact hole (C/H) patterns. Also, the results using a nanomachining were compared with those using an e-beam repair tool, including the cleaning durability evaluated by the accumulated cleaning process. Besides, we discuss the phase shift issue and the solution about the image placement error caused by phase error.

  19. Introducing Modeling Transition Diagrams as a Tool to Connect Mathematical Modeling to Mathematical Thinking

    ERIC Educational Resources Information Center

    Czocher, Jennifer A.

    2016-01-01

    This study contributes a methodological tool to reconstruct the cognitive processes and mathematical activities carried out by mathematical modelers. Represented as Modeling Transition Diagrams (MTDs), individual modeling routes were constructed for four engineering undergraduate students. Findings stress the importance and limitations of using…

  20. Online and Certifiable Spectroscopy Courses Using Information and Communication Tools. a Model for Classrooms and Beyond

    NASA Astrophysics Data System (ADS)

    Krishnan, Mangala Sunder

    2015-06-01

    Online education tools and flipped (reverse) class models for teaching and learning and pedagogic and andragogic approaches to self-learning have become quite mature in the last few years because of the revolution in video, interactive software and social learning tools. Open Educational resources of dependable quality and variety are also becoming available throughout the world making the current era truly a renaissance period for higher education using Internet. In my presentation, I shall highlight structured course content preparation online in several areas of spectroscopy and also the design and development of virtual lab tools and kits for studying optical spectroscopy. Both elementary and advanced courses on molecular spectroscopy are currently under development jointly with researchers in other institutions in India. I would like to explore participation from teachers throughout the world in the teaching-learning process using flipped class methods for topics such as experimental and theoretical microwave spectroscopy of semi-rigid and non-rigid molecules, molecular complexes and aggregates. In addition, courses in Raman, Infrared spectroscopy experimentation and advanced electronic spectroscopy courses are also envisaged for free, online access. The National Programme on Technology Enhanced Learning (NPTEL) and the National Mission on Education through Information and Communication Technology (NMEICT) are two large Government of India funded initiatives for producing certified and self-learning courses with financial support for moderated discussion forums. The learning tools and interactive presentations so developed can be used in classrooms throughout the world using flipped mode of teaching. They are very much sought after by learners and researchers who are in other areas of learning but want to contribute to research and development through inter-disciplinary learning. NPTEL is currently is experimenting with Massive Open Online Course (MOOC

  1. Introducing BioSARN - an ecological niche model refinement tool.

    PubMed

    Heap, Marshall J

    2016-08-01

    Environmental niche modeling outputs a biological species' potential distribution. Further work is needed to arrive at a species' realized distribution. The Biological Species Approximate Realized Niche (BioSARN) application provides the ecological modeler with a toolset to refine Environmental niche models (ENMs). These tools include soil and land class filtering, niche area quantification and novelties like enhanced temporal corridor definition, and output to a high spatial resolution land class model. BioSARN is exemplified with a study on Fraser fir, a tree species with strong land class and edaphic correlations. Soil and land class filtering caused the potential distribution area to decline 17%. Enhanced temporal corridor definition permitted distinction of current, continuing, and future niches, and thus niche change and movement. Tile quantification analysis provided further corroboration of these trends. BioSARN does not substitute other established ENM methods. Rather, it allows the experimenter to work with their preferred ENM, refining it using their knowledge and experience. Output from lower spatial resolution ENMs to a high spatial resolution land class model is a pseudo high-resolution result. Still, it maybe the best that can be achieved until wide range high spatial resolution environmental data and accurate high precision species occurrence data become generally available. PMID:27547356

  2. Introducing BioSARN - an ecological niche model refinement tool.

    PubMed

    Heap, Marshall J

    2016-08-01

    Environmental niche modeling outputs a biological species' potential distribution. Further work is needed to arrive at a species' realized distribution. The Biological Species Approximate Realized Niche (BioSARN) application provides the ecological modeler with a toolset to refine Environmental niche models (ENMs). These tools include soil and land class filtering, niche area quantification and novelties like enhanced temporal corridor definition, and output to a high spatial resolution land class model. BioSARN is exemplified with a study on Fraser fir, a tree species with strong land class and edaphic correlations. Soil and land class filtering caused the potential distribution area to decline 17%. Enhanced temporal corridor definition permitted distinction of current, continuing, and future niches, and thus niche change and movement. Tile quantification analysis provided further corroboration of these trends. BioSARN does not substitute other established ENM methods. Rather, it allows the experimenter to work with their preferred ENM, refining it using their knowledge and experience. Output from lower spatial resolution ENMs to a high spatial resolution land class model is a pseudo high-resolution result. Still, it maybe the best that can be achieved until wide range high spatial resolution environmental data and accurate high precision species occurrence data become generally available.

  3. Advances in Games Technology: Software, Models, and Intelligence

    ERIC Educational Resources Information Center

    Prakash, Edmond; Brindle, Geoff; Jones, Kevin; Zhou, Suiping; Chaudhari, Narendra S.; Wong, Kok-Wai

    2009-01-01

    Games technology has undergone tremendous development. In this article, the authors report the rapid advancement that has been observed in the way games software is being developed, as well as in the development of games content using game engines. One area that has gained special attention is modeling the game environment such as terrain and…

  4. The STREON Recirculation Chamber: An Advanced Tool to Quantify Stream Ecosystem Metabolism in the Benthic Zone

    NASA Astrophysics Data System (ADS)

    Brock, J. T.; Utz, R.; McLaughlin, B.

    2013-12-01

    The STReam Experimental Observatory Network is a large-scale experimental effort that will investigate the effects of eutrophication and loss of large consumers in stream ecosystems. STREON represents the first experimental effort undertaken and supported by the National Ecological Observatory Network (NEON).Two treatments will be applied at 10 NEON sites and maintained for 10 years in the STREON program: the addition of nitrate and phosphate to enrich concentrations by five times ambient levels and electrical fields that exclude top consumers (i.e., fish or invertebrates) of the food web from the surface of buried sediment baskets. Following a 3-5 week period, the sediment baskets will be extracted and incubated in closed, recirculating metabolic chambers to measure rates of respiration, photosynthesis, and nutrient uptake. All STREON-generated data will be open access and available on the NEON web portal. The recirculation chamber represents a critical infrastructural component of STREON. Although researchers have applied such chambers for metabolic and nutrient uptake measurements in the past, the scope of STREON demands a novel design that addresses multiple processes often neglected by earlier models. The STREON recirculation chamber must be capable of: 1) incorporating hyporheic exchange into the flow field to ensure measurements of respiration include the activity of subsurface biota, 2) operating consistently with heterogeneous sediments from sand to cobble, 3) minimizing heat exchange from the motor and external environment, 4) delivering a reproducible uniform flow field over the surface of the sediment basket, and 5) efficient assembly/disassembly with minimal use of tools. The chamber also required a means of accommodating an optical dissolved oxygen probe and a means to inject/extract water. A prototype STREON chamber has been designed and thoroughly tested. The flow field within the chamber has been mapped using particle imaging velocimetry (PIV

  5. GIS as an Integration Tool for Hydrologic Modeling: Spatial Data Management, Analysis and Visualization

    NASA Astrophysics Data System (ADS)

    Setegn, S. G.; Lawrence, A.; Mahmoudi, M.

    2015-12-01

    The Applied Research Center at Florida International University (ARC-FIU) is supporting the soil and groundwater remediation efforts of the U.S. Department of Energy (DOE) Savannah River Site (SRS) by developing a surface water model to simulate the hydrology and the fate and transport of contaminants and sediment in the Tims Branch watershed. The first phase of model development was initiated in 2014 using the MIKE SHE/MIKE 11 hydrological modeling package which has a geographic information systems (GIS) user interface built into its system that can directly use spatial GIS databases (geodatabases) for model inputs. This study developed an ArcGIS geodatabase to support the hydrological modeling work for SRS. The coupling of a geodatabase with MIKE SHE/MIKE 11 numerical models can serve as an efficient tool that significantly reduces the time needed for data preparation. The geodatabase provides an advanced spatial data structure needed to address the management, processing, and analysis of large GIS and timeseries datasets derived from multiple sources that are used for numerical model calibration, uncertainty analysis, and simulation of flow and contaminant fate and transport during extreme climatic events. The geodatabase developed is based on the ArcHydro and ArcGIS Base Map data models with modifications made for project specific input parameters. The significance of this approach was to ensure its replicability for potential application in other watersheds. This paper describes the process of development of the SRS geodatabase and the application of GIS tools to pre-process and analyze hydrological model data; automate repetitive geoprocessing tasks; and produce maps for visualization of the surface water hydrology of the Tims Branch watershed. Key Words: GIS, hydrological modeling, geodatabase, hydrology, MIKE SHE/MIKE 11

  6. Advancing Space Weather Modeling Capabilities at the CCMC

    NASA Astrophysics Data System (ADS)

    Mays, M. Leila; Kuznetsova, Maria; Boblitt, Justin; Chulaki, Anna; MacNeice, Peter; Mendoza, Michelle; Mullinix, Richard; Pembroke, Asher; Pulkkinen, Antti; Rastaetter, Lutz; Shim, Ja Soon; Taktakishvili, Aleksandre; Wiegand, Chiu; Zheng, Yihua

    2016-04-01

    The Community Coordinated Modeling Center (CCMC, http://ccmc.gsfc.nasa.gov) serves as a community access point to an expanding collection of state-of-the-art space environment models and as a hub for collaborative development on next generation of space weather forecasting systems. In partnership with model developers and the international research and operational communities, the CCMC integrates new data streams and models from diverse sources into end-to-end space weather predictive systems, identifies weak links in data-model & model-model coupling and leads community efforts to fill those gaps. The presentation will focus on the latest model installations at the CCMC and advances in CCMC-led community-wide model validation projects.

  7. Current Animal Models of Postoperative Spine Infection and Potential Future Advances

    PubMed Central

    Stavrakis, A. I.; Loftin, A. H.; Lord, E. L.; Hu, Y.; Manegold, J. E.; Dworsky, E. M.; Scaduto, A. A.; Bernthal, N. M.

    2015-01-01

    Implant related infection following spine surgery is a devastating complication for patients and can potentially lead to significant neurological compromise, disability, morbidity, and even mortality. This paper provides an overview of the existing animal models of postoperative spine infection and highlights the strengths and weaknesses of each model. In addition, there is discussion regarding potential modifications to these animal models to better evaluate preventative and treatment strategies for this challenging complication. Current models are effective in simulating surgical procedures but fail to evaluate infection longitudinally using multiple techniques. Potential future modifications to these models include using advanced imaging technologies to evaluate infection, use of bioluminescent bacterial species, and testing of novel treatment strategies against multiple bacterial strains. There is potential to establish a postoperative spine infection model using smaller animals, such as mice, as these would be a more cost-effective screening tool for potential therapeutic interventions. PMID:26131448

  8. The AFDM (advanced fluid dynamics model) program: Scope and significance

    SciTech Connect

    Bohl, W.R.; Parker, F.R. ); Wilhelm, D. . Inst. fuer Neutronenphysik und Reaktortechnik); Berthier, J. )

    1990-01-01

    The origins and goals of the advanced fluid dynamics model (AFDM) program are described, and the models, algorithm, and coding used in the resulting AFDM computer program are summarized. A sample fuel-steel boiling pool calculation is presented and compared with a similar SIMMER-II calculation. A subjective assessment of the AFDM developments is given, and areas where future work is possible are detailed. 10 refs.

  9. Specification of advanced safety modeling requirements (Rev. 0).

    SciTech Connect

    Fanning, T. H.; Tautges, T. J.

    2008-06-30

    The U.S. Department of Energy's Global Nuclear Energy Partnership has lead to renewed interest in liquid-metal-cooled fast reactors for the purpose of closing the nuclear fuel cycle and making more efficient use of future repository capacity. However, the U.S. has not designed or constructed a fast reactor in nearly 30 years. Accurate, high-fidelity, whole-plant dynamics safety simulations will play a crucial role by providing confidence that component and system designs will satisfy established design limits and safety margins under a wide variety of operational, design basis, and beyond design basis transient conditions. Current modeling capabilities for fast reactor safety analyses have resulted from several hundred person-years of code development effort supported by experimental validation. The broad spectrum of mechanistic and phenomenological models that have been developed represent an enormous amount of institutional knowledge that needs to be maintained. Complicating this, the existing code architectures for safety modeling evolved from programming practices of the 1970s. This has lead to monolithic applications with interdependent data models which require significant knowledge of the complexities of the entire code in order for each component to be maintained. In order to develop an advanced fast reactor safety modeling capability, the limitations of the existing code architecture must be overcome while preserving the capabilities that already exist. To accomplish this, a set of advanced safety modeling requirements is defined, based on modern programming practices, that focuses on modular development within a flexible coupling framework. An approach for integrating the existing capabilities of the SAS4A/SASSYS-1 fast reactor safety analysis code into the SHARP framework is provided in order to preserve existing capabilities while providing a smooth transition to advanced modeling capabilities. In doing this, the advanced fast reactor safety models will

  10. Unleashing spatially distributed ecohydrology modeling using Big Data tools

    NASA Astrophysics Data System (ADS)

    Miles, B.; Idaszak, R.

    2015-12-01

    Physically based spatially distributed ecohydrology models are useful for answering science and management questions related to the hydrology and biogeochemistry of prairie, savanna, forested, as well as urbanized ecosystems. However, these models can produce hundreds of gigabytes of spatial output for a single model run over decadal time scales when run at regional spatial scales and moderate spatial resolutions (~100-km2+ at 30-m spatial resolution) or when run for small watersheds at high spatial resolutions (~1-km2 at 3-m spatial resolution). Numerical data formats such as HDF5 can store arbitrarily large datasets. However even in HPC environments, there are practical limits on the size of single files that can be stored and reliably backed up. Even when such large datasets can be stored, querying and analyzing these data can suffer from poor performance due to memory limitations and I/O bottlenecks, for example on single workstations where memory and bandwidth are limited, or in HPC environments where data are stored separately from computational nodes. The difficulty of storing and analyzing spatial data from ecohydrology models limits our ability to harness these powerful tools. Big Data tools such as distributed databases have the potential to surmount the data storage and analysis challenges inherent to large spatial datasets. Distributed databases solve these problems by storing data close to computational nodes while enabling horizontal scalability and fault tolerance. Here we present the architecture of and preliminary results from PatchDB, a distributed datastore for managing spatial output from the Regional Hydro-Ecological Simulation System (RHESSys). The initial version of PatchDB uses message queueing to asynchronously write RHESSys model output to an Apache Cassandra cluster. Once stored in the cluster, these data can be efficiently queried to quickly produce both spatial visualizations for a particular variable (e.g. maps and animations), as well

  11. Network Models: An Underutilized Tool in Wildlife Epidemiology?

    PubMed Central

    Craft, Meggan E.; Caillaud, Damien

    2011-01-01

    Although the approach of contact network epidemiology has been increasing in popularity for studying transmission of infectious diseases in human populations, it has generally been an underutilized approach for investigating disease outbreaks in wildlife populations. In this paper we explore the differences between the type of data that can be collected on human and wildlife populations, provide an update on recent advances that have been made in wildlife epidemiology by using a network approach, and discuss why networks might have been underutilized and why networks could and should be used more in the future. We conclude with ideas for future directions and a call for field biologists and network modelers to engage in more cross-disciplinary collaboration. PMID:21527981

  12. Nuclear fuel cycle system simulation tool based on high-fidelity component modeling

    SciTech Connect

    Ames, David E.

    2014-02-01

    The DOE is currently directing extensive research into developing fuel cycle technologies that will enable the safe, secure, economic, and sustainable expansion of nuclear energy. The task is formidable considering the numerous fuel cycle options, the large dynamic systems that each represent, and the necessity to accurately predict their behavior. The path to successfully develop and implement an advanced fuel cycle is highly dependent on the modeling capabilities and simulation tools available for performing useful relevant analysis to assist stakeholders in decision making. Therefore a high-fidelity fuel cycle simulation tool that performs system analysis, including uncertainty quantification and optimization was developed. The resulting simulator also includes the capability to calculate environmental impact measures for individual components and the system. An integrated system method and analysis approach that provides consistent and comprehensive evaluations of advanced fuel cycles was developed. A general approach was utilized allowing for the system to be modified in order to provide analysis for other systems with similar attributes. By utilizing this approach, the framework for simulating many different fuel cycle options is provided. Two example fuel cycle configurations were developed to take advantage of used fuel recycling and transmutation capabilities in waste management scenarios leading to minimized waste inventories.

  13. A Model-Driven Visualization Tool for Use with Model-Based Systems Engineering Projects

    NASA Technical Reports Server (NTRS)

    Trase, Kathryn; Fink, Eric

    2014-01-01

    Model-Based Systems Engineering (MBSE) promotes increased consistency between a system's design and its design documentation through the use of an object-oriented system model. The creation of this system model facilitates data presentation by providing a mechanism from which information can be extracted by automated manipulation of model content. Existing MBSE tools enable model creation, but are often too complex for the unfamiliar model viewer to easily use. These tools do not yet provide many opportunities for easing into the development and use of a system model when system design documentation already exists. This study creates a Systems Modeling Language (SysML) Document Traceability Framework (SDTF) for integrating design documentation with a system model, and develops an Interactive Visualization Engine for SysML Tools (InVEST), that exports consistent, clear, and concise views of SysML model data. These exported views are each meaningful to a variety of project stakeholders with differing subjects of concern and depth of technical involvement. InVEST allows a model user to generate multiple views and reports from a MBSE model, including wiki pages and interactive visualizations of data. System data can also be filtered to present only the information relevant to the particular stakeholder, resulting in a view that is both consistent with the larger system model and other model views. Viewing the relationships between system artifacts and documentation, and filtering through data to see specialized views improves the value of the system as a whole, as data becomes information

  14. Tool/tissues interaction modeling for transluminal angioplasty simulation.

    PubMed

    Le Fol, T; Haigron, P; Lucas, A

    2007-01-01

    In this paper, a simulation environment is described for balloon dilation during percutaneous transluminal angioplasty. It means simulating tool/tissues interactions involved in the inflation of a balloon by considering patient specific data. In this context, three main behaviors have been identified: soft tissues, crush completely under the effect of the balloon, calcified plaques, do not admit any deformation but could move in deformable structures and blood vessel wall and organs, try to find their original forms. A deformable soft tissue model is proposed, based on the Enhanced ChainMail method to take into account tissues deformation during dilatation. We improved the original ChainMail method with a "forbidden zone" step to facilitate tool/tissues interactions. The simulation was implemented using five key steps: 1) initialization of balloon parameters; 2) definition of the data structure; 3) dilatation of the balloon and displacement approximation; 4) final position estimation by an elastic relaxation; and 5) interpolation step for visualization. Preliminary results obtained from patient CT data are reported. PMID:18002311

  15. Using urban forest assessment tools to model bird habitat potential

    USGS Publications Warehouse

    Lerman, Susannah B.; Nislow, Keith H.; Nowak, David J.; Destefano, Stephen; King, David I.; Jones-Farrand, D. Todd

    2014-01-01

    The alteration of forest cover and the replacement of native vegetation with buildings, roads, exotic vegetation, and other urban features pose one of the greatest threats to global biodiversity. As more land becomes slated for urban development, identifying effective urban forest wildlife management tools becomes paramount to ensure the urban forest provides habitat to sustain bird and other wildlife populations. The primary goal of this study was to integrate wildlife suitability indices to an existing national urban forest assessment tool, i-Tree. We quantified available habitat characteristics of urban forests for ten northeastern U.S. cities, and summarized bird habitat relationships from the literature in terms of variables that were represented in the i-Tree datasets. With these data, we generated habitat suitability equations for nine bird species representing a range of life history traits and conservation status that predicts the habitat suitability based on i-Tree data. We applied these equations to the urban forest datasets to calculate the overall habitat suitability for each city and the habitat suitability for different types of land-use (e.g., residential, commercial, parkland) for each bird species. The proposed habitat models will help guide wildlife managers, urban planners, and landscape designers who require specific information such as desirable habitat conditions within an urban management project to help improve the suitability of urban forests for birds.

  16. A Simple Evacuation Modeling and Simulation Tool for First Responders

    SciTech Connect

    Koch, Daniel B; Payne, Patricia W

    2015-01-01

    Although modeling and simulation of mass evacuations during a natural or man-made disaster is an on-going and vigorous area of study, tool adoption by front-line first responders is uneven. Some of the factors that account for this situation include cost and complexity of the software. For several years, Oak Ridge National Laboratory has been actively developing the free Incident Management Preparedness and Coordination Toolkit (IMPACT) to address these issues. One of the components of IMPACT is a multi-agent simulation module for area-based and path-based evacuations. The user interface is designed so that anyone familiar with typical computer drawing tools can quickly author a geospatially-correct evacuation visualization suitable for table-top exercises. Since IMPACT is designed for use in the field where network communications may not be available, quick on-site evacuation alternatives can be evaluated to keep pace with a fluid threat situation. Realism is enhanced by incorporating collision avoidance into the simulation. Statistics are gathered as the simulation unfolds, including most importantly time-to-evacuate, to help first responders choose the best course of action.

  17. Earth remote sensing as an effective tool for the development of advanced innovative educational technologies

    NASA Astrophysics Data System (ADS)

    Mayorova, Vera; Mayorov, Kirill

    2009-11-01

    Current educational system is facing a contradiction between the fundamentality of engineering education and the necessity of applied learning extension, which requires new methods of training to combine both academic and practical knowledge in balance. As a result there are a number of innovations being developed and implemented into the process of education aimed at optimizing the quality of the entire educational system. Among a wide range of innovative educational technologies there is an especially important subset of educational technologies which involve learning through hands-on scientific and technical projects. The purpose of this paper is to describe the implementation of educational technologies based on small satellites development as well as the usage of Earth remote sensing data acquired from these satellites. The increase in public attention to the education through Earth remote sensing is based on the concern that although there is a great progress in the development of new methods of Earth imagery and remote sensing data acquisition there is still a big question remaining open on practical applications of this kind of data. It is important to develop the new way of thinking for the new generation of people so they understand that they are the masters of their own planet and they are responsible for its state. They should desire and should be able to use a powerful set of tools based on modern and perspective Earth remote sensing. For example NASA sponsors "Classroom of the Future" project. The Universities Space Research Association in United States provides a mechanism through which US universities can cooperate effectively with one another, with the government, and with other organizations to further space science and technology, and to promote education in these areas. It also aims at understanding the Earth as a system and promoting the role of humankind in the destiny of their own planet. The Association has founded a Journal of Earth System

  18. Computer modeling for advanced life support system analysis.

    PubMed

    Drysdale, A

    1997-01-01

    This article discusses the equivalent mass approach to advanced life support system analysis, describes a computer model developed to use this approach, and presents early results from modeling the NASA JSC BioPlex. The model is built using an object oriented approach and G2, a commercially available modeling package Cost factor equivalencies are given for the Volosin scenarios. Plant data from NASA KSC and Utah State University (USU) are used, together with configuration data from the BioPlex design effort. Initial results focus on the importance of obtaining high plant productivity with a flight-like configuration. PMID:11540448

  19. Test model designs for advanced refractory ceramic materials

    NASA Technical Reports Server (NTRS)

    Tran, Huy Kim

    1993-01-01

    The next generation of space vehicles will be subjected to severe aerothermal loads and will require an improved thermal protection system (TPS) and other advanced vehicle components. In order to ensure the satisfactory performance system (TPS) and other advanced vehicle materials and components, testing is to be performed in environments similar to space flight. The design and fabrication of the test models should be fairly simple but still accomplish test objectives. In the Advanced Refractory Ceramic Materials test series, the models and model holders will need to withstand the required heat fluxes of 340 to 817 W/sq cm or surface temperatures in the range of 2700 K to 3000 K. The model holders should provide one dimensional (1-D) heat transfer to the samples and the appropriate flow field without compromising the primary test objectives. The optical properties such as the effective emissivity, catalytic efficiency coefficients, thermal properties, and mass loss measurements are also taken into consideration in the design process. Therefore, it is the intent of this paper to demonstrate the design schemes for different models and model holders that would accommodate these test requirements and ensure the safe operation in a typical arc jet facility.

  20. Advanced practice in neurocritical care: an innovative orientation and competency model.

    PubMed

    Vicari-Christensen, Michele

    2014-02-01

    The advanced registered nurse practitioner (ARNP) began in the 1960s as an alternative provider to meet the demands of an escalating healthcare resource deficit. As the role evolved and ARNPs demonstrated safe and effective care, these providers began to appear in critical care settings. It is believed that in the specialty of Neurocritical Care, about half the providers are ARNPs. Hiring and training practitioners for this complex environment is daunting. At the University of Florida & Shands Jacksonville, an innovative orientation and competency model for ARNPs hired for the newly opened Neurocritical Care unit was developed and implemented. The program contains a roadmap for knowledge base and skill acquisition as well as competency training and maintenance. Experience with appropriate hiring and screening standards, internally developed training tools, and identification of necessary advanced classes are discussed. This model may be used as a guideline for Neurocritical Care ARNP training as well as adapted for all other critical care settings. PMID:24399169

  1. Model-free adaptive control of advanced power plants

    SciTech Connect

    Cheng, George Shu-Xing; Mulkey, Steven L.; Wang, Qiang

    2015-08-18

    A novel 3-Input-3-Output (3.times.3) Model-Free Adaptive (MFA) controller with a set of artificial neural networks as part of the controller is introduced. A 3.times.3 MFA control system using the inventive 3.times.3 MFA controller is described to control key process variables including Power, Steam Throttle Pressure, and Steam Temperature of boiler-turbine-generator (BTG) units in conventional and advanced power plants. Those advanced power plants may comprise Once-Through Supercritical (OTSC) Boilers, Circulating Fluidized-Bed (CFB) Boilers, and Once-Through Supercritical Circulating Fluidized-Bed (OTSC CFB) Boilers.

  2. Integration of Advanced Probabilistic Analysis Techniques with Multi-Physics Models

    SciTech Connect

    Cetiner, Mustafa Sacit; none,; Flanagan, George F.; Poore III, Willis P.; Muhlheim, Michael David

    2014-07-30

    An integrated simulation platform that couples probabilistic analysis-based tools with model-based simulation tools can provide valuable insights for reactive and proactive responses to plant operating conditions. The objective of this work is to demonstrate the benefits of a partial implementation of the Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Detailed Framework Specification through the coupling of advanced PRA capabilities and accurate multi-physics plant models. Coupling a probabilistic model with a multi-physics model will aid in design, operations, and safety by providing a more accurate understanding of plant behavior. This represents the first attempt at actually integrating these two types of analyses for a control system used for operations, on a faster than real-time basis. This report documents the development of the basic communication capability to exchange data with the probabilistic model using Reliability Workbench (RWB) and the multi-physics model using Dymola. The communication pathways from injecting a fault (i.e., failing a component) to the probabilistic and multi-physics models were successfully completed. This first version was tested with prototypic models represented in both RWB and Modelica. First, a simple event tree/fault tree (ET/FT) model was created to develop the software code to implement the communication capabilities between the dynamic-link library (dll) and RWB. A program, written in C#, successfully communicates faults to the probabilistic model through the dll. A systems model of the Advanced Liquid-Metal Reactor–Power Reactor Inherently Safe Module (ALMR-PRISM) design developed under another DOE project was upgraded using Dymola to include proper interfaces to allow data exchange with the control application (ConApp). A program, written in C+, successfully communicates faults to the multi-physics model. The results of the example simulation were successfully plotted.

  3. WIFIRE Data Model and Catalog for Wildfire Data and Tools

    NASA Astrophysics Data System (ADS)

    Altintas, I.; Crawl, D.; Cowart, C.; Gupta, A.; Block, J.; de Callafon, R.

    2014-12-01

    The WIFIRE project (wifire.ucsd.edu) is building an end-to-end cyberinfrastructure for real-time and data-driven simulation, prediction and visualization of wildfire behavior. WIFIRE may be used by wildfire management authorities in the future to predict wildfire rate of spread and direction, and assess the effectiveness of high-density sensor networks in improving fire and weather predictions. WIFIRE has created a data model for wildfire resources including sensed and archived data, sensors, satellites, cameras, modeling tools, workflows and social information including Twitter feeds. This data model and associated wildfire resource catalog includes a detailed description of the HPWREN sensor network, SDG&E's Mesonet, and NASA MODIS. In addition, the WIFIRE data-model describes how to integrate the data from multiple heterogeneous sources to provide detailed fire-related information. The data catalog describes 'Observables' captured by each instrument using multiple ontologies including OGC SensorML and NASA SWEET. Observables include measurements such as wind speed, air temperature, and relative humidity, as well as their accuracy and resolution. We have implemented a REST service for publishing to and querying from the catalog using Web Application Description Language (WADL). We are creating web-based user interfaces and mobile device Apps that use the REST interface for dissemination to wildfire modeling community and project partners covering academic, private, and government laboratories while generating value to emergency officials and the general public. Additionally, the Kepler scientific workflow system is instrumented to interact with this data catalog to access real-time streaming and archived wildfire data and stream it into dynamic data-driven wildfire models at scale.

  4. A Clinical Assessment Tool for Advanced Theory of Mind Performance in 5 to 12 Year Olds

    ERIC Educational Resources Information Center

    O'Hare, Anne E.; Bremner, Lynne; Nash, Marysia; Happe, Francesca; Pettigrew, Luisa M.

    2009-01-01

    One hundred forty typically developing 5- to 12-year-old children were assessed with a test of advanced theory of mind employing Happe's strange stories. There was no significant difference in performance between boys and girls. The stories discriminated performance across the different ages with the lowest performance being in the younger…

  5. Teaching Advanced Concepts in Computer Networks: VNUML-UM Virtualization Tool

    ERIC Educational Resources Information Center

    Ruiz-Martinez, A.; Pereniguez-Garcia, F.; Marin-Lopez, R.; Ruiz-Martinez, P. M.; Skarmeta-Gomez, A. F.

    2013-01-01

    In the teaching of computer networks the main problem that arises is the high price and limited number of network devices the students can work with in the laboratories. Nowadays, with virtualization we can overcome this limitation. In this paper, we present a methodology that allows students to learn advanced computer network concepts through…

  6. ADVANCED TOOLS FOR ASSESSING SELECTED PRESCRIPTION AND ILLICIT DRUGS IN TREATED SEWAGE EFFLUENTS AND SOURCE WATERS

    EPA Science Inventory

    The purpose of this poster is to present the application and assessment of advanced state-of-the-art technologies in a real-world environment - wastewater effluent and source waters - for detecting six drugs [azithromycin, fluoxetine, omeprazole, levothyroxine, methamphetamine, m...

  7. Advanced Technologies as Educational Tools in Science: Concepts, Applications, and Issues. Monograph Series Number 8.

    ERIC Educational Resources Information Center

    Kumar, David D.; And Others

    Systems incorporating two advanced technologies, hypermedia systems and intelligent tutors, are examined with respect to their potential impact on science education. The conceptual framework underlying these systems is discussed first. Applications of systems are then presented with examples of each in operation within the context of science…

  8. Just-in-Time Teaching: A Tool for Enhancing Student Engagement in Advanced Foreign Language Learning

    ERIC Educational Resources Information Center

    Abreu, Laurel; Knouse, Stephanie

    2014-01-01

    Scholars have indicated a need for further research on effective pedagogical strategies designed for advanced foreign language courses in the postsecondary setting, especially in light of decreased enrollments at this level and the elimination of foreign language programs altogether in some institutions (Paesani & Allen, 2012). This article…

  9. The diffraction grating in the Ivory optomechanical modeling tools

    NASA Astrophysics Data System (ADS)

    Hatheway, Alson E.

    2013-09-01

    In imaging spectrometers it is important that both the image of the far-field object and the image of the slit be stable on the detector plane. Lenses and mirrors contribute to the motions of these images but motions of the diffraction grating also have their own influences on these image motions. This paper develops the vector equations for the images (spectra) of the diffraction grating and derives their optomechanical influence coefficients from them. The Ivory Optomechanical Modeling Tools integrates the diffraction grating into the larger optical imaging system and formats the whole system's influence coefficients suitably for both spreadsheet and finite element analysis methods. Their application is illustrated in an example of a spectrometer exposed to both static and dynamic disturbances.

  10. Computational Tools To Model Halogen Bonds in Medicinal Chemistry.

    PubMed

    Ford, Melissa Coates; Ho, P Shing

    2016-03-10

    The use of halogens in therapeutics dates back to the earliest days of medicine when seaweed was used as a source of iodine to treat goiters. The incorporation of halogens to improve the potency of drugs is now fairly standard in medicinal chemistry. In the past decade, halogens have been recognized as direct participants in defining the affinity of inhibitors through a noncovalent interaction called the halogen bond or X-bond. Incorporating X-bonding into structure-based drug design requires computational models for the anisotropic distribution of charge and the nonspherical shape of halogens, which lead to their highly directional geometries and stabilizing energies. We review here current successes and challenges in developing computational methods to introduce X-bonding into lead compound discovery and optimization during drug development. This fast-growing field will push further development of more accurate and efficient computational tools to accelerate the exploitation of halogens in medicinal chemistry.

  11. Recent advances in microbial production of fuels and chemicals using tools and strategies of systems metabolic engineering.

    PubMed

    Cho, Changhee; Choi, So Young; Luo, Zi Wei; Lee, Sang Yup

    2015-11-15

    The advent of various systems metabolic engineering tools and strategies has enabled more sophisticated engineering of microorganisms for the production of industrially useful fuels and chemicals. Advances in systems metabolic engineering have been made in overproducing natural chemicals and producing novel non-natural chemicals. In this paper, we review the tools and strategies of systems metabolic engineering employed for the development of microorganisms for the production of various industrially useful chemicals belonging to fuels, building block chemicals, and specialty chemicals, in particular focusing on those reported in the last three years. It was aimed at providing the current landscape of systems metabolic engineering and suggesting directions to address future challenges towards successfully establishing processes for the bio-based production of fuels and chemicals from renewable resources.

  12. Modeling a Transient Pressurization with Active Cooling Sizing Tool

    NASA Technical Reports Server (NTRS)

    Guzik, Monica C.; Plachta, David W.; Elchert, Justin P.

    2011-01-01

    As interest in the area of in-space zero boil-off cryogenic propellant storage develops, the need to visualize and quantify cryogen behavior during ventless tank self-pressurization and subsequent cool-down with active thermal control has become apparent. During the course of a mission, such as the launch ascent phase, there are periods that power to the active cooling system will be unavailable. In addition, because it is not feasible to install vacuum jackets on large propellant tanks, as is typically done for in-space cryogenic applications for science payloads, instances like the launch ascent heating phase are important to study. Numerous efforts have been made to characterize cryogenic tank pressurization during ventless cryogen storage without active cooling, but few tools exist to model this behavior in a user-friendly environment for general use, and none exist that quantify the marginal active cooling system size needed for power down periods to manage tank pressure response once active cooling is resumed. This paper describes the Transient pressurization with Active Cooling Tool (TACT), which is based on a ventless three-lump homogeneous thermodynamic self-pressurization model1 coupled with an active cooling system estimator. TACT has been designed to estimate the pressurization of a heated but unvented cryogenic tank, assuming an unavailable power period followed by a given cryocooler heat removal rate. By receiving input data on the tank material and geometry, propellant initial conditions, and passive and transient heating rates, a pressurization and recovery profile can be found, which establishes the time needed to return to a designated pressure. This provides the ability to understand the effect that launch ascent and unpowered mission segments have on the size of an active cooling system. A sample of the trends found show that an active cooling system sized for twice the steady state heating rate would results in a reasonable time for tank

  13. Isolated heart models: cardiovascular system studies and technological advances.

    PubMed

    Olejnickova, Veronika; Novakova, Marie; Provaznik, Ivo

    2015-07-01

    Isolated heart model is a relevant tool for cardiovascular system studies. It represents a highly reproducible model for studying broad spectrum of biochemical, physiological, morphological, and pharmaceutical parameters, including analysis of intrinsic heart mechanics, metabolism, and coronary vascular response. Results obtained in this model are under no influence of other organ systems, plasma concentration of hormones or ions and influence of autonomic nervous system. The review describes various isolated heart models, the modes of heart perfusion, and advantages and limitations of various experimental setups. It reports the improvements of perfusion setup according to Langendorff introduced by the authors.

  14. Advanced Algorithms and Automation Tools for Discrete Ordinates Methods in Parallel Environments

    SciTech Connect

    Alireza Haghighat

    2003-05-07

    This final report discusses major accomplishments of a 3-year project under the DOE's NEER Program. The project has developed innovative and automated algorithms, codes, and tools for solving the discrete ordinates particle transport method efficiently in parallel environments. Using a number of benchmark and real-life problems, the performance and accuracy of the new algorithms have been measured and analyzed.

  15. Continuous Symmetry and Chemistry Teachers: Learning Advanced Chemistry Content through Novel Visualization Tools

    ERIC Educational Resources Information Center

    Tuvi-Arad, Inbal; Blonder, Ron

    2010-01-01

    In this paper we describe the learning process of a group of experienced chemistry teachers in a specially designed workshop on molecular symmetry and continuous symmetry. The workshop was based on interactive visualization tools that allow molecules and their symmetry elements to be rotated in three dimensions. The topic of continuous symmetry is…

  16. Advancements in Distributed Generation Issues: Interconnection, Modeling, and Tariffs

    SciTech Connect

    Thomas, H.; Kroposki, B.; Basso, T.; Treanton, B. G.

    2007-01-01

    The California Energy Commission is cost-sharing research with the Department of Energy through the National Renewable Energy Laboratory to address distributed energy resources (DER) topics. These efforts include developing interconnection and power management technologies, modeling the impacts of interconnecting DER with an area electric power system, and evaluating possible modifications to rate policies and tariffs. As a result, a DER interconnection device has been developed and tested. A workshop reviewed the status and issues of advanced power electronic devices. Software simulations used validated models of distribution circuits that incorporated DER, and tests and measurements of actual circuits with and without DER systems are being conducted to validate these models. Current policies affecting DER were reviewed and rate making policies to support deployment of DER through public utility rates and policies were identified. These advancements are expected to support the continued and expanded use of DER systems.

  17. Gasification CFD Modeling for Advanced Power Plant Simulations

    SciTech Connect

    Zitney, S.E.; Guenther, C.P.

    2005-09-01

    In this paper we have described recent progress on developing CFD models for two commercial-scale gasifiers, including a two-stage, coal slurry-fed, oxygen-blown, pressurized, entrained-flow gasifier and a scaled-up design of the PSDF transport gasifier. Also highlighted was NETL’s Advanced Process Engineering Co-Simulator for coupling high-fidelity equipment models with process simulation for the design, analysis, and optimization of advanced power plants. Using APECS, we have coupled the entrained-flow gasifier CFD model into a coal-fired, gasification-based FutureGen power and hydrogen production plant. The results for the FutureGen co-simulation illustrate how the APECS technology can help engineers better understand and optimize gasifier fluid dynamics and related phenomena that impact overall power plant performance.

  18. Implementing an HL7 version 3 modeling tool from an Ecore model.

    PubMed

    Bánfai, Balázs; Ulrich, Brandon; Török, Zsolt; Natarajan, Ravi; Ireland, Tim

    2009-01-01

    One of the main challenges of achieving interoperability using the HL7 V3 healthcare standard is the lack of clear definition and supporting tools for modeling, testing, and conformance checking. Currently, the knowledge defining the modeling is scattered around in MIF schemas, tools and specifications or simply with the domain experts. Modeling core HL7 concepts, constraints, and semantic relationships in Ecore/EMF encapsulates the domain-specific knowledge in a transparent way while unifying Java, XML, and UML in an abstract, high-level representation. Moreover, persisting and versioning the core HL7 concepts as a single Ecore context allows modelers and implementers to create, edit and validate message models against a single modeling context. The solution discussed in this paper is implemented in the new HL7 Static Model Designer as an extensible toolset integrated as a standalone Eclipse RCP application.

  19. Advances in omics and bioinformatics tools for systems analyses of plant functions.

    PubMed

    Mochida, Keiichi; Shinozaki, Kazuo

    2011-12-01

    Omics and bioinformatics are essential to understanding the molecular systems that underlie various plant functions. Recent game-changing sequencing technologies have revitalized sequencing approaches in genomics and have produced opportunities for various emerging analytical applications. Driven by technological advances, several new omics layers such as the interactome, epigenome and hormonome have emerged. Furthermore, in several plant species, the development of omics resources has progressed to address particular biological properties of individual species. Integration of knowledge from omics-based research is an emerging issue as researchers seek to identify significance, gain biological insights and promote translational research. From these perspectives, we provide this review of the emerging aspects of plant systems research based on omics and bioinformatics analyses together with their associated resources and technological advances.

  20. A novel cell culture model as a tool for forensic biology experiments and validations.

    PubMed

    Feine, Ilan; Shpitzen, Moshe; Roth, Jonathan; Gafny, Ron

    2016-09-01

    To improve and advance DNA forensic casework investigation outcomes, extensive field and laboratory experiments are carried out in a broad range of relevant branches, such as touch and trace DNA, secondary DNA transfer and contamination confinement. Moreover, the development of new forensic tools, for example new sampling appliances, by commercial companies requires ongoing validation and assessment by forensic scientists. A frequent challenge in these kinds of experiments and validations is the lack of a stable, reproducible and flexible biological reference material. As a possible solution, we present here a cell culture model based on skin-derived human dermal fibroblasts. Cultured cells were harvested, quantified and dried on glass slides. These slides were used in adhesive tape-lifting experiments and tests of DNA crossover confinement by UV irradiation. The use of this model enabled a simple and concise comparison between four adhesive tapes, as well as a straightforward demonstration of the effect of UV irradiation intensities on DNA quantity and degradation. In conclusion, we believe this model has great potential to serve as an efficient research tool in forensic biology.

  1. Collaborative platform, tool-kit, and physical models for DfM

    NASA Astrophysics Data System (ADS)

    Neureuther, Andy; Poppe, Wojtek; Holwill, Juliet; Chin, Eric; Wang, Lynn; Yang, Jae-Seok; Miller, Marshal; Ceperley, Dan; Clifford, Chris; Kikuchi, Koji; Choi, Jihong; Dornfeld, Dave; Friedberg, Paul; Spanos, Costas; Hoang, John; Chang, Jane; Hsu, Jerry; Graves, David; Wu, Alan C. F.; Lieberman, Mike

    2007-03-01

    Exploratory prototype DfM tools, methodologies and emerging physical process models are described. The examples include new platforms for collaboration on process/device/circuits, visualization and quantification of manufacturing effects at the mask layout level, and advances toward fast-CAD models for lithography, CMP, etch and photomasks. The examples have evolved from research supported over the last several years by DARPA, SRC, Industry and the Sate of California U.C. Discovery Program. DfM tools must enable complexity management with very fast first-cut accurate models across process, device and circuit performance with new modes of collaboration. Collaborations can be promoted by supporting simultaneous views in naturally intuitive parameters for each contributor. An important theme is to shift the view point of the statistical variation in timing and power upstream from gate level CD distributions to a more deterministic set of sources of variations in characterized processes. Many of these nonidealities of manufacturing can be expressed at the mask plane in terms of lateral impact functions to capture effects not included in design rules. Pattern Matching and Perturbation Formulations are shown to be well suited for quantifying these sources of variation.

  2. The smoke-fireplume model : tool for eventual application to prescribed burns and wildland fires.

    SciTech Connect

    Brown, D. F.; Dunn, W. E.; Lazaro, M. A.; Policastro, A. J.

    1999-08-17

    Land managers are increasingly implementing strategies that employ the use of fire in prescribed burns to sustain ecosystems and plan to sustain the rate of increase in its use over the next five years. In planning and executing expanded use of fire in wildland treatment it is important to estimate the human health and safety consequences, property damage, and the extent of visibility degradation from the resulting conflagration-pyrolysis gases, soot and smoke generated during flaming, smoldering and/or glowing fires. Traditional approaches have often employed the analysis of weather observations and forecasts to determine whether a prescribed burn will affect populations, property, or protected Class I areas. However, the complexity of the problem lends itself to advanced PC-based models that are simple to use for both calculating the emissions from the burning of wildland fuels and the downwind dispersion of smoke and other products of pyrolysis, distillation, and/or fuels combustion. These models will need to address the effects of residual smoldering combustion, including plume dynamics and optical effects. In this paper, we discuss a suite of tools that can be applied for analyzing dispersion. These tools include the dispersion models FIREPLUME and SMOKE, together with the meteorological preprocessor SEBMET.

  3. M4AST - A Tool for Asteroid Modelling

    NASA Astrophysics Data System (ADS)

    Birlan, Mirel; Popescu, Marcel; Irimiea, Lucian; Binzel, Richard

    2016-10-01

    M4AST (Modelling for asteroids) is an online tool devoted to the analysis and interpretation of reflection spectra of asteroids in the visible and near-infrared spectral intervals. It consists into a spectral database of individual objects and a set of routines for analysis which address scientific aspects such as: taxonomy, curve matching with laboratory spectra, space weathering models, and mineralogical diagnosis. Spectral data were obtained using groundbased facilities; part of these data are precompiled from the literature[1].The database is composed by permanent and temporary files. Each permanent file contains a header and two or three columns (wavelength, spectral reflectance, and the error on spectral reflectance). Temporary files can be uploaded anonymously, and are purged for the property of submitted data. The computing routines are organized in order to accomplish several scientific objectives: visualize spectra, compute the asteroid taxonomic class, compare an asteroid spectrum with similar spectra of meteorites, and computing mineralogical parameters. One facility of using the Virtual Observatory protocols was also developed.A new version of the service was released in June 2016. This new release of M4AST contains a database and facilities to model more than 6,000 spectra of asteroids. A new web-interface was designed. This development allows new functionalities into a user-friendly environment. A bridge system of access and exploiting the database SMASS-MIT (http://smass.mit.edu) allows the treatment and analysis of these data in the framework of M4AST environment.Reference:[1] M. Popescu, M. Birlan, and D.A. Nedelcu, "Modeling of asteroids: M4AST," Astronomy & Astrophysics 544, EDP Sciences, pp. A130, 2012.

  4. ADVANCEMENT OF NUCLEIC ACID-BASED TOOLS FOR MONITORING IN SITU REDUCTIVE DECHLORINATION

    SciTech Connect

    Vangelas, K; ELIZABETH EDWARDS, E; FRANK LOFFLER, F; Brian02 Looney, B

    2006-11-17

    Regulatory protocols generally recognize that destructive processes are the most effective mechanisms that support natural attenuation of chlorinated solvents. In many cases, these destructive processes will be biological processes and, for chlorinated compounds, will often be reductive processes that occur under anaerobic conditions. The existing EPA guidance (EPA, 1998) provides a list of parameters that provide indirect evidence of reductive dechlorination processes. In an effort to gather direct evidence of these processes, scientists have identified key microorganisms and are currently developing tools to measure the abundance and activity of these organisms in subsurface systems. Drs. Edwards and Luffler are two recognized leaders in this field. The research described herein continues their development efforts to provide a suite of tools to enable direct measures of biological processes related to the reductive dechlorination of TCE and PCE. This study investigated the strengths and weaknesses of the 16S rRNA gene-based approach to characterizing the natural attenuation capabilities in samples. The results suggested that an approach based solely on 16S rRNA may not provide sufficient information to document the natural attenuation capabilities in a system because it does not distinguish between strains of organisms that have different biodegradation capabilities. The results of the investigations provided evidence that tools focusing on relevant enzymes for functionally desired characteristics may be useful adjuncts to the 16SrRNA methods.

  5. From beginners to trained users: an advanced tool to guide experimenters in basic applied fluorescence

    NASA Astrophysics Data System (ADS)

    Pingand, Philippe B.; Lerner, Dan A.

    1993-05-01

    UPY-F is a software dedicated to solving various queries issued by end-users of spectrofluorimeters when they come across a problem in the course of an experiment. The main goal is to provide a diagnostic for the nonpertinent use of a spectrofluorimeter. Many artifacts may induce the operator into trouble and except for experts, the simple manipulation of the controls of a fluorimeter results in effects not always fully appreciated. The solution retained is an association between a powerful hypermedia tool and an expert system. A straight expert system offers a number of well-known advantages. But it is not well accepted by the user due to the many moves between the spectrofluorimeter and the diagnostic tool. In our hypermedia tool, knowledge can be displayed by the means of visual concepts through which one can browse, and navigate. The user still perceives his problem as a whole, which may not be the case with a straight expert system. We demonstrate typical situations in which an event will trigger a chain reasoning leading to the debugging of the problem. The system is not only meant to help a beginner but can conform itself to guide a well trained experimenter. We think that its functionalities and user-friendly interface are very attractive and open new vistas in the way future users may be trained, whether they work in research labs or industrial settings, as it could namely cut down on the time spent for their training.

  6. Tools and Models for Integrating Multiple Cellular Networks

    SciTech Connect

    Gerstein, Mark

    2015-11-06

    In this grant, we have systematically investigated the integrated networks, which are responsible for the coordination of activity between metabolic pathways in prokaryotes. We have developed several computational tools to analyze the topology of the integrated networks consisting of metabolic, regulatory, and physical interaction networks. The tools are all open-source, and they are available to download from Github, and can be incorporated in the Knowledgebase. Here, we summarize our work as follow. Understanding the topology of the integrated networks is the first step toward understanding its dynamics and evolution. For Aim 1 of this grant, we have developed a novel algorithm to determine and measure the hierarchical structure of transcriptional regulatory networks [1]. The hierarchy captures the direction of information flow in the network. The algorithm is generally applicable to regulatory networks in prokaryotes, yeast and higher organisms. Integrated datasets are extremely beneficial in understanding the biology of a system in a compact manner due to the conflation of multiple layers of information. Therefore for Aim 2 of this grant, we have developed several tools and carried out analysis for integrating system-wide genomic information. To make use of the structural data, we have developed DynaSIN for protein-protein interactions networks with various dynamical interfaces [2]. We then examined the association between network topology with phenotypic effects such as gene essentiality. In particular, we have organized E. coli and S. cerevisiae transcriptional regulatory networks into hierarchies. We then correlated gene phenotypic effects by tinkering with different layers to elucidate which layers were more tolerant to perturbations [3]. In the context of evolution, we also developed a workflow to guide the comparison between different types of biological networks across various species using the concept of rewiring [4], and Furthermore, we have developed

  7. Enabling analytical and Modeling Tools for Enhanced Disease Surveillance

    SciTech Connect

    Dawn K. Manley

    2003-04-01

    Early detection, identification, and warning are essential to minimize casualties from a biological attack. For covert attacks, sick people are likely to provide the first indication of an attack. An enhanced medical surveillance system that synthesizes distributed health indicator information and rapidly analyzes the information can dramatically increase the number of lives saved. Current surveillance methods to detect both biological attacks and natural outbreaks are hindered by factors such as distributed ownership of information, incompatible data storage and analysis programs, and patient privacy concerns. Moreover, because data are not widely shared, few data mining algorithms have been tested on and applied to diverse health indicator data. This project addressed both integration of multiple data sources and development and integration of analytical tools for rapid detection of disease outbreaks. As a first prototype, we developed an application to query and display distributed patient records. This application incorporated need-to-know access control and incorporated data from standard commercial databases. We developed and tested two different algorithms for outbreak recognition. The first is a pattern recognition technique that searches for space-time data clusters that may signal a disease outbreak. The second is a genetic algorithm to design and train neural networks (GANN) that we applied toward disease forecasting. We tested these algorithms against influenza, respiratory illness, and Dengue Fever data. Through this LDRD in combination with other internal funding, we delivered a distributed simulation capability to synthesize disparate information and models for earlier recognition and improved decision-making in the event of a biological attack. The architecture incorporates user feedback and control so that a user's decision inputs can impact the scenario outcome as well as integrated security and role-based access-control for communicating between

  8. Advanced computer techniques for inverse modeling of electric current in cardiac tissue

    SciTech Connect

    Hutchinson, S.A.; Romero, L.A.; Diegert, C.F.

    1996-08-01

    For many years, ECG`s and vector cardiograms have been the tools of choice for non-invasive diagnosis of cardiac conduction problems, such as found in reentrant tachycardia or Wolff-Parkinson-White (WPW) syndrome. Through skillful analysis of these skin-surface measurements of cardiac generated electric currents, a physician can deduce the general location of heart conduction irregularities. Using a combination of high-fidelity geometry modeling, advanced mathematical algorithms and massively parallel computing, Sandia`s approach would provide much more accurate information and thus allow the physician to pinpoint the source of an arrhythmia or abnormal conduction pathway.

  9. Decoding Advances in Psychiatric Genetics: A Focus on Neural Circuits in Rodent Models.

    PubMed

    Heckenast, Julia R; Wilkinson, Lawrence S; Jones, Matthew W

    2015-01-01

    Appropriately powered genome-wide association studies combined with deep-sequencing technologies offer the prospect of real progress in revealing the complex biological underpinnings of schizophrenia and other psychiatric disorders. Meanwhile, recent developments in genome engineering, including CRISPR, constitute better tools to move forward with investigating these genetic leads. This review aims to assess how these advances can inform the development of animal models for psychiatric disease, with a focus on schizophrenia and in vivo electrophysiological circuit-level measures with high potential as disease biomarkers.

  10. ADVANCED ELECTRIC AND MAGNETIC MATERIAL MODELS FOR FDTD ELECTROMAGNETIC CODES

    SciTech Connect

    Poole, B R; Nelson, S D; Langdon, S

    2005-05-05

    The modeling of dielectric and magnetic materials in the time domain is required for pulse power applications, pulsed induction accelerators, and advanced transmission lines. For example, most induction accelerator modules require the use of magnetic materials to provide adequate Volt-sec during the acceleration pulse. These models require hysteresis and saturation to simulate the saturation wavefront in a multipulse environment. In high voltage transmission line applications such as shock or soliton lines the dielectric is operating in a highly nonlinear regime, which require nonlinear models. Simple 1-D models are developed for fast parameterization of transmission line structures. In the case of nonlinear dielectrics, a simple analytic model describing the permittivity in terms of electric field is used in a 3-D finite difference time domain code (FDTD). In the case of magnetic materials, both rate independent and rate dependent Hodgdon magnetic material models have been implemented into 3-D FDTD codes and 1-D codes.

  11. The TEF modeling and analysis approach to advance thermionic space power technology

    NASA Astrophysics Data System (ADS)

    Marshall, Albert C.

    1997-01-01

    Thermionics space power systems have been proposed as advanced power sources for future space missions that require electrical power levels significantly above the capabilities of current space power systems. The Defense Special Weapons Agency's (DSWA) Thermionic Evaluation Facility (TEF) is carrying out both experimental and analytical research to advance thermionic space power technology to meet this expected need. A Modeling and Analysis (M&A) project has been created at the TEF to develop analysis tools, evaluate concepts, and guide research. M&A activities are closely linked to the TEF experimental program, providing experiment support and using experimental data to validate models. A planning exercise has been completed for the M&A project, and a strategy for implementation was developed. All M&A activities will build on a framework provided by a system performance model for a baseline Thermionic Fuel Element (TFE) concept. The system model is composed of sub-models for each of the system components and sub-systems. Additional thermionic component options and model improvements will continue to be incorporated in the basic system model during the course of the program. All tasks are organized into four focus areas: 1) system models, 2) thermionic research, 3) alternative concepts, and 4) documentation and integration. The M&A project will provide a solid framework for future thermionic system development.

  12. A US Perspective on Selected Biotechnological Advancements in Fish Health Part II: Genetic stock improvement, biosecurity tools and alternative protein sources in fish diets

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Remarkable biotechnological advancements have been made in the aquaculture industry in the past five years. Advancements, in areas such as fish vaccines, improved genetic stock, biosecurity tools and alternative protein sources in fish diets, are necessary to meet the rapid growth of the aquacultur...

  13. Measurement and modeling of advanced coal conversion processes, Volume III

    SciTech Connect

    Ghani, M.U.; Hobbs, M.L.; Hamblen, D.G.

    1993-08-01

    A generalized one-dimensional, heterogeneous, steady-state, fixed-bed model for coal gasification and combustion is presented. The model, FBED-1, is a design and analysis tool that can be used to simulate a variety of gasification, devolatilization, and combustion processes. The model considers separate gas and solid temperatures, axially variable solid and gas flow rates, variable bed void fraction, coal drying, devolatilization based on chemical functional group composition, depolymerization, vaporization and crosslinking, oxidation, and gasification of char, and partial equilibrium in the gas phase.

  14. Development of an innovative spacer grid model utilizing computational fluid dynamics within a subchannel analysis tool

    NASA Astrophysics Data System (ADS)

    Avramova, Maria

    In the past few decades the need for improved nuclear reactor safety analyses has led to a rapid development of advanced methods for multidimensional thermal-hydraulic analyses. These methods have become progressively more complex in order to account for the many physical phenomena anticipated during steady state and transient Light Water Reactor (LWR) conditions. The advanced thermal-hydraulic subchannel code COBRA-TF (Thurgood, M. J. et al., 1983) is used worldwide for best-estimate evaluations of the nuclear reactor safety margins. In the framework of a joint research project between the Pennsylvania State University (PSU) and AREVA NP GmbH, the theoretical models and numerics of COBRA-TF have been improved. Under the name F-COBRA-TF, the code has been subjected to an extensive verification and validation program and has been applied to variety of LWR steady state and transient simulations. To enable F-COBRA-TF for industrial applications, including safety margins evaluations and design analyses, the code spacer grid models were revised and substantially improved. The state-of-the-art in the modeling of the spacer grid effects on the flow thermal-hydraulic performance in rod bundles employs numerical experiments performed by computational fluid dynamics (CFD) calculations. Because of the involved computational cost, the CFD codes cannot be yet used for full bundle predictions, but their capabilities can be utilized for development of more advanced and sophisticated models for subchannel-level analyses. A subchannel code, equipped with improved physical models, can be then a powerful tool for LWR safety and design evaluations. The unique contributions of this PhD research are seen as development, implementation, and qualification of an innovative spacer grid model by utilizing CFD results within a framework of a subchannel analysis code. Usually, the spacer grid models are mostly related to modeling of the entrainment and deposition phenomena and the heat

  15. SMOKE TOOL FOR MODELS-3 VERSION 4.1 STRUCTURE AND OPERATION DOCUMENTATION

    EPA Science Inventory

    The SMOKE Tool is a part of the Models-3 system, a flexible software system designed to simplify the development and use of air quality models and other environmental decision support tools. The SMOKE Tool is an input processor for SMOKE, (Sparse Matrix Operator Kernel Emissio...

  16. Evaluation of ADAM/1 model for advanced coal extraction concepts

    NASA Technical Reports Server (NTRS)

    Deshpande, G. K.; Gangal, M. D.

    1982-01-01

    Several existing computer programs for estimating life cycle cost of mining systems were evaluated. A commercially available program, ADAM/1 was found to be satisfactory in relation to the needs of the advanced coal extraction project. Two test cases were run to confirm the ability of the program to handle nonconventional mining equipment and procedures. The results were satisfactory. The model, therefore, is recommended to the project team for evaluation of their conceptual designs.

  17. Advanced geothermal hydraulics model -- Phase 1 final report, Part 2

    SciTech Connect

    W. Zheng; J. Fu; W. C. Maurer

    1999-07-01

    An advanced geothermal well hydraulics model (GEODRIL) is being developed to accurately calculate bottom-hole conditions in these hot wells. In Phase 1, real-time monitoring and other improvements were added to GEODRIL. In Phase 2, GEODRIL will be integrated into Marconi's Intelligent Drilling Monitor (IDM) that will use artificial intelligence to detect lost circulation, fluid influxes and other circulation problems in geothermal wells. This software platform has potential for significantly reducing geothermal drilling costs.

  18. Advanced techniques in IR thermography as a tool for the pest management professional

    NASA Astrophysics Data System (ADS)

    Grossman, Jon L.

    2006-04-01

    Within the past five years, the Pest Management industry has become aware that IR thermography can aid in the detection of pest infestations and locate other conditions that are within the purview of the industry. This paper will review the applications that can be utilized by the pest management professional and discuss the advanced techniques that may be required in conjunction with thermal imaging to locate insect and other pest infestations, moisture within structures, the verification of data and the special challenges associated with the inspection process.

  19. Tools for Model Building and Optimization into Near-Atomic Resolution Electron Cryo-Microscopy Density Maps.

    PubMed

    DiMaio, F; Chiu, W

    2016-01-01

    Electron cryo-microscopy (cryoEM) has advanced dramatically to become a viable tool for high-resolution structural biology research. The ultimate outcome of a cryoEM study is an atomic model of a macromolecule or its complex with interacting partners. This chapter describes a variety of algorithms and software to build a de novo model based on the cryoEM 3D density map, to optimize the model with the best stereochemistry restraints and finally to validate the model with proper protocols. The full process of atomic structure determination from a cryoEM map is described. The tools outlined in this chapter should prove extremely valuable in revealing atomic interactions guided by cryoEM data. PMID:27572730

  20. Tools for Model Building and Optimization into Near-Atomic Resolution Electron Cryo-Microscopy Density Maps.

    PubMed

    DiMaio, F; Chiu, W

    2016-01-01

    Electron cryo-microscopy (cryoEM) has advanced dramatically to become a viable tool for high-resolution structural biology research. The ultimate outcome of a cryoEM study is an atomic model of a macromolecule or its complex with interacting partners. This chapter describes a variety of algorithms and software to build a de novo model based on the cryoEM 3D density map, to optimize the model with the best stereochemistry restraints and finally to validate the model with proper protocols. The full process of atomic structure determination from a cryoEM map is described. The tools outlined in this chapter should prove extremely valuable in revealing atomic interactions guided by cryoEM data.

  1. The DSET Tool Library: A software approach to enable data exchange between climate system models

    SciTech Connect

    McCormick, J.

    1994-12-01

    Climate modeling is a computationally intensive process. Until recently computers were not powerful enough to perform the complex calculations required to simulate the earth`s climate. As a result standalone programs were created that represent components of the earth`s climate (e.g., Atmospheric Circulation Model). However, recent advances in computing, including massively parallel computing, make it possible to couple the components forming a complete earth climate simulation. The ability to couple different climate model components will significantly improve our ability to predict climate accurately and reliably. Historically each major component of the coupled earth simulation is a standalone program designed independently with different coordinate systems and data representations. In order for two component models to be coupled, the data of one model must be mapped to the coordinate system of the second model. The focus of this project is to provide a general tool to facilitate the mapping of data between simulation components, with an emphasis on using object-oriented programming techniques to provide polynomial interpolation, line and area weighting, and aggregation services.

  2. Advances in Plasma Process Equipment Development using Plasma and Electromagnetics Modeling

    NASA Astrophysics Data System (ADS)

    Agarwal, Ankur

    2013-10-01

    Plasma processing is widely used in the semiconductor industry for thin film etching and deposition, modification of near-surface material, and cleaning. In particular, the challenges for plasma etching have increased as the critical feature dimensions for advanced semiconductor devices have decreased to 20 nm and below. Critical scaling limitations are increasingly driving the transition to 3D solutions such as multi-gate MOSFETs and 3D NAND structures. These structures create significant challenges for dielectric and conductor etching, especially given the high aspect ratio (HAR) of the features. Plasma etching equipment must therefore be capable of exacting profile control across the entire wafer for feature aspect ratios up to 80:1, high throughput, and exceptionally high selectivity. The multiple challenges for advanced 3D structures are addressed by Applied Material's plasma etching chambers by providing highly sophisticated control of ion energy, wafer temperature and plasma chemistry. Given the costs associated with such complex designs and reduced development time-scales, much of these design innovations have been enabled by utilizing advanced computational plasma modeling tools. We have expended considerable effort to develop 3-dimensional coupled plasma and electromagnetic modeling tools in recent years. In this work, we report on these modeling software and their application to plasma processing system design and evaluation of strategies for hardware and process improvement. Several of these examples deal with process uniformity, which is one of the major challenges facing plasma processing equipment design on large substrates. Three-dimensional plasma modeling is used to understand the sources of plasma non-uniformity, including the radio-frequency (RF) current path, and develop uniformity improvement techniques. Examples from coupled equipment and process models to investigate the dynamics of pulsed plasmas and their impact on plasma chemistry will

  3. Modeling Ionosphere Environments: Creating an ISS Electron Density Tool

    NASA Technical Reports Server (NTRS)

    Gurgew, Danielle N.; Minow, Joseph I.

    2011-01-01

    The International Space Station (ISS) maintains an altitude typically between 300 km and 400 km in low Earth orbit (LEO) which itself is situated in the Earth's ionosphere. The ionosphere is a region of partially ionized gas (plasma) formed by the photoionization of neutral atoms and molecules in the upper atmosphere of Earth. It is important to understand what electron density the spacecraft is/will be operating in because the ionized gas along the ISS orbit interacts with the electrical power system resulting in charging of the vehicle. One instrument that is already operational onboard the ISS with a goal of monitoring electron density, electron temperature, and ISS floating potential is the Floating Potential Measurement Unit (FPMU). Although this tool is a valuable addition to the ISS, there are limitations concerning the data collection periods. The FPMU uses the Ku band communication frequency to transmit data from orbit. Use of this band for FPMU data runs is often terminated due to necessary observation of higher priority Extravehicular Activities (EVAs) and other operations on ISS. Thus, large gaps are present in FPMU data. The purpose of this study is to solve the issue of missing environmental data by implementing a secondary electron density data source, derived from the COSMIC satellite constellation, to create a model of ISS orbital environments. Extrapolating data specific to ISS orbital altitudes, we model the ionospheric electron density along the ISS orbit track to supply a set of data when the FPMU is unavailable. This computer model also provides an additional new source of electron density data that is used to confirm FPMU is operating correctly and supplements the original environmental data taken by FPMU.

  4. Report calls for measures to advance climate modeling

    NASA Astrophysics Data System (ADS)

    Showstack, Randy

    2012-09-01

    While climate modeling has made enormous strides over the past several decades, a critical step toward making more rapid, efficient, and coordinated progress in modeling would require “an evolutionary change in U.S. climate modeling institutions away from developing multiple completely independent models toward a collaborative approach,” according to a 7 September report by a committee of the U.S. National Research Council's Board on Atmospheric Sciences and Climate (BASC). “The Committee believes that the best path forward is a strategy centered around the integration of the decentralized U.S. climate modeling enterprise—across modeling efforts, across a hierarchy of model types, across modeling communities focused on different space and timescales, and between model developers and model output users,” the report notes. “A diversity of approaches is necessary for progress in many areas of climate modeling and is vital for addressing the breadth of users needs.” Entitled A National Strategy for Advancing Climate Modeling, the report states that, “If adopted, this strategy of increased unification amidst diversity will allow the United States to more effectively meet the climate information needs of the Nation in the coming decades and beyond.”

  5. Hydrogeological modelling as a tool for understanding rockslides evolution

    NASA Astrophysics Data System (ADS)

    Crosta, Giovanni B.; De Caro, Mattia; Frattini, Paolo; Volpi, Giorgio

    2015-04-01

    construction of the models, in particular the partition of the slope in different sectors with different hydraulic conductivities, are coherent with the geological, structural, hydrological and hydrogeological field and laboratory data. The sensitivity analysis shows that the hydraulic conductivity of some slope sectors (e.g. morphostructures, compressed or relaxed slope-toe, basal shear band) strongly influence the water table position and evolution. In transient models, the values of specific storage coefficient play a major control on the amplitude of groundwater level fluctuations, deriving from snowmelt or induced reservoir level rise. The calibrated groundwater flow-models are consistent with groundwater levels measured in the proximity of the piezometers aligned along the sections. The two examples can be considered important for a more advanced understanding of the evolution of rockslides and suggest the required set of data and modelling approaches both for seasonal and long term slope stability analyses. The use of the results of such analyses is reported, for both the case studies, in a companion abstract in session 3.7 where elasto-visco-plastic rheologies have been adopted for the shear band materials to replicate the available displacement time-series.

  6. Biomorphodynamic modelling of inner bank advance in migrating meander bends

    NASA Astrophysics Data System (ADS)

    Zen, Simone; Zolezzi, Guido; Toffolon, Marco; Gurnell, Angela M.

    2016-07-01

    We propose a bio-morphodynamic model at bend cross-sectional scale for the lateral migration of river meander bends, where the two banks can migrate separately as a result of the mutual interaction between river flow, sediments and riparian vegetation, particularly at the interface between the permanently wet channel and the advancing floodplain. The model combines a non-linear analytical model for the morphodynamic evolution of the channel bed, a quasi-1D model to account for flow unsteadiness, and an ecological model describing riparian vegetation dynamics. Simplified closures are included to estimate the feedbacks among vegetation, hydrodynamics and sediment transport, which affect the morphology of the river-floodplain system. Model tests reveal the fundamental role of riparian plants in generating bio-morphological patterns at the advancing floodplain margin. Importantly, they provide insight into the biophysical controls of the 'bar push' mechanism and into its role in the lateral migration of meander bends and in the temporal variations of the active channel width.

  7. NASA Advanced Concepts Office, Earth-To-Orbit Team Design Process and Tools

    NASA Technical Reports Server (NTRS)

    Waters, Eric D.; Garcia, Jessica; Beers, Benjamin; Philips, Alan; Holt, James B.; Threet, Grady E., Jr.

    2013-01-01

    The Earth to Orbit (ETO) Team of the Advanced Concepts Office (ACO) at NASA Marshal Space Flight Center (MSFC) is considered the preeminent group to go to for prephase A and phase A concept definition. The ACO team has been at the forefront of a multitude of launch vehicle studies determining the future direction of the Agency as a whole due, in part, to their rapid turnaround time in analyzing concepts and their ability to cover broad trade spaces of vehicles in that limited timeframe. Each completed vehicle concept includes a full mass breakdown of each vehicle to tertiary subsystem components, along with a vehicle trajectory analysis to determine optimized payload delivery to specified orbital parameters, flight environments, and delta v capability. Additionally, a structural analysis of the vehicle based on material properties and geometries is performed as well as an analysis to determine the flight loads based on the trajectory outputs. As mentioned, the ACO Earth to Orbit Team prides themselves on their rapid turnaround time and often need to fulfill customer requests within limited schedule or little advanced notice. Due to working in this fast paced environment, the ETO team has developed some finely honed skills and methods to maximize the delivery capability to meet their customer needs. This paper will describe the interfaces between the 3 primary disciplines used in the design process; weights and sizing, trajectory, and structural analysis, as well as the approach each discipline employs to streamline their particular piece of the design process.

  8. The Nuclear Energy Advanced Modeling and Simulation Enabling Computational Technologies FY09 Report

    SciTech Connect

    Diachin, L F; Garaizar, F X; Henson, V E; Pope, G

    2009-10-12

    In this document we report on the status of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Enabling Computational Technologies (ECT) effort. In particular, we provide the context for ECT In the broader NEAMS program and describe the three pillars of the ECT effort, namely, (1) tools and libraries, (2) software quality assurance, and (3) computational facility (computers, storage, etc) needs. We report on our FY09 deliverables to determine the needs of the integrated performance and safety codes (IPSCs) in these three areas and lay out the general plan for software quality assurance to meet the requirements of DOE and the DOE Advanced Fuel Cycle Initiative (AFCI). We conclude with a brief description of our interactions with the Idaho National Laboratory computer center to determine what is needed to expand their role as a NEAMS user facility.

  9. A New Climate Adjustment Tool: An update to EPA’s Storm Water Management Model

    EPA Science Inventory

    The US EPA’s newest tool, the Stormwater Management Model (SWMM) – Climate Adjustment Tool (CAT) is meant to help municipal stormwater utilities better address potential climate change impacts affecting their operations.

  10. Advanced simulation model for IPM motor drive with considering phase voltage and stator inductance

    NASA Astrophysics Data System (ADS)

    Lee, Dong-Myung; Park, Hyun-Jong; Lee, Ju

    2016-10-01

    This paper proposes an advanced simulation model of driving system for Interior Permanent Magnet (IPM) BrushLess Direct Current (BLDC) motors driven by 120-degree conduction method (two-phase conduction method, TPCM) that is widely used for sensorless control of BLDC motors. BLDC motors can be classified as SPM (Surface mounted Permanent Magnet) and IPM motors. Simulation model of driving system with SPM motors is simple due to the constant stator inductance regardless of the rotor position. Simulation models of SPM motor driving system have been proposed in many researches. On the other hand, simulation models for IPM driving system by graphic-based simulation tool such as Matlab/Simulink have not been proposed. Simulation study about driving system of IPMs with TPCM is complex because stator inductances of IPM vary with the rotor position, as permanent magnets are embedded in the rotor. To develop sensorless scheme or improve control performance, development of control algorithm through simulation study is essential, and the simulation model that accurately reflects the characteristic of IPM is required. Therefore, this paper presents the advanced simulation model of IPM driving system, which takes into account the unique characteristic of IPM due to the position-dependent inductances. The validity of the proposed simulation model is validated by comparison to experimental and simulation results using IPM with TPCM control scheme.

  11. GenSAA: A tool for advancing satellite monitoring with graphical expert systems

    NASA Technical Reports Server (NTRS)

    Hughes, Peter M.; Luczak, Edward C.

    1993-01-01

    During numerous contacts with a satellite each day, spacecraft analysts must closely monitor real time data for combinations of telemetry parameter values, trends, and other indications that may signify a problem or failure. As satellites become more complex and the number of data items increases, this task is becoming increasingly difficult for humans to perform at acceptable performance levels. At the NASA Goddard Space Flight Center, fault-isolation expert systems have been developed to support data monitoring and fault detection tasks in satellite control centers. Based on the lessons learned during these initial efforts in expert system automation, a new domain-specific expert system development tool named the Generic Spacecraft Analyst Assistant (GenSAA) is being developed to facilitate the rapid development and reuse of real-time expert systems to serve as fault-isolation assistants for spacecraft analysts. Although initially domain-specific in nature, this powerful tool will support the development of highly graphical expert systems for data monitoring purposes throughout the space and commercial industry.

  12. Recent advances in developing molecular tools for targeted genome engineering of mammalian cells.

    PubMed

    Lim, Kwang-il

    2015-01-01

    Various biological molecules naturally existing in diversified species including fungi, bacteria, and bacteriophage have functionalities for DNA binding and processing. The biological molecules have been recently actively engineered for use in customized genome editing of mammalian cells as the molecule-encoding DNA sequence information and the underlying mechanisms how the molecules work are unveiled. Excitingly, multiple novel methods based on the newly constructed artificial molecular tools have enabled modifications of specific endogenous genetic elements in the genome context at efficiencies that are much higher than that of the conventional homologous recombination based methods. This minireview introduces the most recently spotlighted molecular genome engineering tools with their key features and ongoing modifications for better performance. Such ongoing efforts have mainly focused on the removal of the inherent DNA sequence recognition rigidity from the original molecular platforms, the addition of newly tailored targeting functions into the engineered molecules, and the enhancement of their targeting specificity. Effective targeted genome engineering of mammalian cells will enable not only sophisticated genetic studies in the context of the genome, but also widely-applicable universal therapeutics based on the pinpointing and correction of the disease-causing genetic elements within the genome in the near future.

  13. Neuron-Miner: An Advanced Tool for Morphological Search and Retrieval in Neuroscientific Image Databases.

    PubMed

    Conjeti, Sailesh; Mesbah, Sepideh; Negahdar, Mohammadreza; Rautenberg, Philipp L; Zhang, Shaoting; Navab, Nassir; Katouzian, Amin

    2016-10-01

    The steadily growing amounts of digital neuroscientific data demands for a reliable, systematic, and computationally effective retrieval algorithm. In this paper, we present Neuron-Miner, which is a tool for fast and accurate reference-based retrieval within neuron image databases. The proposed algorithm is established upon hashing (search and retrieval) technique by employing multiple unsupervised random trees, collectively called as Hashing Forests (HF). The HF are trained to parse the neuromorphological space hierarchically and preserve the inherent neuron neighborhoods while encoding with compact binary codewords. We further introduce the inverse-coding formulation within HF to effectively mitigate pairwise neuron similarity comparisons, thus allowing scalability to massive databases with little additional time overhead. The proposed hashing tool has superior approximation of the true neuromorphological neighborhood with better retrieval and ranking performance in comparison to existing generalized hashing methods. This is exhaustively validated by quantifying the results over 31266 neuron reconstructions from Neuromorpho.org dataset curated from 147 different archives. We envisage that finding and ranking similar neurons through reference-based querying via Neuron Miner would assist neuroscientists in objectively understanding the relationship between neuronal structure and function for applications in comparative anatomy or diagnosis. PMID:27155864

  14. Neuron-Miner: An Advanced Tool for Morphological Search and Retrieval in Neuroscientific Image Databases.

    PubMed

    Conjeti, Sailesh; Mesbah, Sepideh; Negahdar, Mohammadreza; Rautenberg, Philipp L; Zhang, Shaoting; Navab, Nassir; Katouzian, Amin

    2016-10-01

    The steadily growing amounts of digital neuroscientific data demands for a reliable, systematic, and computationally effective retrieval algorithm. In this paper, we present Neuron-Miner, which is a tool for fast and accurate reference-based retrieval within neuron image databases. The proposed algorithm is established upon hashing (search and retrieval) technique by employing multiple unsupervised random trees, collectively called as Hashing Forests (HF). The HF are trained to parse the neuromorphological space hierarchically and preserve the inherent neuron neighborhoods while encoding with compact binary codewords. We further introduce the inverse-coding formulation within HF to effectively mitigate pairwise neuron similarity comparisons, thus allowing scalability to massive databases with little additional time overhead. The proposed hashing tool has superior approximation of the true neuromorphological neighborhood with better retrieval and ranking performance in comparison to existing generalized hashing methods. This is exhaustively validated by quantifying the results over 31266 neuron reconstructions from Neuromorpho.org dataset curated from 147 different archives. We envisage that finding and ranking similar neurons through reference-based querying via Neuron Miner would assist neuroscientists in objectively understanding the relationship between neuronal structure and function for applications in comparative anatomy or diagnosis.

  15. Emerging tools for continuous nutrient monitoring networks: Sensors advancing science and water resources protection

    USGS Publications Warehouse

    Pellerin, Brian; Stauffer, Beth A; Young, Dwane A; Sullivan, Daniel J.; Bricker, Suzanne B.; Walbridge, Mark R; Clyde, Gerard A; Shaw, Denice M

    2016-01-01

    Sensors and enabling technologies are becoming increasingly important tools for water quality monitoring and associated water resource management decisions. In particular, nutrient sensors are of interest because of the well-known adverse effects of nutrient enrichment on coastal hypoxia, harmful algal blooms, and impacts to human health. Accurate and timely information on nutrient concentrations and loads is integral to strategies designed to minimize risk to humans and manage the underlying drivers of water quality impairment. Using nitrate sensors as an example, we highlight the types of applications in freshwater and coastal environments that are likely to benefit from continuous, real-time nutrient data. The concurrent emergence of new tools to integrate, manage and share large data sets is critical to the successful use of nutrient sensors and has made it possible for the field of continuous nutrient monitoring to rapidly move forward. We highlight several near-term opportunities for Federal agencies, as well as the broader scientific and management community, that will help accelerate sensor development, build and leverage sites within a national network, and develop open data standards and data management protocols that are key to realizing the benefits of a large-scale, integrated monitoring network. Investing in these opportunities will provide new information to guide management and policies designed to protect and restore our nation’s water resources.

  16. Recent advances in developing molecular tools for targeted genome engineering of mammalian cells.

    PubMed

    Lim, Kwang-il

    2015-01-01

    Various biological molecules naturally existing in diversified species including fungi, bacteria, and bacteriophage have functionalities for DNA binding and processing. The biological molecules have been recently actively engineered for use in customized genome editing of mammalian cells as the molecule-encoding DNA sequence information and the underlying mechanisms how the molecules work are unveiled. Excitingly, multiple novel methods based on the newly constructed artificial molecular tools have enabled modifications of specific endogenous genetic elements in the genome context at efficiencies that are much higher than that of the conventional homologous recombination based methods. This minireview introduces the most recently spotlighted molecular genome engineering tools with their key features and ongoing modifications for better performance. Such ongoing efforts have mainly focused on the removal of the inherent DNA sequence recognition rigidity from the original molecular platforms, the addition of newly tailored targeting functions into the engineered molecules, and the enhancement of their targeting specificity. Effective targeted genome engineering of mammalian cells will enable not only sophisticated genetic studies in the context of the genome, but also widely-applicable universal therapeutics based on the pinpointing and correction of the disease-causing genetic elements within the genome in the near future. PMID:25104401

  17. Advancing Collaboration through Hydrologic Data and Model Sharing

    NASA Astrophysics Data System (ADS)

    Tarboton, D. G.; Idaszak, R.; Horsburgh, J. S.; Ames, D. P.; Goodall, J. L.; Band, L. E.; Merwade, V.; Couch, A.; Hooper, R. P.; Maidment, D. R.; Dash, P. K.; Stealey, M.; Yi, H.; Gan, T.; Castronova, A. M.; Miles, B.; Li, Z.; Morsy, M. M.

    2015-12-01

    HydroShare is an online, collaborative system for open sharing of hydrologic data, analytical tools, and models. It supports the sharing of and collaboration around "resources" which are defined primarily by standardized metadata, content data models for each resource type, and an overarching resource data model based on the Open Archives Initiative's Object Reuse and Exchange (OAI-ORE) standard and a hierarchical file packaging system called "BagIt". HydroShare expands the data sharing capability of the CUAHSI Hydrologic Information System by broadening the classes of data accommodated to include geospatial and multidimensional space-time datasets commonly used in hydrology. HydroShare also includes new capability for sharing models, model components, and analytical tools and will take advantage of emerging social media functionality to enhance information about and collaboration around hydrologic data and models. It also supports web services and server/cloud based computation operating on resources for the execution of hydrologic models and analysis and visualization of hydrologic data. HydroShare uses iRODS as a network file system for underlying storage of datasets and models. Collaboration is enabled by casting datasets and models as "social objects". Social functions include both private and public sharing, formation of collaborative groups of users, and value-added annotation of shared datasets and models. The HydroShare web interface and social media functions were developed using the Django web application framework coupled to iRODS. Data visualization and analysis is supported through the Tethys Platform web GIS software stack. Links to external systems are supported by RESTful web service interfaces to HydroShare's content. This presentation will introduce the HydroShare functionality developed to date and describe ongoing development of functionality to support collaboration and integration of data and models.

  18. Reliability modelling system for analysis of advanced battery technologies

    NASA Astrophysics Data System (ADS)

    Imhoff, C. H.; Hostick, C. J.; Nakaoka, R. K.

    1985-05-01

    Key considerations in evaluating the reliability of advanced battery technologies include the impact of cell failures on battery performance and cost. Pacific Northwest Laboratory developed interactive microcomputer based simulation models to help battery developers use cell reliability data to calculate the expected performance of new battery technologies. Key benefits of this model include its capability to estimate the effect of cell failures upon: (1) battery system discharge performance, (2) system cycle life, and (3) system economic performance (tradeoffs between capital investment and lifetime operating costs).

  19. Improving Power System Modeling. A Tool to Link Capacity Expansion and Production Cost Models

    SciTech Connect

    Diakov, Victor; Cole, Wesley; Sullivan, Patrick; Brinkman, Gregory; Margolis, Robert

    2015-11-01

    Capacity expansion models (CEM) provide a high-level long-term view at the prospects of the evolving power system. In simulating the possibilities of long-term capacity expansion, it is important to maintain the viability of power system operation in the short-term (daily, hourly and sub-hourly) scales. Production-cost models (PCM) simulate routine power system operation on these shorter time scales using detailed load, transmission and generation fleet data by minimizing production costs and following reliability requirements. When based on CEM 'predictions' about generating unit retirements and buildup, PCM provide more detailed simulation for the short-term system operation and, consequently, may confirm the validity of capacity expansion predictions. Further, production cost model simulations of a system that is based on capacity expansion model solution are 'evolutionary' sound: the generator mix is the result of logical sequence of unit retirement and buildup resulting from policy and incentives. The above has motivated us to bridge CEM with PCM by building a capacity expansion - to - production cost model Linking Tool (CEPCoLT). The Linking Tool is built to onset capacity expansion model prescriptions onto production cost model inputs. NREL's ReEDS and Energy Examplar's PLEXOS are the capacity expansion and the production cost models, respectively. Via the Linking Tool, PLEXOS provides details of operation for the regionally-defined ReEDS scenarios.

  20. Advances and Limitations of Disease Biogeography Using Ecological Niche Modeling.

    PubMed

    Escobar, Luis E; Craft, Meggan E

    2016-01-01

    Mapping disease transmission risk is crucial in public and animal health for evidence based decision-making. Ecology and epidemiology are highly related disciplines that may contribute to improvements in mapping disease, which can be used to answer health related questions. Ecological niche modeling is increasingly used for understanding the biogeography of diseases in plants, animals, and humans. However, epidemiological applications of niche modeling approaches for disease mapping can fail to generate robust study designs, producing incomplete or incorrect inferences. This manuscript is an overview of the history and conceptual bases behind ecological niche modeling, specifically as applied to epidemiology and public health; it does not pretend to be an exhaustive and detailed description of ecological niche modeling literature and methods. Instead, this review includes selected state-of-the-science approaches and tools, providing a short guide to designing studies incorporating information on the type and quality of the input data (i.e., occurrences and environmental variables), identification and justification of the extent of the study area, and encourages users to explore and test diverse algorithms for more informed conclusions. We provide a friendly introduction to the field of disease biogeography presenting an updated guide for researchers looking to use ecological niche modeling for disease mapping. We anticipate that ecological niche modeling will soon be a critical tool for epidemiologists aiming to map disease transmission risk, forecast disease distribution under climate change scenarios, and identify landscape factors triggering outbreaks. PMID:27547199

  1. Advances and Limitations of Disease Biogeography Using Ecological Niche Modeling

    PubMed Central

    Escobar, Luis E.; Craft, Meggan E.

    2016-01-01

    Mapping disease transmission risk is crucial in public and animal health for evidence based decision-making. Ecology and epidemiology are highly related disciplines that may contribute to improvements in mapping disease, which can be used to answer health related questions. Ecological niche modeling is increasingly used for understanding the biogeography of diseases in plants, animals, and humans. However, epidemiological applications of niche modeling approaches for disease mapping can fail to generate robust study designs, producing incomplete or incorrect inferences. This manuscript is an overview of the history and conceptual bases behind ecological niche modeling, specifically as applied to epidemiology and public health; it does not pretend to be an exhaustive and detailed description of ecological niche modeling literature and methods. Instead, this review includes selected state-of-the-science approaches and tools, providing a short guide to designing studies incorporating information on the type and quality of the input data (i.e., occurrences and environmental variables), identification and justification of the extent of the study area, and encourages users to explore and test diverse algorithms for more informed conclusions. We provide a friendly introduction to the field of disease biogeography presenting an updated guide for researchers looking to use ecological niche modeling for disease mapping. We anticipate that ecological niche modeling will soon be a critical tool for epidemiologists aiming to map disease transmission risk, forecast disease distribution under climate change scenarios, and identify landscape factors triggering outbreaks. PMID:27547199

  2. Advances and Limitations of Disease Biogeography Using Ecological Niche Modeling.

    PubMed

    Escobar, Luis E; Craft, Meggan E

    2016-01-01

    Mapping disease transmission risk is crucial in public and animal health for evidence based decision-making. Ecology and epidemiology are highly related disciplines that may contribute to improvements in mapping disease, which can be used to answer health related questions. Ecological niche modeling is increasingly used for understanding the biogeography of diseases in plants, animals, and humans. However, epidemiological applications of niche modeling approaches for disease mapping can fail to generate robust study designs, producing incomplete or incorrect inferences. This manuscript is an overview of the history and conceptual bases behind ecological niche modeling, specifically as applied to epidemiology and public health; it does not pretend to be an exhaustive and detailed description of ecological niche modeling literature and methods. Instead, this review includes selected state-of-the-science approaches and tools, providing a short guide to designing studies incorporating information on the type and quality of the input data (i.e., occurrences and environmental variables), identification and justification of the extent of the study area, and encourages users to explore and test diverse algorithms for more informed conclusions. We provide a friendly introduction to the field of disease biogeography presenting an updated guide for researchers looking to use ecological niche modeling for disease mapping. We anticipate that ecological niche modeling will soon be a critical tool for epidemiologists aiming to map disease transmission risk, forecast disease distribution under climate change scenarios, and identify landscape factors triggering outbreaks.

  3. An advanced constitutive model in the sheet metal forming simulation: the Teodosiu microstructural model and the Cazacu Barlat yield criterion

    NASA Astrophysics Data System (ADS)

    Alves, J. L.; Oliveira, M. C.; Menezes, L. F.

    2004-06-01

    Two constitutive models used to describe the plastic behavior of sheet metals in the numerical simulation of sheet metal forming process are studied: a recently proposed advanced constitutive model based on the Teodosiu microstructural model and the Cazacu Barlat yield criterion is compared with a more classical one, based on the Swift law and the Hill 1948 yield criterion. These constitutive models are implemented into DD3IMP, a finite element home code specifically developed to simulate sheet metal forming processes, which generically is a 3-D elastoplastic finite element code with an updated Lagrangian formulation, following a fully implicit time integration scheme, large elastoplastic strains and rotations. Solid finite elements and parametric surfaces are used to model the blank sheet and tool surfaces, respectively. Some details of the numerical implementation of the constitutive models are given. Finally, the theory is illustrated with the numerical simulation of the deep drawing of a cylindrical cup. The results show that the proposed advanced constitutive model predicts with more exactness the final shape (medium height and ears profile) of the formed part, as one can conclude from the comparison with the experimental results.

  4. Space Weather Models, Tools and Services at the Community Coordinated Modeling Center

    NASA Technical Reports Server (NTRS)

    Kuznetsova, M. M.; Hesse, M.; Maddox, M.; Rastaetter, L.; Berrios, D.; Pulkkinen, A.; Zheng, Y.; MacNeice, P. J.; Shim, J.; Takakishvili, A.; Chulaki, A.

    2010-01-01

    The Community Coordinated Modeling center (CCMC) is a multi-agency partnership to support the research and developmental work necessary to substantially increase space weather modeling capabilities and to facilitate advanced models deployment in forecasting operations. The CCMC conducts unbiased model testing and validation and evaluates model readiness for operational environment. Space weather models and coupled model chains hosted at the CCMC range from the solar corona to the Earth's upper atmosphere. CCMC has developed a number of real-time modeling systems, as well as a large number of modeling and data products tailored to address the space weather needs of NASA's robotic missions. The presentation will demonstrate the rapid progress towards development the system allowing using products derived from space weather models in applications associated with National Space Weather needs. The adaptable Integrated Space Weather Analysis (ISWA) System developed at CCMC for NASA-relevant space weather information combines forecasts based on advanced space weather models hosted at CCMC with concurrent space environment information. The system is also enabling post-impact analysis and flexible dissemination of space weather information.

  5. Educational tool for modeling and simulation of a closed regenerative life support system

    NASA Astrophysics Data System (ADS)

    Arai, Tatsuya; Fanchiang, Christine; Aoki, Hirofumi; Newman, Dava J.

    For long term missions on the moon and Mars, regenerative life support systems emerge as a promising key technology for sustaining successful explorations with reduced re-supply logistics and cost. The purpose of this study was to create a simple model of a regenerative life support system which allows preliminary investigation of system responses. A simplified regenerative life support system was made with MATLAB Simulink ™. Mass flows in the system were simplified to carbon, water, oxygen and carbon dioxide. The subsystems included crew members, animals, a plant module, and a waste processor, which exchanged mass into and out of mass reservoirs. Preliminary numerical simulations were carried out to observe system responses. The simplified life support system model allowed preliminary investigation of the system response to perturbations such as increased or decreased number of crew members. The model is simple and flexible enough to add new components, and also possible to numerically predict non-linear subsystem functions and responses. Future work includes practical issues such as energy efficiency, air leakage, nutrition, and plant growth modeling. The model functions as an effective teaching tool about how a regenerative advanced life support system works.

  6. QCanvas: An Advanced Tool for Data Clustering and Visualization of Genomics Data.

    PubMed

    Kim, Nayoung; Park, Herin; He, Ningning; Lee, Hyeon Young; Yoon, Sukjoon

    2012-12-01

    We developed a user-friendly, interactive program to simultaneously cluster and visualize omics data, such as DNA and protein array profiles. This program provides diverse algorithms for the hierarchical clustering of two-dimensional data. The clustering results can be interactively visualized and optimized on a heatmap. The present tool does not require any prior knowledge of scripting languages to carry out the data clustering and visualization. Furthermore, the heatmaps allow the selective display of data points satisfying user-defined criteria. For example, a clustered heatmap of experimental values can be differentially visualized based on statistical values, such as p-values. Including diverse menu-based display options, QCanvas provides a convenient graphical user interface for pattern analysis and visualization with high-quality graphics.

  7. Advances in ion trap mass spectrometry: Photodissociation as a tool for structural elucidation

    SciTech Connect

    Stephenson, J.L. Jr.; Booth, M.M.; Eyler, J.R.; Yost, R.A.

    1995-12-01

    Photo-induced dissociation (PID) is the next most frequently used method (after collisional activation) for activation of Polyatomic ions in tandem mass spectrometry. The range of internal energies present after the photon absorption process are much narrower than those obtained with collisional energy transfer. Therefore, the usefulness of PID for the study of ion structures is greatly enhanced. The long storage times and instrumental configuration of the ion trap mass spectrometer are ideally suited for photodissociation experiments. This presentation will focus on both the fundamental and analytical applications of CO{sub 2} lasers in conjunction with ion trap mass spectrometry. The first portion of this talk will examine the fundamental issues of wavelength dependence, chemical kinetics, photoabsorption cross section, and collisional effects on photodissociation efficiency. The second half of this presentation will look at novel instrumentation for electrospray/ion trap mass spectrometry, with the concurrent development of photodissociation as a tool for structural elucidation of organic compounds and antibiotics.

  8. Microfluidic chips with multi-junctions: an advanced tool in recovering proteins from inclusion bodies.

    PubMed

    Yamaguchi, Hiroshi; Miyazaki, Masaya

    2015-01-01

    Active recombinant proteins are used for studying the biological functions of genes and for the development of therapeutic drugs. Overexpression of recombinant proteins in bacteria often results in the formation of inclusion bodies, which are protein aggregates with non-native conformations. Protein refolding is an important process for obtaining active recombinant proteins from inclusion bodies. However, the conventional refolding method of dialysis or dilution is time-consuming and recovered active protein yields are often low, and a cumbersome trial-and-error process is required to achieve success. To circumvent these difficulties, we used controllable diffusion through laminar flow in microchannels to regulate the denaturant concentration. This method largely aims at reducing protein aggregation during the refolding procedure. This Commentary introduces the principles of the protein refolding method using microfluidic chips and the advantage of our results as a tool for rapid and efficient recovery of active recombinant proteins from inclusion bodies.

  9. Current themes and recent advances in modelling species occurrences

    PubMed Central

    2009-01-01

    Recent years have seen a huge expansion in the range of methods and approaches that are being used to predict species occurrences. This expansion has been accompanied by many improvements in statistical methods, including more accurate ways of comparing models, better null models, methods to cope with autocorrelation, and greater awareness of the importance of scale and prevalence. However, the field still suffers from problems with incorporating temporal variation, overfitted models and poor out-of-sample prediction, confusion between explanation and prediction, simplistic assumptions, and a focus on pattern over process. The greatest advances in recent years have come from integrative studies that have linked species occurrence models with other themes and topics in ecology, such as island biogeography, climate change, disease geography, and invasive species. PMID:20948597

  10. Elevated Temperature Testing and Modeling of Advanced Toughened Ceramic Materials

    NASA Technical Reports Server (NTRS)

    Keith, Theo G.

    2005-01-01

    The purpose of this report is to provide a final report for the period of 12/1/03 through 11/30/04 for NASA Cooperative Agreement NCC3-776, entitled "Elevated Temperature Testing and Modeling of Advanced Toughened Ceramic Materials." During this final period, major efforts were focused on both the determination of mechanical properties of advanced ceramic materials and the development of mechanical test methodologies under several different programs of the NASA-Glenn. The important research activities made during this period are: 1. Mechanical properties evaluation of two gas-turbine grade silicon nitrides. 2) Mechanical testing for fuel-cell seal materials. 3) Mechanical properties evaluation of thermal barrier coatings and CFCCs and 4) Foreign object damage (FOD) testing.

  11. ModelMage: a tool for automatic model generation, selection and management.

    PubMed

    Flöttmann, Max; Schaber, Jörg; Hoops, Stephan; Klipp, Edda; Mendes, Pedro

    2008-01-01

    Mathematical modeling of biological systems usually involves implementing, simulating, and discriminating several candidate models that represent alternative hypotheses. Generating and managing these candidate models is a tedious and difficult task and can easily lead to errors. ModelMage is a tool that facilitates management of candidate models. It is designed for the easy and rapid development, generation, simulation, and discrimination of candidate models. The main idea of the program is to automatically create a defined set of model alternatives from a single master model. The user provides only one SBML-model and a set of directives from which the candidate models are created by leaving out species, modifiers or reactions. After generating models the software can automatically fit all these models to the data and provides a ranking for model selection, in case data is available. In contrast to other model generation programs, ModelMage aims at generating only a limited set of models that the user can precisely define. ModelMage uses COPASI as a simulation and optimization engine. Thus, all simulation and optimization features of COPASI are readily incorporated. ModelMage can be downloaded from http://sysbio.molgen.mpg.de/modelmage and is distributed as free software. PMID:19425122

  12. Thermal Model Predictions of Advanced Stirling Radioisotope Generator Performance

    NASA Technical Reports Server (NTRS)

    Wang, Xiao-Yen J.; Fabanich, William Anthony; Schmitz, Paul C.

    2014-01-01

    This paper presents recent thermal model results of the Advanced Stirling Radioisotope Generator (ASRG). The three-dimensional (3D) ASRG thermal power model was built using the Thermal Desktop(trademark) thermal analyzer. The model was correlated with ASRG engineering unit test data and ASRG flight unit predictions from Lockheed Martin's (LM's) I-deas(trademark) TMG thermal model. The auxiliary cooling system (ACS) of the ASRG is also included in the ASRG thermal model. The ACS is designed to remove waste heat from the ASRG so that it can be used to heat spacecraft components. The performance of the ACS is reported under nominal conditions and during a Venus flyby scenario. The results for the nominal case are validated with data from Lockheed Martin. Transient thermal analysis results of ASRG for a Venus flyby with a representative trajectory are also presented. In addition, model results of an ASRG mounted on a Cassini-like spacecraft with a sunshade are presented to show a way to mitigate the high temperatures of a Venus flyby. It was predicted that the sunshade can lower the temperature of the ASRG alternator by 20 C for the representative Venus flyby trajectory. The 3D model also was modified to predict generator performance after a single Advanced Stirling Convertor failure. The geometry of the Microtherm HT insulation block on the outboard side was modified to match deformation and shrinkage observed during testing of a prototypic ASRG test fixture by LM. Test conditions and test data were used to correlate the model by adjusting the thermal conductivity of the deformed insulation to match the post-heat-dump steady state temperatures. Results for these conditions showed that the performance of the still-functioning inboard ACS was unaffected.

  13. THERMODYNAMIC AND KINETIC MODELING OF ADVANCED NUCLEAR FUELS - FINAL LDRD-ER REPORT

    SciTech Connect

    Turchi, P

    2011-11-28

    This project enhanced our theoretical capabilities geared towards establishing the basic science of a high-throughput protocol for the development of advanced nuclear fuel that should couple modern computational materials modeling and simulation tools, fabrication and characterization capabilities, and targeted high throughput performance testing experiments. The successful conclusion of this ER project allowed us to upgrade state-of-the-art modeling codes, and apply these modeling tools to ab initio energetics and thermodynamic assessments of phase diagrams of various mixtures of actinide alloys, propose a tool for optimizing composition of complex alloys for specific properties, predict diffusion behavior in diffusion couples made of actinide and transition metals, include one new equation in the LLNL phase-field AMPE code, and predict microstructure evolution during alloy coring. In FY11, despite limited funding, the team also initiated an experimental activity, with collaboration from Texas A&M University by preparing samples of nuclear fuels in bulk forms and for diffusion couple studies and metallic matrices, and performing preliminary characterization.

  14. OCT corneal epithelial topographic asymmetry as a sensitive diagnostic tool for early and advancing keratoconus

    PubMed Central

    Kanellopoulos, Anastasios John; Asimellis, George

    2014-01-01

    Purpose To investigate epithelial thickness-distribution characteristics in a large group of keratoconic patients and their correlation to normal eyes employing anterior-segment optical coherence tomography (AS-OCT). Materials and methods The study group (n=160 eyes) consisted of clinically diagnosed keratoconus eyes; the control group (n=160) consisted of nonkeratoconic eyes. Three separate, three-dimensional epithelial thickness maps were obtained employing AS-OCT, enabling investigation of the pupil center, average, mid-peripheral, superior, inferior, maximum, minimum, and topographic epithelial thickness variability. Intraindividual repeatability of measurements was assessed. We introduced correlation of the epithelial data via newly defined indices. The epithelial thickness indices were then correlated with two Scheimpflug imaging-derived AS-irregularity indices: the index of height decentration, and the index of surface variance highly sensitive to early and advancing keratoconus diagnosis as validation. Results Intraindividual repeatability of epithelial thickness measurement in the keratoconic group was on average 1.67 μm. For the control group, repeatability was on average 1.13 μm. In the keratoconic group, pupil-center epithelial thickness was 51.75±7.02 μm, while maximum and minimum epithelial thickness were 63.54±8.85 μm and 40.73±8.51 μm. In the control group, epithelial thickness at the center was 52.54±3.23 μm, with maximum 55.33±3.27 μm and minimum 48.50±3.98 μm epithelial thickness. Topographic variability was 6.07±3.55 μm in the keratoconic group, while for the control group it was 1.59±0.79 μm. In keratoconus, topographic epithelial thickness change from normal, correlated tightly with the topometric asymmetry indices of IHD and ISV derived from Scheimpflug imaging. Conclusion Simple, OCT-derived epithelial mapping, appears to have critical potential in early and advancing keratoconus diagnosis, confirmed with its correlation

  15. Advances in coupled safety modeling using systems analysis and high-fidelity methods.

    SciTech Connect

    Fanning, T. H.; Thomas, J. W.; Nuclear Engineering Division

    2010-05-31

    The potential for a sodium-cooled fast reactor to survive severe accident initiators with no damage has been demonstrated through whole-plant testing in EBR-II and FFTF. Analysis of the observed natural protective mechanisms suggests that they would be characteristic of a broad range of sodium-cooled fast reactors utilizing metal fuel. However, in order to demonstrate the degree to which new, advanced sodium-cooled fast reactor designs will possess these desired safety features, accurate, high-fidelity, whole-plant dynamics safety simulations will be required. One of the objectives of the advanced safety-modeling component of the Reactor IPSC is to develop a science-based advanced safety simulation capability by utilizing existing safety simulation tools coupled with emerging high-fidelity modeling capabilities in a multi-resolution approach. As part of this integration, an existing whole-plant systems analysis code has been coupled with a high-fidelity computational fluid dynamics code to assess the impact of high-fidelity simulations on safety-related performance. With the coupled capabilities, it is possible to identify critical safety-related phenomenon in advanced reactor designs that cannot be resolved with existing tools. In this report, the impact of coupling is demonstrated by evaluating the conditions of outlet plenum thermal stratification during a protected loss of flow transient. Outlet plenum stratification was anticipated to alter core temperatures and flows predicted during natural circulation conditions. This effect was observed during the simulations. What was not anticipated, however, is the far-reaching impact that resolving thermal stratification has on the whole plant. The high temperatures predicted at the IHX inlet due to thermal stratification in the outlet plenum forces heat into the intermediate system to the point that it eventually becomes a source of heat for the primary system. The results also suggest that flow stagnation in the

  16. Advances in parallel computer technology for desktop atmospheric dispersion models

    SciTech Connect

    Bian, X.; Ionescu-Niscov, S.; Fast, J.D.; Allwine, K.J.

    1996-12-31

    Desktop models are those models used by analysts with varied backgrounds, for performing, for example, air quality assessment and emergency response activities. These models must be robust, well documented, have minimal and well controlled user inputs, and have clear outputs. Existing coarse-grained parallel computers can provide significant increases in computation speed in desktop atmospheric dispersion modeling without considerable increases in hardware cost. This increased speed will allow for significant improvements to be made in the scientific foundations of these applied models, in the form of more advanced diffusion schemes and better representation of the wind and turbulence fields. This is especially attractive for emergency response applications where speed and accuracy are of utmost importance. This paper describes one particular application of coarse-grained parallel computer technology to a desktop complex terrain atmospheric dispersion modeling system. By comparing performance characteristics of the coarse-grained parallel version of the model with the single-processor version, we will demonstrate that applying coarse-grained parallel computer technology to desktop atmospheric dispersion modeling systems will allow us to address critical issues facing future requirements of this class of dispersion models.

  17. Novel Diabetic Mouse Models as Tools for Investigating Diabetic Retinopathy

    PubMed Central

    Kador, Peter F.; Zhang, Peng; Makita, Jun; Zhang, Zifeng; Guo, Changmei; Randazzo, James; Kawada, Hiroyoshi; Haider, Neena; Blessing, Karen

    2012-01-01

    Objective Mouse models possessing green fluorescent protein (GFP) and/or human aldose reductase (hAR) in vascular tissues have been established and crossed with naturally diabetic Akita mice to produce new diabetic mouse models. Research Design and Methods Colonies of transgenic C57BL mice expressing GFP (SMAA-GFP), hAR (SMAA-hAR) or both (SMAA-GFP-hAR) in vascular tissues expressing smooth muscle actin were established and crossbred with C57BL/6-Ins2Akita/J (AK) mice to produce naturally diabetic offspring AK-SMAA-GFP and AK-SMAA-GFP-hAR. Aldose reductase inhibitor AL1576 (ARI) was administered in chow. Retinal and lenticular sorbitol levels were determined by HPLC. Retinal functions were evaluated by electroretinography (ERGs). Growth factor and signaling changes were determined by Western Blots using commercially available antibodies. Retinal vasculatures were isolated from the neural retina by enzymatic digestion. Flat mounts were stained with PAS-hematoxylin and analyzed. Results Akita transgenics developed DM by 8 weeks of age with blood glucose levels higher in males than females. Sorbitol levels were higher in neural retinas of AK-SMAA-GFP-hAR compared to AK-SMAA-GFP mice. AK-SMAA-GFP-hAR mice also had higher VEGF levels and reduced ERG scotopic b-wave function, both of which were normalized by AL1576. AK-SMAA-GFP-hAR mice showed induction of the retinal growth factors bFGF, IGF-1, and TGFβ, as well as signaling changes in P-Akt, P-SAPK/JNK and P-44/42 MAPK that were also reduced by ARI treatment. Quantitative analysis of flat mounts in 18 week AK-SMAA-GFP-hAR mice revealed increased loss of nuclei/capillary length and a significant increase in the percentage of acellular capillaries present which was not seen in AK-SMAA-GFP-hAR treated with ARI. Conclusions/Significance These new mouse models of early onset diabetes may be valuable tools for assessing both the role of hyperglycemia and AR in the development of retinal lesions associated with diabetic

  18. Detecting Surgical Tools by Modelling Local Appearance and Global Shape.

    PubMed

    Bouget, David; Benenson, Rodrigo; Omran, Mohamed; Riffaud, Laurent; Schiele, Bernt; Jannin, Pierre

    2015-12-01

    Detecting tools in surgical videos is an important ingredient for context-aware computer-assisted surgical systems. To this end, we present a new surgical tool detection dataset and a method for joint tool detection and pose estimation in 2d images. Our two-stage pipeline is data-driven and relaxes strong assumptions made by previous works regarding the geometry, number, and position of tools in the image. The first stage classifies each pixel based on local appearance only, while the second stage evaluates a tool-specific shape template to enforce global shape. Both local appearance and global shape are learned from training data. Our method is validated on a new surgical tool dataset of 2 476 images from neurosurgical microscopes, which is made freely available. It improves over existing datasets in size, diversity and detail of annotation. We show that our method significantly improves over competitive baselines from the computer vision field. We achieve 15% detection miss-rate at 10(-1) false positives per image (for the suction tube) over our surgical tool dataset. Results indicate that performing semantic labelling as an intermediate task is key for high quality detection.

  19. Transgenic Mouse Models of Alzheimer Disease: Developing a Better Model as a Tool for Therapeutic Interventions

    PubMed Central

    Kitazawa, Masashi; Medeiros, Rodrigo; LaFerla, Frank M.

    2015-01-01

    Alzheimer disease (AD) is the leading cause of dementia among elderly. Currently, no effective treatment is available for AD. Analysis of transgenic mouse models of AD has facilitated our understanding of disease mechanisms and provided valuable tools for evaluating potential therapeutic strategies. In this review, we will discuss the strengths and weaknesses of current mouse models of AD and the contribution towards understanding the pathological mechanisms and developing effective therapies. PMID:22288400

  20. Surrogate Model Development for Fuels for Advanced Combustion Engines

    SciTech Connect

    Anand, Krishnasamy; Ra, youngchul; Reitz, Rolf; Bunting, Bruce G

    2011-01-01

    The fuels used in internal-combustion engines are complex mixtures of a multitude of different types of hydrocarbon species. Attempting numerical simulations of combustion of real fuels with all of the hydrocarbon species included is highly unrealistic. Thus, a surrogate model approach is generally adopted, which involves choosing a few representative hydrocarbon species whose overall behavior mimics the characteristics of the target fuel. The present study proposes surrogate models for the nine fuels for advanced combustion engines (FACE) that have been developed for studying low-emission, high-efficiency advanced diesel engine concepts. The surrogate compositions for the fuels are arrived at by simulating their distillation profiles to within a maximum absolute error of 4% using a discrete multi-component (DMC) fuel model that has been incorporated in the multi-dimensional computational fluid dynamics (CFD) code, KIVA-ERC-CHEMKIN. The simulated surrogate compositions cover the range and measured concentrations of the various hydrocarbon classes present in the fuels. The fidelity of the surrogate fuel models is judged on the basis of matching their specific gravity, lower heating value, hydrogen/carbon (H/C) ratio, cetane number, and cetane index with the measured data for all nine FACE fuels.

  1. Recent advances in transferable coarse-grained modeling of proteins.

    PubMed

    Kar, Parimal; Feig, Michael

    2014-01-01

    Computer simulations are indispensable tools for studying the structure and dynamics of biological macromolecules. Biochemical processes occur on different scales of length and time. Atomistic simulations cannot cover the relevant spatiotemporal scales at which the cellular processes occur. To address this challenge, coarse-grained (CG) modeling of the biological systems is employed. Over the last few years, many CG models for proteins continue to be developed. However, many of them are not transferable with respect to different systems and different environments. In this review, we discuss those CG protein models that are transferable and that retain chemical specificity. We restrict ourselves to CG models of soluble proteins only. We also briefly review recent progress made in the multiscale hybrid all-atom/CG simulations of proteins.

  2. GraphCrunch 2: Software tool for network modeling, alignment and clustering

    PubMed Central

    2011-01-01

    Background Recent advancements in experimental biotechnology have produced large amounts of protein-protein interaction (PPI) data. The topology of PPI networks is believed to have a strong link to their function. Hence, the abundance of PPI data for many organisms stimulates the development of computational techniques for the modeling, comparison, alignment, and clustering of networks. In addition, finding representative models for PPI networks will improve our understanding of the cell just as a model of gravity has helped us understand planetary motion. To decide if a model is representative, we need quantitative comparisons of model networks to real ones. However, exact network comparison is computationally intractable and therefore several heuristics have been used instead. Some of these heuristics are easily computable "network properties," such as the degree distribution, or the clustering coefficient. An important special case of network comparison is the network alignment problem. Analogous to sequence alignment, this problem asks to find the "best" mapping between regions in two networks. It is expected that network alignment might have as strong an impact on our understanding of biology as sequence alignment has had. Topology-based clustering of nodes in PPI networks is another example of an important network analysis problem that can uncover relationships between interaction patterns and phenotype. Results We introduce the GraphCrunch 2 software tool, which addresses these problems. It is a significant extension of GraphCrunch which implements the most popular random network models and compares them with the data networks with respect to many network properties. Also, GraphCrunch 2 implements the GRAph ALigner algorithm ("GRAAL") for purely topological network alignment. GRAAL can align any pair of networks and exposes large, dense, contiguous regions of topological and functional similarities far larger than any other existing tool. Finally, Graph

  3. Myositis registries and biorepositories: powerful tools to advance clinical, epidemiologic and pathogenic research

    PubMed Central

    Rider, Lisa G.; Dankó, Katalin; Miller, Frederick W.

    2016-01-01

    Purpose of review Clinical registries and biorepositories have proven extremely useful in many studies of diseases, especially rare diseases. Given their rarity and diversity, the idiopathic inflammatory myopathies, or myositis syndromes, have benefited from individual researchers’ collections of cohorts of patients. Major efforts are being made to establish large registries and biorepositories that will allow many additional studies to be performed that were not possible before. Here we describe the registries developed by investigators and patient support groups that are currently available for collaborative research purposes. Recent findings We have identified 46 myositis research registries, including many with biorepositories, which have been developed for a wide variety of purposes and have resulted in great advances in understanding the range of phenotypes, clinical presentations, risk factors, pathogenic mechanisms, outcome assessment, therapeutic responses, and prognoses. These are now available for collaborative use to undertake additional studies. Two myositis patient registries have been developed for research, and myositis patient support groups maintain demographic registries with large numbers of patients available to be contacted for potential research participation. Summary Investigator-initiated myositis research registries and biorepositories have proven extremely useful in understanding many aspects of these rare and diverse autoimmune diseases. These registries and biorepositories, in addition to those developed by myositis patient support groups, deserve continued support to maintain the momentum in this field as they offer major opportunities to improve understanding of the pathogenesis and treatment of these diseases in cost-effective ways. PMID:25225838

  4. Modeling and analysis of hydrogen detonation events in the Advanced Neutron Source reactor containment

    SciTech Connect

    Taleyarkhan, R.P.; Georgevich, V.; Kim, S.H.; Valenti, S.N.; Simpson, D.B.; Sawruk, W.

    1994-07-01

    This paper describes salient aspects of the modeling, analyses, and evaluations for hydrogen detonation in selected regions of the Advanced Neutron Source (ANS) containment during hypothetical severe accident conditions. Shock wave generation and transport modeling and analyses were conducted for two stratified configurations in the dome region of the high bay. Principal tools utilized for these purposes were the CTH and CET89 computer codes. Dynamic pressure loading functions were generated for key locations and used for evaluating structural response behavior for which a finite-element model was developed using the ANSYS code. For the range of conditions analyzed in the two critical dome regions, it was revealed that the ANS containment would be able to withstand detonation loads without failure.

  5. Development of a VOR/DME model for an advanced concepts simulator

    NASA Technical Reports Server (NTRS)

    Steinmetz, G. G.; Bowles, R. L.

    1984-01-01

    The report presents a definition of a VOR/DME, airborne and ground systems simulation model. This description was drafted in response to a need in the creation of an advanced concepts simulation in which flight station design for the 1980 era can be postulated and examined. The simulation model described herein provides a reasonable representation of VOR/DME station in the continental United States including area coverage by type and noise errors. The detail in which the model has been cast provides the interested researcher with a moderate fidelity level simulator tool for conducting research and evaluation of navigator algorithms. Assumptions made within the development are listed and place certain responsibilities (data bases, communication with other simulation modules, uniform round earth, etc.) upon the researcher.

  6. SERS as an advanced tool for investigating chloroethyl nitrosourea derivatives complexation with DNA.

    PubMed

    Agarwal, Shweta; Ray, Bhumika; Mehrotra, Ranjana

    2015-11-01

    We report surface-enhanced Raman spectroscopic (SERS) studies on free calf thymus DNA and its complexes with anti-tumor chloroethyl nitrosourea derivatives; semustine and nimustine. Since, first incident of SERS in 1974, it has rapidly established into an analytical tool, which can be used for the trace detection and characterization of analytes. Here, we depict yet another application of SERS in the field of drug-DNA interaction and thereby, its promising role in rational designing of new chemotherapeutic agents. Vibrational spectral analysis has been performed in an attempt to delineate the anti-cancer action mechanism of above mentioned nitrosourea derivatives. Strong SERS bands associated with the complexation of DNA with semustine and nimustine have been observed, which reveal binding of nitrosourea derivatives with heterocyclic nitrogenous base pair of DNA duplex. Formation of dG-dC interstrand cross-link in DNA double helices is also suggested by the SERS spectral outcomes of CENUs-DNA adduct. Results, demonstrated here, reflect recent progress in the newly developing field of drug-DNA interaction analysis via SERS.

  7. New advances and validation of knowledge management tools for critical care using classifier techniques.

    PubMed Central

    Frize, M.; Wang, L.; Ennett, C. M.; Nickerson, B. G.; Solven, F. G.; Stevenson, M.

    1998-01-01

    An earlier version (2.0) of the case-based reasoning (CBR) tool, called IDEAS for ICU's, allowed users to compare the ten closest matching cases to the newest patient admission, using a large database of intensive care patient records, and physician-selected matching-weights [1,2]. The new version incorporates matching-weights, which have been determined quantitatively. A faster CBR matching engine has also been incorporated into the new CBR. In a second approach, a back-propagation, feed-forward artificial neural network estimated two classes of the outcome "duration of artificial ventilation" for a subset of the database used for the CBR work. Weight-elimination was successfully applied to reduce the number of input variables and speed-up the estimation of outcomes. New experiments examined the impact of using a different number of input variables on the performance of the ANN, measured by correct classification rates (CCR) and the Average Squared Error (ASE). PMID:9929280

  8. STED-FLCS: An Advanced Tool to Reveal Spatiotemporal Heterogeneity of Molecular Membrane Dynamics.

    PubMed

    Vicidomini, Giuseppe; Ta, Haisen; Honigmann, Alf; Mueller, Veronika; Clausen, Mathias P; Waithe, Dominic; Galiani, Silvia; Sezgin, Erdinc; Diaspro, Alberto; Hell, Stefan W; Eggeling, Christian

    2015-09-01

    Heterogeneous diffusion dynamics of molecules play an important role in many cellular signaling events, such as of lipids in plasma membrane bioactivity. However, these dynamics can often only be visualized by single-molecule and super-resolution optical microscopy techniques. Using fluorescence lifetime correlation spectroscopy (FLCS, an extension of fluorescence correlation spectroscopy, FCS) on a super-resolution stimulated emission depletion (STED) microscope, we here extend previous observations of nanoscale lipid dynamics in the plasma membrane of living mammalian cells. STED-FLCS allows an improved determination of spatiotemporal heterogeneity in molecular diffusion and interaction dynamics via a novel gated detection scheme, as demonstrated by a comparison between STED-FLCS and previous conventional STED-FCS recordings on fluorescent phosphoglycerolipid and sphingolipid analogues in the plasma membrane of live mammalian cells. The STED-FLCS data indicate that biophysical and biochemical parameters such as the affinity for molecular complexes strongly change over space and time within a few seconds. Drug treatment for cholesterol depletion or actin cytoskeleton depolymerization not only results in the already previously observed decreased affinity for molecular interactions but also in a slight reduction of the spatiotemporal heterogeneity. STED-FLCS specifically demonstrates a significant improvement over previous gated STED-FCS experiments and with its improved spatial and temporal resolution is a novel tool for investigating how heterogeneities of the cellular plasma membrane may regulate biofunctionality. PMID:26235350

  9. Reactive Transport Modeling: An Essential Tool and a New ResearchApproach for the Earth Sciences

    SciTech Connect

    Steefel, Carl I.; DePaolo, Donald J.; Lichtner, Peter C.

    2005-08-25

    Reactive transport modeling is an essential tool for the analysis of coupled physical, chemical, and biological processes in Earth systems, and has additional potential to better integrate the results from focused fundamental research on Earth materials. Appropriately designed models can describe the interactions of competing processes at a range of spatial and time scales, and hence are critical for connecting the advancing capabilities for materials characterization at the atomic scale with the macroscopic behavior of complex Earth systems. Reactive transport modeling has had a significant impact on the treatment of contaminant retardation in the subsurface, the description of elemental and nutrient fluxes between major Earth reservoirs, and in the treatment of deep Earth processes such as metamorphism and magma transport. Active topics of research include the development of pore scale and hybrid, or multiple continua, models to capture the scale dependence of coupled reactive transport processes. Frontier research questions, that are only now being addressed, include the effects of chemical microenvironments, coupled thermal mechanical chemical processes, controls on mineral fluid reaction rates in natural media, and scaling of reactive transport processes from the microscopic to pore to field scale.

  10. The Use Of Computational Human Performance Modeling As Task Analysis Tool

    SciTech Connect

    Jacuqes Hugo; David Gertman

    2012-07-01

    During a review of the Advanced Test Reactor safety basis at the Idaho National Laboratory, human factors engineers identified ergonomic and human reliability risks involving the inadvertent exposure of a fuel element to the air during manual fuel movement and inspection in the canal. There were clear indications that these risks increased the probability of human error and possible severe physical outcomes to the operator. In response to this concern, a detailed study was conducted to determine the probability of the inadvertent exposure of a fuel element. Due to practical and safety constraints, the task network analysis technique was employed to study the work procedures at the canal. Discrete-event simulation software was used to model the entire procedure as well as the salient physical attributes of the task environment, such as distances walked, the effect of dropped tools, the effect of hazardous body postures, and physical exertion due to strenuous tool handling. The model also allowed analysis of the effect of cognitive processes such as visual perception demands, auditory information and verbal communication. The model made it possible to obtain reliable predictions of operator performance and workload estimates. It was also found that operator workload as well as the probability of human error in the fuel inspection and transfer task were influenced by the concurrent nature of certain phases of the task and the associated demand on cognitive and physical resources. More importantly, it was possible to determine with reasonable accuracy the stages as well as physical locations in the fuel handling task where operators would be most at risk of losing their balance and falling into the canal. The model also provided sufficient information for a human reliability analysis that indicated that the postulated fuel exposure accident was less than credible.

  11. Investigation of Alien Wavelength Quality in Live Multi-Domain, Multi-Vendor Link Using Advanced Simulation Tool

    NASA Astrophysics Data System (ADS)

    Nordal Petersen, Martin; Nuijts, Roeland; Lange Bjørn, Lars

    2014-05-01

    This article presents an advanced optical model for simulation of alien wavelengths in multi-domain and multi-vendor dense wavelength-division multiplexing networks. The model aids optical network planners with a better understanding of the non-linear effects present in dense wavelength-division multiplexing systems and better utilization of alien wavelengths in future applications. The limiting physical effects for alien wavelengths are investigated in relation to power levels, channel spacing, and other factors. The simulation results are verified through experimental setup in live multi-domain dense wavelength-division multiplexing systems between two national research networks: SURFnet in Holland and NORDUnet in Denmark.

  12. Business Model Evaluation for an Advanced Multimedia Service Portfolio

    NASA Astrophysics Data System (ADS)

    Pisciella, Paolo; Zoric, Josip; Gaivoronski, Alexei A.

    In this paper we analyze quantitatively a business model for the collaborative provision of an advanced mobile data service portfolio composed of three multimedia services: Video on Demand, Internet Protocol Television and User Generated Content. We provide a description of the provision system considering the relation occurring between tecnical aspects and business aspects for each agent providing the basic multimedia service. Such a techno-business analysis is then projected into a mathematical model dealing with the problem of the definition of incentives between the different agents involved in a collaborative service provision. Through the implementation of this model we aim at shaping the behaviour of each of the contributing agents modifying the level of profitability that the Service Portfolio yields to each of them.

  13. Advanced optical position sensors for magnetically suspended wind tunnel models

    NASA Technical Reports Server (NTRS)

    Lafleur, S.

    1985-01-01

    A major concern to aerodynamicists has been the corruption of wind tunnel test data by model support structures, such as stings or struts. A technique for magnetically suspending wind tunnel models was considered by Tournier and Laurenceau (1957) in order to overcome this problem. This technique is now implemented with the aid of a Large Magnetic Suspension and Balance System (LMSBS) and advanced position sensors for measuring model attitude and position within the test section. Two different optical position sensors are discussed, taking into account a device based on the use of linear CCD arrays, and a device utilizing area CID cameras. Current techniques in image processing have been employed to develop target tracking algorithms capable of subpixel resolution for the sensors. The algorithms are discussed in detail, and some preliminary test results are reported.

  14. Development of a system model for advanced small modular reactors.

    SciTech Connect

    Lewis, Tom Goslee,; Holschuh, Thomas Vernon,

    2014-01-01

    This report describes a system model that can be used to analyze three advance small modular reactor (SMR) designs through their lifetime. Neutronics of these reactor designs were evaluated using Monte Carlo N-Particle eXtended (MCNPX/6). The system models were developed in Matlab and Simulink. A major thrust of this research was the initial scoping analysis of Sandias concept of a long-life fast reactor (LLFR). The inherent characteristic of this conceptual design is to minimize the change in reactivity over the lifetime of the reactor. This allows the reactor to operate substantially longer at full power than traditional light water reactors (LWRs) or other SMR designs (e.g. high temperature gas reactor (HTGR)). The system model has subroutines for lifetime reactor feedback and operation calculations, thermal hydraulic effects, load demand changes and a simplified SCO2 Brayton cycle for power conversion.

  15. Acoustic test and analyses of three advanced turboprop models

    NASA Technical Reports Server (NTRS)

    Brooks, B. M.; Metzger, F. B.

    1980-01-01

    Results of acoustic tests of three 62.2 cm (24.5 inch) diameter models of the prop-fan (a small diameter, highly loaded. Multi-bladed variable pitch advanced turboprop) are presented. Results show that there is little difference in the noise produced by unswept and slightly swept designs. However, the model designed for noise reduction produces substantially less noise at test conditions simulating 0.8 Mach number cruise speed or at conditions simulating takeoff and landing. In the near field at cruise conditions the acoustically designed. In the far field at takeoff and landing conditions the acoustically designed model is 5 db quieter than unswept or slightly swept designs. Correlation between noise measurement and theoretical predictions as well as comparisons between measured and predicted acoustic pressure pulses generated by the prop-fan blades are discussed. The general characteristics of the pulses are predicted. Shadowgraph measurements were obtained which showed the location of bow and trailing waves.

  16. Advanced optical position sensors for magnetically suspended wind tunnel models

    NASA Astrophysics Data System (ADS)

    Lafleur, S.

    A major concern to aerodynamicists has been the corruption of wind tunnel test data by model support structures, such as stings or struts. A technique for magnetically suspending wind tunnel models was considered by Tournier and Laurenceau (1957) in order to overcome this problem. This technique is now implemented with the aid of a Large Magnetic Suspension and Balance System (LMSBS) and advanced position sensors for measuring model attitude and position within the test section. Two different optical position sensors are discussed, taking into account a device based on the use of linear CCD arrays, and a device utilizing area CID cameras. Current techniques in image processing have been employed to develop target tracking algorithms capable of subpixel resolution for the sensors. The algorithms are discussed in detail, and some preliminary test results are reported.

  17. Model-Based Reasoning: Using Visual Tools to Reveal Student Learning

    ERIC Educational Resources Information Center

    Luckie, Douglas; Harrison, Scott H.; Ebert-May, Diane

    2011-01-01

    Using visual models is common in science and should become more common in classrooms. Our research group has developed and completed studies on the use of a visual modeling tool, the Concept Connector. This modeling tool consists of an online concept mapping Java applet that has automatic scoring functions we refer to as Robograder. The Concept…

  18. 33 CFR 385.33 - Revisions to models and analytical tools.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... system-wide simulation models and analytical tools used in the evaluation and assessment of projects, and... 33 Navigation and Navigable Waters 3 2011-07-01 2011-07-01 false Revisions to models and... Incorporating New Information Into the Plan § 385.33 Revisions to models and analytical tools. (a) In...

  19. An Overview of the Object Protocol Model (OPM) and the OPM Data Management Tools.

    ERIC Educational Resources Information Center

    Chen, I-Min A.; Markowitz, Victor M.

    1995-01-01

    Discussion of database management tools for scientific information focuses on the Object Protocol Model (OPM) and data management tools based on OPM. Topics include the need for new constructs for modeling scientific experiments, modeling object structures and experiments in OPM, queries and updates, and developing scientific database applications…

  20. ADVANCED PROTEOMICS AND BIOINFORMATICS TOOLS IN TOXICOLOGY RESEARCH: OVERCOMING CHALLENGES TO PROVIDE SIGNIFICANT RESULTS

    EPA Science Inventory

    This presentation specifically addresses the advantages and limitations of state of the art gel, protein arrays and peptide-based labeling proteomic approaches to assess the effects of a suite of model T4 inhibitors on the thyroid axis of Xenopus laevis.