Capability Maturity Model (CMM) for Software Process Improvements
NASA Technical Reports Server (NTRS)
Ling, Robert Y.
2000-01-01
This slide presentation reviews the Avionic Systems Division's implementation of the Capability Maturity Model (CMM) for improvements in the software development process. The presentation reviews the process involved in implementing the model and the benefits of using CMM to improve the software development process.
NASA Astrophysics Data System (ADS)
Shin, Yung C.; Bailey, Neil; Katinas, Christopher; Tan, Wenda
2018-05-01
This paper presents an overview of vertically integrated comprehensive predictive modeling capabilities for directed energy deposition processes, which have been developed at Purdue University. The overall predictive models consist of vertically integrated several modules, including powder flow model, molten pool model, microstructure prediction model and residual stress model, which can be used for predicting mechanical properties of additively manufactured parts by directed energy deposition processes with blown powder as well as other additive manufacturing processes. Critical governing equations of each model and how various modules are connected are illustrated. Various illustrative results along with corresponding experimental validation results are presented to illustrate the capabilities and fidelity of the models. The good correlations with experimental results prove the integrated models can be used to design the metal additive manufacturing processes and predict the resultant microstructure and mechanical properties.
NASA Astrophysics Data System (ADS)
Shin, Yung C.; Bailey, Neil; Katinas, Christopher; Tan, Wenda
2018-01-01
This paper presents an overview of vertically integrated comprehensive predictive modeling capabilities for directed energy deposition processes, which have been developed at Purdue University. The overall predictive models consist of vertically integrated several modules, including powder flow model, molten pool model, microstructure prediction model and residual stress model, which can be used for predicting mechanical properties of additively manufactured parts by directed energy deposition processes with blown powder as well as other additive manufacturing processes. Critical governing equations of each model and how various modules are connected are illustrated. Various illustrative results along with corresponding experimental validation results are presented to illustrate the capabilities and fidelity of the models. The good correlations with experimental results prove the integrated models can be used to design the metal additive manufacturing processes and predict the resultant microstructure and mechanical properties.
Risk Quantification of Systems Engineering Documents Improves Probability of DOD Project Success
2009-09-01
comprehensive risk model for DoD milestone review documentation as well as recommended changes to the Capability Maturity Model Integration ( CMMI ) Project...Milestone Documentation, Project Planning, Rational Frame, Political Frame, CMMI Project Planning Process Area, CMMI Risk Management Process Area...well as recommended changes to the Capability Maturity Model Integration ( CMMI ) Project Planning and Risk Management process areas. The intent is to
2004-06-01
Situation Understanding) Common Operational Pictures Planning & Decision Support Capabilities Message & Order Processing Common Operational...Pictures Planning & Decision Support Capabilities Message & Order Processing Common Languages & Data Models Modeling & Simulation Domain
Updraft Fixed Bed Gasification Aspen Plus Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
2007-09-27
The updraft fixed bed gasification model provides predictive modeling capabilities for updraft fixed bed gasifiers, when devolatilization data is available. The fixed bed model is constructed using Aspen Plus, process modeling software, coupled with a FORTRAN user kinetic subroutine. Current updraft gasification models created in Aspen Plus have limited predictive capabilities and must be "tuned" to reflect a generalized gas composition as specified in literature or by the gasifier manufacturer. This limits the applicability of the process model.
Leveraging People-Related Maturity Issues for Achieving Higher Maturity and Capability Levels
NASA Astrophysics Data System (ADS)
Buglione, Luigi
During the past 20 years Maturity Models (MM) become a buzzword in the ICT world. Since the initial Crosby's idea in 1979, plenty of models have been created in the Software & Systems Engineering domains, addressing various perspectives. By analyzing the content of the Process Reference Models (PRM) in many of them, it can be noticed that people-related issues have little weight in the appraisals of the capabilities of organizations while in practice they are considered as significant contributors in traditional process and organizational performance appraisals, as stressed instead in well-known Performance Management models such as MBQA, EFQM and BSC. This paper proposes some ways for leveraging people-related maturity issues merging HR practices from several types of maturity models into the organizational Business Process Model (BPM) in order to achieve higher organizational maturity and capability levels.
Microgrid Design Toolkit (MDT) Technical Documentation and Component Summaries
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arguello, Bryan; Gearhart, Jared Lee; Jones, Katherine A.
2015-09-01
The Microgrid Design Toolkit (MDT) is a decision support software tool for microgrid designers to use during the microgrid design process. The models that support the two main capabilities in MDT are described. The first capability, the Microgrid Sizing Capability (MSC), is used to determine the size and composition of a new microgrid in the early stages of the design process. MSC is a mixed-integer linear program that is focused on developing a microgrid that is economically viable when connected to the grid. The second capability is focused on refining a microgrid design for operation in islanded mode. This secondmore » capability relies on two models: the Technology Management Optimization (TMO) model and Performance Reliability Model (PRM). TMO uses a genetic algorithm to create and refine a collection of candidate microgrid designs. It uses PRM, a simulation based reliability model, to assess the performance of these designs. TMO produces a collection of microgrid designs that perform well with respect to one or more performance metrics.« less
10 Steps to Building an Architecture for Space Surveillance Projects
NASA Astrophysics Data System (ADS)
Gyorko, E.; Barnhart, E.; Gans, H.
Space surveillance is an increasingly complex task, requiring the coordination of a multitude of organizations and systems, while dealing with competing capabilities, proprietary processes, differing standards, and compliance issues. In order to fully understand space surveillance operations, analysts and engineers need to analyze and break down their operations and systems using what are essentially enterprise architecture processes and techniques. These techniques can be daunting to the first- time architect. This paper provides a summary of simplified steps to analyze a space surveillance system at the enterprise level in order to determine capabilities, services, and systems. These steps form the core of an initial Model-Based Architecting process. For new systems, a well defined, or well architected, space surveillance enterprise leads to an easier transition from model-based architecture to model-based design and provides a greater likelihood that requirements are fulfilled the first time. Both new and existing systems benefit from being easier to manage, and can be sustained more easily using portfolio management techniques, based around capabilities documented in the model repository. The resulting enterprise model helps an architect avoid 1) costly, faulty portfolio decisions; 2) wasteful technology refresh efforts; 3) upgrade and transition nightmares; and 4) non-compliance with DoDAF directives. The Model-Based Architecting steps are based on a process that Harris Corporation has developed from practical experience architecting space surveillance systems and ground systems. Examples are drawn from current work on documenting space situational awareness enterprises. The process is centered on DoDAF 2 and its corresponding meta-model so that terminology is standardized and communicable across any disciplines that know DoDAF architecting, including acquisition, engineering and sustainment disciplines. Each step provides a guideline for the type of data to collect, and also the appropriate views to generate. The steps include 1) determining the context of the enterprise, including active elements and high level capabilities or goals; 2) determining the desired effects of the capabilities and mapping capabilities against the project plan; 3) determining operational performers and their inter-relationships; 4) building information and data dictionaries; 5) defining resources associated with capabilities; 6) determining the operational behavior necessary to achieve each capability; 7) analyzing existing or planned implementations to determine systems, services and software; 8) cross-referencing system behavior to operational behavioral; 9) documenting system threads and functional implementations; and 10) creating any required textual documentation from the model.
Neural network for processing both spatial and temporal data with time based back-propagation
NASA Technical Reports Server (NTRS)
Villarreal, James A. (Inventor); Shelton, Robert O. (Inventor)
1993-01-01
Neural networks are computing systems modeled after the paradigm of the biological brain. For years, researchers using various forms of neural networks have attempted to model the brain's information processing and decision-making capabilities. Neural network algorithms have impressively demonstrated the capability of modeling spatial information. On the other hand, the application of parallel distributed models to the processing of temporal data has been severely restricted. The invention introduces a novel technique which adds the dimension of time to the well known back-propagation neural network algorithm. In the space-time neural network disclosed herein, the synaptic weights between two artificial neurons (processing elements) are replaced with an adaptable-adjustable filter. Instead of a single synaptic weight, the invention provides a plurality of weights representing not only association, but also temporal dependencies. In this case, the synaptic weights are the coefficients to the adaptable digital filters. Novelty is believed to lie in the disclosure of a processing element and a network of the processing elements which are capable of processing temporal as well as spacial data.
Information Processing Capabilities in Performers Differing in Levels of Motor Skill
1979-01-01
F. I. 1. , ’ Lockhart , R. S. Levels of* processing : A framework for memory research. Journal of Verbal Learning and Verbal Behavior, 1972, 11, 671-684...ARI TECHNICAL REPORT LEVEr.79iA4 Information Processing Capabilities in Performers Differing In Levels of 00 Motor Skill ,4 by Robert N. Singer... PROCESSING CAPABILITIES IN PERFORMERS DIFFERING IN LEVELS OF MOTOR SKILL INTRODUCTION In the human behaving systems model developed by Singer, Gerson, and
2009-09-01
NII)/CIO Assistant Secretary of Defense for Networks and Information Integration/Chief Information Officer CMMI Capability Maturity Model...a Web-based portal to share knowledge about software process-related methodologies, such as the SEI’s Capability Maturity Model Integration ( CMMI ...19 SEI’s IDEALSM model, and Lean Six Sigma.20 For example, the portal features content areas such as software acquisition management, the SEI CMMI
Mitropoulos, Panagiotis Takis; Cupido, Gerardo
2009-01-01
In construction, the challenge for researchers and practitioners is to develop work systems (production processes and teams) that can achieve high productivity and high safety at the same time. However, construction accident causation models ignore the role of work practices and teamwork. This study investigates the mechanisms by which production and teamwork practices affect the likelihood of accidents. The paper synthesizes a new model for construction safety based on the cognitive perspective (Fuller's Task-Demand-Capability Interface model, 2005) and then presents an exploratory case study. The case study investigates and compares the work practices of two residential framing crews: a 'High Reliability Crew' (HRC)--that is, a crew with exceptional productivity and safety over several years, and an average performing crew from the same company. The model explains how the production and teamwork practices generate the work situations that workers face (the task demands) and affect the workers ability to cope (capabilities). The case study indicates that the work practices of the HRC directly influence the task demands and match them with the applied capabilities. These practices were guided by the 'principle' of avoiding errors and rework and included work planning and preparation, work distribution, managing the production pressures, and quality and behavior monitoring. The Task Demand-Capability model links construction research to a cognitive model of accident causation and provides a new way to conceptualize safety as an emergent property of the production practices and teamwork processes. The empirical evidence indicates that the crews' work practices and team processes strongly affect the task demands, the applied capabilities, and the match between demands and capabilities. The proposed model and the exploratory case study will guide further discovery of work practices and teamwork processes that can increase both productivity and safety in construction operations. Such understanding will enable training of construction foremen and crews in these practices to systematically develop high reliability crews.
Performance analysis of different tuning rules for an isothermal CSTR using integrated EPC and SPC
NASA Astrophysics Data System (ADS)
Roslan, A. H.; Karim, S. F. Abd; Hamzah, N.
2018-03-01
This paper demonstrates the integration of Engineering Process Control (EPC) and Statistical Process Control (SPC) for the control of product concentration of an isothermal CSTR. The objectives of this study are to evaluate the performance of Ziegler-Nichols (Z-N), Direct Synthesis, (DS) and Internal Model Control (IMC) tuning methods and determine the most effective method for this process. The simulation model was obtained from past literature and re-constructed using SIMULINK MATLAB to evaluate the process response. Additionally, the process stability, capability and normality were analyzed using Process Capability Sixpack reports in Minitab. Based on the results, DS displays the best response for having the smallest rise time, settling time, overshoot, undershoot, Integral Time Absolute Error (ITAE) and Integral Square Error (ISE). Also, based on statistical analysis, DS yields as the best tuning method as it exhibits the highest process stability and capability.
A Maturity Model for Assessing the Use of ICT in School Education
ERIC Educational Resources Information Center
Solar, Mauricio; Sabattin, Jorge; Parada, Victor
2013-01-01
This article describes an ICT-based and capability-driven model for assessing ICT in education capabilities and maturity of schools. The proposed model, called ICTE-MM (ICT in School Education Maturity Model), has three elements supporting educational processes: information criteria, ICT resources, and leverage domains. Changing the traditional…
Process Improvement Should Link to Security: SEPG 2007 Security Track Recap
2007-09-01
the Systems Security Engineering Capability Maturity Model (SSE- CMM / ISO 21827) and its use in system software developments ...software development life cycle ( SDLC )? 6. In what ways should process improvement support security in the SDLC ? 1.2 10BPANEL RESOURCES For each... project management, and support practices through the use of the capability maturity models including the CMMI and the Systems Security
NASA Technical Reports Server (NTRS)
1981-01-01
The development of a coal gasification system design and mass and energy balance simulation program for the TVA and other similar facilities is described. The materials-process-product model (MPPM) and the advanced system for process engineering (ASPEN) computer program were selected from available steady state and dynamic models. The MPPM was selected to serve as the basis for development of system level design model structure because it provided the capability for process block material and energy balance and high-level systems sizing and costing. The ASPEN simulation serves as the basis for assessing detailed component models for the system design modeling program. The ASPEN components were analyzed to identify particular process blocks and data packages (physical properties) which could be extracted and used in the system design modeling program. While ASPEN physical properties calculation routines are capable of generating physical properties required for process simulation, not all required physical property data are available, and must be user-entered.
USDA-ARS?s Scientific Manuscript database
Increasing urbanization changes runoff patterns to be flashy and instantaneous with decreased base flow. A model with the ability to simulate sub-daily rainfall–runoff processes and continuous simulation capability is required to realistically capture the long-term flow and water quality trends in w...
Climbing the ladder: capability maturity model integration level 3
NASA Astrophysics Data System (ADS)
Day, Bryce; Lutteroth, Christof
2011-02-01
This article details the attempt to form a complete workflow model for an information and communication technologies (ICT) company in order to achieve a capability maturity model integration (CMMI) maturity rating of 3. During this project, business processes across the company's core and auxiliary sectors were documented and extended using modern enterprise modelling tools and a The Open Group Architectural Framework (TOGAF) methodology. Different challenges were encountered with regard to process customisation and tool support for enterprise modelling. In particular, there were problems with the reuse of process models, the integration of different project management methodologies and the integration of the Rational Unified Process development process framework that had to be solved. We report on these challenges and the perceived effects of the project on the company. Finally, we point out research directions that could help to improve the situation in the future.
NASA Astrophysics Data System (ADS)
McAllister, M.; Gochis, D.; Dugger, A. L.; Karsten, L. R.; McCreight, J. L.; Pan, L.; Rafieeinasab, A.; Read, L. K.; Sampson, K. M.; Yu, W.
2017-12-01
The community WRF-Hydro modeling system is publicly available and provides researchers and operational forecasters a flexible and extensible capability for performing multi-scale, multi-physics options for hydrologic modeling that can be run independent or fully-interactive with the WRF atmospheric model. The core WRF-Hydro physics model contains very high-resolution descriptions of terrestrial hydrologic process representations such as land-atmosphere exchanges of energy and moisture, snowpack evolution, infiltration, terrain routing, channel routing, basic reservoir representation and hydrologic data assimilation. Complementing the core physics components of WRF-Hydro are an ecosystem of pre- and post-processing tools that facilitate the preparation of terrain and meteorological input data, an open-source hydrologic model evaluation toolset (Rwrfhydro), hydrologic data assimilation capabilities with DART and advanced model visualization capabilities. The National Center for Atmospheric Research (NCAR), through collaborative support from the National Science Foundation and other funding partners, provides community support for the entire WRF-Hydro system through a variety of mechanisms. This presentation summarizes the enhanced user support capabilities that are being developed for the community WRF-Hydro modeling system. These products and services include a new website, open-source code repositories, documentation and user guides, test cases, online training materials, live, hands-on training sessions, an email list serve, and individual user support via email through a new help desk ticketing system. The WRF-Hydro modeling system and supporting tools which now include re-gridding scripts and model calibration have recently been updated to Version 4 and are merging toward capabilities of the National Water Model.
NASA Astrophysics Data System (ADS)
Rimland, Jeffrey; McNeese, Michael; Hall, David
2013-05-01
Although the capability of computer-based artificial intelligence techniques for decision-making and situational awareness has seen notable improvement over the last several decades, the current state-of-the-art still falls short of creating computer systems capable of autonomously making complex decisions and judgments in many domains where data is nuanced and accountability is high. However, there is a great deal of potential for hybrid systems in which software applications augment human capabilities by focusing the analyst's attention to relevant information elements based on both a priori knowledge of the analyst's goals and the processing/correlation of a series of data streams too numerous and heterogeneous for the analyst to digest without assistance. Researchers at Penn State University are exploring ways in which an information framework influenced by Klein's (Recognition Primed Decision) RPD model, Endsley's model of situational awareness, and the Joint Directors of Laboratories (JDL) data fusion process model can be implemented through a novel combination of Complex Event Processing (CEP) and Multi-Agent Software (MAS). Though originally designed for stock market and financial applications, the high performance data-driven nature of CEP techniques provide a natural compliment to the proven capabilities of MAS systems for modeling naturalistic decision-making, performing process adjudication, and optimizing networked processing and cognition via the use of "mobile agents." This paper addresses the challenges and opportunities of such a framework for augmenting human observational capability as well as enabling the ability to perform collaborative context-aware reasoning in both human teams and hybrid human / software agent teams.
VS2DRTI: Simulating Heat and Reactive Solute Transport in Variably Saturated Porous Media.
Healy, Richard W; Haile, Sosina S; Parkhurst, David L; Charlton, Scott R
2018-01-29
Variably saturated groundwater flow, heat transport, and solute transport are important processes in environmental phenomena, such as the natural evolution of water chemistry of aquifers and streams, the storage of radioactive waste in a geologic repository, the contamination of water resources from acid-rock drainage, and the geologic sequestration of carbon dioxide. Up to now, our ability to simulate these processes simultaneously with fully coupled reactive transport models has been limited to complex and often difficult-to-use models. To address the need for a simple and easy-to-use model, the VS2DRTI software package has been developed for simulating water flow, heat transport, and reactive solute transport through variably saturated porous media. The underlying numerical model, VS2DRT, was created by coupling the flow and transport capabilities of the VS2DT and VS2DH models with the equilibrium and kinetic reaction capabilities of PhreeqcRM. Flow capabilities include two-dimensional, constant-density, variably saturated flow; transport capabilities include both heat and multicomponent solute transport; and the reaction capabilities are a complete implementation of geochemical reactions of PHREEQC. The graphical user interface includes a preprocessor for building simulations and a postprocessor for visual display of simulation results. To demonstrate the simulation of multiple processes, the model is applied to a hypothetical example of injection of heated waste water to an aquifer with temperature-dependent cation exchange. VS2DRTI is freely available public domain software. © 2018, National Ground Water Association.
NASA Technical Reports Server (NTRS)
Peabody, Hume L.
2017-01-01
This presentation is meant to be an overview of the model building process It is based on typical techniques (Monte Carlo Ray Tracing for radiation exchange, Lumped Parameter, Finite Difference for thermal solution) used by the aerospace industry This is not intended to be a "How to Use ThermalDesktop" course. It is intended to be a "How to Build Thermal Models" course and the techniques will be demonstrated using the capabilities of ThermalDesktop (TD). Other codes may or may not have similar capabilities. The General Model Building Process can be broken into four top level steps: 1. Build Model; 2. Check Model; 3. Execute Model; 4. Verify Results.
Supporting the Use of CERT (registered trademark) Secure Coding Standards in DoD Acquisitions
2012-07-01
Capability Maturity Model IntegrationSM (CMMI®) [Davis 2009]. SM Team Software Process, TSP, and Capability Maturity Model Integration are service...STP Software Test Plan TEP Test and Evaluation Plan TSP Team Software Process V & V verification and validation CMU/SEI-2012-TN-016 | 47...Supporting the Use of CERT® Secure Coding Standards in DoD Acquisitions Tim Morrow ( Software Engineering Institute) Robert Seacord ( Software
JIMM: the next step for mission-level models
NASA Astrophysics Data System (ADS)
Gump, Jamieson; Kurker, Robert G.; Nalepka, Joseph P.
2001-09-01
The (Simulation Based Acquisition) SBA process is one in which the planning, design, and test of a weapon system or other product is done through the more effective use of modeling and simulation, information technology, and process improvement. This process results in a product that is produced faster, cheaper, and more reliably than its predecessors. Because the SBA process requires realistic and detailed simulation conditions, it was necessary to develop a simulation tool that would provide a simulation environment acceptable for doing SBA analysis. The Joint Integrated Mission Model (JIMM) was created to help define and meet the analysis, test and evaluation, and training requirements of a Department of Defense program utilizing SBA. Through its generic nature of representing simulation entities, its data analysis capability, and its robust configuration management process, JIMM can be used to support a wide range of simulation applications as both a constructive and a virtual simulation tool. JIMM is a Mission Level Model (MLM). A MLM is capable of evaluating the effectiveness and survivability of a composite force of air and space systems executing operational objectives in a specific scenario against an integrated air and space defense system. Because MLMs are useful for assessing a system's performance in a realistic, integrated, threat environment, they are key to implementing the SBA process. JIMM is a merger of the capabilities of one legacy model, the Suppressor MLM, into another, the Simulated Warfare Environment Generator (SWEG) MLM. By creating a more capable MLM, JIMM will not only be a tool to support the SBA initiative, but could also provide the framework for the next generation of MLMs.
Data Visualization and Animation Lab (DVAL) overview
NASA Technical Reports Server (NTRS)
Stacy, Kathy; Vonofenheim, Bill
1994-01-01
The general capabilities of the Langley Research Center Data Visualization and Animation Laboratory is described. These capabilities include digital image processing, 3-D interactive computer graphics, data visualization and analysis, video-rate acquisition and processing of video images, photo-realistic modeling and animation, video report generation, and color hardcopies. A specialized video image processing system is also discussed.
NASA Astrophysics Data System (ADS)
Versteeg, R.; Johnson, D. V.; Rodzianko, A.; Zhou, H.; Dafflon, B.; Leger, E.; de Kleine, M.
2017-12-01
Understanding of processes in the shallow subsurface requires that geophysical, biogeochemical, hydrological and remote sensing datasets are assimilated, processed and interpreted. Multiple enabling software capabilities for process understanding have been developed by the science community. These include information models (ODM2), reactive transport modeling (PFLOTRAN, Modflow, CLM, Landlab), geophysical inversion (E4D, BERT), parameter estimation (PEST, DAKOTA), visualization (ViSiT, Paraview, D3, QGIS) as well as numerous tools written in python and R for petrophysical mapping, stochastic modeling, data analysis and so on. These capabilities use data collected using sensors and analytical tools developed by multiple manufacturers which produce many different measurements. While scientists obviously leverage tools, capabilities and lessons learned from one site at other sites, the current approach to site characterization and monitoring is very labor intensive and does not scale well. Our objective is to be able to monitor many (hundreds - thousands) of sites. This requires that monitoring can be done in a near time, affordable, auditable and essentially autonomous manner. For this we have developed a modular vertically integrated cloud based software framework which was designed from the ground up for effective site and process monitoring. This software framework (PAF - Predictive Assimilation Framework) is multitenant software and provides automation of data ingestion, processing and visualization of hydrological, geochemical and geophysical (ERT/DTS) data. The core organizational element of PAF is a project/user one in which capabilities available to users are controlled by a combination of available data and access permissions. All PAF capabilities are exposed through APIs, making it easy to quickly add new components. PAF is fully integrated with newly developed autonomous electrical geophysical hardware and thus allows for automation of electrical geophysical ingestion and processing and the ability for co analysis and visualization of the raw and processed data with other data of interest (e.g. soil temperature, soil moisture, precipitation). We will demonstrate current PAF capabilities and discuss future efforts.
2009-10-01
actuelle M&S couvrant le soutien aux operations, la representation du comportement humain , la guerre asymetrique, la defense contre le terrorisme et...methods, tools, data, intellectual capital , and processes to address these capability requirements. Fourth, there is a need to compare capability...requirements to current capabilities to identify gaps that may be addressed with DoD HSCB methods, tools, data, intellectual capital , and process
Microgrid Design Toolkit (MDT) User Guide Software v1.2.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eddy, John P.
2017-08-01
The Microgrid Design Toolkit (MDT) supports decision analysis for new ("greenfield") microgrid designs as well as microgrids with existing infrastructure. The current version of MDT includes two main capabilities. The first capability, the Microgrid Sizing Capability (MSC), is used to determine the size and composition of a new, grid connected microgrid in the early stages of the design process. MSC is focused on developing a microgrid that is economically viable when connected to the grid. The second capability is focused on designing a microgrid for operation in islanded mode. This second capability relies on two models: the Technology Management Optimizationmore » (TMO) model and Performance Reliability Model (PRM).« less
Multidimensional Data Modeling for Business Process Analysis
NASA Astrophysics Data System (ADS)
Mansmann, Svetlana; Neumuth, Thomas; Scholl, Marc H.
The emerging area of business process intelligence attempts to enhance the analytical capabilities of business process management systems by employing data warehousing and mining technologies. This paper presents an approach to re-engineering the business process modeling in conformity with the multidimensional data model. Since the business process and the multidimensional model are driven by rather different objectives and assumptions, there is no straightforward solution to converging these models.
NASA Astrophysics Data System (ADS)
Gunawan, D.; Amalia, A.; Rahmat, R. F.; Muchtar, M. A.; Siregar, I.
2018-02-01
Identification of software maturity level is a technique to determine the quality of the software. By identifying the software maturity level, the weaknesses of the software can be observed. As a result, the recommendations might be a reference for future software maintenance and development. This paper discusses the software Capability Level (CL) with case studies on Quality Management Unit (Unit Manajemen Mutu) University of Sumatera Utara (UMM-USU). This research utilized Standard CMMI Appraisal Method for Process Improvement class C (SCAMPI C) model with continuous representation. This model focuses on activities for developing quality products and services. The observation is done in three process areas, such as Project Planning (PP), Project Monitoring and Control (PMC), and Requirements Management (REQM). According to the measurement of software capability level for UMM-USU software, turns out that the capability level for the observed process area is in the range of CL1 and CL2. Planning Project (PP) is the only process area which reaches capability level 2, meanwhile, PMC and REQM are still in CL 1 or in performed level. This research reveals several weaknesses of existing UMM-USU software. Therefore, this study proposes several recommendations for UMM-USU to improve capability level for observed process areas.
ERIC Educational Resources Information Center
Serido, Joyce; Shim, Soyeon; Tang, Chuanyi
2013-01-01
This study proposes a developmental model of financial capability to understand the process by which young adults acquire the financial knowledge and behaviors needed to manage full-time adult social roles and responsibilities. The model integrates financial knowledge, financial self-beliefs, financial behavior, and well-being into a single…
Differential Equation Models for Sharp Threshold Dynamics
2012-08-01
dynamics, and the Lanchester model of armed conflict, where the loss of a key capability drastically changes dynamics. We derive and demonstrate a step...dynamics using differential equations. 15. SUBJECT TERMS Differential Equations, Markov Population Process, S-I-R Epidemic, Lanchester Model 16...infection, where a detection event drastically changes dynamics, and the Lanchester model of armed conflict, where the loss of a key capability
Simulator for concurrent processing data flow architectures
NASA Technical Reports Server (NTRS)
Malekpour, Mahyar R.; Stoughton, John W.; Mielke, Roland R.
1992-01-01
A software simulator capability of simulating execution of an algorithm graph on a given system under the Algorithm to Architecture Mapping Model (ATAMM) rules is presented. ATAMM is capable of modeling the execution of large-grained algorithms on distributed data flow architectures. Investigating the behavior and determining the performance of an ATAMM based system requires the aid of software tools. The ATAMM Simulator presented is capable of determining the performance of a system without having to build a hardware prototype. Case studies are performed on four algorithms to demonstrate the capabilities of the ATAMM Simulator. Simulated results are shown to be comparable to the experimental results of the Advanced Development Model System.
Del Rio-Chanona, Ehecatl A; Liu, Jiao; Wagner, Jonathan L; Zhang, Dongda; Meng, Yingying; Xue, Song; Shah, Nilay
2018-02-01
Biodiesel produced from microalgae has been extensively studied due to its potentially outstanding advantages over traditional transportation fuels. In order to facilitate its industrialization and improve the process profitability, it is vital to construct highly accurate models capable of predicting the complex behavior of the investigated biosystem for process optimization and control, which forms the current research goal. Three original contributions are described in this paper. Firstly, a dynamic model is constructed to simulate the complicated effect of light intensity, nutrient supply and light attenuation on both biomass growth and biolipid production. Secondly, chlorophyll fluorescence, an instantly measurable variable and indicator of photosynthetic activity, is embedded into the model to monitor and update model accuracy especially for the purpose of future process optimal control, and its correlation between intracellular nitrogen content is quantified, which to the best of our knowledge has never been addressed so far. Thirdly, a thorough experimental verification is conducted under different scenarios including both continuous illumination and light/dark cycle conditions to testify the model predictive capability particularly for long-term operation, and it is concluded that the current model is characterized by a high level of predictive capability. Based on the model, the optimal light intensity for algal biomass growth and lipid synthesis is estimated. This work, therefore, paves the way to forward future process design and real-time optimization. © 2017 Wiley Periodicals, Inc.
Integration of a three-dimensional process-based hydrological model into the Object Modeling System
USDA-ARS?s Scientific Manuscript database
The integration of a spatial process model into an environmental modelling framework can enhance the model’s capabilities. We present the integration of the GEOtop model into the Object Modeling System (OMS) version 3.0 and illustrate its application in a small watershed. GEOtop is a physically base...
2008-03-01
it to strike targets with minimal collateral damage from a range of 15 kilometers. This stand -off type attack, made capable by the ATL, enables...levels they release a photon or quantum of light. This process continues until the light waves ’ strength builds and passes through the medium...mission level model. Lastly these models are classified by durability as standing models, or legacy models. Standing models are legacy models which have
Metrics for Business Process Models
NASA Astrophysics Data System (ADS)
Mendling, Jan
Up until now, there has been little research on why people introduce errors in real-world business process models. In a more general context, Simon [404] points to the limitations of cognitive capabilities and concludes that humans act rationally only to a certain extent. Concerning modeling errors, this argument would imply that human modelers lose track of the interrelations of large and complex models due to their limited cognitive capabilities and introduce errors that they would not insert in a small model. A recent study by Mendling et al. [275] explores in how far certain complexity metrics of business process models have the potential to serve as error determinants. The authors conclude that complexity indeed appears to have an impact on error probability. Before we can test such a hypothesis in a more general setting, we have to establish an understanding of how we can define determinants that drive error probability and how we can measure them.
NASA Astrophysics Data System (ADS)
Talamonti, James J.; Kay, Richard B.; Krebs, Danny J.
1996-05-01
A numerical model was developed to emulate the capabilities of systems performing noncontact absolute distance measurements. The model incorporates known methods to minimize signal processing and digital sampling errors and evaluates the accuracy limitations imposed by spectral peak isolation by using Hanning, Blackman, and Gaussian windows in the fast Fourier transform technique. We applied this model to the specific case of measuring the relative lengths of a compound Michelson interferometer. By processing computer-simulated data through our model, we project the ultimate precision for ideal data, and data containing AM-FM noise. The precision is shown to be limited by nonlinearities in the laser scan. absolute distance, interferometer.
New reflective symmetry design capability in the JPL-IDEAS Structure Optimization Program
NASA Technical Reports Server (NTRS)
Strain, D.; Levy, R.
1986-01-01
The JPL-IDEAS antenna structure analysis and design optimization computer program was modified to process half structure models of symmetric structures subjected to arbitrary external static loads, synthesize the performance, and optimize the design of the full structure. Significant savings in computation time and cost (more than 50%) were achieved compared to the cost of full model computer runs. The addition of the new reflective symmetry analysis design capabilities to the IDEAS program allows processing of structure models whose size would otherwise prevent automated design optimization. The new program produced synthesized full model iterative design results identical to those of actual full model program executions at substantially reduced cost, time, and computer storage.
NASA Astrophysics Data System (ADS)
Wrożyna, Andrzej; Pernach, Monika; Kuziak, Roman; Pietrzyk, Maciej
2016-04-01
Due to their exceptional strength properties combined with good workability the Advanced High-Strength Steels (AHSS) are commonly used in automotive industry. Manufacturing of these steels is a complex process which requires precise control of technological parameters during thermo-mechanical treatment. Design of these processes can be significantly improved by the numerical models of phase transformations. Evaluation of predictive capabilities of models, as far as their applicability in simulation of thermal cycles thermal cycles for AHSS is considered, was the objective of the paper. Two models were considered. The former was upgrade of the JMAK equation while the latter was an upgrade of the Leblond model. The models can be applied to any AHSS though the examples quoted in the paper refer to the Dual Phase (DP) steel. Three series of experimental simulations were performed. The first included various thermal cycles going beyond limitations of the continuous annealing lines. The objective was to validate models behavior in more complex cooling conditions. The second set of tests included experimental simulations of the thermal cycle characteristic for the continuous annealing lines. Capability of the models to describe properly phase transformations in this process was evaluated. The third set included data from the industrial continuous annealing line. Validation and verification of models confirmed their good predictive capabilities. Since it does not require application of the additivity rule, the upgrade of the Leblond model was selected as the better one for simulation of industrial processes in AHSS production.
Functional Fault Model Development Process to Support Design Analysis and Operational Assessment
NASA Technical Reports Server (NTRS)
Melcher, Kevin J.; Maul, William A.; Hemminger, Joseph A.
2016-01-01
A functional fault model (FFM) is an abstract representation of the failure space of a given system. As such, it simulates the propagation of failure effects along paths between the origin of the system failure modes and points within the system capable of observing the failure effects. As a result, FFMs may be used to diagnose the presence of failures in the modeled system. FFMs necessarily contain a significant amount of information about the design, operations, and failure modes and effects. One of the important benefits of FFMs is that they may be qualitative, rather than quantitative and, as a result, may be implemented early in the design process when there is more potential to positively impact the system design. FFMs may therefore be developed and matured throughout the monitored system's design process and may subsequently be used to provide real-time diagnostic assessments that support system operations. This paper provides an overview of a generalized NASA process that is being used to develop and apply FFMs. FFM technology has been evolving for more than 25 years. The FFM development process presented in this paper was refined during NASA's Ares I, Space Launch System, and Ground Systems Development and Operations programs (i.e., from about 2007 to the present). Process refinement took place as new modeling, analysis, and verification tools were created to enhance FFM capabilities. In this paper, standard elements of a model development process (i.e., knowledge acquisition, conceptual design, implementation & verification, and application) are described within the context of FFMs. Further, newer tools and analytical capabilities that may benefit the broader systems engineering process are identified and briefly described. The discussion is intended as a high-level guide for future FFM modelers.
NASA Astrophysics Data System (ADS)
Ozkat, Erkan Caner; Franciosa, Pasquale; Ceglarek, Dariusz
2017-08-01
Remote laser welding technology offers opportunities for high production throughput at a competitive cost. However, the remote laser welding process of zinc-coated sheet metal parts in lap joint configuration poses a challenge due to the difference between the melting temperature of the steel (∼1500 °C) and the vapourizing temperature of the zinc (∼907 °C). In fact, the zinc layer at the faying surface is vapourized and the vapour might be trapped within the melting pool leading to weld defects. Various solutions have been proposed to overcome this problem over the years. Among them, laser dimpling has been adopted by manufacturers because of its flexibility and effectiveness along with its cost advantages. In essence, the dimple works as a spacer between the two sheets in lap joint and allows the zinc vapour escape during welding process, thereby preventing weld defects. However, there is a lack of comprehensive characterization of dimpling process for effective implementation in real manufacturing system taking into consideration inherent changes in variability of process parameters. This paper introduces a methodology to develop (i) surrogate model for dimpling process characterization considering multiple-inputs (i.e. key control characteristics) and multiple-outputs (i.e. key performance indicators) system by conducting physical experimentation and using multivariate adaptive regression splines; (ii) process capability space (Cp-Space) based on the developed surrogate model that allows the estimation of a desired process fallout rate in the case of violation of process requirements in the presence of stochastic variation; and, (iii) selection and optimization of the process parameters based on the process capability space. The proposed methodology provides a unique capability to: (i) simulate the effect of process variation as generated by manufacturing process; (ii) model quality requirements with multiple and coupled quality requirements; and (iii) optimize process parameters under competing quality requirements such as maximizing the dimple height while minimizing the dimple lower surface area.
An overview of three main types of simulation approach (explanatory, abstraction, and estimation) is presented, along with a discussion of their capabilities limitations, and the steps required for their validation. A process model being developed through the Forest Response Prog...
NASA Technical Reports Server (NTRS)
Chien, Steve; Kandt, R. Kirk; Roden, Joseph; Burleigh, Scott; King, Todd; Joy, Steve
1992-01-01
Scientific data preparation is the process of extracting usable scientific data from raw instrument data. This task involves noise detection (and subsequent noise classification and flagging or removal), extracting data from compressed forms, and construction of derivative or aggregate data (e.g. spectral densities or running averages). A software system called PIPE provides intelligent assistance to users developing scientific data preparation plans using a programming language called Master Plumber. PIPE provides this assistance capability by using a process description to create a dependency model of the scientific data preparation plan. This dependency model can then be used to verify syntactic and semantic constraints on processing steps to perform limited plan validation. PIPE also provides capabilities for using this model to assist in debugging faulty data preparation plans. In this case, the process model is used to focus the developer's attention upon those processing steps and data elements that were used in computing the faulty output values. Finally, the dependency model of a plan can be used to perform plan optimization and runtime estimation. These capabilities allow scientists to spend less time developing data preparation procedures and more time on scientific analysis tasks. Because the scientific data processing modules (called fittings) evolve to match scientists' needs, issues regarding maintainability are of prime importance in PIPE. This paper describes the PIPE system and describes how issues in maintainability affected the knowledge representation used in PIPE to capture knowledge about the behavior of fittings.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bryan, Frank; Dennis, John; MacCready, Parker
This project aimed to improve long term global climate simulations by resolving and enhancing the representation of the processes involved in the cycling of freshwater through estuaries and coastal regions. This was a collaborative multi-institution project consisting of physical oceanographers, climate model developers, and computational scientists. It specifically targeted the DOE objectives of advancing simulation and predictive capability of climate models through improvements in resolution and physical process representation. The main computational objectives were: 1. To develop computationally efficient, but physically based, parameterizations of estuary and continental shelf mixing processes for use in an Earth System Model (CESM). 2. Tomore » develop a two-way nested regional modeling framework in order to dynamically downscale the climate response of particular coastal ocean regions and to upscale the impact of the regional coastal processes to the global climate in an Earth System Model (CESM). 3. To develop computational infrastructure to enhance the efficiency of data transfer between specific sources and destinations, i.e., a point-to-point communication capability, (used in objective 1) within POP, the ocean component of CESM.« less
Real-time face and gesture analysis for human-robot interaction
NASA Astrophysics Data System (ADS)
Wallhoff, Frank; Rehrl, Tobias; Mayer, Christoph; Radig, Bernd
2010-05-01
Human communication relies on a large number of different communication mechanisms like spoken language, facial expressions, or gestures. Facial expressions and gestures are one of the main nonverbal communication mechanisms and pass large amounts of information between human dialog partners. Therefore, to allow for intuitive human-machine interaction, a real-time capable processing and recognition of facial expressions, hand and head gestures are of great importance. We present a system that is tackling these challenges. The input features for the dynamic head gestures and facial expressions are obtained from a sophisticated three-dimensional model, which is fitted to the user in a real-time capable manner. Applying this model different kinds of information are extracted from the image data and afterwards handed over to a real-time capable data-transferring framework, the so-called Real-Time DataBase (RTDB). In addition to the head and facial-related features, also low-level image features regarding the human hand - optical flow, Hu-moments are stored into the RTDB for the evaluation process of hand gestures. In general, the input of a single camera is sufficient for the parallel evaluation of the different gestures and facial expressions. The real-time capable recognition of the dynamic hand and head gestures are performed via different Hidden Markov Models, which have proven to be a quick and real-time capable classification method. On the other hand, for the facial expressions classical decision trees or more sophisticated support vector machines are used for the classification process. These obtained results of the classification processes are again handed over to the RTDB, where other processes (like a Dialog Management Unit) can easily access them without any blocking effects. In addition, an adjustable amount of history can be stored by the RTDB buffer unit.
2015-10-01
capability to meet the task to the standard under the condition, nothing more or less, else the funding is wasted . Also, that funding for the...bin to segregate gaps qualitatively before the gap value model determined preference among gaps within the bins. Computation of a gap’s...for communication, interpretation, or processing by humans or by automatic means (as it pertains to modeling and simulation). Delphi Method -- a
A conceptual framework and classification of capability areas for business process maturity
NASA Astrophysics Data System (ADS)
Van Looy, Amy; De Backer, Manu; Poels, Geert
2014-03-01
The article elaborates on business process maturity, which indicates how well an organisation can perform based on its business processes, i.e. on its way of working. This topic is of paramount importance for managers who try to excel in today's competitive world. Hence, business process maturity is an emerging research field. However, no consensus exists on the capability areas (or skills) needed to excel. Moreover, their theoretical foundation and synergies with other fields are frequently neglected. To overcome this gap, our study presents a conceptual framework with six main capability areas and 17 sub areas. It draws on theories regarding the traditional business process lifecycle, which are supplemented by recognised organisation management theories. The comprehensiveness of this framework is validated by mapping 69 business process maturity models (BPMMs) to the identified capability areas, based on content analysis. Nonetheless, as a consensus neither exists among the collected BPMMs, a classification of different maturity types is proposed, based on cluster analysis and discriminant analysis. Consequently, the findings contribute to the grounding of business process literature. Possible future avenues are evaluating existing BPMMs, directing new BPMMs or investigating which combinations of capability areas (i.e. maturity types) contribute more to performance than others.
Point cloud modeling using the homogeneous transformation for non-cooperative pose estimation
NASA Astrophysics Data System (ADS)
Lim, Tae W.
2015-06-01
A modeling process to simulate point cloud range data that a lidar (light detection and ranging) sensor produces is presented in this paper in order to support the development of non-cooperative pose (relative attitude and position) estimation approaches which will help improve proximity operation capabilities between two adjacent vehicles. The algorithms in the modeling process were based on the homogeneous transformation, which has been employed extensively in robotics and computer graphics, as well as in recently developed pose estimation algorithms. Using a flash lidar in a laboratory testing environment, point cloud data of a test article was simulated and compared against the measured point cloud data. The simulated and measured data sets match closely, validating the modeling process. The modeling capability enables close examination of the characteristics of point cloud images of an object as it undergoes various translational and rotational motions. Relevant characteristics that will be crucial in non-cooperative pose estimation were identified such as shift, shadowing, perspective projection, jagged edges, and differential point cloud density. These characteristics will have to be considered in developing effective non-cooperative pose estimation algorithms. The modeling capability will allow extensive non-cooperative pose estimation performance simulations prior to field testing, saving development cost and providing performance metrics of the pose estimation concepts and algorithms under evaluation. The modeling process also provides "truth" pose of the test objects with respect to the sensor frame so that the pose estimation error can be quantified.
A computer program for uncertainty analysis integrating regression and Bayesian methods
Lu, Dan; Ye, Ming; Hill, Mary C.; Poeter, Eileen P.; Curtis, Gary
2014-01-01
This work develops a new functionality in UCODE_2014 to evaluate Bayesian credible intervals using the Markov Chain Monte Carlo (MCMC) method. The MCMC capability in UCODE_2014 is based on the FORTRAN version of the differential evolution adaptive Metropolis (DREAM) algorithm of Vrugt et al. (2009), which estimates the posterior probability density function of model parameters in high-dimensional and multimodal sampling problems. The UCODE MCMC capability provides eleven prior probability distributions and three ways to initialize the sampling process. It evaluates parametric and predictive uncertainties and it has parallel computing capability based on multiple chains to accelerate the sampling process. This paper tests and demonstrates the MCMC capability using a 10-dimensional multimodal mathematical function, a 100-dimensional Gaussian function, and a groundwater reactive transport model. The use of the MCMC capability is made straightforward and flexible by adopting the JUPITER API protocol. With the new MCMC capability, UCODE_2014 can be used to calculate three types of uncertainty intervals, which all can account for prior information: (1) linear confidence intervals which require linearity and Gaussian error assumptions and typically 10s–100s of highly parallelizable model runs after optimization, (2) nonlinear confidence intervals which require a smooth objective function surface and Gaussian observation error assumptions and typically 100s–1,000s of partially parallelizable model runs after optimization, and (3) MCMC Bayesian credible intervals which require few assumptions and commonly 10,000s–100,000s or more partially parallelizable model runs. Ready access allows users to select methods best suited to their work, and to compare methods in many circumstances.
Information processing of earth resources data
NASA Technical Reports Server (NTRS)
Zobrist, A. L.; Bryant, N. A.
1982-01-01
Current trends in the use of remotely sensed data include integration of multiple data sources of various formats and use of complex models. These trends have placed a strain on information processing systems because an enormous number of capabilities are needed to perform a single application. A solution to this problem is to create a general set of capabilities which can perform a wide variety of applications. General capabilities for the Image-Based Information System (IBIS) are outlined in this report. They are then cross-referenced for a set of applications performed at JPL.
On-line identification of fermentation processes for ethanol production.
Câmara, M M; Soares, R M; Feital, T; Naomi, P; Oki, S; Thevelein, J M; Amaral, M; Pinto, J C
2017-07-01
A strategy for monitoring fermentation processes, specifically, simultaneous saccharification and fermentation (SSF) of corn mash, was developed. The strategy covered the development and use of first principles, semimechanistic and unstructured process model based on major kinetic phenomena, along with mass and energy balances. The model was then used as a reference model within an identification procedure capable of running on-line. The on-line identification procedure consists on updating the reference model through the estimation of corrective parameters for certain reaction rates using the most recent process measurements. The strategy makes use of standard laboratory measurements for sugars quantification and in situ temperature and liquid level data. The model, along with the on-line identification procedure, has been tested against real industrial data and have been able to accurately predict the main variables of operational interest, i.e., state variables and its dynamics, and key process indicators. The results demonstrate that the strategy is capable of monitoring, in real time, this complex industrial biomass fermentation. This new tool provides a great support for decision-making and opens a new range of opportunities for industrial optimization.
People Capability Maturity Model. SM.
1995-09-01
People Capability Maturity Model SM .^^^^_ -——’ Bill Curtis William E. ] Sally Mille] Hefley r Accesion For t NTIS DTIC...People CMM The P-CMM adapts the architecture and the maturity framework underlying the CMM for use with people-related improvement issues. The CMM...focuses on helping organizations improve their software development processes. By adapting the maturity framework and the CMM architecture
Development of a Rolling Process Design Tool for Use in Improving Hot Roll Slab Recovery
DOE Office of Scientific and Technical Information (OSTI.GOV)
Couch, R; Becker, R; Rhee, M
2004-09-24
Lawrence Livermore National Laboratory participated in a U. S. Department of Energy/Office of Industrial Technology sponsored research project 'Development of a Rolling Process Design Tool for Use in Improving Hot Roll Slab Recovery', as a Cooperative Agreement TC-02028 with the Alcoa Technical Center (ATC). The objective of the joint project with Alcoa is to develop a numerical modeling capability to optimize the hot rolling process used to produce aluminum plate. Product lost in the rolling process and subsequent recycling, wastes resources consumed in the energy-intensive steps of remelting and reprocessing the ingot. The modeling capability developed by project partners willmore » be used to produce plate more efficiently and with reduced product loss.« less
INTEGRATION OF FACILITY MODELING CAPABILITIES FOR NUCLEAR NONPROLIFERATION ANALYSIS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gorensek, M.; Hamm, L.; Garcia, H.
2011-07-18
Developing automated methods for data collection and analysis that can facilitate nuclear nonproliferation assessment is an important research area with significant consequences for the effective global deployment of nuclear energy. Facility modeling that can integrate and interpret observations collected from monitored facilities in order to ascertain their functional details will be a critical element of these methods. Although improvements are continually sought, existing facility modeling tools can characterize all aspects of reactor operations and the majority of nuclear fuel cycle processing steps, and include algorithms for data processing and interpretation. Assessing nonproliferation status is challenging because observations can come frommore » many sources, including local and remote sensors that monitor facility operations, as well as open sources that provide specific business information about the monitored facilities, and can be of many different types. Although many current facility models are capable of analyzing large amounts of information, they have not been integrated in an analyst-friendly manner. This paper addresses some of these facility modeling capabilities and illustrates how they could be integrated and utilized for nonproliferation analysis. The inverse problem of inferring facility conditions based on collected observations is described, along with a proposed architecture and computer framework for utilizing facility modeling tools. After considering a representative sampling of key facility modeling capabilities, the proposed integration framework is illustrated with several examples.« less
Corral framework: Trustworthy and fully functional data intensive parallel astronomical pipelines
NASA Astrophysics Data System (ADS)
Cabral, J. B.; Sánchez, B.; Beroiz, M.; Domínguez, M.; Lares, M.; Gurovich, S.; Granitto, P.
2017-07-01
Data processing pipelines represent an important slice of the astronomical software library that include chains of processes that transform raw data into valuable information via data reduction and analysis. In this work we present Corral, a Python framework for astronomical pipeline generation. Corral features a Model-View-Controller design pattern on top of an SQL Relational Database capable of handling: custom data models; processing stages; and communication alerts, and also provides automatic quality and structural metrics based on unit testing. The Model-View-Controller provides concept separation between the user logic and the data models, delivering at the same time multi-processing and distributed computing capabilities. Corral represents an improvement over commonly found data processing pipelines in astronomysince the design pattern eases the programmer from dealing with processing flow and parallelization issues, allowing them to focus on the specific algorithms needed for the successive data transformations and at the same time provides a broad measure of quality over the created pipeline. Corral and working examples of pipelines that use it are available to the community at https://github.com/toros-astro.
Banta, Edward R.; Poeter, Eileen P.; Doherty, John E.; Hill, Mary C.
2006-01-01
he Joint Universal Parameter IdenTification and Evaluation of Reliability Application Programming Interface (JUPITER API) improves the computer programming resources available to those developing applications (computer programs) for model analysis.The JUPITER API consists of eleven Fortran-90 modules that provide for encapsulation of data and operations on that data. Each module contains one or more entities: data, data types, subroutines, functions, and generic interfaces. The modules do not constitute computer programs themselves; instead, they are used to construct computer programs. Such computer programs are called applications of the API. The API provides common modeling operations for use by a variety of computer applications.The models being analyzed are referred to here as process models, and may, for example, represent the physics, chemistry, and(or) biology of a field or laboratory system. Process models commonly are constructed using published models such as MODFLOW (Harbaugh et al., 2000; Harbaugh, 2005), MT3DMS (Zheng and Wang, 1996), HSPF (Bicknell et al., 1997), PRMS (Leavesley and Stannard, 1995), and many others. The process model may be accessed by a JUPITER API application as an external program, or it may be implemented as a subroutine within a JUPITER API application . In either case, execution of the model takes place in a framework designed by the application programmer. This framework can be designed to take advantage of any parallel processing capabilities possessed by the process model, as well as the parallel-processing capabilities of the JUPITER API.Model analyses for which the JUPITER API could be useful include, for example: Compare model results to observed values to determine how well the model reproduces system processes and characteristics.Use sensitivity analysis to determine the information provided by observations to parameters and predictions of interest.Determine the additional data needed to improve selected model predictions.Use calibration methods to modify parameter values and other aspects of the model.Compare predictions to regulatory limits.Quantify the uncertainty of predictions based on the results of one or many simulations using inferential or Monte Carlo methods.Determine how to manage the system to achieve stated objectives.The capabilities provided by the JUPITER API include, for example, communication with process models, parallel computations, compressed storage of matrices, and flexible input capabilities. The input capabilities use input blocks suitable for lists or arrays of data. The input blocks needed for one application can be included within one data file or distributed among many files. Data exchange between different JUPITER API applications or between applications and other programs is supported by data-exchange files.The JUPITER API has already been used to construct a number of applications. Three simple example applications are presented in this report. More complicated applications include the universal inverse code UCODE_2005 (Poeter et al., 2005), the multi-model analysis MMA (Eileen P. Poeter, Mary C. Hill, E.R. Banta, S.W. Mehl, and Steen Christensen, written commun., 2006), and a code named OPR_PPR (Matthew J. Tonkin, Claire R. Tiedeman, Mary C. Hill, and D. Matthew Ely, written communication, 2006).This report describes a set of underlying organizational concepts and complete specifics about the JUPITER API. While understanding the organizational concept presented is useful to understanding the modules, other organizational concepts can be used in applications constructed using the JUPITER API.
Software-as-a-Service Vendors: Are They Ready to Successfully Deliver?
NASA Astrophysics Data System (ADS)
Heart, Tsipi; Tsur, Noa Shamir; Pliskin, Nava
Software as a service (SaaS) is a software sourcing option that allows organizations to remotely access enterprise applications, without having to install the application in-house. In this work we study vendors' readiness to deliver SaaS, a topic scarcely studied before. The innovation classification (evolutionary vs. revolutionary) and a new, Seven Fundamental Organizational Capabilities (FOCs) Model, are used as the theoretical frameworks. The Seven FOCs model suggests generic yet comprehensive set of capabilities that are required for organizational success: 1) sensing the stakeholders, 2) sensing the business environment, 3) sensing the knowledge environment, 4) process control, 5) process improvement, 6) new process development, and 7) appropriate resolution.
NASA Astrophysics Data System (ADS)
Bao, Yanli; Hua, Hefeng
2017-03-01
Network capability is the enterprise's capability to set up, manage, maintain and use a variety of relations between enterprises, and to obtain resources for improving competitiveness. Tourism in China is in a transformation period from sightseeing to leisure and vacation. Scenic spots as well as tourist enterprises can learn from some other enterprises in the process of resource development, and build up its own network relations in order to get resources for their survival and development. Through the effective management of network relations, the performance of resource development will be improved. By analyzing literature on network capability and the case analysis of Wuxi Huishan Ancient Town, the role of network capacity in the tourism resource development is explored and resource development path is built from the perspective of network capability. Finally, the tourism resource development process model based on network capacity is proposed. This model mainly includes setting up network vision, resource identification, resource acquisition, resource utilization and tourism project development. In these steps, network construction, network management and improving network center status are key points.
A neural network model of foraging decisions made under predation risk.
Coleman, Scott L; Brown, Vincent R; Levine, Daniel S; Mellgren, Roger L
2005-12-01
This article develops the cognitive-emotional forager (CEF) model, a novel application of a neural network to dynamical processes in foraging behavior. The CEF is based on a neural network known as the gated dipole, introduced by Grossberg, which is capable of representing short-term affective reactions in a manner similar to Solomon and Corbit's (1974) opponent process theory. The model incorporates a trade-off between approach toward food and avoidance of predation under varying levels of motivation induced by hunger. The results of simulations in a simple patch selection paradigm, using a lifetime fitness criterion for comparison, indicate that the CEF model is capable of nearly optimal foraging and outperforms a run-of-luck rule-of-thumb model. Models such as the one presented here can illuminate the underlying cognitive and motivational components of animal decision making.
Adaptive Multiscale Modeling of Geochemical Impacts on Fracture Evolution
NASA Astrophysics Data System (ADS)
Molins, S.; Trebotich, D.; Steefel, C. I.; Deng, H.
2016-12-01
Understanding fracture evolution is essential for many subsurface energy applications, including subsurface storage, shale gas production, fracking, CO2 sequestration, and geothermal energy extraction. Geochemical processes in particular play a significant role in the evolution of fractures through dissolution-driven widening, fines migration, and/or fracture sealing due to precipitation. One obstacle to understanding and exploiting geochemical fracture evolution is that it is a multiscale process. However, current geochemical modeling of fractures cannot capture this multi-scale nature of geochemical and mechanical impacts on fracture evolution, and is limited to either a continuum or pore-scale representation. Conventional continuum-scale models treat fractures as preferential flow paths, with their permeability evolving as a function (often, a cubic law) of the fracture aperture. This approach has the limitation that it oversimplifies flow within the fracture in its omission of pore scale effects while also assuming well-mixed conditions. More recently, pore-scale models along with advanced characterization techniques have allowed for accurate simulations of flow and reactive transport within the pore space (Molins et al., 2014, 2015). However, these models, even with high performance computing, are currently limited in their ability to treat tractable domain sizes (Steefel et al., 2013). Thus, there is a critical need to develop an adaptive modeling capability that can account for separate properties and processes, emergent and otherwise, in the fracture and the rock matrix at different spatial scales. Here we present an adaptive modeling capability that treats geochemical impacts on fracture evolution within a single multiscale framework. Model development makes use of the high performance simulation capability, Chombo-Crunch, leveraged by high resolution characterization and experiments. The modeling framework is based on the adaptive capability in Chombo which not only enables mesh refinement, but also refinement of the model-pore scale or continuum Darcy scale-in a dynamic way such that the appropriate model is used only when and where it is needed. Explicit flux matching provides coupling betwen the scales.
NASA Astrophysics Data System (ADS)
Zhou, S.; Tao, W. K.; Li, X.; Matsui, T.; Sun, X. H.; Yang, X.
2015-12-01
A cloud-resolving model (CRM) is an atmospheric numerical model that can numerically resolve clouds and cloud systems at 0.25~5km horizontal grid spacings. The main advantage of the CRM is that it can allow explicit interactive processes between microphysics, radiation, turbulence, surface, and aerosols without subgrid cloud fraction, overlapping and convective parameterization. Because of their fine resolution and complex physical processes, it is challenging for the CRM community to i) visualize/inter-compare CRM simulations, ii) diagnose key processes for cloud-precipitation formation and intensity, and iii) evaluate against NASA's field campaign data and L1/L2 satellite data products due to large data volume (~10TB) and complexity of CRM's physical processes. We have been building the Super Cloud Library (SCL) upon a Hadoop framework, capable of CRM database management, distribution, visualization, subsetting, and evaluation in a scalable way. The current SCL capability includes (1) A SCL data model enables various CRM simulation outputs in NetCDF, including the NASA-Unified Weather Research and Forecasting (NU-WRF) and Goddard Cumulus Ensemble (GCE) model, to be accessed and processed by Hadoop, (2) A parallel NetCDF-to-CSV converter supports NU-WRF and GCE model outputs, (3) A technique visualizes Hadoop-resident data with IDL, (4) A technique subsets Hadoop-resident data, compliant to the SCL data model, with HIVE or Impala via HUE's Web interface, (5) A prototype enables a Hadoop MapReduce application to dynamically access and process data residing in a parallel file system, PVFS2 or CephFS, where high performance computing (HPC) simulation outputs such as NU-WRF's and GCE's are located. We are testing Apache Spark to speed up SCL data processing and analysis.With the SCL capabilities, SCL users can conduct large-domain on-demand tasks without downloading voluminous CRM datasets and various observations from NASA Field Campaigns and Satellite data to a local computer, and inter-compare CRM output and data with GCE and NU-WRF.
50 Years of Army Computing From ENIAC to MSRC
2000-09-01
processing capability. The scientifi c visualization program was started in 1984 to provide tools and expertise to help researchers graphically...and materials, forces modeling, nanoelectronics, electromagnetics and acoustics, signal image processing , and simulation and modeling. The ARL...mechanical and electrical calculating equipment, punch card data processing equipment, analog computers, and early digital machines. Before beginning, we
NASA Technical Reports Server (NTRS)
Cole, Stanley R.; Garcia, Jerry L.
2000-01-01
The NASA Langley Transonic Dynamics Tunnel (TDT) has provided a unique capability for aeroelastic testing for forty years. The facility has a rich history of significant contributions to the design of many United States commercial transports, military aircraft, launch vehicles, and spacecraft. The facility has many features that contribute to its uniqueness for aeroelasticity testing, perhaps the most important feature being the use of a heavy gas test medium to achieve higher test densities. Higher test medium densities substantially improve model-building requirements and therefore simplify the fabrication process for building aeroelastically scaled wind tunnel models. Aeroelastic scaling for the heavy gas results in lower model structural frequencies. Lower model frequencies tend to a make aeroelastic testing safer. This paper will describe major developments in the testing capabilities at the TDT throughout its history, the current status of the facility, and planned additions and improvements to its capabilities in the near future.
Diversity's Impact on the Executive Coaching Process
ERIC Educational Resources Information Center
Maltbia, Terrence E.; Power, Anne
2005-01-01
This paper presents a conceptual model intended to expand existing executive coaching processes used in organizations by building the strategic learning capabilities needed to integrate a diversity perspective into this emerging field of HRD practice. This model represents the early development of results from a Diversity Practitioner Study…
Systems Security Engineering Capability Maturity Model SSE-CMM Model Description Document
1999-04-01
management is the process of accessing and quantifying risk , and establishing an acceptable level of risk for the organization. Managing risk is an...Process of assessing and quantifying risk and establishing acceptable level of risk for the organization. [IEEE 13335-1:1996] Security Engineering
Conversion of Component-Based Point Definition to VSP Model and Higher Order Meshing
NASA Technical Reports Server (NTRS)
Ordaz, Irian
2011-01-01
Vehicle Sketch Pad (VSP) has become a powerful conceptual and parametric geometry tool with numerous export capabilities for third-party analysis codes as well as robust surface meshing capabilities for computational fluid dynamics (CFD) analysis. However, a capability gap currently exists for reconstructing a fully parametric VSP model of a geometry generated by third-party software. A computer code called GEO2VSP has been developed to close this gap and to allow the integration of VSP into a closed-loop geometry design process with other third-party design tools. Furthermore, the automated CFD surface meshing capability of VSP are demonstrated for component-based point definition geometries in a conceptual analysis and design framework.
LES ARM Symbiotic Simulation and Observation (LASSO) Implementation Strategy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gustafson Jr., WI; Vogelmann, AM
2015-09-01
This document illustrates the design of the Large-Eddy Simulation (LES) ARM Symbiotic Simulation and Observation (LASSO) workflow to provide a routine, high-resolution modeling capability to augment the U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Climate Research Facility’s high-density observations. LASSO will create a powerful new capability for furthering ARM’s mission to advance understanding of cloud, radiation, aerosol, and land-surface processes. The combined observational and modeling elements will enable a new level of scientific inquiry by connecting processes and context to observations and providing needed statistics for details that cannot be measured. The result will be improved process understandingmore » that facilitates concomitant improvements in climate model parameterizations. The initial LASSO implementation will be for ARM’s Southern Great Plains site in Oklahoma and will focus on shallow convection, which is poorly simulated by climate models due in part to clouds’ typically small spatial scale compared to model grid spacing, and because the convection involves complicated interactions of microphysical and boundary layer processes.« less
Conceptual models of information processing
NASA Technical Reports Server (NTRS)
Stewart, L. J.
1983-01-01
The conceptual information processing issues are examined. Human information processing is defined as an active cognitive process that is analogous to a system. It is the flow and transformation of information within a human. The human is viewed as an active information seeker who is constantly receiving, processing, and acting upon the surrounding environmental stimuli. Human information processing models are conceptual representations of cognitive behaviors. Models of information processing are useful in representing the different theoretical positions and in attempting to define the limits and capabilities of human memory. It is concluded that an understanding of conceptual human information processing models and their applications to systems design leads to a better human factors approach.
Acuña, Gonzalo; Ramirez, Cristian; Curilem, Millaray
2014-01-01
The lack of sensors for some relevant state variables in fermentation processes can be coped by developing appropriate software sensors. In this work, NARX-ANN, NARMAX-ANN, NARX-SVM and NARMAX-SVM models are compared when acting as software sensors of biomass concentration for a solid substrate cultivation (SSC) process. Results show that NARMAX-SVM outperforms the other models with an SMAPE index under 9 for a 20 % amplitude noise. In addition, NARMAX models perform better than NARX models under the same noise conditions because of their better predictive capabilities as they include prediction errors as inputs. In the case of perturbation of initial conditions of the autoregressive variable, NARX models exhibited better convergence capabilities. This work also confirms that a difficult to measure variable, like biomass concentration, can be estimated on-line from easy to measure variables like CO₂ and O₂ using an adequate software sensor based on computational intelligence techniques.
Computer simulation: A modern day crystal ball?
NASA Technical Reports Server (NTRS)
Sham, Michael; Siprelle, Andrew
1994-01-01
It has long been the desire of managers to be able to look into the future and predict the outcome of decisions. With the advent of computer simulation and the tremendous capability provided by personal computers, that desire can now be realized. This paper presents an overview of computer simulation and modeling, and discusses the capabilities of Extend. Extend is an iconic-driven Macintosh-based software tool that brings the power of simulation to the average computer user. An example of an Extend based model is presented in the form of the Space Transportation System (STS) Processing Model. The STS Processing Model produces eight shuttle launches per year, yet it takes only about ten minutes to run. In addition, statistical data such as facility utilization, wait times, and processing bottlenecks are produced. The addition or deletion of resources, such as orbiters or facilities, can be easily modeled and their impact analyzed. Through the use of computer simulation, it is possible to look into the future to see the impact of today's decisions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Joon H.; Siegel, Malcolm Dean; Arguello, Jose Guadalupe, Jr.
2011-03-01
This report describes a gap analysis performed in the process of developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repositorymore » designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with rigorous verification, validation, and software quality requirements. The gap analyses documented in this report were are performed during an initial gap analysis to identify candidate codes and tools to support the development and integration of the Waste IPSC, and during follow-on activities that delved into more detailed assessments of the various codes that were acquired, studied, and tested. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. The gap analysis indicates that significant capabilities may already exist in the existing THC codes although there is no single code able to fully account for all physical and chemical processes involved in a waste disposal system. Large gaps exist in modeling chemical processes and their couplings with other processes. The coupling of chemical processes with flow transport and mechanical deformation remains challenging. The data for extreme environments (e.g., for elevated temperature and high ionic strength media) that are needed for repository modeling are severely lacking. In addition, most of existing reactive transport codes were developed for non-radioactive contaminants, and they need to be adapted to account for radionuclide decay and in-growth. The accessibility to the source codes is generally limited. Because the problems of interest for the Waste IPSC are likely to result in relatively large computational models, a compact memory-usage footprint and a fast/robust solution procedure will be needed. A robust massively parallel processing (MPP) capability will also be required to provide reasonable turnaround times on the analyses that will be performed with the code. A performance assessment (PA) calculation for a waste disposal system generally requires a large number (hundreds to thousands) of model simulations to quantify the effect of model parameter uncertainties on the predicted repository performance. A set of codes for a PA calculation must be sufficiently robust and fast in terms of code execution. A PA system as a whole must be able to provide multiple alternative models for a specific set of physical/chemical processes, so that the users can choose various levels of modeling complexity based on their modeling needs. This requires PA codes, preferably, to be highly modularized. Most of the existing codes have difficulties meeting these requirements. Based on the gap analysis results, we have made the following recommendations for the code selection and code development for the NEAMS waste IPSC: (1) build fully coupled high-fidelity THCMBR codes using the existing SIERRA codes (e.g., ARIA and ADAGIO) and platform, (2) use DAKOTA to build an enhanced performance assessment system (EPAS), and build a modular code architecture and key code modules for performance assessments. The key chemical calculation modules will be built by expanding the existing CANTERA capabilities as well as by extracting useful components from other existing codes.« less
Legacy model integration for enhancing hydrologic interdisciplinary research
NASA Astrophysics Data System (ADS)
Dozier, A.; Arabi, M.; David, O.
2013-12-01
Many challenges are introduced to interdisciplinary research in and around the hydrologic science community due to advances in computing technology and modeling capabilities in different programming languages, across different platforms and frameworks by researchers in a variety of fields with a variety of experience in computer programming. Many new hydrologic models as well as optimization, parameter estimation, and uncertainty characterization techniques are developed in scripting languages such as Matlab, R, Python, or in newer languages such as Java and the .Net languages, whereas many legacy models have been written in FORTRAN and C, which complicates inter-model communication for two-way feedbacks. However, most hydrologic researchers and industry personnel have little knowledge of the computing technologies that are available to address the model integration process. Therefore, the goal of this study is to address these new challenges by utilizing a novel approach based on a publish-subscribe-type system to enhance modeling capabilities of legacy socio-economic, hydrologic, and ecologic software. Enhancements include massive parallelization of executions and access to legacy model variables at any point during the simulation process by another program without having to compile all the models together into an inseparable 'super-model'. Thus, this study provides two-way feedback mechanisms between multiple different process models that can be written in various programming languages and can run on different machines and operating systems. Additionally, a level of abstraction is given to the model integration process that allows researchers and other technical personnel to perform more detailed and interactive modeling, visualization, optimization, calibration, and uncertainty analysis without requiring deep understanding of inter-process communication. To be compatible, a program must be written in a programming language with bindings to a common implementation of the message passing interface (MPI), which includes FORTRAN, C, Java, the .NET languages, Python, R, Matlab, and many others. The system is tested on a longstanding legacy hydrologic model, the Soil and Water Assessment Tool (SWAT), to observe and enhance speed-up capabilities for various optimization, parameter estimation, and model uncertainty characterization techniques, which is particularly important for computationally intensive hydrologic simulations. Initial results indicate that the legacy extension system significantly decreases developer time, computation time, and the cost of purchasing commercial parallel processing licenses, while enhancing interdisciplinary research by providing detailed two-way feedback mechanisms between various process models with minimal changes to legacy code.
Improvements to information management systems simulator
NASA Technical Reports Server (NTRS)
Bilek, R. W.
1972-01-01
The performance of personnel in the augmentation and improvement of the interactive IMSIM information management simulation model is summarized. With this augmented model, NASA now has even greater capabilities for the simulation of computer system configurations, data processing loads imposed on these configurations, and executive software to control system operations. Through these simulations, NASA has an extremely cost effective capability for the design and analysis of computer-based data management systems.
Object-Oriented MDAO Tool with Aeroservoelastic Model Tuning Capability
NASA Technical Reports Server (NTRS)
Pak, Chan-gi; Li, Wesley; Lung, Shun-fat
2008-01-01
An object-oriented multi-disciplinary analysis and optimization (MDAO) tool has been developed at the NASA Dryden Flight Research Center to automate the design and analysis process and leverage existing commercial as well as in-house codes to enable true multidisciplinary optimization in the preliminary design stage of subsonic, transonic, supersonic and hypersonic aircraft. Once the structural analysis discipline is finalized and integrated completely into the MDAO process, other disciplines such as aerodynamics and flight controls will be integrated as well. Simple and efficient model tuning capabilities based on optimization problem are successfully integrated with the MDAO tool. More synchronized all phases of experimental testing (ground and flight), analytical model updating, high-fidelity simulations for model validation, and integrated design may result in reduction of uncertainties in the aeroservoelastic model and increase the flight safety.
Self-conscious robotic system design process--from analysis to implementation.
Chella, Antonio; Cossentino, Massimo; Seidita, Valeria
2011-01-01
Developing robotic systems endowed with self-conscious capabilities means realizing complex sub-systems needing ad-hoc software engineering techniques for their modelling, analysis and implementation. In this chapter the whole process (from analysis to implementation) to model the development of self-conscious robotic systems is presented and the new created design process, PASSIC, supporting each part of it, is fully illustrated.
Applying PCI in Combination Swivel Head Wrench
NASA Astrophysics Data System (ADS)
Chen, Tsang-Chiang; Yang, Chun-Ming; Hsu, Chang-Hsien; Hung, Hsiang-Wen
2017-09-01
Taiwan’s traditional industries are subject to competition in the era of globalization and environmental change, the industry is facing economic pressure and shock, and now sustainable business can only continue to improve production efficiency and quality of technology, in order to stabilize the market, to obtain high occupancy. The use of process capability indices to monitor the quality of the ratchet wrench to find the key function of the dual-use ratchet wrench, the actual measurement data, The use of process capability Cpk index analysis, and draw Process Capability Analysis Chart model. Finally, this study explores the current situation of this case and proposes a lack of improvement and improvement methods to improve the overall quality and thereby enhance the overall industry.
GWM-VI: groundwater management with parallel processing for multiple MODFLOW versions
Banta, Edward R.; Ahlfeld, David P.
2013-01-01
Groundwater Management–Version Independent (GWM–VI) is a new version of the Groundwater Management Process of MODFLOW. The Groundwater Management Process couples groundwater-flow simulation with a capability to optimize stresses on the simulated aquifer based on an objective function and constraints imposed on stresses and aquifer state. GWM–VI extends prior versions of Groundwater Management in two significant ways—(1) it can be used with any version of MODFLOW that meets certain requirements on input and output, and (2) it is structured to allow parallel processing of the repeated runs of the MODFLOW model that are required to solve the optimization problem. GWM–VI uses the same input structure for files that describe the management problem as that used by prior versions of Groundwater Management. GWM–VI requires only minor changes to the input files used by the MODFLOW model. GWM–VI uses the Joint Universal Parameter IdenTification and Evaluation of Reliability Application Programming Interface (JUPITER-API) to implement both version independence and parallel processing. GWM–VI communicates with the MODFLOW model by manipulating certain input files and interpreting results from the MODFLOW listing file and binary output files. Nearly all capabilities of prior versions of Groundwater Management are available in GWM–VI. GWM–VI has been tested with MODFLOW-2005, MODFLOW-NWT (a Newton formulation for MODFLOW-2005), MF2005-FMP2 (the Farm Process for MODFLOW-2005), SEAWAT, and CFP (Conduit Flow Process for MODFLOW-2005). This report provides sample problems that demonstrate a range of applications of GWM–VI and the directory structure and input information required to use the parallel-processing capability.
Enterprise and system of systems capability development life-cycle processes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beck, David Franklin
2014-08-01
This report and set of appendices are a collection of memoranda originally drafted circa 2007-2009 for the purpose of describing and detailing a models-based systems engineering approach for satisfying enterprise and system-of-systems life cycle process requirements. At the time there was interest and support to move from Capability Maturity Model Integration (CMMI) Level One (ad hoc processes) to Level Three. The main thrust of the material presents a rational exposâe of a structured enterprise development life cycle that uses the scientific method as a framework, with further rigor added from adapting relevant portions of standard systems engineering processes. While themore » approach described invokes application of the Department of Defense Architectural Framework (DoDAF), it is suitable for use with other architectural description frameworks.« less
NASA Technical Reports Server (NTRS)
Crisp, David; Komar, George (Technical Monitor)
2001-01-01
Advancement of our predictive capabilities will require new scientific knowledge, improvement of our modeling capabilities, and new observation strategies to generate the complex data sets needed by coupled modeling networks. New observation strategies must support remote sensing from a variety of vantage points and will include "sensorwebs" of small satellites in low Earth orbit, large aperture sensors in Geostationary orbits, and sentinel satellites at L1 and L2 to provide day/night views of the entire globe. Onboard data processing and high speed computing and communications will enable near real-time tailoring and delivery of information products (i.e., predictions) directly to users.
NASA Technical Reports Server (NTRS)
Lahoti, G. D.; Akgerman, N.; Altan, T.
1978-01-01
Mild steel (AISI 1018) was selected as model cold rolling material and Ti-6A1-4V and Inconel 718 were selected as typical hot rolling and cold rolling alloys, respectively. The flow stress and workability of these alloys were characterized and friction factor at the roll/workpiece interface was determined at their respective working conditions by conducting ring tests. Computer-aided mathematical models for predicting metal flow and stresses, and for simulating the shape rolling process were developed. These models utilized the upper bound and the slab methods of analysis, and were capable of predicting the lateral spread, roll separating force, roll torque, and local stresses, strains and strain rates. This computer-aided design system was also capable of simulating the actual rolling process, and thereby designing the roll pass schedule in rolling of an airfoil or a similar shape.
NASA Technical Reports Server (NTRS)
Solloway, C. B.; Wakeland, W.
1976-01-01
First-order Markov model developed on digital computer for population with specific characteristics. System is user interactive, self-documenting, and does not require user to have complete understanding of underlying model details. Contains thorough error-checking algorithms on input and default capabilities.
An AI approach for scheduling space-station payloads at Kennedy Space Center
NASA Technical Reports Server (NTRS)
Castillo, D.; Ihrie, D.; Mcdaniel, M.; Tilley, R.
1987-01-01
The Payload Processing for Space-Station Operations (PHITS) is a prototype modeling tool capable of addressing many Space Station related concerns. The system's object oriented design approach coupled with a powerful user interface provide the user with capabilities to easily define and model many applications. PHITS differs from many artificial intelligence based systems in that it couples scheduling and goal-directed simulation to ensure that on-orbit requirement dates are satisfied.
ISO 9000 and/or Systems Engineering Capability Maturity Model?
NASA Technical Reports Server (NTRS)
Gholston, Sampson E.
2002-01-01
For businesses and organizations to remain competitive today they must have processes and systems in place that will allow them to first identify customer needs and then develop products/processes that will meet or exceed the customers needs and expectations. Customer needs, once identified, are normally stated as requirements. Designers can then develop products/processes that will meet these requirements. Several functions, such as quality management and systems engineering management are used to assist product development teams in the development process. Both functions exist in all organizations and both have a similar objective, which is to ensure that developed processes will meet customer requirements. Are efforts in these organizations being duplicated? Are both functions needed by organizations? What are the similarities and differences between the functions listed above? ISO 9000 is an international standard of goods and services. It sets broad requirements for the assurance of quality and for management's involvement. It requires organizations to document the processes and to follow these documented processes. ISO 9000 gives customers assurance that the suppliers have control of the process for product development. Systems engineering can broadly be defined as a discipline that seeks to ensure that all requirements for a system are satisfied throughout the life of the system by preserving their interrelationship. The key activities of systems engineering include requirements analysis, functional analysis/allocation, design synthesis and verification, and system analysis and control. The systems engineering process, when followed properly, will lead to higher quality products, lower cost products, and shorter development cycles. The System Engineering Capability Maturity Model (SE-CMM) will allow companies to measure their system engineering capability and continuously improve those capabilities. ISO 9000 and SE-CMM seem to have a similar objective, which is to document the organization's processes and certify to potential customers the capability of a supplier to control the processes that determine the quality of the product or services being produced. The remaining sections of this report examine the differences and similarities between ISO 9000 and SE-CMM and make recommendations for implementation.
Simulation of Healing Threshold in Strain-Induced Inflammation Through a Discrete Informatics Model.
Ibrahim, Israr Bin M; Sarma O V, Sanjay; Pidaparti, Ramana M
2018-05-01
Respiratory diseases such as asthma and acute respiratory distress syndrome as well as acute lung injury involve inflammation at the cellular level. The inflammation process is very complex and is characterized by the emergence of cytokines along with other changes in cellular processes. Due to the complexity of the various constituents that makes up the inflammation dynamics, it is necessary to develop models that can complement experiments to fully understand inflammatory diseases. In this study, we developed a discrete informatics model based on cellular automata (CA) approach to investigate the influence of elastic field (stretch/strain) on the dynamics of inflammation and account for probabilistic adaptation based on statistical interpretation of existing experimental data. Our simulation model investigated the effects of low, medium, and high strain conditions on inflammation dynamics. Results suggest that the model is able to indicate the threshold of innate healing of tissue as a response to strain experienced by the tissue. When strain is under the threshold, the tissue is still capable of adapting its structure to heal the damaged part. However, there exists a strain threshold where healing capability breaks down. The results obtained demonstrate that the developed discrete informatics based CA model is capable of modeling and giving insights into inflammation dynamics parameters under various mechanical strain/stretch environments.
Space environment and lunar surface processes
NASA Technical Reports Server (NTRS)
Comstock, G. M.
1979-01-01
The development of a general rock/soil model capable of simulating in a self consistent manner the mechanical and exposure history of an assemblage of solid and loose material from submicron to planetary size scales, applicable to lunar and other space exposed planetary surfaces is discussed. The model was incorporated into a computer code called MESS.2 (model for the evolution of space exposed surfaces). MESS.2, which represents a considerable increase in sophistication and scope over previous soil and rock surface models, is described. The capabilities of previous models for near surface soil and rock surfaces are compared with the rock/soil model, MESS.2.
Top-level modeling of an als system utilizing object-oriented techniques
NASA Astrophysics Data System (ADS)
Rodriguez, L. F.; Kang, S.; Ting, K. C.
The possible configuration of an Advanced Life Support (ALS) System capable of supporting human life for long-term space missions continues to evolve as researchers investigate potential technologies and configurations. To facilitate the decision process the development of acceptable, flexible, and dynamic mathematical computer modeling tools capable of system level analysis is desirable. Object-oriented techniques have been adopted to develop a dynamic top-level model of an ALS system.This approach has several advantages; among these, object-oriented abstractions of systems are inherently modular in architecture. Thus, models can initially be somewhat simplistic, while allowing for adjustments and improvements. In addition, by coding the model in Java, the model can be implemented via the World Wide Web, greatly encouraging the utilization of the model. Systems analysis is further enabled with the utilization of a readily available backend database containing information supporting the model. The subsystem models of the ALS system model include Crew, Biomass Production, Waste Processing and Resource Recovery, Food Processing and Nutrition, and the Interconnecting Space. Each subsystem model and an overall model have been developed. Presented here is the procedure utilized to develop the modeling tool, the vision of the modeling tool, and the current focus for each of the subsystem models.
Multi-Hypothesis Modelling Capabilities for Robust Data-Model Integration
NASA Astrophysics Data System (ADS)
Walker, A. P.; De Kauwe, M. G.; Lu, D.; Medlyn, B.; Norby, R. J.; Ricciuto, D. M.; Rogers, A.; Serbin, S.; Weston, D. J.; Ye, M.; Zaehle, S.
2017-12-01
Large uncertainty is often inherent in model predictions due to imperfect knowledge of how to describe the mechanistic processes (hypotheses) that a model is intended to represent. Yet this model hypothesis uncertainty (MHU) is often overlooked or informally evaluated, as methods to quantify and evaluate MHU are limited. MHU is increased as models become more complex because each additional processes added to a model comes with inherent MHU as well as parametric unceratinty. With the current trend of adding more processes to Earth System Models (ESMs), we are adding uncertainty, which can be quantified for parameters but not MHU. Model inter-comparison projects do allow for some consideration of hypothesis uncertainty but in an ad hoc and non-independent fashion. This has stymied efforts to evaluate ecosystem models against data and intepret the results mechanistically because it is not simple to interpret exactly why a model is producing the results it does and identify which model assumptions are key as they combine models of many sub-systems and processes, each of which may be conceptualised and represented mathematically in various ways. We present a novel modelling framework—the multi-assumption architecture and testbed (MAAT)—that automates the combination, generation, and execution of a model ensemble built with different representations of process. We will present the argument that multi-hypothesis modelling needs to be considered in conjunction with other capabilities (e.g. the Predictive Ecosystem Analyser; PecAn) and statistical methods (e.g. sensitivity anaylsis, data assimilation) to aid efforts in robust data model integration to enhance our predictive understanding of biological systems.
Development of an Aeroelastic Modeling Capability for Transient Nozzle Side Load Analysis
NASA Technical Reports Server (NTRS)
Wang, Ten-See; Zhao, Xiang; Zhang, Sijun; Chen, Yen-Sen
2013-01-01
Lateral nozzle forces are known to cause severe structural damage to any new rocket engine in development. Currently there is no fully coupled computational tool to analyze this fluid/structure interaction process. The objective of this study was to develop a fully coupled aeroelastic modeling capability to describe the fluid/structure interaction process during the transient nozzle operations. The aeroelastic model composes of three components: the computational fluid dynamics component based on an unstructured-grid, pressure-based computational fluid dynamics formulation, the computational structural dynamics component developed in the framework of modal analysis, and the fluid-structural interface component. The developed aeroelastic model was applied to the transient nozzle startup process of the Space Shuttle Main Engine at sea level. The computed nozzle side loads and the axial nozzle wall pressure profiles from the aeroelastic nozzle are compared with those of the published rigid nozzle results, and the impact of the fluid/structure interaction on nozzle side loads is interrogated and presented.
Phenomenological modelling of self-healing polymers based on integrated healing agents
NASA Astrophysics Data System (ADS)
Mergheim, Julia; Steinmann, Paul
2013-09-01
The present contribution introduces a phenomenological model for self-healing polymers. Self-healing polymers are a promising class of materials which mimic nature by their capability to autonomously heal micro-cracks. This self-healing is accomplished by the integration of microcapsules containing a healing agent and a dispersed catalyst into the matrix material. Propagating microcracks may then break the capsules which releases the healing agent into the microcracks where it polymerizes with the catalyst, closes the crack and 'heals' the material. The present modelling approach treats these processes at the macroscopic scale, the microscopic details of crack propagation and healing are thus described by means of continuous damage and healing variables. The formulation of the healing model accounts for the fact that healing is directly associated with the curing process of healing agent and catalyst. The model is implemented and its capabilities are studied by means of numerical examples.
LinkEHR-Ed: a multi-reference model archetype editor based on formal semantics.
Maldonado, José A; Moner, David; Boscá, Diego; Fernández-Breis, Jesualdo T; Angulo, Carlos; Robles, Montserrat
2009-08-01
To develop a powerful archetype editing framework capable of handling multiple reference models and oriented towards the semantic description and standardization of legacy data. The main prerequisite for implementing tools providing enhanced support for archetypes is the clear specification of archetype semantics. We propose a formalization of the definition section of archetypes based on types over tree-structured data. It covers the specialization of archetypes, the relationship between reference models and archetypes and conformance of data instances to archetypes. LinkEHR-Ed, a visual archetype editor based on the former formalization with advanced processing capabilities that supports multiple reference models, the editing and semantic validation of archetypes, the specification of mappings to data sources, and the automatic generation of data transformation scripts, is developed. LinkEHR-Ed is a useful tool for building, processing and validating archetypes based on any reference model.
NASA Astrophysics Data System (ADS)
Combeau, Hervé; Založnik, Miha; Bedel, Marie
2016-08-01
Prediction of solidification defects, such as macrosegregation and inhomogeneous microstructures, constitutes a key issue for industry. The development of models of casting processes needs to account for several imbricated length scales and different physical phenomena. For example, the kinetics of the growth of microstructures needs to be coupled with the multiphase flow at the process scale. We introduce such a state-of-the-art model and outline its principles. We present the most recent applications of the model to casting of a heavy steel ingot and to direct chill casting of a large Al alloy sheet ingot. Their ability to help in the understanding of complex phenomena, such as the competition between nucleation and growth of grains in the presence of convection of the liquid and of grain motion is shown, and its predictive capabilities are discussed. Key issues for future developments and research are addressed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bryan, Frank; Dennis, John; MacCready, Parker
This project aimed to improve long term global climate simulations by resolving and enhancing the representation of the processes involved in the cycling of freshwater through estuaries and coastal regions. This was a collaborative multi-institution project consisting of physical oceanographers, climate model developers, and computational scientists. It specifically targeted the DOE objectives of advancing simulation and predictive capability of climate models through improvements in resolution and physical process representation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bryan, Frank; Dennis, John; MacCready, Parker
This project aimed to improve long term global climate simulations by resolving and enhancing the representation of the processes involved in the cycling of freshwater through estuaries and coastal regions. This was a collaborative multi-institution project consisting of physical oceanographers, climate model developers, and computational scientists. It specifically targeted the DOE objectives of advancing simulation and predictive capability of climate models through improvements in resolution and physical process representation.
Experience Transitioning Models and Data at the NOAA Space Weather Prediction Center
NASA Astrophysics Data System (ADS)
Berger, Thomas
2016-07-01
The NOAA Space Weather Prediction Center has a long history of transitioning research data and models into operations and with the validation activities required. The first stage in this process involves demonstrating that the capability has sufficient value to customers to justify the cost needed to transition it and to run it continuously and reliably in operations. Once the overall value is demonstrated, a substantial effort is then required to develop the operational software from the research codes. The next stage is to implement and test the software and product generation on the operational computers. Finally, effort must be devoted to establishing long-term measures of performance, maintaining the software, and working with forecasters, customers, and researchers to improve over time the operational capabilities. This multi-stage process of identifying, transitioning, and improving operational space weather capabilities will be discussed using recent examples. Plans for future activities will also be described.
New single-aircraft integrated atmospheric observation capabilities
NASA Astrophysics Data System (ADS)
Wang, Z.
2011-12-01
Improving current weather and climate model capabilities requires better understandings of many atmospheric processes. Thus, advancing atmospheric observation capabilities has been regarded as the highest imperatives to advance the atmospheric science in the 21st century. Under the NSF CAREER support, we focus on developing new airborne observation capabilities through the developments of new instrumentations and the single-aircraft integration of multiple remote sensors with in situ probes. Two compact Wyoming cloud lidars were built to work together with a 183 GHz microwave radiometer, a multi-beam Wyoming cloud radar and in situ probes for cloud studies. The synergy of these remote sensor measurements allows us to better resolve the vertical structure of cloud microphysical properties and cloud scale dynamics. Together with detailed in situ data for aerosol, cloud, water vapor and dynamics, we developed the most advanced observational capability to study cloud-scale properties and processes from a single aircraft (Fig. 1). A compact Raman lidar was also built to work together with in situ sampling to characterize boundary layer aerosol and water vapor distributions for many important atmospheric processes studies, such as, air-sea interaction and convective initialization. Case studies will be presented to illustrate these new observation capabilities.
NASA Technical Reports Server (NTRS)
Abbott, J. M.; Deidrich, J. H.; Groeneweg, J. F.; Povinelli, L. A.; Reid, L.; Reinmann, J. J.; Szuch, J. R.
1985-01-01
An effort is currently underway at the NASA Lewis Research Center to rehabilitate and extend the capabilities of the Altitude Wind Tunnel (AWT). This extended capability will include a maximum test section Mach number of about 0.9 at an altitude of 55,000 ft and a -20 F stagnation temperature (octagonal test section, 20 ft across the flats). In addition, the AWT will include an icing and acoustic research capability. In order to insure a technically sound design, an AWT modeling program (both analytical and physical) was initiated to provide essential input to the AWT final design process. This paper describes the modeling program, including the rationale and criteria used in program definition, and presents some early program results.
Gaussian Process Regression (GPR) Representation in Predictive Model Markup Language (PMML)
Lechevalier, D.; Ak, R.; Ferguson, M.; Law, K. H.; Lee, Y.-T. T.; Rachuri, S.
2017-01-01
This paper describes Gaussian process regression (GPR) models presented in predictive model markup language (PMML). PMML is an extensible-markup-language (XML) -based standard language used to represent data-mining and predictive analytic models, as well as pre- and post-processed data. The previous PMML version, PMML 4.2, did not provide capabilities for representing probabilistic (stochastic) machine-learning algorithms that are widely used for constructing predictive models taking the associated uncertainties into consideration. The newly released PMML version 4.3, which includes the GPR model, provides new features: confidence bounds and distribution for the predictive estimations. Both features are needed to establish the foundation for uncertainty quantification analysis. Among various probabilistic machine-learning algorithms, GPR has been widely used for approximating a target function because of its capability of representing complex input and output relationships without predefining a set of basis functions, and predicting a target output with uncertainty quantification. GPR is being employed to various manufacturing data-analytics applications, which necessitates representing this model in a standardized form for easy and rapid employment. In this paper, we present a GPR model and its representation in PMML. Furthermore, we demonstrate a prototype using a real data set in the manufacturing domain. PMID:29202125
Gaussian Process Regression (GPR) Representation in Predictive Model Markup Language (PMML).
Park, J; Lechevalier, D; Ak, R; Ferguson, M; Law, K H; Lee, Y-T T; Rachuri, S
2017-01-01
This paper describes Gaussian process regression (GPR) models presented in predictive model markup language (PMML). PMML is an extensible-markup-language (XML) -based standard language used to represent data-mining and predictive analytic models, as well as pre- and post-processed data. The previous PMML version, PMML 4.2, did not provide capabilities for representing probabilistic (stochastic) machine-learning algorithms that are widely used for constructing predictive models taking the associated uncertainties into consideration. The newly released PMML version 4.3, which includes the GPR model, provides new features: confidence bounds and distribution for the predictive estimations. Both features are needed to establish the foundation for uncertainty quantification analysis. Among various probabilistic machine-learning algorithms, GPR has been widely used for approximating a target function because of its capability of representing complex input and output relationships without predefining a set of basis functions, and predicting a target output with uncertainty quantification. GPR is being employed to various manufacturing data-analytics applications, which necessitates representing this model in a standardized form for easy and rapid employment. In this paper, we present a GPR model and its representation in PMML. Furthermore, we demonstrate a prototype using a real data set in the manufacturing domain.
Computer-aided light sheet flow visualization using photogrammetry
NASA Technical Reports Server (NTRS)
Stacy, Kathryn; Severance, Kurt; Childers, Brooks A.
1994-01-01
A computer-aided flow visualization process has been developed to analyze video images acquired from rotating and translating light sheet visualization systems. The computer process integrates a mathematical model for image reconstruction, advanced computer graphics concepts, and digital image processing to provide a quantitative and a visual analysis capability. The image reconstruction model, based on photogrammetry, uses knowledge of the camera and light sheet locations and orientations to project two-dimensional light sheet video images into three-dimensional space. A sophisticated computer visualization package, commonly used to analyze computational fluid dynamics (CFD) results, was chosen to interactively display the reconstructed light sheet images with the numerical surface geometry for the model or aircraft under study. The photogrammetric reconstruction technique and the image processing and computer graphics techniques and equipment are described. Results of the computer-aided process applied to both a wind tunnel translating light sheet experiment and an in-flight rotating light sheet experiment are presented. The capability to compare reconstructed experimental light sheet images with CFD solutions in the same graphics environment is also demonstrated.
Computer-Aided Light Sheet Flow Visualization
NASA Technical Reports Server (NTRS)
Stacy, Kathryn; Severance, Kurt; Childers, Brooks A.
1993-01-01
A computer-aided flow visualization process has been developed to analyze video images acquired from rotating and translating light sheet visualization systems. The computer process integrates a mathematical model for image reconstruction, advanced computer graphics concepts, and digital image processing to provide a quantitative and visual analysis capability. The image reconstruction model, based on photogrammetry, uses knowledge of the camera and light sheet locations and orientations to project two-dimensional light sheet video images into three-dimensional space. A sophisticated computer visualization package, commonly used to analyze computational fluid dynamics (CFD) data sets, was chosen to interactively display the reconstructed light sheet images, along with the numerical surface geometry for the model or aircraft under study. A description is provided of the photogrammetric reconstruction technique, and the image processing and computer graphics techniques and equipment. Results of the computer aided process applied to both a wind tunnel translating light sheet experiment and an in-flight rotating light sheet experiment are presented. The capability to compare reconstructed experimental light sheet images and CFD solutions in the same graphics environment is also demonstrated.
Computer-aided light sheet flow visualization
NASA Technical Reports Server (NTRS)
Stacy, Kathryn; Severance, Kurt; Childers, Brooks A.
1993-01-01
A computer-aided flow visualization process has been developed to analyze video images acquired from rotating and translating light sheet visualization systems. The computer process integrates a mathematical model for image reconstruction, advanced computer graphics concepts, and digital image processing to provide a quantitative and visual analysis capability. The image reconstruction model, based on photogrammetry, uses knowledge of the camera and light sheet locations and orientations to project two-dimensional light sheet video images into three-dimensional space. A sophisticated computer visualization package, commonly used to analyze computational fluid dynamics (CFD) data sets, was chosen to interactively display the reconstructed light sheet images, along with the numerical surface geometry for the model or aircraft under study. A description is provided of the photogrammetric reconstruction technique, and the image processing and computer graphics techniques and equipment. Results of the computer aided process applied to both a wind tunnel translating light sheet experiment and an in-flight rotating light sheet experiment are presented. The capability to compare reconstructed experimental light sheet images and CFD solutions in the same graphics environment is also demonstrated.
Neurokernel: An Open Source Platform for Emulating the Fruit Fly Brain
2016-01-01
We have developed an open software platform called Neurokernel for collaborative development of comprehensive models of the brain of the fruit fly Drosophila melanogaster and their execution and testing on multiple Graphics Processing Units (GPUs). Neurokernel provides a programming model that capitalizes upon the structural organization of the fly brain into a fixed number of functional modules to distinguish between these modules’ local information processing capabilities and the connectivity patterns that link them. By defining mandatory communication interfaces that specify how data is transmitted between models of each of these modules regardless of their internal design, Neurokernel explicitly enables multiple researchers to collaboratively model the fruit fly’s entire brain by integration of their independently developed models of its constituent processing units. We demonstrate the power of Neurokernel’s model integration by combining independently developed models of the retina and lamina neuropils in the fly’s visual system and by demonstrating their neuroinformation processing capability. We also illustrate Neurokernel’s ability to take advantage of direct GPU-to-GPU data transfers with benchmarks that demonstrate scaling of Neurokernel’s communication performance both over the number of interface ports exposed by an emulation’s constituent modules and the total number of modules comprised by an emulation. PMID:26751378
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bartlett, Roscoe A; Heroux, Dr. Michael A; Willenbring, James
2012-01-01
Software lifecycles are becoming an increasingly important issue for computational science & engineering (CSE) software. The process by which a piece of CSE software begins life as a set of research requirements and then matures into a trusted high-quality capability is both commonplace and extremely challenging. Although an implicit lifecycle is obviously being used in any effort, the challenges of this process--respecting the competing needs of research vs. production--cannot be overstated. Here we describe a proposal for a well-defined software lifecycle process based on modern Lean/Agile software engineering principles. What we propose is appropriate for many CSE software projects thatmore » are initially heavily focused on research but also are expected to eventually produce usable high-quality capabilities. The model is related to TriBITS, a build, integration and testing system, which serves as a strong foundation for this lifecycle model, and aspects of this lifecycle model are ingrained in the TriBITS system. Indeed this lifecycle process, if followed, will enable large-scale sustainable integration of many complex CSE software efforts across several institutions.« less
ERIC Educational Resources Information Center
Keen, John
2017-01-01
This article outlines some cognitive process models of writing composition. Possible reasons why students' writing capabilities do not match their abilities in some other school subjects are explored. Research findings on the efficacy of process approaches to teaching writing are presented and potential shortcomings are discussed. Product-based…
Identity in agent-based models : modeling dynamic multiscale social processes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ozik, J.; Sallach, D. L.; Macal, C. M.
Identity-related issues play central roles in many current events, including those involving factional politics, sectarianism, and tribal conflicts. Two popular models from the computational-social-science (CSS) literature - the Threat Anticipation Program and SharedID models - incorporate notions of identity (individual and collective) and processes of identity formation. A multiscale conceptual framework that extends some ideas presented in these models and draws other capabilities from the broader CSS literature is useful in modeling the formation of political identities. The dynamic, multiscale processes that constitute and transform social identities can be mapped to expressive structures of the framework
NASA Astrophysics Data System (ADS)
Aburas, Maher Milad; Ho, Yuek Ming; Ramli, Mohammad Firuz; Ash'aari, Zulfa Hanan
2017-07-01
The creation of an accurate simulation of future urban growth is considered one of the most important challenges in urban studies that involve spatial modeling. The purpose of this study is to improve the simulation capability of an integrated CA-Markov Chain (CA-MC) model using CA-MC based on the Analytical Hierarchy Process (AHP) and CA-MC based on Frequency Ratio (FR), both applied in Seremban, Malaysia, as well as to compare the performance and accuracy between the traditional and hybrid models. Various physical, socio-economic, utilities, and environmental criteria were used as predictors, including elevation, slope, soil texture, population density, distance to commercial area, distance to educational area, distance to residential area, distance to industrial area, distance to roads, distance to highway, distance to railway, distance to power line, distance to stream, and land cover. For calibration, three models were applied to simulate urban growth trends in 2010; the actual data of 2010 were used for model validation utilizing the Relative Operating Characteristic (ROC) and Kappa coefficient methods Consequently, future urban growth maps of 2020 and 2030 were created. The validation findings confirm that the integration of the CA-MC model with the FR model and employing the significant driving force of urban growth in the simulation process have resulted in the improved simulation capability of the CA-MC model. This study has provided a novel approach for improving the CA-MC model based on FR, which will provide powerful support to planners and decision-makers in the development of future sustainable urban planning.
Executable Architectures for Modeling Command and Control Processes
2006-06-01
of introducing new NCES capabilities (such as the Federated Search ) to the ‘To Be’ model. 2 Table of Contents 1 INTRODUCTION...Conventional Method for SME Discovery ToBe.JCAS.3.2 Send Alert and/or Request OR AND ToBe.JCAS.3.4 Employ Federated Search for CAS-related Info JCAS.1.3.6.13...instant messaging, web browser, etc. • Federated Search – this capability provides a way to search enterprise contents across various search-enabled
Pre- and post-processing for Cosmic/NASTRAN on personal computers and mainframes
NASA Technical Reports Server (NTRS)
Kamel, H. A.; Mobley, A. V.; Nagaraj, B.; Watkins, K. W.
1986-01-01
An interface between Cosmic/NASTRAN and GIFTS has recently been released, combining the powerful pre- and post-processing capabilities of GIFTS with Cosmic/NASTRAN's analysis capabilities. The interface operates on a wide range of computers, even linking Cosmic/NASTRAN and GIFTS when the two are on different computers. GIFTS offers a wide range of elements for use in model construction, each translated by the interface into the nearest Cosmic/NASTRAN equivalent; and the options of automatic or interactive modelling and loading in GIFTS make pre-processing easy and effective. The interface itself includes the programs GFTCOS, which creates the Cosmic/NASTRAN input deck (and, if desired, control deck) from the GIFTS Unified Data Base, COSGFT, which translates the displacements from the Cosmic/NASTRAN analysis back into GIFTS; and HOSTR, which handles stress computations for a few higher-order elements available in the interface, but not supported by the GIFTS processor STRESS. Finally, the versatile display options in GIFTS post-processing allow the user to examine the analysis results through an especially wide range of capabilities, including such possibilities as creating composite loading cases, plotting in color and animating the analysis.
NASA Technical Reports Server (NTRS)
Johnson, Donald R.
1998-01-01
The goal of this research is the continued development and application of global isentropic modeling and analysis capabilities to describe hydrologic processes and energy exchange in the climate system, and discern regional climate change. This work involves a combination of modeling and analysis efforts involving 4DDA datasets and simulations from the University of Wisconsin (UW) hybrid isentropic-sigma (theta-sigma) coordinate model and the GEOS GCM.
NASA Astrophysics Data System (ADS)
Marin, I. S.; Molson, J. W.
2013-05-01
Petroleum hydrocarbons (PHCs) are a major source of groundwater contamination, being a worldwide and well-known problem. Formed by a complex mixture of hundreds of organic compounds (including BTEX - benzene, toluene, ethylbenzene and xylenes), many of which are toxic and persistent in the subsurface and are capable of creating a serious risk to human health. Several remediation technologies can be used to clean-up PHC contamination. In-situ chemical oxidation (ISCO) and intrinsic bioremediation (IBR) are two promising techniques that can be applied in this case. However, the interaction of these processes with the background aquifer geochemistry and the design of an efficient treatment presents a challenge. Here we show the development and application of BIONAPL/Phreeqc, a modeling tool capable of simulating groundwater flow, contaminant transport with coupled biological and geochemical processes in porous or fractured porous media. BIONAPL/Phreeqc is based on the well-tested BIONAPL/3D model, using a powerful finite element simulation engine, capable of simulating non-aqueous phase liquid (NAPL) dissolution, density-dependent advective-dispersive transport, and solving the geochemical and kinetic processes with the library Phreeqc. To validate the model, we compared BIONAPL/Phreeqc with results from the literature for different biodegradation processes and different geometries, with good agreement. We then used the model to simulate the behavior of sodium persulfate (NaS2O8) as an oxidant for BTEX degradation, coupled with sequential biodegradation in a 2D case and to evaluate the effect of inorganic geochemistry reactions. The results show the advantages of a treatment train remediation scheme based on ISCO and IBR. The numerical performance and stability of the integrated BIONAPL/Phreeqc model was also verified.
NASA Technical Reports Server (NTRS)
Mcmanus, Shawn; Mcdaniel, Michael
1989-01-01
Planning for processing payloads was always difficult and time-consuming. With the advent of Space Station Freedom and its capability to support a myriad of complex payloads, the planning to support this ground processing maze involves thousands of man-hours of often tedious data manipulation. To provide the capability to analyze various processing schedules, an object oriented knowledge-based simulation environment called the Advanced Generic Accomodations Planning Environment (AGAPE) is being developed. Having nearly completed the baseline system, the emphasis in this paper is directed toward rule definition and its relation to model development and simulation. The focus is specifically on the methodologies implemented during knowledge acquisition, analysis, and representation within the AGAPE rule structure. A model is provided to illustrate the concepts presented. The approach demonstrates a framework for AGAPE rule development to assist expert system development.
Test Capability Enhancements to the NASA Langley 8-Foot High Temperature Tunnel
NASA Technical Reports Server (NTRS)
Harvin, S. F.; Cabell, K. F.; Gallimore, S. D.; Mekkes, G. L.
2006-01-01
The NASA Langley 8-Foot High Temperature Tunnel produces true enthalpy environments simulating flight from Mach 4 to Mach 7, primarily for airbreathing propulsion and aerothermal/thermo-structural testing. Flow conditions are achieved through a methane-air heater and nozzles producing aerodynamic Mach numbers of 4, 5 or 7 and have exit diameters of 8 feet or 4.5 feet. The 12-ft long free-jet test section, housed inside a 26-ft vacuum sphere, accommodates large test articles. Recently, the facility underwent significant upgrades to support hydrocarbon fueled scramjet engine testing and to expand flight simulation capability. The upgrades were required to meet engine system development and flight clearance verification requirements originally defined by the joint NASA-Air Force X-43C Hypersonic Flight Demonstrator Project and now the Air Force X-51A Program. Enhancements to the 8-Ft. HTT were made in four areas: 1) hydrocarbon fuel delivery; 2) flight simulation capability; 3) controls and communication; and 4) data acquisition/processing. The upgrades include the addition of systems to supply ethylene and liquid JP-7 to test articles; a Mach 5 nozzle with dynamic pressure simulation capability up to 3200 psf, the addition of a real-time model angle-of-attack system; a new programmable logic controller sub-system to improve process controls and communication with model controls; the addition of MIL-STD-1553B and high speed data acquisition systems and a classified data processing environment. These additions represent a significant increase to the already unique test capability and flexibility of the facility, and complement the existing array of test support hardware such as a model injection system, radiant heaters, six-component force measurement system, and optical flow field visualization hardware. The new systems support complex test programs that require sophisticated test sequences and precise management of process fluids. Furthermore, the new systems, such as the real-time angle of attack system and the new programmable logic controller enhance the test efficiency of the facility. The motivation for the upgrades and the expanded capabilities is described here.
Semi-empirical master curve concept describing the rate capability of lithium insertion electrodes
NASA Astrophysics Data System (ADS)
Heubner, C.; Seeba, J.; Liebmann, T.; Nickol, A.; Börner, S.; Fritsch, M.; Nikolowski, K.; Wolter, M.; Schneider, M.; Michaelis, A.
2018-03-01
A simple semi-empirical master curve concept, describing the rate capability of porous insertion electrodes for lithium-ion batteries, is proposed. The model is based on the evaluation of the time constants of lithium diffusion in the liquid electrolyte and the solid active material. This theoretical approach is successfully verified by comprehensive experimental investigations of the rate capability of a large number of porous insertion electrodes with various active materials and design parameters. It turns out, that the rate capability of all investigated electrodes follows a simple master curve governed by the time constant of the rate limiting process. We demonstrate that the master curve concept can be used to determine optimum design criteria meeting specific requirements in terms of maximum gravimetric capacity for a desired rate capability. The model further reveals practical limits of the electrode design, attesting the empirically well-known and inevitable tradeoff between energy and power density.
User's guide to the Variably Saturated Flow (VSF) process to MODFLOW
Thoms, R. Brad; Johnson, Richard L.; Healy, Richard W.
2006-01-01
A new process for simulating three-dimensional (3-D) variably saturated flow (VSF) using Richards' equation has been added to the 3-D modular finite-difference ground-water model MODFLOW. Five new packages are presented here as part of the VSF Process--the Richards' Equation Flow (REF1) Package, the Seepage Face (SPF1) Package, the Surface Ponding (PND1) Package, the Surface Evaporation (SEV1) Package, and the Root Zone Evapotranspiration (RZE1) Package. Additionally, a new Adaptive Time-Stepping (ATS1) Package is presented for use by both the Ground-Water Flow (GWF) Process and VSF. The VSF Process allows simulation of flow in unsaturated media above the ground-water zone and facilitates modeling of ground-water/surface-water interactions. Model performance is evaluated by comparison to an analytical solution for one-dimensional (1-D) constant-head infiltration (Dirichlet boundary condition), field experimental data for a 1-D constant-head infiltration, laboratory experimental data for two-dimensional (2-D) constant-flux infiltration (Neumann boundary condition), laboratory experimental data for 2-D transient drainage through a seepage face, and numerical model results (VS2DT) of a 2-D flow-path simulation using realistic surface boundary conditions. A hypothetical 3-D example case also is presented to demonstrate the new capability using periodic boundary conditions (for example, daily precipitation) and varied surface topography over a larger spatial scale (0.133 square kilometer). The new model capabilities retain the modular structure of the MODFLOW code and preserve MODFLOW's existing capabilities as well as compatibility with commercial pre-/post-processors. The overall success of the VSF Process in simulating mixed boundary conditions and variable soil types demonstrates its utility for future hydrologic investigations. This report presents a new flow package implementing the governing equations for variably saturated ground-water flow, four new boundary condition packages unique to unsaturated flow, the Adaptive Time-Stepping Package for use with both the GWF Process and the new VSF Process, detailed descriptions of the input and output files for each package, and six simulation examples verifying model performance.
Modeling atmospheric effects - an assessment of the problems
Douglas G. Fox
1976-01-01
Our ability to simulate atmospheric processes that affect the life cycle of pollution is reviewed. The transport process is considered on three scales (a) the near-source or single-plume dispersion problem, (b) the multiple-source dispersion problem, and (c) the long-range transport. Modeling the first of these is shown to be well within the capability of generally...
User's guide to the Parallel Processing Extension of the Prognosis Model
Nicholas L. Crookston; Albert R. Stage
1991-01-01
The Parallel Processing Extension (PPE) of the Prognosis Model was designed to analyze responses of numerous stands to coordinated management and pest impacts that operate at the landscape level of forests. Vegetation-related resource supply analysis can be readily performed for a thousand or more sample stands for projections 400 years into the future. Capabilities...
ERIC Educational Resources Information Center
Geiger, Vince; Mulligan, Joanne; Date-Huxtable, Liz; Ahlip, Rehez; Jones, D. Heath; May, E. Julian; Rylands, Leanne; Wright, Ian
2018-01-01
In this article we describe and evaluate processes utilized to develop an online learning module on mathematical modelling for pre-service teachers. The module development process involved a range of professionals working within the STEM disciplines including mathematics and science educators, mathematicians, scientists, in-service and pre-service…
Modeling of pulsed propellant reorientation
NASA Technical Reports Server (NTRS)
Patag, A. E.; Hochstein, J. I.; Chato, D. J.
1989-01-01
Optimization of the propellant reorientation process can provide increased payload capability and extend the service life of spacecraft. The use of pulsed propellant reorientation to optimize the reorientation process is proposed. The ECLIPSE code was validated for modeling the reorientation process and is used to study pulsed reorientation in small-scale and full-scale propellant tanks. A dimensional analysis of the process is performed and the resulting dimensionless groups are used to present and correlate the computational predictions for reorientation performance.
9th Annual Science and Engineering Technology Conference
2008-04-17
Disks Composite Technology Titanium Aluminides Processing Microstructure Properties Curve Generator Go-Forward: Integrated Materials & Process Models...Initiatives Current DPA/T3s: Atomic Layer Deposition Hermetic Coatings: ...domestic ALD for electronic components; transition to fabrication process ...Production windows estim • Process capability fully established >Production specifications in place >Supply chain established •All necessary property
Chemical vapor deposition fluid flow simulation modelling tool
NASA Technical Reports Server (NTRS)
Bullister, Edward T.
1992-01-01
Accurate numerical simulation of chemical vapor deposition (CVD) processes requires a general purpose computational fluid dynamics package combined with specialized capabilities for high temperature chemistry. In this report, we describe the implementation of these specialized capabilities in the spectral element code NEKTON. The thermal expansion of the gases involved is shown to be accurately approximated by the low Mach number perturbation expansion of the incompressible Navier-Stokes equations. The radiative heat transfer between multiple interacting radiating surfaces is shown to be tractable using the method of Gebhart. The disparate rates of reaction and diffusion in CVD processes are calculated via a point-implicit time integration scheme. We demonstrate the use above capabilities on prototypical CVD applications.
Moment-Based Physical Models of Broadband Clutter due to Aggregations of Fish
2013-09-30
statistical models for signal-processing algorithm development. These in turn will help to develop a capability to statistically forecast the impact of...aggregations of fish based on higher-order statistical measures describable in terms of physical and system parameters. Environmentally , these models...processing. In this experiment, we had good ground truth on (1) and (2), and had control over (3) and (4) except for environmentally -imposed restrictions
Metabolic Modeling of the Last Universal Common Ancestor
NASA Astrophysics Data System (ADS)
Broddrick, J. T.; Yurkovich, J. T.; Palsson, B. O.
2017-07-01
The origin and diversity of life on earth are intimately linked to metabolic processes. Using recent assessments of early metabolic capabilities, we construct a metabolic model of a primordial organism that could be representative of the LUCA.
EPA's Models-3 CMAQ system is intended to provide a community modeling paradigm that allows continuous improvement of the one-atmosphere modeling capability in a unified fashion. CMAQ's modular design promotes incorporation of several sets of science process modules representing ...
A technology path to tactical agent-based modeling
NASA Astrophysics Data System (ADS)
James, Alex; Hanratty, Timothy P.
2017-05-01
Wargaming is a process of thinking through and visualizing events that could occur during a possible course of action. Over the past 200 years, wargaming has matured into a set of formalized processes. One area of growing interest is the application of agent-based modeling. Agent-based modeling and its additional supporting technologies has potential to introduce a third-generation wargaming capability to the Army, creating a positive overmatch decision-making capability. In its simplest form, agent-based modeling is a computational technique that helps the modeler understand and simulate how the "whole of a system" responds to change over time. It provides a decentralized method of looking at situations where individual agents are instantiated within an environment, interact with each other, and empowered to make their own decisions. However, this technology is not without its own risks and limitations. This paper explores a technology roadmap, identifying research topics that could realize agent-based modeling within a tactical wargaming context.
ModelMate - A graphical user interface for model analysis
Banta, Edward R.
2011-01-01
ModelMate is a graphical user interface designed to facilitate use of model-analysis programs with models. This initial version of ModelMate supports one model-analysis program, UCODE_2005, and one model software program, MODFLOW-2005. ModelMate can be used to prepare input files for UCODE_2005, run UCODE_2005, and display analysis results. A link to the GW_Chart graphing program facilitates visual interpretation of results. ModelMate includes capabilities for organizing directories used with the parallel-processing capabilities of UCODE_2005 and for maintaining files in those directories to be identical to a set of files in a master directory. ModelMate can be used on its own or in conjunction with ModelMuse, a graphical user interface for MODFLOW-2005 and PHAST.
NASA Technical Reports Server (NTRS)
Hase, Chris
2010-01-01
In August 2003, the Secretary of Defense (SECDEF) established the Adaptive Planning (AP) initiative [1] with an objective of reducing the time necessary to develop and revise Combatant Commander (COCOM) contingency plans and increase SECDEF plan visibility. In addition to reducing the traditional plan development timeline from twenty-four months to less than twelve months (with a goal of six months)[2], AP increased plan visibility to Department of Defense (DoD) leadership through In-Progress Reviews (IPRs). The IPR process, as well as the increased number of campaign and contingency plans COCOMs had to develop, increased the workload while the number of planners remained fixed. Several efforts from collaborative planning tools to streamlined processes were initiated to compensate for the increased workload enabling COCOMS to better meet shorter planning timelines. This paper examines the Joint Strategic Capabilities Plan (JSCP) directed contingency planning and staffing requirements assigned to a combatant commander staff through the lens of modeling and simulation. The dynamics of developing a COCOM plan are captured with an ExtendSim [3] simulation. The resulting analysis provides a quantifiable means by which to measure a combatant commander staffs workload associated with development and staffing JSCP [4] directed contingency plans with COCOM capability/capacity. Modeling and simulation bring significant opportunities in measuring the sensitivity of key variables in the assessment of workload to capability/capacity analysis. Gaining an understanding of the relationship between plan complexity, number of plans, planning processes, and number of planners with time required for plan development provides valuable information to DoD leadership. Through modeling and simulation AP leadership can gain greater insight in making key decisions on knowing where to best allocate scarce resources in an effort to meet DoD planning objectives.
RFI and SCRIMP Model Development and Verification
NASA Technical Reports Server (NTRS)
Loos, Alfred C.; Sayre, Jay
2000-01-01
Vacuum-Assisted Resin Transfer Molding (VARTM) processes are becoming promising technologies in the manufacturing of primary composite structures in the aircraft industry as well as infrastructure. A great deal of work still needs to be done on efforts to reduce the costly trial-and-error methods of VARTM processing that are currently in practice today. A computer simulation model of the VARTM process would provide a cost-effective tool in the manufacturing of composites utilizing this technique. Therefore, the objective of this research was to modify an existing three-dimensional, Resin Film Infusion (RFI)/Resin Transfer Molding (RTM) model to include VARTM simulation capabilities and to verify this model with the fabrication of aircraft structural composites. An additional objective was to use the VARTM model as a process analysis tool, where this tool would enable the user to configure the best process for manufacturing quality composites. Experimental verification of the model was performed by processing several flat composite panels. The parameters verified included flow front patterns and infiltration times. The flow front patterns were determined to be qualitatively accurate, while the simulated infiltration times over predicted experimental times by 8 to 10%. Capillary and gravitational forces were incorporated into the existing RFI/RTM model in order to simulate VARTM processing physics more accurately. The theoretical capillary pressure showed the capability to reduce the simulated infiltration times by as great as 6%. The gravity, on the other hand, was found to be negligible for all cases. Finally, the VARTM model was used as a process analysis tool. This enabled the user to determine such important process constraints as the location and type of injection ports and the permeability and location of the high-permeable media. A process for a three-stiffener composite panel was proposed. This configuration evolved from the variation of the process constraints in the modeling of several different composite panels. The configuration was proposed by considering such factors as: infiltration time, the number of vacuum ports, and possible areas of void entrapment.
NASA Astrophysics Data System (ADS)
Guan, Mingfu; Ahilan, Sangaralingam; Yu, Dapeng; Peng, Yong; Wright, Nigel
2018-01-01
Fine sediment plays crucial and multiple roles in the hydrological, ecological and geomorphological functioning of river systems. This study employs a two-dimensional (2D) numerical model to track the hydro-morphological processes dominated by fine suspended sediment, including the prediction of sediment concentration in flow bodies, and erosion and deposition caused by sediment transport. The model is governed by 2D full shallow water equations with which an advection-diffusion equation for fine sediment is coupled. Bed erosion and sedimentation are updated by a bed deformation model based on local sediment entrainment and settling flux in flow bodies. The model is initially validated with the three laboratory-scale experimental events where suspended load plays a dominant role. Satisfactory simulation results confirm the model's capability in capturing hydro-morphodynamic processes dominated by fine suspended sediment at laboratory-scale. Applications to sedimentation in a stormwater pond are conducted to develop the process-based understanding of fine sediment dynamics over a variety of flow conditions. Urban flows with 5-year, 30-year and 100-year return period and the extreme flood event in 2012 are simulated. The modelled results deliver a step change in understanding fine sediment dynamics in stormwater ponds. The model is capable of quantitatively simulating and qualitatively assessing the performance of a stormwater pond in managing urban water quantity and quality.
Development of a New VLBI Data Analysis Software
NASA Technical Reports Server (NTRS)
Bolotin, Sergei; Gipson, John M.; MacMillan, Daniel S.
2010-01-01
We present an overview of a new VLBI analysis software under development at NASA GSFC. The new software will replace CALC/SOLVE and many related utility programs. It will have the capabilities of the current system as well as incorporate new models and data analysis techniques. In this paper we give a conceptual overview of the new software. We formulate the main goals of the software. The software should be flexible and modular to implement models and estimation techniques that currently exist or will appear in future. On the other hand it should be reliable and possess production quality for processing standard VLBI sessions. Also, it needs to be capable of processing observations from a fully deployed network of VLBI2010 stations in a reasonable time. We describe the software development process and outline the software architecture.
NASA Astrophysics Data System (ADS)
Neill, Aaron; Reaney, Sim
2015-04-01
Fully-distributed, physically-based rainfall-runoff models attempt to capture some of the complexity of the runoff processes that operate within a catchment, and have been used to address a variety of issues including water quality and the effect of climate change on flood frequency. Two key issues are prevalent, however, which call into question the predictive capability of such models. The first is the issue of parameter equifinality which can be responsible for large amounts of uncertainty. The second is whether such models make the right predictions for the right reasons - are the processes operating within a catchment correctly represented, or do the predictive abilities of these models result only from the calibration process? The use of additional data sources, such as environmental tracers, has been shown to help address both of these issues, by allowing for multi-criteria model calibration to be undertaken, and by permitting a greater understanding of the processes operating in a catchment and hence a more thorough evaluation of how well catchment processes are represented in a model. Using discharge and oxygen-18 data sets, the ability of the fully-distributed, physically-based CRUM3 model to represent the runoff processes in three sub-catchments in Cumbria, NW England has been evaluated. These catchments (Morland, Dacre and Pow) are part of the of the River Eden demonstration test catchment project. The oxygen-18 data set was firstly used to derive transit-time distributions and mean residence times of water for each of the catchments to gain an integrated overview of the types of processes that were operating. A generalised likelihood uncertainty estimation procedure was then used to calibrate the CRUM3 model for each catchment based on a single discharge data set from each catchment. Transit-time distributions and mean residence times of water obtained from the model using the top 100 behavioural parameter sets for each catchment were then compared to those derived from the oxygen-18 data to see how well the model captured catchment dynamics. The value of incorporating the oxygen-18 data set, as well as discharge data sets from multiple as opposed to single gauging stations in each catchment, in the calibration process to improve the predictive capability of the model was then investigated. This was achieved by assessing by how much the identifiability of the model parameters and the ability of the model to represent the runoff processes operating in each catchment improved with the inclusion of the additional data sets with respect to the likely costs that would be incurred in obtaining the data sets themselves.
Composing Models of Geographic Physical Processes
NASA Astrophysics Data System (ADS)
Hofer, Barbara; Frank, Andrew U.
Processes are central for geographic information science; yet geographic information systems (GIS) lack capabilities to represent process related information. A prerequisite to including processes in GIS software is a general method to describe geographic processes independently of application disciplines. This paper presents such a method, namely a process description language. The vocabulary of the process description language is derived formally from mathematical models. Physical processes in geography can be described in two equivalent languages: partial differential equations or partial difference equations, where the latter can be shown graphically and used as a method for application specialists to enter their process models. The vocabulary of the process description language comprises components for describing the general behavior of prototypical geographic physical processes. These process components can be composed by basic models of geographic physical processes, which is shown by means of an example.
NASA Astrophysics Data System (ADS)
Lenkiewicz, Przemyslaw; Pereira, Manuela; Freire, Mário M.; Fernandes, José
2013-12-01
In this article, we propose a novel image segmentation method called the whole mesh deformation (WMD) model, which aims at addressing the problems of modern medical imaging. Such problems have raised from the combination of several factors: (1) significant growth of medical image volumes sizes due to increasing capabilities of medical acquisition devices; (2) the will to increase the complexity of image processing algorithms in order to explore new functionality; (3) change in processor development and turn towards multi processing units instead of growing bus speeds and the number of operations per second of a single processing unit. Our solution is based on the concept of deformable models and is characterized by a very effective and precise segmentation capability. The proposed WMD model uses a volumetric mesh instead of a contour or a surface to represent the segmented shapes of interest, which allows exploiting more information in the image and obtaining results in shorter times, independently of image contents. The model also offers a good ability for topology changes and allows effective parallelization of workflow, which makes it a very good choice for large datasets. We present a precise model description, followed by experiments on artificial images and real medical data.
The ends of uncertainty: Air quality science and planning in Central California
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fine, James
Air quality planning in Central California is complicated and controversial despite millions of dollars invested to improve scientific understanding. This research describes and critiques the use of photochemical air quality simulation modeling studies in planning to attain standards for ground-level ozone in the San Francisco Bay Area and the San Joaquin Valley during the 1990's. Data are gathered through documents and interviews with planners, modelers, and policy-makers at public agencies and with representatives from the regulated and environmental communities. Interactions amongst organizations are diagramed to identify significant nodes of interaction. Dominant policy coalitions are described through narratives distinguished by theirmore » uses of and responses to uncertainty, their exposures to risks, and their responses to the principles of conservatism, civil duty, and caution. Policy narratives are delineated using aggregated respondent statements to describe and understand advocacy coalitions. I found that models impacted the planning process significantly, but were used not purely for their scientific capabilities. Modeling results provided justification for decisions based on other constraints and political considerations. Uncertainties were utilized opportunistically by stakeholders instead of managed explicitly. Ultimately, the process supported the partisan views of those in control of the modeling. Based on these findings, as well as a review of model uncertainty analysis capabilities, I recommend modifying the planning process to allow for the development and incorporation of uncertainty information, while addressing the need for inclusive and meaningful public participation. By documenting an actual air quality planning process these findings provide insights about the potential for using new scientific information and understanding to achieve environmental goals, most notably the analysis of uncertainties in modeling applications. Concurrently, needed uncertainty information is identified and capabilities to produce it are assessed. Practices to facilitate incorporation of uncertainty information are suggested based on research findings, as well as theory from the literatures of the policy sciences, decision sciences, science and technology studies, consensus-based and communicative planning, and modeling.« less
The Integrated Landscape Modeling partnership - Current status and future directions
Mushet, David M.; Scherff, Eric J.
2016-01-28
The Integrated Landscape Modeling (ILM) partnership is an effort by the U.S. Geological Survey (USGS) and U.S. Department of Agriculture (USDA) to identify, evaluate, and develop models to quantify services derived from ecosystems, with a focus on wetland ecosystems and conservation effects. The ILM partnership uses the Integrated Valuation of Ecosystem Services and Tradeoffs (InVEST) modeling platform to facilitate regional quantifications of ecosystem services under various scenarios of land-cover change that are representative of differing conservation program and practice implementation scenarios. To date, the ILM InVEST partnership has resulted in capabilities to quantify carbon stores, amphibian habitat, plant-community diversity, and pollination services. Work to include waterfowl and grassland bird habitat quality is in progress. Initial InVEST modeling has been focused on the Prairie Pothole Region (PPR) of the United States; future efforts might encompass other regions as data availability and knowledge increase as to how functions affecting ecosystem services differ among regions.The ILM partnership is also developing the capability for field-scale process-based modeling of depressional wetland ecosystems using the Agricultural Policy/Environmental Extender (APEX) model. Progress was made towards the development of techniques to use the APEX model for closed-basin depressional wetlands of the PPR, in addition to the open systems that the model was originally designed to simulate. The ILM partnership has matured to the stage where effects of conservation programs and practices on multiple ecosystem services can now be simulated in selected areas. Future work might include the continued development of modeling capabilities, as well as development and evaluation of differing conservation program and practice scenarios of interest to partner agencies including the USDA’s Farm Service Agency (FSA) and Natural Resources Conservation Service (NRCS). When combined, the ecosystem services modeling capabilities of InVEST and the process-based abilities of the APEX model should provide complementary information needed to meet USDA and the Department of the Interior information needs.
Management Sciences Division Annual Report (9th)
1992-01-01
41 Actuarial Process Consolidation and Review ....................................... 43 How M alfunction Code Reduction...47 Sun W ork Stations ............................................................................... 48 Actuarial Process Consolidation and...Information System (WSMIS). Dyna-METRIC is used for wartime supply support capability assessments. The Aircraft Sustainability Model ( ASM ) is the
Automated extraction of knowledge for model-based diagnostics
NASA Technical Reports Server (NTRS)
Gonzalez, Avelino J.; Myler, Harley R.; Towhidnejad, Massood; Mckenzie, Frederic D.; Kladke, Robin R.
1990-01-01
The concept of accessing computer aided design (CAD) design databases and extracting a process model automatically is investigated as a possible source for the generation of knowledge bases for model-based reasoning systems. The resulting system, referred to as automated knowledge generation (AKG), uses an object-oriented programming structure and constraint techniques as well as internal database of component descriptions to generate a frame-based structure that describes the model. The procedure has been designed to be general enough to be easily coupled to CAD systems that feature a database capable of providing label and connectivity data from the drawn system. The AKG system is capable of defining knowledge bases in formats required by various model-based reasoning tools.
Fast Segmentation From Blurred Data in 3D Fluorescence Microscopy.
Storath, Martin; Rickert, Dennis; Unser, Michael; Weinmann, Andreas
2017-10-01
We develop a fast algorithm for segmenting 3D images from linear measurements based on the Potts model (or piecewise constant Mumford-Shah model). To that end, we first derive suitable space discretizations of the 3D Potts model, which are capable of dealing with 3D images defined on non-cubic grids. Our discretization allows us to utilize a specific splitting approach, which results in decoupled subproblems of moderate size. The crucial point in the 3D setup is that the number of independent subproblems is so large that we can reasonably exploit the parallel processing capabilities of the graphics processing units (GPUs). Our GPU implementation is up to 18 times faster than the sequential CPU version. This allows to process even large volumes in acceptable runtimes. As a further contribution, we extend the algorithm in order to deal with non-negativity constraints. We demonstrate the efficiency of our method for combined image deconvolution and segmentation on simulated data and on real 3D wide field fluorescence microscopy data.
A thermo-chemo-mechanically coupled constitutive model for curing of glassy polymers
NASA Astrophysics Data System (ADS)
Sain, Trisha; Loeffel, Kaspar; Chester, Shawn
2018-07-01
Curing of a polymer is the process through which a polymer liquid transitions into a solid polymer, capable of bearing mechanical loads. The curing process is a coupled thermo-chemo-mechanical conversion process which requires a thorough understanding of the system behavior to predict the cure dependent mechanical behavior of the solid polymer. In this paper, a thermodynamically consistent, frame indifferent, thermo-chemo-mechanically coupled continuum level constitutive framework is proposed for thermally cured glassy polymers. The constitutive framework considers the thermodynamics of chemical reactions, as well as the material behavior for a glassy polymer. A stress-free intermediate configuration is introduced within a finite deformation setting to capture the formation of the network in a stress-free configuration. This work considers a definition for the degree of cure based on the chemistry of the curing reactions. A simplified version of the proposed model has been numerically implemented, and simulations are used to understand the capabilities of the model and framework.
Sulis, William H
2017-10-01
Walter Freeman III pioneered the application of nonlinear dynamical systems theories and methodologies in his work on mesoscopic brain dynamics.Sadly, mainstream psychology and psychiatry still cling to linear correlation based data analysis techniques, which threaten to subvert the process of experimentation and theory building. In order to progress, it is necessary to develop tools capable of managing the stochastic complexity of complex biopsychosocial systems, which includes multilevel feedback relationships, nonlinear interactions, chaotic dynamics and adaptability. In addition, however, these systems exhibit intrinsic randomness, non-Gaussian probability distributions, non-stationarity, contextuality, and non-Kolmogorov probabilities, as well as the absence of mean and/or variance and conditional probabilities. These properties and their implications for statistical analysis are discussed. An alternative approach, the Process Algebra approach, is described. It is a generative model, capable of generating non-Kolmogorov probabilities. It has proven useful in addressing fundamental problems in quantum mechanics and in the modeling of developing psychosocial systems.
Hammer, Michael
2007-04-01
Few executives question the idea that by redesigning business processes--work that runs from end to end across an enterprise--they can achieve extraordinary improvements in cost, quality, speed, profitability, and other key areas Yet in spite of their intentions and investments, many executives flounder, unsure about what exactly needs to be changed, by how much, and when. As a result, many organizations make little progress--if any at all--in their attempts to transform business processes. Michael Hammer has spent the past five years working with a group of leading companies to develop the Process and Enterprise Maturity Model (PEMM), a new framework that helps executives comprehend, formulate, and assess process-based transformation efforts. He has identified two distinct groups of characteristics that are needed for business processes to perform exceptionally well over a long period of time. Process enablers, which affect individual processes, determine how well a process is able to function. They are mutually interdependent--if any are missing, the others will be ineffective. However, enablers are not enough to develop high-performance processes; they only provide the potential to deliver high performance. A company must also possess or establish organizational capabilities that allow the business to offer a supportive environment. Together, the enablers and the capabilities provide an effective way for companies to plan and evaluate process-based transformations. PEMM is different from other frameworks, such as Capability Maturity Model Integration (CMMI), because it applies to all industries and all processes. The author describes how several companies--including Michelin, CSAA, Tetra Pak, Shell, Clorox, and Schneider National--have successfully used PEMM in various ways and at different stages to evaluate the progress of their process-based transformation efforts.
Ball milling: An experimental support to the energy transfer evaluated by the collision model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Magini, M.; Iasonna, A.; Padella, F.
1996-01-01
In recent years several attempts have been made in order to understand the fundamentals of the ball milling process. The aim of these approaches is to establish predictive capabilities for this process, i.e. the possibility of obtaining a given product by suitable choosing the proper milling conditions. Maurice and Courtney have modeled ball milling in a planetary and in a vibratory mill including parameters like impact times, areas of the colliding surfaces (derived from hertzian collision theory), powder strain rates and pressure peak during collision. Burgio et al derived the kinematic equations of a ball moving on a planetary millmore » and the consequent ball-to-powder energy transfer occurring in a single collision event. The fraction of input energy transferred to the powder was subsequently estimated by an analysis of the collision event. Finally an energy map was constructed which was the basis for a model with predictive capabilities. The aim of the present article is to show that the arguments used to construct the model of the milling process has substantial experimental support.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rutqvist, Jonny; Blanco Martin, Laura; Mukhopadhyay, Sumit
The modeling efforts in support of the field test planning conducted at LBNL leverage on recent developments of tools for modeling coupled thermal-hydrological-mechanical-chemical (THMC) processes in salt and their effect on brine migration at high temperatures. This work includes development related to, and implementation of, essential capabilities, as well as testing the model against relevant information and published experimental data related to the fate and transport of water. These are modeling capabilities that will be suitable for assisting in the design of field experiment, especially related to multiphase flow processes coupled with mechanical deformations, at high temperature. In this report,more » we first examine previous generic repository modeling results, focusing on the first 20 years to investigate the expected evolution of the different processes that could be monitored in a full-scale heater experiment, and then present new results from ongoing modeling of the Thermal Simulation for Drift Emplacement (TSDE) experiment, a heater experiment on the in-drift emplacement concept at the Asse Mine, Germany, and provide an update on the ongoing model developments for modeling brine migration. LBNL also supported field test planning activities via contributions to and technical review of framework documents and test plans, as well as participation in workshops associated with field test planning.« less
NASA Technical Reports Server (NTRS)
Conway, R.; Matuck, G. N.; Roe, J. M.; Taylor, J.; Turner, A.
1975-01-01
A vortex information display system is described which provides flexible control through system-user interaction for collecting wing-tip-trailing vortex data, processing this data in real time, displaying the processed data, storing raw data on magnetic tape, and post processing raw data. The data is received from two asynchronous laser Doppler velocimeters (LDV's) and includes position, velocity, and intensity information. The raw data is written onto magnetic tape for permanent storage and is also processed in real time to locate vortices and plot their positions as a function of time. The interactive capability enables the user to make real time adjustments in processing data and provides a better definition of vortex behavior. Displaying the vortex information in real time produces a feedback capability to the LDV system operator allowing adjustments to be made in the collection of raw data. Both raw data and processing can be continually upgraded during flyby testing to improve vortex behavior studies. The post-analysis capability permits the analyst to perform in-depth studies of test data and to modify vortex behavior models to improve transport predictions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crabtree, George; Glotzer, Sharon; McCurdy, Bill
This report is based on a SC Workshop on Computational Materials Science and Chemistry for Innovation on July 26-27, 2010, to assess the potential of state-of-the-art computer simulations to accelerate understanding and discovery in materials science and chemistry, with a focus on potential impacts in energy technologies and innovation. The urgent demand for new energy technologies has greatly exceeded the capabilities of today's materials and chemical processes. To convert sunlight to fuel, efficiently store energy, or enable a new generation of energy production and utilization technologies requires the development of new materials and processes of unprecedented functionality and performance. Newmore » materials and processes are critical pacing elements for progress in advanced energy systems and virtually all industrial technologies. Over the past two decades, the United States has developed and deployed the world's most powerful collection of tools for the synthesis, processing, characterization, and simulation and modeling of materials and chemical systems at the nanoscale, dimensions of a few atoms to a few hundred atoms across. These tools, which include world-leading x-ray and neutron sources, nanoscale science facilities, and high-performance computers, provide an unprecedented view of the atomic-scale structure and dynamics of materials and the molecular-scale basis of chemical processes. For the first time in history, we are able to synthesize, characterize, and model materials and chemical behavior at the length scale where this behavior is controlled. This ability is transformational for the discovery process and, as a result, confers a significant competitive advantage. Perhaps the most spectacular increase in capability has been demonstrated in high performance computing. Over the past decade, computational power has increased by a factor of a million due to advances in hardware and software. This rate of improvement, which shows no sign of abating, has enabled the development of computer simulations and models of unprecedented fidelity. We are at the threshold of a new era where the integrated synthesis, characterization, and modeling of complex materials and chemical processes will transform our ability to understand and design new materials and chemistries with predictive power. In turn, this predictive capability will transform technological innovation by accelerating the development and deployment of new materials and processes in products and manufacturing. Harnessing the potential of computational science and engineering for the discovery and development of materials and chemical processes is essential to maintaining leadership in these foundational fields that underpin energy technologies and industrial competitiveness. Capitalizing on the opportunities presented by simulation-based engineering and science in materials and chemistry will require an integration of experimental capabilities with theoretical and computational modeling; the development of a robust and sustainable infrastructure to support the development and deployment of advanced computational models; and the assembly of a community of scientists and engineers to implement this integration and infrastructure. This community must extend to industry, where incorporating predictive materials science and chemistry into design tools can accelerate the product development cycle and drive economic competitiveness. The confluence of new theories, new materials synthesis capabilities, and new computer platforms has created an unprecedented opportunity to implement a "materials-by-design" paradigm with wide-ranging benefits in technological innovation and scientific discovery. The Workshop on Computational Materials Science and Chemistry for Innovation was convened in Bethesda, Maryland, on July 26-27, 2010. Sponsored by the Department of Energy (DOE) Offices of Advanced Scientific Computing Research and Basic Energy Sciences, the workshop brought together 160 experts in materials science, chemistry, and computational science representing more than 65 universities, laboratories, and industries, and four agencies. The workshop examined seven foundational challenge areas in materials science and chemistry: materials for extreme conditions, self-assembly, light harvesting, chemical reactions, designer fluids, thin films and interfaces, and electronic structure. Each of these challenge areas is critical to the development of advanced energy systems, and each can be accelerated by the integrated application of predictive capability with theory and experiment. The workshop concluded that emerging capabilities in predictive modeling and simulation have the potential to revolutionize the development of new materials and chemical processes. Coupled with world-leading materials characterization and nanoscale science facilities, this predictive capability provides the foundation for an innovation ecosystem that can accelerate the discovery, development, and deployment of new technologies, including advanced energy systems. Delivering on the promise of this innovation ecosystem requires the following: Integration of synthesis, processing, characterization, theory, and simulation and modeling. Many of the newly established Energy Frontier Research Centers and Energy Hubs are exploiting this integration. Achieving/strengthening predictive capability in foundational challenge areas. Predictive capability in the seven foundational challenge areas described in this report is critical to the development of advanced energy technologies. Developing validated computational approaches that span vast differences in time and length scales. This fundamental computational challenge crosscuts all of the foundational challenge areas. Similarly challenging is coupling of analytical data from multiple instruments and techniques that are required to link these length and time scales. Experimental validation and quantification of uncertainty in simulation and modeling. Uncertainty quantification becomes increasingly challenging as simulations become more complex. Robust and sustainable computational infrastructure, including software and applications. For modeling and simulation, software equals infrastructure. To validate the computational tools, software is critical infrastructure that effectively translates huge arrays of experimental data into useful scientific understanding. An integrated approach for managing this infrastructure is essential. Efficient transfer and incorporation of simulation-based engineering and science in industry. Strategies for bridging the gap between research and industrial applications and for widespread industry adoption of integrated computational materials engineering are needed.« less
Hormone Purification by Isoelectric Focusing
NASA Technical Reports Server (NTRS)
Bier, M.
1985-01-01
Various ground-based research approaches are being applied to a more definitive evaluation of the natures and degrees of electroosmosis effects on the separation capabilities of the Isoelectric Focusing (IEF) process. A primary instrumental system for this work involves rotationally stabilized, horizontal electrophoretic columns specially adapted for the IEF process. Representative adaptations include segmentation, baffles/screens, and surface coatings. Comparative performance and development testing are pursued against the type of column or cell established as an engineering model. Previously developed computer simulation capabilities are used to predict low-gravity behavior patterns and performance for IEF apparatus geometries of direct project interest. Three existing mathematical models plus potential new routines for particular aspects of simulating instrument fluid patterns with varied wall electroosmosis influences are being exercised.
NASA Technical Reports Server (NTRS)
Pototzky, Anthony; Wieseman, Carol; Hoadley, Sherwood Tiffany; Mukhopadhyay, Vivek
1991-01-01
Described here is the development and implementation of on-line, near real time controller performance evaluation (CPE) methods capability. Briefly discussed are the structure of data flow, the signal processing methods used to process the data, and the software developed to generate the transfer functions. This methodology is generic in nature and can be used in any type of multi-input/multi-output (MIMO) digital controller application, including digital flight control systems, digitally controlled spacecraft structures, and actively controlled wind tunnel models. Results of applying the CPE methodology to evaluate (in near real time) MIMO digital flutter suppression systems being tested on the Rockwell Active Flexible Wing (AFW) wind tunnel model are presented to demonstrate the CPE capability.
Aviation System Analysis Capability Executive Assistant Design
NASA Technical Reports Server (NTRS)
Roberts, Eileen; Villani, James A.; Osman, Mohammed; Godso, David; King, Brent; Ricciardi, Michael
1998-01-01
In this technical document, we describe the design developed for the Aviation System Analysis Capability (ASAC) Executive Assistant (EA) Proof of Concept (POC). We describe the genesis and role of the ASAC system, discuss the objectives of the ASAC system and provide an overview of components and models within the ASAC system, and describe the design process and the results of the ASAC EA POC system design. We also describe the evaluation process and results for applicable COTS software. The document has six chapters, a bibliography, three appendices and one attachment.
Workshop on the Thermophysical Properties of Molten Materials
NASA Technical Reports Server (NTRS)
1993-01-01
The role of accurate thermophysical property data in the process design and modeling of solidification processes was the subject of a workshop held on 22-23 Oct. 1992 in Cleveland, Ohio. The workshop was divided into three sequential sessions dealing with (1) industrial needs and priorities for thermophysical data, (2) experimental capabilities for measuring the necessary data, and (3) theoretical capabilities for predicting the necessary data. In addition, a 2-hour panel discussion of the salient issues was featured as well as a 2-hour caucus that assessed priorities and identified action plans.
Aviation System Analysis Capability Executive Assistant Development
NASA Technical Reports Server (NTRS)
Roberts, Eileen; Villani, James A.; Anderson, Kevin; Book, Paul
1999-01-01
In this technical document, we describe the development of the Aviation System Analysis Capability (ASAC) Executive Assistant (EA) Proof of Concept (POC) and Beta version. We describe the genesis and role of the ASAC system, discuss the objectives of the ASAC system and provide an overview of components and models in the ASAC system, and describe the design process and the results of the ASAC EA POC and Beta system development. We also describe the evaluation process and results for applicable COTS software. The document has seven chapters, a bibliography, and two appendices.
Seismographs, sensors, and satellites: Better technology for safer communities
Groat, C.G.
2004-01-01
In the past 25 years, our ability to measure, monitor, and model the processes that lead to natural disasters has increased dramatically. Equally important has been the improvement in our technological capability to communicate information about hazards to those whose lives may be affected. These innovations in tracking and communicating the changes-floods, earthquakes, wildfires, volcanic eruptions-in our dynamic planet, supported by a deeper understanding of earth processes, enable us to expand our predictive capabilities and point the way to a safer future. ?? 2004 Elsevier Ltd. All rights reserved.
Error Detection Processes during Observational Learning
ERIC Educational Resources Information Center
Badets, Arnaud; Blandin, Yannick; Wright, David L.; Shea, Charles H.
2006-01-01
The purpose of this experiment was to determine whether a faded knowledge of results (KR) frequency during observation of a model's performance enhanced error detection capabilities. During the observation phase, participants observed a model performing a timing task and received KR about the model's performance on each trial or on one of two…
ERIC Educational Resources Information Center
Marceau, Kristine; Ram, Nilam; Houts, Renate M.; Grimm, Kevin J.; Susman, Elizabeth J.
2011-01-01
Pubertal development is a nonlinear process progressing from prepubescent beginnings through biological, physical, and psychological changes to full sexual maturity. To tether theoretical concepts of puberty with sophisticated longitudinal, analytical models capable of articulating pubertal development more accurately, we used nonlinear…
Rational Approximations to Rational Models: Alternative Algorithms for Category Learning
ERIC Educational Resources Information Center
Sanborn, Adam N.; Griffiths, Thomas L.; Navarro, Daniel J.
2010-01-01
Rational models of cognition typically consider the abstract computational problems posed by the environment, assuming that people are capable of optimally solving those problems. This differs from more traditional formal models of cognition, which focus on the psychological processes responsible for behavior. A basic challenge for rational models…
ERIC Educational Resources Information Center
Hannon, Cliona; Faas, Daniel; O'Sullivan, Katriona
2017-01-01
Widening participation programmes aim to increase the progression of students from low socio-economic status (SES) groups to higher education. This research proposes that the human capabilities approach is a good justice-based framework within which to consider the social and cultural capital processes that impact upon the educational capabilities…
Integrated Process Modeling-A Process Validation Life Cycle Companion.
Zahel, Thomas; Hauer, Stefan; Mueller, Eric M; Murphy, Patrick; Abad, Sandra; Vasilieva, Elena; Maurer, Daniel; Brocard, Cécile; Reinisch, Daniela; Sagmeister, Patrick; Herwig, Christoph
2017-10-17
During the regulatory requested process validation of pharmaceutical manufacturing processes, companies aim to identify, control, and continuously monitor process variation and its impact on critical quality attributes (CQAs) of the final product. It is difficult to directly connect the impact of single process parameters (PPs) to final product CQAs, especially in biopharmaceutical process development and production, where multiple unit operations are stacked together and interact with each other. Therefore, we want to present the application of Monte Carlo (MC) simulation using an integrated process model (IPM) that enables estimation of process capability even in early stages of process validation. Once the IPM is established, its capability in risk and criticality assessment is furthermore demonstrated. IPMs can be used to enable holistic production control strategies that take interactions of process parameters of multiple unit operations into account. Moreover, IPMs can be trained with development data, refined with qualification runs, and maintained with routine manufacturing data which underlines the lifecycle concept. These applications will be shown by means of a process characterization study recently conducted at a world-leading contract manufacturing organization (CMO). The new IPM methodology therefore allows anticipation of out of specification (OOS) events, identify critical process parameters, and take risk-based decisions on counteractions that increase process robustness and decrease the likelihood of OOS events.
Summary of the key features of seven biomathematical models of human fatigue and performance.
Mallis, Melissa M; Mejdal, Sig; Nguyen, Tammy T; Dinges, David F
2004-03-01
Biomathematical models that quantify the effects of circadian and sleep/wake processes on the regulation of alertness and performance have been developed in an effort to predict the magnitude and timing of fatigue-related responses in a variety of contexts (e.g., transmeridian travel, sustained operations, shift work). This paper summarizes key features of seven biomathematical models reviewed as part of the Fatigue and Performance Modeling Workshop held in Seattle, WA, on June 13-14, 2002. The Workshop was jointly sponsored by the National Aeronautics and Space Administration, U.S. Department of Defense, U.S. Army Medical Research and Materiel Command, Office of Naval Research, Air Force Office of Scientific Research, and U.S. Department of Transportation. An invitation was sent to developers of seven biomathematical models that were commonly cited in scientific literature and/or supported by government funding. On acceptance of the invitation to attend the Workshop, developers were asked to complete a survey of the goals, capabilities, inputs, and outputs of their biomathematical models of alertness and performance. Data from the completed surveys were summarized and juxtaposed to provide a framework for comparing features of the seven models. Survey responses revealed that models varied greatly relative to their reported goals and capabilities. While all modelers reported that circadian factors were key components of their capabilities, they differed markedly with regard to the roles of sleep and work times as input factors for prediction: four of the seven models had work time as their sole input variable(s), while the other three models relied on various aspects of sleep timing for model input. Models also differed relative to outputs: five sought to predict results from laboratory experiments, field, and operational data, while two models were developed without regard to predicting laboratory experimental results. All modelers provided published papers describing their models, with three of the models being proprietary. Although all models appear to have been fundamentally influenced by the two-process model of sleep regulation by Borbély, there is considerable diversity among them in the number and type of input and output variables, and their stated goals and capabilities.
Summary of the key features of seven biomathematical models of human fatigue and performance
NASA Technical Reports Server (NTRS)
Mallis, Melissa M.; Mejdal, Sig; Nguyen, Tammy T.; Dinges, David F.
2004-01-01
BACKGROUND: Biomathematical models that quantify the effects of circadian and sleep/wake processes on the regulation of alertness and performance have been developed in an effort to predict the magnitude and timing of fatigue-related responses in a variety of contexts (e.g., transmeridian travel, sustained operations, shift work). This paper summarizes key features of seven biomathematical models reviewed as part of the Fatigue and Performance Modeling Workshop held in Seattle, WA, on June 13-14, 2002. The Workshop was jointly sponsored by the National Aeronautics and Space Administration, U.S. Department of Defense, U.S. Army Medical Research and Materiel Command, Office of Naval Research, Air Force Office of Scientific Research, and U.S. Department of Transportation. METHODS: An invitation was sent to developers of seven biomathematical models that were commonly cited in scientific literature and/or supported by government funding. On acceptance of the invitation to attend the Workshop, developers were asked to complete a survey of the goals, capabilities, inputs, and outputs of their biomathematical models of alertness and performance. Data from the completed surveys were summarized and juxtaposed to provide a framework for comparing features of the seven models. RESULTS: Survey responses revealed that models varied greatly relative to their reported goals and capabilities. While all modelers reported that circadian factors were key components of their capabilities, they differed markedly with regard to the roles of sleep and work times as input factors for prediction: four of the seven models had work time as their sole input variable(s), while the other three models relied on various aspects of sleep timing for model input. Models also differed relative to outputs: five sought to predict results from laboratory experiments, field, and operational data, while two models were developed without regard to predicting laboratory experimental results. All modelers provided published papers describing their models, with three of the models being proprietary. CONCLUSIONS: Although all models appear to have been fundamentally influenced by the two-process model of sleep regulation by Borbely, there is considerable diversity among them in the number and type of input and output variables, and their stated goals and capabilities.
Enhancing GIS Capabilities for High Resolution Earth Science Grids
NASA Astrophysics Data System (ADS)
Koziol, B. W.; Oehmke, R.; Li, P.; O'Kuinghttons, R.; Theurich, G.; DeLuca, C.
2017-12-01
Applications for high performance GIS will continue to increase as Earth system models pursue more realistic representations of Earth system processes. Finer spatial resolution model input and output, unstructured or irregular modeling grids, data assimilation, and regional coordinate systems present novel challenges for GIS frameworks operating in the Earth system modeling domain. This presentation provides an overview of two GIS-driven applications that combine high performance software with big geospatial datasets to produce value-added tools for the modeling and geoscientific community. First, a large-scale interpolation experiment using National Hydrography Dataset (NHD) catchments, a high resolution rectilinear CONUS grid, and the Earth System Modeling Framework's (ESMF) conservative interpolation capability will be described. ESMF is a parallel, high-performance software toolkit that provides capabilities (e.g. interpolation) for building and coupling Earth science applications. ESMF is developed primarily by the NOAA Environmental Software Infrastructure and Interoperability (NESII) group. The purpose of this experiment was to test and demonstrate the utility of high performance scientific software in traditional GIS domains. Special attention will be paid to the nuanced requirements for dealing with high resolution, unstructured grids in scientific data formats. Second, a chunked interpolation application using ESMF and OpenClimateGIS (OCGIS) will demonstrate how spatial subsetting can virtually remove computing resource ceilings for very high spatial resolution interpolation operations. OCGIS is a NESII-developed Python software package designed for the geospatial manipulation of high-dimensional scientific datasets. An overview of the data processing workflow, why a chunked approach is required, and how the application could be adapted to meet operational requirements will be discussed here. In addition, we'll provide a general overview of OCGIS's parallel subsetting capabilities including challenges in the design and implementation of a scientific data subsetter.
NASA Technical Reports Server (NTRS)
Johnson, Donald R.
2001-01-01
This research was directed to the development and application of global isentropic modeling and analysis capabilities to describe hydrologic processes and energy exchange in the climate system, and discern regional climate change. An additional objective was to investigate the accuracy and theoretical limits of global climate predictability which are imposed by the inherent limitations of simulating trace constituent transport and the hydrologic processes of condensation, precipitation and cloud life cycles.
Capability maturity models for offshore organisational management.
Strutt, J E; Sharp, J V; Terry, E; Miles, R
2006-12-01
The goal setting regime imposed by the UK safety regulator has important implications for an organisation's ability to manage health and safety related risks. Existing approaches to safety assurance based on risk analysis and formal safety assessments are increasingly considered unlikely to create the step change improvement in safety to which the offshore industry aspires and alternative approaches are being considered. One approach, which addresses the important issue of organisational behaviour and which can be applied at a very early stage of design, is the capability maturity model (CMM). The paper describes the development of a design safety capability maturity model, outlining the key processes considered necessary to safety achievement, definition of maturity levels and scoring methods. The paper discusses how CMM is related to regulatory mechanisms and risk based decision making together with the potential of CMM to environmental risk management.
NASA Astrophysics Data System (ADS)
Xu, Fei; Zhang, Yaning; Jin, Guangri; Li, Bingxi; Kim, Yong-Song; Xie, Gongnan; Fu, Zhongbin
2018-04-01
A three-phase model capable of predicting the heat transfer and moisture migration for soil freezing process was developed based on the Shen-Chen model and the mechanisms of heat and mass transfer in unsaturated soil freezing. The pre-melted film was taken into consideration, and the relationship between film thickness and soil temperature was used to calculate the liquid water fraction in both frozen zone and freezing fringe. The force that causes the moisture migration was calculated by the sum of several interactive forces and the suction in the pre-melted film was regarded as an interactive force between ice and water. Two kinds of resistance were regarded as a kind of body force related to the water films between the ice grains and soil grains, and a block force instead of gravity was introduced to keep balance with gravity before soil freezing. Lattice Boltzmann method was used in the simulation, and the input variables for the simulation included the size of computational domain, obstacle fraction, liquid water fraction, air fraction and soil porosity. The model is capable of predicting the water content distribution along soil depth and variations in water content and temperature during soil freezing process.
Micromechanics Analysis Code Post-Processing (MACPOST) User Guide. 1.0
NASA Technical Reports Server (NTRS)
Goldberg, Robert K.; Comiskey, Michele D.; Bednarcyk, Brett A.
1999-01-01
As advanced composite materials have gained wider usage. the need for analytical models and computer codes to predict the thermomechanical deformation response of these materials has increased significantly. Recently, a micromechanics technique called the generalized method of cells (GMC) has been developed, which has the capability to fulfill this -oal. Tc provide a framework for GMC, the Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC) has been developed. As MAC/GMC has been updated, significant improvements have been made to the post-processing capabilities of the code. Through the MACPOST program, which operates directly within the MSC/PATRAN graphical pre- and post-processing package, a direct link between the analysis capabilities of MAC/GMC and the post-processing capabilities of MSC/PATRAN has been established. MACPOST has simplified the production, printing. and exportation of results for unit cells analyzed by MAC/GMC. MACPOST allows different micro-level quantities to be plotted quickly and easily in contour plots. In addition, meaningful data for X-Y plots can be examined. MACPOST thus serves as an important analysis and visualization tool for the macro- and micro-level data generated by MAC/GMC. This report serves as the user's manual for the MACPOST program.
High-Resolution Characterization of UMo Alloy Microstructure
DOE Office of Scientific and Technical Information (OSTI.GOV)
Devaraj, Arun; Kovarik, Libor; Joshi, Vineet V.
2016-11-30
This report highlights the capabilities and procedure for high-resolution characterization of UMo fuels in PNNL. Uranium-molybdenum (UMo) fuel processing steps, from casting to forming final fuel, directly affect the microstructure of the fuel, which in turn dictates the in-reactor performance of the fuel under irradiation. In order to understand the influence of processing on UMo microstructure, microstructure characterization techniques are necessary. Higher-resolution characterization techniques like transmission electron microscopy (TEM) and atom probe tomography (APT) are needed to interrogate the details of the microstructure. The findings from TEM and APT are also directly beneficial for developing predictive multiscale modeling tools thatmore » can predict the microstructure as a function of process parameters. This report provides background on focused-ion-beam–based TEM and APT sample preparation, TEM and APT analysis procedures, and the unique information achievable through such advanced characterization capabilities for UMo fuels, from a fuel fabrication capability viewpoint.« less
NASA Astrophysics Data System (ADS)
Biermann, D.; Gausemeier, J.; Heim, H.-P.; Hess, S.; Petersen, M.; Ries, A.; Wagner, T.
2014-05-01
In this contribution a framework for the computer-aided planning and optimisation of functional graded components is presented. The framework is divided into three modules - the "Component Description", the "Expert System" for the synthetisation of several process chains and the "Modelling and Process Chain Optimisation". The Component Description module enhances a standard computer-aided design (CAD) model by a voxel-based representation of the graded properties. The Expert System synthesises process steps stored in the knowledge base to generate several alternative process chains. Each process chain is capable of producing components according to the enhanced CAD model and usually consists of a sequence of heating-, cooling-, and forming processes. The dependencies between the component and the applied manufacturing processes as well as between the processes themselves need to be considered. The Expert System utilises an ontology for that purpose. The ontology represents all dependencies in a structured way and connects the information of the knowledge base via relations. The third module performs the evaluation of the generated process chains. To accomplish this, the parameters of each process are optimised with respect to the component specification, whereby the result of the best parameterisation is used as representative value. Finally, the process chain which is capable of manufacturing a functionally graded component in an optimal way regarding to the property distributions of the component description is presented by means of a dedicated specification technique.
Readability and Recall of Short Prose Passages: A Theoretical Analysis.
ERIC Educational Resources Information Center
Miller, James R.; Kintsch, Walter
1980-01-01
To support the view of readability as an interaction between a text and the reader's prose-processing capabilities, this article applies an extended and formalized version of the Kintch and van Dijk prose-processing model to 20 texts of varying readability. (Author/GSK)
76 FR 63613 - Environmental Management Site-Specific Advisory Board, Hanford
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-13
... membership reappointment process. DOE Presentation: Tank Vapor Monitoring. Pacific Northwest National Laboratory Presentation: Advanced Simulation Capability for EM/Groundwater Modeling. Board Business: [[Page 63614
Development of constraint-based system-level models of microbial metabolism.
Navid, Ali
2012-01-01
Genome-scale models of metabolism are valuable tools for using genomic information to predict microbial phenotypes. System-level mathematical models of metabolic networks have been developed for a number of microbes and have been used to gain new insights into the biochemical conversions that occur within organisms and permit their survival and proliferation. Utilizing these models, computational biologists can (1) examine network structures, (2) predict metabolic capabilities and resolve unexplained experimental observations, (3) generate and test new hypotheses, (4) assess the nutritional requirements of the organism and approximate its environmental niche, (5) identify missing enzymatic functions in the annotated genome, and (6) engineer desired metabolic capabilities in model organisms. This chapter details the protocol for developing genome-scale models of metabolism in microbes as well as tips for accelerating the model building process.
Advanced sensor-simulation capability
NASA Astrophysics Data System (ADS)
Cota, Stephen A.; Kalman, Linda S.; Keller, Robert A.
1990-09-01
This paper provides an overview of an advanced simulation capability currently in use for analyzing visible and infrared sensor systems. The software system, called VISTAS (VISIBLE/INFRARED SENSOR TRADES, ANALYSES, AND SIMULATIONS) combines classical image processing techniques with detailed sensor models to produce static and time dependent simulations of a variety of sensor systems including imaging, tracking, and point target detection systems. Systems modelled to date include space-based scanning line-array sensors as well as staring 2-dimensional array sensors which can be used for either imaging or point source detection.
Waves at Navigation Structures
2014-10-27
upgrades the Coastal Modeling System’s (CMS) wave model CMS-Wave, a phase-averaged spectral wave model, and BOUSS-2D, a Boussinesq -type nonlinear wave...nearshore wave processes in practical applications. These capabilities facilitate optimization of innovative infrastructure for navigation systems to...navigation systems . The advanced models develop probabilistic engineering design estimates for rehabilitation of coastal structures to evaluate the
Learning the Norm of Internality: NetNorm, a Connectionist Model
ERIC Educational Resources Information Center
Thierry, Bollon; Adeline, Paignon; Pascal, Pansu
2011-01-01
The objective of the present article is to show that connectionist simulations can be used to model some of the socio-cognitive processes underlying the learning of the norm of internality. For our simulations, we developed a connectionist model which we called NetNorm (based on Dual-Network formalism). This model is capable of simulating the…
Model-Based Infrared Metrology for Advanced Technology Nodes and 300 mm Wafer Processing
NASA Astrophysics Data System (ADS)
Rosenthal, Peter A.; Duran, Carlos; Tower, Josh; Mazurenko, Alex; Mantz, Ulrich; Weidner, Peter; Kasic, Alexander
2005-09-01
The use of infrared spectroscopy for production semiconductor process monitoring has evolved recently from primarily unpatterned, i.e. blanket test wafer measurements in a limited historical application space of blanket epitaxial, BPSG, and FSG layers to new applications involving patterned product wafer measurements, and new measurement capabilities. Over the last several years, the semiconductor industry has adopted a new set of materials associated with copper/low-k interconnects, and new structures incorporating exotic materials including silicon germanium, SOI substrates and high aspect ratio trenches. The new device architectures and more chemically sophisticated materials have raised new process control and metrology challenges that are not addressed by current measurement technology. To address the challenges we have developed a new infrared metrology tool designed for emerging semiconductor production processes, in a package compatible with modern production and R&D environments. The tool incorporates recent advances in reflectance instrumentation including highly accurate signal processing, optimized reflectometry optics, and model-based calibration and analysis algorithms. To meet the production requirements of the modern automated fab, the measurement hardware has been integrated with a fully automated 300 mm platform incorporating front opening unified pod (FOUP) interfaces, automated pattern recognition and high throughput ultra clean robotics. The tool employs a suite of automated dispersion-model analysis algorithms capable of extracting a variety of layer properties from measured spectra. The new tool provides excellent measurement precision, tool matching, and a platform for deploying many new production and development applications. In this paper we will explore the use of model based infrared analysis as a tool for characterizing novel bottle capacitor structures employed in high density dynamic random access memory (DRAM) chips. We will explore the capability of the tool for characterizing multiple geometric parameters associated with the manufacturing process that are important to the yield and performance of advanced bottle DRAM devices.
Liu, Jessica; Oakley, Clyde; Shandas, Robin
2009-01-01
The objective of this work is to construct capacitive micromachined ultrasouind transducers (cMUTs) using multi-user MEMS (MicroElectroMechanical Systems) process (MUMPs) and to analyze the capability of this process relative to the customized processes commonly in use. The MUMPs process has the advantages of low cost and accessibility to general users since it is not necessary to have access to customized fabrication capability such as wafer-bonding and sacrificial release processes. While other researchers have reported fabricating cMUTs using the MUMPs process none has reported the limitations in the process that arise due to the use of standard design rules that place limitations on the material thicknesses, gap thicknesses, and materials that may be used. In this paper we explain these limitations, and analyze the capabilities using 1D modeling, Finite Element Analysis, and experimental devices. We show that one of the limitations is that collapse voltage and center frequency can not be controlled independently. However, center frequencies up to 9 MHz can be achieved with collapse voltages of less than 200 volts making such devices suitable for medical and non-destructive evaluation imaging applications. Since the membrane and base electrodes are made of polysilicon, there is a larger series resistance than that resulting from processes that use metal electrodes. We show that the series resistance is not a significant problem. The conductive polysilicon can also destroy the cMUT if the top membrane is pulled in the bottom. As a solution we propose the application of an additional dielectric layer. Finally we demonstrate a device built with a novel beam construction that produces transmitted pressure pulse into air with 6% bandwidth and agrees reasonably well with the 1D model. We conclude that cMUTS made with MUMPS process have some limitations that are not present in customized processes. However these limitations may be overcome with the proper design considerations that we have presented putting a low cost, highly accessible means of making cMUT devices into the hands of academic and industrial researchers. PMID:19640557
A Context-Aware Model to Provide Positioning in Disaster Relief Scenarios
Moreno, Daniel; Ochoa, Sergio F.; Meseguer, Roc
2015-01-01
The effectiveness of the work performed during disaster relief efforts is highly dependent on the coordination of activities conducted by the first responders deployed in the affected area. Such coordination, in turn, depends on an appropriate management of geo-referenced information. Therefore, enabling first responders to count on positioning capabilities during these activities is vital to increase the effectiveness of the response process. The positioning methods used in this scenario must assume a lack of infrastructure-based communication and electrical energy, which usually characterizes affected areas. Although positioning systems such as the Global Positioning System (GPS) have been shown to be useful, we cannot assume that all devices deployed in the area (or most of them) will have positioning capabilities by themselves. Typically, many first responders carry devices that are not capable of performing positioning on their own, but that require such a service. In order to help increase the positioning capability of first responders in disaster-affected areas, this paper presents a context-aware positioning model that allows mobile devices to estimate their position based on information gathered from their surroundings. The performance of the proposed model was evaluated using simulations, and the obtained results show that mobile devices without positioning capabilities were able to use the model to estimate their position. Moreover, the accuracy of the positioning model has been shown to be suitable for conducting most first response activities. PMID:26437406
Ahluwalia, Sangeeta C; Harris, Benjamin J; Lewis, Valerie A; Colla, Carrie H
2018-06-01
To measure the extent to which accountable care organizations (ACOs) have adopted end-of-life (EOL) care planning processes and characterize those ACOs that have established processes related to EOL. This study uses data from three waves (2012-2015) of the National Survey of ACOs. Respondents were 397 ACOs participating in Medicare, Medicaid, and commercial ACO contracts. This is a cross-sectional survey study using multivariate ordered logit regression models. We measured the extent to which the ACO had adopted EOL care planning processes as well as organizational characteristics, including care management, utilization management, health informatics, and shared decision-making capabilities, palliative care, and patient-centered medical home experience. Twenty-one percent of ACOs had few or no EOL care planning processes, 60 percent had some processes, and 19.6 percent had advanced processes. ACOs with a hospital in their system (OR: 3.07; p = .01), and ACOs with advanced care management (OR: 1.43; p = .02), utilization management (OR: 1.58, p = .00), and shared decision-making capabilities (OR: 16.3, p = .000) were more likely to have EOL care planning processes than those with no hospital or few to no capabilities. There remains considerable room for today's ACOs to increase uptake of EOL care planning, possibly by leveraging existing care management, utilization management, and shared decision-making processes. © Health Research and Educational Trust.
The composite load spectra project
NASA Technical Reports Server (NTRS)
Newell, J. F.; Ho, H.; Kurth, R. E.
1990-01-01
Probabilistic methods and generic load models capable of simulating the load spectra that are induced in space propulsion system components are being developed. Four engine component types (the transfer ducts, the turbine blades, the liquid oxygen posts and the turbopump oxidizer discharge duct) were selected as representative hardware examples. The composite load spectra that simulate the probabilistic loads for these components are typically used as the input loads for a probabilistic structural analysis. The knowledge-based system approach used for the composite load spectra project provides an ideal environment for incremental development. The intelligent database paradigm employed in developing the expert system provides a smooth coupling between the numerical processing and the symbolic (information) processing. Large volumes of engine load information and engineering data are stored in database format and managed by a database management system. Numerical procedures for probabilistic load simulation and database management functions are controlled by rule modules. Rules were hard-wired as decision trees into rule modules to perform process control tasks. There are modules to retrieve load information and models. There are modules to select loads and models to carry out quick load calculations or make an input file for full duty-cycle time dependent load simulation. The composite load spectra load expert system implemented today is capable of performing intelligent rocket engine load spectra simulation. Further development of the expert system will provide tutorial capability for users to learn from it.
Testing Strategies for Model-Based Development
NASA Technical Reports Server (NTRS)
Heimdahl, Mats P. E.; Whalen, Mike; Rajan, Ajitha; Miller, Steven P.
2006-01-01
This report presents an approach for testing artifacts generated in a model-based development process. This approach divides the traditional testing process into two parts: requirements-based testing (validation testing) which determines whether the model implements the high-level requirements and model-based testing (conformance testing) which determines whether the code generated from a model is behaviorally equivalent to the model. The goals of the two processes differ significantly and this report explores suitable testing metrics and automation strategies for each. To support requirements-based testing, we define novel objective requirements coverage metrics similar to existing specification and code coverage metrics. For model-based testing, we briefly describe automation strategies and examine the fault-finding capability of different structural coverage metrics using tests automatically generated from the model.
NASA Astrophysics Data System (ADS)
Pham, Binh Thai; Tien Bui, Dieu; Pourghasemi, Hamid Reza; Indra, Prakash; Dholakia, M. B.
2017-04-01
The objective of this study is to make a comparison of the prediction performance of three techniques, Functional Trees (FT), Multilayer Perceptron Neural Networks (MLP Neural Nets), and Naïve Bayes (NB) for landslide susceptibility assessment at the Uttarakhand Area (India). Firstly, a landslide inventory map with 430 landslide locations in the study area was constructed from various sources. Landslide locations were then randomly split into two parts (i) 70 % landslide locations being used for training models (ii) 30 % landslide locations being employed for validation process. Secondly, a total of eleven landslide conditioning factors including slope angle, slope aspect, elevation, curvature, lithology, soil, land cover, distance to roads, distance to lineaments, distance to rivers, and rainfall were used in the analysis to elucidate the spatial relationship between these factors and landslide occurrences. Feature selection of Linear Support Vector Machine (LSVM) algorithm was employed to assess the prediction capability of these conditioning factors on landslide models. Subsequently, the NB, MLP Neural Nets, and FT models were constructed using training dataset. Finally, success rate and predictive rate curves were employed to validate and compare the predictive capability of three used models. Overall, all the three models performed very well for landslide susceptibility assessment. Out of these models, the MLP Neural Nets and the FT models had almost the same predictive capability whereas the MLP Neural Nets (AUC = 0.850) was slightly better than the FT model (AUC = 0.849). The NB model (AUC = 0.838) had the lowest predictive capability compared to other models. Landslide susceptibility maps were final developed using these three models. These maps would be helpful to planners and engineers for the development activities and land-use planning.
Photoresist thin-film effects on alignment process capability
NASA Astrophysics Data System (ADS)
Flores, Gary E.; Flack, Warren W.
1993-08-01
Two photoresists were selected for alignment characterization based on their dissimilar coating properties and observed differences on alignment capability. The materials are Dynachem OFPR-800 and Shipley System 8. Both photoresists were examined on two challenging alignment levels in a submicron CMOS process, a nitride level and a planarized second level metal. An Ultratech Stepper model 1500 which features a darkfield alignment system with a broadband green light for alignment signal detection was used for this project. Initially, statistically designed linear screening experiments were performed to examine six process factors for each photoresist: viscosity, spin acceleration, spin speed, spin time, softbake time, and softbake temperature. Using the results derived from the screening experiments, a more thorough examination of the statistically significant process factors was performed. A full quadratic experimental design was conducted to examine viscosity, spin speed, and spin time coating properties on alignment. This included a characterization of both intra and inter wafer alignment control and alignment process capability. Insight to the different alignment behavior is analyzed in terms of photoresist material properties and the physical nature of the alignment detection system.
Ahlfeld, David P.; Baker, Kristine M.; Barlow, Paul M.
2009-01-01
This report describes the Groundwater-Management (GWM) Process for MODFLOW-2005, the 2005 version of the U.S. Geological Survey modular three-dimensional groundwater model. GWM can solve a broad range of groundwater-management problems by combined use of simulation- and optimization-modeling techniques. These problems include limiting groundwater-level declines or streamflow depletions, managing groundwater withdrawals, and conjunctively using groundwater and surface-water resources. GWM was initially released for the 2000 version of MODFLOW. Several modifications and enhancements have been made to GWM since its initial release to increase the scope of the program's capabilities and to improve its operation and reporting of results. The new code, which is called GWM-2005, also was designed to support the local grid refinement capability of MODFLOW-2005. Local grid refinement allows for the simulation of one or more higher resolution local grids (referred to as child models) within a coarser grid parent model. Local grid refinement is often needed to improve simulation accuracy in regions where hydraulic gradients change substantially over short distances or in areas requiring detailed representation of aquifer heterogeneity. GWM-2005 can be used to formulate and solve groundwater-management problems that include components in both parent and child models. Although local grid refinement increases simulation accuracy, it can also substantially increase simulation run times.
SAMM: a prototype southeast Alaska multiresource model.
Roger D. Fight; Lawrence D. Garrett; Dale L. Weyermann
1990-01-01
The adaptive environmental assessment method was used by an interdisciplinary team of forest specialists to gain an understanding of resource interactions and tradeoffs resulting from forest management activities in southeast Alaska. A forest multiresource projection model was developed in the process. The multiresource model, âSAMM,â is capable of characterizing and...
Building more effective sea level rise models for coastal management
NASA Astrophysics Data System (ADS)
Kidwell, D.; Buckel, C.; Collini, R.; Meckley, T.
2017-12-01
For over a decade, increased attention on coastal resilience and adaptation to sea level rise has resulted in a proliferation of predictive models and tools. This proliferation has enhanced our understanding of our vulnerability to sea level rise, but has also led to stakeholder fatigue in trying to realize the value of each advancement. These models vary in type and complexity ranging from GIS-based bathtub viewers to modeling systems that dynamically couple complex biophysical and geomorphic processes. These approaches and capabilities typically have the common purpose using scenarios of global and regional sea level change to inform adaptation and mitigation. In addition, stakeholders are often presented a plethora of options to address sea level rise issues from a variety of agencies, academics, and consulting firms. All of this can result in confusion, misapplication of a specific model/tool, and stakeholder feedback of "no more new science or tools, just help me understand which one to use". Concerns from stakeholders have led to the question; how do we move forward with sea level rise modeling? This presentation will provide a synthesis of the experiences and feedback derived from NOAA's Ecological Effects of Sea level Rise (EESLR) program to discuss the future of predictive sea level rise impact modeling. EESLR is an applied research program focused on the advancement of dynamic modeling capabilities in collaboration with local and regional stakeholders. Key concerns from stakeholder engagement include questions about model uncertainty, approaches for model validation, and a lack of cross-model comparisons. Effective communication of model/tool products, capabilities, and results is paramount to address these concerns. Looking forward, the most effective predictions of sea level rise impacts on our coast will be attained through a focus on coupled modeling systems, particularly those that connect natural processes and human response.
User Inspired Management of Scientific Jobs in Grids and Clouds
ERIC Educational Resources Information Center
Withana, Eran Chinthaka
2011-01-01
From time-critical, real time computational experimentation to applications which process petabytes of data there is a continuing search for faster, more responsive computing platforms capable of supporting computational experimentation. Weather forecast models, for instance, process gigabytes of data to produce regional (mesoscale) predictions on…
Mediation, Alignment, and Information Services for Semantic interoperability (MAISSI): A Trade Study
2007-06-01
Modeling Notation ( BPMN ) • Business Process Definition Metamodel (BPDM) A Business Process (BP) is a defined sequence of steps to be executed in...enterprise applications, to evaluate the capabilities of suppliers, and to compare against the competition. BPMN standardizes flowchart diagrams that
The Defense Industrial Base: Prescription for a Psychosomatic Ailment
1983-08-01
The Decision- Making Process ------------------------- 65 Notes ---------------------------------------- FIGURE 4-1. The Decision [laking Process...the strategy and tactics process to make certain that we can attain out national security objectives. (IFP is also known as mobilization planning or...decision- making model that could improve the capacity and capability-of the military-industrial complex, thereby increasing the probability of success
A model for process representation and synthesis. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Thomas, R. H.
1971-01-01
The problem of representing groups of loosely connected processes is investigated, and a model for process representation useful for synthesizing complex patterns of process behavior is developed. There are three parts, the first part isolates the concepts which form the basis for the process representation model by focusing on questions such as: What is a process; What is an event; Should one process be able to restrict the capabilities of another? The second part develops a model for process representation which captures the concepts and intuitions developed in the first part. The model presented is able to describe both the internal structure of individual processes and the interface structure between interacting processes. Much of the model's descriptive power derives from its use of the notion of process state as a vehicle for relating the internal and external aspects of process behavior. The third part demonstrates by example that the model for process representation is a useful one for synthesizing process behavior patterns. In it the model is used to define a variety of interesting process behavior patterns. The dissertation closes by suggesting how the model could be used as a semantic base for a very potent language extension facility.
NASA Technical Reports Server (NTRS)
Leonard, Daniel; Parsons, Jeremy W.; Cates, Grant
2014-01-01
In May 2013, NASA's GSDO Program requested a study to develop a discrete event simulation (DES) model that analyzes the launch campaign process of the Space Launch System (SLS) from an integrated commodities perspective. The scope of the study includes launch countdown and scrub turnaround and focuses on four core launch commodities: hydrogen, oxygen, nitrogen, and helium. Previously, the commodities were only analyzed individually and deterministically for their launch support capability, but this study was the first to integrate them to examine the impact of their interactions on a launch campaign as well as the effects of process variability on commodity availability. The study produced a validated DES model with Rockwell Arena that showed that Kennedy Space Center's ground systems were capable of supporting a 48-hour scrub turnaround for the SLS. The model will be maintained and updated to provide commodity consumption analysis of future ground system and SLS configurations.
Constraints and Approach for Selecting the Mars Surveyor '01 Landing Site
NASA Technical Reports Server (NTRS)
Golombek, M.; Bridges, N.; Gilmore, M.; Haldemann, A.; Parker, T.; Saunders, R.; Spencer, D.; Smith, J.; Weitz, C.
1999-01-01
There are many similarities between the Mars Surveyor '01 (MS '01) landing site selection process and that of Mars Pathfinder. The selection process includes two parallel activities in which engineers define and refine the capabilities of the spacecraft through design, testing and modeling and scientists define a set of landing site constraints based on the spacecraft design and landing scenario. As for Pathfinder, the safety of the site is without question the single most important factor, for the simple reason that failure to land safely yields no science and exposes the mission and program to considerable risk. The selection process must be thorough and defensible and capable of surviving multiple withering reviews similar to the Pathfinder decision. On Pathfinder, this was accomplished by attempting to understand the surface properties of sites using available remote sensing data sets and models based on them. Science objectives are factored into the selection process only after the safety of the site is validated. Finally, as for Pathfinder, the selection process is being done in an open environment with multiple opportunities for community involvement including open workshops, with education and outreach opportunities.
Constraints, Approach and Present Status for Selecting the Mars Surveyor 2001 Landing Site
NASA Technical Reports Server (NTRS)
Golombek, M.; Anderson, F.; Bridges, N.; Briggs, G.; Gilmore, M.; Gulick, V.; Haldemann, A.; Parker, T.; Saunders, R.; Spencer, D.;
1999-01-01
There are many similarities between the Mars Surveyor '01 (MS '01) landing site selection process and that of Mars Pathfinder. The selection process includes two parallel activities in which engineers define and refine the capabilities of the spacecraft through design, testing and modeling and scientists define a set of landing site constraints based on the spacecraft design and landing scenario. As for Pathfinder, the safety of the site is without question the single most important factor, for the simple reason that failure to land safely yields no science and exposes the mission and program to considerable risk. The selection process must be thorough, defensible and capable of surviving multiple withering reviews similar to the Pathfinder decision. On Pathfinder, this was accomplished by attempting to understand the surface properties of sites using available remote sensing data sets and models based on them. Science objectives are factored into the selection process only after the safety of the site is validated. Finally, as for Pathfinder, the selection process is being done in an open environment with multiple opportunities for community involvement including open workshops, with education and outreach opportunities.
Validation of a multi-phase plant-wide model for the description of the aeration process in a WWTP.
Lizarralde, I; Fernández-Arévalo, T; Beltrán, S; Ayesa, E; Grau, P
2018-02-01
This paper introduces a new mathematical model built under the PC-PWM methodology to describe the aeration process in a full-scale WWTP. This methodology enables a systematic and rigorous incorporation of chemical and physico-chemical transformations into biochemical process models, particularly for the description of liquid-gas transfer to describe the aeration process. The mathematical model constructed is able to reproduce biological COD and nitrogen removal, liquid-gas transfer and chemical reactions. The capability of the model to describe the liquid-gas mass transfer has been tested by comparing simulated and experimental results in a full-scale WWTP. Finally, an exploration by simulation has been undertaken to show the potential of the mathematical model. Copyright © 2017 Elsevier Ltd. All rights reserved.
The Evaluation of HOMER as a Marine Corps Expeditionary Energy Predeployment Tool
2010-09-01
experiment was used to ensure the HOMER models were accurate. Following the calibration, the concept of expeditionary energy density as it pertains to power ...Brigade–Afghanistan xvi MEP Mobile Electric Power MPP Maximum Power Point MPPT Maximum Power Point Tracker NASA National Aeronautics and...process was used to analyze HOMER’s modeling capability: • Conduct photovoltaic (PV) experiment, • Develop a calibration process to match the HOMER
The Evaluation of HOMER as a Marine Corps Expeditionary Energy Pre-deployment Tool
2010-11-21
used to ensure the HOMER models were accurate. Following the calibration, the concept of expeditionary energy density as it pertains to power ...MEP Mobile Electric Power MPP Maximum Power Point MPPT Maximum Power Point Tracker NASA National Aeronautics and Space Administration...process was used to analyze HOMER’s modeling capability: • Conduct photovoltaic (PV) experiment, • Develop a calibration process to match the HOMER
Rastatter, M; Dell, C W; McGuire, R A; Loren, C
1987-03-01
Previous studies investigating hemispheric organization for processing concrete and abstract nouns have provided conflicting results. Using manual reaction time tasks some studies have shown that the right hemisphere is capable of analyzing concrete words but not abstract. Others, however, have inferred that the left hemisphere is the sole analyzer of both types of lexicon. The present study tested these issues further by measuring vocal reaction times of normal subjects to unilaterally presented concrete and abstract items. Results were consistent with a model of functional localization which suggests that the minor hemisphere is capable of differentially processing both types of lexicon in the presence of a dominant left hemisphere.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hills, Richard G.; Maniaci, David Charles; Naughton, Jonathan W.
2015-09-01
A Verification and Validation (V&V) framework is presented for the development and execution of coordinated modeling and experimental program s to assess the predictive capability of computational models of complex systems through focused, well structured, and formal processes.The elements of the framework are based on established V&V methodology developed by various organizations including the Department of Energy, National Aeronautics and Space Administration, the American Institute of Aeronautics and Astronautics, and the American Society of Mechanical Engineers. Four main topics are addressed: 1) Program planning based on expert elicitation of the modeling physics requirements, 2) experimental design for model assessment, 3)more » uncertainty quantification for experimental observations and computational model simulations, and 4) assessment of the model predictive capability. The audience for this document includes program planners, modelers, experimentalist, V &V specialist, and customers of the modeling results.« less
NASA Technical Reports Server (NTRS)
Mainger, Steve
2004-01-01
As NASA speculates on and explores the future of aviation, the technological and physical aspects of our environment increasing become hurdles that must be overcome for success. Research into methods for overcoming some of these selected hurdles have been purposed by several NASA research partners as concepts. The task of establishing a common evaluation environment was placed on NASA's Virtual Airspace Simulation Technologies (VAST) project (sub-project of VAMS), and they responded with the development of the Airspace Concept Evaluation System (ACES). As one examines the ACES environment from a communication, navigation or surveillance (CNS) perspective, the simulation parameters are built with assumed perfection in the transactions associated with CNS. To truly evaluate these concepts in a realistic sense, the contributions/effects of CNS must be part of the ACES. NASA Glenn Research Center (GRC) has supported the Virtual Airspace Modeling and Simulation (VAMS) project through the continued development of CNS models and analysis capabilities which supports the ACES environment. NASA GRC initiated the development a communications traffic loading analysis tool, called the Future Aeronautical Sub-network Traffic Emulator for Communications, Navigation and Surveillance (FASTE-CNS), as part of this support. This tool allows for forecasting of communications load with the understanding that, there is no single, common source for loading models used to evaluate the existing and planned communications channels; and that, consensus and accuracy in the traffic load models is a very important input to the decisions being made on the acceptability of communication techniques used to fulfill the aeronautical requirements. Leveraging off the existing capabilities of the FASTE-CNS tool, GRC has called for FASTE-CNS to have the functionality to pre- and post-process the simulation runs of ACES to report on instances when traffic density, frequency congestion or aircraft spacing/distance violations have occurred. The integration of these functions require that the CNS models used to characterize these avionic system be of higher fidelity and better consistency then is present in FASTE-CNS system. This presentation will explore the capabilities of FASTE-CNS with renewed emphasis on the enhancements being added to perform these processing functions; the fidelity and reliability of CNS models necessary to make the enhancements work; and the benchmarking of FASTE-CNS results to improve confidence for the results of the new processing capabilities.
Rapid Automated Aircraft Simulation Model Updating from Flight Data
NASA Technical Reports Server (NTRS)
Brian, Geoff; Morelli, Eugene A.
2011-01-01
Techniques to identify aircraft aerodynamic characteristics from flight measurements and compute corrections to an existing simulation model of a research aircraft were investigated. The purpose of the research was to develop a process enabling rapid automated updating of aircraft simulation models using flight data and apply this capability to all flight regimes, including flight envelope extremes. The process presented has the potential to improve the efficiency of envelope expansion flight testing, revision of control system properties, and the development of high-fidelity simulators for pilot training.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rider, William J.; Witkowski, Walter R.; Mousseau, Vincent Andrew
2016-04-13
The importance of credible, trustworthy numerical simulations is obvious especially when using the results for making high-consequence decisions. Determining the credibility of such numerical predictions is much more difficult and requires a systematic approach to assessing predictive capability, associated uncertainties and overall confidence in the computational simulation process for the intended use of the model. This process begins with an evaluation of the computational modeling of the identified, important physics of the simulation for its intended use. This is commonly done through a Phenomena Identification Ranking Table (PIRT). Then an assessment of the evidence basis supporting the ability to computationallymore » simulate these physics can be performed using various frameworks such as the Predictive Capability Maturity Model (PCMM). There were several critical activities that follow in the areas of code and solution verification, validation and uncertainty quantification, which will be described in detail in the following sections. Here, we introduce the subject matter for general applications but specifics are given for the failure prediction project. In addition, the first task that must be completed in the verification & validation procedure is to perform a credibility assessment to fully understand the requirements and limitations of the current computational simulation capability for the specific application intended use. The PIRT and PCMM are tools used at Sandia National Laboratories (SNL) to provide a consistent manner to perform such an assessment. Ideally, all stakeholders should be represented and contribute to perform an accurate credibility assessment. PIRTs and PCMMs are both described in brief detail below and the resulting assessments for an example project are given.« less
Using Sensor Web Processes and Protocols to Assimilate Satellite Data into a Forecast Model
NASA Technical Reports Server (NTRS)
Goodman, H. Michael; Conover, Helen; Zavodsky, Bradley; Maskey, Manil; Jedlovec, Gary; Regner, Kathryn; Li, Xiang; Lu, Jessica; Botts, Mike; Berthiau, Gregoire
2008-01-01
The goal of the Sensor Management Applied Research Technologies (SMART) On-Demand Modeling project is to develop and demonstrate the readiness of the Open Geospatial Consortium (OGC) Sensor Web Enablement (SWE) capabilities to integrate both space-based Earth observations and forecast model output into new data acquisition and assimilation strategies. The project is developing sensor web-enabled processing plans to assimilate Atmospheric Infrared Sounding (AIRS) satellite temperature and moisture retrievals into a regional Weather Research and Forecast (WRF) model over the southeastern United States.
Identification of Biokinetic Models Using the Concept of Extents.
Mašić, Alma; Srinivasan, Sriniketh; Billeter, Julien; Bonvin, Dominique; Villez, Kris
2017-07-05
The development of a wide array of process technologies to enable the shift from conventional biological wastewater treatment processes to resource recovery systems is matched by an increasing demand for predictive capabilities. Mathematical models are excellent tools to meet this demand. However, obtaining reliable and fit-for-purpose models remains a cumbersome task due to the inherent complexity of biological wastewater treatment processes. In this work, we present a first study in the context of environmental biotechnology that adopts and explores the use of extents as a way to simplify and streamline the dynamic process modeling task. In addition, the extent-based modeling strategy is enhanced by optimal accounting for nonlinear algebraic equilibria and nonlinear measurement equations. Finally, a thorough discussion of our results explains the benefits of extent-based modeling and its potential to turn environmental process modeling into a highly automated task.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rutqvist, Jonny; Blanco Martin, Laura; Mukhopadhyay, Sumit
In this report, we present FY2014 progress by Lawrence Berkeley National Laboratory (LBNL) related to modeling of coupled thermal-hydrological-mechanical-chemical (THMC) processes in salt and their effect on brine migration at high temperatures. LBNL’s work on the modeling of coupled THMC processes in salt was initiated in FY2012, focusing on exploring and demonstrating the capabilities of an existing LBNL modeling tool (TOUGH-FLAC) for simulating temperature-driven coupled flow and geomechanical processes in salt. This work includes development related to, and implementation of, essential capabilities, as well as testing the model against relevant information and published experimental data related to the fate andmore » transport of water. we provide more details on the FY2014 work, first presenting updated tools and improvements made to the TOUGH-FLAC simulator, and the use of this updated tool in a new model simulation of long-term THM behavior within a generic repository in a salt formation. This is followed by the description of current benchmarking and validations efforts, including the TSDE experiment. We then present the current status in the development of constitutive relationships and the dual-continuum model for brine migration. We conclude with an outlook for FY2015, which will be much focused on model validation against field experiments and on the use of the model for the design studies related to a proposed heater experiment.« less
A Theoretical and Experimental Analysis of the Outside World Perception Process
NASA Technical Reports Server (NTRS)
Wewerinke, P. H.
1978-01-01
The outside scene is often an important source of information for manual control tasks. Important examples of these are car driving and aircraft control. This paper deals with modelling this visual scene perception process on the basis of linear perspective geometry and the relative motion cues. Model predictions utilizing psychophysical threshold data from base-line experiments and literature of a variety of visual approach tasks are compared with experimental data. Both the performance and workload results illustrate that the model provides a meaningful description of the outside world perception process, with a useful predictive capability.
VIM: A Platform for Violent Intent Modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanfilippo, Antonio P.; Schryver, Jack C.; Whitney, Paul D.
2009-03-31
Radical and contentious political/religious activism may or may not evolve into violent behavior depending on contextual factors related to social, political, cultural and infrastructural conditions. Significant theoretical advances have been made in understanding these contextual factors and the import of their interrelations. However, there has been relative little progress in the development of processes and capabilities which leverage such theoretical advances to automate the anticipatory analysis of violent intent. In this paper, we describe a framework which implements such processes and capabilities, and discuss the implications of using the resulting system to assess the emergence of radicalization leading to violence.
Space Shuttle Orbiter oxygen partial pressure sensing and control system improvements
NASA Technical Reports Server (NTRS)
Frampton, Robert F.; Hoy, Dennis M.; Kelly, Kevin J.; Walleshauser, James J.
1992-01-01
A program aimed at developing a new PPO2 oxygen sensor and a replacement amplifier for the Space Shuttle Orbiter is described. Experimental design methodologies used in the test and modeling process made it possible to enhance the effectiveness of the program and to reduce its cost. Significant cost savings are due to the increased lifetime of the basic sensor cell, the maximization of useful sensor life through an increased amplifier gain adjustment capability, the use of streamlined production processes for the manufacture of the assemblies, and the refurbishment capability of the replacement sensor.
NASA Technical Reports Server (NTRS)
Baldridge, P. E.; Weber, C.; Schaal, G.; Wilhelm, C.; Wurelic, G. E.; Stephan, J. G.; Ebbert, T. F.; Smail, H. E.; Mckeon, J.; Schmidt, N. (Principal Investigator)
1977-01-01
The author has identified the following significant results. A current uniform land inventory was derived, in part, from LANDSAT data. The State has the ability to convert processed land information from LANDSAT to Ohio Capability Analysis Program (OCAP). The OCAP is a computer information and mapping system comprised of various programs used to digitally store, analyze, and display land capability information. More accurate processing of LANDSAT data could lead to reasonably accurate, useful land allocations models. It was feasible to use LANDSAT data to investigate minerals, pollution, land use, and resource inventory.
Cost Model Comparison: A Study of Internally and Commercially Developed Cost Models in Use by NASA
NASA Technical Reports Server (NTRS)
Gupta, Garima
2011-01-01
NASA makes use of numerous cost models to accurately estimate the cost of various components of a mission - hardware, software, mission/ground operations - during the different stages of a mission's lifecycle. The purpose of this project was to survey these models and determine in which respects they are similar and in which they are different. The initial survey included a study of the cost drivers for each model, the form of each model (linear/exponential/other CER, range/point output, capable of risk/sensitivity analysis), and for what types of missions and for what phases of a mission lifecycle each model is capable of estimating cost. The models taken into consideration consisted of both those that were developed by NASA and those that were commercially developed: GSECT, NAFCOM, SCAT, QuickCost, PRICE, and SEER. Once the initial survey was completed, the next step in the project was to compare the cost models' capabilities in terms of Work Breakdown Structure (WBS) elements. This final comparison was then portrayed in a visual manner with Venn diagrams. All of the materials produced in the process of this study were then posted on the Ground Segment Team (GST) Wiki.
Because Zebrafish (Danio rerio) have become a popular and important model for scientific research, the capability to rear larval zebrafish to adulthood is of great importance. Recently research examining the effects of diet (live versus processed) have been published. In the cu...
NASA Technical Reports Server (NTRS)
Lahoti, G. D.; Akgerman, N.; Altan, T.
1978-01-01
Mild steel (AISI 1018) was selected as model cold-rolling material and Ti-6Al-4V and INCONEL 718 were selected as typical hot-rolling and cold-rolling alloys, respectively. The flow stress and workability of these alloys were characterized and friction factor at the roll/workpiece interface was determined at their respective working conditions by conducting ring tests. Computer-aided mathematical models for predicting metal flow and stresses, and for simulating the shape-rolling process were developed. These models utilize the upper-bound and the slab methods of analysis, and are capable of predicting the lateral spread, roll-separating force, roll torque and local stresses, strains and strain rates. This computer-aided design (CAD) system is also capable of simulating the actual rolling process and thereby designing roll-pass schedule in rolling of an airfoil or similar shape. The predictions from the CAD system were verified with respect to cold rolling of mild steel plates. The system is being applied to cold and hot isothermal rolling of an airfoil shape, and will be verified with respect to laboratory experiments under controlled conditions.
Simulating the Composite Propellant Manufacturing Process
NASA Technical Reports Server (NTRS)
Williamson, Suzanne; Love, Gregory
2000-01-01
There is a strategic interest in understanding how the propellant manufacturing process contributes to military capabilities outside the United States. The paper will discuss how system dynamics (SD) has been applied to rapidly assess the capabilities and vulnerabilities of a specific composite propellant production complex. These facilities produce a commonly used solid propellant with military applications. The authors will explain how an SD model can be configured to match a specific production facility followed by a series of scenarios designed to analyze operational vulnerabilities. By using the simulation model to rapidly analyze operational risks, the analyst gains a better understanding of production complexities. There are several benefits of developing SD models to simulate chemical production. SD is an effective tool for characterizing complex problems, especially the production process where the cascading effect of outages quickly taxes common understanding. By programming expert knowledge into an SD application, these tools are transformed into a knowledge management resource that facilitates rapid learning without requiring years of experience in production operations. It also permits the analyst to rapidly respond to crisis situations and other time-sensitive missions. Most importantly, the quantitative understanding gained from applying the SD model lends itself to strategic analysis and planning.
Rawahi, Said Harith Al; Asimakopoulou, Koula; Newton, Jonathon Timothy
2018-01-01
To determine the barriers and enablers to behavioural change to reduce free sugar intake related to dental caries in a sample of UK adults who identify their ethnicity as White. Qualitative study comprising semi-structured interviews of 27 participants. Interviews were recorded, transcribed and analysed using thematic analysis methods. The Capability-Opportunity-Motivation-Behaviour model (COM-B) and the Theoretical Domains Framework (TDF) were used to guide the derivation of themes. Data saturation occurred at 27 interviews. The COM-B Model and TDF domains captured various factors that may influence the consumption of free sugar. TDF elements which are reflected in the study are: Knowledge; Psychological skills; Memory, attention, and decision processes; Behavioural regulation; Physical skills; Social influence; Environmental context and resources; Social and professional role and identity; Beliefs about capabilities; Beliefs about consequence; Intentions and goals reinforcement; and Emotions. COM-B Model elements which are reflected in the study are: psychological capabilities, physical capabilities, social opportunities, physical opportunities, reflective motivation, and automatic motivation. The COM-B model and TDF framework provided a comprehensive account of the barriers and facilitators of reducing sugar intake among white ethnic groups.
Structural equation modeling and natural systems
Grace, James B.
2006-01-01
This book, first published in 2006, presents an introduction to the methodology of structural equation modeling, illustrates its use, and goes on to argue that it has revolutionary implications for the study of natural systems. A major theme of this book is that we have, up to this point, attempted to study systems primarily using methods (such as the univariate model) that were designed only for considering individual processes. Understanding systems requires the capacity to examine simultaneous influences and responses. Structural equation modeling (SEM) has such capabilities. It also possesses many other traits that add strength to its utility as a means of making scientific progress. In light of the capabilities of SEM, it can be argued that much of ecological theory is currently locked in an immature state that impairs its relevance. It is further argued that the principles of SEM are capable of leading to the development and evaluation of multivariate theories of the sort vitally needed for the conservation of natural systems.
Extending BPM Environments of Your Choice with Performance Related Decision Support
NASA Astrophysics Data System (ADS)
Fritzsche, Mathias; Picht, Michael; Gilani, Wasif; Spence, Ivor; Brown, John; Kilpatrick, Peter
What-if Simulations have been identified as one solution for business performance related decision support. Such support is especially useful in cases where it can be automatically generated out of Business Process Management (BPM) Environments from the existing business process models and performance parameters monitored from the executed business process instances. Currently, some of the available BPM Environments offer basic-level performance prediction capabilities. However, these functionalities are normally too limited to be generally useful for performance related decision support at business process level. In this paper, an approach is presented which allows the non-intrusive integration of sophisticated tooling for what-if simulations, analytic performance prediction tools, process optimizations or a combination of such solutions into already existing BPM environments. The approach abstracts from process modelling techniques which enable automatic decision support spanning processes across numerous BPM Environments. For instance, this enables end-to-end decision support for composite processes modelled with the Business Process Modelling Notation (BPMN) on top of existing Enterprise Resource Planning (ERP) processes modelled with proprietary languages.
NASA Astrophysics Data System (ADS)
Kuras, P. K.; Weiler, M.; Alila, Y.; Spittlehouse, D.; Winkler, R.
2006-12-01
Hydrologic models have been increasingly used in forest hydrology to overcome the limitations of paired watershed experiments, where vegetative recovery and natural variability obscure the inferences and conclusions that can be drawn from such studies. Models, however, are also plagued by uncertainty stemming from a limited understanding of hydrological processes in forested catchments and parameter equifinality is a common concern. This has created the necessity to improve our understanding of how hydrological systems work, through the development of hydrological measures, analyses and models that address the question: are we getting the right answers for the right reasons? Hence, physically-based, spatially-distributed hydrologic models should be validated with high-quality experimental data describing multiple concurrent internal catchment processes under a range of hydrologic regimes. The distributed hydrology soil vegetation model (DHSVM) frequently used in forest management applications is an example of a process-based model used to address the aforementioned circumstances, and this study takes a novel approach at collectively examining the ability of a pre-calibrated model application to realistically simulate outlet flows along with the spatial-temporal variation of internal catchment processes including: continuous groundwater dynamics at 9 locations, stream and road network flow at 67 locations for six individual days throughout the freshet, and pre-melt season snow distribution. Model efficiency was improved over prior evaluations due to continuous efforts in improving the quality of meteorological data in the watershed. Road and stream network flows were very well simulated for a range of hydrological conditions, and the spatial distribution of the pre-melt season snowpack was in general agreement with observed values. The model was effective in simulating the spatial variability of subsurface flow generation, except at locations where strong stream-groundwater interactions existed, as the model is not capable of simulating such processes and subsurface flows always drain to the stream network. The model has proven overall to be quite capable in realistically simulating internal catchment processes in the watershed, which creates more confidence in future model applications exploring the effects of various forest management scenarios on the watershed's hydrological processes.
Graphical workstation capability for reliability modeling
NASA Technical Reports Server (NTRS)
Bavuso, Salvatore J.; Koppen, Sandra V.; Haley, Pamela J.
1992-01-01
In addition to computational capabilities, software tools for estimating the reliability of fault-tolerant digital computer systems must also provide a means of interfacing with the user. Described here is the new graphical interface capability of the hybrid automated reliability predictor (HARP), a software package that implements advanced reliability modeling techniques. The graphics oriented (GO) module provides the user with a graphical language for modeling system failure modes through the selection of various fault-tree gates, including sequence-dependency gates, or by a Markov chain. By using this graphical input language, a fault tree becomes a convenient notation for describing a system. In accounting for any sequence dependencies, HARP converts the fault-tree notation to a complex stochastic process that is reduced to a Markov chain, which it can then solve for system reliability. The graphics capability is available for use on an IBM-compatible PC, a Sun, and a VAX workstation. The GO module is written in the C programming language and uses the graphical kernal system (GKS) standard for graphics implementation. The PC, VAX, and Sun versions of the HARP GO module are currently in beta-testing stages.
NASA Technical Reports Server (NTRS)
Eckhardt, D. E., Jr.
1979-01-01
A model of a central processor (CPU) which services background applications in the presence of time critical activity is presented. The CPU is viewed as an M/M/1 queueing system subject to periodic interrupts by deterministic, time critical process. The Laplace transform of the distribution of service times for the background applications is developed. The use of state of the art queueing models for studying the background processing capability of time critical computer systems is discussed and the results of a model validation study which support this application of queueing models are presented.
Dawn Orbit Determination Team: Trajectory Modeling and Reconstruction Processes at Vesta
NASA Technical Reports Server (NTRS)
Abrahamson, Matthew J.; Ardito, Alessandro; Han, Dongsuk; Haw, Robert; Kennedy, Brian; Mastrodemos, Nick; Nandi, Sumita; Park, Ryan; Rush, Brian; Vaughan, Andrew
2013-01-01
The Dawn spacecraft spent over a year in orbit around Vesta from July 2011 through August 2012. In order to maintain the designated science reference orbits and enable the transfers between those orbits, precise and timely orbit determination was required. Challenges included low-thrust ion propulsion modeling, estimation of relatively unknown Vesta gravity and rotation models, track-ing data limitations, incorporation of real-time telemetry into dynamics model updates, and rapid maneuver design cycles during transfers. This paper discusses the dynamics models, filter configuration, and data processing implemented to deliver a rapid orbit determination capability to the Dawn project.
Human Support Technology Research to Enable Exploration
NASA Technical Reports Server (NTRS)
Joshi, Jitendra
2003-01-01
Contents include the following: Advanced life support. System integration, modeling, and analysis. Progressive capabilities. Water processing. Air revitalization systems. Why advanced CO2 removal technology? Solid waste resource recovery systems: lyophilization. ISRU technologies for Mars life support. Atmospheric resources of Mars. N2 consumable/make-up for Mars life. Integrated test beds. Monitoring and controlling the environment. Ground-based commercial technology. Optimizing size vs capability. Water recovery systems. Flight verification topics.
Material Stream Strategy for Lithium and Inorganics (U)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Safarik, Douglas Joseph; Dunn, Paul Stanton; Korzekwa, Deniece Rochelle
Design Agency Responsibilities: Manufacturing Support to meet Stockpile Stewardship goals for maintaining the nuclear stockpile through experimental and predictive modeling capability. Development and maintenance of Manufacturing Science expertise to assess material specifications and performance boundaries, and their relationship to processing parameters. Production Engineering Evaluations with competence in design requirements, material specifications, and manufacturing controls. Maintenance and enhancement of Aging Science expertise to support Stockpile Stewardship predictive science capability.
Modeling fire and other disturbance processes using LANDIS
Stephen R. Shifley; Jian Yang; Hong He
2009-01-01
LANDIS is a landscape decision support tool that models spatial relationships to help managers and planners examine the large-scale, long-term, cumulative effects of succession, harvesting, wildfire, prescribed fire, insects, and disease. It can operate on forest landscapes from a few thousand to a few million acres in extent. Fire modeling capabilities in LANDIS are...
USDA-ARS?s Scientific Manuscript database
Classical, one-dimensional, mobile bed, sediment-transport models simulate vertical channel adjustment, raising or lowering cross-section node elevations to simulate erosion or deposition. This approach does not account for bank erosion processes including toe scour and mass failure. In many systems...
Dai, Heng; Ye, Ming; Walker, Anthony P.; ...
2017-03-28
A hydrological model consists of multiple process level submodels, and each submodel represents a process key to the operation of the simulated system. Global sensitivity analysis methods have been widely used to identify important processes for system model development and improvement. The existing methods of global sensitivity analysis only consider parametric uncertainty, and are not capable of handling model uncertainty caused by multiple process models that arise from competing hypotheses about one or more processes. To address this problem, this study develops a new method to probe model output sensitivity to competing process models by integrating model averaging methods withmore » variance-based global sensitivity analysis. A process sensitivity index is derived as a single summary measure of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and their parameters. Here, for demonstration, the new index is used to assign importance to the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that convert precipitation to recharge, and the geology process is simulated by two models of hydraulic conductivity. Each process model has its own random parameters. Finally, the new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dai, Heng; Ye, Ming; Walker, Anthony P.
A hydrological model consists of multiple process level submodels, and each submodel represents a process key to the operation of the simulated system. Global sensitivity analysis methods have been widely used to identify important processes for system model development and improvement. The existing methods of global sensitivity analysis only consider parametric uncertainty, and are not capable of handling model uncertainty caused by multiple process models that arise from competing hypotheses about one or more processes. To address this problem, this study develops a new method to probe model output sensitivity to competing process models by integrating model averaging methods withmore » variance-based global sensitivity analysis. A process sensitivity index is derived as a single summary measure of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and their parameters. Here, for demonstration, the new index is used to assign importance to the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that convert precipitation to recharge, and the geology process is simulated by two models of hydraulic conductivity. Each process model has its own random parameters. Finally, the new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less
Technical and economic assessment of processes for the production of butanol and acetone
NASA Technical Reports Server (NTRS)
1982-01-01
This report represents a preliminary technical and economic evaluation of a process which produces mixed solvents (butaol/acetone/ethanol) via fermentation of sugars derived from renewable biomass resources. The objective is to assess the technology of producing butanol/acetone from biomass, and select a viable process capable of serving as a base case model for technical and economic analysis. It is anticipated that the base case process developed herein can then be used as the basis for subsequent studies concerning biomass conversion processes capable of producing a wide range of chemicals. The general criteria utilized in determining the design basis for the process are profit potential and non-renewable energy displacement potential. The feedstock chosen, aspen wood, was selected from a number of potential renewable biomass resources as the most readily available in the United States and for its relatively large potential for producing reducing sugars.
Architecture for Survivable System Processing (ASSP)
NASA Astrophysics Data System (ADS)
Wood, Richard J.
1991-11-01
The Architecture for Survivable System Processing (ASSP) Program is a multi-phase effort to implement Department of Defense (DOD) and commercially developed high-tech hardware, software, and architectures for reliable space avionics and ground based systems. System configuration options provide processing capabilities to address Time Dependent Processing (TDP), Object Dependent Processing (ODP), and Mission Dependent Processing (MDP) requirements through Open System Architecture (OSA) alternatives that allow for the enhancement, incorporation, and capitalization of a broad range of development assets. High technology developments in hardware, software, and networking models, address technology challenges of long processor life times, fault tolerance, reliability, throughput, memories, radiation hardening, size, weight, power (SWAP) and security. Hardware and software design, development, and implementation focus on the interconnectivity/interoperability of an open system architecture and is being developed to apply new technology into practical OSA components. To insure for widely acceptable architecture capable of interfacing with various commercial and military components, this program provides for regular interactions with standardization working groups (e.g.) the International Standards Organization (ISO), American National Standards Institute (ANSI), Society of Automotive Engineers (SAE), and Institute of Electrical and Electronic Engineers (IEEE). Selection of a viable open architecture is based on the widely accepted standards that implement the ISO/OSI Reference Model.
Architecture for Survivable System Processing (ASSP)
NASA Technical Reports Server (NTRS)
Wood, Richard J.
1991-01-01
The Architecture for Survivable System Processing (ASSP) Program is a multi-phase effort to implement Department of Defense (DOD) and commercially developed high-tech hardware, software, and architectures for reliable space avionics and ground based systems. System configuration options provide processing capabilities to address Time Dependent Processing (TDP), Object Dependent Processing (ODP), and Mission Dependent Processing (MDP) requirements through Open System Architecture (OSA) alternatives that allow for the enhancement, incorporation, and capitalization of a broad range of development assets. High technology developments in hardware, software, and networking models, address technology challenges of long processor life times, fault tolerance, reliability, throughput, memories, radiation hardening, size, weight, power (SWAP) and security. Hardware and software design, development, and implementation focus on the interconnectivity/interoperability of an open system architecture and is being developed to apply new technology into practical OSA components. To insure for widely acceptable architecture capable of interfacing with various commercial and military components, this program provides for regular interactions with standardization working groups (e.g.) the International Standards Organization (ISO), American National Standards Institute (ANSI), Society of Automotive Engineers (SAE), and Institute of Electrical and Electronic Engineers (IEEE). Selection of a viable open architecture is based on the widely accepted standards that implement the ISO/OSI Reference Model.
NARSTO critical review of photochemical models and modeling
NASA Astrophysics Data System (ADS)
Russell, Armistead; Dennis, Robin
Photochemical air quality models play a central role in both schentific investigation of how pollutants evlove in the atmosphere as well as developing policies to manage air quality. In the past 30 years, these models have evolved from rather crude representations of the physics and chemistry impacting trace species to their current state: comprehensive, but not complete. The evolution has included advancements in not only the level of process descriptions, but also the computational implementation, including numerical methods. As part of the NARSTO Critical Reviews, this article discusses the current strengths and weaknesses of air quality models and the modeling process. Current Eulerian models are found to represent well the primary processes impacting the evolution of trace species in most cases though some exceptions may exist. For example, sub-grid-scale processes, such as concentrated power plant plumes, are treated only approximately. It is not apparent how much such approximations affect their results and the polices based upon those results. A significant weakness has been in how investigators have addressed, and communicated, such uncertainties. Studies find that major uncertainties are due to model inputs, e.g., emissions and meteorology, more so than the model itself. One of the primary weakness identified is in the modeling process, not the models. Evaluation has been limited both due to data constraints. Seldom is there ample observational data to conduct a detailed model intercomparison using consistent data (e.g., the same emissions and meteorology). Further model advancement, and development of greater confidence in the use of models, is hampered by the lack of thorough evaluation and intercomparisons. Model advances are seen in the use of new tools for extending the interpretation of model results, e.g., process and sensitivity analysis, modeling systems to facilitate their use, and extension of model capabilities, e.g., aerosol dynamics capabilities and sub-grid-scale representations. Another possible direction that is the development and widespread use of a community model acting as a platform for multiple groups and agencies to collaborate and progress more rapidly.
Verification of a Finite Element Model for Pyrolyzing Ablative Materials
NASA Technical Reports Server (NTRS)
Risch, Timothy K.
2017-01-01
Ablating thermal protection system (TPS) materials have been used in many reentering spacecraft and in other applications such as rocket nozzle linings, fire protection materials, and as countermeasures for directed energy weapons. The introduction of the finite element model to the analysis of ablation has arguably resulted in improved computational capabilities due the flexibility and extended applicability of the method, especially to complex geometries. Commercial finite element codes often provide enhanced capability compared to custom, specially written programs based on versatility, usability, pre- and post-processing, grid generation, total life-cycle costs, and speed.
Using APEX to Model Anticipated Human Error: Analysis of a GPS Navigational Aid
NASA Technical Reports Server (NTRS)
VanSelst, Mark; Freed, Michael; Shefto, Michael (Technical Monitor)
1997-01-01
The interface development process can be dramatically improved by predicting design facilitated human error at an early stage in the design process. The approach we advocate is to SIMULATE the behavior of a human agent carrying out tasks with a well-specified user interface, ANALYZE the simulation for instances of human error, and then REFINE the interface or protocol to minimize predicted error. This approach, incorporated into the APEX modeling architecture, differs from past approaches to human simulation in Its emphasis on error rather than e.g. learning rate or speed of response. The APEX model consists of two major components: (1) a powerful action selection component capable of simulating behavior in complex, multiple-task environments; and (2) a resource architecture which constrains cognitive, perceptual, and motor capabilities to within empirically demonstrated limits. The model mimics human errors arising from interactions between limited human resources and elements of the computer interface whose design falls to anticipate those limits. We analyze the design of a hand-held Global Positioning System (GPS) device used for radical and navigational decisions in small yacht recalls. The analysis demonstrates how human system modeling can be an effective design aid, helping to accelerate the process of refining a product (or procedure).
NASA Astrophysics Data System (ADS)
Villamil-Otero, G.; Zhang, J.; Yao, Y.
2017-12-01
The Antarctic Peninsula (AP) has long been the focus of climate change studies due to its rapid environmental changes such as significantly increased glacier melt and retreat, and ice-shelf break-up. Progress has been continuously made in the use of regional modeling to simulate surface mass changes over ice sheets. Most efforts, however, focus on the ice sheets of Greenland with considerable fewer studies in Antarctica. In this study the Weather Research and Forecasting (WRF) model, which has been applied to the Antarctic region for weather modeling, is adopted to capture the past and future surface mass balance changes over AP. In order to enhance the capabilities of WRF model simulating surface mass balance over the ice surface, we implement various ice and snow processes within the WRF and develop a new WRF suite (WRF-Ice). The WRF-Ice includes a thermodynamic ice sheet model that improves the representation of internal melting and refreezing processes and the thermodynamic effects over ice sheet. WRF-Ice also couples a thermodynamic sea ice model to improve the simulation of surface temperature and fluxes over sea ice. Lastly, complex snow processes are also taken into consideration including the implementation of a snowdrift model that takes into account the redistribution of blowing snow as well as the thermodynamic impact of drifting snow sublimation on the lower atmospheric boundary layer. Intensive testing of these ice and snow processes are performed to assess the capability of WRF-Ice in simulating the surface mass balance changes over AP.
Flexibility on storage-release based distributed hydrologic modeling with object-oriented approach
USDA-ARS?s Scientific Manuscript database
With the availability of advanced hydrologic data in the public domain such as remotely sensed and climate change scenario data, there is a need for a modeling framework that is capable of using these data to simulate and extend hydrologic processes with multidisciplinary approaches for sustainable ...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-22
... Competition Bureau seeks public input on additional questions relating to modeling voice capability and Annual... submitting comments and additional information on the rulemaking process, see the SUPPLEMENTARY INFORMATION section of this document. FOR FURTHER INFORMATION CONTACT: Katie King, Wireline Competition Bureau at (202...
Unified Simulation and Analysis Framework for Deep Space Navigation Design
NASA Technical Reports Server (NTRS)
Anzalone, Evan; Chuang, Jason; Olsen, Carrie
2013-01-01
As the technology that enables advanced deep space autonomous navigation continues to develop and the requirements for such capability continues to grow, there is a clear need for a modular expandable simulation framework. This tool's purpose is to address multiple measurement and information sources in order to capture system capability. This is needed to analyze the capability of competing navigation systems as well as to develop system requirements, in order to determine its effect on the sizing of the integrated vehicle. The development for such a framework is built upon Model-Based Systems Engineering techniques to capture the architecture of the navigation system and possible state measurements and observations to feed into the simulation implementation structure. These models also allow a common environment for the capture of an increasingly complex operational architecture, involving multiple spacecraft, ground stations, and communication networks. In order to address these architectural developments, a framework of agent-based modules is implemented to capture the independent operations of individual spacecraft as well as the network interactions amongst spacecraft. This paper describes the development of this framework, and the modeling processes used to capture a deep space navigation system. Additionally, a sample implementation describing a concept of network-based navigation utilizing digitally transmitted data packets is described in detail. This developed package shows the capability of the modeling framework, including its modularity, analysis capabilities, and its unification back to the overall system requirements and definition.
Electrophysiological models of neural processing.
Nelson, Mark E
2011-01-01
The brain is an amazing information processing system that allows organisms to adaptively monitor and control complex dynamic interactions with their environment across multiple spatial and temporal scales. Mathematical modeling and computer simulation techniques have become essential tools in understanding diverse aspects of neural processing ranging from sub-millisecond temporal coding in the sound localization circuity of barn owls to long-term memory storage and retrieval in humans that can span decades. The processing capabilities of individual neurons lie at the core of these models, with the emphasis shifting upward and downward across different levels of biological organization depending on the nature of the questions being addressed. This review provides an introduction to the techniques for constructing biophysically based models of individual neurons and local networks. Topics include Hodgkin-Huxley-type models of macroscopic membrane currents, Markov models of individual ion-channel currents, compartmental models of neuronal morphology, and network models involving synaptic interactions among multiple neurons.
NASA Astrophysics Data System (ADS)
Rinehart, A. J.; Vivoni, E. R.
2005-12-01
Snow processes play a significant role in the hydrologic cycle of mountainous and high-latitude catchments in the western United States. Snowmelt runoff contributes to a large percentage of stream runoff while snow covered regions remain highly localized to small portions of the catchment area. The appropriate representation of snow dynamics at a given range of spatial and temporal scales is critical for adequately predicting runoff responses in snowmelt-dominated watersheds. In particular, the accurate depiction of snow cover patterns is important as a range of topographic, land-use and geographic parameters create zones of preferential snow accumulation or ablation that significantly affect the timing of a region's snow melt and the persistence of a snow pack. In this study, we present the development and testing of a distributed snow model designed for simulations over complex terrain. The snow model is developed within the context of the TIN-based Real-time Integrated Basin Simulator (tRIBS), a fully-distributed watershed model capable of continuous simulations of coupled hydrological processes, including unsaturated-saturated zone dynamics, land-atmosphere interactions and runoff generation via multiple mechanisms. The use of triangulated irregular networks as a domain discretization allows tRIBS to accurately represent topography with a reduced number of computational nodes, as compared to traditional grid-based models. This representation is developed using a Delauney optimization criterion that causes areas of topographic homogeneity to be represented at larger spatial scales than the original grid, while more heterogeneous areas are represented at higher resolutions. We utilize the TIN-based terrain representation to simulate microscale (10-m to 100-m) snow pack dynamics over a catchment. The model includes processes such as the snow pack energy balance, wind and bulk redistribution, and snow interception by vegetation. For this study, we present tests from a distributed one-layer energy balance model as applied to a northern New Mexico hillslope in a ponderosa pine forest using both synthetic and real meteorological forcing. We also provide tests of the model's capability to represent spatial patterns within a small watershed in the Jemez Mountain region. Finally, we discuss the interaction of the tested snow process module with existing components in the watershed model and additional applications and capabilities under development.
Markstrom, Steven L.
2012-01-01
A software program, called P2S, has been developed which couples the daily stream temperature simulation capabilities of the U.S. Geological Survey Stream Network Temperature model with the watershed hydrology simulation capabilities of the U.S. Geological Survey Precipitation-Runoff Modeling System. The Precipitation-Runoff Modeling System is a modular, deterministic, distributed-parameter, physical-process watershed model that simulates hydrologic response to various combinations of climate and land use. Stream Network Temperature was developed to help aquatic biologists and engineers predict the effects of changes that hydrology and energy have on water temperatures. P2S will allow scientists and watershed managers to evaluate the effects of historical climate and projected climate change, landscape evolution, and resource management scenarios on watershed hydrology and in-stream water temperature.
Wiemuth, M; Junger, D; Leitritz, M A; Neumann, J; Neumuth, T; Burgert, O
2017-08-01
Medical processes can be modeled using different methods and notations. Currently used modeling systems like Business Process Model and Notation (BPMN) are not capable of describing the highly flexible and variable medical processes in sufficient detail. We combined two modeling systems, Business Process Management (BPM) and Adaptive Case Management (ACM), to be able to model non-deterministic medical processes. We used the new Standards Case Management Model and Notation (CMMN) and Decision Management Notation (DMN). First, we explain how CMMN, DMN and BPMN could be used to model non-deterministic medical processes. We applied this methodology to model 79 cataract operations provided by University Hospital Leipzig, Germany, and four cataract operations provided by University Eye Hospital Tuebingen, Germany. Our model consists of 85 tasks and about 20 decisions in BPMN. We were able to expand the system with more complex situations that might appear during an intervention. An effective modeling of the cataract intervention is possible using the combination of BPM and ACM. The combination gives the possibility to depict complex processes with complex decisions. This combination allows a significant advantage for modeling perioperative processes.
ATLAS, an integrated structural analysis and design system. Volume 1: ATLAS user's guide
NASA Technical Reports Server (NTRS)
Dreisbach, R. L. (Editor)
1979-01-01
Some of the many analytical capabilities provided by the ATLAS Version 4.0 System in the logical sequence are described in which model-definition data are prepared and the subsequent computer job is executed. The example data presented and the fundamental technical considerations that are highlighted can be used as guides during the problem solving process. This guide does not describe the details of the ATLAS capabilities, but provides an introduction to the new user of ATLAS to the level at which the complete array of capabilities described in the ATLAS User's Manual can be exploited fully.
Time Dependent Simulation of Turbopump Flows
NASA Technical Reports Server (NTRS)
Kiris, Cetin C.; Kwak, Dochan; Chan, William; Williams, Robert
2001-01-01
The objective of this viewgraph presentation is to enhance incompressible flow simulation capability for developing aerospace vehicle components, especially unsteady flow phenomena associated with high speed turbo pumps. Unsteady Space Shuttle Main Engine (SSME)-rig1 1 1/2 rotations are completed for the 34.3 million grid points model. The moving boundary capability is obtained by using the DCF module. MLP shared memory parallelism has been implemented and benchmarked in INS3D. The scripting capability from CAD geometry to solution is developed. Data compression is applied to reduce data size in post processing and fluid/structure coupling is initiated.
Heterogeneous concurrent computing with exportable services
NASA Technical Reports Server (NTRS)
Sunderam, Vaidy
1995-01-01
Heterogeneous concurrent computing, based on the traditional process-oriented model, is approaching its functionality and performance limits. An alternative paradigm, based on the concept of services, supporting data driven computation, and built on a lightweight process infrastructure, is proposed to enhance the functional capabilities and the operational efficiency of heterogeneous network-based concurrent computing. TPVM is an experimental prototype system supporting exportable services, thread-based computation, and remote memory operations that is built as an extension of and an enhancement to the PVM concurrent computing system. TPVM offers a significantly different computing paradigm for network-based computing, while maintaining a close resemblance to the conventional PVM model in the interest of compatibility and ease of transition Preliminary experiences have demonstrated that the TPVM framework presents a natural yet powerful concurrent programming interface, while being capable of delivering performance improvements of upto thirty percent.
NSF's Perspective on Space Weather Research for Building Forecasting Capabilities
NASA Astrophysics Data System (ADS)
Bisi, M. M.; Pulkkinen, A. A.; Bisi, M. M.; Pulkkinen, A. A.; Webb, D. F.; Oughton, E. J.; Azeem, S. I.
2017-12-01
Space weather research at the National Science Foundation (NSF) is focused on scientific discovery and on deepening knowledge of the Sun-Geospace system. The process of maturation of knowledge base is a requirement for the development of improved space weather forecast models and for the accurate assessment of potential mitigation strategies. Progress in space weather forecasting requires advancing in-depth understanding of the underlying physical processes, developing better instrumentation and measurement techniques, and capturing the advancements in understanding in large-scale physics based models that span the entire chain of events from the Sun to the Earth. This presentation will provide an overview of current and planned programs pertaining to space weather research at NSF and discuss the recommendations of the Geospace Section portfolio review panel within the context of space weather forecasting capabilities.
The numerical modelling and process simulation for the fault diagnosis of rotary kiln incinerator.
Roh, S D; Kim, S W; Cho, W S
2001-10-01
The numerical modelling and process simulation for the fault diagnosis of rotary kiln incinerator were accomplished. In the numerical modelling, two models applied to the modelling within the kiln are the combustion chamber model including the mass and energy balance equations for two combustion chambers and 3D thermal model. The combustion chamber model predicts temperature within the kiln, flue gas composition, flux and heat of combustion. Using the combustion chamber model and 3D thermal model, the production-rules for the process simulation can be obtained through interrelation analysis between control and operation variables. The process simulation of the kiln is operated with the production-rules for automatic operation. The process simulation aims to provide fundamental solutions to the problems in incineration process by introducing an online expert control system to provide an integrity in process control and management. Knowledge-based expert control systems use symbolic logic and heuristic rules to find solutions for various types of problems. It was implemented to be a hybrid intelligent expert control system by mutually connecting with the process control systems which has the capability of process diagnosis, analysis and control.
2015-06-01
and tools, called model-integrated computing ( MIC ) [3] relies on the use of domain-specific modeling languages for creating models of the system to be...hence giving reflective capabilities to it. We have followed the MIC method here: we designed a domain- specific modeling language for modeling...are produced one-off and not for the mass market , the scope for price reduction based on the market demands is non-existent. Processes to create
Valuation of Capabilities and System Architecture Options to Meet Affordability Requirement
2014-04-30
is an extension of the historic volatility and trend of the stock using Brownian motion . In finance , the Black-Scholes equation is used to value...the underlying asset whose value is modeled as a stochastic process. In finance , the underlying asset is a tradeable stock and the stochastic process
Meeting the needs of an ever-demanding market.
Rigby, Richard
2002-04-01
Balancing cost and performance in packaging is critical. This article outlines techniques to assist in this whilst delivering added value and product differentiation. The techniques include a rigorous statistical process capable of delivering cost reduction and improved quality and a computer modelling process that can save time when validating new packaging options.
ERIC Educational Resources Information Center
Matto, Holly C.; Hadjiyane, Maria C.; Kost, Michelle; Marshall, Jennifer; Wiley, Joseph; Strolin-Goltzman, Jessica; Khatiwada, Manish; VanMeter, John W.
2014-01-01
Objectives: Empirical evidence suggests substance dependence creates stress system dysregulation which, in turn, may limit the efficacy of verbal-based treatment interventions, as the recovering brain may not be functionally capable of executive level processing. Treatment models that target implicit functioning are necessary. Methods: An RCT was…
Deep Learning towards Expertise Development in a Visualization-Based Learning Environment
ERIC Educational Resources Information Center
Yuan, Bei; Wang, Minhong; Kushniruk, Andre W.; Peng, Jun
2017-01-01
With limited problem-solving capability and practical experience, novices have difficulties developing expert-like performance. It is important to make the complex problem-solving process visible to learners and provide them with necessary help throughout the process. This study explores the design and effects of a model-based learning approach…
High-fidelity simulation capability for virtual testing of seismic and acoustic sensors
NASA Astrophysics Data System (ADS)
Wilson, D. Keith; Moran, Mark L.; Ketcham, Stephen A.; Lacombe, James; Anderson, Thomas S.; Symons, Neill P.; Aldridge, David F.; Marlin, David H.; Collier, Sandra L.; Ostashev, Vladimir E.
2005-05-01
This paper describes development and application of a high-fidelity, seismic/acoustic simulation capability for battlefield sensors. The purpose is to provide simulated sensor data so realistic that they cannot be distinguished by experts from actual field data. This emerging capability provides rapid, low-cost trade studies of unattended ground sensor network configurations, data processing and fusion strategies, and signatures emitted by prototype vehicles. There are three essential components to the modeling: (1) detailed mechanical signature models for vehicles and walkers, (2) high-resolution characterization of the subsurface and atmospheric environments, and (3) state-of-the-art seismic/acoustic models for propagating moving-vehicle signatures through realistic, complex environments. With regard to the first of these components, dynamic models of wheeled and tracked vehicles have been developed to generate ground force inputs to seismic propagation models. Vehicle models range from simple, 2D representations to highly detailed, 3D representations of entire linked-track suspension systems. Similarly detailed models of acoustic emissions from vehicle engines are under development. The propagation calculations for both the seismics and acoustics are based on finite-difference, time-domain (FDTD) methodologies capable of handling complex environmental features such as heterogeneous geologies, urban structures, surface vegetation, and dynamic atmospheric turbulence. Any number of dynamic sources and virtual sensors may be incorporated into the FDTD model. The computational demands of 3D FDTD simulation over tactical distances require massively parallel computers. Several example calculations of seismic/acoustic wave propagation through complex atmospheric and terrain environments are shown.
Improving measurement technology for the design of sustainable cities
NASA Astrophysics Data System (ADS)
Pardyjak, Eric R.; Stoll, Rob
2017-09-01
This review identifies and discusses measurement technology gaps that are currently preventing major science leaps from being realized in the study of urban environmental transport processes. These scientific advances are necessary to better understand the links between atmospheric transport processes in the urban environment, human activities, and potential management strategies. We propose that with various improved and targeted measurements, it will be possible to provide technically sound guidance to policy and decision makers for the design of sustainable cities. This review focuses on full-scale in situ and remotely sensed measurements of atmospheric winds, temperature, and humidity in cities and links measurements to current modeling and simulation needs. A key conclusion of this review is that there is a need for urban-specific measurement techniques including measurements of highly-resolved three-dimensional fields at sampling frequencies high enough to capture small-scale turbulence processes yet also capable of covering spatial extents large enough to simultaneously capture key features of urban heterogeneity and boundary layer processes while also supporting the validation of current and emerging modeling capabilities.
Business Models for Cost Sharing & Capability Sustainment
2012-08-18
digital technology into existing mechanical products and their supporting processes can only work correctly if the firm carrying it out changes its entire...average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed...Capability Sustainment 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK
Meeting Capability Goals through Effective Modelling and Experimentation of C4ISTAR Options
2011-06-01
UNCLASSIFIED 9 Key Facts 12 industry partners drawn from the major defence providers ~80 associate members made up of small and medium sized...in the emergence of a number of effective monopolies. The UK Defence marketplace has become too small and the major equipment ‘replacement’ cycles too...ProcessThreat & Need Figure 3. Environment for Capability Trading The environment is aligned with the MOD’s strategy for Enterprise Architecture *10
Rapid Prototyping: State of the Art Review
2003-10-23
Steel H13 Tool Steel CP Ti, Ti-6Al-4V Titanium Tungsten Copper Aluminum Nickel...The company’s LENS 750 and LENS 850 machines (both $440,000 to $640,000) are capable of producing parts in 16 stainless steel , H13 tool steel ...machining. 20 The Arcam EBM S12 model sells for $500,000 and is capable of processing two materials. One is H13 tool steel , while the other
Improving the Aircraft Design Process Using Web-Based Modeling and Simulation
NASA Technical Reports Server (NTRS)
Reed, John A.; Follen, Gregory J.; Afjeh, Abdollah A.; Follen, Gregory J. (Technical Monitor)
2000-01-01
Designing and developing new aircraft systems is time-consuming and expensive. Computational simulation is a promising means for reducing design cycle times, but requires a flexible software environment capable of integrating advanced multidisciplinary and multifidelity analysis methods, dynamically managing data across heterogeneous computing platforms, and distributing computationally complex tasks. Web-based simulation, with its emphasis on collaborative composition of simulation models, distributed heterogeneous execution, and dynamic multimedia documentation, has the potential to meet these requirements. This paper outlines the current aircraft design process, highlighting its problems and complexities, and presents our vision of an aircraft design process using Web-based modeling and simulation.
Improving the Aircraft Design Process Using Web-based Modeling and Simulation
NASA Technical Reports Server (NTRS)
Reed, John A.; Follen, Gregory J.; Afjeh, Abdollah A.
2003-01-01
Designing and developing new aircraft systems is time-consuming and expensive. Computational simulation is a promising means for reducing design cycle times, but requires a flexible software environment capable of integrating advanced multidisciplinary and muitifidelity analysis methods, dynamically managing data across heterogeneous computing platforms, and distributing computationally complex tasks. Web-based simulation, with its emphasis on collaborative composition of simulation models, distributed heterogeneous execution, and dynamic multimedia documentation, has the potential to meet these requirements. This paper outlines the current aircraft design process, highlighting its problems and complexities, and presents our vision of an aircraft design process using Web-based modeling and simulation.
Using Petri Net Tools to Study Properties and Dynamics of Biological Systems
Peleg, Mor; Rubin, Daniel; Altman, Russ B.
2005-01-01
Petri Nets (PNs) and their extensions are promising methods for modeling and simulating biological systems. We surveyed PN formalisms and tools and compared them based on their mathematical capabilities as well as by their appropriateness to represent typical biological processes. We measured the ability of these tools to model specific features of biological systems and answer a set of biological questions that we defined. We found that different tools are required to provide all capabilities that we assessed. We created software to translate a generic PN model into most of the formalisms and tools discussed. We have also made available three models and suggest that a library of such models would catalyze progress in qualitative modeling via PNs. Development and wide adoption of common formats would enable researchers to share models and use different tools to analyze them without the need to convert to proprietary formats. PMID:15561791
The mathematical modeling of rapid solidification processing. Ph.D. Thesis. Final Report
NASA Technical Reports Server (NTRS)
Gutierrez-Miravete, E.
1986-01-01
The detailed formulation of and the results obtained from a continuum mechanics-based mathematical model of the planar flow melt spinning (PFMS) rapid solidification system are presented and discussed. The numerical algorithm proposed is capable of computing the cooling and freezing rates as well as the fluid flow and capillary phenomena which take place inside the molten puddle formed in the PFMS process. The FORTRAN listings of some of the most useful computer programs and a collection of appendices describing the basic equations used for the modeling are included.
Eigensystem realization algorithm user's guide forVAX/VMS computers: Version 931216
NASA Technical Reports Server (NTRS)
Pappa, Richard S.
1994-01-01
The eigensystem realization algorithm (ERA) is a multiple-input, multiple-output, time domain technique for structural modal identification and minimum-order system realization. Modal identification is the process of calculating structural eigenvalues and eigenvectors (natural vibration frequencies, damping, mode shapes, and modal masses) from experimental data. System realization is the process of constructing state-space dynamic models for modern control design. This user's guide documents VAX/VMS-based FORTRAN software developed by the author since 1984 in conjunction with many applications. It consists of a main ERA program and 66 pre- and post-processors. The software provides complete modal identification capabilities and most system realization capabilities.
Materials science and engineering
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lesuer, D.R.
1997-02-01
During FY-96, work within the Materials Science and Engineering Thrust Area was focused on material modeling. Our motivation for this work is to develop the capability to study the structural response of materials as well as material processing. These capabilities have been applied to a broad range of problems, in support of many programs at Lawrence Livermore National Laboratory. These studies are described in (1) Strength and Fracture Toughness of Material Interfaces; (2) Damage Evolution in Fiber Composite Materials; (3) Flashlamp Envelope Optical Properties and Failure Analysis; (4) Synthesis and Processing of Nanocrystalline Hydroxyapatite; and (5) Room Temperature Creep Compliancemore » of Bulk Kel-E.« less
Frequency domain laser velocimeter signal processor
NASA Technical Reports Server (NTRS)
Meyers, James F.; Murphy, R. Jay
1991-01-01
A new scheme for processing signals from laser velocimeter systems is described. The technique utilizes the capabilities of advanced digital electronics to yield a signal processor capable of operating in the frequency domain maximizing the information obtainable from each signal burst. This allows a sophisticated approach to signal detection and processing, with a more accurate measurement of the chirp frequency resulting in an eight-fold increase in measurable signals over the present high-speed burst counter technology. Further, the required signal-to-noise ratio is reduced by a factor of 32, allowing measurements within boundary layers of wind tunnel models. Measurement accuracy is also increased up to a factor of five.
NASA Astrophysics Data System (ADS)
Toepfer, F.; Cortinas, J. V., Jr.; Kuo, W.; Tallapragada, V.; Stajner, I.; Nance, L. B.; Kelleher, K. E.; Firl, G.; Bernardet, L.
2017-12-01
NOAA develops, operates, and maintains an operational global modeling capability for weather, sub seasonal and seasonal prediction for the protection of life and property and fostering the US economy. In order to substantially improve the overall performance and accelerate advancements of the operational modeling suite, NOAA is partnering with NCAR to design and build the Global Modeling Test Bed (GMTB). The GMTB has been established to provide a platform and a capability for researchers to contribute to the advancement primarily through the development of physical parameterizations needed to improve operational NWP. The strategy to achieve this goal relies on effectively leveraging global expertise through a modern collaborative software development framework. This framework consists of a repository of vetted and supported physical parameterizations known as the Common Community Physics Package (CCPP), a common well-documented interface known as the Interoperable Physics Driver (IPD) for combining schemes into suites and for their configuration and connection to dynamic cores, and an open evidence-based governance process for managing the development and evolution of CCPP. In addition, a physics test harness designed to work within this framework has been established in order to facilitate easier like-to-like comparison of physics advancements. This paper will present an overview of the design of the CCPP and test platform. Additionally, an overview of potential new opportunities of how physics developers can engage in the process, from implementing code for CCPP/IPD compliance to testing their development within an operational-like software environment, will be presented. In addition, insight will be given as to how development gets elevated to CPPP-supported status, the pre-cursor to broad availability and use within operational NWP. An overview of how the GMTB can be expanded to support other global or regional modeling capabilities will also be presented.
The Coastal Ocean Prediction Systems program: Understanding and managing our coastal ocean
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eden, H.F.; Mooers, C.N.K.
1990-06-01
The goal of COPS is to couple a program of regular observations to numerical models, through techniques of data assimilation, in order to provide a predictive capability for the US coastal ocean including the Great Lakes, estuaries, and the entire Exclusive Economic Zone (EEZ). The objectives of the program include: determining the predictability of the coastal ocean and the processes that govern the predictability; developing efficient prediction systems for the coastal ocean based on the assimilation of real-time observations into numerical models; and coupling the predictive systems for the physical behavior of the coastal ocean to predictive systems for biological,more » chemical, and geological processes to achieve an interdisciplinary capability. COPS will provide the basis for effective monitoring and prediction of coastal ocean conditions by optimizing the use of increased scientific understanding, improved observations, advanced computer models, and computer graphics to make the best possible estimates of sea level, currents, temperatures, salinities, and other properties of entire coastal regions.« less
Airborne Detection and Tracking of Geologic Leakage Sites
NASA Astrophysics Data System (ADS)
Jacob, Jamey; Allamraju, Rakshit; Axelrod, Allan; Brown, Calvin; Chowdhary, Girish; Mitchell, Taylor
2014-11-01
Safe storage of CO2 to reduce greenhouse gas emissions without adversely affecting energy use or hindering economic growth requires development of monitoring technology that is capable of validating storage permanence while ensuring the integrity of sequestration operations. Soil gas monitoring has difficulty accurately distinguishing gas flux signals related to leakage from those associated with meteorologically driven changes of soil moisture and temperature. Integrated ground and airborne monitoring systems are being deployed capable of directly detecting CO2 concentration in storage sites. Two complimentary approaches to detecting leaks in the carbon sequestration fields are presented. The first approach focuses on reducing the requisite network communication for fusing individual Gaussian Process (GP) CO2 sensing models into a global GP CO2 model. The GP fusion approach learns how to optimally allocate the static and mobile sensors. The second approach leverages a hierarchical GP-Sigmoidal Gaussian Cox Process for airborne predictive mission planning to optimally reducing the entropy of the global CO2 model. Results from the approaches will be presented.
NASA Astrophysics Data System (ADS)
Zhang, Zhongyang; Nian, Qiong; Doumanidis, Charalabos C.; Liao, Yiliang
2018-02-01
Nanosecond pulsed laser shock processing (LSP) techniques, including laser shock peening, laser peen forming, and laser shock imprinting, have been employed for widespread industrial applications. In these processes, the main beneficial characteristic is the laser-induced shockwave with a high pressure (in the order of GPa), which leads to the plastic deformation with an ultrahigh strain rate (105-106/s) on the surface of target materials. Although LSP processes have been extensively studied by experiments, few efforts have been put on elucidating underlying process mechanisms through developing a physics-based process model. In particular, development of a first-principles model is critical for process optimization and novel process design. This work aims at introducing such a theoretical model for a fundamental understanding of process mechanisms in LSP. Emphasis is placed on the laser-matter interaction and plasma dynamics. This model is found to offer capabilities in predicting key parameters including electron and ion temperatures, plasma state variables (temperature, density, and pressure), and the propagation of the laser shockwave. The modeling results were validated by experimental data.
Lin, I-Chun; Xing, Dajun; Shapley, Robert
2014-01-01
One of the reasons the visual cortex has attracted the interest of computational neuroscience is that it has well-defined inputs. The lateral geniculate nucleus (LGN) of the thalamus is the source of visual signals to the primary visual cortex (V1). Most large-scale cortical network models approximate the spike trains of LGN neurons as simple Poisson point processes. However, many studies have shown that neurons in the early visual pathway are capable of spiking with high temporal precision and their discharges are not Poisson-like. To gain an understanding of how response variability in the LGN influences the behavior of V1, we study response properties of model V1 neurons that receive purely feedforward inputs from LGN cells modeled either as noisy leaky integrate-and-fire (NLIF) neurons or as inhomogeneous Poisson processes. We first demonstrate that the NLIF model is capable of reproducing many experimentally observed statistical properties of LGN neurons. Then we show that a V1 model in which the LGN input to a V1 neuron is modeled as a group of NLIF neurons produces higher orientation selectivity than the one with Poisson LGN input. The second result implies that statistical characteristics of LGN spike trains are important for V1's function. We conclude that physiologically motivated models of V1 need to include more realistic LGN spike trains that are less noisy than inhomogeneous Poisson processes. PMID:22684587
Lin, I-Chun; Xing, Dajun; Shapley, Robert
2012-12-01
One of the reasons the visual cortex has attracted the interest of computational neuroscience is that it has well-defined inputs. The lateral geniculate nucleus (LGN) of the thalamus is the source of visual signals to the primary visual cortex (V1). Most large-scale cortical network models approximate the spike trains of LGN neurons as simple Poisson point processes. However, many studies have shown that neurons in the early visual pathway are capable of spiking with high temporal precision and their discharges are not Poisson-like. To gain an understanding of how response variability in the LGN influences the behavior of V1, we study response properties of model V1 neurons that receive purely feedforward inputs from LGN cells modeled either as noisy leaky integrate-and-fire (NLIF) neurons or as inhomogeneous Poisson processes. We first demonstrate that the NLIF model is capable of reproducing many experimentally observed statistical properties of LGN neurons. Then we show that a V1 model in which the LGN input to a V1 neuron is modeled as a group of NLIF neurons produces higher orientation selectivity than the one with Poisson LGN input. The second result implies that statistical characteristics of LGN spike trains are important for V1's function. We conclude that physiologically motivated models of V1 need to include more realistic LGN spike trains that are less noisy than inhomogeneous Poisson processes.
Domínguez-Tello, Antonio; Arias-Borrego, Ana; García-Barrera, Tamara; Gómez-Ariza, José Luis
2017-10-01
The trihalomethanes (TTHMs) and others disinfection by-products (DBPs) are formed in drinking water by the reaction of chlorine with organic precursors contained in the source water, in two consecutive and linked stages, that starts at the treatment plant and continues in second stage along the distribution system (DS) by reaction of residual chlorine with organic precursors not removed. Following this approach, this study aimed at developing a two-stage empirical model for predicting the formation of TTHMs in the water treatment plant and subsequently their evolution along the water distribution system (WDS). The aim of the two-stage model was to improve the predictive capability for a wide range of scenarios of water treatments and distribution systems. The two-stage model was developed using multiple regression analysis from a database (January 2007 to July 2012) using three different treatment processes (conventional and advanced) in the water supply system of Aljaraque area (southwest of Spain). Then, the new model was validated using a recent database from the same water supply system (January 2011 to May 2015). The validation results indicated no significant difference in the predictive and observed values of TTHM (R 2 0.874, analytical variance <17%). The new model was applied to three different supply systems with different treatment processes and different characteristics. Acceptable predictions were obtained in the three distribution systems studied, proving the adaptability of the new model to the boundary conditions. Finally the predictive capability of the new model was compared with 17 other models selected from the literature, showing satisfactory results prediction and excellent adaptability to treatment processes.
NASA Astrophysics Data System (ADS)
Johnston, J. M.
2013-12-01
Freshwater habitats provide fishable, swimmable and drinkable resources and are a nexus of geophysical and biological processes. These processes in turn influence the persistence and sustainability of populations, communities and ecosystems. Climate change and landuse change encompass numerous stressors of potential exposure, including the introduction of toxic contaminants, invasive species, and disease in addition to physical drivers such as temperature and hydrologic regime. A systems approach that includes the scientific and technologic basis of assessing the health of ecosystems is needed to effectively protect human health and the environment. The Integrated Environmental Modeling Framework 'iemWatersheds' has been developed as a consistent and coherent means of forecasting the cumulative impact of co-occurring stressors. The Framework consists of three facilitating technologies: Data for Environmental Modeling (D4EM) that automates the collection and standardization of input data; the Framework for Risk Assessment of Multimedia Environmental Systems (FRAMES) that manages the flow of information between linked models; and the Supercomputer for Model Uncertainty and Sensitivity Evaluation (SuperMUSE) that provides post-processing and analysis of model outputs, including uncertainty and sensitivity analysis. Five models are linked within the Framework to provide multimedia simulation capabilities for hydrology and water quality processes: the Soil Water Assessment Tool (SWAT) predicts surface water and sediment runoff and associated contaminants; the Watershed Mercury Model (WMM) predicts mercury runoff and loading to streams; the Water quality Analysis and Simulation Program (WASP) predicts water quality within the stream channel; the Habitat Suitability Index (HSI) model scores physicochemical habitat quality for individual fish species; and the Bioaccumulation and Aquatic System Simulator (BASS) predicts fish growth, population dynamics and bioaccumulation of toxic substances. The capability of the Framework to address cumulative impacts will be demonstrated for freshwater ecosystem services and mountaintop mining.
Tschechne, Stephan; Neumann, Heiko
2014-01-01
Visual structures in the environment are segmented into image regions and those combined to a representation of surfaces and prototypical objects. Such a perceptual organization is performed by complex neural mechanisms in the visual cortex of primates. Multiple mutually connected areas in the ventral cortical pathway receive visual input and extract local form features that are subsequently grouped into increasingly complex, more meaningful image elements. Such a distributed network of processing must be capable to make accessible highly articulated changes in shape boundary as well as very subtle curvature changes that contribute to the perception of an object. We propose a recurrent computational network architecture that utilizes hierarchical distributed representations of shape features to encode surface and object boundary over different scales of resolution. Our model makes use of neural mechanisms that model the processing capabilities of early and intermediate stages in visual cortex, namely areas V1–V4 and IT. We suggest that multiple specialized component representations interact by feedforward hierarchical processing that is combined with feedback signals driven by representations generated at higher stages. Based on this, global configurational as well as local information is made available to distinguish changes in the object's contour. Once the outline of a shape has been established, contextual contour configurations are used to assign border ownership directions and thus achieve segregation of figure and ground. The model, thus, proposes how separate mechanisms contribute to distributed hierarchical cortical shape representation and combine with processes of figure-ground segregation. Our model is probed with a selection of stimuli to illustrate processing results at different processing stages. We especially highlight how modulatory feedback connections contribute to the processing of visual input at various stages in the processing hierarchy. PMID:25157228
Tschechne, Stephan; Neumann, Heiko
2014-01-01
Visual structures in the environment are segmented into image regions and those combined to a representation of surfaces and prototypical objects. Such a perceptual organization is performed by complex neural mechanisms in the visual cortex of primates. Multiple mutually connected areas in the ventral cortical pathway receive visual input and extract local form features that are subsequently grouped into increasingly complex, more meaningful image elements. Such a distributed network of processing must be capable to make accessible highly articulated changes in shape boundary as well as very subtle curvature changes that contribute to the perception of an object. We propose a recurrent computational network architecture that utilizes hierarchical distributed representations of shape features to encode surface and object boundary over different scales of resolution. Our model makes use of neural mechanisms that model the processing capabilities of early and intermediate stages in visual cortex, namely areas V1-V4 and IT. We suggest that multiple specialized component representations interact by feedforward hierarchical processing that is combined with feedback signals driven by representations generated at higher stages. Based on this, global configurational as well as local information is made available to distinguish changes in the object's contour. Once the outline of a shape has been established, contextual contour configurations are used to assign border ownership directions and thus achieve segregation of figure and ground. The model, thus, proposes how separate mechanisms contribute to distributed hierarchical cortical shape representation and combine with processes of figure-ground segregation. Our model is probed with a selection of stimuli to illustrate processing results at different processing stages. We especially highlight how modulatory feedback connections contribute to the processing of visual input at various stages in the processing hierarchy.
Constraints, Approach, and Status of Mars Surveyor 2001 Landing Site Selection
NASA Technical Reports Server (NTRS)
Golombek, M.; Bridges, N.; Briggs, G.; Gilmore, M.; Haldemann, A.; Parker, T.; Saunders, R.; Spencer, D.; Smith, J.; Soderblom, L.
1999-01-01
There are many similarities between the Mars Surveyor '01 (MS '01) landing site selection process and that of Mars Pathfinder. The selection process includes two parallel activities in which engineers define and refine the capabilities of the spacecraft through design, testing and modeling and scientists define a set of landing site constraints based on the spacecraft design and landing scenario. As for Pathfinder, the safety of the site is without question the single most important factor, for the simple reason that failure to land safely yields no science and exposes the mission and program to considerable risk. The selection process must be thorough and defensible and capable of surviving multiple withering reviews similar to the Pathfinder decision. On Pathfinder, this was accomplished by attempting to understand the surface properties of sites using available remote sensing data sets and models based on them. Science objectives are factored into the selection process only after the safety of the site is validated. Finally, as for Pathfinder, the selection process is being done in an open environment with multiple opportunities for community involvement including open workshops, with education and outreach opportunities. Additional information is contained in the original extended abstract.
Quality and noise measurements in mobile phone video capture
NASA Astrophysics Data System (ADS)
Petrescu, Doina; Pincenti, John
2011-02-01
The quality of videos captured with mobile phones has become increasingly important particularly since resolutions and formats have reached a level that rivals the capabilities available in the digital camcorder market, and since many mobile phones now allow direct playback on large HDTVs. The video quality is determined by the combined quality of the individual parts of the imaging system including the image sensor, the digital color processing, and the video compression, each of which has been studied independently. In this work, we study the combined effect of these elements on the overall video quality. We do this by evaluating the capture under various lighting, color processing, and video compression conditions. First, we measure full reference quality metrics between encoder input and the reconstructed sequence, where the encoder input changes with light and color processing modifications. Second, we introduce a system model which includes all elements that affect video quality, including a low light additive noise model, ISP color processing, as well as the video encoder. Our experiments show that in low light conditions and for certain choices of color processing the system level visual quality may not improve when the encoder becomes more capable or the compression ratio is reduced.
Final Report for "Design calculations for high-space-charge beam-to-RF conversion".
DOE Office of Scientific and Technical Information (OSTI.GOV)
David N Smithe
2008-10-17
Accelerator facility upgrades, new accelerator applications, and future design efforts are leading to novel klystron and IOT device concepts, including multiple beam, high-order mode operation, and new geometry configurations of old concepts. At the same time, a new simulation capability, based upon finite-difference “cut-cell” boundaries, has emerged and is transforming the existing modeling and design capability with unparalleled realism, greater flexibility, and improved accuracy. This same new technology can also be brought to bear on a difficult-to-study aspect of the energy recovery linac (ERL), namely the accurate modeling of the exit beam, and design of the beam dump for optimummore » energy efficiency. We have developed new capability for design calculations and modeling of a broad class of devices which convert bunched beam kinetic energy to RF energy, including RF sources, as for example, klystrons, gyro-klystrons, IOT's, TWT’s, and other devices in which space-charge effects are important. Recent advances in geometry representation now permits very accurate representation of the curved metallic surfaces common to RF sources, resulting in unprecedented simulation accuracy. In the Phase I work, we evaluated and demonstrated the capabilities of the new geometry representation technology as applied to modeling and design of output cavity components of klystron, IOT's, and energy recovery srf cavities. We identified and prioritized which aspects of the design study process to pursue and improve in Phase II. The development and use of the new accurate geometry modeling technology on RF sources for DOE accelerators will help spark a new generational modeling and design capability, free from many of the constraints and inaccuracy associated with the previous generation of “stair-step” geometry modeling tools. This new capability is ultimately expected to impact all fields with high power RF sources, including DOE fusion research, communications, radar and other defense applications.« less
Model of Fluidized Bed Containing Reacting Solids and Gases
NASA Technical Reports Server (NTRS)
Bellan, Josette; Lathouwers, Danny
2003-01-01
A mathematical model has been developed for describing the thermofluid dynamics of a dense, chemically reacting mixture of solid particles and gases. As used here, "dense" signifies having a large volume fraction of particles, as for example in a bubbling fluidized bed. The model is intended especially for application to fluidized beds that contain mixtures of carrier gases, biomass undergoing pyrolysis, and sand. So far, the design of fluidized beds and other gas/solid industrial processing equipment has been based on empirical correlations derived from laboratory- and pilot-scale units. The present mathematical model is a product of continuing efforts to develop a computational capability for optimizing the designs of fluidized beds and related equipment on the basis of first principles. Such a capability could eliminate the need for expensive, time-consuming predesign testing.
NASA Technical Reports Server (NTRS)
Williams-Byrd, Julie; Arney, Dale C.; Hay, Jason; Reeves, John D.; Craig, Douglas
2016-01-01
NASA is transforming human spaceflight. The Agency is shifting from an exploration-based program with human activities in low Earth orbit (LEO) and targeted robotic missions in deep space to a more sustainable and integrated pioneering approach. Through pioneering, NASA seeks to address national goals to develop the capacity for people to work, learn, operate, live, and thrive safely beyond Earth for extended periods of time. However, pioneering space involves daunting technical challenges of transportation, maintaining health, and enabling crew productivity for long durations in remote, hostile, and alien environments. Prudent investments in capability and technology developments, based on mission need, are critical for enabling a campaign of human exploration missions. There are a wide variety of capabilities and technologies that could enable these missions, so it is a major challenge for NASA's Human Exploration and Operations Mission Directorate (HEOMD) to make knowledgeable portfolio decisions. It is critical for this pioneering initiative that these investment decisions are informed with a prioritization process that is robust and defensible. It is NASA's role to invest in targeted technologies and capabilities that would enable exploration missions even though specific requirements have not been identified. To inform these investments decisions, NASA's HEOMD has supported a variety of analysis activities that prioritize capabilities and technologies. These activities are often based on input from subject matter experts within the NASA community who understand the technical challenges of enabling human exploration missions. This paper will review a variety of processes and methods that NASA has used to prioritize and rank capabilities and technologies applicable to human space exploration. The paper will show the similarities in the various processes and showcase instances were customer specified priorities force modifications to the process. Specifically, this paper will describe the processes that the NASA Langley Research Center (LaRC) Technology Assessment and Integration Team (TAIT) has used for several years and how those processes have been customized to meet customer needs while staying robust and defensible. This paper will show how HEOMD uses these analyses results to assist with making informed portfolio investment decisions. The paper will also highlight which human exploration capabilities and technologies typically rank high regardless of the specific design reference mission. The paper will conclude by describing future capability and technology ranking activities that will continue o leverage subject matter experts (SME) input while also incorporating more model-based analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carlson, Neil; Jibben, Zechariah; Brady, Peter
2017-06-28
Pececillo is a proxy-app for the open source Truchas metal processing code (LA-CC-15-097). It implements many of the physics models used in Truchas: free-surface, incompressible Navier-Stokes fluid dynamics (e.g., water waves); heat transport, material phase change, view factor thermal radiation; species advection-diffusion; quasi-static, elastic/plastic solid mechanics with contact; electomagnetics (Maxwell's equations). The models are simplified versions that retain the fundamental computational complexity of the Truchas models while omitting many non-essential features and modeling capabilities. The purpose is to expose Truchas algorithms in a greatly simplified context where computer science problems related to parallel performance on advanced architectures can be moremore » easily investigated. While Pececillo is capable of performing simulations representative of typical Truchas metal casting, welding, and additive manufacturing simulations, it lacks many of the modeling capabilites needed for real applications.« less
A General Multivariate Latent Growth Model with Applications to Student Achievement
ERIC Educational Resources Information Center
Bianconcini, Silvia; Cagnone, Silvia
2012-01-01
The evaluation of the formative process in the University system has been assuming an ever increasing importance in the European countries. Within this context, the analysis of student performance and capabilities plays a fundamental role. In this work, the authors propose a multivariate latent growth model for studying the performances of a…
ERIC Educational Resources Information Center
Geiger, Vince; Date-Huxtable, Liz; Ahlip, Rehez; Herberstein, Marie; Jones, D. Heath; May, E. Julian; Rylands, Leanne; Wright, Ian; Mulligan, Joanne
2016-01-01
The purpose of this paper is to describe the processes utilised to develop an online learning module within the Opening Real Science (ORS) project--"Modelling the present: Predicting the future." The module was realised through an interdisciplinary collaboration, among mathematicians, scientists and mathematics and science educators that…
USDA-ARS?s Scientific Manuscript database
The Agricultural Policy/Environmental eXtender (APEX) is a watershed-scale water quality model that includes detailed representation of agricultural management but currently does not have microbial fate and transport simulation capabilities. The objective of this work was to develop a process-based ...
1998-06-01
process or plant can complete using a 24-hour, seven-day operation with zero waste , i.e., the maximum output capability, allowing no adjustment for...models: • Resource Effectiveness Model: > Analyzes economic impact of capacity management decisions > Assumes that " zero waste " is the goal > Supports
Using landscape disturbance and succession models to support forest management
Eric J. Gustafson; Brian R. Sturtevant; Anatoly S. Shvidenko; Robert M. Scheller
2010-01-01
Managers of forested landscapes must account for multiple, interacting ecological processes operating at broad spatial and temporal scales. These interactions can be of such complexity that predictions of future forest ecosystem states are beyond the analytical capability of the human mind. Landscape disturbance and succession models (LDSM) are predictive and...
ERIC Educational Resources Information Center
Grier, Betsy Chesno; Bradley-Klug, Kathy L.
2011-01-01
Medical technology continues to improve, increasing life expectancies and capabilities of children with chronic illnesses and disabilities. Pediatric health issues have an impact on children's academic, emotional, behavioral, and social functioning. This article reviews a consultative Biopsychoeducational Model, based on a problem-solving process,…
Kirtania, Kawnish; Bhattacharya, Sankar
2012-03-01
Apart from capturing carbon dioxide, fresh water algae can be used to produce biofuel. To assess the energy potential of Chlorococcum humicola, the alga's pyrolytic behavior was studied at heating rates of 5-20K/min in a thermobalance. To model the weight loss characteristics, an algorithm was developed based on the distributed activation energy model and applied to experimental data to extract the kinetics of the decomposition process. When the kinetic parameters estimated by this method were applied to another set of experimental data which were not used to estimate the parameters, the model was capable of predicting the pyrolysis behavior, in the new set of data with a R(2) value of 0.999479. The slow weight loss, that took place at the end of the pyrolysis process, was also accounted for by the proposed algorithm which is capable of predicting the pyrolysis kinetics of C. humicola at different heating rates. Crown Copyright © 2011. Published by Elsevier Ltd. All rights reserved.
A Game-Theoretical Model to Improve Process Plant Protection from Terrorist Attacks.
Zhang, Laobing; Reniers, Genserik
2016-12-01
The New York City 9/11 terrorist attacks urged people from academia as well as from industry to pay more attention to operational security research. The required focus in this type of research is human intention. Unlike safety-related accidents, security-related accidents have a deliberate nature, and one has to face intelligent adversaries with characteristics that traditional probabilistic risk assessment techniques are not capable of dealing with. In recent years, the mathematical tool of game theory, being capable to handle intelligent players, has been used in a variety of ways in terrorism risk assessment. In this article, we analyze the general intrusion detection system in process plants, and propose a game-theoretical model for security management in such plants. Players in our model are assumed to be rational and they play the game with complete information. Both the pure strategy and the mixed strategy solutions are explored and explained. We illustrate our model by an illustrative case, and find that in our case, no pure strategy but, instead, a mixed strategy Nash equilibrium exists. © 2016 Society for Risk Analysis.
Sahu, Sounak; Dattani, Anish; Aboobaker, A Aziz
2017-10-01
Understanding how some animals are immortal and avoid the ageing process is important. We currently know very little about how they achieve this. Research with genetic model systems has revealed the existence of conserved genetic pathways and molecular processes that affect longevity. Most of these established model organisms have relatively short lifespans. Here we consider the use of planarians, with an immortal life-history that is able to entirely avoid the ageing process. These animals are capable of profound feats of regeneration fueled by a population of adult stem cells called neoblasts. These cells are capable of indefinite self-renewal that has underpinned the evolution of animals that reproduce only by fission, having disposed of the germline, and must therefore be somatically immortal and avoid the ageing process. How they do this is only now starting to be understood. Here we suggest that the evidence so far supports the hypothesis that the lack of ageing is an emergent property of both being highly regenerative and the evolution of highly effective mechanisms for ensuring genome stability in the neoblast stem cell population. The details of these mechanisms could prove to be very informative in understanding how the causes of ageing can be avoided, slowed or even reversed. Copyright © 2017 Elsevier Ltd. All rights reserved.
Mathematical Modeling of Diverse Phenomena
NASA Technical Reports Server (NTRS)
Howard, J. C.
1979-01-01
Tensor calculus is applied to the formulation of mathematical models of diverse phenomena. Aeronautics, fluid dynamics, and cosmology are among the areas of application. The feasibility of combining tensor methods and computer capability to formulate problems is demonstrated. The techniques described are an attempt to simplify the formulation of mathematical models by reducing the modeling process to a series of routine operations, which can be performed either manually or by computer.
Model-Based Trade Space Exploration for Near-Earth Space Missions
NASA Technical Reports Server (NTRS)
Cohen, Ronald H.; Boncyk, Wayne; Brutocao, James; Beveridge, Iain
2005-01-01
We developed a capability for model-based trade space exploration to be used in the conceptual design of Earth-orbiting space missions. We have created a set of reusable software components to model various subsystems and aspects of space missions. Several example mission models were created to test the tools and process. This technique and toolset has demonstrated itself to be valuable for space mission architectural design.
High-Performance Modeling and Simulation of Anchoring in Granular Media for NEO Applications
NASA Technical Reports Server (NTRS)
Quadrelli, Marco B.; Jain, Abhinandan; Negrut, Dan; Mazhar, Hammad
2012-01-01
NASA is interested in designing a spacecraft capable of visiting a near-Earth object (NEO), performing experiments, and then returning safely. Certain periods of this mission would require the spacecraft to remain stationary relative to the NEO, in an environment characterized by very low gravity levels; such situations require an anchoring mechanism that is compact, easy to deploy, and upon mission completion, easy to remove. The design philosophy used in this task relies on the simulation capability of a high-performance multibody dynamics physics engine. On Earth, it is difficult to create low-gravity conditions, and testing in low-gravity environments, whether artificial or in space, can be costly and very difficult to achieve. Through simulation, the effect of gravity can be controlled with great accuracy, making it ideally suited to analyze the problem at hand. Using Chrono::Engine, a simulation pack age capable of utilizing massively parallel Graphic Processing Unit (GPU) hardware, several validation experiments were performed. Modeling of the regolith interaction has been carried out, after which the anchor penetration tests were performed and analyzed. The regolith was modeled by a granular medium composed of very large numbers of convex three-dimensional rigid bodies, subject to microgravity levels and interacting with each other with contact, friction, and cohesional forces. The multibody dynamics simulation approach used for simulating anchors penetrating a soil uses a differential variational inequality (DVI) methodology to solve the contact problem posed as a linear complementarity method (LCP). Implemented within a GPU processing environment, collision detection is greatly accelerated compared to traditional CPU (central processing unit)- based collision detection. Hence, systems of millions of particles interacting with complex dynamic systems can be efficiently analyzed, and design recommendations can be made in a much shorter time. The figure shows an example of this capability where the Brazil Nut problem is simulated: as the container full of granular material is vibrated, the large ball slowly moves upwards. This capability was expanded to account for anchors of different shapes and penetration velocities, interacting with granular soils.
Southern Ocean bottom water characteristics in CMIP5 models
NASA Astrophysics Data System (ADS)
Heuzé, CéLine; Heywood, Karen J.; Stevens, David P.; Ridley, Jeff K.
2013-04-01
Southern Ocean deep water properties and formation processes in climate models are indicative of their capability to simulate future climate, heat and carbon uptake, and sea level rise. Southern Ocean temperature and density averaged over 1986-2005 from 15 CMIP5 (Coupled Model Intercomparison Project Phase 5) climate models are compared with an observed climatology, focusing on bottom water. Bottom properties are reasonably accurate for half the models. Ten models create dense water on the Antarctic shelf, but it mixes with lighter water and is not exported as bottom water as in reality. Instead, most models create deep water by open ocean deep convection, a process occurring rarely in reality. Models with extensive deep convection are those with strong seasonality in sea ice. Optimum bottom properties occur in models with deep convection in the Weddell and Ross Gyres. Bottom Water formation processes are poorly represented in ocean models and are a key challenge for improving climate predictions.
NASA Technical Reports Server (NTRS)
Penta, Bradley; Ko, D.; Gould, Richard W.; Arnone, Robert A.; Greene, R.; Lehrter, J.; Hagy, James; Schaeffer, B.; Murrell, M.; Kurtz, J.;
2009-01-01
We describe emerging capabilities to understand physical processes and biogeoehemical cycles in coastal waters through the use of satellites, numerical models, and ship observations. Emerging capabilities provide significantly improved ability to model ecological systems and the impact of environmental management actions on them. The complex interaction of physical and biogeoehemical processes responsible for hypoxic events requires an integrated approach to research, monitoring, and modeling in order to fully define the processes leading to hypoxia. Our efforts characterizes the carbon cycle associated with river plumes and the export of organic matter and nutrients form coastal Louisiana wetlands and embayments in a spatially and temporally intensive manner previously not possible. Riverine nutrients clearly affect ecosystems in the northern Gulf of Mexico as evidenced in the occurrence of regional hypoxia events. Less known and largely unqualified is the export of organic matter and nutrients from the large areas of disappearing coastal wetlands and large embayments adjacent to the Louisiana Continental Shelf. This project provides new methods to track the river plume along the shelf and to estimate the rate of export of suspended inorganic and organic paniculate matter and dissolved organic matter form coastal habitats of south Louisiana.
CLAES Product Improvement by use of GSFC Data Assimilation System
NASA Technical Reports Server (NTRS)
Kumer, J. B.; Douglass, Anne (Technical Monitor)
2001-01-01
Recent development in chemistry transport models (CTM) and in data assimilation systems (DAS) indicate impressive predictive capability for the movement of airparcels and the chemistry that goes on within these. This project was aimed at exploring the use of this capability to achieve improved retrieval of geophysical parameters from remote sensing data. The specific goal was to improve retrieval of the CLAES CH4 data obtained during the active north high latitude dynamics event of 18 to 25 February 1992. The model capabilities would be used: (1) rather than climatology to improve on the first guess and the a-priori fields, and (2) to provide horizontal gradients to include in the retrieval forward model. The retrieval would be implemented with the first forward DAS prediction. The results would feed back to the DAS and a second DAS prediction for first guess, a-priori and gradients would feed to the retrieval. The process would repeat to convergence and then proceed to the next day.
NASA Astrophysics Data System (ADS)
Dai, H.; Chen, X.; Ye, M.; Song, X.; Zachara, J. M.
2016-12-01
Sensitivity analysis has been an important tool in groundwater modeling to identify the influential parameters. Among various sensitivity analysis methods, the variance-based global sensitivity analysis has gained popularity for its model independence characteristic and capability of providing accurate sensitivity measurements. However, the conventional variance-based method only considers uncertainty contribution of single model parameters. In this research, we extended the variance-based method to consider more uncertainty sources and developed a new framework to allow flexible combinations of different uncertainty components. We decompose the uncertainty sources into a hierarchical three-layer structure: scenario, model and parametric. Furthermore, each layer of uncertainty source is capable of containing multiple components. An uncertainty and sensitivity analysis framework was then constructed following this three-layer structure using Bayesian network. Different uncertainty components are represented as uncertain nodes in this network. Through the framework, variance-based sensitivity analysis can be implemented with great flexibility of using different grouping strategies for uncertainty components. The variance-based sensitivity analysis thus is improved to be able to investigate the importance of an extended range of uncertainty sources: scenario, model, and other different combinations of uncertainty components which can represent certain key model system processes (e.g., groundwater recharge process, flow reactive transport process). For test and demonstration purposes, the developed methodology was implemented into a test case of real-world groundwater reactive transport modeling with various uncertainty sources. The results demonstrate that the new sensitivity analysis method is able to estimate accurate importance measurements for any uncertainty sources which were formed by different combinations of uncertainty components. The new methodology can provide useful information for environmental management and decision-makers to formulate policies and strategies.
NASA Technical Reports Server (NTRS)
1979-01-01
Information to identify viable coal gasification and utilization technologies is presented. Analysis capabilities required to support design and implementation of coal based synthetic fuels complexes are identified. The potential market in the Southeast United States for coal based synthetic fuels is investigated. A requirements analysis to identify the types of modeling and analysis capabilities required to conduct and monitor coal gasification project designs is discussed. Models and methodologies to satisfy these requirements are identified and evaluated, and recommendations are developed. Requirements for development of technology and data needed to improve gasification feasibility and economies are examined.
Toward an optimisation technique for dynamically monitored environment
NASA Astrophysics Data System (ADS)
Shurrab, Orabi M.
2016-10-01
The data fusion community has introduced multiple procedures of situational assessments; this is to facilitate timely responses to emerging situations. More directly, the process refinement of the Joint Directors of Laboratories (JDL) is a meta-process to assess and improve the data fusion task during real-time operation. In other wording, it is an optimisation technique to verify the overall data fusion performance, and enhance it toward the top goals of the decision-making resources. This paper discusses the theoretical concept of prioritisation. Where the analysts team is required to keep an up to date with the dynamically changing environment, concerning different domains such as air, sea, land, space and cyberspace. Furthermore, it demonstrates an illustration example of how various tracking activities are ranked, simultaneously into a predetermined order. Specifically, it presents a modelling scheme for a case study based scenario, where the real-time system is reporting different classes of prioritised events. Followed by a performance metrics for evaluating the prioritisation process of situational awareness (SWA) domain. The proposed performance metrics has been designed and evaluated using an analytical approach. The modelling scheme represents the situational awareness system outputs mathematically, in the form of a list of activities. Such methods allowed the evaluation process to conduct a rigorous analysis of the prioritisation process, despite any constrained related to a domain-specific configuration. After conducted three levels of assessments over three separates scenario, The Prioritisation Capability Score (PCS) has provided an appropriate scoring scheme for different ranking instances, Indeed, from the data fusion perspectives, the proposed metric has assessed real-time system performance adequately, and it is capable of conducting a verification process, to direct the operator's attention to any issue, concerning the prioritisation capability of situational awareness domain.
Testing a Constrained MPC Controller in a Process Control Laboratory
ERIC Educational Resources Information Center
Ricardez-Sandoval, Luis A.; Blankespoor, Wesley; Budman, Hector M.
2010-01-01
This paper describes an experiment performed by the fourth year chemical engineering students in the process control laboratory at the University of Waterloo. The objective of this experiment is to test the capabilities of a constrained Model Predictive Controller (MPC) to control the operation of a Double Pipe Heat Exchanger (DPHE) in real time.…
NASA Astrophysics Data System (ADS)
Fauzi, Ilham; Muharram Hasby, Fariz; Irianto, Dradjad
2018-03-01
Although government is able to make mandatory standards that must be obeyed by the industry, the respective industries themselves often have difficulties to fulfil the requirements described in those standards. This is especially true in many small and medium sized enterprises that lack the required capital to invest in standard-compliant equipment and machineries. This study aims to develop a set of measurement tools for evaluating the level of readiness of production technology with respect to the requirements of a product standard based on the quality function deployment (QFD) method. By combining the QFD methodology, UNESCAP Technometric model [9] and Analytic Hierarchy Process (AHP), this model is used to measure a firm’s capability to fulfill government standard in the toy making industry. Expert opinions from both the governmental officers responsible for setting and implementing standards and the industry practitioners responsible for managing manufacturing processes are collected and processed to find out the technological capabilities that should be improved by the firm to fulfill the existing standard. This study showed that the proposed model can be used successfully to measure the gap between the requirements of the standard and the readiness of technoware technological component in a particular firm.
Differential correction capability of the GTDS using TDRSS data
NASA Technical Reports Server (NTRS)
Liu, S. Y.; Soskey, D. G.; Jacintho, J.
1980-01-01
A differential correction (DC) capability was implemented in the Goddard Trajectory Determination System (GTDS) to process satellite tracking data acquired via the Tracking and Data Relay Satellite System (TRDRSS). Configuration of the TDRSS is reviewed, observation modeling is presented, and major features of the capability are discussed. The following types of TDRSS data can be processed by GTDS: two way relay range and Doppler measurements, hybrid relay range and Doppler measurements, one way relay Doppler measurements, and differenced one way relay Doppler measurements. These data may be combined with conventional ground based direct tracking data. By using Bayesian weighted least squares techniques, the software allows the simultaneous determination of the trajectories of up to four different satellites - one user satellite and three relay satellites. In addition to satellite trajectories, the following parameters can be optionally solved: for drag coefficient, reflectivity of a satellite for solar radiation pressure, transponder delay, station position, and biases.
NASA Technical Reports Server (NTRS)
Patterson, Jonathan D.; Breckenridge, Jonathan T.; Johnson, Stephen B.
2013-01-01
Building upon the purpose, theoretical approach, and use of a Goal-Function Tree (GFT) being presented by Dr. Stephen B. Johnson, described in a related Infotech 2013 ISHM abstract titled "Goal-Function Tree Modeling for Systems Engineering and Fault Management", this paper will describe the core framework used to implement the GFTbased systems engineering process using the Systems Modeling Language (SysML). These two papers are ideally accepted and presented together in the same Infotech session. Statement of problem: SysML, as a tool, is currently not capable of implementing the theoretical approach described within the "Goal-Function Tree Modeling for Systems Engineering and Fault Management" paper cited above. More generally, SysML's current capabilities to model functional decompositions in the rigorous manner required in the GFT approach are limited. The GFT is a new Model-Based Systems Engineering (MBSE) approach to the development of goals and requirements, functions, and its linkage to design. As a growing standard for systems engineering, it is important to develop methods to implement GFT in SysML. Proposed Method of Solution: Many of the central concepts of the SysML language are needed to implement a GFT for large complex systems. In the implementation of those central concepts, the following will be described in detail: changes to the nominal SysML process, model view definitions and examples, diagram definitions and examples, and detailed SysML construct and stereotype definitions.
Integration of Engine, Plume, and CFD Analyses in Conceptual Design of Low-Boom Supersonic Aircraft
NASA Technical Reports Server (NTRS)
Li, Wu; Campbell, Richard; Geiselhart, Karl; Shields, Elwood; Nayani, Sudheer; Shenoy, Rajiv
2009-01-01
This paper documents an integration of engine, plume, and computational fluid dynamics (CFD) analyses in the conceptual design of low-boom supersonic aircraft, using a variable fidelity approach. In particular, the Numerical Propulsion Simulation System (NPSS) is used for propulsion system cycle analysis and nacelle outer mold line definition, and a low-fidelity plume model is developed for plume shape prediction based on NPSS engine data and nacelle geometry. This model provides a capability for the conceptual design of low-boom supersonic aircraft that accounts for plume effects. Then a newly developed process for automated CFD analysis is presented for CFD-based plume and boom analyses of the conceptual geometry. Five test cases are used to demonstrate the integrated engine, plume, and CFD analysis process based on a variable fidelity approach, as well as the feasibility of the automated CFD plume and boom analysis capability.
Virtual Observation System for Earth System Model: An Application to ACME Land Model Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Dali; Yuan, Fengming; Hernandez, Benjamin
Investigating and evaluating physical-chemical-biological processes within an Earth system model (EMS) can be very challenging due to the complexity of both model design and software implementation. A virtual observation system (VOS) is presented to enable interactive observation of these processes during system simulation. Based on advance computing technologies, such as compiler-based software analysis, automatic code instrumentation, and high-performance data transport, the VOS provides run-time observation capability, in-situ data analytics for Earth system model simulation, model behavior adjustment opportunities through simulation steering. A VOS for a terrestrial land model simulation within the Accelerated Climate Modeling for Energy model is also presentedmore » to demonstrate the implementation details and system innovations.« less
Virtual Observation System for Earth System Model: An Application to ACME Land Model Simulations
Wang, Dali; Yuan, Fengming; Hernandez, Benjamin; ...
2017-01-01
Investigating and evaluating physical-chemical-biological processes within an Earth system model (EMS) can be very challenging due to the complexity of both model design and software implementation. A virtual observation system (VOS) is presented to enable interactive observation of these processes during system simulation. Based on advance computing technologies, such as compiler-based software analysis, automatic code instrumentation, and high-performance data transport, the VOS provides run-time observation capability, in-situ data analytics for Earth system model simulation, model behavior adjustment opportunities through simulation steering. A VOS for a terrestrial land model simulation within the Accelerated Climate Modeling for Energy model is also presentedmore » to demonstrate the implementation details and system innovations.« less
NASA Technical Reports Server (NTRS)
Hall, Laverne
1995-01-01
Modeling of the Multi-mission Image Processing System (MIPS) will be described as an example of the use of a modeling tool to design a distributed system that supports multiple application scenarios. This paper examines: (a) modeling tool selection, capabilities, and operation (namely NETWORK 2.5 by CACl), (b) pointers for building or constructing a model and how the MIPS model was developed, (c) the importance of benchmarking or testing the performance of equipment/subsystems being considered for incorporation the design/architecture, (d) the essential step of model validation and/or calibration using the benchmark results, (e) sample simulation results from the MIPS model, and (f) how modeling and simulation analysis affected the MIPS design process by having a supportive and informative impact.
Amanzi: An Open-Source Multi-process Simulator for Environmental Applications
NASA Astrophysics Data System (ADS)
Moulton, J. D.; Molins, S.; Johnson, J. N.; Coon, E.; Lipnikov, K.; Day, M.; Barker, E.
2014-12-01
The Advanced Simulation Capabililty for Environmental Management (ASCEM) program is developing an approach and open-source tool suite for standardized risk and performance assessments at legacy nuclear waste sites. These assessments begin with simplified models, and add geometric and geologic complexity as understanding is gained. The Platform toolsets (Akuna) generates these conceptual models and Amanzi provides the computational engine to perform the simulations, returning the results for analysis and visualization. In this presentation we highlight key elements of the design, algorithms and implementations used in Amanzi. In particular, the hierarchical and modular design is aligned with the coupled processes being sumulated, and naturally supports a wide range of model complexity. This design leverages a dynamic data manager and the synergy of two graphs (one from the high-level perspective of the models the other from the dependencies of the variables in the model) to enable this flexible model configuration at run time. Moreover, to model sites with complex hydrostratigraphy, as well as engineered systems, we are developing a dual unstructured/structured capability. Recently, these capabilities have been collected in a framework named Arcos, and efforts have begun to improve interoperability between the unstructured and structured AMR approaches in Amanzi. To leverage a range of biogeochemistry capability from the community (e.g., CrunchFlow, PFLOTRAN, etc.), a biogeochemistry interface library was developed called Alquimia. To ensure that Amanzi is truly an open-source community code we require a completely open-source tool chain for our development. We will comment on elements of this tool chain, including the testing and documentation development tools such as docutils, and Sphinx. Finally, we will show simulation results from our phased demonstrations, including the geochemically complex Savannah River F-Area seepage basins.
Application of the JDL data fusion process model for cyber security
NASA Astrophysics Data System (ADS)
Giacobe, Nicklaus A.
2010-04-01
A number of cyber security technologies have proposed the use of data fusion to enhance the defensive capabilities of the network and aid in the development of situational awareness for the security analyst. While there have been advances in fusion technologies and the application of fusion in intrusion detection systems (IDSs), in particular, additional progress can be made by gaining a better understanding of a variety of data fusion processes and applying them to the cyber security application domain. This research explores the underlying processes identified in the Joint Directors of Laboratories (JDL) data fusion process model and further describes them in a cyber security context.
Towards improved capability and confidence in coupled atmospheric and wildland fire modeling
NASA Astrophysics Data System (ADS)
Sauer, Jeremy A.
This dissertation work is aimed at improving the capability and confidence in a modernized and improved version of Los Alamos National Laboratory's coupled atmospheric and wild- land fire dynamics model, Higrad-Firetec. Higrad is the hydrodynamics component of this large eddy simulation model that solves the three dimensional, fully compressible Navier-Stokes equations, incorporating a dynamic eddy viscosity formulation through a two-scale turbulence closure scheme. Firetec is the vegetation, drag forcing, and combustion physics portion that is integrated with Higrad. The modern version of Higrad-Firetec incorporates multiple numerical methodologies and high performance computing aspects which combine to yield a unique tool capable of augmenting theoretical and observational investigations in order to better understand the multi-scale, multi-phase, and multi-physics, phenomena involved in coupled atmospheric and environmental dynamics. More specifically, the current work includes extended functionality and validation efforts targeting component processes in coupled atmospheric and wildland fire scenarios. Since observational data of sufficient quality and resolution to validate the fully coupled atmosphere-wildfire scenario simply does not exist, we instead seek to validate components of the full prohibitively convoluted process. This manuscript provides first, an introduction and background into the application space of Higrad-Firetec. Second we document the model formulation, solution procedure, and a simple scalar transport verification exercise. Third, we perform a validate model results against observational data for time averaged flow field metrics in and above four idealized forest canopies. Fourth, we carry out a validation effort for the non-buoyant jet in a crossflow scenario (to which an analogy can be made for atmosphere-wildfire interactions) comparing model results to laboratory data of both steady-in-time and unsteady-in-time metrics. Finally, an extension of model multi-phase physics is implemented, allowing for the representation of multiple collocated fuels as separately evolving constituents leading to differences resulting rate of spread and total burned area. In combination these efforts demonstrate improved capability, increased validation of component functionality, and unique applicability the Higrad-Firetec modeling framework. As a result this work provides a substantially more robust foundation for future new, more widely acceptable investigations into the complexities of coupled atmospheric and wildland fire behavior.
Electrical features of new DNC, CNC system viewed
NASA Astrophysics Data System (ADS)
Fritzsch, W.; Kochan, D.; Schaller, J.; Zander, H. J.
1985-03-01
Control structures capable of solving the problems of a flexible minial-labor manufacturing process are analyzed. The present state of development of equipment technology is described, and possible ways of modeling control processes are surveyed. Concepts which are frequently differently interpreted in various specialized disciplines are systematized, with a view toward creating the prerequisites for interdisciplinary cooperation. Problems and information flow during the preparatory and performance phases of manufacturing are examined with respect to coupling CAD/CAM functions. Mathematical modeling for direct numerical control is explored.
Transforming Systems Engineering through Model-Centric Engineering
2018-02-28
intelligence (e.g., Artificial Intelligence , etc.), because they provide a means for representing knowledge. We see these capabilities coming to use in both...level, including: Performance is measured by degree of success of a mission Artificial Intelligence (AI) is applied to counterparties so that they...Modeling, Artificial Intelligence , Simulation and Modeling, 1989. [140] SAE ARP4761. Guidelines and Methods for Conducting the Safety Assessment Process
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gregory Reaman
The initiative will enable the COG Biopathology Center (Biospecimen Repository), the Molecular Genetics Laboratory and other participating reference laboratories to upload large data sets to the eRDES. The capability streamlines data currency and accuracy allowing the centers to export data from local systems and import the defined data to the eRDES. The process will aid in the best practices which have been defined by the Office of Biorepository and Biospecimen Research (OBBR) and the Group Banking Committee (GBC). The initiative allows for batch import and export, a data validation process and reporting mechanism, and a model for other labs tomore » incorporate. All objectives are complete. The solutions provided and the defined process eliminates dual data entry resulting in data consistency. The audit trail capabilities allow for complete tracking of the data exchange between laboratories and the Statistical Data Center (SDC). The impact is directly on time and efforts. In return, the process will save money and improve the data utilized by the COG. Ongoing efforts include implementing new technologies to further enhance the current solutions and process currently in place. Web Services and Reporting Services are technologies that have become industry standards and will allow for further harmonization with caBIG (cancer Biolnforrnatics Grid). Additional testing and implementation of the model for other laboratories is in process.« less
Implementation and benefits of advanced process control for lithography CD and overlay
NASA Astrophysics Data System (ADS)
Zavyalova, Lena; Fu, Chong-Cheng; Seligman, Gary S.; Tapp, Perry A.; Pol, Victor
2003-05-01
Due to the rapidly reduced imaging process windows and increasingly stingent device overlay requirements, sub-130 nm lithography processes are more severely impacted than ever by systamic fault. Limits on critical dimensions (CD) and overlay capability further challenge the operational effectiveness of a mix-and-match environment using multiple lithography tools, as such mode additionally consumes the available error budgets. Therefore, a focus on advanced process control (APC) methodologies is key to gaining control in the lithographic modules for critical device levels, which in turn translates to accelerated yield learning, achieving time-to-market lead, and ultimately a higher return on investment. This paper describes the implementation and unique challenges of a closed-loop CD and overlay control solution in high voume manufacturing of leading edge devices. A particular emphasis has been placed on developing a flexible APC application capable of managing a wide range of control aspects such as process and tool drifts, single and multiple lot excursions, referential overlay control, 'special lot' handling, advanced model hierarchy, and automatic model seeding. Specific integration cases, including the multiple-reticle complementary phase shift lithography process, are discussed. A continuous improvement in the overlay and CD Cpk performance as well as the rework rate has been observed through the implementation of this system, and the results are studied.
Synchronous orbit power technology needs
NASA Technical Reports Server (NTRS)
Slifer, L. W., Jr.; Billerbeck, W. J.
1979-01-01
The needs are defined for future geosynchronous orbit spacecraft power subsystem components, including power generation, energy storage, and power processing. A review of the rapid expansion of the satellite communications field provides a basis for projection into the future. Three projected models, a mission model, an orbit transfer vehicle model, and a mass model for power subsystem components are used to define power requirements and mass limitations for future spacecraft. Based upon these three models, the power subsystems for a 10 kw, 10 year life, dedicated spacecraft and for a 20 kw, 20 year life, multi-mission platform are analyzed in further detail to establish power density requirements for the generation, storage and processing components of power subsystems as related to orbit transfer vehicle capabilities. Comparison of these requirements to state of the art design values shows that major improvements, by a factor of 2 or more, are needed to accomplish the near term missions. However, with the advent of large transfer vehicles, these requirements are significantly reduced, leaving the long lifetime requirement, associated with reliability and/or refurbishment, as the primary development need. A few technology advances, currently under development, are noted with regard to their impacts on future capability.
Modelling the impacts of pests and diseases on agricultural systems.
Donatelli, M; Magarey, R D; Bregaglio, S; Willocquet, L; Whish, J P M; Savary, S
2017-07-01
The improvement and application of pest and disease models to analyse and predict yield losses including those due to climate change is still a challenge for the scientific community. Applied modelling of crop diseases and pests has mostly targeted the development of support capabilities to schedule scouting or pesticide applications. There is a need for research to both broaden the scope and evaluate the capabilities of pest and disease models. Key research questions not only involve the assessment of the potential effects of climate change on known pathosystems, but also on new pathogens which could alter the (still incompletely documented) impacts of pests and diseases on agricultural systems. Yield loss data collected in various current environments may no longer represent a adequate reference to develop tactical, decision-oriented, models for plant diseases and pests and their impacts, because of the ongoing changes in climate patterns. Process-based agricultural simulation modelling, on the other hand, appears to represent a viable methodology to estimate the impacts of these potential effects. A new generation of tools based on state-of-the-art knowledge and technologies is needed to allow systems analysis including key processes and their dynamics over appropriate suitable range of environmental variables. This paper offers a brief overview of the current state of development in coupling pest and disease models to crop models, and discusses technical and scientific challenges. We propose a five-stage roadmap to improve the simulation of the impacts caused by plant diseases and pests; i) improve the quality and availability of data for model inputs; ii) improve the quality and availability of data for model evaluation; iii) improve the integration with crop models; iv) improve the processes for model evaluation; and v) develop a community of plant pest and disease modelers.
NASA Astrophysics Data System (ADS)
Chen, Y.; Li, J.; Xu, H.
2015-10-01
Physically based distributed hydrological models discrete the terrain of the whole catchment into a number of grid cells at fine resolution, and assimilate different terrain data and precipitation to different cells, and are regarded to have the potential to improve the catchment hydrological processes simulation and prediction capability. In the early stage, physically based distributed hydrological models are assumed to derive model parameters from the terrain properties directly, so there is no need to calibrate model parameters, but unfortunately, the uncertanties associated with this model parameter deriving is very high, which impacted their application in flood forecasting, so parameter optimization may also be necessary. There are two main purposes for this study, the first is to propose a parameter optimization method for physically based distributed hydrological models in catchment flood forecasting by using PSO algorithm and to test its competence and to improve its performances, the second is to explore the possibility of improving physically based distributed hydrological models capability in cathcment flood forecasting by parameter optimization. In this paper, based on the scalar concept, a general framework for parameter optimization of the PBDHMs for catchment flood forecasting is first proposed that could be used for all PBDHMs. Then, with Liuxihe model as the study model, which is a physically based distributed hydrological model proposed for catchment flood forecasting, the improverd Particle Swarm Optimization (PSO) algorithm is developed for the parameter optimization of Liuxihe model in catchment flood forecasting, the improvements include to adopt the linear decreasing inertia weight strategy to change the inertia weight, and the arccosine function strategy to adjust the acceleration coefficients. This method has been tested in two catchments in southern China with different sizes, and the results show that the improved PSO algorithm could be used for Liuxihe model parameter optimization effectively, and could improve the model capability largely in catchment flood forecasting, thus proven that parameter optimization is necessary to improve the flood forecasting capability of physically based distributed hydrological model. It also has been found that the appropriate particle number and the maximum evolution number of PSO algorithm used for Liuxihe model catchment flood forcasting is 20 and 30, respectively.
Automation of energy demand forecasting
NASA Astrophysics Data System (ADS)
Siddique, Sanzad
Automation of energy demand forecasting saves time and effort by searching automatically for an appropriate model in a candidate model space without manual intervention. This thesis introduces a search-based approach that improves the performance of the model searching process for econometrics models. Further improvements in the accuracy of the energy demand forecasting are achieved by integrating nonlinear transformations within the models. This thesis introduces machine learning techniques that are capable of modeling such nonlinearity. Algorithms for learning domain knowledge from time series data using the machine learning methods are also presented. The novel search based approach and the machine learning models are tested with synthetic data as well as with natural gas and electricity demand signals. Experimental results show that the model searching technique is capable of finding an appropriate forecasting model. Further experimental results demonstrate an improved forecasting accuracy achieved by using the novel machine learning techniques introduced in this thesis. This thesis presents an analysis of how the machine learning techniques learn domain knowledge. The learned domain knowledge is used to improve the forecast accuracy.
Technology evaluation, assessment, modeling, and simulation: the TEAMS capability
NASA Astrophysics Data System (ADS)
Holland, Orgal T.; Stiegler, Robert L.
1998-08-01
The United States Marine Corps' Technology Evaluation, Assessment, Modeling and Simulation (TEAMS) capability, located at the Naval Surface Warfare Center in Dahlgren Virginia, provides an environment for detailed test, evaluation, and assessment of live and simulated sensor and sensor-to-shooter systems for the joint warfare community. Frequent use of modeling and simulation allows for cost effective testing, bench-marking, and evaluation of various levels of sensors and sensor-to-shooter engagements. Interconnectivity to live, instrumented equipment operating in real battle space environments and to remote modeling and simulation facilities participating in advanced distributed simulations (ADS) exercises is available to support a wide- range of situational assessment requirements. TEAMS provides a valuable resource for a variety of users. Engineers, analysts, and other technology developers can use TEAMS to evaluate, assess and analyze tactical relevant phenomenological data on tactical situations. Expeditionary warfare and USMC concept developers can use the facility to support and execute advanced warfighting experiments (AWE) to better assess operational maneuver from the sea (OMFTS) concepts, doctrines, and technology developments. Developers can use the facility to support sensor system hardware, software and algorithm development as well as combat development, acquisition, and engineering processes. Test and evaluation specialists can use the facility to plan, assess, and augment their processes. This paper presents an overview of the TEAMS capability and focuses specifically on the technical challenges associated with the integration of live sensor hardware into a synthetic environment and how those challenges are being met. Existing sensors, recent experiments and facility specifications are featured.
Self-organized huddles of rat pups modeled by simple rules of individual behavior.
Schank, J C; Alberts, J R
1997-11-07
Starting at infancy and continuing throughout adult life, huddling is a major component of the behavioral repertoire of Norway rats (Rattus norvegicus). Huddling behavior maintains the cohesion of litters throughout early life, and in adulthood, it remains a consistent feature of social behavior of R. norvegicus. During infancy, rats have severely limited sensorimotor capabilities, and yet they are capable of aggregating and display a form of group regulatory behavior that conserves metabolic effort and augments body temperature regulation. The functions of huddling are generally understood as group adaptations, which are beyond the capabilities of the individual infant rat. We show, however, that huddling as aggregative or cohesive behavior can emerge as a self-organizing process from autonomous individuals following simple sensorimotor rules. In our model, two sets of sensorimotor parameters characterize the topotaxic responses and the dynamics of contact in 7-day-old rats. The first set of parameters are conditional probabilities of activity and inactivity given prior activity or inactivity and the second set are preferences for objects in the infant rat's environment. We found that the behavior of the model and of actual rat pups compare very favorably, demonstrating that the aggregative feature of huddling can emerge from the local sensorimotor interactions of individuals, and that complex group regulatory behaviors in infant rats may also emerge from self-organizing processes. We discuss the model and the underlying approach as a paradigm for investigating the dynamics of social interactions, group behavior, and developmental change.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Powell, D.R.; Hutchinson, J.L.
Eagle 11 is a prototype analytic model derived from the integration of the low resolution Eagle model with the high resolution SIMNET model. This integration promises a new capability to allow for a more effective examination of proposed or existing combat systems that could not be easily evaluated using either Eagle or SIMNET alone. In essence, Eagle II becomes a multi-resolution combat model in which simulated combat units can exhibit both high and low fidelity behavior at different times during model execution. This capability allows a unit to behave in a highly manner only when required, thereby reducing the overallmore » computational and manpower requirements for a given study. In this framework, the SIMNET portion enables a highly credible assessment of the performance of individual combat systems under consideration, encompassing both engineering performance and crew capabilities. However, when the assessment being conducted goes beyond system performance and extends to questions of force structure balance and sustainment, then SISMNET results can be used to ``calibrate`` the Eagle attrition process appropriate to the study at hand. Advancing technologies, changes in the world-wide threat, requirements for flexible response, declining defense budgets, and down-sizing of military forces motivate the development of manpower-efficient, low-cost, responsive tools for combat development studies. Eagle and SIMNET both serve as credible and useful tools. The integration of these two models promises enhanced capabilities to examine the broader, deeper, more complex battlefield of the future with higher fidelity, greater responsiveness and low overall cost.« less
NASA Astrophysics Data System (ADS)
Chartosias, Marios
Acceptance of Carbon Fiber Reinforced Polymer (CFRP) structures requires a robust surface preparation method with improved process controls capable of ensuring high bond quality. Surface preparation in a production clean room environment prior to applying adhesive for bonding would minimize risk of contamination and reduce cost. Plasma treatment is a robust surface preparation process capable of being applied in a production clean room environment with process parameters that are easily controlled and documented. Repeatable and consistent processing is enabled through the development of a process parameter window utilizing techniques such as Design of Experiments (DOE) tailored to specific adhesive and substrate bonding applications. Insight from respective plasma treatment Original Equipment Manufacturers (OEMs) and screening tests determined critical process factors from non-factors and set the associated factor levels prior to execution of the DOE. Results from mode I Double Cantilever Beam (DCB) testing per ASTM D 5528 [1] standard and DOE statistical analysis software are used to produce a regression model and determine appropriate optimum settings for each factor.
Containerless processing of undercooled melts
NASA Technical Reports Server (NTRS)
Perepezko, J. H.
1993-01-01
The investigation focused on the control of microstructural evolution in Mn-Al, Fe-Ni, Ni-V, and Au-Pb-Sb alloys through the high undercooling levels provided by containerless processing, and provided fundamental new information on the control of nucleation. Solidification analysis was conducted by means of thermal analysis, x-ray diffraction, and metallographic characterization on samples processed in a laboratory scale drop tube system. The Mn-Al alloy system offers a useful model system with the capability of phase separation on an individual particle basis, thus permitting a more complete understanding of the operative kinetics and the key containerless processing variables. This system provided the opportunity of analyzing the nucleation rate as a function of processing conditions and allowed for the quantitative assessment of the relevant processing parameters. These factors are essential in the development of a containerless processing model which has a predictive capability. Similarly, Ni-V is a model system that was used to study duplex partitionless solidification, which is a structure possible only in high under cooling solidification processes. Nucleation kinetics for the competing bcc and fcc phases were studied to determine how this structure can develop and the conditions under which it may occur. The Fe-Ni alloy system was studied to identify microstructural transitions with controlled variations in sample size and composition during containerless solidification. This work was forwarded to develop a microstructure map which delineates regimes of structural evolution and provides a unified analysis of experimental observations. The Au-Pb-Sb system was investigated to characterize the thermodynamic properties of the undercooled liquid phase and to characterize the glass transition under a variety of processing conditions. By analyzing key containerless processing parameters in a ground based drop tube study, a carefully designed flight experiment may be planned to utilize the extended duration microgravity conditions of orbiting spacecraft.
Painter, Scott L.; Coon, Ethan T.; Atchley, Adam L.; ...
2016-08-11
The need to understand potential climate impacts and feedbacks in Arctic regions has prompted recent interest in modeling of permafrost dynamics in a warming climate. A new fine-scale integrated surface/subsurface thermal hydrology modeling capability is described and demonstrated in proof-of-concept simulations. The new modeling capability combines a surface energy balance model with recently developed three-dimensional subsurface thermal hydrology models and new models for nonisothermal surface water flows and snow distribution in the microtopography. Surface water flows are modeled using the diffusion wave equation extended to include energy transport and phase change of ponded water. Variation of snow depth in themore » microtopography, physically the result of wind scour, is also modeled heuristically with a diffusion wave equation. The multiple surface and subsurface processes are implemented by leveraging highly parallel community software. Fully integrated thermal hydrology simulations on the tilted open book catchment, an important test case for integrated surface/subsurface flow modeling, are presented. Fine-scale 100-year projections of the integrated permafrost thermal hydrological system on an ice wedge polygon at Barrow Alaska in a warming climate are also presented. Finally, these simulations demonstrate the feasibility of microtopography-resolving, process-rich simulations as a tool to help understand possible future evolution of the carbon-rich Arctic tundra in a warming climate.« less
ERIC Educational Resources Information Center
Kolodny, Oren; Lotem, Arnon; Edelman, Shimon
2015-01-01
We introduce a set of biologically and computationally motivated design choices for modeling the learning of language, or of other types of sequential, hierarchically structured experience and behavior, and describe an implemented system that conforms to these choices and is capable of unsupervised learning from raw natural-language corpora. Given…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nandi, Taraj; Brasseur, James; Vijayakumar, Ganesh
2016-01-04
This study is aimed at gaining insight into the nonsteady transitional boundary layer dynamics of wind turbine blades and the predictive capabilities of URANS based transition and turbulence models for similar physics through the analysis of a controlled flow with similar nonsteady parameters.
Annual Report: Carbon Capture Simulation Initiative (CCSI) (30 September 2012)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, David C.; Syamlal, Madhava; Cottrell, Roger
2012-09-30
The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and academic institutions that is developing and deploying state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technologies from discovery to development, demonstration, and ultimately the widespread deployment to hundreds of power plants. The CCSI Toolset will provide end users in industry with a comprehensive, integrated suite of scientifically validated models, with uncertainty quantification (UQ), optimization, risk analysis and decision making capabilities. The CCSI Toolset incorporates commercial and open-source software currently in use by industry and is also developing new software tools asmore » necessary to fill technology gaps identified during execution of the project. Ultimately, the CCSI Toolset will (1) enable promising concepts to be more quickly identified through rapid computational screening of devices and processes; (2) reduce the time to design and troubleshoot new devices and processes; (3) quantify the technical risk in taking technology from laboratory-scale to commercial-scale; and (4) stabilize deployment costs more quickly by replacing some of the physical operational tests with virtual power plant simulations. CCSI is organized into 8 technical elements that fall under two focus areas. The first focus area (Physicochemical Models and Data) addresses the steps necessary to model and simulate the various technologies and processes needed to bring a new Carbon Capture and Storage (CCS) technology into production. The second focus area (Analysis & Software) is developing the software infrastructure to integrate the various components and implement the tools that are needed to make quantifiable decisions regarding the viability of new CCS technologies. CCSI also has an Industry Advisory Board (IAB). By working closely with industry from the inception of the project to identify industrial challenge problems, CCSI ensures that the simulation tools are developed for the carbon capture technologies of most relevance to industry. CCSI is led by the National Energy Technology Laboratory (NETL) and leverages the Department of Energy (DOE) national laboratories' core strengths in modeling and simulation, bringing together the best capabilities at NETL, Los Alamos National Laboratory (LANL), Lawrence Berkeley National Laboratory (LBNL), Lawrence Livermore National Laboratory (LLNL), and Pacific Northwest National Laboratory (PNNL). The CCSI's industrial partners provide representation from the power generation industry, equipment manufacturers, technology providers and engineering and construction firms. The CCSI's academic participants (Carnegie Mellon University, Princeton University, West Virginia University, and Boston University) bring unparalleled expertise in multiphase flow reactors, combustion, process synthesis and optimization, planning and scheduling, and process control techniques for energy processes. During Fiscal Year (FY) 12, CCSI released its first set of computational tools and models. This pre-release, a year ahead of the originally planned first release, is the result of intense industry interest in getting early access to the tools and the phenomenal progress of the CCSI technical team. These initial components of the CCSI Toolset provide new models and computational capabilities that will accelerate the commercial development of carbon capture technologies as well as related technologies, such as those found in the power, refining, chemicals, and gas production industries. The release consists of new tools for process synthesis and optimization to help identify promising concepts more quickly, new physics-based models of potential capture equipment and processes that will reduce the time to design and troubleshoot new systems, a framework to quantify the uncertainty of model predictions, and various enabling tools that provide new capabilities such as creating reduced order models (ROMs) from reacting multiphase flow simulations and running thousands of process simulations concurrently for optimization and UQ.« less
An Investigation of the Influence of Waves on Sediment Processes in Skagit Bay
2012-09-30
parameterizations common to most surface wave models, including wave generation by wind , energy dissipation from whitecapping, and quadruplet wave-wave...supply and wind on tidal flat sediment transport. It will be used to evaluate the capabilities of state-of-the-art open source sediment models and to...N00014-08-1-1115 which supported the hydrodynamic model development. Wind forcing for the wave and hydrodynamic models for realistic experiments will
NASA Astrophysics Data System (ADS)
Li, Yutong; Wang, Yuxin; Duffy, Alex H. B.
2014-11-01
Computer-based conceptual design for routine design has made great strides, yet non-routine design has not been given due attention, and it is still poorly automated. Considering that the function-behavior-structure(FBS) model is widely used for modeling the conceptual design process, a computer-based creativity enhanced conceptual design model(CECD) for non-routine design of mechanical systems is presented. In the model, the leaf functions in the FBS model are decomposed into and represented with fine-grain basic operation actions(BOA), and the corresponding BOA set in the function domain is then constructed. Choosing building blocks from the database, and expressing their multiple functions with BOAs, the BOA set in the structure domain is formed. Through rule-based dynamic partition of the BOA set in the function domain, many variants of regenerated functional schemes are generated. For enhancing the capability to introduce new design variables into the conceptual design process, and dig out more innovative physical structure schemes, the indirect function-structure matching strategy based on reconstructing the combined structure schemes is adopted. By adjusting the tightness of the partition rules and the granularity of the divided BOA subsets, and making full use of the main function and secondary functions of each basic structure in the process of reconstructing of the physical structures, new design variables and variants are introduced into the physical structure scheme reconstructing process, and a great number of simpler physical structure schemes to accomplish the overall function organically are figured out. The creativity enhanced conceptual design model presented has a dominant capability in introducing new deign variables in function domain and digging out simpler physical structures to accomplish the overall function, therefore it can be utilized to solve non-routine conceptual design problem.
An Environmental Management Maturity Model of Construction Programs Using the AHP-Entropy Approach.
Bai, Libiao; Wang, Hailing; Huang, Ning; Du, Qiang; Huang, Youdan
2018-06-23
The accelerating process of urbanization in China has led to considerable opportunities for the development of construction projects, however, environmental issues have become an important constraint on the implementation of these projects. To quantitatively describe the environmental management capabilities of such projects, this paper proposes a 2-dimensional Environmental Management Maturity Model of Construction Program (EMMMCP) based on an analysis of existing projects, group management theory and a management maturity model. In this model, a synergetic process was included to compensate for the lack of consideration of synergies in previous studies, and it was involved in the construction of the first dimension, i.e., the environmental management index system. The second dimension, i.e., the maturity level of environment management, was then constructed by redefining the hierarchical characteristics of construction program (CP) environmental management maturity. Additionally, a mathematical solution to this proposed model was derived via the Analytic Hierarchy Process (AHP)-entropy approach. To verify the effectiveness and feasibility of this proposed model, a computational experiment was conducted, and the results show that this approach could not only measure the individual levels of different processes, but also achieve the most important objective of providing a reference for stakeholders when making decisions on the environmental management of construction program, which reflects this model is reasonable for evaluating the level of environmental management maturity in CP. To our knowledge, this paper is the first study to evaluate the environmental management maturity levels of CP, which would fill the gap between project program management and environmental management and provide a reference for relevant management personnel to enhance their environmental management capabilities.
Intertime jump statistics of state-dependent Poisson processes.
Daly, Edoardo; Porporato, Amilcare
2007-01-01
A method to obtain the probability distribution of the interarrival times of jump occurrences in systems driven by state-dependent Poisson noise is proposed. Such a method uses the survivor function obtained by a modified version of the master equation associated to the stochastic process under analysis. A model for the timing of human activities shows the capability of state-dependent Poisson noise to generate power-law distributions. The application of the method to a model for neuron dynamics and to a hydrological model accounting for land-atmosphere interaction elucidates the origin of characteristic recurrence intervals and possible persistence in state-dependent Poisson models.
A perspective on modeling the multiscale response of energetic materials
NASA Astrophysics Data System (ADS)
Rice, Betsy M.
2017-01-01
The response of an energetic material to insult is perhaps one of the most difficult processes to model due to concurrent chemical and physical phenomena occurring over scales ranging from atomistic to continuum. Unraveling the interdependencies of these complex processes across the scales through modeling can only be done within a multiscale framework. In this paper, I will describe progress in the development of a predictive, experimentally validated multiscale reactive modeling capability for energetic materials at the Army Research Laboratory. I will also describe new challenges and research opportunities that have arisen in the course of our development which should be pursued in the future.
Device design and signal processing for multiple-input multiple-output multimode fiber links
NASA Astrophysics Data System (ADS)
Appaiah, Kumar; Vishwanath, Sriram; Bank, Seth R.
2012-01-01
Multimode fibers (MMFs) are limited in data rate capabilities owing to modal dispersion. However, their large core diameter simplifies alignment and packaging, and makes them attractive for short and medium length links. Recent research has shown that the use of signal processing and techniques such as multiple-input multiple-output (MIMO) can greatly improve the data rate capabilities of multimode fibers. In this paper, we review recent experimental work using MIMO and signal processing for multimode fibers, and the improvements in data rates achievable with these techniques. We then present models to design as well as simulate the performance benefits obtainable with arrays of lasers and detectors in conjunction with MIMO, using channel capacity as the metric to optimize. We also discuss some aspects related to complexity of the algorithms needed for signal processing and discuss techniques for low complexity implementation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zitney, S.E.
This paper highlights the use of the CAPE-OPEN (CO) standard interfaces in the Advanced Process Engineering Co-Simulator (APECS) developed at the National Energy Technology Laboratory (NETL). The APECS system uses the CO unit operation, thermodynamic, and reaction interfaces to provide its plug-and-play co-simulation capabilities, including the integration of process simulation with computational fluid dynamics (CFD) simulation. APECS also relies heavily on the use of a CO COM/CORBA bridge for running process/CFD co-simulations on multiple operating systems. For process optimization in the face of multiple and some time conflicting objectives, APECS offers stochastic modeling and multi-objective optimization capabilities developed to complymore » with the CO software standard. At NETL, system analysts are applying APECS to a wide variety of advanced power generation systems, ranging from small fuel cell systems to commercial-scale power plants including the coal-fired, gasification-based FutureGen power and hydrogen production plant.« less
NASA Astrophysics Data System (ADS)
Aldrin, John C.; Annis, Charles; Sabbagh, Harold A.; Lindgren, Eric A.
2016-02-01
A comprehensive approach to NDE and SHM characterization error (CE) evaluation is presented that follows the framework of the `ahat-versus-a' regression analysis for POD assessment. Characterization capability evaluation is typically more complex with respect to current POD evaluations and thus requires engineering and statistical expertise in the model-building process to ensure all key effects and interactions are addressed. Justifying the statistical model choice with underlying assumptions is key. Several sizing case studies are presented with detailed evaluations of the most appropriate statistical model for each data set. The use of a model-assisted approach is introduced to help assess the reliability of NDE and SHM characterization capability under a wide range of part, environmental and damage conditions. Best practices of using models are presented for both an eddy current NDE sizing and vibration-based SHM case studies. The results of these studies highlight the general protocol feasibility, emphasize the importance of evaluating key application characteristics prior to the study, and demonstrate an approach to quantify the role of varying SHM sensor durability and environmental conditions on characterization performance.
Lattice Boltzmann simulations of immiscible displacement process with large viscosity ratios
NASA Astrophysics Data System (ADS)
Rao, Parthib; Schaefer, Laura
2017-11-01
Immiscible displacement is a key physical mechanism involved in enhanced oil recovery and carbon sequestration processes. This multiphase flow phenomenon involves a complex interplay of viscous, capillary, inertial and wettability effects. The lattice Boltzmann (LB) method is an accurate and efficient technique for modeling and simulating multiphase/multicomponent flows especially in complex flow configurations and media. In this presentation we present numerical simulation results of displacement process in thin long channels. The results are based on a new psuedo-potential multicomponent LB model with multiple relaxation time collision (MRT) model and explicit forcing scheme. We demonstrate that the proposed model is capable of accurately simulating the displacement process involving fluids with a wider range of viscosity ratios (>100) and which also leads to viscosity-independent interfacial tension and reduction of some important numerical artifacts.
Challenges in Developing Models Describing Complex Soil Systems
NASA Astrophysics Data System (ADS)
Simunek, J.; Jacques, D.
2014-12-01
Quantitative mechanistic models that consider basic physical, mechanical, chemical, and biological processes have the potential to be powerful tools to integrate our understanding of complex soil systems, and the soil science community has often called for models that would include a large number of these diverse processes. However, once attempts have been made to develop such models, the response from the community has not always been overwhelming, especially after it discovered that these models are consequently highly complex, requiring not only a large number of parameters, not all of which can be easily (or at all) measured and/or identified, and which are often associated with large uncertainties, but also requiring from their users deep knowledge of all/most of these implemented physical, mechanical, chemical and biological processes. Real, or perceived, complexity of these models then discourages users from using them even for relatively simple applications, for which they would be perfectly adequate. Due to the nonlinear nature and chemical/biological complexity of the soil systems, it is also virtually impossible to verify these types of models analytically, raising doubts about their applicability. Code inter-comparisons, which is then likely the most suitable method to assess code capabilities and model performance, requires existence of multiple models of similar/overlapping capabilities, which may not always exist. It is thus a challenge not only to developed models describing complex soil systems, but also to persuade the soil science community in using them. As a result, complex quantitative mechanistic models are still an underutilized tool in soil science research. We will demonstrate some of the challenges discussed above on our own efforts in developing quantitative mechanistic models (such as HP1/2) for complex soil systems.
2015-01-08
VANDENBERG AIR FORCE BASE, Calif. – Inside the Astrotech payload processing facility at Vandenberg Air Force Base in California, engineers and technicians inspect NASA's Soil Moisture Active Passive mission, or SMAP, satellite. SMAP will provide global measurements of soil moisture and its freeze/thaw state. These measurements will be used to enhance understanding of processes that link the water, energy and carbon cycles, and to extend the capabilities of weather and climate prediction models. SMAP data also will be used to quantify net carbon flux in boreal landscapes and to develop improved flood prediction and drought monitoring capabilities. Launch is scheduled for Jan. 29, 2015. To learn more about SMAP, visit http://smap.jpl.nasa.gov Photo credit: Jeremy Moore, USAF Photo Squadron
2015-01-08
VANDENBERG AIR FORCE BASE, Calif. – Inside the Astrotech payload processing facility at Vandenberg Air Force Base in California, engineers and technicians inspect NASA's Soil Moisture Active Passive mission, or SMAP, satellite. SMAP will provide global measurements of soil moisture and its freeze/thaw state. These measurements will be used to enhance understanding of processes that link the water, energy and carbon cycles, and to extend the capabilities of weather and climate prediction models. SMAP data also will be used to quantify net carbon flux in boreal landscapes and to develop improved flood prediction and drought monitoring capabilities. Launch is scheduled for Jan. 29, 2015. To learn more about SMAP, visit http://smap.jpl.nasa.gov Photo credit: Jeremy Moore, USAF Photo Squadron
NASA Technical Reports Server (NTRS)
Ido, Haisam; Burns, Rich
2015-01-01
The NASA Goddard Space Science Mission Operations project (SSMO) is performing a technical cost-benefit analysis for centralizing and consolidating operations of a diverse set of missions into a unified and integrated technical infrastructure. The presentation will focus on the notion of normalizing spacecraft operations processes, workflows, and tools. It will also show the processes of creating a standardized open architecture, creating common security models and implementations, interfaces, services, automations, notifications, alerts, logging, publish, subscribe and middleware capabilities. The presentation will also discuss how to leverage traditional capabilities, along with virtualization, cloud computing services, control groups and containers, and possibly Big Data concepts.
Finite element modelling of crash response of composite aerospace sub-floor structures
NASA Astrophysics Data System (ADS)
McCarthy, M. A.; Harte, C. G.; Wiggenraad, J. F. M.; Michielsen, A. L. P. J.; Kohlgrüber, D.; Kamoulakos, A.
Composite energy-absorbing structures for use in aircraft are being studied within a European Commission research programme (CRASURV - Design for Crash Survivability). One of the aims of the project is to evaluate the current capabilities of crashworthiness simulation codes for composites modelling. This paper focuses on the computational analysis using explicit finite element analysis, of a number of quasi-static and dynamic tests carried out within the programme. It describes the design of the structures, the analysis techniques used, and the results of the analyses in comparison to the experimental test results. It has been found that current multi-ply shell models are capable of modelling the main energy-absorbing processes at work in such structures. However some deficiencies exist, particularly in modelling fabric composites. Developments within the finite element code are taking place as a result of this work which will enable better representation of composite fabrics.
Process-based tolerance assessment of connecting rod machining process
NASA Astrophysics Data System (ADS)
Sharma, G. V. S. S.; Rao, P. Srinivasa; Surendra Babu, B.
2016-06-01
Process tolerancing based on the process capability studies is the optimistic and pragmatic approach of determining the manufacturing process tolerances. On adopting the define-measure-analyze-improve-control approach, the process potential capability index ( C p) and the process performance capability index ( C pk) values of identified process characteristics of connecting rod machining process are achieved to be greater than the industry benchmark of 1.33, i.e., four sigma level. The tolerance chain diagram methodology is applied to the connecting rod in order to verify the manufacturing process tolerances at various operations of the connecting rod manufacturing process. This paper bridges the gap between the existing dimensional tolerances obtained via tolerance charting and process capability studies of the connecting rod component. Finally, the process tolerancing comparison has been done by adopting a tolerance capability expert software.
Razmilic, Valeria; Castro, Jean Franco; Marchant, Francisca; Asenjo, Juan A; Andrews, Barbara
2018-02-02
Metabolic modelling is a useful tool that enables the rational design of metabolic engineering experiments and the study of the unique capabilities of biotechnologically important microorganisms. The extreme abiotic conditions of the Atacama Desert have selected microbial diversity with exceptional characteristics that can be applied in the mining industry for bioleaching processes and for production of specialised metabolites with antimicrobial, antifungal, antiviral, antitumoral, among other activities. In this review we summarise the scientific data available of the use of metabolic modelling and flux analysis to improve the performance of Atacama Desert microorganisms in biotechnological applications.
NASA Astrophysics Data System (ADS)
Schmidt, J. B.
1985-09-01
This thesis investigates ways of improving the real-time performance of the Stockpoint Logistics Integrated Communication Environment (SPLICE). Performance evaluation through continuous monitoring activities and performance studies are the principle vehicles discussed. The method for implementing this performance evaluation process is the measurement of predefined performance indexes. Performance indexes for SPLICE are offered that would measure these areas. Existing SPLICE capability to carry out performance evaluation is explored, and recommendations are made to enhance that capability.
Stochastic Feedforward Control Technique
NASA Technical Reports Server (NTRS)
Halyo, Nesim
1990-01-01
Class of commanded trajectories modeled as stochastic process. Advanced Transport Operating Systems (ATOPS) research and development program conducted by NASA Langley Research Center aimed at developing capabilities for increases in capacities of airports, safe and accurate flight in adverse weather conditions including shear, winds, avoidance of wake vortexes, and reduced consumption of fuel. Advances in techniques for design of modern controls and increased capabilities of digital flight computers coupled with accurate guidance information from Microwave Landing System (MLS). Stochastic feedforward control technique developed within context of ATOPS program.
Study of CFB Simulation Model with Coincidence at Multi-Working Condition
NASA Astrophysics Data System (ADS)
Wang, Z.; He, F.; Yang, Z. W.; Li, Z.; Ni, W. D.
A circulating fluidized bed (CFB) two-stage simulation model was developed. To realize the model results coincident with the design value or real operation value at specified multi-working conditions and with capability of real-time calculation, only the main key processes were taken into account and the dominant factors were further abstracted out of these key processes. The simulation results showed a sound accordance at multi-working conditions, and confirmed the advantage of the two-stage model over the original single-stage simulation model. The combustion-support effect of secondary air was investigated using the two-stage model. This model provides a solid platform for investigating the pant-leg structured CFB furnace, which is now under design for a supercritical power plant.
NASA Astrophysics Data System (ADS)
Talamonti, James Joseph
1995-01-01
Future NASA proposals include the placement of optical interferometer systems in space for a wide variety of astrophysical studies including a vastly improved deflection test of general relativity, a precise and direct calibration of the Cepheid distance scale, and the determination of stellar masses (Reasenberg et al., 1988). There are also plans for placing large array telescopes on the moon with the ultimate objective of being able to measure angular separations of less than 10 mu-arc seconds (Burns, 1990). These and other future projects will require interferometric measurement of the (baseline) distance between the optical elements comprising the systems. Eventually, space qualifiable interferometers capable of picometer (10^{-12}m) relative precision and nanometer (10^{ -9}m) absolute precision will be required. A numerical model was developed to emulate the capabilities of systems performing interferometric noncontact absolute distance measurements. The model incorporates known methods to minimize signal processing and digital sampling errors and evaluates the accuracy limitations imposed by spectral peak isolation using Hanning, Blackman, and Gaussian windows in the Fast Fourier Transform Technique. We applied this model to the specific case of measuring the relative lengths of a compound Michelson interferometer using a frequency scanned laser. By processing computer simulated data through our model, the ultimate precision is projected for ideal data, and data containing AM/FM noise. The precision is shown to be limited by non-linearities in the laser scan. A laboratory system was developed by implementing ultra-stable external cavity diode lasers into existing interferometric measuring techniques. The capabilities of the system were evaluated and increased by using the computer modeling results as guidelines for the data analysis. Experimental results measured 1-3 meter baselines with <20 micron precision. Comparison of the laboratory and modeling results showed that the laboratory precisions obtained were of the same order of magnitude as those predicted for computer generated results under similar conditions. We believe that our model can be implemented as a tool in the design for new metrology systems capable of meeting the precisions required by space-based interferometers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schraad, Mark William; Luscher, Darby Jon
Additive Manufacturing techniques are presenting the Department of Energy and the NNSA Laboratories with new opportunities to consider novel component production and repair processes, and to manufacture materials with tailored response and optimized performance characteristics. Additive Manufacturing technologies already are being applied to primary NNSA mission areas, including Nuclear Weapons. These mission areas are adapting to these new manufacturing methods, because of potential advantages, such as smaller manufacturing footprints, reduced needs for specialized tooling, an ability to embed sensing, novel part repair options, an ability to accommodate complex geometries, and lighter weight materials. To realize the full potential of Additivemore » Manufacturing as a game-changing technology for the NNSA’s national security missions; however, significant progress must be made in several key technical areas. In addition to advances in engineering design, process optimization and automation, and accelerated feedstock design and manufacture, significant progress must be made in modeling and simulation. First and foremost, a more mature understanding of the process-structure-property-performance relationships must be developed. Because Additive Manufacturing processes change the nature of a material’s structure below the engineering scale, new models are required to predict materials response across the spectrum of relevant length scales, from the atomistic to the continuum. New diagnostics will be required to characterize materials response across these scales. And not just models, but advanced algorithms, next-generation codes, and advanced computer architectures will be required to complement the associated modeling activities. Based on preliminary work in each of these areas, a strong argument for the need for Exascale computing architectures can be made, if a legitimate predictive capability is to be developed.« less
NASA Astrophysics Data System (ADS)
Wiesmann, William P.; Pranger, L. Alex; Bogucki, Mary S.
1998-05-01
Remote monitoring of physiologic data from individual high- risk workers distributed over time and space is a considerable challenge. This is often due to an inadequate capability to accurately integrate large amounts of data into usable information in real time. In this report, we have used the vertical and horizontal organization of the 'fireground' as a framework to design a distributed network of sensors. In this system, sensor output is linked through a hierarchical object oriented programing process to accurately interpret physiological data, incorporate these data into a synchronous model and relay processed data, trends and predictions to members of the fire incident command structure. There are several unique aspects to this approach. The first includes a process to account for variability in vital parameter values for each individual's normal physiologic response by including an adaptive network in each data process. This information is used by the model in an iterative process to baseline a 'normal' physiologic response to a given stress for each individual and to detect deviations that indicate dysfunction or a significant insult. The second unique capability of the system orders the information for each user including the subject, local company officers, medical personnel and the incident commanders. Information can be retrieved and used for training exercises and after action analysis. Finally this system can easily be adapted to existing communication and processing links along with incorporating the best parts of current models through the use of object oriented programming techniques. These modern software techniques are well suited to handling multiple data processes independently over time in a distributed network.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lechman, Jeremy B.; Battaile, Corbett Chandler.; Bolintineanu, Dan
This report summarizes a project in which the authors sought to develop and deploy: (i) experimental techniques to elucidate the complex, multiscale nature of thermal transport in particle-based materials; and (ii) modeling approaches to address current challenges in predicting performance variability of materials (e.g., identifying and characterizing physical- chemical processes and their couplings across multiple length and time scales, modeling information transfer between scales, and statically and dynamically resolving material structure and its evolution during manufacturing and device performance). Experimentally, several capabilities were successfully advanced. As discussed in Chapter 2 a flash diffusivity capability for measuring homogeneous thermal conductivity ofmore » pyrotechnic powders (and beyond) was advanced; leading to enhanced characterization of pyrotechnic materials and properties impacting component development. Chapter 4 describes success for the first time, although preliminary, in resolving thermal fields at speeds and spatial scales relevant to energetic components. Chapter 7 summarizes the first ever (as far as the authors know) application of TDTR to actual pyrotechnic materials. This is the first attempt to actually characterize these materials at the interfacial scale. On the modeling side, new capabilities in image processing of experimental microstructures and direct numerical simulation on complicated structures were advanced (see Chapters 3 and 5). In addition, modeling work described in Chapter 8 led to improved prediction of interface thermal conductance from first principles calculations. Toward the second point, for a model system of packed particles, significant headway was made in implementing numerical algorithms and collecting data to justify the approach in terms of highlighting the phenomena at play and pointing the way forward in developing and informing the kind of modeling approach originally envisioned (see Chapter 6). In both cases much more remains to be accomplished.« less
USDA-ARS?s Scientific Manuscript database
In smoked fish processes, smoking is the only step that is capable of inactivating pathogens, such as Listeria monocytogenes, that contaminate the raw fish. The objectives of this study were to examine and develop a model to describe the survival of L. monocytogenes in salmon as affected by salt, s...
EUV mask manufacturing readiness in the merchant mask industry
NASA Astrophysics Data System (ADS)
Green, Michael; Choi, Yohan; Ham, Young; Kamberian, Henry; Progler, Chris; Tseng, Shih-En; Chiou, Tsann-Bim; Miyazaki, Junji; Lammers, Ad; Chen, Alek
2017-10-01
As nodes progress into the 7nm and below regime, extreme ultraviolet lithography (EUVL) becomes critical for all industry participants interested in remaining at the leading edge. One key cost driver for EUV in the supply chain is the reflective EUV mask. As of today, the relatively few end users of EUV consist primarily of integrated device manufactures (IDMs) and foundries that have internal (captive) mask manufacturing capability. At the same time, strong and early participation in EUV by the merchant mask industry should bring value to these chip makers, aiding the wide-scale adoption of EUV in the future. For this, merchants need access to high quality, representative test vehicles to develop and validate their own processes. This business circumstance provides the motivation for merchants to form Joint Development Partnerships (JDPs) with IDMs, foundries, Original Equipment Manufacturers (OEMs) and other members of the EUV supplier ecosystem that leverage complementary strengths. In this paper, we will show how, through a collaborative supplier JDP model between a merchant and OEM, a novel, test chip driven strategy is applied to guide and validate mask level process development. We demonstrate how an EUV test vehicle (TV) is generated for mask process characterization in advance of receiving chip maker-specific designs. We utilize the TV to carry out mask process "stress testing" to define process boundary conditions which can be used to create Mask Rule Check (MRC) rules as well as serve as baseline conditions for future process improvement. We utilize Advanced Mask Characterization (AMC) techniques to understand process capability on designs of varying complexity that include EUV OPC models with and without sub-resolution assist features (SRAFs). Through these collaborations, we demonstrate ways to develop EUV processes and reduce implementation risks for eventual mass production. By reducing these risks, we hope to expand access to EUV mask capability for the broadest community possible as the technology is implemented first within and then beyond the initial early adopters.
Advances in SiC/SiC Composites for Aerospace Applications
NASA Technical Reports Server (NTRS)
DiCarlo, James A.
2006-01-01
In recent years, supported by a variety of materials development programs, NASA Glenn Research Center has significantly increased the thermostructural capability of SiC/SiC composite materials for high-temperature aerospace applications. These state-of-the-art advances have occurred in every key constituent of the composite: fiber, fiber coating, matrix, and environmental barrier coating, as well as processes for forming the fiber architectures needed for complex-shaped components such as turbine vanes for gas turbine engines. This presentation will briefly elaborate on the nature of these advances in terms of performance data and underlying mechanisms. Based on a list of first-order property goals for typical high-temperature applications, key data from a variety of laboratory tests are presented which demonstrate that the NASA-developed constituent materials and processes do indeed result in SiC/SiC systems with the desired thermal and structural capabilities. Remaining process and microstructural issues for further property enhancement are discussed, as well as on-going approaches at NASA to solve these issues. NASA efforts to develop physics-based property models that can be used not only for component design and life modeling, but also for constituent material and process improvement will also be discussed.
2013-11-01
by existing cyber-attack detection tools far exceeds the analysts’ cognitive capabilities. Grounded in perceptual and cognitive theory , many visual...Processes Inspired by the sense-making theory discussed earlier, we model the analytical reasoning process of cyber analysts using three key...analyst are called “working hypotheses”); each hypothesis could trigger further actions to confirm or disconfirm it. New actions will lead to new
Process Tailoring and the Software Capability Maturity Model(sm).
1995-11-01
A Discipline For Software Engineering, Addison-Wesley, 1995; Humphrey . This book summarizes the costs and benefits of a Personal Software Process ( PSP ...1994. [Humphrey95] Humphrey , Watts S . A Discipline For Software Engineering. Reading, MA: Addison-Wesley Publishing Company, 1995. CMUISEI-94-TR-24 43...practiced and institutionalized. 8 CMU/SEI-94-TR-24 . Leveraging mo n o s I cDocument" IRevise & Analyze Organizational LessonsApproach ’"- Define Processes
Integrated Optical Design Analysis (IODA): New Test Data and Modeling Features
NASA Technical Reports Server (NTRS)
Moore, Jim; Troy, Ed; Patrick, Brian
2003-01-01
A general overview of the capabilities of the IODA ("Integrated Optical Design Analysis") exchange of data and modeling results between thermal, structures, optical design, and testing engineering disciplines. This presentation focuses on new features added to the software that allow measured test data to be imported into the IODA environment for post processing or comparisons with pretest model predictions. software is presented. IODA promotes efficient
NASA Astrophysics Data System (ADS)
Mani, N. J.; Waliser, D. E.; Jiang, X.
2014-12-01
While the boreal summer monsoon intraseasonal variability (BSISV) exerts profound influence on the south Asian monsoon, the capability of present day dynamical models in simulating and predicting the BSISV is still limited. The global model evaluation project on vertical structure and diabatic processes of the Madden Julian Oscillations (MJO) is a joint venture, coordinated by the Working Group on Numerical Experimentation (WGNE) MJO Task Force and GEWEX Atmospheric System Study (GASS) program, for assessing the model deficiencies in simulating the ISV and for improving our understanding of the underlying processes. In this study the simulation of the northward propagating BSISV is investigated in 26 climate models with special focus on the vertical diabatic heating structure and clouds. Following parallel lines of inquiry as the MJO Task Force has done with the eastward propagating MJO, we utilize previously proposed and newly developed model performance metrics and process diagnostics and apply them to the global climate model simulations of BSISV.
NASA Astrophysics Data System (ADS)
Gordov, Evgeny; Lykosov, Vasily; Krupchatnikov, Vladimir; Okladnikov, Igor; Titov, Alexander; Shulgina, Tamara
2013-04-01
Analysis of growing volume of related to climate change data from sensors and model outputs requires collaborative multidisciplinary efforts of researchers. To do it timely and in reliable way one needs in modern information-computational infrastructure supporting integrated studies in the field of environmental sciences. Recently developed experimental software and hardware platform Climate (http://climate.scert.ru/) provides required environment for regional climate change related investigations. The platform combines modern web 2.0 approach, GIS-functionality and capabilities to run climate and meteorological models, process large geophysical datasets and support relevant analysis. It also supports joint software development by distributed research groups, and organization of thematic education for students and post-graduate students. In particular, platform software developed includes dedicated modules for numerical processing of regional and global modeling results for consequent analysis and visualization. Also run of integrated into the platform WRF and «Planet Simulator» models, modeling results data preprocessing and visualization is provided. All functions of the platform are accessible by a user through a web-portal using common graphical web-browser in the form of an interactive graphical user interface which provides, particularly, capabilities of selection of geographical region of interest (pan and zoom), data layers manipulation (order, enable/disable, features extraction) and visualization of results. Platform developed provides users with capabilities of heterogeneous geophysical data analysis, including high-resolution data, and discovering of tendencies in climatic and ecosystem changes in the framework of different multidisciplinary researches. Using it even unskilled user without specific knowledge can perform reliable computational processing and visualization of large meteorological, climatic and satellite monitoring datasets through unified graphical web-interface. Partial support of RF Ministry of Education and Science grant 8345, SB RAS Program VIII.80.2 and Projects 69, 131, 140 and APN CBA2012-16NSY project is acknowledged.
Program for the feasibility of developing a high pressure acoustic levitator
NASA Technical Reports Server (NTRS)
Rey, Charles A.; Merkley, Dennis R.; Hammarlund, Gregory R.
1988-01-01
This is the final report for the program for the feasibility of developing a high-pressure acoustic levitator (HPAL). It includes work performed during the period from February 15, 1987 to October 26, 1987. The program was conducted for NASA under contract number NAS3-25115. The HPAL would be used for containerless processing of materials in the 1-g Earth environment. Results show that the use of increased gas pressure produces higher sound pressure levels. The harmonics produced by the acoustic source are also reduced. This provides an improvement in the capabilities of acoustic levitation in 1-g. The reported processing capabilities are directly limited by the design of the Medium Pressure Acoustic Levitator used for this study. Data show that sufficient acoustic intensities can be obtained to levitate and process a specimen of density 5 g/cu cm at 1500 C. However, it is recommended that a working engineering model of the HPAL be developed. The model would be used to establish the maximum operating parameters of furnace temperature and sample density.
Performance of the NEXT Engineering Model Power Processing Unit
NASA Technical Reports Server (NTRS)
Pinero, Luis R.; Hopson, Mark; Todd, Philip C.; Wong, Brian
2007-01-01
The NASA s Evolutionary Xenon Thruster (NEXT) project is developing an advanced ion propulsion system for future NASA missions for solar system exploration. An engineering model (EM) power processing unit (PPU) for the NEXT project was designed and fabricated by L-3 Communications under contract with NASA Glenn Research Center (GRC). This modular PPU is capable of processing up from 0.5 to 7.0 kW of output power for the NEXT ion thruster. Its design includes many significant improvements for better performance over the state-of-the-art PPU. The most significant difference is the beam supply which is comprised of six modules and capable of very efficient operation through a wide voltage range because of innovative features like dual controls, module addressing, and a high current mode. The low voltage power supplies are based on elements of the previously validated NASA Solar Electric Propulsion Technology Application Readiness (NSTAR) PPU. The highly modular construction of the PPU resulted in improved manufacturability, simpler scalability, and lower cost. This paper describes the design of the EM PPU and the results of the bench-top performance tests.
CMMI Level 5 and the Team Software Process
2007-04-01
could meet the rigors of a CMMI assessment and achieve their group’s goal of Level 5. Watts Humphrey , who is widely acknowledged as the founder of the...Capability Maturity Model® (CMM®) approach to improvement and who later created the Personal Software Process ( PSP )SM and TSP, has noted that one of the...intents of PSP and TSP is to be an operational process enactment of CMM Level 5 processes at the personal and pro- ject levels respectively [1]. CMM
Landlab: an Open-Source Python Library for Modeling Earth Surface Dynamics
NASA Astrophysics Data System (ADS)
Gasparini, N. M.; Adams, J. M.; Hobley, D. E. J.; Hutton, E.; Nudurupati, S. S.; Istanbulluoglu, E.; Tucker, G. E.
2016-12-01
Landlab is an open-source Python modeling library that enables users to easily build unique models to explore earth surface dynamics. The Landlab library provides a number of tools and functionalities that are common to many earth surface models, thus eliminating the need for a user to recode fundamental model elements each time she explores a new problem. For example, Landlab provides a gridding engine so that a user can build a uniform or nonuniform grid in one line of code. The library has tools for setting boundary conditions, adding data to a grid, and performing basic operations on the data, such as calculating gradients and curvature. The library also includes a number of process components, which are numerical implementations of physical processes. To create a model, a user creates a grid and couples together process components that act on grid variables. The current library has components for modeling a diverse range of processes, from overland flow generation to bedrock river incision, from soil wetting and drying to vegetation growth, succession and death. The code is freely available for download (https://github.com/landlab/landlab) or can be installed as a Python package. Landlab models can also be built and run on Hydroshare (www.hydroshare.org), an online collaborative environment for sharing hydrologic data, models, and code. Tutorials illustrating a wide range of Landlab capabilities such as building a grid, setting boundary conditions, reading in data, plotting, using components and building models are also available (https://github.com/landlab/tutorials). The code is also comprehensively documented both online and natively in Python. In this presentation, we illustrate the diverse capabilities of Landlab. We highlight existing functionality by illustrating outcomes from a range of models built with Landlab - including applications that explore landscape evolution and ecohydrology. Finally, we describe the range of resources available for new users.
Modules based on the geochemical model PHREEQC for use in scripting and programming languages
Charlton, Scott R.; Parkhurst, David L.
2011-01-01
The geochemical model PHREEQC is capable of simulating a wide range of equilibrium reactions between water and minerals, ion exchangers, surface complexes, solid solutions, and gases. It also has a general kinetic formulation that allows modeling of nonequilibrium mineral dissolution and precipitation, microbial reactions, decomposition of organic compounds, and other kinetic reactions. To facilitate use of these reaction capabilities in scripting languages and other models, PHREEQC has been implemented in modules that easily interface with other software. A Microsoft COM (component object model) has been implemented, which allows PHREEQC to be used by any software that can interface with a COM server—for example, Excel®, Visual Basic®, Python, or MATLAB". PHREEQC has been converted to a C++ class, which can be included in programs written in C++. The class also has been compiled in libraries for Linux and Windows that allow PHREEQC to be called from C++, C, and Fortran. A limited set of methods implements the full reaction capabilities of PHREEQC for each module. Input methods use strings or files to define reaction calculations in exactly the same formats used by PHREEQC. Output methods provide a table of user-selected model results, such as concentrations, activities, saturation indices, and densities. The PHREEQC module can add geochemical reaction capabilities to surface-water, groundwater, and watershed transport models. It is possible to store and manipulate solution compositions and reaction information for many cells within the module. In addition, the object-oriented nature of the PHREEQC modules simplifies implementation of parallel processing for reactive-transport models. The PHREEQC COM module may be used in scripting languages to fit parameters; to plot PHREEQC results for field, laboratory, or theoretical investigations; or to develop new models that include simple or complex geochemical calculations.
Modules based on the geochemical model PHREEQC for use in scripting and programming languages
Charlton, S.R.; Parkhurst, D.L.
2011-01-01
The geochemical model PHREEQC is capable of simulating a wide range of equilibrium reactions between water and minerals, ion exchangers, surface complexes, solid solutions, and gases. It also has a general kinetic formulation that allows modeling of nonequilibrium mineral dissolution and precipitation, microbial reactions, decomposition of organic compounds, and other kinetic reactions. To facilitate use of these reaction capabilities in scripting languages and other models, PHREEQC has been implemented in modules that easily interface with other software. A Microsoft COM (component object model) has been implemented, which allows PHREEQC to be used by any software that can interface with a COM server-for example, Excel??, Visual Basic??, Python, or MATLAB??. PHREEQC has been converted to a C++ class, which can be included in programs written in C++. The class also has been compiled in libraries for Linux and Windows that allow PHREEQC to be called from C++, C, and Fortran. A limited set of methods implements the full reaction capabilities of PHREEQC for each module. Input methods use strings or files to define reaction calculations in exactly the same formats used by PHREEQC. Output methods provide a table of user-selected model results, such as concentrations, activities, saturation indices, and densities. The PHREEQC module can add geochemical reaction capabilities to surface-water, groundwater, and watershed transport models. It is possible to store and manipulate solution compositions and reaction information for many cells within the module. In addition, the object-oriented nature of the PHREEQC modules simplifies implementation of parallel processing for reactive-transport models. The PHREEQC COM module may be used in scripting languages to fit parameters; to plot PHREEQC results for field, laboratory, or theoretical investigations; or to develop new models that include simple or complex geochemical calculations. ?? 2011.
A Cloud Based Framework For Monitoring And Predicting Subsurface System Behaviour
NASA Astrophysics Data System (ADS)
Versteeg, R. J.; Rodzianko, A.; Johnson, D. V.; Soltanian, M. R.; Dwivedi, D.; Dafflon, B.; Tran, A. P.; Versteeg, O. J.
2015-12-01
Subsurface system behavior is driven and controlled by the interplay of physical, chemical, and biological processes which occur at multiple temporal and spatial scales. Capabilities to monitor, understand and predict this behavior in an effective and timely manner are needed for both scientific purposes and for effective subsurface system management. Such capabilities require three elements: Models, Data and an enabling cyberinfrastructure, which allow users to use these models and data in an effective manner. Under a DOE Office of Science funded STTR award Subsurface Insights and LBNL have designed and implemented a cloud based predictive assimilation framework (PAF) which automatically ingests, controls quality and stores heterogeneous physical and chemical subsurface data and processes these data using different inversion and modeling codes to provide information on the current state and evolution of subsurface systems. PAF is implemented as a modular cloud based software application with five components: (1) data acquisition, (2) data management, (3) data assimilation and processing, (4) visualization and result delivery and (5) orchestration. Serverside PAF uses ZF2 (a PHP web application framework) and Python and both open source (ODM2) and in house developed data models. Clientside PAF uses CSS and JS to allow for interactive data visualization and analysis. Client side modularity (which allows for a responsive interface) of the system is achieved by implementing each core capability of PAF (such as data visualization, user configuration and control, electrical geophysical monitoring and email/SMS alerts on data streams) as a SPA (Single Page Application). One of the recent enhancements is the full integration of a number of flow and mass transport and parameter estimation codes (e.g., MODFLOW, MT3DMS, PHT3D, TOUGH, PFLOTRAN) in this framework. This integration allows for autonomous and user controlled modeling of hydrological and geochemical processes. In our presentation we will discuss our software architecture and present the results of using these codes and the overall developed performance of our framework using hydrological, geochemical and geophysical data from the LBNL SFA2 Rifle field site.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pinilla, Maria Isabel
This report seeks to study and benchmark code predictions against experimental data; determine parameters to match MCNP-simulated detector response functions to experimental stilbene measurements; add stilbene processing capabilities to DRiFT; and improve NEUANCE detector array modeling and analysis using new MCNP6 and DRiFT features.
Asynchronous Data Retrieval from an Object-Oriented Database
NASA Astrophysics Data System (ADS)
Gilbert, Jonathan P.; Bic, Lubomir
We present an object-oriented semantic database model which, similar to other object-oriented systems, combines the virtues of four concepts: the functional data model, a property inheritance hierarchy, abstract data types and message-driven computation. The main emphasis is on the last of these four concepts. We describe generic procedures that permit queries to be processed in a purely message-driven manner. A database is represented as a network of nodes and directed arcs, in which each node is a logical processing element, capable of communicating with other nodes by exchanging messages. This eliminates the need for shared memory and for centralized control during query processing. Hence, the model is suitable for implementation on a multiprocessor computer architecture, consisting of large numbers of loosely coupled processing elements.
New Decision Support for Landslide and Other Disaster Events
NASA Astrophysics Data System (ADS)
Nair, U. S.; Keiser, K.; Wu, Y.; Kaulfus, A.; Srinivasan, K.; Anderson, E. R.; McEniry, M.
2013-12-01
An Event-Driven Data delivery (ED3) framework has been created that provides reusable services and configurations to support better data preparedness for decision support of disasters and other events by rapidly providing pre-planned access to data, special processing, modeling and other capabilities, all executed in response to criteria-based events. ED3 facilitates decision makers to plan in advance of disasters and other types of events for the data necessary for decisions and response activities. A layer of services provided in the ED3 framework allows systems to support user definition of subscriptions for data plans that will be triggered when events matching specified criteria occur. Pre-planning for data in response to events lessens the burden on decision makers in the aftermath of an event and allows planners to think through the desired processing for specialized data products. Additionally the ED3 framework provides support for listening for event alerts and support for multiple workflow managers that provide data and processing functionality in response to events. Landslides are often costly and, at times, deadly disaster events. Whereas intense and/or sustained rainfall is often the primary trigger for landslides, soil type and slope are also important factors in determining the location and timing of slope failure. Accounting for the substantial spatial variability of these factors is one of the major difficulties when predicting the timing and location of slope failures. A wireless sensor network (WSN), developed by NASA SERVIR and USRA, with peer-to-peer communication capability and low power consumption, is ideal for high spatial in situ monitoring in remote locations. In collaboration with the University of Huntsville at Alabama, WSN equipped with accelerometer, rainfall and soil moisture sensors is being integrated into an end-to-end landslide warning system. The WSN is being tested to ascertain communication capabilities and the density of nodes required depending upon the nature of terrain and land cover. The performance of a water table model, to be utilized in the end-to-end system, is being evaluated by comparing against landslides that occurred during the 6th and 7th of May, 2003 and 20th and 21st of April, 2011. The model provides a deterministic assessment of slope stability by evaluating horizontal and vertical transport of underground water and associated weight bearing capacity. In the proposed end-to-end system, the model will be coupled to the WSN, and the in situ data collected will be used to drive the model. The output from the model could be communicated back to the WSN providing the capability of generating warning of possible events to the ED3 framework to trigger additional data retrieval or the processing of additional models based on decision maker's ED3 preparedness plans. NASA's Applied Science Program has funded a feasibility study of the ED3 technology and as a result the capability is on track be integrated into existing decision support systems, with an initial reference implementation hosted at the Global Hydrology Resource Center, a NASA distributed active archive center (DAAC).
Distributed collaborative environments for virtual capability-based planning
NASA Astrophysics Data System (ADS)
McQuay, William K.
2003-09-01
Distributed collaboration is an emerging technology that will significantly change how decisions are made in the 21st century. Collaboration involves two or more geographically dispersed individuals working together to share and exchange data, information, knowledge, and actions. The marriage of information, collaboration, and simulation technologies provides the decision maker with a collaborative virtual environment for planning and decision support. This paper reviews research that is focusing on the applying open standards agent-based framework with integrated modeling and simulation to a new Air Force initiative in capability-based planning and the ability to implement it in a distributed virtual environment. Virtual Capability Planning effort will provide decision-quality knowledge for Air Force resource allocation and investment planning including examining proposed capabilities and cost of alternative approaches, the impact of technologies, identification of primary risk drivers, and creation of executable acquisition strategies. The transformed Air Force business processes are enabled by iterative use of constructive and virtual modeling, simulation, and analysis together with information technology. These tools are applied collaboratively via a technical framework by all the affected stakeholders - warfighter, laboratory, product center, logistics center, test center, and primary contractor.
Yang, Chen-Wei
2015-01-01
The main purpose of this study is to develop an innovation model for hospital organisations. For this purpose, this study explores and examines the determinants, capabilities and performance in the hospital sector. First, this discusses three categories of determinants that affect hospitals' innovative capability studies: (1) knowledge stock; (2) social ties; and (3) institutional pressures. Then, this study examines the idea of innovative hospital capabilities, defined as the ability of the hospital organisation to innovate their knowledge. Finally, the hospital evaluation rating, which identifies performance in the hospital sector, was examined. This study empirically tested the theoretical model at the organisation level. The findings suggest that a hospital's innovative capabilities are influenced by its knowledge stock, social ties, institutional pressures and the impact of hospital performance. However, in attempts to keep hospitals aligned with their highly institutionalised environments, it may prove necessary for hospital administrators to pay more attention to both existing knowledge stock and the process of innovation if the institutions are to survive. Finally, implications for theory and practitioners complete this study. Copyright © 2014 John Wiley & Sons, Ltd.
FACE-IT. A Science Gateway for Food Security Research
DOE Office of Scientific and Technical Information (OSTI.GOV)
Montella, Raffaele; Kelly, David; Xiong, Wei
Progress in sustainability science is hindered by challenges in creating and managing complex data acquisition, processing, simulation, post-processing, and intercomparison pipelines. To address these challenges, we developed the Framework to Advance Climate, Economic, and Impact Investigations with Information Technology (FACE-IT) for crop and climate impact assessments. This integrated data processing and simulation framework enables data ingest from geospatial archives; data regridding, aggregation, and other processing prior to simulation; large-scale climate impact simulations with agricultural and other models, leveraging high-performance and cloud computing; and post-processing to produce aggregated yields and ensemble variables needed for statistics, for model intercomparison, and to connectmore » biophysical models to global and regional economic models. FACE-IT leverages the capabilities of the Globus Galaxies platform to enable the capture of workflows and outputs in well-defined, reusable, and comparable forms. We describe FACE-IT and applications within the Agricultural Model Intercomparison and Improvement Project and the Center for Robust Decision-making on Climate and Energy Policy.« less
SpaceNet: Modeling and Simulating Space Logistics
NASA Technical Reports Server (NTRS)
Lee, Gene; Jordan, Elizabeth; Shishko, Robert; de Weck, Olivier; Armar, Nii; Siddiqi, Afreen
2008-01-01
This paper summarizes the current state of the art in interplanetary supply chain modeling and discusses SpaceNet as one particular method and tool to address space logistics modeling and simulation challenges. Fundamental upgrades to the interplanetary supply chain framework such as process groups, nested elements, and cargo sharing, enabled SpaceNet to model an integrated set of missions as a campaign. The capabilities and uses of SpaceNet are demonstrated by a step-by-step modeling and simulation of a lunar campaign.
Assessment of predictive capabilities for aerodynamic heating in hypersonic flow
NASA Astrophysics Data System (ADS)
Knight, Doyle; Chazot, Olivier; Austin, Joanna; Badr, Mohammad Ali; Candler, Graham; Celik, Bayram; Rosa, Donato de; Donelli, Raffaele; Komives, Jeffrey; Lani, Andrea; Levin, Deborah; Nompelis, Ioannis; Panesi, Marco; Pezzella, Giuseppe; Reimann, Bodo; Tumuklu, Ozgur; Yuceil, Kemal
2017-04-01
The capability for CFD prediction of hypersonic shock wave laminar boundary layer interaction was assessed for a double wedge model at Mach 7.1 in air and nitrogen at 2.1 MJ/kg and 8 MJ/kg. Simulations were performed by seven research organizations encompassing both Navier-Stokes and Direct Simulation Monte Carlo (DSMC) methods as part of the NATO STO AVT Task Group 205 activity. Comparison of the CFD simulations with experimental heat transfer and schlieren visualization suggest the need for accurate modeling of the tunnel startup process in short-duration hypersonic test facilities, and the importance of fully 3-D simulations of nominally 2-D (i.e., non-axisymmmetric) experimental geometries.
NASA Technical Reports Server (NTRS)
Davis, George; Cary, Everett; Higinbotham, John; Burns, Richard; Hogie, Keith; Hallahan, Francis
2003-01-01
The paper will provide an overview of the web-based distributed simulation software system developed for end-to-end, multi-spacecraft mission design, analysis, and test at the NASA Goddard Space Flight Center (GSFC). This software system was developed for an internal research and development (IR&D) activity at GSFC called the Distributed Space Systems (DSS) Distributed Synthesis Environment (DSE). The long-term goal of the DSS-DSE is to integrate existing GSFC stand-alone test beds, models, and simulation systems to create a "hands on", end-to-end simulation environment for mission design, trade studies and simulations. The short-term goal of the DSE was therefore to develop the system architecture, and then to prototype the core software simulation capability based on a distributed computing approach, with demonstrations of some key capabilities by the end of Fiscal Year 2002 (FY02). To achieve the DSS-DSE IR&D objective, the team adopted a reference model and mission upon which FY02 capabilities were developed. The software was prototyped according to the reference model, and demonstrations were conducted for the reference mission to validate interfaces, concepts, etc. The reference model, illustrated in Fig. 1, included both space and ground elements, with functional capabilities such as spacecraft dynamics and control, science data collection, space-to-space and space-to-ground communications, mission operations, science operations, and data processing, archival and distribution addressed.
A Distributed Simulation Software System for Multi-Spacecraft Missions
NASA Technical Reports Server (NTRS)
Burns, Richard; Davis, George; Cary, Everett
2003-01-01
The paper will provide an overview of the web-based distributed simulation software system developed for end-to-end, multi-spacecraft mission design, analysis, and test at the NASA Goddard Space Flight Center (GSFC). This software system was developed for an internal research and development (IR&D) activity at GSFC called the Distributed Space Systems (DSS) Distributed Synthesis Environment (DSE). The long-term goal of the DSS-DSE is to integrate existing GSFC stand-alone test beds, models, and simulation systems to create a "hands on", end-to-end simulation environment for mission design, trade studies and simulations. The short-term goal of the DSE was therefore to develop the system architecture, and then to prototype the core software simulation capability based on a distributed computing approach, with demonstrations of some key capabilities by the end of Fiscal Year 2002 (FY02). To achieve the DSS-DSE IR&D objective, the team adopted a reference model and mission upon which FY02 capabilities were developed. The software was prototyped according to the reference model, and demonstrations were conducted for the reference mission to validate interfaces, concepts, etc. The reference model, illustrated in Fig. 1, included both space and ground elements, with functional capabilities such as spacecraft dynamics and control, science data collection, space-to-space and space-to-ground communications, mission operations, science operations, and data processing, archival and distribution addressed.
Above the cloud computing orbital services distributed data model
NASA Astrophysics Data System (ADS)
Straub, Jeremy
2014-05-01
Technology miniaturization and system architecture advancements have created an opportunity to significantly lower the cost of many types of space missions by sharing capabilities between multiple spacecraft. Historically, most spacecraft have been atomic entities that (aside from their communications with and tasking by ground controllers) operate in isolation. Several notable example exist; however, these are purpose-designed systems that collaborate to perform a single goal. The above the cloud computing (ATCC) concept aims to create ad-hoc collaboration between service provider and consumer craft. Consumer craft can procure processing, data transmission, storage, imaging and other capabilities from provider craft. Because of onboard storage limitations, communications link capability limitations and limited windows of communication, data relevant to or required for various operations may span multiple craft. This paper presents a model for the identification, storage and accessing of this data. This model includes appropriate identification features for this highly distributed environment. It also deals with business model constraints such as data ownership, retention and the rights of the storing craft to access, resell, transmit or discard the data in its possession. The model ensures data integrity and confidentiality (to the extent applicable to a given data item), deals with unique constraints of the orbital environment and tags data with business model (contractual) obligation data.
Process Engineering Technology Center Initiative
NASA Technical Reports Server (NTRS)
Centeno, Martha A.
2001-01-01
NASA's Kennedy Space Center (KSC) is developing as a world-class Spaceport Technology Center (STC). From a process engineering (PE) perspective, the facilities used for flight hardware processing at KSC are NASA's premier factories. The products of these factories are safe, successful shuttle and expendable vehicle launches carrying state-of-the-art payloads. PE is devoted to process design, process management, and process improvement, rather than product design. PE also emphasizes the relationships of workers with systems and processes. Thus, it is difficult to speak of having a laboratory for PE at KSC because the entire facility is practically a laboratory when observed from a macro level perspective. However, it becomes necessary, at times, to show and display how KSC has benefited from PE and how KSC has contributed to the development of PE; hence, it has been proposed that a Process Engineering Technology Center (PETC) be developed to offer a place with a centralized focus on PE projects, and a place where KSC's PE capabilities can be showcased, and a venue where new Process Engineering technologies can be investigated and tested. Graphics for showcasing PE capabilities have been designed, and two initial test beds for PE technology research have been identified. Specifically, one test bed will look into the use of wearable computers with head mounted displays to deliver work instructions; the other test bed will look into developing simulation models that can be assembled into one to create a hierarchical model.
Process Engineering Technology Center Initiative
NASA Technical Reports Server (NTRS)
Centeno, Martha A.
2002-01-01
NASA's Kennedy Space Center (KSC) is developing as a world-class Spaceport Technology Center (STC). From a process engineering (PE) perspective, the facilities used for flight hardware processing at KSC are NASA's premier factories. The products of these factories are safe, successful shuttle and expendable vehicle launches carrying state-of-the-art payloads. PE is devoted to process design, process management, and process improvement, rather than product design. PE also emphasizes the relationships of workers with systems and processes. Thus, it is difficult to speak of having a laboratory for PE at K.S.C. because the entire facility is practically a laboratory when observed from a macro level perspective. However, it becomes necessary, at times, to show and display how K.S.C. has benefited from PE and how K.S.C. has contributed to the development of PE; hence, it has been proposed that a Process Engineering Technology Center (PETC) be developed to offer a place with a centralized focus on PE projects, and a place where K.S.C.'s PE capabilities can be showcased, and a venue where new Process Engineering technologies can be investigated and tested. Graphics for showcasing PE capabilities have been designed, and two initial test beds for PE technology research have been identified. Specifically, one test bed will look into the use of wearable computers with head mounted displays to deliver work instructions; the other test bed will look into developing simulation models that can be assembled into one to create a hierarchical model.
Probabilistic Fracture Mechanics of Reactor Pressure Vessels with Populations of Flaws
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spencer, Benjamin; Backman, Marie; Williams, Paul
This report documents recent progress in developing a tool that uses the Grizzly and RAVEN codes to perform probabilistic fracture mechanics analyses of reactor pressure vessels in light water reactor nuclear power plants. The Grizzly code is being developed with the goal of creating a general tool that can be applied to study a variety of degradation mechanisms in nuclear power plant components. Because of the central role of the reactor pressure vessel (RPV) in a nuclear power plant, particular emphasis is being placed on developing capabilities to model fracture in embrittled RPVs to aid in the process surrounding decisionmore » making relating to life extension of existing plants. A typical RPV contains a large population of pre-existing flaws introduced during the manufacturing process. The use of probabilistic techniques is necessary to assess the likelihood of crack initiation at one or more of these flaws during a transient event. This report documents development and initial testing of a capability to perform probabilistic fracture mechanics of large populations of flaws in RPVs using reduced order models to compute fracture parameters. The work documented here builds on prior efforts to perform probabilistic analyses of a single flaw with uncertain parameters, as well as earlier work to develop deterministic capabilities to model the thermo-mechanical response of the RPV under transient events, and compute fracture mechanics parameters at locations of pre-defined flaws. The capabilities developed as part of this work provide a foundation for future work, which will develop a platform that provides the flexibility needed to consider scenarios that cannot be addressed with the tools used in current practice.« less
Model-Based Engineering Design for Trade Space Exploration throughout the Design Cycle
NASA Technical Reports Server (NTRS)
Lamassoure, Elisabeth S.; Wall, Stephen D.; Easter, Robert W.
2004-01-01
This paper presents ongoing work to standardize model-based system engineering as a complement to point design development in the conceptual design phase of deep space missions. It summarizes two first steps towards practical application of this capability within the framework of concurrent engineering design teams and their customers. The first step is standard generation of system sensitivities models as the output of concurrent engineering design sessions, representing the local trade space around a point design. A review of the chosen model development process, and the results of three case study examples, demonstrate that a simple update to the concurrent engineering design process can easily capture sensitivities to key requirements. It can serve as a valuable tool to analyze design drivers and uncover breakpoints in the design. The second step is development of rough-order- of-magnitude, broad-range-of-validity design models for rapid exploration of the trade space, before selection of a point design. At least one case study demonstrated the feasibility to generate such models in a concurrent engineering session. The experiment indicated that such a capability could yield valid system-level conclusions for a trade space composed of understood elements. Ongoing efforts are assessing the practicality of developing end-to-end system-level design models for use before even convening the first concurrent engineering session, starting with modeling an end-to-end Mars architecture.
NASA Technical Reports Server (NTRS)
Starr, D. OC. (Editor); Melfi, S. Harvey (Editor)
1991-01-01
The proposed GEWEX Water Vapor Project (GVaP) addresses fundamental deficiencies in the present understanding of moist atmospheric processes and the role of water vapor in the global hydrologic cycle and climate. Inadequate knowledge of the distribution of atmospheric water vapor and its transport is a major impediment to progress in achieving a fuller understanding of various hydrologic processes and a capability for reliable assessment of potential climatic change on global and regional scales. GVap will promote significant improvements in knowledge of atmospheric water vapor and moist processes as well as in present capabilities to model these processes on global and regional scales. GVaP complements a number of ongoing and planned programs focused on various aspects of the hydrologic cycle. The goal of GVaP is to improve understanding of the role of water vapor in meteorological, hydrological, and climatological processes through improved knowledge of water vapor and its variability on all scales. A detailed description of the GVaP is presented.
Digital Signal Processing and Control for the Study of Gene Networks
NASA Astrophysics Data System (ADS)
Shin, Yong-Jun
2016-04-01
Thanks to the digital revolution, digital signal processing and control has been widely used in many areas of science and engineering today. It provides practical and powerful tools to model, simulate, analyze, design, measure, and control complex and dynamic systems such as robots and aircrafts. Gene networks are also complex dynamic systems which can be studied via digital signal processing and control. Unlike conventional computational methods, this approach is capable of not only modeling but also controlling gene networks since the experimental environment is mostly digital today. The overall aim of this article is to introduce digital signal processing and control as a useful tool for the study of gene networks.
Digital Signal Processing and Control for the Study of Gene Networks.
Shin, Yong-Jun
2016-04-22
Thanks to the digital revolution, digital signal processing and control has been widely used in many areas of science and engineering today. It provides practical and powerful tools to model, simulate, analyze, design, measure, and control complex and dynamic systems such as robots and aircrafts. Gene networks are also complex dynamic systems which can be studied via digital signal processing and control. Unlike conventional computational methods, this approach is capable of not only modeling but also controlling gene networks since the experimental environment is mostly digital today. The overall aim of this article is to introduce digital signal processing and control as a useful tool for the study of gene networks.
Digital Signal Processing and Control for the Study of Gene Networks
Shin, Yong-Jun
2016-01-01
Thanks to the digital revolution, digital signal processing and control has been widely used in many areas of science and engineering today. It provides practical and powerful tools to model, simulate, analyze, design, measure, and control complex and dynamic systems such as robots and aircrafts. Gene networks are also complex dynamic systems which can be studied via digital signal processing and control. Unlike conventional computational methods, this approach is capable of not only modeling but also controlling gene networks since the experimental environment is mostly digital today. The overall aim of this article is to introduce digital signal processing and control as a useful tool for the study of gene networks. PMID:27102828
NASA Astrophysics Data System (ADS)
Preiss, Bruce; Greene, Lloyd; Kriebel, Jamie; Wasson, Robert
2006-05-01
The Air Force Research Laboratory utilizes a value model as a primary input for space technology planning and budgeting. The Space Sector at AFRL headquarters manages space technology investment across all the geographically disparate technical directorates and ensures that integrated planning is achieved across the space community. The space investment portfolio must ultimately balance near, mid, and far-term investments across all the critical space mission areas. Investment levels and growth areas can always be identified by a typical capability analysis or gap analysis, but the value model approach goes one step deeper and helps identify the potential payoff of technology investments by linking the technology directly to an existing or potential concept. The value of the technology is then viewed from the enabling performance perspective of the concept that ultimately fulfills the Air Force mission. The process of linking space technologies to future concepts and technology roadmaps will be reviewed in this paper, along with representative results from this planning cycle. The initial assumptions in this process will be identified along with the strengths and weaknesses of this planning methodology.
[Research on identification of species of fruit trees by spectral analysis].
Xing, Dong-Xing; Chang, Qing-Rui
2009-07-01
Using the spectral reflectance data (R2) of canopies, the present paper identifies seven species of fruit trees bearing fruit in the fruit mature period. Firstly, it compares the fruit tree species identification capability of six kinds of satellite sensors and four kinds of vegetation index through re-sampling the spectral data with six kinds of pre-defined filter function and the related data processing of calculating vegetation indexes. Then, it structures a BP neural network model for identifying seven species of fruit trees on the basis of choosing the best transformation of R(lambda) and optimizing the model parameters. The main conclusions are: (1) the order of the identification capability of the six kinds of satellite sensors from strong to weak is: MODIS, ASTER, ETM+, HRG, QUICKBIRD and IKONOS; (2) among the four kinds of vegetation indexes, the identification capability of RVI is the most powerful, the next is NDVI, while the identification capability of SAVI or DVI is relatively weak; (3) The identification capability of RVI and NDVI calculated with the reflectance of near-infrared and red channels of ETM+ or MODIS sensor is relatively powerful; (4) Among R(lambda) and its 22 kinds of transformation data, d1 [log(1/R(lambda))](derivative gap is set 9 nm) is the best transformation for structuring BP neural network model; (5) The paper structures a 3-layer BP neural network model for identifying seven species of fruit trees using the best transformation of R(lambda) which is d1 [log(1/R(lambda))](derivative gap is set 9 nm).
Li, Liang; Wang, Yiying; Xu, Jiting; Flora, Joseph R V; Hoque, Shamia; Berge, Nicole D
2018-08-01
Hydrothermal carbonization (HTC) is a wet, low temperature thermal conversion process that continues to gain attention for the generation of hydrochar. The importance of specific process conditions and feedstock properties on hydrochar characteristics is not well understood. To evaluate this, linear and non-linear models were developed to describe hydrochar characteristics based on data collected from HTC-related literature. A Sobol analysis was subsequently conducted to identify parameters that most influence hydrochar characteristics. Results from this analysis indicate that for each investigated hydrochar property, the model fit and predictive capability associated with the random forest models is superior to both the linear and regression tree models. Based on results from the Sobol analysis, the feedstock properties and process conditions most influential on hydrochar yield, carbon content, and energy content were identified. In addition, a variational process parameter sensitivity analysis was conducted to determine how feedstock property importance changes with process conditions. Copyright © 2018 Elsevier Ltd. All rights reserved.
Somekh, Judith; Choder, Mordechai; Dori, Dov
2012-01-01
We propose a Conceptual Model-based Systems Biology framework for qualitative modeling, executing, and eliciting knowledge gaps in molecular biology systems. The framework is an adaptation of Object-Process Methodology (OPM), a graphical and textual executable modeling language. OPM enables concurrent representation of the system's structure—the objects that comprise the system, and behavior—how processes transform objects over time. Applying a top-down approach of recursively zooming into processes, we model a case in point—the mRNA transcription cycle. Starting with this high level cell function, we model increasingly detailed processes along with participating objects. Our modeling approach is capable of modeling molecular processes such as complex formation, localization and trafficking, molecular binding, enzymatic stimulation, and environmental intervention. At the lowest level, similar to the Gene Ontology, all biological processes boil down to three basic molecular functions: catalysis, binding/dissociation, and transporting. During modeling and execution of the mRNA transcription model, we discovered knowledge gaps, which we present and classify into various types. We also show how model execution enhances a coherent model construction. Identification and pinpointing knowledge gaps is an important feature of the framework, as it suggests where research should focus and whether conjectures about uncertain mechanisms fit into the already verified model. PMID:23308089
U-10Mo Baseline Fuel Fabrication Process Description
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hubbard, Lance R.; Arendt, Christina L.; Dye, Daniel F.
This document provides a description of the U.S. High Power Research Reactor (USHPRR) low-enriched uranium (LEU) fuel fabrication process. This document is intended to be used in conjunction with the baseline process flow diagram (PFD) presented in Appendix A. The baseline PFD is used to document the fabrication process, communicate gaps in technology or manufacturing capabilities, convey alternatives under consideration, and as the basis for a dynamic simulation model of the fabrication process. The simulation model allows for the assessment of production rates, costs, and manufacturing requirements (manpower, fabrication space, numbers and types of equipment, etc.) throughout the lifecycle ofmore » the USHPRR program. This document, along with the accompanying PFD, is updated regularly« less
Developing a Model for Identifying Students at Risk of Failure in a First Year Accounting Unit
ERIC Educational Resources Information Center
Smith, Malcolm; Therry, Len; Whale, Jacqui
2012-01-01
This paper reports on the process involved in attempting to build a predictive model capable of identifying students at risk of failure in a first year accounting unit in an Australian university. Identifying attributes that contribute to students being at risk can lead to the development of appropriate intervention strategies and support…
Biosecurity through Public Health System Design.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beyeler, Walter E.; Finley, Patrick D.; Arndt, William
We applied modeling and simulation to examine the real-world tradeoffs between developingcountry public-health improvement and the need to improve the identification, tracking, and security of agents with bio-weapons potential. Traditionally, the international community has applied facility-focused strategies for improving biosecurity and biosafety. This work examines how system-level assessments and improvements can foster biosecurity and biosafety. We modeled medical laboratory resources and capabilities to identify scenarios where biosurveillance goals are transparently aligned with public health needs, and resource are distributed in a way that maximizes their ability to serve patients while minimizing security a nd safety risks. Our modeling platform simulatesmore » key processes involved in healthcare system operation, such as sample collection, transport, and analysis at medical laboratories. The research reported here extends the prior art by provided two key compone nts for comparative performance assessment: a model of patient interaction dynamics, and the capability to perform uncertainty quantification. In addition, we have outlined a process for incorporating quantitative biosecurity and biosafety risk measures. Two test problems were used to exercise these research products examine (a) Systemic effects of technological innovation and (b) Right -sizing of laboratory networks.« less
Lu, Sen; Ren, Tusheng; Lu, Yili; Meng, Ping; Sun, Shiyou
2014-01-01
Accurate estimation of soil water retention curve (SWRC) at the dry region is required to describe the relation between soil water content and matric suction from saturation to oven dryness. In this study, the extrapolative capability of two models for predicting the complete SWRC from limited ranges of soil water retention data was evaluated. When the model parameters were obtained from SWRC data in the 0-1500 kPa range, the FX model (Fredlund and Xing, 1994) estimations agreed well with measurements from saturation to oven dryness with RMSEs less than 0.01. The GG model (Groenevelt and Grant, 2004) produced larger errors at the dry region, with significantly larger RMSEs and MEs than the FX model. Further evaluations indicated that when SWRC measurements in the 0-100 kPa suction range was applied for model establishment, the FX model was capable of producing acceptable SWRCs across the entire water content range. For a higher accuracy, the FX model requires soil water retention data at least in the 0- to 300-kPa range to extend the SWRC to oven dryness. Comparing with the Khlosi et al. (2006) model, which requires measurements in the 0-500 kPa range to reproduce the complete SWRCs, the FX model has the advantage of requiring less SWRC measurements. Thus the FX modeling approach has the potential to eliminate the processes for measuring soil water retention in the dry range.
Advanced Machine Learning Emulators of Radiative Transfer Models
NASA Astrophysics Data System (ADS)
Camps-Valls, G.; Verrelst, J.; Martino, L.; Vicent, J.
2017-12-01
Physically-based model inversion methodologies are based on physical laws and established cause-effect relationships. A plethora of remote sensing applications rely on the physical inversion of a Radiative Transfer Model (RTM), which lead to physically meaningful bio-geo-physical parameter estimates. The process is however computationally expensive, needs expert knowledge for both the selection of the RTM, its parametrization and the the look-up table generation, as well as its inversion. Mimicking complex codes with statistical nonlinear machine learning algorithms has become the natural alternative very recently. Emulators are statistical constructs able to approximate the RTM, although at a fraction of the computational cost, providing an estimation of uncertainty, and estimations of the gradient or finite integral forms. We review the field and recent advances of emulation of RTMs with machine learning models. We posit Gaussian processes (GPs) as the proper framework to tackle the problem. Furthermore, we introduce an automatic methodology to construct emulators for costly RTMs. The Automatic Gaussian Process Emulator (AGAPE) methodology combines the interpolation capabilities of GPs with the accurate design of an acquisition function that favours sampling in low density regions and flatness of the interpolation function. We illustrate the good capabilities of our emulators in toy examples, leaf and canopy levels PROSPECT and PROSAIL RTMs, and for the construction of an optimal look-up-table for atmospheric correction based on MODTRAN5.
Status Report on Modelling and Simulation Capabilities for Nuclear-Renewable Hybrid Energy Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rabiti, C.; Epiney, A.; Talbot, P.
This report summarizes the current status of the modeling and simulation capabilities developed for the economic assessment of Nuclear-Renewable Hybrid Energy Systems (N-R HES). The increasing penetration of variable renewables is altering the profile of the net demand, with which the other generators on the grid have to cope. N-R HES analyses are being conducted to determine the potential feasibility of mitigating the resultant volatility in the net electricity demand by adding industrial processes that utilize either thermal or electrical energy as stabilizing loads. This coordination of energy generators and users is proposed to mitigate the increase in electricity costmore » and cost volatility through the production of a saleable commodity. Overall, the financial performance of a system that is comprised of peaking units (i.e. gas turbine), baseload supply (i.e. nuclear power plant), and an industrial process (e.g. hydrogen plant) should be optimized under the constraint of satisfying an electricity demand profile with a certain level of variable renewable (wind) penetration. The optimization should entail both the sizing of the components/subsystems that comprise the system and the optimal dispatch strategy (output at any given moment in time from the different subsystems). Some of the capabilities here described have been reported separately in [1, 2, 3]. The purpose of this report is to provide an update on the improvement and extension of those capabilities and to illustrate their integrated application in the economic assessment of N-R HES.« less
Generic Raman-based calibration models enabling real-time monitoring of cell culture bioreactors.
Mehdizadeh, Hamidreza; Lauri, David; Karry, Krizia M; Moshgbar, Mojgan; Procopio-Melino, Renee; Drapeau, Denis
2015-01-01
Raman-based multivariate calibration models have been developed for real-time in situ monitoring of multiple process parameters within cell culture bioreactors. Developed models are generic, in the sense that they are applicable to various products, media, and cell lines based on Chinese Hamster Ovarian (CHO) host cells, and are scalable to large pilot and manufacturing scales. Several batches using different CHO-based cell lines and corresponding proprietary media and process conditions have been used to generate calibration datasets, and models have been validated using independent datasets from separate batch runs. All models have been validated to be generic and capable of predicting process parameters with acceptable accuracy. The developed models allow monitoring multiple key bioprocess metabolic variables, and hence can be utilized as an important enabling tool for Quality by Design approaches which are strongly supported by the U.S. Food and Drug Administration. © 2015 American Institute of Chemical Engineers.
The Transfer Function Model as a Tool to Study and Describe Space Weather Phenomena
NASA Technical Reports Server (NTRS)
Porter, Hayden S.; Mayr, Hans G.; Bhartia, P. K. (Technical Monitor)
2001-01-01
The Transfer Function Model (TFM) is a semi-analytical, linear model that is designed especially to describe thermospheric perturbations associated with magnetic storms and substorm. activity. It is a multi-constituent model (N2, O, He H, Ar) that accounts for wind induced diffusion, which significantly affects not only the composition and mass density but also the temperature and wind fields. Because the TFM adopts a semianalytic approach in which the geometry and temporal dependencies of the driving sources are removed through the use of height-integrated Green's functions, it provides physical insight into the essential properties of processes being considered, which are uncluttered by the accidental complexities that arise from particular source geometrie and time dependences. Extending from the ground to 700 km, the TFM eliminates spurious effects due to arbitrarily chosen boundary conditions. A database of transfer functions, computed only once, can be used to synthesize a wide range of spatial and temporal sources dependencies. The response synthesis can be performed quickly in real-time using only limited computing capabilities. These features make the TFM unique among global dynamical models. Given these desirable properties, a version of the TFM has been developed for personal computers (PC) using advanced platform-independent 3D visualization capabilities. We demonstrate the model capabilities with simulations for different auroral sources, including the response of ducted gravity waves modes that propagate around the globe. The thermospheric response is found to depend strongly on the spatial and temporal frequency spectra of the storm. Such varied behavior is difficult to describe in statistical empirical models. To improve the capability of space weather prediction, the TFM thus could be grafted naturally onto existing statistical models using data assimilation.
Chopra, Vikram; Bairagi, Mukesh; Trivedi, P; Nagar, Mona
2012-01-01
Statistical process control is the application of statistical methods to the measurement and analysis of variation process. Various regulatory authorities such as Validation Guidance for Industry (2011), International Conference on Harmonisation ICH Q10 (2009), the Health Canada guidelines (2009), Health Science Authority, Singapore: Guidance for Product Quality Review (2008), and International Organization for Standardization ISO-9000:2005 provide regulatory support for the application of statistical process control for better process control and understanding. In this study risk assessments, normal probability distributions, control charts, and capability charts are employed for selection of critical quality attributes, determination of normal probability distribution, statistical stability, and capability of production processes, respectively. The objective of this study is to determine tablet production process quality in the form of sigma process capability. By interpreting data and graph trends, forecasting of critical quality attributes, sigma process capability, and stability of process were studied. The overall study contributes to an assessment of process at the sigma level with respect to out-of-specification attributes produced. Finally, the study will point to an area where the application of quality improvement and quality risk assessment principles for achievement of six sigma-capable processes is possible. Statistical process control is the most advantageous tool for determination of the quality of any production process. This tool is new for the pharmaceutical tablet production process. In the case of pharmaceutical tablet production processes, the quality control parameters act as quality assessment parameters. Application of risk assessment provides selection of critical quality attributes among quality control parameters. Sequential application of normality distributions, control charts, and capability analyses provides a valid statistical process control study on process. Interpretation of such a study provides information about stability, process variability, changing of trends, and quantification of process ability against defective production. Comparative evaluation of critical quality attributes by Pareto charts provides the least capable and most variable process that is liable for improvement. Statistical process control thus proves to be an important tool for six sigma-capable process development and continuous quality improvement.
MaMR: High-performance MapReduce programming model for material cloud applications
NASA Astrophysics Data System (ADS)
Jing, Weipeng; Tong, Danyu; Wang, Yangang; Wang, Jingyuan; Liu, Yaqiu; Zhao, Peng
2017-02-01
With the increasing data size in materials science, existing programming models no longer satisfy the application requirements. MapReduce is a programming model that enables the easy development of scalable parallel applications to process big data on cloud computing systems. However, this model does not directly support the processing of multiple related data, and the processing performance does not reflect the advantages of cloud computing. To enhance the capability of workflow applications in material data processing, we defined a programming model for material cloud applications that supports multiple different Map and Reduce functions running concurrently based on hybrid share-memory BSP called MaMR. An optimized data sharing strategy to supply the shared data to the different Map and Reduce stages was also designed. We added a new merge phase to MapReduce that can efficiently merge data from the map and reduce modules. Experiments showed that the model and framework present effective performance improvements compared to previous work.
NASA Technical Reports Server (NTRS)
Aschwanden, Markus J.; Poland, Arthur I.; Rabin, Douglas M.; Fisher, Richard R. (Technical Monitor)
2001-01-01
We focus on new observational capabilities (Yohkoh, SoHO, TRACE) observations, modeling, approaches, and insights into physical processes of the solar corona. The most impressive new results and problems discussed in this article can be appreciated from the movies and available on the Annual Reviews web site.
DOT National Transportation Integrated Search
2015-04-01
Research done through the Second Strategic Highway Research Program (SHRP 2) determined that agencies with the most effective transportation systems management and operations (TSM&O) activities were differentiated not by budgets or technical skills a...
Organizational Learning through Transformational Leadership
ERIC Educational Resources Information Center
Imran, Muhammad Kashif; Ilyas, Muhammad; Aslam, Usman; Ubaid-Ur-Rahman
2016-01-01
Purpose: The transformation of firms from resource-based-view to knowledge-based-view has extended the importance of organizational learning. Thus, this study aims to develop an organizational learning model through transformational leadership with indirect effect of knowledge management process capability and interactive role of…
Kim, Chang-Sei; Ansermino, J. Mark; Hahn, Jin-Oh
2016-01-01
The goal of this study is to derive a minimally complex but credible model of respiratory CO2 gas exchange that may be used in systematic design and pilot testing of closed-loop end-tidal CO2 controllers in mechanical ventilation. We first derived a candidate model that captures the essential mechanisms involved in the respiratory CO2 gas exchange process. Then, we simplified the candidate model to derive two lower-order candidate models. We compared these candidate models for predictive capability and reliability using experimental data collected from 25 pediatric subjects undergoing dynamically varying mechanical ventilation during surgical procedures. A two-compartment model equipped with transport delay to account for CO2 delivery between the lungs and the tissues showed modest but statistically significant improvement in predictive capability over the same model without transport delay. Aggregating the lungs and the tissues into a single compartment further degraded the predictive fidelity of the model. In addition, the model equipped with transport delay demonstrated superior reliability to the one without transport delay. Further, the respiratory parameters derived from the model equipped with transport delay, but not the one without transport delay, were physiologically plausible. The results suggest that gas transport between the lungs and the tissues must be taken into account to accurately reproduce the respiratory CO2 gas exchange process under conditions of wide-ranging and dynamically varying mechanical ventilation conditions. PMID:26870728
New atmospheric sensor analysis study
NASA Technical Reports Server (NTRS)
Parker, K. G.
1989-01-01
The functional capabilities of the ESAD Research Computing Facility are discussed. The system is used in processing atmospheric measurements which are used in the evaluation of sensor performance, conducting design-concept simulation studies, and also in modeling the physical and dynamical nature of atmospheric processes. The results may then be evaluated to furnish inputs into the final design specifications for new space sensors intended for future Spacelab, Space Station, and free-flying missions. In addition, data gathered from these missions may subsequently be analyzed to provide better understanding of requirements for numerical modeling of atmospheric phenomena.
NASA Astrophysics Data System (ADS)
Hickmott, Curtis W.
Cellular core tooling is a new technology which has the capability to manufacture complex integrated monolithic composite structures. This novel tooling method utilizes thermoplastic cellular cores as inner tooling. The semi-rigid nature of the cellular cores makes them convenient for lay-up, and under autoclave temperature and pressure they soften and expand providing uniform compaction on all surfaces including internal features such as ribs and spar tubes. This process has the capability of developing fully optimized aerospace structures by reducing or eliminating assembly using fasteners or bonded joints. The technology is studied in the context of evaluating its capabilities, advantages, and limitations in developing high quality structures. The complex nature of these parts has led to development of a model using the Finite Element Analysis (FEA) software Abaqus and the plug-in COMPRO Common Component Architecture (CCA) provided by Convergent Manufacturing Technologies. This model utilizes a "virtual autoclave" technique to simulate temperature profiles, resin flow paths, and ultimately deformation from residual stress. A model has been developed simulating the temperature profile during curing of composite parts made with the cellular core technology. While modeling of composites has been performed in the past, this project will look to take this existing knowledge and apply it to this new manufacturing method capable of building more complex parts and develop a model designed specifically for building large, complex components with a high degree of accuracy. The model development has been carried out in conjunction with experimental validation. A double box beam structure was chosen for analysis to determine the effects of the technology on internal ribs and joints. Double box beams were manufactured and sectioned into T-joints for characterization. Mechanical behavior of T-joints was performed using the T-joint pull-off test and compared to traditional tooling methods. Components made with the cellular core tooling method showed an improved strength at the joints. It is expected that this knowledge will help optimize the processing of complex, integrated structures and benefit applications in aerospace where lighter, structurally efficient components would be advantageous.
Modeling a Civil Event Case Study for Consequence Management Using the IMPRINT Forces Module
NASA Technical Reports Server (NTRS)
Gacy, Marc; Gosakan, Mala; Eckdahl, Angela; Miller, Jeffrey R.
2012-01-01
A critical challenge in the Consequence Management (CM) domain is the appropriate allocation of necessary and skilled military and civilian personnel and materiel resources in unexpected emergencies. To aid this process we used the Forces module in the Improved Performance Research Integration Tool (IMPRINT). This module enables analysts to enter personnel and equipment capabilities, prioritized schedules and numbers available, along with unexpected emergency requirements in order to assess force response requirements. Using a suspected terrorist threat on a college campus, we developed a test case model which exercised the capabilities of the module, including the scope and scale of operations. The model incorporates data from multiple sources, including daily schedules and frequency of events such as fire calls. Our preliminary results indicate that the model can predict potential decreases in civilian emergency response coverage due to an involved unplanned incident requiring significant portions of police, fire and civil responses teams.
An assessment model for quality management
NASA Astrophysics Data System (ADS)
Völcker, Chr.; Cass, A.; Dorling, A.; Zilioli, P.; Secchi, P.
2002-07-01
SYNSPACE together with InterSPICE and Alenia Spazio is developing an assessment method to determine the capability of an organisation in the area of quality management. The method, sponsored by the European Space Agency (ESA), is called S9kS (SPiCE- 9000 for SPACE). S9kS is based on ISO 9001:2000 with additions from the quality standards issued by the European Committee for Space Standardization (ECSS) and ISO 15504 - Process Assessments. The result is a reference model that supports the expansion of the generic process assessment framework provided by ISO 15504 to nonsoftware areas. In order to be compliant with ISO 15504, requirements from ISO 9001 and ECSS-Q-20 and Q-20-09 have been turned into process definitions in terms of Purpose and Outcomes, supported by a list of detailed indicators such as Practices, Work Products and Work Product Characteristics. In coordination with this project, the capability dimension of ISO 15504 has been revised to be consistent with ISO 9001. As contributions from ISO 9001 and the space quality assurance standards are separable, the stripped down version S9k offers organisations in all industries an assessment model based solely on ISO 9001, and is therefore interesting to all organisations, which intend to improve their quality management system based on ISO 9001.
NASA Technical Reports Server (NTRS)
Jellicorse, John J.; Rahman, Shamin A.
2016-01-01
NASA is currently developing the next generation crewed spacecraft and launch vehicle for exploration beyond earth orbit including returning to the Moon and making the transit to Mars. Managing the design integration of major hardware elements of a space transportation system is critical for overcoming both the technical and programmatic challenges in taking a complex system from concept to space operations. An established method of accomplishing this is formal interface management. In this paper we set forth an argument that the interface management process implemented by NASA between the Orion Multi-Purpose Crew Vehicle (MPCV) and the Space Launch System (SLS) achieves the Level 3 tier of the EIA 731.1 System Engineering Capability Model (SECM) for Generic Practices. We describe the relevant NASA systems and associated organizations, and define the EIA SECM Level 3 Generic Practices. We then provide evidence for our compliance with those practices. This evidence includes discussions of: NASA Systems Engineering Interface (SE) Management standard process and best practices; the tailoring of that process for implementation on the Orion to SLS interface; changes made over time to improve the tailored process, and; the opportunities to take the resulting lessons learned and propose improvements to our institutional processes and best practices. We compare this evidence against the practices to form the rationale for the declared SECM maturity level.
An Automatic Medium to High Fidelity Low-Thrust Global Trajectory Toolchain; EMTG-GMAT
NASA Technical Reports Server (NTRS)
Beeson, Ryne T.; Englander, Jacob A.; Hughes, Steven P.; Schadegg, Maximillian
2015-01-01
Solving the global optimization, low-thrust, multiple-flyby interplanetary trajectory problem with high-fidelity dynamical models requires an unreasonable amount of computational resources. A better approach, and one that is demonstrated in this paper, is a multi-step process whereby the solution of the aforementioned problem is solved at a lower-fidelity and this solution is used as an initial guess for a higher-fidelity solver. The framework presented in this work uses two tools developed by NASA Goddard Space Flight Center: the Evolutionary Mission Trajectory Generator (EMTG) and the General Mission Analysis Tool (GMAT). EMTG is a medium to medium-high fidelity low-thrust interplanetary global optimization solver, which now has the capability to automatically generate GMAT script files for seeding a high-fidelity solution using GMAT's local optimization capabilities. A discussion of the dynamical models as well as thruster and power modeling for both EMTG and GMAT are given in this paper. Current capabilities are demonstrated with examples that highlight the toolchains ability to efficiently solve the difficult low-thrust global optimization problem with little human intervention.
Distributed intelligent monitoring and reporting facilities
NASA Astrophysics Data System (ADS)
Pavlou, George; Mykoniatis, George; Sanchez-P, Jorge-A.
1996-06-01
Distributed intelligent monitoring and reporting facilities are of paramount importance in both service and network management as they provide the capability to monitor quality of service and utilization parameters and notify degradation so that corrective action can be taken. By intelligent, we refer to the capability of performing the monitoring tasks in a way that has the smallest possible impact on the managed network, facilitates the observation and summarization of information according to a number of criteria and in its most advanced form and permits the specification of these criteria dynamically to suit the particular policy in hand. In addition, intelligent monitoring facilities should minimize the design and implementation effort involved in such activities. The ISO/ITU Metric, Summarization and Performance management functions provide models that only partially satisfy the above requirements. This paper describes our extensions to the proposed models to support further capabilities, with the intention to eventually lead to fully dynamically defined monitoring policies. The concept of distributing intelligence is also discussed, including the consideration of security issues and the applicability of the model in ODP-based distributed processing environments.
Parametric Modeling in the CAE Process: Creating a Family of Models
NASA Technical Reports Server (NTRS)
Brown, Christopher J.
2011-01-01
This Presentation meant as an example - Give ideas of approaches to use - The significant benefit of PARAMETRIC geometry based modeling The importance of planning before you build Showcase some NX capabilities - Mesh Controls - Associativity - Divide Face - Offset Surface Reminder - This only had to be done once! - Can be used for any cabinet in that "family" Saves a lot of time if pre-planned Allows re-use in the future
Comprehensive analysis of a Metabolic Model for lipid production in Rhodosporidium toruloides.
Castañeda, María Teresita; Nuñez, Sebastián; Garelli, Fabricio; Voget, Claudio; Battista, Hernán De
2018-05-19
The yeast Rhodosporidium toruloides has been extensively studied for its application in biolipid production. The knowledge of its metabolism capabilities and the application of constraint-based flux analysis methodology provide useful information for process prediction and optimization. The accuracy of the resulting predictions is highly dependent on metabolic models. A metabolic reconstruction for R. toruloides metabolism has been recently published. On the basis of this model, we developed a curated version that unblocks the central nitrogen metabolism and, in addition, completes charge and mass balances in some reactions neglected in the former model. Then, a comprehensive analysis of network capability was performed with the curated model and compared with the published metabolic reconstruction. The flux distribution obtained by lipid optimization with Flux Balance Analysis was able to replicate the internal biochemical changes that lead to lipogenesis in oleaginous microorganisms. These results motivate the development of a genome-scale model for complete elucidation of R. toruloides metabolism. Copyright © 2018 Elsevier B.V. All rights reserved.
Adaptation to Variance of Stimuli in Drosophila Larva Navigation
NASA Astrophysics Data System (ADS)
Wolk, Jason; Gepner, Ruben; Gershow, Marc
In order to respond to stimuli that vary over orders of magnitude while also being capable of sensing very small changes, neural systems must be capable of rapidly adapting to the variance of stimuli. We study this adaptation in Drosophila larvae responding to varying visual signals and optogenetically induced fictitious odors using an infrared illuminated arena and custom computer vision software. Larval navigational decisions (when to turn) are modeled as the output a linear-nonlinear Poisson process. The development of the nonlinear turn rate in response to changes in variance is tracked using an adaptive point process filter determining the rate of adaptation to different stimulus profiles. Supported by NIH Grant 1DP2EB022359 and NSF Grant PHY-1455015.
NASA's Evolutionary Xenon Thruster (NEXT) Long-Duration Test as of 736 kg of Propellant Throughput
NASA Technical Reports Server (NTRS)
Shastry, Rohit; Herman, Daniel A.; Soulas, George C.; Patterson, Michael J.
2012-01-01
The NASA s Evolutionary Xenon Thruster (NEXT) program is developing the next-generation solar-electric ion propulsion system with significant enhancements beyond the state-of-the-art NASA Solar Electric Propulsion Technology Application Readiness (NSTAR) ion propulsion system to provide future NASA science missions with enhanced mission capabilities. A Long-Duration Test (LDT) was initiated in June 2005 to validate the thruster service life modeling and to qualify the thruster propellant throughput capability. The thruster has set electric propulsion records for the longest operating duration, highest propellant throughput, and most total impulse demonstrated. At the time of this publication, the NEXT LDT has surpassed 42,100 h of operation, processed more than 736 kg of xenon propellant, and demonstrated greater than 28.1 MN s total impulse. Thruster performance has been steady with negligible degradation. The NEXT thruster design has mitigated several lifetime limiting mechanisms encountered in the NSTAR design, including the NSTAR first failure mode, thereby drastically improving thruster capabilities. Component erosion rates and the progression of the predicted life-limiting erosion mechanism for the thruster compare favorably to pretest predictions based upon semi-empirical ion thruster models used in the thruster service life assessment. Service life model validation has been accomplished by the NEXT LDT. Assuming full-power operation until test article failure, the models and extrapolated erosion data predict penetration of the accelerator grid grooves after more than 45,000 hours of operation while processing over 800 kg of xenon propellant. Thruster failure due to degradation of the accelerator grid structural integrity is expected after groove penetration.
NASA's Evolutionary Xenon Thruster (NEXT) Long-Duration Test as of 736 kg of Propellant Throughput
NASA Technical Reports Server (NTRS)
Shastry, Rohit; Herman, Daniel A.; Soulas, George C.; Patterson, Michael J.
2012-01-01
The NASA s Evolutionary Xenon Thruster (NEXT) program is developing the next-generation solar-electric ion propulsion system with significant enhancements beyond the state-of-the-art NASA Solar Electric Propulsion Technology Application Readiness (NSTAR) ion propulsion system to provide future NASA science missions with enhanced mission capabilities. A Long-Duration Test (LDT) was initiated in June 2005 to validate the thruster service life modeling and to qualify the thruster propellant throughput capability. The thruster has set electric propulsion records for the longest operating duration, highest propellant throughput, and most total impulse demonstrated. At the time of this publication, the NEXT LDT has surpassed 42,100 h of operation, processed more than 736 kg of xenon propellant, and demonstrated greater than 28.1 MN s total impulse. Thruster performance has been steady with negligible degradation. The NEXT thruster design has mitigated several lifetime limiting mechanisms encountered in the NSTAR design, including the NSTAR first failure mode, thereby drastically improving thruster capabilities. Component erosion rates and the progression of the predicted life-limiting erosion mechanism for the thruster compare favorably to pretest predictions based upon semi-empirical ion thruster models used in the thruster service life assessment. Service life model validation has been accomplished by the NEXT LDT. Assuming full-power operation until test article failure, the models and extrapolated erosion data predict penetration of the accelerator grid grooves after more than 45,000 hours of operation while processing over 800 kg of xenon propellant. Thruster failure due to degradation of the accelerator grid structural integrity is expected after
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mou, J.I.; King, C.
The focus of this study is to develop a sensor fused process modeling and control methodology to model, assess, and then enhance the performance of a hexapod machine for precision product realization. Deterministic modeling technique was used to derive models for machine performance assessment and enhancement. Sensor fusion methodology was adopted to identify the parameters of the derived models. Empirical models and computational algorithms were also derived and implemented to model, assess, and then enhance the machine performance. The developed sensor fusion algorithms can be implemented on a PC-based open architecture controller to receive information from various sensors, assess themore » status of the process, determine the proper action, and deliver the command to actuators for task execution. This will enhance a hexapod machine`s capability to produce workpieces within the imposed dimensional tolerances.« less
A Simple and Accurate Rate-Driven Infiltration Model
NASA Astrophysics Data System (ADS)
Cui, G.; Zhu, J.
2017-12-01
In this study, we develop a novel Rate-Driven Infiltration Model (RDIMOD) for simulating infiltration into soils. Unlike traditional methods, RDIMOD avoids numerically solving the highly non-linear Richards equation or simply modeling with empirical parameters. RDIMOD employs infiltration rate as model input to simulate one-dimensional infiltration process by solving an ordinary differential equation. The model can simulate the evolutions of wetting front, infiltration rate, and cumulative infiltration on any surface slope including vertical and horizontal directions. Comparing to the results from the Richards equation for both vertical infiltration and horizontal infiltration, RDIMOD simply and accurately predicts infiltration processes for any type of soils and soil hydraulic models without numerical difficulty. Taking into account the accuracy, capability, and computational effectiveness and stability, RDIMOD can be used in large-scale hydrologic and land-atmosphere modeling.
Model-Based Fatigue Prognosis of Fiber-Reinforced Laminates Exhibiting Concurrent Damage Mechanisms
NASA Technical Reports Server (NTRS)
Corbetta, M.; Sbarufatti, C.; Saxena, A.; Giglio, M.; Goebel, K.
2016-01-01
Prognostics of large composite structures is a topic of increasing interest in the field of structural health monitoring for aerospace, civil, and mechanical systems. Along with recent advancements in real-time structural health data acquisition and processing for damage detection and characterization, model-based stochastic methods for life prediction are showing promising results in the literature. Among various model-based approaches, particle-filtering algorithms are particularly capable in coping with uncertainties associated with the process. These include uncertainties about information on the damage extent and the inherent uncertainties of the damage propagation process. Some efforts have shown successful applications of particle filtering-based frameworks for predicting the matrix crack evolution and structural stiffness degradation caused by repetitive fatigue loads. Effects of other damage modes such as delamination, however, are not incorporated in these works. It is well established that delamination and matrix cracks not only co-exist in most laminate structures during the fatigue degradation process but also affect each other's progression. Furthermore, delamination significantly alters the stress-state in the laminates and accelerates the material degradation leading to catastrophic failure. Therefore, the work presented herein proposes a particle filtering-based framework for predicting a structure's remaining useful life with consideration of multiple co-existing damage-mechanisms. The framework uses an energy-based model from the composite modeling literature. The multiple damage-mode model has been shown to suitably estimate the energy release rate of cross-ply laminates as affected by matrix cracks and delamination modes. The model is also able to estimate the reduction in stiffness of the damaged laminate. This information is then used in the algorithms for life prediction capabilities. First, a brief summary of the energy-based damage model is provided. Then, the paper describes how the model is embedded within the prognostic framework and how the prognostics performance is assessed using observations from run-to-failure experiments
Diabetes: Models, Signals and control
NASA Astrophysics Data System (ADS)
Cobelli, C.
2010-07-01
Diabetes and its complications impose significant economic consequences on individuals, families, health systems, and countries. The control of diabetes is an interdisciplinary endeavor, which includes significant components of modeling, signal processing and control. Models: first, I will discuss the minimal (coarse) models which describe the key components of the system functionality and are capable of measuring crucial processes of glucose metabolism and insulin control in health and diabetes; then, the maximal (fine-grain) models which include comprehensively all available knowledge about system functionality and are capable to simulate the glucose-insulin system in diabetes, thus making it possible to create simulation scenarios whereby cost effective experiments can be conducted in silico to assess the efficacy of various treatment strategies - in particular I will focus on the first in silico simulation model accepted by FDA as a substitute to animal trials in the quest for optimal diabetes control. Signals: I will review metabolic monitoring, with a particular emphasis on the new continuous glucose sensors, on the crucial role of models to enhance the interpretation of their time-series signals, and on the opportunities that they present for automation of diabetes control. Control: I will review control strategies that have been successfully employed in vivo or in silico, presenting a promise for the development of a future artificial pancreas and, in particular, I will discuss a modular architecture for building closed-loop control systems, including insulin delivery and patient safety supervision layers.
Existing and Required Modeling Capabilities for Evaluating ATM Systems and Concepts
NASA Technical Reports Server (NTRS)
Odoni, Amedeo R.; Bowman, Jeremy; Delahaye, Daniel; Deyst, John J.; Feron, Eric; Hansman, R. John; Khan, Kashif; Kuchar, James K.; Pujet, Nicolas; Simpson, Robert W.
1997-01-01
ATM systems throughout the world are entering a period of major transition and change. The combination of important technological developments and of the globalization of the air transportation industry has necessitated a reexamination of some of the fundamental premises of existing Air Traffic Management (ATM) concepts. New ATM concepts have to be examined, concepts that may place more emphasis on: strategic traffic management; planning and control; partial decentralization of decision-making; and added reliance on the aircraft to carry out strategic ATM plans, with ground controllers confined primarily to a monitoring and supervisory role. 'Free Flight' is a case in point. In order to study, evaluate and validate such new concepts, the ATM community will have to rely heavily on models and computer-based tools/utilities, covering a wide range of issues and metrics related to safety, capacity and efficiency. The state of the art in such modeling support is adequate in some respects, but clearly deficient in others. It is the objective of this study to assist in: (1) assessing the strengths and weaknesses of existing fast-time models and tools for the study of ATM systems and concepts and (2) identifying and prioritizing the requirements for the development of additional modeling capabilities in the near future. A three-stage process has been followed to this purpose: 1. Through the analysis of two case studies involving future ATM system scenarios, as well as through expert assessment, modeling capabilities and supporting tools needed for testing and validating future ATM systems and concepts were identified and described. 2. Existing fast-time ATM models and support tools were reviewed and assessed with regard to the degree to which they offer the capabilities identified under Step 1. 3 . The findings of 1 and 2 were combined to draw conclusions about (1) the best capabilities currently existing, (2) the types of concept testing and validation that can be carried out reliably with such existing capabilities and (3) the currently unavailable modeling capabilities that should receive high priority for near-term research and development. It should be emphasized that the study is concerned only with the class of 'fast time' analytical and simulation models. 'Real time' models, that typically involve humans-in-the-loop, comprise another extensive class which is not addressed in this report. However, the relationship between some of the fast-time models reviewed and a few well-known real-time models is identified in several parts of this report and the potential benefits from the combined use of these two classes of models-a very important subject-are discussed in chapters 4 and 7.
Borsia, I.; Rossetto, R.; Schifani, C.; Hill, Mary C.
2013-01-01
In this paper two modifications to the MODFLOW code are presented. One concerns an extension of Local Grid Refinement (LGR) to Variable Saturated Flow process (VSF) capability. This modification allows the user to solve the 3D Richards’ equation only in selected parts of the model domain. The second modification introduces a new package, named CFL (Cascading Flow), which improves the computation of overland flow when ground surface saturation is simulated using either VSF or the Unsaturated Zone Flow (UZF) package. The modeling concepts are presented and demonstrated. Programmer documentation is included in appendices.
Modeling Production Plant Forming Processes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rhee, M; Becker, R; Couch, R
2004-09-22
Engineering has simulation tools and experience in modeling forming processes. Y-12 personnel have expressed interest in validating our tools and experience against their manufacturing process activities such as rolling, casting, and forging etc. We have demonstrated numerical capabilities in a collaborative DOE/OIT project with ALCOA that is nearing successful completion. The goal was to use ALE3D to model Alcoa's slab rolling process in order to demonstrate a computational tool that would allow Alcoa to define a rolling schedule that would minimize the probability of ingot fracture, thus reducing waste and energy consumption. It is intended to lead to long-term collaborationmore » with Y-12 and perhaps involvement with other components of the weapons production complex. Using simulations to aid in design of forming processes can: decrease time to production; reduce forming trials and associated expenses; and guide development of products with greater uniformity and less scrap.« less
Assessment of Process Capability: the case of Soft Drinks Processing Unit
NASA Astrophysics Data System (ADS)
Sri Yogi, Kottala
2018-03-01
The process capability studies have significant impact in investigating process variation which is important in achieving product quality characteristics. Its indices are to measure the inherent variability of a process and thus to improve the process performance radically. The main objective of this paper is to understand capability of the process being produced within specification of the soft drinks processing unit, a premier brands being marketed in India. A few selected critical parameters in soft drinks processing: concentration of gas volume, concentration of brix, torque of crock has been considered for this study. Assessed some relevant statistical parameters: short term capability, long term capability as a process capability indices perspective. For assessment we have used real time data of soft drinks bottling company which is located in state of Chhattisgarh, India. As our research output suggested reasons for variations in the process which is validated using ANOVA and also predicted Taguchi cost function, assessed also predicted waste monetarily this shall be used by organization for improving process parameters. This research work has substantially benefitted the organization in understanding the various variations of selected critical parameters for achieving zero rejection.
A model of the human observer and decision maker
NASA Technical Reports Server (NTRS)
Wewerinke, P. H.
1981-01-01
The decision process is described in terms of classical sequential decision theory by considering the hypothesis that an abnormal condition has occurred by means of a generalized likelihood ratio test. For this, a sufficient statistic is provided by the innovation sequence which is the result of the perception an information processing submodel of the human observer. On the basis of only two model parameters, the model predicts the decision speed/accuracy trade-off and various attentional characteristics. A preliminary test of the model for single variable failure detection tasks resulted in a very good fit of the experimental data. In a formal validation program, a variety of multivariable failure detection tasks was investigated and the predictive capability of the model was demonstrated.
Structural mode significance using INCA. [Interactive Controls Analysis computer program
NASA Technical Reports Server (NTRS)
Bauer, Frank H.; Downing, John P.; Thorpe, Christopher J.
1990-01-01
Structural finite element models are often too large to be used in the design and analysis of control systems. Model reduction techniques must be applied to reduce the structural model to manageable size. In the past, engineers either performed the model order reduction by hand or used distinct computer programs to retrieve the data, to perform the significance analysis and to reduce the order of the model. To expedite this process, the latest version of INCA has been expanded to include an interactive graphical structural mode significance and model order reduction capability.
Additions and improvements to the high energy density physics capabilities in the FLASH code
NASA Astrophysics Data System (ADS)
Lamb, D.; Bogale, A.; Feister, S.; Flocke, N.; Graziani, C.; Khiar, B.; Laune, J.; Tzeferacos, P.; Walker, C.; Weide, K.
2017-10-01
FLASH is an open-source, finite-volume Eulerian, spatially-adaptive radiation magnetohydrodynamics code that has the capabilities to treat a broad range of physical processes. FLASH performs well on a wide range of computer architectures, and has a broad user base. Extensive high energy density physics (HEDP) capabilities exist in FLASH, which make it a powerful open toolset for the academic HEDP community. We summarize these capabilities, emphasizing recent additions and improvements. We describe several non-ideal MHD capabilities that are being added to FLASH, including the Hall and Nernst effects, implicit resistivity, and a circuit model, which will allow modeling of Z-pinch experiments. We showcase the ability of FLASH to simulate Thomson scattering polarimetry, which measures Faraday due to the presence of magnetic fields, as well as proton radiography, proton self-emission, and Thomson scattering diagnostics. Finally, we describe several collaborations with the academic HEDP community in which FLASH simulations were used to design and interpret HEDP experiments. This work was supported in part at U. Chicago by DOE NNSA ASC through the Argonne Institute for Computing in Science under FWP 57789; DOE NNSA under NLUF Grant DE-NA0002724; DOE SC OFES Grant DE-SC0016566; and NSF Grant PHY-1619573.
STDP Installs in Winner-Take-All Circuits an Online Approximation to Hidden Markov Model Learning
Kappel, David; Nessler, Bernhard; Maass, Wolfgang
2014-01-01
In order to cross a street without being run over, we need to be able to extract very fast hidden causes of dynamically changing multi-modal sensory stimuli, and to predict their future evolution. We show here that a generic cortical microcircuit motif, pyramidal cells with lateral excitation and inhibition, provides the basis for this difficult but all-important information processing capability. This capability emerges in the presence of noise automatically through effects of STDP on connections between pyramidal cells in Winner-Take-All circuits with lateral excitation. In fact, one can show that these motifs endow cortical microcircuits with functional properties of a hidden Markov model, a generic model for solving such tasks through probabilistic inference. Whereas in engineering applications this model is adapted to specific tasks through offline learning, we show here that a major portion of the functionality of hidden Markov models arises already from online applications of STDP, without any supervision or rewards. We demonstrate the emergent computing capabilities of the model through several computer simulations. The full power of hidden Markov model learning can be attained through reward-gated STDP. This is due to the fact that these mechanisms enable a rejection sampling approximation to theoretically optimal learning. We investigate the possible performance gain that can be achieved with this more accurate learning method for an artificial grammar task. PMID:24675787
The HackensackUMC Value-Based Care Model: Building Essentials for Value-Based Purchasing.
Douglas, Claudia; Aroh, Dianne; Colella, Joan; Quadri, Mohammed
2016-01-01
The Affordable Care Act, 2010, and the subsequent shift from a quantity-focus to a value-centric reimbursement model led our organization to create the HackensackUMC Value-Based Care Model to improve our process capability and performance to meet and sustain the triple aims of value-based purchasing: higher quality, lower cost, and consumer perception. This article describes the basics of our model and illustrates how we used it to reduce the costs of our patient sitter program.
NASA Astrophysics Data System (ADS)
Sindiy, Oleg V.
This dissertation presents a model-based system-of-systems engineering (SoSE) approach as a design philosophy for architecting in system-of-systems (SoS) problems. SoS refers to a special class of systems in which numerous systems with operational and managerial independence interact to generate new capabilities that satisfy societal needs. Design decisions are more complicated in a SoS setting. A revised Process Model for SoSE is presented to support three phases in SoS architecting: defining the scope of the design problem, abstracting key descriptors and their interrelations in a conceptual model, and implementing computer-based simulations for architectural analyses. The Process Model enables improved decision support considering multiple SoS features and develops computational models capable of highlighting configurations of organizational, policy, financial, operational, and/or technical features. Further, processes for verification and validation of SoS models and simulations are also important due to potential impact on critical decision-making and, thus, are addressed. Two research questions frame the research efforts described in this dissertation. The first concerns how the four key sources of SoS complexity---heterogeneity of systems, connectivity structure, multi-layer interactions, and the evolutionary nature---influence the formulation of SoS models and simulations, trade space, and solution performance and structure evaluation metrics. The second question pertains to the implementation of SoSE architecting processes to inform decision-making for a subset of SoS problems concerning the design of information exchange services in space-based operations domain. These questions motivate and guide the dissertation's contributions. A formal methodology for drawing relationships within a multi-dimensional trade space, forming simulation case studies from applications of candidate architecture solutions to a campaign of notional mission use cases, and executing multi-purpose analysis studies is presented. These efforts are coupled to the generation of aggregate and time-dependent solution performance metrics via the hierarchical decomposition of objectives and the analytical recomposition of multi-attribute qualitative program drivers from quantifiable measures. This methodology was also applied to generate problem-specific solution structure evaluation metrics that facilitate the comparison of alternate solutions at a high level of aggregation, at lower levels of abstraction, and to relate options for design variables with associated performance values. For proof-of-capability demonstration, the selected application problem concerns the design of command, control, communication, and information (C3I) architecture services for a notional campaign of crewed and robotic lunar surface missions. The impetus for the work was the demonstration of using model-based SoSE for design of sustainable interoperability capabilities between all data and communication assets in extended lunar campaigns. A comprehensive Lunar C3I simulation tool was developed by a team of researchers at Purdue University in support of NASA's Constellation Program; the author of this dissertation was a key contributor to the creation of this tool and made modifications and extensions to key components relevant to the methodological concepts presented in this dissertation. The dissertation concludes with a presentation of example results based on the interrogation of the constructed Lunar C3I computational model. The results are based on a family of studies, structured around a trade-tree of architecture options, which were conducted to test the hypothesis that the SoSE approach is efficacious in the information-exchange architecture design in space exploration domain. Included in the family of proof-of-capability studies is a simulation of the Apollo 17 mission, which allows not only for partial verification and validation of the model, but also provides insights for prioritizing future model design iterations to make it more realistic representation of the "real world." A caveat within the results presented is that they serve within the capacity of a proof-of-capability demonstration, and as such, they are a product of models and analyses that need further development before the tool's results can be employed for decision-making. Additional discussion is provided for how to further develop and validate the Lunar C3I tool and also to make it extensible to other SoS design problems of similar nature in space exploration and other problem application domains.
Collaborative environments for capability-based planning
NASA Astrophysics Data System (ADS)
McQuay, William K.
2005-05-01
Distributed collaboration is an emerging technology for the 21st century that will significantly change how business is conducted in the defense and commercial sectors. Collaboration involves two or more geographically dispersed entities working together to create a "product" by sharing and exchanging data, information, and knowledge. A product is defined broadly to include, for example, writing a report, creating software, designing hardware, or implementing robust systems engineering and capability planning processes in an organization. Collaborative environments provide the framework and integrate models, simulations, domain specific tools, and virtual test beds to facilitate collaboration between the multiple disciplines needed in the enterprise. The Air Force Research Laboratory (AFRL) is conducting a leading edge program in developing distributed collaborative technologies targeted to the Air Force's implementation of systems engineering for a simulation-aided acquisition and capability-based planning. The research is focusing on the open systems agent-based framework, product and process modeling, structural architecture, and the integration technologies - the glue to integrate the software components. In past four years, two live assessment events have been conducted to demonstrate the technology in support of research for the Air Force Agile Acquisition initiatives. The AFRL Collaborative Environment concept will foster a major cultural change in how the acquisition, training, and operational communities conduct business.
Off-Gas Adsorption Model Capabilities and Recommendations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lyon, Kevin L.; Welty, Amy K.; Law, Jack
2016-03-01
Off-gas treatment is required to reduce emissions from aqueous fuel reprocessing. Evaluating the products of innovative gas adsorption research requires increased computational simulation capability to more effectively transition from fundamental research to operational design. Early modeling efforts produced the Off-Gas SeParation and REcoverY (OSPREY) model that, while efficient in terms of computation time, was of limited value for complex systems. However, the computational and programming lessons learned in development of the initial model were used to develop Discontinuous Galerkin OSPREY (DGOSPREY), a more effective model. Initial comparisons between OSPREY and DGOSPREY show that, while OSPREY does reasonably well to capturemore » the initial breakthrough time, it displays far too much numerical dispersion to accurately capture the real shape of the breakthrough curves. DGOSPREY is a much better tool as it utilizes a more stable set of numerical methods. In addition, DGOSPREY has shown the capability to capture complex, multispecies adsorption behavior, while OSPREY currently only works for a single adsorbing species. This capability makes DGOSPREY ultimately a more practical tool for real world simulations involving many different gas species. While DGOSPREY has initially performed very well, there is still need for improvement. The current state of DGOSPREY does not include any micro-scale adsorption kinetics and therefore assumes instantaneous adsorption. This is a major source of error in predicting water vapor breakthrough because the kinetics of that adsorption mechanism is particularly slow. However, this deficiency can be remedied by building kinetic kernels into DGOSPREY. Another source of error in DGOSPREY stems from data gaps in single species, such as Kr and Xe, isotherms. Since isotherm data for each gas is currently available at a single temperature, the model is unable to predict adsorption at temperatures outside of the set of data currently available. Thus, in order to improve the predictive capabilities of the model, there is a need for more single-species adsorption isotherms at different temperatures, in addition to extending the model to include adsorption kinetics. This report provides background information about the modeling process and a path forward for further model improvement in terms of accuracy and user interface.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zheng, Liange; Li, Lianchong; Rutqvist, Jonny
Clay/shale has been considered as potential host rock for geological disposal of high-level nuclear waste throughout the world, because of its low permeability, low diffusion coefficient, high retention capacity for radionuclides, and capability to self-seal fractures induced by tunnel excavation. For example, Callovo-Oxfordian argillites at the Bure site, France (Fouche et al., 2004), Toarcian argillites at the Tournemire site, France (Patriarche et al., 2004), Opalinus Clay at the Mont Terri site, Switzerland (Meier et al., 2000), and Boom clay at the Mol site, Belgium (Barnichon and Volckaert, 2003) have all been under intensive scientific investigation (at both field and laboratorymore » scales) for understanding a variety of rock properties and their relationships to flow and transport processes associated with geological disposal of nuclear waste. Clay/shale formations may be generally classified as indurated or plastic clays (Tsang and Hudson, 2010). The latter (including Boom clay) is a softer material without high cohesion; its deformation is dominantly plastic. During the lifespan of a clay repository, the repository performance is affected by complex thermal, hydrogeological, mechanical, chemical (THMC) processes, such as heat release due to radionuclide decay, multiphase flow, formation of damage zones, radionuclide transport, waste dissolution, and chemical reactions. All these processes are related to each other. An in-depth understanding of these coupled processes is critical for the performance assessment (PA) of the repository. These coupled processes may affect radionuclide transport by changing transport paths (e.g., formation and evolution of excavation damaged zone (EDZ)) and altering flow, mineral, and mechanical properties that are related to radionuclide transport. While radionuclide transport in clay formation has been studied using laboratory tests (e,g, Appelo et al. 2010, Garcia-Gutierrez et al., 2008, Maes et al., 2008), short-term field tests (e.g. Garcia-Gutierrez et al. 2006, Soler et al. 2008, van Loon et al. 2004, Wu et al. 2009) and numerical modeling (de Windt et al. 2003; 2006), the effects of THMC processes on radionuclide transport are not fully investigated. The objectives of the research activity documented in this report are to improve a modeling capability for coupled THMC processes and to use it to evaluate the THMC impacts on radionuclide transport. This research activity addresses several key Features, Events and Processes (FEPs), including FEP 2.2.08, Hydrologic Processes, FEP 2.2.07, Mechanical Processes and FEP 2.2.09, Chemical Process— Transport, by studying near-field coupled THMC processes in clay/shale repositories and their impacts on radionuclide transport. This report documents the progress that has been made in FY12. Section 2 discusses the development of THMC modeling capability. Section 3 reports modeling results of THMC impacts on radionuclide transport. Planned work for the remaining months of FY12 and proposed work for FY13 are presented in Section 4.« less
Key Practices of the Capability Maturity Model, Version 1.1
1993-02-01
0-W31 4 Interpreting the CMM ............................................................ 0-35 4.1 Interpreting the Key...Practices............................................. 0-35 4.2 Interpreting the Common Features ..................................... 0-w35 4.2.1...4.2.5 Verifying Implementation ....................................... 0-47 4.3 Interpreting Software Process Definition
Tutorial: Neural networks and their potential application in nuclear power plants
DOE Office of Scientific and Technical Information (OSTI.GOV)
Uhrig, R.E.
A neural network is a data processing system consisting of a number of simple, highly interconnected processing elements in an architecture inspired by the structure of the cerebral cortex portion of the brain. Hence, neural networks are often capable of doing things which humans or animals do well but which conventional computers often do poorly. Neural networks have emerged in the past few years as an area of unusual opportunity for research, development and application to a variety of real world problems. Indeed, neural networks exhibit characteristics and capabilities not provided by any other technology. Examples include reading Japanese Kanjimore » characters and human handwriting, reading a typewritten manuscript aloud, compensating for alignment errors in robots, interpreting very noise'' signals (e.g. electroencephalograms), modeling complex systems that cannot be modelled mathematically, and predicting whether proposed loans will be good or fail. This paper presents a brief tutorial on neural networks and describes research on the potential applications to nuclear power plants.« less
A wetting and drying scheme for ROMS
Warner, John C.; Defne, Zafer; Haas, Kevin; Arango, Hernan G.
2013-01-01
The processes of wetting and drying have many important physical and biological impacts on shallow water systems. Inundation and dewatering effects on coastal mud flats and beaches occur on various time scales ranging from storm surge, periodic rise and fall of the tide, to infragravity wave motions. To correctly simulate these physical processes with a numerical model requires the capability of the computational cells to become inundated and dewatered. In this paper, we describe a method for wetting and drying based on an approach consistent with a cell-face blocking algorithm. The method allows water to always flow into any cell, but prevents outflow from a cell when the total depth in that cell is less than a user defined critical value. We describe the method, the implementation into the three-dimensional Regional Oceanographic Modeling System (ROMS), and exhibit the new capability under three scenarios: an analytical expression for shallow water flows, a dam break test case, and a realistic application to part of a wetland area along the Georgia Coast, USA.
Novel application of DEM to modelling comminution processes
NASA Astrophysics Data System (ADS)
Delaney, Gary W.; Cleary, Paul W.; Sinnott, Matt D.; Morrison, Rob D.
2010-06-01
Comminution processes in which grains are broken down into smaller and smaller sizes represent a critical component in many industries including mineral processing, cement production, food processing and pharmaceuticals. We present a novel DEM implementation capable of realistically modelling such comminution processes. This extends on a previous implementation of DEM particle breakage that utilized spherical particles. Our new extension uses super-quadric particles, where daughter fragments with realistic size and shape distributions are packed inside a bounding parent super-quadric. We demonstrate the flexibility of our approach in different particle breakage scenarios and examine the effect of the chosen minimum resolved particle size. This incorporation of the effect of particle shape in the breakage process allows for more realistic DEM simulations to be performed, that can provide additional fundamental insights into comminution processes and into the behaviour of individual pieces of industrial machinery.
Sensor Management for Applied Research Technologies (SMART)-On Demand Modeling (ODM) Project
NASA Technical Reports Server (NTRS)
Goodman, M.; Blakeslee, R.; Hood, R.; Jedlovec, G.; Botts, M.; Li, X.
2006-01-01
NASA requires timely on-demand data and analysis capabilities to enable practical benefits of Earth science observations. However, a significant challenge exists in accessing and integrating data from multiple sensors or platforms to address Earth science problems because of the large data volumes, varying sensor scan characteristics, unique orbital coverage, and the steep learning curve associated with each sensor and data type. The development of sensor web capabilities to autonomously process these data streams (whether real-time or archived) provides an opportunity to overcome these obstacles and facilitate the integration and synthesis of Earth science data and weather model output. A three year project, entitled Sensor Management for Applied Research Technologies (SMART) - On Demand Modeling (ODM), will develop and demonstrate the readiness of Open Geospatial Consortium (OGC) Sensor Web Enablement (SWE) capabilities that integrate both Earth observations and forecast model output into new data acquisition and assimilation strategies. The advancement of SWE-enabled systems (i.e., use of SensorML, sensor planning services - SPS, sensor observation services - SOS, sensor alert services - SAS and common observation model protocols) will have practical and efficient uses in the Earth science community for enhanced data set generation, real-time data assimilation with operational applications, and for autonomous sensor tasking for unique data collection.
Cervera, Miguel; Tesei, Claudia
2017-01-01
In this paper, an energy-equivalent orthotropic d+/d− damage model for cohesive-frictional materials is formulated. Two essential mechanical features are addressed, the damage-induced anisotropy and the microcrack closure-reopening (MCR) effects, in order to provide an enhancement of the original d+/d− model proposed by Faria et al. 1998, while keeping its high algorithmic efficiency unaltered. First, in order to ensure the symmetry and positive definiteness of the secant operator, the new formulation is developed in an energy-equivalence framework. This proves thermodynamic consistency and allows one to describe a fundamental feature of the orthotropic damage models, i.e., the reduction of the Poisson’s ratio throughout the damage process. Secondly, a “multidirectional” damage procedure is presented to extend the MCR capabilities of the original model. The fundamental aspects of this approach, devised for generic cyclic conditions, lie in maintaining only two scalar damage variables in the constitutive law, while preserving memory of the degradation directionality. The enhanced unilateral capabilities are explored with reference to the problem of a panel subjected to in-plane cyclic shear, with or without vertical pre-compression; depending on the ratio between shear and pre-compression, an absent, a partial or a complete stiffness recovery is simulated with the new multidirectional procedure. PMID:28772793
NASA Astrophysics Data System (ADS)
Chen, Y.; Li, J.; Xu, H.
2016-01-01
Physically based distributed hydrological models (hereafter referred to as PBDHMs) divide the terrain of the whole catchment into a number of grid cells at fine resolution and assimilate different terrain data and precipitation to different cells. They are regarded to have the potential to improve the catchment hydrological process simulation and prediction capability. In the early stage, physically based distributed hydrological models are assumed to derive model parameters from the terrain properties directly, so there is no need to calibrate model parameters. However, unfortunately the uncertainties associated with this model derivation are very high, which impacted their application in flood forecasting, so parameter optimization may also be necessary. There are two main purposes for this study: the first is to propose a parameter optimization method for physically based distributed hydrological models in catchment flood forecasting by using particle swarm optimization (PSO) algorithm and to test its competence and to improve its performances; the second is to explore the possibility of improving physically based distributed hydrological model capability in catchment flood forecasting by parameter optimization. In this paper, based on the scalar concept, a general framework for parameter optimization of the PBDHMs for catchment flood forecasting is first proposed that could be used for all PBDHMs. Then, with the Liuxihe model as the study model, which is a physically based distributed hydrological model proposed for catchment flood forecasting, the improved PSO algorithm is developed for the parameter optimization of the Liuxihe model in catchment flood forecasting. The improvements include adoption of the linearly decreasing inertia weight strategy to change the inertia weight and the arccosine function strategy to adjust the acceleration coefficients. This method has been tested in two catchments in southern China with different sizes, and the results show that the improved PSO algorithm could be used for the Liuxihe model parameter optimization effectively and could improve the model capability largely in catchment flood forecasting, thus proving that parameter optimization is necessary to improve the flood forecasting capability of physically based distributed hydrological models. It also has been found that the appropriate particle number and the maximum evolution number of PSO algorithm used for the Liuxihe model catchment flood forecasting are 20 and 30 respectively.
Marshall Space Flight Center Materials and Processes Laboratory
NASA Technical Reports Server (NTRS)
Tramel, Terri L.
2012-01-01
Marshall?s Materials and Processes Laboratory has been a core capability for NASA for over fifty years. MSFC has a proven heritage and recognized expertise in materials and manufacturing that are essential to enable and sustain space exploration. Marshall provides a "systems-wise" capability for applied research, flight hardware development, and sustaining engineering. Our history of leadership and achievements in materials, manufacturing, and flight experiments includes Apollo, Skylab, Mir, Spacelab, Shuttle (Space Shuttle Main Engine, External Tank, Reusable Solid Rocket Motor, and Solid Rocket Booster), Hubble, Chandra, and the International Space Station. MSFC?s National Center for Advanced Manufacturing, NCAM, facilitates major M&P advanced manufacturing partnership activities with academia, industry and other local, state and federal government agencies. The Materials and Processes Laborato ry has principal competencies in metals, composites, ceramics, additive manufacturing, materials and process modeling and simulation, space environmental effects, non-destructive evaluation, and fracture and failure analysis provide products ranging from materials research in space to fully integrated solutions for large complex systems challenges. Marshall?s materials research, development and manufacturing capabilities assure that NASA and National missions have access to cutting-edge, cost-effective engineering design and production options that are frugal in using design margins and are verified as safe and reliable. These are all critical factors in both future mission success and affordability.
An intelligent approach to welding robot selection
NASA Astrophysics Data System (ADS)
Milano, J.; Mauk, S. D.; Flitter, L.; Morris, R.
1993-10-01
In a shipyard where multiple stationary and mobile workcells are employed in the fabrication of components of complex sub-assemblies,efficient operation requires an intelligent method of scheduling jobs and selecting workcells based on optimum throughput and cost. The achievement of this global solution requires the successful organization of resource availability,process requirements,and process constraints. The Off-line Planner (OLP) of the Programmable Automated Weld Systemd (PAWS) is capable of advanced modeling of weld processes and environments as well as the generation of complete weld procedures. These capabilities involve the integration of advanced Computer Aided Design (CAD), path planning, and obstacle detection and avoidance techniques as well as the synthesis of complex design and process information. These existing capabilities provide the basis of the functionality required for the successful implementation of an intelligent weld robot selector and material flow planner. Current efforts are focused on robot selection via the dynamic routing of components to the appropriate work cells. It is proposed that this problem is a variant of the “Traveling Salesman Problem” (TSP) that has been proven to belong to a larger set of optimization problems termed nondeterministic polynomial complete (NP complete). In this paper, a heuristic approach utilizing recurrent neural networks is explored as a rapid means of producing a near optimal, if not optimal, bdweld robot selection.
Mallavarapu, Aneil; Thomson, Matthew; Ullian, Benjamin; Gunawardena, Jeremy
2008-01-01
Mathematical models are increasingly used to understand how phenotypes emerge from systems of molecular interactions. However, their current construction as monolithic sets of equations presents a fundamental barrier to progress. Overcoming this requires modularity, enabling sub-systems to be specified independently and combined incrementally, and abstraction, enabling generic properties of biological processes to be specified independently of specific instances. These, in turn, require models to be represented as programs rather than as datatypes. Programmable modularity and abstraction enables libraries of modules to be created, which can be instantiated and reused repeatedly in different contexts with different components. We have developed a computational infrastructure that accomplishes this. We show here why such capabilities are needed, what is required to implement them and what can be accomplished with them that could not be done previously. PMID:18647734
Microeconomics of yield learning and process control in semiconductor manufacturing
NASA Astrophysics Data System (ADS)
Monahan, Kevin M.
2003-06-01
Simple microeconomic models that directly link yield learning to profitability in semiconductor manufacturing have been rare or non-existent. In this work, we review such a model and provide links to inspection capability and cost. Using a small number of input parameters, we explain current yield management practices in 200mm factories. The model is then used to extrapolate requirements for 300mm factories, including the impact of technology transitions to 130nm design rules and below. We show that the dramatic increase in value per wafer at the 300mm transition becomes a driver for increasing metrology and inspection capability and sampling. These analyses correlate well wtih actual factory data and often identify millions of dollars in potential cost savings. We demonstrate this using the example of grating-based overlay metrology for the 65nm node.
Mallavarapu, Aneil; Thomson, Matthew; Ullian, Benjamin; Gunawardena, Jeremy
2009-03-06
Mathematical models are increasingly used to understand how phenotypes emerge from systems of molecular interactions. However, their current construction as monolithic sets of equations presents a fundamental barrier to progress. Overcoming this requires modularity, enabling sub-systems to be specified independently and combined incrementally, and abstraction, enabling generic properties of biological processes to be specified independently of specific instances. These, in turn, require models to be represented as programs rather than as datatypes. Programmable modularity and abstraction enables libraries of modules to be created, which can be instantiated and reused repeatedly in different contexts with different components. We have developed a computational infrastructure that accomplishes this. We show here why such capabilities are needed, what is required to implement them and what can be accomplished with them that could not be done previously.
Preliminary Assessment of Turbomachinery Codes
NASA Technical Reports Server (NTRS)
Mazumder, Quamrul H.
2007-01-01
This report assesses different CFD codes developed and currently being used at Glenn Research Center to predict turbomachinery fluid flow and heat transfer behavior. This report will consider the following codes: APNASA, TURBO, GlennHT, H3D, and SWIFT. Each code will be described separately in the following section with their current modeling capabilities, level of validation, pre/post processing, and future development and validation requirements. This report addresses only previously published and validations of the codes. However, the codes have been further developed to extend the capabilities of the codes.
Modeling a Hall Thruster from Anode to Plume Far Field
2005-01-01
Hall thruster simulation capability that begins with propellant injection at the thruster anode, and ends in the plume far field. The development of a comprehensive simulation capability is critical for a number of reasons. The main motivation stems from the need to directly couple simulation of the plasma discharge processes inside the thruster and the transport of the plasma to the plume far field. The simulation strategy will employ two existing codes, one for the Hall thruster device and one for the plume. The coupling will take place in the plume
Development of a material processing plant for lunar soil
NASA Technical Reports Server (NTRS)
Goettsch, Ulix; Ousterhout, Karl
1992-01-01
Currently there is considerable interest in developing in-situ materials processing plants for both the Moon and Mars. Two of the most important aspects of developing such a materials processing plant is the overall system design and the integration of the different technologies into a reliable, lightweight, and cost-effective unit. The concept of an autonomous materials processing plant that is capable of producing useful substances from lunar regolith was developed. In order for such a materials processing plant to be considered as a viable option, it must be totally self-contained, able to operate autonomously, cost effective, light weight, and fault tolerant. In order to assess the impact of different technologies on the overall systems design and integration, a one-half scale model was constructed that is capable of scooping up (or digging) lunar soil, transferring the soil to a solar furnace, heating the soil in the furnace to liberate the gasses, and transferring the spent soil to a 'tile' processing center. All aspects of the control system are handled by a 386 class PC via D/A, A/D, and DSP (Digital Signal Processor) control cards.
Wieland, Birgit; Ropte, Sven
2017-01-01
The production of rotor blades for wind turbines is still a predominantly manual process. Process simulation is an adequate way of improving blade quality without a significant increase in production costs. This paper introduces a module for tolerance simulation for rotor-blade production processes. The investigation focuses on the simulation of temperature distribution for one-sided, self-heated tooling and thick laminates. Experimental data from rotor-blade production and down-scaled laboratory tests are presented. Based on influencing factors that are identified, a physical model is created and implemented as a simulation. This provides an opportunity to simulate temperature and cure-degree distribution for two-dimensional cross sections. The aim of this simulation is to support production processes. Hence, it is modelled as an in situ simulation with direct input of temperature data and real-time capability. A monolithic part of the rotor blade, the main girder, is used as an example for presenting the results. PMID:28981458
Wieland, Birgit; Ropte, Sven
2017-10-05
The production of rotor blades for wind turbines is still a predominantly manual process. Process simulation is an adequate way of improving blade quality without a significant increase in production costs. This paper introduces a module for tolerance simulation for rotor-blade production processes. The investigation focuses on the simulation of temperature distribution for one-sided, self-heated tooling and thick laminates. Experimental data from rotor-blade production and down-scaled laboratory tests are presented. Based on influencing factors that are identified, a physical model is created and implemented as a simulation. This provides an opportunity to simulate temperature and cure-degree distribution for two-dimensional cross sections. The aim of this simulation is to support production processes. Hence, it is modelled as an in situ simulation with direct input of temperature data and real-time capability. A monolithic part of the rotor blade, the main girder, is used as an example for presenting the results.
Wen J. Wang; Hong S. He; Frank R. Thompson; Jacob S. Fraser; William D. Dijak
2016-01-01
Tree species distribution and abundance are affected by forces operating at multiple scales. Niche and biophysical process models have been commonly used to predict climate change effects at regional scales, however, these models have limited capability to include site-scale population dynamics and landscape- scale disturbance and dispersal. We applied a landscape...
Code of Federal Regulations, 2010 CFR
2010-10-01
... originally manufactured for importation into and sale in the United States and of the same model year as the model for which petition is made, and is capable of being readily modified to conform to all applicable... standards, shall pay a fee based upon the direct and indirect costs of processing and acting upon such...
ERIC Educational Resources Information Center
GLOVER, J.H.
THE CHIEF OBJECTIVE OF THIS STUDY OF SPEED-SKILL ACQUISITION WAS TO FIND A MATHEMATICAL MODEL CAPABLE OF SIMPLE GRAPHIC INTERPRETATION FOR INDUSTRIAL TRAINING AND PRODUCTION SCHEDULING AT THE SHOP FLOOR LEVEL. STUDIES OF MIDDLE SKILL DEVELOPMENT IN MACHINE AND VEHICLE ASSEMBLY, AIRCRAFT PRODUCTION, SPOOLMAKING AND THE MACHINING OF PARTS CONFIRMED…
F C Pan, Frank
2014-03-01
Nurses have long been relied as the major labor force in hospitals. Featured with complicated and highly labor-intensive job requirement, multiple pressures from different sources was inevitable. Success in identifying stresses and accordingly coping with such stresses is important for job performance of nurses, and service quality of a hospital. Purpose of this research is to identify the determinants of nurses' capabilities. A modified Analytic Hierarchy Process (AHP) was adopted. Overall, 105 nurses from several randomly selected hospitals in southern Taiwan were investigated to generate factors. Ten experienced practitioners were included as the expert in the AHP to produce weights of each criterion. Six nurses from two regional hospitals were then selected to test the model. Four factors are then identified as the second level of hierarchy. The study result shows that the family factor is the most important factor, and followed by the personal attributes. Top three sub-criteria that attribute to the nurse's stress-coping capability are children's education, good career plan, and healthy family. The practical simulation provided evidence for the usefulness of this model. The study suggested including these key determinants into the practice of human-resource management, and restructuring the hospital's organization, creating an employee-support system as well as a family-friendly working climate. The research provided evidence that supports the usefulness of AHP in identifying the key factors that help stabilizing a nursing team.
F. C. PAN, Frank
2014-01-01
Abstract Background Nurses have long been relied as the major labor force in hospitals. Featured with complicated and highly labor-intensive job requirement, multiple pressures from different sources was inevitable. Success in identifying stresses and accordingly coping with such stresses is important for job performance of nurses, and service quality of a hospital. Purpose of this research is to identify the determinants of nurses' capabilities. Methods A modified Analytic Hierarchy Process (AHP) was adopted. Overall, 105 nurses from several randomly selected hospitals in southern Taiwan were investigated to generate factors. Ten experienced practitioners were included as the expert in the AHP to produce weights of each criterion. Six nurses from two regional hospitals were then selected to test the model. Results Four factors are then identified as the second level of hierarchy. The study result shows that the family factor is the most important factor, and followed by the personal attributes. Top three sub-criteria that attribute to the nurse's stress-coping capability are children's education, good career plan, and healthy family. The practical simulation provided evidence for the usefulness of this model. Conclusion The study suggested including these key determinants into the practice of human-resource management, and restructuring the hospital's organization, creating an employee-support system as well as a family-friendly working climate. The research provided evidence that supports the usefulness of AHP in identifying the key factors that help stabilizing a nursing team. PMID:25988086
Mathur, Rohit; Xing, Jia; Gilliam, Robert; Sarwar, Golam; Hogrefe, Christian; Pleim, Jonathan; Pouliot, George; Roselle, Shawn; Spero, Tanya L.; Wong, David C.; Young, Jeffrey
2018-01-01
The Community Multiscale Air Quality (CMAQ) modeling system is extended to simulate ozone, particulate matter, and related precursor distributions throughout the Northern Hemisphere. Modelled processes were examined and enhanced to suitably represent the extended space and time scales for such applications. Hemispheric scale simulations with CMAQ and the Weather Research and Forecasting (WRF) model are performed for multiple years. Model capabilities for a range of applications including episodic long-range pollutant transport, long-term trends in air pollution across the Northern Hemisphere, and air pollution-climate interactions are evaluated through detailed comparison with available surface, aloft, and remotely sensed observations. The expansion of CMAQ to simulate the hemispheric scales provides a framework to examine interactions between atmospheric processes occurring at various spatial and temporal scales with physical, chemical, and dynamical consistency. PMID:29681922
CATS - A process-based model for turbulent turbidite systems at the reservoir scale
NASA Astrophysics Data System (ADS)
Teles, Vanessa; Chauveau, Benoît; Joseph, Philippe; Weill, Pierre; Maktouf, Fakher
2016-09-01
The Cellular Automata for Turbidite systems (CATS) model is intended to simulate the fine architecture and facies distribution of turbidite reservoirs with a multi-event and process-based approach. The main processes of low-density turbulent turbidity flow are modeled: downslope sediment-laden flow, entrainment of ambient water, erosion and deposition of several distinct lithologies. This numerical model, derived from (Salles, 2006; Salles et al., 2007), proposes a new approach based on the Rouse concentration profile to consider the flow capacity to carry the sediment load in suspension. In CATS, the flow distribution on a given topography is modeled with local rules between neighboring cells (cellular automata) based on potential and kinetic energy balance and diffusion concepts. Input parameters are the initial flow parameters and a 3D topography at depositional time. An overview of CATS capabilities in different contexts is presented and discussed.
Collaborative Working Architecture for IoT-Based Applications.
Mora, Higinio; Signes-Pont, María Teresa; Gil, David; Johnsson, Magnus
2018-05-23
The new sensing applications need enhanced computing capabilities to handle the requirements of complex and huge data processing. The Internet of Things (IoT) concept brings processing and communication features to devices. In addition, the Cloud Computing paradigm provides resources and infrastructures for performing the computations and outsourcing the work from the IoT devices. This scenario opens new opportunities for designing advanced IoT-based applications, however, there is still much research to be done to properly gear all the systems for working together. This work proposes a collaborative model and an architecture to take advantage of the available computing resources. The resulting architecture involves a novel network design with different levels which combines sensing and processing capabilities based on the Mobile Cloud Computing (MCC) paradigm. An experiment is included to demonstrate that this approach can be used in diverse real applications. The results show the flexibility of the architecture to perform complex computational tasks of advanced applications.
SMAP Spacecraft Rotate & Placed on Fixture
2014-10-16
Inside the Astrotech payload processing facility on Vandenberg Air Force Base in California, engineers and technicians rotate NASA's Soil Moisture Active Passive, or SMAP, spacecraft to begin processing. SMAP will launch on a Delta II 7320 configuration vehicle featuring a United Launch Alliance first stage booster powered by an Aerojet Rocketdyne RS-27A main engine and three Alliant Techsystems, or ATK, strap-on solid rocket motors. Once on station in Earth orbit, SMAP will provide global measurements of soil moisture and its freeze/thaw state. These measurements will be used to enhance understanding of processes that link the water, energy and carbon cycles, and to extend the capabilities of weather and climate prediction models. SMAP data also will be used to quantify net carbon flux in boreal landscapes and to develop improved flood prediction and drought monitoring capabilities. Launch from Space Launch Complex 2 is targeted for Jan. 29, 2015.
2014-10-16
Inside the Astrotech payload processing facility on Vandenberg Air Force Base in California, engineers and technicians have rotated NASA's Soil Moisture Active Passive, or SMAP, spacecraft to begin processing. SMAP will launch on a Delta II 7320 configuration vehicle featuring a United Launch Alliance first stage booster powered by an Aerojet Rocketdyne RS-27A main engine and three Alliant Techsystems, or ATK, strap-on solid rocket motors. Once on station in Earth orbit, SMAP will provide global measurements of soil moisture and its freeze/thaw state. These measurements will be used to enhance understanding of processes that link the water, energy and carbon cycles, and to extend the capabilities of weather and climate prediction models. SMAP data also will be used to quantify net carbon flux in boreal landscapes and to develop improved flood prediction and drought monitoring capabilities. Launch from Space Launch Complex 2 is targeted for Jan. 29, 2015.
SMAP Spacecraft Rotate & Placed on Fixture
2014-10-16
Inside the Astrotech payload processing facility on Vandenberg Air Force Base in California, engineers and technicians begin processing of NASA's Soil Moisture Active Passive, or SMAP, spacecraft. SMAP will launch on a Delta II 7320 configuration vehicle featuring a United Launch Alliance first stage booster powered by an Aerojet Rocketdyne RS-27A main engine and three Alliant Techsystems, or ATK, strap-on solid rocket motors. Once on station in Earth orbit, SMAP will provide global measurements of soil moisture and its freeze/thaw state. These measurements will be used to enhance understanding of processes that link the water, energy and carbon cycles, and to extend the capabilities of weather and climate prediction models. SMAP data also will be used to quantify net carbon flux in boreal landscapes and to develop improved flood prediction and drought monitoring capabilities. Launch from Space Launch Complex 2 is targeted for Jan. 29, 2015.
2014-10-16
Inside the Astrotech payload processing facility on Vandenberg Air Force Base in California, engineers and technicians rotate NASA's Soil Moisture Active Passive, or SMAP, spacecraft to begin processing. SMAP will launch on a Delta II 7320 configuration vehicle featuring a United Launch Alliance first stage booster powered by an Aerojet Rocketdyne RS-27A main engine and three Alliant Techsystems, or ATK, strap-on solid rocket motors. Once on station in Earth orbit, SMAP will provide global measurements of soil moisture and its freeze/thaw state. These measurements will be used to enhance understanding of processes that link the water, energy and carbon cycles, and to extend the capabilities of weather and climate prediction models. SMAP data also will be used to quantify net carbon flux in boreal landscapes and to develop improved flood prediction and drought monitoring capabilities. Launch from Space Launch Complex 2 is targeted for Jan. 29, 2015.
SMAP Spacecraft Rotate & Placed on Fixture
2014-10-16
Inside the Astrotech payload processing facility on Vandenberg Air Force Base in California, engineers and technicians have rotated NASA's Soil Moisture Active Passive, or SMAP, spacecraft to begin processing. SMAP will launch on a Delta II 7320 configuration vehicle featuring a United Launch Alliance first stage booster powered by an Aerojet Rocketdyne RS-27A main engine and three Alliant Techsystems, or ATK, strap-on solid rocket motors. Once on station in Earth orbit, SMAP will provide global measurements of soil moisture and its freeze/thaw state. These measurements will be used to enhance understanding of processes that link the water, energy and carbon cycles, and to extend the capabilities of weather and climate prediction models. SMAP data also will be used to quantify net carbon flux in boreal landscapes and to develop improved flood prediction and drought monitoring capabilities. Launch from Space Launch Complex 2 is targeted for Jan. 29, 2015.
SMAP Spacecraft Rotate & Placed on Fixture
2014-10-16
Inside the Astrotech payload processing facility on Vandenberg Air Force Base in California, processing has begun on NASA's Soil Moisture Active Passive, or SMAP, spacecraft. SMAP will launch on a Delta II 7320 configuration vehicle featuring a United Launch Alliance first stage booster powered by an Aerojet Rocketdyne RS-27A main engine and three Alliant Techsystems, or ATK, strap-on solid rocket motors. Once on station in Earth orbit, SMAP will provide global measurements of soil moisture and its freeze/thaw state. These measurements will be used to enhance understanding of processes that link the water, energy and carbon cycles, and to extend the capabilities of weather and climate prediction models. SMAP data also will be used to quantify net carbon flux in boreal landscapes and to develop improved flood prediction and drought monitoring capabilities. Launch from Space Launch Complex 2 is targeted for Jan. 29, 2015.
Modeling rainfall-runoff process using soft computing techniques
NASA Astrophysics Data System (ADS)
Kisi, Ozgur; Shiri, Jalal; Tombul, Mustafa
2013-02-01
Rainfall-runoff process was modeled for a small catchment in Turkey, using 4 years (1987-1991) of measurements of independent variables of rainfall and runoff values. The models used in the study were Artificial Neural Networks (ANNs), Adaptive Neuro-Fuzzy Inference System (ANFIS) and Gene Expression Programming (GEP) which are Artificial Intelligence (AI) approaches. The applied models were trained and tested using various combinations of the independent variables. The goodness of fit for the model was evaluated in terms of the coefficient of determination (R2), root mean square error (RMSE), mean absolute error (MAE), coefficient of efficiency (CE) and scatter index (SI). A comparison was also made between these models and traditional Multi Linear Regression (MLR) model. The study provides evidence that GEP (with RMSE=17.82 l/s, MAE=6.61 l/s, CE=0.72 and R2=0.978) is capable of modeling rainfall-runoff process and is a viable alternative to other applied artificial intelligence and MLR time-series methods.
Tomperi, Jani; Leiviskä, Kauko
2018-06-01
Traditionally the modelling in an activated sludge process has been based on solely the process measurements, but as the interest to optically monitor wastewater samples to characterize the floc morphology has increased, in the recent years the results of image analyses have been more frequently utilized to predict the characteristics of wastewater. This study shows that the traditional process measurements or the automated optical monitoring variables by themselves are not capable of developing the best predictive models for the treated wastewater quality in a full-scale wastewater treatment plant, but utilizing these variables together the optimal models, which show the level and changes in the treated wastewater quality, are achieved. By this early warning, process operation can be optimized to avoid environmental damages and economic losses. The study also shows that specific optical monitoring variables are important in modelling a certain quality parameter, regardless of the other input variables available.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Willenbring, James M.; Bartlett, Roscoe Ainsworth; Heroux, Michael Allen
2012-01-01
Software lifecycles are becoming an increasingly important issue for computational science and engineering (CSE) software. The process by which a piece of CSE software begins life as a set of research requirements and then matures into a trusted high-quality capability is both commonplace and extremely challenging. Although an implicit lifecycle is obviously being used in any effort, the challenges of this process - respecting the competing needs of research vs. production - cannot be overstated. Here we describe a proposal for a well-defined software lifecycle process based on modern Lean/Agile software engineering principles. What we propose is appropriate for manymore » CSE software projects that are initially heavily focused on research but also are expected to eventually produce usable high-quality capabilities. The model is related to TriBITS, a build, integration and testing system, which serves as a strong foundation for this lifecycle model, and aspects of this lifecycle model are ingrained in the TriBITS system. Here, we advocate three to four phases or maturity levels that address the appropriate handling of many issues associated with the transition from research to production software. The goals of this lifecycle model are to better communicate maturity levels with customers and to help to identify and promote Software Engineering (SE) practices that will help to improve productivity and produce better software. An important collection of software in this domain is Trilinos, which is used as the motivation and the initial target for this lifecycle model. However, many other related and similar CSE (and non-CSE) software projects can also make good use of this lifecycle model, especially those that use the TriBITS system. Indeed this lifecycle process, if followed, will enable large-scale sustainable integration of many complex CSE software efforts across several institutions.« less
Spatio-Semantic Comparison of Large 3d City Models in Citygml Using a Graph Database
NASA Astrophysics Data System (ADS)
Nguyen, S. H.; Yao, Z.; Kolbe, T. H.
2017-10-01
A city may have multiple CityGML documents recorded at different times or surveyed by different users. To analyse the city's evolution over a given period of time, as well as to update or edit the city model without negating modifications made by other users, it is of utmost importance to first compare, detect and locate spatio-semantic changes between CityGML datasets. This is however difficult due to the fact that CityGML elements belong to a complex hierarchical structure containing multi-level deep associations, which can basically be considered as a graph. Moreover, CityGML allows multiple syntactic ways to define an object leading to syntactic ambiguities in the exchange format. Furthermore, CityGML is capable of including not only 3D urban objects' graphical appearances but also their semantic properties. Since to date, no known algorithm is capable of detecting spatio-semantic changes in CityGML documents, a frequent approach is to replace the older models completely with the newer ones, which not only costs computational resources, but also loses track of collaborative and chronological changes. Thus, this research proposes an approach capable of comparing two arbitrarily large-sized CityGML documents on both semantic and geometric level. Detected deviations are then attached to their respective sources and can easily be retrieved on demand. As a result, updating a 3D city model using this approach is much more efficient as only real changes are committed. To achieve this, the research employs a graph database as the main data structure for storing and processing CityGML datasets in three major steps: mapping, matching and updating. The mapping process transforms input CityGML documents into respective graph representations. The matching process compares these graphs and attaches edit operations on the fly. Found changes can then be executed using the Web Feature Service (WFS), the standard interface for updating geographical features across the web.
Integrated analysis of error detection and recovery
NASA Technical Reports Server (NTRS)
Shin, K. G.; Lee, Y. H.
1985-01-01
An integrated modeling and analysis of error detection and recovery is presented. When fault latency and/or error latency exist, the system may suffer from multiple faults or error propagations which seriously deteriorate the fault-tolerant capability. Several detection models that enable analysis of the effect of detection mechanisms on the subsequent error handling operations and the overall system reliability were developed. Following detection of the faulty unit and reconfiguration of the system, the contaminated processes or tasks have to be recovered. The strategies of error recovery employed depend on the detection mechanisms and the available redundancy. Several recovery methods including the rollback recovery are considered. The recovery overhead is evaluated as an index of the capabilities of the detection and reconfiguration mechanisms.
NASA Technical Reports Server (NTRS)
Noll, Thomas E.
1990-01-01
The paper describes recent accomplishments and current research projects along four main thrusts in aeroservoelasticity at NASA Langley. One activity focuses on enhancing the modeling and analysis procedures to accurately predict aeroservoelastic interactions. Improvements to the minimum-state method of approximating unsteady aerodynamics are shown to provide precise low-order models for design and simulation tasks. Recent extensions in aerodynamic correction-factor methodology are also described. With respect to analysis procedures, the paper reviews novel enhancements to matched filter theory and random process theory for predicting the critical gust profile and the associated time-correlated gust loads for structural design considerations. Two research projects leading towards improved design capability are also summarized: (1) an integrated structure/control design capability and (2) procedures for obtaining low-order robust digital control laws for aeroelastic applications.
NASA Technical Reports Server (NTRS)
Haste, Deepak; Ghoshal, Sudipto; Johnson, Stephen B.; Moore, Craig
2018-01-01
This paper describes the theory and considerations in the application of model-based techniques to assimilate information from disjoint knowledge sources for performing NASA's Fault Management (FM)-related activities using the TEAMS® toolset. FM consists of the operational mitigation of existing and impending spacecraft failures. NASA's FM directives have both design-phase and operational-phase goals. This paper highlights recent studies by QSI and DST of the capabilities required in the TEAMS® toolset for conducting FM activities with the aim of reducing operating costs, increasing autonomy, and conforming to time schedules. These studies use and extend the analytic capabilities of QSI's TEAMS® toolset to conduct a range of FM activities within a centralized platform.
Molecular Sieve Bench Testing and Computer Modeling
NASA Technical Reports Server (NTRS)
Mohamadinejad, Habib; DaLee, Robert C.; Blackmon, James B.
1995-01-01
The design of an efficient four-bed molecular sieve (4BMS) CO2 removal system for the International Space Station depends on many mission parameters, such as duration, crew size, cost of power, volume, fluid interface properties, etc. A need for space vehicle CO2 removal system models capable of accurately performing extrapolated hardware predictions is inevitable due to the change of the parameters which influences the CO2 removal system capacity. The purpose is to investigate the mathematical techniques required for a model capable of accurate extrapolated performance predictions and to obtain test data required to estimate mass transfer coefficients and verify the computer model. Models have been developed to demonstrate that the finite difference technique can be successfully applied to sorbents and conditions used in spacecraft CO2 removal systems. The nonisothermal, axially dispersed, plug flow model with linear driving force for 5X sorbent and pore diffusion for silica gel are then applied to test data. A more complex model, a non-darcian model (two dimensional), has also been developed for simulation of the test data. This model takes into account the channeling effect on column breakthrough. Four FORTRAN computer programs are presented: a two-dimensional model of flow adsorption/desorption in a packed bed; a one-dimensional model of flow adsorption/desorption in a packed bed; a model of thermal vacuum desorption; and a model of a tri-sectional packed bed with two different sorbent materials. The programs are capable of simulating up to four gas constituents for each process, which can be increased with a few minor changes.
Dogrul, Emin C.; Schmid, Wolfgang; Hanson, Randall T.; Kadir, Tariq; Chung, Francis
2016-01-01
Effective modeling of conjunctive use of surface and subsurface water resources requires simulation of land use-based root zone and surface flow processes as well as groundwater flows, streamflows, and their interactions. Recently, two computer models developed for this purpose, the Integrated Water Flow Model (IWFM) from the California Department of Water Resources and the MODFLOW with Farm Process (MF-FMP) from the US Geological Survey, have been applied to complex basins such as the Central Valley of California. As both IWFM and MFFMP are publicly available for download and can be applied to other basins, there is a need to objectively compare the main approaches and features used in both models. This paper compares the concepts, as well as the method and simulation features of each hydrologic model pertaining to groundwater, surface water, and landscape processes. The comparison is focused on the integrated simulation of water demand and supply, water use, and the flow between coupled hydrologic processes. The differences in the capabilities and features of these two models could affect the outcome and types of water resource problems that can be simulated.
NASA Astrophysics Data System (ADS)
Tian, Yingtao; Robson, Joseph D.; Riekehr, Stefan; Kashaev, Nikolai; Wang, Li; Lowe, Tristan; Karanika, Alexandra
2016-07-01
Laser welding of advanced Al-Li alloys has been developed to meet the increasing demand for light-weight and high-strength aerospace structures. However, welding of high-strength Al-Li alloys can be problematic due to the tendency for hot cracking. Finding suitable welding parameters and filler material for this combination currently requires extensive and costly trial and error experimentation. The present work describes a novel coupled model to predict hot crack susceptibility (HCS) in Al-Li welds. Such a model can be used to shortcut the weld development process. The coupled model combines finite element process simulation with a two-level HCS model. The finite element process model predicts thermal field data for the subsequent HCS hot cracking prediction. The model can be used to predict the influences of filler wire composition and welding parameters on HCS. The modeling results have been validated by comparing predictions with results from fully instrumented laser welds performed under a range of process parameters and analyzed using high-resolution X-ray tomography to identify weld defects. It is shown that the model is capable of accurately predicting the thermal field around the weld and the trend of HCS as a function of process parameters.
Process improvement as an investment: Measuring its worth
NASA Technical Reports Server (NTRS)
Mcgarry, Frank; Jeletic, Kellyann
1993-01-01
This paper discusses return on investment (ROI) generated from software process improvement programs. It details the steps needed to compute ROI and compares these steps from the perspective of two process improvement approaches: the widely known Software Engineering Institute's capability maturity model and the approach employed by NASA's Software Engineering Laboratory (SEL). The paper then describes the specific investments made in the SEL over the past 18 years and discusses the improvements gained from this investment by the production organization in the SEL.
A Brief Study of Software Engineering Professional Continuing Education in DoD Acquisition
2010-04-01
Lifecycle Processes (IEEE 12207 ) (810) 37% 61% 2% Guide to the Software Engineering Body of K l d (SWEBOK) (804) 67% 31% 2% now e ge Software...Engineering-Software Measurement Process ( ISO /IEC 15939) (797) 55% 44% 2% Capability Maturity Model Integration (806) 17% 81% 2% Six Sigma Process...Improvement (804) 7% 91% 1% ISO 9000 Quality Management Systems (803) 10% 89% 1% 28 Conclusions Significant problem areas R i tequ remen s Management Very
Seven Processes that Enable NASA Software Engineering Technologies
NASA Technical Reports Server (NTRS)
Housch, Helen; Godfrey, Sally
2011-01-01
This slide presentation reviews seven processes that NASA uses to ensure that software is developed, acquired and maintained as specified in the NPR 7150.2A requirement. The requirement is to ensure that all software be appraised for the Capability Maturity Model Integration (CMMI). The enumerated processes are: (7) Product Integration, (6) Configuration Management, (5) Verification, (4) Software Assurance, (3) Measurement and Analysis, (2) Requirements Management and (1) Planning & Monitoring. Each of these is described and the group(s) that are responsible is described.
NASA Astrophysics Data System (ADS)
Mohamed, Omar Ahmed; Hasan Masood, Syed; Lal Bhowmik, Jahar
2018-02-01
In the additive manufacturing (AM) market, the question is raised by industry and AM users on how reproducible and repeatable the fused deposition modeling (FDM) process is in providing good dimensional accuracy. This paper aims to investigate and evaluate the repeatability and reproducibility of the FDM process through a systematic approach to answer this frequently asked question. A case study based on the statistical gage repeatability and reproducibility (gage R&R) technique is proposed to investigate the dimensional variations in the printed parts of the FDM process. After running the simulation and analysis of the data, the FDM process capability is evaluated, which would help the industry for better understanding the performance of FDM technology.
NASA Technical Reports Server (NTRS)
Gaston, S.; Wertheim, M.; Orourke, J. A.
1973-01-01
Summary, consolidation and analysis of specifications, manufacturing process and test controls, and performance results for OAO-2 and OAO-3 lot 20 Amp-Hr sealed nickel cadmium cells and batteries are reported. Correlation of improvements in control requirements with performance is a key feature. Updates for a cell/battery computer model to improve performance prediction capability are included. Applicability of regression analysis computer techniques to relate process controls to performance is checked.
Integrated Modeling and Analysis of Physical Oceanographic and Acoustic Processes
2015-09-30
goal is to improve ocean physical state and acoustic state predictive capabilities. The goal fitting the scope of this project is the creation of... Project -scale objectives are to complete targeted studies of oceanographic processes in a few regimes, accompanied by studies of acoustic propagation...by the basic research efforts of this project . An additional objective is to develop improved computational tools for acoustics and for the
1987-05-06
Rational . Rational Environment A_9_5_2. Rational Arthitecture (R1000 Model 200) 6. PERFORMING ORG. REPORT...validation testing performed on the Rational Environment , A_9_5_2, using Version 1.8 of the Ada0 Compiler Validation Capability (ACVC). The Rational ... Environment is hosted on a Rational Architecture (R1000 Model 200) operating under Rational Environment , Release A 95 2. Programs processed by this
In-vehicle group activity modeling and simulation in sensor-based virtual environment
NASA Astrophysics Data System (ADS)
Shirkhodaie, Amir; Telagamsetti, Durga; Poshtyar, Azin; Chan, Alex; Hu, Shuowen
2016-05-01
Human group activity recognition is a very complex and challenging task, especially for Partially Observable Group Activities (POGA) that occur in confined spaces with limited visual observability and often under severe occultation. In this paper, we present IRIS Virtual Environment Simulation Model (VESM) for the modeling and simulation of dynamic POGA. More specifically, we address sensor-based modeling and simulation of a specific category of POGA, called In-Vehicle Group Activities (IVGA). In VESM, human-alike animated characters, called humanoids, are employed to simulate complex in-vehicle group activities within the confined space of a modeled vehicle. Each articulated humanoid is kinematically modeled with comparable physical attributes and appearances that are linkable to its human counterpart. Each humanoid exhibits harmonious full-body motion - simulating human-like gestures and postures, facial impressions, and hands motions for coordinated dexterity. VESM facilitates the creation of interactive scenarios consisting of multiple humanoids with different personalities and intentions, which are capable of performing complicated human activities within the confined space inside a typical vehicle. In this paper, we demonstrate the efficiency and effectiveness of VESM in terms of its capabilities to seamlessly generate time-synchronized, multi-source, and correlated imagery datasets of IVGA, which are useful for the training and testing of multi-source full-motion video processing and annotation. Furthermore, we demonstrate full-motion video processing of such simulated scenarios under different operational contextual constraints.
High-resolution time-frequency representation of EEG data using multi-scale wavelets
NASA Astrophysics Data System (ADS)
Li, Yang; Cui, Wei-Gang; Luo, Mei-Lin; Li, Ke; Wang, Lina
2017-09-01
An efficient time-varying autoregressive (TVAR) modelling scheme that expands the time-varying parameters onto the multi-scale wavelet basis functions is presented for modelling nonstationary signals and with applications to time-frequency analysis (TFA) of electroencephalogram (EEG) signals. In the new parametric modelling framework, the time-dependent parameters of the TVAR model are locally represented by using a novel multi-scale wavelet decomposition scheme, which can allow the capability to capture the smooth trends as well as track the abrupt changes of time-varying parameters simultaneously. A forward orthogonal least square (FOLS) algorithm aided by mutual information criteria are then applied for sparse model term selection and parameter estimation. Two simulation examples illustrate that the performance of the proposed multi-scale wavelet basis functions outperforms the only single-scale wavelet basis functions or Kalman filter algorithm for many nonstationary processes. Furthermore, an application of the proposed method to a real EEG signal demonstrates the new approach can provide highly time-dependent spectral resolution capability.
Simulating human behavior for national security human interactions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bernard, Michael Lewis; Hart, Dereck H.; Verzi, Stephen J.
2007-01-01
This 3-year research and development effort focused on what we believe is a significant technical gap in existing modeling and simulation capabilities: the representation of plausible human cognition and behaviors within a dynamic, simulated environment. Specifically, the intent of the ''Simulating Human Behavior for National Security Human Interactions'' project was to demonstrate initial simulated human modeling capability that realistically represents intra- and inter-group interaction behaviors between simulated humans and human-controlled avatars as they respond to their environment. Significant process was made towards simulating human behaviors through the development of a framework that produces realistic characteristics and movement. The simulated humansmore » were created from models designed to be psychologically plausible by being based on robust psychological research and theory. Progress was also made towards enhancing Sandia National Laboratories existing cognitive models to support culturally plausible behaviors that are important in representing group interactions. These models were implemented in the modular, interoperable, and commercially supported Umbra{reg_sign} simulation framework.« less
NASA Technical Reports Server (NTRS)
Aquilina, Rudolph A.
2015-01-01
The SMART-NAS Testbed for Safe Trajectory Based Operations Project will deliver an evaluation capability, critical to the ATM community, allowing full NextGen and beyond-NextGen concepts to be assessed and developed. To meet this objective a strong focus will be placed on concept integration and validation to enable a gate-to-gate trajectory-based system capability that satisfies a full vision for NextGen. The SMART-NAS for Safe TBO Project consists of six sub-projects. Three of the sub-projects are focused on exploring and developing technologies, concepts and models for evolving and transforming air traffic management operations in the ATM+2 time horizon, while the remaining three sub-projects are focused on developing the tools and capabilities needed for testing these advanced concepts. Function Allocation, Networked Air Traffic Management and Trajectory Based Operations are developing concepts and models. SMART-NAS Test-bed, System Assurance Technologies and Real-time Safety Modeling are developing the tools and capabilities to test these concepts. Simulation and modeling capabilities will include the ability to assess multiple operational scenarios of the national airspace system, accept data feeds, allowing shadowing of actual operations in either real-time, fast-time and/or hybrid modes of operations in distributed environments, and enable integrated examinations of concepts, algorithms, technologies, and NAS architectures. An important focus within this project is to enable the development of a real-time, system-wide safety assurance system. The basis of such a system is a continuum of information acquisition, analysis, and assessment that enables awareness and corrective action to detect and mitigate potential threats to continuous system-wide safety at all levels. This process, which currently can only be done post operations, will be driven towards "real-time" assessments in the 2035 time frame.
Contract Management Process Maturity: Analysis of Recent Organizational Assessments
2009-04-22
Airman: The book. (Special Issue, Vol. LI). Washington, DC: Air Force News Agency, Secretary of the Air Force Office of Public Affairs. Yueng , A.K...competence ( Yueng , Ulrich, Nason, & von Glinow, 1999) • Capability maturity models have been successfully used in assessing software management and
Facilitating Employees' and Students' Process towards Nascent Entrepreneurship
ERIC Educational Resources Information Center
Hietanen, Lenita
2015-01-01
Purpose: The purpose of this paper is to investigate a model for facilitating employees' and full-time, non-business students' entrepreneurial capabilities during their optional entrepreneurship studies at one Finnish Open University. Design/methodology/approach: The case study investigates the course in which transitions from employees or…
Computer Applications in the Design Process.
ERIC Educational Resources Information Center
Winchip, Susan
Computer Assisted Design (CAD) and Computer Assisted Manufacturing (CAM) are emerging technologies now being used in home economics and interior design applications. A microcomputer in a computer network system is capable of executing computer graphic functions such as three-dimensional modeling, as well as utilizing office automation packages to…
Group-oriented coordination models for distributed client-server computing
NASA Technical Reports Server (NTRS)
Adler, Richard M.; Hughes, Craig S.
1994-01-01
This paper describes group-oriented control models for distributed client-server interactions. These models transparently coordinate requests for services that involve multiple servers, such as queries across distributed databases. Specific capabilities include: decomposing and replicating client requests; dispatching request subtasks or copies to independent, networked servers; and combining server results into a single response for the client. The control models were implemented by combining request broker and process group technologies with an object-oriented communication middleware tool. The models are illustrated in the context of a distributed operations support application for space-based systems.
Firm profitability and the network of organizational capabilities
NASA Astrophysics Data System (ADS)
Wagner, Friedrich; Milaković, Mishael; Alfarano, Simone
2010-11-01
A Laplace distribution for firm profit rates (or returns on assets) can be obtained through the sum of many independent shocks if the number of shocks is Poisson distributed. Interpreting this as a linear chain of events, we generalize the process to a hierarchical network structure. The hierarchical model reproduces the observed distributional patterns of firm profitability, which crucially depend on the life span of firms. While the profit rates of long-lived firms obey a symmetric Laplacian, short-lived firms display a different behavior depending on whether they are capable of generating positive profits or not. Successful short-lived firms exhibit a symmetric yet more leptokurtic pdf than long-lived firms. Our model suggests that these firms are more dynamic in their organizational capabilities, but on average also face more risk than long-lived firms. Finally, short-lived firms that fail to generate positive profits have the most leptokurtic distribution among the three classes, and on average lose slightly more than their total assets within a year.
Leavesley, G.H.; Markstrom, S.L.; Viger, R.J.
2004-01-01
The interdisciplinary nature and increasing complexity of water- and environmental-resource problems require the use of modeling approaches that can incorporate knowledge from a broad range of scientific disciplines. The large number of distributed hydrological and ecosystem models currently available are composed of a variety of different conceptualizations of the associated processes they simulate. Assessment of the capabilities of these distributed models requires evaluation of the conceptualizations of the individual processes, and the identification of which conceptualizations are most appropriate for various combinations of criteria, such as problem objectives, data constraints, and spatial and temporal scales of application. With this knowledge, "optimal" models for specific sets of criteria can be created and applied. The U.S. Geological Survey (USGS) Modular Modeling System (MMS) is an integrated system of computer software that has been developed to provide these model development and application capabilities. MMS supports the integration of models and tools at a variety of levels of modular design. These include individual process models, tightly coupled models, loosely coupled models, and fully-integrated decision support systems. A variety of visualization and statistical tools are also provided. MMS has been coupled with the Bureau of Reclamation (BOR) object-oriented reservoir and river-system modeling framework, RiverWare, under a joint USGS-BOR program called the Watershed and River System Management Program. MMS and RiverWare are linked using a shared relational database. The resulting database-centered decision support system provides tools for evaluating and applying optimal resource-allocation and management strategies to complex, operational decisions on multipurpose reservoir systems and watersheds. Management issues being addressed include efficiency of water-resources management, environmental concerns such as meeting flow needs for endangered species, and optimizing operations within the constraints of multiple objectives such as power generation, irrigation, and water conservation. This decision support system approach is being developed, tested, and implemented in the Gunni-son, Yakima, San Juan, Rio Grande, and Truckee River basins of the western United States. Copyright ASCE 2004.
Toward a Capability Engineering Process
2004-12-01
TOWARD A CAPABILITY ENGINEERING PROCESS M. Lizotte, F. Bernier, M. Mokhtari , M. Couture, G. Dussault, C. Lalancette, F. Lemieux System of Systems...Lizotte, F. Lemieux, the US DoD 5000 acquisition strategies?; and (8) since a M. Mokhtari , 2004: Toward Capability Engineering capability can be
Silicon web process development
NASA Technical Reports Server (NTRS)
Duncan, C. S.; Seidensticker, R. G.; Mchugh, J. P.; Blais, P. D.; Davis, J. R., Jr.
1977-01-01
Thirty-five (35) furnace runs were carried out during this quarter, of which 25 produced a total of 120 web crystals. The two main thermal models for the dendritic growth process were completed and are being used to assist the design of the thermal geometry of the web growth apparatus. The first model, a finite element representation of the susceptor and crucible, was refined to give greater precision and resolution in the critical central region of the melt. The second thermal model, which describes the dissipation of the latent heat to generate thickness-velocity data, was completed. Dendritic web samples were fabricated into solar cells using a standard configuration and a standard process for a N(+) -P-P(+) configuration. The detailed engineering design was completed for a new dendritic web growth facility of greater width capability than previous facilities.
NASA Astrophysics Data System (ADS)
Jin, Hu; Dong, Erbao; Xu, Min; Xia, Qirong; Liu, Shuai; Li, Weihua; Yang, Jie
2018-01-01
Many shape memory alloy (SMA)-based soft actuators have specific composite structures and manufacture processes, and are therefore unique. However, these exclusive characteristics limit their capabilities and applications, so in this article a soft and smart digital structure (SDS) is proposed that acts like a modular unit to assemble soft actuators by a layered adhesive bonding process. The SDS is a fully soft structure that encapsulates a digital skeleton consisting of four groups of parallel and independently actuated SMA wires capable of outputting a four-channel tunable force. The layered adhesive bonding process modularly bonds several SDSs with an elastic backbone to fabricate a layered soft actuator where the elastic backbone is used to recover the SDSs in a cooling process using the SMA wires. Two kinds of SDS-based soft actuators were modularly assembled, an actuator, SDS-I, with a two-dimensional reciprocal motion, and an actuator, SDS-II, capable of bi-directional reciprocal motion. The thermodynamics and phase transformation modeling of the SDS-based actuator were analyzed. Several extensional soft actuators were also assembled by bonding the SDS with an anomalous elastic backbone or modularly assembling the SDS-Is and SDS-IIs. These modularly assembled soft actuators delivered more output channels and a complicated motion, e.g., an actinomorphic soft actuator with four SDS-Is jumps in a series of hierarchical heights and directional movement by tuning the input channels of the SDSs. This result showed that the SDS can modularly assemble multifarious soft actuators with diverse capabilities, steerability and tunable outputs.
Kang, Jian; Zhang, Jixin; Bai, Yongqiang
2016-12-15
An evaluation of the oil-spill emergency response capability (OS-ERC) currently in place in modern marine management is required to prevent pollution and loss accidents. The objective of this paper is to develop a novel OS-ERC evaluation model, the importance of which stems from the current lack of integrated approaches for interpreting, ranking and assessing OS-ERC performance factors. In the first part of this paper, the factors influencing OS-ERC are analyzed and classified to generate a global evaluation index system. Then, a semantic tree is adopted to illustrate linguistic variables in the evaluation process, followed by the application of a combination of Fuzzy Cognitive Maps (FCM) and the Analytic Hierarchy Process (AHP) to construct and calculate the weight distribution. Finally, considering that the OS-ERC evaluation process is a complex system, a fuzzy comprehensive evaluation (FCE) is employed to calculate the OS-ERC level. The entire evaluation framework obtains the overall level of OS-ERC, and also highlights the potential major issues concerning OS-ERC, as well as expert opinions for improving the feasibility of oil-spill accident prevention and protection. Copyright © 2016 Elsevier Ltd. All rights reserved.
Valdés, Julio J; Bonham-Carter, Graeme
2006-03-01
A computational intelligence approach is used to explore the problem of detecting internal state changes in time dependent processes; described by heterogeneous, multivariate time series with imprecise data and missing values. Such processes are approximated by collections of time dependent non-linear autoregressive models represented by a special kind of neuro-fuzzy neural network. Grid and high throughput computing model mining procedures based on neuro-fuzzy networks and genetic algorithms, generate: (i) collections of models composed of sets of time lag terms from the time series, and (ii) prediction functions represented by neuro-fuzzy networks. The composition of the models and their prediction capabilities, allows the identification of changes in the internal structure of the process. These changes are associated with the alternation of steady and transient states, zones with abnormal behavior, instability, and other situations. This approach is general, and its sensitivity for detecting subtle changes of state is revealed by simulation experiments. Its potential in the study of complex processes in earth sciences and astrophysics is illustrated with applications using paleoclimate and solar data.
Signal Processing in Periodically Forced Gradient Frequency Neural Networks
Kim, Ji Chul; Large, Edward W.
2015-01-01
Oscillatory instability at the Hopf bifurcation is a dynamical phenomenon that has been suggested to characterize active non-linear processes observed in the auditory system. Networks of oscillators poised near Hopf bifurcation points and tuned to tonotopically distributed frequencies have been used as models of auditory processing at various levels, but systematic investigation of the dynamical properties of such oscillatory networks is still lacking. Here we provide a dynamical systems analysis of a canonical model for gradient frequency neural networks driven by a periodic signal. We use linear stability analysis to identify various driven behaviors of canonical oscillators for all possible ranges of model and forcing parameters. The analysis shows that canonical oscillators exhibit qualitatively different sets of driven states and transitions for different regimes of model parameters. We classify the parameter regimes into four main categories based on their distinct signal processing capabilities. This analysis will lead to deeper understanding of the diverse behaviors of neural systems under periodic forcing and can inform the design of oscillatory network models of auditory signal processing. PMID:26733858
Data near processing support for climate data analysis
NASA Astrophysics Data System (ADS)
Kindermann, Stephan; Ehbrecht, Carsten; Hempelmann, Nils
2016-04-01
Climate data repositories grow in size exponentially. Scalable data near processing capabilities are required to meet future data analysis requirements and to replace current "data download and process at home" workflows and approaches. On one hand side, these processing capabilities should be accessible via standardized interfaces (e.g. OGC WPS), on the other side a large variety of processing tools, toolboxes and deployment alternatives have to be supported and maintained at the data/processing center. We present a community approach of a modular and flexible system supporting the development, deployment and maintenace of OGC-WPS based web processing services. This approach is organized in an open source github project (called "bird-house") supporting individual processing services ("birds", e.g. climate index calculations, model data ensemble calculations), which rely on basic common infrastructural components (e.g. installation and deployment recipes, analysis code dependencies management). To support easy deployment at data centers as well as home institutes (e.g. for testing and development) the system supports the management of the often very complex package dependency chain of climate data analysis packages as well as docker based packaging and installation. We present a concrete deployment scenario at the German Climate Computing Center (DKRZ). The DKRZ one hand side hosts a multi-petabyte climate archive which is integrated e.g. into the european ENES and worldwide ESGF data infrastructure, and on the other hand hosts an HPC center supporting (model) data production and data analysis. The deployment scenario also includes openstack based data cloud services to support data import and data distribution for bird-house based WPS web processing services. Current challenges for inter-institutionnal deployments of web processing services supporting the european and international climate modeling community as well as the climate impact community are highlighted. Also aspects supporting future WPS based cross community usage scenarios supporting data reuse and data provenance aspects are reflected.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harben, P E; Harris, D; Myers, S
Seismic imaging and tracking methods have intelligence and monitoring applications. Current systems, however, do not adequately calibrate or model the unknown geological heterogeneity. Current systems are also not designed for rapid data acquisition and analysis in the field. This project seeks to build the core technological capabilities coupled with innovative deployment, processing, and analysis methodologies to allow seismic methods to be effectively utilized in the applications of seismic imaging and vehicle tracking where rapid (minutes to hours) and real-time analysis is required. The goal of this project is to build capabilities in acquisition system design, utilization and in full 3Dmore » finite difference modeling as well as statistical characterization of geological heterogeneity. Such capabilities coupled with a rapid field analysis methodology based on matched field processing are applied to problems associated with surveillance, battlefield management, finding hard and deeply buried targets, and portal monitoring. This project benefits the U.S. military and intelligence community in support of LLNL's national-security mission. FY03 was the final year of this project. In the 2.5 years this project has been active, numerous and varied developments and milestones have been accomplished. A wireless communication module for seismic data was developed to facilitate rapid seismic data acquisition and analysis. The E3D code was enhanced to include topographic effects. Codes were developed to implement the Karhunen-Loeve (K-L) statistical methodology for generating geological heterogeneity that can be utilized in E3D modeling. The matched field processing methodology applied to vehicle tracking and based on a field calibration to characterize geological heterogeneity was tested and successfully demonstrated in a tank tracking experiment at the Nevada Test Site. A 3-seismic-array vehicle tracking testbed was installed on-site at LLNL for testing real-time seismic tracking methods. A field experiment was conducted over a tunnel at the Nevada Site that quantified the tunnel reflection signal and, coupled with modeling, identified key needs and requirements in experimental layout of sensors. A large field experiment was conducted at the Lake Lynn Laboratory, a mine safety research facility in Pennsylvania, over a tunnel complex in realistic, difficult conditions. This experiment gathered the necessary data for a full 3D attempt to apply the methodology. The experiment also collected data to analyze the capabilities to detect and locate in-tunnel explosions for mine safety and other applications.« less
An Adaptive Technique for a Redundant-Sensor Navigation System. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Chien, T. T.
1972-01-01
An on-line adaptive technique is developed to provide a self-contained redundant-sensor navigation system with a capability to utilize its full potentiality in reliability and performance. The gyro navigation system is modeled as a Gauss-Markov process, with degradation modes defined as changes in characteristics specified by parameters associated with the model. The adaptive system is formulated as a multistage stochastic process: (1) a detection system, (2) an identification system and (3) a compensation system. It is shown that the sufficient statistics for the partially observable process in the detection and identification system is the posterior measure of the state of degradation, conditioned on the measurement history.
Physics-based interactive volume manipulation for sharing surgical process.
Nakao, Megumi; Minato, Kotaro
2010-05-01
This paper presents a new set of techniques by which surgeons can interactively manipulate patient-specific volumetric models for sharing surgical process. To handle physical interaction between the surgical tools and organs, we propose a simple surface-constraint-based manipulation algorithm to consistently simulate common surgical manipulations such as grasping, holding and retraction. Our computation model is capable of simulating soft-tissue deformation and incision in real time. We also present visualization techniques in order to rapidly visualize time-varying, volumetric information on the deformed image. This paper demonstrates the success of the proposed methods in enabling the simulation of surgical processes, and the ways in which this simulation facilitates preoperative planning and rehearsal.
Kennedy Space Center Launch and Landing Support
NASA Technical Reports Server (NTRS)
Wahlberg, Jennifer
2010-01-01
The presentations describes Kennedy Space Center (KSC) payload processing, facilities and capabilities, and research development and life science experience. Topics include launch site processing, payload processing, key launch site processing roles, leveraging KSC experience, Space Station Processing Facility and capabilities, Baseline Data Collection Facility, Space Life Sciences Laboratory and capabilities, research payload development, International Space Station research flight hardware, KSC flight payload history, and KSC life science expertise.
Stochastic simulation by image quilting of process-based geological models
NASA Astrophysics Data System (ADS)
Hoffimann, Júlio; Scheidt, Céline; Barfod, Adrian; Caers, Jef
2017-09-01
Process-based modeling offers a way to represent realistic geological heterogeneity in subsurface models. The main limitation lies in conditioning such models to data. Multiple-point geostatistics can use these process-based models as training images and address the data conditioning problem. In this work, we further develop image quilting as a method for 3D stochastic simulation capable of mimicking the realism of process-based geological models with minimal modeling effort (i.e. parameter tuning) and at the same time condition them to a variety of data. In particular, we develop a new probabilistic data aggregation method for image quilting that bypasses traditional ad-hoc weighting of auxiliary variables. In addition, we propose a novel criterion for template design in image quilting that generalizes the entropy plot for continuous training images. The criterion is based on the new concept of voxel reuse-a stochastic and quilting-aware function of the training image. We compare our proposed method with other established simulation methods on a set of process-based training images of varying complexity, including a real-case example of stochastic simulation of the buried-valley groundwater system in Denmark.
Detection and quantification of flow consistency in business process models.
Burattin, Andrea; Bernstein, Vered; Neurauter, Manuel; Soffer, Pnina; Weber, Barbara
2018-01-01
Business process models abstract complex business processes by representing them as graphical models. Their layout, as determined by the modeler, may have an effect when these models are used. However, this effect is currently not fully understood. In order to systematically study this effect, a basic set of measurable key visual features is proposed, depicting the layout properties that are meaningful to the human user. The aim of this research is thus twofold: first, to empirically identify key visual features of business process models which are perceived as meaningful to the user and second, to show how such features can be quantified into computational metrics, which are applicable to business process models. We focus on one particular feature, consistency of flow direction, and show the challenges that arise when transforming it into a precise metric. We propose three different metrics addressing these challenges, each following a different view of flow consistency. We then report the results of an empirical evaluation, which indicates which metric is more effective in predicting the human perception of this feature. Moreover, two other automatic evaluations describing the performance and the computational capabilities of our metrics are reported as well.
Future requirements in surface modeling and grid generation
NASA Technical Reports Server (NTRS)
Cosner, Raymond R.
1995-01-01
The past ten years have seen steady progress in surface modeling procedures, and wholesale changes in grid generation technology. Today, it seems fair to state that a satisfactory grid can be developed to model nearly any configuration of interest. The issues at present focus on operational concerns such as cost and quality. Continuing evolution of the engineering process is placing new demands on the technologies of surface modeling and grid generation. In the evolution toward a multidisciplinary analysis-bascd design environment, methods developed for Computational Fluid Dynamics are finding acceptance in many additional applications. These two trends, the normal evolution of the process and a watershed shift toward concurrent and multidisciplinary analysis, will be considered in assessing current capabilities and needed technological improvements.
An agent-based model for queue formation of powered two-wheelers in heterogeneous traffic
NASA Astrophysics Data System (ADS)
Lee, Tzu-Chang; Wong, K. I.
2016-11-01
This paper presents an agent-based model (ABM) for simulating the queue formation of powered two-wheelers (PTWs) in heterogeneous traffic at a signalized intersection. The main novelty is that the proposed interaction rule describing the position choice behavior of PTWs when queuing in heterogeneous traffic can capture the stochastic nature of the decision making process. The interaction rule is formulated as a multinomial logit model, which is calibrated by using a microscopic traffic trajectory dataset obtained from video footage. The ABM is validated against the survey data for the vehicular trajectory patterns, queuing patterns, queue lengths, and discharge rates. The results demonstrate that the proposed model is capable of replicating the observed queue formation process for heterogeneous traffic.
An architecture for the development of real-time fault diagnosis systems using model-based reasoning
NASA Technical Reports Server (NTRS)
Hall, Gardiner A.; Schuetzle, James; Lavallee, David; Gupta, Uday
1992-01-01
Presented here is an architecture for implementing real-time telemetry based diagnostic systems using model-based reasoning. First, we describe Paragon, a knowledge acquisition tool for offline entry and validation of physical system models. Paragon provides domain experts with a structured editing capability to capture the physical component's structure, behavior, and causal relationships. We next describe the architecture of the run time diagnostic system. The diagnostic system, written entirely in Ada, uses the behavioral model developed offline by Paragon to simulate expected component states as reflected in the telemetry stream. The diagnostic algorithm traces causal relationships contained within the model to isolate system faults. Since the diagnostic process relies exclusively on the behavioral model and is implemented without the use of heuristic rules, it can be used to isolate unpredicted faults in a wide variety of systems. Finally, we discuss the implementation of a prototype system constructed using this technique for diagnosing faults in a science instrument. The prototype demonstrates the use of model-based reasoning to develop maintainable systems with greater diagnostic capabilities at a lower cost.
NASA Astrophysics Data System (ADS)
Brigatti, E.; Vieira, M. V.; Kajin, M.; Almeida, P. J. A. L.; de Menezes, M. A.; Cerqueira, R.
2016-02-01
We study the population size time series of a Neotropical small mammal with the intent of detecting and modelling population regulation processes generated by density-dependent factors and their possible delayed effects. The application of analysis tools based on principles of statistical generality are nowadays a common practice for describing these phenomena, but, in general, they are more capable of generating clear diagnosis rather than granting valuable modelling. For this reason, in our approach, we detect the principal temporal structures on the bases of different correlation measures, and from these results we build an ad-hoc minimalist autoregressive model that incorporates the main drivers of the dynamics. Surprisingly our model is capable of reproducing very well the time patterns of the empirical series and, for the first time, clearly outlines the importance of the time of attaining sexual maturity as a central temporal scale for the dynamics of this species. In fact, an important advantage of this analysis scheme is that all the model parameters are directly biologically interpretable and potentially measurable, allowing a consistency check between model outputs and independent measurements.
1985-05-01
unit in the data base, with knowing one generic assembly language. °-’--a 139 The 5-tuple describing single operation execution time of the operations...TSi-- generate , random eventi ( ,.0-15 tieit tmls - ((floa egus ()16 274 r Ispt imet imel I at :EVE’JS- II ktime=0.0; /0 present time 0/ rrs ptime=0.0...computing machinery capable of performing these tasks within a given time constraint. Because the majority of the available computing machinery is general