Incorporation of Electrical Systems Models Into an Existing Thermodynamic Cycle Code
NASA Technical Reports Server (NTRS)
Freeh, Josh
2003-01-01
Integration of entire system includes: Fuel cells, motors, propulsors, thermal/power management, compressors, etc. Use of existing, pre-developed NPSS capabilities includes: 1) Optimization tools; 2) Gas turbine models for hybrid systems; 3) Increased interplay between subsystems; 4) Off-design modeling capabilities; 5) Altitude effects; and 6) Existing transient modeling architecture. Other factors inclde: 1) Easier transfer between users and groups of users; 2) General aerospace industry acceptance and familiarity; and 3) Flexible analysis tool that can also be used for ground power applications.
Existing and Required Modeling Capabilities for Evaluating ATM Systems and Concepts
NASA Technical Reports Server (NTRS)
Odoni, Amedeo R.; Bowman, Jeremy; Delahaye, Daniel; Deyst, John J.; Feron, Eric; Hansman, R. John; Khan, Kashif; Kuchar, James K.; Pujet, Nicolas; Simpson, Robert W.
1997-01-01
ATM systems throughout the world are entering a period of major transition and change. The combination of important technological developments and of the globalization of the air transportation industry has necessitated a reexamination of some of the fundamental premises of existing Air Traffic Management (ATM) concepts. New ATM concepts have to be examined, concepts that may place more emphasis on: strategic traffic management; planning and control; partial decentralization of decision-making; and added reliance on the aircraft to carry out strategic ATM plans, with ground controllers confined primarily to a monitoring and supervisory role. 'Free Flight' is a case in point. In order to study, evaluate and validate such new concepts, the ATM community will have to rely heavily on models and computer-based tools/utilities, covering a wide range of issues and metrics related to safety, capacity and efficiency. The state of the art in such modeling support is adequate in some respects, but clearly deficient in others. It is the objective of this study to assist in: (1) assessing the strengths and weaknesses of existing fast-time models and tools for the study of ATM systems and concepts and (2) identifying and prioritizing the requirements for the development of additional modeling capabilities in the near future. A three-stage process has been followed to this purpose: 1. Through the analysis of two case studies involving future ATM system scenarios, as well as through expert assessment, modeling capabilities and supporting tools needed for testing and validating future ATM systems and concepts were identified and described. 2. Existing fast-time ATM models and support tools were reviewed and assessed with regard to the degree to which they offer the capabilities identified under Step 1. 3 . The findings of 1 and 2 were combined to draw conclusions about (1) the best capabilities currently existing, (2) the types of concept testing and validation that can be carried out reliably with such existing capabilities and (3) the currently unavailable modeling capabilities that should receive high priority for near-term research and development. It should be emphasized that the study is concerned only with the class of 'fast time' analytical and simulation models. 'Real time' models, that typically involve humans-in-the-loop, comprise another extensive class which is not addressed in this report. However, the relationship between some of the fast-time models reviewed and a few well-known real-time models is identified in several parts of this report and the potential benefits from the combined use of these two classes of models-a very important subject-are discussed in chapters 4 and 7.
SMP: A solid modeling program version 2.0
NASA Technical Reports Server (NTRS)
Randall, D. P.; Jones, K. H.; Vonofenheim, W. H.; Gates, R. L.; Matthews, C. G.
1986-01-01
The Solid Modeling Program (SMP) provides the capability to model complex solid objects through the composition of primitive geometric entities. In addition to the construction of solid models, SMP has extensive facilities for model editing, display, and analysis. The geometric model produced by the software system can be output in a format compatible with existing analysis programs such as PATRAN-G. The present version of the SMP software supports six primitives: boxes, cones, spheres, paraboloids, tori, and trusses. The details for creating each of the major primitive types is presented. The analysis capabilities of SMP, including interfaces to existing analysis programs, are discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Christensen, Craig
Opportunities for combining energy efficiency, demand response, and energy storage with PV are often missed, because the required knowledge and expertise for these different technologies exist in separate organizations or individuals. Furthermore, there is a lack of quantitative tools to optimize energy efficiency, demand response and energy storage with PV, especially for existing buildings. Our goal is to develop a modeling tool, BEopt-CA (Ex), with capabilities to facilitate identification and implementation of a balanced integration of energy efficiency (EE), demand response (DR), and energy storage (ES) with photovoltaics (PV) within the residential retrofit market. To achieve this goal, we willmore » adapt and extend an existing tool -- BEopt -- that is designed to identify optimal combinations of efficiency and PV in new home designs. In addition, we will develop multifamily residential modeling capabilities for use in California, to facilitate integration of distributed solar power into the grid in order to maximize its value to California ratepayers. The project is follow-on research that leverages previous California Solar Initiative RD&D investment in the BEopt software. BEopt facilitates finding the least cost combination of energy efficiency and renewables to support integrated DSM (iDSM) and Zero Net Energy (ZNE) in California residential buildings. However, BEopt is currently focused on modeling single-family houses and does not include satisfactory capabilities for modeling multifamily homes. The project brings BEopt's existing modeling and optimization capabilities to multifamily buildings, including duplexes, triplexes, townhouses, flats, and low-rise apartment buildings.« less
User's Guide To CHEAP0 II-Economic Analysis of Stand Prognosis Model Outputs
Joseph E. Horn; E. Lee Medema; Ervin G. Schuster
1986-01-01
CHEAP0 II provides supplemental economic analysis capability for users of version 5.1 of the Stand Prognosis Model, including recent regeneration and insect outbreak extensions. Although patterned after the old CHEAP0 model, CHEAP0 II has more features and analytic capabilities, especially for analysis of existing and uneven-aged stands....
A Transactional Model of Bullying and Victimization
ERIC Educational Resources Information Center
Georgiou, Stelios N.; Fanti, Kostas A.
2010-01-01
The purpose of the current study was to develop and test a transactional model, based on longitudinal data, capable to describe the existing interrelation between maternal behavior and child bullying and victimization experiences over time. The results confirmed the existence of such a model for bullying, but not for victimization in terms of…
Microgrid Design Toolkit (MDT) User Guide Software v1.2.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eddy, John P.
2017-08-01
The Microgrid Design Toolkit (MDT) supports decision analysis for new ("greenfield") microgrid designs as well as microgrids with existing infrastructure. The current version of MDT includes two main capabilities. The first capability, the Microgrid Sizing Capability (MSC), is used to determine the size and composition of a new, grid connected microgrid in the early stages of the design process. MSC is focused on developing a microgrid that is economically viable when connected to the grid. The second capability is focused on designing a microgrid for operation in islanded mode. This second capability relies on two models: the Technology Management Optimizationmore » (TMO) model and Performance Reliability Model (PRM).« less
BBN technical memorandum W1291 infrasound model feasibility study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Farrell, T., BBN Systems and Technologies
1998-05-01
The purpose of this study is to determine the need and level of effort required to add existing atmospheric databases and infrasound propagation models to the DOE`s Hydroacoustic Coverage Assessment Model (HydroCAM) [1,2]. The rationale for the study is that the performance of the infrasound monitoring network will be an important factor for both the International Monitoring System (IMS) and US national monitoring capability. Many of the technical issues affecting the design and performance of the infrasound network are directly related to the variability of the atmosphere and the corresponding uncertainties in infrasound propagation. It is clear that the studymore » of these issues will be enhanced by the availability of software tools for easy manipulation and interfacing of various atmospheric databases and infrasound propagation models. In addition, since there are many similarities between propagation in the oceans and in the atmosphere, it is anticipated that much of the software infrastructure developed for hydroacoustic database manipulation and propagation modeling in HydroCAM will be directly extendible to an infrasound capability. The study approach was to talk to the acknowledged domain experts in the infrasound monitoring area to determine: 1. The major technical issues affecting infrasound monitoring network performance. 2. The need for an atmospheric database/infrasound propagation modeling capability similar to HydroCAM. 3. The state of existing infrasound propagation codes and atmospheric databases. 4. A recommended approach for developing the required capabilities. A list of the people who contributed information to this study is provided in Table 1. We also relied on our knowledge of oceanographic and meteorological data sources to determine the availability of atmospheric databases and the feasibility of incorporating this information into the existing HydroCAM geographic database software. This report presents a summary of the need for an integrated infrasound modeling capability in Section 2.0. Section 3.0 provides a recommended approach for developing this capability in two stages; a basic capability and an extended capability. This section includes a discussion of the available static and dynamic databases, and the various modeling tools which are available or could be developed under such a task. The conclusions and recommendations of the study are provided in Section 4.0.« less
A Conceptual Measurement Model for eHealth Readiness: a Team Based Perspective
Phillips, James; Poon, Simon K.; Yu, Dan; Lam, Mary; Hines, Monique; Brunner, Melissa; Power, Emma; Keep, Melanie; Shaw, Tim; Togher, Leanne
2017-01-01
Despite the shift towards collaborative healthcare and the increase in the use of eHealth technologies, there does not currently exist a model for the measurement of eHealth readiness in interdisciplinary healthcare teams. This research aims to address this gap in the literature through the development of a three phase methodology incorporating qualitative and quantitative methods. We propose a conceptual measurement model consisting of operationalized themes affecting readiness across four factors: (i) Organizational Capabilities, (ii) Team Capabilities, (iii) Patient Capabilities, and (iv) Technology Capabilities. The creation of this model will allow for the measurement of the readiness of interdisciplinary healthcare teams to use eHealth technologies to improve patient outcomes. PMID:29854207
Computer evaluation of existing and proposed fire lookouts
Romain M. Mees
1976-01-01
A computer simulation model has been developed for evaluating the fire detection capabilities of existing and proposed lookout stations. The model uses coordinate location of fires and lookouts, tower elevation, and topographic data to judge location of stations, and to determine where a fire can be seen. The model was tested by comparing it with manual detection on a...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Joon H.; Siegel, Malcolm Dean; Arguello, Jose Guadalupe, Jr.
2011-03-01
This report describes a gap analysis performed in the process of developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repositorymore » designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with rigorous verification, validation, and software quality requirements. The gap analyses documented in this report were are performed during an initial gap analysis to identify candidate codes and tools to support the development and integration of the Waste IPSC, and during follow-on activities that delved into more detailed assessments of the various codes that were acquired, studied, and tested. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. The gap analysis indicates that significant capabilities may already exist in the existing THC codes although there is no single code able to fully account for all physical and chemical processes involved in a waste disposal system. Large gaps exist in modeling chemical processes and their couplings with other processes. The coupling of chemical processes with flow transport and mechanical deformation remains challenging. The data for extreme environments (e.g., for elevated temperature and high ionic strength media) that are needed for repository modeling are severely lacking. In addition, most of existing reactive transport codes were developed for non-radioactive contaminants, and they need to be adapted to account for radionuclide decay and in-growth. The accessibility to the source codes is generally limited. Because the problems of interest for the Waste IPSC are likely to result in relatively large computational models, a compact memory-usage footprint and a fast/robust solution procedure will be needed. A robust massively parallel processing (MPP) capability will also be required to provide reasonable turnaround times on the analyses that will be performed with the code. A performance assessment (PA) calculation for a waste disposal system generally requires a large number (hundreds to thousands) of model simulations to quantify the effect of model parameter uncertainties on the predicted repository performance. A set of codes for a PA calculation must be sufficiently robust and fast in terms of code execution. A PA system as a whole must be able to provide multiple alternative models for a specific set of physical/chemical processes, so that the users can choose various levels of modeling complexity based on their modeling needs. This requires PA codes, preferably, to be highly modularized. Most of the existing codes have difficulties meeting these requirements. Based on the gap analysis results, we have made the following recommendations for the code selection and code development for the NEAMS waste IPSC: (1) build fully coupled high-fidelity THCMBR codes using the existing SIERRA codes (e.g., ARIA and ADAGIO) and platform, (2) use DAKOTA to build an enhanced performance assessment system (EPAS), and build a modular code architecture and key code modules for performance assessments. The key chemical calculation modules will be built by expanding the existing CANTERA capabilities as well as by extracting useful components from other existing codes.« less
Health Capability: Conceptualization and Operationalization
2010-01-01
Current theoretical approaches to bioethics and public health ethics propose varied justifications as the basis for health care and public health, yet none captures a fundamental reality: people seek good health and the ability to pursue it. Existing models do not effectively address these twin goals. The approach I espouse captures both of these orientations through a concept here called health capability. Conceptually, health capability illuminates the conditions that affect health and one's ability to make health choices. By respecting the health consequences individuals face and their health agency, health capability offers promise for finding a balance between paternalism and autonomy. I offer a conceptual model of health capability and present a health capability profile to identify and address health capability gaps. PMID:19965570
Business Models for Cost Sharing & Capability Sustainment
2012-08-18
digital technology into existing mechanical products and their supporting processes can only work correctly if the firm carrying it out changes its entire...average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed...Capability Sustainment 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK
CREME: The 2011 Revision of the Cosmic Ray Effects on Micro-Electronics Code
NASA Technical Reports Server (NTRS)
Adams, James H., Jr.; Barghouty, Abdulnasser F.; Reed, Robert A.; Sierawski, Brian D.; Watts, John W., Jr.
2012-01-01
We describe a tool suite, CREME, which combines existing capabilities of CREME96 and CREME86 with new radiation environment models and new Monte Carlo computational capabilities for single event effects and total ionizing dose.
2005-12-31
MANPADS missile is modeled using LSDYNA . It has 187600 nodes, 52802 shell elements with 13 shell materials, 112200 solid elements with 1804 solid...model capability that includes impact, detonation, penetration, and wing flutter response. This work extends an existing body on body missile model...the missile as well as the expansion of the surrounding fluids was modeled in the Eulerian domain. The Jones-Wilkins-Lee (JWL) equation of state was
NASA Astrophysics Data System (ADS)
Helmuth, Douglas B.; Bell, Raymond M.; Grant, David A.; Lentz, Christopher A.
2012-09-01
Architecting the operational Next Generation of earth monitoring satellites based on matured climate modeling, reuse of existing sensor & satellite capabilities, attention to affordability and evolutionary improvements integrated with constellation efficiencies - becomes our collective goal for an open architectural design forum. Understanding the earth's climate and collecting requisite signatures over the next 30 years is a shared mandate by many of the world's governments. But there remains a daunting challenge to bridge scientific missions to 'operational' systems that truly support the demands of decision makers, scientific investigators and global users' requirements for trusted data. In this paper we will suggest an architectural structure that takes advantage of current earth modeling examples including cross-model verification and a first order set of critical climate parameters and metrics; that in turn, are matched up with existing space borne collection capabilities and sensors. The tools used and the frameworks offered are designed to allow collaborative overlays by other stakeholders nominating different critical parameters and their own treaded connections to existing international collection experience. These aggregate design suggestions will be held up to group review and prioritized as potential constellation solutions including incremental and spiral developments - including cost benefits and organizational opportunities. This Part IV effort is focused on being an inclusive 'Next Gen Constellation' design discussion and is the natural extension to earlier papers.
On the Conditioning of Machine-Learning-Assisted Turbulence Modeling
NASA Astrophysics Data System (ADS)
Wu, Jinlong; Sun, Rui; Wang, Qiqi; Xiao, Heng
2017-11-01
Recently, several researchers have demonstrated that machine learning techniques can be used to improve the RANS modeled Reynolds stress by training on available database of high fidelity simulations. However, obtaining improved mean velocity field remains an unsolved challenge, restricting the predictive capability of current machine-learning-assisted turbulence modeling approaches. In this work we define a condition number to evaluate the model conditioning of data-driven turbulence modeling approaches, and propose a stability-oriented machine learning framework to model Reynolds stress. Two canonical flows, the flow in a square duct and the flow over periodic hills, are investigated to demonstrate the predictive capability of the proposed framework. The satisfactory prediction performance of mean velocity field for both flows demonstrates the predictive capability of the proposed framework for machine-learning-assisted turbulence modeling. With showing the capability of improving the prediction of mean flow field, the proposed stability-oriented machine learning framework bridges the gap between the existing machine-learning-assisted turbulence modeling approaches and the demand of predictive capability of turbulence models in real applications.
Institutional Transformation Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
2015-10-19
Reducing the energy consumption of large institutions with dozens to hundreds of existing buildings while maintaining and improving existing infrastructure is a critical economic and environmental challenge. SNL's Institutional Transformation (IX) work integrates facilities and infrastructure sustainability technology capabilities and collaborative decision support modeling approaches to help facilities managers at Sandia National Laboratories (SNL) simulate different future energy reduction strategies and meet long term energy conservation goals.
Conversion of Component-Based Point Definition to VSP Model and Higher Order Meshing
NASA Technical Reports Server (NTRS)
Ordaz, Irian
2011-01-01
Vehicle Sketch Pad (VSP) has become a powerful conceptual and parametric geometry tool with numerous export capabilities for third-party analysis codes as well as robust surface meshing capabilities for computational fluid dynamics (CFD) analysis. However, a capability gap currently exists for reconstructing a fully parametric VSP model of a geometry generated by third-party software. A computer code called GEO2VSP has been developed to close this gap and to allow the integration of VSP into a closed-loop geometry design process with other third-party design tools. Furthermore, the automated CFD surface meshing capability of VSP are demonstrated for component-based point definition geometries in a conceptual analysis and design framework.
Simulation of Healing Threshold in Strain-Induced Inflammation Through a Discrete Informatics Model.
Ibrahim, Israr Bin M; Sarma O V, Sanjay; Pidaparti, Ramana M
2018-05-01
Respiratory diseases such as asthma and acute respiratory distress syndrome as well as acute lung injury involve inflammation at the cellular level. The inflammation process is very complex and is characterized by the emergence of cytokines along with other changes in cellular processes. Due to the complexity of the various constituents that makes up the inflammation dynamics, it is necessary to develop models that can complement experiments to fully understand inflammatory diseases. In this study, we developed a discrete informatics model based on cellular automata (CA) approach to investigate the influence of elastic field (stretch/strain) on the dynamics of inflammation and account for probabilistic adaptation based on statistical interpretation of existing experimental data. Our simulation model investigated the effects of low, medium, and high strain conditions on inflammation dynamics. Results suggest that the model is able to indicate the threshold of innate healing of tissue as a response to strain experienced by the tissue. When strain is under the threshold, the tissue is still capable of adapting its structure to heal the damaged part. However, there exists a strain threshold where healing capability breaks down. The results obtained demonstrate that the developed discrete informatics based CA model is capable of modeling and giving insights into inflammation dynamics parameters under various mechanical strain/stretch environments.
DOT National Transportation Integrated Search
2014-08-01
The travel demand models developed and applied by the Transportation Planning and Programming Division : (TPP) of the Texas Department of Transportation (TxDOT) are daily three-step models (i.e., trip generation, trip : distribution, and traffic assi...
A conceptual framework and classification of capability areas for business process maturity
NASA Astrophysics Data System (ADS)
Van Looy, Amy; De Backer, Manu; Poels, Geert
2014-03-01
The article elaborates on business process maturity, which indicates how well an organisation can perform based on its business processes, i.e. on its way of working. This topic is of paramount importance for managers who try to excel in today's competitive world. Hence, business process maturity is an emerging research field. However, no consensus exists on the capability areas (or skills) needed to excel. Moreover, their theoretical foundation and synergies with other fields are frequently neglected. To overcome this gap, our study presents a conceptual framework with six main capability areas and 17 sub areas. It draws on theories regarding the traditional business process lifecycle, which are supplemented by recognised organisation management theories. The comprehensiveness of this framework is validated by mapping 69 business process maturity models (BPMMs) to the identified capability areas, based on content analysis. Nonetheless, as a consensus neither exists among the collected BPMMs, a classification of different maturity types is proposed, based on cluster analysis and discriminant analysis. Consequently, the findings contribute to the grounding of business process literature. Possible future avenues are evaluating existing BPMMs, directing new BPMMs or investigating which combinations of capability areas (i.e. maturity types) contribute more to performance than others.
A study about the existence of the leverage effect in stochastic volatility models
NASA Astrophysics Data System (ADS)
Florescu, Ionuţ; Pãsãricã, Cristian Gabriel
2009-02-01
The empirical relationship between the return of an asset and the volatility of the asset has been well documented in the financial literature. Named the leverage effect or sometimes risk-premium effect, it is observed in real data that, when the return of the asset decreases, the volatility increases and vice versa. Consequently, it is important to demonstrate that any formulated model for the asset price is capable of generating this effect observed in practice. Furthermore, we need to understand the conditions on the parameters present in the model that guarantee the apparition of the leverage effect. In this paper we analyze two general specifications of stochastic volatility models and their capability of generating the perceived leverage effect. We derive conditions for the apparition of leverage effect in both of these stochastic volatility models. We exemplify using stochastic volatility models used in practice and we explicitly state the conditions for the existence of the leverage effect in these examples.
The NASA Severe Thunderstorm Observations and Regional Modeling (NASA STORM) Project
NASA Technical Reports Server (NTRS)
Schultz, Christopher J.; Gatlin, Patrick N.; Lang, Timothy J.; Srikishen, Jayanthi; Case, Jonathan L.; Molthan, Andrew L.; Zavodsky, Bradley T.; Bailey, Jeffrey; Blakeslee, Richard J.; Jedlovec, Gary J.
2016-01-01
The NASA Severe Storm Thunderstorm Observations and Regional Modeling(NASA STORM) project enhanced NASA’s severe weather research capabilities, building upon existing Earth Science expertise at NASA Marshall Space Flight Center (MSFC). During this project, MSFC extended NASA’s ground-based lightning detection capacity to include a readily deployable lightning mapping array (LMA). NASA STORM also enabled NASA’s Short-term Prediction and Research Transition (SPoRT) to add convection allowing ensemble modeling to its portfolio of regional numerical weather prediction (NWP) capabilities. As a part of NASA STORM, MSFC developed new open-source capabilities for analyzing and displaying weather radar observations integrated from both research and operational networks. These accomplishments enabled by NASA STORM are a step towards enhancing NASA’s capabilities for studying severe weather and positions them for any future NASA related severe storm field campaigns.
Retargeting of existing FORTRAN program and development of parallel compilers
NASA Technical Reports Server (NTRS)
Agrawal, Dharma P.
1988-01-01
The software models used in implementing the parallelizing compiler for the B-HIVE multiprocessor system are described. The various models and strategies used in the compiler development are: flexible granularity model, which allows a compromise between two extreme granularity models; communication model, which is capable of precisely describing the interprocessor communication timings and patterns; loop type detection strategy, which identifies different types of loops; critical path with coloring scheme, which is a versatile scheduling strategy for any multicomputer with some associated communication costs; and loop allocation strategy, which realizes optimum overlapped operations between computation and communication of the system. Using these models, several sample routines of the AIR3D package are examined and tested. It may be noted that automatically generated codes are highly parallelized to provide the maximized degree of parallelism, obtaining the speedup up to a 28 to 32-processor system. A comparison of parallel codes for both the existing and proposed communication model, is performed and the corresponding expected speedup factors are obtained. The experimentation shows that the B-HIVE compiler produces more efficient codes than existing techniques. Work is progressing well in completing the final phase of the compiler. Numerous enhancements are needed to improve the capabilities of the parallelizing compiler.
10 Steps to Building an Architecture for Space Surveillance Projects
NASA Astrophysics Data System (ADS)
Gyorko, E.; Barnhart, E.; Gans, H.
Space surveillance is an increasingly complex task, requiring the coordination of a multitude of organizations and systems, while dealing with competing capabilities, proprietary processes, differing standards, and compliance issues. In order to fully understand space surveillance operations, analysts and engineers need to analyze and break down their operations and systems using what are essentially enterprise architecture processes and techniques. These techniques can be daunting to the first- time architect. This paper provides a summary of simplified steps to analyze a space surveillance system at the enterprise level in order to determine capabilities, services, and systems. These steps form the core of an initial Model-Based Architecting process. For new systems, a well defined, or well architected, space surveillance enterprise leads to an easier transition from model-based architecture to model-based design and provides a greater likelihood that requirements are fulfilled the first time. Both new and existing systems benefit from being easier to manage, and can be sustained more easily using portfolio management techniques, based around capabilities documented in the model repository. The resulting enterprise model helps an architect avoid 1) costly, faulty portfolio decisions; 2) wasteful technology refresh efforts; 3) upgrade and transition nightmares; and 4) non-compliance with DoDAF directives. The Model-Based Architecting steps are based on a process that Harris Corporation has developed from practical experience architecting space surveillance systems and ground systems. Examples are drawn from current work on documenting space situational awareness enterprises. The process is centered on DoDAF 2 and its corresponding meta-model so that terminology is standardized and communicable across any disciplines that know DoDAF architecting, including acquisition, engineering and sustainment disciplines. Each step provides a guideline for the type of data to collect, and also the appropriate views to generate. The steps include 1) determining the context of the enterprise, including active elements and high level capabilities or goals; 2) determining the desired effects of the capabilities and mapping capabilities against the project plan; 3) determining operational performers and their inter-relationships; 4) building information and data dictionaries; 5) defining resources associated with capabilities; 6) determining the operational behavior necessary to achieve each capability; 7) analyzing existing or planned implementations to determine systems, services and software; 8) cross-referencing system behavior to operational behavioral; 9) documenting system threads and functional implementations; and 10) creating any required textual documentation from the model.
A review of methods for predicting air pollution dispersion
NASA Technical Reports Server (NTRS)
Mathis, J. J., Jr.; Grose, W. L.
1973-01-01
Air pollution modeling, and problem areas in air pollution dispersion modeling were surveyed. Emission source inventory, meteorological data, and turbulent diffusion are discussed in terms of developing a dispersion model. Existing mathematical models of urban air pollution, and highway and airport models are discussed along with their limitations. Recommendations for improving modeling capabilities are included.
A Survey of Cost Estimating Methodologies for Distributed Spacecraft Missions
NASA Technical Reports Server (NTRS)
Foreman, Veronica; Le Moigne, Jacqueline; de Weck, Oliver
2016-01-01
Satellite constellations present unique capabilities and opportunities to Earth orbiting and near-Earth scientific and communications missions, but also present new challenges to cost estimators. An effective and adaptive cost model is essential to successful mission design and implementation, and as Distributed Spacecraft Missions (DSM) become more common, cost estimating tools must become more representative of these types of designs. Existing cost models often focus on a single spacecraft and require extensive design knowledge to produce high fidelity estimates. Previous research has examined the shortcomings of existing cost practices as they pertain to the early stages of mission formulation, for both individual satellites and small satellite constellations. Recommendations have been made for how to improve the cost models for individual satellites one-at-a-time, but much of the complexity in constellation and DSM cost modeling arises from constellation systems level considerations that have not yet been examined. This paper constitutes a survey of the current state-of-the-art in cost estimating techniques with recommendations for improvements to increase the fidelity of future constellation cost estimates. To enable our investigation, we have developed a cost estimating tool for constellation missions. The development of this tool has revealed three high-priority weaknesses within existing parametric cost estimating capabilities as they pertain to DSM architectures: design iteration, integration and test, and mission operations. Within this paper we offer illustrative examples of these discrepancies and make preliminary recommendations for addressing them. DSM and satellite constellation missions are shifting the paradigm of space-based remote sensing, showing promise in the realms of Earth science, planetary observation, and various heliophysical applications. To fully reap the benefits of DSM technology, accurate and relevant cost estimating capabilities must exist; this paper offers insights critical to the future development and implementation of DSM cost estimating tools.
A Survey of Cost Estimating Methodologies for Distributed Spacecraft Missions
NASA Technical Reports Server (NTRS)
Foreman, Veronica L.; Le Moigne, Jacqueline; de Weck, Oliver
2016-01-01
Satellite constellations present unique capabilities and opportunities to Earth orbiting and near-Earth scientific and communications missions, but also present new challenges to cost estimators. An effective and adaptive cost model is essential to successful mission design and implementation, and as Distributed Spacecraft Missions (DSM) become more common, cost estimating tools must become more representative of these types of designs. Existing cost models often focus on a single spacecraft and require extensive design knowledge to produce high fidelity estimates. Previous research has examined the limitations of existing cost practices as they pertain to the early stages of mission formulation, for both individual satellites and small satellite constellations. Recommendations have been made for how to improve the cost models for individual satellites one-at-a-time, but much of the complexity in constellation and DSM cost modeling arises from constellation systems level considerations that have not yet been examined. This paper constitutes a survey of the current state-of-theart in cost estimating techniques with recommendations for improvements to increase the fidelity of future constellation cost estimates. To enable our investigation, we have developed a cost estimating tool for constellation missions. The development of this tool has revealed three high-priority shortcomings within existing parametric cost estimating capabilities as they pertain to DSM architectures: design iteration, integration and test, and mission operations. Within this paper we offer illustrative examples of these discrepancies and make preliminary recommendations for addressing them. DSM and satellite constellation missions are shifting the paradigm of space-based remote sensing, showing promise in the realms of Earth science, planetary observation, and various heliophysical applications. To fully reap the benefits of DSM technology, accurate and relevant cost estimating capabilities must exist; this paper offers insights critical to the future development and implementation of DSM cost estimating tools.
Simulating human behavior for national security human interactions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bernard, Michael Lewis; Hart, Dereck H.; Verzi, Stephen J.
2007-01-01
This 3-year research and development effort focused on what we believe is a significant technical gap in existing modeling and simulation capabilities: the representation of plausible human cognition and behaviors within a dynamic, simulated environment. Specifically, the intent of the ''Simulating Human Behavior for National Security Human Interactions'' project was to demonstrate initial simulated human modeling capability that realistically represents intra- and inter-group interaction behaviors between simulated humans and human-controlled avatars as they respond to their environment. Significant process was made towards simulating human behaviors through the development of a framework that produces realistic characteristics and movement. The simulated humansmore » were created from models designed to be psychologically plausible by being based on robust psychological research and theory. Progress was also made towards enhancing Sandia National Laboratories existing cognitive models to support culturally plausible behaviors that are important in representing group interactions. These models were implemented in the modular, interoperable, and commercially supported Umbra{reg_sign} simulation framework.« less
ERIC Educational Resources Information Center
Barker, D. M.; Aggerholm, K.; Standal, O.; Larsson, H.
2018-01-01
Background: Physical educators currently have a number of pedagogical (or curricular) models at their disposal. While existing models have been well-received in educational contexts, these models seek to extend students' capacities within a limited number of "human activities" (Arendt, 1958). The activity of "human practising,"…
Industrial Sector Energy Efficiency Modeling (ISEEM) Framework Documentation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karali, Nihan; Xu, Tengfang; Sathaye, Jayant
2012-12-12
The goal of this study is to develop a new bottom-up industry sector energy-modeling framework with an agenda of addressing least cost regional and global carbon reduction strategies, improving the capabilities and limitations of the existing models that allows trading across regions and countries as an alternative.
van der Klink, Jac J L; Bültmann, Ute; Burdorf, Alex; Schaufeli, Wilmar B; Zijlstra, Fred R H; Abma, Femke I; Brouwer, Sandra; van der Wilt, Gert Jan
2016-01-01
The aim of this paper is to propose a new model of sustainable employability based on the capability approach, encompassing the complexity of contemporary work, and placing particular emphasis on work-related values. Having evaluated existing conceptual models of work, health, and employability, we concluded that prevailing models lack an emphasis on important work-related values. Amartya Sen's capability approach (CA) provides a framework that incorporates a focus on values and reflects the complexity of sustainable employability. We developed a model of sustainable employability based on the CA. This model can be used as starting point for developing an assessment tool to investigate sustainable employability. A fundamental premise of the CA is that work should create value for the organization as well as for the worker. This approach challenges researchers, policy-makers, and practitioners to investigate what people find important and valuable--what they would like to achieve in a given (work) context--and moreover to ascertain whether people are able and enabled to do so. According to this approach, it is not only the individual who is responsible for achieving this; the work context is also important. Rather than merely describing relationships between variables, as existing descriptive models often do, the CA depicts a valuable goal: a set of capabilities that constitute valuable work. Moreover, the CA fits well with recent conceptions of health and modern insights into work, in which the individual works towards his or her own goals that s/he has to achieve within the broader goals of the organization.
Cryogenic Wind Tunnel Models. Design and Fabrication
NASA Technical Reports Server (NTRS)
Young, C. P., Jr. (Compiler); Gloss, B. B. (Compiler)
1983-01-01
The principal motivating factor was the National Transonic Facility (NTF). Since the NTF can achieve significantly higher Reynolds numbers at transonic speeds than other wind tunnels in the world, and will therefore occupy a unique position among ground test facilities, every effort is being made to ensure that model design and fabrication technology exists to allow researchers to take advantage of this high Reynolds number capability. Since a great deal of experience in designing and fabricating cryogenic wind tunnel models does not exist, and since the experience that does exist is scattered over a number of organizations, there is a need to bring existing experience in these areas together and share it among all interested parties. Representatives from government, the airframe industry, and universities are included.
Distributed Hydrologic Modeling Apps for Decision Support in the Cloud
NASA Astrophysics Data System (ADS)
Swain, N. R.; Latu, K.; Christiensen, S.; Jones, N.; Nelson, J.
2013-12-01
Advances in computation resources and greater availability of water resources data represent an untapped resource for addressing hydrologic uncertainties in water resources decision-making. The current practice of water authorities relies on empirical, lumped hydrologic models to estimate watershed response. These models are not capable of taking advantage of many of the spatial datasets that are now available. Physically-based, distributed hydrologic models are capable of using these data resources and providing better predictions through stochastic analysis. However, there exists a digital divide that discourages many science-minded decision makers from using distributed models. This divide can be spanned using a combination of existing web technologies. The purpose of this presentation is to present a cloud-based environment that will offer hydrologic modeling tools or 'apps' for decision support and the web technologies that have been selected to aid in its implementation. Compared to the more commonly used lumped-parameter models, distributed models, while being more intuitive, are still data intensive, computationally expensive, and difficult to modify for scenario exploration. However, web technologies such as web GIS, web services, and cloud computing have made the data more accessible, provided an inexpensive means of high-performance computing, and created an environment for developing user-friendly apps for distributed modeling. Since many water authorities are primarily interested in the scenario exploration exercises with hydrologic models, we are creating a toolkit that facilitates the development of a series of apps for manipulating existing distributed models. There are a number of hurdles that cloud-based hydrologic modeling developers face. One of these is how to work with the geospatial data inherent with this class of models in a web environment. Supporting geospatial data in a website is beyond the capabilities of standard web frameworks and it requires the use of additional software. In particular, there are at least three elements that are needed: a geospatially enabled database, a map server, and geoprocessing toolbox. We recommend a software stack for geospatial web application development comprising: MapServer, PostGIS, and 52 North with Python as the scripting language to tie them together. Another hurdle that must be cleared is managing the cloud-computing load. We are using HTCondor as a solution to this end. Finally, we are creating a scripting environment wherein developers will be able to create apps that use existing hydrologic models in our system with minimal effort. This capability will be accomplished by creating a plugin for a Python content management system called CKAN. We are currently developing cyberinfrastructure that utilizes this stack and greatly lowers the investment required to deploy cloud-based modeling apps. This material is based upon work supported by the National Science Foundation under Grant No. 1135482
IRTPRO 2.1 for Windows (Item Response Theory for Patient-Reported Outcomes)
ERIC Educational Resources Information Center
Paek, Insu; Han, Kyung T.
2013-01-01
This article reviews a new item response theory (IRT) model estimation program, IRTPRO 2.1, for Windows that is capable of unidimensional and multidimensional IRT model estimation for existing and user-specified constrained IRT models for dichotomously and polytomously scored item response data. (Contains 1 figure and 2 notes.)
Nested ocean models: Work in progress
NASA Technical Reports Server (NTRS)
Perkins, A. Louise
1991-01-01
The ongoing work of combining three existing software programs into a nested grid oceanography model is detailed. The HYPER domain decomposition program, the SPEM ocean modeling program, and a quasi-geostrophic model written in England are being combined into a general ocean modeling facility. This facility will be used to test the viability and the capability of two-way nested grids in the North Atlantic.
Performance Modeling of an Airborne Raman Water Vapor Lidar
NASA Technical Reports Server (NTRS)
Whiteman, D. N.; Schwemmer, G.; Berkoff, T.; Plotkin, H.; Ramos-Izquierdo, L.; Pappalardo, G.
2000-01-01
A sophisticated Raman lidar numerical model had been developed. The model has been used to simulate the performance of two ground-based Raman water vapor lidar systems. After tuning the model using these ground-based measurements, the model is used to simulate the water vapor measurement capability of an airborne Raman lidar under both day-and night-time conditions for a wide range of water vapor conditions. The results indicate that, under many circumstances, the daytime measurements possess comparable resolution to an existing airborne differential absorption water vapor lidar while the nighttime measurement have higher resolution. In addition, a Raman lidar is capable of measurements not possible using a differential absorption system.
Diversity's Impact on the Executive Coaching Process
ERIC Educational Resources Information Center
Maltbia, Terrence E.; Power, Anne
2005-01-01
This paper presents a conceptual model intended to expand existing executive coaching processes used in organizations by building the strategic learning capabilities needed to integrate a diversity perspective into this emerging field of HRD practice. This model represents the early development of results from a Diversity Practitioner Study…
Environmental engineering calculations involving uncertainties; either in the model itself or in the data, are far beyond the capabilities of conventional analysis for any but the simplest of models. There exist a number of general-purpose computer simulation languages, using Mon...
On the Need for Multidimensional Stirling Simulations
NASA Technical Reports Server (NTRS)
Dyson, Rodger W.; Wilson, Scott D.; Tew, Roy C.; Demko, Rikako
2005-01-01
Given the cost and complication of simulating Stirling convertors, do we really need multidimensional modeling when one-dimensional capabilities exist? This paper provides a comprehensive description of when and why multidimensional simulation is needed.
Survey of Existing Uncertainty Quantification Capabilities for Army Relevant Problems
2017-11-27
ARL-TR-8218•NOV 2017 US Army Research Laboratory Survey of Existing Uncertainty Quantification Capabilities for Army-Relevant Problems by James J...NOV 2017 US Army Research Laboratory Survey of Existing Uncertainty Quantification Capabilities for Army-Relevant Problems by James J Ramsey...Rev. 8/98) Prescribed by ANSI Std. Z39.18 November 2017 Technical Report Survey of Existing Uncertainty Quantification Capabilities for Army
Mission Analysis, Operations, and Navigation Toolkit Environment (Monte) Version 040
NASA Technical Reports Server (NTRS)
Sunseri, Richard F.; Wu, Hsi-Cheng; Evans, Scott E.; Evans, James R.; Drain, Theodore R.; Guevara, Michelle M.
2012-01-01
Monte is a software set designed for use in mission design and spacecraft navigation operations. The system can process measurement data, design optimal trajectories and maneuvers, and do orbit determination, all in one application. For the first time, a single software set can be used for mission design and navigation operations. This eliminates problems due to different models and fidelities used in legacy mission design and navigation software. The unique features of Monte 040 include a blowdown thruster model for GRAIL (Gravity Recovery and Interior Laboratory) with associated pressure models, as well as an updated, optimalsearch capability (COSMIC) that facilitated mission design for ARTEMIS. Existing legacy software lacked the capabilities necessary for these two missions. There is also a mean orbital element propagator and an osculating to mean element converter that allows long-term orbital stability analysis for the first time in compiled code. The optimized trajectory search tool COSMIC allows users to place constraints and controls on their searches without any restrictions. Constraints may be user-defined and depend on trajectory information either forward or backwards in time. In addition, a long-term orbit stability analysis tool (morbiter) existed previously as a set of scripts on top of Monte. Monte is becoming the primary tool for navigation operations, a core competency at JPL. The mission design capabilities in Monte are becoming mature enough for use in project proposals as well as post-phase A mission design. Monte has three distinct advantages over existing software. First, it is being developed in a modern paradigm: object- oriented C++ and Python. Second, the software has been developed as a toolkit, which allows users to customize their own applications and allows the development team to implement requirements quickly, efficiently, and with minimal bugs. Finally, the software is managed in accordance with the CMMI (Capability Maturity Model Integration), where it has been ap praised at maturity level 3.
USDA-ARS?s Scientific Manuscript database
Assessing the performance of Low Impact Development (LID) practices at a catchment scale is important in managing urban watersheds. Few modeling tools exist that are capable of explicitly representing the hydrological mechanisms of LIDs while considering the diverse land uses of urban watersheds. ...
LEWICE 2.2 Capabilities and Thermal Validation
NASA Technical Reports Server (NTRS)
Wright, William B.
2002-01-01
A computational model of bleed air anti-icing and electrothermal de-icing have been added to the LEWICE 2.0 software by integrating the capabilities of two previous programs, ANTICE and LEWICE/ Thermal. This combined model has been released as LEWICE version 2.2. Several advancements have also been added to the previous capabilities of each module. This report will present the capabilities of the software package and provide results for both bleed air and electrothermal cases. A comprehensive validation effort has also been performed to compare the predictions to an existing electrothermal database. A quantitative comparison shows that for deicing cases, the average difference is 9.4 F (26%) compared to 3 F for the experimental data while for evaporative cases the average difference is 2 F (32%) compared to an experimental error of 4 F.
Cogeneration computer model assessment: Advanced cogeneration research study
NASA Technical Reports Server (NTRS)
Rosenberg, L.
1983-01-01
Cogeneration computer simulation models to recommend the most desirable models or their components for use by the Southern California Edison Company (SCE) in evaluating potential cogeneration projects was assessed. Existing cogeneration modeling capabilities are described, preferred models are identified, and an approach to the development of a code which will best satisfy SCE requirements is recommended. Five models (CELCAP, COGEN 2, CPA, DEUS, and OASIS) are recommended for further consideration.
NASA Technical Reports Server (NTRS)
Welp, D. W.; Brown, R. A.; Ullman, D. G.; Kuhner, M. B.
1974-01-01
A computer simulation program which models a commercial short-haul aircraft operating in the civil air system was developed. The purpose of the program is to evaluate the effect of a given aircraft avionics capability on the ability of the aircraft to perform on-time carrier operations. The program outputs consist primarily of those quantities which can be used to determine direct operating costs. These include: (1) schedule reliability or delays, (2) repairs/replacements, (3) fuel consumption, and (4) cancellations. More comprehensive models of the terminal area environment were added and a simulation of an existing airline operation was conducted to obtain a form of model verification. The capability of the program to provide comparative results (sensitivity analysis) was then demonstrated by modifying the aircraft avionics capability for additional computer simulations.
Capability maturity models for offshore organisational management.
Strutt, J E; Sharp, J V; Terry, E; Miles, R
2006-12-01
The goal setting regime imposed by the UK safety regulator has important implications for an organisation's ability to manage health and safety related risks. Existing approaches to safety assurance based on risk analysis and formal safety assessments are increasingly considered unlikely to create the step change improvement in safety to which the offshore industry aspires and alternative approaches are being considered. One approach, which addresses the important issue of organisational behaviour and which can be applied at a very early stage of design, is the capability maturity model (CMM). The paper describes the development of a design safety capability maturity model, outlining the key processes considered necessary to safety achievement, definition of maturity levels and scoring methods. The paper discusses how CMM is related to regulatory mechanisms and risk based decision making together with the potential of CMM to environmental risk management.
Improvements to Nuclear Data and Its Uncertainties by Theoretical Modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Danon, Yaron; Nazarewicz, Witold; Talou, Patrick
2013-02-18
This project addresses three important gaps in existing evaluated nuclear data libraries that represent a significant hindrance against highly advanced modeling and simulation capabilities for the Advanced Fuel Cycle Initiative (AFCI). This project will: Develop advanced theoretical tools to compute prompt fission neutrons and gamma-ray characteristics well beyond average spectra and multiplicity, and produce new evaluated files of U and Pu isotopes, along with some minor actinides; Perform state-of-the-art fission cross-section modeling and calculations using global and microscopic model input parameters, leading to truly predictive fission cross-sections capabilities. Consistent calculations for a suite of Pu isotopes will be performed; Implementmore » innovative data assimilation tools, which will reflect the nuclear data evaluation process much more accurately, and lead to a new generation of uncertainty quantification files. New covariance matrices will be obtained for Pu isotopes and compared to existing ones. The deployment of a fleet of safe and efficient advanced reactors that minimize radiotoxic waste and are proliferation-resistant is a clear and ambitious goal of AFCI. While in the past the design, construction and operation of a reactor were supported through empirical trials, this new phase in nuclear energy production is expected to rely heavily on advanced modeling and simulation capabilities. To be truly successful, a program for advanced simulations of innovative reactors will have to develop advanced multi-physics capabilities, to be run on massively parallel super- computers, and to incorporate adequate and precise underlying physics. And all these areas have to be developed simultaneously to achieve those ambitious goals. Of particular interest are reliable fission cross-section uncertainty estimates (including important correlations) and evaluations of prompt fission neutrons and gamma-ray spectra and uncertainties.« less
Space transportation activities in the United States
NASA Technical Reports Server (NTRS)
Gabris, Edward A.
1994-01-01
The status of the existing space transportation systems in the U.S. and options for increased capability is being examined in the context of mission requirements, options for new vehicles, cost to operate the existing vehicles, cost to develop new vehicles, and the capabilities and plans of other suppliers. This assessment is addressing the need to build and resupply the space station, to maintain necessary military assets in a rapidly changing world, and to continue a competitive commercial space transportation industry. The Department of Defense (DOD) and NASA each conducted an 'access to space' study using a common mission model but with the emphasis on their unique requirements. Both studies considered three options: maintain and improve the existing capability, build a new launch vehicle using contemporary technology, and build a new launch vehicle using advanced technology. While no decisions have been made on a course of action, it will be influenced by the availability of funds in the U.S. budget, the changing need for military space assets, the increasing competition among space launch suppliers, and the emerging opportunity for an advanced technology, low cost system and international partnerships to develop it.
2015-06-01
and tools, called model-integrated computing ( MIC ) [3] relies on the use of domain-specific modeling languages for creating models of the system to be...hence giving reflective capabilities to it. We have followed the MIC method here: we designed a domain- specific modeling language for modeling...are produced one-off and not for the mass market , the scope for price reduction based on the market demands is non-existent. Processes to create
Refining the aggregate exposure pathway
Advancements in measurement technologies and modeling capabilities continue to result in an abundance of exposure information, adding to that currently in existence. However, fragmentation within the exposure science community acts as an obstacle for realizing the vision set fort...
Anomaly Detection in Power Quality at Data Centers
NASA Technical Reports Server (NTRS)
Grichine, Art; Solano, Wanda M.
2015-01-01
The goal during my internship at the National Center for Critical Information Processing and Storage (NCCIPS) is to implement an anomaly detection method through the StruxureWare SCADA Power Monitoring system. The benefit of the anomaly detection mechanism is to provide the capability to detect and anticipate equipment degradation by monitoring power quality prior to equipment failure. First, a study is conducted that examines the existing techniques of power quality management. Based on these findings, and the capabilities of the existing SCADA resources, recommendations are presented for implementing effective anomaly detection. Since voltage, current, and total harmonic distortion demonstrate Gaussian distributions, effective set-points are computed using this model, while maintaining a low false positive count.
Modeling tools for the assessment of microbiological risks during floods: a review
NASA Astrophysics Data System (ADS)
Collender, Philip; Yang, Wen; Stieglitz, Marc; Remais, Justin
2015-04-01
Floods are a major, recurring source of harm to global economies and public health. Projected increases in the frequency and intensity of heavy precipitation events under future climate change, coupled with continued urbanization in areas with high risk of floods, may exacerbate future impacts of flooding. Improved flood risk management is essential to support global development, poverty reduction and public health, and is likely to be a crucial aspect of climate change adaptation. Importantly, floods can facilitate the transmission of waterborne pathogens by changing social conditions (overcrowding among displaced populations, interruption of public health services), imposing physical challenges to infrastructure (sewerage overflow, reduced capacity to treat drinking water), and altering fate and transport of pathogens (transport into waterways from overland flow, resuspension of settled contaminants) during and after flood conditions. Hydrological and hydrodynamic models are capable of generating quantitative characterizations of microbiological risks associated with flooding, while accounting for these diverse and at times competing physical and biological processes. Despite a few applications of such models to the quantification of microbiological risks associated with floods, there exists limited guidance as to the relative capabilities, and limitations, of existing modeling platforms when used for this purpose. Here, we review 17 commonly used flood and water quality modeling tools that have demonstrated or implicit capabilities of mechanistically representing and quantifying microbial risk during flood conditions. We compare models with respect to their capabilities of generating outputs that describe physical and microbial conditions during floods, such as concentration or load of non-cohesive sediments or pathogens, and the dynamics of high flow conditions. Recommendations are presented for the application of specific modeling tools for assessing particular flood-related microbial risks, and model improvements are suggested that may better characterize key microbial risks during flood events. The state of current tools are assessed in the context of a changing climate where the frequency, intensity and duration of flooding are shifting in some areas.
A multidimensional stability model for predicting shallow landslide size and shape across landscapes
David G. Milledge; Dino Bellugi; Jim A. McKean; Alexander L. Densmore; William E. Dietrich
2014-01-01
The size of a shallow landslide is a fundamental control on both its hazard and geomorphic importance. Existing models are either unable to predict landslide size or are computationally intensive such that they cannot practically be applied across landscapes. We derive a model appropriate for natural slopes that is capable of predicting shallow landslide size but...
NASA Technical Reports Server (NTRS)
daSilva, Arlindo
2004-01-01
The first set of interoperability experiments illustrates the role ESMF can play in integrating the national Earth science resources. Using existing data assimilation technology from NCEP and the National Weather Service, the Community Atmosphere Model (CAM) was able to ingest conventional and remotely sensed observations, a capability that could open the door to using CAM for weather as well as climate prediction. CAM, which includes land surface capabilities, was developed by NCAR, with key components from GSFC. In this talk we will describe the steps necessary for achieving the coupling of these two systems.
NASA Technical Reports Server (NTRS)
Chow, Chuen-Yen; Ryan, James S.
1987-01-01
While the zonal grid system of Transonic Navier-Stokes (TNS) provides excellent modeling of complex geometries, improved shock capturing, and a higher Mach number range will be required if flows about hypersonic aircraft are to be modeled accurately. A computational fluid dynamics (CFD) code, the Compressible Navier-Stokes (CNS), is under development to combine the required high Mach number capability with the existing TNS geometry capability. One of several candidate flow solvers for inclusion in the CNS is that of F3D. This upwinding flow solver promises improved shock capturing, and more accurate hypersonic solutions overall, compared to the solver currently used in TNS.
NASA Astrophysics Data System (ADS)
Navaz, H. K.; Dang, A. L.; Atkinson, T.; Zand, A.; Nowakowski, A.; Kamensky, K.
2014-05-01
A general-purpose multi-phase and multi-component computer model capable of solving the complex problems encountered in the agent substrate interaction is developed. The model solves the transient and time-accurate mass and momentum governing equations in a three dimensional space. The provisions for considering all the inter-phase activities (solidification, evaporation, condensation, etc.) are included in the model. The chemical reactions among all phases are allowed and the products of the existing chemical reactions in all three phases are possible. The impact of chemical reaction products on the transport properties in porous media such as porosity, capillary pressure, and permeability is considered. Numerous validations for simulants, agents, and pesticides with laboratory and open air data are presented. Results for chemical reactions in the presence of pre-existing water in porous materials such as moisture, or separated agent and water droplets on porous substrates are presented. The model will greatly enhance the capabilities in predicting the level of threat after any chemical such as Toxic Industrial Chemicals (TICs) and Toxic Industrial Materials (TIMs) release on environmental substrates. The model's generality makes it suitable for both defense and pharmaceutical applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yu, Y. Q.; Shemon, E. R.; Mahadevan, Vijay S.
SHARP, developed under the NEAMS Reactor Product Line, is an advanced modeling and simulation toolkit for the analysis of advanced nuclear reactors. SHARP is comprised of three physics modules currently including neutronics, thermal hydraulics, and structural mechanics. SHARP empowers designers to produce accurate results for modeling physical phenomena that have been identified as important for nuclear reactor analysis. SHARP can use existing physics codes and take advantage of existing infrastructure capabilities in the MOAB framework and the coupling driver/solver library, the Coupled Physics Environment (CouPE), which utilizes the widely used, scalable PETSc library. This report aims at identifying the coupled-physicsmore » simulation capability of SHARP by introducing the demonstration example called sahex in advance of the SHARP release expected by Mar 2016. sahex consists of 6 fuel pins with cladding, 1 control rod, sodium coolant and an outer duct wall that encloses all the other components. This example is carefully chosen to demonstrate the proof of concept for solving more complex demonstration examples such as EBR II assembly and ABTR full core. The workflow of preparing the input files, running the case and analyzing the results is demonstrated in this report. Moreover, an extension of the sahex model called sahex_core, which adds six homogenized neighboring assemblies to the full heterogeneous sahex model, is presented to test homogenization capabilities in both Nek5000 and PROTEUS. Some primary information on the configuration and build aspects for the SHARP toolkit, which includes capability to auto-download dependencies and configure/install with optimal flags in an architecture-aware fashion, is also covered by this report. A step-by-step instruction is provided to help users to create their cases. Details on these processes will be provided in the SHARP user manual that will accompany the first release.« less
Model driver screening and evaluation program. Volume 3, Guidelines for motor vehicle administrators
DOT National Transportation Integrated Search
2003-05-01
These Guidelines present an update of report number DOT HS 807 853 published in August 1992. They reflect current understanding of the relationship between functional capabilities and driving impairment gained through review of existing medical revie...
Ascending Stairway Modeling: A First Step Toward Autonomous Multi-Floor Exploration
2012-10-01
Many robotics platforms are capable of ascending stairways, but all existing approaches for autonomous stair climbing use stairway detection as a...the rich potential of an autonomous ground robot that can climb stairs while exploring a multi-floor building. Our proposed solution to this problem is...over several steps. However, many ground robots are not capable of traversing tight spiral stairs , and so we do not focus on these types. The stairway is
RDHWT/MARIAH II Hypersonic Wind Tunnel Research Program
2008-09-01
Diagnostics Dr. Gary Brown – Gas Dynamics Dr. Ihab Girgis – Modeling Dr. Dennis Mansfield – Experimental Ring Technical Services Dr. Leon Ring – Systems...wind tunnel (MSHWT) with Mach 8 to 15, true -temperature flight test capabilities. This research program was initiated in fiscal year (FY) 1998 and is...Force test capabilities that exist today. Performance goals of the MSHWT are true temperature, Mach 8 to 15, dynamic pressure of 500 to 2000 psf (24 to
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spencer, Benjamin; Zhang, Yongfeng; Chakraborty, Pritam
2014-09-01
This report summarizes work during FY 2014 to develop capabilities to predict embrittlement of reactor pressure vessel steel, and to assess the response of embrittled reactor pressure vessels to postulated accident conditions. This work has been conducted a three length scales. At the engineering scale, 3D fracture mechanics capabilities have been developed to calculate stress intensities and fracture toughnesses, to perform a deterministic assessment of whether a crack would propagate at the location of an existing flaw. This capability has been demonstrated on several types of flaws in a generic reactor pressure vessel model. Models have been developed at themore » scale of fracture specimens to develop a capability to determine how irradiation affects the fracture toughness of material. Verification work has been performed on a previously-developed model to determine the sensitivity of the model to specimen geometry and size effects. The effects of irradiation on the parameters of this model has been investigated. At lower length scales, work has continued in an ongoing to understand how irradiation and thermal aging affect the microstructure and mechanical properties of reactor pressure vessel steel. Previously-developed atomistic kinetic monte carlo models have been further developed and benchmarked against experimental data. Initial work has been performed to develop models of nucleation in a phase field model. Additional modeling work has also been performed to improve the fundamental understanding of the formation mechanisms and stability of matrix defects caused.« less
Using Micro-Synchrophasor Data for Advanced Distribution Grid Planning and Operations Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stewart, Emma; Kiliccote, Sila; McParland, Charles
2014-07-01
This report reviews the potential for distribution-grid phase-angle data that will be available from new micro-synchrophasors (µPMUs) to be utilized in existing distribution-grid planning and operations analysis. This data could augment the current diagnostic capabilities of grid analysis software, used in both planning and operations for applications such as fault location, and provide data for more accurate modeling of the distribution system. µPMUs are new distribution-grid sensors that will advance measurement and diagnostic capabilities and provide improved visibility of the distribution grid, enabling analysis of the grid’s increasingly complex loads that include features such as large volumes of distributed generation.more » Large volumes of DG leads to concerns on continued reliable operation of the grid, due to changing power flow characteristics and active generation, with its own protection and control capabilities. Using µPMU data on change in voltage phase angle between two points in conjunction with new and existing distribution-grid planning and operational tools is expected to enable model validation, state estimation, fault location, and renewable resource/load characterization. Our findings include: data measurement is outstripping the processing capabilities of planning and operational tools; not every tool can visualize a voltage phase-angle measurement to the degree of accuracy measured by advanced sensors, and the degree of accuracy in measurement required for the distribution grid is not defined; solving methods cannot handle the high volumes of data generated by modern sensors, so new models and solving methods (such as graph trace analysis) are needed; standardization of sensor-data communications platforms in planning and applications tools would allow integration of different vendors’ sensors and advanced measurement devices. In addition, data from advanced sources such as µPMUs could be used to validate models to improve/ensure accuracy, providing information on normally estimated values such as underground conductor impedance, and characterization of complex loads. Although the input of high-fidelity data to existing tools will be challenging, µPMU data on phase angle (as well as other data from advanced sensors) will be useful for basic operational decisions that are based on a trend of changing data.« less
Verification of Modelica-Based Models with Analytical Solutions for Tritium Diffusion
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rader, Jordan D.; Greenwood, Michael Scott; Humrickhouse, Paul W.
Here, tritium transport in metal and molten salt fluids combined with diffusion through high-temperature structural materials is an important phenomenon in both magnetic confinement fusion (MCF) and molten salt reactor (MSR) applications. For MCF, tritium is desirable to capture for fusion fuel. For MSRs, uncaptured tritium potentially can be released to the environment. In either application, quantifying the time- and space-dependent tritium concentration in the working fluid(s) and structural components is necessary.Whereas capability exists specifically for calculating tritium transport in such systems (e.g., using TMAP for fusion reactors), it is desirable to unify the calculation of tritium transport with othermore » system variables such as dynamic fluid and structure temperature combined with control systems such as those that might be found in a system code. Some capability for radioactive trace substance transport exists in thermal-hydraulic systems codes (e.g., RELAP5-3D); however, this capability is not coupled to species diffusion through solids. Combined calculations of tritium transport and thermal-hydraulic solution have been demonstrated with TRIDENT but only for a specific type of MSR.Researchers at Oak Ridge National Laboratory have developed a set of Modelica-based dynamic system modeling tools called TRANsient Simulation Framework Of Reconfigurable Models (TRANSFORM) that were used previously to model advanced fission reactors and associated systems. In this system, the augmented TRANSFORM library includes dynamically coupled fluid and solid trace substance transport and diffusion. Results from simulations are compared against analytical solutions for verification.« less
Verification of Modelica-Based Models with Analytical Solutions for Tritium Diffusion
Rader, Jordan D.; Greenwood, Michael Scott; Humrickhouse, Paul W.
2018-03-20
Here, tritium transport in metal and molten salt fluids combined with diffusion through high-temperature structural materials is an important phenomenon in both magnetic confinement fusion (MCF) and molten salt reactor (MSR) applications. For MCF, tritium is desirable to capture for fusion fuel. For MSRs, uncaptured tritium potentially can be released to the environment. In either application, quantifying the time- and space-dependent tritium concentration in the working fluid(s) and structural components is necessary.Whereas capability exists specifically for calculating tritium transport in such systems (e.g., using TMAP for fusion reactors), it is desirable to unify the calculation of tritium transport with othermore » system variables such as dynamic fluid and structure temperature combined with control systems such as those that might be found in a system code. Some capability for radioactive trace substance transport exists in thermal-hydraulic systems codes (e.g., RELAP5-3D); however, this capability is not coupled to species diffusion through solids. Combined calculations of tritium transport and thermal-hydraulic solution have been demonstrated with TRIDENT but only for a specific type of MSR.Researchers at Oak Ridge National Laboratory have developed a set of Modelica-based dynamic system modeling tools called TRANsient Simulation Framework Of Reconfigurable Models (TRANSFORM) that were used previously to model advanced fission reactors and associated systems. In this system, the augmented TRANSFORM library includes dynamically coupled fluid and solid trace substance transport and diffusion. Results from simulations are compared against analytical solutions for verification.« less
Approaches for scalable modeling and emulation of cyber systems : LDRD final report.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mayo, Jackson R.; Minnich, Ronald G.; Armstrong, Robert C.
2009-09-01
The goal of this research was to combine theoretical and computational approaches to better understand the potential emergent behaviors of large-scale cyber systems, such as networks of {approx} 10{sup 6} computers. The scale and sophistication of modern computer software, hardware, and deployed networked systems have significantly exceeded the computational research community's ability to understand, model, and predict current and future behaviors. This predictive understanding, however, is critical to the development of new approaches for proactively designing new systems or enhancing existing systems with robustness to current and future cyber threats, including distributed malware such as botnets. We have developed preliminarymore » theoretical and modeling capabilities that can ultimately answer questions such as: How would we reboot the Internet if it were taken down? Can we change network protocols to make them more secure without disrupting existing Internet connectivity and traffic flow? We have begun to address these issues by developing new capabilities for understanding and modeling Internet systems at scale. Specifically, we have addressed the need for scalable network simulation by carrying out emulations of a network with {approx} 10{sup 6} virtualized operating system instances on a high-performance computing cluster - a 'virtual Internet'. We have also explored mappings between previously studied emergent behaviors of complex systems and their potential cyber counterparts. Our results provide foundational capabilities for further research toward understanding the effects of complexity in cyber systems, to allow anticipating and thwarting hackers.« less
NASA Technical Reports Server (NTRS)
Chan, David T.; Balakrishna, Sundareswara; Walker, Eric L.; Goodliff, Scott L.
2015-01-01
Recent data quality improvements at the National Transonic Facility have an intended goal of reducing the Mach number variation in a data point to within plus or minus 0.0005, with the ultimate goal of reducing the data repeatability of the drag coefficient for full-span subsonic transport models at transonic speeds to within half a drag count. This paper will discuss the Mach stability improvements achieved through the use of an existing second throat capability at the NTF to create a minimum area at the end of the test section. These improvements were demonstrated using both the NASA Common Research Model and the NTF Pathfinder-I model in recent experiments. Sonic conditions at the throat were verified using sidewall static pressure data. The Mach variation levels from both experiments in the baseline tunnel configuration and the choked tunnel configuration will be presented and the correlation between Mach number and drag will also be examined. Finally, a brief discussion is given on the consequences of using the second throat in its location at the end of the test section.
NASA Technical Reports Server (NTRS)
Chan, David T.
2015-01-01
Recent data quality improvements at the National Transonic Facility (NTF) have an intended goal of reducing the Mach number variation in a data point to within unit vector A plus or minus 0.0005, with the ultimate goal of reducing the data repeatability of the drag coefficient for full-span subsonic transport models at transonic speeds to within half of a drag count. This paper will discuss the Mach stability improvements achieved through the use of an existing second throat capability at the NTF to create a minimum area at the end of the test section. These improvements were demonstrated using both the NASA Common Research Model and the NTF Pathfinder-I model in recent experiments. Sonic conditions at the throat were verified using sidewall static pressure data. The Mach variation levels from both experiments in the baseline tunnel configuration and the choked tunnel configuration will be presented. Finally, a brief discussion is given on the consequences of using the second throat in its location at the end of the test section.
NASA Astrophysics Data System (ADS)
Schmidt, J. B.
1985-09-01
This thesis investigates ways of improving the real-time performance of the Stockpoint Logistics Integrated Communication Environment (SPLICE). Performance evaluation through continuous monitoring activities and performance studies are the principle vehicles discussed. The method for implementing this performance evaluation process is the measurement of predefined performance indexes. Performance indexes for SPLICE are offered that would measure these areas. Existing SPLICE capability to carry out performance evaluation is explored, and recommendations are made to enhance that capability.
A Three-Dimensional Linearized Unsteady Euler Analysis for Turbomachinery Blade Rows
NASA Technical Reports Server (NTRS)
Montgomery, Matthew D.; Verdon, Joseph M.
1996-01-01
A three-dimensional, linearized, Euler analysis is being developed to provide an efficient unsteady aerodynamic analysis that can be used to predict the aeroelastic and aeroacoustic response characteristics of axial-flow turbomachinery blading. The field equations and boundary conditions needed to describe nonlinear and linearized inviscid unsteady flows through a blade row operating within a cylindrical annular duct are presented. In addition, a numerical model for linearized inviscid unsteady flow, which is based upon an existing nonlinear, implicit, wave-split, finite volume analysis, is described. These aerodynamic and numerical models have been implemented into an unsteady flow code, called LINFLUX. A preliminary version of the LINFLUX code is applied herein to selected, benchmark three-dimensional, subsonic, unsteady flows, to illustrate its current capabilities and to uncover existing problems and deficiencies. The numerical results indicate that good progress has been made toward developing a reliable and useful three-dimensional prediction capability. However, some problems, associated with the implementation of an unsteady displacement field and numerical errors near solid boundaries, still exist. Also, accurate far-field conditions must be incorporated into the FINFLUX analysis, so that this analysis can be applied to unsteady flows driven be external aerodynamic excitations.
NASA Astrophysics Data System (ADS)
Kolb, Kimberly E.; Choi, Hee-sue S.; Kaur, Balvinder; Olson, Jeffrey T.; Hill, Clayton F.; Hutchinson, James A.
2016-05-01
The US Army's Communications Electronics Research, Development and Engineering Center (CERDEC) Night Vision and Electronic Sensors Directorate (referred to as NVESD) is developing a virtual detection, recognition, and identification (DRI) testing methodology using simulated imagery as a means of augmenting the field testing component of sensor performance evaluation, which is expensive, resource intensive, time consuming, and limited to the available target(s) and existing atmospheric visibility and environmental conditions at the time of testing. Existing simulation capabilities such as the Digital Imaging Remote Sensing Image Generator (DIRSIG) and NVESD's Integrated Performance Model Image Generator (NVIPM-IG) can be combined with existing detection algorithms to reduce cost/time, minimize testing risk, and allow virtual/simulated testing using full spectral and thermal object signatures, as well as those collected in the field. NVESD has developed an end-to-end capability to demonstrate the feasibility of this approach. Simple detection algorithms have been used on the degraded images generated by NVIPM-IG to determine the relative performance of the algorithms on both DIRSIG-simulated and collected images. Evaluating the degree to which the algorithm performance agrees between simulated versus field collected imagery is the first step in validating the simulated imagery procedure.
Women Moving Women: Going beyond the Awareness Stage.
ERIC Educational Resources Information Center
Gillen, Marie A.
A model was developed out of a need to help women move toward a more authentic existence. It was intended to help women go beyond the level of awareness in their present quest for self-determination, to make full use of their lives, to challenge their capabilities, and to reach for new horizons. The model involves both an unlearning and a…
Monte Carlo capabilities of the SCALE code system
Rearden, Bradley T.; Petrie, Jr., Lester M.; Peplow, Douglas E.; ...
2014-09-12
SCALE is a broadly used suite of tools for nuclear systems modeling and simulation that provides comprehensive, verified and validated, user-friendly capabilities for criticality safety, reactor physics, radiation shielding, and sensitivity and uncertainty analysis. For more than 30 years, regulators, licensees, and research institutions around the world have used SCALE for nuclear safety analysis and design. SCALE provides a “plug-and-play” framework that includes three deterministic and three Monte Carlo radiation transport solvers that can be selected based on the desired solution, including hybrid deterministic/Monte Carlo simulations. SCALE includes the latest nuclear data libraries for continuous-energy and multigroup radiation transport asmore » well as activation, depletion, and decay calculations. SCALE’s graphical user interfaces assist with accurate system modeling, visualization, and convenient access to desired results. SCALE 6.2 will provide several new capabilities and significant improvements in many existing features, especially with expanded continuous-energy Monte Carlo capabilities for criticality safety, shielding, depletion, and sensitivity and uncertainty analysis. Finally, an overview of the Monte Carlo capabilities of SCALE is provided here, with emphasis on new features for SCALE 6.2.« less
LLIMAS: Revolutionizing integrating modeling and analysis at MIT Lincoln Laboratory
NASA Astrophysics Data System (ADS)
Doyle, Keith B.; Stoeckel, Gerhard P.; Rey, Justin J.; Bury, Mark E.
2017-08-01
MIT Lincoln Laboratory's Integrated Modeling and Analysis Software (LLIMAS) enables the development of novel engineering solutions for advanced prototype systems through unique insights into engineering performance and interdisciplinary behavior to meet challenging size, weight, power, environmental, and performance requirements. LLIMAS is a multidisciplinary design optimization tool that wraps numerical optimization algorithms around an integrated framework of structural, thermal, optical, stray light, and computational fluid dynamics analysis capabilities. LLIMAS software is highly extensible and has developed organically across a variety of technologies including laser communications, directed energy, photometric detectors, chemical sensing, laser radar, and imaging systems. The custom software architecture leverages the capabilities of existing industry standard commercial software and supports the incorporation of internally developed tools. Recent advances in LLIMAS's Structural-Thermal-Optical Performance (STOP), aeromechanical, and aero-optical capabilities as applied to Lincoln prototypes are presented.
Object-Oriented MDAO Tool with Aeroservoelastic Model Tuning Capability
NASA Technical Reports Server (NTRS)
Pak, Chan-gi; Li, Wesley; Lung, Shun-fat
2008-01-01
An object-oriented multi-disciplinary analysis and optimization (MDAO) tool has been developed at the NASA Dryden Flight Research Center to automate the design and analysis process and leverage existing commercial as well as in-house codes to enable true multidisciplinary optimization in the preliminary design stage of subsonic, transonic, supersonic and hypersonic aircraft. Once the structural analysis discipline is finalized and integrated completely into the MDAO process, other disciplines such as aerodynamics and flight controls will be integrated as well. Simple and efficient model tuning capabilities based on optimization problem are successfully integrated with the MDAO tool. More synchronized all phases of experimental testing (ground and flight), analytical model updating, high-fidelity simulations for model validation, and integrated design may result in reduction of uncertainties in the aeroservoelastic model and increase the flight safety.
NASA Technical Reports Server (NTRS)
Mehra, Avichal; Anantharaj, Valentine; Payne, Steve; Kantha, Lakshmi
1996-01-01
This report documents an existing capability to produce operationally relevant products on sea level and currents from a tides/storm surge model for any coastal region around the world within 48 hours from the time of the request. The model is ready for transition to the Naval Oceanographic Office (NAVOCEANO) for potential contingency use anywhere around the world. A recent application to naval operations offshore Liberia illustrates this. Mississippi State University, in collaboration with the University of Colorado and NAVOCEANO, successfully deployed the Colorado University Rapidly Relocatable Nestable Tides and Storm Surge (CURReNTSS) model that predicts sea surface height, tidal currents and storm surge, and provided operational products on tidal sea level and currents in the littoral region off south-western coast of Africa. This report summarizes the results of this collaborative effort in an actual contingency use of the relocatable model, summarizes the lessons learned, and provides recommendations for further evaluation and transition of this modeling capability to operational use.
Stakeholder approach for evaluating organizational change projects.
Peltokorpi, Antti; Alho, Antti; Kujala, Jaakko; Aitamurto, Johanna; Parvinen, Petri
2008-01-01
This paper aims to create a model for evaluating organizational change initiatives from a stakeholder resistance viewpoint. The paper presents a model to evaluate change projects and their expected benefits. Factors affecting the challenge to implement change were defined based on stakeholder theory literature. The authors test the model's practical validity for screening change initiatives to improve operating room productivity. Change initiatives can be evaluated using six factors: the effect of the planned intervention on stakeholders' actions and position; stakeholders' capability to influence the project's implementation; motivation to participate; capability to change; change complexity; and management capability. The presented model's generalizability should be explored by filtering presented factors through a larger number of historical cases operating in different healthcare contexts. The link between stakeholders, the change challenge and the outcomes of change projects needs to be empirically tested. The proposed model can be used to prioritize change projects, manage stakeholder resistance and establish a better organizational and professional competence for managing healthcare organization change projects. New insights into existing stakeholder-related understanding of change project successes are provided.
Assessment of existing Sierra/Fuego capabilities related to grid-to-rod-fretting (GTRF).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Turner, Daniel Zack; Rodriguez, Salvador B.
2011-06-01
The following report presents an assessment of existing capabilities in Sierra/Fuego applied to modeling several aspects of grid-to-rod-fretting (GTRF) including: fluid dynamics, heat transfer, and fluid-structure interaction. We compare the results of a number of Fuego simulations with relevant sources in the literature to evaluate the accuracy, efficiency, and robustness of using Fuego to model the aforementioned aspects. Comparisons between flow domains that include the full fuel rod length vs. a subsection of the domain near the spacer show that tremendous efficiency gains can be obtained by truncating the domain without loss of accuracy. Thermal analysis reveals the extent tomore » which heat transfer from the fuel rods to the coolant is improved by the swirling flow created by the mixing vanes. Lastly, coupled fluid-structure interaction analysis shows that the vibrational modes of the fuel rods filter out high frequency turbulent pressure fluctuations. In general, these results allude to interesting phenomena for which further investigation could be quite fruitful.« less
NASA Astrophysics Data System (ADS)
Kiekebusch, Mario J.; Di Lieto, Nicola; Sandrock, Stefan; Popovic, Dan; Chiozzi, Gianluca
2014-07-01
ESO is in the process of implementing a new development platform, based on PLCs, for upcoming VLT control systems (new instruments and refurbishing of existing systems to manage obsolescence issues). In this context, we have evaluated the integration and reuse of existing C++ libraries and Simulink models into the real-time environment of BECKHOFF Embedded PCs using the capabilities of the latest version of TwinCAT software and MathWorks Embedded Coder. While doing so the aim was to minimize the impact of the new platform by adopting fully tested solutions implemented in C++. This allows us to reuse the in house expertise, as well as extending the normal capabilities of the traditional PLC programming environments. We present the progress of this work and its application in two concrete cases: 1) field rotation compensation for instrument tracking devices like derotators, 2) the ESO standard axis controller (ESTAC), a generic model-based controller implemented in Simulink and used for the control of telescope main axes.
INTEGRATION OF FACILITY MODELING CAPABILITIES FOR NUCLEAR NONPROLIFERATION ANALYSIS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gorensek, M.; Hamm, L.; Garcia, H.
2011-07-18
Developing automated methods for data collection and analysis that can facilitate nuclear nonproliferation assessment is an important research area with significant consequences for the effective global deployment of nuclear energy. Facility modeling that can integrate and interpret observations collected from monitored facilities in order to ascertain their functional details will be a critical element of these methods. Although improvements are continually sought, existing facility modeling tools can characterize all aspects of reactor operations and the majority of nuclear fuel cycle processing steps, and include algorithms for data processing and interpretation. Assessing nonproliferation status is challenging because observations can come frommore » many sources, including local and remote sensors that monitor facility operations, as well as open sources that provide specific business information about the monitored facilities, and can be of many different types. Although many current facility models are capable of analyzing large amounts of information, they have not been integrated in an analyst-friendly manner. This paper addresses some of these facility modeling capabilities and illustrates how they could be integrated and utilized for nonproliferation analysis. The inverse problem of inferring facility conditions based on collected observations is described, along with a proposed architecture and computer framework for utilizing facility modeling tools. After considering a representative sampling of key facility modeling capabilities, the proposed integration framework is illustrated with several examples.« less
Systems test facilities existing capabilities compilation
NASA Technical Reports Server (NTRS)
Weaver, R.
1981-01-01
Systems test facilities (STFS) to test total photovoltaic systems and their interfaces are described. The systems development (SD) plan is compilation of existing and planned STFs, as well as subsystem and key component testing facilities. It is recommended that the existing capabilities compilation is annually updated to provide and assessment of the STF activity and to disseminate STF capabilities, status and availability to the photovoltaics program.
Models and techniques for evaluating the effectiveness of aircraft computing systems
NASA Technical Reports Server (NTRS)
Meyer, J. F.
1978-01-01
The development of system models that can provide a basis for the formulation and evaluation of aircraft computer system effectiveness, the formulation of quantitative measures of system effectiveness, and the development of analytic and simulation techniques for evaluating the effectiveness of a proposed or existing aircraft computer are described. Specific topics covered include: system models; performability evaluation; capability and functional dependence; computation of trajectory set probabilities; and hierarchical modeling of an air transport mission.
1976-03-01
atmosphere,as well as very fine grid cloud models and cloud probability models. Some of the new requirements that will be supported with this system are a...including the Advanced Prediction Model for the global atmosphere, as well as very fine grid cloud models and cloud proba- bility models. Some of the new...with the mapping and gridding function (imput and output)? Should the capability exist to interface raw ungridded data with the SID interface
Potential capabilities of lunar laser ranging for geodesy and relativity
NASA Technical Reports Server (NTRS)
Muller, Jurgen; Williams, James G.; Turshev, Slava G.; Shelus, Peter J.
2005-01-01
Here, we review the LLR technique focusing on its impact on Geodesy and Relativity. We discuss the modem observational accuracy and the level of existing LLR modeling. We present the near-term objectives and emphasize improvements needed to fully utilize the scientific potential of LLR.
NASA Astrophysics Data System (ADS)
Jung, Yookyung; Klein, Oliver J.; Wang, Hequn; Evans, Conor L.
2016-06-01
Three-dimensional in vitro tumor models are highly useful tools for studying tumor growth and treatment response of malignancies such as ovarian cancer. Existing viability and treatment assessment assays, however, face shortcomings when applied to these large, complex, and heterogeneous culture systems. Optical coherence tomography (OCT) is a noninvasive, label-free, optical imaging technique that can visualize live cells and tissues over time with subcellular resolution and millimeters of optical penetration depth. Here, we show that OCT is capable of carrying out high-content, longitudinal assays of 3D culture treatment response. We demonstrate the usage and capability of OCT for the dynamic monitoring of individual and combination therapeutic regimens in vitro, including both chemotherapy drugs and photodynamic therapy (PDT) for ovarian cancer. OCT was validated against the standard LIVE/DEAD Viability/Cytotoxicity Assay in small tumor spheroid cultures, showing excellent correlation with existing standards. Importantly, OCT was shown to be capable of evaluating 3D spheroid treatment response even when traditional viability assays failed. OCT 3D viability imaging revealed synergy between PDT and the standard-of-care chemotherapeutic carboplatin that evolved over time. We believe the efficacy and accuracy of OCT in vitro drug screening will greatly contribute to the field of cancer treatment and therapy evaluation.
Extending BPM Environments of Your Choice with Performance Related Decision Support
NASA Astrophysics Data System (ADS)
Fritzsche, Mathias; Picht, Michael; Gilani, Wasif; Spence, Ivor; Brown, John; Kilpatrick, Peter
What-if Simulations have been identified as one solution for business performance related decision support. Such support is especially useful in cases where it can be automatically generated out of Business Process Management (BPM) Environments from the existing business process models and performance parameters monitored from the executed business process instances. Currently, some of the available BPM Environments offer basic-level performance prediction capabilities. However, these functionalities are normally too limited to be generally useful for performance related decision support at business process level. In this paper, an approach is presented which allows the non-intrusive integration of sophisticated tooling for what-if simulations, analytic performance prediction tools, process optimizations or a combination of such solutions into already existing BPM environments. The approach abstracts from process modelling techniques which enable automatic decision support spanning processes across numerous BPM Environments. For instance, this enables end-to-end decision support for composite processes modelled with the Business Process Modelling Notation (BPMN) on top of existing Enterprise Resource Planning (ERP) processes modelled with proprietary languages.
NASA Astrophysics Data System (ADS)
Guzmán, H. A.; Lárraga, M. E.; Alvarez-Icaza, L.; Carvajal, J.
2018-02-01
In this paper, a reliable cellular automata model oriented to faithfully reproduce deceleration and acceleration according to realistic reactions of drivers, when vehicles with different deceleration capabilities are considered is presented. The model focuses on describing complex traffic phenomena by coding in its rules the basic mechanisms of drivers behavior, vehicles capabilities and kinetics, while preserving simplicity. In particular, vehiclés kinetics is based on uniform accelerated motion, rather than in impulsive accelerated motion as in most existing CA models. Thus, the proposed model calculates in an analytic way three safe preserving distances to determine the best action a follower vehicle can take under a worst case scenario. Besides, the prediction analysis guarantees that under the proper assumptions, collision between vehicles may not happen at any future time. Simulations results indicate that all interactions of heterogeneous vehicles (i.e., car-truck, truck-car, car-car and truck-truck) are properly reproduced by the model. In addition, the model overcomes one of the major limitations of CA models for traffic modeling: the inability to perform smooth approach to slower or stopped vehicles. Moreover, the model is also capable of reproducing most empirical findings including the backward speed of the downstream front of the traffic jam, and different congested traffic patterns induced by a system with open boundary conditions with an on-ramp. Like most CA models, integer values are used to make the model run faster, which makes the proposed model suitable for real time traffic simulation of large networks.
Probabilistic Fracture Mechanics of Reactor Pressure Vessels with Populations of Flaws
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spencer, Benjamin; Backman, Marie; Williams, Paul
This report documents recent progress in developing a tool that uses the Grizzly and RAVEN codes to perform probabilistic fracture mechanics analyses of reactor pressure vessels in light water reactor nuclear power plants. The Grizzly code is being developed with the goal of creating a general tool that can be applied to study a variety of degradation mechanisms in nuclear power plant components. Because of the central role of the reactor pressure vessel (RPV) in a nuclear power plant, particular emphasis is being placed on developing capabilities to model fracture in embrittled RPVs to aid in the process surrounding decisionmore » making relating to life extension of existing plants. A typical RPV contains a large population of pre-existing flaws introduced during the manufacturing process. The use of probabilistic techniques is necessary to assess the likelihood of crack initiation at one or more of these flaws during a transient event. This report documents development and initial testing of a capability to perform probabilistic fracture mechanics of large populations of flaws in RPVs using reduced order models to compute fracture parameters. The work documented here builds on prior efforts to perform probabilistic analyses of a single flaw with uncertain parameters, as well as earlier work to develop deterministic capabilities to model the thermo-mechanical response of the RPV under transient events, and compute fracture mechanics parameters at locations of pre-defined flaws. The capabilities developed as part of this work provide a foundation for future work, which will develop a platform that provides the flexibility needed to consider scenarios that cannot be addressed with the tools used in current practice.« less
Integrating O/S models during conceptual design, part 1
NASA Technical Reports Server (NTRS)
Ebeling, Charles E.
1994-01-01
The University of Dayton is pleased to submit this report to the National Aeronautics and Space Administration (NASA), Langley Research Center, which integrates a set of models for determining operational capabilities and support requirements during the conceptual design of proposed space systems. This research provides for the integration of the reliability and maintainability (R&M) model, both new and existing simulation models, and existing operations and support (O&S) costing equations in arriving at a complete analysis methodology. Details concerning the R&M model and the O&S costing model may be found in previous reports accomplished under this grant (NASA Research Grant NAG1-1327). In the process of developing this comprehensive analysis approach, significant enhancements were made to the R&M model, updates to the O&S costing model were accomplished, and a new simulation model developed. This is the 1st part of a 3 part technical report.
ERIC Educational Resources Information Center
GLOVER, J.H.
THE CHIEF OBJECTIVE OF THIS STUDY OF SPEED-SKILL ACQUISITION WAS TO FIND A MATHEMATICAL MODEL CAPABLE OF SIMPLE GRAPHIC INTERPRETATION FOR INDUSTRIAL TRAINING AND PRODUCTION SCHEDULING AT THE SHOP FLOOR LEVEL. STUDIES OF MIDDLE SKILL DEVELOPMENT IN MACHINE AND VEHICLE ASSEMBLY, AIRCRAFT PRODUCTION, SPOOLMAKING AND THE MACHINING OF PARTS CONFIRMED…
Software Analysis of New Space Gravity Data for Geophysics and Climate Research
NASA Technical Reports Server (NTRS)
Deese, Rupert; Ivins, Erik R.; Fielding, Eric J.
2012-01-01
Both the Gravity Recovery and Climate Experiment (GRACE) and Gravity field and steady-state Ocean Circulation Explorer (GOCE) satellites are returning rich data for the study of the solid earth, the oceans, and the climate. Current software analysis tools do not provide researchers with the ease and flexibility required to make full use of this data. We evaluate the capabilities and shortcomings of existing software tools including Mathematica, the GOCE User Toolbox, the ICGEM's (International Center for Global Earth Models) web server, and Tesseroids. Using existing tools as necessary, we design and implement software with the capability to produce gridded data and publication quality renderings from raw gravity data. The straight forward software interface marks an improvement over previously existing tools and makes new space gravity data more useful to researchers. Using the software we calculate Bouguer anomalies of the gravity tensor's vertical component in the Gulf of Mexico, Antarctica, and the 2010 Maule earthquake region. These maps identify promising areas of future research.
High Reynolds number turbulence model of rotating shear flows
NASA Astrophysics Data System (ADS)
Masuda, S.; Ariga, I.; Koyama, H. S.
1983-09-01
A Reynolds stress closure model for rotating turbulent shear flows is developed. Special attention is paid to keeping the model constants independent of rotation. First, general forms of the model of a Reynolds stress equation and a dissipation rate equation are derived, the only restrictions of which are high Reynolds number and incompressibility. The model equations are then applied to two-dimensional equilibrium boundary layers and the effects of Coriolis acceleration on turbulence structures are discussed. Comparisons with the experimental data and with previous results in other external force fields show that there exists a very close analogy between centrifugal, buoyancy and Coriolis force fields. Finally, the model is applied to predict the two-dimensional boundary layers on rotating plane walls. Comparisons with existing data confirmed its capability of predicting mean and turbulent quantities without employing any empirical relations in rotating fields.
Tracking trade transactions in water resource systems: A node-arc optimization formulation
NASA Astrophysics Data System (ADS)
Erfani, Tohid; Huskova, Ivana; Harou, Julien J.
2013-05-01
We formulate and apply a multicommodity network flow node-arc optimization model capable of tracking trade transactions in complex water resource systems. The model uses a simple node to node network connectivity matrix and does not require preprocessing of all possible flow paths in the network. We compare the proposed node-arc formulation with an existing arc-path (flow path) formulation and explain the advantages and difficulties of both approaches. We verify the proposed formulation model on a hypothetical water distribution network. Results indicate the arc-path model solves the problem with fewer constraints, but the proposed formulation allows using a simple network connectivity matrix which simplifies modeling large or complex networks. The proposed algorithm allows converting existing node-arc hydroeconomic models that broadly represent water trading to ones that also track individual supplier-receiver relationships (trade transactions).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wnek, W.J.; Ramshaw, J.D.; Trapp, J.A.
1975-11-01
A mathematical model and a numerical solution scheme for thermal- hydraulic analysis of fuel rod arrays are given. The model alleviates the two major deficiencies associated with existing rod array analysis models, that of a correct transverse momentum equation and the capability of handling reversing and circulatory flows. Possible applications of the model include steady state and transient subchannel calculations as well as analysis of flows in heat exchangers, other engineering equipment, and porous media. (auth)
Modeling and Simulation Tools for Heavy Lift Airships
NASA Technical Reports Server (NTRS)
Hochstetler, Ron; Chachad, Girish; Hardy, Gordon; Blanken, Matthew; Melton, John
2016-01-01
For conventional fixed wing and rotary wing aircraft a variety of modeling and simulation tools have been developed to provide designers the means to thoroughly investigate proposed designs and operational concepts. However, lighter-than-air (LTA) airships, hybrid air vehicles, and aerostats have some important aspects that are different from heavier-than-air (HTA) vehicles. In order to account for these differences, modifications are required to the standard design tools to fully characterize the LTA vehicle design and performance parameters.. To address these LTA design and operational factors, LTA development organizations have created unique proprietary modeling tools, often at their own expense. An expansion of this limited LTA tool set could be accomplished by leveraging existing modeling and simulation capabilities available in the National laboratories and public research centers. Development of an expanded set of publicly available LTA modeling and simulation tools for LTA developers would mitigate the reliance on proprietary LTA design tools in use today. A set of well researched, open source, high fidelity LTA design modeling and simulation tools would advance LTA vehicle development and also provide the analytical basis for accurate LTA operational cost assessments. This paper will present the modeling and analysis tool capabilities required for LTA vehicle design, analysis of operations, and full life-cycle support. A survey of the tools currently available will be assessed to identify the gaps between their capabilities and the LTA industry's needs. Options for development of new modeling and analysis capabilities to supplement contemporary tools will also be presented.
Finite element modelling of crash response of composite aerospace sub-floor structures
NASA Astrophysics Data System (ADS)
McCarthy, M. A.; Harte, C. G.; Wiggenraad, J. F. M.; Michielsen, A. L. P. J.; Kohlgrüber, D.; Kamoulakos, A.
Composite energy-absorbing structures for use in aircraft are being studied within a European Commission research programme (CRASURV - Design for Crash Survivability). One of the aims of the project is to evaluate the current capabilities of crashworthiness simulation codes for composites modelling. This paper focuses on the computational analysis using explicit finite element analysis, of a number of quasi-static and dynamic tests carried out within the programme. It describes the design of the structures, the analysis techniques used, and the results of the analyses in comparison to the experimental test results. It has been found that current multi-ply shell models are capable of modelling the main energy-absorbing processes at work in such structures. However some deficiencies exist, particularly in modelling fabric composites. Developments within the finite element code are taking place as a result of this work which will enable better representation of composite fabrics.
Information Management for Unmanned Systems: Combining DL-Reasoning with Publish/Subscribe
NASA Astrophysics Data System (ADS)
Moser, Herwig; Reichelt, Toni; Oswald, Norbert; Förster, Stefan
Sharing capabilities and information between collaborating entities by using modem information- and communication-technology is a core principle in complex distributed civil or military mission scenarios. Previous work proved the suitability of Service-oriented Architectures for modelling and sharing the participating entities' capabilities. Albeit providing a satisfactory model for capabilities sharing, pure service-orientation curtails expressiveness for information exchange as opposed to dedicated data-centric communication principles. In this paper we introduce an Information Management System which combines OWL-Ontologies and automated reasoning with Publish/Subscribe-Systems, providing for a shared but decoupled data model. While confirming existing related research results, we emphasise the novel application and lack of practical experience of using Semantic Web technologies in areas other than originally intended. That is, aiding decision support and software design in the context of a mission scenario for an unmanned system. Experiments within a complex simulation environment show the immediate benefits of a semantic information-management and -dissemination platform: Clear separation of concerns in code and data model, increased service re-usability and extensibility as well as regulation of data flow and respective system behaviour through declarative rules.
NextGen Weather Processor Architecture Study
2010-03-09
achieved in a staged fashion , ideally with new components coming on-line in time to replace existing capabilities prior to their end-of-life dates... fashion , ideally with new components coming on-line in time to replace existing capabilities prior to their end-of-life dates. As part of NWP Segment...weather capabilities. These objectives are to be achieved in a staged fashion , ideally with new components coming on-line in time to replace existing
NASA Technical Reports Server (NTRS)
Burns, K. Lee; Altino, Karen
2008-01-01
The Marshall Space Flight Center Natural Environments Branch has a long history of expertise in the modeling and computation of statistical launch availabilities with respect to weather conditions. Their existing data analysis product, the Atmospheric Parametric Risk Assessment (APRA) tool, computes launch availability given an input set of vehicle hardware and/or operational weather constraints by calculating the climatological probability of exceeding the specified constraint limits, APRA has been used extensively to provide the Space Shuttle program the ability to estimate impacts that various proposed design modifications would have to overall launch availability. The model accounts for both seasonal and diurnal variability at a single geographic location and provides output probabilities for a single arbitrary launch attempt. Recently, the Shuttle program has shown interest in having additional capabilities added to the APRA model, including analysis of humidity parameters, inclusion of landing site weather to produce landing availability, and concurrent analysis of multiple sites, to assist in operational landing site selection. In addition, the Constellation program has also expressed interest in the APRA tool, and has requested several additional capabilities to address some Constellation-specific issues, both in the specification and verification of design requirements and in the development of operations concepts. The combined scope of the requested capability enhancements suggests an evolution of the model beyond a simple revision process. Development has begun for a new data analysis tool that will satisfy the requests of both programs. This new tool, Probabilities of Atmospheric Conditions and Environmental Risk (PACER), will provide greater flexibility and significantly enhanced functionality compared to the currently existing tool.
Improvement of High-Resolution Tropical Cyclone Structure and Intensity Forecasts using COAMPS-TC
2013-09-30
scientific community including the recent T- PARC /TCS08, ITOP, and HS3 field campaigns to build upon the existing modeling capabilities. We will...heating and cooling rates in developing and non-developing tropical disturbances during tcs-08: radar -equivalent retrievals from mesoscale numerical
Development of a Risk-Based Comparison Methodology of Carbon Capture Technologies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Engel, David W.; Dalton, Angela C.; Dale, Crystal
2014-06-01
Given the varying degrees of maturity among existing carbon capture (CC) technology alternatives, an understanding of the inherent technical and financial risk and uncertainty associated with these competing technologies is requisite to the success of carbon capture as a viable solution to the greenhouse gas emission challenge. The availability of tools and capabilities to conduct rigorous, risk–based technology comparisons is thus highly desirable for directing valuable resources toward the technology option(s) with a high return on investment, superior carbon capture performance, and minimum risk. To address this research need, we introduce a novel risk-based technology comparison method supported by anmore » integrated multi-domain risk model set to estimate risks related to technological maturity, technical performance, and profitability. Through a comparison between solid sorbent and liquid solvent systems, we illustrate the feasibility of estimating risk and quantifying uncertainty in a single domain (modular analytical capability) as well as across multiple risk dimensions (coupled analytical capability) for comparison. This method brings technological maturity and performance to bear on profitability projections, and carries risk and uncertainty modeling across domains via inter-model sharing of parameters, distributions, and input/output. The integration of the models facilitates multidimensional technology comparisons within a common probabilistic risk analysis framework. This approach and model set can equip potential technology adopters with the necessary computational capabilities to make risk-informed decisions about CC technology investment. The method and modeling effort can also be extended to other industries where robust tools and analytical capabilities are currently lacking for evaluating nascent technologies.« less
Modeling a Hall Thruster from Anode to Plume Far Field
2005-01-01
Hall thruster simulation capability that begins with propellant injection at the thruster anode, and ends in the plume far field. The development of a comprehensive simulation capability is critical for a number of reasons. The main motivation stems from the need to directly couple simulation of the plasma discharge processes inside the thruster and the transport of the plasma to the plume far field. The simulation strategy will employ two existing codes, one for the Hall thruster device and one for the plume. The coupling will take place in the plume
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tupek, Michael R.
2016-06-30
In recent years there has been a proliferation of modeling techniques for forward predictions of crack propagation in brittle materials, including: phase-field/gradient damage models, peridynamics, cohesive-zone models, and G/XFEM enrichment techniques. However, progress on the corresponding inverse problems has been relatively lacking. Taking advantage of key features of existing modeling approaches, we propose a parabolic regularization of Barenblatt cohesive models which borrows extensively from previous phase-field and gradient damage formulations. An efficient explicit time integration strategy for this type of nonlocal fracture model is then proposed and justified. In addition, we present a C++ computational framework for computing in- putmore » parameter sensitivities efficiently for explicit dynamic problems using the adjoint method. This capability allows for solving inverse problems involving crack propagation to answer interesting engineering questions such as: 1) what is the optimal design topology and material placement for a heterogeneous structure to maximize fracture resistance, 2) what loads must have been applied to a structure for it to have failed in an observed way, 3) what are the existing cracks in a structure given various experimental observations, etc. In this work, we focus on the first of these engineering questions and demonstrate a capability to automatically and efficiently compute optimal designs intended to minimize crack propagation in structures.« less
Powathil, Gibin G; Swat, Maciej; Chaplain, Mark A J
2015-02-01
The multiscale complexity of cancer as a disease necessitates a corresponding multiscale modelling approach to produce truly predictive mathematical models capable of improving existing treatment protocols. To capture all the dynamics of solid tumour growth and its progression, mathematical modellers need to couple biological processes occurring at various spatial and temporal scales (from genes to tissues). Because effectiveness of cancer therapy is considerably affected by intracellular and extracellular heterogeneities as well as by the dynamical changes in the tissue microenvironment, any model attempt to optimise existing protocols must consider these factors ultimately leading to improved multimodal treatment regimes. By improving existing and building new mathematical models of cancer, modellers can play important role in preventing the use of potentially sub-optimal treatment combinations. In this paper, we analyse a multiscale computational mathematical model for cancer growth and spread, incorporating the multiple effects of radiation therapy and chemotherapy in the patient survival probability and implement the model using two different cell based modelling techniques. We show that the insights provided by such multiscale modelling approaches can ultimately help in designing optimal patient-specific multi-modality treatment protocols that may increase patients quality of life. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Underwood, Lauren W.; Ryan, Robert E.
2007-01-01
This Candidate Solution uses NASA Earth science research on atmospheric ozone and aerosols data (1) to help improve the prediction capabilities of water runoff models that are used to estimate runoff pollution from retention ponds, and (2) to understand the pollutant removal contribution and potential of photocatalytically coated materials that could be used in these ponds. Models (the EPA's SWMM and the USGS SLAMM) exist that estimate the release of pollutants into the environment from storm-water-related retention pond runoff. UV irradiance data acquired from the satellite mission Aura and from the OMI Surface UV algorithm will be incorporated into these models to enhance their capabilities, not only by increasing the general understanding of retention pond function (both the efficacy and efficiency) but additionally by adding photocatalytic materials to these retention ponds, augmenting their performance. State and local officials who run pollution protection programs could then develop and implement photocatalytic technologies for water pollution control in retention ponds and use them in conjunction with existing runoff models. More effective decisions about water pollution protection programs could be made, the persistence and toxicity of waste generated could be minimized, and subsequently our natural water resources would be improved. This Candidate Solution is in alignment with the Water Management and Public Health National Applications.
Empirical testing of an analytical model predicting electrical isolation of photovoltaic models
NASA Astrophysics Data System (ADS)
Garcia, A., III; Minning, C. P.; Cuddihy, E. F.
A major design requirement for photovoltaic modules is that the encapsulation system be capable of withstanding large DC potentials without electrical breakdown. Presented is a simple analytical model which can be used to estimate material thickness to meet this requirement for a candidate encapsulation system or to predict the breakdown voltage of an existing module design. A series of electrical tests to verify the model are described in detail. The results of these verification tests confirmed the utility of the analytical model for preliminary design of photovoltaic modules.
Quantification of localized vertebral deformities using a sparse wavelet-based shape model.
Zewail, R; Elsafi, A; Durdle, N
2008-01-01
Medical experts often examine hundreds of spine x-ray images to determine existence of various pathologies. Common pathologies of interest are anterior osteophites, disc space narrowing, and wedging. By careful inspection of the outline shapes of the vertebral bodies, experts are able to identify and assess vertebral abnormalities with respect to the pathology under investigation. In this paper, we present a novel method for quantification of vertebral deformation using a sparse shape model. Using wavelets and Independent component analysis (ICA), we construct a sparse shape model that benefits from the approximation power of wavelets and the capability of ICA to capture higher order statistics in wavelet space. The new model is able to capture localized pathology-related shape deformations, hence it allows for quantification of vertebral shape variations. We investigate the capability of the model to predict localized pathology related deformations. Next, using support-vector machines, we demonstrate the diagnostic capabilities of the method through the discrimination of anterior osteophites in lumbar vertebrae. Experiments were conducted using a set of 150 contours from digital x-ray images of lumbar spine. Each vertebra is labeled as normal or abnormal. Results reported in this work focus on anterior osteophites as the pathology of interest.
Investigation and Development of Data-Driven D-Region Model for HF Systems Impacts
NASA Technical Reports Server (NTRS)
Eccles, J. V.; Rice, D.; Sojka, J. J.; Hunsucker, R. D.
2002-01-01
Space Environment Corporation (SEC) and RP Consultants (RPC) are to develop and validate a weather-capable D region model for making High Frequency (HF) absorption predictions in support of the HF communications and radar communities. The weather-capable model will assimilate solar and earth space observations from NASA satellites. The model will account for solar-induced impacts on HF absorption, including X-rays, Solar Proton Events (SPE's), and auroral precipitation. The work plan includes: I . Optimize D-region model to quickly obtain ion and electron densities for proper HF absorption calculations. 2. Develop indices-driven modules for D-region ionization sources for low, mid, & high latitudes including X-rays, cosmic rays, auroral precipitation, & solar protons. (Note: solar spectrum & auroral modules already exist). 3. Setup low-cost monitors of existing HF beacons and add one single-frequency beacon. 4. Use PENEX HF-link database with HF monitor data to validate D-region/HF absorption model using climatological ionization drivers. 5. Develop algorithms to assimilate NASA satellite data of solar, interplanetary, and auroral observations into ionization source modules. 6. Use PENEX HF-link & HF-beacon data for skill score comparison of assimilation versus climatological D-region/HF absorption model. Only some satellites are available for the PENEX time period, thus, HF-beacon data is necessary. 7. Use HF beacon monitors to develop HF-link data assimilation algorithms for regional improvement to the D-region/HF absorption model.
Automation life-cycle cost model
NASA Technical Reports Server (NTRS)
Gathmann, Thomas P.; Reeves, Arlinda J.; Cline, Rick; Henrion, Max; Ruokangas, Corinne
1992-01-01
The problem domain being addressed by this contractual effort can be summarized by the following list: Automation and Robotics (A&R) technologies appear to be viable alternatives to current, manual operations; Life-cycle cost models are typically judged with suspicion due to implicit assumptions and little associated documentation; and Uncertainty is a reality for increasingly complex problems and few models explicitly account for its affect on the solution space. The objectives for this effort range from the near-term (1-2 years) to far-term (3-5 years). In the near-term, the envisioned capabilities of the modeling tool are annotated. In addition, a framework is defined and developed in the Decision Modelling System (DEMOS) environment. Our approach is summarized as follows: Assess desirable capabilities (structure into near- and far-term); Identify useful existing models/data; Identify parameters for utility analysis; Define tool framework; Encode scenario thread for model validation; and Provide transition path for tool development. This report contains all relevant, technical progress made on this contractual effort.
Distributed generation capabilities of the national energy modeling system
DOE Office of Scientific and Technical Information (OSTI.GOV)
LaCommare, Kristina Hamachi; Edwards, Jennifer L.; Marnay, Chris
2003-01-01
This report describes Berkeley Lab's exploration of how the National Energy Modeling System (NEMS) models distributed generation (DG) and presents possible approaches for improving how DG is modeled. The on-site electric generation capability has been available since the AEO2000 version of NEMS. Berkeley Lab has previously completed research on distributed energy resources (DER) adoption at individual sites and has developed a DER Customer Adoption Model called DER-CAM. Given interest in this area, Berkeley Lab set out to understand how NEMS models small-scale on-site generation to assess how adequately DG is treated in NEMS, and to propose improvements or alternatives. Themore » goal is to determine how well NEMS models the factors influencing DG adoption and to consider alternatives to the current approach. Most small-scale DG adoption takes place in the residential and commercial modules of NEMS. Investment in DG ultimately offsets purchases of electricity, which also eliminates the losses associated with transmission and distribution (T&D). If the DG technology that is chosen is photovoltaics (PV), NEMS assumes renewable energy consumption replaces the energy input to electric generators. If the DG technology is fuel consuming, consumption of fuel in the electric utility sector is replaced by residential or commercial fuel consumption. The waste heat generated from thermal technologies can be used to offset the water heating and space heating energy uses, but there is no thermally activated cooling capability. This study consists of a review of model documentation and a paper by EIA staff, a series of sensitivity runs performed by Berkeley Lab that exercise selected DG parameters in the AEO2002 version of NEMS, and a scoping effort of possible enhancements and alternatives to NEMS current DG capabilities. In general, the treatment of DG in NEMS is rudimentary. The penetration of DG is determined by an economic cash-flow analysis that determines adoption based on the n umber of years to a positive cash flow. Some important technologies, e.g. thermally activated cooling, are absent, and ceilings on DG adoption are determined by some what arbitrary caps on the number of buildings that can adopt DG. These caps are particularly severe for existing buildings, where the maximum penetration for any one technology is 0.25 percent. On the other hand, competition among technologies is not fully considered, and this may result in double-counting for certain applications. A series of sensitivity runs show greater penetration with net metering enhancements and aggressive tax credits and a more limited response to lowered DG technology costs. Discussion of alternatives to the current code is presented in Section 4. Alternatives or improvements to how DG is modeled in NEMS cover three basic areas: expanding on the existing total market for DG both by changing existing parameters in NEMS and by adding new capabilities, such as for missing technologies; enhancing the cash flow analysis but incorporating aspects of DG economics that are not currently represented, e.g. complex tariffs; and using an external geographic information system (GIS) driven analysis that can better and more intuitively identify niche markets.« less
Fontaine, Joseph J.; Jorgensen, Christopher; Stuber, Erica F.; Gruber, Lutz F.; Bishop, Andrew A.; Lusk, Jeffrey J.; Zach, Eric S.; Decker, Karie L.
2017-01-01
We know economic and social policy has implications for ecosystems at large, but the consequences for a given geographic area or specific wildlife population are more difficult to conceptualize and communicate. Species distribution models, which extrapolate species-habitat relationships across ecological scales, are capable of predicting population changes in distribution and abundance in response to management and policy, and thus, are an ideal means for facilitating proactive management within a larger policy framework. To illustrate the capabilities of species distribution modeling in scenario planning for wildlife populations, we projected an existing distribution model for ring-necked pheasants (Phasianus colchicus) onto a series of alternative future landscape scenarios for Nebraska, USA. Based on our scenarios, we qualitatively and quantitatively estimated the effects of agricultural policy decisions on pheasant populations across Nebraska, in specific management regions, and at wildlife management areas.
A Power Efficient Exaflop Computer Design for Global Cloud System Resolving Climate Models.
NASA Astrophysics Data System (ADS)
Wehner, M. F.; Oliker, L.; Shalf, J.
2008-12-01
Exascale computers would allow routine ensemble modeling of the global climate system at the cloud system resolving scale. Power and cost requirements of traditional architecture systems are likely to delay such capability for many years. We present an alternative route to the exascale using embedded processor technology to design a system optimized for ultra high resolution climate modeling. These power efficient processors, used in consumer electronic devices such as mobile phones, portable music players, cameras, etc., can be tailored to the specific needs of scientific computing. We project that a system capable of integrating a kilometer scale climate model a thousand times faster than real time could be designed and built in a five year time scale for US$75M with a power consumption of 3MW. This is cheaper, more power efficient and sooner than any other existing technology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun, Amy Cha-Tien; Downes, Paula Sue; Heinen, Russell
Analysis of chemical supply chains is an inherently complex task, given the dependence of these supply chains on multiple infrastructure systems (e.g., the petroleum sector, transportation, etc.). This effort requires data and information at various levels of resolution, ranging from network-level distribution systems to individual chemical reactions. Sandia National Laboratories (Sandia) has integrated its existing simulation and infrastructure analysis capabilities with chemical data models to analyze the chemical supply chains of several nationally critical chemical commodities. This paper describes how Sandia models the ethylene supply chain; that is, the supply chain for the most widely used raw material for plasticsmore » production including a description of the types of data and modeling capabilities that are required to represent the ethylene supply chain. The paper concludes with a description of Sandia's use the model to project how the supply chain would be affected by and adapt to a disruptive scenario hurricane.« less
Yang, Chen-Wei
2015-01-01
The main purpose of this study is to develop an innovation model for hospital organisations. For this purpose, this study explores and examines the determinants, capabilities and performance in the hospital sector. First, this discusses three categories of determinants that affect hospitals' innovative capability studies: (1) knowledge stock; (2) social ties; and (3) institutional pressures. Then, this study examines the idea of innovative hospital capabilities, defined as the ability of the hospital organisation to innovate their knowledge. Finally, the hospital evaluation rating, which identifies performance in the hospital sector, was examined. This study empirically tested the theoretical model at the organisation level. The findings suggest that a hospital's innovative capabilities are influenced by its knowledge stock, social ties, institutional pressures and the impact of hospital performance. However, in attempts to keep hospitals aligned with their highly institutionalised environments, it may prove necessary for hospital administrators to pay more attention to both existing knowledge stock and the process of innovation if the institutions are to survive. Finally, implications for theory and practitioners complete this study. Copyright © 2014 John Wiley & Sons, Ltd.
Final Report for "Design calculations for high-space-charge beam-to-RF conversion".
DOE Office of Scientific and Technical Information (OSTI.GOV)
David N Smithe
2008-10-17
Accelerator facility upgrades, new accelerator applications, and future design efforts are leading to novel klystron and IOT device concepts, including multiple beam, high-order mode operation, and new geometry configurations of old concepts. At the same time, a new simulation capability, based upon finite-difference “cut-cell” boundaries, has emerged and is transforming the existing modeling and design capability with unparalleled realism, greater flexibility, and improved accuracy. This same new technology can also be brought to bear on a difficult-to-study aspect of the energy recovery linac (ERL), namely the accurate modeling of the exit beam, and design of the beam dump for optimummore » energy efficiency. We have developed new capability for design calculations and modeling of a broad class of devices which convert bunched beam kinetic energy to RF energy, including RF sources, as for example, klystrons, gyro-klystrons, IOT's, TWT’s, and other devices in which space-charge effects are important. Recent advances in geometry representation now permits very accurate representation of the curved metallic surfaces common to RF sources, resulting in unprecedented simulation accuracy. In the Phase I work, we evaluated and demonstrated the capabilities of the new geometry representation technology as applied to modeling and design of output cavity components of klystron, IOT's, and energy recovery srf cavities. We identified and prioritized which aspects of the design study process to pursue and improve in Phase II. The development and use of the new accurate geometry modeling technology on RF sources for DOE accelerators will help spark a new generational modeling and design capability, free from many of the constraints and inaccuracy associated with the previous generation of “stair-step” geometry modeling tools. This new capability is ultimately expected to impact all fields with high power RF sources, including DOE fusion research, communications, radar and other defense applications.« less
USDA-ARS?s Scientific Manuscript database
Land surface moisture measurements are central to our understanding of the earth’s water system, and are needed to produce accurate model-based weather/climate predictions. Currently, there exists no in-situ network capable of estimating wide-area soil moisture. In this paper, we explore an alterna...
Military Potential Test of the Model PA23-250B Fixed-Wing Instrument Trainer
1964-11-30
cabin heater was installed in the test airplane. Existing climatic conditions precluded actual tests to determine the capability of the heater to...housed within the engine contol pedestal under the engine conr- trol levers. r , aulic pressure is supplied to the control unit by an engine-driven
Study of an engine flow diverter system for a large scale ejector powered aircraft model
NASA Technical Reports Server (NTRS)
Springer, R. J.; Langley, B.; Plant, T.; Hunter, L.; Brock, O.
1981-01-01
Requirements were established for a conceptual design study to analyze and design an engine flow diverter system and to include accommodations for an ejector system in an existing 3/4 scale fighter model equipped with YJ-79 engines. Model constraints were identified and cost-effective limited modification was proposed to accept the ejectors, ducting and flow diverter valves. Complete system performance was calculated and a versatile computer program capable of analyzing any ejector system was developed.
Rapid Automated Aircraft Simulation Model Updating from Flight Data
NASA Technical Reports Server (NTRS)
Brian, Geoff; Morelli, Eugene A.
2011-01-01
Techniques to identify aircraft aerodynamic characteristics from flight measurements and compute corrections to an existing simulation model of a research aircraft were investigated. The purpose of the research was to develop a process enabling rapid automated updating of aircraft simulation models using flight data and apply this capability to all flight regimes, including flight envelope extremes. The process presented has the potential to improve the efficiency of envelope expansion flight testing, revision of control system properties, and the development of high-fidelity simulators for pilot training.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Powell, D.R.; Hutchinson, J.L.
Eagle 11 is a prototype analytic model derived from the integration of the low resolution Eagle model with the high resolution SIMNET model. This integration promises a new capability to allow for a more effective examination of proposed or existing combat systems that could not be easily evaluated using either Eagle or SIMNET alone. In essence, Eagle II becomes a multi-resolution combat model in which simulated combat units can exhibit both high and low fidelity behavior at different times during model execution. This capability allows a unit to behave in a highly manner only when required, thereby reducing the overallmore » computational and manpower requirements for a given study. In this framework, the SIMNET portion enables a highly credible assessment of the performance of individual combat systems under consideration, encompassing both engineering performance and crew capabilities. However, when the assessment being conducted goes beyond system performance and extends to questions of force structure balance and sustainment, then SISMNET results can be used to ``calibrate`` the Eagle attrition process appropriate to the study at hand. Advancing technologies, changes in the world-wide threat, requirements for flexible response, declining defense budgets, and down-sizing of military forces motivate the development of manpower-efficient, low-cost, responsive tools for combat development studies. Eagle and SIMNET both serve as credible and useful tools. The integration of these two models promises enhanced capabilities to examine the broader, deeper, more complex battlefield of the future with higher fidelity, greater responsiveness and low overall cost.« less
Integrated analysis of error detection and recovery
NASA Technical Reports Server (NTRS)
Shin, K. G.; Lee, Y. H.
1985-01-01
An integrated modeling and analysis of error detection and recovery is presented. When fault latency and/or error latency exist, the system may suffer from multiple faults or error propagations which seriously deteriorate the fault-tolerant capability. Several detection models that enable analysis of the effect of detection mechanisms on the subsequent error handling operations and the overall system reliability were developed. Following detection of the faulty unit and reconfiguration of the system, the contaminated processes or tasks have to be recovered. The strategies of error recovery employed depend on the detection mechanisms and the available redundancy. Several recovery methods including the rollback recovery are considered. The recovery overhead is evaluated as an index of the capabilities of the detection and reconfiguration mechanisms.
Hormone Purification by Isoelectric Focusing
NASA Technical Reports Server (NTRS)
Bier, M.
1985-01-01
Various ground-based research approaches are being applied to a more definitive evaluation of the natures and degrees of electroosmosis effects on the separation capabilities of the Isoelectric Focusing (IEF) process. A primary instrumental system for this work involves rotationally stabilized, horizontal electrophoretic columns specially adapted for the IEF process. Representative adaptations include segmentation, baffles/screens, and surface coatings. Comparative performance and development testing are pursued against the type of column or cell established as an engineering model. Previously developed computer simulation capabilities are used to predict low-gravity behavior patterns and performance for IEF apparatus geometries of direct project interest. Three existing mathematical models plus potential new routines for particular aspects of simulating instrument fluid patterns with varied wall electroosmosis influences are being exercised.
NASA Technical Reports Server (NTRS)
Haste, Deepak; Ghoshal, Sudipto; Johnson, Stephen B.; Moore, Craig
2018-01-01
This paper describes the theory and considerations in the application of model-based techniques to assimilate information from disjoint knowledge sources for performing NASA's Fault Management (FM)-related activities using the TEAMS® toolset. FM consists of the operational mitigation of existing and impending spacecraft failures. NASA's FM directives have both design-phase and operational-phase goals. This paper highlights recent studies by QSI and DST of the capabilities required in the TEAMS® toolset for conducting FM activities with the aim of reducing operating costs, increasing autonomy, and conforming to time schedules. These studies use and extend the analytic capabilities of QSI's TEAMS® toolset to conduct a range of FM activities within a centralized platform.
Microeconomics of yield learning and process control in semiconductor manufacturing
NASA Astrophysics Data System (ADS)
Monahan, Kevin M.
2003-06-01
Simple microeconomic models that directly link yield learning to profitability in semiconductor manufacturing have been rare or non-existent. In this work, we review such a model and provide links to inspection capability and cost. Using a small number of input parameters, we explain current yield management practices in 200mm factories. The model is then used to extrapolate requirements for 300mm factories, including the impact of technology transitions to 130nm design rules and below. We show that the dramatic increase in value per wafer at the 300mm transition becomes a driver for increasing metrology and inspection capability and sampling. These analyses correlate well wtih actual factory data and often identify millions of dollars in potential cost savings. We demonstrate this using the example of grating-based overlay metrology for the 65nm node.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ebert, D.
1997-07-01
This is a report on the CSNI Workshop on Transient Thermal-Hydraulic and Neutronic Codes Requirements held at Annapolis, Maryland, USA November 5-8, 1996. This experts` meeting consisted of 140 participants from 21 countries; 65 invited papers were presented. The meeting was divided into five areas: (1) current and prospective plans of thermal hydraulic codes development; (2) current and anticipated uses of thermal-hydraulic codes; (3) advances in modeling of thermal-hydraulic phenomena and associated additional experimental needs; (4) numerical methods in multi-phase flows; and (5) programming language, code architectures and user interfaces. The workshop consensus identified the following important action items tomore » be addressed by the international community in order to maintain and improve the calculational capability: (a) preserve current code expertise and institutional memory, (b) preserve the ability to use the existing investment in plant transient analysis codes, (c) maintain essential experimental capabilities, (d) develop advanced measurement capabilities to support future code validation work, (e) integrate existing analytical capabilities so as to improve performance and reduce operating costs, (f) exploit the proven advances in code architecture, numerics, graphical user interfaces, and modularization in order to improve code performance and scrutibility, and (g) more effectively utilize user experience in modifying and improving the codes.« less
Evaluating local indirect addressing in SIMD proc essors
NASA Technical Reports Server (NTRS)
Middleton, David; Tomboulian, Sherryl
1989-01-01
In the design of parallel computers, there exists a tradeoff between the number and power of individual processors. The single instruction stream, multiple data stream (SIMD) model of parallel computers lies at one extreme of the resulting spectrum. The available hardware resources are devoted to creating the largest possible number of processors, and consequently each individual processor must use the fewest possible resources. Disagreement exists as to whether SIMD processors should be able to generate addresses individually into their local data memory, or all processors should access the same address. The tradeoff is examined between the increased capability and the reduced number of processors that occurs in this single instruction stream, multiple, locally addressed, data (SIMLAD) model. Factors are assembled that affect this design choice, and the SIMLAD model is compared with the bare SIMD and the MIMD models.
Above the cloud computing orbital services distributed data model
NASA Astrophysics Data System (ADS)
Straub, Jeremy
2014-05-01
Technology miniaturization and system architecture advancements have created an opportunity to significantly lower the cost of many types of space missions by sharing capabilities between multiple spacecraft. Historically, most spacecraft have been atomic entities that (aside from their communications with and tasking by ground controllers) operate in isolation. Several notable example exist; however, these are purpose-designed systems that collaborate to perform a single goal. The above the cloud computing (ATCC) concept aims to create ad-hoc collaboration between service provider and consumer craft. Consumer craft can procure processing, data transmission, storage, imaging and other capabilities from provider craft. Because of onboard storage limitations, communications link capability limitations and limited windows of communication, data relevant to or required for various operations may span multiple craft. This paper presents a model for the identification, storage and accessing of this data. This model includes appropriate identification features for this highly distributed environment. It also deals with business model constraints such as data ownership, retention and the rights of the storing craft to access, resell, transmit or discard the data in its possession. The model ensures data integrity and confidentiality (to the extent applicable to a given data item), deals with unique constraints of the orbital environment and tags data with business model (contractual) obligation data.
Modular GIS Framework for National Scale Hydrologic and Hydraulic Modeling Support
NASA Astrophysics Data System (ADS)
Djokic, D.; Noman, N.; Kopp, S.
2015-12-01
Geographic information systems (GIS) have been extensively used for pre- and post-processing of hydrologic and hydraulic models at multiple scales. An extensible GIS-based framework was developed for characterization of drainage systems (stream networks, catchments, floodplain characteristics) and model integration. The framework is implemented as a set of free, open source, Python tools and builds on core ArcGIS functionality and uses geoprocessing capabilities to ensure extensibility. Utilization of COTS GIS core capabilities allows immediate use of model results in a variety of existing online applications and integration with other data sources and applications.The poster presents the use of this framework to downscale global hydrologic models to local hydraulic scale and post process the hydraulic modeling results and generate floodplains at any local resolution. Flow forecasts from ECMWF or WRF-Hydro are downscaled and combined with other ancillary data for input into the RAPID flood routing model. RAPID model results (stream flow along each reach) are ingested into a GIS-based scale dependent stream network database for efficient flow utilization and visualization over space and time. Once the flows are known at localized reaches, the tools can be used to derive the floodplain depth and extent for each time step in the forecast at any available local resolution. If existing rating curves are available they can be used to relate the flow to the depth of flooding, or synthetic rating curves can be derived using the tools in the toolkit and some ancillary data/assumptions. The results can be published as time-enabled spatial services to be consumed by web applications that use floodplain information as an input. Some of the existing online presentation templates can be easily combined with available online demographic and infrastructure data to present the impact of the potential floods on the local community through simple, end user products. This framework has been successfully used in both the data rich environments as well as in locales with minimum available spatial and hydrographic data.
NASA Technical Reports Server (NTRS)
Davis, George; Cary, Everett; Higinbotham, John; Burns, Richard; Hogie, Keith; Hallahan, Francis
2003-01-01
The paper will provide an overview of the web-based distributed simulation software system developed for end-to-end, multi-spacecraft mission design, analysis, and test at the NASA Goddard Space Flight Center (GSFC). This software system was developed for an internal research and development (IR&D) activity at GSFC called the Distributed Space Systems (DSS) Distributed Synthesis Environment (DSE). The long-term goal of the DSS-DSE is to integrate existing GSFC stand-alone test beds, models, and simulation systems to create a "hands on", end-to-end simulation environment for mission design, trade studies and simulations. The short-term goal of the DSE was therefore to develop the system architecture, and then to prototype the core software simulation capability based on a distributed computing approach, with demonstrations of some key capabilities by the end of Fiscal Year 2002 (FY02). To achieve the DSS-DSE IR&D objective, the team adopted a reference model and mission upon which FY02 capabilities were developed. The software was prototyped according to the reference model, and demonstrations were conducted for the reference mission to validate interfaces, concepts, etc. The reference model, illustrated in Fig. 1, included both space and ground elements, with functional capabilities such as spacecraft dynamics and control, science data collection, space-to-space and space-to-ground communications, mission operations, science operations, and data processing, archival and distribution addressed.
A Distributed Simulation Software System for Multi-Spacecraft Missions
NASA Technical Reports Server (NTRS)
Burns, Richard; Davis, George; Cary, Everett
2003-01-01
The paper will provide an overview of the web-based distributed simulation software system developed for end-to-end, multi-spacecraft mission design, analysis, and test at the NASA Goddard Space Flight Center (GSFC). This software system was developed for an internal research and development (IR&D) activity at GSFC called the Distributed Space Systems (DSS) Distributed Synthesis Environment (DSE). The long-term goal of the DSS-DSE is to integrate existing GSFC stand-alone test beds, models, and simulation systems to create a "hands on", end-to-end simulation environment for mission design, trade studies and simulations. The short-term goal of the DSE was therefore to develop the system architecture, and then to prototype the core software simulation capability based on a distributed computing approach, with demonstrations of some key capabilities by the end of Fiscal Year 2002 (FY02). To achieve the DSS-DSE IR&D objective, the team adopted a reference model and mission upon which FY02 capabilities were developed. The software was prototyped according to the reference model, and demonstrations were conducted for the reference mission to validate interfaces, concepts, etc. The reference model, illustrated in Fig. 1, included both space and ground elements, with functional capabilities such as spacecraft dynamics and control, science data collection, space-to-space and space-to-ground communications, mission operations, science operations, and data processing, archival and distribution addressed.
Implementation of Flow Tripping Capability in the USM3D Unstructured Flow Solver
NASA Technical Reports Server (NTRS)
Pandya, Mohagna J.; Abdol-Harrid, Khaled S.; Campbell, Richard L.; Frink, Neal T.
2006-01-01
A flow tripping capability is added to an established NASA tetrahedral unstructured parallel Navier-Stokes flow solver, USM3D. The capability is based on prescribing an appropriate profile of turbulence model variables to energize the boundary layer in a plane normal to a specified trip region on the body surface. We demonstrate this approach using the k-e two-equation turbulence model of USM3D. Modification to the solution procedure primarily consists of developing a data structure to identify all unstructured tetrahedral grid cells located in the plane normal to a specified surface trip region and computing a function based on the mean flow solution to specify the modified profile of the turbulence model variables. We leverage this data structure and also show an adjunct approach that is based on enforcing a laminar flow condition on the otherwise fully turbulent flow solution in user specified region. The latter approach is applied for the solutions obtained using other one- and two-equation turbulence models of USM3D. A key ingredient of the present capability is the use of a graphical user-interface tool PREDISC to define a trip region on the body surface in an existing grid. Verification of the present modifications is demonstrated on three cases, namely, a flat plate, the RAE2822 airfoil, and the DLR F6 wing-fuselage configuration.
Implementation of Flow Tripping Capability in the USM3D Unstructured Flow Solver
NASA Technical Reports Server (NTRS)
Pandya, Mohagna J.; Abdol-Hamid, Khaled S.; Campbell, Richard L.; Frink, Neal T.
2006-01-01
A flow tripping capability is added to an established NASA tetrahedral unstructured parallel Navier-Stokes flow solver, USM3D. The capability is based on prescribing an appropriate profile of turbulence model variables to energize the boundary layer in a plane normal to a specified trip region on the body surface. We demonstrate this approach using the k-epsilon two-equation turbulence model of USM3D. Modification to the solution procedure primarily consists of developing a data structure to identify all unstructured tetrahedral grid cells located in the plane normal to a specified surface trip region and computing a function based on the mean flow solution to specify the modified profile of the turbulence model variables. We leverage this data structure and also show an adjunct approach that is based on enforcing a laminar flow condition on the otherwise fully turbulent flow solution in user-specified region. The latter approach is applied for the solutions obtained using other one-and two-equation turbulence models of USM3D. A key ingredient of the present capability is the use of a graphical user-interface tool PREDISC to define a trip region on the body surface in an existing grid. Verification of the present modifications is demonstrated on three cases, namely, a flat plate, the RAE2822 airfoil, and the DLR F6 wing-fuselage configuration.
ALGE3D: A Three-Dimensional Transport Model
NASA Astrophysics Data System (ADS)
Maze, G. M.
2017-12-01
Of the top 10 most populated US cities from a 2015 US Census Bureau estimate, 7 of the cities are situated near the ocean, a bay, or on one of the Great Lakes. A contamination of the water ways in the United States could be devastating to the economy (through tourism and industries such as fishing), public health (from direct contact, or contaminated drinking water), and in some cases even infrastructure (water treatment plants). Current national response models employed by emergency response agencies have well developed models to simulate the effects of hazardous contaminants in riverine systems that are primarily driven by one-dimensional flows; however in more complex systems, such as tidal estuaries, bays, or lakes, a more complex model is needed. While many models exist, none are capable of quick deployment in emergency situations that could contain a variety of release situations including a mixture of both particulate and dissolved chemicals in a complex flow area. ALGE3D, developed at the Department of Energy's (DOE) Savannah River National Laboratory (SRNL), is a three-dimensional hydrodynamic code which solves the momentum, mass, and energy conservation equations to predict the movement and dissipation of thermal or dissolved chemical plumes discharged into cooling lakes, rivers, and estuaries. ALGE3D is capable of modeling very complex flows, including areas with tidal flows which include wetting and drying of land. Recent upgrades have increased the capabilities including the transport of particulate tracers, allowing for more complete modeling of the transport of pollutants. In addition the model is capable of coupling with a one-dimension riverine transport model or a two-dimension atmospheric deposition model in the event that a contamination event occurs upstream or upwind of the water body.
New Ground Truth Capability from InSAR Time Series Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buckley, S; Vincent, P; Yang, D
2005-07-13
We demonstrate that next-generation interferometric synthetic aperture radar (InSAR) processing techniques applied to existing data provide rich InSAR ground truth content for exploitation in seismic source identification. InSAR time series analyses utilize tens of interferograms and can be implemented in different ways. In one such approach, conventional InSAR displacement maps are inverted in a final post-processing step. Alternatively, computationally intensive data reduction can be performed with specialized InSAR processing algorithms. The typical final result of these approaches is a synthesized set of cumulative displacement maps. Examples from our recent work demonstrate that these InSAR processing techniques can provide appealing newmore » ground truth capabilities. We construct movies showing the areal and temporal evolution of deformation associated with previous nuclear tests. In other analyses, we extract time histories of centimeter-scale surface displacement associated with tunneling. The potential exists to identify millimeter per year surface movements when sufficient data exists for InSAR techniques to isolate and remove phase signatures associated with digital elevation model errors and the atmosphere.« less
Boundary cooled rocket engines for space storable propellants
NASA Technical Reports Server (NTRS)
Kesselring, R. C.; Mcfarland, B. L.; Knight, R. M.; Gurnitz, R. N.
1972-01-01
An evaluation of an existing analytical heat transfer model was made to develop the technology of boundary film/conduction cooled rocket thrust chambers to the space storable propellant combination oxygen difluoride/diborane. Critical design parameters were identified and their importance determined. Test reduction methods were developed to enable data obtained from short duration hot firings with a thin walled (calorimeter) chamber to be used quantitatively evaluate the heat absorbing capability of the vapor film. The modification of the existing like-doublet injector was based on the results obtained from the calorimeter firings.
NASA Astrophysics Data System (ADS)
Floyd, I. E.; Downer, C. W.; Brown, G.; Pradhan, N. R.
2017-12-01
The Gridded Surface Subsurface Hydrologic Analysis (GSSHA) model is the US Army Corps of Engineers' (USACE)'s only fully coupled overland/in-stream sediment transport model. While the overland sediment transport formulation in GSSHA is considered state of the art, the existing in-stream sediment transport formulation is less robust. A major omission in the formulation of the existing GSSHA in-stream model is the lack of in-stream sources of fine materials. In this effort, we enhanced the in-stream sediment transport capacity of GSSHA by linking GSSHA to the SEDLIB sediment transport library. SEDLIB was developed at the Coastal and Hydraulics Laboratory (CHL) under the System Wide Water Resources Program (SWWRP) and Flood and Coastal (F&C) research program. It is designed to provide a library of sediment flux formulations for hydraulic and hydrologic models, such as GSSHA. This new version of GSSHA, with the updated in-stream sediment transport simulation capability afforded by the linkage to SEDLIB, was tested in against observations in an experimental watershed that had previously been used as a test bed for GSSHA. The results show a significant improvement in the ability to model in-stream sources of fine sediment. This improved capability will broaden the applicability of GSSHA to larger watersheds and watersheds with complex sediment dynamics, such as those subjected to fire hydrology.
Web-based applications for building, managing and analysing kinetic models of biological systems.
Lee, Dong-Yup; Saha, Rajib; Yusufi, Faraaz Noor Khan; Park, Wonjun; Karimi, Iftekhar A
2009-01-01
Mathematical modelling and computational analysis play an essential role in improving our capability to elucidate the functions and characteristics of complex biological systems such as metabolic, regulatory and cell signalling pathways. The modelling and concomitant simulation render it possible to predict the cellular behaviour of systems under various genetically and/or environmentally perturbed conditions. This motivates systems biologists/bioengineers/bioinformaticians to develop new tools and applications, allowing non-experts to easily conduct such modelling and analysis. However, among a multitude of systems biology tools developed to date, only a handful of projects have adopted a web-based approach to kinetic modelling. In this report, we evaluate the capabilities and characteristics of current web-based tools in systems biology and identify desirable features, limitations and bottlenecks for further improvements in terms of usability and functionality. A short discussion on software architecture issues involved in web-based applications and the approaches taken by existing tools is included for those interested in developing their own simulation applications.
A Generalized Framework for Modeling Next Generation 911 Implementations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kelic, Andjelka; Aamir, Munaf Syed; Kelic, Andjelka
This document summarizes the current state of Sandia 911 modeling capabilities and then addresses key aspects of Next Generation 911 (NG911) architectures for expansion of existing models. Analysis of three NG911 implementations was used to inform heuristics , associated key data requirements , and assumptions needed to capture NG911 architectures in the existing models . Modeling of NG911 necessitates careful consideration of its complexity and the diversity of implementations. Draft heuristics for constructing NG911 models are pres ented based on the analysis along with a summary of current challenges and ways to improve future NG911 modeling efforts . We foundmore » that NG911 relies on E nhanced 911 (E911) assets such as 911 selective routers to route calls originating from traditional tel ephony service which are a majority of 911 calls . We also found that the diversity and transitional nature of NG911 implementations necessitates significant and frequent data collection to ensure that adequate model s are available for crisis action support .« less
Ainong Li; Chengquan Huang; Guoqing Sun; Hua Shi; Chris Toney; Zhiliang Zhu; Matthew G. Rollins; Samuel N. Goward; Jeffrey G. Masek
2011-01-01
Many forestry and earth science applications require spatially detailed forest height data sets. Among the various remote sensing technologies, lidar offers the most potential for obtaining reliable height measurement. However, existing and planned spaceborne lidar systems do not have the capability to produce spatially contiguous, fine resolution forest height maps...
ERIC Educational Resources Information Center
Jones, Sandra; Lefoe, Geraldine; Harvey, Marina; Ryland, Kevin
2012-01-01
New models of leadership are needed for the higher education sector to continue to graduate students with leading edge capabilities. While multiple theories of leadership exist, the higher education sector requires a less hierarchical approach that takes account of its specialised and professional context. Over the last decade the sector has…
Requirements Modeling with Agent Programming
NASA Astrophysics Data System (ADS)
Dasgupta, Aniruddha; Krishna, Aneesh; Ghose, Aditya K.
Agent-oriented conceptual modeling notations are highly effective in representing requirements from an intentional stance and answering questions such as what goals exist, how key actors depend on each other, and what alternatives must be considered. In this chapter, we review an approach to executing i* models by translating these into set of interacting agents implemented in the CASO language and suggest how we can perform reasoning with requirements modeled (both functional and non-functional) using i* models. In this chapter we particularly incorporate deliberation into the agent design. This allows us to benefit from the complementary representational capabilities of the two frameworks.
Challenges in Developing Models Describing Complex Soil Systems
NASA Astrophysics Data System (ADS)
Simunek, J.; Jacques, D.
2014-12-01
Quantitative mechanistic models that consider basic physical, mechanical, chemical, and biological processes have the potential to be powerful tools to integrate our understanding of complex soil systems, and the soil science community has often called for models that would include a large number of these diverse processes. However, once attempts have been made to develop such models, the response from the community has not always been overwhelming, especially after it discovered that these models are consequently highly complex, requiring not only a large number of parameters, not all of which can be easily (or at all) measured and/or identified, and which are often associated with large uncertainties, but also requiring from their users deep knowledge of all/most of these implemented physical, mechanical, chemical and biological processes. Real, or perceived, complexity of these models then discourages users from using them even for relatively simple applications, for which they would be perfectly adequate. Due to the nonlinear nature and chemical/biological complexity of the soil systems, it is also virtually impossible to verify these types of models analytically, raising doubts about their applicability. Code inter-comparisons, which is then likely the most suitable method to assess code capabilities and model performance, requires existence of multiple models of similar/overlapping capabilities, which may not always exist. It is thus a challenge not only to developed models describing complex soil systems, but also to persuade the soil science community in using them. As a result, complex quantitative mechanistic models are still an underutilized tool in soil science research. We will demonstrate some of the challenges discussed above on our own efforts in developing quantitative mechanistic models (such as HP1/2) for complex soil systems.
Rail Inspection Systems Analysis and Technology Survey
DOT National Transportation Integrated Search
1977-09-01
The study was undertaken to identify existing rail inspection system capabilities and methods which might be used to improve these capabilities. Task I was a study to quantify existing inspection parameters and Task II was a cost effectiveness study ...
NASA Astrophysics Data System (ADS)
Gunawan, D.; Amalia, A.; Rahmat, R. F.; Muchtar, M. A.; Siregar, I.
2018-02-01
Identification of software maturity level is a technique to determine the quality of the software. By identifying the software maturity level, the weaknesses of the software can be observed. As a result, the recommendations might be a reference for future software maintenance and development. This paper discusses the software Capability Level (CL) with case studies on Quality Management Unit (Unit Manajemen Mutu) University of Sumatera Utara (UMM-USU). This research utilized Standard CMMI Appraisal Method for Process Improvement class C (SCAMPI C) model with continuous representation. This model focuses on activities for developing quality products and services. The observation is done in three process areas, such as Project Planning (PP), Project Monitoring and Control (PMC), and Requirements Management (REQM). According to the measurement of software capability level for UMM-USU software, turns out that the capability level for the observed process area is in the range of CL1 and CL2. Planning Project (PP) is the only process area which reaches capability level 2, meanwhile, PMC and REQM are still in CL 1 or in performed level. This research reveals several weaknesses of existing UMM-USU software. Therefore, this study proposes several recommendations for UMM-USU to improve capability level for observed process areas.
Additions and improvements to the high energy density physics capabilities in the FLASH code
NASA Astrophysics Data System (ADS)
Lamb, D.; Bogale, A.; Feister, S.; Flocke, N.; Graziani, C.; Khiar, B.; Laune, J.; Tzeferacos, P.; Walker, C.; Weide, K.
2017-10-01
FLASH is an open-source, finite-volume Eulerian, spatially-adaptive radiation magnetohydrodynamics code that has the capabilities to treat a broad range of physical processes. FLASH performs well on a wide range of computer architectures, and has a broad user base. Extensive high energy density physics (HEDP) capabilities exist in FLASH, which make it a powerful open toolset for the academic HEDP community. We summarize these capabilities, emphasizing recent additions and improvements. We describe several non-ideal MHD capabilities that are being added to FLASH, including the Hall and Nernst effects, implicit resistivity, and a circuit model, which will allow modeling of Z-pinch experiments. We showcase the ability of FLASH to simulate Thomson scattering polarimetry, which measures Faraday due to the presence of magnetic fields, as well as proton radiography, proton self-emission, and Thomson scattering diagnostics. Finally, we describe several collaborations with the academic HEDP community in which FLASH simulations were used to design and interpret HEDP experiments. This work was supported in part at U. Chicago by DOE NNSA ASC through the Argonne Institute for Computing in Science under FWP 57789; DOE NNSA under NLUF Grant DE-NA0002724; DOE SC OFES Grant DE-SC0016566; and NSF Grant PHY-1619573.
Development Of A Data Assimilation Capability For RAPID
NASA Astrophysics Data System (ADS)
Emery, C. M.; David, C. H.; Turmon, M.; Hobbs, J.; Allen, G. H.; Famiglietti, J. S.
2017-12-01
The global decline of in situ observations associated with the increasing ability to monitor surface water from space motivates the creation of data assimilation algorithms that merge computer models and space-based observations to produce consistent estimates of terrestrial hydrology that fill the spatiotemporal gaps in observations. RAPID is a routing model based on the Muskingum method that is capable of estimating river streamflow over large scales with a relatively short computing time. This model only requires limited inputs: a reach-based river network, and lateral surface and subsurface flow into the rivers. The relatively simple model physics imply that RAPID simulations could be significantly improved by including a data assimilation capability. Here we present the early developments of such data assimilation approach into RAPID. Given the linear and matrix-based structure of the model, we chose to apply a direct Kalman filter, hence allowing for the preservation of high computational speed. We correct the simulated streamflows by assimilating streamflow observations and our early results demonstrate the feasibility of the approach. Additionally, the use of in situ gauges at continental scales motivates the application of our new data assimilation scheme to altimetry measurements from existing (e.g. EnviSat, Jason 2) and upcoming satellite missions (e.g. SWOT), and ultimately apply the scheme globally.
Sensor Management for Applied Research Technologies (SMART)-On Demand Modeling (ODM) Project
NASA Technical Reports Server (NTRS)
Goodman, M.; Blakeslee, R.; Hood, R.; Jedlovec, G.; Botts, M.; Li, X.
2006-01-01
NASA requires timely on-demand data and analysis capabilities to enable practical benefits of Earth science observations. However, a significant challenge exists in accessing and integrating data from multiple sensors or platforms to address Earth science problems because of the large data volumes, varying sensor scan characteristics, unique orbital coverage, and the steep learning curve associated with each sensor and data type. The development of sensor web capabilities to autonomously process these data streams (whether real-time or archived) provides an opportunity to overcome these obstacles and facilitate the integration and synthesis of Earth science data and weather model output. A three year project, entitled Sensor Management for Applied Research Technologies (SMART) - On Demand Modeling (ODM), will develop and demonstrate the readiness of Open Geospatial Consortium (OGC) Sensor Web Enablement (SWE) capabilities that integrate both Earth observations and forecast model output into new data acquisition and assimilation strategies. The advancement of SWE-enabled systems (i.e., use of SensorML, sensor planning services - SPS, sensor observation services - SOS, sensor alert services - SAS and common observation model protocols) will have practical and efficient uses in the Earth science community for enhanced data set generation, real-time data assimilation with operational applications, and for autonomous sensor tasking for unique data collection.
The flow of power law fluids in elastic networks and porous media.
Sochi, Taha
2016-02-01
The flow of power law fluids, which include shear thinning and shear thickening as well as Newtonian as a special case, in networks of interconnected elastic tubes is investigated using a residual-based pore scale network modeling method with the employment of newly derived formulae. Two relations describing the mechanical interaction between the local pressure and local cross-sectional area in distensible tubes of elastic nature are considered in the derivation of these formulae. The model can be used to describe shear dependent flows of mainly viscous nature. The behavior of the proposed model is vindicated by several tests in a number of special and limiting cases where the results can be verified quantitatively or qualitatively. The model, which is the first of its kind, incorporates more than one major nonlinearity corresponding to the fluid rheology and conduit mechanical properties, that is non-Newtonian effects and tube distensibility. The formulation, implementation, and performance indicate that the model enjoys certain advantages over the existing models such as being exact within the restricting assumptions on which the model is based, easy implementation, low computational costs, reliability, and smooth convergence. The proposed model can, therefore, be used as an alternative to the existing Newtonian distensible models; moreover, it stretches the capabilities of the existing modeling approaches to reach non-Newtonian rheologies.
NASA Astrophysics Data System (ADS)
Bird, Adam; Murphy, Christophe; Dobson, Geoff
2017-09-01
RANKERN 16 is the latest version of the point-kernel gamma radiation transport Monte Carlo code from AMEC Foster Wheeler's ANSWERS Software Service. RANKERN is well established in the UK shielding community for radiation shielding and dosimetry assessments. Many important developments have been made available to users in this latest release of RANKERN. The existing general 3D geometry capability has been extended to include import of CAD files in the IGES format providing efficient full CAD modelling capability without geometric approximation. Import of tetrahedral mesh and polygon surface formats has also been provided. An efficient voxel geometry type has been added suitable for representing CT data. There have been numerous input syntax enhancements and an extended actinide gamma source library. This paper describes some of the new features and compares the performance of the new geometry capabilities.
The Transfer Function Model as a Tool to Study and Describe Space Weather Phenomena
NASA Technical Reports Server (NTRS)
Porter, Hayden S.; Mayr, Hans G.; Bhartia, P. K. (Technical Monitor)
2001-01-01
The Transfer Function Model (TFM) is a semi-analytical, linear model that is designed especially to describe thermospheric perturbations associated with magnetic storms and substorm. activity. It is a multi-constituent model (N2, O, He H, Ar) that accounts for wind induced diffusion, which significantly affects not only the composition and mass density but also the temperature and wind fields. Because the TFM adopts a semianalytic approach in which the geometry and temporal dependencies of the driving sources are removed through the use of height-integrated Green's functions, it provides physical insight into the essential properties of processes being considered, which are uncluttered by the accidental complexities that arise from particular source geometrie and time dependences. Extending from the ground to 700 km, the TFM eliminates spurious effects due to arbitrarily chosen boundary conditions. A database of transfer functions, computed only once, can be used to synthesize a wide range of spatial and temporal sources dependencies. The response synthesis can be performed quickly in real-time using only limited computing capabilities. These features make the TFM unique among global dynamical models. Given these desirable properties, a version of the TFM has been developed for personal computers (PC) using advanced platform-independent 3D visualization capabilities. We demonstrate the model capabilities with simulations for different auroral sources, including the response of ducted gravity waves modes that propagate around the globe. The thermospheric response is found to depend strongly on the spatial and temporal frequency spectra of the storm. Such varied behavior is difficult to describe in statistical empirical models. To improve the capability of space weather prediction, the TFM thus could be grafted naturally onto existing statistical models using data assimilation.
Stem cells in genetically-engineered mouse models of prostate cancer
Shibata, Maho; Shen, Michael M.
2015-01-01
The cancer stem cell model proposes that tumors have a hierarchical organization in which tumorigenic cells give rise to non-tumorigenic cells, with only a subset of stem-like cells able to propagate the tumor. In the case of prostate cancer, recent analyses of genetically engineered mouse (GEM) models have provided evidence supporting the existence of cancer stem cells in vivo. These studies suggest that cancer stem cells capable of tumor propagation exist at various stages of tumor progression from prostatic intraepithelial neoplasia (PIN) to advanced metastatic and castration-resistant disease. However, studies of stem cells in prostate cancer have been limited by available approaches for evaluating their functional properties in cell culture and transplantation assays. Given the role of the tumor microenvironment and the putative cancer stem cell niche, future studies using GEM models to analyze cancer stem cells in their native tissue microenvironment are likely to be highly informative. PMID:26341780
NASA Technical Reports Server (NTRS)
Perkey, D. J.; Kreitzberg, C. W.
1984-01-01
The dynamic prediction model along with its macro-processor capability and data flow system from the Drexel Limited-Area and Mesoscale Prediction System (LAMPS) were converted and recorded for the Perkin-Elmer 3220. The previous version of this model was written for Control Data Corporation 7600 and CRAY-1a computer environment which existed until recently at the National Center for Atmospheric Research. The purpose of this conversion was to prepare LAMPS for porting to computer environments other than that encountered at NCAR. The emphasis was shifted from programming tasks to model simulation and evaluation tests.
Direct use of linear time-domain aerodynamics in aeroservoelastic analysis: Aerodynamic model
NASA Technical Reports Server (NTRS)
Woods, J. A.; Gilbert, Michael G.
1990-01-01
The work presented here is the first part of a continuing effort to expand existing capabilities in aeroelasticity by developing the methodology which is necessary to utilize unsteady time-domain aerodynamics directly in aeroservoelastic design and analysis. The ultimate objective is to define a fully integrated state-space model of an aeroelastic vehicle's aerodynamics, structure and controls which may be used to efficiently determine the vehicle's aeroservoelastic stability. Here, the current status of developing a state-space model for linear or near-linear time-domain indicial aerodynamic forces is presented.
Design-based modeling of magnetically actuated soft diaphragm materials
NASA Astrophysics Data System (ADS)
Jayaneththi, V. R.; Aw, K. C.; McDaid, A. J.
2018-04-01
Magnetic polymer composites (MPC) have shown promise for emerging biomedical applications such as lab-on-a-chip and implantable drug delivery. These soft material actuators are capable of fast response, large deformation and wireless actuation. Existing MPC modeling approaches are computationally expensive and unsuitable for rapid design prototyping and real-time control applications. This paper proposes a macro-scale 1-DOF model capable of predicting force and displacement of an MPC diaphragm actuator. Model validation confirmed both blocked force and displacement can be accurately predicted in a variety of working conditions i.e. different magnetic field strengths, static/dynamic fields, and gap distances. The contribution of this work includes a comprehensive experimental investigation of a macro-scale diaphragm actuator; the derivation and validation of a new phenomenological model to describe MPC actuation; and insights into the proposed model’s design-based functionality i.e. scalability and generalizability in terms of magnetic filler concentration and diaphragm diameter. Due to the lumped element modeling approach, the proposed model can also be adapted to alternative actuator configurations, and thus presents a useful tool for design, control and simulation of novel MPC applications.
Modeling and Analysis of Wrinkled Membranes: An Overview
NASA Technical Reports Server (NTRS)
Yang, B.; Ding, H.; Lou, M.; Fang, H.; Broduer, Steve (Technical Monitor)
2001-01-01
Thin-film membranes are basic elements of a variety of space inflatable/deployable structures. Wrinkling degrades the performance and reliability of these membrane structures, and hence has been a topic of continued interest. Wrinkling analysis of membranes for general geometry and arbitrary boundary conditions is quite challenging. The objective of this presentation is two-fold. Firstly, the existing models of wrinkled membranes and related numerical solution methods are reviewed. The important issues to be discussed are the capability of a membrane model to characterize taut, wrinkled and slack states of membranes in a consistent and physically reasonable manner; the ability of a wrinkling analysis method to predict the formation and growth of wrinkled regions, and to determine out-of-plane deformation and wrinkled waves; the convergence of a numerical solution method for wrinkling analysis; and the compatibility of a wrinkling analysis with general-purpose finite element codes. According to this review, several opening issues in modeling and analysis of wrinkled membranes that are to be addressed in future research are summarized, The second objective of this presentation is to discuss a newly developed membrane model of two viable parameters (2-VP model) and associated parametric finite element method (PFEM) for wrinkling analysis are introduced. The innovations and advantages of the proposed membrane model and PFEM-based wrinkling analysis are: (1) Via a unified stress-strain relation; the 2-VP model treat the taut, wrinkled, and slack states of membranes consistently; (2) The PFEM-based wrinkling analysis has guaranteed convergence; (3) The 2-VP model along with PFEM is capable of predicting membrane out-of-plane deformations; and (4) The PFEM can be integrated into any existing finite element code. Preliminary numerical examples are also included in this presentation to demonstrate the 2-VP model and PFEM-based wrinkling analysis approach.
Integrated Workforce Modeling System
NASA Technical Reports Server (NTRS)
Moynihan, Gary P.
2000-01-01
There are several computer-based systems, currently in various phases of development at KSC, which encompass some component, aspect, or function of workforce modeling. These systems may offer redundant capabilities and/or incompatible interfaces. A systems approach to workforce modeling is necessary in order to identify and better address user requirements. This research has consisted of two primary tasks. Task 1 provided an assessment of existing and proposed KSC workforce modeling systems for their functionality and applicability to the workforce planning function. Task 2 resulted in the development of a proof-of-concept design for a systems approach to workforce modeling. The model incorporates critical aspects of workforce planning, including hires, attrition, and employee development.
Reusable Launch Vehicle (RLV) Market Analysis Model
NASA Technical Reports Server (NTRS)
Prince, Frank A.
1999-01-01
The RLV Market Analysis model is at best a rough order approximation of actual market behavior. However, it does give a quick indication if the flights exists to enable an economically viable RLV, and the assumptions necessary for the vehicle to capture those flights. Additional analysis, market research, and updating with the latest information on payloads and launches would improve the model. Plans are to update the model as new information becomes available and new requirements are levied. This tool will continue to be a vital part of NASA's RLV business analysis capability for the foreseeable future.
Integrated Modeling, Mapping, and Simulation (IMMS) framework for planning exercises.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Friedman-Hill, Ernest J.; Plantenga, Todd D.
2010-06-01
The Integrated Modeling, Mapping, and Simulation (IMMS) program is designing and prototyping a simulation and collaboration environment for linking together existing and future modeling and simulation tools to enable analysts, emergency planners, and incident managers to more effectively, economically, and rapidly prepare, analyze, train, and respond to real or potential incidents. When complete, the IMMS program will demonstrate an integrated modeling and simulation capability that supports emergency managers and responders with (1) conducting 'what-if' analyses and exercises to address preparedness, analysis, training, operations, and lessons learned, and (2) effectively, economically, and rapidly verifying response tactics, plans and procedures.
The Business Case for Spiral Development in Heavy Lift Launch Vehicle Systems
NASA Technical Reports Server (NTRS)
Farr, Rebecca A.; Christensen, David L.; Keith, Edward L.
2005-01-01
Performance capabilities of a specific combination of the Space Shuttle external tank and various liquid engines in an in-line configuration, two-stage core vehicle with multiple redesigned solid rocket motor strap-ons are reexamined. This concept proposes using existing assets, hardware, and capabilities that are already crew-rated, flight certified, being manufactured under existing contracts, have a long history of component and system ground testing, and have been flown for over 20 yr. This paper goes beyond describing potential performance capabilities of specific components to discuss the overall system feasibility-from end to end, start to finish-describing the inherent cost advantages of the Spiral Development concept, which builds on existing capabilities and assets, as opposed to starting up a "fresh sheet" heavy-lift launch vehicle program from scratch.
NASA Technical Reports Server (NTRS)
Montoya, R. J.; Jai, A. R.; Parker, C. D.
1979-01-01
A ground based, general purpose, real time, digital control system simulator (CSS) is specified, developed, and integrated with the existing instrumentation van of the testing facility. This CSS is built around a PDP-11/55, and its operational software was developed to meet the dual goal of providing the immediate capability to represent the F-18 drop model control laws and the flexibility for expansion to represent more complex control laws typical of control configured vehicles. Overviews of the two CSS's developed are reviewed as well as the overall system after their integration with the existing facility. Also the latest version of the F-18 drop model control laws (REV D) is described and the changes needed for its incorporation in the digital and analog CSS's are discussed.
NASA Technical Reports Server (NTRS)
Mainger, Steve
2004-01-01
As NASA speculates on and explores the future of aviation, the technological and physical aspects of our environment increasing become hurdles that must be overcome for success. Research into methods for overcoming some of these selected hurdles have been purposed by several NASA research partners as concepts. The task of establishing a common evaluation environment was placed on NASA's Virtual Airspace Simulation Technologies (VAST) project (sub-project of VAMS), and they responded with the development of the Airspace Concept Evaluation System (ACES). As one examines the ACES environment from a communication, navigation or surveillance (CNS) perspective, the simulation parameters are built with assumed perfection in the transactions associated with CNS. To truly evaluate these concepts in a realistic sense, the contributions/effects of CNS must be part of the ACES. NASA Glenn Research Center (GRC) has supported the Virtual Airspace Modeling and Simulation (VAMS) project through the continued development of CNS models and analysis capabilities which supports the ACES environment. NASA GRC initiated the development a communications traffic loading analysis tool, called the Future Aeronautical Sub-network Traffic Emulator for Communications, Navigation and Surveillance (FASTE-CNS), as part of this support. This tool allows for forecasting of communications load with the understanding that, there is no single, common source for loading models used to evaluate the existing and planned communications channels; and that, consensus and accuracy in the traffic load models is a very important input to the decisions being made on the acceptability of communication techniques used to fulfill the aeronautical requirements. Leveraging off the existing capabilities of the FASTE-CNS tool, GRC has called for FASTE-CNS to have the functionality to pre- and post-process the simulation runs of ACES to report on instances when traffic density, frequency congestion or aircraft spacing/distance violations have occurred. The integration of these functions require that the CNS models used to characterize these avionic system be of higher fidelity and better consistency then is present in FASTE-CNS system. This presentation will explore the capabilities of FASTE-CNS with renewed emphasis on the enhancements being added to perform these processing functions; the fidelity and reliability of CNS models necessary to make the enhancements work; and the benchmarking of FASTE-CNS results to improve confidence for the results of the new processing capabilities.
NASA Technical Reports Server (NTRS)
Nieten, Joseph L.; Seraphine, Kathleen M.
1991-01-01
Procedural modeling systems, rule based modeling systems, and a method for converting a procedural model to a rule based model are described. Simulation models are used to represent real time engineering systems. A real time system can be represented by a set of equations or functions connected so that they perform in the same manner as the actual system. Most modeling system languages are based on FORTRAN or some other procedural language. Therefore, they must be enhanced with a reaction capability. Rule based systems are reactive by definition. Once the engineering system has been decomposed into a set of calculations using only basic algebraic unary operations, a knowledge network of calculations and functions can be constructed. The knowledge network required by a rule based system can be generated by a knowledge acquisition tool or a source level compiler. The compiler would take an existing model source file, a syntax template, and a symbol table and generate the knowledge network. Thus, existing procedural models can be translated and executed by a rule based system. Neural models can be provide the high capacity data manipulation required by the most complex real time models.
ERIC Educational Resources Information Center
Eylon, Bat-Sheva; Bagno, Esther
2006-01-01
How can one increase the awareness of teachers to the existence and importance of knowledge gained through physics education research (PER) and provide them with capabilities to use it? How can one enrich teachers' physics knowledge and the related pedagogical content knowledge of topics singled out by PER? In this paper we describe a professional…
Process Time Refinement for Reusable Launch Vehicle Regeneration Modeling
2008-03-01
predicted to fail, or have failed. 3) Augmenting existing space systems with redundant or additional capability to enhance space system performance or...Canopies, External Tanks/Pods/Pylon Ejectors , Armament Bay Doors, Missile Launchers, Wing and Fuselage Center Line Racks, Bomb Bay Release...Systems Test 04583 Thrust Maintenance Operation 04584 Silo Door Operation 04650 Initial Build-up-Recovery Vehicle (RV) 147 04610 Nondestructive
Leveraging annotation-based modeling with Jump.
Bergmayr, Alexander; Grossniklaus, Michael; Wimmer, Manuel; Kappel, Gerti
2018-01-01
The capability of UML profiles to serve as annotation mechanism has been recognized in both research and industry. Today's modeling tools offer profiles specific to platforms, such as Java, as they facilitate model-based engineering approaches. However, considering the large number of possible annotations in Java, manually developing the corresponding profiles would only be achievable by huge development and maintenance efforts. Thus, leveraging annotation-based modeling requires an automated approach capable of generating platform-specific profiles from Java libraries. To address this challenge, we present the fully automated transformation chain realized by Jump, thereby continuing existing mapping efforts between Java and UML by emphasizing on annotations and profiles. The evaluation of Jump shows that it scales for large Java libraries and generates profiles of equal or even improved quality compared to profiles currently used in practice. Furthermore, we demonstrate the practical value of Jump by contributing profiles that facilitate reverse engineering and forward engineering processes for the Java platform by applying it to a modernization scenario.
NASA Technical Reports Server (NTRS)
Van Dresar, N. T.
1992-01-01
A review of technology, history, and current status for pressurized expulsion of cryogenic tankage is presented. Use of tank pressurization to expel cryogenic fluid will continue to be studied for future spacecraft applications over a range of operating conditions in the low-gravity environment. The review examines experimental test results and analytical model development for quiescent and agitated conditions in normal-gravity followed by a discussion of pressurization and expulsion in low-gravity. Validated, 1-D, finite difference codes exist for the prediction of pressurant mass requirements within the range of quiescent normal-gravity test data. To date, the effects of liquid sloshing have been characterized by tests in normal-gravity, but analytical models capable of predicting pressurant gas requirements remain unavailable. Efforts to develop multidimensional modeling capabilities in both normal and low-gravity have recently occurred. Low-gravity cryogenic fluid transfer experiments are needed to obtain low-gravity pressurized expulsion data. This data is required to guide analytical model development and to verify code performance.
NASA Technical Reports Server (NTRS)
Vandresar, N. T.
1992-01-01
A review of technology, history, and current status for pressurized expulsion of cryogenic tankage is presented. Use of tank pressurization to expel cryogenic fluids will continue to be studied for future spacecraft applications over a range of operating conditions in the low-gravity environment. The review examines experimental test results and analytical model development for quiescent and agitated conditions in normal-gravity, followed by a discussion of pressurization and expulsion in low-gravity. Validated, 1-D, finite difference codes exist for the prediction of pressurant mass requirements within the range of quiescent normal-gravity test data. To date, the effects of liquid sloshing have been characterized by tests in normal-gravity, but analytical models capable of predicting pressurant gas requirements remain unavailable. Efforts to develop multidimensional modeling capabilities in both normal and low-gravity have recently occurred. Low-gravity cryogenic fluid transfer experiments are needed to obtain low-gravity pressurized expulsion data. This data is required to guide analytical model development and to verify code performance.
Argobots: A Lightweight Low-Level Threading and Tasking Framework
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seo, Sangmin; Amer, Abdelhalim; Balaji, Pavan
In the past few decades, a number of user-level threading and tasking models have been proposed in the literature to address the shortcomings of OS-level threads, primarily with respect to cost and flexibility. Current state-of-the-art user-level threading and tasking models, however, either are too specific to applications or architectures or are not as powerful or flexible. In this paper, we present Argobots, a lightweight, low-level threading and tasking framework that is designed as a portable and performant substrate for high-level programming models or runtime systems. Argobots offers a carefully designed execution model that balances generality of functionality with providing amore » rich set of controls to allow specialization by end users or high-level programming models. We describe the design, implementation, and performance characterization of Argobots and present integrations with three high-level models: OpenMP, MPI, and colocated I/O services. Evaluations show that (1) Argobots, while providing richer capabilities, is competitive with existing simpler generic threading runtimes; (2) our OpenMP runtime offers more efficient interoperability capabilities than production OpenMP runtimes do; (3) when MPI interoperates with Argobots instead of Pthreads, it enjoys reduced synchronization costs and better latency-hiding capabilities; and (4) I/O services with Argobots reduce interference with colocated applications while achieving performance competitive with that of a Pthreads approach.« less
Learning Reverse Engineering and Simulation with Design Visualization
NASA Technical Reports Server (NTRS)
Hemsworth, Paul J.
2018-01-01
The Design Visualization (DV) group supports work at the Kennedy Space Center by utilizing metrology data with Computer-Aided Design (CAD) models and simulations to provide accurate visual representations that aid in decision-making. The capability to measure and simulate objects in real time helps to predict and avoid potential problems before they become expensive in addition to facilitating the planning of operations. I had the opportunity to work on existing and new models and simulations in support of DV and NASA’s Exploration Ground Systems (EGS).
Model compilation for real-time planning and diagnosis with feedback
NASA Technical Reports Server (NTRS)
Barrett, Anthony
2005-01-01
This paper describes MEXEC, an implemented micro executive that compiles a device model that can have feedback into a structure for subsequent evaluation. This system computes both the most likely current device mode from n sets of sensor measurements and the n-1 step reconfiguration plan that is most likely to result in reaching a target mode - if such a plan exists. A user tunes the system by increasing n to improve system capability at the cost of real-time performance.
Competitive assessment of aerospace systems using system dynamics
NASA Astrophysics Data System (ADS)
Pfaender, Jens Holger
Aircraft design has recently experienced a trend away from performance centric design towards a more balanced approach with increased emphasis on engineering an economically successful system. This approach focuses on bringing forward a comprehensive economic and life-cycle cost analysis. Since the success of any system also depends on many external factors outside of the control of the designer, this traditionally has been modeled as noise affecting the uncertainty of the design. However, this approach is currently lacking a strategic treatment of necessary early decisions affecting the probability of success of a given concept in a dynamic environment. This suggests that the introduction of a dynamic method into a life-cycle cost analysis should allow the analysis of the future attractiveness of such a concept in the presence of uncertainty. One way of addressing this is through the use of a competitive market model. However, existing market models do not focus on the dynamics of the market. Instead, they focus on modeling and predicting market share through logit regression models. The resulting models exhibit relatively poor predictive capabilities. The method proposed here focuses on a top-down approach that integrates a competitive model based on work in the field of system dynamics into the aircraft design process. Demonstrating such integration is one of the primary contributions of this work, which previously has not been demonstrated. This integration is achieved through the use of surrogate models, in this case neural networks. This enabled not only the practical integration of analysis techniques, but also reduced the computational requirements so that interactive exploration as envisioned was actually possible. The example demonstration of this integration is built on the competition in the 250 seat large commercial aircraft market exemplified by the Boeing 767-400ER and the Airbus A330-200. Both aircraft models were calibrated to existing performance and certification data and then integrated into the system dynamics market model. The market model was then calibrated with historical market data. This calibration showed a much improved predictive capability as compared to the conventional logit regression models. An additional advantage of this dynamic model is that to realize this improved capability, no additional explanatory variables were required. Furthermore, the resulting market model was then integrated into a prediction profiler environment with a time variant Monte-Carlo analysis resulting in a unique trade-off environment. This environment was shown to allow interactive trade-off between aircraft design decisions and economic considerations while allowing the exploration potential market success in the light of varying external market conditions and scenarios. The resulting method is capable of reduced decision support uncertainty and identification of robust design decisions in future scenarios with a high likelihood of occurrence with special focus on the path dependent nature of future implications of decisions. Furthermore, it was possible to demonstrate the increased importance of design and technology choices on the competitiveness in scenarios with drastic increases in commodity prices during the time period modeled. Another use of the existing outputs of the Monte-Carlo analysis was then realized by showing them on a multivariate scatter plot. This plot was then shown to enable by appropriate grouping of variables to enable the top down definition of an aircraft design, also known as inverse design. In other words this enables the designer to define strategic market and return on investment goals for a number of scenarios, for example the development of fuel prices, and then directly see which specific aircraft designs meet these goals.
Multidisciplinary analysis and design of printed wiring boards
NASA Astrophysics Data System (ADS)
Fulton, Robert E.; Hughes, Joseph L.; Scott, Waymond R., Jr.; Umeagukwu, Charles; Yeh, Chao-Pin
1991-04-01
Modern printed wiring board design depends on electronic prototyping using computer-based simulation and design tools. Existing electrical computer-aided design (ECAD) tools emphasize circuit connectivity with only rudimentary analysis capabilities. This paper describes a prototype integrated PWB design environment denoted Thermal Structural Electromagnetic Testability (TSET) being developed at Georgia Tech in collaboration with companies in the electronics industry. TSET provides design guidance based on enhanced electrical and mechanical CAD capabilities including electromagnetic modeling testability analysis thermal management and solid mechanics analysis. TSET development is based on a strong analytical and theoretical science base and incorporates an integrated information framework and a common database design based on a systematic structured methodology.
Building HR capability in health care organizations.
Khatri, Naresh
2006-01-01
The current human resource (HR) management practices in health care are consistent with the industrial model of management. However, health care organizations are not factories. They are highly knowledge-intensive and service-oriented entities and thus require a different set of HR practices and systems to support them. Drawing from the resource-based theory, I argue that HRs are a potent weapon of competitive advantage for health care organizations and propose a five-dimensional conception of HR capability for harnessing HRs in health care organizations. The significant complementarities that exist between HRs and information technologies for delivering safer and better quality of patient care are also discussed.
Mosquito population dynamics from cellular automata-based simulation
NASA Astrophysics Data System (ADS)
Syafarina, Inna; Sadikin, Rifki; Nuraini, Nuning
2016-02-01
In this paper we present an innovative model for simulating mosquito-vector population dynamics. The simulation consist of two stages: demography and dispersal dynamics. For demography simulation, we follow the existing model for modeling a mosquito life cycles. Moreover, we use cellular automata-based model for simulating dispersal of the vector. In simulation, each individual vector is able to move to other grid based on a random walk. Our model is also capable to represent immunity factor for each grid. We simulate the model to evaluate its correctness. Based on the simulations, we can conclude that our model is correct. However, our model need to be improved to find a realistic parameters to match real data.
Advanced Computational Techniques for Hypersonic Propulsion
NASA Technical Reports Server (NTRS)
Povinelli, Louis A.
1996-01-01
CFD has played a major role in the resurgence of hypersonic flight, on the premise that numerical methods will allow us to perform simulations at conditions for which no ground test capability exists. Validation of CFD methods is being established using the experimental data base available, which is below Mach 8. It is important, however, to realize the limitations involved in the extrapolation process as well as the deficiencies that exist in numerical methods at the present time. Current features of CFD codes are examined for application to propulsion system components. The shortcomings in simulation and modeling are identified and discussed.
NASA Technical Reports Server (NTRS)
Balas, M. J.; Kaufman, H.; Wen, J.
1985-01-01
A command generator tracker approach to model following contol of linear distributed parameter systems (DPS) whose dynamics are described on infinite dimensional Hilbert spaces is presented. This method generates finite dimensional controllers capable of exponentially stable tracking of the reference trajectories when certain ideal trajectories are known to exist for the open loop DPS; we present conditions for the existence of these ideal trajectories. An adaptive version of this type of controller is also presented and shown to achieve (in some cases, asymptotically) stable finite dimensional control of the infinite dimensional DPS.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCoy, M; Kissel, L
2002-01-29
We are experimenting with a new computing model to be applied to a new computer dedicated to that model. Several LLNL science teams now have computational requirements, evidenced by the mature scientific applications that have been developed over the past five plus years, that far exceed the capability of the institution's computing resources. Thus, there is increased demand for dedicated, powerful parallel computational systems. Computation can, in the coming year, potentially field a capability system that is low cost because it will be based on a model that employs open source software and because it will use PC (IA32-P4) hardware.more » This incurs significant computer science risk regarding stability and system features but also presents great opportunity. We believe the risks can be managed, but the existence of risk cannot be ignored. In order to justify the budget for this system, we need to make the case that it serves science and, through serving science, serves the institution. That is the point of the meeting and the White Paper that we are proposing to prepare. The questions are listed and the responses received are in this report.« less
A Proposal for Modeling Real Hardware, Weather and Marine Conditions for Underwater Sensor Networks
Climent, Salvador; Capella, Juan Vicente; Blanc, Sara; Perles, Angel; Serrano, Juan José
2013-01-01
Network simulators are useful for researching protocol performance, appraising new hardware capabilities and evaluating real application scenarios. However, these tasks can only be achieved when using accurate models and real parameters that enable the extraction of trustworthy results and conclusions. This paper presents an underwater wireless sensor network ecosystem for the ns-3 simulator. This ecosystem is composed of a new energy-harvesting model and a low-cost, low-power underwater wake-up modem model that, alongside existing models, enables the performance of accurate simulations by providing real weather and marine conditions from the location where the real application is to be deployed. PMID:23748171
Evaluating Measurement of Dynamic Constructs: Defining a Measurement Model of Derivatives
Estabrook, Ryne
2015-01-01
While measurement evaluation has been embraced as an important step in psychological research, evaluating measurement structures with longitudinal data is fraught with limitations. This paper defines and tests a measurement model of derivatives (MMOD), which is designed to assess the measurement structure of latent constructs both for analyses of between-person differences and for the analysis of change. Simulation results indicate that MMOD outperforms existing models for multivariate analysis and provides equivalent fit to data generation models. Additional simulations show MMOD capable of detecting differences in between-person and within-person factor structures. Model features, applications and future directions are discussed. PMID:24364383
Advanced Post-Irradiation Examination Capabilities Alternatives Analysis Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jeff Bryan; Bill Landman; Porter Hill
2012-12-01
An alternatives analysis was performed for the Advanced Post-Irradiation Capabilities (APIEC) project in accordance with the U.S. Department of Energy (DOE) Order DOE O 413.3B, “Program and Project Management for the Acquisition of Capital Assets”. The Alternatives Analysis considered six major alternatives: ? No Action ? Modify Existing DOE Facilities – capabilities distributed among multiple locations ? Modify Existing DOE Facilities – capabilities consolidated at a few locations ? Construct New Facility ? Commercial Partnership ? International Partnerships Based on the alternatives analysis documented herein, it is recommended to DOE that the advanced post-irradiation examination capabilities be provided by amore » new facility constructed at the Materials and Fuels Complex at the Idaho National Laboratory.« less
NASA Technical Reports Server (NTRS)
Phillips, Shaun
1996-01-01
The Graphical Observation Scheduling System (GROSS) and its functionality and editing capabilities are reported on. The GROSS system was developed as a replacement for a suite of existing programs and associated processes with the aim of: providing a software tool that combines the functionality of several of the existing programs, and provides a Graphical User Interface (GUI) that gives greater data visibility and editing capabilities. It is considered that the improved editing capability provided by this approach enhanced the efficiency of the second astronomical Spacelab mission's (ASTRO-2) mission planning.
Development of a New VLBI Data Analysis Software
NASA Technical Reports Server (NTRS)
Bolotin, Sergei; Gipson, John M.; MacMillan, Daniel S.
2010-01-01
We present an overview of a new VLBI analysis software under development at NASA GSFC. The new software will replace CALC/SOLVE and many related utility programs. It will have the capabilities of the current system as well as incorporate new models and data analysis techniques. In this paper we give a conceptual overview of the new software. We formulate the main goals of the software. The software should be flexible and modular to implement models and estimation techniques that currently exist or will appear in future. On the other hand it should be reliable and possess production quality for processing standard VLBI sessions. Also, it needs to be capable of processing observations from a fully deployed network of VLBI2010 stations in a reasonable time. We describe the software development process and outline the software architecture.
Coupled Neutronics Thermal-Hydraulic Solution of a Full-Core PWR Using VERA-CS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clarno, Kevin T; Palmtag, Scott; Davidson, Gregory G
2014-01-01
The Consortium for Advanced Simulation of Light Water Reactors (CASL) is developing a core simulator called VERA-CS to model operating PWR reactors with high resolution. This paper describes how the development of VERA-CS is being driven by a set of progression benchmark problems that specify the delivery of useful capability in discrete steps. As part of this development, this paper will describe the current capability of VERA-CS to perform a multiphysics simulation of an operating PWR at Hot Full Power (HFP) conditions using a set of existing computer codes coupled together in a novel method. Results for several single-assembly casesmore » are shown that demonstrate coupling for different boron concentrations and power levels. Finally, high-resolution results are shown for a full-core PWR reactor modeled in quarter-symmetry.« less
Implementing a Loosely Coupled Fluid Structure Interaction Finite Element Model in PHASTA
NASA Astrophysics Data System (ADS)
Pope, David
Fluid Structure Interaction problems are an important multi-physics phenomenon in the design of aerospace vehicles and other engineering applications. A variety of computational fluid dynamics solvers capable of resolving the fluid dynamics exist. PHASTA is one such computational fluid dynamics solver. Enhancing the capability of PHASTA to resolve Fluid-Structure Interaction first requires implementing a structural dynamics solver. The implementation also requires a correction of the mesh used to solve the fluid equations to account for the deformation of the structure. This results in mesh motion and causes the need for an Arbitrary Lagrangian-Eulerian modification to the fluid dynamics equations currently implemented in PHASTA. With the implementation of both structural dynamics physics, mesh correction, and the Arbitrary Lagrangian-Eulerian modification of the fluid dynamics equations, PHASTA is made capable of solving Fluid-Structure Interaction problems.
Earth Observations for Global Water Security
NASA Technical Reports Server (NTRS)
Lawford, Richard; Strauch, Adrian; Toll, David; Fekete, Balazs; Cripe, Douglas
2013-01-01
The combined effects of population growth, increasing demands for water to support agriculture, energy security, and industrial expansion, and the challenges of climate change give rise to an urgent need to carefully monitor and assess trends and variations in water resources. Doing so will ensure that sustainable access to adequate quantities of safe and useable water will serve as a foundation for water security. Both satellite and in situ observations combined with data assimilation and models are needed for effective, integrated monitoring of the water cycle's trends and variability in terms of both quantity and quality. On the basis of a review of existing observational systems, we argue that a new integrated monitoring capability for water security purposes is urgently needed. Furthermore, the components for this capability exist and could be integrated through the cooperation of national observational programmes. The Group on Earth Observations should play a central role in the design, implementation, management and analysis of this system and its products.
Status of the AIAA Modeling and Simulation Format Standard
NASA Technical Reports Server (NTRS)
Jackson, E. Bruce; Hildreth, Bruce L.
2008-01-01
The current draft AIAA Standard for flight simulation models represents an on-going effort to improve the productivity of practitioners of the art of digital flight simulation (one of the original digital computer applications). This initial release provides the capability for the efficient representation and exchange of an aerodynamic model in full fidelity; the DAVE-ML format can be easily imported (with development of site-specific import tools) in an unambiguous way with automatic verification. An attractive feature of the standard is the ability to coexist with existing legacy software or tools. The draft Standard is currently limited in scope to static elements of dynamic flight simulations; however, these static elements represent the bulk of typical flight simulation mathematical models. It is already seeing application within U.S. and Australian government agencies in an effort to improve productivity and reduce model rehosting overhead. An existing tool allows import of DAVE-ML models into a popular simulation modeling and analysis tool, and other community-contributed tools and libraries can simplify the use of DAVE-ML compliant models at compile- or run-time of high-fidelity flight simulation.
NASA Astrophysics Data System (ADS)
Kim, Kunhwi; Rutqvist, Jonny; Nakagawa, Seiji; Birkholzer, Jens
2017-11-01
This paper presents coupled hydro-mechanical modeling of hydraulic fracturing processes in complex fractured media using a discrete fracture network (DFN) approach. The individual physical processes in the fracture propagation are represented by separate program modules: the TOUGH2 code for multiphase flow and mass transport based on the finite volume approach; and the rigid-body-spring network (RBSN) model for mechanical and fracture-damage behavior, which are coupled with each other. Fractures are modeled as discrete features, of which the hydrological properties are evaluated from the fracture deformation and aperture change. The verification of the TOUGH-RBSN code is performed against a 2D analytical model for single hydraulic fracture propagation. Subsequently, modeling capabilities for hydraulic fracturing are demonstrated through simulations of laboratory experiments conducted on rock-analogue (soda-lime glass) samples containing a designed network of pre-existing fractures. Sensitivity analyses are also conducted by changing the modeling parameters, such as viscosity of injected fluid, strength of pre-existing fractures, and confining stress conditions. The hydraulic fracturing characteristics attributed to the modeling parameters are investigated through comparisons of the simulation results.
Global Fleet Station: Station Ship Concept
2008-02-01
The basic ISO TEU containers can be designed for any number of configurations and provide many different capabilities. For example there are...Design Design Process The ship was designed using an iterative weight and volume balancing method . This method assigns a weight and volume to each...from existing merchant ships3. Different ship types are modeled in the algorithm though the selection of appropriate non-dimensional factors
ERIC Educational Resources Information Center
Suvannatsiri, Ratchasak; Santichaianant, Kitidech; Murphy, Elizabeth
2015-01-01
This paper reports on a project in which students designed, constructed and tested a model of an existing early warning system with simulation of debris flow in a context of a landslide. Students also assessed rural community members' knowledge of this system and subsequently taught them to estimate the time needed for evacuation of the community…
JPRS Report, Science & Technology, China
1992-06-18
The key to success of this model is the existence of a very effective basic investment capability, including an edu- cational foundation and an...firm lodgement in international markets. We must use various kinds of effective measures to channel our scientific and technical strength toward the... effectively farmland currently in use. Bio- engineering technology should be used to develop new kinds of plants and animals, the report says. China
Dispersion of pollutants in densely populated urban areas is a research area of clear importance. Currently, few numerical tools exist capable of describing airflow and dispersion patterns in these complex regions in a time efficient manner. (QUIC), Quick Urban & Industrial C...
Installation and Testing of ITER Integrated Modeling and Analysis Suite (IMAS) on DIII-D
NASA Astrophysics Data System (ADS)
Lao, L.; Kostuk, M.; Meneghini, O.; Smith, S.; Staebler, G.; Kalling, R.; Pinches, S.
2017-10-01
A critical objective of the ITER Integrated Modeling Program is the development of IMAS to support ITER plasma operation and research activities. An IMAS framework has been established based on the earlier work carried out within the EU. It consists of a physics data model and a workflow engine. The data model is capable of representing both simulation and experimental data and is applicable to ITER and other devices. IMAS has been successfully installed on a local DIII-D server using a flexible installer capable of managing the core data access tools (Access Layer and Data Dictionary) and optionally the Kepler workflow engine and coupling tools. A general adaptor for OMFIT (a workflow engine) is being built for adaptation of any analysis code to IMAS using a new IMAS universal access layer (UAL) interface developed from an existing OMFIT EU Integrated Tokamak Modeling UAL. Ongoing work includes development of a general adaptor for EFIT and TGLF based on this new UAL that can be readily extended for other physics codes within OMFIT. Work supported by US DOE under DE-FC02-04ER54698.
NASA Astrophysics Data System (ADS)
Echavarria, E.; Tomiyama, T.; van Bussel, G. J. W.
2007-07-01
The objective of this on-going research is to develop a design methodology to increase the availability for offshore wind farms, by means of an intelligent maintenance system capable of responding to faults by reconfiguring the system or subsystems, without increasing service visits, complexity, or costs. The idea is to make use of the existing functional redundancies within the system and sub-systems to keep the wind turbine operational, even at a reduced capacity if necessary. Re-configuration is intended to be a built-in capability to be used as a repair strategy, based on these existing functionalities provided by the components. The possible solutions can range from using information from adjacent wind turbines, such as wind speed and direction, to setting up different operational modes, for instance re-wiring, re-connecting, changing parameters or control strategy. The methodology described in this paper is based on qualitative physics and consists of a fault diagnosis system based on a model-based reasoner (MBR), and on a functional redundancy designer (FRD). Both design tools make use of a function-behaviour-state (FBS) model. A design methodology based on the re-configuration concept to achieve self-maintained wind turbines is an interesting and promising approach to reduce stoppage rate, failure events, maintenance visits, and to maintain energy output possibly at reduced rate until the next scheduled maintenance.
A general-purpose machine learning framework for predicting properties of inorganic materials
Ward, Logan; Agrawal, Ankit; Choudhary, Alok; ...
2016-08-26
A very active area of materials research is to devise methods that use machine learning to automatically extract predictive models from existing materials data. While prior examples have demonstrated successful models for some applications, many more applications exist where machine learning can make a strong impact. To enable faster development of machine-learning-based models for such applications, we have created a framework capable of being applied to a broad range of materials data. Our method works by using a chemically diverse list of attributes, which we demonstrate are suitable for describing a wide variety of properties, and a novel method formore » partitioning the data set into groups of similar materials to boost the predictive accuracy. In this manuscript, we demonstrate how this new method can be used to predict diverse properties of crystalline and amorphous materials, such as band gap energy and glass-forming ability.« less
Graph-based real-time fault diagnostics
NASA Technical Reports Server (NTRS)
Padalkar, S.; Karsai, G.; Sztipanovits, J.
1988-01-01
A real-time fault detection and diagnosis capability is absolutely crucial in the design of large-scale space systems. Some of the existing AI-based fault diagnostic techniques like expert systems and qualitative modelling are frequently ill-suited for this purpose. Expert systems are often inadequately structured, difficult to validate and suffer from knowledge acquisition bottlenecks. Qualitative modelling techniques sometimes generate a large number of failure source alternatives, thus hampering speedy diagnosis. In this paper we present a graph-based technique which is well suited for real-time fault diagnosis, structured knowledge representation and acquisition and testing and validation. A Hierarchical Fault Model of the system to be diagnosed is developed. At each level of hierarchy, there exist fault propagation digraphs denoting causal relations between failure modes of subsystems. The edges of such a digraph are weighted with fault propagation time intervals. Efficient and restartable graph algorithms are used for on-line speedy identification of failure source components.
A general-purpose machine learning framework for predicting properties of inorganic materials
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ward, Logan; Agrawal, Ankit; Choudhary, Alok
A very active area of materials research is to devise methods that use machine learning to automatically extract predictive models from existing materials data. While prior examples have demonstrated successful models for some applications, many more applications exist where machine learning can make a strong impact. To enable faster development of machine-learning-based models for such applications, we have created a framework capable of being applied to a broad range of materials data. Our method works by using a chemically diverse list of attributes, which we demonstrate are suitable for describing a wide variety of properties, and a novel method formore » partitioning the data set into groups of similar materials to boost the predictive accuracy. In this manuscript, we demonstrate how this new method can be used to predict diverse properties of crystalline and amorphous materials, such as band gap energy and glass-forming ability.« less
Criteria for predicting the formation of single-phase high-entropy alloys
Troparevsky, M Claudia; Morris, James R..; Kent, Paul R.; ...
2015-03-15
High entropy alloys constitute a new class of materials whose very existence poses fundamental questions. Originally thought to be stabilized by the large entropy of mixing, these alloys have attracted attention due to their potential applications, yet no model capable of robustly predicting which combinations of elements will form a single-phase currently exists. Here we propose a model that, through the use of high-throughput computation of the enthalpies of formation of binary compounds, is able to confirm all known high-entropy alloys while rejecting similar alloys that are known to form multiple phases. Despite the increasing entropy, our model predicts thatmore » the number of potential single-phase multicomponent alloys decreases with an increasing number of components: out of more than two million possible 7-component alloys considered, fewer than twenty single-phase alloys are likely.« less
NASA Technical Reports Server (NTRS)
1990-01-01
Lunar base projects, including a reconfigurable lunar cargo launcher, a thermal and micrometeorite protection system, a versatile lifting machine with robotic capabilities, a cargo transport system, the design of a road construction system for a lunar base, and the design of a device for removing lunar dust from material surfaces, are discussed. The emphasis on the Gulf of Mexico project was on the development of a computer simulation model for predicting vessel station keeping requirements. An existing code, used in predicting station keeping requirements for oil drilling platforms operating in North Shore (Alaska) waters was used as a basis for the computer simulation. Modifications were made to the existing code. The input into the model consists of satellite altimeter readings and water velocity readings from buoys stationed in the Gulf of Mexico. The satellite data consists of altimeter readings (wave height) taken during the spring of 1989. The simulation model predicts water velocity and direction, and wind velocity.
Genoviz Software Development Kit: Java tool kit for building genomics visualization applications.
Helt, Gregg A; Nicol, John W; Erwin, Ed; Blossom, Eric; Blanchard, Steven G; Chervitz, Stephen A; Harmon, Cyrus; Loraine, Ann E
2009-08-25
Visualization software can expose previously undiscovered patterns in genomic data and advance biological science. The Genoviz Software Development Kit (SDK) is an open source, Java-based framework designed for rapid assembly of visualization software applications for genomics. The Genoviz SDK framework provides a mechanism for incorporating adaptive, dynamic zooming into applications, a desirable feature of genome viewers. Visualization capabilities of the Genoviz SDK include automated layout of features along genetic or genomic axes; support for user interactions with graphical elements (Glyphs) in a map; a variety of Glyph sub-classes that promote experimentation with new ways of representing data in graphical formats; and support for adaptive, semantic zooming, whereby objects change their appearance depending on zoom level and zooming rate adapts to the current scale. Freely available demonstration and production quality applications, including the Integrated Genome Browser, illustrate Genoviz SDK capabilities. Separation between graphics components and genomic data models makes it easy for developers to add visualization capability to pre-existing applications or build new applications using third-party data models. Source code, documentation, sample applications, and tutorials are available at http://genoviz.sourceforge.net/.
Sculpting bespoke mountains: Determining free energies with basis expansions
NASA Astrophysics Data System (ADS)
Whitmer, Jonathan K.; Fluitt, Aaron M.; Antony, Lucas; Qin, Jian; McGovern, Michael; de Pablo, Juan J.
2015-07-01
The intriguing behavior of a wide variety of physical systems, ranging from amorphous solids or glasses to proteins, is a direct manifestation of underlying free energy landscapes riddled with local minima separated by large barriers. Exploring such landscapes has arguably become one of statistical physics's great challenges. A new method is proposed here for uniform sampling of rugged free energy surfaces. The method, which relies on special Green's functions to approximate the Dirac delta function, improves significantly on existing simulation techniques by providing a boundary-agnostic approach that is capable of mapping complex features in multidimensional free energy surfaces. The usefulness of the proposed approach is established in the context of a simple model glass former and model proteins, demonstrating improved convergence and accuracy over existing methods.
Gardner, Anne; Gardner, Glenn; Coyer, Fiona; Gosby, Helen
2016-01-01
The nurse practitioner is a growing clinical role in Australia and internationally, with an expanded scope of practice including prescribing, referring and diagnosing. However, key gaps exist in nurse practitioner education regarding governance of specialty clinical learning and teaching. Specifically, there is no internationally accepted framework against which to measure the quality of clinical learning and teaching for advanced specialty practice. A case study design will be used to investigate educational governance and capability theory in nurse practitioner education. Nurse practitioner students, their clinical mentors and university academic staff, from an Australian university that offers an accredited nurse practitioner Master's degree, will be invited to participate in the study. Semi-structured interviews will be conducted with students and their respective clinical mentors and university academic staff to investigate learning objectives related to educational governance and attributes of capability learning. Limited demographic data on age, gender, specialty, education level and nature of the clinical healthcare learning site will also be collected. Episodes of nurse practitioner student specialty clinical learning will be observed and documentation from the students' healthcare learning sites will be collected. Descriptive statistics will be used to report age groups, areas of specialty and types of facilities where clinical learning and teaching is observed. Qualitative data from interviews, observations and student documents will be coded, aggregated and explored to inform a framework of educational governance, to confirm the existing capability framework and describe any additional characteristics of capability and capability learning. This research has widespread significance and will contribute to ongoing development of the Australian health workforce. Stakeholders from industry and academic bodies will be involved in shaping the framework that guides the quality and governance of clinical learning and teaching in specialty nurse practitioner practice. Through developing standards for advanced clinical learning and teaching, and furthering understanding of capability theory for advanced healthcare practitioners, this research will contribute to evidence-based models of advanced specialty postgraduate education.
Modular Architecture for Integrated Model-Based Decision Support.
Gaebel, Jan; Schreiber, Erik; Oeser, Alexander; Oeltze-Jafra, Steffen
2018-01-01
Model-based decision support systems promise to be a valuable addition to oncological treatments and the implementation of personalized therapies. For the integration and sharing of decision models, the involved systems must be able to communicate with each other. In this paper, we propose a modularized architecture of dedicated systems for the integration of probabilistic decision models into existing hospital environments. These systems interconnect via web services and provide model sharing and processing capabilities for clinical information systems. Along the lines of IHE integration profiles from other disciplines and the meaningful reuse of routinely recorded patient data, our approach aims for the seamless integration of decision models into hospital infrastructure and the physicians' daily work.
User's manual for the Composite HTGR Analysis Program (CHAP-1)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilbert, J.S.; Secker, P.A. Jr.; Vigil, J.C.
1977-03-01
CHAP-1 is the first release version of an HTGR overall plant simulation program with both steady-state and transient solution capabilities. It consists of a model-independent systems analysis program and a collection of linked modules, each representing one or more components of the HTGR plant. Detailed instructions on the operation of the code and detailed descriptions of the HTGR model are provided. Information is also provided to allow the user to easily incorporate additional component modules, to modify or replace existing modules, or to incorporate a completely new simulation model into the CHAP systems analysis framework.
Fault displacement hazard assessment for nuclear installations based on IAEA safety standards
NASA Astrophysics Data System (ADS)
Fukushima, Y.
2016-12-01
In the IAEA Safety NS-R-3, surface fault displacement hazard assessment (FDHA) is required for the siting of nuclear installations. If any capable faults exist in the candidate site, IAEA recommends the consideration of alternative sites. However, due to the progress in palaeoseismological investigations, capable faults may be found in existing site. In such a case, IAEA recommends to evaluate the safety using probabilistic FDHA (PFDHA), which is an empirical approach based on still quite limited database. Therefore a basic and crucial improvement is to increase the database. In 2015, IAEA produced a TecDoc-1767 on Palaeoseismology as a reference for the identification of capable faults. Another IAEA Safety Report 85 on ground motion simulation based on fault rupture modelling provides an annex introducing recent PFDHAs and fault displacement simulation methodologies. The IAEA expanded the project of FDHA for the probabilistic approach and the physics based fault rupture modelling. The first approach needs a refinement of the empirical methods by building a world wide database, and the second approach needs to shift from kinematic to the dynamic scheme. Both approaches can complement each other, since simulated displacement can fill the gap of a sparse database and geological observations can be useful to calibrate the simulations. The IAEA already supported a workshop in October 2015 to discuss the existing databases with the aim of creating a common worldwide database. A consensus of a unified database was reached. The next milestone is to fill the database with as many fault rupture data sets as possible. Another IAEA work group had a WS in November 2015 to discuss the state-of-the-art PFDHA as well as simulation methodologies. Two groups jointed a consultancy meeting in February 2016, shared information, identified issues, discussed goals and outputs, and scheduled future meetings. Now we may aim at coordinating activities for the whole FDHA tasks jointly.
Computer routine adds plotting capabilities to existing programs
NASA Technical Reports Server (NTRS)
Harris, J. C.; Linnekin, J. S.
1966-01-01
PLOTAN, a generalized plot analysis routine written for the IBM 7094 computer, minimizes the difficulties in adding plot capabilities to large existing programs. PLOTAN is used in conjunction with a binary tape writing routine and has the ability to plot any variable on the intermediate binary tape as a function of any other.
Concerted and mosaic evolution of functional modules in songbird brains
DeVoogd, Timothy J.
2017-01-01
Vertebrate brains differ in overall size, composition and functional capacities, but the evolutionary processes linking these traits are unclear. Two leading models offer opposing views: the concerted model ascribes major dimensions of covariation in brain structures to developmental events, whereas the mosaic model relates divergent structures to functional capabilities. The models are often cast as incompatible, but they must be unified to explain how adaptive changes in brain structure arise from pre-existing architectures and developmental mechanisms. Here we show that variation in the sizes of discrete neural systems in songbirds, a species-rich group exhibiting diverse behavioural and ecological specializations, supports major elements of both models. In accordance with the concerted model, most variation in nucleus volumes is shared across functional domains and allometry is related to developmental sequence. Per the mosaic model, residual variation in nucleus volumes is correlated within functional systems and predicts specific behavioural capabilities. These comparisons indicate that oscine brains evolved primarily as a coordinated whole but also experienced significant, independent modifications to dedicated systems from specific selection pressures. Finally, patterns of covariation between species and brain areas hint at underlying developmental mechanisms. PMID:28490627
Testing Strategies for Model-Based Development
NASA Technical Reports Server (NTRS)
Heimdahl, Mats P. E.; Whalen, Mike; Rajan, Ajitha; Miller, Steven P.
2006-01-01
This report presents an approach for testing artifacts generated in a model-based development process. This approach divides the traditional testing process into two parts: requirements-based testing (validation testing) which determines whether the model implements the high-level requirements and model-based testing (conformance testing) which determines whether the code generated from a model is behaviorally equivalent to the model. The goals of the two processes differ significantly and this report explores suitable testing metrics and automation strategies for each. To support requirements-based testing, we define novel objective requirements coverage metrics similar to existing specification and code coverage metrics. For model-based testing, we briefly describe automation strategies and examine the fault-finding capability of different structural coverage metrics using tests automatically generated from the model.
Crew procedures and workload of retrofit concepts for microwave landing system
NASA Technical Reports Server (NTRS)
Summers, Leland G.; Jonsson, Jon E.
1989-01-01
Crew procedures and workload for Microwave Landing Systems (MLS) that could be retrofitted into existing transport aircraft were evaluated. Two MLS receiver concepts were developed. One is capable of capturing a runway centerline and the other is capable of capturing a segmented approach path. Crew procedures were identified and crew task analyses were performed using each concept. Crew workload comparisons were made between the MLS concepts and an ILS baseline using a task-timeline workload model. Workload indexes were obtained for each scenario. The results showed that workload was comparable to the ILS baseline for the MLS centerline capture concept, but significantly higher for the segmented path capture concept.
Mapping annotations with textual evidence using an scLDA model.
Jin, Bo; Chen, Vicky; Chen, Lujia; Lu, Xinghua
2011-01-01
Most of the knowledge regarding genes and proteins is stored in biomedical literature as free text. Extracting information from complex biomedical texts demands techniques capable of inferring biological concepts from local text regions and mapping them to controlled vocabularies. To this end, we present a sentence-based correspondence latent Dirichlet allocation (scLDA) model which, when trained with a corpus of PubMed documents with known GO annotations, performs the following tasks: 1) learning major biological concepts from the corpus, 2) inferring the biological concepts existing within text regions (sentences), and 3) identifying the text regions in a document that provides evidence for the observed annotations. When applied to new gene-related documents, a trained scLDA model is capable of predicting GO annotations and identifying text regions as textual evidence supporting the predicted annotations. This study uses GO annotation data as a testbed; the approach can be generalized to other annotated data, such as MeSH and MEDLINE documents.
A Model-Free Scheme for Meme Ranking in Social Media.
He, Saike; Zheng, Xiaolong; Zeng, Daniel
2016-01-01
The prevalence of social media has greatly catalyzed the dissemination and proliferation of online memes (e.g., ideas, topics, melodies, tags, etc.). However, this information abundance is exceeding the capability of online users to consume it. Ranking memes based on their popularities could promote online advertisement and content distribution. Despite such importance, few existing work can solve this problem well. They are either daunted by unpractical assumptions or incapability of characterizing dynamic information. As such, in this paper, we elaborate a model-free scheme to rank online memes in the context of social media. This scheme is capable to characterize the nonlinear interactions of online users, which mark the process of meme diffusion. Empirical studies on two large-scale, real-world datasets (one in English and one in Chinese) demonstrate the effectiveness and robustness of the proposed scheme. In addition, due to its fine-grained modeling of user dynamics, this ranking scheme can also be utilized to explain meme popularity through the lens of social influence.
A Model-Free Scheme for Meme Ranking in Social Media
He, Saike; Zheng, Xiaolong; Zeng, Daniel
2015-01-01
The prevalence of social media has greatly catalyzed the dissemination and proliferation of online memes (e.g., ideas, topics, melodies, tags, etc.). However, this information abundance is exceeding the capability of online users to consume it. Ranking memes based on their popularities could promote online advertisement and content distribution. Despite such importance, few existing work can solve this problem well. They are either daunted by unpractical assumptions or incapability of characterizing dynamic information. As such, in this paper, we elaborate a model-free scheme to rank online memes in the context of social media. This scheme is capable to characterize the nonlinear interactions of online users, which mark the process of meme diffusion. Empirical studies on two large-scale, real-world datasets (one in English and one in Chinese) demonstrate the effectiveness and robustness of the proposed scheme. In addition, due to its fine-grained modeling of user dynamics, this ranking scheme can also be utilized to explain meme popularity through the lens of social influence. PMID:26823638
Final report on LDRD project : coupling strategies for multi-physics applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hopkins, Matthew Morgan; Moffat, Harry K.; Carnes, Brian
Many current and future modeling applications at Sandia including ASC milestones will critically depend on the simultaneous solution of vastly different physical phenomena. Issues due to code coupling are often not addressed, understood, or even recognized. The objectives of the LDRD has been both in theory and in code development. We will show that we have provided a fundamental analysis of coupling, i.e., when strong coupling vs. a successive substitution strategy is needed. We have enabled the implementation of tighter coupling strategies through additions to the NOX and Sierra code suites to make coupling strategies available now. We have leveragedmore » existing functionality to do this. Specifically, we have built into NOX the capability to handle fully coupled simulations from multiple codes, and we have also built into NOX the capability to handle Jacobi Free Newton Krylov simulations that link multiple applications. We show how this capability may be accessed from within the Sierra Framework as well as from outside of Sierra. The critical impact from this LDRD is that we have shown how and have delivered strategies for enabling strong Newton-based coupling while respecting the modularity of existing codes. This will facilitate the use of these codes in a coupled manner to solve multi-physic applications.« less
Operating a terrestrial Internet router onboard and alongside a small satellite
NASA Astrophysics Data System (ADS)
Wood, L.; da Silva Curiel, A.; Ivancic, W.; Hodgson, D.; Shell, D.; Jackson, C.; Stewart, D.
2006-07-01
After twenty months of flying, testing and demonstrating a Cisco mobile access router, originally designed for terrestrial use, onboard the low-Earth-orbiting UK-DMC satellite as part of a larger merged ground/space IP-based internetwork, we use our experience to examine the benefits and drawbacks of integration and standards reuse for small satellite missions. Benefits include ease of operation and the ability to leverage existing systems and infrastructure designed for general use with a large set of latent capabilities to draw on when needed, as well as the familiarity that comes from reuse of existing, known, and well-understood security and operational models. Drawbacks include cases where integration work was needed to bridge the gaps in assumptions between different systems, and where performance considerations outweighed the benefits of reuse of pre-existing file transfer protocols. We find similarities with the terrestrial IP networks whose technologies have been taken to small satellites—and also some significant differences between the two in operational models and assumptions that must be borne in mind.
Generalized superradiant assembly for nanophotonic thermal emitters
NASA Astrophysics Data System (ADS)
Mallawaarachchi, Sudaraka; Gunapala, Sarath D.; Stockman, Mark I.; Premaratne, Malin
2018-03-01
Superradiance explains the collective enhancement of emission, observed when nanophotonic emitters are arranged within subwavelength proximity and perfect symmetry. Thermal superradiant emitter assemblies with variable photon far-field coupling rates are known to be capable of outperforming their conventional, nonsuperradiant counterparts. However, due to the inability to account for assemblies comprising emitters with various materials and dimensional configurations, existing thermal superradiant models are inadequate and incongruent. In this paper, a generalized thermal superradiant assembly for nanophotonic emitters is developed from first principles. Spectral analysis shows that not only does the proposed model outperform existing models in power delivery, but also portrays unforeseen and startling characteristics during emission. These electromagnetically induced transparency like (EIT-like) and superscattering-like characteristics are reported here for a superradiant assembly, and the effects escalate as the emitters become increasingly disparate. The fact that the EIT-like characteristics are in close agreement with a recent experimental observation involving the superradiant decay of qubits strongly bolsters the validity of the proposed model.
MSC/NASTRAN Stress Analysis of Complete Models Subjected to Random and Quasi-Static Loads
NASA Technical Reports Server (NTRS)
Hampton, Roy W.
2000-01-01
Space payloads, such as those which fly on the Space Shuttle in Spacelab, are designed to withstand dynamic loads which consist of combined acoustic random loads and quasi-static acceleration loads. Methods for computing the payload stresses due to these loads are well known and appear in texts and NASA documents, but typically involve approximations such as the Miles' equation, as well as possible adjustments based on "modal participation factors." Alternatively, an existing capability in MSC/NASTRAN may be used to output exact root mean square [rms] stresses due to the random loads for any specified elements in the Finite Element Model. However, it is time consuming to use this methodology to obtain the rms stresses for the complete structural model and then combine them with the quasi-static loading induced stresses. Special processing was developed as described here to perform the stress analysis of all elements in the model using existing MSC/NASTRAN and MSC/PATRAN and UNIX utilities. Fail-safe and buckling analyses applications are also described.
Structure of turbulence in three-dimensional boundary layers
NASA Technical Reports Server (NTRS)
Subramanian, Chelakara S.
1993-01-01
This report provides an overview of the three dimensional turbulent boundary layer concepts and of the currently available experimental information for their turbulence modeling. It is found that more reliable turbulence data, especially of the Reynolds stress transport terms, is needed to improve the existing modeling capabilities. An experiment is proposed to study the three dimensional boundary layer formed by a 'sink flow' in a fully developed two dimensional turbulent boundary layer. Also, the mean and turbulence field measurement procedure using a three component laser Doppler velocimeter is described.
Summary of long-baseline systematics session at CETUP*2014
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cherdack, Daniel; Worcester, Elizabeth
2015-10-15
A session studying systematics in long-baseline neutrino oscillation physics was held July 14-18, 2014 as part of CETUP* 2014. Systematic effects from flux normalization and modeling, modeling of cross sections and nuclear interactions, and far detector effects were addressed. Experts presented the capabilities of existing and planned tools. A program of study to determine estimates of and requirements for the size of these effects was designed. This document summarizes the results of the CETUP* systematics workshop and the current status of systematic uncertainty studies in long-baseline neutrino oscillation measurements.
NASA Technical Reports Server (NTRS)
Wilson, William C.; Atkinson, Gary M.
2007-01-01
Integrated Vehicle Health Monitoring (IVHM) of aerospace vehicles requires rugged sensors having reduced volume, mass, and power that can be used to measure a variety of phenomena. Wireless systems are preferred when retro-fitting sensors onto existing vehicles. Surface Acoustic Wave (SAW) devices are capable of sensing: temperature, pressure, strain, chemical species, mass loading, acceleration, and shear stress. SAW technology is low cost, rugged, lightweight, and extremely low power. To aid in the development of SAW sensors for IVHM applications, a first order model of a SAW Delay line has been created.
Recent advances in hypersonic technology
NASA Technical Reports Server (NTRS)
Dwoyer, Douglas L.
1990-01-01
This paper will focus on recent advances in hypersonic aerodynamic prediction techniques. Current capabilities of existing numerical methods for predicting high Mach number flows will be discussed and shortcomings will be identified. Physical models available for inclusion into modern codes for predicting the effects of transition and turbulence will also be outlined and their limitations identified. Chemical reaction models appropriate to high-speed flows will be addressed, and the impact of their inclusion in computational fluid dynamics codes will be discussed. Finally, the problem of validating predictive techniques for high Mach number flows will be addressed.
ARES Modeling of High-foot Implosions (NNSA Milestone #5466)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hurricane, O. A.
ARES “capsule only” simulations demonstrated results of applying an ASC code to a suite of high-foot ICF implosion experiments. While a capability to apply an asymmetric FDS drive to the capsule-only model using add-on Python routines exists, it was not exercised here. The ARES simulation results resemble the results from HYDRA simulations documented in A. Kritcher, et al., Phys. Plasmas, 23, 052709 (2016); namely, 1D simulation and data are in reasonable agreement for the lowest velocity experiments, but diverge from each other at higher velocities.
A history of presatellite investigations of the earth's radiation budget
NASA Technical Reports Server (NTRS)
Hunt, G. E.; Kandel, R.; Mecherikunnel, A. T.
1986-01-01
The history of radiation budget studies from the early twentieth century to the advent of the space age is reviewed. By the beginning of the 1960's, accurate radiative models had been developed capable of estimating the global and zonally averaged components of the radiation budget, though great uncertainty in the derived parameters existed due to inaccuracy of the data describing the physical parameters used in the model, associated with clouds, the solar radiation, and the gaseous atmospheric absorbers. Over the century, the planetary albedo estimates had reduced from 89 to 30 percent.
NASA Technical Reports Server (NTRS)
Dowell, E. H.
1976-01-01
Internal sound fields are considered. Specifically, the interaction between the (acoustic) sound pressure field and the (elastic) flexible wall of an enclosure is discussed. Such problems frequently arise when the vibrating walls of a transportation vehicle induce a significant internal sound field. Cabin noise in various flight vehicles and the internal sound field in an automobile are representative examples. A mathematical model, simplified solutions, and numerical results and comparisons with representative experimental data are briefly considered. An overall conclusion is that reasonable grounds for optimism exist with respect to available theoretical models and their predictive capability.
Consolidation of data base for Army generalized missile model
NASA Technical Reports Server (NTRS)
Klenke, D. J.; Hemsch, M. J.
1980-01-01
Data from plume interaction tests, nose mounted canard configuration tests, and high angle of attack tests on the Army Generalized Missile model are consolidated in a computer program which makes them readily accessible for plotting, listing, and evaluation. The program is written in FORTRAN and will run on an ordinary minicomputer. It has the capability of retrieving any coefficient from the existing DATAMAN tapes and displaying it in tabular or plotted form. Comparisons of data taken in several wind tunnels and of data with the predictions of Program MISSILE2 are also presented.
A Study of Fan Stage/Casing Interaction Models
NASA Technical Reports Server (NTRS)
Lawrence, Charles; Carney, Kelly; Gallardo, Vicente
2003-01-01
The purpose of the present study is to investigate the performance of several existing and new, blade-case interactions modeling capabilities that are compatible with the large system simulations used to capture structural response during blade-out events. Three contact models are examined for simulating the interactions between a rotor bladed disk and a case: a radial and linear gap element and a new element based on a hydrodynamic formulation. The first two models are currently available in commercial finite element codes such as NASTRAN and have been showed to perform adequately for simulating rotor-case interactions. The hydrodynamic model, although not readily available in commercial codes, may prove to be better able to characterize rotor-case interactions.
Enhancing the ABAQUS Thermomechanics Code to Simulate Steady and Transient Fuel Rod Behavior
DOE Office of Scientific and Technical Information (OSTI.GOV)
R. L. Williamson; D. A. Knoll
2009-09-01
A powerful multidimensional fuels performance capability, applicable to both steady and transient fuel behavior, is developed based on enhancements to the commercially available ABAQUS general-purpose thermomechanics code. Enhanced capabilities are described, including: UO2 temperature and burnup dependent thermal properties, solid and gaseous fission product swelling, fuel densification, fission gas release, cladding thermal and irradiation creep, cladding irradiation growth , gap heat transfer, and gap/plenum gas behavior during irradiation. The various modeling capabilities are demonstrated using a 2D axisymmetric analysis of the upper section of a simplified multi-pellet fuel rod, during both steady and transient operation. Computational results demonstrate the importancemore » of a multidimensional fully-coupled thermomechanics treatment. Interestingly, many of the inherent deficiencies in existing fuel performance codes (e.g., 1D thermomechanics, loose thermo-mechanical coupling, separate steady and transient analysis, cumbersome pre- and post-processing) are, in fact, ABAQUS strengths.« less
CIRSS vertical data integration, San Bernardino study
NASA Technical Reports Server (NTRS)
Hodson, W.; Christenson, J.; Michel, R. (Principal Investigator)
1982-01-01
The creation and use of a vertically integrated data base, including LANDSAT data, for local planning purposes in a portion of San Bernardino County, California are described. The project illustrates that a vertically integrated approach can benefit local users, can be used to identify and rectify discrepancies in various data sources, and that the LANDSAT component can be effectively used to identify change, perform initial capability/suitability modeling, update existing data, and refine existing data in a geographic information system. Local analyses were developed which produced data of value to planners in the San Bernardino County Planning Department and the San Bernardino National Forest staff.
Design of a Ka-Band Propagation Terminal for Atmospheric Measurements in Polar Regions
NASA Technical Reports Server (NTRS)
Houts, Jacquelynne R.; Nessel, James A.; Zemba, Michael J.
2016-01-01
This paper describes the design and performance of a Ka-Band beacon receiver developed at NASA Glenn Research Center (GRC) that will be installed alongside an existing Ka-Band Radiometer [2] located at the east end of the Svalbard Near Earth Network (NEN) complex. The goal of this experiment is to characterize rain fade attenuation to improve the performance of existing statistical rain attenuation models. The ground terminal developed by NASA GRC utilizes an FFT-based frequency estimation [3] receiver capable of characterizing total path attenuation effects due to gaseous absorption, clouds, rain, and scintillation by directly measuring the propagated signal from the satellite Thor 7.
Design of a Ka-band Propagation Terminal for Atmospheric Measurements in Polar Regions
NASA Technical Reports Server (NTRS)
Houts, Jacquelynne R.; Nessel, James A.; Zemba, Michael J.
2016-01-01
This paper describes the design and performance of a Ka-Band beacon receiver developed at NASA Glenn Research Center (GRC) that will be installed alongside an existing Ka-Band Radiometer located at the east end of the Svalbard Near Earth Network (NEN) complex. The goal of this experiment is to characterize rain fade attenuation to improve the performance of existing statistical rain attenuation models. The ground terminal developed by NASA GRC utilizes an FFT-based frequency estimation receiver capable of characterizing total path attenuation effects due to gaseous absorption, clouds, rain, and scintillation by directly measuring the propagated signal from the satellite Thor 7.
Should the United States Create an American Foreign Legion?
2011-06-01
away to the point that it barely functions.”3 HVT programs also have their drawbacks , beginning with a history of questionable success. French...are some inherent drawbacks with their legion model, the most obvious being linguistic issues among a polyglot soldiery. While French...capability gaps that exist in the U.S. Military. However, the pendulum of support has been shifting away from these private military contractors since
Neuner, Matthias; Gamnitzer, Peter; Hofstetter, Günter
2017-01-01
The aims of the present paper are (i) to briefly review single-field and multi-field shotcrete models proposed in the literature; (ii) to propose the extension of a damage-plasticity model for concrete to shotcrete; and (iii) to evaluate the capabilities of the proposed extended damage-plasticity model for shotcrete by comparing the predicted response with experimental data for shotcrete and with the response predicted by shotcrete models, available in the literature. The results of the evaluation will be used for recommendations concerning the application and further improvements of the investigated shotcrete models and they will serve as a basis for the design of a new lab test program, complementing the existing ones. PMID:28772445
Data Requirements for Oceanic Processes in the Open Ocean, Coastal Zone, and Cryosphere
NASA Technical Reports Server (NTRS)
Nagler, R. G.; Mccandless, S. W., Jr.
1978-01-01
The type of information system that is needed to meet the requirements of ocean, coastal, and polar region users was examined. The requisite qualities of the system are: (1) availability, (2) accessibility, (3) responsiveness, (4) utility, (5) continuity, and (6) NASA participation. The system would not displace existing capabilities, but would have to integrate and expand the capabilities of existing systems and resolve the deficiencies that currently exist in producer-to-user information delivery options.
IoT-Based User-Driven Service Modeling Environment for a Smart Space Management System
Choi, Hoan-Suk; Rhee, Woo-Seop
2014-01-01
The existing Internet environment has been extended to the Internet of Things (IoT) as an emerging new paradigm. The IoT connects various physical entities. These entities have communication capability and deploy the observed information to various service areas such as building management, energy-saving systems, surveillance services, and smart homes. These services are designed and developed by professional service providers. Moreover, users' needs have become more complicated and personalized with the spread of user-participation services such as social media and blogging. Therefore, some active users want to create their own services to satisfy their needs, but the existing IoT service-creation environment is difficult for the non-technical user because it requires a programming capability to create a service. To solve this problem, we propose the IoT-based user-driven service modeling environment to provide an easy way to create IoT services. Also, the proposed environment deploys the defined service to another user. Through the personalization and customization of the defined service, the value and dissemination of the service is increased. This environment also provides the ontology-based context-information processing that produces and describes the context information for the IoT-based user-driven service. PMID:25420153
IoT-based user-driven service modeling environment for a smart space management system.
Choi, Hoan-Suk; Rhee, Woo-Seop
2014-11-20
The existing Internet environment has been extended to the Internet of Things (IoT) as an emerging new paradigm. The IoT connects various physical entities. These entities have communication capability and deploy the observed information to various service areas such as building management, energy-saving systems, surveillance services, and smart homes. These services are designed and developed by professional service providers. Moreover, users' needs have become more complicated and personalized with the spread of user-participation services such as social media and blogging. Therefore, some active users want to create their own services to satisfy their needs, but the existing IoT service-creation environment is difficult for the non-technical user because it requires a programming capability to create a service. To solve this problem, we propose the IoT-based user-driven service modeling environment to provide an easy way to create IoT services. Also, the proposed environment deploys the defined service to another user. Through the personalization and customization of the defined service, the value and dissemination of the service is increased. This environment also provides the ontology-based context-information processing that produces and describes the context information for the IoT-based user-driven service.
CometQuest: A Rosetta Adventure
NASA Technical Reports Server (NTRS)
Leon, Nancy J.; Fisher, Diane K.; Novati, Alexander; Chmielewski, Artur B.; Fitzpatrick, Austin J.; Angrum, Andrea
2012-01-01
This software is a higher-performance implementation of tiled WMS, with integral support for KML and time-varying data. This software is compliant with the Open Geospatial WMS standard, and supports KML natively as a WMS return type, including support for the time attribute. Regionated KML wrappers are generated that match the existing tiled WMS dataset. Ping and JPG formats are supported, and the software is implemented as an Apache 2.0 module that supports a threading execution model that is capable of supporting very high request rates. The module intercepts and responds to WMS requests that match certain patterns and returns the existing tiles. If a KML format that matches an existing pyramid and tile dataset is requested, regionated KML is generated and returned to the requesting application. In addition, KML requests that do not match the existing tile datasets generate a KML response that includes the corresponding JPG WMS request, effectively adding KML support to a backing WMS server.
NASA Technical Reports Server (NTRS)
Plesea, Lucian
2012-01-01
This software is a higher-performance implementation of tiled WMS, with integral support for KML and time-varying data. This software is compliant with the Open Geospatial WMS standard, and supports KML natively as a WMS return type, including support for the time attribute. Regionated KML wrappers are generated that match the existing tiled WMS dataset. Ping and JPG formats are supported, and the software is implemented as an Apache 2.0 module that supports a threading execution model that is capable of supporting very high request rates. The module intercepts and responds to WMS requests that match certain patterns and returns the existing tiles. If a KML format that matches an existing pyramid and tile dataset is requested, regionated KML is generated and returned to the requesting application. In addition, KML requests that do not match the existing tile datasets generate a KML response that includes the corresponding JPG WMS request, effectively adding KML support to a backing WMS server.
Activity-Centered Domain Characterization for Problem-Driven Scientific Visualization
Marai, G. Elisabeta
2018-01-01
Although visualization design models exist in the literature in the form of higher-level methodological frameworks, these models do not present a clear methodological prescription for the domain characterization step. This work presents a framework and end-to-end model for requirements engineering in problem-driven visualization application design. The framework and model are based on the activity-centered design paradigm, which is an enhancement of human-centered design. The proposed activity-centered approach focuses on user tasks and activities, and allows an explicit link between the requirements engineering process with the abstraction stage—and its evaluation—of existing, higher-level visualization design models. In a departure from existing visualization design models, the resulting model: assigns value to a visualization based on user activities; ranks user tasks before the user data; partitions requirements in activity-related capabilities and nonfunctional characteristics and constraints; and explicitly incorporates the user workflows into the requirements process. A further merit of this model is its explicit integration of functional specifications, a concept this work adapts from the software engineering literature, into the visualization design nested model. A quantitative evaluation using two sets of interdisciplinary projects supports the merits of the activity-centered model. The result is a practical roadmap to the domain characterization step of visualization design for problem-driven data visualization. Following this domain characterization model can help remove a number of pitfalls that have been identified multiple times in the visualization design literature. PMID:28866550
A Game-Theoretical Model to Improve Process Plant Protection from Terrorist Attacks.
Zhang, Laobing; Reniers, Genserik
2016-12-01
The New York City 9/11 terrorist attacks urged people from academia as well as from industry to pay more attention to operational security research. The required focus in this type of research is human intention. Unlike safety-related accidents, security-related accidents have a deliberate nature, and one has to face intelligent adversaries with characteristics that traditional probabilistic risk assessment techniques are not capable of dealing with. In recent years, the mathematical tool of game theory, being capable to handle intelligent players, has been used in a variety of ways in terrorism risk assessment. In this article, we analyze the general intrusion detection system in process plants, and propose a game-theoretical model for security management in such plants. Players in our model are assumed to be rational and they play the game with complete information. Both the pure strategy and the mixed strategy solutions are explored and explained. We illustrate our model by an illustrative case, and find that in our case, no pure strategy but, instead, a mixed strategy Nash equilibrium exists. © 2016 Society for Risk Analysis.
Vivek-Ananth, R P; Samal, Areejit
2016-09-01
A major goal of systems biology is to build predictive computational models of cellular metabolism. Availability of complete genome sequences and wealth of legacy biochemical information has led to the reconstruction of genome-scale metabolic networks in the last 15 years for several organisms across the three domains of life. Due to paucity of information on kinetic parameters associated with metabolic reactions, the constraint-based modelling approach, flux balance analysis (FBA), has proved to be a vital alternative to investigate the capabilities of reconstructed metabolic networks. In parallel, advent of high-throughput technologies has led to the generation of massive amounts of omics data on transcriptional regulation comprising mRNA transcript levels and genome-wide binding profile of transcriptional regulators. A frontier area in metabolic systems biology has been the development of methods to integrate the available transcriptional regulatory information into constraint-based models of reconstructed metabolic networks in order to increase the predictive capabilities of computational models and understand the regulation of cellular metabolism. Here, we review the existing methods to integrate transcriptional regulatory information into constraint-based models of metabolic networks. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Modeling and Evaluation of Miles-in-Trail Restrictions in the National Air Space
NASA Technical Reports Server (NTRS)
Grabbe, Shon; Sridhar, Banavar
2003-01-01
Miles-in-trail restrictions impact flights in the national air space on a daily basis and these restrictions routinely propagate between adjacent Air Route Traffic Control Centers. Since overly restrictive or ineffective miles-in-trail restrictions can reduce the overall efficiency of the national air space, decision support capabilities that model miles-in-trail restrictions should prove to be very beneficial. This paper presents both an analytical formulation and a linear programming approach for modeling the effects of miles-in-trail restrictions. A methodology for monitoring the conformance of an existing miles-in-trail restriction is also presented. These capabilities have been implemented in the Future ATM Concepts Evaluation Tool for testing purposes. To allow alternative restrictions to be evaluated in post-operations, a new mode of operation, which is referred to as the hybrid-playback mode, has been implemented in the simulation environment. To demonstrate the capabilities of these new algorithms, the miles-in-trail restrictions, which were in effect on June 27, 2002 in the New York Terminal Radar Approach Control, are examined. Results from the miles-in-trail conformance monitoring functionality are presented for the ELIOT, PARKE and WHITE departure fixes. In addition, the miles-in-trail algorithms are used to assess the impact of alternative restrictions at the PARKE departure fix.
The Temporal Morphology of Infrasound Propagation
NASA Astrophysics Data System (ADS)
Drob, Douglas P.; Garcés, Milton; Hedlin, Michael; Brachet, Nicolas
2010-05-01
Expert knowledge suggests that the performance of automated infrasound event association and source location algorithms could be greatly improved by the ability to continually update station travel-time curves to properly account for the hourly, daily, and seasonal changes of the atmospheric state. With the goal of reducing false alarm rates and improving network detection capability we endeavor to develop, validate, and integrate this capability into infrasound processing operations at the International Data Centre of the Comprehensive Nuclear Test-Ban Treaty Organization. Numerous studies have demonstrated that incorporation of hybrid ground-to-space (G2S) enviromental specifications in numerical calculations of infrasound signal travel time and azimuth deviation yields significantly improved results over that of climatological atmospheric specifications, specifically for tropospheric and stratospheric modes. A robust infrastructure currently exists to generate hybrid G2S vector spherical harmonic coefficients, based on existing operational and emperical models on a real-time basis (every 3- to 6-hours) (D rob et al., 2003). Thus the next requirement in this endeavor is to refine numerical procedures to calculate infrasound propagation characteristics for robust automatic infrasound arrival identification and network detection, location, and characterization algorithms. We present results from a new code that integrates the local (range-independent) τp ray equations to provide travel time, range, turning point, and azimuth deviation for any location on the globe given a G2S vector spherical harmonic coefficient set. The code employs an accurate numerical technique capable of handling square-root singularities. We investigate the seasonal variability of propagation characteristics over a five-year time series for two different stations within the International Monitoring System with the aim of understanding the capabilities of current working knowledge of the atmosphere and infrasound propagation models. The statistical behaviors or occurrence frequency of various propagation configurations are discussed. Representative examples of some of these propagation configuration states are also shown.
Steinmo, Siri; Fuller, Christopher; Stone, Sheldon P; Michie, Susan
2015-08-08
Sepsis is a major cause of death from infection, with a mortality rate of 36 %. This can be halved by implementing the 'Sepsis Six' evidence-based care bundle within 1 h of presentation. A UK audit has shown that median implementation rates are 27-47 % and interventions to improve this have demonstrated minimal effects. In order to develop more effective implementation interventions, it is helpful to obtain detailed characterisations of current interventions and to draw on behavioural theory to identify mechanisms of change. The aim of this study was to illustrate this process by using the Behaviour Change Wheel; Behaviour Change Technique (BCT) Taxonomy; Capability, Opportunity, Motivation model of behaviour; and Theoretical Domains Framework to characterise the content and theoretical mechanisms of action of an existing intervention to implement Sepsis Six. Data came from documentary, interview and observational analyses of intervention delivery in several wards of a UK hospital. A broad description of the intervention was created using the Template for Intervention Description and Replication framework. Content was specified in terms of (i) component BCTs using the BCT Taxonomy and (ii) intervention functions using the Behaviour Change Wheel. Mechanisms of action were specified using the Capability, Opportunity, Motivation model and the Theoretical Domains Framework. The intervention consisted of 19 BCTs, with eight identified using all three data sources. The BCTs were delivered via seven functions of the Behaviour Change Wheel, with four ('education', 'enablement', 'training' and 'environmental restructuring') supported by the three data sources. The most frequent mechanisms of action were reflective motivation (especially 'beliefs about consequences' and 'beliefs about capabilities') and psychological capability (especially 'knowledge'). The intervention consisted of a wide range of BCTs targeting a wide range of mechanisms of action. This study demonstrates the utility of the Behaviour Change Wheel, the BCT Taxonomy and the Theoretical Domains Framework, tools recognised for providing guidance for intervention design, for characterising an existing intervention to implement evidence-based care.
WENESSA, Wide Eye-Narrow Eye Space Simulation fo Situational Awareness
NASA Astrophysics Data System (ADS)
Albarait, O.; Payne, D. M.; LeVan, P. D.; Luu, K. K.; Spillar, E.; Freiwald, W.; Hamada, K.; Houchard, J.
In an effort to achieve timelier indications of anomalous object behaviors in geosynchronous earth orbit, a Planning Capability Concept (PCC) for a “Wide Eye-Narrow Eye” (WE-NE) telescope network has been established. The PCC addresses the problem of providing continuous and operationally robust, layered and cost-effective, Space Situational Awareness (SSA) that is focused on monitoring deep space for anomalous behaviors. It does this by first detecting the anomalies with wide field of regard systems, and then providing reliable handovers for detailed observational follow-up by another optical asset. WENESSA will explore the added value of such a system to the existing Space Surveillance Network (SSN). The study will assess and quantify the degree to which the PCC completely fulfills, or improves or augments, these deep space knowledge deficiencies relative to current operational systems. In order to improve organic simulation capabilities, we will explore options for the federation of diverse community simulation approaches, while evaluating the efficiencies offered by a network of small and larger aperture, ground-based telescopes. Existing Space Modeling and Simulation (M&S) tools designed for evaluating WENESSA-like problems will be taken into consideration as we proceed in defining and developing the tools needed to perform this study, leading to the creation of a unified Space M&S environment for the rapid assessment of new capabilities. The primary goal of this effort is to perform a utility assessment of the WE-NE concept. The assessment will explore the mission utility of various WE-NE concepts in discovering deep space anomalies in concert with the SSN. The secondary goal is to generate an enduring modeling and simulation environment to explore the utility of future proposed concepts and supporting technologies. Ultimately, our validated simulation framework would support the inclusion of other ground- and space-based SSA assets through integrated analysis. Options will be explored using at least two competing simulation capabilities, but emphasis will be placed on reasoned analyses as supported by the simulations.
NASA Technical Reports Server (NTRS)
DellaCorte, Christopher; Moore, Lewis E., III
2014-01-01
Compared to conventional bearing materials (tool steel and ceramics), emerging Superelastic Intermetallic Materials (SIMs), such as 60NiTi, have significantly lower elastic modulus and enhanced strain capability. They are also immune to atmospheric corrosion (rusting). This offers the potential for increased resilience and superior ability to withstand static indentation load without damage. In this paper, the static load capacity of hardened 60NiTi 50-mm-bore ball bearing races are measured to correlate existing flat-plate indentation load capacity data to an actual bearing geometry through the Hertz stress relations. The results confirmed the validity of using the Hertz stress relations to model 60NiTi contacts; 60NiTi exhibits a static stress capability (approximately 3.1 GPa) between that of 440C (2.4 GPa) and REX20 (3.8 GPa) tool steel. When the reduced modulus and extended strain capability are taken into account, 60NiTi is shown to withstand higher loads than other bearing materials. To quantify this effect, a notional space mechanism, a 5-kg mass reaction wheel, was modeled with respect to launch load capability when supported on standard (catalogue geometry) design 440C; 60NiTi and REX20 tool steel bearings. For this application, the use of REX20 bearings increased the static load capability of the mechanism by a factor of three while the use of 60NiTi bearings resulted in an order of magnitude improvement compared to the baseline 440C stainless steel bearings
NASA Technical Reports Server (NTRS)
Dellacorte, Christopher; Moore, Lewis E.
2014-01-01
Compared to conventional bearing materials (tool steel and ceramics), emerging Superelastic Intermetallic Materials (SIMs), such as 60NiTi, have significantly lower elastic modulus and enhanced strain capability. They are also immune to atmospheric corrosion (rusting). This offers the potential for increased resilience and superior ability to withstand static indentation load without damage. In this paper, the static load capacity of hardened 60NiTi 50mm bore ball-bearing races are measured to correlate existing flat-plate indentation load capacity data to an actual bearing geometry through the Hertz stress relations. The results confirmed the validity of using the Hertz stress relations to model 60NiTi contacts; 60NiTi exhibits a static stress capability (3.1GPa) between that of 440C (2.4GPa) and REX20 (3.8GPa) tool steel. When the reduced modulus and extended strain capability are taken into account, 60NiTi is shown to withstand higher loads than other bearing materials. To quantify this effect, a notional space mechanism, a 5kg mass reaction wheel, was modeled with respect to launch load capability when supported on 440C, 60NiTi and REX20 tool steel bearings. For this application, the use of REX20 bearings increased the static load capability of the mechanism by a factor of three while the use of 60NiTi bearings resulted in an order of magnitude improvement compared to the baseline 440C stainless steel bearings.
Study of tethered satellite active attitude control
NASA Technical Reports Server (NTRS)
Colombo, G.
1982-01-01
Existing software was adapted for the study of tethered subsatellite rotational dynamics, an analytic solution for a stable configuration of a tethered subsatellite was developed, the analytic and numerical integrator (computer) solutions for this "test case' was compared in a two mass tether model program (DUMBEL), the existing multiple mass tether model (SKYHOOK) was modified to include subsatellite rotational dynamics, the analytic "test case,' was verified, and the use of the SKYHOOK rotational dynamics capability with a computer run showing the effect of a single off axis thruster on the behavior of the subsatellite was demonstrated. Subroutines for specific attitude control systems are developed and applied to the study of the behavior of the tethered subsatellite under realistic on orbit conditions. The effect of all tether "inputs,' including pendular oscillations, air drag, and electrodynamic interactions, on the dynamic behavior of the tether are included.
Using a hybrid neuron in physiologically inspired models of the basal ganglia.
Thibeault, Corey M; Srinivasa, Narayan
2013-01-01
Our current understanding of the basal ganglia (BG) has facilitated the creation of computational models that have contributed novel theories, explored new functional anatomy and demonstrated results complementing physiological experiments. However, the utility of these models extends beyond these applications. Particularly in neuromorphic engineering, where the basal ganglia's role in computation is important for applications such as power efficient autonomous agents and model-based control strategies. The neurons used in existing computational models of the BG, however, are not amenable for many low-power hardware implementations. Motivated by a need for more hardware accessible networks, we replicate four published models of the BG, spanning single neuron and small networks, replacing the more computationally expensive neuron models with an Izhikevich hybrid neuron. This begins with a network modeling action-selection, where the basal activity levels and the ability to appropriately select the most salient input is reproduced. A Parkinson's disease model is then explored under normal conditions, Parkinsonian conditions and during subthalamic nucleus deep brain stimulation (DBS). The resulting network is capable of replicating the loss of thalamic relay capabilities in the Parkinsonian state and its return under DBS. This is also demonstrated using a network capable of action-selection. Finally, a study of correlation transfer under different patterns of Parkinsonian activity is presented. These networks successfully captured the significant results of the originals studies. This not only creates a foundation for neuromorphic hardware implementations but may also support the development of large-scale biophysical models. The former potentially providing a way of improving the efficacy of DBS and the latter allowing for the efficient simulation of larger more comprehensive networks.
Dynamical properties of a prey-predator-scavenger model with quadratic harvesting
NASA Astrophysics Data System (ADS)
Gupta, R. P.; Chandra, Peeyush
2017-08-01
In this paper, we propose and analyze an extended model for the prey-predator-scavenger in presence of harvesting to study the effects of harvesting of predator as well as scavenger. The positivity, boundedness and persistence conditions are derived for the proposed model. The model undergoes a Hopf-bifurcation around the co-existing equilibrium point. It is also observed that the model is capable of exhibiting period doubling route to chaos. It is pointed out that a suitable amount of harvesting of predator can control the chaotic dynamics and make the system stable. An extensive numerical simulation is performed to validate the analytic findings. The associated control problem for the proposed model has been analyzed for optimal harvesting.
Abiotic/biotic coupling in the rhizosphere: a reactive transport modeling analysis
Lawrence, Corey R.; Steefel, Carl; Maher, Kate
2014-01-01
A new generation of models is needed to adequately simulate patterns of soil biogeochemical cycling in response changing global environmental drivers. For example, predicting the influence of climate change on soil organic matter storage and stability requires models capable of addressing complex biotic/abiotic interactions of rhizosphere and weathering processes. Reactive transport modeling provides a powerful framework simulating these interactions and the resulting influence on soil physical and chemical characteristics. Incorporation of organic reactions in an existing reactive transport model framework has yielded novel insights into soil weathering and development but much more work is required to adequately capture root and microbial dynamics in the rhizosphere. This endeavor provides many advantages over traditional soil biogeochemical models but also many challenges.
Energy savings modelling of re-tuning energy conservation measures in large office buildings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fernandez, Nick; Katipamula, Srinivas; Wang, Weimin
Today, many large commercial buildings use sophisticated building automation systems (BASs) to manage a wide range of building equipment. While the capabilities of BASs have increased over time, many buildings still do not fully use the BAS’s capabilities and are not properly commissioned, operated or maintained, which leads to inefficient operation, increased energy use, and reduced lifetimes of the equipment. This paper investigates the energy savings potential of several common HVAC system re-tuning measures on a typical large office building, using the Department of Energy’s building energy modeling software, EnergyPlus. The baseline prototype model uses roughly as much energy asmore » an average large office building in existing building stock, but does not utilize any re-tuning measures. Individual re-tuning measures simulated against this baseline include automatic schedule adjustments, damper minimum flow adjustments, thermostat adjustments, as well as dynamic resets (set points that change continuously with building and/or outdoor conditions) to static pressure, supply-air temperature, condenser water temperature, chilled and hot water temperature, and chilled and hot water differential pressure set points. Six combinations of these individual measures have been formulated – each designed to conform to limitations to implementation of certain individual measures that might exist in typical buildings. All the individual measures and combinations were simulated in 16 climate locations representative of specific U.S. climate zones. The modeling results suggest that the most effective energy savings measures are those that affect the demand-side of the building (air-systems and schedules). Many of the demand-side individual measures were capable of reducing annual total HVAC system energy consumption by over 20% in most cities that were modeled. Supply side measures affecting HVAC plant conditions were only modestly successful (less than 5% annual HVAC energy savings for most cities for all measures). Combining many of the re-tuning measures revealed deep savings potential. Some of the more aggressive combinations revealed 35-75% reductions in annual HVAC energy consumption, depending on climate and building vintage.« less
Leith, William S.; Benz, Harley M.; Herrmann, Robert B.
2011-01-01
Evaluation of seismic monitoring capabilities in the central and eastern United States for critical facilities - including nuclear powerplants - focused on specific improvements to understand better the seismic hazards in the region. The report is not an assessment of seismic safety at nuclear plants. To accomplish the evaluation and to provide suggestions for improvements using funding from the American Recovery and Reinvestment Act of 2009, the U.S. Geological Survey examined addition of new strong-motion seismic stations in areas of seismic activity and addition of new seismic stations near nuclear power-plant locations, along with integration of data from the Transportable Array of some 400 mobile seismic stations. Some 38 and 68 stations, respectively, were suggested for addition in active seismic zones and near-power-plant locations. Expansion of databases for strong-motion and other earthquake source-characterization data also was evaluated. Recognizing pragmatic limitations of station deployment, augmentation of existing deployments provides improvements in source characterization by quantification of near-source attenuation in regions where larger earthquakes are expected. That augmentation also supports systematic data collection from existing networks. The report further utilizes the application of modeling procedures and processing algorithms, with the additional stations and the improved seismic databases, to leverage the capabilities of existing and expanded seismic arrays.
Computer Models Simulate Fine Particle Dispersion
NASA Technical Reports Server (NTRS)
2010-01-01
Through a NASA Seed Fund partnership with DEM Solutions Inc., of Lebanon, New Hampshire, scientists at Kennedy Space Center refined existing software to study the electrostatic phenomena of granular and bulk materials as they apply to planetary surfaces. The software, EDEM, allows users to import particles and obtain accurate representations of their shapes for modeling purposes, such as simulating bulk solids behavior, and was enhanced to be able to more accurately model fine, abrasive, cohesive particles. These new EDEM capabilities can be applied in many industries unrelated to space exploration and have been adopted by several prominent U.S. companies, including John Deere, Pfizer, and Procter & Gamble.
Development of an Improved Simulator for Chemical and Microbial EOR Methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pope, Gary A.; Sepehrnoori, Kamy; Delshad, Mojdeh
2000-09-11
The objective of this research was to extend the capability of an existing simulator (UTCHEM) to improved oil recovery methods that use surfactants, polymers, gels, alkaline chemicals, microorganisms and foam as well as various combinations of these in both conventional and naturally fractured oil reservoirs. Task 1 is the addition of a dual-porosity model for chemical improved of recovery processes in naturally fractured oil reservoirs. Task 2 is the addition of a foam model. Task 3 addresses several numerical and coding enhancements that will greatly improve the versatility and performance of UTCHEM. Task 4 is the enhancements of physical propertymore » models.« less
USEEIO: a New and Transparent United States ...
National-scope environmental life cycle models of goods and services may be used for many purposes, not limited to quantifying impacts of production and consumption of nations, assessing organization-wide impacts, identifying purchasing hot spots, analyzing environmental impacts of policies, and performing streamlined life cycle assessment. USEEIO is a new environmentally extended input-output model of the United States fit for such purposes and other sustainable materials management applications. USEEIO melds data on economic transactions between 389 industry sectors with environmental data for these sectors covering land, water, energy and mineral usage and emissions of greenhouse gases, criteria air pollutants, nutrients and toxics, to build a life cycle model of 385 US goods and services. In comparison with existing US input-output models, USEEIO is more current with most data representing year 2013, more extensive in its coverage of resources and emissions, more deliberate and detailed in its interpretation and combination of data sources, and includes formal data quality evaluation and description. USEEIO was assembled with a new Python module called the IO Model Builder capable of assembling and calculating results of user-defined input-output models and exporting the models into LCA software. The model and data quality evaluation capabilities are demonstrated with an analysis of the environmental performance of an average hospital in the US. All USEEIO f
Queueing models for token and slotted ring networks. Thesis
NASA Technical Reports Server (NTRS)
Peden, Jeffery H.
1990-01-01
Currently the end-to-end delay characteristics of very high speed local area networks are not well understood. The transmission speed of computer networks is increasing, and local area networks especially are finding increasing use in real time systems. Ring networks operation is generally well understood for both token rings and slotted rings. There is, however, a severe lack of queueing models for high layer operation. There are several factors which contribute to the processing delay of a packet, as opposed to the transmission delay, e.g., packet priority, its length, the user load, the processor load, the use of priority preemption, the use of preemption at packet reception, the number of processors, the number of protocol processing layers, the speed of each processor, and queue length limitations. Currently existing medium access queueing models are extended by adding modeling techniques which will handle exhaustive limited service both with and without priority traffic, and modeling capabilities are extended into the upper layers of the OSI model. Some of the model are parameterized solution methods, since it is shown that certain models do not exist as parameterized solutions, but rather as solution methods.
Friedberg, Mark W; Safran, Dana G; Coltin, Kathryn L; Dresser, Marguerite; Schneider, Eric C
2009-02-01
The Patient-Centered Medical Home (PCMH), a popular model for primary care reorganization, includes several structural capabilities intended to enhance quality of care. The extent to which different types of primary care practices have adopted these capabilities has not been previously studied. To measure the prevalence of recommended structural capabilities among primary care practices and to determine whether prevalence varies among practices of different size (number of physicians) and administrative affiliation with networks of practices. Cross-sectional analysis. One physician chosen at random from each of 412 primary care practices in Massachusetts was surveyed about practice capabilities during 2007. Practice size and network affiliation were obtained from an existing database. Presence of 13 structural capabilities representing 4 domains relevant to quality: patient assistance and reminders, culture of quality, enhanced access, and electronic health records (EHRs). Three hundred eight (75%) physicians responded, representing practices with a median size of 4 physicians (range 2-74). Among these practices, 64% were affiliated with 1 of 9 networks. The prevalence of surveyed capabilities ranged from 24% to 88%. Larger practice size was associated with higher prevalence for 9 of the 13 capabilities spanning all 4 domains (P < 0.05). Network affiliation was associated with higher prevalence of 5 capabilities (P < 0.05) in 3 domains. Associations were not substantively altered by statistical adjustment for other practice characteristics. Larger and network-affiliated primary care practices are more likely than smaller, non-affiliated practices to have adopted several recommended capabilities. In order to achieve PCMH designation, smaller non-affiliated practices may require the greatest investments.
Formulation of a parametric systems design framework for disaster response planning
NASA Astrophysics Data System (ADS)
Mma, Stephanie Weiya
The occurrence of devastating natural disasters in the past several years have prompted communities, responding organizations, and governments to seek ways to improve disaster preparedness capabilities locally, regionally, nationally, and internationally. A holistic approach to design used in the aerospace and industrial engineering fields enables efficient allocation of resources through applied parametric changes within a particular design to improve performance metrics to selected standards. In this research, this methodology is applied to disaster preparedness, using a community's time to restoration after a disaster as the response metric. A review of the responses from Hurricane Katrina and the 2010 Haiti earthquake, among other prominent disasters, provides observations leading to some current capability benchmarking. A need for holistic assessment and planning exists for communities but the current response planning infrastructure lacks a standardized framework and standardized assessment metrics. Within the humanitarian logistics community, several different metrics exist, enabling quantification and measurement of a particular area's vulnerability. These metrics, combined with design and planning methodologies from related fields, such as engineering product design, military response planning, and business process redesign, provide insight and a framework from which to begin developing a methodology to enable holistic disaster response planning. The developed methodology was applied to the communities of Shelby County, TN and pre-Hurricane-Katrina Orleans Parish, LA. Available literature and reliable media sources provide information about the different values of system parameters within the decomposition of the community aspects and also about relationships among the parameters. The community was modeled as a system dynamics model and was tested in the implementation of two, five, and ten year improvement plans for Preparedness, Response, and Development capabilities, and combinations of these capabilities. For Shelby County and for Orleans Parish, the Response improvement plan reduced restoration time the most. For the combined capabilities, Shelby County experienced the greatest reduction in restoration time with the implementation of Development and Response capability improvements, and for Orleans Parish it was the Preparedness and Response capability improvements. Optimization of restoration time with community parameters was tested by using a Particle Swarm Optimization algorithm. Fifty different optimized restoration times were generated using the Particle Swarm Optimization algorithm and ranked using the Technique for Order Preference by Similarity to Ideal Solution. The optimization results indicate that the greatest reduction in restoration time for a community is achieved with a particular combination of different parameter values instead of the maximization of each parameter.
NASA Astrophysics Data System (ADS)
Cui, Z.; Welty, C.; Maxwell, R. M.
2011-12-01
Lagrangian, particle-tracking models are commonly used to simulate solute advection and dispersion in aquifers. They are computationally efficient and suffer from much less numerical dispersion than grid-based techniques, especially in heterogeneous and advectively-dominated systems. Although particle-tracking models are capable of simulating geochemical reactions, these reactions are often simplified to first-order decay and/or linear, first-order kinetics. Nitrogen transport and transformation in aquifers involves both biodegradation and higher-order geochemical reactions. In order to take advantage of the particle-tracking approach, we have enhanced an existing particle-tracking code SLIM-FAST, to simulate nitrogen transport and transformation in aquifers. The approach we are taking is a hybrid one: the reactive multispecies transport process is operator split into two steps: (1) the physical movement of the particles including the attachment/detachment to solid surfaces, which is modeled by a Lagrangian random-walk algorithm; and (2) multispecies reactions including biodegradation are modeled by coupling multiple Monod equations with other geochemical reactions. The coupled reaction system is solved by an ordinary differential equation solver. In order to solve the coupled system of equations, after step 1, the particles are converted to grid-based concentrations based on the mass and position of the particles, and after step 2 the newly calculated concentration values are mapped back to particles. The enhanced particle-tracking code is capable of simulating subsurface nitrogen transport and transformation in a three-dimensional domain with variably saturated conditions. Potential application of the enhanced code is to simulate subsurface nitrogen loading to the Chesapeake Bay and its tributaries. Implementation details, verification results of the enhanced code with one-dimensional analytical solutions and other existing numerical models will be presented in addition to a discussion of implementation challenges.
NASA Astrophysics Data System (ADS)
Khodayari, Arezoo; Wuebbles, Donald J.; Olsen, Seth C.; Fuglestvedt, Jan S.; Berntsen, Terje; Lund, Marianne T.; Waitz, Ian; Wolfe, Philip; Forster, Piers M.; Meinshausen, Malte; Lee, David S.; Lim, Ling L.
2013-08-01
This study evaluates the capabilities of the carbon cycle and energy balance treatments relative to the effect of aviation CO2 emissions on climate in several existing simplified climate models (SCMs) that are either being used or could be used for evaluating the effects of aviation on climate. Since these models are used in policy-related analyses, it is important that the capabilities of such models represent the state of understanding of the science. We compare the Aviation Environmental Portfolio Management Tool (APMT) Impacts climate model, two models used at the Center for International Climate and Environmental Research-Oslo (CICERO-1 and CICERO-2), the Integrated Science Assessment Model (ISAM) model as described in Jain et al. (1994), the simple Linear Climate response model (LinClim) and the Model for the Assessment of Greenhouse-gas Induced Climate Change version 6 (MAGICC6). In this paper we select scenarios to illustrate the behavior of the carbon cycle and energy balance models in these SCMs. This study is not intended to determine the absolute and likely range of the expected climate response in these models but to highlight specific features in model representations of the carbon cycle and energy balance models that need to be carefully considered in studies of aviation effects on climate. These results suggest that carbon cycle models that use linear impulse-response-functions (IRF) in combination with separate equations describing air-sea and air-biosphere exchange of CO2 can account for the dominant nonlinearities in the climate system that would otherwise not have been captured with an IRF alone, and hence, produce a close representation of more complex carbon cycle models. Moreover, results suggest that an energy balance model with a 2-box ocean sub-model and IRF tuned to reproduce the response of coupled Earth system models produces a close representation of the globally-averaged temperature response of more complex energy balance models.
Model-Based Fatigue Prognosis of Fiber-Reinforced Laminates Exhibiting Concurrent Damage Mechanisms
NASA Technical Reports Server (NTRS)
Corbetta, M.; Sbarufatti, C.; Saxena, A.; Giglio, M.; Goebel, K.
2016-01-01
Prognostics of large composite structures is a topic of increasing interest in the field of structural health monitoring for aerospace, civil, and mechanical systems. Along with recent advancements in real-time structural health data acquisition and processing for damage detection and characterization, model-based stochastic methods for life prediction are showing promising results in the literature. Among various model-based approaches, particle-filtering algorithms are particularly capable in coping with uncertainties associated with the process. These include uncertainties about information on the damage extent and the inherent uncertainties of the damage propagation process. Some efforts have shown successful applications of particle filtering-based frameworks for predicting the matrix crack evolution and structural stiffness degradation caused by repetitive fatigue loads. Effects of other damage modes such as delamination, however, are not incorporated in these works. It is well established that delamination and matrix cracks not only co-exist in most laminate structures during the fatigue degradation process but also affect each other's progression. Furthermore, delamination significantly alters the stress-state in the laminates and accelerates the material degradation leading to catastrophic failure. Therefore, the work presented herein proposes a particle filtering-based framework for predicting a structure's remaining useful life with consideration of multiple co-existing damage-mechanisms. The framework uses an energy-based model from the composite modeling literature. The multiple damage-mode model has been shown to suitably estimate the energy release rate of cross-ply laminates as affected by matrix cracks and delamination modes. The model is also able to estimate the reduction in stiffness of the damaged laminate. This information is then used in the algorithms for life prediction capabilities. First, a brief summary of the energy-based damage model is provided. Then, the paper describes how the model is embedded within the prognostic framework and how the prognostics performance is assessed using observations from run-to-failure experiments
RFI and SCRIMP Model Development and Verification
NASA Technical Reports Server (NTRS)
Loos, Alfred C.; Sayre, Jay
2000-01-01
Vacuum-Assisted Resin Transfer Molding (VARTM) processes are becoming promising technologies in the manufacturing of primary composite structures in the aircraft industry as well as infrastructure. A great deal of work still needs to be done on efforts to reduce the costly trial-and-error methods of VARTM processing that are currently in practice today. A computer simulation model of the VARTM process would provide a cost-effective tool in the manufacturing of composites utilizing this technique. Therefore, the objective of this research was to modify an existing three-dimensional, Resin Film Infusion (RFI)/Resin Transfer Molding (RTM) model to include VARTM simulation capabilities and to verify this model with the fabrication of aircraft structural composites. An additional objective was to use the VARTM model as a process analysis tool, where this tool would enable the user to configure the best process for manufacturing quality composites. Experimental verification of the model was performed by processing several flat composite panels. The parameters verified included flow front patterns and infiltration times. The flow front patterns were determined to be qualitatively accurate, while the simulated infiltration times over predicted experimental times by 8 to 10%. Capillary and gravitational forces were incorporated into the existing RFI/RTM model in order to simulate VARTM processing physics more accurately. The theoretical capillary pressure showed the capability to reduce the simulated infiltration times by as great as 6%. The gravity, on the other hand, was found to be negligible for all cases. Finally, the VARTM model was used as a process analysis tool. This enabled the user to determine such important process constraints as the location and type of injection ports and the permeability and location of the high-permeable media. A process for a three-stiffener composite panel was proposed. This configuration evolved from the variation of the process constraints in the modeling of several different composite panels. The configuration was proposed by considering such factors as: infiltration time, the number of vacuum ports, and possible areas of void entrapment.
Cargo launch vehicles to low earth orbit
NASA Technical Reports Server (NTRS)
Austin, Robert E.
1990-01-01
There are two primary space transportation capabilities required to support both base programs and expanded mission requirements: earth-to-orbit (ETO) transportation systems and space transfer vehicle systems. Existing and new ETO vehicles required to support mission requirements, and planned robotic missions, along with currently planned ETO vehicles are provided. Lunar outposts, Mars' outposts, base and expanded model, ETO vehicles, advanced avionics technologies, expert systems, network architecture and operations systems, and technology transfer are discussed.
2013-11-01
by existing cyber-attack detection tools far exceeds the analysts’ cognitive capabilities. Grounded in perceptual and cognitive theory , many visual...Processes Inspired by the sense-making theory discussed earlier, we model the analytical reasoning process of cyber analysts using three key...analyst are called “working hypotheses”); each hypothesis could trigger further actions to confirm or disconfirm it. New actions will lead to new
Domestic Wind Energy Workforce; NREL (National Renewable Energy Laboratory)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tegen, Suzanne
2015-07-30
A robust workforce is essential to growing domestic wind manufacturing capabilities. NREL researchers conducted research to better understand today's domestic wind workforce, projected needs for the future, and how existing and new education and training programs can meet future needs. This presentation provides an overview of this research and the accompanying industry survey, as well as the Energy Department's Career Maps, Jobs & Economic Development Impacts models, and the Wind for Schools project.
Intelligent model-based diagnostics for vehicle health management
NASA Astrophysics Data System (ADS)
Luo, Jianhui; Tu, Fang; Azam, Mohammad S.; Pattipati, Krishna R.; Willett, Peter K.; Qiao, Liu; Kawamoto, Masayuki
2003-08-01
The recent advances in sensor technology, remote communication and computational capabilities, and standardized hardware/software interfaces are creating a dramatic shift in the way the health of vehicles is monitored and managed. These advances facilitate remote monitoring, diagnosis and condition-based maintenance of automotive systems. With the increased sophistication of electronic control systems in vehicles, there is a concomitant increased difficulty in the identification of the malfunction phenomena. Consequently, the current rule-based diagnostic systems are difficult to develop, validate and maintain. New intelligent model-based diagnostic methodologies that exploit the advances in sensor, telecommunications, computing and software technologies are needed. In this paper, we will investigate hybrid model-based techniques that seamlessly employ quantitative (analytical) models and graph-based dependency models for intelligent diagnosis. Automotive engineers have found quantitative simulation (e.g. MATLAB/SIMULINK) to be a vital tool in the development of advanced control systems. The hybrid method exploits this capability to improve the diagnostic system's accuracy and consistency, utilizes existing validated knowledge on rule-based methods, enables remote diagnosis, and responds to the challenges of increased system complexity. The solution is generic and has the potential for application in a wide range of systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1993-08-01
The study has been undertaken by the Glosten Associates, Inc., to evaluate the existing capability for emergency towing at Prince William Sound and to examine alternatives that could enhance the escort and assist capabilities for disabled tankers within the waterway from the Alyeska Oil Terminal at the Port of Valdez to the Gulf of Alaska outside Hinchinbrook Entrance. Part 1, reported herein, is an objective evaluation by an experienced salvage towing master of the existing tugs, emergency towing equipment, towing practices, and discussion of alternative tug types.
NEXT Ion Propulsion System Development Status and Capabilities
NASA Technical Reports Server (NTRS)
Patterson, Michael J.; Benson, Scott W.
2008-01-01
NASA s Evolutionary Xenon Thruster (NEXT) project is developing next generation ion propulsion technologies to provide future NASA science missions with enhanced mission performance benefit at a low total development cost. The objective of the NEXT project is to advance next generation ion propulsion technology by producing engineering model system components, validating these through qualification-level and integrated system testing, and ensuring preparedness for transitioning to flight system development. As NASA s Evolutionary Xenon Thruster technology program completes advanced development activities, it is advantageous to review the existing technology capabilities of the system under development. This paper describes the NEXT ion propulsion system development status, characteristics and performance. A review of mission analyses results conducted to date using the NEXT system is also provided.
The Soldier Fitness Tracker: global delivery of Comprehensive Soldier Fitness.
Fravell, Mike; Nasser, Katherine; Cornum, Rhonda
2011-01-01
Carefully implemented technology strategies are vital to the success of large-scale initiatives such as the U.S. Army's Comprehensive Soldier Fitness (CSF) program. Achieving the U.S. Army's vision for CSF required a robust information technology platform that was scaled to millions of users and that leveraged the Internet to enable global reach. The platform needed to be agile, provide powerful real-time reporting, and have the capacity to quickly transform to meet emerging requirements. Existing organizational applications, such as "Single Sign-On," and authoritative data sources were exploited to the maximum extent possible. Development of the "Soldier Fitness Tracker" is the most recent, and possibly the best, demonstration of the potential benefits possible when existing organizational capabilities are married to new, innovative applications. Combining the capabilities of the extant applications with the newly developed applications expedited development, eliminated redundant data collection, resulted in the exceeding of program objectives, and produced a comfortable experience for the end user, all in less than six months. This is a model for future technology integration. (c) 2010 APA, all rights reserved.
Automated Software Development Workstation (ASDW)
NASA Technical Reports Server (NTRS)
Fridge, Ernie
1990-01-01
Software development is a serious bottleneck in the construction of complex automated systems. An increase of the reuse of software designs and components has been viewed as a way to relieve this bottleneck. One approach to achieving software reusability is through the development and use of software parts composition systems. A software parts composition system is a software development environment comprised of a parts description language for modeling parts and their interfaces, a catalog of existing parts, a composition editor that aids a user in the specification of a new application from existing parts, and a code generator that takes a specification and generates an implementation of a new application in a target language. The Automated Software Development Workstation (ASDW) is an expert system shell that provides the capabilities required to develop and manipulate these software parts composition systems. The ASDW is now in Beta testing at the Johnson Space Center. Future work centers on responding to user feedback for capability and usability enhancement, expanding the scope of the software lifecycle that is covered, and in providing solutions to handling very large libraries of reusable components.
Higher-order harmonics coupling in different free-electron laser codes
NASA Astrophysics Data System (ADS)
Giannessi, L.; Freund, H. P.; Musumeci, P.; Reiche, S.
2008-08-01
The capability for simulation of the dynamics of a free-electron laser including the higher-order harmonics in linear undulators exists in several existing codes as MEDUSA [H.P. Freund, S.G. Biedron, and S.V. Milton, IEEE J. Quantum Electron. 27 (2000) 243; H.P. Freund, Phys. Rev. ST-AB 8 (2005) 110701] and PERSEO [L. Giannessi, Overview of Perseo, a system for simulating FEL dynamics in Mathcad, < http://www.jacow.org>, in: Proceedings of FEL 2006 Conference, BESSY, Berlin, Germany, 2006, p. 91], and has been recently implemented in GENESIS 1.3 [See < http://www.perseo.enea.it>]. MEDUSA and GENESIS also include the dynamics of even harmonics induced by the coupling through the betatron motion. In addition MEDUSA, which is based on a non-wiggler averaged model, is capable of simulating the generation of even harmonics in the transversally cold beam regime, i.e. when the even harmonic coupling arises from non-linear effects associated with longitudinal particle dynamics and not to a finite beam emittance. In this paper a comparison between the predictions of the codes in different conditions is given.
An eHealth Capabilities Framework for Graduates and Health Professionals: Mixed-Methods Study
McGregor, Deborah; Keep, Melanie; Janssen, Anna; Spallek, Heiko; Quinn, Deleana; Jones, Aaron; Tseris, Emma; Yeung, Wilson; Togher, Leanne; Solman, Annette; Shaw, Tim
2018-01-01
Background The demand for an eHealth-ready and adaptable workforce is placing increasing pressure on universities to deliver eHealth education. At present, eHealth education is largely focused on components of eHealth rather than considering a curriculum-wide approach. Objective This study aimed to develop a framework that could be used to guide health curriculum design based on current evidence, and stakeholder perceptions of eHealth capabilities expected of tertiary health graduates. Methods A 3-phase, mixed-methods approach incorporated the results of a literature review, focus groups, and a Delphi process to develop a framework of eHealth capability statements. Results Participants (N=39) with expertise or experience in eHealth education, practice, or policy provided feedback on the proposed framework, and following the fourth iteration of this process, consensus was achieved. The final framework consisted of 4 higher-level capability statements that describe the learning outcomes expected of university graduates across the domains of (1) digital health technologies, systems, and policies; (2) clinical practice; (3) data analysis and knowledge creation; and (4) technology implementation and codesign. Across the capability statements are 40 performance cues that provide examples of how these capabilities might be demonstrated. Conclusions The results of this study inform a cross-faculty eHealth curriculum that aligns with workforce expectations. There is a need for educational curriculum to reinforce existing eHealth capabilities, adapt existing capabilities to make them transferable to novel eHealth contexts, and introduce new learning opportunities for interactions with technologies within education and practice encounters. As such, the capability framework developed may assist in the application of eHealth by emerging and existing health care professionals. Future research needs to explore the potential for integration of findings into workforce development programs. PMID:29764794
Feasibility study on the design of a probe for rectal cancer detection
NASA Technical Reports Server (NTRS)
Anselm, V. J.; Frazer, R. E.; Lecroisset, D. H.; Roseboro, J. A.; Smokler, M. I.
1977-01-01
Rectal examination techniques are considered in terms of detection capability, patient acceptance, and cost reduction. A review of existing clinical techniques are considered in terms of detection capability, patient acceptance, and cost reduction. A review of existing clinical techniques and of relevant aerospace technology included evaluation of the applicability of visual, thermal, ultrasound, and radioisotope modalities of examination. The desired improvements can be obtained by redesigning the proctosigmoidoscope to have reduced size, additional visibility, and the capability of readily providing a color photograph of the entire rectosigmoid mucosa in a single composite view.
Digital Architecture – Results From a Gap Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oxstrand, Johanna Helene; Thomas, Kenneth David; Fitzgerald, Kirk
The digital architecture is defined as a collection of IT capabilities needed to support and integrate a wide-spectrum of real-time digital capabilities for nuclear power plant performance improvements. The digital architecture can be thought of as an integration of the separate I&C and information systems already in place in NPPs, brought together for the purpose of creating new levels of automation in NPP work activities. In some cases, it might be an extension of the current communication systems, to provide digital communications where they are currently analog only. This collection of IT capabilities must in turn be based on amore » set of user requirements that must be supported for the interconnected technologies to operate in an integrated manner. These requirements, simply put, are a statement of what sorts of digital work functions will be exercised in a fully-implemented seamless digital environment and how much they will be used. The goal of the digital architecture research is to develop a methodology for mapping nuclear power plant operational and support activities into the digital architecture, which includes the development of a consensus model for advanced information and control architecture. The consensus model should be developed at a level of detail that is useful to the industry. In other words, not so detailed that it specifies specific protocols and not so vague that it is only provides a high level description of technology. The next step towards the model development is to determine the current state of digital architecture at typical NPPs. To investigate the current state, the researchers conducted a gap analysis to determine to what extent the NPPs can support the future digital technology environment with their existing I&C and IT structure, and where gaps exist with respect to the full deployment of technology over time. The methodology, result, and conclusions from the gap analysis are described in this report.« less
Evaluation of new collision-pair selection models in DSMC
NASA Astrophysics Data System (ADS)
Akhlaghi, Hassan; Roohi, Ehsan
2017-10-01
The current paper investigates new collision-pair selection procedures in a direct simulation Monte Carlo (DSMC) method. Collision partner selection based on the random procedure from nearest neighbor particles and deterministic selection of nearest neighbor particles have already been introduced as schemes that provide accurate results in a wide range of problems. In the current research, new collision-pair selections based on the time spacing and direction of the relative movement of particles are introduced and evaluated. Comparisons between the new and existing algorithms are made considering appropriate test cases including fluctuations in homogeneous gas, 2D equilibrium flow, and Fourier flow problem. Distribution functions for number of particles and collisions in cell, velocity components, and collisional parameters (collision separation, time spacing, relative velocity, and the angle between relative movements of particles) are investigated and compared with existing analytical relations for each model. The capability of each model in the prediction of the heat flux in the Fourier problem at different cell numbers, numbers of particles, and time steps is examined. For new and existing collision-pair selection schemes, the effect of an alternative formula for the number of collision-pair selections and avoiding repetitive collisions are investigated via the prediction of the Fourier heat flux. The simulation results demonstrate the advantages and weaknesses of each model in different test cases.
DOE Office of Scientific and Technical Information (OSTI.GOV)
David Muth, Jr.; Jared Abodeely; Richard Nelson
Agricultural residues have significant potential as a feedstock for bioenergy production, but removing these residues can have negative impacts on soil health. Models and datasets that can support decisions about sustainable agricultural residue removal are available; however, no tools currently exist capable of simultaneously addressing all environmental factors that can limit availability of residue. The VE-Suite model integration framework has been used to couple a set of environmental process models to support agricultural residue removal decisions. The RUSLE2, WEPS, and Soil Conditioning Index models have been integrated. A disparate set of databases providing the soils, climate, and management practice datamore » required to run these models have also been integrated. The integrated system has been demonstrated for two example cases. First, an assessment using high spatial fidelity crop yield data has been run for a single farm. This analysis shows the significant variance in sustainably accessible residue across a single farm and crop year. A second example is an aggregate assessment of agricultural residues available in the state of Iowa. This implementation of the integrated systems model demonstrates the capability to run a vast range of scenarios required to represent a large geographic region.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Podestà, M.; Gorelenkova, M.; Gorelenkov, N. N.
Alfvénic instabilities (AEs) are well known as a potential cause of enhanced fast ion transport in fusion devices. Given a specific plasma scenario, quantitative predictions of (i) expected unstable AE spectrum and (ii) resulting fast ion transport are required to prevent or mitigate the AE-induced degradation in fusion performance. Reduced models are becoming an attractive tool to analyze existing scenarios as well as for scenario prediction in time-dependent simulations. Here, in this work, a neutral beam heated NSTX discharge is used as reference to illustrate the potential of a reduced fast ion transport model, known as kick model, that hasmore » been recently implemented for interpretive and predictive analysis within the framework of the time-dependent tokamak transport code TRANSP. Predictive capabilities for AE stability and saturation amplitude are first assessed, based on given thermal plasma profiles only. Predictions are then compared to experimental results, and the interpretive capabilities of the model further discussed. Overall, the reduced model captures the main properties of the instabilities and associated effects on the fast ion population. Finally, additional information from the actual experiment enables further tuning of the model's parameters to achieve a close match with measurements.« less
Podestà, M.; Gorelenkova, M.; Gorelenkov, N. N.; ...
2017-07-20
Alfvénic instabilities (AEs) are well known as a potential cause of enhanced fast ion transport in fusion devices. Given a specific plasma scenario, quantitative predictions of (i) expected unstable AE spectrum and (ii) resulting fast ion transport are required to prevent or mitigate the AE-induced degradation in fusion performance. Reduced models are becoming an attractive tool to analyze existing scenarios as well as for scenario prediction in time-dependent simulations. Here, in this work, a neutral beam heated NSTX discharge is used as reference to illustrate the potential of a reduced fast ion transport model, known as kick model, that hasmore » been recently implemented for interpretive and predictive analysis within the framework of the time-dependent tokamak transport code TRANSP. Predictive capabilities for AE stability and saturation amplitude are first assessed, based on given thermal plasma profiles only. Predictions are then compared to experimental results, and the interpretive capabilities of the model further discussed. Overall, the reduced model captures the main properties of the instabilities and associated effects on the fast ion population. Finally, additional information from the actual experiment enables further tuning of the model's parameters to achieve a close match with measurements.« less
Three-Dimensional Modeling of Aircraft High-Lift Components with Vehicle Sketch Pad
NASA Technical Reports Server (NTRS)
Olson, Erik D.
2016-01-01
Vehicle Sketch Pad (OpenVSP) is a parametric geometry modeler that has been used extensively for conceptual design studies of aircraft, including studies using higher-order analysis. OpenVSP can model flap and slat surfaces using simple shearing of the airfoil coordinates, which is an appropriate level of complexity for lower-order aerodynamic analysis methods. For three-dimensional analysis, however, there is not a built-in method for defining the high-lift components in OpenVSP in a realistic manner, or for controlling their complex motions in a parametric manner that is intuitive to the designer. This paper seeks instead to utilize OpenVSP's existing capabilities, and establish a set of best practices for modeling high-lift components at a level of complexity suitable for higher-order analysis methods. Techniques are described for modeling the flap and slat components as separate three-dimensional surfaces, and for controlling their motion using simple parameters defined in the local hinge-axis frame of reference. To demonstrate the methodology, an OpenVSP model for the Energy-Efficient Transport (EET) AR12 wind-tunnel model has been created, taking advantage of OpenVSP's Advanced Parameter Linking capability to translate the motions of the high-lift components from the hinge-axis coordinate system to a set of transformations in OpenVSP's frame of reference.
IB2d: a Python and MATLAB implementation of the immersed boundary method.
Battista, Nicholas A; Strickland, W Christopher; Miller, Laura A
2017-03-29
The development of fluid-structure interaction (FSI) software involves trade-offs between ease of use, generality, performance, and cost. Typically there are large learning curves when using low-level software to model the interaction of an elastic structure immersed in a uniform density fluid. Many existing codes are not publicly available, and the commercial software that exists usually requires expensive licenses and may not be as robust or allow the necessary flexibility that in house codes can provide. We present an open source immersed boundary software package, IB2d, with full implementations in both MATLAB and Python, that is capable of running a vast range of biomechanics models and is accessible to scientists who have experience in high-level programming environments. IB2d contains multiple options for constructing material properties of the fiber structure, as well as the advection-diffusion of a chemical gradient, muscle mechanics models, and artificial forcing to drive boundaries with a preferred motion.
Basal and thermal control mechanisms of the Ragnhild glaciers, East Antarctica
NASA Astrophysics Data System (ADS)
Pattyn, Frank; de Brabander, Sang; Huyghe, Ann
The Ragnhild glaciers are three enhanced-flow features situated between the Sør Rondane and Yamato Mountains in eastern Dronning Maud Land, Antarctica. We investigate the glaciological mechanisms controlling their existence and behavior, using a three-dimensional numerical thermomechanical ice-sheet model including higher-order stress gradients. This model is further extended with a steady-state model of subglacial water flow, based on the hydraulic potential gradient. Both static and dynamic simulations are capable of reproducing the enhanced ice-flow features. Although basal topography is responsible for the existence of the flow pattern, thermomechanical effects and basal sliding seem to locally soften and lubricate the ice in the main trunks. Lateral drag is a contributing factor in balancing the driving stress, as shear margins can be traced over a distance of hundreds of kilometers along west Ragnhild glacier. Different basal sliding scenarios show that central Ragnhild glacier stagnates as west Ragnhild glacier accelerates and progressively drains the whole catchment area by ice and water piracy.
Conceptual Privacy Framework for Health Information on Wearable Device
Safavi, Seyedmostafa; Shukur, Zarina
2014-01-01
Wearable health tech provides doctors with the ability to remotely supervise their patients' wellness. It also makes it much easier to authorize someone else to take appropriate actions to ensure the person's wellness than ever before. Information Technology may soon change the way medicine is practiced, improving the performance, while reducing the price of healthcare. We analyzed the secrecy demands of wearable devices, including Smartphone, smart watch and their computing techniques, that can soon change the way healthcare is provided. However, before this is adopted in practice, all devices must be equipped with sufficient privacy capabilities related to healthcare service. In this paper, we formulated a new improved conceptual framework for wearable healthcare systems. This framework consists of ten principles and nine checklists, capable of providing complete privacy protection package to wearable device owners. We constructed this framework based on the analysis of existing mobile technology, the results of which are combined with the existing security standards. The approach also incorporates the market share percentage level of every app and its respective OS. This framework is evaluated based on the stringent CIA and HIPAA principles for information security. This evaluation is followed by testing the capability to revoke rights of subjects to access objects and ability to determine the set of available permissions for a particular subject for all models Finally, as the last step, we examine the complexity of the required initial setup. PMID:25478915
Conceptual privacy framework for health information on wearable device.
Safavi, Seyedmostafa; Shukur, Zarina
2014-01-01
Wearable health tech provides doctors with the ability to remotely supervise their patients' wellness. It also makes it much easier to authorize someone else to take appropriate actions to ensure the person's wellness than ever before. Information Technology may soon change the way medicine is practiced, improving the performance, while reducing the price of healthcare. We analyzed the secrecy demands of wearable devices, including Smartphone, smart watch and their computing techniques, that can soon change the way healthcare is provided. However, before this is adopted in practice, all devices must be equipped with sufficient privacy capabilities related to healthcare service. In this paper, we formulated a new improved conceptual framework for wearable healthcare systems. This framework consists of ten principles and nine checklists, capable of providing complete privacy protection package to wearable device owners. We constructed this framework based on the analysis of existing mobile technology, the results of which are combined with the existing security standards. The approach also incorporates the market share percentage level of every app and its respective OS. This framework is evaluated based on the stringent CIA and HIPAA principles for information security. This evaluation is followed by testing the capability to revoke rights of subjects to access objects and ability to determine the set of available permissions for a particular subject for all models Finally, as the last step, we examine the complexity of the required initial setup.
NASA Technical Reports Server (NTRS)
Callis, L. B.; Boughner, R. E.; Natarajan, M.
1983-01-01
The coupling that exists between infrared opacity changes and tropospheric (and to a lesser extent stratospheric) chemistry is explored in considerable detail, and the effects arising from various perturbations are examined. The studies are carried out with a fully coupled one-dimensional radiative-convective-photochemical model (RCP) that extends from the surface to 53.5 km and has the capability of calculating surface temperature changes due to both chemical and radiative perturbations. The model encompasses contemporary atmospheric chemistry and photochemistry involving the O(x), HO(x), NO(x), and Cl(x) species.
SBML and CellML translation in antimony and JSim.
Smith, Lucian P; Butterworth, Erik; Bassingthwaighte, James B; Sauro, Herbert M
2014-04-01
The creation and exchange of biologically relevant models is of great interest to many researchers. When multiple standards are in use, models are more readily used and re-used if there exist robust translators between the various accepted formats. Antimony 2.4 and JSim 2.10 provide translation capabilities from their own formats to SBML and CellML. All provided unique challenges, stemming from differences in each format's inherent design, in addition to differences in functionality. Both programs are available under BSD licenses; Antimony from http://antimony.sourceforge.net/and JSim from http://physiome.org/jsim/. lpsmith@u.washington.edu.
Analysis of Bonded Joints Between the Facesheet and Flange of Corrugated Composite Panels
NASA Technical Reports Server (NTRS)
Yarrington, Phillip W.; Collier, Craig S.; Bednarcyk, Brett A.
2008-01-01
This paper outlines a method for the stress analysis of bonded composite corrugated panel facesheet to flange joints. The method relies on the existing HyperSizer Joints software, which analyzes the bonded joint, along with a beam analogy model that provides the necessary boundary loading conditions to the joint analysis. The method is capable of predicting the full multiaxial stress and strain fields within the flange to facesheet joint and thus can determine ply-level margins and evaluate delamination. Results comparing the method to NASTRAN finite element model stress fields are provided illustrating the accuracy of the method.
Human Germline CRISPR-Cas Modification: Toward a Regulatory Framework
Evitt, Niklaus H.; Mascharak, Shamik; Altman, Russ B.
2015-01-01
CRISPR germline editing therapies (CGETs) hold unprecedented potential to eradicate hereditary disorders. However, the prospect of altering the human germline has sparked a debate over the safety, efficacy, and morality of CGETs, triggering a funding moratorium by the NIH. There is an urgent need for practical paths for the evaluation of these capabilities. We propose a model regulatory framework for CGET research, clinical development, and distribution. Our model takes advantage of existing legal and regulatory institutions but adds elevated scrutiny at each stage of CGET development to accommodate the unique technical and ethical challenges posed by germline editing. PMID:26632357
Propulsion Ground Testing: Planning for the Future
NASA Technical Reports Server (NTRS)
Bruce, Robert
2003-01-01
Advanced planners are constantly being asked to plan for the provision of future test capability. Historically, this capability is provided either by substantial investment in new test facility capabilities, or in the substantial investment in the modification of pre- existing test capabilities. The key words in the previous sentence are "substantial investment". In the evolving environment of increasingly constrained resources, how is an advanced planner to plan for the provisions of such capabilities? Additionally, the conundrum exists that program formulation decisions are being made based upon both life cycle cost decisions in an environment in which the more immediate challenge of "front-end" capital investment? Often times is the linch-pin upon which early decisions are made. In such an environment, how are plans and decisions made? This paper cites examples of decisions made in the past in the area of both major test facility upgrades, as well as major new test facility investment.
NASA Technical Reports Server (NTRS)
Schmalzel, John L.; Morris, Jon; Turowski, Mark; Figueroa, Fernando; Oostdyk, Rebecca
2008-01-01
There are a number of architecture models for implementing Integrated Systems Health Management (ISHM) capabilities. For example, approaches based on the OSA-CBM and OSA-EAI models, or specific architectures developed in response to local needs. NASA s John C. Stennis Space Center (SSC) has developed one such version of an extensible architecture in support of rocket engine testing that integrates a palette of functions in order to achieve an ISHM capability. Among the functional capabilities that are supported by the framework are: prognostic models, anomaly detection, a data base of supporting health information, root cause analysis, intelligent elements, and integrated awareness. This paper focuses on the role that intelligent elements can play in ISHM architectures. We define an intelligent element as a smart element with sufficient computing capacity to support anomaly detection or other algorithms in support of ISHM functions. A smart element has the capabilities of supporting networked implementations of IEEE 1451.x smart sensor and actuator protocols. The ISHM group at SSC has been actively developing intelligent elements in conjunction with several partners at other Centers, universities, and companies as part of our ISHM approach for better supporting rocket engine testing. We have developed several implementations. Among the key features for these intelligent sensors is support for IEEE 1451.1 and incorporation of a suite of algorithms for determination of sensor health. Regardless of the potential advantages that can be achieved using intelligent sensors, existing large-scale systems are still based on conventional sensors and data acquisition systems. In order to bring the benefits of intelligent sensors to these environments, we have also developed virtual implementations of intelligent sensors.
NASA Astrophysics Data System (ADS)
Frederick, J. M.; Bull, D. L.; Jones, C.; Roberts, J.; Thomas, M. A.
2016-12-01
Arctic coastlines are receding at accelerated rates, putting existing and future activities in the developing coastal Arctic environment at extreme risk. For example, at Oliktok Long Range Radar Site, erosion that was not expected until 2040 was reached as of 2014 (Alaska Public Media). As the Arctic Ocean becomes increasingly ice-free, rates of coastal erosion will likely continue to increase as (a) increased ice-free waters generate larger waves, (b) sea levels rise, and (c) coastal permafrost soils warm and lose strength/cohesion. Due to the complex and rapidly varying nature of the Arctic region, little is known about the increasing waves, changing circulation, permafrost soil degradation, and the response of the coastline to changes in these combined conditions. However, as scientific focus has been shifting towards the polar regions, Arctic science is rapidly advancing, increasing our understanding of complex Arctic processes. Our present understanding allows us to begin to develop and evaluate the coupled models necessary for the prediction of coastal erosion in support of Arctic risk assessments. What are the best steps towards the development of a coupled model for Arctic coastal erosion? This work focuses on our current understanding of Arctic conditions and identifying the tools and methods required to develop an integrated framework capable of accurately predicting Arctic coastline erosion and assessing coastal risk and hazards. We will present a summary of the state-of-the-science, and identify existing tools and methods required to develop an integrated diagnostic and monitoring framework capable of accurately predicting and assessing Arctic coastline erosion, infrastructure risk, and coastal hazards. The summary will describe the key coastal processes to simulate, appropriate models to use, effective methods to couple existing models, and identify gaps in knowledge that require further attention to make progress in our understanding of Arctic coastal erosion. * Co-authors listed in alphabetical order. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.
Rate and time dependent behavior of structural adhesives. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Renieri, M. P.; Herakovich, C. T.; Brinson, H. F.
1976-01-01
Studies on two adhesives (Metlbond 1113 and 1113-2) identified as having applications in the bonding of composite materials are presented. Constitutive equations capable of describing changes in material behavior with strain rate are derived from various theoretical approaches. It is shown that certain unique relationships exist between these approaches. It is also shown that the constitutive equation derived from mechanical models can be used for creep and relaxation loading. A creep to failure phenomenon is shown to exist and is correlated with a delayed yield equation proposed by Crochet. Loading-unloading results are presented and are shown to correlate well with the proposed form of the loading-unloading equations for the modified Bingham model. Experimental results obtained for relaxation tests above and below the glass transition temperature are presented. It is shown that the adhesives obey the time-temperature superposition principle.
Object-oriented Approach to High-level Network Monitoring and Management
NASA Technical Reports Server (NTRS)
Mukkamala, Ravi
2000-01-01
An absolute prerequisite for the management of large investigating methods to build high-level monitoring computer networks is the ability to measure their systems that are built on top of existing monitoring performance. Unless we monitor a system, we cannot tools. Due to the heterogeneous nature of the hope to manage and control its performance. In this underlying systems at NASA Langley Research Center, paper, we describe a network monitoring system that we use an object-oriented approach for the design, we are currently designing and implementing. Keeping, first, we use UML (Unified Modeling Language) to in mind the complexity of the task and the required model users' requirements. Second, we identify the flexibility for future changes, we use an object-oriented existing capabilities of the underlying monitoring design methodology. The system is built using the system. Third, we try to map the former with the latter. APIs offered by the HP OpenView system.
Nowicki, Dimitri; Siegelmann, Hava
2010-01-01
This paper introduces a new model of associative memory, capable of both binary and continuous-valued inputs. Based on kernel theory, the memory model is on one hand a generalization of Radial Basis Function networks and, on the other, is in feature space, analogous to a Hopfield network. Attractors can be added, deleted, and updated on-line simply, without harming existing memories, and the number of attractors is independent of input dimension. Input vectors do not have to adhere to a fixed or bounded dimensionality; they can increase and decrease it without relearning previous memories. A memory consolidation process enables the network to generalize concepts and form clusters of input data, which outperforms many unsupervised clustering techniques; this process is demonstrated on handwritten digits from MNIST. Another process, reminiscent of memory reconsolidation is introduced, in which existing memories are refreshed and tuned with new inputs; this process is demonstrated on series of morphed faces. PMID:20552013
Coupled electromechanical model of the heart: Parallel finite element formulation.
Lafortune, Pierre; Arís, Ruth; Vázquez, Mariano; Houzeaux, Guillaume
2012-01-01
In this paper, a highly parallel coupled electromechanical model of the heart is presented and assessed. The parallel-coupled model is thoroughly discussed, with scalability proven up to hundreds of cores. This work focuses on the mechanical part, including the constitutive model (proposing some modifications to pre-existent models), the numerical scheme and the coupling strategy. The model is next assessed through two examples. First, the simulation of a small piece of cardiac tissue is used to introduce the main features of the coupled model and calibrate its parameters against experimental evidence. Then, a more realistic problem is solved using those parameters, with a mesh of the Oxford ventricular rabbit model. The results of both examples demonstrate the capability of the model to run efficiently in hundreds of processors and to reproduce some basic characteristic of cardiac deformation.
Linking Assessment to Undergraduate Student Capabilities through Portfolio Examination
ERIC Educational Resources Information Center
O'Sullivan, Anthony J.; Harris, Peter; Hughes, Chris S.; Toohey, Susan M.; Balasooriya, Chinthaka; Velan, Gary; Kumar, Rakesh K.; McNeil, H. Patrick
2012-01-01
Portfolios are an established method of assessment, although concerns do exist around their validity for capabilities such as reflection and self-direction. This article describes an e-portfolio which closely aligns learning and reflection to graduate capabilities, incorporating features that address concerns about portfolios. Students are…
The Metadata Cloud: The Last Piece of a Distributed Data System Model
NASA Astrophysics Data System (ADS)
King, T. A.; Cecconi, B.; Hughes, J. S.; Walker, R. J.; Roberts, D.; Thieman, J. R.; Joy, S. P.; Mafi, J. N.; Gangloff, M.
2012-12-01
Distributed data systems have existed ever since systems were networked together. Over the years the model for distributed data systems have evolved from basic file transfer to client-server to multi-tiered to grid and finally to cloud based systems. Initially metadata was tightly coupled to the data either by embedding the metadata in the same file containing the data or by co-locating the metadata in commonly named files. As the sources of data multiplied, data volumes have increased and services have specialized to improve efficiency; a cloud system model has emerged. In a cloud system computing and storage are provided as services with accessibility emphasized over physical location. Computation and data clouds are common implementations. Effectively using the data and computation capabilities requires metadata. When metadata is stored separately from the data; a metadata cloud is formed. With a metadata cloud information and knowledge about data resources can migrate efficiently from system to system, enabling services and allowing the data to remain efficiently stored until used. This is especially important with "Big Data" where movement of the data is limited by bandwidth. We examine how the metadata cloud completes a general distributed data system model, how standards play a role and relate this to the existing types of cloud computing. We also look at the major science data systems in existence and compare each to the generalized cloud system model.
NASA Astrophysics Data System (ADS)
Preiss, Bruce; Greene, Lloyd; Kriebel, Jamie; Wasson, Robert
2006-05-01
The Air Force Research Laboratory utilizes a value model as a primary input for space technology planning and budgeting. The Space Sector at AFRL headquarters manages space technology investment across all the geographically disparate technical directorates and ensures that integrated planning is achieved across the space community. The space investment portfolio must ultimately balance near, mid, and far-term investments across all the critical space mission areas. Investment levels and growth areas can always be identified by a typical capability analysis or gap analysis, but the value model approach goes one step deeper and helps identify the potential payoff of technology investments by linking the technology directly to an existing or potential concept. The value of the technology is then viewed from the enabling performance perspective of the concept that ultimately fulfills the Air Force mission. The process of linking space technologies to future concepts and technology roadmaps will be reviewed in this paper, along with representative results from this planning cycle. The initial assumptions in this process will be identified along with the strengths and weaknesses of this planning methodology.
Initial Integration of Noise Prediction Tools for Acoustic Scattering Effects
NASA Technical Reports Server (NTRS)
Nark, Douglas M.; Burley, Casey L.; Tinetti, Ana; Rawls, John W.
2008-01-01
This effort provides an initial glimpse at NASA capabilities available in predicting the scattering of fan noise from a non-conventional aircraft configuration. The Aircraft NOise Prediction Program, Fast Scattering Code, and the Rotorcraft Noise Model were coupled to provide increased fidelity models of scattering effects on engine fan noise sources. The integration of these codes led to the identification of several keys issues entailed in applying such multi-fidelity approaches. In particular, for prediction at noise certification points, the inclusion of distributed sources leads to complications with the source semi-sphere approach. Computational resource requirements limit the use of the higher fidelity scattering code to predict radiated sound pressure levels for full scale configurations at relevant frequencies. And, the ability to more accurately represent complex shielding surfaces in current lower fidelity models is necessary for general application to scattering predictions. This initial step in determining the potential benefits/costs of these new methods over the existing capabilities illustrates a number of the issues that must be addressed in the development of next generation aircraft system noise prediction tools.
A Synthetic Vision Preliminary Integrated Safety Analysis
NASA Technical Reports Server (NTRS)
Hemm, Robert; Houser, Scott
2001-01-01
This report documents efforts to analyze a sample of aviation safety programs, using the LMI-developed integrated safety analysis tool to determine the change in system risk resulting from Aviation Safety Program (AvSP) technology implementation. Specifically, we have worked to modify existing system safety tools to address the safety impact of synthetic vision (SV) technology. Safety metrics include reliability, availability, and resultant hazard. This analysis of SV technology is intended to be part of a larger effort to develop a model that is capable of "providing further support to the product design and development team as additional information becomes available". The reliability analysis portion of the effort is complete and is fully documented in this report. The simulation analysis is still underway; it will be documented in a subsequent report. The specific goal of this effort is to apply the integrated safety analysis to SV technology. This report also contains a brief discussion of data necessary to expand the human performance capability of the model, as well as a discussion of human behavior and its implications for system risk assessment in this modeling environment.
Technology evaluation, assessment, modeling, and simulation: the TEAMS capability
NASA Astrophysics Data System (ADS)
Holland, Orgal T.; Stiegler, Robert L.
1998-08-01
The United States Marine Corps' Technology Evaluation, Assessment, Modeling and Simulation (TEAMS) capability, located at the Naval Surface Warfare Center in Dahlgren Virginia, provides an environment for detailed test, evaluation, and assessment of live and simulated sensor and sensor-to-shooter systems for the joint warfare community. Frequent use of modeling and simulation allows for cost effective testing, bench-marking, and evaluation of various levels of sensors and sensor-to-shooter engagements. Interconnectivity to live, instrumented equipment operating in real battle space environments and to remote modeling and simulation facilities participating in advanced distributed simulations (ADS) exercises is available to support a wide- range of situational assessment requirements. TEAMS provides a valuable resource for a variety of users. Engineers, analysts, and other technology developers can use TEAMS to evaluate, assess and analyze tactical relevant phenomenological data on tactical situations. Expeditionary warfare and USMC concept developers can use the facility to support and execute advanced warfighting experiments (AWE) to better assess operational maneuver from the sea (OMFTS) concepts, doctrines, and technology developments. Developers can use the facility to support sensor system hardware, software and algorithm development as well as combat development, acquisition, and engineering processes. Test and evaluation specialists can use the facility to plan, assess, and augment their processes. This paper presents an overview of the TEAMS capability and focuses specifically on the technical challenges associated with the integration of live sensor hardware into a synthetic environment and how those challenges are being met. Existing sensors, recent experiments and facility specifications are featured.
NASA Technical Reports Server (NTRS)
Rubinstein, R. (Editor); Rumsey, C. L. (Editor); Salas, M. D. (Editor); Thomas, J. L. (Editor); Bushnell, Dennis M. (Technical Monitor)
2001-01-01
Advances in turbulence modeling are needed in order to calculate high Reynolds number flows near the onset of separation and beyond. To this end, the participants in this workshop made the following recommendations. (1) A national/international database and standards for turbulence modeling assessment should be established. Existing experimental data sets should be reviewed and categorized. Advantage should be taken of other efforts already under-way, such as that of the European Research Community on Flow, Turbulence, and Combustion (ERCOFTAC) consortium. Carefully selected "unit" experiments will be needed, as well as advances in instrumentation, to fill the gaps in existing datasets. A high priority should be given to document existing turbulence model capabilities in a standard form, including numerical implementation issues such as grid quality and resolution. (2) NASA should support long-term research on Algebraic Stress Models and Reynolds Stress Models. The emphasis should be placed on improving the length-scale equation, since it is the least understood and is a key component of two-equation and higher models. Second priority should be given to the development of improved near-wall models. Direct Numerical Simulations (DNS) and Large Eddy Simulations (LES) would provide valuable guidance in developing and validating new Reynolds-averaged Navier-Stokes (RANS) models. Although not the focus of this workshop, DNS, LES, and hybrid methods currently represent viable approaches for analysis on a limited basis. Therefore, although computer limitations require the use of RANS methods for realistic configurations at high Reynolds number in the foreseeable future, a balanced effort in turbulence modeling development, validation, and implementation should include these approaches as well.
Afzali, Maryam; Fatemizadeh, Emad; Soltanian-Zadeh, Hamid
2015-09-30
Diffusion weighted imaging (DWI) is a non-invasive method for investigating the brain white matter structure and can be used to evaluate fiber bundles. However, due to practical constraints, DWI data acquired in clinics are low resolution. This paper proposes a method for interpolation of orientation distribution functions (ODFs). To this end, fuzzy clustering is applied to segment ODFs based on the principal diffusion directions (PDDs). Next, a cluster is modeled by a tensor so that an ODF is represented by a mixture of tensors. For interpolation, each tensor is rotated separately. The method is applied on the synthetic and real DWI data of control and epileptic subjects. Both experiments illustrate capability of the method in increasing spatial resolution of the data in the ODF field properly. The real dataset show that the method is capable of reliable identification of differences between temporal lobe epilepsy (TLE) patients and normal subjects. The method is compared to existing methods. Comparison studies show that the proposed method generates smaller angular errors relative to the existing methods. Another advantage of the method is that it does not require an iterative algorithm to find the tensors. The proposed method is appropriate for increasing resolution in the ODF field and can be applied to clinical data to improve evaluation of white matter fibers in the brain. Copyright © 2015 Elsevier B.V. All rights reserved.
DFL, Canada's Space AIT Facilities - Current and Planned Capabilities
NASA Astrophysics Data System (ADS)
Singhal, R.; Mishra, S.; Choueiry, E.; Dumoulin, J.; Ahmed, S.
2004-08-01
The David Florida Laboratory (DFL) of the Canadian Space Agency is the Canadian national ISO 9001:2000 registered facility for the assembly, integration, and (environmental) testing of space hardware. This paper briefly describes the three main qualification facilities: Structural Qualification Facilities (SQF); Radio Frequency Qualification Facilities (RFQF); and Thermal Qualification Facilities (TQF). The paper also describes the planned/new upgrades/improvements to the DFL's existing capabilities. These include: cylindrical near-field antenna measurement system, current capabilities in multi-frequency multi-band passive intermodulation (PIM) measurement; combined thermal/vibration test facility, improvement in efficiency and performance of the photogrammetry capability, acquisition of an additional mass properties measurement system for small and micro-satellites; combined control and data acquisition system for all existing thermal vacuum facilities, plus a new automatic thermal control system and hypobaric chamber.
National facilities study. Volume 4: Space operations facilities task group
NASA Technical Reports Server (NTRS)
1994-01-01
The principal objectives of the National Facilities Study (NFS) were to: (1) determine where U.S. facilities do not meet national aerospace needs; (2) define new facilities required to make U.S. capabilities 'world class' where such improvements are in the national interest; (3) define where consolidation and phase-out of existing facilities is appropriate; and (4) develop a long-term national plan for world-class facility acquisition and shared usage. The Space Operations Facilities Task Group defined discrete tasks to accomplish the above objectives within the scope of the study. An assessment of national space operations facilities was conducted to determine the nation's capability to meet the requirements of space operations during the next 30 years. The mission model used in the study to define facility requirements is described in Volume 3. Based on this model, the major focus of the Task Group was to identify any substantive overlap or underutilization of space operations facilities and to identify any facility shortfalls that would necessitate facility upgrades or new facilities. The focus of this initial study was directed toward facility recommendations related to consolidations, closures, enhancements, and upgrades considered necessary to efficiently and effectively support the baseline requirements model. Activities related to identifying facility needs or recommendations for enhancing U.S. international competitiveness and achieving world-class capability, where appropriate, were deferred to a subsequent study phase.
NASA Astrophysics Data System (ADS)
Abrams, T.; Ding, R.; Guo, H. Y.; Thomas, D. M.; Chrobak, C. P.; Rudakov, D. L.; McLean, A. G.; Unterberg, E. A.; Briesemeister, A. R.; Stangeby, P. C.; Elder, J. D.; Wampler, W. R.; Watkins, J. G.
2017-05-01
It is important to develop a predictive capability for the tungsten source rate near the strike points during H-mode operation in ITER and beyond. H-mode deuterium plasma exposures were performed on W-coated graphite and molybdenum substrates in the DIII-D divertor using DiMES. The W-I 400.9 nm spectral line was monitored by fast filtered diagnostics cross calibrated via a high-resolution spectrometer to resolve inter-ELM W erosion. The effective ionization/photon (S/XB) was calibrated using a unique method developed on DIII-D based on surface analysis. Inferred S/XB values agree with an existing empirical scaling at low electron density (n e) but diverge at higher densities, consistent with recent ADAS atomic physics modeling results. Edge modeling of the inter-ELM phase is conducted via OEDGE utilizing the new capability for charge-state resolved carbon impurity fluxes. ERO modeling is performed with the calculated main ion and impurity plasma background from OEDGE. ERO results demonstrate the importance a mixed-material surface model in the interpretation of W sourcing measurements. It is demonstrated that measured inter-ELM W erosion rates can be well explained by C→W sputtering only if a realistic mixed material model is incorporated.
Argobots: A Lightweight Low-Level Threading and Tasking Framework
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seo, Sangmin; Amer, Abdelhalim; Balaji, Pavan
In the past few decades, a number of user-level threading and tasking models have been proposed in the literature to address the shortcomings of OS-level threads, primarily with respect to cost and flexibility. Current state-of-the-art user-level threading and tasking models, however, are either too specific to applications or architectures or are not as powerful or flexible. In this paper, we present Argobots, a lightweight, low-level threading and tasking framework that is designed as a portable and performant substrate for high-level programming models or runtime systems. Argobots offers a carefully designed execution model that balances generality of functionality with providing amore » rich set of controls to allow specialization by the user or high-level programming model. We describe the design, implementation, and optimization of Argobots and present integrations with three example high-level models: OpenMP, MPI, and co-located I/O service. Evaluations show that (1) Argobots outperforms existing generic threading runtimes; (2) our OpenMP runtime offers more efficient interoperability capabilities than production OpenMP runtimes do; (3) when MPI interoperates with Argobots instead of Pthreads, it enjoys reduced synchronization costs and better latency hiding capabilities; and (4) I/O service with Argobots reduces interference with co-located applications, achieving performance competitive with that of the Pthreads version.« less
NASA Astrophysics Data System (ADS)
Podestà, M.; Gorelenkova, M.; Gorelenkov, N. N.; White, R. B.
2017-09-01
Alfvénic instabilities (AEs) are well known as a potential cause of enhanced fast ion transport in fusion devices. Given a specific plasma scenario, quantitative predictions of (i) expected unstable AE spectrum and (ii) resulting fast ion transport are required to prevent or mitigate the AE-induced degradation in fusion performance. Reduced models are becoming an attractive tool to analyze existing scenarios as well as for scenario prediction in time-dependent simulations. In this work, a neutral beam heated NSTX discharge is used as reference to illustrate the potential of a reduced fast ion transport model, known as kick model, that has been recently implemented for interpretive and predictive analysis within the framework of the time-dependent tokamak transport code TRANSP. Predictive capabilities for AE stability and saturation amplitude are first assessed, based on given thermal plasma profiles only. Predictions are then compared to experimental results, and the interpretive capabilities of the model further discussed. Overall, the reduced model captures the main properties of the instabilities and associated effects on the fast ion population. Additional information from the actual experiment enables further tuning of the model’s parameters to achieve a close match with measurements.
The search for extraterrestrial intelligence.
Wilson, T L
2001-02-22
As far as we know, humanity is alone in the Universe: there is no definite evidence for the existence of extraterrestrial life, let alone extraterrestrial civilizations (ETCs) capable of communicating or travelling over interstellar distances. Yet popular speculation about the existence of ETCs abounds, including reports of alien visitations either now or in the past. But there is a middle way. It is now possible to put limits on the existence of ETCs of varying capabilities, within arbitrary distances from the Solar System, and conceive of real-world strategies whereby we might communicate with ETCs, or they with us.
Leveraging existing information for use in a National Nuclear Forensics Library (NNFL)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davydov, Jerry; Dion, Heather; LaMont, Stephen
A National Nuclear Forensics Library (NNFL) assists a State to assess whether nuclear material encountered out of regulatory control is of domestic or international origin. And by leveraging nuclear material registries, nuclear enterprise records, and safeguards accountancy information, as well as existing domestic technical capability and subject-matter domain expertise, states can better assess the effort required for setting up an NNFL. For states who are largely recipients of nuclear and radiological materials and have no internal production capabilities may create an NNFL that relies on existing information rather than carry out advanced analyses on domestic materials.
Leveraging existing information for use in a National Nuclear Forensics Library (NNFL)
Davydov, Jerry; Dion, Heather; LaMont, Stephen; ...
2015-12-16
A National Nuclear Forensics Library (NNFL) assists a State to assess whether nuclear material encountered out of regulatory control is of domestic or international origin. And by leveraging nuclear material registries, nuclear enterprise records, and safeguards accountancy information, as well as existing domestic technical capability and subject-matter domain expertise, states can better assess the effort required for setting up an NNFL. For states who are largely recipients of nuclear and radiological materials and have no internal production capabilities may create an NNFL that relies on existing information rather than carry out advanced analyses on domestic materials.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kleese van Dam, Kerstin; Lansing, Carina S.; Elsethagen, Todd O.
2014-01-28
Modern workflow systems enable scientists to run ensemble simulations at unprecedented scales and levels of complexity, allowing them to study system sizes previously impossible to achieve, due to the inherent resource requirements needed for the modeling work. However as a result of these new capabilities the science teams suddenly also face unprecedented data volumes that they are unable to analyze with their existing tools and methodologies in a timely fashion. In this paper we will describe the ongoing development work to create an integrated data intensive scientific workflow and analysis environment that offers researchers the ability to easily create andmore » execute complex simulation studies and provides them with different scalable methods to analyze the resulting data volumes. The integration of simulation and analysis environments is hereby not only a question of ease of use, but supports fundamental functions in the correlated analysis of simulation input, execution details and derived results for multi-variant, complex studies. To this end the team extended and integrated the existing capabilities of the Velo data management and analysis infrastructure, the MeDICi data intensive workflow system and RHIPE the R for Hadoop version of the well-known statistics package, as well as developing a new visual analytics interface for the result exploitation by multi-domain users. The capabilities of the new environment are demonstrated on a use case that focusses on the Pacific Northwest National Laboratory (PNNL) building energy team, showing how they were able to take their previously local scale simulations to a nationwide level by utilizing data intensive computing techniques not only for their modeling work, but also for the subsequent analysis of their modeling results. As part of the PNNL research initiative PRIMA (Platform for Regional Integrated Modeling and Analysis) the team performed an initial 3 year study of building energy demands for the US Eastern Interconnect domain, which they are now planning to extend to predict the demand for the complete century. The initial study raised their data demands from a few GBs to 400GB for the 3year study and expected tens of TBs for the full century.« less
Maturing Weapon Systems for Improved Availability at Lower Costs
1994-01-01
development of new measures of R&M performance and improved data collection and analysis capabilities . Innovations in automated data collection, including the...45 Capabilities Required to Implement Maturation Development ...... 45 Assess R&M Performance Accurately ....................... 46 Identify...Requirements Determination ...................................... 49 Capabilities of the Best Existing Databases ..................... 49 Data Elements Needed
NASA Astrophysics Data System (ADS)
Kapitan, Loginn
This research created a new model which provides an integrated approach to planning the effective selection and employment of airborne sensor systems in response to accidental or intentional chemical vapor releases. The approach taken was to use systems engineering and decision analysis methods to construct a model architecture which produced a modular structure for integrating both new and existing components into a logical procedure to assess the application of airborne sensor systems to address chemical vapor hazards. The resulting integrated process model includes an internal aggregation model which allowed differentiation among alternative airborne sensor systems. Both models were developed and validated by experts and demonstrated using appropriate hazardous chemical release scenarios. The resultant prototype integrated process model or system fills a current gap in capability allowing improved planning, training and exercise for HAZMAT teams and first responders when considering the selection and employment of airborne sensor systems. Through the research process, insights into the current response structure and how current airborne capability may be most effectively used were generated. Furthermore, the resultant prototype system is tailorable for local, state, and federal application, and can potentially be modified to help evaluate investments in new airborne sensor technology and systems. Better planning, training and preparedness exercising holds the prospect for the effective application of airborne assets for improved response to large scale chemical release incidents. Improved response will result in fewer casualties and lives lost, reduced economic impact, and increased protection of critical infrastructure when faced with accidental and intentional terrorist release of hazardous industrial chemicals. With the prospect of more airborne sensor systems becoming available, this prototype system integrates existing and new tools into an effective process for the selection and employment of airborne sensors to better plan, train and exercise ahead of potential chemical release events.
Kate's Model Verification Tools
NASA Technical Reports Server (NTRS)
Morgan, Steve
1991-01-01
Kennedy Space Center's Knowledge-based Autonomous Test Engineer (KATE) is capable of monitoring electromechanical systems, diagnosing their errors, and even repairing them when they crash. A survey of KATE's developer/modelers revealed that they were already using a sophisticated set of productivity enhancing tools. They did request five more, however, and those make up the body of the information presented here: (1) a transfer function code fitter; (2) a FORTRAN-Lisp translator; (3) three existing structural consistency checkers to aid in syntax checking their modeled device frames; (4) an automated procedure for calibrating knowledge base admittances to protect KATE's hardware mockups from inadvertent hand valve twiddling; and (5) three alternatives for the 'pseudo object', a programming patch that currently apprises KATE's modeling devices of their operational environments.
Commerce Laboratory: Mission analysis payload integration study
NASA Technical Reports Server (NTRS)
Bannister, T. C.
1984-01-01
A mission model which will accommodate commercial users and provide a basic data base for further mission planning is reported. The data bases to be developed are: (1) user requirements; (2) apparatus capabilities and availabilities; and (3) carrier capabilities. These data bases are synthesized in a trades and analysis phase along with the STS flight apparatus, and optimum missions will be identified. The completed work is reported. The user requirements data base was expanded to identify within the six scientific disciplines the areas of investigation, investigation categories and status, potential commercial application, interested parties, process, and experiment requirements. The scope of the apparatus data base was expanded to indicate apparatus status as to whether it is ground or flight equipment and, within both categories, whether the apparatus is: (1) existing, (2) under development, (3) planned, or (4) needed. Applications for the apparatus are listed. The methodology is revised in the areas of trades and analysis and mission planning. The carrier capabilities data base was updated and completed.
Data engineering systems: Computerized modeling and data bank capabilities for engineering analysis
NASA Technical Reports Server (NTRS)
Kopp, H.; Trettau, R.; Zolotar, B.
1984-01-01
The Data Engineering System (DES) is a computer-based system that organizes technical data and provides automated mechanisms for storage, retrieval, and engineering analysis. The DES combines the benefits of a structured data base system with automated links to large-scale analysis codes. While the DES provides the user with many of the capabilities of a computer-aided design (CAD) system, the systems are actually quite different in several respects. A typical CAD system emphasizes interactive graphics capabilities and organizes data in a manner that optimizes these graphics. On the other hand, the DES is a computer-aided engineering system intended for the engineer who must operationally understand an existing or planned design or who desires to carry out additional technical analysis based on a particular design. The DES emphasizes data retrieval in a form that not only provides the engineer access to search and display the data but also links the data automatically with the computer analysis codes.
Grayson, Richard; Kay, Paul; Foulger, Miles
2008-01-01
Diffuse pollution poses a threat to water quality and results in the need for treatment for potable water supplies which can prove costly. Within the Yorkshire region, UK, nitrates, pesticides and water colour present particular treatment problems. Catchment management techniques offer an alternative to 'end of pipe' solutions and allow resources to be targeted to the most polluting areas. This project has attempted to identify such areas using GIS based modelling approaches in catchments where water quality data were available. As no model exists to predict water colour a model was created using an MCE method which is capable of predicting colour concentrations at the catchment scale. CatchIS was used to predict pesticide and nitrate N concentrations and was found to be generally capable of reliably predicting nitrate N loads at the catchment scale. The pesticides results did not match the historic data possibly due to problems with the historic pesticide data and temporal and spatially variability in pesticide usage. The use of these models can be extended to predict water quality problems in catchments where water quality data are unavailable and highlight areas of concern. IWA Publishing 2008.
NASA Technical Reports Server (NTRS)
Liever, Peter A.; West, Jeffrey S.
2016-01-01
A hybrid Computational Fluid Dynamics and Computational Aero-Acoustics (CFD/CAA) modeling framework has been developed for launch vehicle liftoff acoustic environment predictions. The framework couples the existing highly-scalable NASA production CFD code, Loci/CHEM, with a high-order accurate discontinuous Galerkin solver developed in the same production framework, Loci/THRUST, to accurately resolve and propagate acoustic physics across the entire launch environment. Time-accurate, Hybrid RANS/LES CFD modeling is applied for predicting the acoustic generation physics at the plume source, and a high-order accurate unstructured discontinuous Galerkin (DG) method is employed to propagate acoustic waves away from the source across large distances using high-order accurate schemes. The DG solver is capable of solving 2nd, 3rd, and 4th order Euler solutions for non-linear, conservative acoustic field propagation. Initial application testing and validation has been carried out against high resolution acoustic data from the Ares Scale Model Acoustic Test (ASMAT) series to evaluate the capabilities and production readiness of the CFD/CAA system to resolve the observed spectrum of acoustic frequency content. This paper presents results from this validation and outlines efforts to mature and improve the computational simulation framework.
Aerodynamic stability analysis of NASA J85-13/planar pressure pulse generator installation
NASA Technical Reports Server (NTRS)
Chung, K.; Hosny, W. M.; Steenken, W. G.
1980-01-01
A digital computer simulation model for the J85-13/Planar Pressure Pulse Generator (P3 G) test installation was developed by modifying an existing General Electric compression system model. This modification included the incorporation of a novel method for describing the unsteady blade lift force. This approach significantly enhanced the capability of the model to handle unsteady flows. In addition, the frequency response characteristics of the J85-13/P3G test installation were analyzed in support of selecting instrumentation locations to avoid standing wave nodes within the test apparatus and thus, low signal levels. The feasibility of employing explicit analytical expression for surge prediction was also studied.
Aeroelastic modeling for the FIT (Functional Integration Technology) team F/A-18 simulation
NASA Technical Reports Server (NTRS)
Zeiler, Thomas A.; Wieseman, Carol D.
1989-01-01
As part of Langley Research Center's commitment to developing multidisciplinary integration methods to improve aerospace systems, the Functional Integration Technology (FIT) team was established to perform dynamics integration research using an existing aircraft configuration, the F/A-18. An essential part of this effort has been the development of a comprehensive simulation modeling capability that includes structural, control, and propulsion dynamics as well as steady and unsteady aerodynamics. The structural and unsteady aerodynamics contributions come from an aeroelastic mode. Some details of the aeroelastic modeling done for the Functional Integration Technology (FIT) team research are presented. Particular attention is given to work done in the area of correction factors to unsteady aerodynamics data.
Agent independent task planning
NASA Technical Reports Server (NTRS)
Davis, William S.
1990-01-01
Agent-Independent Planning is a technique that allows the construction of activity plans without regard to the agent that will perform them. Once generated, a plan is then validated and translated into instructions for a particular agent, whether a robot, crewmember, or software-based control system. Because Space Station Freedom (SSF) is planned for orbital operations for approximately thirty years, it will almost certainly experience numerous enhancements and upgrades, including upgrades in robotic manipulators. Agent-Independent Planning provides the capability to construct plans for SSF operations, independent of specific robotic systems, by combining techniques of object oriented modeling, nonlinear planning and temporal logic. Since a plan is validated using the physical and functional models of a particular agent, new robotic systems can be developed and integrated with existing operations in a robust manner. This technique also provides the capability to generate plans for crewmembers with varying skill levels, and later apply these same plans to more sophisticated robotic manipulators made available by evolutions in technology.
NASA'S SERVIR Gulf of Mexico Project: The Gulf of Mexico Regional Collaborative (GoMRC)
NASA Technical Reports Server (NTRS)
Quattrochi, Dale A.; Irwin, Daniel; Presson, Joan; Estes, Maury; Estes, Sue; Judd, Kathleen
2006-01-01
The Gulf of Mexico Regional Collaborative (GoMRC) is a NASA-funded project that has as its goal to develop an integrated, working, prototype IT infrastructure for Earth science data, knowledge and models for the five Gulf U.S. states and Mexico, and to demonstrate its ability to help decision-makers better understand critical Gulf-scale issues. Within this preview, the mission of this project is to provide cross cutting solution network and rapid prototyping capability for the Gulf of Mexico region, in order to demonstrate substantial, collaborative, multi-agency research and transitional capabilities using unique NASA data sets and models to address regional problems. SERVIR Mesoamerica is seen as an excellent existing framework that can be used to integrate observational and GIs data bases, provide a sensor web interface, visualization and interactive analysis tools, archival functions, data dissemination and product generation within a Rapid Prototyping concept to assist decision-makers in better understanding Gulf-scale environmental issues.
Development of the 15 meter diameter hoop column antenna
NASA Technical Reports Server (NTRS)
1986-01-01
The building of a deployable 15-meter engineering model of the 100 meter antenna based on the point-design of an earlier task of this contract, complete with an RF-capable surface is described. The 15 meter diameter was selected so that the model could be tested in existing manufacturing, near-field RF, thermal vacuum, and structural dynamics facilities. The antenna was designed with four offset paraboloidal reflector surfaces with a focal length of 366.85 in and a primary surface accuracy goal of .069 in rms. Surface adjustment capability was provided by manually resetting the length of 96 surface control cords which emanated from the lower column extremity. A detailed description of the 15-meter Hoop/Column Antenna, major subassemblies, and a history of its fabrication, assembly, deployment testing, and verification measurements are given. The deviation for one aperture surface (except the outboard extremity) was measured after adjustments in follow-on tests at the Martin Marietta Near-field Facility to be .061 in; thus the primary surface goal was achieved.
Asteroid Crew Segment Mission Lean Development
NASA Technical Reports Server (NTRS)
Gard, Joseph; McDonald, Mark
2014-01-01
Asteroid Retrieval Crewed Mission (ARCM) requires a minimum set of Key Capabilities compared in the context of the baseline EM-1/2 Orion and SLS capabilities. These include: Life Support & Human Systems Capabilities; Mission Kit Capabilities; Minimizing the impact to the Orion and SLS development schedules and funding. Leveraging existing technology development efforts to develop the kits adds functionality to Orion while minimizing cost and mass impact.
The Telecommunications and Data Acquisition Report
NASA Technical Reports Server (NTRS)
Posner, E. C. (Editor)
1987-01-01
Topics addressed include: tracking and ground-based navigation; communications, spacecraft-ground; station control and system technology; capabilities for existing projects; network upgrade and sustaining; mission interface and support; and Ka-band capabilities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shahidehpour, Mohammad
Integrating 20% or more wind energy into the system and transmitting large sums of wind energy over long distances will require a decision making capability that can handle very large scale power systems with tens of thousands of buses and lines. There is a need to explore innovative analytical and implementation solutions for continuing reliable operations with the most economical integration of additional wind energy in power systems. A number of wind integration solution paths involve the adoption of new operating policies, dynamic scheduling of wind power across interties, pooling integration services, and adopting new transmission scheduling practices. Such practicesmore » can be examined by the decision tool developed by this project. This project developed a very efficient decision tool called Wind INtegration Simulator (WINS) and applied WINS to facilitate wind energy integration studies. WINS focused on augmenting the existing power utility capabilities to support collaborative planning, analysis, and wind integration project implementations. WINS also had the capability of simulating energy storage facilities so that feasibility studies of integrated wind energy system applications can be performed for systems with high wind energy penetrations. The development of WINS represents a major expansion of a very efficient decision tool called POwer Market Simulator (POMS), which was developed by IIT and has been used extensively for power system studies for decades. Specifically, WINS provides the following superiorities; (1) An integrated framework is included in WINS for the comprehensive modeling of DC transmission configurations, including mono-pole, bi-pole, tri-pole, back-to-back, and multi-terminal connection, as well as AC/DC converter models including current source converters (CSC) and voltage source converters (VSC); (2) An existing shortcoming of traditional decision tools for wind integration is the limited availability of user interface, i.e., decision results are often text-based demonstrations. WINS includes a powerful visualization tool and user interface capability for transmission analyses, planning, and assessment, which will be of great interest to power market participants, power system planners and operators, and state and federal regulatory entities; and (3) WINS can handle extended transmission models for wind integration studies. WINS models include limitations on transmission flow as well as bus voltage for analyzing power system states. The existing decision tools often consider transmission flow constraints (dc power flow) alone which could result in the over-utilization of existing resources when analyzing wind integration. WINS can be used to assist power market participants including transmission companies, independent system operators, power system operators in vertically integrated utilities, wind energy developers, and regulatory agencies to analyze economics, security, and reliability of various options for wind integration including transmission upgrades and the planning of new transmission facilities. WINS can also be used by industry for the offline training of reliability and operation personnel when analyzing wind integration uncertainties, identifying critical spots in power system operation, analyzing power system vulnerabilities, and providing credible decisions for examining operation and planning options for wind integration. Researches in this project on wind integration included (1) Development of WINS; (2) Transmission Congestion Analysis in the Eastern Interconnection; (3) Analysis of 2030 Large-Scale Wind Energy Integration in the Eastern Interconnection; (4) Large-scale Analysis of 2018 Wind Energy Integration in the Eastern U.S. Interconnection. The research resulted in 33 papers, 9 presentations, 9 PhD degrees, 4 MS degrees, and 7 awards. The education activities in this project on wind energy included (1) Wind Energy Training Facility Development; (2) Wind Energy Course Development.« less
Open discovery: An integrated live Linux platform of Bioinformatics tools.
Vetrivel, Umashankar; Pilla, Kalabharath
2008-01-01
Historically, live linux distributions for Bioinformatics have paved way for portability of Bioinformatics workbench in a platform independent manner. Moreover, most of the existing live Linux distributions limit their usage to sequence analysis and basic molecular visualization programs and are devoid of data persistence. Hence, open discovery - a live linux distribution has been developed with the capability to perform complex tasks like molecular modeling, docking and molecular dynamics in a swift manner. Furthermore, it is also equipped with complete sequence analysis environment and is capable of running windows executable programs in Linux environment. Open discovery portrays the advanced customizable configuration of fedora, with data persistency accessible via USB drive or DVD. The Open Discovery is distributed free under Academic Free License (AFL) and can be downloaded from http://www.OpenDiscovery.org.in.
NASA Astrophysics Data System (ADS)
Bolon, Kevin M.
The lack of multi-day data for household travel and vehicle capability requirements is an impediment to evaluations of energy savings strategies, since (1) travel requirements vary from day-to-day, and (2) energy-saving transportation options often have reduced capability. This work demonstrates a survey methodology and modeling system for evaluating the energy-savings potential of household travel, considering multi-day travel requirements and capability constraints imposed by the available transportation resources. A stochastic scheduling model is introduced---the multi-day Household Activity Schedule Estimator (mPHASE)---which generates synthetic daily schedules based on "fuzzy" descriptions of activity characteristics using a finite-element representation of activity flexibility, coordination among household members, and scheduling conflict resolution. Results of a thirty-household pilot study are presented in which responses to an interactive computer assisted personal interview were used as inputs to the mPHASE model in order to illustrate the feasibility of generating complex, realistic multi-day household schedules. Study vehicles were equipped with digital cameras and GPS data acquisition equipment to validate the model results. The synthetically generated schedules captured an average of 60 percent of household travel distance, and exhibited many of the characteristics of complex household travel, including day-to-day travel variation, and schedule coordination among household members. Future advances in the methodology may improve the model results, such as encouraging more detailed and accurate responses by providing a selection of generated schedules during the interview. Finally, the Constraints-based Transportation Resource Assignment Model (CTRAM) is introduced. Using an enumerative optimization approach, CTRAM determines the energy-minimizing vehicle-to-trip assignment decisions, considering trip schedules, occupancy, and vehicle capability. Designed to accept either actual or synthetic schedules, results of an application of the optimization model to the 2001 and 2009 National Household Travel Survey data show that U.S. households can reduce energy use by 10 percent, on average, by modifying the assignment of existing vehicles to trips. Households in 2009 show a higher tendency to assign vehicles optimally than in 2001, and multi-vehicle households with diverse fleets have greater savings potential, indicating that fleet modification strategies may be effective, particularly under higher energy price conditions.
Towards improved capability and confidence in coupled atmospheric and wildland fire modeling
NASA Astrophysics Data System (ADS)
Sauer, Jeremy A.
This dissertation work is aimed at improving the capability and confidence in a modernized and improved version of Los Alamos National Laboratory's coupled atmospheric and wild- land fire dynamics model, Higrad-Firetec. Higrad is the hydrodynamics component of this large eddy simulation model that solves the three dimensional, fully compressible Navier-Stokes equations, incorporating a dynamic eddy viscosity formulation through a two-scale turbulence closure scheme. Firetec is the vegetation, drag forcing, and combustion physics portion that is integrated with Higrad. The modern version of Higrad-Firetec incorporates multiple numerical methodologies and high performance computing aspects which combine to yield a unique tool capable of augmenting theoretical and observational investigations in order to better understand the multi-scale, multi-phase, and multi-physics, phenomena involved in coupled atmospheric and environmental dynamics. More specifically, the current work includes extended functionality and validation efforts targeting component processes in coupled atmospheric and wildland fire scenarios. Since observational data of sufficient quality and resolution to validate the fully coupled atmosphere-wildfire scenario simply does not exist, we instead seek to validate components of the full prohibitively convoluted process. This manuscript provides first, an introduction and background into the application space of Higrad-Firetec. Second we document the model formulation, solution procedure, and a simple scalar transport verification exercise. Third, we perform a validate model results against observational data for time averaged flow field metrics in and above four idealized forest canopies. Fourth, we carry out a validation effort for the non-buoyant jet in a crossflow scenario (to which an analogy can be made for atmosphere-wildfire interactions) comparing model results to laboratory data of both steady-in-time and unsteady-in-time metrics. Finally, an extension of model multi-phase physics is implemented, allowing for the representation of multiple collocated fuels as separately evolving constituents leading to differences resulting rate of spread and total burned area. In combination these efforts demonstrate improved capability, increased validation of component functionality, and unique applicability the Higrad-Firetec modeling framework. As a result this work provides a substantially more robust foundation for future new, more widely acceptable investigations into the complexities of coupled atmospheric and wildland fire behavior.
Numerical study of low-frequency discharge oscillations in a 5 kW Hall thruster
NASA Astrophysics Data System (ADS)
Le, YANG; Tianping, ZHANG; Juanjuan, CHEN; Yanhui, JIA
2018-07-01
A two-dimensional particle-in-cell plasma model is built in the R–Z plane to investigate the low-frequency plasma oscillations in the discharge channel of a 5 kW LHT-140 Hall thruster. In addition to the elastic, excitation, and ionization collisions between neutral atoms and electrons, the Coulomb collisions between electrons and electrons and between electrons and ions are analyzed. The sheath characteristic distortion is also corrected. Simulation results indicate the capability of the built model to reproduce the low-frequency oscillation with high accuracy. The oscillations of the discharge current and ion density produced by the model are consistent with the existing conclusions. The model predicts a frequency that is consistent with that calculated by the zero-dimensional theoretical model.
Development of a Solid-Oxide Fuel Cell/Gas Turbine Hybrid System Model for Aerospace Applications
NASA Technical Reports Server (NTRS)
Freeh, Joshua E.; Pratt, Joseph W.; Brouwer, Jacob
2004-01-01
Recent interest in fuel cell-gas turbine hybrid applications for the aerospace industry has led to the need for accurate computer simulation models to aid in system design and performance evaluation. To meet this requirement, solid oxide fuel cell (SOFC) and fuel processor models have been developed and incorporated into the Numerical Propulsion Systems Simulation (NPSS) software package. The SOFC and reformer models solve systems of equations governing steady-state performance using common theoretical and semi-empirical terms. An example hybrid configuration is presented that demonstrates the new capability as well as the interaction with pre-existing gas turbine and heat exchanger models. Finally, a comparison of calculated SOFC performance with experimental data is presented to demonstrate model validity. Keywords: Solid Oxide Fuel Cell, Reformer, System Model, Aerospace, Hybrid System, NPSS
A summary of existing and planned experiment hardware for low-gravity fluids research
NASA Technical Reports Server (NTRS)
Hill, Myron E.; Omalley, Terence F.
1991-01-01
An overview is presented of (1) existing ground-based, low gravity research facilities, with examples of hardware capabilities, and (2) existing and planned space-based research facilities, with examples of current and past flight hardware. Low-gravity, ground-based facilities, such as drop towers and aircraft, provide the experimenter with quick turnaround time, easy access to equipment, gravity levels ranging from 10(exp -2) to 10(exp -6) G, and low-gravity durations ranging from 2 to 30 sec. Currently, the only operational space-based facility is the Space Shuttle. The Shuttle's payload bay and middeck facilities are described. Existing and planned low-gravity fluids research facilities are also described with examples of experiments and hardware capabilities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rutqvist, Jonny; Blanco Martin, Laura; Mukhopadhyay, Sumit
In this report, we present FY2014 progress by Lawrence Berkeley National Laboratory (LBNL) related to modeling of coupled thermal-hydrological-mechanical-chemical (THMC) processes in salt and their effect on brine migration at high temperatures. LBNL’s work on the modeling of coupled THMC processes in salt was initiated in FY2012, focusing on exploring and demonstrating the capabilities of an existing LBNL modeling tool (TOUGH-FLAC) for simulating temperature-driven coupled flow and geomechanical processes in salt. This work includes development related to, and implementation of, essential capabilities, as well as testing the model against relevant information and published experimental data related to the fate andmore » transport of water. we provide more details on the FY2014 work, first presenting updated tools and improvements made to the TOUGH-FLAC simulator, and the use of this updated tool in a new model simulation of long-term THM behavior within a generic repository in a salt formation. This is followed by the description of current benchmarking and validations efforts, including the TSDE experiment. We then present the current status in the development of constitutive relationships and the dual-continuum model for brine migration. We conclude with an outlook for FY2015, which will be much focused on model validation against field experiments and on the use of the model for the design studies related to a proposed heater experiment.« less
PhreeqcRM: A reaction module for transport simulators based on the geochemical model PHREEQC
Parkhurst, David L.; Wissmeier, Laurin
2015-01-01
PhreeqcRM is a geochemical reaction module designed specifically to perform equilibrium and kinetic reaction calculations for reactive transport simulators that use an operator-splitting approach. The basic function of the reaction module is to take component concentrations from the model cells of the transport simulator, run geochemical reactions, and return updated component concentrations to the transport simulator. If multicomponent diffusion is modeled (e.g., Nernst–Planck equation), then aqueous species concentrations can be used instead of component concentrations. The reaction capabilities are a complete implementation of the reaction capabilities of PHREEQC. In each cell, the reaction module maintains the composition of all of the reactants, which may include minerals, exchangers, surface complexers, gas phases, solid solutions, and user-defined kinetic reactants.PhreeqcRM assigns initial and boundary conditions for model cells based on standard PHREEQC input definitions (files or strings) of chemical compositions of solutions and reactants. Additional PhreeqcRM capabilities include methods to eliminate reaction calculations for inactive parts of a model domain, transfer concentrations and other model properties, and retrieve selected results. The module demonstrates good scalability for parallel processing by using multiprocessing with MPI (message passing interface) on distributed memory systems, and limited scalability using multithreading with OpenMP on shared memory systems. PhreeqcRM is written in C++, but interfaces allow methods to be called from C or Fortran. By using the PhreeqcRM reaction module, an existing multicomponent transport simulator can be extended to simulate a wide range of geochemical reactions. Results of the implementation of PhreeqcRM as the reaction engine for transport simulators PHAST and FEFLOW are shown by using an analytical solution and the reactive transport benchmark of MoMaS.
NASA Astrophysics Data System (ADS)
Zhang, Ming
Recent trends in the electric power industry have led to more attention to optimal operation of power transformers. In a deregulated environment, optimal operation means minimizing the maintenance and extending the life of this critical and costly equipment for the purpose of maximizing profits. Optimal utilization of a transformer can be achieved through the use of dynamic loading. A benefit of dynamic loading is that it allows better utilization of the transformer capacity, thus increasing the flexibility and reliability of the power system. This document presents the progress on a software application which can estimate the maximum time-varying loading capability of transformers. This information can be used to load devices closer to their limits without exceeding the manufacturer specified operating limits. The maximally efficient dynamic loading of transformers requires a model that can accurately predict both top-oil temperatures (TOTs) and hottest-spot temperatures (HSTs). In the previous work, two kinds of thermal TOT and HST models have been studied and used in the application: the IEEE TOT/HST models and the ASU TOT/HST models. And, several metrics have been applied to evaluate the model acceptability and determine the most appropriate models for using in the dynamic loading calculations. In this work, an investigation to improve the existing transformer thermal models performance is presented. Some factors that may affect the model performance such as improper fan status and the error caused by the poor performance of IEEE models are discussed. Additional methods to determine the reliability of transformer thermal models using metrics such as time constant and the model parameters are also provided. A new production grade application for real-time dynamic loading operating purpose is introduced. This application is developed by using an existing planning application, TTeMP, as a start point, which is designed for the dispatchers and load specialists. To overcome the limitations of TTeMP, the new application can perform dynamic loading under emergency conditions, such as loss-of transformer loading. It also has the capability to determine the emergency rating of the transformers for a real-time estimation.
Ahmad, Zulfiqar; Ashraf, Arshad; Fryar, Alan; Akhter, Gulraiz
2011-02-01
The integration of the Geographic Information System (GIS) with groundwater modeling and satellite remote sensing capabilities has provided an efficient way of analyzing and monitoring groundwater behavior and its associated land conditions. A 3-dimensional finite element model (Feflow) has been used for regional groundwater flow modeling of Upper Chaj Doab in Indus Basin, Pakistan. The approach of using GIS techniques that partially fulfill the data requirements and define the parameters of existing hydrologic models was adopted. The numerical groundwater flow model is developed to configure the groundwater equipotential surface, hydraulic head gradient, and estimation of the groundwater budget of the aquifer. GIS is used for spatial database development, integration with a remote sensing, and numerical groundwater flow modeling capabilities. The thematic layers of soils, land use, hydrology, infrastructure, and climate were developed using GIS. The Arcview GIS software is used as additive tool to develop supportive data for numerical groundwater flow modeling and integration and presentation of image processing and modeling results. The groundwater flow model was calibrated to simulate future changes in piezometric heads from the period 2006 to 2020. Different scenarios were developed to study the impact of extreme climatic conditions (drought/flood) and variable groundwater abstraction on the regional groundwater system. The model results indicated a significant response in watertable due to external influential factors. The developed model provides an effective tool for evaluating better management options for monitoring future groundwater development in the study area.
Jones, James W; Antle, John M; Basso, Bruno; Boote, Kenneth J; Conant, Richard T; Foster, Ian; Godfray, H Charles J; Herrero, Mario; Howitt, Richard E; Janssen, Sander; Keating, Brian A; Munoz-Carpena, Rafael; Porter, Cheryl H; Rosenzweig, Cynthia; Wheeler, Tim R
2017-07-01
We review the current state of agricultural systems science, focusing in particular on the capabilities and limitations of agricultural systems models. We discuss the state of models relative to five different Use Cases spanning field, farm, landscape, regional, and global spatial scales and engaging questions in past, current, and future time periods. Contributions from multiple disciplines have made major advances relevant to a wide range of agricultural system model applications at various spatial and temporal scales. Although current agricultural systems models have features that are needed for the Use Cases, we found that all of them have limitations and need to be improved. We identified common limitations across all Use Cases, namely 1) a scarcity of data for developing, evaluating, and applying agricultural system models and 2) inadequate knowledge systems that effectively communicate model results to society. We argue that these limitations are greater obstacles to progress than gaps in conceptual theory or available methods for using system models. New initiatives on open data show promise for addressing the data problem, but there also needs to be a cultural change among agricultural researchers to ensure that data for addressing the range of Use Cases are available for future model improvements and applications. We conclude that multiple platforms and multiple models are needed for model applications for different purposes. The Use Cases provide a useful framework for considering capabilities and limitations of existing models and data.
NASA Technical Reports Server (NTRS)
Jones, James W.; Antle, John M.; Basso, Bruno; Boote, Kenneth J.; Conant, Richard T.; Foster, Ian; Godfray, H. Charles J.; Herrero, Mario; Howitt, Richard E.; Janssen, Sander;
2016-01-01
We review the current state of agricultural systems science, focusing in particular on the capabilities and limitations of agricultural systems models. We discuss the state of models relative to five different Use Cases spanning field, farm, landscape, regional, and global spatial scales and engaging questions in past, current, and future time periods. Contributions from multiple disciplines have made major advances relevant to a wide range of agricultural system model applications at various spatial and temporal scales. Although current agricultural systems models have features that are needed for the Use Cases, we found that all of them have limitations and need to be improved. We identified common limitations across all Use Cases, namely 1) a scarcity of data for developing, evaluating, and applying agricultural system models and 2) inadequate knowledge systems that effectively communicate model results to society. We argue that these limitations are greater obstacles to progress than gaps in conceptual theory or available methods for using system models. New initiatives on open data show promise for addressing the data problem, but there also needs to be a cultural change among agricultural researchers to ensure that data for addressing the range of Use Cases are available for future model improvements and applications. We conclude that multiple platforms and multiple models are needed for model applications for different purposes. The Use Cases provide a useful framework for considering capabilities and limitations of existing models and data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, James W.; Antle, John M.; Basso, Bruno
We review the current state of agricultural systems science, focusing in particular on the capabilities and limitations of agricultural systems models. We discuss the state of models relative to five different Use Cases spanning field, farm, landscape, regional, and global spatial scales and engaging questions in past, current, and future time periods. Contributions from multiple disciplines have made major advances relevant to a wide range of agricultural system model applications at various spatial and temporal scales. Although current agricultural systems models have features that are needed for the Use Cases, we found that all of them have limitations and needmore » to be improved. We identified common limitations across all Use Cases, namely 1) a scarcity of data for developing, evaluating, and applying agricultural system models and 2) inadequate knowledge systems that effectively communicate model results to society. We argue that these limitations are greater obstacles to progress than gaps in conceptual theory or available methods for using system models. New initiatives on open data show promise for addressing the data problem, but there also needs to be a cultural change among agricultural researchers to ensure that data for addressing the range of Use Cases are available for future model improvements and applications. We conclude that multiple platforms and multiple models are needed for model applications for different purposes. The Use Cases provide a useful framework for considering capabilities and limitations of existing models and data.« less
Development of a standardized control module for dc-to-dc converters
NASA Technical Reports Server (NTRS)
Yu, Y.; Iwens, R. I.; Lee, F. C.; Inouye, L. Y.
1977-01-01
The electrical performance of a power processor depends on the quality of its control system. Most of the existing control circuits suffer one or more of the following imperfections that tend to restrict their respective utility: (1) inability to perform different modes of duty cycle control; (2) lack of immunity to output filter parameter changes, and (3) lack of capability to provide power component stress limiting on an instantaneous basis. The three lagging aspects of existing control circuits have been used to define the major objectives of the current Standardized Control Module (SCM) Program. Detailed information on the SCM functional block diagram, its universality, and performance features, circuit description, test results, and modeling and analysis efforts are presented.
Radioactive threat detection using scintillant-based detectors
NASA Astrophysics Data System (ADS)
Chalmers, Alex
2004-09-01
An update to the performance of AS&E's Radioactive Threat Detection sensor technology. A model is presented detailing the components of the scintillant-based RTD system employed in AS&E products aimed at detecting radiological WMD. An overview of recent improvements in the sensors, electrical subsystems and software algorithms are presented. The resulting improvements in performance are described and sample results shown from existing systems. Advanced and future capabilities are described with an assessment of their feasibility and their application to Homeland Defense.
Simultaneous computation of jet turbulence and noise
NASA Technical Reports Server (NTRS)
Berman, C. H.; Ramos, J. I.
1989-01-01
The existing flow computation methods, wave computation techniques, and theories based on noise source models are reviewed in order to assess the capabilities of numerical techniques to compute jet turbulence noise and understand the physical mechanisms governing it over a range of subsonic and supersonic nozzle exit conditions. In particular, attention is given to (1) methods for extrapolating near field information, obtained from flow computations, to the acoustic far field and (2) the numerical solution of the time-dependent Lilley equation.
NASA Technical Reports Server (NTRS)
Hodges, D. H.; Hopkins, A. S.; Kunz, D. L.; Hinnant, H. E.
1986-01-01
The General Rotorcraft Aeromechanical Stability Program (GRASP), which is a hybrid between finite element programs and spacecraft-oriented multibody programs, is described in terms of its design and capabilities. Numerical results from GRASP are presented and compared with the results from an existing, special-purpose coupled rotor/body aeromechanical stability program and with experimental data of Dowell and Traybar (1975 and 1977) for large deflections of an end-loaded cantilevered beam. The agreement is excellent in both cases.
Surface colour photometry of galaxies with Schmidt telescopes.
NASA Technical Reports Server (NTRS)
Wray, J. D.
1972-01-01
A method is described which owes its practicality to the capability of Schmidt telescopes to record a number of galaxy images on a single plate and to the existence of high speed computer controlled area-scanning precision microdensitometers such as the Photometric Data Systems model 1010. The method of analysis results in quantitative color-index information which is displayed in a manner that allows any user to effectively study the morphological properties of the distribution of color-index in galaxies.
NASA Astrophysics Data System (ADS)
Sachko, A. V.; Zakordonskii, V. P.; Voloshinovskii, A. S.
2013-03-01
Fluorescent spectroscopy is used to investigate the processes of intermolecular association in mixed solutions of polymethacrylic acid (PMAA) and anionic sodium dodecylbenzenesulfonate (SDBS). We propose a model for describing the stage-by-stage mechanism of association processes and conclude that the nature of intermolecular associates depends on the PMAA-SDBS concentration ratio in the solution. Studying the kinetics of fluorescence decay reveals the simultaneous existence of two types of formations capable of pyrene solubilization.
How informative is the mouse for human gut microbiota research?
Nguyen, Thi Loan Anh; Vieira-Silva, Sara; Liston, Adrian; Raes, Jeroen
2015-01-01
The microbiota of the human gut is gaining broad attention owing to its association with a wide range of diseases, ranging from metabolic disorders (e.g. obesity and type 2 diabetes) to autoimmune diseases (such as inflammatory bowel disease and type 1 diabetes), cancer and even neurodevelopmental disorders (e.g. autism). Having been increasingly used in biomedical research, mice have become the model of choice for most studies in this emerging field. Mouse models allow perturbations in gut microbiota to be studied in a controlled experimental setup, and thus help in assessing causality of the complex host-microbiota interactions and in developing mechanistic hypotheses. However, pitfalls should be considered when translating gut microbiome research results from mouse models to humans. In this Special Article, we discuss the intrinsic similarities and differences that exist between the two systems, and compare the human and murine core gut microbiota based on a meta-analysis of currently available datasets. Finally, we discuss the external factors that influence the capability of mouse models to recapitulate the gut microbiota shifts associated with human diseases, and investigate which alternative model systems exist for gut microbiota research. PMID:25561744
How informative is the mouse for human gut microbiota research?
Nguyen, Thi Loan Anh; Vieira-Silva, Sara; Liston, Adrian; Raes, Jeroen
2015-01-01
The microbiota of the human gut is gaining broad attention owing to its association with a wide range of diseases, ranging from metabolic disorders (e.g. obesity and type 2 diabetes) to autoimmune diseases (such as inflammatory bowel disease and type 1 diabetes), cancer and even neurodevelopmental disorders (e.g. autism). Having been increasingly used in biomedical research, mice have become the model of choice for most studies in this emerging field. Mouse models allow perturbations in gut microbiota to be studied in a controlled experimental setup, and thus help in assessing causality of the complex host-microbiota interactions and in developing mechanistic hypotheses. However, pitfalls should be considered when translating gut microbiome research results from mouse models to humans. In this Special Article, we discuss the intrinsic similarities and differences that exist between the two systems, and compare the human and murine core gut microbiota based on a meta-analysis of currently available datasets. Finally, we discuss the external factors that influence the capability of mouse models to recapitulate the gut microbiota shifts associated with human diseases, and investigate which alternative model systems exist for gut microbiota research. © 2015. Published by The Company of Biologists Ltd.
Chiu, Chia-Nan; Chen, Huei-Huang
2016-01-01
Many studies on the significance of knowledge management (KM) in the business world have been performed in recent years. Public sector KM is a research area of growing importance. Findings show that few authors specialize in the field and there are several obstacles to developing a cohesive body of literature. In order to examine their effect of the knowledge management capability [which consists of knowledge infrastructure capability (KIC) and knowledge process capability (KPC)] and organizational effectiveness (OE), this study conducted structural equation modeling to test the hypotheses with 302 questionnaires of Taipei Water Department staffs in Taiwan. In exploring the model developed in this study, the findings show that there exists a significant relationship between KPC and OE, while KIC and OE are insignificant. These results are different from earlier findings in the literature. Furthermore, this research proposed organizational commitment (OC) as the mediator role. The findings suggest that only OC has significant mediating effects between KPC and OE, whereas this is not the case for KIC and OE. It is noteworthy that the above findings inspired managers, in addition to construct the knowledge infrastructure more than focus on social media tools on the Internet, which engage knowledge workers in "peer-to-peer" knowledge sharing across organizational and company boundaries. The results are likely to help organizations (particularly public utilities) sharpen their knowledge management strategies. Academic and practical implications were drawn based on the findings.
An eHealth Capabilities Framework for Graduates and Health Professionals: Mixed-Methods Study.
Brunner, Melissa; McGregor, Deborah; Keep, Melanie; Janssen, Anna; Spallek, Heiko; Quinn, Deleana; Jones, Aaron; Tseris, Emma; Yeung, Wilson; Togher, Leanne; Solman, Annette; Shaw, Tim
2018-05-15
The demand for an eHealth-ready and adaptable workforce is placing increasing pressure on universities to deliver eHealth education. At present, eHealth education is largely focused on components of eHealth rather than considering a curriculum-wide approach. This study aimed to develop a framework that could be used to guide health curriculum design based on current evidence, and stakeholder perceptions of eHealth capabilities expected of tertiary health graduates. A 3-phase, mixed-methods approach incorporated the results of a literature review, focus groups, and a Delphi process to develop a framework of eHealth capability statements. Participants (N=39) with expertise or experience in eHealth education, practice, or policy provided feedback on the proposed framework, and following the fourth iteration of this process, consensus was achieved. The final framework consisted of 4 higher-level capability statements that describe the learning outcomes expected of university graduates across the domains of (1) digital health technologies, systems, and policies; (2) clinical practice; (3) data analysis and knowledge creation; and (4) technology implementation and codesign. Across the capability statements are 40 performance cues that provide examples of how these capabilities might be demonstrated. The results of this study inform a cross-faculty eHealth curriculum that aligns with workforce expectations. There is a need for educational curriculum to reinforce existing eHealth capabilities, adapt existing capabilities to make them transferable to novel eHealth contexts, and introduce new learning opportunities for interactions with technologies within education and practice encounters. As such, the capability framework developed may assist in the application of eHealth by emerging and existing health care professionals. Future research needs to explore the potential for integration of findings into workforce development programs. ©Melissa Brunner, Deborah McGregor, Melanie Keep, Anna Janssen, Heiko Spallek, Deleana Quinn, Aaron Jones, Emma Tseris, Wilson Yeung, Leanne Togher, Annette Solman, Tim Shaw. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 15.05.2018.
2011-05-10
concert with existing surveillance applications or the SAGES tools may be used en masse for an end-to-end biosurveillance capability. This flexibility...existing surveillance applications or the SAGES tools may be used en masse for an end–to-end biosurveillance capability. doi:10.1371/journal.pone...health resources, and the costs of proprietary software. The Suite for Automated Global Electronic bioSurveillance (SAGES) is a collection of modular
NASA Technical Reports Server (NTRS)
Hendricks, Eric A.; Bell, Michael M.; Elsberry, Russell L.; Velden, Chris S.; Cecil, Dan
2016-01-01
Background: Initialization of tropical cyclones in numerical weather prediction (NWP) systems is a great challenge: Mass-wind ?eld balance; Secondary circulation and heating; Asymmetries. There can be large adjustments in structure and intensity in the ?rst 24 hours if the initial vortex is not in balance: Spurious gravity waves; Spin-up (model and physics). Existing mesoscale NWP model TC (Tropical Cyclone) initialization strategies: Bogus vortex, cold start from global analyses; 3DVAR or 4DVAR, possibly with synthetic observations; EnKF (Ensemble Kalman Filter); Dynamic initialization. Dynamic initialization allows vortex to have improved balance and physics spin-up at the initial time (e.g., Hendricks et al. 2013, 2011; Nguyen and Chen 2011; Fiorino and Warner 1981; Hoke and Anthes 1976). Himawari-8 geostationary satellite has capability of continuous imagery (10-minutes) over the full disk: New GOES-R satellites will have same capability. This will allow for unprecedented observations of tropical cyclones. However, current data assimila1on systems are not capable of ingesting such high temporal observations (Atmospheric Mo1on Vectors - AMVs). Hourly AMVs are produced, and thinned to 100-kilometer spacing in the horizontal. An entirely new data assimilation concept is required to utilize these observations.
Incompressible viscous flow simulations of the NFAC wind tunnel
NASA Technical Reports Server (NTRS)
Champney, Joelle Milene
1986-01-01
The capabilities of an existing 3-D incompressible Navier-Stokes flow solver, INS3D, are extended and improved to solve turbulent flows through the incorporation of zero- and two-equation turbulence models. The two-equation model equations are solved in their high Reynolds number form and utilize wall functions in the treatment of solid wall boundary conditions. The implicit approximate factorization scheme is modified to improve the stability of the two-equation solver. Applications to the 3-D viscous flow inside the 80 by 120 feet open return wind tunnel of the National Full Scale Aerodynamics Complex (NFAC) are discussed and described.
3D Microstructures for Materials and Damage Models
Livescu, Veronica; Bronkhorst, Curt Allan; Vander Wiel, Scott Alan
2017-02-01
Many challenges exist with regard to understanding and representing complex physical processes involved with ductile damage and failure in polycrystalline metallic materials. Currently, the ability to accurately predict the macroscale ductile damage and failure response of metallic materials is lacking. Research at Los Alamos National Laboratory (LANL) is aimed at building a coupled experimental and computational methodology that supports the development of predictive damage capabilities by: capturing real distributions of microstructural features from real material and implementing them as digitally generated microstructures in damage model development; and, distilling structure-property information to link microstructural details to damage evolution under a multitudemore » of loading states.« less
Simulation of a tangential soft x-ray imaging system.
Battaglia, D J; Shafer, M W; Unterberg, E A; Bell, R E; Hillis, D L; LeBlanc, B P; Maingi, R; Sabbagh, S; Stratton, B C
2010-10-01
Tangentially viewing soft x-ray (SXR) cameras are capable of detecting nonaxisymmetric plasma structures in magnetically confined plasmas. They are particularly useful for studying stationary perturbations or phenomenon that occur on a timescale faster than the plasma rotation period. Tangential SXR camera diagnostics are planned for the DIII-D and NSTX tokamaks to elucidate the static edge magnetic structure during the application of 3D perturbations. To support the design of the proposed diagnostics, a synthetic diagnostic model was developed using the CHIANTI database to estimate the SXR emission. The model is shown to be in good agreement with the measurements from an existing tangential SXR camera diagnostic on NSTX.
Toward an integrated Volcanic Ash Observing System in Europe
NASA Astrophysics Data System (ADS)
Lee, Deborah; Lisk, Ian
2014-05-01
Volcanic ash from the Icelandic eruption of Eyjafjallajökull in April and May of 2010 resulted in the decision by many northern European countries to impose significant restrictions on the use of their airspace. The eruption, extent and persistence of the ash revealed how reliant society now is on a safe and efficient air transport system and the fragility of that system when affected by the impact of complex natural hazards. As part of an EC framework programme, the 2011-2013 WEZARD (WEather HaZARD for aeronautics) consortium conducted a cross-industry volcanic ash capability and gap analyses, with the EUMETNET (network of 29 National Meteorological Services) led Work Package 3 focussing on a review of observational and monitoring capabilities, atmospheric dispersion modelling and data exchange. The review has revealed a patchwork of independent observing capabilities for volcanic ash, with some countries investing and others not at all, and most existing networks focus on space-based products. Existing capabilities do not provide the necessary detail on the geographical and vertical extent of volcanic ash and associated levels of contamination, which decision makers in the aviation industry require in order to decide where it is safe to fly. A resultant high priority was identified by WEZARD Work Package 3 for an enhanced observational network of complementary monitoring systems needed to initialise, validate and verify volcanic ash dispersion model output and forecasts. Thus a key recommendation is to invest in a major pre-operational demonstrator "European volcanic ash observing network", focussing on distal monitoring, and aiming to a) fill R&D gaps identified in instrumentation and algorithms and b) integrate data, where possible in near-real-time, from a range of ground-based, airborne and space-based techniques. Here we present a key WEZARD recommendation toward an integrated volcanic ash observing system in Europe, in context with other related projects and initiatives. We will also look to highlight the work underway by VAACs (Volcanic Ash Advisory Centres) and aviation regulatory authorities within the IAVWOPSG (International Airways Volcano Watch Operations Group) to develop the 'agreed in situ and/or remote sensing techniques' that underpin the newly approved definition of 'Discernible ash'.
NASA Astrophysics Data System (ADS)
Lepore, C.; Arnone, E.; Noto, L. V.; Sivandran, G.; Bras, R. L.
2013-09-01
This paper presents the development of a rainfall-triggered landslide module within an existing physically based spatially distributed ecohydrologic model. The model, tRIBS-VEGGIE (Triangulated Irregular Networks-based Real-time Integrated Basin Simulator and Vegetation Generator for Interactive Evolution), is capable of a sophisticated description of many hydrological processes; in particular, the soil moisture dynamics are resolved at a temporal and spatial resolution required to examine the triggering mechanisms of rainfall-induced landslides. The validity of the tRIBS-VEGGIE model to a tropical environment is shown with an evaluation of its performance against direct observations made within the study area of Luquillo Forest. The newly developed landslide module builds upon the previous version of the tRIBS landslide component. This new module utilizes a numerical solution to the Richards' equation (present in tRIBS-VEGGIE but not in tRIBS), which better represents the time evolution of soil moisture transport through the soil column. Moreover, the new landslide module utilizes an extended formulation of the factor of safety (FS) to correctly quantify the role of matric suction in slope stability and to account for unsaturated conditions in the evaluation of FS. The new modeling framework couples the capabilities of the detailed hydrologic model to describe soil moisture dynamics with the infinite slope model, creating a powerful tool for the assessment of rainfall-triggered landslide risk.
Coupled nonlinear aeroelasticity and flight dynamics of fully flexible aircraft
NASA Astrophysics Data System (ADS)
Su, Weihua
This dissertation introduces an approach to effectively model and analyze the coupled nonlinear aeroelasticity and flight dynamics of highly flexible aircraft. A reduced-order, nonlinear, strain-based finite element framework is used, which is capable of assessing the fundamental impact of structural nonlinear effects in preliminary vehicle design and control synthesis. The cross-sectional stiffness and inertia properties of the wings are calculated along the wing span, and then incorporated into the one-dimensional nonlinear beam formulation. Finite-state unsteady subsonic aerodynamics is used to compute airloads along lifting surfaces. Flight dynamic equations are then introduced to complete the aeroelastic/flight dynamic system equations of motion. Instead of merely considering the flexibility of the wings, the current work allows all members of the vehicle to be flexible. Due to their characteristics of being slender structures, the wings, tail, and fuselage of highly flexible aircraft can be modeled as beams undergoing three dimensional displacements and rotations. New kinematic relationships are developed to handle the split beam systems, such that fully flexible vehicles can be effectively modeled within the existing framework. Different aircraft configurations are modeled and studied, including Single-Wing, Joined-Wing, Blended-Wing-Body, and Flying-Wing configurations. The Lagrange Multiplier Method is applied to model the nodal displacement constraints at the joint locations. Based on the proposed models, roll response and stability studies are conducted on fully flexible and rigidized models. The impacts of the flexibility of different vehicle members on flutter with rigid body motion constraints, flutter in free flight condition, and roll maneuver performance are presented. Also, the static stability of the compressive member of the Joined-Wing configuration is studied. A spatially-distributed discrete gust model is incorporated into the time simulation of the framework. Gust responses of the Flying-Wing configuration subject to stall effects are investigated. A bilinear torsional stiffness model is introduced to study the skin wrinkling due to large bending curvature of the Flying-Wing. The numerical studies illustrate the improvements of the existing reduced-order formulation with new capabilities of both structural modeling and coupled aeroelastic and flight dynamic analysis of fully flexible aircraft.
NASA Astrophysics Data System (ADS)
Talamonti, James Joseph
1995-01-01
Future NASA proposals include the placement of optical interferometer systems in space for a wide variety of astrophysical studies including a vastly improved deflection test of general relativity, a precise and direct calibration of the Cepheid distance scale, and the determination of stellar masses (Reasenberg et al., 1988). There are also plans for placing large array telescopes on the moon with the ultimate objective of being able to measure angular separations of less than 10 mu-arc seconds (Burns, 1990). These and other future projects will require interferometric measurement of the (baseline) distance between the optical elements comprising the systems. Eventually, space qualifiable interferometers capable of picometer (10^{-12}m) relative precision and nanometer (10^{ -9}m) absolute precision will be required. A numerical model was developed to emulate the capabilities of systems performing interferometric noncontact absolute distance measurements. The model incorporates known methods to minimize signal processing and digital sampling errors and evaluates the accuracy limitations imposed by spectral peak isolation using Hanning, Blackman, and Gaussian windows in the Fast Fourier Transform Technique. We applied this model to the specific case of measuring the relative lengths of a compound Michelson interferometer using a frequency scanned laser. By processing computer simulated data through our model, the ultimate precision is projected for ideal data, and data containing AM/FM noise. The precision is shown to be limited by non-linearities in the laser scan. A laboratory system was developed by implementing ultra-stable external cavity diode lasers into existing interferometric measuring techniques. The capabilities of the system were evaluated and increased by using the computer modeling results as guidelines for the data analysis. Experimental results measured 1-3 meter baselines with <20 micron precision. Comparison of the laboratory and modeling results showed that the laboratory precisions obtained were of the same order of magnitude as those predicted for computer generated results under similar conditions. We believe that our model can be implemented as a tool in the design for new metrology systems capable of meeting the precisions required by space-based interferometers.
Design Methods and Practices for Fault Prevention and Management in Spacecraft
NASA Technical Reports Server (NTRS)
Tumer, Irem Y.
2005-01-01
Integrated Systems Health Management (ISHM) is intended to become a critical capability for all space, lunar and planetary exploration vehicles and systems at NASA. Monitoring and managing the health state of diverse components, subsystems, and systems is a difficult task that will become more challenging when implemented for long-term, evolving deployments. A key technical challenge will be to ensure that the ISHM technologies are reliable, effective, and low cost, resulting in turn in safe, reliable, and affordable missions. To ensure safety and reliability, ISHM functionality, decisions and knowledge have to be incorporated into the product lifecycle as early as possible, and ISHM must be considered as an essential element of models developed and used in various stages during system design. During early stage design, many decisions and tasks are still open, including sensor and measurement point selection, modeling and model-checking, diagnosis, signature and data fusion schemes, presenting the best opportunity to catch and prevent potential failures and anomalies in a cost-effective way. Using appropriate formal methods during early design, the design teams can systematically explore risks without committing to design decisions too early. However, the nature of ISHM knowledge and data is detailed, relying on high-fidelity, detailed models, whereas the earlier stages of the product lifecycle utilize low-fidelity, high-level models of systems and their functionality. We currently lack the tools and processes necessary for integrating ISHM into the vehicle system/subsystem design. As a result, most existing ISHM-like technologies are retrofits that were done after the system design was completed. It is very expensive, and sometimes futile, to retrofit a system health management capability into existing systems. Last-minute retrofits result in unreliable systems, ineffective solutions, and excessive costs (e.g., Space Shuttle TPS monitoring which was considered only after 110 flights and the Columbia disaster). High false alarm or false negative rates due to substandard implementations hurt the credibility of the ISHM discipline. This paper presents an overview of the current state of ISHM design,and a review of formal design methods to make recommendations about possible approaches to enable the ISHM capabilities to be designed-in at the system-level, from the very beginning of the vehicle design process.
NASA Technical Reports Server (NTRS)
Kulkarni, Sameer; Beach, Timothy A.; Jorgenson, Philip C.; Veres, Joseph P.
2017-01-01
A 24 foot diameter 3-stage axial compressor powered by variable-speed induction motors provides the airflow in the closed-return 11- by 11-Foot Transonic Wind Tunnel (11-Foot TWT) Facility at NASA Ames Research Center at Moffett Field, California. The facility is part of the Unitary Plan Wind Tunnel, which was completed in 1955. Since then, upgrades made to the 11-Foot TWT such as flow conditioning devices and instrumentation have increased blockage and pressure loss in the tunnel, somewhat reducing the peak Mach number capability of the test section. Due to erosion effects on the existing aluminum alloy rotor blades, fabrication of new steel rotor blades is planned. This presents an opportunity to increase the Mach number capability of the tunnel by redesigning the compressor for increased pressure ratio. Challenging design constraints exist for any proposed design, demanding the use of the existing driveline, rotor disks, stator vanes, and hub and casing flow paths, so as to minimize cost and installation time. The current effort was undertaken to characterize the performance of the existing compressor design using available design tools and computational fluid dynamics (CFD) codes and subsequently recommend a new compressor design to achieve higher pressure ratio, which directly correlates with increased test section Mach number. The constant cross-sectional area of the compressor leads to highly diffusion factors, which presents a challenge in simulating the existing design. The CFD code APNASA was used to simulate the aerodynamic performance of the existing compressor. The simulations were compared to performance predictions from the HT0300 turbomachinery design and analysis code, and to compressor performance data taken during a 1997 facility test. It was found that the CFD simulations were sensitive to endwall leakages associated with stator buttons, and to a lesser degree, under-stator-platform flow recirculation at the hub. When stator button leakages were modeled, pumping capability increased by over 20 of pressure rise at design point due to a large reduction in aerodynamic blockage at the hub. Incorporating the stator button leakages was crucial to matching test data. Under-stator-platform flow recirculation was thought to be large due to a lack of seals. The effect of this recirculation was assessed with APNASA simulations recirculating 0.5, 1, and 2 of inlet flow about stators 1 and 2, modeled as axisymmetric mass flux boundary conditions on the hub before and after the vanes. The injection of flow ahead of the stators tended to re-energize the boundary layer and reduce hub separations, resulting in about 3 increased stall margin per 1 of inlet flow recirculated. In order to assess the value of the flow recirculation, a mixing plane simulation of the compressor which gridded the under-stator cavities was generated using the ADPAC CFD code. This simulation indicated that about 0.65 of the inlet flow is recirculated around each shrouded stator. This collective information was applied during the redesign of the compressor. A potential design was identified using HT0300 which improved overall pressure ratio by removing pre-swirl into rotor 1, replacing existing NASA 65 series rotors with double circular arc sections, and re-staggering rotors and the existing stators. The performance of the new design predicted by APNASA and HT0300 is compared to the existing design.
76 FR 70721 - Voltage Coordination on High Voltage Grids; Notice of Staff Workshop
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-15
... and the capability of existing and emerging software to improve coordination and optimization of transfer capability across the Bulk-Power System from a reliability and economic perspective. The agenda...
NASA Astrophysics Data System (ADS)
Phipps, Marja; Capel, David; Srinivasan, James
2014-06-01
Motion imagery capabilities within the Department of Defense/Intelligence Community (DoD/IC) have advanced significantly over the last decade, attempting to meet continuously growing data collection, video processing and analytical demands in operationally challenging environments. The motion imagery tradecraft has evolved accordingly, enabling teams of analysts to effectively exploit data and generate intelligence reports across multiple phases in structured Full Motion Video (FMV) Processing Exploitation and Dissemination (PED) cells. Yet now the operational requirements are drastically changing. The exponential growth in motion imagery data continues, but to this the community adds multi-INT data, interoperability with existing and emerging systems, expanded data access, nontraditional users, collaboration, automation, and support for ad hoc configurations beyond the current FMV PED cells. To break from the legacy system lifecycle, we look towards a technology application and commercial adoption model course which will meet these future Intelligence, Surveillance and Reconnaissance (ISR) challenges. In this paper, we explore the application of cutting edge computer vision technology to meet existing FMV PED shortfalls and address future capability gaps. For example, real-time georegistration services developed from computer-vision-based feature tracking, multiple-view geometry, and statistical methods allow the fusion of motion imagery with other georeferenced information sources - providing unparalleled situational awareness. We then describe how these motion imagery capabilities may be readily deployed in a dynamically integrated analytical environment; employing an extensible framework, leveraging scalable enterprise-wide infrastructure and following commercial best practices.
Design and Characterization of a Microfabricated Hydrogen Clearance Blood Flow Sensor
Walton, Lindsay R.; Edwards, Martin A.; McCarty, Gregory S.; Wightman, R. Mark
2016-01-01
Background Modern cerebral blood flow (CBF) detection favors the use of either optical technologies that are limited to cortical brain regions, or expensive magnetic resonance. Decades ago, inhalation gas clearance was the choice method of quantifying CBF, but this suffered from poor temporal resolution. Electrolytic H2 clearance (EHC) generates and collects gas in situ at an electrode pair, which improves temporal resolution, but the probe size has prohibited meaningful subcortical use. New Method We microfabricated EHC electrodes to an order of magnitude smaller than those existing, on the scale of 100 µm, to permit use deep within the brain. Results Novel EHC probes were fabricated. The devices offered exceptional signal-to-noise, achieved high collection efficiencies (40 – 50%) in vitro, and agreed with theoretical modeling. An in vitro chemical reaction model was used to confirm that our devices detected flow rates higher than those expected physiologically. Computational modeling that incorporated realistic noise levels demonstrated devices would be sensitive to physiological CBF rates. Comparison with Existing Method The reduced size of our arrays makes them suitable for subcortical EHC measurements, as opposed to the larger, existing EHC electrodes that would cause substantial tissue damage. Our array can collect multiple CBF measurements per minute, and can thus resolve physiological changes occurring on a shorter timescale than existing gas clearance measurements. Conclusion We present and characterize microfabricated EHC electrodes and an accompanying theoretical model to interpret acquired data. Microfabrication allows for the high-throughput production of reproducible devices that are capable of monitoring deep brain CBF with sub-minute resolution. PMID:27102042
Continuing Development of a Hybrid Model (VSH) of the Neutral Thermosphere
NASA Technical Reports Server (NTRS)
Burns, Alan
1996-01-01
We propose to continue the development of a new operational model of neutral thermospheric density, composition, temperatures and winds to improve current engineering environment definitions of the neutral thermosphere. This model will be based on simulations made with the National Center for Atmospheric Research (NCAR) Thermosphere-Ionosphere- Electrodynamic General Circulation Model (TIEGCM) and on empirical data. It will be capable of using real-time geophysical indices or data from ground-based and satellite inputs and provides neutral variables at specified locations and times. This "hybrid" model will be based on a Vector Spherical Harmonic (VSH) analysis technique developed (over the last 8 years) at the University of Michigan that permits the incorporation of the TIGCM outputs and data into the model. The VSH model will be a more accurate version of existing models of the neutral thermospheric, and will thus improve density specification for satellites flying in low Earth orbit (LEO).
A system level model for preliminary design of a space propulsion solid rocket motor
NASA Astrophysics Data System (ADS)
Schumacher, Daniel M.
Preliminary design of space propulsion solid rocket motors entails a combination of components and subsystems. Expert design tools exist to find near optimal performance of subsystems and components. Conversely, there is no system level preliminary design process for space propulsion solid rocket motors that is capable of synthesizing customer requirements into a high utility design for the customer. The preliminary design process for space propulsion solid rocket motors typically builds on existing designs and pursues feasible rather than the most favorable design. Classical optimization is an extremely challenging method when dealing with the complex behavior of an integrated system. The complexity and combinations of system configurations make the number of the design parameters that are traded off unreasonable when manual techniques are used. Existing multi-disciplinary optimization approaches generally address estimating ratios and correlations rather than utilizing mathematical models. The developed system level model utilizes the Genetic Algorithm to perform the necessary population searches to efficiently replace the human iterations required during a typical solid rocket motor preliminary design. This research augments, automates, and increases the fidelity of the existing preliminary design process for space propulsion solid rocket motors. The system level aspect of this preliminary design process, and the ability to synthesize space propulsion solid rocket motor requirements into a near optimal design, is achievable. The process of developing the motor performance estimate and the system level model of a space propulsion solid rocket motor is described in detail. The results of this research indicate that the model is valid for use and able to manage a very large number of variable inputs and constraints towards the pursuit of the best possible design.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gerhard Strydom; Cristian Rabiti; Andrea Alfonsi
2012-10-01
PHISICS is a neutronics code system currently under development at the Idaho National Laboratory (INL). Its goal is to provide state of the art simulation capability to reactor designers. The different modules for PHISICS currently under development are a nodal and semi-structured transport core solver (INSTANT), a depletion module (MRTAU) and a cross section interpolation (MIXER) module. The INSTANT module is the most developed of the mentioned above. Basic functionalities are ready to use, but the code is still in continuous development to extend its capabilities. This paper reports on the effort of coupling the nodal kinetics code package PHISICSmore » (INSTANT/MRTAU/MIXER) to the thermal hydraulics system code RELAP5-3D, to enable full core and system modeling. This will enable the possibility to model coupled (thermal-hydraulics and neutronics) problems with more options for 3D neutron kinetics, compared to the existing diffusion theory neutron kinetics module in RELAP5-3D (NESTLE). In the second part of the paper, an overview of the OECD/NEA MHTGR-350 MW benchmark is given. This benchmark has been approved by the OECD, and is based on the General Atomics 350 MW Modular High Temperature Gas Reactor (MHTGR) design. The benchmark includes coupled neutronics thermal hydraulics exercises that require more capabilities than RELAP5-3D with NESTLE offers. Therefore, the MHTGR benchmark makes extensive use of the new PHISICS/RELAP5-3D coupling capabilities. The paper presents the preliminary results of the three steady state exercises specified in Phase I of the benchmark using PHISICS/RELAP5-3D.« less
SPoRT - An End-to-End R2O Activity
NASA Technical Reports Server (NTRS)
Jedlovec, Gary J.
2009-01-01
Established in 2002 to demonstrate the weather and forecasting application of real-time EOS measurements, the Short-term Prediction Research and Transition (SPoRT) program has grown to be an end-to-end research to operations activity focused on the use of advanced NASA modeling and data assimilation approaches, nowcasting techniques, and unique high-resolution multispectral observational data applications from EOS satellites to improve short-term weather forecasts on a regional and local scale. SPoRT currently partners with several universities and other government agencies for access to real-time data and products, and works collaboratively with them and operational end users at 13 WFOs to develop and test the new products and capabilities in a "test-bed" mode. The test-bed simulates key aspects of the operational environment without putting constraints on the forecaster workload. Products and capabilities which show utility in the test-bed environment are then transitioned experimentally into the operational environment for further evaluation and assessment. SPoRT focuses on a suite of data and products from MODIS, AMSR-E, and AIRS on the NASA Terra and Aqua satellites, and total lightning measurements from ground-based networks. Some of the observations are assimilated into or used with various versions of the WRF model to provide supplemental forecast guidance to operational end users. SPoRT is enhancing partnerships with NOAA / NESDIS for new product development and data access to exploit the remote sensing capabilities of instruments on the NPOESS satellites to address short term weather forecasting problems. The VIIRS and CrIS instruments on the NPP and follow-on NPOESS satellites provide similar observing capabilities to the MODIS and AIRS instruments on Terra and Aqua. SPoRT will be transitioning existing and new capabilities into the AWIIPS II environment to continue the continuity of its activities.
Assessment of the Eulerian particle flamelet model for nonpremixed turbulent jet flames
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Seong-Ku; Kim, Yongmo
2008-07-15
Although the Eulerian particle flamelet model (EPFM) recently proposed by Barths et al. [Proc. Combust. Inst. 27 (1998) 1841-1847] has shown the potential capabilities to realistically predict detailed pollutant (NO{sub x}, soot) formation in a turbulent reacting flow occurring within practical combustion devices, there still exists room to improve the predicative capability in terms of local flame structure and turbulence-chemistry interaction. In this study, the EPFM approach was applied to simulate two turbulent nonpremixed jet flames of CO/H{sub 2}/N{sub 2} fuel having the same jet Reynolds number but different nozzle diameters, and the capability of predicting the NO{sub x} formationmore » as well as both similarity of major species and sensitivity of minor species to fluid-dynamic scaling for the two flames has been assessed deeply in terms of both conditional and unconditional mean structures. The present results indicate that the original EPFM substantially overpredicts the conditional scalar dissipation rate at the downstream region and consequently underpredicts the streamwise decay of superequilibrium radical concentrations to the equilibrium state. In this study, in order to correctly estimate the averaged conditional scalar dissipation rate, a new modeling of the conditional scalar dissipation rate based on a least-squares fit through a mass weighted spatial distribution has been devised. In terms of both conditional and unconditional means, the EPFM utilizing this new procedure yields nearly the same results as the Lagrangian flamelet model, and provides closer agreement with experimental data than the original EPFM approach. (author)« less
MSFC Optical Metrology: A National Resource
NASA Technical Reports Server (NTRS)
Burdine, Robert
1998-01-01
A national need exists for Large Diameter Optical Metrology Services. These services include the manufacture, testing, and assurance of precision and control necessary to assure the success of large optical projects. "Best Practices" are often relied on for manufacture and quality controls while optical projects are increasingly more demanding and complex. Marshall Space Flight Center (MSFC) has acquired unique optical measurement, testing and metrology capabilities through active participation in a wide variety of NASA optical programs. An overview of existing optical facilities and metrology capabilities is given with emphasis on use by other optical projects. Cost avoidance and project success is stressed through use of existing MSFC facilities and capabilities for measurement and metrology controls. Current issues in large diameter optical metrology are briefly reviewed. The need for a consistent and long duration Large Diameter Optical Metrology Service Group is presented with emphasis on the establishment of a National Large Diameter Optical Standards Laboratory. Proposals are made to develop MSFC optical standards and metrology capabilities as the primary national standards resource, providing access to MSFC Optical Core Competencies for manufacturers and researchers. Plans are presented for the development of a national lending library of precision optical standards with emphasis on cost avoidance while improving measurement assurance.
A model of the wall boundary layer for ducted propellers
NASA Technical Reports Server (NTRS)
Eversman, Walter; Moehring, Willi
1987-01-01
The objective of the present study is to include a representation of a wall boundary layer in an existing finite element model of the propeller in the wind tunnel environment. The major consideration is that the new formulation should introduce only modest alterations in the numerical model and should still be capable of producing economical predictions of the radiated acoustic field. This is accomplished by using a stepped approximation in which the velocity profile is piecewise constant in layers. In the limit of infinitesimally thin layers, the velocity profile of the stepped approximation coincides with that of the continuous profile. The approach described here could also be useful in modeling the boundary layer in other duct applications, particularly in the computation of the radiated acoustic field for sources contained in a duct.
Test Capability Enhancements to the NASA Langley 8-Foot High Temperature Tunnel
NASA Technical Reports Server (NTRS)
Harvin, S. F.; Cabell, K. F.; Gallimore, S. D.; Mekkes, G. L.
2006-01-01
The NASA Langley 8-Foot High Temperature Tunnel produces true enthalpy environments simulating flight from Mach 4 to Mach 7, primarily for airbreathing propulsion and aerothermal/thermo-structural testing. Flow conditions are achieved through a methane-air heater and nozzles producing aerodynamic Mach numbers of 4, 5 or 7 and have exit diameters of 8 feet or 4.5 feet. The 12-ft long free-jet test section, housed inside a 26-ft vacuum sphere, accommodates large test articles. Recently, the facility underwent significant upgrades to support hydrocarbon fueled scramjet engine testing and to expand flight simulation capability. The upgrades were required to meet engine system development and flight clearance verification requirements originally defined by the joint NASA-Air Force X-43C Hypersonic Flight Demonstrator Project and now the Air Force X-51A Program. Enhancements to the 8-Ft. HTT were made in four areas: 1) hydrocarbon fuel delivery; 2) flight simulation capability; 3) controls and communication; and 4) data acquisition/processing. The upgrades include the addition of systems to supply ethylene and liquid JP-7 to test articles; a Mach 5 nozzle with dynamic pressure simulation capability up to 3200 psf, the addition of a real-time model angle-of-attack system; a new programmable logic controller sub-system to improve process controls and communication with model controls; the addition of MIL-STD-1553B and high speed data acquisition systems and a classified data processing environment. These additions represent a significant increase to the already unique test capability and flexibility of the facility, and complement the existing array of test support hardware such as a model injection system, radiant heaters, six-component force measurement system, and optical flow field visualization hardware. The new systems support complex test programs that require sophisticated test sequences and precise management of process fluids. Furthermore, the new systems, such as the real-time angle of attack system and the new programmable logic controller enhance the test efficiency of the facility. The motivation for the upgrades and the expanded capabilities is described here.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hyung Lee; Rich Johnson, Ph.D.; Kimberlyn C. Moussesau
2011-12-01
The Nuclear Energy - Knowledge base for Advanced Modeling and Simulation (NE-KAMS) is being developed at the Idaho National Laboratory in conjunction with Bettis Laboratory, Sandia National Laboratories, Argonne National Laboratory, Oak Ridge National Laboratory, Utah State University and others. The objective of this consortium is to establish a comprehensive knowledge base to provide Verification and Validation (V&V) and Uncertainty Quantification (UQ) and other resources for advanced modeling and simulation (M&S) in nuclear reactor design and analysis. NE-KAMS will become a valuable resource for the nuclear industry, the national laboratories, the U.S. NRC and the public to help ensure themore » safe operation of existing and future nuclear reactors. A survey and evaluation of the state-of-the-art of existing V&V and M&S databases, including the Department of Energy and commercial databases, has been performed to ensure that the NE-KAMS effort will not be duplicating existing resources and capabilities and to assess the scope of the effort required to develop and implement NE-KAMS. The survey and evaluation have indeed highlighted the unique set of value-added functionality and services that NE-KAMS will provide to its users. Additionally, the survey has helped develop a better understanding of the architecture and functionality of these data and knowledge bases that can be used to leverage the development of NE-KAMS.« less
Processing and Analysis of Mars Pathfinder Science Data at JPL's Science Data Processing Section
NASA Technical Reports Server (NTRS)
LaVoie, S.; Green, W.; Runkle, A.; Alexander, D.; Andres, P.; DeJong, E.; Duxbury, E.; Freda, D.; Gorjian, Z.; Hall, J.;
1998-01-01
The Mars Pathfinder mission required new capabilities and adaptation of existing capabilities in order to support science analysis and flight operations requirements imposed by the in-situ nature of the mission.
NASA Technical Reports Server (NTRS)
1994-01-01
This study provides a set of recommendations for improving the effectiveness of our nation's aeronautics and space facilities. The study plan considers current and future government and commercial needs as well as DOD and NASA mission requirements through the year 2023. It addresses shortfalls in existing capabilities, new facility requirements, upgrades, consolidations, and phase-out of existing facilities. If the recommendations are implemented, they will provide world-class capability where it is vital to our country's needs and make us more efficient in meeting future needs.
The integration of a LANDSAT analysis capability with a geographic information system
NASA Technical Reports Server (NTRS)
Nordstrand, E. A.
1981-01-01
The integration of LANDSAT data was achieved through the development of a flexible, compatible analysis tool and using an existing data base to select the usable data from a LANDSAT analysis. The software package allows manipulation of grid cell data plus the flexibility to allow the user to include FORTRAN statements for special functions. Using this combination of capabilities the user can classify a LANDSAT image and then selectivity merge the results with other data that may exist for the study area.
Transparent SiO2-Ag core-satellite nanoparticle assembled layer for plasmonic-based chemical sensors
NASA Astrophysics Data System (ADS)
Chen, Tsung-Han; Jean, Ren-Der; Chiu, Kuo-Chuang; Chen, Chun-Hua; Liu, Dean-Mo
2012-05-01
We discovered a promising sensing capability of SiO2@Ag core-satellite nanoparticles with respect to organic melamine when they were consolidated into a solid-type thin-film entity. A series of theoretical models were proposed which provided calculation outcomes superior to those of existing models for the localized surface plasmon resonance spectra of the solid-state assemblies. We envisioned not only that such a SiO2@Ag film is a potential candidate for a transparent solid-state optical nanosensor for the detection of organic molecules but also that the resulting plasmonic resonance model facilitates a better understanding of such a solid-state nanosensor used for a number of sensory applications.
Modeling Urban Energy Savings Scenarios Using Earth System Microclimate and Urban Morphology
NASA Astrophysics Data System (ADS)
Allen, M. R.; Rose, A.; New, J. R.; Yuan, J.; Omitaomu, O.; Sylvester, L.; Branstetter, M. L.; Carvalhaes, T. M.; Seals, M.; Berres, A.
2017-12-01
We analyze and quantify the relationships among climatic conditions, urban morphology, population, land cover, and energy use so that these relationships can be used to inform energy-efficient urban development and planning. We integrate different approaches across three research areas: earth system modeling; impacts, adaptation and vulnerability; and urban planning in order to address three major gaps in the existing capability in these areas: i) neighborhood resolution modeling and simulation of urban micrometeorological processes and their effect on and from regional climate; ii) projections for future energy use under urbanization and climate change scenarios identifying best strategies for urban morphological development and energy savings; iii) analysis and visualization tools to help planners optimally use these projections.
Scharm, Martin; Wolkenhauer, Olaf; Waltemath, Dagmar
2016-02-15
Repositories support the reuse of models and ensure transparency about results in publications linked to those models. With thousands of models available in repositories, such as the BioModels database or the Physiome Model Repository, a framework to track the differences between models and their versions is essential to compare and combine models. Difference detection not only allows users to study the history of models but also helps in the detection of errors and inconsistencies. Existing repositories lack algorithms to track a model's development over time. Focusing on SBML and CellML, we present an algorithm to accurately detect and describe differences between coexisting versions of a model with respect to (i) the models' encoding, (ii) the structure of biological networks and (iii) mathematical expressions. This algorithm is implemented in a comprehensive and open source library called BiVeS. BiVeS helps to identify and characterize changes in computational models and thereby contributes to the documentation of a model's history. Our work facilitates the reuse and extension of existing models and supports collaborative modelling. Finally, it contributes to better reproducibility of modelling results and to the challenge of model provenance. The workflow described in this article is implemented in BiVeS. BiVeS is freely available as source code and binary from sems.uni-rostock.de. The web interface BudHat demonstrates the capabilities of BiVeS at budhat.sems.uni-rostock.de. © The Author 2015. Published by Oxford University Press.
Bringing computational models of bone regeneration to the clinic.
Carlier, Aurélie; Geris, Liesbet; Lammens, Johan; Van Oosterwyck, Hans
2015-01-01
Although the field of bone regeneration has experienced great advancements in the last decades, integrating all the relevant, patient-specific information into a personalized diagnosis and optimal treatment remains a challenging task due to the large number of variables that affect bone regeneration. Computational models have the potential to cope with this complexity and to improve the fundamental understanding of the bone regeneration processes as well as to predict and optimize the patient-specific treatment strategies. However, the current use of computational models in daily orthopedic practice is very limited or inexistent. We have identified three key hurdles that limit the translation of computational models of bone regeneration from bench to bed side. First, there exists a clear mismatch between the scope of the existing and the clinically required models. Second, most computational models are confronted with limited quantitative information of insufficient quality thereby hampering the determination of patient-specific parameter values. Third, current computational models are only corroborated with animal models, whereas a thorough (retrospective and prospective) assessment of the computational model will be crucial to convince the health care providers of the capabilities thereof. These challenges must be addressed so that computational models of bone regeneration can reach their true potential, resulting in the advancement of individualized care and reduction of the associated health care costs. © 2015 Wiley Periodicals, Inc.
Workshop on Computational Turbulence Modeling
NASA Technical Reports Server (NTRS)
Shabbir, A. (Compiler); Shih, T.-H. (Compiler); Povinelli, L. A. (Compiler)
1994-01-01
The purpose of this meeting was to discuss the current status and future development of turbulence modeling in computational fluid dynamics for aerospace propulsion systems. Various turbulence models have been developed and applied to different turbulent flows over the past several decades and it is becoming more and more urgent to assess their performance in various complex situations. In order to help users in selecting and implementing appropriate models in their engineering calculations, it is important to identify the capabilities as well as the deficiencies of these models. This also benefits turbulence modelers by permitting them to further improve upon the existing models. This workshop was designed for exchanging ideas and enhancing collaboration between different groups in the Lewis community who are using turbulence models in propulsion related CFD. In this respect this workshop will help the Lewis goal of excelling in propulsion related research. This meeting had seven sessions for presentations and one panel discussion over a period of two days. Each presentation session was assigned to one or two branches (or groups) to present their turbulence related research work. Each group was asked to address at least the following points: current status of turbulence model applications and developments in the research; progress and existing problems; and requests about turbulence modeling. The panel discussion session was designed for organizing committee members to answer management and technical questions from the audience and to make concluding remarks.
Correlated Topic Vector for Scene Classification.
Wei, Pengxu; Qin, Fei; Wan, Fang; Zhu, Yi; Jiao, Jianbin; Ye, Qixiang
2017-07-01
Scene images usually involve semantic correlations, particularly when considering large-scale image data sets. This paper proposes a novel generative image representation, correlated topic vector, to model such semantic correlations. Oriented from the correlated topic model, correlated topic vector intends to naturally utilize the correlations among topics, which are seldom considered in the conventional feature encoding, e.g., Fisher vector, but do exist in scene images. It is expected that the involvement of correlations can increase the discriminative capability of the learned generative model and consequently improve the recognition accuracy. Incorporated with the Fisher kernel method, correlated topic vector inherits the advantages of Fisher vector. The contributions to the topics of visual words have been further employed by incorporating the Fisher kernel framework to indicate the differences among scenes. Combined with the deep convolutional neural network (CNN) features and Gibbs sampling solution, correlated topic vector shows great potential when processing large-scale and complex scene image data sets. Experiments on two scene image data sets demonstrate that correlated topic vector improves significantly the deep CNN features, and outperforms existing Fisher kernel-based features.
Divertor Coil Design and Implementation on Pegasus
NASA Astrophysics Data System (ADS)
Shriwise, P. C.; Bongard, M. W.; Cole, J. A.; Fonck, R. J.; Kujak-Ford, B. A.; Lewicki, B. T.; Winz, G. R.
2012-10-01
An upgraded divertor coil system is being commissioned on the Pegasus Toroidal Experiment in conjunction with power system upgrades in order to achieve higher β plasmas, reduce impurities, and possibly achieve H-mode operation. Design points for the divertor coil locations and estimates of their necessary current ratings were found using predictive equilibrium modeling based upon a 300 kA target plasma. This modeling represented existing Pegasus coil locations and current drive limits. The resultant design calls for 125 kA-turns from the divertor system to support the creation of a double null magnetic topology in plasmas with Ip<=300 kA. Initial experiments using this system will employ 900 V IGBT power supply modules to provide IDIV<=4 kA. The resulting 20 kA-turn capability of the existing divertor coil will be augmented by a new coil providing additional A-turns in series. Induced vessel wall current modeling indicates the time response of a 28 turn augmentation coil remains fast compared to the poloidal field penetration rate through the vessel. First results operating the augmented system are shown.
Chen, Vivian Yi-Ju; Yang, Tse-Chuan
2012-08-01
An increasing interest in exploring spatial non-stationarity has generated several specialized analytic software programs; however, few of these programs can be integrated natively into a well-developed statistical environment such as SAS. We not only developed a set of SAS macro programs to fill this gap, but also expanded the geographically weighted generalized linear modeling (GWGLM) by integrating the strengths of SAS into the GWGLM framework. Three features distinguish our work. First, the macro programs of this study provide more kernel weighting functions than the existing programs. Second, with our codes the users are able to better specify the bandwidth selection process compared to the capabilities of existing programs. Third, the development of the macro programs is fully embedded in the SAS environment, providing great potential for future exploration of complicated spatially varying coefficient models in other disciplines. We provided three empirical examples to illustrate the use of the SAS macro programs and demonstrated the advantages explained above. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Augmenting the SCaN Link Budget Tool with Validated Atmospheric Propagation
NASA Technical Reports Server (NTRS)
Steinkerchner, Leo; Welch, Bryan
2017-01-01
In any Earth-Space or Space-Earth communications link, atmospheric effects cause significant signal attenuation. In order to develop a communications system that is cost effective while meeting appropriate performance requirements, it is important to accurately predict these effects for the given link parameters. This project aimed to develop a Matlab(TradeMark) (The MathWorks, Inc.) program that could augment the existing Space Communications and Navigation (SCaN) Link Budget Tool with accurate predictions of atmospheric attenuation of both optical and radio-frequency signals according to the SCaN Optical Link Assessment Model Version 5 and the International Telecommunications Union, Radiocommunications Sector (ITU-R) atmospheric propagation loss model, respectively. When compared to data collected from the Advance Communications Technology Satellite (ACTS), the radio-frequency model predicted attenuation to within 1.3 dB of loss for 95 of measurements. Ultimately, this tool will be integrated into the SCaN Center for Engineering, Networks, Integration, and Communications (SCENIC) user interface in order to support analysis of existing SCaN systems and planning capabilities for future NASA missions.
Optical systems integrated modeling
NASA Technical Reports Server (NTRS)
Shannon, Robert R.; Laskin, Robert A.; Brewer, SI; Burrows, Chris; Epps, Harlan; Illingworth, Garth; Korsch, Dietrich; Levine, B. Martin; Mahajan, Vini; Rimmer, Chuck
1992-01-01
An integrated modeling capability that provides the tools by which entire optical systems and instruments can be simulated and optimized is a key technology development, applicable to all mission classes, especially astrophysics. Many of the future missions require optical systems that are physically much larger than anything flown before and yet must retain the characteristic sub-micron diffraction limited wavefront accuracy of their smaller precursors. It is no longer feasible to follow the path of 'cut and test' development; the sheer scale of these systems precludes many of the older techniques that rely upon ground evaluation of full size engineering units. The ability to accurately model (by computer) and optimize the entire flight system's integrated structural, thermal, and dynamic characteristics is essential. Two distinct integrated modeling capabilities are required. These are an initial design capability and a detailed design and optimization system. The content of an initial design package is shown. It would be a modular, workstation based code which allows preliminary integrated system analysis and trade studies to be carried out quickly by a single engineer or a small design team. A simple concept for a detailed design and optimization system is shown. This is a linkage of interface architecture that allows efficient interchange of information between existing large specialized optical, control, thermal, and structural design codes. The computing environment would be a network of large mainframe machines and its users would be project level design teams. More advanced concepts for detailed design systems would support interaction between modules and automated optimization of the entire system. Technology assessment and development plans for integrated package for initial design, interface development for detailed optimization, validation, and modeling research are presented.
MODFLOW-2005 : the U.S. Geological Survey modular ground-water model--the ground-water flow process
Harbaugh, Arlen W.
2005-01-01
This report presents MODFLOW-2005, which is a new version of the finite-difference ground-water model commonly called MODFLOW. Ground-water flow is simulated using a block-centered finite-difference approach. Layers can be simulated as confined or unconfined. Flow associated with external stresses, such as wells, areal recharge, evapotranspiration, drains, and rivers, also can be simulated. The report includes detailed explanations of physical and mathematical concepts on which the model is based, an explanation of how those concepts are incorporated in the modular structure of the computer program, instructions for using the model, and details of the computer code. The modular structure consists of a MAIN Program and a series of highly independent subroutines. The subroutines are grouped into 'packages.' Each package deals with a specific feature of the hydrologic system that is to be simulated, such as flow from rivers or flow into drains, or with a specific method of solving the set of simultaneous equations resulting from the finite-difference method. Several solution methods are incorporated, including the Preconditioned Conjugate-Gradient method. The division of the program into packages permits the user to examine specific hydrologic features of the model independently. This also facilitates development of additional capabilities because new packages can be added to the program without modifying the existing packages. The input and output systems of the computer program also are designed to permit maximum flexibility. The program is designed to allow other capabilities, such as transport and optimization, to be incorporated, but this report is limited to describing the ground-water flow capability. The program is written in Fortran 90 and will run without modification on most computers that have a Fortran 90 compiler.
NASA Technical Reports Server (NTRS)
Noor, A. K.
1983-01-01
Advances in continuum modeling, progress in reduction methods, and analysis and modeling needs for large space structures are covered with specific attention given to repetitive lattice trusses. As far as continuum modeling is concerned, an effective and verified analysis capability exists for linear thermoelastic stress, birfurcation buckling, and free vibration problems of repetitive lattices. However, application of continuum modeling to nonlinear analysis needs more development. Reduction methods are very effective for bifurcation buckling and static (steady-state) nonlinear analysis. However, more work is needed to realize their full potential for nonlinear dynamic and time-dependent problems. As far as analysis and modeling needs are concerned, three areas are identified: loads determination, modeling and nonclassical behavior characteristics, and computational algorithms. The impact of new advances in computer hardware, software, integrated analysis, CAD/CAM stems, and materials technology is also discussed.
Viability of Existing INL Facilities for Dry Storage Cask Handling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Randy Bohachek; Charles Park; Bruce Wallace
2013-04-01
This report evaluates existing capabilities at the INL to determine if a practical and cost effective method could be developed for opening and handling full-sized dry storage casks. The Idaho Nuclear Technology and Engineering Center (INTEC) CPP-603, Irradiated Spent Fuel Storage Facility, provides the infrastructure to support handling and examining casks and their contents. Based on a reasonable set of assumptions, it is possible to receive, open, inspect, remove samples, close, and reseal large bolted-lid dry storage casks at the INL. The capability can also be used to open and inspect casks that were last examined at the TAN Hotmore » Shop over ten years ago. The Castor V/21 and REA-2023 casks can provide additional confirmatory information regarding the extended performance of low-burnup (<45 GWD/MTU) used nuclear fuel. Once a dry storage cask is opened inside CPP-603, used fuel retrieved from the cask can be packaged in a shipping cask, and sent to a laboratory for testing. Testing at the INL’s Materials and Fuels Complex (MFC) can occur starting with shipment of samples from CPP-603 over an on-site road, avoiding the need to use public highways. This reduces cost and reduces the risk to the public. The full suite of characterization methods needed to establish the condition of the fuel exists and MFC. Many other testing capabilities also exist at MFC, but when those capabilities are not adequate, samples can be prepared and shipped to other laboratories for testing. This report discusses how the casks would be handled, what work needs to be done to ready the facilities/capabilities, and what the work will cost.« less
Viability of Existing INL Facilities for Dry Storage Cask Handling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bohachek, Randy; Wallace, Bruce; Winston, Phil
2013-04-30
This report evaluates existing capabilities at the INL to determine if a practical and cost effective method could be developed for opening and handling full-sized dry storage casks. The Idaho Nuclear Technology and Engineering Center (INTEC) CPP-603, Irradiated Spent Fuel Storage Facility, provides the infrastructure to support handling and examining casks and their contents. Based on a reasonable set of assumptions, it is possible to receive, open, inspect, remove samples, close, and reseal large bolted-lid dry storage casks at the INL. The capability can also be used to open and inspect casks that were last examined at the TAN Hotmore » Shop over ten years ago. The Castor V/21 and REA-2023 casks can provide additional confirmatory information regarding the extended performance of low-burnup (<45 GWD/MTU) used nuclear fuel. Once a dry storage cask is opened inside CPP-603, used fuel retrieved from the cask can be packaged in a shipping cask, and sent to a laboratory for testing. Testing at the INL’s Materials and Fuels Complex (MFC) can occur starting with shipment of samples from CPP-603 over an on-site road, avoiding the need to use public highways. This reduces cost and reduces the risk to the public. The full suite of characterization methods needed to establish the condition of the fuel exists and MFC. Many other testing capabilities also exist at MFC, but when those capabilities are not adequate, samples can be prepared and shipped to other laboratories for testing. This report discusses how the casks would be handled, what work needs to be done to ready the facilities/capabilities, and what the work will cost.« less
The Radiation, Interplanetary Shocks, and Coronal Sources (RISCS) Toolset
NASA Technical Reports Server (NTRS)
Zank, G. P.; Spann, J.
2014-01-01
We outline a plan to develop a physics based predictive toolset RISCS to describe the interplanetary energetic particle and radiation environment throughout the inner heliosphere, including at the Earth. To forecast and "nowcast" the radiation environment requires the fusing of three components: 1) the ability to provide probabilities for incipient solar activity; 2) the use of these probabilities and daily coronal and solar wind observations to model the 3D spatial and temporal heliosphere, including magnetic field structure and transients, within 10 AU; and 3) the ability to model the acceleration and transport of energetic particles based on current and anticipated coronal and heliospheric conditions. We describe how to address 1) - 3) based on our existing, well developed, and validated codes and models. The goal of RISCS toolset is to provide an operational forecast and "nowcast" capability that will a) predict solar energetic particle (SEP) intensities; b) spectra for protons and heavy ions; c) predict maximum energies and their duration; d) SEP composition; e) cosmic ray intensities, and f) plasma parameters, including shock arrival times, strength and obliquity at any given heliospheric location and time. The toolset would have a 72 hour predicative capability, with associated probabilistic bounds, that would be updated hourly thereafter to improve the predicted event(s) and reduce the associated probability bounds. The RISCS toolset would be highly adaptable and portable, capable of running on a variety of platforms to accommodate various operational needs and requirements.
MAPGEN Planner: Mixed-Initiative Activity Planning for the Mars Exploration Rover Mission
NASA Technical Reports Server (NTRS)
Ai-Chang, Mitch; Bresina, John; Charest, Leonard; Hsu, Jennifer; Jonsson, Ari K.; Kanefsky, Bob; Maldague, Pierre; Morris, Paul; Rajan, Kanna; Yglesias, Jeffrey
2003-01-01
This document describes the Mixed-initiative Activity Plan Generation system MAPGEN. The system is be- ing developed as one of the tools to be used during surface operations of NASA's Mars Exploration Rover mission (MER). However, the core technology is general and can be adapted to different missions and applications. The motivation for the system is to better support users that need to rapidly build activity plans that have to satisfy complex rules and fit within resource limits. The system therefore combines an existing tool for activity plan editing and resource modeling, with an advanced constraint-based reasoning and planning framework. The demonstration will show the key capabilities of the automated reasoning and planning component of the system, with emphasis on how these capabilities will be used during surface operations of the MER mission.
Open discovery: An integrated live Linux platform of Bioinformatics tools
Vetrivel, Umashankar; Pilla, Kalabharath
2008-01-01
Historically, live linux distributions for Bioinformatics have paved way for portability of Bioinformatics workbench in a platform independent manner. Moreover, most of the existing live Linux distributions limit their usage to sequence analysis and basic molecular visualization programs and are devoid of data persistence. Hence, open discovery ‐ a live linux distribution has been developed with the capability to perform complex tasks like molecular modeling, docking and molecular dynamics in a swift manner. Furthermore, it is also equipped with complete sequence analysis environment and is capable of running windows executable programs in Linux environment. Open discovery portrays the advanced customizable configuration of fedora, with data persistency accessible via USB drive or DVD. Availability The Open Discovery is distributed free under Academic Free License (AFL) and can be downloaded from http://www.OpenDiscovery.org.in PMID:19238235
A survey of electric and hybrid vehicle simulation programs
NASA Technical Reports Server (NTRS)
Bevan, J.; Heimburger, D. A.; Metcalfe, M. A.
1978-01-01
Results of a survey conducted within the United States to determine the extent of development and capabilities of automotive performance simulation programs suitable for electric and hybrid vehicle studies are summarized. Altogether, 111 programs were identified as being in a usable state. The complexity of the existing programs spans a range from a page of simple desktop calculator instructions to 300,000 lines of a high-level programming language. The capability to simulate electric vehicles was most common, heat-engines second, and hybrid vehicles least common. Batch-operated programs are slightly more common than interactive ones, and one-third can be operated in either mode. The most commonly used language was FORTRAN, the language typically used by engineers. The higher-level simulation languages (e.g. SIMSCRIPT, GPSS, SIMULA) used by "model builders" were conspicuously lacking.
Control system software, simulation, and robotic applications
NASA Technical Reports Server (NTRS)
Frisch, Harold P.
1991-01-01
All essential existing capabilities needed to create a man-machine interaction dynamics and performance (MMIDAP) capability are reviewed. The multibody system dynamics software program Order N DISCOS will be used for machine and musculo-skeletal dynamics modeling. The program JACK will be used for estimating and animating whole body human response to given loading situations and motion constraints. The basic elements of performance (BEP) task decomposition methodologies associated with the Human Performance Institute database will be used for performance assessment. Techniques for resolving the statically indeterminant muscular load sharing problem will be used for a detailed understanding of potential musculotendon or ligamentous fatigue, pain, discomfort, and trauma. The envisioned capacity is to be used for mechanical system design, human performance assessment, extrapolation of man/machine interaction test data, biomedical engineering, and soft prototyping within a concurrent engineering (CE) system.
In-Space Manufacturing Baseline Property Development
NASA Technical Reports Server (NTRS)
Stockman, Tom; Schneider, Judith; Prater, Tracie; Bean, Quincy; Werkheiser, Nicki
2016-01-01
The In-Space Manufacturing (ISM) project at NASA Marshall Space Flight Center currently operates a 3D FDM (fused deposition modeling) printer onboard the International Space Station. In order to enable utilization of this capability by designer, the project needs to establish characteristic material properties for materials produced using the process. This is difficult for additive manufacturing since standards and specifications do not yet exist for these technologies. Due to availability of crew time, there are limitations to the sample size which in turn limits the application of the traditional design allowables approaches to develop a materials property database for designers. In this study, various approaches to development of material databases were evaluated for use by designers of space systems who wish to leverage in-space manufacturing capabilities. This study focuses on alternative statistical techniques for baseline property development to support in-space manufacturing.
A data management system for engineering and scientific computing
NASA Technical Reports Server (NTRS)
Elliot, L.; Kunii, H. S.; Browne, J. C.
1978-01-01
Data elements and relationship definition capabilities for this data management system are explicitly tailored to the needs of engineering and scientific computing. System design was based upon studies of data management problems currently being handled through explicit programming. The system-defined data element types include real scalar numbers, vectors, arrays and special classes of arrays such as sparse arrays and triangular arrays. The data model is hierarchical (tree structured). Multiple views of data are provided at two levels. Subschemas provide multiple structural views of the total data base and multiple mappings for individual record types are supported through the use of a REDEFINES capability. The data definition language and the data manipulation language are designed as extensions to FORTRAN. Examples of the coding of real problems taken from existing practice in the data definition language and the data manipulation language are given.
Strategies and Innovative Approaches for the Future of Space Weather Forecasting
NASA Astrophysics Data System (ADS)
Hoeksema, J. T.
2012-12-01
The real and potential impacts of space weather have been well documented, yet neither the required research and operations programs, nor the data, modeling and analysis infrastructure necessary to develop and sustain a reliable space weather forecasting capability for a society are in place. The recently published decadal survey "Solar and Space Physics: A Science for a Technological Society" presents a vision for the coming decade and calls for a renewed national commitment to a comprehensive program in space weather and climatology. New resources are imperative. Particularly in the current fiscal environment, implementing a responsible strategy to address these needs will require broad participation across agencies and innovative approaches to make the most of existing resources, capitalize on current knowledge, span gaps in capabilities and observations, and focus resources on overcoming immediate roadblocks.
Terrestrial Planet Finder: Technology Development Plans
NASA Technical Reports Server (NTRS)
Lindensmith, Chris
2004-01-01
One of humanity's oldest questions is whether life exists elsewhere in the universe. The Terrestrial Planet Finder (TPF) mission will survey stars in our stellar neighborhood to search for planets and perform spectroscopic measurements to identify potential biomarkers in their atmospheres. In response to the recently published President's Plan for Space Exploration, TPF has plans to launch a visible-light coronagraph in 2014, and a separated-spacecraft infrared interferometer in 2016. Substantial funding has been committed to the development of the key technologies that are required to meet these goals for launch in the next decade. Efforts underway through industry and university contracts and at JPL include a number of system and subsystem testbeds, as well as components and numerical modeling capabilities. The science, technology, and design efforts are closely coupled to ensure that requirements and capabilities will be consistent and meet the science goals.
NASA Astrophysics Data System (ADS)
Zheng, J.; Zhu, J.; Wang, Z.; Fang, F.; Pain, C. C.; Xiang, J.
2015-06-01
A new anisotropic hr-adaptive mesh technique has been applied to modelling of multiscale transport phenomena, which is based on a discontinuous Galerkin/control volume discretization on unstructured meshes. Over existing air quality models typically based on static-structured grids using a locally nesting technique, the advantage of the anisotropic hr-adaptive model has the ability to adapt the mesh according to the evolving pollutant distribution and flow features. That is, the mesh resolution can be adjusted dynamically to simulate the pollutant transport process accurately and effectively. To illustrate the capability of the anisotropic adaptive unstructured mesh model, three benchmark numerical experiments have been setup for two-dimensional (2-D) transport phenomena. Comparisons have been made between the results obtained using uniform resolution meshes and anisotropic adaptive resolution meshes.
Sahoo, Debasis; Deck, Caroline; Yoganandan, Narayan; Willinger, Rémy
2013-12-01
A composite material model for skull, taking into account damage is implemented in the Strasbourg University finite element head model (SUFEHM) in order to enhance the existing skull mechanical constitutive law. The skull behavior is validated in terms of fracture patterns and contact forces by reconstructing 15 experimental cases. The new SUFEHM skull model is capable of reproducing skull fracture precisely. The composite skull model is validated not only for maximum forces, but also for lateral impact against actual force time curves from PMHS for the first time. Skull strain energy is found to be a pertinent parameter to predict the skull fracture and based on statistical (binary logistical regression) analysis it is observed that 50% risk of skull fracture occurred at skull strain energy of 544.0mJ. © 2013 Elsevier Ltd. All rights reserved.
Anisotropic constitutive modeling for nickel-base single crystal superalloys. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Sheh, Michael Y.
1988-01-01
An anisotropic constitutive model was developed based on crystallographic slip theory for nickel base single crystal superalloys. The constitutive equations developed utilizes drag stress and back stress state variables to model the local inelastic flow. Specially designed experiments were conducted to evaluate the existence of back stress in single crystal superalloy Rene N4 at 982 C. The results suggest that: (1) the back stress is orientation dependent; and (2) the back stress state variable is required for the current model to predict material anelastic recovery behavior. The model was evaluated for its predictive capability on single crystal material behavior including orientation dependent stress-strain response, tension/compression asymmetry, strain rate sensitivity, anelastic recovery behavior, cyclic hardening and softening, stress relaxation, creep and associated crystal lattice rotation. Limitation and future development needs are discussed.
The Space Debris Environment for the ISS Orbit
NASA Technical Reports Server (NTRS)
Theall, Jeff; Liou, Jer-Chyi; Matney, Mark; Kessler, Don
2001-01-01
With thirty-five planned missions over the next five years, the International Space Station (ISS) will be the focus for manned space activity. At least 6 different vehicles will transport crew and supplies to and from the nominally 400 km, 51.6 degree orbit. When completed, the ISS will be the largest space structure ever assembled and hence the largest target for space debris. Recent work at the Johnson Space Center has focused on updating the existing space debris models. The Orbital Debris Engineering Model, has been restructured to take advantage of state of the art desktop computing capability and revised with recent measurements from Haystack and Goldstone radars, additional analysis of LDEF and STS impacts, and the most recent SSN catalog. The new model also contains the capability to extrapolate the current environment in time to the year 2030. A revised meteoroid model based on the work of Divine has also been developed, and is called the JSC Meteoroid Model. The new model defines flux on the target per unit angle per unit speed, and for Earth orbit, includes the meteor showers. This paper quantifies the space debris environment for the ISS orbit from natural and anthropogenic sources. Particle flux and velocity distributions as functions of size and angle are be given for particles 10 microns and larger for altitudes from 350 to 450 km. The environment is projected forward in time until 2030.
Argobots: A Lightweight Low-Level Threading and Tasking Framework
Seo, Sangmin; Amer, Abdelhalim; Balaji, Pavan; ...
2017-10-24
In the past few decades, a number of user-level threading and tasking models have been proposed in the literature to address the shortcomings of OS-level threads, primarily with respect to cost and flexibility. Current state-of-the-art user-level threading and tasking models, however, are either too specific to applications or architectures or are not as powerful or flexible. In this article, we present Argobots, a lightweight, low-level threading and tasking framework that is designed as a portable and performant substrate for high-level programming models or runtime systems. Argobots offers a carefully designed execution model that balances generality of functionality with providing amore » rich set of controls to allow specialization by the user or high-level programming model. Here, we describe the design, implementation, and optimization of Argobots and present integrations with three example high-level models: OpenMP, MPI, and co-located I/O service. Evaluations show that (1) Argobots outperforms existing generic threading runtimes; (2) our OpenMP runtime offers more efficient interoperability capabilities than production OpenMP runtimes do; (3) when MPI interoperates with Argobots instead of Pthreads, it enjoys reduced synchronization costs and better latency hiding capabilities; and (4) I/O service with Argobots reduces interference with co-located applications, achieving performance competitive with that of the Pthreads version.« less
Argobots: A Lightweight Low-Level Threading and Tasking Framework
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seo, Sangmin; Amer, Abdelhalim; Balaji, Pavan
In the past few decades, a number of user-level threading and tasking models have been proposed in the literature to address the shortcomings of OS-level threads, primarily with respect to cost and flexibility. Current state-of-the-art user-level threading and tasking models, however, are either too specific to applications or architectures or are not as powerful or flexible. In this article, we present Argobots, a lightweight, low-level threading and tasking framework that is designed as a portable and performant substrate for high-level programming models or runtime systems. Argobots offers a carefully designed execution model that balances generality of functionality with providing amore » rich set of controls to allow specialization by the user or high-level programming model. Here, we describe the design, implementation, and optimization of Argobots and present integrations with three example high-level models: OpenMP, MPI, and co-located I/O service. Evaluations show that (1) Argobots outperforms existing generic threading runtimes; (2) our OpenMP runtime offers more efficient interoperability capabilities than production OpenMP runtimes do; (3) when MPI interoperates with Argobots instead of Pthreads, it enjoys reduced synchronization costs and better latency hiding capabilities; and (4) I/O service with Argobots reduces interference with co-located applications, achieving performance competitive with that of the Pthreads version.« less
Leveraging existing technology to boost revenue cycle performance.
Wagner, Karen
2012-09-01
Revenue cycle leaders can reduce the frequency or level of technology investment needed while maintaining strong service and payment accuracy by looking at four areas of opportunity: Applying output from existing technology in new ways Seeking new functionality from existing systems. Linking with external systems to provide greater capabilities. Supplementing limitations of existing technology with outside expertise.
A framework for modeling scenario-based barrier island storm impacts
Mickey, Rangley; Long, Joseph W.; Dalyander, P. Soupy; Plant, Nathaniel G.; Thompson, David M.
2018-01-01
Methods for investigating the vulnerability of existing or proposed coastal features to storm impacts often rely on simplified parametric models or one-dimensional process-based modeling studies that focus on changes to a profile across a dune or barrier island. These simple studies tend to neglect the impacts to curvilinear or alongshore varying island planforms, influence of non-uniform nearshore hydrodynamics and sediment transport, irregular morphology of the offshore bathymetry, and impacts from low magnitude wave events (e.g. cold fronts). Presented here is a framework for simulating regionally specific, low and high magnitude scenario-based storm impacts to assess the alongshore variable vulnerabilities of a coastal feature. Storm scenarios based on historic hydrodynamic conditions were derived and simulated using the process-based morphologic evolution model XBeach. Model results show that the scenarios predicted similar patterns of erosion and overwash when compared to observed qualitative morphologic changes from recent storm events that were not included in the dataset used to build the scenarios. The framework model simulations were capable of predicting specific areas of vulnerability in the existing feature and the results illustrate how this storm vulnerability simulation framework could be used as a tool to help inform the decision-making process for scientists, engineers, and stakeholders involved in coastal zone management or restoration projects.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Murphy, L.T.; Hickey, M.
This paper summarizes the progress to date by CH2M HILL and the UKAEA in development of a parametric modelling capability for estimating the costs of large nuclear decommissioning projects in the United Kingdom (UK) and Europe. The ability to successfully apply parametric cost estimating techniques will be a key factor to commercial success in the UK and European multi-billion dollar waste management, decommissioning and environmental restoration markets. The most useful parametric models will be those that incorporate individual components representing major elements of work: reactor decommissioning, fuel cycle facility decommissioning, waste management facility decommissioning and environmental restoration. Models must bemore » sufficiently robust to estimate indirect costs and overheads, permit pricing analysis and adjustment, and accommodate the intricacies of international monetary exchange, currency fluctuations and contingency. The development of a parametric cost estimating capability is also a key component in building a forward estimating strategy. The forward estimating strategy will enable the preparation of accurate and cost-effective out-year estimates, even when work scope is poorly defined or as yet indeterminate. Preparation of cost estimates for work outside the organizations current sites, for which detailed measurement is not possible and historical cost data does not exist, will also be facilitated. (authors)« less
Long-term archiving and data access: modelling and standardization
NASA Technical Reports Server (NTRS)
Hoc, Claude; Levoir, Thierry; Nonon-Latapie, Michel
1996-01-01
This paper reports on the multiple difficulties inherent in the long-term archiving of digital data, and in particular on the different possible causes of definitive data loss. It defines the basic principles which must be respected when creating long-term archives. Such principles concern both the archival systems and the data. The archival systems should have two primary qualities: independence of architecture with respect to technological evolution, and generic-ness, i.e., the capability of ensuring identical service for heterogeneous data. These characteristics are implicit in the Reference Model for Archival Services, currently being designed within an ISO-CCSDS framework. A system prototype has been developed at the French Space Agency (CNES) in conformance with these principles, and its main characteristics will be discussed in this paper. Moreover, the data archived should be capable of abstract representation regardless of the technology used, and should, to the extent that it is possible, be organized, structured and described with the help of existing standards. The immediate advantage of standardization is illustrated by several concrete examples. Both the positive facets and the limitations of this approach are analyzed. The advantages of developing an object-oriented data model within this contxt are then examined.
NASA Astrophysics Data System (ADS)
Fauzi, Ilham; Muharram Hasby, Fariz; Irianto, Dradjad
2018-03-01
Although government is able to make mandatory standards that must be obeyed by the industry, the respective industries themselves often have difficulties to fulfil the requirements described in those standards. This is especially true in many small and medium sized enterprises that lack the required capital to invest in standard-compliant equipment and machineries. This study aims to develop a set of measurement tools for evaluating the level of readiness of production technology with respect to the requirements of a product standard based on the quality function deployment (QFD) method. By combining the QFD methodology, UNESCAP Technometric model [9] and Analytic Hierarchy Process (AHP), this model is used to measure a firm’s capability to fulfill government standard in the toy making industry. Expert opinions from both the governmental officers responsible for setting and implementing standards and the industry practitioners responsible for managing manufacturing processes are collected and processed to find out the technological capabilities that should be improved by the firm to fulfill the existing standard. This study showed that the proposed model can be used successfully to measure the gap between the requirements of the standard and the readiness of technoware technological component in a particular firm.
Cartographic symbol library considering symbol relations based on anti-aliasing graphic library
NASA Astrophysics Data System (ADS)
Mei, Yang; Li, Lin
2007-06-01
Cartographic visualization represents geographic information with a map form, which enables us retrieve useful geospatial information. In digital environment, cartographic symbol library is the base of cartographic visualization and is an essential component of Geographic Information System as well. Existing cartographic symbol libraries have two flaws. One is the display quality and the other one is relations adjusting. Statistic data presented in this paper indicate that the aliasing problem is a major factor on the symbol display quality on graphic display devices. So, effective graphic anti-aliasing methods based on a new anti-aliasing algorithm are presented and encapsulated in an anti-aliasing graphic library with the form of Component Object Model. Furthermore, cartographic visualization should represent feature relation in the way of correctly adjusting symbol relations besides displaying an individual feature. But current cartographic symbol libraries don't have this capability. This paper creates a cartographic symbol design model to implement symbol relations adjusting. Consequently the cartographic symbol library based on this design model can provide cartographic visualization with relations adjusting capability. The anti-aliasing graphic library and the cartographic symbol library are sampled and the results prove that the two libraries both have better efficiency and effect.
Waffle mode error in the AEOS adaptive optics point-spread function
NASA Astrophysics Data System (ADS)
Makidon, Russell B.; Sivaramakrishnan, Anand; Roberts, Lewis C., Jr.; Oppenheimer, Ben R.; Graham, James R.
2003-02-01
Adaptive optics (AO) systems have improved astronomical imaging capabilities significantly over the last decade, and have the potential to revolutionize the kinds of science done with 4-5m class ground-based telescopes. However, provided sufficient detailed study and analysis, existing AO systems can be improved beyond their original specified error budgets. Indeed, modeling AO systems has been a major activity in the past decade: sources of noise in the atmosphere and the wavefront sensing WFS) control loop have received a great deal of attention, and many detailed and sophisticated control-theoretic and numerical models predicting AO performance are already in existence. However, in terms of AO system performance improvements, wavefront reconstruction (WFR) and wavefront calibration techniques have commanded relatively little attention. We elucidate the nature of some of these reconstruction problems, and demonstrate their existence in data from the AEOS AO system. We simulate the AO correction of AEOS in the I-band, and show that the magnitude of the `waffle mode' error in the AEOS reconstructor is considerably larger than expected. We suggest ways of reducing the magnitude of this error, and, in doing so, open up ways of understanding how wavefront reconstruction might handle bad actuators and partially-illuminated WFS subapertures.
Stable modeling based control methods using a new RBF network.
Beyhan, Selami; Alci, Musa
2010-10-01
This paper presents a novel model with radial basis functions (RBFs), which is applied successively for online stable identification and control of nonlinear discrete-time systems. First, the proposed model is utilized for direct inverse modeling of the plant to generate the control input where it is assumed that inverse plant dynamics exist. Second, it is employed for system identification to generate a sliding-mode control input. Finally, the network is employed to tune PID (proportional + integrative + derivative) controller parameters automatically. The adaptive learning rate (ALR), which is employed in the gradient descent (GD) method, provides the global convergence of the modeling errors. Using the Lyapunov stability approach, the boundedness of the tracking errors and the system parameters are shown both theoretically and in real time. To show the superiority of the new model with RBFs, its tracking results are compared with the results of a conventional sigmoidal multi-layer perceptron (MLP) neural network and the new model with sigmoid activation functions. To see the real-time capability of the new model, the proposed network is employed for online identification and control of a cascaded parallel two-tank liquid-level system. Even though there exist large disturbances, the proposed model with RBFs generates a suitable control input to track the reference signal better than other methods in both simulations and real time. Copyright © 2010 ISA. Published by Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Lavelle, Tom
2003-01-01
The objective is to increase the usability of the current NPSS code/architecture by incorporating an advanced space transportation propulsion system capability into the existing NPSS code and begin defining advanced capabilities for NPSS and provide an enhancement for the NPSS code/architecture.
NASA Astrophysics Data System (ADS)
Hickmott, Curtis W.
Cellular core tooling is a new technology which has the capability to manufacture complex integrated monolithic composite structures. This novel tooling method utilizes thermoplastic cellular cores as inner tooling. The semi-rigid nature of the cellular cores makes them convenient for lay-up, and under autoclave temperature and pressure they soften and expand providing uniform compaction on all surfaces including internal features such as ribs and spar tubes. This process has the capability of developing fully optimized aerospace structures by reducing or eliminating assembly using fasteners or bonded joints. The technology is studied in the context of evaluating its capabilities, advantages, and limitations in developing high quality structures. The complex nature of these parts has led to development of a model using the Finite Element Analysis (FEA) software Abaqus and the plug-in COMPRO Common Component Architecture (CCA) provided by Convergent Manufacturing Technologies. This model utilizes a "virtual autoclave" technique to simulate temperature profiles, resin flow paths, and ultimately deformation from residual stress. A model has been developed simulating the temperature profile during curing of composite parts made with the cellular core technology. While modeling of composites has been performed in the past, this project will look to take this existing knowledge and apply it to this new manufacturing method capable of building more complex parts and develop a model designed specifically for building large, complex components with a high degree of accuracy. The model development has been carried out in conjunction with experimental validation. A double box beam structure was chosen for analysis to determine the effects of the technology on internal ribs and joints. Double box beams were manufactured and sectioned into T-joints for characterization. Mechanical behavior of T-joints was performed using the T-joint pull-off test and compared to traditional tooling methods. Components made with the cellular core tooling method showed an improved strength at the joints. It is expected that this knowledge will help optimize the processing of complex, integrated structures and benefit applications in aerospace where lighter, structurally efficient components would be advantageous.
Testing the Data Assimilation Capability of the Profiler Virtual Module
2016-02-01
ARL-TR-7601 ● FEB 2016 US Army Research Laboratory Testing the Data Assimilation Capability of the Profiler Virtual Module by...originator. ARL-TR-7601 ● FEB 2016 US Army Research Laboratory Testing the Data Assimilation Capability of the Profiler Virtual...hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and
Establishing NWP capabilities in African Small Island States (SIDs)
NASA Astrophysics Data System (ADS)
Rögnvaldsson, Ólafur
2017-04-01
Íslenskar orkurannsóknir (ÍSOR), in collaboration with Belgingur Ltd. and the United Nations Economic Commission for Africa (UNECA) signed a Letter of Agreement in 2015 regarding collaboration in the "Establishing Operational Capacity for Building, Deploying and Using Numerical Weather and Seasonal Prediction Systems in Small Island States in Africa (SIDs)" project. The specific objectives of the collaboration were the following: - Build capacity of National Meteorological and Hydrology Services (NMHS) staff on the use of the WRF atmospheric model for weather and seasonal forecasting, interpretation of model results, and the use of observations to verify and improve model simulations. - Establish a platform for integrating short to medium range weather forecasts, as well as seasonal forecasts, into already existing infrastructure at NMHS and Regional Climate Centres. - Improve understanding of existing model results and forecast verification, for improving decision-making on the time scale of days to weeks. To meet these challenges the operational Weather On Demand (WOD) forecasting system, developed by Belgingur, is being installed in a number of SIDs countries (Cabo Verde, Guinea-Bissau, and Seychelles), as well as being deployed for the Pan-Africa region, with forecasts being disseminated to collaborating NMHSs.
Intercomparison of land-surface parameterizations launched
NASA Astrophysics Data System (ADS)
Henderson-Sellers, A.; Dickinson, R. E.
One of the crucial tasks for climatic and hydrological scientists over the next several years will be validating land surface process parameterizations used in climate models. There is not, necessarily, a unique set of parameters to be used. Different scientists will want to attempt to capture processes through various methods “for example, Avissar and Verstraete, 1990”. Validation of some aspects of the available (and proposed) schemes' performance is clearly required. It would also be valuable to compare the behavior of the existing schemes [for example, Dickinson et al., 1991; Henderson-Sellers, 1992a].The WMO-CAS Working Group on Numerical Experimentation (WGNE) and the Science Panel of the GEWEX Continental-Scale International Project (GCIP) [for example, Chahine, 1992] have agreed to launch the joint WGNE/GCIP Project for Intercomparison of Land-Surface Parameterization Schemes (PILPS). The principal goal of this project is to achieve greater understanding of the capabilities and potential applications of existing and new land-surface schemes in atmospheric models. It is not anticipated that a single “best” scheme will emerge. Rather, the aim is to explore alternative models in ways compatible with their authors' or exploiters' goals and to increase understanding of the characteristics of these models in the scientific community.
A cooperative strategy for parameter estimation in large scale systems biology models.
Villaverde, Alejandro F; Egea, Jose A; Banga, Julio R
2012-06-22
Mathematical models play a key role in systems biology: they summarize the currently available knowledge in a way that allows to make experimentally verifiable predictions. Model calibration consists of finding the parameters that give the best fit to a set of experimental data, which entails minimizing a cost function that measures the goodness of this fit. Most mathematical models in systems biology present three characteristics which make this problem very difficult to solve: they are highly non-linear, they have a large number of parameters to be estimated, and the information content of the available experimental data is frequently scarce. Hence, there is a need for global optimization methods capable of solving this problem efficiently. A new approach for parameter estimation of large scale models, called Cooperative Enhanced Scatter Search (CeSS), is presented. Its key feature is the cooperation between different programs ("threads") that run in parallel in different processors. Each thread implements a state of the art metaheuristic, the enhanced Scatter Search algorithm (eSS). Cooperation, meaning information sharing between threads, modifies the systemic properties of the algorithm and allows to speed up performance. Two parameter estimation problems involving models related with the central carbon metabolism of E. coli which include different regulatory levels (metabolic and transcriptional) are used as case studies. The performance and capabilities of the method are also evaluated using benchmark problems of large-scale global optimization, with excellent results. The cooperative CeSS strategy is a general purpose technique that can be applied to any model calibration problem. Its capability has been demonstrated by calibrating two large-scale models of different characteristics, improving the performance of previously existing methods in both cases. The cooperative metaheuristic presented here can be easily extended to incorporate other global and local search solvers and specific structural information for particular classes of problems.
A cooperative strategy for parameter estimation in large scale systems biology models
2012-01-01
Background Mathematical models play a key role in systems biology: they summarize the currently available knowledge in a way that allows to make experimentally verifiable predictions. Model calibration consists of finding the parameters that give the best fit to a set of experimental data, which entails minimizing a cost function that measures the goodness of this fit. Most mathematical models in systems biology present three characteristics which make this problem very difficult to solve: they are highly non-linear, they have a large number of parameters to be estimated, and the information content of the available experimental data is frequently scarce. Hence, there is a need for global optimization methods capable of solving this problem efficiently. Results A new approach for parameter estimation of large scale models, called Cooperative Enhanced Scatter Search (CeSS), is presented. Its key feature is the cooperation between different programs (“threads”) that run in parallel in different processors. Each thread implements a state of the art metaheuristic, the enhanced Scatter Search algorithm (eSS). Cooperation, meaning information sharing between threads, modifies the systemic properties of the algorithm and allows to speed up performance. Two parameter estimation problems involving models related with the central carbon metabolism of E. coli which include different regulatory levels (metabolic and transcriptional) are used as case studies. The performance and capabilities of the method are also evaluated using benchmark problems of large-scale global optimization, with excellent results. Conclusions The cooperative CeSS strategy is a general purpose technique that can be applied to any model calibration problem. Its capability has been demonstrated by calibrating two large-scale models of different characteristics, improving the performance of previously existing methods in both cases. The cooperative metaheuristic presented here can be easily extended to incorporate other global and local search solvers and specific structural information for particular classes of problems. PMID:22727112
Electric Propulsion Interactions Code (EPIC): Recent Enhancements and Goals for Future Capabilities
NASA Technical Reports Server (NTRS)
Gardner, Barbara M.; Kuharski, Robert A.; Davis, Victoria A.; Ferguson, Dale C.
2007-01-01
The Electric Propulsion Interactions Code (EPIC) is the leading interactive computer tool for assessing the effects of electric thruster plumes on spacecraft subsystems. EPIC, developed by SAIC under the sponsorship of the Space Environments and Effects (SEE) Program at the NASA Marshall Space Flight Center, has three primary modules. One is PlumeTool, which calculates plumes of electrostatic thrusters and Hall-effect thrusters by modeling the primary ion beam as well as elastic scattering and charge-exchange of beam ions with thruster-generated neutrals. ObjectToolkit is a 3-D object definition and spacecraft surface modeling tool developed for use with several SEE Program codes. The main EPIC interface integrates the thruster plume into the 3-D geometry of the spacecraft and calculates interactions and effects of the plume with the spacecraft. Effects modeled include erosion of surfaces due to sputtering, re-deposition of sputtered materials, surface heating, torque on the spacecraft, and changes in surface properties due to erosion and deposition. In support of Prometheus I (JIMO), a number of new capabilities and enhancements were made to existing EPIC models. Enhancements to EPIC include adding the ability to scale and view individual plume components, to import a neutral plume associated with a thruster (to model a grid erosion plume, for example), and to calculate the plume from new initial beam conditions. Unfortunately, changes in program direction have left a number of desired enhancements undone. Variable gridding over a surface and resputtering of deposited materials, including multiple bounces and sticking coefficients, would significantly enhance the erosion/deposition model. Other modifications such as improving the heating model and the PlumeTool neutral plume model, enabling time dependent surface interactions, and including EM1 and optical effects would enable EPIC to better serve the aerospace engineer and electric propulsion systems integrator. We review EPIC S overall capabilities and recent modifications, and discuss directions for future enhancements.
Space Tug systems study. Volume 2: Compendium
NASA Technical Reports Server (NTRS)
1974-01-01
Possible storable propellant configurations and program plans are evaluated for the space tug. Alternatives examined include: use of existing expendable stages modified for use with shuttle, followed by a space tug at a later date; use of a modified growth version of existing expendable stages for greater performance and potential reuse, followed by a space tug at a later date; use of a low development cost, reusable, interim space tug available at shuttle initial operational capability (IOC) that could be evolved to greater system capabilities at a later date; and use a direct developed tug with maximum potential to be available at some specified time after space shuttle IOC. The capability options were narrowed down to three final options for detailed program definition.
Banger, Kamaljit; Yuan, Mingwei; Wang, Junming; Nafziger, Emerson D.; Pittelkow, Cameron M.
2017-01-01
Meeting crop nitrogen (N) demand while minimizing N losses to the environment has proven difficult despite significant field research and modeling efforts. To improve N management, several real-time N management tools have been developed with a primary focus on enhancing crop production. However, no coordinated effort exists to simultaneously address sustainability concerns related to N losses at field- and regional-scales. In this perspective, we highlight the opportunity for incorporating environmental effects into N management decision support tools for United States maize production systems by integrating publicly available crop models with grower-entered management information and gridded soil and climate data in a geospatial framework specifically designed to quantify environmental and crop production tradeoffs. To facilitate advances in this area, we assess the capability of existing crop models to provide in-season N recommendations while estimating N leaching and nitrous oxide emissions, discuss several considerations for initial framework development, and highlight important challenges related to improving the accuracy of crop model predictions. Such a framework would benefit the development of regional sustainable intensification strategies by enabling the identification of N loss hotspots which could be used to implement spatially explicit mitigation efforts in relation to current environmental quality goals and real-time weather conditions. Nevertheless, we argue that this long-term vision can only be realized by leveraging a variety of existing research efforts to overcome challenges related to improving model structure, accessing field data to enhance model performance, and addressing the numerous social difficulties in delivery and adoption of such tool by stakeholders. PMID:28804490
Banger, Kamaljit; Yuan, Mingwei; Wang, Junming; Nafziger, Emerson D; Pittelkow, Cameron M
2017-01-01
Meeting crop nitrogen (N) demand while minimizing N losses to the environment has proven difficult despite significant field research and modeling efforts. To improve N management, several real-time N management tools have been developed with a primary focus on enhancing crop production. However, no coordinated effort exists to simultaneously address sustainability concerns related to N losses at field- and regional-scales. In this perspective, we highlight the opportunity for incorporating environmental effects into N management decision support tools for United States maize production systems by integrating publicly available crop models with grower-entered management information and gridded soil and climate data in a geospatial framework specifically designed to quantify environmental and crop production tradeoffs. To facilitate advances in this area, we assess the capability of existing crop models to provide in-season N recommendations while estimating N leaching and nitrous oxide emissions, discuss several considerations for initial framework development, and highlight important challenges related to improving the accuracy of crop model predictions. Such a framework would benefit the development of regional sustainable intensification strategies by enabling the identification of N loss hotspots which could be used to implement spatially explicit mitigation efforts in relation to current environmental quality goals and real-time weather conditions. Nevertheless, we argue that this long-term vision can only be realized by leveraging a variety of existing research efforts to overcome challenges related to improving model structure, accessing field data to enhance model performance, and addressing the numerous social difficulties in delivery and adoption of such tool by stakeholders.
NASA Astrophysics Data System (ADS)
Cole, Christopher J. P.
Nuclear power has several unique advantages over other air independent energy sources for nuclear combat submarines. An inherently safe, small nuclear reactor, capable of supply the hotel load of the Victoria Class submarines, has been conceptually developed. The reactor is designed to complement the existing diesel electric power generation plant presently onboard the submarine. The reactor, rated at greater than 1 MW thermal, will supply electricity to the submarine's batteries through an organic Rankine cycle energy conversion plant at 200 kW. This load will increase the operational envelope of the submarine by providing up to 28 continuous days submerged, allowing for an enhanced indiscretion ratio (ratio of time spent on the surface versus time submerged) and a limited under ice capability. The power plant can be fitted into the existing submarine by inserting a 6 m hull plug. With its simplistic design and inherent safety features, the reactor plant will require a minimal addition to the crew. The reactor employs TRISO fuel particles for increased safety. The light water coolant remains at atmospheric pressure, exiting the core at 96°C. Burn-up control and limiting excess reactivity is achieved through movable reflector plates. Shut down and regulatory control is achieved through the thirteen hafnium control rods. Inherent safety is achieved through the negative prompt and delayed temperature coefficients, as well as the negative void coefficient. During a transient, the boiling of the moderator results in a sudden drop in reactivity, essentially shutting down the reactor. It is this characteristic after which the reactor has been named. The design of the reactor was achieved through modelling using computer codes such as MCNP5, WIMS-AECL, FEMLAB, and MicroShield5, in addition to specially written software for kinetics, heat transfer and fission product poisoning calculations. The work has covered a broad area of research and has highlighted additional areas that should be investigated. These include developing a detailed point nodel kinetic model coupled with a finite element heat transfer model, undertaking radiation protection shielding calculations in accordance with international and national regulations, and exploring the effects of advanced fuels.
NASA Technical Reports Server (NTRS)
Mikulas, Martin M., Jr.
1991-01-01
In many lunar construction scenarios, mechanical cranes in some form will be indispensible in moving large masses around with various degrees of fine positioning. While thorough experience exists in the use of terrestrial cranes new thinking is required about the design of cranes to be used in extraterrestrial construction. The primary driving force for this new thinking is the need to automate the crane system so that space cranes can be operated as telerobotic machines with a large number of automatic capabilities. This is true because in extraterrestrial construction human resources will need to be critically rationed. The design problems of mechanisms and control systems for a lunar crane must deal with at least two areas of performance. First, the automated crane must be capable of maneuvering a large mass, so that when the mass arrives at the target position there are only small vibrations. Secondly, any residue vibrations must be automatically damped out and a fine positioning must be achieved. For extraterrestrial use there are additional challenges to a crane design - for example, to design a crane system so that it can be transformed for other construction uses. This initial project in crane design does not address such additional issues, although they may be the subject of future CSC research. To date the Center has designed and analyzed many mechanisms. The fundamental problem of trade-offs between passively stabilizing the load and actively controlling the load by actuators was extensively studied. The capability of 3D dynamics modeling now exists for such studies. A scaled model of a lunar crane was set up and it has been most fruitful in providing basic understanding of lunar cranes. Due to an interesting scaling match-up, this scaled model exhibits the load vibration frequencies one would expect in the real lunar case. Using the analytical results achieved to date, a laboratory crane system is now being developed as a test bed for verifying a wide variety of mechanisms and control designs. Future development will be aimed at making the crane system a telerobotic test bed into which external sensors such as computer vision systems, and other small robotic devices such as CSC lunar rovers, will be integrated.
NASA Astrophysics Data System (ADS)
McLarty, Dustin Fogle
Distributed energy systems are a promising means by which to reduce both emissions and costs. Continuous generators must be responsive and highly efficiency to support building dynamics and intermittent on-site renewable power. Fuel cell -- gas turbine hybrids (FC/GT) are fuel-flexible generators capable of ultra-high efficiency, ultra-low emissions, and rapid power response. This work undertakes a detailed study of the electrochemistry, chemistry and mechanical dynamics governing the complex interaction between the individual systems in such a highly coupled hybrid arrangement. The mechanisms leading to the compressor stall/surge phenomena are studied for the increased risk posed to particular hybrid configurations. A novel fuel cell modeling method introduced captures various spatial resolutions, flow geometries, stack configurations and novel heat transfer pathways. Several promising hybrid configurations are analyzed throughout the work and a sensitivity analysis of seven design parameters is conducted. A simple estimating method is introduced for the combined system efficiency of a fuel cell and a turbine using component performance specifications. Existing solid oxide fuel cell technology is capable of hybrid efficiencies greater than 75% (LHV) operating on natural gas, and existing molten carbonate systems greater than 70% (LHV). A dynamic model is calibrated to accurately capture the physical coupling of a FC/GT demonstrator tested at UC Irvine. The 2900 hour experiment highlighted the sensitivity to small perturbations and a need for additional control development. Further sensitivity studies outlined the responsiveness and limits of different control approaches. The capability for substantial turn-down and load following through speed control and flow bypass with minimal impact on internal fuel cell thermal distribution is particularly promising to meet local demands or provide dispatchable support for renewable power. Advanced control and dispatch heuristics are discussed using a case study of the UCI central plant. Thermal energy storage introduces a time horizon into the dispatch optimization which requires novel solution strategies. Highly efficient and responsive generators are required to meet the increasingly dynamic loads of today's efficient buildings and intermittent local renewable wind and solar power. Fuel cell gas turbine hybrids will play an integral role in the complex and ever-changing solution to local electricity production.
NASA Technical Reports Server (NTRS)
Liever, Peter A.; West, Jeffrey S.; Harris, Robert E.
2016-01-01
A hybrid Computational Fluid Dynamics and Computational Aero-Acoustics (CFD/CAA) modeling framework has been developed for launch vehicle liftoff acoustic environment predictions. The framework couples the existing highly-scalable NASA production CFD code, Loci/CHEM, with a high-order accurate Discontinuous Galerkin solver developed in the same production framework, Loci/THRUST, to accurately resolve and propagate acoustic physics across the entire launch environment. Time-accurate, Hybrid RANS/LES CFD modeling is applied for predicting the acoustic generation physics at the plume source, and a high-order accurate unstructured mesh Discontinuous Galerkin (DG) method is employed to propagate acoustic waves away from the source across large distances using high-order accurate schemes. The DG solver is capable of solving 2nd, 3rd, and 4th order Euler solutions for non-linear, conservative acoustic field propagation. Initial application testing and validation has been carried out against high resolution acoustic data from the Ares Scale Model Acoustic Test (ASMAT) series to evaluate the capabilities and production readiness of the CFD/CAA system to resolve the observed spectrum of acoustic frequency content. This paper presents results from this validation and outlines efforts to mature and improve the computational simulation framework.
Image-based modelling of skeletal muscle oxygenation
Clough, G. F.
2017-01-01
The supply of oxygen in sufficient quantity is vital for the correct functioning of all organs in the human body, in particular for skeletal muscle during exercise. Disease is often associated with both an inhibition of the microvascular supply capability and is thought to relate to changes in the structure of blood vessel networks. Different methods exist to investigate the influence of the microvascular structure on tissue oxygenation, varying over a range of application areas, i.e. biological in vivo and in vitro experiments, imaging and mathematical modelling. Ideally, all of these methods should be combined within the same framework in order to fully understand the processes involved. This review discusses the mathematical models of skeletal muscle oxygenation currently available that are based upon images taken of the muscle microvasculature in vivo and ex vivo. Imaging systems suitable for capturing the blood vessel networks are discussed and respective contrasting methods presented. The review further informs the association between anatomical characteristics in health and disease. With this review we give the reader a tool to understand and establish the workflow of developing an image-based model of skeletal muscle oxygenation. Finally, we give an outlook for improvements needed for measurements and imaging techniques to adequately investigate the microvascular capability for oxygen exchange. PMID:28202595
NASA Astrophysics Data System (ADS)
Chakraborty, Amitav; Roy, Sumit; Banerjee, Rahul
2018-03-01
This experimental work highlights the inherent capability of an adaptive-neuro fuzzy inference system (ANFIS) based model to act as a robust system identification tool (SIT) in prognosticating the performance and emission parameters of an existing diesel engine running of diesel-LPG dual fuel mode. The developed model proved its adeptness by successfully harnessing the effects of the input parameters of load, injection duration and LPG energy share on output parameters of BSFCEQ, BTE, NOX, SOOT, CO and HC. Successive evaluation of the ANFIS model, revealed high levels of resemblance with the already forecasted ANN results for the same input parameters and it was evident that similar to ANN, ANFIS also has the innate ability to act as a robust SIT. The ANFIS predicted data harmonized the experimental data with high overall accuracy. The correlation coefficient (R) values are stretched in between 0.99207 to 0.999988. The mean absolute percentage error (MAPE) tallies were recorded in the range of 0.02-0.173% with the root mean square errors (RMSE) in acceptable margins. Hence the developed model is capable of emulating the actual engine parameters with commendable ranges of accuracy, which in turn would act as a robust prediction platform in the future domains of optimization.
Using WNTR to Model Water Distribution System Resilience ...
The Water Network Tool for Resilience (WNTR) is a new open source Python package developed by the U.S. Environmental Protection Agency and Sandia National Laboratories to model and evaluate resilience of water distribution systems. WNTR can be used to simulate a wide range of disruptive events, including earthquakes, contamination incidents, floods, climate change, and fires. The software includes the EPANET solver as well as a WNTR solver with the ability to model pressure-driven demand hydraulics, pipe breaks, component degradation and failure, changes to supply and demand, and cascading failure. Damage to individual components in the network (i.e. pipes, tanks) can be selected probabilistically using fragility curves. WNTR can also simulate different types of resilience-enhancing actions, including scheduled pipe repair or replacement, water conservation efforts, addition of back-up power, and use of contamination warning systems. The software can be used to estimate potential damage in a network, evaluate preparedness, prioritize repair strategies, and identify worse case scenarios. As a Python package, WNTR takes advantage of many existing python capabilities, including parallel processing of scenarios and graphics capabilities. This presentation will outline the modeling components in WNTR, demonstrate their use, give the audience information on how to get started using the code, and invite others to participate in this open source project. This pres
Cerretelli, Stefania; Poggio, Laura; Gimona, Alessandro; Yakob, Getahun; Boke, Shiferaw; Habte, Mulugeta; Coull, Malcolm; Peressotti, Alessandro; Black, Helaina
2018-07-01
Land degradation is a serious issue especially in dry and developing countries leading to ecosystem services (ESS) degradation due to soil functions' depletion. Reliably mapping land degradation spatial distribution is therefore important for policy decisions. The main objectives of this paper were to infer land degradation through ESS assessment and compare the modelling results obtained using different sets of data. We modelled important physical processes (sediment erosion and nutrient export) and the equivalent ecosystem services (sediment and nutrient retention) to infer land degradation in an area in the Ethiopian Great Rift Valley. To model soil erosion/retention capability, and nitrogen export/retention capability, two datasets were used: a 'global' dataset derived from existing global-coverage data and a hybrid dataset where global data were integrated with data from local surveys. The results showed that ESS assessments can be used to infer land degradation and identify priority areas for interventions. The comparison between the modelling results of the two different input datasets showed that caution is necessary if only global-coverage data are used at a local scale. In remote and data-poor areas, an approach that integrates global data with targeted local sampling campaigns might be a good compromise to use ecosystem services in decision-making. Copyright © 2018. Published by Elsevier B.V.
User’s Guide for Biodegradation Reactions in TMVOCBio
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jung, Yoojin; Battistelli, Alfredo
TMVOCBio is an extended version of the TMVOC numerical reservoir simulator, with the capability of simulating multiple biodegradation reactions mediated by different microbial populations or based on different redox reactions, thus involving different electron acceptors. This modeling feature is implemented within the existing TMVOC module in iTOUGH2. TMVOCBio, originally developed by Battistelli (2003; 2004), uses a general modified form of the Monod kinetic rate equation to simulate biodegradation reactions, which effectively simulates the uptake of a substrate while accounting for various limiting factors (i.e., the limitation by substrate, electron acceptor, or nutrients). Two approaches are included: 1) a multiple Monodmore » kinetic rate equation, which assumes all the limiting factors simultaneously affect the substrate uptake rate, and 2) a minimum Monod model, which assumes that the substrate uptake rate is controlled by the most limiting factor among those acting for the specific substrate. As the limiting factors, biomass growth inhibition, toxicity effects, as well as competitive and non-competitive inhibition effects are included. The temperature and moisture dependence of biodegradation reactions is also considered. This report provides mathematical formulations and assumptions used for modeling the biodegradation reactions, and describes additional modeling capabilities. Detailed description of input format for biodegradation reactions is presented along with sample problems.« less
Single tree biomass modelling using airborne laser scanning
NASA Astrophysics Data System (ADS)
Kankare, Ville; Räty, Minna; Yu, Xiaowei; Holopainen, Markus; Vastaranta, Mikko; Kantola, Tuula; Hyyppä, Juha; Hyyppä, Hannu; Alho, Petteri; Viitala, Risto
2013-11-01
Accurate forest biomass mapping methods would provide the means for e.g. detecting bioenergy potential, biofuel and forest-bound carbon. The demand for practical biomass mapping methods at all forest levels is growing worldwide, and viable options are being developed. Airborne laser scanning (ALS) is a promising forest biomass mapping technique, due to its capability of measuring the three-dimensional forest vegetation structure. The objective of the study was to develop new methods for tree-level biomass estimation using metrics derived from ALS point clouds and to compare the results with field references collected using destructive sampling and with existing biomass models. The study area was located in Evo, southern Finland. ALS data was collected in 2009 with pulse density equalling approximately 10 pulses/m2. Linear models were developed for the following tree biomass components: total, stem wood, living branch and total canopy biomass. ALS-derived geometric and statistical point metrics were used as explanatory variables when creating the models. The total and stem biomass root mean square error per cents equalled 26.3% and 28.4% for Scots pine (Pinus sylvestris L.), and 36.8% and 27.6% for Norway spruce (Picea abies (L.) H. Karst.), respectively. The results showed that higher estimation accuracy for all biomass components can be achieved with models created in this study compared to existing allometric biomass models when ALS-derived height and diameter were used as input parameters. Best results were achieved when adding field-measured diameter and height as inputs in the existing biomass models. The only exceptions to this were the canopy and living branch biomass estimations for spruce. The achieved results are encouraging for the use of ALS-derived metrics in biomass mapping and for further development of the models.
Modeling functional neuroanatomy for an anatomy information system.
Niggemann, Jörg M; Gebert, Andreas; Schulz, Stefan
2008-01-01
Existing neuroanatomical ontologies, databases and information systems, such as the Foundational Model of Anatomy (FMA), represent outgoing connections from brain structures, but cannot represent the "internal wiring" of structures and as such, cannot distinguish between different independent connections from the same structure. Thus, a fundamental aspect of Neuroanatomy, the functional pathways and functional systems of the brain such as the pupillary light reflex system, is not adequately represented. This article identifies underlying anatomical objects which are the source of independent connections (collections of neurons) and uses these as basic building blocks to construct a model of functional neuroanatomy and its functional pathways. The basic representational elements of the model are unnamed groups of neurons or groups of neuron segments. These groups, their relations to each other, and the relations to the objects of macroscopic anatomy are defined. The resulting model can be incorporated into the FMA. The capabilities of the presented model are compared to the FMA and the Brain Architecture Management System (BAMS). Internal wiring as well as functional pathways can correctly be represented and tracked. This model bridges the gap between representations of single neurons and their parts on the one hand and representations of spatial brain structures and areas on the other hand. It is capable of drawing correct inferences on pathways in a nervous system. The object and relation definitions are related to the Open Biomedical Ontology effort and its relation ontology, so that this model can be further developed into an ontology of neuronal functional systems.
NASA Astrophysics Data System (ADS)
Manger, Daniel; Metzler, Jürgen
2014-03-01
Military Operations in Urban Terrain (MOUT) require the capability to perceive and to analyze the situation around a patrol in order to recognize potential threats. A permanent monitoring of the surrounding area is essential in order to appropriately react to the given situation, where one relevant task is the detection of objects that can pose a threat. Especially the robust detection of persons is important, as in MOUT scenarios threats usually arise from persons. This task can be supported by image processing systems. However, depending on the scenario, person detection in MOUT can be challenging, e.g. persons are often occluded in complex outdoor scenes and the person detection also suffers from low image resolution. Furthermore, there are several requirements on person detection systems for MOUT such as the detection of non-moving persons, as they can be a part of an ambush. Existing detectors therefore have to operate on single images with low thresholds for detection in order to not miss any person. This, in turn, leads to a comparatively high number of false positive detections which renders an automatic vision-based threat detection system ineffective. In this paper, a hybrid detection approach is presented. A combination of a discriminative and a generative model is examined. The objective is to increase the accuracy of existing detectors by integrating a separate hypotheses confirmation and rejection step which is built by a discriminative and generative model. This enables the overall detection system to make use of both the discriminative power and the capability to detect partly hidden objects with the models. The approach is evaluated on benchmark data sets generated from real-world image sequences captured during MOUT exercises. The extension shows a significant improvement of the false positive detection rate.
Circular revisit orbits design for responsive mission over a single target
NASA Astrophysics Data System (ADS)
Li, Taibo; Xiang, Junhua; Wang, Zhaokui; Zhang, Yulin
2016-10-01
The responsive orbits play a key role in addressing the mission of Operationally Responsive Space (ORS) because of their capabilities. These capabilities are usually focused on supporting specific targets as opposed to providing global coverage. One subtype of responsive orbits is repeat coverage orbit which is nearly circular in most remote sensing applications. This paper deals with a special kind of repeating ground track orbit, referred to as circular revisit orbit. Different from traditional repeat coverage orbits, a satellite on circular revisit orbit can visit a target site at both the ascending and descending stages in one revisit cycle. This typology of trajectory allows a halving of the traditional revisit time and does a favor to get useful information for responsive applications. However the previous reported numerical methods in some references often cost lots of computation or fail to obtain such orbits. To overcome this difficulty, an analytical method to determine the existence conditions of the solutions to revisit orbits is presented in this paper. To this end, the mathematical model of circular revisit orbit is established under the central gravity model and the J2 perturbation. A constraint function of the circular revisit orbit is introduced, and the monotonicity of that function has been studied. The existent conditions and the number of such orbits are naturally worked out. Taking the launch cost into consideration, optimal design model of circular revisit orbit is established to achieve a best orbit which visits a target twice a day in the morning and in the afternoon respectively for several days. The result shows that it is effective to apply circular revisit orbits in responsive application such as reconnoiter of natural disaster.
This manual is intended as a source document for individuals responsible for improving the performance of an existing, non-complying wastewater treatment facility. Described are: 1) methods to evaluate an existing facility's capability to achieve improved performance, 2) a ...
Soft mechanical metamaterials with unusual swelling behavior and tunable stress-strain curves
Guo, Xiaogang; Wu, Jun
2018-01-01
Soft adaptable materials that change their shapes, volumes, and properties in response to changes under ambient conditions have important applications in tissue engineering, soft robotics, biosensing, and flexible displays. Upon water absorption, most existing soft materials, such as hydrogels, show a positive volume change, corresponding to a positive swelling. By contrast, the negative swelling represents a relatively unusual phenomenon that does not exist in most natural materials. The development of material systems capable of large or anisotropic negative swelling remains a challenge. We combine analytic modeling, finite element analyses, and experiments to design a type of soft mechanical metamaterials that can achieve large effective negative swelling ratios and tunable stress-strain curves, with desired isotropic/anisotropic features. This material system exploits horseshoe-shaped composite microstructures of hydrogel and passive materials as the building blocks, which extend into a periodic network, following the lattice constructions. The building block structure leverages a sandwiched configuration to convert the hydraulic swelling deformations of hydrogel into bending deformations, thereby resulting in an effective shrinkage (up to around −47% linear strain) of the entire network. By introducing spatially heterogeneous designs, we demonstrated a range of unusual, anisotropic swelling responses, including those with expansion in one direction and, simultaneously, shrinkage along the perpendicular direction. The design approach, as validated by experiments, allows the determination of tailored microstructure geometries to yield desired length/area changes. These design concepts expand the capabilities of existing soft materials and hold promising potential for applications in a diverse range of areas.
Strategies for the coupling of global and local crystal growth models
NASA Astrophysics Data System (ADS)
Derby, Jeffrey J.; Lun, Lisa; Yeckel, Andrew
2007-05-01
The modular coupling of existing numerical codes to model crystal growth processes will provide for maximum effectiveness, capability, and flexibility. However, significant challenges are posed to make these coupled models mathematically self-consistent and algorithmically robust. This paper presents sample results from a coupling of the CrysVUn code, used here to compute furnace-scale heat transfer, and Cats2D, used to calculate melt fluid dynamics and phase-change phenomena, to form a global model for a Bridgman crystal growth system. However, the strategy used to implement the CrysVUn-Cats2D coupling is unreliable and inefficient. The implementation of under-relaxation within a block Gauss-Seidel iteration is shown to be ineffective for improving the coupling performance in a model one-dimensional problem representative of a melt crystal growth model. Ideas to overcome current convergence limitations using approximations to a full Newton iteration method are discussed.
Examining the Relationships Between Education, Social Networks and Democratic Support With ABM
NASA Technical Reports Server (NTRS)
Drucker, Nick; Campbell, Kenyth
2011-01-01
This paper introduces an agent-based model that explores the relationships between education, social networks, and support for democratic ideals. This study examines two factors thai affect democratic support, education, and social networks. Current theory concerning these two variables suggests that positive relationships exist between education and democratic support and between social networks and the spread of ideas. The model contains multiple variables of democratic support, two of which are evaluated through experimentation. The model allows individual entities within the system to make "decisions" about their democratic support independent of one another. The agent based approach also allows entities to utilize their social networks to spread ideas. Current theory supports experimentation results. In addion , these results show the model is capable of reproducing real world outcomes. This paper addresses the model creation process and the experimentation procedure, as well as future research avenues and potential shortcomings of the model
Advanced Atmospheric Modeling for Emergency Response.
NASA Astrophysics Data System (ADS)
Fast, Jerome D.; O'Steen, B. Lance; Addis, Robert P.
1995-03-01
Atmospheric transport and diffusion models are an important part of emergency response systems for industrial facilities that have the potential to release significant quantities of toxic or radioactive material into the atmosphere. An advanced atmospheric transport and diffusion modeling system for emergency response and environmental applications, based upon a three-dimensional mesoscale model, has been developed for the U.S. Department of Energy's Savannah River Site so that complex, time-dependent flow fields not explicitly measured can be routinely simulated. To overcome some of the current computational demands of mesoscale models, two operational procedures for the advanced atmospheric transport and diffusion modeling system are described including 1) a semiprognostic calculation to produce high-resolution wind fields for local pollutant transport in the vicinity of the Savannah River Site and 2) a fully prognostic calculation to produce a regional wind field encompassing the southeastern United States for larger-scale pollutant problems. Local and regional observations and large-scale model output are used by the mesoscale model for the initial conditions, lateral boundary conditions, and four-dimensional data assimilation procedure. This paper describes the current status of the modeling system and presents two case studies demonstrating the capabilities of both modes of operation. While the results from the case studies shown in this paper are preliminary and certainly not definitive, they do suggest that the mesoscale model has the potential for improving the prognostic capabilities of atmospheric modeling for emergency response at the Savannah River Site. Long-term model evaluation will be required to determine under what conditions significant forecast errors exist.
Capability and opportunity in hot shooting performance: Evidence from top-scoring NBA leaders
2018-01-01
In basketball games, whenever players successfully shoot in streaks, they are expected to demonstrate heightened performance for a stretch of time. Streak shooting in basketball has been debated for more than three decades, but most studies have provided little significant statistical evidence and have labeled random subjective judgments the “hot hand fallacy.” To obtain a broader perspective of the hot hand phenomenon and its accompanying influences on the court, this study uses field goal records and optical tracking data from the official NBA database for the entire 2015–2016 season to analyze top-scoring leaders’ shooting performances. We first reflect on the meaning of “hot hand” and the “Matthew effect” in actual basketball competition. Second, this study employs statistical models to integrate three different shooting perspectives (field goal percentage, points scored, and attempts). This study’s findings shed new light not only on the existence or nonexistence of streaks, but on the roles of capability and opportunity in NBA hot shooting. Furthermore, we show how hot shooting performances resulting from capability and opportunity lead to actual differences for teams. PMID:29432458
Capability and opportunity in hot shooting performance: Evidence from top-scoring NBA leaders.
Chang, Shun-Chuan
2018-01-01
In basketball games, whenever players successfully shoot in streaks, they are expected to demonstrate heightened performance for a stretch of time. Streak shooting in basketball has been debated for more than three decades, but most studies have provided little significant statistical evidence and have labeled random subjective judgments the "hot hand fallacy." To obtain a broader perspective of the hot hand phenomenon and its accompanying influences on the court, this study uses field goal records and optical tracking data from the official NBA database for the entire 2015-2016 season to analyze top-scoring leaders' shooting performances. We first reflect on the meaning of "hot hand" and the "Matthew effect" in actual basketball competition. Second, this study employs statistical models to integrate three different shooting perspectives (field goal percentage, points scored, and attempts). This study's findings shed new light not only on the existence or nonexistence of streaks, but on the roles of capability and opportunity in NBA hot shooting. Furthermore, we show how hot shooting performances resulting from capability and opportunity lead to actual differences for teams.
NASA Astrophysics Data System (ADS)
Barud-Zubillaga, Alberto
During the 2006 El Paso-Juarez flood there were many concerns regarding the capability of the existing stormwater system to handle 50- and 100-year flood events in El Paso, Texas and Juarez, Mexico area. Moreover in 2008, a considerable wet year from the normal 223 mm of annual precipitation for El Paso demonstrated that the area could very well received large amounts of precipitation at localized areas in short periods of time, representing a great flood threat to residents living in areas prone to flood. Some climate change projections for the area are exactly what had occurred over the last two decades; an increased number of torrential rainstorms over smaller concentrated pieces of land separated by longer years of drought between rainstorms. This study consisted in three projects focused on three critical regions within the El Paso-Juarez area that were greatly affected by the 2006 Flood. The goal was to identify if natural arroyos or the existent built stormwater system, could properly managed the projected precipitation patterns. The three projects described in this dissertation touch on the following points: (a) the importance of a reliable precipitation model that could accurately describes precipitation patterns in the region under extreme drought and wet climates conditions; (b) differences in land use/land cover characteristics as factors promoting or disrupting the possibility for flooding, and (c) limitations and capabilities of existent stormwater systems and natural arroyos as means to control flooding. Conclusions and recommendations are shown below, which apply not only to each particular project, but also to all study areas and similar areas in the El Paso-Juarez region. Urbanization can improve or worsen a pre-existing natural stormwater system if built under its required capacity. Such capacity should be calculated considering extreme weather conditions, based on a denser network of precipitation stations to capture the various microclimates found in the region and taking into account climate change predictions. Development of new areas needs to consider not only the watershed of study but its relation to other watersheds around them. Basin parameters seemed to be of low impact while comparing them with precipitation rates. High resolution DEMs, such as those derived from LiDAR can dramatically improve the accuracy and reliability of a hydrological model. Hardware capabilities and limitations however should be considered. The overall recommendations derived from this dissertation are to direct new studies, policies and regulations at the three levels of government---local, state and federal---to: limit urban development to areas of no or low potential for flooding; implementing some type of ecological, green corridors, or conservation easements to preserve these areas; build semi-natural or hybrid stormwater infrastructure to slowdown, collect, and ultimately, transport runoff to the Rio Grande or any other waterway; consider extreme wet and dry scenarios for designation of flood-prone areas and future construction of stormwater infrastructure; and design stormwater infrastructure to retrofit the existing natural and irrigation drains.
Photovoltaic Systems Test Facilities: Existing capabilities compilation
NASA Technical Reports Server (NTRS)
Volkmer, K.
1982-01-01
A general description of photovoltaic systems test facilities (PV-STFs) operated under the U.S. Department of Energy's photovoltaics program is given. Descriptions of a number of privately operated facilities having test capabilities appropriate to photovoltaic hardware development are given. A summary of specific, representative test capabilities at the system and subsystem level is presented for each listed facility. The range of system and subsystem test capabilities available to serve the needs of both the photovoltaics program and the private sector photovoltaics industry is given.
Forecasting disease risk for increased epidemic preparedness in public health
NASA Technical Reports Server (NTRS)
Myers, M. F.; Rogers, D. J.; Cox, J.; Flahault, A.; Hay, S. I.
2000-01-01
Emerging infectious diseases pose a growing threat to human populations. Many of the world's epidemic diseases (particularly those transmitted by intermediate hosts) are known to be highly sensitive to long-term changes in climate and short-term fluctuations in the weather. The application of environmental data to the study of disease offers the capability to demonstrate vector-environment relationships and potentially forecast the risk of disease outbreaks or epidemics. Accurate disease forecasting models would markedly improve epidemic prevention and control capabilities. This chapter examines the potential for epidemic forecasting and discusses the issues associated with the development of global networks for surveillance and prediction. Existing global systems for epidemic preparedness focus on disease surveillance using either expert knowledge or statistical modelling of disease activity and thresholds to identify times and areas of risk. Predictive health information systems would use monitored environmental variables, linked to a disease system, to be observed and provide prior information of outbreaks. The components and varieties of forecasting systems are discussed with selected examples, along with issues relating to further development.
AgBase: supporting functional modeling in agricultural organisms
McCarthy, Fiona M.; Gresham, Cathy R.; Buza, Teresia J.; Chouvarine, Philippe; Pillai, Lakshmi R.; Kumar, Ranjit; Ozkan, Seval; Wang, Hui; Manda, Prashanti; Arick, Tony; Bridges, Susan M.; Burgess, Shane C.
2011-01-01
AgBase (http://www.agbase.msstate.edu/) provides resources to facilitate modeling of functional genomics data and structural and functional annotation of agriculturally important animal, plant, microbe and parasite genomes. The website is redesigned to improve accessibility and ease of use, including improved search capabilities. Expanded capabilities include new dedicated pages for horse, cat, dog, cotton, rice and soybean. We currently provide 590 240 Gene Ontology (GO) annotations to 105 454 gene products in 64 different species, including GO annotations linked to transcripts represented on agricultural microarrays. For many of these arrays, this provides the only functional annotation available. GO annotations are available for download and we provide comprehensive, species-specific GO annotation files for 18 different organisms. The tools available at AgBase have been expanded and several existing tools improved based upon user feedback. One of seven new tools available at AgBase, GOModeler, supports hypothesis testing from functional genomics data. We host several associated databases and provide genome browsers for three agricultural pathogens. Moreover, we provide comprehensive training resources (including worked examples and tutorials) via links to Educational Resources at the AgBase website. PMID:21075795
NASA Astrophysics Data System (ADS)
Ma, Zhi-Sai; Liu, Li; Zhou, Si-Da; Yu, Lei; Naets, Frank; Heylen, Ward; Desmet, Wim
2018-01-01
The problem of parametric output-only identification of time-varying structures in a recursive manner is considered. A kernelized time-dependent autoregressive moving average (TARMA) model is proposed by expanding the time-varying model parameters onto the basis set of kernel functions in a reproducing kernel Hilbert space. An exponentially weighted kernel recursive extended least squares TARMA identification scheme is proposed, and a sliding-window technique is subsequently applied to fix the computational complexity for each consecutive update, allowing the method to operate online in time-varying environments. The proposed sliding-window exponentially weighted kernel recursive extended least squares TARMA method is employed for the identification of a laboratory time-varying structure consisting of a simply supported beam and a moving mass sliding on it. The proposed method is comparatively assessed against an existing recursive pseudo-linear regression TARMA method via Monte Carlo experiments and shown to be capable of accurately tracking the time-varying dynamics. Furthermore, the comparisons demonstrate the superior achievable accuracy, lower computational complexity and enhanced online identification capability of the proposed kernel recursive extended least squares TARMA approach.
Forecasting Disease Risk for Increased Epidemic Preparedness in Public Health
Myers, M.F.; Rogers, D.J.; Cox, J.; Flahault, A.; Hay, S.I.
2011-01-01
Emerging infectious diseases pose a growing threat to human populations. Many of the world’s epidemic diseases (particularly those transmitted by intermediate hosts) are known to be highly sensitive to long-term changes in climate and short-term fluctuations in the weather. The application of environmental data to the study of disease offers the capability to demonstrate vector–environment relationships and potentially forecast the risk of disease outbreaks or epidemics. Accurate disease forecasting models would markedly improve epidemic prevention and control capabilities. This chapter examines the potential for epidemic forecasting and discusses the issues associated with the development of global networks for surveillance and prediction. Existing global systems for epidemic preparedness focus on disease surveillance using either expert knowledge or statistical modelling of disease activity and thresholds to identify times and areas of risk. Predictive health information systems would use monitored environmental variables, linked to a disease system, to be observed and provide prior information of outbreaks. The components and varieties of forecasting systems are discussed with selected examples, along with issues relating to further development. PMID:10997211
Zooplankton can actively adjust their motility to turbulent flow
Michalec, François-Gaël; Fouxon, Itzhak
2017-01-01
Calanoid copepods are among the most abundant metazoans in the ocean and constitute a vital trophic link within marine food webs. They possess relatively narrow swimming capabilities, yet are capable of significant self-locomotion under strong hydrodynamic conditions. Here we provide evidence for an active adaptation that allows these small organisms to adjust their motility in response to background flow. We track simultaneously and in three dimensions the motion of flow tracers and planktonic copepods swimming freely at several intensities of quasi-homogeneous, isotropic turbulence. We show that copepods synchronize the frequency of their relocation jumps with the frequency of small-scale turbulence by performing frequent relocation jumps of low amplitude that seem unrelated to localized hydrodynamic signals. We develop a model of plankton motion in turbulence that shows excellent quantitative agreement with our measurements when turbulence is significant. We find that, compared with passive tracers, active motion enhances the diffusion of organisms at low turbulence intensity whereas it dampens diffusion at higher turbulence levels. The existence of frequent jumps in a motion that is otherwise dominated by turbulent transport allows for the possibility of active locomotion and hence to transition from being passively advected to being capable of controlling diffusion. This behavioral response provides zooplankton with the capability to retain the benefits of self-locomotion despite turbulence advection and may help these organisms to actively control their distribution in dynamic environments. Our study reveals an active adaptation that carries strong fitness advantages and provides a realistic model of plankton motion in turbulence. PMID:29229858
Friedberg, Mark W.; Safran, Dana G.; Coltin, Kathryn L.; Dresser, Marguerite
2008-01-01
Background The Patient-Centered Medical Home (PCMH), a popular model for primary care reorganization, includes several structural capabilities intended to enhance quality of care. The extent to which different types of primary care practices have adopted these capabilities has not been previously studied. Objective To measure the prevalence of recommended structural capabilities among primary care practices and to determine whether prevalence varies among practices of different size (number of physicians) and administrative affiliation with networks of practices. Design Cross-sectional analysis. Participants One physician chosen at random from each of 412 primary care practices in Massachusetts was surveyed about practice capabilities during 2007. Practice size and network affiliation were obtained from an existing database. Measurements Presence of 13 structural capabilities representing 4 domains relevant to quality: patient assistance and reminders, culture of quality, enhanced access, and electronic health records (EHRs). Main Results Three hundred eight (75%) physicians responded, representing practices with a median size of 4 physicians (range 2–74). Among these practices, 64% were affiliated with 1 of 9 networks. The prevalence of surveyed capabilities ranged from 24% to 88%. Larger practice size was associated with higher prevalence for 9 of the 13 capabilities spanning all 4 domains (P < 0.05). Network affiliation was associated with higher prevalence of 5 capabilities (P < 0.05) in 3 domains. Associations were not substantively altered by statistical adjustment for other practice characteristics. Conclusions Larger and network-affiliated primary care practices are more likely than smaller, non-affiliated practices to have adopted several recommended capabilities. In order to achieve PCMH designation, smaller non-affiliated practices may require the greatest investments. Electronic supplementary material The online version of this article (doi:10.1007/s11606-008-0856-x) contains supplementary material, which is available to authorized users. PMID:19050977
Implementing an error disclosure coaching model: A multicenter case study.
White, Andrew A; Brock, Douglas M; McCotter, Patricia I; Shannon, Sarah E; Gallagher, Thomas H
2017-01-01
National guidelines call for health care organizations to provide around-the-clock coaching for medical error disclosure. However, frontline clinicians may not always seek risk managers for coaching. As part of a demonstration project designed to improve patient safety and reduce malpractice liability, we trained multidisciplinary disclosure coaches at 8 health care organizations in Washington State. The training was highly rated by participants, although not all emerged confident in their coaching skill. This multisite intervention can serve as a model for other organizations looking to enhance existing disclosure capabilities. Success likely requires cultural change and repeated practice opportunities for coaches. © 2017 American Society for Healthcare Risk Management of the American Hospital Association.
Synergia: an accelerator modeling tool with 3-D space charge
DOE Office of Scientific and Technical Information (OSTI.GOV)
Amundson, James F.; Spentzouris, P.; /Fermilab
2004-07-01
High precision modeling of space-charge effects, together with accurate treatment of single-particle dynamics, is essential for designing future accelerators as well as optimizing the performance of existing machines. We describe Synergia, a high-fidelity parallel beam dynamics simulation package with fully three dimensional space-charge capabilities and a higher order optics implementation. We describe the computational techniques, the advanced human interface, and the parallel performance obtained using large numbers of macroparticles. We also perform code benchmarks comparing to semi-analytic results and other codes. Finally, we present initial results on particle tune spread, beam halo creation, and emittance growth in the Fermilab boostermore » accelerator.« less
Acoustic Measurements of Small Solid Rocket Motor
NASA Technical Reports Server (NTRS)
Vargas, Magda B.; Kenny, R. Jeremy
2010-01-01
Rocket acoustic noise can induce loads and vibration on the vehicle as well as the surrounding structures. Models have been developed to predict these acoustic loads based on scaling existing solid rocket motor data. The NASA Marshall Space Flight Center acoustics team has measured several small solid rocket motors (thrust below 150,000 lbf) to anchor prediction models. This data will provide NASA the capability to predict the acoustic environments and consequent vibro-acoustic response of larger rockets (thrust above 1,000,000 lbf) such as those planned for the NASA Constellation program. This paper presents the methods used to measure acoustic data during the static firing of small solid rocket motors and the trends found in the data.
Information support of monitoring of technical condition of buildings in construction risk area
NASA Astrophysics Data System (ADS)
Skachkova, M. E.; Lepihina, O. Y.; Ignatova, V. V.
2018-05-01
The paper presents the results of the research devoted to the development of a model of information support of monitoring buildings technical condition; these buildings are located in the construction risk area. As a result of the visual and instrumental survey, as well as the analysis of existing approaches and techniques, attributive and cartographic databases have been created. These databases allow monitoring defects and damages of buildings located in a 30-meter risk area from the object under construction. The classification of structures and defects of these buildings under survey is presented. The functional capabilities of the developed model and the field of it practical applications are determined.
Searching for long-lived particles: A compact detector for exotics at LHCb
NASA Astrophysics Data System (ADS)
Gligorov, Vladimir V.; Knapen, Simon; Papucci, Michele; Robinson, Dean J.
2018-01-01
We advocate for the construction of a new detector element at the LHCb experiment, designed to search for displaced decays of beyond Standard Model long-lived particles, taking advantage of a large shielded space in the LHCb cavern that is expected to soon become available. We discuss the general features and putative capabilities of such an experiment, as well as its various advantages and complementarities with respect to the existing LHC experiments and proposals such as SHiP and MATHUSLA. For two well-motivated beyond Standard Model benchmark scenarios—Higgs decay to dark photons and B meson decays via a Higgs mixing portal—the reach either complements or exceeds that predicted for other LHC experiments.
Manufacturing and control of the aspherical mirrors for the telescope of the satellite Pleiades
NASA Astrophysics Data System (ADS)
Ducollet, Hélène; du Jeu, Christian; Fermé, Jean-Jacques
2017-11-01
For the Pleiades space program, SESO has been awarded the contract (fully completed), for the manufacturing of the whole set of telescope mirrors (4 mirrors, 2 flight models). These works did also include the mechanical design, manufacturing and mounting of the attachment flexures between the mirrors and the telescope main structure. This presentation is focused on the different steps of lightweighting, polishing, integration and control of these mirrors as well as a presentation of the existing SESO facilities and capabilities to produce such kind of aspherical components/sub-assemblies.
Hyperbolic chaos in the klystron-type microwave vacuum tube oscillator
NASA Astrophysics Data System (ADS)
Emel'yanov, V. V.; Kuznetsov, S. P.; Ryskin, N. M.
2010-12-01
The ring-loop oscillator consisting of two coupled klystrons which is capable of generating hyperbolic chaotic signal in the microwave band is considered. The system of delayed-differential equations describing the dynamics of the oscillator is derived. This system is further reduced to the two-dimensional return map under the assumption of the instantaneous build-up of oscillations in the cavities. The results of detailed numerical simulation for both models are presented showing that there exists large enough range of control parameters where the sustained regime corresponds to the structurally stable hyperbolic chaos.
NASA Technical Reports Server (NTRS)
Przekwas, A. J.; Singhal, A. K.; Tam, L. T.
1984-01-01
The capability of simulating three dimensional two phase reactive flows with combustion in the liquid fuelled rocket engines is demonstrated. This was accomplished by modifying an existing three dimensional computer program (REFLAN3D) with Eulerian Lagrangian approach to simulate two phase spray flow, evaporation and combustion. The modified code is referred as REFLAN3D-SPRAY. The mathematical formulation of the fluid flow, heat transfer, combustion and two phase flow interaction of the numerical solution procedure, boundary conditions and their treatment are described.
Semantic Interaction for Visual Analytics: Toward Coupling Cognition and Computation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Endert, Alexander
2014-07-01
The dissertation discussed in this article [1] was written in the midst of an era of digitization. The world is becoming increasingly instrumented with sensors, monitoring, and other methods for generating data describing social, physical, and natural phenomena. Thus, data exist with the potential of being analyzed to uncover, or discover, the phenomena from which it was created. However, as the analytic models leveraged to analyze these data continue to increase in complexity and computational capability, how can visualizations and user interaction methodologies adapt and evolve to continue to foster discovery and sensemaking?
BIM-Based E-Procurement: An Innovative Approach to Construction E-Procurement
2015-01-01
This paper presents an innovative approach to e-procurement in construction, which uses building information models (BIM) to support the construction procurement process. The result is an integrated and electronic instrument connected to a rich knowledge base capable of advanced operations and able to strengthen transaction relationships and collaboration throughout the supply chain. The BIM-based e-procurement prototype has been developed using distinct existing electronic solutions and an IFC server and was tested in a pilot case study, which supported further discussions of the results of the research. PMID:26090518
NASA Technical Reports Server (NTRS)
Murray, John; Vernier, Jean-Paul; Fairlie, T. Duncan; Pavolonis, Michael; Krotkov, Nickolay A.; Lindsay, Francis; Haynes, John
2013-01-01
Although significant progress has been made in recent years, estimating volcanic ash concentration for the full extent of the airspace affected by volcanic ash remains a challenge. No single satellite, airborne or ground observing system currently exists which can sufficiently inform dispersion models to provide the degree of accuracy required to use them with a high degree of confidence for routing aircraft in and near volcanic ash. Toward this end, the detection and characterization of volcanic ash in the atmosphere may be substantially improved by integrating a wider array of observing systems and advancements in trajectory and dispersion modeling to help solve this problem. The qualitative aspect of this effort has advanced significantly in the past decade due to the increase of highly complementary observational and model data currently available. Satellite observations, especially when coupled with trajectory and dispersion models can provide a very accurate picture of the 3-dimensional location of ash clouds. The accurate estimate of the mass loading at various locations throughout the entire plume, however improving, remains elusive. This paper examines the capabilities of various satellite observation systems and postulates that model-based volcanic ash concentration maps and forecasts might be significantly improved if the various extant satellite capabilities are used together with independent, accurate mass loading data from other observing systems available to calibrate (tune) ash concentration retrievals from the satellite systems.
Augmenting epidemiological models with point-of-care diagnostics data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pullum, Laura L.; Ramanathan, Arvind; Nutaro, James J.
Although adoption of newer Point-of-Care (POC) diagnostics is increasing, there is a significant challenge using POC diagnostics data to improve epidemiological models. In this work, we propose a method to process zip-code level POC datasets and apply these processed data to calibrate an epidemiological model. We specifically develop a calibration algorithm using simulated annealing and calibrate a parsimonious equation-based model of modified Susceptible-Infected-Recovered (SIR) dynamics. The results show that parsimonious models are remarkably effective in predicting the dynamics observed in the number of infected patients and our calibration algorithm is sufficiently capable of predicting peak loads observed in POC diagnosticsmore » data while staying within reasonable and empirical parameter ranges reported in the literature. Additionally, we explore the future use of the calibrated values by testing the correlation between peak load and population density from Census data. Our results show that linearity assumptions for the relationships among various factors can be misleading, therefore further data sources and analysis are needed to identify relationships between additional parameters and existing calibrated ones. As a result, calibration approaches such as ours can determine the values of newly added parameters along with existing ones and enable policy-makers to make better multi-scale decisions.« less
Augmenting epidemiological models with point-of-care diagnostics data
Pullum, Laura L.; Ramanathan, Arvind; Nutaro, James J.; ...
2016-04-20
Although adoption of newer Point-of-Care (POC) diagnostics is increasing, there is a significant challenge using POC diagnostics data to improve epidemiological models. In this work, we propose a method to process zip-code level POC datasets and apply these processed data to calibrate an epidemiological model. We specifically develop a calibration algorithm using simulated annealing and calibrate a parsimonious equation-based model of modified Susceptible-Infected-Recovered (SIR) dynamics. The results show that parsimonious models are remarkably effective in predicting the dynamics observed in the number of infected patients and our calibration algorithm is sufficiently capable of predicting peak loads observed in POC diagnosticsmore » data while staying within reasonable and empirical parameter ranges reported in the literature. Additionally, we explore the future use of the calibrated values by testing the correlation between peak load and population density from Census data. Our results show that linearity assumptions for the relationships among various factors can be misleading, therefore further data sources and analysis are needed to identify relationships between additional parameters and existing calibrated ones. As a result, calibration approaches such as ours can determine the values of newly added parameters along with existing ones and enable policy-makers to make better multi-scale decisions.« less
Abrams, Tyler; Ding, Rui; Guo, Houyang Y.; ...
2017-04-03
It is important to develop a predictive capability for the tungsten source rate near the strike points during H-mode operation in ITER and beyond. H-mode deuterium plasma exposures were performed on W-coated graphite and TZM molybdenum substrates in the DIII-D divertor using DiMES. The W-I 400.9 nm spectral line was monitored by fast filtered diagnostics cross calibrated via a high-resolution spectrometer to resolve inter-ELM W erosion. The effective ionization/photon (S/XB) was calibrated using a unique method developed on DIII-D based on surface analysis. Inferred S/XB values agree with an existing empirical scaling at low electron density (n e) but divergemore » at higher densities, consistent with recent ADAS atomic physics modeling results. Edge modeling of the inter-ELM phase is conducted via OEDGE utilizing the new capability for charge-state resolved carbon impurity fluxes. ERO modeling is performed with the calculated main ion and impurity plasma background from OEDGE. ERO results demonstrate the importance a mixed-material surface model in the interpretation of W sourcing measurements. As a result, it is demonstrated that measured inter-ELM W erosion rates can be well explained by C→W sputtering only if a realistic mixed material model is incorporated.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abrams, Tyler; Ding, Rui; Guo, Houyang Y.
It is important to develop a predictive capability for the tungsten source rate near the strike points during H-mode operation in ITER and beyond. H-mode deuterium plasma exposures were performed on W-coated graphite and TZM molybdenum substrates in the DIII-D divertor using DiMES. The W-I 400.9 nm spectral line was monitored by fast filtered diagnostics cross calibrated via a high-resolution spectrometer to resolve inter-ELM W erosion. The effective ionization/photon (S/XB) was calibrated using a unique method developed on DIII-D based on surface analysis. Inferred S/XB values agree with an existing empirical scaling at low electron density (n e) but divergemore » at higher densities, consistent with recent ADAS atomic physics modeling results. Edge modeling of the inter-ELM phase is conducted via OEDGE utilizing the new capability for charge-state resolved carbon impurity fluxes. ERO modeling is performed with the calculated main ion and impurity plasma background from OEDGE. ERO results demonstrate the importance a mixed-material surface model in the interpretation of W sourcing measurements. As a result, it is demonstrated that measured inter-ELM W erosion rates can be well explained by C→W sputtering only if a realistic mixed material model is incorporated.« less
Progress in developing Poisson-Boltzmann equation solvers
Li, Chuan; Li, Lin; Petukh, Marharyta; Alexov, Emil
2013-01-01
This review outlines the recent progress made in developing more accurate and efficient solutions to model electrostatics in systems comprised of bio-macromolecules and nano-objects, the last one referring to objects that do not have biological function themselves but nowadays are frequently used in biophysical and medical approaches in conjunction with bio-macromolecules. The problem of modeling macromolecular electrostatics is reviewed from two different angles: as a mathematical task provided the specific definition of the system to be modeled and as a physical problem aiming to better capture the phenomena occurring in the real experiments. In addition, specific attention is paid to methods to extend the capabilities of the existing solvers to model large systems toward applications of calculations of the electrostatic potential and energies in molecular motors, mitochondria complex, photosynthetic machinery and systems involving large nano-objects. PMID:24199185
Haro, Alexander J.; Dudley, Robert W.; Chelminski, Michael
2012-01-01
A two-dimensional computational fluid dynamics-habitat suitability (CFD–HSI) model was developed to identify potential zones of shallow depth and high water velocity that may present passage challenges for five anadromous fish species in the Penobscot River, Maine, upstream from two existing dams and as a result of the proposed future removal of the dams. Potential depth-challenge zones were predicted for larger species at the lowest flow modeled in the dam-removal scenario. Increasing flows under both scenarios increased the number and size of potential velocity-challenge zones, especially for smaller species. This application of the two-dimensional CFD–HSI model demonstrated its capabilities to estimate the potential effects of flow and hydraulic alteration on the passage of migratory fish.
NASA Technical Reports Server (NTRS)
Furukawa, S.
1975-01-01
Current applications of simulation models for clinical research described included tilt model simulation of orthostatic intolerance with hemorrhage, and modeling long term circulatory circulation. Current capabilities include: (1) simulation of analogous pathological states and effects of abnormal environmental stressors by the manipulation of system variables and changing inputs in various sequences; (2) simulation of time courses of responses of controlled variables by the altered inputs and their relationships; (3) simulation of physiological responses of treatment such as isotonic saline transfusion; (4) simulation of the effectiveness of a treatment as well as the effects of complication superimposed on an existing pathological state; and (5) comparison of the effectiveness of various treatments/countermeasures for a given pathological state. The feasibility of applying simulation models to diagnostic and therapeutic research problems is assessed.
An ecohydrologic model for a shallow groundwater urban environment.
Arden, Sam; Ma, Xin Cissy; Brown, Mark
2014-01-01
The urban environment is a patchwork of natural and artificial surfaces that results in complex interactions with and impacts to natural hydrologic cycles. Evapotranspiration is a major hydrologic flow that is often altered through urbanization, although the mechanisms of change are sometimes difficult to tease out due to difficulty in effectively simulating soil-plant-atmosphere interactions. This paper introduces a simplified yet realistic model that is a combination of existing surface runoff and ecohydrology models designed to increase the quantitative understanding of complex urban hydrologic processes. Results demonstrate that the model is capable of simulating the long-term variability of major hydrologic fluxes as a function of impervious surface, temperature, water table elevation, canopy interception, soil characteristics, precipitation and complex mechanisms of plant water uptake. These understandings have potential implications for holistic urban water system management.
Microscopic modeling of multi-lane highway traffic flow
NASA Astrophysics Data System (ADS)
Hodas, Nathan O.; Jagota, Anand
2003-12-01
We discuss a microscopic model for the study of multi-lane highway traffic flow dynamics. Each car experiences a force resulting from a combination of the desire of the driver to attain a certain velocity, aerodynamic drag, and change of the force due to car-car interactions. The model also includes multi-lane simulation capability and the ability to add and remove obstructions. We implement the model via a Java applet, which is used to simulate traffic jam formation, the effect of bottlenecks on traffic flow, and the existence of light, medium, and heavy traffic flow. The simulations also provide insight into how the properties of individual cars result in macroscopic behavior. Because the investigation of emergent characteristics is so common in physics, the study of traffic in this manner sheds new light on how the micro-to-macro transition works in general.
NASA Astrophysics Data System (ADS)
Sahajpal, R.; Hurtt, G. C.; Fisk, J. P.; Izaurralde, R. C.; Zhang, X.
2012-12-01
While cellulosic biofuels are widely considered to be a low carbon energy source for the future, a comprehensive assessment of the environmental sustainability of existing and future biofuel systems is needed to assess their utility in meeting US energy and food needs without exacerbating environmental harm. To assess the carbon emission reduction potential of cellulosic biofuels, we need to identify lands that are initially not storing large quantities of carbon in soil and vegetation but are capable of producing abundant biomass with limited management inputs, and accurately model forest production rates and associated input requirements. Here we present modeled results for carbon emission reduction potential and cellulosic ethanol production of woody bioenergy crops replacing existing native prairie vegetation grown on marginal lands in the US Midwest. Marginal lands are selected based on soil properties describing use limitation, and are extracted from the SSURGO (Soil Survey Geographic) database. Yield estimates for existing native prairie vegetation on marginal lands modeled using the process-based field-scale model EPIC (Environmental Policy Integrated Climate) amount to ~ 6.7±2.0 Mg ha-1. To model woody bioenergy crops, the individual-based terrestrial ecosystem model ED (Ecosystem Demography) is initialized with the soil organic carbon stocks estimated at the end of the EPIC simulation. Four woody bioenergy crops: willow, southern pine, eucalyptus and poplar are parameterized in ED. Sensitivity analysis of model parameters and drivers is conducted to explore the range of carbon emission reduction possible with variation in woody bioenergy crop types, spatial and temporal resolution. We hypothesize that growing cellulosic crops on these marginal lands can provide significant water quality, biodiversity and GHG emissions mitigation benefits, without accruing additional carbon costs from the displacement of food and feed production.
Low-Order Modeling of Dynamic Stall on Airfoils in Incompressible Flow
NASA Astrophysics Data System (ADS)
Narsipur, Shreyas
Unsteady aerodynamics has been a topic of research since the late 1930's and has increased in popularity among researchers studying dynamic stall in helicopters, insect/bird flight, micro air vehicles, wind-turbine aerodynamics, and ow-energy harvesting devices. Several experimental and computational studies have helped researchers gain a good understanding of the unsteady ow phenomena, but have proved to be expensive and time-intensive for rapid design and analysis purposes. Since the early 1970's, the push to develop low-order models to solve unsteady ow problems has resulted in several semi-empirical models capable of effectively analyzing unsteady aerodynamics in a fraction of the time required by high-order methods. However, due to the various complexities associated with time-dependent flows, several empirical constants and curve fits derived from existing experimental and computational results are required by the semi-empirical models to be an effective analysis tool. The aim of the current work is to develop a low-order model capable of simulating incompressible dynamic-stall type ow problems with a focus on accurately modeling the unsteady ow physics with the aim of reducing empirical dependencies. The lumped-vortex-element (LVE) algorithm is used as the baseline unsteady inviscid model to which augmentations are applied to model unsteady viscous effects. The current research is divided into two phases. The first phase focused on augmentations aimed at modeling pure unsteady trailing-edge boundary-layer separation and stall without leading-edge vortex (LEV) formation. The second phase is targeted at including LEV shedding capabilities to the LVE algorithm and combining with the trailing-edge separation model from phase one to realize a holistic, optimized, and robust low-order dynamic stall model. In phase one, initial augmentations to theory were focused on modeling the effects of steady trailing-edge separation by implementing a non-linear decambering flap to model the effect of the separated boundary-layer. Unsteady RANS results for several pitch and plunge motions showed that the differences in aerodynamic loads between steady and unsteady flows can be attributed to the boundary-layer convection lag, which can be modeled by choosing an appropriate value of the time lag parameter, tau2. In order to provide appropriate viscous corrections to inviscid unsteady calculations, the non-linear decambering flap is applied with a time lag determined by the tau2 value, which was found to be independent of motion kinematics for a given airfoil and Reynolds number. The predictions of the aerodynamic loads, unsteady stall, hysteresis loops, and ow reattachment from the low-order model agree well with CFD and experimental results, both for individual cases and for trends between motions. The model was also found to perform as well as existing semi-empirical models while using only a single empirically defined parameter. Inclusion of LEV shedding capabilities and combining the resulting algorithm with phase one's trailing-edge separation model was the primary objective of phase two. Computational results at low and high Reynolds numbers were used to analyze the ow morphology of the LEV to identify the common surface signature associated with LEV initiation at both low and high Reynolds numbers and relate it to the critical leading-edge suction parameter (LESP ) to control the initiation and termination of LEV shedding in the low-order model. The critical LESP, like the tau2 parameter, was found to be independent of motion kinematics for a given airfoil and Reynolds number. Results from the final low-order model compared excellently with CFD and experimental solutions, both in terms of aerodynamic loads and vortex ow pattern predictions. Overall, the final combined dynamic stall model that resulted from the current research was successful in accurately modeling the physics of unsteady ow thereby helping restrict the number of empirical coefficients to just two variables while successfully modeling the aerodynamic forces and ow patterns in a simple and precise manner.
Virtual Habitat -a Dynamic Simulation of Closed Life Support Systems -Overall Status and Outlook
NASA Astrophysics Data System (ADS)
Zhukov, Anton; Schnaitmann, Jonas; Mecsaci, Ahmad; Bickel, Thomas; Markus Czupalla, M. Sc.
In order to optimize Life Support Systems (LSS) on a system level, stability questions and closure grade must be investigated. To do so the exploration group of the Technical University of Munich (TUM) is developing the "Virtual Habitat" (V-HAB) dynamic LSS simulation software. The main advantages of the dynamic simulation of LSS within V-HAB are the possibilities to compose different LSS configurations from the LSS subsystems and conduct dynamic simulation of it to test its stability in different mission scenarios inclusive emergency events and define the closure grade of the LSS. Additional the optimization of LSS based on different criteria will be possible. The Virtual Habitat simulation tool consists of four main modules: • Closed Environment Module (CEM) -monitoring of compounds in a closed environment • Crew Module (CM) -dynamic human simulation • P/C Systems Module (PCSM) -dynamic P/C subsystems • Plant Module (PM) -dynamic plant simulation Since the first idea and version, the V-HAB simulation has been significantly updated increasing its capabilities and maturity significantly. The updates which shall be introduced concern all modules of V-HAB. In particular: Significant progress has been made in development of the human model. In addition to the exist-ing human sub-models three newly developed ones (thermal regulation, digestion and schedule controller) have been introduced and shall be presented. Regarding the Plant Module a wheat plant model has been integrated in the V-HAB and is being correlated against test data. Ad-ditionally a first version of the algae bioreactor model has been developed and integrated. In terms of the P/C System module, an innovative approach for the P/C subsystem modelling has been developed and applied. The capabilities and features of the improved V-HAB models and the overall functionality of the V-HAB are demonstrated in form of meaningful test cases. In addition to the presentation of the results, the correlation strategy for the Virtual Habitat simulation shall be introduced assessing the models current confidence level and giving an outlook on the future correlation strategy.
Thermal bioaerosol cloud tracking with Bayesian classification
NASA Astrophysics Data System (ADS)
Smith, Christian W.; Dupuis, Julia R.; Schundler, Elizabeth C.; Marinelli, William J.
2017-05-01
The development of a wide area, bioaerosol early warning capability employing existing uncooled thermal imaging systems used for persistent perimeter surveillance is discussed. The capability exploits thermal imagers with other available data streams including meteorological data and employs a recursive Bayesian classifier to detect, track, and classify observed thermal objects with attributes consistent with a bioaerosol plume. Target detection is achieved based on similarity to a phenomenological model which predicts the scene-dependent thermal signature of bioaerosol plumes. Change detection in thermal sensor data is combined with local meteorological data to locate targets with the appropriate thermal characteristics. Target motion is tracked utilizing a Kalman filter and nearly constant velocity motion model for cloud state estimation. Track management is performed using a logic-based upkeep system, and data association is accomplished using a combinatorial optimization technique. Bioaerosol threat classification is determined using a recursive Bayesian classifier to quantify the threat probability of each tracked object. The classifier can accept additional inputs from visible imagers, acoustic sensors, and point biological sensors to improve classification confidence. This capability was successfully demonstrated for bioaerosol simulant releases during field testing at Dugway Proving Grounds. Standoff detection at a range of 700m was achieved for as little as 500g of anthrax simulant. Developmental test results will be reviewed for a range of simulant releases, and future development and transition plans for the bioaerosol early warning platform will be discussed.
NASA Technical Reports Server (NTRS)
Perry, Bruce; Anderson, Molly
2015-01-01
The Cascade Distillation Subsystem (CDS) is a rotary multistage distiller being developed to serve as the primary processor for wastewater recovery during long-duration space missions. The CDS could be integrated with a system similar to the International Space Station (ISS) Water Processor Assembly (WPA) to form a complete Water Recovery System (WRS) for future missions. Independent chemical process simulations with varying levels of detail have previously been developed using Aspen Custom Modeler (ACM) to aid in the analysis of the CDS and several WPA components. The existing CDS simulation could not model behavior during thermal startup and lacked detailed analysis of several key internal processes, including heat transfer between stages. The first part of this paper describes modifications to the ACM model of the CDS that improve its capabilities and the accuracy of its predictions. Notably, the modified version of the model can accurately predict behavior during thermal startup for both NaCl solution and pretreated urine feeds. The model is used to predict how changing operating parameters and design features of the CDS affects its performance, and conclusions from these predictions are discussed. The second part of this paper describes the integration of the modified CDS model and the existing WPA component models into a single WRS model. The integrated model is used to demonstrate the effects that changes to one component can have on the dynamic behavior of the system as a whole.
Space Station Mission Planning System (MPS) development study. Volume 2
NASA Technical Reports Server (NTRS)
Klus, W. J.
1987-01-01
The process and existing software used for Spacelab payload mission planning were studied. A complete baseline definition of the Spacelab payload mission planning process was established, along with a definition of existing software capabilities for potential extrapolation to the Space Station. This information was used as a basis for defining system requirements to support Space Station mission planning. The Space Station mission planning concept was reviewed for the purpose of identifying areas where artificial intelligence concepts might offer substantially improved capability. Three specific artificial intelligence concepts were to be investigated for applicability: natural language interfaces; expert systems; and automatic programming. The advantages and disadvantages of interfacing an artificial intelligence language with existing FORTRAN programs or of converting totally to a new programming language were identified.
Do Open Source LMSs Support Personalization? A Comparative Evaluation
NASA Astrophysics Data System (ADS)
Kerkiri, Tania; Paleologou, Angela-Maria
A number of parameters that support the LMSs capabilities towards content personalization are presented and substantiated. These parameters constitute critical criteria for an exhaustive investigation of the personalization capabilities of the most popular open source LMSs. Results are comparatively shown and commented upon, thus highlighting a course of conduct for the implementation of new personalization methodologies for these LMSs, aligned at their existing infrastructure, to maintain support of the numerous educational institutions entrusting major part of their curricula to them. Meanwhile, new capabilities arise as drawn from a more efficient description of the existing resources -especially when organized into widely available repositories- that lead to qualitatively advanced learner-oriented courses which would ideally meet the challenge of combining personification of demand and personalization of thematic content at once.
40 CFR 165.87 - Design and capacity requirements for existing structures.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 23 2010-07-01 2010-07-01 false Design and capacity requirements for... Structures § 165.87 Design and capacity requirements for existing structures. (a) For all existing... concrete or other rigid material capable of withstanding the full hydrostatic head, load and impact of any...
Characterization of structural connections using free and forced response test data
NASA Technical Reports Server (NTRS)
Lawrence, Charles; Huckelbridge, Arthur A.
1989-01-01
The accurate prediction of system dynamic response often has been limited by deficiencies in existing capabilities to characterize connections adequately. Connections between structural components often are complex mechanically, and difficult to accurately model analytically. Improved analytical models for connections are needed to improve system dynamic preditions. A procedure for identifying physical connection properties from free and forced response test data is developed, then verified utilizing a system having both a linear and nonlinear connection. Connection properties are computed in terms of physical parameters so that the physical characteristics of the connections can better be understood, in addition to providing improved input for the system model. The identification procedure is applicable to multi-degree of freedom systems, and does not require that the test data be measured directly at the connection locations.
Clinical governance is "ACE"--using the EFQM excellence model to support baseline assessment.
Holland, K; Fennell, S
2000-01-01
The introduction of clinical governance in the "new NHS" means that National Health Service (NHS) organisations are now accountable for the quality of the services they provide to their local communities. As part of the implementation of clinical governance in the NHS, Trusts and health authorities had to complete a baseline assessment of their capability and capacity by September 1999. Describes one Trust's approach to developing and implementing its baseline assessment tool, based upon its existing use of the European Foundation for Quality Management (EFQM) Excellence Model. An initial review of the process suggests that the model provides an adaptable framework for the development of a comprehensive and practical assessment tool and that self-assessment ensures ownership of action plans at service level.
NASA Astrophysics Data System (ADS)
Halkos, George E.; Tsilika, Kyriaki D.
2011-09-01
In this paper we examine the property of asymptotic stability in several dynamic economic systems, modeled in ordinary differential equation formulations of time parameter t. Asymptotic stability ensures intertemporal equilibrium for the economic quantity the solution stands for, regardless of what the initial conditions happen to be. Existence of economic equilibrium in continuous time models is checked via a Symbolic language, the Xcas program editor. Using stability theorems of differential equations as background a brief overview of symbolic capabilities of free software Xcas is given. We present computational experience with a programming style for stability results of ordinary linear and nonlinear differential equations. Numerical experiments on traditional applications of economic dynamics exhibit the simplicity clarity and brevity of input and output of our computer codes.
1st Order Modeling of a SAW Delay Line using MathCAD(Registered)
NASA Technical Reports Server (NTRS)
Wilson, William C.; Atkinson, Gary M.
2007-01-01
To aid in the development of SAW sensors for Integrated Vehicle Health Monitoring applications, a first order model of a SAW Delay line has been created using MathCadA. The model implements the Impulse Response method to calculate the frequency response, impedance, and insertion loss. This paper presents the model and the results from the model for a SAW delay line design. Integrated Vehicle Health Monitoring (IVHM) of aerospace vehicles requires rugged sensors having reduced volume, mass, and power that can be used to measure a variety of phenomena. Wireless systems are preferred when retro-fitting sensors onto existing vehicles [1]. Surface Acoustic Wave (SAW) devices are capable of sensing: temperature, pressure, strain, chemical species, mass loading, acceleration, and shear stress. SAW technology is low cost, rugged, lightweight, and extremely low power. Passive wireless sensors have been developed using SAW technology. For these reasons new SAW sensors are being investigated for aerospace applications.
The graph neural network model.
Scarselli, Franco; Gori, Marco; Tsoi, Ah Chung; Hagenbuchner, Markus; Monfardini, Gabriele
2009-01-01
Many underlying relationships among data in several areas of science and engineering, e.g., computer vision, molecular chemistry, molecular biology, pattern recognition, and data mining, can be represented in terms of graphs. In this paper, we propose a new neural network model, called graph neural network (GNN) model, that extends existing neural network methods for processing the data represented in graph domains. This GNN model, which can directly process most of the practically useful types of graphs, e.g., acyclic, cyclic, directed, and undirected, implements a function tau(G,n) is an element of IR(m) that maps a graph G and one of its nodes n into an m-dimensional Euclidean space. A supervised learning algorithm is derived to estimate the parameters of the proposed GNN model. The computational cost of the proposed algorithm is also considered. Some experimental results are shown to validate the proposed learning algorithm, and to demonstrate its generalization capabilities.
Communications network design and costing model technical manual
NASA Technical Reports Server (NTRS)
Logan, K. P.; Somes, S. S.; Clark, C. A.
1983-01-01
This computer model provides the capability for analyzing long-haul trunking networks comprising a set of user-defined cities, traffic conditions, and tariff rates. Networks may consist of all terrestrial connectivity, all satellite connectivity, or a combination of terrestrial and satellite connectivity. Network solutions provide the least-cost routes between all cities, the least-cost network routing configuration, and terrestrial and satellite service cost totals. The CNDC model allows analyses involving three specific FCC-approved tariffs, which are uniquely structured and representative of most existing service connectivity and pricing philosophies. User-defined tariffs that can be variations of these three tariffs are accepted as input to the model and allow considerable flexibility in network problem specification. The resulting model extends the domain of network analysis from traditional fixed link cost (distance-sensitive) problems to more complex problems involving combinations of distance and traffic-sensitive tariffs.
Functional Risk Modeling for Lunar Surface Systems
NASA Technical Reports Server (NTRS)
Thomson, Fraser; Mathias, Donovan; Go, Susie; Nejad, Hamed
2010-01-01
We introduce an approach to risk modeling that we call functional modeling , which we have developed to estimate the capabilities of a lunar base. The functional model tracks the availability of functions provided by systems, in addition to the operational state of those systems constituent strings. By tracking functions, we are able to identify cases where identical functions are provided by elements (rovers, habitats, etc.) that are connected together on the lunar surface. We credit functional diversity in those cases, and in doing so compute more realistic estimates of operational mode availabilities. The functional modeling approach yields more realistic estimates of the availability of the various operational modes provided to astronauts by the ensemble of surface elements included in a lunar base architecture. By tracking functional availability the effects of diverse backup, which often exists when two or more independent elements are connected together, is properly accounted for.
Modeling the Energy Use of a Connected and Automated Transportation System (Poster)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gonder, J.; Brown, A.
Early research points to large potential impacts of connected and automated vehicles (CAVs) on transportation energy use - dramatic savings, increased use, or anything in between. Due to a lack of suitable data and integrated modeling tools to explore these complex future systems, analyses to date have relied on simple combinations of isolated effects. This poster proposes a framework for modeling the potential energy implications from increasing penetration of CAV technologies and for assessing technology and policy options to steer them toward favorable energy outcomes. Current CAV modeling challenges include estimating behavior change, understanding potential vehicle-to-vehicle interactions, and assessing trafficmore » flow and vehicle use under different automation scenarios. To bridge these gaps and develop a picture of potential future automated systems, NREL is integrating existing modeling capabilities with additional tools and data inputs to create a more fully integrated CAV assessment toolkit.« less
Theoretical and software considerations for nonlinear dynamic analysis
NASA Technical Reports Server (NTRS)
Schmidt, R. J.; Dodds, R. H., Jr.
1983-01-01
In the finite element method for structural analysis, it is generally necessary to discretize the structural model into a very large number of elements to accurately evaluate displacements, strains, and stresses. As the complexity of the model increases, the number of degrees of freedom can easily exceed the capacity of present-day software system. Improvements of structural analysis software including more efficient use of existing hardware and improved structural modeling techniques are discussed. One modeling technique that is used successfully in static linear and nonlinear analysis is multilevel substructuring. This research extends the use of multilevel substructure modeling to include dynamic analysis and defines the requirements for a general purpose software system capable of efficient nonlinear dynamic analysis. The multilevel substructuring technique is presented, the analytical formulations and computational procedures for dynamic analysis and nonlinear mechanics are reviewed, and an approach to the design and implementation of a general purpose structural software system is presented.
Model-based object classification using unification grammars and abstract representations
NASA Astrophysics Data System (ADS)
Liburdy, Kathleen A.; Schalkoff, Robert J.
1993-04-01
The design and implementation of a high level computer vision system which performs object classification is described. General object labelling and functional analysis require models of classes which display a wide range of geometric variations. A large representational gap exists between abstract criteria such as `graspable' and current geometric image descriptions. The vision system developed and described in this work addresses this problem and implements solutions based on a fusion of semantics, unification, and formal language theory. Object models are represented using unification grammars, which provide a framework for the integration of structure and semantics. A methodology for the derivation of symbolic image descriptions capable of interacting with the grammar-based models is described and implemented. A unification-based parser developed for this system achieves object classification by determining if the symbolic image description can be unified with the abstract criteria of an object model. Future research directions are indicated.
Identification of Defects in Piles Through Dynamic Testing
NASA Astrophysics Data System (ADS)
Liao, Shutao T.; Roesset, Jose M.
1997-04-01
The objective of this work was to evaluate the theoretical capabilities of the non-destructive impact-response method in detecting the existence of a single defect in a pile, its location and its length. The cross-section of the pile is assumed to be circular and the defects are assumed to be axisymmetric in geometry. As mentioned in the companion paper, special codes utilizing one-dimensional (1-D) and three-dimensional (3-D) axisymmetric finite element models were developed to simulate the responses of defective piles to an impact load. Extensive parametric studies were then performed. In each study, the results from the direct use of time histories of displacements or velocities and the mechanical admittance (or mobility) function were compared in order to assess their capabilities. The effects of the length and the width of a defect were also investigated using these methods. Int. J. Numer. Anal. Meth. Geomech., vol. 21, 277-291 (1997)
The distributed production system of the SuperB project: description and results
NASA Astrophysics Data System (ADS)
Brown, D.; Corvo, M.; Di Simone, A.; Fella, A.; Luppi, E.; Paoloni, E.; Stroili, R.; Tomassetti, L.
2011-12-01
The SuperB experiment needs large samples of MonteCarlo simulated events in order to finalize the detector design and to estimate the data analysis performances. The requirements are beyond the capabilities of a single computing farm, so a distributed production model capable of exploiting the existing HEP worldwide distributed computing infrastructure is needed. In this paper we describe the set of tools that have been developed to manage the production of the required simulated events. The production of events follows three main phases: distribution of input data files to the remote site Storage Elements (SE); job submission, via SuperB GANGA interface, to all available remote sites; output files transfer to CNAF repository. The job workflow includes procedures for consistency checking, monitoring, data handling and bookkeeping. A replication mechanism allows storing the job output on the local site SE. Results from 2010 official productions are reported.
Solid propulsion advanced concepts
NASA Technical Reports Server (NTRS)
Nakamura, Y.; Shafer, J. I.
1972-01-01
The feasibility and application of a solid propulsion powered spacecraft concept to implement high energy missions independent of multiplanetary swingby opportunities are assessed and recommendations offered for future work. An upper stage, solid propulsion launch vehicle augmentation system was selected as the baseline configuration in view of the established program goals of low cost and high reliability. Spacecraft and propulsion system data that characterize mission performance capabilities were generated to serve as the basis for subsequent tradeoff studies. A cost effectiveness model was used for the preliminary feasibility assessment to provide a meaningful comparative effectiveness measure of the various candidate designs. The results substantiated the feasibility of the powered spacecraft concept when used in conjunction with several intermediate-sized launch vehicles as well as the existence of energy margins by which to exploit the attainment of extended mission capabilities. Additionally, in growth option applications, the employment of advanced propulsion systems and alternate spacecraft approaches appear promising.
Space shuttle main engine computed tomography applications
NASA Technical Reports Server (NTRS)
Sporny, Richard F.
1990-01-01
For the past two years the potential applications of computed tomography to the fabrication and overhaul of the Space Shuttle Main Engine were evaluated. Application tests were performed at various government and manufacturer facilities with equipment produced by four different manufacturers. The hardware scanned varied in size and complexity from a small temperature sensor and turbine blades to an assembled heat exchanger and main injector oxidizer inlet manifold. The evaluation of capabilities included the ability to identify and locate internal flaws, measure the depth of surface cracks, measure wall thickness, compare manifold design contours to actual part contours, perform automatic dimensional inspections, generate 3D computer models of actual parts, and image the relationship of the details in a complex assembly. The capabilities evaluated, with the exception of measuring the depth of surface flaws, demonstrated the existing and potential ability to perform many beneficial Space Shuttle Main Engine applications.
Liu, Qingshan; Guo, Zhishan; Wang, Jun
2012-02-01
In this paper, a one-layer recurrent neural network is proposed for solving pseudoconvex optimization problems subject to linear equality and bound constraints. Compared with the existing neural networks for optimization (e.g., the projection neural networks), the proposed neural network is capable of solving more general pseudoconvex optimization problems with equality and bound constraints. Moreover, it is capable of solving constrained fractional programming problems as a special case. The convergence of the state variables of the proposed neural network to achieve solution optimality is guaranteed as long as the designed parameters in the model are larger than the derived lower bounds. Numerical examples with simulation results illustrate the effectiveness and characteristics of the proposed neural network. In addition, an application for dynamic portfolio optimization is discussed. Copyright © 2011 Elsevier Ltd. All rights reserved.
COMPUTER SUPPORT SYSTEMS FOR ESTIMATING CHEMICAL TOXICITY: PRESENT CAPABILITIES AND FUTURE TRENDS
Computer Support Systems for Estimating Chemical Toxicity: Present Capabilities and Future Trends
A wide variety of computer-based artificial intelligence (AI) and decision support systems exist currently to aid in the assessment of toxicity for environmental chemicals. T...
Occupational Therapists' Views of Nussbaum's Life Capability: An Exploratory Study.
Mousavi, Tahmineh; Dharamsi, Shafik; Forwell, Susan; Dean, Elizabeth
2015-10-01
Life Capability is the first and most fundamental of Nussbaum's 10 Central Human Functional Capabilities (CHFCs). This capability refers to a person having a quality life of normal duration. The purpose of this study was to explore the views' of occupational therapists about Life Capability, specifically, their perspectives of this capability and its perceived relevance to practice. Semi-structured interviews with 14 occupational therapists in British Columbia, Canada, were conducted and thematically analyzed. Within this Canadian context, three themes emerged regarding occupational therapists' views about Life Capability: basic human right, quality of life, and longevity. Occupational therapists appear to view Life Capability as being consistent with the values of the occupational therapy profession. Nussbaum's other CHFCs warrant study to explore the degree to which the Capabilities Approach could complement existing occupational therapy theories, science, and practice. © The Author(s) 2015.
NASA Technical Reports Server (NTRS)
Foster, John V.; Hartman, David C.
2017-01-01
The NASA Unmanned Aircraft System (UAS) Traffic Management (UTM) project is conducting research to enable civilian low-altitude airspace and UAS operations. A goal of this project is to develop probabilistic methods to quantify risk during failures and off nominal flight conditions. An important part of this effort is the reliable prediction of feasible trajectories during off-nominal events such as control failure, atmospheric upsets, or navigation anomalies that can cause large deviations from the intended flight path or extreme vehicle upsets beyond the normal flight envelope. Few examples of high-fidelity modeling and prediction of off-nominal behavior for small UAS (sUAS) vehicles exist, and modeling requirements for accurately predicting flight dynamics for out-of-envelope or failure conditions are essentially undefined. In addition, the broad range of sUAS aircraft configurations already being fielded presents a significant modeling challenge, as these vehicles are often very different from one another and are likely to possess dramatically different flight dynamics and resultant trajectories and may require different modeling approaches to capture off-nominal behavior. NASA has undertaken an extensive research effort to define sUAS flight dynamics modeling requirements and develop preliminary high fidelity six degree-of-freedom (6-DOF) simulations capable of more closely predicting off-nominal flight dynamics and trajectories. This research has included a literature review of existing sUAS modeling and simulation work as well as development of experimental testing methods to measure and model key components of propulsion, airframe and control characteristics. The ultimate objective of these efforts is to develop tools to support UTM risk analyses and for the real-time prediction of off-nominal trajectories for use in the UTM Risk Assessment Framework (URAF). This paper focuses on modeling and simulation efforts for a generic quad-rotor configuration typical of many commercial vehicles in use today. An overview of relevant off-nominal multi-rotor behaviors will be presented to define modeling goals and to identify the prediction capability lacking in simplified models of multi-rotor performance. A description of recent NASA wind tunnel testing of multi-rotor propulsion and airframe components will be presented illustrating important experimental and data acquisition methods, and a description of preliminary propulsion and airframe models will be presented. Lastly, examples of predicted off-nominal flight dynamics and trajectories from the simulation will be presented.
Capabilities of the Large-Scale Sediment Transport Facility
2016-04-01
experiments in wave /current environments. INTRODUCTION: The LSTF (Figure 1) is a large-scale laboratory facility capable of simulating conditions...comparable to low- wave energy coasts. The facility was constructed to address deficiencies in existing methods for calculating longshore sediment...transport. The LSTF consists of a 30 m wide, 50 m long, 1.4 m deep basin. Waves are generated by four digitally controlled wave makers capable of producing
Examining quality improvement programs: the case of Minnesota hospitals.
Olson, John R; Belohlav, James A; Cook, Lori S; Hays, Julie M
2008-10-01
To determine if there is a hierarchy of improvement program adoption by hospitals and outline that hierarchy. Primary data were collected in the spring of 2007 via e-survey from 210 individuals representing 109 Minnesota hospitals. Secondary data from 2006 were assembled from the Leapfrog database. As part of a larger survey, respondents were given a list of improvement programs and asked to identify those programs that are used in their hospital. DATA COLLECTION/DATA EXTRACTION: Rasch Model Analysis was used to assess whether a unidimensional construct exists that defines a hospital's ability to implement performance improvement programs. Linear regression analysis was used to assess the relationship of the Rasch ability scores with Leapfrog Safe Practices Scores to validate the research findings. Principal Findings. The results of the study show that hospitals have widely varying abilities in implementing improvement programs. In addition, improvement programs present differing levels of difficulty for hospitals trying to implement them. Our findings also indicate that the ability to adopt improvement programs is important to the overall performance of hospitals. There is a hierarchy of improvement programs in the health care context. A hospital's ability to successfully adopt improvement programs is a function of its existing capabilities. As a hospital's capability increases, the ability to successfully implement higher level programs also increases.
The natural compound sanguinarine perturbs the regenerative capabilities of planarians.
Balestrini, Linda; Di Donfrancesco, Alessia; Rossi, Leonardo; Marracci, Silvia; Isolani, Maria E; Bianucci, Anna M; Batistoni, Renata
2017-01-01
The natural alkaloid sanguinarine has remarkable therapeutic properties and has been used for centuries as a folk remedy. This compound exhibits interesting anticancer properties and is currently receiving attention as a potential chemotherapeutic agent. Nevertheless, limited information exists regarding its safety for developing organisms. Planarians are an animal model known for their extraordinary stem cell-based regenerative capabilities and are increasingly used for toxicological and pharmacological studies. Here, we report that sanguinarine, at micromolar concentrations, perturbs the regeneration process in the planarian Dugesia japonica. We show that sanguinarine exposure causes defects during anterior regeneration and visual system recovery, as well as anomalous remodelling of pre-existing structures. Investigating the effects of sanguinarine on stem cells, we found that sanguinarine perturbs the transcriptional profile of early and late stem cell progeny markers. Our results indicate that sanguinarine exposure alters cell dynamics and induces apoptosis without affecting cell proliferation. Finally, sanguinarine exposure influences the expression level of H + , K + -ATPase α subunit, a gene of the P-type-ATPase pump family which plays a crucial role during anterior regeneration in planaria. On the whole, our data reveal that sanguinarine perturbs multiple mechanisms which regulate regeneration dynamics and contribute to a better understanding of the safety profile of this alkaloid in developing organisms.
A Different Web-Based Geocoding Service Using Fuzzy Techniques
NASA Astrophysics Data System (ADS)
Pahlavani, P.; Abbaspour, R. A.; Zare Zadiny, A.
2015-12-01
Geocoding - the process of finding position based on descriptive data such as address or postal code - is considered as one of the most commonly used spatial analyses. Many online map providers such as Google Maps, Bing Maps and Yahoo Maps present geocoding as one of their basic capabilities. Despite the diversity of geocoding services, users usually face some limitations when they use available online geocoding services. In existing geocoding services, proximity and nearness concept is not modelled appropriately as well as these services search address only by address matching based on descriptive data. In addition there are also some limitations in display searching results. Resolving these limitations can enhance efficiency of the existing geocoding services. This paper proposes the idea of integrating fuzzy technique with geocoding process to resolve these limitations. In order to implement the proposed method, a web-based system is designed. In proposed method, nearness to places is defined by fuzzy membership functions and multiple fuzzy distance maps are created. Then these fuzzy distance maps are integrated using fuzzy overlay technique for obtain the results. Proposed methods provides different capabilities for users such as ability to search multi-part addresses, searching places based on their location, non-point representation of results as well as displaying search results based on their priority.