Sample records for current modeling capabilities

  1. Upgrades, Current Capabilities and Near-Term Plans of the NASA ARC Mars Climate

    NASA Technical Reports Server (NTRS)

    Hollingsworth, J. L.; Kahre, Melinda April; Haberle, Robert M.; Schaeffer, James R.

    2012-01-01

    We describe and review recent upgrades to the ARC Mars climate modeling framework, in particular, with regards to physical parameterizations (i.e., testing, implementation, modularization and documentation); the current climate modeling capabilities; selected research topics regarding current/past climates; and then, our near-term plans related to the NASA ARC Mars general circulation modeling (GCM) project.

  2. Best Practices for Evaluating the Capability of Nondestructive Evaluation (NDE) and Structural Health Monitoring (SHM) Techniques for Damage Characterization (Post-Print)

    DTIC Science & Technology

    2016-02-10

    a wide range of part, environmental and damage conditions. Best practices of using models are presented for both an eddy current NDE sizing and...to assess the reliability of NDE and SHM characterization capability. Best practices of using models are presented for both an eddy current NDE... EDDY CURRENT NDE CASE STUDY An eddy current crack sizing case study is presented to highlight examples of some of these complex characteristics of

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rabiti, Cristian; Alfonsi, Andrea; Huang, Dongli

    This report collect the effort performed to improve the reliability analysis capabilities of the RAVEN code and explore new opportunity in the usage of surrogate model by extending the current RAVEN capabilities to multi physics surrogate models and construction of surrogate models for high dimensionality fields.

  4. Demonstration of a Real Time Capability to Produce Tidal Heights and Currents for Naval Operational Use: A Cast Study for the West Coast of Africa (Liberia)

    NASA Technical Reports Server (NTRS)

    Mehra, Avichal; Anantharaj, Valentine; Payne, Steve; Kantha, Lakshmi

    1996-01-01

    This report documents an existing capability to produce operationally relevant products on sea level and currents from a tides/storm surge model for any coastal region around the world within 48 hours from the time of the request. The model is ready for transition to the Naval Oceanographic Office (NAVOCEANO) for potential contingency use anywhere around the world. A recent application to naval operations offshore Liberia illustrates this. Mississippi State University, in collaboration with the University of Colorado and NAVOCEANO, successfully deployed the Colorado University Rapidly Relocatable Nestable Tides and Storm Surge (CURReNTSS) model that predicts sea surface height, tidal currents and storm surge, and provided operational products on tidal sea level and currents in the littoral region off south-western coast of Africa. This report summarizes the results of this collaborative effort in an actual contingency use of the relocatable model, summarizes the lessons learned, and provides recommendations for further evaluation and transition of this modeling capability to operational use.

  5. Eddy current inspection of graphite fiber components

    NASA Technical Reports Server (NTRS)

    Workman, G. L.; Bryson, C. C.

    1990-01-01

    The recognition of defects in materials properties still presents a number of problems for nondestructive testing in aerospace systems. This project attempts to utilize current capabilities in eddy current instrumentation, artificial intelligence, and robotics in order to provide insight into defining geometrical aspects of flaws in composite materials which are capable of being evaluated using eddy current inspection techniques. The unique capabilities of E-probes and horseshoe probes for inspecting probes for inspecting graphite fiber materials were evaluated and appear to hold great promise once the technology development matures. The initial results are described of modeling eddy current interactions with certain flaws in graphite fiber samples.

  6. Urban public transit systems modeling capabilities

    DOT National Transportation Integrated Search

    1995-02-01

    Current national transportation policy places increasing emphasis on multi-modal : solutions involving public transit and high-occupancy vehicle (HOV) facilities : and services. Current traffic simulation/assignment models, however, have only : limit...

  7. Microgrid Design Toolkit (MDT) User Guide Software v1.2.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eddy, John P.

    2017-08-01

    The Microgrid Design Toolkit (MDT) supports decision analysis for new ("greenfield") microgrid designs as well as microgrids with existing infrastructure. The current version of MDT includes two main capabilities. The first capability, the Microgrid Sizing Capability (MSC), is used to determine the size and composition of a new, grid connected microgrid in the early stages of the design process. MSC is focused on developing a microgrid that is economically viable when connected to the grid. The second capability is focused on designing a microgrid for operation in islanded mode. This second capability relies on two models: the Technology Management Optimizationmore » (TMO) model and Performance Reliability Model (PRM).« less

  8. Simulation study of a new inverse-pinch high Coulomb transfer switch

    NASA Technical Reports Server (NTRS)

    Choi, S. H.

    1984-01-01

    A simulation study of a simplified model of a high coulomb transfer switch is performed. The switch operates in an inverse pinch geometry formed by an all metal chamber, which greatly reduces hot spot formations on the electrode surfaces. Advantages of the switch over the conventional switches are longer useful life, higher current capability and lower inductance, which improves the characteristics required for a high repetition rate switch. The simulation determines the design parameters by analytical computations and comparison with the experimentally measured risetime, current handling capability, electrode damage, and hold-off voltages. The parameters of initial switch design can be determined for the anticipated switch performance. Results are in agreement with the experiment results. Although the model is simplified, the switch characteristics such as risetime, current handling capability, electrode damages, and hold-off voltages are accurately determined.

  9. Off-Gas Adsorption Model Capabilities and Recommendations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lyon, Kevin L.; Welty, Amy K.; Law, Jack

    2016-03-01

    Off-gas treatment is required to reduce emissions from aqueous fuel reprocessing. Evaluating the products of innovative gas adsorption research requires increased computational simulation capability to more effectively transition from fundamental research to operational design. Early modeling efforts produced the Off-Gas SeParation and REcoverY (OSPREY) model that, while efficient in terms of computation time, was of limited value for complex systems. However, the computational and programming lessons learned in development of the initial model were used to develop Discontinuous Galerkin OSPREY (DGOSPREY), a more effective model. Initial comparisons between OSPREY and DGOSPREY show that, while OSPREY does reasonably well to capturemore » the initial breakthrough time, it displays far too much numerical dispersion to accurately capture the real shape of the breakthrough curves. DGOSPREY is a much better tool as it utilizes a more stable set of numerical methods. In addition, DGOSPREY has shown the capability to capture complex, multispecies adsorption behavior, while OSPREY currently only works for a single adsorbing species. This capability makes DGOSPREY ultimately a more practical tool for real world simulations involving many different gas species. While DGOSPREY has initially performed very well, there is still need for improvement. The current state of DGOSPREY does not include any micro-scale adsorption kinetics and therefore assumes instantaneous adsorption. This is a major source of error in predicting water vapor breakthrough because the kinetics of that adsorption mechanism is particularly slow. However, this deficiency can be remedied by building kinetic kernels into DGOSPREY. Another source of error in DGOSPREY stems from data gaps in single species, such as Kr and Xe, isotherms. Since isotherm data for each gas is currently available at a single temperature, the model is unable to predict adsorption at temperatures outside of the set of data currently available. Thus, in order to improve the predictive capabilities of the model, there is a need for more single-species adsorption isotherms at different temperatures, in addition to extending the model to include adsorption kinetics. This report provides background information about the modeling process and a path forward for further model improvement in terms of accuracy and user interface.« less

  10. A Conceptual Measurement Model for eHealth Readiness: a Team Based Perspective

    PubMed Central

    Phillips, James; Poon, Simon K.; Yu, Dan; Lam, Mary; Hines, Monique; Brunner, Melissa; Power, Emma; Keep, Melanie; Shaw, Tim; Togher, Leanne

    2017-01-01

    Despite the shift towards collaborative healthcare and the increase in the use of eHealth technologies, there does not currently exist a model for the measurement of eHealth readiness in interdisciplinary healthcare teams. This research aims to address this gap in the literature through the development of a three phase methodology incorporating qualitative and quantitative methods. We propose a conceptual measurement model consisting of operationalized themes affecting readiness across four factors: (i) Organizational Capabilities, (ii) Team Capabilities, (iii) Patient Capabilities, and (iv) Technology Capabilities. The creation of this model will allow for the measurement of the readiness of interdisciplinary healthcare teams to use eHealth technologies to improve patient outcomes. PMID:29854207

  11. Updraft Fixed Bed Gasification Aspen Plus Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2007-09-27

    The updraft fixed bed gasification model provides predictive modeling capabilities for updraft fixed bed gasifiers, when devolatilization data is available. The fixed bed model is constructed using Aspen Plus, process modeling software, coupled with a FORTRAN user kinetic subroutine. Current updraft gasification models created in Aspen Plus have limited predictive capabilities and must be "tuned" to reflect a generalized gas composition as specified in literature or by the gasifier manufacturer. This limits the applicability of the process model.

  12. The Ensemble Space Weather Modeling System (eSWMS): Status, Capabilities and Challenges

    NASA Astrophysics Data System (ADS)

    Fry, C. D.; Eccles, J. V.; Reich, J. P.

    2010-12-01

    Marking a milestone in space weather forecasting, the Space Weather Modeling System (SWMS) successfully completed validation testing in advance of operational testing at Air Force Weather Agency’s primary space weather production center. This is the first coupling of stand-alone, physics-based space weather models that are currently in operations at AFWA supporting the warfighter. Significant development effort went into ensuring the component models were portable and scalable while maintaining consistent results across diverse high performance computing platforms. Coupling was accomplished under the Earth System Modeling Framework (ESMF). The coupled space weather models are the Hakamada-Akasofu-Fry version 2 (HAFv2) solar wind model and GAIM1, the ionospheric forecast component of the Global Assimilation of Ionospheric Measurements (GAIM) model. The SWMS was developed by team members from AFWA, Explorations Physics International, Inc. (EXPI) and Space Environment Corporation (SEC). The successful development of the SWMS provides new capabilities beyond enabling extended lead-time, data-driven ionospheric forecasts. These include ingesting diverse data sets at higher resolution, incorporating denser computational grids at finer time steps, and performing probability-based ensemble forecasts. Work of the SWMS development team now focuses on implementing the ensemble-based probability forecast capability by feeding multiple scenarios of 5 days of solar wind forecasts to the GAIM1 model based on the variation of the input fields to the HAFv2 model. The ensemble SWMS (eSWMS) will provide the most-likely space weather scenario with uncertainty estimates for important forecast fields. The eSWMS will allow DoD mission planners to consider the effects of space weather on their systems with more advance warning than is currently possible. The payoff is enhanced, tailored support to the warfighter with improved capabilities, such as point-to-point HF propagation forecasts, single-frequency GPS error corrections, and high cadence, high-resolution Space Situational Awareness (SSA) products. We present the current status of eSWMS, its capabilities, limitations and path of transition to operational use.

  13. Health Capability: Conceptualization and Operationalization

    PubMed Central

    2010-01-01

    Current theoretical approaches to bioethics and public health ethics propose varied justifications as the basis for health care and public health, yet none captures a fundamental reality: people seek good health and the ability to pursue it. Existing models do not effectively address these twin goals. The approach I espouse captures both of these orientations through a concept here called health capability. Conceptually, health capability illuminates the conditions that affect health and one's ability to make health choices. By respecting the health consequences individuals face and their health agency, health capability offers promise for finding a balance between paternalism and autonomy. I offer a conceptual model of health capability and present a health capability profile to identify and address health capability gaps. PMID:19965570

  14. Mixed Phase Modeling in GlennICE with Application to Engine Icing

    NASA Technical Reports Server (NTRS)

    Wright, William B.; Jorgenson, Philip C. E.; Veres, Joseph P.

    2011-01-01

    A capability for modeling ice crystals and mixed phase icing has been added to GlennICE. Modifications have been made to the particle trajectory algorithm and energy balance to model this behavior. This capability has been added as part of a larger effort to model ice crystal ingestion in aircraft engines. Comparisons have been made to four mixed phase ice accretions performed in the Cox icing tunnel in order to calibrate an ice erosion model. A sample ice ingestion case was performed using the Energy Efficient Engine (E3) model in order to illustrate current capabilities. Engine performance characteristics were supplied using the Numerical Propulsion System Simulation (NPSS) model for this test case.

  15. Developments in Coastal Ocean Modeling

    NASA Astrophysics Data System (ADS)

    Allen, J. S.

    2001-12-01

    Capabilities in modeling continental shelf flow fields have improved markedly in the last several years. Progress is being made toward the long term scientific goal of utilizing numerical circulation models to interpolate, or extrapolate, necessarily limited field measurements to provide additional full-field information describing the behavior of, and providing dynamical rationalizations for, complex observed coastal flow. The improvement in modeling capabilities has been due to several factors including an increase in computer power and, importantly, an increase in experience of modelers in formulating relevant numerical experiments and in analyzing model results. We demonstrate present modeling capabilities and limitations by discussion of results from recent studies of shelf circulation off Oregon and northern California (joint work with Newberger, Gan, Oke, Pullen, and Wijesekera). Strong interactions between wind-forced coastal currents and continental shelf topography characterize the flow regimes in these cases. Favorable comparisons of model and measured alongshore currents and other variables provide confidence in the model-produced fields. The dependence of the mesoscale circulation, including upwelling and downwelling fronts and flow instabilities, on the submodel used to parameterize the effects of small scale turbulence, is discussed. Analyses of model results to provide explanations for the observed, but previously unexplained, alongshore variability in the intensity of coastal upwelling, which typically results in colder surface water south of capes, and the observed development in some locations of northward currents near the coast in response to the relaxation of southward winds, are presented.

  16. Combustion system CFD modeling at GE Aircraft Engines

    NASA Technical Reports Server (NTRS)

    Burrus, D.; Mongia, H.; Tolpadi, Anil K.; Correa, S.; Braaten, M.

    1995-01-01

    This viewgraph presentation discusses key features of current combustion system CFD modeling capabilities at GE Aircraft Engines provided by the CONCERT code; CONCERT development history; modeling applied for designing engine combustion systems; modeling applied to improve fundamental understanding; CONCERT3D results for current production combustors; CONCERT3D model of NASA/GE E3 combustor; HYBRID CONCERT CFD/Monte-Carlo modeling approach; and future modeling directions.

  17. Combustion system CFD modeling at GE Aircraft Engines

    NASA Astrophysics Data System (ADS)

    Burrus, D.; Mongia, H.; Tolpadi, Anil K.; Correa, S.; Braaten, M.

    1995-03-01

    This viewgraph presentation discusses key features of current combustion system CFD modeling capabilities at GE Aircraft Engines provided by the CONCERT code; CONCERT development history; modeling applied for designing engine combustion systems; modeling applied to improve fundamental understanding; CONCERT3D results for current production combustors; CONCERT3D model of NASA/GE E3 combustor; HYBRID CONCERT CFD/Monte-Carlo modeling approach; and future modeling directions.

  18. A Capital or Capabilities Education Narrative in a World of Staggering Inequalities?

    ERIC Educational Resources Information Center

    Walker, Melanie

    2012-01-01

    In a world of tremendous inequalities, this paper explores two contrasting normative models for education policy, and the relationship of each to policy, practices and outcomes that can improve lives by reducing injustice and building societies which value capabilities for all. The first model is that of human capital which currently dominates…

  19. Space shuttle hypergolic bipropellant RCS engine design study, Bell model 8701

    NASA Technical Reports Server (NTRS)

    1974-01-01

    A research program was conducted to define the level of the current technology base for reaction control system rocket engines suitable for space shuttle applications. The project consisted of engine analyses, design, fabrication, and tests. The specific objectives are: (1) extrapolating current engine design experience to design of an RCS engine with required safety, reliability, performance, and operational capability, (2) demonstration of multiple reuse capability, and (3) identification of current design and technology deficiencies and critical areas for future effort.

  20. User's instructions for the GE cardiovascular model to simulate LBNP and tilt experiments, with graphic capabilities

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The present form of this cardiovascular model simulates both 1-g and zero-g LBNP (lower body negative pressure) experiments and tilt experiments. In addition, the model simulates LBNP experiments at any body angle. The model is currently accessible on the Univac 1110 Time-Shared System in an interactive operational mode. Model output may be in tabular form and/or graphic form. The graphic capabilities are programmed for the Tektronix 4010 graphics terminal and the Univac 1110.

  1. Nonlinear Simulation of DIII-D Plasma and Poloidal Systems Using DINA and Simulink

    NASA Astrophysics Data System (ADS)

    Walker, M. L.; Leuer, J. A.; Deranian, R. D.; Humphreys, D. A.; Khayrutdinov, R. R.

    2002-11-01

    Hardware-in-the-loop simulation capability was developed previously for poloidal shape control testing using Matlab Simulink [1]. This has been upgraded by replacing a linearized plasma model with the DINA nonlinear plasma evolution code [2]. In addition to its use for shape control studies, this new capability will allow study of current profile control using the DINA model of electron cyclotron current drive (ECCD) and current profile information soon to be available from the Plasma Control System (PCS) real time EFIT [3] calculation. We describe the incorporation of DINA into the Simulink DIII-D tokamak systems model and results of validating this combined model against DIII-D data. \\vspace0.1em [1] J.A. Leuer, et al., 18th IEEE/NPSS SOFE (1999), p. 531. [2] R.R. Khayrutdinov, V.E. Lukash, J. Comput. Phys. 109, 193 (1993). [3] J.R. Ferron, et al., Nucl. Fusion 38, 1055 (1988).

  2. On the Conditioning of Machine-Learning-Assisted Turbulence Modeling

    NASA Astrophysics Data System (ADS)

    Wu, Jinlong; Sun, Rui; Wang, Qiqi; Xiao, Heng

    2017-11-01

    Recently, several researchers have demonstrated that machine learning techniques can be used to improve the RANS modeled Reynolds stress by training on available database of high fidelity simulations. However, obtaining improved mean velocity field remains an unsolved challenge, restricting the predictive capability of current machine-learning-assisted turbulence modeling approaches. In this work we define a condition number to evaluate the model conditioning of data-driven turbulence modeling approaches, and propose a stability-oriented machine learning framework to model Reynolds stress. Two canonical flows, the flow in a square duct and the flow over periodic hills, are investigated to demonstrate the predictive capability of the proposed framework. The satisfactory prediction performance of mean velocity field for both flows demonstrates the predictive capability of the proposed framework for machine-learning-assisted turbulence modeling. With showing the capability of improving the prediction of mean flow field, the proposed stability-oriented machine learning framework bridges the gap between the existing machine-learning-assisted turbulence modeling approaches and the demand of predictive capability of turbulence models in real applications.

  3. Models for mapping potential habitat at landscape scales: an example using northern spotted owls.

    Treesearch

    William C. McComb; Michael T. McGrath; Thomas A. Spies; David Vesely

    2002-01-01

    We are assessing the potential for current and alternative policies in the Oregon Coast Range to affect habitat capability for a suite of forest resources. We provide an example of a spatially explicit habitat capability model for northern spotted owls (Strix occidentalis caurina)to illustrate the approach we are taking to assess potential changes...

  4. Comparison of dark energy models after Planck 2015

    NASA Astrophysics Data System (ADS)

    Xu, Yue-Yao; Zhang, Xin

    2016-11-01

    We make a comparison for ten typical, popular dark energy models according to their capabilities of fitting the current observational data. The observational data we use in this work include the JLA sample of type Ia supernovae observation, the Planck 2015 distance priors of cosmic microwave background observation, the baryon acoustic oscillations measurements, and the direct measurement of the Hubble constant. Since the models have different numbers of parameters, in order to make a fair comparison, we employ the Akaike and Bayesian information criteria to assess the worth of the models. The analysis results show that, according to the capability of explaining observations, the cosmological constant model is still the best one among all the dark energy models. The generalized Chaplygin gas model, the constant w model, and the α dark energy model are worse than the cosmological constant model, but still are good models compared to others. The holographic dark energy model, the new generalized Chaplygin gas model, and the Chevalliear-Polarski-Linder model can still fit the current observations well, but from an economically feasible perspective, they are not so good. The new agegraphic dark energy model, the Dvali-Gabadadze-Porrati model, and the Ricci dark energy model are excluded by the current observations.

  5. Integrated Mecical Model (IMM) 4.0 Verification and Validation (VV) Testing (HRP IWS 2016)

    NASA Technical Reports Server (NTRS)

    Walton, M; Kerstman, E.; Arellano, J.; Boley, L.; Reyes, D.; Young, M.; Garcia, Y.; Saile, L.; Myers, J.

    2016-01-01

    Timeline, partial treatment, and alternate medications were added to the IMM to improve the fidelity of this model to enhance decision support capabilities. Using standard design reference missions, IMM VV testing compared outputs from the current operational IMM (v3) with those from the model with added functionalities (v4). These new capabilities were examined in a comparative, stepwise approach as follows: a) comparison of the current operational IMM v3 with the enhanced functionality of timeline alone (IMM 4.T), b) comparison of IMM 4.T with the timeline and partial treatment (IMM 4.TPT), and c) comparison of IMM 4.TPT with timeline, partial treatment and alternative medication (IMM 4.0).

  6. AIR QUALITY MODELING OF PM AND AIR TOXICS AT NEIGHBORHOOD SCALES

    EPA Science Inventory

    The current interest in fine particles and toxics pollutants provide an impetus for extending air quality modeling capability towards improving exposure modeling and assessments. Human exposure models require information on concentration derived from interpolation of observati...

  7. Modified Johnson-Cook model incorporated with electroplasticity for uniaxial tension under a pulsed electric current

    NASA Astrophysics Data System (ADS)

    Kim, Moon-Jo; Jeong, Hye-Jin; Park, Ju-Won; Hong, Sung-Tae; Han, Heung Nam

    2018-01-01

    An empirical expression describing the electroplastic deformation behavior is suggested based on the Johnson-Cook (JC) model by adding several functions to consider both thermal and athermal electric current effects. Tensile deformation behaviors are carried out for an AZ31 magnesium alloy and an Al-Mg-Si alloy under pulsed electric current at various current densities with a fixed duration of electric current. To describe the flow curves under electric current, a modified JC model is proposed to take the electric current effect into account. Phenomenological descriptions of the adopted parameters in the equation are made. The modified JC model suggested in the present study is capable of describing the tensile deformation behaviors under pulsed electric current reasonably well.

  8. A review of the ionospheric model for the long wave prediction capability

    NASA Astrophysics Data System (ADS)

    Ferguson, J. A.

    1992-11-01

    The Naval Command, Control, and Ocean Surveillance Center's Long Wave Prediction Capability (LWPC) has a built-in ionospheric model. The latter was defined after a review of the literature comparing measurements with calculations. Subsequent to this original specification of the ionospheric model in the LWPC, a new collection of data were obtained and analyzed. The new data were collected aboard a merchant ship named the Callaghan during a series of trans-Atlantic trips over a period of a year. This report presents a detailed analysis of the ionospheric model currently in use by the LWPC and the new model suggested by the shipboard measurements. We conclude that, although the fits to measurements are almost the same between the two models examined, the current LWPC model should be used because it is better than the new model for nighttime conditions at long ranges. This conclusion supports the primary use of the LWPC model for coverage assessment that requires a valid model at the limits of a transmitter's reception.

  9. Conversion of Component-Based Point Definition to VSP Model and Higher Order Meshing

    NASA Technical Reports Server (NTRS)

    Ordaz, Irian

    2011-01-01

    Vehicle Sketch Pad (VSP) has become a powerful conceptual and parametric geometry tool with numerous export capabilities for third-party analysis codes as well as robust surface meshing capabilities for computational fluid dynamics (CFD) analysis. However, a capability gap currently exists for reconstructing a fully parametric VSP model of a geometry generated by third-party software. A computer code called GEO2VSP has been developed to close this gap and to allow the integration of VSP into a closed-loop geometry design process with other third-party design tools. Furthermore, the automated CFD surface meshing capability of VSP are demonstrated for component-based point definition geometries in a conceptual analysis and design framework.

  10. Life modeling of thermal barrier coatings for aircraft gas turbine engines

    NASA Technical Reports Server (NTRS)

    Miller, Robert A.

    1988-01-01

    Thermal barrier coating life models developed under the NASA Lewis Research Center's Hot Section Technology (HOST) program are summarized. An initial laboratory model and three design-capable models are discussed. Current understanding of coating failure mechanisms are also summarized.

  11. Past, Present, and Future Capabilities of the Transonic Dynamics Tunnel from an Aeroelasticity Perspective

    NASA Technical Reports Server (NTRS)

    Cole, Stanley R.; Garcia, Jerry L.

    2000-01-01

    The NASA Langley Transonic Dynamics Tunnel (TDT) has provided a unique capability for aeroelastic testing for forty years. The facility has a rich history of significant contributions to the design of many United States commercial transports, military aircraft, launch vehicles, and spacecraft. The facility has many features that contribute to its uniqueness for aeroelasticity testing, perhaps the most important feature being the use of a heavy gas test medium to achieve higher test densities. Higher test medium densities substantially improve model-building requirements and therefore simplify the fabrication process for building aeroelastically scaled wind tunnel models. Aeroelastic scaling for the heavy gas results in lower model structural frequencies. Lower model frequencies tend to a make aeroelastic testing safer. This paper will describe major developments in the testing capabilities at the TDT throughout its history, the current status of the facility, and planned additions and improvements to its capabilities in the near future.

  12. Transitioning Human, Social, Cultural Behavior (HSCB) Models and Simulations to the Operational User1

    DTIC Science & Technology

    2009-10-01

    actuelle M&S couvrant le soutien aux operations, la representation du comportement humain , la guerre asymetrique, la defense contre le terrorisme et...methods, tools, data, intellectual capital , and processes to address these capability requirements. Fourth, there is a need to compare capability...requirements to current capabilities to identify gaps that may be addressed with DoD HSCB methods, tools, data, intellectual capital , and process

  13. A spatiotemporal data model for incorporating time in geographic information systems (GEN-STGIS)

    NASA Astrophysics Data System (ADS)

    Narciso, Flor Eugenia

    Temporal Geographic Information Systems (TGIS) is a new technology, which is being developed to work with Geographic Information Systems (GIS) that deal with geographic phenomena that change over time. The capabilities of TGIS depend on the underlying data model. However, a literature review of current spatiotemporal GIS data models has shown that they are not adequate for managing time when representing temporal data. In addition, the majority of these data models have been designed to support the requirements of specific-purpose applications. In an effort to resolve this problem, the related literature has been explored. A comparative investigation of the current spatiotemporal GIS data models has been made to identify their characteristics, advantages and disadvantages, similarities and differences, and to determine why they do not work adequately. A new object-oriented General-purpose Spatiotemporal GIS (GEN-STGIS) data model is proposed here. This model provides better representation, storage and management of data related to geographic phenomena that change over time and overcomes some of the problems detected in the reviewed data models. The proposed data model has four key benefits. First, it provides the capabilities of a standard vector-based GIS embedded in the 2-D Euclidean space. Second, it includes the two temporal dimensions, valid time and transaction time, supported by temporal databases. Third, it inherits, from the object oriented approach, the flexibility, modularity and ability to handle the complexities introduced by spatial and temporal dimensions. Fourth, it improves the geographic query capabilities of current TGIS with the introduction of the concept of bounding box while providing temporal and spatiotemporal query capabilities. The data model is then evaluated in order to assess its strengths and weaknesses as a spatiotemporal GIS data model, and to determine how well the model satisfies the requirements imposed by TGIS applications. The practicality of the data model is demonstrated by the creation of a TGIS example and the partial implementation of the model using the POET Java software for developing the object-oriented database. the object-oriented database.

  14. Advanced simulation noise model for modern fighter aircraft

    NASA Astrophysics Data System (ADS)

    Ikelheimer, Bruce

    2005-09-01

    NoiseMap currently represents the state of the art for military airfield noise analysis. While this model is sufficient for the current fleet of aircraft, it has limits in its capability to model the new generation of fighter aircraft like the JSF and the F-22. These aircraft's high-powered engines produce noise with significant nonlinear content. Combining this with their ability to vector the thrust means they have noise characteristics that are outside of the basic modeling assumptions of the currently available noise models. Wyle Laboratories, Penn State University, and University of Alabama are in the process of developing a new noise propagation model for the Strategic Environmental Research and Development Program. Source characterization will be through complete spheres (or hemispheres if there is not sufficient data) for each aircraft state (including thrust vector angles). Fixed and rotor wing aircraft will be included. Broadband, narrowband, and pure tone propagation will be included. The model will account for complex terrain and weather effects, as well as the effects of nonlinear propagation. It will be a complete model capable of handling a range of noise sources from small subsonic general aviation aircraft to the latest fighter aircraft like the JSF.

  15. Finite element modelling of crash response of composite aerospace sub-floor structures

    NASA Astrophysics Data System (ADS)

    McCarthy, M. A.; Harte, C. G.; Wiggenraad, J. F. M.; Michielsen, A. L. P. J.; Kohlgrüber, D.; Kamoulakos, A.

    Composite energy-absorbing structures for use in aircraft are being studied within a European Commission research programme (CRASURV - Design for Crash Survivability). One of the aims of the project is to evaluate the current capabilities of crashworthiness simulation codes for composites modelling. This paper focuses on the computational analysis using explicit finite element analysis, of a number of quasi-static and dynamic tests carried out within the programme. It describes the design of the structures, the analysis techniques used, and the results of the analyses in comparison to the experimental test results. It has been found that current multi-ply shell models are capable of modelling the main energy-absorbing processes at work in such structures. However some deficiencies exist, particularly in modelling fabric composites. Developments within the finite element code are taking place as a result of this work which will enable better representation of composite fabrics.

  16. Best practices for evaluating the capability of nondestructive evaluation (NDE) and structural health monitoring (SHM) techniques for damage characterization

    NASA Astrophysics Data System (ADS)

    Aldrin, John C.; Annis, Charles; Sabbagh, Harold A.; Lindgren, Eric A.

    2016-02-01

    A comprehensive approach to NDE and SHM characterization error (CE) evaluation is presented that follows the framework of the `ahat-versus-a' regression analysis for POD assessment. Characterization capability evaluation is typically more complex with respect to current POD evaluations and thus requires engineering and statistical expertise in the model-building process to ensure all key effects and interactions are addressed. Justifying the statistical model choice with underlying assumptions is key. Several sizing case studies are presented with detailed evaluations of the most appropriate statistical model for each data set. The use of a model-assisted approach is introduced to help assess the reliability of NDE and SHM characterization capability under a wide range of part, environmental and damage conditions. Best practices of using models are presented for both an eddy current NDE sizing and vibration-based SHM case studies. The results of these studies highlight the general protocol feasibility, emphasize the importance of evaluating key application characteristics prior to the study, and demonstrate an approach to quantify the role of varying SHM sensor durability and environmental conditions on characterization performance.

  17. Thermal Effects Modeling Developed for Smart Structures

    NASA Technical Reports Server (NTRS)

    Lee, Ho-Jun

    1998-01-01

    Applying smart materials in aeropropulsion systems may improve the performance of aircraft engines through a variety of vibration, noise, and shape-control applications. To facilitate the experimental characterization of these smart structures, researchers have been focusing on developing analytical models to account for the coupled mechanical, electrical, and thermal response of these materials. One focus of current research efforts has been directed toward incorporating a comprehensive thermal analysis modeling capability. Typically, temperature affects the behavior of smart materials by three distinct mechanisms: Induction of thermal strains because of coefficient of thermal expansion mismatch 1. Pyroelectric effects on the piezoelectric elements; 2. Temperature-dependent changes in material properties; and 3. Previous analytical models only investigated the first two thermal effects mechanisms. However, since the material properties of piezoelectric materials generally vary greatly with temperature (see the graph), incorporating temperature-dependent material properties will significantly affect the structural deflections, sensory voltages, and stresses. Thus, the current analytical model captures thermal effects arising from all three mechanisms through thermopiezoelectric constitutive equations. These constitutive equations were incorporated into a layerwise laminate theory with the inherent capability to model both the active and sensory response of smart structures in thermal environments. Corresponding finite element equations were formulated and implemented for both the beam and plate elements to provide a comprehensive thermal effects modeling capability.

  18. Nuclear Engine System Simulation (NESS). Volume 1: Program user's guide

    NASA Astrophysics Data System (ADS)

    Pelaccio, Dennis G.; Scheil, Christine M.; Petrosky, Lyman J.

    1993-03-01

    A Nuclear Thermal Propulsion (NTP) engine system design analysis tool is required to support current and future Space Exploration Initiative (SEI) propulsion and vehicle design studies. Currently available NTP engine design models are those developed during the NERVA program in the 1960's and early 1970's and are highly unique to that design or are modifications of current liquid propulsion system design models. To date, NTP engine-based liquid design models lack integrated design of key NTP engine design features in the areas of reactor, shielding, multi-propellant capability, and multi-redundant pump feed fuel systems. Additionally, since the SEI effort is in the initial development stage, a robust, verified NTP analysis design tool could be of great use to the community. This effort developed an NTP engine system design analysis program (tool), known as the Nuclear Engine System Simulation (NESS) program, to support ongoing and future engine system and stage design study efforts. In this effort, Science Applications International Corporation's (SAIC) NTP version of the Expanded Liquid Engine Simulation (ELES) program was modified extensively to include Westinghouse Electric Corporation's near-term solid-core reactor design model. The ELES program has extensive capability to conduct preliminary system design analysis of liquid rocket systems and vehicles. The program is modular in nature and is versatile in terms of modeling state-of-the-art component and system options as discussed. The Westinghouse reactor design model, which was integrated in the NESS program, is based on the near-term solid-core ENABLER NTP reactor design concept. This program is now capable of accurately modeling (characterizing) a complete near-term solid-core NTP engine system in great detail, for a number of design options, in an efficient manner. The following discussion summarizes the overall analysis methodology, key assumptions, and capabilities associated with the NESS presents an example problem, and compares the results to related NTP engine system designs. Initial installation instructions and program disks are in Volume 2 of the NESS Program User's Guide.

  19. Nuclear Engine System Simulation (NESS). Volume 1: Program user's guide

    NASA Technical Reports Server (NTRS)

    Pelaccio, Dennis G.; Scheil, Christine M.; Petrosky, Lyman J.

    1993-01-01

    A Nuclear Thermal Propulsion (NTP) engine system design analysis tool is required to support current and future Space Exploration Initiative (SEI) propulsion and vehicle design studies. Currently available NTP engine design models are those developed during the NERVA program in the 1960's and early 1970's and are highly unique to that design or are modifications of current liquid propulsion system design models. To date, NTP engine-based liquid design models lack integrated design of key NTP engine design features in the areas of reactor, shielding, multi-propellant capability, and multi-redundant pump feed fuel systems. Additionally, since the SEI effort is in the initial development stage, a robust, verified NTP analysis design tool could be of great use to the community. This effort developed an NTP engine system design analysis program (tool), known as the Nuclear Engine System Simulation (NESS) program, to support ongoing and future engine system and stage design study efforts. In this effort, Science Applications International Corporation's (SAIC) NTP version of the Expanded Liquid Engine Simulation (ELES) program was modified extensively to include Westinghouse Electric Corporation's near-term solid-core reactor design model. The ELES program has extensive capability to conduct preliminary system design analysis of liquid rocket systems and vehicles. The program is modular in nature and is versatile in terms of modeling state-of-the-art component and system options as discussed. The Westinghouse reactor design model, which was integrated in the NESS program, is based on the near-term solid-core ENABLER NTP reactor design concept. This program is now capable of accurately modeling (characterizing) a complete near-term solid-core NTP engine system in great detail, for a number of design options, in an efficient manner. The following discussion summarizes the overall analysis methodology, key assumptions, and capabilities associated with the NESS presents an example problem, and compares the results to related NTP engine system designs. Initial installation instructions and program disks are in Volume 2 of the NESS Program User's Guide.

  20. Aviation Safety Program Atmospheric Environment Safety Technologies (AEST) Project

    NASA Technical Reports Server (NTRS)

    Colantonio, Ron

    2011-01-01

    Engine Icing: Characterization and Simulation Capability: Develop knowledge bases, analysis methods, and simulation tools needed to address the problem of engine icing; in particular, ice-crystal icing Airframe Icing Simulation and Engineering Tool Capability: Develop and demonstrate 3-D capability to simulate and model airframe ice accretion and related aerodynamic performance degradation for current and future aircraft configurations in an expanded icing environment that includes freezing drizzle/rain Atmospheric Hazard Sensing and Mitigation Technology Capability: Improve and expand remote sensing and mitigation of hazardous atmospheric environments and phenomena

  1. Modelling the transient behaviour of pulsed current tungsten-inert-gas weldpools

    NASA Astrophysics Data System (ADS)

    Wu, C. S.; Zheng, W.; Wu, L.

    1999-01-01

    A three-dimensional model is established to simulate the pulsed current tungsten-inert-gas (TIG) welding process. The goal is to analyse the cyclic variation of fluid flow and heat transfer in weldpools under periodic arc heat input. To this end, an algorithm, which is capable of handling the transience, nonlinearity, multiphase and strong coupling encountered in this work, is developed. The numerical simulations demonstrate the transient behaviour of weldpools under pulsed current. Experimental data are compared with numerical results to show the effectiveness of the developed model.

  2. Venus Global Reference Atmospheric Model Status and Planned Updates

    NASA Astrophysics Data System (ADS)

    Justh, H. L.; Dwyer Cianciolo, A. M.

    2017-05-01

    Details the current status of Venus Global Reference Atmospheric Model (Venus-GRAM). Provides new sources of data and upgrades that need to be incorporated to maintain credibility and identifies options and features that could increase capability.

  3. Unified Deep Learning Architecture for Modeling Biology Sequence.

    PubMed

    Wu, Hongjie; Cao, Chengyuan; Xia, Xiaoyan; Lu, Qiang

    2017-10-09

    Prediction of the spatial structure or function of biological macromolecules based on their sequence remains an important challenge in bioinformatics. When modeling biological sequences using traditional sequencing models, characteristics, such as long-range interactions between basic units, the complicated and variable output of labeled structures, and the variable length of biological sequences, usually lead to different solutions on a case-by-case basis. This study proposed the use of bidirectional recurrent neural networks based on long short-term memory or a gated recurrent unit to capture long-range interactions by designing the optional reshape operator to adapt to the diversity of the output labels and implementing a training algorithm to support the training of sequence models capable of processing variable-length sequences. Additionally, the merge and pooling operators enhanced the ability to capture short-range interactions between basic units of biological sequences. The proposed deep-learning model and its training algorithm might be capable of solving currently known biological sequence-modeling problems through the use of a unified framework. We validated our model on one of the most difficult biological sequence-modeling problems currently known, with our results indicating the ability of the model to obtain predictions of protein residue interactions that exceeded the accuracy of current popular approaches by 10% based on multiple benchmarks.

  4. Analytical and physical modeling program for the NASA Lewis Research Center's Altitude Wind Tunnel (AWT)

    NASA Technical Reports Server (NTRS)

    Abbott, J. M.; Deidrich, J. H.; Groeneweg, J. F.; Povinelli, L. A.; Reid, L.; Reinmann, J. J.; Szuch, J. R.

    1985-01-01

    An effort is currently underway at the NASA Lewis Research Center to rehabilitate and extend the capabilities of the Altitude Wind Tunnel (AWT). This extended capability will include a maximum test section Mach number of about 0.9 at an altitude of 55,000 ft and a -20 F stagnation temperature (octagonal test section, 20 ft across the flats). In addition, the AWT will include an icing and acoustic research capability. In order to insure a technically sound design, an AWT modeling program (both analytical and physical) was initiated to provide essential input to the AWT final design process. This paper describes the modeling program, including the rationale and criteria used in program definition, and presents some early program results.

  5. Dynamic modeling of green algae cultivation in a photobioreactor for sustainable biodiesel production.

    PubMed

    Del Rio-Chanona, Ehecatl A; Liu, Jiao; Wagner, Jonathan L; Zhang, Dongda; Meng, Yingying; Xue, Song; Shah, Nilay

    2018-02-01

    Biodiesel produced from microalgae has been extensively studied due to its potentially outstanding advantages over traditional transportation fuels. In order to facilitate its industrialization and improve the process profitability, it is vital to construct highly accurate models capable of predicting the complex behavior of the investigated biosystem for process optimization and control, which forms the current research goal. Three original contributions are described in this paper. Firstly, a dynamic model is constructed to simulate the complicated effect of light intensity, nutrient supply and light attenuation on both biomass growth and biolipid production. Secondly, chlorophyll fluorescence, an instantly measurable variable and indicator of photosynthetic activity, is embedded into the model to monitor and update model accuracy especially for the purpose of future process optimal control, and its correlation between intracellular nitrogen content is quantified, which to the best of our knowledge has never been addressed so far. Thirdly, a thorough experimental verification is conducted under different scenarios including both continuous illumination and light/dark cycle conditions to testify the model predictive capability particularly for long-term operation, and it is concluded that the current model is characterized by a high level of predictive capability. Based on the model, the optimal light intensity for algal biomass growth and lipid synthesis is estimated. This work, therefore, paves the way to forward future process design and real-time optimization. © 2017 Wiley Periodicals, Inc.

  6. Partial Validation of Multibody Program to Optimize Simulated Trajectories II (POST II) Parachute Simulation With Interacting Forces

    NASA Technical Reports Server (NTRS)

    Raiszadeh, Ben; Queen, Eric M.

    2002-01-01

    A capability to simulate trajectories Of Multiple interacting rigid bodies has been developed. This capability uses the Program to Optimize Simulated Trajectories II (POST II). Previously, POST II had the ability to simulate multiple bodies without interacting forces. The current implementation is used for the Simulation of parachute trajectories, in which the parachute and suspended bodies can be treated as rigid bodies. An arbitrary set of connecting lines can be included in the model and are treated as massless spring-dampers. This paper discusses details of the connection line modeling and results of several test cases used to validate the capability.

  7. University of Washington/ Northwest National Marine Renewable Energy Center Tidal Current Technology Test Protocol, Instrumentation, Design Code, and Oceanographic Modeling Collaboration: Cooperative Research and Development Final Report, CRADA Number CRD-11-452

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Driscoll, Frederick R.

    The University of Washington (UW) - Northwest National Marine Renewable Energy Center (UW-NNMREC) and the National Renewable Energy Laboratory (NREL) will collaborate to advance research and development (R&D) of Marine Hydrokinetic (MHK) renewable energy technology, specifically renewable energy captured from ocean tidal currents. UW-NNMREC is endeavoring to establish infrastructure, capabilities and tools to support in-water testing of marine energy technology. NREL is leveraging its experience and capabilities in field testing of wind systems to develop protocols and instrumentation to advance field testing of MHK systems. Under this work, UW-NNMREC and NREL will work together to develop a common instrumentation systemmore » and testing methodologies, standards and protocols. UW-NNMREC is also establishing simulation capabilities for MHK turbine and turbine arrays. NREL has extensive experience in wind turbine array modeling and is developing several computer based numerical simulation capabilities for MHK systems. Under this CRADA, UW-NNMREC and NREL will work together to augment single device and array modeling codes. As part of this effort UW NNMREC will also work with NREL to run simulations on NREL's high performance computer system.« less

  8. Thermal Properties Measurement Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carmack, Jon; Braase, Lori; Papesch, Cynthia

    2015-08-01

    The Thermal Properties Measurement Report summarizes the research, development, installation, and initial use of significant experimental thermal property characterization capabilities at the INL in FY 2015. These new capabilities were used to characterize a U 3Si 2 (candidate Accident Tolerant) fuel sample fabricated at the INL. The ability to perform measurements at various length scales is important and provides additional data that is not currently in the literature. However, the real value of the data will be in accomplishing a phenomenological understanding of the thermal conductivity in fuels and the ties to predictive modeling. Thus, the MARMOT advanced modeling andmore » simulation capability was utilized to illustrate how the microstructural data can be modeled and compared with bulk characterization data. A scientific method was established for thermal property measurement capability on irradiated nuclear fuel samples, which will be installed in the Irradiated Material Characterization Laboratory (IMCL).« less

  9. Review of the ionospheric model for the long wave prediction capability. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferguson, J.A.

    1992-11-01

    The Naval Command, Control and Ocean Surveillance Center's Long Wave Prediction Capability (LWPC) has a built-in ionospheric model. The latter was defined after a review of the literature comparing measurements with calculations. Subsequent to this original specification of the ionospheric model in the LWPC, a new collection of data were obtained and analyzed. The new data were collected aboard a merchant ship named the Callaghan during a series of trans-Atlantic trips over a period of a year. This report presents a detailed analysis of the ionospheric model currently in use by the LWPC and the new model suggested by themore » shipboard measurements. We conclude that, although the fits to measurements are almost the same between the two models examined, the current LWPC model should be used because it is better than the new model for nighttime conditions at long ranges. This conclusion supports the primary use of the LWPC model for coverage assessment that requires a valid model at the limits of a transmitter's reception.... Communications, Very low frequency and low frequency, High voltage, Antennas, Measurement.« less

  10. Predicting the Performance of Radiant Technologies in Attics: Reducing the Discrepancies Between Attic Specific and Whole-Building Energy Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Merket, Noel D; DeGraw, Jason W; Lee, Edwin S

    The use of radiant technology in attics aims to reduce the radiation component of heat transfer between the attic floor and roof decks, gables, and eaves. Recently, it has been shown that EnergyPlus underestimates the savings using radiant technologies in attic spaces. The aim of this study is to understand why EnergyPlus underestimates the performance of radiant technologies and provide a solution strategy that works within the current capabilities of EnergyPlus. The analysis uses three attic energy models as a baseline for comparison for EnergyPlus. Potential reasons for the discrepancies between the attic specific energy models and EnergyPlus are isolatedmore » and individually tested. A solution strategy is proposed using the Energy Management System (EMS) capabilities within EnergyPlus. This solution strategy produces similar results to the other attic specific energy models. This paper shows that the current capabilities of EnergyPlus are sufficient to simulate radiant technologies in attics. The methodology showcased in this paper serves as a guide for engineers and researchers who would like to predict the performance radiant technology in attics using the whole building energy software, EnergyPlus.« less

  11. Preliminary Assessment of Turbomachinery Codes

    NASA Technical Reports Server (NTRS)

    Mazumder, Quamrul H.

    2007-01-01

    This report assesses different CFD codes developed and currently being used at Glenn Research Center to predict turbomachinery fluid flow and heat transfer behavior. This report will consider the following codes: APNASA, TURBO, GlennHT, H3D, and SWIFT. Each code will be described separately in the following section with their current modeling capabilities, level of validation, pre/post processing, and future development and validation requirements. This report addresses only previously published and validations of the codes. However, the codes have been further developed to extend the capabilities of the codes.

  12. Meeting the Growing Demand for Sustainability-Focused Management Education: A Case Study of a PRME Academic Institution

    ERIC Educational Resources Information Center

    Young, Suzanne; Nagpal, Swati

    2013-01-01

    The current business landscape has created the impetus to develop management graduates with capabilities that foster responsible leadership and sustainability. Through the lens of Gitsham's 3C Model (Complexity, Context and Connection) of graduate capabilities, this paper discusses the experience of implementing the United Nations Principles for…

  13. Estimating wildfire behavior and effects

    Treesearch

    Frank A. Albini

    1976-01-01

    This paper presents a brief survey of the research literature on wildfire behavior and effects and assembles formulae and graphical computation aids based on selected theoretical and empirical models. The uses of mathematical fire behavior models are discussed, and the general capabilities and limitations of currently available models are outlined.

  14. An application of the Multi-Purpose System Simulation /MPSS/ model to the Monitor and Control Display System /MACDS/ at the National Aeronautics and Space Administration /NASA/ Goddard Space Flight Center /GSFC/

    NASA Technical Reports Server (NTRS)

    Mill, F. W.; Krebs, G. N.; Strauss, E. S.

    1976-01-01

    The Multi-Purpose System Simulator (MPSS) model was used to investigate the current and projected performance of the Monitor and Control Display System (MACDS) at the Goddard Space Flight Center in processing and displaying launch data adequately. MACDS consists of two interconnected mini-computers with associated terminal input and display output equipment and a disk-stored data base. Three configurations of MACDS were evaluated via MPSS and their performances ascertained. First, the current version of MACDS was found inadequate to handle projected launch data loads because of unacceptable data backlogging. Second, the current MACDS hardware with enhanced software was capable of handling two times the anticipated data loads. Third, an up-graded hardware ensemble combined with the enhanced software was capable of handling four times the anticipated data loads.

  15. Validating a Finite Element Model of a Structure Subjected to Mine Blast with Experimental Modal Analysis

    DTIC Science & Technology

    2017-11-01

    The Under-body Blast Methodology (UBM) for the Test and Evaluation (T&E) program was established to provide a capability for the US Army Test and... Evaluation Command to assess the vulnerability of vehicles to under-body blast. Finite element (FE) models are part of the current UBM for T&E methodology...Methodology (UBM) for the Test and Evaluation (T&E) program was established to provide a capability for the US Army Test and Evaluation Command

  16. Developing the Practising Model in Physical Education: An Expository Outline Focusing on Movement Capability

    ERIC Educational Resources Information Center

    Barker, D. M.; Aggerholm, K.; Standal, O.; Larsson, H.

    2018-01-01

    Background: Physical educators currently have a number of pedagogical (or curricular) models at their disposal. While existing models have been well-received in educational contexts, these models seek to extend students' capacities within a limited number of "human activities" (Arendt, 1958). The activity of "human practising,"…

  17. Threat radar system simulations

    NASA Astrophysics Data System (ADS)

    Miller, L.

    The capabilities, requirements, and goals of radar emitter simulators are discussed. Simulators are used to evaluate competing receiver designs, to quantify the performance envelope of a radar system, and to model the characteristics of a transmitted signal waveform. A database of candidate threat systems is developed and, in concert with intelligence data on a given weapons system, permits upgrading simulators to new projected threat capabilities. Four currently available simulation techniques are summarized, noting the usefulness of developing modular software for fast controlled-cost upgrades of simulation capabilities.

  18. Current status of one- and two-dimensional numerical models: Successes and limitations

    NASA Technical Reports Server (NTRS)

    Schwartz, R. J.; Gray, J. L.; Lundstrom, M. S.

    1985-01-01

    The capabilities of one and two-dimensional numerical solar cell modeling programs (SCAP1D and SCAP2D) are described. The occasions when a two-dimensional model is required are discussed. The application of the models to design, analysis, and prediction are presented along with a discussion of problem areas for solar cell modeling.

  19. INITIAL STUDY OF HPAC MODELED DISPERSION DRIVEN BY MM5 WITH AND WITHOUT URBAN CANOPY PARAMETERIZATIONS

    EPA Science Inventory

    Improving the accuracy and capability of transport and dispersion models in urban areas is essential for current and future urban applications. These models must reflect more realistically the presence and details of urban canopy features. Such features markedly influence the flo...

  20. Reactivity Insertion Accident (RIA) Capability Status in the BISON Fuel Performance Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williamson, Richard L.; Folsom, Charles Pearson; Pastore, Giovanni

    2016-05-01

    One of the Challenge Problems being considered within CASL relates to modelling and simulation of Light Water Reactor LWR) fuel under Reactivity Insertion Accident (RIA) conditions. BISON is the fuel performance code used within CASL for LWR fuel under both normal operating and accident conditions, and thus must be capable of addressing the RIA challenge problem. This report outlines required BISON capabilities for RIAs and describes the current status of the code. Information on recent accident capability enhancements, application of BISON to a RIA benchmark exercise, and plans for validation to RIA behavior are included.

  1. Technical Study on Improvement of Endurance Capability of Limit Short-circuit Current of Charge Control SMART Meter

    NASA Astrophysics Data System (ADS)

    Li, W. W.; Du, Z. Z.; Yuan, R. m.; Xiong, D. Z.; Shi, E. W.; Lu, G. N.; Dai, Z. Y.; Chen, X. Q.; Jiang, Z. Y.; Lv, Y. G.

    2017-10-01

    Smart meter represents the development direction of energy-saving smart grid in the future. The load switch, one of the core parts of smart meter, should be of high reliability, safety and endurance capability of limit short-circuit current. For this reason, this paper discusses the quick simulation of relationship between attraction and counterforce of load switch without iteration, establishes dual response surface model of attraction and counterforce and optimizes the design scheme of load switch for charge control smart meter, thus increasing electromagnetic attraction and spring counterforce. In this way, this paper puts forward a method to improve the withstand capacity of limit short-circuit current.

  2. Advanced space propulsion thruster research

    NASA Technical Reports Server (NTRS)

    Wilbur, P. J.

    1981-01-01

    Experiments showed that stray magnetic fields can adversely affect the capacity of a hollow cathode neutralizer to couple to an ion beam. Magnetic field strength at the neutralizer cathode orifice is a crucial factor influencing the coupling voltage. The effects of electrostatic accelerator grid aperture diameters on the ion current extraction capabilities were examined experimentally to describe the divergence, deflection, and current extraction capabilities of grids with the screen and accelerator apertures displaced relative to one another. Experiments performed in orificed, mercury hollow cathodes support the model of field enhanced thermionic electron mission from cathode inserts. Tests supported the validity of a thermal model of the cathode insert. A theoretical justification of a Saha equation model relating cathode plasma properties is presented. Experiments suggest that ion loss rates to discharge chamber walls can be controlled. A series of new discharge chamber magnetic field configurations were generated in the flexible magnetic field thruster and their effect on performance was examined. A technique used in the thruster to measure ion currents to discharge chamber walls is described. Using these ion currents the fraction of ions produced that are extracted from the discharge chamber and the energy cost of plasma ions are computed.

  3. Exploration Medical Capability System Engineering Overview

    NASA Technical Reports Server (NTRS)

    Mindock, J.; McGuire, K.

    2018-01-01

    Deep Space Gateway and Transport missions will change the way NASA currently practices medicine. The missions will require more autonomous capability compared to current low Earth orbit operations. For the medical system, lack of consumable resupply, evacuation opportunities, and real-time ground support are key drivers toward greater autonomy. Recognition of the limited mission and vehicle resources available to carry out exploration missions motivates the Exploration Medical Capability (ExMC) Element's approach to enabling the necessary autonomy. The ExMC Systems Engineering team's mission is to "Define, develop, validate, and manage the technical system design needed to implement exploration medical capabilities for Mars and test the design in a progression of proving grounds." The Element's work must integrate with the overall exploration mission and vehicle design efforts to successfully provide exploration medical capabilities. ExMC is using Model-Based System Engineering (MBSE) to accomplish its integrative goals. The MBSE approach to medical system design offers a paradigm shift toward greater integration between vehicle and the medical system, and directly supports the transition of Earth-reliant ISS operations to the Earth-independent operations envisioned for Mars. This talk will discuss how ExMC is using MBSE to define operational needs, decompose requirements and architecture, and identify medical capabilities needed to support human exploration. How MBSE is being used to integrate across disciplines and NASA Centers will also be described. The medical system being discussed in this talk is one system within larger habitat systems. Data generated within the medical system will be inputs to other systems and vice versa. This talk will also describe the next steps in model development that include: modeling the different systems that comprise the larger system and interact with the medical system, understanding how the various systems work together, and developing tools to support trade studies.

  4. Controlling Your Environment and Yourself: Implications for Career Success

    ERIC Educational Resources Information Center

    Converse, Patrick D.; Pathak, Jaya; DePaul-Haddock, Anne Marie; Gotlib, Tomer; Merbedone, Matthew

    2012-01-01

    Given the complex and rapidly changing nature of the current work environment, individuals' capabilities to effectively influence their environment and regulate their behavior may be critical to career success. Drawing from the model of emergent interactive agency (Bandura, 1989), the current research examines this perspective, focusing on…

  5. Development of a Semi-Span Test Capability at the National Transonic Facility

    NASA Technical Reports Server (NTRS)

    Gatlin, G. M.; Parker, P. A.; Owens, L. R., Jr.

    2001-01-01

    A need for low-speed, high Reynolds number test capabilities has been identified for the design and development of advanced subsonic transport high-lift systems. In support of this need, multiple investigations have been conducted in the National Transonic Facility (NTF) at the NASA Langley Research Center to develop a semi-span testing capability that will provide the low-speed, flight Reynolds number data currently unattainable using conventional sting-mounted, full-span models. Although a semi-span testing capability will effectively double the Reynolds number capability over full-span models, it does come at the expense of contending with the issue of the interaction of the flow over the model with the windtunnel wall boundary layer. To address this issue the size and shape of the semi-span model mounting geometry have been investigated, and the results are presented herein. The cryogenic operating environment of the NTF produced another semi-span test technique issue in that varying thermal gradients have developed on the large semi-span balance. The suspected cause of these thermal gradients and methods to eliminate them are presented. Data are also presented that demonstrate the successful elimination of these varying thermal gradients during cryogenic operations.

  6. Adaptive Multiscale Modeling of Geochemical Impacts on Fracture Evolution

    NASA Astrophysics Data System (ADS)

    Molins, S.; Trebotich, D.; Steefel, C. I.; Deng, H.

    2016-12-01

    Understanding fracture evolution is essential for many subsurface energy applications, including subsurface storage, shale gas production, fracking, CO2 sequestration, and geothermal energy extraction. Geochemical processes in particular play a significant role in the evolution of fractures through dissolution-driven widening, fines migration, and/or fracture sealing due to precipitation. One obstacle to understanding and exploiting geochemical fracture evolution is that it is a multiscale process. However, current geochemical modeling of fractures cannot capture this multi-scale nature of geochemical and mechanical impacts on fracture evolution, and is limited to either a continuum or pore-scale representation. Conventional continuum-scale models treat fractures as preferential flow paths, with their permeability evolving as a function (often, a cubic law) of the fracture aperture. This approach has the limitation that it oversimplifies flow within the fracture in its omission of pore scale effects while also assuming well-mixed conditions. More recently, pore-scale models along with advanced characterization techniques have allowed for accurate simulations of flow and reactive transport within the pore space (Molins et al., 2014, 2015). However, these models, even with high performance computing, are currently limited in their ability to treat tractable domain sizes (Steefel et al., 2013). Thus, there is a critical need to develop an adaptive modeling capability that can account for separate properties and processes, emergent and otherwise, in the fracture and the rock matrix at different spatial scales. Here we present an adaptive modeling capability that treats geochemical impacts on fracture evolution within a single multiscale framework. Model development makes use of the high performance simulation capability, Chombo-Crunch, leveraged by high resolution characterization and experiments. The modeling framework is based on the adaptive capability in Chombo which not only enables mesh refinement, but also refinement of the model-pore scale or continuum Darcy scale-in a dynamic way such that the appropriate model is used only when and where it is needed. Explicit flux matching provides coupling betwen the scales.

  7. Crash Certification by Analysis - Are We There Yet?

    NASA Technical Reports Server (NTRS)

    Jackson, Karen E.; Fasanella, Edwin L.; Lyle, Karen H.

    2006-01-01

    This paper addresses the issue of crash certification by analysis. This broad topic encompasses many ancillary issues including model validation procedures, uncertainty in test data and analysis models, probabilistic techniques for test-analysis correlation, verification of the mathematical formulation, and establishment of appropriate qualification requirements. This paper will focus on certification requirements for crashworthiness of military helicopters; capabilities of the current analysis codes used for crash modeling and simulation, including some examples of simulations from the literature to illustrate the current approach to model validation; and future directions needed to achieve "crash certification by analysis."

  8. Control model design to limit DC-link voltage during grid fault in a dfig variable speed wind turbine

    NASA Astrophysics Data System (ADS)

    Nwosu, Cajethan M.; Ogbuka, Cosmas U.; Oti, Stephen E.

    2017-08-01

    This paper presents a control model design capable of inhibiting the phenomenal rise in the DC-link voltage during grid- fault condition in a variable speed wind turbine. Against the use of power circuit protection strategies with inherent limitations in fault ride-through capability, a control circuit algorithm capable of limiting the DC-link voltage rise which in turn bears dynamics that has direct influence on the characteristics of the rotor voltage especially during grid faults is here proposed. The model results so obtained compare favorably with the simulation results as obtained in a MATLAB/SIMULINK environment. The generated model may therefore be used to predict near accurately the nature of DC-link voltage variations during fault given some factors which include speed and speed mode of operation, the value of damping resistor relative to half the product of inner loop current control bandwidth and the filter inductance.

  9. Extending the Lunar Mapping and Modeling Portal - New Capabilities and New Worlds

    NASA Technical Reports Server (NTRS)

    Day, B.; Law, E.; Arevalo, E.; Bui, B.; Chang, G.; Dodge, K.; Kim, R.; Malhotra, S.; Sadaqathullah, S.; Schmidt, G.; hide

    2015-01-01

    NASA's Lunar Mapping and Modeling Portal (LMMP) provides a web-based Portal and a suite of interactive visualization and analysis tools to enable mission planners, lunar scientists, and engineers to access mapped lunar data products from past and current lunar missions (http://lmmp.nasa.gov). During the past year, the capabilities and data served by LMMP have been significantly expanded. New interfaces are providing improved ways to access and visualize data. At the request of NASA's Science Mission Directorate, LMMP's technology and capabilities are now being extended to additional planetary bodies. New portals for Vesta and Mars are the first of these new products to be released. This presentation will provide an overview of LMMP, Vesta Trek, and Mars Trek, demonstrate their uses and capabilities, highlight new features, and preview coming enhancements.

  10. Extending the Lunar Mapping and Modeling Portal - New Capabilities and New Worlds

    NASA Astrophysics Data System (ADS)

    Day, B.; Law, E.; Arevalo, E.; Bui, B.; Chang, G.; Dodge, K.; Kim, R.; Malhotra, S.; Sadaqathullah, S.; Schmidt, G.; Bailey, B.

    2015-10-01

    NASA's Lunar Mapping and Modeling Portal (LMMP) provides a web-based Portal and a suite of interactive visualization and analysis tools to enable mission planners, lunar scientists, and engineers to access mapped lunar data products from past and current lunar missions (http://lmmp.nasa.gov). During the past year, the capabilities and data served by LMMP have been significantly expanded. New interfaces are providing improved ways to access and visualize data. At the request of NASA's Science Mission Directorate, LMMP's technology and capabilities are now being extended to additional planetary bodies. New portals for Vesta and Mars are the first of these new products to be released. This presentation will provide an overview of LMMP, Vesta Trek, and Mars Trek, demonstrate their uses and capabilities, highlight new features, and preview coming enhancements.

  11. Summary of 2016 Light Microscopy Module (LMM) Physical Science Experiments on ISS. Update of LMM Science Experiments and Facility Capabilities

    NASA Technical Reports Server (NTRS)

    Sicker, Ronald J.; Meyer, William V.; Foster, William M.; Fletcher, William A.; Williams, Stuart J.; Lee, Chang-Soo

    2016-01-01

    This presentation will feature a series of short, entertaining, and informative videos that describe the current status and science support for the Light Microscopy Module (LMM) facility on the International Space Station. These interviews will focus on current experiments and provide an overview of future capabilities. The recently completed experiments include nano-particle haloing, 3-D self-assembly with Janus particles and a model system for nano-particle drug delivery. The videos will share perspectives from the scientists, engineers, and managers working with the NASA Light Microscopy program.

  12. Existing and Required Modeling Capabilities for Evaluating ATM Systems and Concepts

    NASA Technical Reports Server (NTRS)

    Odoni, Amedeo R.; Bowman, Jeremy; Delahaye, Daniel; Deyst, John J.; Feron, Eric; Hansman, R. John; Khan, Kashif; Kuchar, James K.; Pujet, Nicolas; Simpson, Robert W.

    1997-01-01

    ATM systems throughout the world are entering a period of major transition and change. The combination of important technological developments and of the globalization of the air transportation industry has necessitated a reexamination of some of the fundamental premises of existing Air Traffic Management (ATM) concepts. New ATM concepts have to be examined, concepts that may place more emphasis on: strategic traffic management; planning and control; partial decentralization of decision-making; and added reliance on the aircraft to carry out strategic ATM plans, with ground controllers confined primarily to a monitoring and supervisory role. 'Free Flight' is a case in point. In order to study, evaluate and validate such new concepts, the ATM community will have to rely heavily on models and computer-based tools/utilities, covering a wide range of issues and metrics related to safety, capacity and efficiency. The state of the art in such modeling support is adequate in some respects, but clearly deficient in others. It is the objective of this study to assist in: (1) assessing the strengths and weaknesses of existing fast-time models and tools for the study of ATM systems and concepts and (2) identifying and prioritizing the requirements for the development of additional modeling capabilities in the near future. A three-stage process has been followed to this purpose: 1. Through the analysis of two case studies involving future ATM system scenarios, as well as through expert assessment, modeling capabilities and supporting tools needed for testing and validating future ATM systems and concepts were identified and described. 2. Existing fast-time ATM models and support tools were reviewed and assessed with regard to the degree to which they offer the capabilities identified under Step 1. 3 . The findings of 1 and 2 were combined to draw conclusions about (1) the best capabilities currently existing, (2) the types of concept testing and validation that can be carried out reliably with such existing capabilities and (3) the currently unavailable modeling capabilities that should receive high priority for near-term research and development. It should be emphasized that the study is concerned only with the class of 'fast time' analytical and simulation models. 'Real time' models, that typically involve humans-in-the-loop, comprise another extensive class which is not addressed in this report. However, the relationship between some of the fast-time models reviewed and a few well-known real-time models is identified in several parts of this report and the potential benefits from the combined use of these two classes of models-a very important subject-are discussed in chapters 4 and 7.

  13. Model development and validation of geometrically complex eddy current coils using finite element methods

    NASA Astrophysics Data System (ADS)

    Brown, Alexander; Eviston, Connor

    2017-02-01

    Multiple FEM models of complex eddy current coil geometries were created and validated to calculate the change of impedance due to the presence of a notch. Capable realistic simulations of eddy current inspections are required for model assisted probability of detection (MAPOD) studies, inversion algorithms, experimental verification, and tailored probe design for NDE applications. An FEM solver was chosen to model complex real world situations including varying probe dimensions and orientations along with complex probe geometries. This will also enable creation of a probe model library database with variable parameters. Verification and validation was performed using other commercially available eddy current modeling software as well as experimentally collected benchmark data. Data analysis and comparison showed that the created models were able to correctly model the probe and conductor interactions and accurately calculate the change in impedance of several experimental scenarios with acceptable error. The promising results of the models enabled the start of an eddy current probe model library to give experimenters easy access to powerful parameter based eddy current models for alternate project applications.

  14. Structural Equation Modeling: Applications in ecological and evolutionary biology research

    USGS Publications Warehouse

    Pugesek, Bruce H.; von Eye, Alexander; Tomer, Adrian

    2003-01-01

    This book presents an introduction to the methodology of structural equation modeling, illustrates its use, and goes on to argue that it has revolutionary implications for the study of natural systems. A major theme of this book is that we have, up to this point, attempted to study systems primarily using methods (such as the univariate model) that were designed only for considering individual processes. Understanding systems requires the capacity to examine simultaneous influences and responses. Structural equation modeling (SEM) has such capabilities. It also possesses many other traits that add strength to its utility as a means of making scientific progress. In light of the capabilities of SEM, it can be argued that much of ecological theory is currently locked in an immature state that impairs its relevance. It is further argued that the principles of SEM are capable of leading to the development and evaluation of multivariate theories of the sort vitally needed for the conservation of natural systems. Supplementary information can be found at the authors website, http://www.jamesbgrace.com/. • Details why multivariate analyses should be used to study ecological systems • Exposes unappreciated weakness in many current popular analyses • Emphasizes the future methodological developments needed to advance our understanding of ecological systems.

  15. A Review and Analysis of Remote Sensing Capability for Air Quality Measurements as a Potential Decision Support Tool Conducted by the NASA DEVELOP Program

    NASA Technical Reports Server (NTRS)

    Ross, A.; Richards, A.; Keith, K.; Frew, C.; Boseck, J.; Sutton, S.; Watts, C.; Rickman, D.

    2007-01-01

    This project focused on a comprehensive utilization of air quality model products as decision support tools (DST) needed for public health applications. A review of past and future air quality measurement methods and their uncertainty, along with the relationship of air quality to national and global public health, is vital. This project described current and future NASA satellite remote sensing and ground sensing capabilities and the potential for using these sensors to enhance the prediction, prevention, and control of public health effects that result from poor air quality. The qualitative uncertainty of current satellite remotely sensed air quality, the ground-based remotely sensed air quality, the air quality/public health model, and the decision making process is evaluated in this study. Current peer-reviewed literature suggests that remotely sensed air quality parameters correlate well with ground-based sensor data. A satellite remote-sensed and ground-sensed data complement is needed to enhance the models/tools used by policy makers for the protection of national and global public health communities

  16. Modeling Forest Timber Productivity in the South: Where Are We Today?

    Treesearch

    V. Clark Baldwin; Quang V. Cao

    1999-01-01

    The current southern species growth and yield prediction capability, new techniques utilized, and modeling trends over the last 17 years, were examined. Changing forest management objectives that emphasize more non-timber resources may have contributed to the continuing genetii lack of emphasis in modeling the timber productivity of the South's largest forest...

  17. A Transactional Model of Bullying and Victimization

    ERIC Educational Resources Information Center

    Georgiou, Stelios N.; Fanti, Kostas A.

    2010-01-01

    The purpose of the current study was to develop and test a transactional model, based on longitudinal data, capable to describe the existing interrelation between maternal behavior and child bullying and victimization experiences over time. The results confirmed the existence of such a model for bullying, but not for victimization in terms of…

  18. EOID System Model Validation, Metrics, and Synthetic Clutter Generation

    DTIC Science & Technology

    2003-09-30

    Our long-term goal is to accurately predict the capability of the current generation of laser-based underwater imaging sensors to perform Electro ... Optic Identification (EOID) against relevant targets in a variety of realistic environmental conditions. The models will predict the impact of

  19. Curved Thermopiezoelectric Shell Structures Modeled by Finite Element Analysis

    NASA Technical Reports Server (NTRS)

    Lee, Ho-Jun

    2000-01-01

    "Smart" structures composed of piezoelectric materials may significantly improve the performance of aeropropulsion systems through a variety of vibration, noise, and shape-control applications. The development of analytical models for piezoelectric smart structures is an ongoing, in-house activity at the NASA Glenn Research Center at Lewis Field focused toward the experimental characterization of these materials. Research efforts have been directed toward developing analytical models that account for the coupled mechanical, electrical, and thermal response of piezoelectric composite materials. Current work revolves around implementing thermal effects into a curvilinear-shell finite element code. This enhances capabilities to analyze curved structures and to account for coupling effects arising from thermal effects and the curved geometry. The current analytical model implements a unique mixed multi-field laminate theory to improve computational efficiency without sacrificing accuracy. The mechanics can model both the sensory and active behavior of piezoelectric composite shell structures. Finite element equations are being implemented for an eight-node curvilinear shell element, and numerical studies are being conducted to demonstrate capabilities to model the response of curved piezoelectric composite structures (see the figure).

  20. Pressurization of cryogens - A review of current technology and its applicability to low-gravity conditions

    NASA Technical Reports Server (NTRS)

    Van Dresar, N. T.

    1992-01-01

    A review of technology, history, and current status for pressurized expulsion of cryogenic tankage is presented. Use of tank pressurization to expel cryogenic fluid will continue to be studied for future spacecraft applications over a range of operating conditions in the low-gravity environment. The review examines experimental test results and analytical model development for quiescent and agitated conditions in normal-gravity followed by a discussion of pressurization and expulsion in low-gravity. Validated, 1-D, finite difference codes exist for the prediction of pressurant mass requirements within the range of quiescent normal-gravity test data. To date, the effects of liquid sloshing have been characterized by tests in normal-gravity, but analytical models capable of predicting pressurant gas requirements remain unavailable. Efforts to develop multidimensional modeling capabilities in both normal and low-gravity have recently occurred. Low-gravity cryogenic fluid transfer experiments are needed to obtain low-gravity pressurized expulsion data. This data is required to guide analytical model development and to verify code performance.

  1. Pressurization of cryogens: A review of current technology and its applicability to low-gravity conditions

    NASA Technical Reports Server (NTRS)

    Vandresar, N. T.

    1992-01-01

    A review of technology, history, and current status for pressurized expulsion of cryogenic tankage is presented. Use of tank pressurization to expel cryogenic fluids will continue to be studied for future spacecraft applications over a range of operating conditions in the low-gravity environment. The review examines experimental test results and analytical model development for quiescent and agitated conditions in normal-gravity, followed by a discussion of pressurization and expulsion in low-gravity. Validated, 1-D, finite difference codes exist for the prediction of pressurant mass requirements within the range of quiescent normal-gravity test data. To date, the effects of liquid sloshing have been characterized by tests in normal-gravity, but analytical models capable of predicting pressurant gas requirements remain unavailable. Efforts to develop multidimensional modeling capabilities in both normal and low-gravity have recently occurred. Low-gravity cryogenic fluid transfer experiments are needed to obtain low-gravity pressurized expulsion data. This data is required to guide analytical model development and to verify code performance.

  2. Toward a new generation of agricultural system data, models, and knowledge products: State of agricultural systems science.

    PubMed

    Jones, James W; Antle, John M; Basso, Bruno; Boote, Kenneth J; Conant, Richard T; Foster, Ian; Godfray, H Charles J; Herrero, Mario; Howitt, Richard E; Janssen, Sander; Keating, Brian A; Munoz-Carpena, Rafael; Porter, Cheryl H; Rosenzweig, Cynthia; Wheeler, Tim R

    2017-07-01

    We review the current state of agricultural systems science, focusing in particular on the capabilities and limitations of agricultural systems models. We discuss the state of models relative to five different Use Cases spanning field, farm, landscape, regional, and global spatial scales and engaging questions in past, current, and future time periods. Contributions from multiple disciplines have made major advances relevant to a wide range of agricultural system model applications at various spatial and temporal scales. Although current agricultural systems models have features that are needed for the Use Cases, we found that all of them have limitations and need to be improved. We identified common limitations across all Use Cases, namely 1) a scarcity of data for developing, evaluating, and applying agricultural system models and 2) inadequate knowledge systems that effectively communicate model results to society. We argue that these limitations are greater obstacles to progress than gaps in conceptual theory or available methods for using system models. New initiatives on open data show promise for addressing the data problem, but there also needs to be a cultural change among agricultural researchers to ensure that data for addressing the range of Use Cases are available for future model improvements and applications. We conclude that multiple platforms and multiple models are needed for model applications for different purposes. The Use Cases provide a useful framework for considering capabilities and limitations of existing models and data.

  3. Toward a New Generation of Agricultural System Data, Models, and Knowledge Products: State of Agricultural Systems Science

    NASA Technical Reports Server (NTRS)

    Jones, James W.; Antle, John M.; Basso, Bruno; Boote, Kenneth J.; Conant, Richard T.; Foster, Ian; Godfray, H. Charles J.; Herrero, Mario; Howitt, Richard E.; Janssen, Sander; hide

    2016-01-01

    We review the current state of agricultural systems science, focusing in particular on the capabilities and limitations of agricultural systems models. We discuss the state of models relative to five different Use Cases spanning field, farm, landscape, regional, and global spatial scales and engaging questions in past, current, and future time periods. Contributions from multiple disciplines have made major advances relevant to a wide range of agricultural system model applications at various spatial and temporal scales. Although current agricultural systems models have features that are needed for the Use Cases, we found that all of them have limitations and need to be improved. We identified common limitations across all Use Cases, namely 1) a scarcity of data for developing, evaluating, and applying agricultural system models and 2) inadequate knowledge systems that effectively communicate model results to society. We argue that these limitations are greater obstacles to progress than gaps in conceptual theory or available methods for using system models. New initiatives on open data show promise for addressing the data problem, but there also needs to be a cultural change among agricultural researchers to ensure that data for addressing the range of Use Cases are available for future model improvements and applications. We conclude that multiple platforms and multiple models are needed for model applications for different purposes. The Use Cases provide a useful framework for considering capabilities and limitations of existing models and data.

  4. Toward a new generation of agricultural system data, models, and knowledge products: State of agricultural systems science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, James W.; Antle, John M.; Basso, Bruno

    We review the current state of agricultural systems science, focusing in particular on the capabilities and limitations of agricultural systems models. We discuss the state of models relative to five different Use Cases spanning field, farm, landscape, regional, and global spatial scales and engaging questions in past, current, and future time periods. Contributions from multiple disciplines have made major advances relevant to a wide range of agricultural system model applications at various spatial and temporal scales. Although current agricultural systems models have features that are needed for the Use Cases, we found that all of them have limitations and needmore » to be improved. We identified common limitations across all Use Cases, namely 1) a scarcity of data for developing, evaluating, and applying agricultural system models and 2) inadequate knowledge systems that effectively communicate model results to society. We argue that these limitations are greater obstacles to progress than gaps in conceptual theory or available methods for using system models. New initiatives on open data show promise for addressing the data problem, but there also needs to be a cultural change among agricultural researchers to ensure that data for addressing the range of Use Cases are available for future model improvements and applications. We conclude that multiple platforms and multiple models are needed for model applications for different purposes. The Use Cases provide a useful framework for considering capabilities and limitations of existing models and data.« less

  5. Study and development of techniques for automatic control of remote manipulators

    NASA Technical Reports Server (NTRS)

    Shaket, E.; Leal, A.

    1976-01-01

    An overall conceptual design for an autonomous control system of remote manipulators which utilizes feedback was constructed. The system consists of a description of the high-level capabilities of a model from which design algorithms are constructed. The autonomous capability is achieved through automatic planning and locally controlled execution of the plans. The operator gives his commands in high level task-oriented terms. The system transforms these commands into a plan. It uses built-in procedural knowledge of the problem domain and an internal model of the current state of the world.

  6. Advanced sensor-simulation capability

    NASA Astrophysics Data System (ADS)

    Cota, Stephen A.; Kalman, Linda S.; Keller, Robert A.

    1990-09-01

    This paper provides an overview of an advanced simulation capability currently in use for analyzing visible and infrared sensor systems. The software system, called VISTAS (VISIBLE/INFRARED SENSOR TRADES, ANALYSES, AND SIMULATIONS) combines classical image processing techniques with detailed sensor models to produce static and time dependent simulations of a variety of sensor systems including imaging, tracking, and point target detection systems. Systems modelled to date include space-based scanning line-array sensors as well as staring 2-dimensional array sensors which can be used for either imaging or point source detection.

  7. Creating Synthetic Coronal Observational Data From MHD Models: The Forward Technique

    NASA Technical Reports Server (NTRS)

    Rachmeler, Laurel A.; Gibson, Sarah E.; Dove, James; Kucera, Therese Ann

    2010-01-01

    We present a generalized forward code for creating simulated corona) observables off the limb from numerical and analytical MHD models. This generalized forward model is capable of creating emission maps in various wavelengths for instruments such as SXT, EIT, EIS, and coronagraphs, as well as spectropolari metric images and line profiles. The inputs to our code can be analytic models (of which four come with the code) or 2.5D and 3D numerical datacubes. We present some examples of the observable data created with our code as well as its functional capabilities. This code is currently available for beta-testing (contact authors), with the ultimate goal of release as a SolarSoft package

  8. Taking Wave Prediction to New Levels: Wavewatch 3

    DTIC Science & Technology

    2016-01-01

    features such as surf and rip currents , conditions that affect special operations, amphibious assaults, and logistics over the shore. Changes in...The Navy’s current version of WAVEWATCH Ill features the capability of operating with gridded domains of multiple resolution simultaneously, ranging...Netherlands. Its current form, WAVEWATCH Ill, was developed at NOAA’s National Center for Environmental Prediction. The model is free and open source

  9. Current and Future Development of a Non-hydrostatic Unified Atmospheric Model (NUMA)

    DTIC Science & Technology

    2010-09-09

    following capabilities: 1.  Highly scalable on current and future computer architectures ( exascale computing and beyond and GPUs) 2.  Flexibility... Exascale Computing •  10 of Top 500 are already in the Petascale range •  Should also keep our eyes on GPUs (e.g., Mare Nostrum) 2.  Numerical

  10. Information processing of earth resources data

    NASA Technical Reports Server (NTRS)

    Zobrist, A. L.; Bryant, N. A.

    1982-01-01

    Current trends in the use of remotely sensed data include integration of multiple data sources of various formats and use of complex models. These trends have placed a strain on information processing systems because an enormous number of capabilities are needed to perform a single application. A solution to this problem is to create a general set of capabilities which can perform a wide variety of applications. General capabilities for the Image-Based Information System (IBIS) are outlined in this report. They are then cross-referenced for a set of applications performed at JPL.

  11. Eddy Current Influences on the Dynamic Behaviour of Magnetic Suspension Systems

    NASA Technical Reports Server (NTRS)

    Britcher, Colin P.; Bloodgood, Dale V.

    1998-01-01

    This report will summarize some results from a multi-year research effort at NASA Langley Research Center aimed at the development of an improved capability for practical modelling of eddy current effects in magnetic suspension systems. Particular attention is paid to large-gap systems, although generic results applicable to both large-gap and small-gap systems are presented. It is shown that eddy currents can significantly affect the dynamic behavior of magnetic suspension systems, but that these effects can be amenable to modelling and measurement. Theoretical frameworks are presented, together with comparisons of computed and experimental data particularly related to the Large Angle Magnetic Suspension Test Fixture at NASA Langley Research Center, and the Annular Suspension and Pointing System at Old Dominion University. In both cases, practical computations are capable of providing reasonable estimates of important performance-related parameters. The most difficult case is seen to be that of eddy currents in highly permeable material, due to the low skin depths. Problems associated with specification of material properties and areas for future research are discussed.

  12. Temperature-Dependent Short-Circuit Capability of Silicon Carbide Power MOSFETs

    DOE PAGES

    Wang, Zhiqiang; Shi, Xiaojie; Tolbert, Leon M.; ...

    2016-02-01

    Our paper presents a comprehensive short-circuit ruggedness evaluation and numerical investigation of up-to-date commercial silicon carbide (SiC) MOSFETs. The short-circuit capability of three types of commercial 1200-V SiC MOSFETs is tested under various conditions, with case temperatures from 25 to 200 degrees C and dc bus voltages from 400 to 750 V. It is found that the commercial SiC MOSFETs can withstand short-circuit current for only several microseconds with a dc bus voltage of 750 V and case temperature of 200 degrees C. Moreover, the experimental short-circuit behaviors are compared, and analyzed through numerical thermal dynamic simulation. Specifically, an electrothermalmore » model is built to estimate the device internal temperature distribution, considering the temperature-dependent thermal properties of SiC material. Based on the temperature information, a leakage current model is derived to calculate the main leakage current components (i.e., thermal, diffusion, and avalanche generation currents). Finally, numerical results show that the short-circuit failure mechanisms of SiC MOSFETs can be thermal generation current induced thermal runaway or high-temperature-related gate oxide damage.« less

  13. COREBA (cognition-oriented emergent behavior architecture)

    NASA Astrophysics Data System (ADS)

    Kwak, S. David

    2000-06-01

    Currently, many behavior implementation technologies are available for modeling human behaviors in Department of Defense (DOD) computerized systems. However, it is commonly known that any single currently adopted behavior implementation technology is not so capable of fully representing complex and dynamic human decision-making and cognition behaviors. The author views that the current situation can be greatly improved if multiple technologies are integrated within a well designed overarching architecture that amplifies the merits of each of the participating technologies while suppressing the limitations that are inherent with each of the technologies. COREBA uses an overarching behavior integration architecture that makes the multiple implementation technologies cooperate in a homogeneous environment while collectively transcending the limitations associated with the individual implementation technologies. Specifically, COREBA synergistically integrates Artificial Intelligence and Complex Adaptive System under Rational Behavior Model multi-level multi- paradigm behavior architecture. This paper will describe applicability of COREBA in DOD domain, behavioral capabilities and characteristics of COREBA and how the COREBA architectural integrates various behavior implementation technologies.

  14. Integrated Medical Model (IMM) 4.0 Enhanced Functionalities

    NASA Technical Reports Server (NTRS)

    Young, M.; Keenan, A. B.; Saile, L.; Boley, L. A.; Walton, M. E.; Shah, R. V.; Kerstman, E. L.; Myers, J. G.

    2015-01-01

    The Integrated Medical Model is a probabilistic simulation model that uses input data on 100 medical conditions to simulate expected medical events, the resources required to treat, and the resulting impact to the mission for specific crew and mission characteristics. The newest development version of IMM, IMM v4.0, adds capabilities that remove some of the conservative assumptions that underlie the current operational version, IMM v3. While IMM v3 provides the framework to simulate whether a medical event occurred, IMMv4 also simulates when the event occurred during a mission timeline. This allows for more accurate estimation of mission time lost and resource utilization. In addition to the mission timeline, IMMv4.0 features two enhancements that address IMM v3 assumptions regarding medical event treatment. Medical events in IMMv3 are assigned the untreated outcome if any resource required to treat the event was unavailable. IMMv4 allows for partially treated outcomes that are proportional to the amount of required resources available, thus removing the dichotomous treatment assumption. An additional capability IMMv4 is to use an alternative medical resource when the primary resource assigned to the condition is depleted, more accurately reflecting the real-world system. The additional capabilities defining IMM v4.0the mission timeline, partial treatment, and alternate drug result in more realistic predicted mission outcomes. The primary model outcomes of IMM v4.0 for the ISS6 mission, including mission time lost, probability of evacuation, and probability of loss of crew life, are be compared to those produced by the current operational version of IMM to showcase enhanced prediction capabilities.

  15. Global Weather Prediction and High-End Computing at NASA

    NASA Technical Reports Server (NTRS)

    Lin, Shian-Jiann; Atlas, Robert; Yeh, Kao-San

    2003-01-01

    We demonstrate current capabilities of the NASA finite-volume General Circulation Model an high-resolution global weather prediction, and discuss its development path in the foreseeable future. This model can be regarded as a prototype of a future NASA Earth modeling system intended to unify development activities cutting across various disciplines within the NASA Earth Science Enterprise.

  16. Ignition behavior of live California chaparral leaves

    Treesearch

    J.D. Engstrom; J.K Butler; S.G. Smith; L.L. Baxter; T.H. Fletcher; D.R. Weise

    2004-01-01

    Current forest fire models are largely empirical correlations based on data from beds of dead vegetation Improvement in model capabilities is sought by developing models of the combustion of live fuels. A facility was developed to determine the combustion behavior of small samples of live fuels, consisting of a flat-flame burner on a moveable platform Qualitative and...

  17. Multisite evaluation of APEX for water quality: 1. Best professional judgement parameterization

    USDA-ARS?s Scientific Manuscript database

    The Agricultural and Policy Environmental Extender (APEX) model is capable of estimating edge-of-field water, nutrient, and sediment transport and is used to assess the environmental impacts of management practices. The current practice is to fully calibrate the model for each site simulation, a tas...

  18. Errors of Inference in Structural Equation Modeling

    ERIC Educational Resources Information Center

    McCoach, D. Betsy; Black, Anne C.; O'Connell, Ann A.

    2007-01-01

    Although structural equation modeling (SEM) is one of the most comprehensive and flexible approaches to data analysis currently available, it is nonetheless prone to researcher misuse and misconceptions. This article offers a brief overview of the unique capabilities of SEM and discusses common sources of user error in drawing conclusions from…

  19. Development of a New VLBI Data Analysis Software

    NASA Technical Reports Server (NTRS)

    Bolotin, Sergei; Gipson, John M.; MacMillan, Daniel S.

    2010-01-01

    We present an overview of a new VLBI analysis software under development at NASA GSFC. The new software will replace CALC/SOLVE and many related utility programs. It will have the capabilities of the current system as well as incorporate new models and data analysis techniques. In this paper we give a conceptual overview of the new software. We formulate the main goals of the software. The software should be flexible and modular to implement models and estimation techniques that currently exist or will appear in future. On the other hand it should be reliable and possess production quality for processing standard VLBI sessions. Also, it needs to be capable of processing observations from a fully deployed network of VLBI2010 stations in a reasonable time. We describe the software development process and outline the software architecture.

  20. Modeling and strain gauging of eddy current repulsion deicing systems

    NASA Technical Reports Server (NTRS)

    Smith, Samuel O.

    1993-01-01

    Work described in this paper confirms and extends work done by Zumwalt, et al., on a variety of in-flight deicing systems that use eddy current repulsion for repelling ice. Two such systems are known as electro-impulse deicing (EIDI) and the eddy current repulsion deicing strip (EDS). Mathematical models for these systems are discussed for their capabilities and limitations. The author duplicates a particular model of the EDS. Theoretical voltage, current, and force results are compared directly to experimental results. Dynamic strain measurements results are presented for the EDS system. Dynamic strain measurements near EDS or EIDI coils are complicated by the high magnetic fields in the vicinity of the coils. High magnetic fields induce false voltage signals out of the gages.

  1. Spike train generation and current-to-frequency conversion in silicon diodes

    NASA Technical Reports Server (NTRS)

    Coon, D. D.; Perera, A. G. U.

    1989-01-01

    A device physics model is developed to analyze spontaneous neuron-like spike train generation in current driven silicon p(+)-n-n(+) devices in cryogenic environments. The model is shown to explain the very high dynamic range (0 to the 7th) current-to-frequency conversion and experimental features of the spike train frequency as a function of input current. The devices are interesting components for implementation of parallel asynchronous processing adjacent to cryogenically cooled focal planes because of their extremely low current and power requirements, their electronic simplicity, and their pulse coding capability, and could be used to form the hardware basis for neural networks which employ biologically plausible means of information coding.

  2. The PLAID graphics analysis impact on the space program

    NASA Technical Reports Server (NTRS)

    Nguyen, Jennifer P.; Wheaton, Aneice L.; Maida, James C.

    1994-01-01

    An ongoing project design often requires visual verification at various stages. These requirements are critically important because the subsequent phases of that project might depend on the complete verification of a particular stage. Currently, there are several software packages at JSC that provide such simulation capabilities. We present the simulation capabilities of the PLAID modeling system used in the Flight Crew Support Division for human factors analyses. We summarize some ongoing studies in kinematics, lighting, EVA activities, and discuss various applications in the mission planning of the current Space Shuttle flights and the assembly sequence of the Space Station Freedom with emphasis on the redesign effort.

  3. European Science Notes Information Bulletin. Report on Current European and Middle Eastern Science

    DTIC Science & Technology

    1992-10-01

    oceanographers. This has occurred at a time of current radar systems . The independent develop- rapidly increasing government interest in and fund...over each area in which surface current is ment of the waves (some motions caused by wave determined (for HF systems , averaging time spans action and...Ocean Observing System ; high-resolution model capabilities; ocean- atmosphere interface; Surface Density Depression Pool; forecasting INTRODUCTION tion

  4. Space Weather Models at the CCMC And Their Capabilities

    NASA Technical Reports Server (NTRS)

    Hesse, Michael; Rastatter, Lutz; MacNeice, Peter; Kuznetsova, Masha

    2007-01-01

    The Community Coordinated Modeling Center (CCMC) is a US inter-agency activity aiming at research in support of the generation of advanced space weather models. As one of its main functions, the CCMC provides to researchers the use of space science models, even if they are not model owners themselves. The second focus of CCMC activities is on validation and verification of space weather models, and on the transition of appropriate models to space weather forecast centers. As part of the latter activity, the CCMC develops real-time simulation systems that stress models through routine execution. A by-product of these real-time calculations is the ability to derive model products, which may be useful for space weather operators. In this presentation, we will provide an overview of the community-provided, space weather-relevant, model suite, which resides at CCMC. We will discuss current capabilities, and analyze expected future developments of space weather related modeling.

  5. Integrated simulations for fusion research in the 2030's time frame (white paper outline)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friedman, Alex; LoDestro, Lynda L.; Parker, Jeffrey B.

    This white paper presents the rationale for developing a community-wide capability for whole-device modeling, and advocates for an effort with the expectation of persistence: a long-term programmatic commitment, and support for community efforts. Statement of 2030 goal (two suggestions): (a) Robust integrated simulation tools to aid real-time experimental discharges and reactor designs by employing a hierarchy in fidelity of physics models. (b) To produce by the early 2030s a capability for validated, predictive simulation via integration of a suite of physics models from moderate through high fidelity, to understand and plan full plasma discharges, aid in data interpretation, carry outmore » discovery science, and optimize future machine designs. We can achieve this goal via a focused effort to extend current scientific capabilities and rigorously integrate simulations of disparate physics into a comprehensive set of workflows.« less

  6. Extension of HCDstruct for Transonic Aeroservoelastic Analysis of Unconventional Aircraft Concepts

    NASA Technical Reports Server (NTRS)

    Quinlan, Jesse R.; Gern, Frank H.

    2017-01-01

    A substantial effort has been made to implement an enhanced aerodynamic modeling capability in the Higher-fidelity Conceptual Design and structural optimization tool. This additional capability is needed for a rapid, physics-based method of modeling advanced aircraft concepts at risk of structural failure due to dynamic aeroelastic instabilities. To adequately predict these instabilities, in particular for transonic applications, a generalized aerodynamic matching algorithm was implemented to correct the doublet-lattice model available in Nastran using solution data from a priori computational fluid dynamics anal- ysis. This new capability is demonstrated for two tube-and-wing aircraft configurations, including a Boeing 737-200 for implementation validation and the NASA D8 as a first use case. Results validate the current implementation of the aerodynamic matching utility and demonstrate the importance of using such a method for aircraft configurations featuring fuselage-wing aerodynamic interaction.

  7. Exploration Medical Cap Ability System Engineering Overview

    NASA Technical Reports Server (NTRS)

    McGuire, K.; Mindock, J.

    2018-01-01

    Deep Space Gateway and Transport missions will change the way NASA currently practices medicine. The missions will require more autonomous capability compared to current low Earth orbit operations. For the medical system, lack of consumable resupply, evacuation opportunities, and real-time ground support are key drivers toward greater autonomy. Recognition of the limited mission and vehicle resources available to carry out exploration missions motivates the Exploration Medical Capability (ExMC) Element's approach to enabling the necessary autonomy. The ExMC Systems Engineering team's mission is to "Define, develop, validate, and manage the technical system design needed to implement exploration medical capabilities for Mars and test the design in a progression of proving grounds." The Element's work must integrate with the overall exploration mission and vehicle design efforts to successfully provide exploration medical capabilities. ExMC is using Model-Based System Engineering (MBSE) to accomplish its integrative goals. The MBSE approach to medical system design offers a paradigm shift toward greater integration between vehicle and the medical system, and directly supports the transition of Earth-reliant ISS operations to the Earth-independent operations envisioned for Mars. This talk will discuss how ExMC is using MBSE to define operational needs, decompose requirements and architecture, and identify medical capabilities needed to support human exploration. How MBSE is being used to integrate across disciplines and NASA Centers will also be described. The medical system being discussed in this talk is one system within larger habitat systems. Data generated within the medical system will be inputs to other systems and vice versa. This talk will also describe the next steps in model development that include: modeling the different systems that comprise the larger system and interact with the medical system, understanding how the various systems work together, and developing tools to support trade studies.

  8. Hypersonic flow analysis

    NASA Technical Reports Server (NTRS)

    Chow, Chuen-Yen; Ryan, James S.

    1987-01-01

    While the zonal grid system of Transonic Navier-Stokes (TNS) provides excellent modeling of complex geometries, improved shock capturing, and a higher Mach number range will be required if flows about hypersonic aircraft are to be modeled accurately. A computational fluid dynamics (CFD) code, the Compressible Navier-Stokes (CNS), is under development to combine the required high Mach number capability with the existing TNS geometry capability. One of several candidate flow solvers for inclusion in the CNS is that of F3D. This upwinding flow solver promises improved shock capturing, and more accurate hypersonic solutions overall, compared to the solver currently used in TNS.

  9. Guide to Modelling & Simulation (M&S) for NATO Network-Enabled Capability (M&S for NNEC) (Guide de la modelisation et de la simulation (M&S) pour las NATO network-enabled capability (M&S de la NNEC))

    DTIC Science & Technology

    2010-02-01

    interdependencies, and then modifying plans according to updated projections. This is currently an immature area where further research is required. The...crosscutting.html. [7] Zeigler, B.P. and Hammonds, P. (2007). “Modelling and Simulation- Based Data Engineering: Introducing Pragmatics and Ontologies for...the optimum benefit to be obtained and while immature , ongoing research needs to be maintained. 20) Use of M&S to support complex operations needs

  10. Validation of Multibody Program to Optimize Simulated Trajectories II Parachute Simulation with Interacting Forces

    NASA Technical Reports Server (NTRS)

    Raiszadeh, Behzad; Queen, Eric M.; Hotchko, Nathaniel J.

    2009-01-01

    A capability to simulate trajectories of multiple interacting rigid bodies has been developed, tested and validated. This capability uses the Program to Optimize Simulated Trajectories II (POST 2). The standard version of POST 2 allows trajectory simulation of multiple bodies without force interaction. In the current implementation, the force interaction between the parachute and the suspended bodies has been modeled using flexible lines, allowing accurate trajectory simulation of the individual bodies in flight. The POST 2 multibody capability is intended to be general purpose and applicable to any parachute entry trajectory simulation. This research paper explains the motivation for multibody parachute simulation, discusses implementation methods, and presents validation of this capability.

  11. NEXT Ion Thruster Performance Dispersion Analyses

    NASA Technical Reports Server (NTRS)

    Soulas, George C.; Patterson, Michael J.

    2008-01-01

    The NEXT ion thruster is a low specific mass, high performance thruster with a nominal throttling range of 0.5 to 7 kW. Numerous engineering model and one prototype model thrusters have been manufactured and tested. Of significant importance to propulsion system performance is thruster-to-thruster performance dispersions. This type of information can provide a bandwidth of expected performance variations both on a thruster and a component level. Knowledge of these dispersions can be used to more conservatively predict thruster service life capability and thruster performance for mission planning, facilitate future thruster performance comparisons, and verify power processor capabilities are compatible with the thruster design. This study compiles the test results of five engineering model thrusters and one flight-like thruster to determine unit-to-unit dispersions in thruster performance. Component level performance dispersion analyses will include discharge chamber voltages, currents, and losses; accelerator currents, electron backstreaming limits, and perveance limits; and neutralizer keeper and coupling voltages and the spot-to-plume mode transition flow rates. Thruster level performance dispersion analyses will include thrust efficiency.

  12. Development and Application of Wide Bandwidth Magneto-Resistive Sensor Based Eddy Current Probe

    NASA Technical Reports Server (NTRS)

    Wincheski, Russell A.; Simpson, John

    2010-01-01

    The integration of magneto-resistive sensors into eddy current probes can significantly expand the capabilities of conventional eddy current nondestructive evaluation techniques. The room temperature solid-state sensors have typical bandwidths in the megahertz range and resolutions of tens of microgauss. The low frequency sensitivity of magneto-resistive sensors has been capitalized upon in previous research to fabricate very low frequency eddy current sensors for deep flaw detection in multilayer conductors. In this work a modified probe design is presented to expand the capabilities of the device. The new probe design incorporates a dual induction source enabling operation from low frequency deep flaw detection to high frequency high resolution near surface material characterization. Applications of the probe for the detection of localized near surface conductivity anomalies are presented. Finite element modeling of the probe is shown to be in good agreement with experimental measurements.

  13. Structural equation modeling and natural systems

    USGS Publications Warehouse

    Grace, James B.

    2006-01-01

    This book, first published in 2006, presents an introduction to the methodology of structural equation modeling, illustrates its use, and goes on to argue that it has revolutionary implications for the study of natural systems. A major theme of this book is that we have, up to this point, attempted to study systems primarily using methods (such as the univariate model) that were designed only for considering individual processes. Understanding systems requires the capacity to examine simultaneous influences and responses. Structural equation modeling (SEM) has such capabilities. It also possesses many other traits that add strength to its utility as a means of making scientific progress. In light of the capabilities of SEM, it can be argued that much of ecological theory is currently locked in an immature state that impairs its relevance. It is further argued that the principles of SEM are capable of leading to the development and evaluation of multivariate theories of the sort vitally needed for the conservation of natural systems.

  14. A model to assess the Mars Telecommunications Network relay robustness

    NASA Technical Reports Server (NTRS)

    Girerd, Andre R.; Meshkat, Leila; Edwards, Charles D., Jr.; Lee, Charles H.

    2005-01-01

    The relatively long mission durations and compatible radio protocols of current and projected Mars orbiters have enabled the gradual development of a heterogeneous constellation providing proximity communication services for surface assets. The current and forecasted capability of this evolving network has reached the point that designers of future surface missions consider complete dependence on it. Such designers, along with those architecting network requirements, have a need to understand the robustness of projected communication service. A model has been created to identify the robustness of the Mars Network as a function of surface location and time. Due to the decade-plus time horizon considered, the network will evolve, with emerging productive nodes and nodes that cease or fail to contribute. The model is a flexible framework to holistically process node information into measures of capability robustness that can be visualized for maximum understanding. Outputs from JPL's Telecom Orbit Analysis Simulation Tool (TOAST) provide global telecom performance parameters for current and projected orbiters. Probabilistic estimates of orbiter fuel life are derived from orbit keeping burn rates, forecasted maneuver tasking, and anomaly resolution budgets. Orbiter reliability is estimated probabilistically. A flexible scheduling framework accommodates the projected mission queue as well as potential alterations.

  15. Payload Planning for the International Space Station

    NASA Technical Reports Server (NTRS)

    Johnson, Tameka J.

    1995-01-01

    A review of the evolution of the International Space Station (ISS) was performed for the purpose of understanding the project objectives. It was requested than an analysis of the current Office of Space Access and Technology (OSAT) Partnership Utilization Plan (PUP) traffic model be completed to monitor the process through which the scientific experiments called payloads are manifested for flight to the ISS. A viewing analysis of the ISS was also proposed to identify the capability to observe the United States Laboratory (US LAB) during the assembly sequence. Observations of the Drop-Tower experiment and nondestructive testing procedures were also performed to maximize the intern's technical experience. Contributions were made to the meeting in which the 1996 OSAT or Code X PUP traffic model was generated using the software tool, Filemaker Pro. The current OSAT traffic model satisfies the requirement for manifesting and delivering the proposed payloads to station. The current viewing capability of station provides the ability to view the US LAB during station assembly sequence. The Drop Tower experiment successfully simulates the effect of microgravity and conveniently documents the results for later use. The non-destructive test proved effective in determining stress in various components tested.

  16. Implementation of a tree algorithm in MCNP code for nuclear well logging applications.

    PubMed

    Li, Fusheng; Han, Xiaogang

    2012-07-01

    The goal of this paper is to develop some modeling capabilities that are missing in the current MCNP code. Those missing capabilities can greatly help for some certain nuclear tools designs, such as a nuclear lithology/mineralogy spectroscopy tool. The new capabilities to be developed in this paper include the following: zone tally, neutron interaction tally, gamma rays index tally and enhanced pulse-height tally. The patched MCNP code also can be used to compute neutron slowing-down length and thermal neutron diffusion length. Copyright © 2011 Elsevier Ltd. All rights reserved.

  17. Advances in Computational Capabilities for Hypersonic Flows

    NASA Technical Reports Server (NTRS)

    Kumar, Ajay; Gnoffo, Peter A.; Moss, James N.; Drummond, J. Philip

    1997-01-01

    The paper reviews the growth and advances in computational capabilities for hypersonic applications over the period from the mid-1980's to the present day. The current status of the code development issues such as surface and field grid generation, algorithms, physical and chemical modeling, and validation is provided. A brief description of some of the major codes being used at NASA Langley Research Center for hypersonic continuum and rarefied flows is provided, along with their capabilities and deficiencies. A number of application examples are presented, and future areas of research to enhance accuracy, reliability, efficiency, and robustness of computational codes are discussed.

  18. Surrogate modeling of joint flood risk across coastal watersheds

    NASA Astrophysics Data System (ADS)

    Bass, Benjamin; Bedient, Philip

    2018-03-01

    This study discusses the development and performance of a rapid prediction system capable of representing the joint rainfall-runoff and storm surge flood response of tropical cyclones (TCs) for probabilistic risk analysis. Due to the computational demand required for accurately representing storm surge with the high-fidelity ADvanced CIRCulation (ADCIRC) hydrodynamic model and its coupling with additional numerical models to represent rainfall-runoff, a surrogate or statistical model was trained to represent the relationship between hurricane wind- and pressure-field characteristics and their peak joint flood response typically determined from physics based numerical models. This builds upon past studies that have only evaluated surrogate models for predicting peak surge, and provides the first system capable of probabilistically representing joint flood levels from TCs. The utility of this joint flood prediction system is then demonstrated by improving upon probabilistic TC flood risk products, which currently account for storm surge but do not take into account TC associated rainfall-runoff. Results demonstrate the source apportionment of rainfall-runoff versus storm surge and highlight that slight increases in flood risk levels may occur due to the interaction between rainfall-runoff and storm surge as compared to the Federal Emergency Management Association's (FEMAs) current practices.

  19. First-Principles-Driven Model-Based Optimal Control of the Current Profile in NSTX-U

    NASA Astrophysics Data System (ADS)

    Ilhan, Zeki; Barton, Justin; Wehner, William; Schuster, Eugenio; Gates, David; Gerhardt, Stefan; Kolemen, Egemen; Menard, Jonathan

    2014-10-01

    Regulation in time of the toroidal current profile is one of the main challenges toward the realization of the next-step operational goals for NSTX-U. A nonlinear, control-oriented, physics-based model describing the temporal evolution of the current profile is obtained by combining the magnetic diffusion equation with empirical correlations obtained at NSTX-U for the electron density, electron temperature, and non-inductive current drives. In this work, the proposed model is embedded into the control design process to synthesize a time-variant, linear-quadratic-integral, optimal controller capable of regulating the safety factor profile around a desired target profile while rejecting disturbances. Neutral beam injectors and the total plasma current are used as actuators to shape the current profile. The effectiveness of the proposed controller in regulating the safety factor profile in NSTX-U is demonstrated via closed-loop predictive simulations carried out in PTRANSP. Supported by PPPL.

  20. Assessing the Current Status of Atmospheric Radiation Modelling: Progress, Challenges and the Needs for the Next Generation of Models

    NASA Astrophysics Data System (ADS)

    Joyce, C. J.; Tobiska, W. K.; Copeland, K.; Smart, D. F.; Shea, M. A.; Nowicki, S.; Atwell, W.; Benton, E. R.; Wilkins, R.; Hands, A.; Gronoff, G.; Meier, M. M.; Schwadron, N.

    2017-12-01

    Despite its potential for causing a wide range of harmful effects, including health hazards to airline passengers and damage to aircraft and satellite electronics, atmospheric radiation remains a relatively poorly defined risk, lacking sufficient measurements and modelling to fully evaluate the dangers posed. While our reliance on airline travel has increased dramatically over time, there remains an absence of international guidance and standards to protect aircraft passengers from potential health impacts due to radiation exposure. This subject has been gaining traction within the scientific community in recent years, with an expanding number of models with increasing capabilities being made available to evaluate atmospheric radiation hazards. We provide a general description of these modelling efforts, including the physics and methods used by the models, as well as their data inputs and outputs. We also discuss the current capacity for model validation via measurements and discuss the needs for the next generation of models, both in terms of their capabilities and the measurements required to validate them. This review of the status of atmospheric radiation modelling is part of a larger series of studies made as part of the SAFESKY program, with other efforts focusing on the underlying physics and implications, measurements and regulations/standards of atmospheric radiation.

  1. Graphical workstation capability for reliability modeling

    NASA Technical Reports Server (NTRS)

    Bavuso, Salvatore J.; Koppen, Sandra V.; Haley, Pamela J.

    1992-01-01

    In addition to computational capabilities, software tools for estimating the reliability of fault-tolerant digital computer systems must also provide a means of interfacing with the user. Described here is the new graphical interface capability of the hybrid automated reliability predictor (HARP), a software package that implements advanced reliability modeling techniques. The graphics oriented (GO) module provides the user with a graphical language for modeling system failure modes through the selection of various fault-tree gates, including sequence-dependency gates, or by a Markov chain. By using this graphical input language, a fault tree becomes a convenient notation for describing a system. In accounting for any sequence dependencies, HARP converts the fault-tree notation to a complex stochastic process that is reduced to a Markov chain, which it can then solve for system reliability. The graphics capability is available for use on an IBM-compatible PC, a Sun, and a VAX workstation. The GO module is written in the C programming language and uses the graphical kernal system (GKS) standard for graphics implementation. The PC, VAX, and Sun versions of the HARP GO module are currently in beta-testing stages.

  2. Development and evaluation of the bacterial fate and transport module for the agricultural policy/environmental extender (APEX) model

    USDA-ARS?s Scientific Manuscript database

    The Agricultural Policy/Environmental eXtender (APEX) is a watershed-scale water quality model that includes detailed representation of agricultural management but currently does not have microbial fate and transport simulation capabilities. The objective of this work was to develop a process-based ...

  3. Teacher's Corner: Structural Equation Modeling with the Sem Package in R

    ERIC Educational Resources Information Center

    Fox, John

    2006-01-01

    R is free, open-source, cooperatively developed software that implements the S statistical programming language and computing environment. The current capabilities of R are extensive, and it is in wide use, especially among statisticians. The sem package provides basic structural equation modeling facilities in R, including the ability to fit…

  4. Improving component interoperability and reusability with the java connection framework (JCF): overview and application to the ages-w environmental model

    USDA-ARS?s Scientific Manuscript database

    Environmental modeling framework (EMF) design goals are multi-dimensional and often include many aspects of general software framework development. Many functional capabilities offered by current EMFs are closely related to interoperability and reuse aspects. For example, an EMF needs to support dev...

  5. Autonomous cloud based site monitoring through hydro geophysical data assimilation, processing and result delivery

    NASA Astrophysics Data System (ADS)

    Versteeg, R.; Johnson, D. V.; Rodzianko, A.; Zhou, H.; Dafflon, B.; Leger, E.; de Kleine, M.

    2017-12-01

    Understanding of processes in the shallow subsurface requires that geophysical, biogeochemical, hydrological and remote sensing datasets are assimilated, processed and interpreted. Multiple enabling software capabilities for process understanding have been developed by the science community. These include information models (ODM2), reactive transport modeling (PFLOTRAN, Modflow, CLM, Landlab), geophysical inversion (E4D, BERT), parameter estimation (PEST, DAKOTA), visualization (ViSiT, Paraview, D3, QGIS) as well as numerous tools written in python and R for petrophysical mapping, stochastic modeling, data analysis and so on. These capabilities use data collected using sensors and analytical tools developed by multiple manufacturers which produce many different measurements. While scientists obviously leverage tools, capabilities and lessons learned from one site at other sites, the current approach to site characterization and monitoring is very labor intensive and does not scale well. Our objective is to be able to monitor many (hundreds - thousands) of sites. This requires that monitoring can be done in a near time, affordable, auditable and essentially autonomous manner. For this we have developed a modular vertically integrated cloud based software framework which was designed from the ground up for effective site and process monitoring. This software framework (PAF - Predictive Assimilation Framework) is multitenant software and provides automation of data ingestion, processing and visualization of hydrological, geochemical and geophysical (ERT/DTS) data. The core organizational element of PAF is a project/user one in which capabilities available to users are controlled by a combination of available data and access permissions. All PAF capabilities are exposed through APIs, making it easy to quickly add new components. PAF is fully integrated with newly developed autonomous electrical geophysical hardware and thus allows for automation of electrical geophysical ingestion and processing and the ability for co analysis and visualization of the raw and processed data with other data of interest (e.g. soil temperature, soil moisture, precipitation). We will demonstrate current PAF capabilities and discuss future efforts.

  6. Current and anticipated uses of thermal-hydraulic codes in NFI

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tsuda, K.; Takayasu, M.

    1997-07-01

    This paper presents the thermal-hydraulic codes currently used in NFI for the LWR fuel development and licensing application including transient and design basis accident analyses of LWR plants. The current status of the codes are described in the context of code capability, modeling feature, and experience of code application related to the fuel development and licensing. Finally, the anticipated use of the future thermal-hydraulic code in NFI is briefly given.

  7. Structure and dynamics of the coronal magnetic field

    NASA Technical Reports Server (NTRS)

    VanHoven, Gerard; Schnack, Dalton D.

    1996-01-01

    The last few years have seen a marked increase in the sophistication of models of the solar corona. This has been brought about by a confluence of three key elements. First, the collection of high-resolution observations of the Sun, both in space and time, has grown tremendously. The SOHO (Solar Heliospheric Observatory) mission is providing additional correlated high-resolution magnetic, white-light and spectroscopic observations. Second, the power and availability of supercomputers has made two- and three-dimensional modeling routine. Third, the sophistication of the models themselves, both in their geometrical realism and in the detailed physics that has been included, has improved significantly. The support from our current Space Physics Theory grant has allowed us to exploit this confluence of capabilities. We have carried out direct comparisons between observations and models of the solar corona. The agreement between simulated coronal structure and observations has verified that the models are mature enough for detailed analysis, as we will describe. The development of this capability is especially timely, since observations obtained from three space missions that are underway (Ulysses, WIND and SOHO) offer an opportunity for significant advances in our understanding of the corona and heliosphere. Through this interplay of observations and theory we can improve our understanding of the Sun. Our achievements thus far include progress modeling the large-scale structure of the solar corona, three-dimensional models of active region fields, development of emerging flux and current, formation and evolution of coronal loops, and coronal heating by current filaments.

  8. Refining the aggregate exposure pathway

    EPA Science Inventory

    Advancements in measurement technologies and modeling capabilities continue to result in an abundance of exposure information, adding to that currently in existence. However, fragmentation within the exposure science community acts as an obstacle for realizing the vision set fort...

  9. A new mathematical model and control of a three-phase AC-DC voltage source converter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blasko, V.; Kaura, V.

    1997-01-01

    A new mathematical model of the power circuit of a three-phase voltage source converter (VSC) was developed in the stationary and synchronous reference frames. The mathematical model was then used to analyze and synthesize the voltage and current control loops for the VSC. Analytical expressions were derived for calculating the gains and time constants of the current and voltage regulators. The mathematical model was used to control a 140-kW regenerative VSC. The synchronous reference frame model was used to define feedforward signals in the current regulators to eliminate the cross coupling between the d and q phases. It allowed themore » reduction of the current control loop to first-order plants and improved their tracking capability. The bandwidths of the current and voltage-control loops were found to be approximately 20 and 60 times (respectively) smaller than the sampling frequency. All control algorithms were implemented in a digital-signal processor. All results of the analysis were experimentally verified.« less

  10. Terrestrial gamma-ray flashes

    NASA Astrophysics Data System (ADS)

    Marisaldi, Martino; Fuschino, Fabio; Labanti, Claudio; Tavani, Marco; Argan, Andrea; Del Monte, Ettore; Longo, Francesco; Barbiellini, Guido; Giuliani, Andrea; Trois, Alessio; Bulgarelli, Andrea; Gianotti, Fulvio; Trifoglio, Massimo

    2013-08-01

    Lightning and thunderstorm systems in general have been recently recognized as powerful particle accelerators, capable of producing electrons, positrons, gamma-rays and neutrons with energies as high as several tens of MeV. In fact, these natural systems turn out to be the highest energy and most efficient natural particle accelerators on Earth. Terrestrial Gamma-ray Flashes (TGFs) are millisecond long, very intense bursts of gamma-rays and are one of the most intriguing manifestation of these natural accelerators. Only three currently operative missions are capable of detecting TGFs from space: the RHESSI, Fermi and AGILE satellites. In this paper we review the characteristics of TGFs, including energy spectrum, timing structure, beam geometry and correlation with lightning, and the basic principles of the associated production models. Then we focus on the recent AGILE discoveries concerning the high energy extension of the TGF spectrum up to 100 MeV, which is difficult to reconcile with current theoretical models.

  11. Distributed generation capabilities of the national energy modeling system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    LaCommare, Kristina Hamachi; Edwards, Jennifer L.; Marnay, Chris

    2003-01-01

    This report describes Berkeley Lab's exploration of how the National Energy Modeling System (NEMS) models distributed generation (DG) and presents possible approaches for improving how DG is modeled. The on-site electric generation capability has been available since the AEO2000 version of NEMS. Berkeley Lab has previously completed research on distributed energy resources (DER) adoption at individual sites and has developed a DER Customer Adoption Model called DER-CAM. Given interest in this area, Berkeley Lab set out to understand how NEMS models small-scale on-site generation to assess how adequately DG is treated in NEMS, and to propose improvements or alternatives. Themore » goal is to determine how well NEMS models the factors influencing DG adoption and to consider alternatives to the current approach. Most small-scale DG adoption takes place in the residential and commercial modules of NEMS. Investment in DG ultimately offsets purchases of electricity, which also eliminates the losses associated with transmission and distribution (T&D). If the DG technology that is chosen is photovoltaics (PV), NEMS assumes renewable energy consumption replaces the energy input to electric generators. If the DG technology is fuel consuming, consumption of fuel in the electric utility sector is replaced by residential or commercial fuel consumption. The waste heat generated from thermal technologies can be used to offset the water heating and space heating energy uses, but there is no thermally activated cooling capability. This study consists of a review of model documentation and a paper by EIA staff, a series of sensitivity runs performed by Berkeley Lab that exercise selected DG parameters in the AEO2002 version of NEMS, and a scoping effort of possible enhancements and alternatives to NEMS current DG capabilities. In general, the treatment of DG in NEMS is rudimentary. The penetration of DG is determined by an economic cash-flow analysis that determines adoption based on the n umber of years to a positive cash flow. Some important technologies, e.g. thermally activated cooling, are absent, and ceilings on DG adoption are determined by some what arbitrary caps on the number of buildings that can adopt DG. These caps are particularly severe for existing buildings, where the maximum penetration for any one technology is 0.25 percent. On the other hand, competition among technologies is not fully considered, and this may result in double-counting for certain applications. A series of sensitivity runs show greater penetration with net metering enhancements and aggressive tax credits and a more limited response to lowered DG technology costs. Discussion of alternatives to the current code is presented in Section 4. Alternatives or improvements to how DG is modeled in NEMS cover three basic areas: expanding on the existing total market for DG both by changing existing parameters in NEMS and by adding new capabilities, such as for missing technologies; enhancing the cash flow analysis but incorporating aspects of DG economics that are not currently represented, e.g. complex tariffs; and using an external geographic information system (GIS) driven analysis that can better and more intuitively identify niche markets.« less

  12. INTEGRATION OF FACILITY MODELING CAPABILITIES FOR NUCLEAR NONPROLIFERATION ANALYSIS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gorensek, M.; Hamm, L.; Garcia, H.

    2011-07-18

    Developing automated methods for data collection and analysis that can facilitate nuclear nonproliferation assessment is an important research area with significant consequences for the effective global deployment of nuclear energy. Facility modeling that can integrate and interpret observations collected from monitored facilities in order to ascertain their functional details will be a critical element of these methods. Although improvements are continually sought, existing facility modeling tools can characterize all aspects of reactor operations and the majority of nuclear fuel cycle processing steps, and include algorithms for data processing and interpretation. Assessing nonproliferation status is challenging because observations can come frommore » many sources, including local and remote sensors that monitor facility operations, as well as open sources that provide specific business information about the monitored facilities, and can be of many different types. Although many current facility models are capable of analyzing large amounts of information, they have not been integrated in an analyst-friendly manner. This paper addresses some of these facility modeling capabilities and illustrates how they could be integrated and utilized for nonproliferation analysis. The inverse problem of inferring facility conditions based on collected observations is described, along with a proposed architecture and computer framework for utilizing facility modeling tools. After considering a representative sampling of key facility modeling capabilities, the proposed integration framework is illustrated with several examples.« less

  13. Sharp burnout failure observed in high current-carrying double-walled carbon nanotube fibers

    NASA Astrophysics Data System (ADS)

    Song, Li; Toth, Geza; Wei, Jinquan; Liu, Zheng; Gao, Wei; Ci, Lijie; Vajtai, Robert; Endo, Morinobu; Ajayan, Pulickel M.

    2012-01-01

    We report on the current-carrying capability and the high-current-induced thermal burnout failure modes of 5-20 µm diameter double-walled carbon nanotube (DWNT) fibers made by an improved dry-spinning method. It is found that the electrical conductivity and maximum current-carrying capability for these DWNT fibers can reach up to 5.9 × 105 S m - 1 and over 1 × 105 A cm - 2 in air. In comparison, we observed that standard carbon fiber tended to be oxidized and burnt out into cheese-like morphology when the maximum current was reached, while DWNT fiber showed a much slower breakdown behavior due to the gradual burnout in individual nanotubes. The electron microscopy observations further confirmed that the failure process of DWNT fibers occurs at localized positions, and while the individual nanotubes burn they also get aligned due to local high temperature and electrostatic field. In addition a finite element model was constructed to gain better understanding of the failure behavior of DWNT fibers.

  14. Representing Geospatial Environment Observation Capability Information: A Case Study of Managing Flood Monitoring Sensors in the Jinsha River Basin

    PubMed Central

    Hu, Chuli; Guan, Qingfeng; Li, Jie; Wang, Ke; Chen, Nengcheng

    2016-01-01

    Sensor inquirers cannot understand comprehensive or accurate observation capability information because current observation capability modeling does not consider the union of multiple sensors nor the effect of geospatial environmental features on the observation capability of sensors. These limitations result in a failure to discover credible sensors or plan for their collaboration for environmental monitoring. The Geospatial Environmental Observation Capability (GEOC) is proposed in this study and can be used as an information basis for the reliable discovery and collaborative planning of multiple environmental sensors. A field-based GEOC (GEOCF) information representation model is built. Quintuple GEOCF feature components and two GEOCF operations are formulated based on the geospatial field conceptual framework. The proposed GEOCF markup language is used to formalize the proposed GEOCF. A prototype system called GEOCapabilityManager is developed, and a case study is conducted for flood observation in the lower reaches of the Jinsha River Basin. The applicability of the GEOCF is verified through the reliable discovery of flood monitoring sensors and planning for the collaboration of these sensors. PMID:27999247

  15. Representing Geospatial Environment Observation Capability Information: A Case Study of Managing Flood Monitoring Sensors in the Jinsha River Basin.

    PubMed

    Hu, Chuli; Guan, Qingfeng; Li, Jie; Wang, Ke; Chen, Nengcheng

    2016-12-16

    Sensor inquirers cannot understand comprehensive or accurate observation capability information because current observation capability modeling does not consider the union of multiple sensors nor the effect of geospatial environmental features on the observation capability of sensors. These limitations result in a failure to discover credible sensors or plan for their collaboration for environmental monitoring. The Geospatial Environmental Observation Capability (GEOC) is proposed in this study and can be used as an information basis for the reliable discovery and collaborative planning of multiple environmental sensors. A field-based GEOC (GEOCF) information representation model is built. Quintuple GEOCF feature components and two GEOCF operations are formulated based on the geospatial field conceptual framework. The proposed GEOCF markup language is used to formalize the proposed GEOCF. A prototype system called GEOCapabilityManager is developed, and a case study is conducted for flood observation in the lower reaches of the Jinsha River Basin. The applicability of the GEOCF is verified through the reliable discovery of flood monitoring sensors and planning for the collaboration of these sensors.

  16. Proceedings of the OECD/CSNI workshop on transient thermal-hydraulic and neutronic codes requirements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ebert, D.

    1997-07-01

    This is a report on the CSNI Workshop on Transient Thermal-Hydraulic and Neutronic Codes Requirements held at Annapolis, Maryland, USA November 5-8, 1996. This experts` meeting consisted of 140 participants from 21 countries; 65 invited papers were presented. The meeting was divided into five areas: (1) current and prospective plans of thermal hydraulic codes development; (2) current and anticipated uses of thermal-hydraulic codes; (3) advances in modeling of thermal-hydraulic phenomena and associated additional experimental needs; (4) numerical methods in multi-phase flows; and (5) programming language, code architectures and user interfaces. The workshop consensus identified the following important action items tomore » be addressed by the international community in order to maintain and improve the calculational capability: (a) preserve current code expertise and institutional memory, (b) preserve the ability to use the existing investment in plant transient analysis codes, (c) maintain essential experimental capabilities, (d) develop advanced measurement capabilities to support future code validation work, (e) integrate existing analytical capabilities so as to improve performance and reduce operating costs, (f) exploit the proven advances in code architecture, numerics, graphical user interfaces, and modularization in order to improve code performance and scrutibility, and (g) more effectively utilize user experience in modifying and improving the codes.« less

  17. Current CFD Practices in Launch Vehicle Applications

    NASA Technical Reports Server (NTRS)

    Kwak, Dochan; Kiris, Cetin

    2012-01-01

    The quest for sustained space exploration will require the development of advanced launch vehicles, and efficient and reliable operating systems. Development of launch vehicles via test-fail-fix approach is very expensive and time consuming. For decision making, modeling and simulation (M&S) has played increasingly important roles in many aspects of launch vehicle development. It is therefore essential to develop and maintain most advanced M&S capability. More specifically computational fluid dynamics (CFD) has been providing critical data for developing launch vehicles complementing expensive testing. During the past three decades CFD capability has increased remarkably along with advances in computer hardware and computing technology. However, most of the fundamental CFD capability in launch vehicle applications is derived from the past advances. Specific gaps in the solution procedures are being filled primarily through "piggy backed" efforts.on various projects while solving today's problems. Therefore, some of the advanced capabilities are not readily available for various new tasks, and mission-support problems are often analyzed using ad hoc approaches. The current report is intended to present our view on state-of-the-art (SOA) in CFD and its shortcomings in support of space transport vehicle development. Best practices in solving current issues will be discussed using examples from ascending launch vehicles. Some of the pacing will be discussed in conjunction with these examples.

  18. Performance Analysis of a Ring Current Model Driven by Global MHD

    NASA Astrophysics Data System (ADS)

    Falasca, A.; Keller, K. A.; Fok, M.; Hesse, M.; Gombosi, T.

    2003-12-01

    Effectively modeling the high-energy particles in Earth's inner magnetosphere has the potential to improve safety in both manned and unmanned spacecraft. One model of this environment is the Fok Ring Current Model. This model can utilize as inputs both solar wind data, and empirical ionospheric electric field and magnetic field models. Alternatively, we have a procedure which allows the model to be driven by outputs from the BATS-R-US global MHD model. By using in-situ satellite data we will compare the predictive capability of this model in its original stand-alone form, to that of the model when driven by the BATS-R-US Global Magnetosphere Model. As a basis for comparison we use the April 2002 and May 2003 storms where suitable LANL geosynchronous data are available.

  19. Computational Combustion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Westbrook, C K; Mizobuchi, Y; Poinsot, T J

    2004-08-26

    Progress in the field of computational combustion over the past 50 years is reviewed. Particular attention is given to those classes of models that are common to most system modeling efforts, including fluid dynamics, chemical kinetics, liquid sprays, and turbulent flame models. The developments in combustion modeling are placed into the time-dependent context of the accompanying exponential growth in computer capabilities and Moore's Law. Superimposed on this steady growth, the occasional sudden advances in modeling capabilities are identified and their impacts are discussed. Integration of submodels into system models for spark ignition, diesel and homogeneous charge, compression ignition engines, surfacemore » and catalytic combustion, pulse combustion, and detonations are described. Finally, the current state of combustion modeling is illustrated by descriptions of a very large jet lifted 3D turbulent hydrogen flame with direct numerical simulation and 3D large eddy simulations of practical gas burner combustion devices.« less

  20. Center of Excellence for Applied Mathematical and Statistical Research in support of development of multicrop production monitoring capability

    NASA Technical Reports Server (NTRS)

    Woodward, W. A.; Gray, H. L.

    1983-01-01

    Efforts in support of the development of multicrop production monitoring capability are reported. In particular, segment level proportion estimation techniques based upon a mixture model were investigated. Efforts have dealt primarily with evaluation of current techniques and development of alternative ones. A comparison of techniques is provided on both simulated and LANDSAT data along with an analysis of the quality of profile variables obtained from LANDSAT data.

  1. Improving measurement technology for the design of sustainable cities

    NASA Astrophysics Data System (ADS)

    Pardyjak, Eric R.; Stoll, Rob

    2017-09-01

    This review identifies and discusses measurement technology gaps that are currently preventing major science leaps from being realized in the study of urban environmental transport processes. These scientific advances are necessary to better understand the links between atmospheric transport processes in the urban environment, human activities, and potential management strategies. We propose that with various improved and targeted measurements, it will be possible to provide technically sound guidance to policy and decision makers for the design of sustainable cities. This review focuses on full-scale in situ and remotely sensed measurements of atmospheric winds, temperature, and humidity in cities and links measurements to current modeling and simulation needs. A key conclusion of this review is that there is a need for urban-specific measurement techniques including measurements of highly-resolved three-dimensional fields at sampling frequencies high enough to capture small-scale turbulence processes yet also capable of covering spatial extents large enough to simultaneously capture key features of urban heterogeneity and boundary layer processes while also supporting the validation of current and emerging modeling capabilities.

  2. Economic modeling of fault tolerant flight control systems in commercial applications

    NASA Technical Reports Server (NTRS)

    Finelli, G. B.

    1982-01-01

    This paper describes the current development of a comprehensive model which will supply the assessment and analysis capability to investigate the economic viability of Fault Tolerant Flight Control Systems (FTFCS) for commercial aircraft of the 1990's and beyond. An introduction to the unique attributes of fault tolerance and how they will influence aircraft operations and consequent airline costs and benefits is presented. Specific modeling issues and elements necessary for accurate assessment of all costs affected by ownership and operation of FTFCS are delineated. Trade-off factors are presented, aimed at exposing economically optimal realizations of system implementations, resource allocation, and operating policies. A trade-off example is furnished to graphically display some of the analysis capabilities of the comprehensive simulation model now being developed.

  3. Implementation of a Tabulated Failure Model Into a Generalized Composite Material Model Suitable for Use in Impact Problems

    NASA Technical Reports Server (NTRS)

    Goldberg, Robert K.; Carney, Kelly S.; Dubois, Paul; Hoffarth, Canio; Khaled, Bilal; Shyamsunder, Loukham; Rajan, Subramaniam; Blankenhorn, Gunther

    2017-01-01

    The need for accurate material models to simulate the deformation, damage and failure of polymer matrix composites under impact conditions is becoming critical as these materials are gaining increased use in the aerospace and automotive communities. The aerospace community has identified several key capabilities which are currently lacking in the available material models in commercial transient dynamic finite element codes. To attempt to improve the predictive capability of composite impact simulations, a next generation material model is being developed for incorporation within the commercial transient dynamic finite element code LS-DYNA. The material model, which incorporates plasticity, damage and failure, utilizes experimentally based tabulated input to define the evolution of plasticity and damage and the initiation of failure as opposed to specifying discrete input parameters such as modulus and strength. The plasticity portion of the orthotropic, three-dimensional, macroscopic composite constitutive model is based on an extension of the Tsai-Wu composite failure model into a generalized yield function with a non-associative flow rule. For the damage model, a strain equivalent formulation is used to allow for the uncoupling of the deformation and damage analyses. For the failure model, a tabulated approach is utilized in which a stress or strain based invariant is defined as a function of the location of the current stress state in stress space to define the initiation of failure. Failure surfaces can be defined with any arbitrary shape, unlike traditional failure models where the mathematical functions used to define the failure surface impose a specific shape on the failure surface. In the current paper, the complete development of the failure model is described and the generation of a tabulated failure surface for a representative composite material is discussed.

  4. Development of task network models of human performance in microgravity

    NASA Technical Reports Server (NTRS)

    Diaz, Manuel F.; Adam, Susan

    1992-01-01

    This paper discusses the utility of task-network modeling for quantifying human performance variability in microgravity. The data are gathered for: (1) improving current methodologies for assessing human performance and workload in the operational space environment; (2) developing tools for assessing alternative system designs; and (3) developing an integrated set of methodologies for the evaluation of performance degradation during extended duration spaceflight. The evaluation entailed an analysis of the Remote Manipulator System payload-grapple task performed on many shuttle missions. Task-network modeling can be used as a tool for assessing and enhancing human performance in man-machine systems, particularly for modeling long-duration manned spaceflight. Task-network modeling can be directed toward improving system efficiency by increasing the understanding of basic capabilities of the human component in the system and the factors that influence these capabilities.

  5. A Geant4 model of backscatter security imaging systems

    NASA Astrophysics Data System (ADS)

    Leboffe, Eric Matthew

    The operating characteristics of x ray security scanner systems that utilize backscatter signal in order to distinguish person borne threats have never been made fully available to the general public. By designing a model using Geant4, studies can be performed which will shed light on systems such as security scanners and allow for analysis of the performance and safety of the system without access to any system data. Despite the fact that the systems are no longer in use at airports in the United States, the ability to design and validate detector models and phenomena is an important capability that can be applied to many current real world applications. The model presented provides estimates for absorbed dose, effective dose and dose depth distribution that are comparable to previously published work and explores imaging capabilities for the system embodiment modeled.

  6. Clinical Summarization Capabilities of Commercially-available and Internally-developed Electronic Health Records

    PubMed Central

    Laxmisan, A.; McCoy, A.B.; Wright, A.; Sittig, D.F.

    2012-01-01

    Objective Clinical summarization, the process by which relevant patient information is electronically summarized and presented at the point of care, is of increasing importance given the increasing volume of clinical data in electronic health record systems (EHRs). There is a paucity of research on electronic clinical summarization, including the capabilities of currently available EHR systems. Methods We compared different aspects of general clinical summary screens used in twelve different EHR systems using a previously described conceptual model: AORTIS (Aggregation, Organization, Reduction, Interpretation and Synthesis). Results We found a wide variation in the EHRs’ summarization capabilities: all systems were capable of simple aggregation and organization of limited clinical content, but only one demonstrated an ability to synthesize information from the data. Conclusion Improvement of the clinical summary screen functionality for currently available EHRs is necessary. Further research should identify strategies and methods for creating easy to use, well-designed clinical summary screens that aggregate, organize and reduce all pertinent patient information as well as provide clinical interpretations and synthesis as required. PMID:22468161

  7. Enhancing the LVRT Capability of PMSG-Based Wind Turbines Based on R-SFCL

    NASA Astrophysics Data System (ADS)

    Xu, Lin; Lin, Ruixing; Ding, Lijie; Huang, Chunjun

    2018-03-01

    A novel low voltage ride-through (LVRT) scheme for PMSG-based wind turbines based on the Resistor Superconducting Fault Current Limiter (R-SFCL) is proposed in this paper. The LVRT scheme is mainly formed by R-SFCL in series between the transformer and the Grid Side Converter (GSC), and basic modelling has been discussed in detail. The proposed LVRT scheme is implemented to interact with PMSG model in PSCAD/EMTDC under three phase short circuit fault condition, which proves that the proposed scheme based on R-SFCL can improve the transient performance and LVRT capability to consolidate grid connection with wind turbines.

  8. MESSOC capabilities and results. [Model for Estimating Space Station Opertions Costs

    NASA Technical Reports Server (NTRS)

    Shishko, Robert

    1990-01-01

    MESSOC (Model for Estimating Space Station Operations Costs) is the result of a multi-year effort by NASA to understand and model the mature operations cost of Space Station Freedom. This paper focuses on MESSOC's ability to contribute to life-cycle cost analyses through its logistics equations and databases. Together, these afford MESSOC the capability to project not only annual logistics costs for a variety of Space Station scenarios, but critical non-cost logistics results such as annual Station maintenance crewhours, upweight/downweight, and on-orbit sparing availability as well. MESSOC results using current logistics databases and baseline scenario have already shown important implications for on-orbit maintenance approaches, space transportation systems, and international operations cost sharing.

  9. Nonlinear static and dynamic finite element analysis of an eccentrically loaded graphite-epoxy beam

    NASA Technical Reports Server (NTRS)

    Fasanella, Edwin L.; Jackson, Karen E.; Jones, Lisa E.

    1991-01-01

    The Dynamic Crash Analysis of Structures (DYCAT) and NIKE3D nonlinear finite element codes were used to model the static and implulsive response of an eccentrically loaded graphite-epoxy beam. A 48-ply unidirectional composite beam was tested under an eccentric axial compressive load until failure. This loading configuration was chosen to highlight the capabilities of two finite element codes for modeling a highly nonlinear, large deflection structural problem which has an exact solution. These codes are currently used to perform dynamic analyses of aircraft structures under impact loads to study crashworthiness and energy absorbing capabilities. Both beam and plate element models were developed to compare with the experimental data using the DYCAST and NIKE3D codes.

  10. A Steady State and Quasi-Steady Interface Between the Generalized Fluid System Simulation Program and the SINDA/G Thermal Analysis Program

    NASA Technical Reports Server (NTRS)

    Schallhorn, Paul; Majumdar, Alok; Tiller, Bruce

    2001-01-01

    A general purpose, one dimensional fluid flow code is currently being interfaced with the thermal analysis program SINDA/G. The flow code, GFSSP, is capable of analyzing steady state and transient flow in a complex network. The flow code is capable of modeling several physical phenomena including compressibility effects, phase changes, body forces (such as gravity and centrifugal) and mixture thermodynamics for multiple species. The addition of GFSSP to SINDA/G provides a significant improvement in convective heat transfer modeling for SINDA/G. The interface development is conducted in multiple phases. This paper describes the first phase of the interface which allows for steady and quasisteady (unsteady solid, steady fluid) conjugate heat transfer modeling.

  11. Modeling and Simulation Tools for Heavy Lift Airships

    NASA Technical Reports Server (NTRS)

    Hochstetler, Ron; Chachad, Girish; Hardy, Gordon; Blanken, Matthew; Melton, John

    2016-01-01

    For conventional fixed wing and rotary wing aircraft a variety of modeling and simulation tools have been developed to provide designers the means to thoroughly investigate proposed designs and operational concepts. However, lighter-than-air (LTA) airships, hybrid air vehicles, and aerostats have some important aspects that are different from heavier-than-air (HTA) vehicles. In order to account for these differences, modifications are required to the standard design tools to fully characterize the LTA vehicle design and performance parameters.. To address these LTA design and operational factors, LTA development organizations have created unique proprietary modeling tools, often at their own expense. An expansion of this limited LTA tool set could be accomplished by leveraging existing modeling and simulation capabilities available in the National laboratories and public research centers. Development of an expanded set of publicly available LTA modeling and simulation tools for LTA developers would mitigate the reliance on proprietary LTA design tools in use today. A set of well researched, open source, high fidelity LTA design modeling and simulation tools would advance LTA vehicle development and also provide the analytical basis for accurate LTA operational cost assessments. This paper will present the modeling and analysis tool capabilities required for LTA vehicle design, analysis of operations, and full life-cycle support. A survey of the tools currently available will be assessed to identify the gaps between their capabilities and the LTA industry's needs. Options for development of new modeling and analysis capabilities to supplement contemporary tools will also be presented.

  12. Development of a Risk-Based Comparison Methodology of Carbon Capture Technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Engel, David W.; Dalton, Angela C.; Dale, Crystal

    2014-06-01

    Given the varying degrees of maturity among existing carbon capture (CC) technology alternatives, an understanding of the inherent technical and financial risk and uncertainty associated with these competing technologies is requisite to the success of carbon capture as a viable solution to the greenhouse gas emission challenge. The availability of tools and capabilities to conduct rigorous, risk–based technology comparisons is thus highly desirable for directing valuable resources toward the technology option(s) with a high return on investment, superior carbon capture performance, and minimum risk. To address this research need, we introduce a novel risk-based technology comparison method supported by anmore » integrated multi-domain risk model set to estimate risks related to technological maturity, technical performance, and profitability. Through a comparison between solid sorbent and liquid solvent systems, we illustrate the feasibility of estimating risk and quantifying uncertainty in a single domain (modular analytical capability) as well as across multiple risk dimensions (coupled analytical capability) for comparison. This method brings technological maturity and performance to bear on profitability projections, and carries risk and uncertainty modeling across domains via inter-model sharing of parameters, distributions, and input/output. The integration of the models facilitates multidimensional technology comparisons within a common probabilistic risk analysis framework. This approach and model set can equip potential technology adopters with the necessary computational capabilities to make risk-informed decisions about CC technology investment. The method and modeling effort can also be extended to other industries where robust tools and analytical capabilities are currently lacking for evaluating nascent technologies.« less

  13. Approaches for scalable modeling and emulation of cyber systems : LDRD final report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mayo, Jackson R.; Minnich, Ronald G.; Armstrong, Robert C.

    2009-09-01

    The goal of this research was to combine theoretical and computational approaches to better understand the potential emergent behaviors of large-scale cyber systems, such as networks of {approx} 10{sup 6} computers. The scale and sophistication of modern computer software, hardware, and deployed networked systems have significantly exceeded the computational research community's ability to understand, model, and predict current and future behaviors. This predictive understanding, however, is critical to the development of new approaches for proactively designing new systems or enhancing existing systems with robustness to current and future cyber threats, including distributed malware such as botnets. We have developed preliminarymore » theoretical and modeling capabilities that can ultimately answer questions such as: How would we reboot the Internet if it were taken down? Can we change network protocols to make them more secure without disrupting existing Internet connectivity and traffic flow? We have begun to address these issues by developing new capabilities for understanding and modeling Internet systems at scale. Specifically, we have addressed the need for scalable network simulation by carrying out emulations of a network with {approx} 10{sup 6} virtualized operating system instances on a high-performance computing cluster - a 'virtual Internet'. We have also explored mappings between previously studied emergent behaviors of complex systems and their potential cyber counterparts. Our results provide foundational capabilities for further research toward understanding the effects of complexity in cyber systems, to allow anticipating and thwarting hackers.« less

  14. Development of a Three-Dimensional, Unstructured Material Response Design Tool

    NASA Technical Reports Server (NTRS)

    Schulz, Joseph C.; Stern, Eric C.; Muppidi, Suman; Palmer, Grant E.; Schroeder, Olivia

    2017-01-01

    A preliminary verification and validation of a new material response model is presented. This model, Icarus, is intended to serve as a design tool for the thermal protection systems of re-entry vehicles. Currently, the capability of the model is limited to simulating the pyrolysis of a material as a result of the radiative and convective surface heating imposed on the material from the surrounding high enthalpy gas. Since the major focus behind the development of Icarus has been model extensibility, the hope is that additional physics can be quickly added. This extensibility is critical since thermal protection systems are becoming increasing complex, e.g. woven carbon polymers. Additionally, as a three-dimensional, unstructured, finite-volume model, Icarus is capable of modeling complex geometries. In this paper, the mathematical and numerical formulation is presented followed by a discussion of the software architecture and some preliminary verification and validation studies.

  15. POST2 End-To-End Descent and Landing Simulation for the Autonomous Landing and Hazard Avoidance Technology Project

    NASA Technical Reports Server (NTRS)

    Fisher, Jody l.; Striepe, Scott A.

    2007-01-01

    The Program to Optimize Simulated Trajectories II (POST2) is used as a basis for an end-to-end descent and landing trajectory simulation that is essential in determining the design and performance capability of lunar descent and landing system models and lunar environment models for the Autonomous Landing and Hazard Avoidance Technology (ALHAT) project. This POST2-based ALHAT simulation provides descent and landing simulation capability by integrating lunar environment and lander system models (including terrain, sensor, guidance, navigation, and control models), along with the data necessary to design and operate a landing system for robotic, human, and cargo lunar-landing success. This paper presents the current and planned development and model validation of the POST2-based end-to-end trajectory simulation used for the testing, performance and evaluation of ALHAT project system and models.

  16. Model driver screening and evaluation program. Volume 3, Guidelines for motor vehicle administrators

    DOT National Transportation Integrated Search

    2003-05-01

    These Guidelines present an update of report number DOT HS 807 853 published in August 1992. They reflect current understanding of the relationship between functional capabilities and driving impairment gained through review of existing medical revie...

  17. Icing simulation: A survey of computer models and experimental facilities

    NASA Technical Reports Server (NTRS)

    Potapczuk, M. G.; Reinmann, J. J.

    1991-01-01

    A survey of the current methods for simulation of the response of an aircraft or aircraft subsystem to an icing encounter is presented. The topics discussed include a computer code modeling of aircraft icing and performance degradation, an evaluation of experimental facility simulation capabilities, and ice protection system evaluation tests in simulated icing conditions. Current research focussed on upgrading simulation fidelity of both experimental and computational methods is discussed. The need for increased understanding of the physical processes governing ice accretion, ice shedding, and iced airfoil aerodynamics is examined.

  18. Icing simulation: A survey of computer models and experimental facilities

    NASA Technical Reports Server (NTRS)

    Potapczuk, M. G.; Reinmann, J. J.

    1991-01-01

    A survey of the current methods for simulation of the response of an aircraft or aircraft subsystem to an icing encounter is presented. The topics discussed include a computer code modeling of aircraft icing and performance degradation, an evaluation of experimental facility simulation capabilities, and ice protection system evaluation tests in simulated icing conditions. Current research focused on upgrading simulation fidelity of both experimental and computational methods is discussed. The need for the increased understanding of the physical processes governing ice accretion, ice shedding, and iced aerodynamics is examined.

  19. Delta Advanced Reusable Transport (DART): An alternative manned spacecraft

    NASA Astrophysics Data System (ADS)

    Lewerenz, T.; Kosha, M.; Magazu, H.

    Although the current U.S. Space Transportation System (STS) has proven successful in many applications, the truth remains that the space shuttle is not as reliable or economical as was once hoped. In fact, the Augustine Commission on the future of the U.S. Space Program has recommended that the space shuttle only be used on missions directly requiring human capabilities on-orbit and that the shuttle program should eventually be phased out. This poses a great dilemma since the shuttle provides the only current or planned U.S. means for human access to space at the same time that NASA is building toward a permanent manned presence. As a possible solution to this dilemma, it is proposed that the U.S. begin development of an Alternative Manned Spacecraft (AMS). This spacecraft would not only provide follow-on capability for maintaining human space flight, but would also provide redundancy and enhanced capability in the near future. Design requirements for the AMS studied include: (1) capability of launching on one of the current or planned U.S. expendable launch vehicles (baseline McDonnell Douglas Delta II model 7920 expendable booster); (2) application to a wide variety of missions including autonomous operations, space station support, and access to orbits and inclinations beyond those of the space shuttle; (3) low enough costing to fly regularly in augmentation of space shuttle capabilities; (4) production surge capabilities to replace the shuttle if events require it; (5) intact abort capability in all flight regimes since the planned launch vehicles are not man-rated; (6) technology cut-off date of 1990; and (7) initial operational capability in 1995. In addition, the design of the AMS would take advantage of scientific advances made in the 20 years since the space shuttle was first conceived. These advances are in such technologies as composite materials, propulsion systems, avionics, and hypersonics.

  20. Delta Advanced Reusable Transport (DART): An alternative manned spacecraft

    NASA Technical Reports Server (NTRS)

    Lewerenz, T.; Kosha, M.; Magazu, H.

    1991-01-01

    Although the current U.S. Space Transportation System (STS) has proven successful in many applications, the truth remains that the space shuttle is not as reliable or economical as was once hoped. In fact, the Augustine Commission on the future of the U.S. Space Program has recommended that the space shuttle only be used on missions directly requiring human capabilities on-orbit and that the shuttle program should eventually be phased out. This poses a great dilemma since the shuttle provides the only current or planned U.S. means for human access to space at the same time that NASA is building toward a permanent manned presence. As a possible solution to this dilemma, it is proposed that the U.S. begin development of an Alternative Manned Spacecraft (AMS). This spacecraft would not only provide follow-on capability for maintaining human space flight, but would also provide redundancy and enhanced capability in the near future. Design requirements for the AMS studied include: (1) capability of launching on one of the current or planned U.S. expendable launch vehicles (baseline McDonnell Douglas Delta II model 7920 expendable booster); (2) application to a wide variety of missions including autonomous operations, space station support, and access to orbits and inclinations beyond those of the space shuttle; (3) low enough costing to fly regularly in augmentation of space shuttle capabilities; (4) production surge capabilities to replace the shuttle if events require it; (5) intact abort capability in all flight regimes since the planned launch vehicles are not man-rated; (6) technology cut-off date of 1990; and (7) initial operational capability in 1995. In addition, the design of the AMS would take advantage of scientific advances made in the 20 years since the space shuttle was first conceived. These advances are in such technologies as composite materials, propulsion systems, avionics, and hypersonics.

  1. Current Testing Capabilities at the NASA Ames Ballistic Ranges

    NASA Technical Reports Server (NTRS)

    Ramsey, Alvin; Tam, Tim; Bogdanoff, David; Gage, Peter

    1999-01-01

    Capabilities for designing and performing ballistic range tests at the NASA Ames Research Center are presented. Computational tools to assist in designing and developing ballistic range models and to predict the flight characteristics of these models are described. A CFD code modeling two-stage gun performance is available, allowing muzzle velocity, maximum projectile base pressure, and gun erosion to be predicted. Aerodynamic characteristics such as drag and stability can be obtained at speeds ranging from 0.2 km/s to 8 km/s. The composition and density of the test gas can be controlled, which allows for an assessment of Reynolds number and specific heat ratio effects under conditions that closely match those encountered during planetary entry. Pressure transducers have been installed in the gun breech to record the time history of the pressure during launch, and pressure transducers have also been installed in the walls of the range to measure sonic boom effects. To illustrate the testing capabilities of the Ames ballistic ranges, an overview of some of the recent tests is given.

  2. NASA's supercomputing experience

    NASA Technical Reports Server (NTRS)

    Bailey, F. Ron

    1990-01-01

    A brief overview of NASA's recent experience in supercomputing is presented from two perspectives: early systems development and advanced supercomputing applications. NASA's role in supercomputing systems development is illustrated by discussion of activities carried out by the Numerical Aerodynamical Simulation Program. Current capabilities in advanced technology applications are illustrated with examples in turbulence physics, aerodynamics, aerothermodynamics, chemistry, and structural mechanics. Capabilities in science applications are illustrated by examples in astrophysics and atmospheric modeling. Future directions and NASA's new High Performance Computing Program are briefly discussed.

  3. Parameterization of the 3-PG model for Pinus elliottii stands using alternative methods to estimate fertility rating, biomass partitioning and canopy closure

    Treesearch

    Carlos A. Gonzalez-Benecke; Eric J. Jokela; Wendell P. Cropper; Rosvel Bracho; Daniel J. Leduc

    2014-01-01

    The forest simulation model, 3-PG, has been widely applied as a useful tool for predicting growth of forest species in many countries. The model has the capability to estimate the effects of management, climate and site characteristics on many stand attributes using easily available data. Currently, there is an increasing interest in estimating biomass and assessing...

  4. Integrated modeling of advanced optical systems

    NASA Astrophysics Data System (ADS)

    Briggs, Hugh C.; Needels, Laura; Levine, B. Martin

    1993-02-01

    This poster session paper describes an integrated modeling and analysis capability being developed at JPL under funding provided by the JPL Director's Discretionary Fund and the JPL Control/Structure Interaction Program (CSI). The posters briefly summarize the program capabilities and illustrate them with an example problem. The computer programs developed under this effort will provide an unprecedented capability for integrated modeling and design of high performance optical spacecraft. The engineering disciplines supported include structural dynamics, controls, optics and thermodynamics. Such tools are needed in order to evaluate the end-to-end system performance of spacecraft such as OSI, POINTS, and SMMM. This paper illustrates the proof-of-concept tools that have been developed to establish the technology requirements and demonstrate the new features of integrated modeling and design. The current program also includes implementation of a prototype tool based upon the CAESY environment being developed under the NASA Guidance and Control Research and Technology Computational Controls Program. This prototype will be available late in FY-92. The development plan proposes a major software production effort to fabricate, deliver, support and maintain a national-class tool from FY-93 through FY-95.

  5. Simulator Study of Indoor Annoyance Caused by Shaped Sonic Boom Stimuli With and Without Rattle Augmentation

    NASA Technical Reports Server (NTRS)

    Rathsam, Jonathan; Loubeau, Alexandra; Klos, Jacob

    2013-01-01

    The National Aeronautics and Space Administration's High Speed Project is developing a predictive capability for annoyance caused by shaped sonic booms transmitted indoors. The predictive capability is intended for use by aircraft designers as well as by aircraft noise regulators who are considering lifting the current prohibition on overland civil supersonic flight. The goal of the current study is to use an indoor simulator to validate two models developed using headphone tests for annoyance caused by sonic booms with and without rattle augmentation. The predictors in the proposed models include Moore and Glasberg's Stationary Loudness Level, the time derivative of Moore and Glasberg's time-varying short-term Loudness Level, and the difference between two weighted sound exposure levels, CSEL-ASEL. The indoor simulator provides a more realistic listening environment than headphones due to lowfrequency sound reproduction down to 6 Hz, which also causes perceptible tactile vibration. The results of this study show that a model consisting of {PL + (CSEL-ASEL)} is a reliable predictor of annoyance caused by shaped sonic booms alone, rattle sounds alone, and shaped sonic booms and rattle sounds together.

  6. Rapid architecture alternative modeling (RAAM): A framework for capability-based analysis of system of systems architectures

    NASA Astrophysics Data System (ADS)

    Iacobucci, Joseph V.

    The research objective for this manuscript is to develop a Rapid Architecture Alternative Modeling (RAAM) methodology to enable traceable Pre-Milestone A decision making during the conceptual phase of design of a system of systems. Rather than following current trends that place an emphasis on adding more analysis which tends to increase the complexity of the decision making problem, RAAM improves on current methods by reducing both runtime and model creation complexity. RAAM draws upon principles from computer science, system architecting, and domain specific languages to enable the automatic generation and evaluation of architecture alternatives. For example, both mission dependent and mission independent metrics are considered. Mission dependent metrics are determined by the performance of systems accomplishing a task, such as Probability of Success. In contrast, mission independent metrics, such as acquisition cost, are solely determined and influenced by the other systems in the portfolio. RAAM also leverages advances in parallel computing to significantly reduce runtime by defining executable models that are readily amendable to parallelization. This allows the use of cloud computing infrastructures such as Amazon's Elastic Compute Cloud and the PASTEC cluster operated by the Georgia Institute of Technology Research Institute (GTRI). Also, the amount of data that can be generated when fully exploring the design space can quickly exceed the typical capacity of computational resources at the analyst's disposal. To counter this, specific algorithms and techniques are employed. Streaming algorithms and recursive architecture alternative evaluation algorithms are used that reduce computer memory requirements. Lastly, a domain specific language is created to provide a reduction in the computational time of executing the system of systems models. A domain specific language is a small, usually declarative language that offers expressive power focused on a particular problem domain by establishing an effective means to communicate the semantics from the RAAM framework. These techniques make it possible to include diverse multi-metric models within the RAAM framework in addition to system and operational level trades. A canonical example was used to explore the uses of the methodology. The canonical example contains all of the features of a full system of systems architecture analysis study but uses fewer tasks and systems. Using RAAM with the canonical example it was possible to consider both system and operational level trades in the same analysis. Once the methodology had been tested with the canonical example, a Suppression of Enemy Air Defenses (SEAD) capability model was developed. Due to the sensitive nature of analyses on that subject, notional data was developed. The notional data has similar trends and properties to realistic Suppression of Enemy Air Defenses data. RAAM was shown to be traceable and provided a mechanism for a unified treatment of a variety of metrics. The SEAD capability model demonstrated lower computer runtimes and reduced model creation complexity as compared to methods currently in use. To determine the usefulness of the implementation of the methodology on current computing hardware, RAAM was tested with system of system architecture studies of different sizes. This was necessary since system of systems may be called upon to accomplish thousands of tasks. It has been clearly demonstrated that RAAM is able to enumerate and evaluate the types of large, complex design spaces usually encountered in capability based design, oftentimes providing the ability to efficiently search the entire decision space. The core algorithms for generation and evaluation of alternatives scale linearly with expected problem sizes. The SEAD capability model outputs prompted the discovery a new issue, the data storage and manipulation requirements for an analysis. Two strategies were developed to counter large data sizes, the use of portfolio views and top 'n' analysis. This proved the usefulness of the RAAM framework and methodology during Pre-Milestone A capability based analysis. (Abstract shortened by UMI.).

  7. EOID Model Validation and Performance Prediction

    DTIC Science & Technology

    2002-09-30

    Our long-term goal is to accurately predict the capability of the current generation of laser-based underwater imaging sensors to perform Electro ... Optic Identification (EOID) against relevant targets in a variety of realistic environmental conditions. The two most prominent technologies in this area

  8. Current capabilities and future directions in computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    1986-01-01

    A summary of significant findings is given, followed by specific recommendations for future directions of emphasis for computational fluid dynamics development. The discussion is organized into three application areas: external aerodynamics, hypersonics, and propulsion - and followed by a turbulence modeling synopsis.

  9. Common world model for unmanned systems

    NASA Astrophysics Data System (ADS)

    Dean, Robert Michael S.

    2013-05-01

    The Robotic Collaborative Technology Alliance (RCTA) seeks to provide adaptive robot capabilities which move beyond traditional metric algorithms to include cognitive capabilities. Key to this effort is the Common World Model, which moves beyond the state-of-the-art by representing the world using metric, semantic, and symbolic information. It joins these layers of information to define objects in the world. These objects may be reasoned upon jointly using traditional geometric, symbolic cognitive algorithms and new computational nodes formed by the combination of these disciplines. The Common World Model must understand how these objects relate to each other. Our world model includes the concept of Self-Information about the robot. By encoding current capability, component status, task execution state, and histories we track information which enables the robot to reason and adapt its performance using Meta-Cognition and Machine Learning principles. The world model includes models of how aspects of the environment behave, which enable prediction of future world states. To manage complexity, we adopted a phased implementation approach to the world model. We discuss the design of "Phase 1" of this world model, and interfaces by tracing perception data through the system from the source to the meta-cognitive layers provided by ACT-R and SS-RICS. We close with lessons learned from implementation and how the design relates to Open Architecture.

  10. Physics-based distributed snow models in the operational arena: Current and future challenges

    NASA Astrophysics Data System (ADS)

    Winstral, A. H.; Jonas, T.; Schirmer, M.; Helbig, N.

    2017-12-01

    The demand for modeling tools robust to climate change and weather extremes along with coincident increases in computational capabilities have led to an increase in the use of physics-based snow models in operational applications. Current operational applications include the WSL-SLF's across Switzerland, ASO's in California, and USDA-ARS's in Idaho. While the physics-based approaches offer many advantages there remain limitations and modeling challenges. The most evident limitation remains computation times that often limit forecasters to a single, deterministic model run. Other limitations however remain less conspicuous amidst the assumptions that these models require little to no calibration based on their foundation on physical principles. Yet all energy balance snow models seemingly contain parameterizations or simplifications of processes where validation data are scarce or present understanding is limited. At the research-basin scale where many of these models were developed these modeling elements may prove adequate. However when applied over large areas, spatially invariable parameterizations of snow albedo, roughness lengths and atmospheric exchange coefficients - all vital to determining the snowcover energy balance - become problematic. Moreover as we apply models over larger grid cells, the representation of sub-grid variability such as the snow-covered fraction adds to the challenges. Here, we will demonstrate some of the major sensitivities of distributed energy balance snow models to particular model constructs, the need for advanced and spatially flexible methods and parameterizations, and prompt the community for open dialogue and future collaborations to further modeling capabilities.

  11. Real-Time Aircraft Cosmic Ray Radiation Exposure Predictions from the NAIRAS Model

    NASA Astrophysics Data System (ADS)

    Mertens, C. J.; Tobiska, W.; Kress, B. T.; Xu, X.

    2012-12-01

    The Nowcast of Atmospheric Ionizing Radiation for Aviation Safety (NAIRAS) is a prototype operational model for predicting commercial aircraft radiation exposure from galactic and solar cosmic rays. NAIRAS predictions are currently streaming live from the project's public website, and the exposure rate nowcast is also available on the SpaceWx smartphone app for iPhone, IPad, and Android. Cosmic rays are the primary source of human exposure to high linear energy transfer radiation at aircraft altitudes, which increases the risk of cancer and other adverse health effects. Thus, the NAIRAS model addresses an important national need with broad societal, public health and economic benefits. There is also interest in extending NAIRAS to the LEO environment to address radiation hazard issues for the emerging commercial spaceflight industry. The processes responsible for the variability in the solar wind, interplanetary magnetic field, solar energetic particle spectrum, and the dynamical response of the magnetosphere to these space environment inputs, strongly influence the composition and energy distribution of the atmospheric ionizing radiation field. Real-time observations are required at a variety of locations within the geospace environment. The NAIRAS model is driven by real-time input data from ground-, atmospheric-, and space-based platforms. During the development of the NAIRAS model, new science questions and observational data gaps were identified that must be addressed in order to obtain a more reliable and robust operational model of atmospheric radiation exposure. The focus of this talk is to present the current capabilities of the NAIRAS model, discuss future developments in aviation radiation modeling and instrumentation, and propose strategies and methodologies of bridging known gaps in current modeling and observational capabilities.

  12. Semantic Likelihood Models for Bayesian Inference in Human-Robot Interaction

    NASA Astrophysics Data System (ADS)

    Sweet, Nicholas

    Autonomous systems, particularly unmanned aerial systems (UAS), remain limited in au- tonomous capabilities largely due to a poor understanding of their environment. Current sensors simply do not match human perceptive capabilities, impeding progress towards full autonomy. Recent work has shown the value of humans as sources of information within a human-robot team; in target applications, communicating human-generated 'soft data' to autonomous systems enables higher levels of autonomy through large, efficient information gains. This requires development of a 'human sensor model' that allows soft data fusion through Bayesian inference to update the probabilistic belief representations maintained by autonomous systems. Current human sensor models that capture linguistic inputs as semantic information are limited in their ability to generalize likelihood functions for semantic statements: they may be learned from dense data; they do not exploit the contextual information embedded within groundings; and they often limit human input to restrictive and simplistic interfaces. This work provides mechanisms to synthesize human sensor models from constraints based on easily attainable a priori knowledge, develops compression techniques to capture information-dense semantics, and investigates the problem of capturing and fusing semantic information contained within unstructured natural language. A robotic experimental testbed is also developed to validate the above contributions.

  13. Development of Integrated Magnetic and Kinetic Control-oriented Transport Model for q-profile Response Prediction in EAST Discharges

    NASA Astrophysics Data System (ADS)

    Wang, Hexiang; Schuster, Eugenio; Rafiq, Tariq; Kritz, Arnold; Ding, Siye

    2016-10-01

    Extensive research has been conducted to find high-performance operating scenarios characterized by high fusion gain, good confinement, plasma stability and possible steady-state operation. A key plasma property that is related to both the stability and performance of these advanced plasma scenarios is the safety factor profile. A key component of the EAST research program is the exploration of non-inductively driven steady-state plasmas with the recently upgraded heating and current drive capabilities that include lower hybrid current drive and neutral beam injection. Anticipating the need for tight regulation of the safety factor profile in these plasma scenarios, a first-principles-driven (FPD)control-oriented model is proposed to describe the safety factor profile evolution in EAST in response to the different actuators. The TRANSP simulation code is employed to tailor the FPD model to the EAST tokamak geometry and to convert it into a form suitable for control design. The FPD control-oriented model's prediction capabilities are demonstrated by comparing predictions with experimental data from EAST. Supported by the US DOE under DE-SC0010537,DE-FG02-92ER54141 and DE-SC0013977.

  14. The NASA Space Launch System Program Systems Engineering Approach for Affordability

    NASA Technical Reports Server (NTRS)

    Hutt, John J.; Whitehead, Josh; Hanson, John

    2017-01-01

    The National Aeronautics and Space Administration is currently developing the Space Launch System to provide the United States with a capability to launch large Payloads into Low Earth orbit and deep space. One of the development tenets of the SLS Program is affordability. One initiative to enhance affordability is the SLS approach to requirements definition, verification and system certification. The key aspects of this initiative include: 1) Minimizing the number of requirements, 2) Elimination of explicit verification requirements, 3) Use of certified models of subsystem capability in lieu of requirements when appropriate and 4) Certification of capability beyond minimum required capability. Implementation of each aspect is described and compared to a "typical" systems engineering implementation, including a discussion of relative risk. Examples of each implementation within the SLS Program are provided.

  15. SpaceNet: Modeling and Simulating Space Logistics

    NASA Technical Reports Server (NTRS)

    Lee, Gene; Jordan, Elizabeth; Shishko, Robert; de Weck, Olivier; Armar, Nii; Siddiqi, Afreen

    2008-01-01

    This paper summarizes the current state of the art in interplanetary supply chain modeling and discusses SpaceNet as one particular method and tool to address space logistics modeling and simulation challenges. Fundamental upgrades to the interplanetary supply chain framework such as process groups, nested elements, and cargo sharing, enabled SpaceNet to model an integrated set of missions as a campaign. The capabilities and uses of SpaceNet are demonstrated by a step-by-step modeling and simulation of a lunar campaign.

  16. Early experiences building a software quality prediction model

    NASA Technical Reports Server (NTRS)

    Agresti, W. W.; Evanco, W. M.; Smith, M. C.

    1990-01-01

    Early experiences building a software quality prediction model are discussed. The overall research objective is to establish a capability to project a software system's quality from an analysis of its design. The technical approach is to build multivariate models for estimating reliability and maintainability. Data from 21 Ada subsystems were analyzed to test hypotheses about various design structures leading to failure-prone or unmaintainable systems. Current design variables highlight the interconnectivity and visibility of compilation units. Other model variables provide for the effects of reusability and software changes. Reported results are preliminary because additional project data is being obtained and new hypotheses are being developed and tested. Current multivariate regression models are encouraging, explaining 60 to 80 percent of the variation in error density of the subsystems.

  17. Application of information technology to the National Launch System

    NASA Technical Reports Server (NTRS)

    Mauldin, W. T.; Smith, Carolyn L.; Monk, Jan C.; Davis, Steve; Smith, Marty E.

    1992-01-01

    The approach to the development of the Unified Information System (UNIS) to provide in a timely manner all the information required to manage, design, manufacture, integrate, test, launch, operate, and support the Advanced Launch System (NLS), as well as the current and planned capabilities are described. STESYM, the Space Transportation Main Engine (STME) development program, is comprised of a collection of data models which can be grouped into two primary models: the Engine Infrastructure Model (ENGIM) and the Engine Integrated Cast Model (ENGICOM). ENGIM is an end-to-end model of the infrastructure needed to perform the fabrication, assembly, and testing of the STEM program and its components. Together, UNIS and STESYM are to provide NLS managers and engineers with the ability to access various types and files of data quickly and use that data to assess the capabilities of the STEM program.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rutqvist, Jonny; Blanco Martin, Laura; Mukhopadhyay, Sumit

    In this report, we present FY2014 progress by Lawrence Berkeley National Laboratory (LBNL) related to modeling of coupled thermal-hydrological-mechanical-chemical (THMC) processes in salt and their effect on brine migration at high temperatures. LBNL’s work on the modeling of coupled THMC processes in salt was initiated in FY2012, focusing on exploring and demonstrating the capabilities of an existing LBNL modeling tool (TOUGH-FLAC) for simulating temperature-driven coupled flow and geomechanical processes in salt. This work includes development related to, and implementation of, essential capabilities, as well as testing the model against relevant information and published experimental data related to the fate andmore » transport of water. we provide more details on the FY2014 work, first presenting updated tools and improvements made to the TOUGH-FLAC simulator, and the use of this updated tool in a new model simulation of long-term THM behavior within a generic repository in a salt formation. This is followed by the description of current benchmarking and validations efforts, including the TSDE experiment. We then present the current status in the development of constitutive relationships and the dual-continuum model for brine migration. We conclude with an outlook for FY2015, which will be much focused on model validation against field experiments and on the use of the model for the design studies related to a proposed heater experiment.« less

  19. Targeting cancer’s weaknesses (not its strengths): Therapeutic strategies suggested by the atavistic model

    PubMed Central

    Lineweaver, Charles H.; Davies, Paul C.W.; Vincent, Mark D.

    2014-01-01

    In the atavistic model of cancer progression, tumor cell dedifferentiation is interpreted as a reversion to phylogenetically earlier capabilities. The more recently evolved capabilities are compromised first during cancer progression. This suggests a therapeutic strategy for targeting cancer: design challenges to cancer that can only be met by the recently evolved capabilities no longer functional in cancer cells. We describe several examples of this target-the-weakness strategy. Our most detailed example involves the immune system. The absence of adaptive immunity in immunosuppressed tumor environments is an irreversible weakness of cancer that can be exploited by creating a challenge that only the presence of adaptive immunity can meet. This leaves tumor cells more vulnerable than healthy tissue to pathogenic attack. Such a target-the-weakness therapeutic strategy has broad applications, and contrasts with current therapies that target the main strength of cancer: cell proliferation. PMID:25043755

  20. Assessing the Validity of the Simplified Potential Energy Clock Model for Modeling Glass-Ceramics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jamison, Ryan Dale; Grillet, Anne M.; Stavig, Mark E.

    Glass-ceramic seals may be the future of hermetic connectors at Sandia National Laboratories. They have been shown capable of surviving higher temperatures and pressures than amorphous glass seals. More advanced finite-element material models are required to enable model-based design and provide evidence that the hermetic connectors can meet design requirements. Glass-ceramics are composite materials with both crystalline and amorphous phases. The latter gives rise to (non-linearly) viscoelastic behavior. Given their complex microstructures, glass-ceramics may be thermorheologically complex, a behavior outside the scope of currently implemented constitutive models at Sandia. However, it was desired to assess if the Simplified Potential Energymore » Clock (SPEC) model is capable of capturing the material response. Available data for SL 16.8 glass-ceramic was used to calibrate the SPEC model. Model accuracy was assessed by comparing model predictions with shear moduli temperature dependence and high temperature 3-point bend creep data. It is shown that the model can predict the temperature dependence of the shear moduli and 3- point bend creep data. Analysis of the results is presented. Suggestions for future experiments and model development are presented. Though further calibration is likely necessary, SPEC has been shown capable of modeling glass-ceramic behavior in the glass transition region but requires further analysis below the transition region.« less

  1. Validation of a coupled core-transport, pedestal-structure, current-profile and equilibrium model

    NASA Astrophysics Data System (ADS)

    Meneghini, O.

    2015-11-01

    The first workflow capable of predicting the self-consistent solution to the coupled core-transport, pedestal structure, and equilibrium problems from first-principles and its experimental tests are presented. Validation with DIII-D discharges in high confinement regimes shows that the workflow is capable of robustly predicting the kinetic profiles from on axis to the separatrix and matching the experimental measurements to within their uncertainty, with no prior knowledge of the pedestal height nor of any measurement of the temperature or pressure. Self-consistent coupling has proven to be essential to match the experimental results, and capture the non-linear physics that governs the core and pedestal solutions. In particular, clear stabilization of the pedestal peeling ballooning instabilities by the global Shafranov shift and destabilization by additional edge bootstrap current, and subsequent effect on the core plasma profiles, have been clearly observed and documented. In our model, self-consistency is achieved by iterating between the TGYRO core transport solver (with NEO and TGLF for neoclassical and turbulent flux), and the pedestal structure predicted by the EPED model. A self-consistent equilibrium is calculated by EFIT, while the ONETWO transport package evolves the current profile and calculates the particle and energy sources. The capabilities of such workflow are shown to be critical for the design of future experiments such as ITER and FNSF, which operate in a regime where the equilibrium, the pedestal, and the core transport problems are strongly coupled, and for which none of these quantities can be assumed to be known. Self-consistent core-pedestal predictions for ITER, as well as initial optimizations, will be presented. Supported by the US Department of Energy under DE-FC02-04ER54698, DE-SC0012652.

  2. Technical Data Exchange Software Tools Adapted to Distributed Microsatellite Design

    NASA Astrophysics Data System (ADS)

    Pache, Charly

    2002-01-01

    One critical issue concerning distributed design of satellites, is the collaborative work it requires. In particular, the exchange of data between each group responsible for each subsystem can be complex and very time-consuming. The goal of this paper is to present a design collaborative tool, the SSETI Design Model (SDM), specifically developed for enabling satellite distributed design. SDM is actually used in the ongoing Student Space Exploration &Technology (SSETI) initiative (www.sseti.net). SSETI is lead by European Space Agency (ESA) outreach office (http://www.estec.esa.nl/outreach), involving student groups from all over Europe for design, construction and launch of a microsatellite. The first part of this paper presents the current version of the SDM tool, a collection of Microsoft Excel linked worksheets, one for each subsystem. An overview of the project framework/structure is given, explaining the different actors, the flows between them, as well as the different types of data and the links - formulas - between data sets. Unified Modeling Language (UML) diagrams give an overview of the different parts . Then the SDM's functionalities, developed in VBA scripts (Visual Basic for Application), are introduced, as well as the interactive features, user interfaces and administration tools. The second part discusses the capabilities and limitations of SDM current version. Taking into account these capabilities and limitations, the third part outlines the next version of SDM, a web-oriented, database-driven evolution of the current version. This new approach will enable real-time data exchange and processing between the different actors of the mission. Comprehensive UML diagrams will guide the audience through the entire modeling process of such a system. Tradeoffs simulation capabilities, security, reliability, hardware and software issues will also be thoroughly discussed.

  3. OLTARIS - Overview and Recent Updates

    NASA Technical Reports Server (NTRS)

    Sandridge, C. A.

    2015-01-01

    The On-Line Tool for the Assessment of Radiation in Space (OLTARIS) is a web-based set of tools and models that allow engineers and scientists to assess the effects of space radiation in spacecraft, habitats, rovers, and spacesuits. The site is intended to be a design tool for those studying the effects of space radiation for current and future missions as well as a research tool for those developing advanced material and shielding concepts. The tools and models are built around the HZETRN radiation transport code and are currently focused on human-related responses. OLTARIS was deployed in 2008. Since that time, many improvements and additional capabilities have been added to the site. The purpose of this poster/presentation is to give an overview of the current capabilities of OLTARIS and focus on the updates to the site since the last workshop presentation in 2014. OLTARIS currently has 240 active accounts - 87 accounts are government (including NASA, ORNL, JPL, AFRL, and FAA), 76 are university professors/researchers/students, and 51 are industry (including Boeing, Space X, Lockheed-Martin, ATK, Northrup Grumman, and Bigelow Aerospace). There have been 14,000 jobs run through OLTARIS since counting began in November 2009. ITAR restrictions were recently reversed, so the site is now available to registered users worldwide.

  4. 10 Steps to Building an Architecture for Space Surveillance Projects

    NASA Astrophysics Data System (ADS)

    Gyorko, E.; Barnhart, E.; Gans, H.

    Space surveillance is an increasingly complex task, requiring the coordination of a multitude of organizations and systems, while dealing with competing capabilities, proprietary processes, differing standards, and compliance issues. In order to fully understand space surveillance operations, analysts and engineers need to analyze and break down their operations and systems using what are essentially enterprise architecture processes and techniques. These techniques can be daunting to the first- time architect. This paper provides a summary of simplified steps to analyze a space surveillance system at the enterprise level in order to determine capabilities, services, and systems. These steps form the core of an initial Model-Based Architecting process. For new systems, a well defined, or well architected, space surveillance enterprise leads to an easier transition from model-based architecture to model-based design and provides a greater likelihood that requirements are fulfilled the first time. Both new and existing systems benefit from being easier to manage, and can be sustained more easily using portfolio management techniques, based around capabilities documented in the model repository. The resulting enterprise model helps an architect avoid 1) costly, faulty portfolio decisions; 2) wasteful technology refresh efforts; 3) upgrade and transition nightmares; and 4) non-compliance with DoDAF directives. The Model-Based Architecting steps are based on a process that Harris Corporation has developed from practical experience architecting space surveillance systems and ground systems. Examples are drawn from current work on documenting space situational awareness enterprises. The process is centered on DoDAF 2 and its corresponding meta-model so that terminology is standardized and communicable across any disciplines that know DoDAF architecting, including acquisition, engineering and sustainment disciplines. Each step provides a guideline for the type of data to collect, and also the appropriate views to generate. The steps include 1) determining the context of the enterprise, including active elements and high level capabilities or goals; 2) determining the desired effects of the capabilities and mapping capabilities against the project plan; 3) determining operational performers and their inter-relationships; 4) building information and data dictionaries; 5) defining resources associated with capabilities; 6) determining the operational behavior necessary to achieve each capability; 7) analyzing existing or planned implementations to determine systems, services and software; 8) cross-referencing system behavior to operational behavioral; 9) documenting system threads and functional implementations; and 10) creating any required textual documentation from the model.

  5. Digital Architecture – Results From a Gap Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oxstrand, Johanna Helene; Thomas, Kenneth David; Fitzgerald, Kirk

    The digital architecture is defined as a collection of IT capabilities needed to support and integrate a wide-spectrum of real-time digital capabilities for nuclear power plant performance improvements. The digital architecture can be thought of as an integration of the separate I&C and information systems already in place in NPPs, brought together for the purpose of creating new levels of automation in NPP work activities. In some cases, it might be an extension of the current communication systems, to provide digital communications where they are currently analog only. This collection of IT capabilities must in turn be based on amore » set of user requirements that must be supported for the interconnected technologies to operate in an integrated manner. These requirements, simply put, are a statement of what sorts of digital work functions will be exercised in a fully-implemented seamless digital environment and how much they will be used. The goal of the digital architecture research is to develop a methodology for mapping nuclear power plant operational and support activities into the digital architecture, which includes the development of a consensus model for advanced information and control architecture. The consensus model should be developed at a level of detail that is useful to the industry. In other words, not so detailed that it specifies specific protocols and not so vague that it is only provides a high level description of technology. The next step towards the model development is to determine the current state of digital architecture at typical NPPs. To investigate the current state, the researchers conducted a gap analysis to determine to what extent the NPPs can support the future digital technology environment with their existing I&C and IT structure, and where gaps exist with respect to the full deployment of technology over time. The methodology, result, and conclusions from the gap analysis are described in this report.« less

  6. An Astrosociological Perspective on Space-Capable vs. Spacefaring Societies

    NASA Astrophysics Data System (ADS)

    Pass, J.

    As with any academic field, astrosociology allows for an endless number of competing theoretical models and hypotheses. One possible theoretical model is presented here that starts with the premise that even the most advanced societies today are extremely far from achieving a spacefaring status. The most advanced nation states are, in fact, space-capable societies because they have the capacity to send cargo and humans into low Earth orbit and beyond. However, their social structures and cultures lack fundamental characteristics that would allow for their designation as spacefaring societies. This article describes the characteristics of a theoretical spacefaring society and argues that getting there from our current status as space-capable societies is a long and arduous process, and it is not a definite outcome whatsoever. While a continuum is offered, it represents an imprecise path that can retrograde or fall apart at any time. Thus, this theoretical model provides one possible series of an unfolding of events that result in the creation of characteristics of the social fabric that may result in movement along the continuum toward a spacefaring society. Movement along the continuum results in an accumulation of coordinated spacefaring characteristics for a given society. Simultaneously, strictly terrestrial characteristics disappear or transform themselves into hybrid forms that include spacefaring features. This exercise demonstrates that this theoretical exercise has a number of benefits for astrosociologists conducting research in the area of spacefaring theory. Moreover, it makes the case for the idea that the study of the theoretical transformation from a space-capable to a spacefaring society includes implications for current and future 1) space policy in the public sector and 2) corporate decision-making related to space in the private sector.

  7. Rapid prototyping, astronaut training, and experiment control and supervision: distributed virtual worlds for COLUMBUS, the European Space Laboratory module

    NASA Astrophysics Data System (ADS)

    Freund, Eckhard; Rossmann, Juergen

    2002-02-01

    In 2004, the European COLUMBUS Module is to be attached to the International Space Station. On the way to the successful planning, deployment and operation of the module, computer generated and animated models are being used to optimize performance. Under contract of the German Space Agency DLR, it has become IRF's task to provide a Projective Virtual Reality System to provide a virtual world built after the planned layout of the COLUMBUS module let astronauts and experimentators practice operational procedures and the handling of experiments. The key features of the system currently being realized comprise the possibility for distributed multi-user access to the virtual lab and the visualization of real-world experiment data. Through the capabilities to share the virtual world, cooperative operations can be practiced easily, but also trainers and trainees can work together more effectively sharing the virtual environment. The capability to visualize real-world data will be used to introduce measured data of experiments into the virtual world online in order to realistically interact with the science-reference model hardware: The user's actions in the virtual world are translated into corresponding changes of the inputs of the science reference model hardware; the measured data is than in turn fed back into the virtual world. During the operation of COLUMBUS, the capabilities for distributed access and the capabilities to visualize measured data through the use of metaphors and augmentations of the virtual world may be used to provide virtual access to the COLUMBUS module, e.g. via Internet. Currently, finishing touches are being put to the system. In November 2001 the virtual world shall be operational, so that besides the design and the key ideas, first experimental results can be presented.

  8. High fidelity studies of exploding foil initiator bridges, Part 1: Experimental method

    NASA Astrophysics Data System (ADS)

    Bowden, Mike; Neal, William

    2017-01-01

    Simulations of high voltage detonators, such as Exploding Bridgewire (EBW) and Exploding Foil Initiators (EFI), have historically been simple, often empirical, one-dimensional models capable of predicting parameters such as current, voltage and in the case of EFIs, flyer velocity. Correspondingly, experimental methods have in general been limited to the same parameters. With the advent of complex, first principles magnetohydrodynamic codes such as ALEGRA and ALE-MHD, it is now possible to simulate these components in three dimensions, predicting a much greater range of parameters than before. A significant improvement in experimental capability was therefore required to ensure these simulations could be adequately validated. In this first paper of a three part study, the experimental method for determining the current, voltage, flyer velocity and multi-dimensional profile of detonator components is presented. This improved capability, along with high fidelity simulations, offer an opportunity to gain a greater understanding of the processes behind the functioning of EBW and EFI detonators.

  9. Acoustic Prediction State of the Art Assessment

    NASA Technical Reports Server (NTRS)

    Dahl, Milo D.

    2007-01-01

    The acoustic assessment task for both the Subsonic Fixed Wing and the Supersonic projects under NASA s Fundamental Aeronautics Program was designed to assess the current state-of-the-art in noise prediction capability and to establish baselines for gauging future progress. The documentation of our current capabilities included quantifying the differences between predictions of noise from computer codes and measurements of noise from experimental tests. Quantifying the accuracy of both the computed and experimental results further enhanced the credibility of the assessment. This presentation gives sample results from codes representative of NASA s capabilities in aircraft noise prediction both for systems and components. These include semi-empirical, statistical, analytical, and numerical codes. System level results are shown for both aircraft and engines. Component level results are shown for a landing gear prototype, for fan broadband noise, for jet noise from a subsonic round nozzle, and for propulsion airframe aeroacoustic interactions. Additional results are shown for modeling of the acoustic behavior of duct acoustic lining and the attenuation of sound in lined ducts with flow.

  10. Making the Case for Reusable Booster Systems: The Operations Perspective

    NASA Technical Reports Server (NTRS)

    Zapata, Edgar

    2012-01-01

    Presentation to the Aeronautics Space Engineering Board National Research Council Reusable Booster System: Review and Assessment Committee. Addresses: the criteria and assumptions used in the formulation of current RBS plans; the methodologies used in the current cost estimates for RBS; the modeling methodology used to frame the business case for an RBS capability including: the data used in the analysis, the models' robustness if new data become available, and the impact of unclassified government data that was previously unavailable and which will be supplied by the USAF; the technical maturity of key elements critical to RBS implementation and the ability of current technology development plans to meet technical readiness milestones.

  11. Interfacing a General Purpose Fluid Network Flow Program with the SINDA/G Thermal Analysis Program

    NASA Technical Reports Server (NTRS)

    Schallhorn, Paul; Popok, Daniel

    1999-01-01

    A general purpose, one dimensional fluid flow code is currently being interfaced with the thermal analysis program Systems Improved Numerical Differencing Analyzer/Gaski (SINDA/G). The flow code, Generalized Fluid System Simulation Program (GFSSP), is capable of analyzing steady state and transient flow in a complex network. The flow code is capable of modeling several physical phenomena including compressibility effects, phase changes, body forces (such as gravity and centrifugal) and mixture thermodynamics for multiple species. The addition of GFSSP to SINDA/G provides a significant improvement in convective heat transfer modeling for SINDA/G. The interface development is conducted in multiple phases. This paper describes the first phase of the interface which allows for steady and quasi-steady (unsteady solid, steady fluid) conjugate heat transfer modeling.

  12. Study on model current predictive control method of PV grid- connected inverters systems with voltage sag

    NASA Astrophysics Data System (ADS)

    Jin, N.; Yang, F.; Shang, S. Y.; Tao, T.; Liu, J. S.

    2016-08-01

    According to the limitations of the LVRT technology of traditional photovoltaic inverter existed, this paper proposes a low voltage ride through (LVRT) control method based on model current predictive control (MCPC). This method can effectively improve the photovoltaic inverter output characteristics and response speed. The MCPC method of photovoltaic grid-connected inverter designed, the sum of the absolute value of the predictive current and the given current error is adopted as the cost function with the model predictive control method. According to the MCPC, the optimal space voltage vector is selected. Photovoltaic inverter has achieved automatically switches of priority active or reactive power control of two control modes according to the different operating states, which effectively improve the inverter capability of LVRT. The simulation and experimental results proves that the proposed method is correct and effective.

  13. Improving Aerosol and Visibility Forecasting Capabilities Using Current and Future Generations of Satellite Observations

    DTIC Science & Technology

    2015-08-27

    and 2) preparing for the post-MODIS/MISR era using the Geostationary Operational Environmental Satellite (GOES). 3. Improve model representations of...meteorological property retrievals. In this study, using collocated data from Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) and Geostationary

  14. Non-Solenoidal Startup Research Directions on the Pegasus Toroidal Experiment

    NASA Astrophysics Data System (ADS)

    Fonck, R. J.; Bongard, M. W.; Lewicki, B. T.; Reusch, J. A.; Winz, G. R.

    2017-10-01

    The Pegasus research program has been focused on developing a physical understanding and predictive models for non-solenoidal tokamak plasma startup using Local Helicity Injection (LHI). LHI employs strong localized electron currents injected along magnetic field lines in the plasma edge that relax through magnetic turbulence to form a tokamak-like plasma. Pending approval, the Pegasus program will address a broader, more comprehensive examination of non-solenoidal tokamak startup techniques. New capabilities may include: increasing the toroidal field to 0.6 T to support critical scaling tests to near-NSTX-U field levels; deploying internal plasma diagnostics; installing a coaxial helicity injection (CHI) capability in the upper divertor region; and deploying a modest (200-400 kW) electron cyclotron RF capability. These efforts will address scaling of relevant physics to higher BT, separate and comparative studies of helicity injection techniques, efficiency of handoff to consequent current sustainment techniques, and the use of ECH to synergistically improve the target plasma for consequent bootstrap and neutral beam current drive sustainment. This has an ultimate goal of validating techniques to produce a 1 MA target plasma in NSTX-U and beyond. Work supported by US DOE Grant DE-FG02-96ER54375.

  15. A Low-Signal-to-Noise-Ratio Sensor Framework Incorporating Improved Nighttime Capabilities in DIRSIG

    NASA Astrophysics Data System (ADS)

    Rizzuto, Anthony P.

    When designing new remote sensing systems, it is difficult to make apples-to-apples comparisons between designs because of the number of sensor parameters that can affect the final image. Using synthetic imagery and a computer sensor model allows for comparisons to be made between widely different sensor designs or between competing design parameters. Little work has been done in fully modeling low-SNR systems end-to-end for these types of comparisons. Currently DIRSIG has limited capability to accurately model nighttime scenes under new moon conditions or near large cities. An improved DIRSIG scene modeling capability is presented that incorporates all significant sources of nighttime radiance, including new models for urban glow and airglow, both taken from the astronomy community. A low-SNR sensor modeling tool is also presented that accounts for sensor components and noise sources to generate synthetic imagery from a DIRSIG scene. The various sensor parameters that affect SNR are discussed, and example imagery is shown with the new sensor modeling tool. New low-SNR detectors have recently been designed and marketed for remote sensing applications. A comparison of system parameters for a state-of-the-art low-SNR sensor is discussed, and a sample design trade study is presented for a hypothetical scene and sensor.

  16. Used Nuclear Fuel-Storage, Transportation & Disposal Analysis Resource and Data System (UNF-ST&DARDS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Banerjee, Kaushik; Clarity, Justin B; Cumberland, Riley M

    This will be licensed via RSICC. A new, integrated data and analysis system has been designed to simplify and automate the performance of accurate and efficient evaluations for characterizing the input to the overall nuclear waste management system -UNF-Storage, Transportation & Disposal Analysis Resource and Data System (UNF-ST&DARDS). A relational database within UNF-ST&DARDS provides a standard means by which UNF-ST&DARDS can succinctly store and retrieve modeling and simulation (M&S) parameters for specific spent nuclear fuel analysis. A library of various analysis model templates provides the ability to communicate the various set of M&S parameters to the most appropriate M&S application.more » Interactive visualization capabilities facilitate data analysis and results interpretation. UNF-ST&DARDS current analysis capabilities include (1) assembly-specific depletion and decay, (2) and spent nuclear fuel cask-specific criticality and shielding. Currently, UNF-ST&DARDS uses SCALE nuclear analysis code system for performing nuclear analysis.« less

  17. Expansion of space station diagnostic capability to include serological identification of viral and bacterial infections

    NASA Technical Reports Server (NTRS)

    Hejtmancik, Kelly E.

    1987-01-01

    It is necessary that an adequate microbiology capability be provided as part of the Health Maintenance Facility (HMF) to support expected microbial disease events during long periods of space flight. The applications of morphological and biochemical studies to confirm the presence of certain bacterial and fungal disease agents are currently available and under consideration. This confirmation would be greatly facilitated through employment of serological methods to aid in the identification for not only bacterial and fungal agents, but viruses as well. A number of serological approached were considered, particularly the use of Enzyme Linked Immunosorbent Assays (ELISAs), which could be utilized during space flight conditions. A solid phase, membrane supported ELISA for the detection of Bordetella pertussis was developed to show a potential model system that would meet the HMF requirements and specifications for the future space station. A second model system for the detection of Legionella pneumophilia, an expected bacterial disease agent, is currently under investigation.

  18. Source Physics Experiments at the Nevada Test Site

    DTIC Science & Technology

    2010-09-01

    not display a currently valid OMB control number. 1. REPORT DATE SEP 2010 2. REPORT TYPE 3. DATES COVERED 00-00-2010 to 00-00-2010 4. TITLE AND...seismograms through three-dimensional models of the earth will move monitoring science into a physics- based era. This capability should enable...the advanced ability to model synthetic seismograms in three-dimensional earth models should also lead to advances in the ability to locate and

  19. From DNS to RANS: A Multi-model workflow to understand the Influence of Hurricanes on Generating Turbidity Currents in the Gulf of Mexico

    NASA Astrophysics Data System (ADS)

    Syvitski, J. P.; Arango, H.; Harris, C. K.; Meiburg, E. H.; Jenkins, C. J.; Auad, G.; Hutton, E.; Kniskern, T. A.; Radhakrishnan, S.

    2016-12-01

    A loosely coupled numerical workflow is developed to address land-sea pathways for sediment routing from terrestrial and coastal sources, across the continental shelf and ultimately down the continental slope canyon system of the northern Gulf of Mexico (GOM). Model simulations represent a range of environmental conditions that might lead to the generation of turbidity-currents. The workflow comprises: 1) A simulator for the water and sediment discharged from rivers into the GOM with WMBsedv2 with calibration using USGS and USACE gauged river data; 2) Domain grids and bathymetry (ETOPO2) for the ocean models and realistic seabed sediment texture grids (dbSEABED) for the sediment transport models; 3) A spectral wave action simulator (10 km resolution) (WaveWatch III) driven by GFDL - GFS winds; 4) A simulator for ocean dynamics (ROMS) forced with ECMWF ERA winds; 5) A simulator for seafloor resuspension and transport (CSTMS); 6) Simulators (HurriSlip) of seafloor failure and flow ignition locations for boundary input to a turbidity current model; and 7) A RANS turbidity current model (TURBINS) to route sediment flows down GOM canyons, providing estimates of bottom shear stresses. TURBINS was developed first as a DNS model and then converted to an LES model wherein a dynamic turbulence closure scheme was employed. Like most DNS to LES model comparisons (these being done by the UCSB team), turbulence scaling allowed for higher Re applications but were found still not capable of simulating field scale (GOM continental canyons) environments. The LES model was next converted to a non-hydrostatic RANS model capable of field scale applications but only with a daisy-chain approach to multiple model runs along the simulated canyon floor. These model adaptations allowed the workflow to be tested for the year 1-Oct-2007 to 30-Sep-2008 that included two domain Hurricanes (Ike and Gustav). The RANS-TURBINS employed further boundary simplifications on both sediment erosion and deposition in line with the ocean model ROMS-CSTMS.

  20. Adversarial reasoning: challenges and approaches

    NASA Astrophysics Data System (ADS)

    Kott, Alexander; Ownby, Michael

    2005-05-01

    This paper defines adversarial reasoning as computational approaches to inferring and anticipating an enemy's perceptions, intents and actions. It argues that adversarial reasoning transcends the boundaries of game theory and must also leverage such disciplines as cognitive modeling, control theory, AI planning and others. To illustrate the challenges of applying adversarial reasoning to real-world problems, the paper explores the lessons learned in the CADET -- a battle planning system that focuses on brigade-level ground operations and involves adversarial reasoning. From this example of current capabilities, the paper proceeds to describe RAID -- a DARPA program that aims to build capabilities in adversarial reasoning, and how such capabilities would address practical requirements in Defense and other application areas.

  1. Early Estimation of Solar Activity Cycle: Potential Capability and Limits

    NASA Technical Reports Server (NTRS)

    Kitiashvili, Irina N.; Collins, Nancy S.

    2017-01-01

    The variable solar magnetic activity known as the 11-year solar cycle has the longest history of solar observations. These cycles dramatically affect conditions in the heliosphere and the Earth's space environment. Our current understanding of the physical processes that make up global solar dynamics and the dynamo that generates the magnetic fields is sketchy, resulting in unrealistic descriptions in theoretical and numerical models of the solar cycles. The absence of long-term observations of solar interior dynamics and photospheric magnetic fields hinders development of accurate dynamo models and their calibration. In such situations, mathematical data assimilation methods provide an optimal approach for combining the available observational data and their uncertainties with theoretical models in order to estimate the state of the solar dynamo and predict future cycles. In this presentation, we will discuss the implementation and performance of an Ensemble Kalman Filter data assimilation method based on the Parker migratory dynamo model, complemented by the equation of magnetic helicity conservation and longterm sunspot data series. This approach has allowed us to reproduce the general properties of solar cycles and has already demonstrated a good predictive capability for the current cycle, 24. We will discuss further development of this approach, which includes a more sophisticated dynamo model, synoptic magnetogram data, and employs the DART Data Assimilation Research Testbed.

  2. New Multi-group Transport Neutronics (PHISICS) Capabilities for RELAP5-3D and its Application to Phase I of the OECD/NEA MHTGR-350 MW Benchmark

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerhard Strydom; Cristian Rabiti; Andrea Alfonsi

    2012-10-01

    PHISICS is a neutronics code system currently under development at the Idaho National Laboratory (INL). Its goal is to provide state of the art simulation capability to reactor designers. The different modules for PHISICS currently under development are a nodal and semi-structured transport core solver (INSTANT), a depletion module (MRTAU) and a cross section interpolation (MIXER) module. The INSTANT module is the most developed of the mentioned above. Basic functionalities are ready to use, but the code is still in continuous development to extend its capabilities. This paper reports on the effort of coupling the nodal kinetics code package PHISICSmore » (INSTANT/MRTAU/MIXER) to the thermal hydraulics system code RELAP5-3D, to enable full core and system modeling. This will enable the possibility to model coupled (thermal-hydraulics and neutronics) problems with more options for 3D neutron kinetics, compared to the existing diffusion theory neutron kinetics module in RELAP5-3D (NESTLE). In the second part of the paper, an overview of the OECD/NEA MHTGR-350 MW benchmark is given. This benchmark has been approved by the OECD, and is based on the General Atomics 350 MW Modular High Temperature Gas Reactor (MHTGR) design. The benchmark includes coupled neutronics thermal hydraulics exercises that require more capabilities than RELAP5-3D with NESTLE offers. Therefore, the MHTGR benchmark makes extensive use of the new PHISICS/RELAP5-3D coupling capabilities. The paper presents the preliminary results of the three steady state exercises specified in Phase I of the benchmark using PHISICS/RELAP5-3D.« less

  3. An overview of Space Shuttle anthropometry and biomechanics research with emphasis on STS/Mir recumbent seat system design

    NASA Technical Reports Server (NTRS)

    Klute, Glenn K.; Stoycos, Lara E.

    1994-01-01

    The Anthropometry and Biomechanics Laboratory (ABL) at JSC conducts multi-disciplinary research focusing on maximizing astronaut intravehicular (IVA) and extravehicular (EVA) capabilities to provide the most effective work conditions for manned space flight and exploration missions. Biomechanics involves the measurement and modeling of the strength characteristics of the human body. Current research for the Space Shuttle Program includes the measurement of torque wrench capability during weightlessness, optimization of foot restraint, and hand hold placement, measurements of the strength and dexterity of the pressure gloved hand to improve glove design, quantification of the ability to move and manipulate heavy masses (6672 N or 1500 lb) in weightlessness, and verification of the capability of EVA crewmembers to perform Hubble Space Telescope repair tasks. Anthropometry is the measurement and modeling of the dimensions of the human body. Current research for the Space Shuttle Program includes the measurement of 14 anthropometric parameters of every astronaut candidate, identification of EVA finger entrapment hazards by measuring the dimensions of the gloved hand, definition of flight deck reach envelopes during launch and landing accelerations, and measurement of anthropometric design parameters for the recumbent seat system required for the Shuttle/Mir mission (STS-71, Spacelab M) scheduled for Jun. 1995.

  4. Methods for Estimating Environmental Effects and Constraints on NexGen: High Density Case Study

    NASA Technical Reports Server (NTRS)

    Augustine, S.; Ermatinger, C.; Graham, M.; Thompson, T.

    2010-01-01

    This document provides a summary of the current methods developed by Metron Aviation for the estimate of environmental effects and constraints on the Next Generation Air Transportation System (NextGen). This body of work incorporates many of the key elements necessary to achieve such an estimate. Each section contains the background and motivation for the technical elements of the work, a description of the methods used, and possible next steps. The current methods described in this document were selected in an attempt to provide a good balance between accuracy and fairly rapid turn around times to best advance Joint Planning and Development Office (JPDO) System Modeling and Analysis Division (SMAD) objectives while also supporting the needs of the JPDO Environmental Working Group (EWG). In particular this document describes methods applied to support the High Density (HD) Case Study performed during the spring of 2008. A reference day (in 2006) is modeled to describe current system capabilities while the future demand is applied to multiple alternatives to analyze system performance. The major variables in the alternatives are operational/procedural capabilities for airport, terminal, and en route airspace along with projected improvements to airframe, engine and navigational equipment.

  5. Automation life-cycle cost model

    NASA Technical Reports Server (NTRS)

    Gathmann, Thomas P.; Reeves, Arlinda J.; Cline, Rick; Henrion, Max; Ruokangas, Corinne

    1992-01-01

    The problem domain being addressed by this contractual effort can be summarized by the following list: Automation and Robotics (A&R) technologies appear to be viable alternatives to current, manual operations; Life-cycle cost models are typically judged with suspicion due to implicit assumptions and little associated documentation; and Uncertainty is a reality for increasingly complex problems and few models explicitly account for its affect on the solution space. The objectives for this effort range from the near-term (1-2 years) to far-term (3-5 years). In the near-term, the envisioned capabilities of the modeling tool are annotated. In addition, a framework is defined and developed in the Decision Modelling System (DEMOS) environment. Our approach is summarized as follows: Assess desirable capabilities (structure into near- and far-term); Identify useful existing models/data; Identify parameters for utility analysis; Define tool framework; Encode scenario thread for model validation; and Provide transition path for tool development. This report contains all relevant, technical progress made on this contractual effort.

  6. Development of an Unstructured, Three-Dimensional Material Response Design Tool

    NASA Technical Reports Server (NTRS)

    Schulz, Joseph; Stern, Eric; Palmer, Grant; Muppidi, Suman; Schroeder, Olivia

    2017-01-01

    A preliminary verification and validation of a new material response model is presented. This model, Icarus, is intended to serve as a design tool for the thermal protection systems of re-entry vehicles. Currently, the capability of the model is limited to simulating the pyrolysis of a material as a result of the radiative and convective surface heating imposed on the material from the surrounding high enthalpy gas. Since the major focus behind the development of Icarus has been model extensibility, the hope is that additional physics can be quickly added. The extensibility is critical since thermal protection systems are becoming increasing complex, e.g. woven carbon polymers. Additionally, as a three-dimensional, unstructured, finite-volume model, Icarus is capable of modeling complex geometries as well as multi-dimensional physics, which have been shown to be important in some scenarios and are not captured by one-dimensional models. In this paper, the mathematical and numerical formulation is presented followed by a discussion of the software architecture and some preliminary verification and validation studies.

  7. Metabolic modeling of dynamic 13C NMR isotopomer data in the brain in vivo: Fast screening of metabolic models using automated generation of differential equations

    PubMed Central

    Tiret, Brice; Shestov, Alexander A.; Valette, Julien; Henry, Pierre-Gilles

    2017-01-01

    Most current brain metabolic models are not capable of taking into account the dynamic isotopomer information available from fine structure multiplets in 13C spectra, due to the difficulty of implementing such models. Here we present a new approach that allows automatic implementation of multi-compartment metabolic models capable of fitting any number of 13C isotopomer curves in the brain. The new automated approach also makes it possible to quickly modify and test new models to best describe the experimental data. We demonstrate the power of the new approach by testing the effect of adding separate pyruvate pools in astrocytes and neurons, and adding a vesicular neuronal glutamate pool. Including both changes reduced the global fit residual by half and pointed to dilution of label prior to entry into the astrocytic TCA cycle as the main source of glutamine dilution. The glutamate-glutamine cycle rate was particularly sensitive to changes in the model. PMID:26553273

  8. The Langley Research Center CSI phase-0 evolutionary model testbed-design and experimental results

    NASA Technical Reports Server (NTRS)

    Belvin, W. K.; Horta, Lucas G.; Elliott, K. B.

    1991-01-01

    A testbed for the development of Controls Structures Interaction (CSI) technology is described. The design philosophy, capabilities, and early experimental results are presented to introduce some of the ongoing CSI research at NASA-Langley. The testbed, referred to as the Phase 0 version of the CSI Evolutionary model (CEM), is the first stage of model complexity designed to show the benefits of CSI technology and to identify weaknesses in current capabilities. Early closed loop test results have shown non-model based controllers can provide an order of magnitude increase in damping in the first few flexible vibration modes. Model based controllers for higher performance will need to be robust to model uncertainty as verified by System ID tests. Data are presented that show finite element model predictions of frequency differ from those obtained from tests. Plans are also presented for evolution of the CEM to study integrated controller and structure design as well as multiple payload dynamics.

  9. Hierarchical equations of motion method applied to nonequilibrium heat transport in model molecular junctions: Transient heat current and high-order moments of the current operator

    NASA Astrophysics Data System (ADS)

    Song, Linze; Shi, Qiang

    2017-02-01

    We present a theoretical approach to study nonequilibrium quantum heat transport in molecular junctions described by a spin-boson type model. Based on the Feynman-Vernon path integral influence functional formalism, expressions for the average value and high-order moments of the heat current operators are derived, which are further obtained directly from the auxiliary density operators (ADOs) in the hierarchical equations of motion (HEOM) method. Distribution of the heat current is then derived from the high-order moments. As the HEOM method is nonperturbative and capable of treating non-Markovian system-environment interactions, the method can be applied to various problems of nonequilibrium quantum heat transport beyond the weak coupling regime.

  10. A comprehensive cost model for NASA data archiving

    NASA Technical Reports Server (NTRS)

    Green, J. L.; Klenk, K. F.; Treinish, L. A.

    1990-01-01

    A simple archive cost model has been developed to help predict NASA's archiving costs. The model covers data management activities from the beginning of the mission through launch, acquisition, and support of retrospective users by the long-term archive; it is capable of determining the life cycle costs for archived data depending on how the data need to be managed to meet user requirements. The model, which currently contains 48 equations with a menu-driven user interface, is available for use on an IBM PC or AT.

  11. Heat analysis of thermal overload relays using 3-D finite element method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kawase, Yoshihiro; Ichihashi, Takayuki; Ito, Shokichi

    1999-05-01

    In designing a thermal overload relay, it is necessary to analyze thermal characteristics of several trial models. Up to now, this has been done by measuring the temperatures on a number of positions in the trial models. This experimental method is undoubtedly expensive. In this paper, the temperature distribution of a thermal overload relay is obtained by using 3-D finite element analysis taking into account the current distribution in current-carrying conductors. It is shown that the 3-D analysis is capable of evaluating a new design of thermal overload relays.

  12. System Dynamics Aviation Readiness Modeling Demonstration

    DTIC Science & Technology

    2005-08-31

    requirements. It is recommended that the Naval Aviation Enterprise take a close look at the requirements i.e., performance measures, methodology ...unit’s capability to perform specific Joint Mission Essential Task List (JMETL) requirements now and in the future. This assessment methodology must...the time-associated costs. The new methodology must base decisions on currently available data and databases. A “useful” readiness model should be

  13. The Aircraft Industry

    DTIC Science & Technology

    2005-01-01

    market segment. Manufacturers are putting considerable effort into creating new models and upgrading current products for the high-end corporate...market share. Both competitors have new products entering the market with the Airbus A380 due around 2006, and the Boeing 787 scheduled for service in...commonality on all systems and technologies. First production models are expected to be delivered in 2008. Initial operating capability (IOC) for the U.S

  14. Modeling of Aerosols in Post-Combustor Flow Path and Sampling System

    NASA Technical Reports Server (NTRS)

    Wey, Thomas; Liu, Nan-Suey

    2006-01-01

    The development and application of a multi-dimensional capability for modeling and simulation of aviation-sourced particle emissions and their precursors are elucidated. Current focus is on the role of the flow and thermal environments. The cases investigated include a film cooled turbine blade, the first-stage of a high-pressure turbine, the sampling probes, the sampling lines, and a pressure reduction chamber.

  15. Assimilation of the seabird and ship drift data in the north-eastern sea of Japan into an operational ocean nowcast/forecast system

    PubMed Central

    Miyazawa, Yasumasa; Guo, Xinyu; Varlamov, Sergey M.; Miyama, Toru; Yoda, Ken; Sato, Katsufumi; Kano, Toshiyuki; Sato, Keiji

    2015-01-01

    At the present time, ocean current is being operationally monitored mainly by combined use of numerical ocean nowcast/forecast models and satellite remote sensing data. Improvement in the accuracy of the ocean current nowcast/forecast requires additional measurements with higher spatial and temporal resolution as expected from the current observation network. Here we show feasibility of assimilating high-resolution seabird and ship drift data into an operational ocean forecast system. Data assimilation of geostrophic current contained in the observed drift leads to refinement in the gyre mode events of the Tsugaru warm current in the north-eastern sea of Japan represented by the model. Fitting the observed drift to the model depends on ability of the drift representing geostrophic current compared to that representing directly wind driven components. A preferable horizontal scale of 50 km indicated for the seabird drift data assimilation implies their capability of capturing eddies with smaller horizontal scale than the minimum scale of 100 km resolved by the satellite altimetry. The present study actually demonstrates that transdisciplinary approaches combining bio-/ship- logging and numerical modeling could be effective for enhancement in monitoring the ocean current. PMID:26633309

  16. ALGE3D: A Three-Dimensional Transport Model

    NASA Astrophysics Data System (ADS)

    Maze, G. M.

    2017-12-01

    Of the top 10 most populated US cities from a 2015 US Census Bureau estimate, 7 of the cities are situated near the ocean, a bay, or on one of the Great Lakes. A contamination of the water ways in the United States could be devastating to the economy (through tourism and industries such as fishing), public health (from direct contact, or contaminated drinking water), and in some cases even infrastructure (water treatment plants). Current national response models employed by emergency response agencies have well developed models to simulate the effects of hazardous contaminants in riverine systems that are primarily driven by one-dimensional flows; however in more complex systems, such as tidal estuaries, bays, or lakes, a more complex model is needed. While many models exist, none are capable of quick deployment in emergency situations that could contain a variety of release situations including a mixture of both particulate and dissolved chemicals in a complex flow area. ALGE3D, developed at the Department of Energy's (DOE) Savannah River National Laboratory (SRNL), is a three-dimensional hydrodynamic code which solves the momentum, mass, and energy conservation equations to predict the movement and dissipation of thermal or dissolved chemical plumes discharged into cooling lakes, rivers, and estuaries. ALGE3D is capable of modeling very complex flows, including areas with tidal flows which include wetting and drying of land. Recent upgrades have increased the capabilities including the transport of particulate tracers, allowing for more complete modeling of the transport of pollutants. In addition the model is capable of coupling with a one-dimension riverine transport model or a two-dimension atmospheric deposition model in the event that a contamination event occurs upstream or upwind of the water body.

  17. A new bead-spring model for simulation of semi-flexible macromolecules

    NASA Astrophysics Data System (ADS)

    Saadat, Amir; Khomami, Bamin

    2016-11-01

    A bead-spring model for semi-flexible macromolecules is developed to overcome the deficiencies of the current coarse-grained bead-spring models. Specifically, model improvements are achieved through incorporation of a bending potential. The new model is designed to accurately describe the correlation along the backbone of the chain, segmental length, and force-extension behavior of the macromolecule even at the limit of 1 Kuhn step per spring. The relaxation time of different Rouse modes is used to demonstrate the capabilities of the new model in predicting chain dynamics.

  18. Analytical modeling of eddy-current losses caused by pulse-width-modulation switching in permanent-magnet brushless direct-current motors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deng, F.; Nehl, T.W.

    1998-09-01

    Because of their high efficiency and power density the PM brushless dc motor is a strong candidate for electric and hybrid vehicle propulsion systems. An analytical approach is developed to predict the inverter high frequency pulse width modulation (PWM) switching caused eddy-current losses in a permanent magnet brushless dc motor. The model uses polar coordinates to take curvature effects into account, and is also capable of including the space harmonic effect of the stator magnetic field and the stator lamination effect on the losses. The model was applied to an existing motor design and was verified with the finite elementmore » method. Good agreement was achieved between the two approaches. Hence, the model is expected to be very helpful in predicting PWM switching losses in permanent magnet machine design.« less

  19. A new software tool for computing Earth's atmospheric transmission of near- and far-infrared radiation

    NASA Technical Reports Server (NTRS)

    Lord, Steven D.

    1992-01-01

    This report describes a new software tool, ATRAN, which computes the transmittance of Earth's atmosphere at near- and far-infrared wavelengths. We compare the capabilities of this program with others currently available and demonstrate its utility for observational data calibration and reduction. The program employs current water-vapor and ozone models to produce fast and accurate transmittance spectra for wavelengths ranging from 0.8 microns to 10 mm.

  20. Thermo-Mechanical and Electrochemistry Modeling of Planar SOFC Stacks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khaleel, Mohammad A.; Recknagle, Kurtis P.; Lin, Zijing

    2002-12-01

    Modeling activities at PNNL support design and development of modular SOFC systems. The SOFC stack modeling capability at PNNL has developed to a level at which planar stack designs can be compared and optimized for startup performance. Thermal-fluids and stress modeling is being performed to predict the transient temperature distribution and to determine the thermal stresses based on the temperature distribution. Current efforts also include the development of a model for calculating current density, cell voltage, and heat production in SOFC stacks with hydrogen or other fuels. The model includes the heat generation from both Joule heating and chemical reactions.more » It also accounts for species production and destruction via mass balance. The model is being linked to the finite element code MARC to allow for the evaluation of temperatures and stresses during steady state operations.« less

  1. Managing Complexity in Next Generation Robotic Spacecraft: From a Software Perspective

    NASA Technical Reports Server (NTRS)

    Reinholtz, Kirk

    2008-01-01

    This presentation highlights the challenges in the design of software to support robotic spacecraft. Robotic spacecraft offer a higher degree of autonomy, however currently more capabilities are required, primarily in the software, while providing the same or higher degree of reliability. The complexity of designing such an autonomous system is great, particularly while attempting to address the needs for increased capabilities and high reliability without increased needs for time or money. The efforts to develop programming models for the new hardware and the integration of software architecture are highlighted.

  2. A Coupled Layerwise Analysis of the Thermopiezoelectric Response of Smart Composite Beams Beams

    NASA Technical Reports Server (NTRS)

    Lee, H.-J.; Saravanos, D. A.

    1995-01-01

    Thermal effects are incorporated into previously developed discrete layer mechanics for piezoelectric composite beam structures. The updated mechanics explicitly account for the complete coupled thermoelectromechanical response of smart composite beams. This unified representation leads to an inherent capability to model both the sensory and actuator responses of piezoelectric composite beams in a thermal environment. Finite element equations are developed and numerical results are presented to demonstrate the capability of the current formulation to represent the behavior of both sensory and active smart structures under thermal loadings.

  3. Landscape capability models as a tool to predict fine-scale forest bird occupancy and abundance

    USGS Publications Warehouse

    Loman, Zachary G.; DeLuca, William; Harrison, Daniel J.; Loftin, Cynthia S.; Rolek, Brian W.; Wood, Petra B.

    2018-01-01

    ContextSpecies-specific models of landscape capability (LC) can inform landscape conservation design. Landscape capability is “the ability of the landscape to provide the environment […] and the local resources […] needed for survival and reproduction […] in sufficient quantity, quality and accessibility to meet the life history requirements of individuals and local populations.” Landscape capability incorporates species’ life histories, ecologies, and distributions to model habitat for current and future landscapes and climates as a proactive strategy for conservation planning.ObjectivesWe tested the ability of a set of LC models to explain variation in point occupancy and abundance for seven bird species representative of spruce-fir, mixed conifer-hardwood, and riparian and wooded wetland macrohabitats.MethodsWe compiled point count data sets used for biological inventory, species monitoring, and field studies across the northeastern United States to create an independent validation data set. Our validation explicitly accounted for underestimation in validation data using joint distance and time removal sampling.ResultsBlackpoll warbler (Setophaga striata), wood thrush (Hylocichla mustelina), and Louisiana (Parkesia motacilla) and northern waterthrush (P. noveboracensis) models were validated as predicting variation in abundance, although this varied from not biologically meaningful (1%) to strongly meaningful (59%). We verified all seven species models [including ovenbird (Seiurus aurocapilla), blackburnian (Setophaga fusca) and cerulean warbler (Setophaga cerulea)], as all were positively related to occupancy data.ConclusionsLC models represent a useful tool for conservation planning owing to their predictive ability over a regional extent. As improved remote-sensed data become available, LC layers are updated, which will improve predictions.

  4. Modeling users, context and devices for ambient assisted living environments.

    PubMed

    Castillejo, Eduardo; Almeida, Aitor; López-de-Ipiña, Diego; Chen, Liming

    2014-03-17

    The participation of users within AAL environments is increasing thanks to the capabilities of the current wearable devices. Furthermore, the significance of considering user's preferences, context conditions and device's capabilities help smart environments to personalize services and resources for them. Being aware of different characteristics of the entities participating in these situations is vital for reaching the main goals of the corresponding systems efficiently. To collect different information from these entities, it is necessary to design several formal models which help designers to organize and give some meaning to the gathered data. In this paper, we analyze several literature solutions for modeling users, context and devices considering different approaches in the Ambient Assisted Living domain. Besides, we remark different ongoing standardization works in this area. We also discuss the used techniques, modeled characteristics and the advantages and drawbacks of each approach to finally draw several conclusions about the reviewed works.

  5. Modeling Users, Context and Devices for Ambient Assisted Living Environments

    PubMed Central

    Castillejo, Eduardo; Almeida, Aitor; López-de-Ipiña, Diego; Chen, Liming

    2014-01-01

    The participation of users within AAL environments is increasing thanks to the capabilities of the current wearable devices. Furthermore, the significance of considering user's preferences, context conditions and device's capabilities help smart environments to personalize services and resources for them. Being aware of different characteristics of the entities participating in these situations is vital for reaching the main goals of the corresponding systems efficiently. To collect different information from these entities, it is necessary to design several formal models which help designers to organize and give some meaning to the gathered data. In this paper, we analyze several literature solutions for modeling users, context and devices considering different approaches in the Ambient Assisted Living domain. Besides, we remark different ongoing standardization works in this area. We also discuss the used techniques, modeled characteristics and the advantages and drawbacks of each approach to finally draw several conclusions about the reviewed works. PMID:24643006

  6. Benefits of a Unified LaSRS++ Simulation for NAS-Wide and High-Fidelity Modeling

    NASA Technical Reports Server (NTRS)

    Glaab, Patricia; Madden, Michael

    2014-01-01

    The LaSRS++ high-fidelity vehicle simulation was extended in 2012 to support a NAS-wide simulation mode. Since the initial proof-of-concept, the LaSRS++ NAS-wide simulation is maturing into a research-ready tool. A primary benefit of this new capability is the consolidation of the two modeling paradigms under a single framework to save cost, facilitate iterative concept testing between the two tools, and to promote communication and model sharing between user communities at Langley. Specific benefits of each type of modeling are discussed along with the expected benefits of the unified framework. Current capability details of the LaSRS++ NAS-wide simulations are provided, including the visualization tool, live data interface, trajectory generators, terminal routing for arrivals and departures, maneuvering, re-routing, navigation, winds, and turbulence. The plan for future development is also described.

  7. Development of a non-contextual model for determining the autonomy level of intelligent unmanned systems

    NASA Astrophysics Data System (ADS)

    Durst, Phillip J.; Gray, Wendell; Trentini, Michael

    2013-05-01

    A simple, quantitative measure for encapsulating the autonomous capabilities of unmanned systems (UMS) has yet to be established. Current models for measuring a UMS's autonomy level require extensive, operational level testing, and provide a means for assessing the autonomy level for a specific mission/task and operational environment. A more elegant technique for quantifying autonomy using component level testing of the robot platform alone, outside of mission and environment contexts, is desirable. Using a high level framework for UMS architectures, such a model for determining a level of autonomy has been developed. The model uses a combination of developmental and component level testing for each aspect of the UMS architecture to define a non-contextual autonomous potential (NCAP). The NCAP provides an autonomy level, ranging from fully non- autonomous to fully autonomous, in the form of a single numeric parameter describing the UMS's performance capabilities when operating at that level of autonomy.

  8. Fisher-Level Decision Making to Participate in Fisheries Improvement Projects (FIPs) for Yellowfin Tuna in the Philippines

    PubMed Central

    Berentsen, Paul; Bush, Simon R.; Digal, Larry; Oude Lansink, Alfons

    2016-01-01

    This study identifies the capabilities needed by small-scale fishers to participate in Fishery Improvement Projects (FIPs) for yellowfin tuna in the Philippines. The current literature provides little empirical evidence on how different models, or types of FIPs, influence the participation of fishers in their programs and the degree which FIPs are able to foster improvements in fishing practices. To address this literature gap, two different FIPs are empirically analysed, each with different approaches for fostering improvement. The first is the non-governmental organisation-led Partnership Programme Towards Sustainable Tuna, which adopts a bottom-up or development oriented FIP model. The second is the private-led Artesmar FIP, which adopts a top-down or market-oriented FIP approach. The data were obtained from 350 fishers surveyed and were analysed using two separate models run in succession, taking into consideration full, partial, and non-participation in the two FIPs. The results demonstrate that different types of capabilities are required in order to participate in different FIP models. Individual firm capabilities are more important for fishers participation in market-oriented FIPs, which use direct economic incentives to encourage improvements in fisher practices. Collective capabilities are more important for fishers to participate in development-oriented FIPs, which drive improvement by supporting fishers, fisher associations, and governments to move towards market requirements. PMID:27732607

  9. Fisher-Level Decision Making to Participate in Fisheries Improvement Projects (FIPs) for Yellowfin Tuna in the Philippines.

    PubMed

    Tolentino-Zondervan, Frazen; Berentsen, Paul; Bush, Simon R; Digal, Larry; Oude Lansink, Alfons

    2016-01-01

    This study identifies the capabilities needed by small-scale fishers to participate in Fishery Improvement Projects (FIPs) for yellowfin tuna in the Philippines. The current literature provides little empirical evidence on how different models, or types of FIPs, influence the participation of fishers in their programs and the degree which FIPs are able to foster improvements in fishing practices. To address this literature gap, two different FIPs are empirically analysed, each with different approaches for fostering improvement. The first is the non-governmental organisation-led Partnership Programme Towards Sustainable Tuna, which adopts a bottom-up or development oriented FIP model. The second is the private-led Artesmar FIP, which adopts a top-down or market-oriented FIP approach. The data were obtained from 350 fishers surveyed and were analysed using two separate models run in succession, taking into consideration full, partial, and non-participation in the two FIPs. The results demonstrate that different types of capabilities are required in order to participate in different FIP models. Individual firm capabilities are more important for fishers participation in market-oriented FIPs, which use direct economic incentives to encourage improvements in fisher practices. Collective capabilities are more important for fishers to participate in development-oriented FIPs, which drive improvement by supporting fishers, fisher associations, and governments to move towards market requirements.

  10. Using metagenomic and metatranscriptomic observations to test a thermodynamic-based model of community metabolic expression over time and space

    NASA Astrophysics Data System (ADS)

    Vallino, J. J.; Huber, J. A.

    2016-02-01

    Marine biogeochemistry is orchestrated by a complex and dynamic community of microorganisms that attempt to maximize their own fecundity through a combination of competition and cooperation. At a systems level, the community can be described as a distributed metabolic network, where different species contribute their own unique set of metabolic capabilities. Our current project attempts to understand the governing principles that describe amplification or attenuation of metabolic pathways within the network through a combination of modeling and metagenomic, metatranscriptomic and biogeochemical observations. We will describe and present results from our thermodynamic-based model that determines optimal pathway expression from available resources based on the principle of maximum entropy production (MEP); that is, based on the hypothesis that non-equilibrium systems organize to maximize energy dissipation. The MEP model currently predicts metabolic pathway expression over time, and one spatial dimension. Model predictions will be compared to biogeochemical observations and gene presence and expression from samples collected over time and space from a costal meromictic basin (Siders Pond) located in Falmouth MA, US. Siders Pond permanent stratification, caused by occasional seawater intrusion, results in steep chemoclines and redox gradients, which supports both aerobic and anaerobic phototrophs as well as sulfur, Fe and Mn redox cycles. The diversity of metabolic capability and expression we have observed over depth makes it an ideal system to test our thermodynamic-based model.

  11. Progress in and prospects for fluvial flood modelling.

    PubMed

    Wheater, H S

    2002-07-15

    Recent floods in the UK have raised public and political awareness of flood risk. There is an increasing recognition that flood management and land-use planning are linked, and that decision-support modelling tools are required to address issues of climate and land-use change for integrated catchment management. In this paper, the scientific context for fluvial flood modelling is discussed, current modelling capability is considered and research challenges are identified. Priorities include (i) appropriate representation of spatial precipitation, including scenarios of climate change; (ii) development of a national capability for continuous hydrological simulation of ungauged catchments; (iii) improved scientific understanding of impacts of agricultural land-use and land-management change, and the development of new modelling approaches to represent those impacts; (iv) improved representation of urban flooding, at both local and catchment scale; (v) appropriate parametrizations for hydraulic simulation of in-channel and flood-plain flows, assimilating available ground observations and remotely sensed data; and (vi) a flexible decision-support modelling framework, incorporating developments in computing, data availability, data assimilation and uncertainty analysis.

  12. Background: Preflight Screening, In-flight Capabilities, and Postflight Testing

    NASA Technical Reports Server (NTRS)

    Gibson, Charles Robert; Duncan, James

    2009-01-01

    Recommendations for minimal in-flight capabilities: Retinal Imaging - provide in-flight capability for the visual monitoring of ocular health (specifically, imaging of the retina and optic nerve head) with the capability of downlinking video/still images. Tonometry - provide more accurate and reliable in-flight capability for measuring intraocular pressure. Ultrasound - explore capabilities of current on-board system for monitoring ocular health. We currently have limited in-flight capabilities on board the International Space Station for performing an internal ocular health assessment. Visual Acuity, Direct Ophthalmoscope, Ultrasound, Tonometry(Tonopen):

  13. Providing a parallel and distributed capability for JMASS using SPEEDES

    NASA Astrophysics Data System (ADS)

    Valinski, Maria; Driscoll, Jonathan; McGraw, Robert M.; Meyer, Bob

    2002-07-01

    The Joint Modeling And Simulation System (JMASS) is a Tri-Service simulation environment that supports engineering and engagement-level simulations. As JMASS is expanded to support other Tri-Service domains, the current set of modeling services must be expanded for High Performance Computing (HPC) applications by adding support for advanced time-management algorithms, parallel and distributed topologies, and high speed communications. By providing support for these services, JMASS can better address modeling domains requiring parallel computationally intense calculations such clutter, vulnerability and lethality calculations, and underwater-based scenarios. A risk reduction effort implementing some HPC services for JMASS using the SPEEDES (Synchronous Parallel Environment for Emulation and Discrete Event Simulation) Simulation Framework has recently concluded. As an artifact of the JMASS-SPEEDES integration, not only can HPC functionality be brought to the JMASS program through SPEEDES, but an additional HLA-based capability can be demonstrated that further addresses interoperability issues. The JMASS-SPEEDES integration provided a means of adding HLA capability to preexisting JMASS scenarios through an implementation of the standard JMASS port communication mechanism that allows players to communicate.

  14. An Automatic Medium to High Fidelity Low-Thrust Global Trajectory Toolchain; EMTG-GMAT

    NASA Technical Reports Server (NTRS)

    Beeson, Ryne T.; Englander, Jacob A.; Hughes, Steven P.; Schadegg, Maximillian

    2015-01-01

    Solving the global optimization, low-thrust, multiple-flyby interplanetary trajectory problem with high-fidelity dynamical models requires an unreasonable amount of computational resources. A better approach, and one that is demonstrated in this paper, is a multi-step process whereby the solution of the aforementioned problem is solved at a lower-fidelity and this solution is used as an initial guess for a higher-fidelity solver. The framework presented in this work uses two tools developed by NASA Goddard Space Flight Center: the Evolutionary Mission Trajectory Generator (EMTG) and the General Mission Analysis Tool (GMAT). EMTG is a medium to medium-high fidelity low-thrust interplanetary global optimization solver, which now has the capability to automatically generate GMAT script files for seeding a high-fidelity solution using GMAT's local optimization capabilities. A discussion of the dynamical models as well as thruster and power modeling for both EMTG and GMAT are given in this paper. Current capabilities are demonstrated with examples that highlight the toolchains ability to efficiently solve the difficult low-thrust global optimization problem with little human intervention.

  15. Simulation for analysis and control of superplastic forming. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zacharia, T.; Aramayo, G.A.; Simunovic, S.

    1996-08-01

    A joint study was conducted by Oak Ridge National Laboratory (ORNL) and the Pacific Northwest Laboratory (PNL) for the U.S. Department of Energy-Lightweight Materials (DOE-LWM) Program. the purpose of the study was to assess and benchmark the current modeling capabilities with respect to accuracy of predictions and simulation time. Two modeling capabilities with respect to accuracy of predictions and simulation time. Two simulation platforms were considered in this study, which included the LS-DYNA3D code installed on ORNL`s high- performance computers and the finite element code MARC used at PNL. both ORNL and PNL performed superplastic forming (SPF) analysis on amore » standard butter-tray geometry, which was defined by PNL, to better understand the capabilities of the respective models. The specific geometry was selected and formed at PNL, and the experimental results, such as forming time and thickness at specific locations, were provided for comparisons with numerical predictions. Furthermore, comparisons between the ORNL simulation results, using elasto-plastic analysis, and PNL`s results, using rigid-plastic flow analysis, were performed.« less

  16. University Research in Support of TREAT Modeling and Simulation, FY 2016

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DeHart, Mark David

    Idaho National Laboratory is currently evolving the modeling and simulation (M&S) capability that will enable improved core operation as well as design and analysis of TREAT experiments. This M&S capability primarily uses MAMMOTH, a reactor physics application being developed under the Multi-physics Object Oriented Simulation Environment (MOOSE) framework. MAMMOTH allows the coupling of a number of other MOOSE-based applications. In support of this research, INL is working with four universities to explore advanced solution methods that will complement or augment capabilities in MAMMOTH. This report consists of a collection of year end summaries of research from the universities performed inmore » support of TREAT modeling and simulation. This research was led by Prof. Sedat Goluoglu at the University of Florida, Profs. Jim Morel and Jean Ragusa at Texas A&M University, Profs. Benoit Forget and Kord Smith at Massachusetts Institute of Technology, Prof. Leslie Kerby of Idaho State University and Prof. Barry Ganapol of University of Arizona. A significant number of students were supported at various levels though the projects and, for some, also as interns at INL.« less

  17. Mediterranea Forecasting System: a focus on wave-current coupling

    NASA Astrophysics Data System (ADS)

    Clementi, Emanuela; Delrosso, Damiano; Pistoia, Jenny; Drudi, Massimiliano; Fratianni, Claudia; Grandi, Alessandro; Pinardi, Nadia; Oddo, Paolo; Tonani, Marina

    2016-04-01

    The Mediterranean Forecasting System (MFS) is a numerical ocean prediction system that produces analyses, reanalyses and short term forecasts for the entire Mediterranean Sea and its Atlantic Ocean adjacent areas. MFS became operational in the late 90's and has been developed and continuously improved in the framework of a series of EU and National funded programs and is now part of the Copernicus Marine Service. The MFS is composed by the hydrodynamic model NEMO (Nucleus for European Modelling of the Ocean) 2-way coupled with the third generation wave model WW3 (WaveWatchIII) implemented in the Mediterranean Sea with 1/16 horizontal resolution and forced by ECMWF atmospheric fields. The model solutions are corrected by the data assimilation system (3D variational scheme adapted to the oceanic assimilation problem) with a daily assimilation cycle, using a background error correlation matrix varying seasonally and in different sub-regions of the Mediterranean Sea. The focus of this work is to present the latest modelling system upgrades and the related achieved improvements. In order to evaluate the performance of the coupled system a set of experiments has been built by coupling the wave and circulation models that hourly exchange the following fields: the sea surface currents and air-sea temperature difference are transferred from NEMO model to WW3 model modifying respectively the mean momentum transfer of waves and the wind speed stability parameter; while the neutral drag coefficient computed by WW3 model is passed to NEMO that computes the turbulent component. In order to validate the modelling system, numerical results have been compared with in-situ and remote sensing data. This work suggests that a coupled model might be capable of a better description of wave-current interactions, in particular feedback from the ocean to the waves might assess an improvement on the prediction capability of wave characteristics, while suggests to proceed toward a fully coupled modelling system in order to achieve stronger enhancements of the hydrodynamic fields.

  18. Conserving analyst attention units: use of multi-agent software and CEP methods to assist information analysis

    NASA Astrophysics Data System (ADS)

    Rimland, Jeffrey; McNeese, Michael; Hall, David

    2013-05-01

    Although the capability of computer-based artificial intelligence techniques for decision-making and situational awareness has seen notable improvement over the last several decades, the current state-of-the-art still falls short of creating computer systems capable of autonomously making complex decisions and judgments in many domains where data is nuanced and accountability is high. However, there is a great deal of potential for hybrid systems in which software applications augment human capabilities by focusing the analyst's attention to relevant information elements based on both a priori knowledge of the analyst's goals and the processing/correlation of a series of data streams too numerous and heterogeneous for the analyst to digest without assistance. Researchers at Penn State University are exploring ways in which an information framework influenced by Klein's (Recognition Primed Decision) RPD model, Endsley's model of situational awareness, and the Joint Directors of Laboratories (JDL) data fusion process model can be implemented through a novel combination of Complex Event Processing (CEP) and Multi-Agent Software (MAS). Though originally designed for stock market and financial applications, the high performance data-driven nature of CEP techniques provide a natural compliment to the proven capabilities of MAS systems for modeling naturalistic decision-making, performing process adjudication, and optimizing networked processing and cognition via the use of "mobile agents." This paper addresses the challenges and opportunities of such a framework for augmenting human observational capability as well as enabling the ability to perform collaborative context-aware reasoning in both human teams and hybrid human / software agent teams.

  19. A Feasibility Study on Numerical Modeling of Large-Scale Naval Fluid-Filled Structure: Contact-Impact Problems

    DTIC Science & Technology

    2011-02-01

    capabilities for airbags , sensors, and seatbelts have tailored the code for applications in the automotive industry. Currently the code contains...larger intervals. In certain contact scenarios where contacting parts are moving relative to each other in a rapid fashion, such as airbag deployment

  20. Performance Evaluation of the Honeywell GG1308 Miniature Ring Laser Gyroscope

    DTIC Science & Technology

    1993-01-01

    information. The final display line provides the current DSB configuration status. An external strobe was established between the Contraves motion...components and systems. The core of the facility is a Contraves -Goerz Model 57CD 2-axis motion simulator capable of highly precise position, rate and

  1. Thermal niche estimators and the capability of poor dispersal species to cope with climate change

    NASA Astrophysics Data System (ADS)

    Sánchez-Fernández, David; Rizzo, Valeria; Cieslak, Alexandra; Faille, Arnaud; Fresneda, Javier; Ribera, Ignacio

    2016-03-01

    For management strategies in the context of global warming, accurate predictions of species response are mandatory. However, to date most predictions are based on niche (bioclimatic) models that usually overlook biotic interactions, behavioral adjustments or adaptive evolution, and assume that species can disperse freely without constraints. The deep subterranean environment minimises these uncertainties, as it is simple, homogeneous and with constant environmental conditions. It is thus an ideal model system to study the effect of global change in species with poor dispersal capabilities. We assess the potential fate of a lineage of troglobitic beetles under global change predictions using different approaches to estimate their thermal niche: bioclimatic models, rates of thermal niche change estimated from a molecular phylogeny, and data from physiological studies. Using bioclimatic models, at most 60% of the species were predicted to have suitable conditions in 2080. Considering the rates of thermal niche change did not improve this prediction. However, physiological data suggest that subterranean species have a broad thermal tolerance, allowing them to stand temperatures never experienced through their evolutionary history. These results stress the need of experimental approaches to assess the capability of poor dispersal species to cope with temperatures outside those they currently experience.

  2. RELAP-7 Closure Correlations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zou, Ling; Berry, R. A.; Martineau, R. C.

    The RELAP-7 code is the next generation nuclear reactor system safety analysis code being developed at the Idaho National Laboratory (INL). The code is based on the INL’s modern scientific software development framework, MOOSE (Multi-Physics Object Oriented Simulation Environment). The overall design goal of RELAP-7 is to take advantage of the previous thirty years of advancements in computer architecture, software design, numerical integration methods, and physical models. The end result will be a reactor systems analysis capability that retains and improves upon RELAP5’s and TRACE’s capabilities and extends their analysis capabilities for all reactor system simulation scenarios. The RELAP-7 codemore » utilizes the well-posed 7-equation two-phase flow model for compressible two-phase flow. Closure models used in the TRACE code has been reviewed and selected to reflect the progress made during the past decades and provide a basis for the colure correlations implemented in the RELAP-7 code. This document provides a summary on the closure correlations that are currently implemented in the RELAP-7 code. The closure correlations include sub-grid models that describe interactions between the fluids and the flow channel, and interactions between the two phases.« less

  3. Programming with models: modularity and abstraction provide powerful capabilities for systems biology

    PubMed Central

    Mallavarapu, Aneil; Thomson, Matthew; Ullian, Benjamin; Gunawardena, Jeremy

    2008-01-01

    Mathematical models are increasingly used to understand how phenotypes emerge from systems of molecular interactions. However, their current construction as monolithic sets of equations presents a fundamental barrier to progress. Overcoming this requires modularity, enabling sub-systems to be specified independently and combined incrementally, and abstraction, enabling generic properties of biological processes to be specified independently of specific instances. These, in turn, require models to be represented as programs rather than as datatypes. Programmable modularity and abstraction enables libraries of modules to be created, which can be instantiated and reused repeatedly in different contexts with different components. We have developed a computational infrastructure that accomplishes this. We show here why such capabilities are needed, what is required to implement them and what can be accomplished with them that could not be done previously. PMID:18647734

  4. Microeconomics of yield learning and process control in semiconductor manufacturing

    NASA Astrophysics Data System (ADS)

    Monahan, Kevin M.

    2003-06-01

    Simple microeconomic models that directly link yield learning to profitability in semiconductor manufacturing have been rare or non-existent. In this work, we review such a model and provide links to inspection capability and cost. Using a small number of input parameters, we explain current yield management practices in 200mm factories. The model is then used to extrapolate requirements for 300mm factories, including the impact of technology transitions to 130nm design rules and below. We show that the dramatic increase in value per wafer at the 300mm transition becomes a driver for increasing metrology and inspection capability and sampling. These analyses correlate well wtih actual factory data and often identify millions of dollars in potential cost savings. We demonstrate this using the example of grating-based overlay metrology for the 65nm node.

  5. Programming with models: modularity and abstraction provide powerful capabilities for systems biology.

    PubMed

    Mallavarapu, Aneil; Thomson, Matthew; Ullian, Benjamin; Gunawardena, Jeremy

    2009-03-06

    Mathematical models are increasingly used to understand how phenotypes emerge from systems of molecular interactions. However, their current construction as monolithic sets of equations presents a fundamental barrier to progress. Overcoming this requires modularity, enabling sub-systems to be specified independently and combined incrementally, and abstraction, enabling generic properties of biological processes to be specified independently of specific instances. These, in turn, require models to be represented as programs rather than as datatypes. Programmable modularity and abstraction enables libraries of modules to be created, which can be instantiated and reused repeatedly in different contexts with different components. We have developed a computational infrastructure that accomplishes this. We show here why such capabilities are needed, what is required to implement them and what can be accomplished with them that could not be done previously.

  6. NASA Stennis Space Center Integrated System Health Management Test Bed and Development Capabilities

    NASA Technical Reports Server (NTRS)

    Figueroa, Fernando; Holland, Randy; Coote, David

    2006-01-01

    Integrated System Health Management (ISHM) is a capability that focuses on determining the condition (health) of every element in a complex System (detect anomalies, diagnose causes, prognosis of future anomalies), and provide data, information, and knowledge (DIaK)-not just data-to control systems for safe and effective operation. This capability is currently done by large teams of people, primarily from ground, but needs to be embedded on-board systems to a higher degree to enable NASA's new Exploration Mission (long term travel and stay in space), while increasing safety and decreasing life cycle costs of spacecraft (vehicles; platforms; bases or outposts; and ground test, launch, and processing operations). The topics related to this capability include: 1) ISHM Related News Articles; 2) ISHM Vision For Exploration; 3) Layers Representing How ISHM is Currently Performed; 4) ISHM Testbeds & Prototypes at NASA SSC; 5) ISHM Functional Capability Level (FCL); 6) ISHM Functional Capability Level (FCL) and Technology Readiness Level (TRL); 7) Core Elements: Capabilities Needed; 8) Core Elements; 9) Open Systems Architecture for Condition-Based Maintenance (OSA-CBM); 10) Core Elements: Architecture, taxonomy, and ontology (ATO) for DIaK management; 11) Core Elements: ATO for DIaK Management; 12) ISHM Architecture Physical Implementation; 13) Core Elements: Standards; 14) Systematic Implementation; 15) Sketch of Work Phasing; 16) Interrelationship Between Traditional Avionics Systems, Time Critical ISHM and Advanced ISHM; 17) Testbeds and On-Board ISHM; 18) Testbed Requirements: RETS AND ISS; 19) Sustainable Development and Validation Process; 20) Development of on-board ISHM; 21) Taxonomy/Ontology of Object Oriented Implementation; 22) ISHM Capability on the E1 Test Stand Hydraulic System; 23) Define Relationships to Embed Intelligence; 24) Intelligent Elements Physical and Virtual; 25) ISHM Testbeds and Prototypes at SSC Current Implementations; 26) Trailer-Mounted RETS; 27) Modeling and Simulation; 28) Summary ISHM Testbed Environments; 29) Data Mining - ARC; 30) Transitioning ISHM to Support NASA Missions; 31) Feature Detection Routines; 32) Sample Features Detected in SSC Test Stand Data; and 33) Health Assessment Database (DIaK Repository).

  7. Advanced Atmospheric Modeling for Emergency Response.

    NASA Astrophysics Data System (ADS)

    Fast, Jerome D.; O'Steen, B. Lance; Addis, Robert P.

    1995-03-01

    Atmospheric transport and diffusion models are an important part of emergency response systems for industrial facilities that have the potential to release significant quantities of toxic or radioactive material into the atmosphere. An advanced atmospheric transport and diffusion modeling system for emergency response and environmental applications, based upon a three-dimensional mesoscale model, has been developed for the U.S. Department of Energy's Savannah River Site so that complex, time-dependent flow fields not explicitly measured can be routinely simulated. To overcome some of the current computational demands of mesoscale models, two operational procedures for the advanced atmospheric transport and diffusion modeling system are described including 1) a semiprognostic calculation to produce high-resolution wind fields for local pollutant transport in the vicinity of the Savannah River Site and 2) a fully prognostic calculation to produce a regional wind field encompassing the southeastern United States for larger-scale pollutant problems. Local and regional observations and large-scale model output are used by the mesoscale model for the initial conditions, lateral boundary conditions, and four-dimensional data assimilation procedure. This paper describes the current status of the modeling system and presents two case studies demonstrating the capabilities of both modes of operation. While the results from the case studies shown in this paper are preliminary and certainly not definitive, they do suggest that the mesoscale model has the potential for improving the prognostic capabilities of atmospheric modeling for emergency response at the Savannah River Site. Long-term model evaluation will be required to determine under what conditions significant forecast errors exist.

  8. Progress in Unsteady Turbopump Flow Simulations Using Overset Grid Systems

    NASA Technical Reports Server (NTRS)

    Kiris, Cetin C.; Chan, William; Kwak, Dochan

    2002-01-01

    This viewgraph presentation provides information on unsteady flow simulations for the Second Generation RLV (Reusable Launch Vehicle) baseline turbopump. Three impeller rotations were simulated by using a 34.3 million grid points model. MPI/OpenMP hybrid parallelism and MLP shared memory parallelism has been implemented and benchmarked in INS3D, an incompressible Navier-Stokes solver. For RLV turbopump simulations a speed up of more than 30 times has been obtained. Moving boundary capability is obtained by using the DCF module. Scripting capability from CAD geometry to solution is developed. Unsteady flow simulations for advanced consortium impeller/diffuser by using a 39 million grid points model are currently underway. 1.2 impeller rotations are completed. The fluid/structure coupling is initiated.

  9. Aeroservoelasticity

    NASA Technical Reports Server (NTRS)

    Noll, Thomas E.

    1990-01-01

    The paper describes recent accomplishments and current research projects along four main thrusts in aeroservoelasticity at NASA Langley. One activity focuses on enhancing the modeling and analysis procedures to accurately predict aeroservoelastic interactions. Improvements to the minimum-state method of approximating unsteady aerodynamics are shown to provide precise low-order models for design and simulation tasks. Recent extensions in aerodynamic correction-factor methodology are also described. With respect to analysis procedures, the paper reviews novel enhancements to matched filter theory and random process theory for predicting the critical gust profile and the associated time-correlated gust loads for structural design considerations. Two research projects leading towards improved design capability are also summarized: (1) an integrated structure/control design capability and (2) procedures for obtaining low-order robust digital control laws for aeroelastic applications.

  10. NetMOD v. 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Merchant, Bion J

    2015-12-22

    NetMOD is a tool to model the performance of global ground-based explosion monitoring systems. The version 2.0 of the software supports the simulation of seismic, hydroacoustic, and infrasonic detection capability. The tool provides a user interface to execute simulations based upon a hypothetical definition of the monitoring system configuration, geophysical properties of the Earth, and detection analysis criteria. NetMOD will be distributed with a project file defining the basic performance characteristics of the International Monitoring System (IMS), a network of sensors operated by the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO). Network modeling is needed to be able to assess and explainmore » the potential effect of changes to the IMS, to prioritize station deployment and repair, and to assess the overall CTBTO monitoring capability currently and in the future. Currently the CTBTO uses version 1.0 of NetMOD, provided to them in early 2014. NetMOD will provide a modern tool that will cover all the simulations currently available and allow for the development of additional simulation capabilities of the IMS in the future. NetMOD simulates the performance of monitoring networks by estimating the relative amplitudes of the signal and noise measured at each of the stations within the network based upon known geophysical principles. From these signal and noise estimates, a probability of detection may be determined for each of the stations. The detection probabilities at each of the stations may then be combined to produce an estimate of the detection probability for the entire monitoring network.« less

  11. Electrophysiological models of neural processing.

    PubMed

    Nelson, Mark E

    2011-01-01

    The brain is an amazing information processing system that allows organisms to adaptively monitor and control complex dynamic interactions with their environment across multiple spatial and temporal scales. Mathematical modeling and computer simulation techniques have become essential tools in understanding diverse aspects of neural processing ranging from sub-millisecond temporal coding in the sound localization circuity of barn owls to long-term memory storage and retrieval in humans that can span decades. The processing capabilities of individual neurons lie at the core of these models, with the emphasis shifting upward and downward across different levels of biological organization depending on the nature of the questions being addressed. This review provides an introduction to the techniques for constructing biophysically based models of individual neurons and local networks. Topics include Hodgkin-Huxley-type models of macroscopic membrane currents, Markov models of individual ion-channel currents, compartmental models of neuronal morphology, and network models involving synaptic interactions among multiple neurons.

  12. Resolving Orbital and Climate Keys of Earth and Extraterrestrial Environments with Dynamics (ROCKE-3D) 1.0: A General Circulation Model for Simulating the Climates of Rocky Planets

    NASA Astrophysics Data System (ADS)

    Way, M. J.; Aleinov, I.; Amundsen, David S.; Chandler, M. A.; Clune, T. L.; Del Genio, A. D.; Fujii, Y.; Kelley, M.; Kiang, N. Y.; Sohl, L.; Tsigaridis, K.

    2017-07-01

    Resolving Orbital and Climate Keys of Earth and Extraterrestrial Environments with Dynamics (ROCKE-3D) is a three-dimensional General Circulation Model (GCM) developed at the NASA Goddard Institute for Space Studies for the modeling of atmospheres of solar system and exoplanetary terrestrial planets. Its parent model, known as ModelE2, is used to simulate modern Earth and near-term paleo-Earth climates. ROCKE-3D is an ongoing effort to expand the capabilities of ModelE2 to handle a broader range of atmospheric conditions, including higher and lower atmospheric pressures, more diverse chemistries and compositions, larger and smaller planet radii and gravity, different rotation rates (from slower to more rapid than modern Earth’s, including synchronous rotation), diverse ocean and land distributions and topographies, and potential basic biosphere functions. The first aim of ROCKE-3D is to model planetary atmospheres on terrestrial worlds within the solar system such as paleo-Earth, modern and paleo-Mars, paleo-Venus, and Saturn’s moon Titan. By validating the model for a broad range of temperatures, pressures, and atmospheric constituents, we can then further expand its capabilities to those exoplanetary rocky worlds that have been discovered in the past, as well as those to be discovered in the future. We also discuss the current and near-future capabilities of ROCKE-3D as a community model for studying planetary and exoplanetary atmospheres.

  13. Resolving Orbital and Climate Keys of Earth and Extraterrestrial Environments with Dynamics (ROCKE-3D) 1.0: A General Circulation Model for Simulating the Climates of Rocky Planets

    NASA Technical Reports Server (NTRS)

    Way, M. J.; Aleinov, I.; Amundsen, David S.; Chandler, M. A.; Clune, T. L.; Del Genio, A.; Fujii, Y.; Kelley, M.; Kiang, N. Y.; Sohl, L.; hide

    2017-01-01

    Resolving Orbital and Climate Keys of Earth and Extraterrestrial Environments with Dynamics (ROCKE-3D) is a three-dimensional General Circulation Model (GCM) developed at the NASA Goddard Institute for Space Studies for the modeling of atmospheres of solar system and exoplanetary terrestrial planets. Its parent model, known as ModelE2, is used to simulate modern Earth and near-term paleo-Earth climates. ROCKE-3D is an ongoing effort to expand the capabilities of ModelE2 to handle a broader range of atmospheric conditions, including higher and lower atmospheric pressures, more diverse chemistries and compositions, larger and smaller planet radii and gravity, different rotation rates (from slower to more rapid than modern Earth's, including synchronous rotation), diverse ocean and land distributions and topographies, and potential basic biosphere functions. The first aim of ROCKE-3D is to model planetary atmospheres on terrestrial worlds within the solar system such as paleo-Earth, modern and paleo-Mars, paleo-Venus, and Saturn's moon Titan. By validating the model for a broad range of temperatures, pressures, and atmospheric constituents, we can then further expand its capabilities to those exoplanetary rocky worlds that have been discovered in the past, as well as those to be discovered in the future. We also discuss the current and near-future capabilities of ROCKE-3D as a community model for studying planetary and exoplanetary atmospheres.

  14. Extended MAGTF Operations - Tactical Chat

    DTIC Science & Technology

    2017-03-01

    vertical obstructions?  Over what ranges might such a system maintain connectivity? E . ORGANIZATION OF THESIS This thesis is organized in the...likely future models of UAVs will likely be capable of providing a relay platform for a long-range communication system that can solve the shadowing...problem presented in this study. However, for reasons outlined in the remainder of this section, current models of UAVs do not appear to provide a

  15. Reliability of Next Generation Power Electronics Packaging Under Concurrent Vibration, Thermal and High Power Loads

    DTIC Science & Technology

    2008-02-01

    combined thermal g effect and initial current field. The model is implemented using Abaqus user element subroutine and verified against the experimental...Finite Element Formulation The proposed model is implemented with ABAQUS general purpose finite element program using thermal -displacement analysis...option. ABAQUS and other commercially available finite element codes do not have the capability to solve general electromigration problem directly. Thermal

  16. Seals Flow Code Development

    NASA Technical Reports Server (NTRS)

    1991-01-01

    In recognition of a deficiency in the current modeling capability for seals, an effort was established by NASA to develop verified computational fluid dynamic concepts, codes, and analyses for seals. The objectives were to develop advanced concepts for the design and analysis of seals, to effectively disseminate the information to potential users by way of annual workshops, and to provide experimental verification for the models and codes under a wide range of operating conditions.

  17. Prognostic and health management of active assets in nuclear power plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agarwal, Vivek; Lybeck, Nancy; Pham, Binh T.

    This study presents the development of diagnostic and prognostic capabilities for active assets in nuclear power plants (NPPs). The research was performed under the Advanced Instrumentation, Information, and Control Technologies Pathway of the Light Water Reactor Sustainability Program. Idaho National Laboratory researched, developed, implemented, and demonstrated diagnostic and prognostic models for generator step-up transformers (GSUs). The Fleet-Wide Prognostic and Health Management (FW-PHM) Suite software developed by the Electric Power Research Institute was used to perform diagnosis and prognosis. As part of the research activity, Idaho National Laboratory implemented 22 GSU diagnostic models in the Asset Fault Signature Database and twomore » wellestablished GSU prognostic models for the paper winding insulation in the Remaining Useful Life Database of the FW-PHM Suite. The implemented models along with a simulated fault data stream were used to evaluate the diagnostic and prognostic capabilities of the FW-PHM Suite. Knowledge of the operating condition of plant asset gained from diagnosis and prognosis is critical for the safe, productive, and economical long-term operation of the current fleet of NPPs. This research addresses some of the gaps in the current state of technology development and enables effective application of diagnostics and prognostics to nuclear plant assets.« less

  18. Prognostic and health management of active assets in nuclear power plants

    DOE PAGES

    Agarwal, Vivek; Lybeck, Nancy; Pham, Binh T.; ...

    2015-06-04

    This study presents the development of diagnostic and prognostic capabilities for active assets in nuclear power plants (NPPs). The research was performed under the Advanced Instrumentation, Information, and Control Technologies Pathway of the Light Water Reactor Sustainability Program. Idaho National Laboratory researched, developed, implemented, and demonstrated diagnostic and prognostic models for generator step-up transformers (GSUs). The Fleet-Wide Prognostic and Health Management (FW-PHM) Suite software developed by the Electric Power Research Institute was used to perform diagnosis and prognosis. As part of the research activity, Idaho National Laboratory implemented 22 GSU diagnostic models in the Asset Fault Signature Database and twomore » wellestablished GSU prognostic models for the paper winding insulation in the Remaining Useful Life Database of the FW-PHM Suite. The implemented models along with a simulated fault data stream were used to evaluate the diagnostic and prognostic capabilities of the FW-PHM Suite. Knowledge of the operating condition of plant asset gained from diagnosis and prognosis is critical for the safe, productive, and economical long-term operation of the current fleet of NPPs. This research addresses some of the gaps in the current state of technology development and enables effective application of diagnostics and prognostics to nuclear plant assets.« less

  19. Prediction of Acoustic Loads Generated by Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Perez, Linamaria; Allgood, Daniel C.

    2011-01-01

    NASA Stennis Space Center is one of the nation's premier facilities for conducting large-scale rocket engine testing. As liquid rocket engines vary in size, so do the acoustic loads that they produce. When these acoustic loads reach very high levels they may cause damages both to humans and to actual structures surrounding the testing area. To prevent these damages, prediction tools are used to estimate the spectral content and levels of the acoustics being generated by the rocket engine plumes and model their propagation through the surrounding atmosphere. Prior to the current work, two different acoustic prediction tools were being implemented at Stennis Space Center, each having their own advantages and disadvantages depending on the application. Therefore, a new prediction tool was created, using NASA SP-8072 handbook as a guide, which would replicate the same prediction methods as the previous codes, but eliminate any of the drawbacks the individual codes had. Aside from replicating the previous modeling capability in a single framework, additional modeling functions were added thereby expanding the current modeling capability. To verify that the new code could reproduce the same predictions as the previous codes, two verification test cases were defined. These verification test cases also served as validation cases as the predicted results were compared to actual test data.

  20. Status of NASA/Army rotorcraft research and development piloted flight simulation

    NASA Technical Reports Server (NTRS)

    Condon, Gregory W.; Gossett, Terrence D.

    1988-01-01

    The status of the major NASA/Army capabilities in piloted rotorcraft flight simulation is reviewed. The requirements for research and development piloted simulation are addressed as well as the capabilities and technologies that are currently available or are being developed by NASA and the Army at Ames. The application of revolutionary advances (in visual scene, electronic cockpits, motion, and modelling of interactive mission environments and/or vehicle systems) to the NASA/Army facilities are also addressed. Particular attention is devoted to the major advances made in integrating these individual capabilities into fully integrated simulation environment that were or are being applied to new rotorcraft mission requirements. The specific simulators discussed are the Vertical Motion Simulator and the Crew Station Research and Development Facility.

  1. Applying PCI in Combination Swivel Head Wrench

    NASA Astrophysics Data System (ADS)

    Chen, Tsang-Chiang; Yang, Chun-Ming; Hsu, Chang-Hsien; Hung, Hsiang-Wen

    2017-09-01

    Taiwan’s traditional industries are subject to competition in the era of globalization and environmental change, the industry is facing economic pressure and shock, and now sustainable business can only continue to improve production efficiency and quality of technology, in order to stabilize the market, to obtain high occupancy. The use of process capability indices to monitor the quality of the ratchet wrench to find the key function of the dual-use ratchet wrench, the actual measurement data, The use of process capability Cpk index analysis, and draw Process Capability Analysis Chart model. Finally, this study explores the current situation of this case and proposes a lack of improvement and improvement methods to improve the overall quality and thereby enhance the overall industry.

  2. Enhancing seasonal climate prediction capacity for the Pacific countries

    NASA Astrophysics Data System (ADS)

    Kuleshov, Y.; Jones, D.; Hendon, H.; Charles, A.; Cottrill, A.; Lim, E.-P.; Langford, S.; de Wit, R.; Shelton, K.

    2012-04-01

    Seasonal and inter-annual climate variability is a major factor in determining the vulnerability of many Pacific Island Countries to climate change and there is need to improve weekly to seasonal range climate prediction capabilities beyond what is currently available from statistical models. In the seasonal climate prediction project under the Australian Government's Pacific Adaptation Strategy Assistance Program (PASAP), we describe a comprehensive project to strengthen the climate prediction capacities in National Meteorological Services in 14 Pacific Island Countries and East Timor. The intent is particularly to reduce the vulnerability of current services to a changing climate, and improve the overall level of information available assist with managing climate variability. Statistical models cannot account for aspects of climate variability and change that are not represented in the historical record. In contrast, dynamical physics-based models implicitly include the effects of a changing climate whatever its character or cause and can predict outcomes not seen previously. The transition from a statistical to a dynamical prediction system provides more valuable and applicable climate information to a wide range of climate sensitive sectors throughout the countries of the Pacific region. In this project, we have developed seasonal climate outlooks which are based upon the current dynamical model POAMA (Predictive Ocean-Atmosphere Model for Australia) seasonal forecast system. At present, meteorological services of the Pacific Island Countries largely employ statistical models for seasonal outlooks. Outcomes of the PASAP project enhanced capabilities of the Pacific Island Countries in seasonal prediction providing National Meteorological Services with an additional tool to analyse meteorological variables such as sea surface temperatures, air temperature, pressure and rainfall using POAMA outputs and prepare more accurate seasonal climate outlooks.

  3. Understanding the Current 30-Year Shipbuilding Plan Through Three Models

    DTIC Science & Technology

    2014-12-01

    to what the Navy can ultimately build in ten years and beyond due to the fact that this plan is revised annually. In a political game this...the game and each one hoping for a different outcome. Though not currently possible with the FY2015 Long Range Plan it is possible to analyze the...to make a choice while the bounded rationality theory acknowledges the limits of human capabilities, knowledge, and capacity. Therefore, bounded

  4. Correlation study of theoretical and experimental results for spin tests of a 1/10 scale radio control model

    NASA Technical Reports Server (NTRS)

    Bihrle, W., Jr.

    1976-01-01

    A correlation study was conducted to determine the ability of current analytical spin prediction techniques to predict the flight motions of a current fighter airplane configuration during the spin entry, the developed spin, and the spin recovery motions. The airplane math model used aerodynamics measured on an exact replica of the flight test model using conventional static and forced-oscillation wind-tunnel test techniques and a recently developed rotation-balance test apparatus capable of measuring aerodynamics under steady spinning conditions. An attempt was made to predict the flight motions measured during stall/spin flight testing of an unpowered, radio-controlled model designed to be a 1/10 scale, dynamically-scaled model of a current fighter configuration. Comparison of the predicted and measured flight motions show that while the post-stall and spin entry motions were not well-predicted, the developed spinning motion (a steady flat spin) and the initial phases of the spin recovery motion are reasonably well predicted.

  5. Assessing Upper-Level Winds on Day-of-Launch

    NASA Technical Reports Server (NTRS)

    Bauman, William H., III; Wheeler, Mark M.

    2012-01-01

    On the day-or-launch. the 45th Weather Squadron Launch Weather Officers (LWOS) monitor the upper-level winds for their launch customers to include NASA's Launch Services Program (LSP). During launch operations, the payload launch team sometimes asks the LWO if they expect the upper level winds to change during the countdown but the LWOs did not have the capability to quickly retrieve or display the upper-level observations and compare them to the numerical weather prediction model point forecasts. The LWOs requested the Applied Meteorology Unit (AMU) develop a capability in the form of a graphical user interface (GUI) that would allow them to plot upper-level wind speed and direction observations from the Kennedy Space Center Doppler Radar Wind Profilers and Cape Canaveral Air Force Station rawinsondes and then overlay model point forecast profiles on the observation profiles to assess the performance of these models and graphically display them to the launch team. The AMU developed an Excel-based capability for the LWOs to assess the model forecast upper-level winds and compare them to observations. They did so by creating a GUI in Excel that allows the LWOs to first initialize the models by comparing the O-hour model forecasts to the observations and then to display model forecasts in 3-hour intervals from the current time through 12 hours.

  6. Prospects for steady-state scenarios on JET

    NASA Astrophysics Data System (ADS)

    Litaudon, X.; Bizarro, J. P. S.; Challis, C. D.; Crisanti, F.; DeVries, P. C.; Lomas, P.; Rimini, F. G.; Tala, T. J. J.; Akers, R.; Andrew, Y.; Arnoux, G.; Artaud, J. F.; Baranov, Yu F.; Beurskens, M.; Brix, M.; Cesario, R.; DeLa Luna, E.; Fundamenski, W.; Giroud, C.; Hawkes, N. C.; Huber, A.; Joffrin, E.; Pitts, R. A.; Rachlew, E.; Reyes-Cortes, S. D. A.; Sharapov, S. E.; Zastrow, K. D.; Zimmermann, O.; JET EFDA contributors, the

    2007-09-01

    In the 2006 experimental campaign, progress has been made on JET to operate non-inductive scenarios at higher applied powers (31 MW) and density (nl ~ 4 × 1019 m-3), with ITER-relevant safety factor (q95 ~ 5) and plasma shaping, taking advantage of the new divertor capabilities. The extrapolation of the performance using transport modelling benchmarked on the experimental database indicates that the foreseen power upgrade (~45 MW) will allow the development of non-inductive scenarios where the bootstrap current is maximized together with the fusion yield and not, as in present-day experiments, at its expense. The tools for the long-term JET programme are the new ITER-like ICRH antenna (~15 MW), an upgrade of the NB power (35 MW/20 s or 17.5 MW/40 s), a new ITER-like first wall, a new pellet injector for edge localized mode control together with improved diagnostic and control capability. Operation with the new wall will set new constraints on non-inductive scenarios that are already addressed experimentally and in the modelling. The fusion performance and driven current that could be reached at high density and power have been estimated using either 0D or 1-1/2D validated transport models. In the high power case (45 MW), the calculations indicate the potential for the operational space of the non-inductive regime to be extended in terms of current (~2.5 MA) and density (nl > 5 × 1019 m-3), with high βN (βN > 3.0) and a fraction of the bootstrap current within 60-70% at high toroidal field (~3.5 T).

  7. Is there something quantum-like about the human mental lexicon?

    PubMed Central

    Bruza, Peter; Kitto, Kirsty; Nelson, Douglas; McEvoy, Cathy

    2010-01-01

    Following an early claim by Nelson & McEvoy (35) suggesting that word associations can display ‘spooky action at a distance behaviour’, a serious investigation of the potentially quantum nature of such associations is currently underway. In this paper quantum theory is proposed as a framework suitable for modelling the human mental lexicon, specifically the results obtained from both intralist and extralist word association experiments. Some initial models exploring this hypothesis are discussed, and experiments capable of testing these models proposed. PMID:20224806

  8. TRIGRS Application for landslide susceptibility mapping

    NASA Astrophysics Data System (ADS)

    Sugiarti, K.; Sukristiyanti, S.

    2018-02-01

    Research on landslide susceptibility has been carried out using several different methods. TRIGRS is a modeling program for landslide susceptibility by considering pore water pressure changes due to infiltration of rainfall. This paper aims to present a current state-of-the-art science on the development and application of TRIGRS. Some limitations of TRIGRS, some developments of it to improve its modeling capability, and some examples of the applications of some versions of it to model the effect of rainfall variation on landslide susceptibility are reviewed and discussed.

  9. Web-based applications for building, managing and analysing kinetic models of biological systems.

    PubMed

    Lee, Dong-Yup; Saha, Rajib; Yusufi, Faraaz Noor Khan; Park, Wonjun; Karimi, Iftekhar A

    2009-01-01

    Mathematical modelling and computational analysis play an essential role in improving our capability to elucidate the functions and characteristics of complex biological systems such as metabolic, regulatory and cell signalling pathways. The modelling and concomitant simulation render it possible to predict the cellular behaviour of systems under various genetically and/or environmentally perturbed conditions. This motivates systems biologists/bioengineers/bioinformaticians to develop new tools and applications, allowing non-experts to easily conduct such modelling and analysis. However, among a multitude of systems biology tools developed to date, only a handful of projects have adopted a web-based approach to kinetic modelling. In this report, we evaluate the capabilities and characteristics of current web-based tools in systems biology and identify desirable features, limitations and bottlenecks for further improvements in terms of usability and functionality. A short discussion on software architecture issues involved in web-based applications and the approaches taken by existing tools is included for those interested in developing their own simulation applications.

  10. Firing patterns in the adaptive exponential integrate-and-fire model.

    PubMed

    Naud, Richard; Marcille, Nicolas; Clopath, Claudia; Gerstner, Wulfram

    2008-11-01

    For simulations of large spiking neuron networks, an accurate, simple and versatile single-neuron modeling framework is required. Here we explore the versatility of a simple two-equation model: the adaptive exponential integrate-and-fire neuron. We show that this model generates multiple firing patterns depending on the choice of parameter values, and present a phase diagram describing the transition from one firing type to another. We give an analytical criterion to distinguish between continuous adaption, initial bursting, regular bursting and two types of tonic spiking. Also, we report that the deterministic model is capable of producing irregular spiking when stimulated with constant current, indicating low-dimensional chaos. Lastly, the simple model is fitted to real experiments of cortical neurons under step current stimulation. The results provide support for the suitability of simple models such as the adaptive exponential integrate-and-fire neuron for large network simulations.

  11. C3 System Performance Simulation and User Manual. Getting Started: Guidelines for Users

    NASA Technical Reports Server (NTRS)

    2006-01-01

    This document is a User's Manual describing the C3 Simulation capabilities. The subject work was designed to simulate the communications involved in the flight of a Remotely Operated Aircraft (ROA) using the Opnet software. Opnet provides a comprehensive development environment supporting the modeling of communication networks and distributed systems. It has tools for model design, simulation, data collection, and data analysis. Opnet models are hierarchical -- consisting of a project which contains node models which in turn contain process models. Nodes can be fixed, mobile, or satellite. Links between nodes can be physical or wireless. Communications are packet based. The model is very generic in its current form. Attributes such as frequency and bandwidth can easily be modified to better reflect a specific platform. The model is not fully developed at this stage -- there are still more enhancements to be added. Current issues are documented throughout this guide.

  12. Implementation of a Goal-Based Systems Engineering Process Using the Systems Modeling Language (SysML)

    NASA Technical Reports Server (NTRS)

    Patterson, Jonathan D.; Breckenridge, Jonathan T.; Johnson, Stephen B.

    2013-01-01

    Building upon the purpose, theoretical approach, and use of a Goal-Function Tree (GFT) being presented by Dr. Stephen B. Johnson, described in a related Infotech 2013 ISHM abstract titled "Goal-Function Tree Modeling for Systems Engineering and Fault Management", this paper will describe the core framework used to implement the GFTbased systems engineering process using the Systems Modeling Language (SysML). These two papers are ideally accepted and presented together in the same Infotech session. Statement of problem: SysML, as a tool, is currently not capable of implementing the theoretical approach described within the "Goal-Function Tree Modeling for Systems Engineering and Fault Management" paper cited above. More generally, SysML's current capabilities to model functional decompositions in the rigorous manner required in the GFT approach are limited. The GFT is a new Model-Based Systems Engineering (MBSE) approach to the development of goals and requirements, functions, and its linkage to design. As a growing standard for systems engineering, it is important to develop methods to implement GFT in SysML. Proposed Method of Solution: Many of the central concepts of the SysML language are needed to implement a GFT for large complex systems. In the implementation of those central concepts, the following will be described in detail: changes to the nominal SysML process, model view definitions and examples, diagram definitions and examples, and detailed SysML construct and stereotype definitions.

  13. Current Capabilities at SNL for the Integration of Small Modular Reactors onto Smart Microgrids Using Sandia's Smart Microgrid Technology High Performance Computing and Advanced Manufacturing.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rodriguez, Salvador B.

    Smart grids are a crucial component for enabling the nation’s future energy needs, as part of a modernization effort led by the Department of Energy. Smart grids and smart microgrids are being considered in niche applications, and as part of a comprehensive energy strategy to help manage the nation’s growing energy demands, for critical infrastructures, military installations, small rural communities, and large populations with limited water supplies. As part of a far-reaching strategic initiative, Sandia National Laboratories (SNL) presents herein a unique, three-pronged approach to integrate small modular reactors (SMRs) into microgrids, with the goal of providing economically-competitive, reliable, andmore » secure energy to meet the nation’s needs. SNL’s triad methodology involves an innovative blend of smart microgrid technology, high performance computing (HPC), and advanced manufacturing (AM). In this report, Sandia’s current capabilities in those areas are summarized, as well as paths forward that will enable DOE to achieve its energy goals. In the area of smart grid/microgrid technology, Sandia’s current computational capabilities can model the entire grid, including temporal aspects and cyber security issues. Our tools include system development, integration, testing and evaluation, monitoring, and sustainment.« less

  14. Operating capability and current status of the reactivated NASA Lewis Research Center Hypersonic Tunnel Facility

    NASA Technical Reports Server (NTRS)

    Thomas, Scott R.; Trefny, Charles J.; Pack, William D.

    1995-01-01

    The NASA Lewis Research Center's Hypersonic Tunnel Facility (HTF) is a free-jet, blowdown propulsion test facility that can simulate up to Mach-7 flight conditions with true air composition. Mach-5, -6, and -7 nozzles, each with a 42 inch exit diameter, are available. Previously obtained calibration data indicate that the test flow uniformity of the HTF is good. The facility, without modifications, can accommodate models approximately 10 feet long. The test gas is heated using a graphite core induction heater that generates a nonvitiated flow. The combination of clean-air, large-scale, and Mach-7 capabilities is unique to the HTF and enables an accurate propulsion performance determination. The reactivation of the HTF, in progress since 1990, includes refurbishing the graphite heater, the steam generation plant, the gaseous oxygen system, and all control systems. All systems were checked out and recertified, and environmental systems were upgraded to meet current standards. The data systems were also upgraded to current standards and a communication link with NASA-wide computers was added. In May 1994, the reactivation was complete, and an integrated systems test was conducted to verify facility operability. This paper describes the reactivation, the facility status, the operating capabilities, and specific applications of the HTF.

  15. Resource-Based Capability on Development Knowledge Management Capabilities of Coastal Community

    NASA Astrophysics Data System (ADS)

    Teniwut, Roberto M. K.; Hasyim, Cawalinya L.; Teniwut, Wellem A.

    2017-10-01

    Building sustainable knowledge management capabilities in the coastal area might face a whole new challenge since there are many intangible factors involved from openness on new knowledge, access and ability to use the latest technology to the various local wisdom that still in place. The aimed of this study was to identify and analyze the resource-based condition of coastal community in this area to have an empirical condition of tangible and intangible infrastructure on developing knowledge management capability coastal community in Southeast Maluku, Indonesia. We used qualitative and quantitative analysis by depth interview and questionnaire for collecting the data with multiple linear regression as our analysis method. The result provided the information on current state of resource-based capability of a coastal community in this Southeast Maluku to build a sustainability model of knowledge management capabilities especially on utilization marine and fisheries resources. The implication of this study can provide an empirical information for government, NGO and research institution to dictate on how they conducted their policy and program on developing coastal community region.

  16. Filling the Gaps: The Synergistic Application of Satellite Data for the Volcanic Ash Threat to Aviation

    NASA Technical Reports Server (NTRS)

    Murray, John; Vernier, Jean-Paul; Fairlie, T. Duncan; Pavolonis, Michael; Krotkov, Nickolay A.; Lindsay, Francis; Haynes, John

    2013-01-01

    Although significant progress has been made in recent years, estimating volcanic ash concentration for the full extent of the airspace affected by volcanic ash remains a challenge. No single satellite, airborne or ground observing system currently exists which can sufficiently inform dispersion models to provide the degree of accuracy required to use them with a high degree of confidence for routing aircraft in and near volcanic ash. Toward this end, the detection and characterization of volcanic ash in the atmosphere may be substantially improved by integrating a wider array of observing systems and advancements in trajectory and dispersion modeling to help solve this problem. The qualitative aspect of this effort has advanced significantly in the past decade due to the increase of highly complementary observational and model data currently available. Satellite observations, especially when coupled with trajectory and dispersion models can provide a very accurate picture of the 3-dimensional location of ash clouds. The accurate estimate of the mass loading at various locations throughout the entire plume, however improving, remains elusive. This paper examines the capabilities of various satellite observation systems and postulates that model-based volcanic ash concentration maps and forecasts might be significantly improved if the various extant satellite capabilities are used together with independent, accurate mass loading data from other observing systems available to calibrate (tune) ash concentration retrievals from the satellite systems.

  17. Life modeling of thermal barrier coatings for aircraft gas turbine engines

    NASA Technical Reports Server (NTRS)

    Miller, R. A.

    1989-01-01

    Thermal barrier coating life models developed under the NASA Lewis Research Center's Hot Section Technology (HOST) Program are summarized. An initial laboratory model and three design-capable models are discussed. Current understanding of coating failure mechanisms are also summarized. The materials and structural aspects of thermal barrier coatings have been successfully integrated under the HOST program to produce models which may now or in the near future be used in design. Efforts on this program continue at Pratt and Whitney Aircraft where their model is being extended to the life prediction of physical vapor deposited thermal barrier coatings.

  18. Identity in agent-based models : modeling dynamic multiscale social processes.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ozik, J.; Sallach, D. L.; Macal, C. M.

    Identity-related issues play central roles in many current events, including those involving factional politics, sectarianism, and tribal conflicts. Two popular models from the computational-social-science (CSS) literature - the Threat Anticipation Program and SharedID models - incorporate notions of identity (individual and collective) and processes of identity formation. A multiscale conceptual framework that extends some ideas presented in these models and draws other capabilities from the broader CSS literature is useful in modeling the formation of political identities. The dynamic, multiscale processes that constitute and transform social identities can be mapped to expressive structures of the framework

  19. Strategies and Innovative Approaches for the Future of Space Weather Forecasting

    NASA Astrophysics Data System (ADS)

    Hoeksema, J. T.

    2012-12-01

    The real and potential impacts of space weather have been well documented, yet neither the required research and operations programs, nor the data, modeling and analysis infrastructure necessary to develop and sustain a reliable space weather forecasting capability for a society are in place. The recently published decadal survey "Solar and Space Physics: A Science for a Technological Society" presents a vision for the coming decade and calls for a renewed national commitment to a comprehensive program in space weather and climatology. New resources are imperative. Particularly in the current fiscal environment, implementing a responsible strategy to address these needs will require broad participation across agencies and innovative approaches to make the most of existing resources, capitalize on current knowledge, span gaps in capabilities and observations, and focus resources on overcoming immediate roadblocks.

  20. Comparison of nozzle and afterbody surface pressures from wind tunnel and flight test of the YF-17 aircraft

    NASA Technical Reports Server (NTRS)

    Lucas, E. J.; Fanning, A. E.; Steers, L. I.

    1978-01-01

    Results are reported from the initial phase of an effort to provide an adequate technical capability to accurately predict the full scale, flight vehicle, nozzle-afterbody performance of future aircraft based on partial scale, wind tunnel testing. The primary emphasis of this initial effort is to assess the current capability and identify the cause of limitations on this capability. A direct comparison of surface pressure data is made between the results from an 0.1-scale model wind tunnel investigation and a full-scale flight test program to evaluate the current subscale testing techniques. These data were acquired at Mach numbers 0.6, 0.8, 0.9, 1.2, and 1.5 on four nozzle configurations at various vehicle pitch attitudes. Support system interference increments were also documented during the wind tunnel investigation. In general, the results presented indicate a good agreement in trend and level of the surface pressures when corrective increments are applied for known effects and surface differences between the two articles under investigation.

  1. A solid phase enzyme-linked immunosorbent assay for the antigenic detection of Legionella pneumophila (serogroup 1): A compliment for the space station diagnostic capability

    NASA Technical Reports Server (NTRS)

    Hejtmancik, Kelly E.

    1987-01-01

    It is necessary that an adequate microbiology capability be provided as part of the Health Maintenance Facility (HMF) to support expected microbial disease events and environmental monitoring during long periods of space flight. The application of morphological and biochemical studies to confirm the presence of certain bacterial and fungal disease agents are currently available and under consideration. This confirmation would be facilitated through employment of serological methods to aid in the identification of bacterial, fungal, and viral agents. A number of serological approaches are currently being considered, including the use of Enzyme Linked Immunosorbent Assay (ELISA) technology, which could be utilized during microgravity conditions. A solid phase, membrane supported ELISA for the detection of Legionella pneumophila, an expected disease agent, was developed to show a potential model system that would meet the HMF requirements and specifications for the future space station. These studies demonstrate the capability of membrane supported ELISA systems for identification of expected microbial disease agents as part of the HMF.

  2. NARSTO critical review of photochemical models and modeling

    NASA Astrophysics Data System (ADS)

    Russell, Armistead; Dennis, Robin

    Photochemical air quality models play a central role in both schentific investigation of how pollutants evlove in the atmosphere as well as developing policies to manage air quality. In the past 30 years, these models have evolved from rather crude representations of the physics and chemistry impacting trace species to their current state: comprehensive, but not complete. The evolution has included advancements in not only the level of process descriptions, but also the computational implementation, including numerical methods. As part of the NARSTO Critical Reviews, this article discusses the current strengths and weaknesses of air quality models and the modeling process. Current Eulerian models are found to represent well the primary processes impacting the evolution of trace species in most cases though some exceptions may exist. For example, sub-grid-scale processes, such as concentrated power plant plumes, are treated only approximately. It is not apparent how much such approximations affect their results and the polices based upon those results. A significant weakness has been in how investigators have addressed, and communicated, such uncertainties. Studies find that major uncertainties are due to model inputs, e.g., emissions and meteorology, more so than the model itself. One of the primary weakness identified is in the modeling process, not the models. Evaluation has been limited both due to data constraints. Seldom is there ample observational data to conduct a detailed model intercomparison using consistent data (e.g., the same emissions and meteorology). Further model advancement, and development of greater confidence in the use of models, is hampered by the lack of thorough evaluation and intercomparisons. Model advances are seen in the use of new tools for extending the interpretation of model results, e.g., process and sensitivity analysis, modeling systems to facilitate their use, and extension of model capabilities, e.g., aerosol dynamics capabilities and sub-grid-scale representations. Another possible direction that is the development and widespread use of a community model acting as a platform for multiple groups and agencies to collaborate and progress more rapidly.

  3. Verification of a Multiphysics Toolkit against the Magnetized Target Fusion Concept

    NASA Technical Reports Server (NTRS)

    Thomas, Scott; Perrell, Eric; Liron, Caroline; Chiroux, Robert; Cassibry, Jason; Adams, Robert B.

    2005-01-01

    In the spring of 2004 the Advanced Concepts team at MSFC embarked on an ambitious project to develop a suite of modeling routines that would interact with one another. The tools would each numerically model a portion of any advanced propulsion system. The tools were divided by physics categories, hence the name multiphysics toolset. Currently most of the anticipated modeling tools have been created and integrated. Results are given in this paper for both a quarter nozzle with chemically reacting flow and the interaction of two plasma jets representative of a Magnetized Target Fusion device. The results have not been calibrated against real data as of yet, but this paper demonstrates the current capability of the multiphysics tool and planned future enhancements

  4. Studies and analyses of the space shuttle main engine

    NASA Technical Reports Server (NTRS)

    Tischer, Alan E.; Glover, R. C.

    1987-01-01

    The primary objectives were to: evaluate ways to maximize the information yield from the current Space Shuttle Main Engine (SSME) condition monitoring sensors, identify additional sensors or monitoring capabilities which would significantly improve SSME data, and provide continuing support of the Main Engine Cost/Operations (MECO) model. In the area of SSME condition monitoring, the principal tasks were a review of selected SSME failure data, a general survey of condition monitoring, and an evaluation of the current engine monitoring system. A computerized data base was developed to assist in modeling engine failure information propagations. Each of the above items is discussed in detail. Also included is a brief discussion of the activities conducted in support of the MECO model.

  5. An Assessment of Current Fan Noise Prediction Capability

    NASA Technical Reports Server (NTRS)

    Envia, Edmane; Woodward, Richard P.; Elliott, David M.; Fite, E. Brian; Hughes, Christopher E.; Podboy, Gary G.; Sutliff, Daniel L.

    2008-01-01

    In this paper, the results of an extensive assessment exercise carried out to establish the current state of the art for predicting fan noise at NASA are presented. Representative codes in the empirical, analytical, and computational categories were exercised and assessed against a set of benchmark acoustic data obtained from wind tunnel tests of three model scale fans. The chosen codes were ANOPP, representing an empirical capability, RSI, representing an analytical capability, and LINFLUX, representing a computational aeroacoustics capability. The selected benchmark fans cover a wide range of fan pressure ratios and fan tip speeds, and are representative of modern turbofan engine designs. The assessment results indicate that the ANOPP code can predict fan noise spectrum to within 4 dB of the measurement uncertainty band on a third-octave basis for the low and moderate tip speed fans except at extreme aft emission angles. The RSI code can predict fan broadband noise spectrum to within 1.5 dB of experimental uncertainty band provided the rotor-only contribution is taken into account. The LINFLUX code can predict interaction tone power levels to within experimental uncertainties at low and moderate fan tip speeds, but could deviate by as much as 6.5 dB outside the experimental uncertainty band at the highest tip speeds in some case.

  6. Wide-Area Soil Moisture Estimation Using the Propagation of Lightning Generated Low-Frequency Electromagnetic Signals 1977

    USDA-ARS?s Scientific Manuscript database

    Land surface moisture measurements are central to our understanding of the earth’s water system, and are needed to produce accurate model-based weather/climate predictions. Currently, there exists no in-situ network capable of estimating wide-area soil moisture. In this paper, we explore an alterna...

  7. Multi-site evaluation of APEX for crop and grazing land in the Heartland region of the US

    USDA-ARS?s Scientific Manuscript database

    The Agricultural and Policy Environmental Extender (APEX) is capable of estimating edge-of-field water, nutrient, and sediment transport and is used to assess the environmental impacts of management practices. Current practice is to fully calibrate the model for each site simulation, which requires ...

  8. Academic Airframe Icing Perspective

    NASA Technical Reports Server (NTRS)

    Bragg, Mike; Rothmayer, Alric; Thompson, David

    2009-01-01

    2-D ice accretion and aerodynamics reasonably well understood for engineering applications To significantly improve our current capabilities we need to understand 3-D: a) Important ice accretion physics and modeling not well understood in 3-D; and b) Aerodynamics unsteady and 3-D especially near stall. Larger systems issues important and require multidisciplinary team approach

  9. 9th Annual Science and Engineering Technology Conference

    DTIC Science & Technology

    2008-04-17

    Disks Composite Technology Titanium Aluminides Processing Microstructure Properties Curve Generator Go-Forward: Integrated Materials & Process Models...Initiatives Current DPA/T3s: Atomic Layer Deposition Hermetic Coatings: ...domestic ALD for electronic components; transition to fabrication process ...Production windows estim • Process capability fully established >Production specifications in place >Supply chain established •All necessary property

  10. Application of historical mobility testing to sensor-based robotic performance

    NASA Astrophysics Data System (ADS)

    Willoughby, William E.; Jones, Randolph A.; Mason, George L.; Shoop, Sally A.; Lever, James H.

    2006-05-01

    The USA Engineer Research and Development Center (ERDC) has conducted on-/off-road experimental field testing with full-sized and scale-model military vehicles for more than fifty years. Some 4000 acres of local terrain are available for tailored field evaluations or verification/validation of future robotic designs in a variety of climatic regimes. Field testing and data collection procedures, as well as techniques for quantifying terrain in engineering terms, have been developed and refined into algorithms and models for predicting vehicle-terrain interactions and resulting forces or speeds of military-sized vehicles. Based on recent experiments with Matilda, Talon, and Pacbot, these predictive capabilities appear to be relevant to most robotic systems currently in development. Utilization of current testing capabilities with sensor-based vehicle drivers, or use of the procedures for terrain quantification from sensor data, would immediately apply some fifty years of historical knowledge to the development, refinement, and implementation of future robotic systems. Additionally, translation of sensor-collected terrain data into engineering terms would allow assessment of robotic performance a priori deployment of the actual system and ensure maximum system performance in the theater of operation.

  11. Characteristic investigation and control of a modular multilevel converter-based HVDC system under single-line-to-ground fault conditions

    DOE PAGES

    Shi, Xiaojie; Wang, Zhiqiang; Liu, Bo; ...

    2014-05-16

    This paper presents the analysis and control of a multilevel modular converter (MMC)-based HVDC transmission system under three possible single-line-to-ground fault conditions, with special focus on the investigation of their different fault characteristics. Considering positive-, negative-, and zero-sequence components in both arm voltages and currents, the generalized instantaneous power of a phase unit is derived theoretically according to the equivalent circuit model of the MMC under unbalanced conditions. Based on this model, a novel double-line frequency dc-voltage ripple suppression control is proposed. This controller, together with the negative-and zero-sequence current control, could enhance the overall fault-tolerant capability of the HVDCmore » system without additional cost. To further improve the fault-tolerant capability, the operation performance of the HVDC system with and without single-phase switching is discussed and compared in detail. Lastly, simulation results from a three-phase MMC-HVDC system generated with MATLAB/Simulink are provided to support the theoretical analysis and proposed control schemes.« less

  12. A Generalized Decision Framework Using Multi-objective Optimization for Water Resources Planning

    NASA Astrophysics Data System (ADS)

    Basdekas, L.; Stewart, N.; Triana, E.

    2013-12-01

    Colorado Springs Utilities (CSU) is currently engaged in an Integrated Water Resource Plan (IWRP) to address the complex planning scenarios, across multiple time scales, currently faced by CSU. The modeling framework developed for the IWRP uses a flexible data-centered Decision Support System (DSS) with a MODSIM-based modeling system to represent the operation of the current CSU raw water system coupled with a state-of-the-art multi-objective optimization algorithm. Three basic components are required for the framework, which can be implemented for planning horizons ranging from seasonal to interdecadal. First, a water resources system model is required that is capable of reasonable system simulation to resolve performance metrics at the appropriate temporal and spatial scales of interest. The system model should be an existing simulation model, or one developed during the planning process with stakeholders, so that 'buy-in' has already been achieved. Second, a hydrologic scenario tool(s) capable of generating a range of plausible inflows for the planning period of interest is required. This may include paleo informed or climate change informed sequences. Third, a multi-objective optimization model that can be wrapped around the system simulation model is required. The new generation of multi-objective optimization models do not require parameterization which greatly reduces problem complexity. Bridging the gap between research and practice will be evident as we use a case study from CSU's planning process to demonstrate this framework with specific competing water management objectives. Careful formulation of objective functions, choice of decision variables, and system constraints will be discussed. Rather than treating results as theoretically Pareto optimal in a planning process, we use the powerful multi-objective optimization models as tools to more efficiently and effectively move out of the inferior decision space. The use of this framework will help CSU evaluate tradeoffs in a continually changing world.

  13. The Integrated Landscape Modeling partnership - Current status and future directions

    USGS Publications Warehouse

    Mushet, David M.; Scherff, Eric J.

    2016-01-28

    The Integrated Landscape Modeling (ILM) partnership is an effort by the U.S. Geological Survey (USGS) and U.S. Department of Agriculture (USDA) to identify, evaluate, and develop models to quantify services derived from ecosystems, with a focus on wetland ecosystems and conservation effects. The ILM partnership uses the Integrated Valuation of Ecosystem Services and Tradeoffs (InVEST) modeling platform to facilitate regional quantifications of ecosystem services under various scenarios of land-cover change that are representative of differing conservation program and practice implementation scenarios. To date, the ILM InVEST partnership has resulted in capabilities to quantify carbon stores, amphibian habitat, plant-community diversity, and pollination services. Work to include waterfowl and grassland bird habitat quality is in progress. Initial InVEST modeling has been focused on the Prairie Pothole Region (PPR) of the United States; future efforts might encompass other regions as data availability and knowledge increase as to how functions affecting ecosystem services differ among regions.The ILM partnership is also developing the capability for field-scale process-based modeling of depressional wetland ecosystems using the Agricultural Policy/Environmental Extender (APEX) model. Progress was made towards the development of techniques to use the APEX model for closed-basin depressional wetlands of the PPR, in addition to the open systems that the model was originally designed to simulate. The ILM partnership has matured to the stage where effects of conservation programs and practices on multiple ecosystem services can now be simulated in selected areas. Future work might include the continued development of modeling capabilities, as well as development and evaluation of differing conservation program and practice scenarios of interest to partner agencies including the USDA’s Farm Service Agency (FSA) and Natural Resources Conservation Service (NRCS). When combined, the ecosystem services modeling capabilities of InVEST and the process-based abilities of the APEX model should provide complementary information needed to meet USDA and the Department of the Interior information needs.

  14. SCISEAL: A CFD code for analysis of fluid dynamic forces in seals

    NASA Technical Reports Server (NTRS)

    Athavale, Mahesh; Przekwas, Andrzej

    1994-01-01

    A viewgraph presentation is made of the objectives, capabilities, and test results of the computer code SCISEAL. Currently, the seal code has: a finite volume, pressure-based integration scheme; colocated variables with strong conservation approach; high-order spatial differencing, up to third-order; up to second-order temporal differencing; a comprehensive set of boundary conditions; a variety of turbulence models and surface roughness treatment; moving grid formulation for arbitrary rotor whirl; rotor dynamic coefficients calculated by the circular whirl and numerical shaker methods; and small perturbation capabilities to handle centered and eccentric seals.

  15. Current Capabilities, Requirements and a Proposed Strategy for Interdependency Analysis in the UK

    NASA Astrophysics Data System (ADS)

    Bloomfield, Robin; Chozos, Nick; Salako, Kizito

    The UK government recently commissioned a research study to identify the state-of-the-art in Critical Infrastructure modelling and analysis, and the government/industry requirements for such tools and services. This study (Cetifs) concluded with a strategy aiming to bridge the gaps between the capabilities and requirements, which would establish interdependency analysis as a commercially viable service in the near future. This paper presents the findings of this study that was carried out by CSR, City University London, Adelard LLP, a safety/security consultancy and Cranfield University, defense academy of the UK.

  16. Characterization and imaging of nanostructured materials using tabletop extreme ultraviolet light sources

    NASA Astrophysics Data System (ADS)

    Karl, Robert; Knobloch, Joshua; Frazer, Travis; Tanksalvala, Michael; Porter, Christina; Bevis, Charles; Chao, Weilun; Abad Mayor, Begoña.; Adams, Daniel; Mancini, Giulia F.; Hernandez-Charpak, Jorge N.; Kapteyn, Henry; Murnane, Margaret

    2018-03-01

    Using a tabletop coherent extreme ultraviolet source, we extend current nanoscale metrology capabilities with applications spanning from new models of nanoscale transport and materials, to nanoscale device fabrication. We measure the ultrafast dynamics of acoustic waves in materials; by analyzing the material's response, we can extract elastic properties of films as thin as 11nm. We extend this capability to a spatially resolved imaging modality by using coherent diffractive imaging to image the acoustic waves in nanostructures as they propagate. This will allow for spatially resolved characterization of the elastic properties of non-isotropic materials.

  17. Development of a multi-disciplinary ERTS user program in the state of Ohio. Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    Baldridge, P. E.; Weber, C.; Schaal, G.; Wilhelm, C.; Wurelic, G. E.; Stephan, J. G.; Ebbert, T. F.; Smail, H. E.; Mckeon, J.; Schmidt, N. (Principal Investigator)

    1977-01-01

    The author has identified the following significant results. A current uniform land inventory was derived, in part, from LANDSAT data. The State has the ability to convert processed land information from LANDSAT to Ohio Capability Analysis Program (OCAP). The OCAP is a computer information and mapping system comprised of various programs used to digitally store, analyze, and display land capability information. More accurate processing of LANDSAT data could lead to reasonably accurate, useful land allocations models. It was feasible to use LANDSAT data to investigate minerals, pollution, land use, and resource inventory.

  18. MANPRINT Methods Monograph: Aiding the Development of Manned System Performance Criteria

    DTIC Science & Technology

    1989-06-01

    the need for the new system. It may be necessary co derive these requirements from combat models. By modeling the capabilities of the current force ...FORMAT The O&O Plan describes how a system will be integrated into the force structure, deployed, operated, and supported in peacetime and wartime...for evaluation during OT I. 9. MANPOWER/ FORCE STRUCTURE ASSESSMENT. Estimate manpower require- ments per system, using unit, and total Army by

  19. Comparison of Hall Thruster Plume Expansion Model with Experimental Data (Preprint)

    DTIC Science & Technology

    2006-07-01

    Cartesian mesh. AQUILA, the focus of this study, is a hybrid PIC model that tracks particles along an unstructured tetrahedral mesh. COLISEUM is capable...measurements of the ion current density profile, ion energy distributions, and ion species fraction distributions using a nude Faraday probe...Spacecraft and Rockets, Vol.37 No.1. 6 Oh, D. and Hastings, D., “Three Dimensional PIC -DSMC Simulations of Hall Thruster Plumes and Analysis for

  20. U.S. Nuclear Weapons Enterprise: A Strategic Past and Unknown Future

    DTIC Science & Technology

    2012-04-25

    are left to base their planning assumptions, weapons designs and capabilities on outdated models . The likelihood of a large-scale nuclear war has...conduct any testing on nuclear weapons and must rely on computer modeling . While this may provide sufficient confidence in the current nuclear...unlikely the world will be free of nuclear weapons. 24 APPENDIX A – Acronyms ACC – Air Combat Command ACM – Advanced cruise missle CSAF

  1. Connected Equipment Maturity Model Version 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Butzbaugh, Joshua B.; Mayhorn, Ebony T.; Sullivan, Greg

    2017-05-01

    The Connected Equipment Maturity Model (CEMM) evaluates the high-level functionality and characteristics that enable equipment to provide the four categories of energy-related services through communication with other entities (e.g., equipment, third parties, utilities, and users). The CEMM will help the U.S. Department of Energy, industry, energy efficiency organizations, and research institutions benchmark the current state of connected equipment and identify capabilities that may be attained to reach a more advanced, future state.

  2. Autonomous Dynamically Self-Organizing and Self-Healing Distributed Hardware Architecture - the eDNA Concept

    NASA Technical Reports Server (NTRS)

    Boesen, Michael Reibel; Madsen, Jan; Keymeulen, Didier

    2011-01-01

    This paper presents the current state of the autonomous dynamically self-organizing and self-healing electronic DNA (eDNA) hardware architecture (patent pending). In its current prototype state, the eDNA architecture is capable of responding to multiple injected faults by autonomously reconfiguring itself to accommodate the fault and keep the application running. This paper will also disclose advanced features currently available in the simulation model only. These features are future work and will soon be implemented in hardware. Finally we will describe step-by-step how an application is implemented on the eDNA architecture.

  3. URBAN STORMWATER INVESTIGATIONS BY THE U. S. GEOLOGICAL SURVEY.

    USGS Publications Warehouse

    Jennings, Marshall E.

    1985-01-01

    Urban stormwater hydrology studies in the U. S. Geological Survey are currently focused on compilation of national data bases containing flood-peak and short time-interval rainfall, discharge and water-quality information for urban watersheds. Current data bases, updated annually, are nationwide in scope. Supplementing the national data files are published reports of interpretative analyses, a map report and research products including improved instrumentation and deterministic modeling capabilities. New directions of Survey investigations include gaging programs for very small catchments and for stormwater detention facilities.

  4. The need and approach for characterization - U.S. air force perspectives on materials state awareness

    NASA Astrophysics Data System (ADS)

    Aldrin, John C.; Lindgren, Eric A.

    2018-04-01

    This paper expands on the objective and motivation for NDE-based characterization and includes a discussion of the current approach using model-assisted inversion being pursued within the Air Force Research Laboratory (AFRL). This includes a discussion of the multiple model-based methods that can be used, including physics-based models, deep machine learning, and heuristic approaches. The benefits and drawbacks of each method is reviewed and the potential to integrate multiple methods is discussed. Initial successes are included to highlight the ability to obtain quantitative values of damage. Additional steps remaining to realize this capability with statistical metrics of accuracy are discussed, and how these results can be used to enable probabilistic life management are addressed. The outcome of this initiative will realize the long-term desired capability of NDE methods to provide quantitative characterization to accelerate certification of new materials and enhance life management of engineered systems.

  5. Dark traits and suicide: Associations between psychopathy, narcissism, and components of the interpersonal-psychological theory of suicide.

    PubMed

    Harrop, Tiffany M; Preston, Olivia C; Khazem, Lauren R; Anestis, Michael D; Junearick, Regis; Green, Bradley A; Anestis, Joye C

    2017-10-01

    Studies have identified independent relationships between psychopathy, narcissism, and suicidality. The current study expands upon the extant literature by exploring psychopathic and narcissistic personality traits and components of the interpersonal-psychological theory of suicide, utilizing a 3-factor model of psychopathy and 2-factor model of pathological narcissism in community, undergraduate, and military individuals. We hypothesized that the impulsive-antisocial facets of psychopathy would be related to suicidal desire, whereas all facets of psychopathy would relate to the capability for suicide. We anticipated an association between pathological narcissism, thwarted belongingness, and capability for suicide, but not perceived burdensomeness. We further hypothesized a relationship between physical pain tolerance and persistence and the affective (i.e., callousness) facet of psychopathy. Results partially supported these hypotheses and underscore the need for further examination of these associations utilizing contemporary models of psychopathy and narcissism. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  6. Updates to the Generation of Physics Data Inputs for MAMMOTH Simulations of the Transient Reactor Test Facility - FY2016

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ortensi, Javier; Baker, Benjamin Allen; Schunert, Sebastian

    The INL is currently evolving the modeling and simulation (M&S) capability that will enable improved core operation as well as design and analysis of TREAT experiments. This M&S capability primarily uses MAMMOTH, a reactor physics application being developed under Multi-physics Object Oriented Simulation Environment (MOOSE) framework. MAMMOTH allows the coupling of a number of other MOOSE-based applications. This second year of work has been devoted to the generation of a deterministic reference solution for the full core, the preparation of anisotropic diffusion coefficients, the testing of the SPH equivalence method, and the improvement of the control rod modeling. In addition,more » this report includes the progress made in the modeling of the M8 core configuration and experiment vehicle since January of this year.« less

  7. Multi-Instance Learning Models for Automated Support of Analysts in Simulated Surveillance Environments

    NASA Technical Reports Server (NTRS)

    Birisan, Mihnea; Beling, Peter

    2011-01-01

    New generations of surveillance drones are being outfitted with numerous high definition cameras. The rapid proliferation of fielded sensors and supporting capacity for processing and displaying data will translate into ever more capable platforms, but with increased capability comes increased complexity and scale that may diminish the usefulness of such platforms to human operators. We investigate methods for alleviating strain on analysts by automatically retrieving content specific to their current task using a machine learning technique known as Multi-Instance Learning (MIL). We use MIL to create a real time model of the analysts' task and subsequently use the model to dynamically retrieve relevant content. This paper presents results from a pilot experiment in which a computer agent is assigned analyst tasks such as identifying caravanning vehicles in a simulated vehicle traffic environment. We compare agent performance between MIL aided trials and unaided trials.

  8. EIT forward problem parallel simulation environment with anisotropic tissue and realistic electrode models.

    PubMed

    De Marco, Tommaso; Ries, Florian; Guermandi, Marco; Guerrieri, Roberto

    2012-05-01

    Electrical impedance tomography (EIT) is an imaging technology based on impedance measurements. To retrieve meaningful insights from these measurements, EIT relies on detailed knowledge of the underlying electrical properties of the body. This is obtained from numerical models of current flows therein. The nonhomogeneous and anisotropic electric properties of human tissues make accurate modeling and simulation very challenging, leading to a tradeoff between physical accuracy and technical feasibility, which at present severely limits the capabilities of EIT. This work presents a complete algorithmic flow for an accurate EIT modeling environment featuring high anatomical fidelity with a spatial resolution equal to that provided by an MRI and a novel realistic complete electrode model implementation. At the same time, we demonstrate that current graphics processing unit (GPU)-based platforms provide enough computational power that a domain discretized with five million voxels can be numerically modeled in about 30 s.

  9. Examining the Relationships Between Education, Social Networks and Democratic Support With ABM

    NASA Technical Reports Server (NTRS)

    Drucker, Nick; Campbell, Kenyth

    2011-01-01

    This paper introduces an agent-based model that explores the relationships between education, social networks, and support for democratic ideals. This study examines two factors thai affect democratic support, education, and social networks. Current theory concerning these two variables suggests that positive relationships exist between education and democratic support and between social networks and the spread of ideas. The model contains multiple variables of democratic support, two of which are evaluated through experimentation. The model allows individual entities within the system to make "decisions" about their democratic support independent of one another. The agent based approach also allows entities to utilize their social networks to spread ideas. Current theory supports experimentation results. In addion , these results show the model is capable of reproducing real world outcomes. This paper addresses the model creation process and the experimentation procedure, as well as future research avenues and potential shortcomings of the model

  10. Argobots: A Lightweight Low-Level Threading and Tasking Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seo, Sangmin; Amer, Abdelhalim; Balaji, Pavan

    In the past few decades, a number of user-level threading and tasking models have been proposed in the literature to address the shortcomings of OS-level threads, primarily with respect to cost and flexibility. Current state-of-the-art user-level threading and tasking models, however, either are too specific to applications or architectures or are not as powerful or flexible. In this paper, we present Argobots, a lightweight, low-level threading and tasking framework that is designed as a portable and performant substrate for high-level programming models or runtime systems. Argobots offers a carefully designed execution model that balances generality of functionality with providing amore » rich set of controls to allow specialization by end users or high-level programming models. We describe the design, implementation, and performance characterization of Argobots and present integrations with three high-level models: OpenMP, MPI, and colocated I/O services. Evaluations show that (1) Argobots, while providing richer capabilities, is competitive with existing simpler generic threading runtimes; (2) our OpenMP runtime offers more efficient interoperability capabilities than production OpenMP runtimes do; (3) when MPI interoperates with Argobots instead of Pthreads, it enjoys reduced synchronization costs and better latency-hiding capabilities; and (4) I/O services with Argobots reduce interference with colocated applications while achieving performance competitive with that of a Pthreads approach.« less

  11. THE IDEA IS TO USEMODIS IN CONJUNCTION WITH THE CURRENT LIMITED LANDSAT CAPABILITY, COMMERCIAL SATELLITES, ANDUNMANNED AERIAL VEHICLES (UAV), IN A MULTI-STAGE APPROACH TO MEET EPA INFORMATION NEEDS.REMOTE SENSING OVERVIEW: EPA CAPABILITIES, PRIORITY AGENCY APPLICATIONS, SENSOR/AIRCRAFT CAPABILITIES, COST CONSIDERATIONS, SPECTRAL AND SPATIAL RESOLUTIONS, AND TEMPORAL CONSIDERATIONS

    EPA Science Inventory

    EPA remote sensing capabilities include applied research for priority applications and technology support for operational assistance to clients across the Agency. The idea is to use MODIS in conjunction with the current limited Landsat capability, commercial satellites, and Unma...

  12. Modelling the impacts of pests and diseases on agricultural systems.

    PubMed

    Donatelli, M; Magarey, R D; Bregaglio, S; Willocquet, L; Whish, J P M; Savary, S

    2017-07-01

    The improvement and application of pest and disease models to analyse and predict yield losses including those due to climate change is still a challenge for the scientific community. Applied modelling of crop diseases and pests has mostly targeted the development of support capabilities to schedule scouting or pesticide applications. There is a need for research to both broaden the scope and evaluate the capabilities of pest and disease models. Key research questions not only involve the assessment of the potential effects of climate change on known pathosystems, but also on new pathogens which could alter the (still incompletely documented) impacts of pests and diseases on agricultural systems. Yield loss data collected in various current environments may no longer represent a adequate reference to develop tactical, decision-oriented, models for plant diseases and pests and their impacts, because of the ongoing changes in climate patterns. Process-based agricultural simulation modelling, on the other hand, appears to represent a viable methodology to estimate the impacts of these potential effects. A new generation of tools based on state-of-the-art knowledge and technologies is needed to allow systems analysis including key processes and their dynamics over appropriate suitable range of environmental variables. This paper offers a brief overview of the current state of development in coupling pest and disease models to crop models, and discusses technical and scientific challenges. We propose a five-stage roadmap to improve the simulation of the impacts caused by plant diseases and pests; i) improve the quality and availability of data for model inputs; ii) improve the quality and availability of data for model evaluation; iii) improve the integration with crop models; iv) improve the processes for model evaluation; and v) develop a community of plant pest and disease modelers.

  13. The SMART-NAS Testbed

    NASA Technical Reports Server (NTRS)

    Aquilina, Rudolph A.

    2015-01-01

    The SMART-NAS Testbed for Safe Trajectory Based Operations Project will deliver an evaluation capability, critical to the ATM community, allowing full NextGen and beyond-NextGen concepts to be assessed and developed. To meet this objective a strong focus will be placed on concept integration and validation to enable a gate-to-gate trajectory-based system capability that satisfies a full vision for NextGen. The SMART-NAS for Safe TBO Project consists of six sub-projects. Three of the sub-projects are focused on exploring and developing technologies, concepts and models for evolving and transforming air traffic management operations in the ATM+2 time horizon, while the remaining three sub-projects are focused on developing the tools and capabilities needed for testing these advanced concepts. Function Allocation, Networked Air Traffic Management and Trajectory Based Operations are developing concepts and models. SMART-NAS Test-bed, System Assurance Technologies and Real-time Safety Modeling are developing the tools and capabilities to test these concepts. Simulation and modeling capabilities will include the ability to assess multiple operational scenarios of the national airspace system, accept data feeds, allowing shadowing of actual operations in either real-time, fast-time and/or hybrid modes of operations in distributed environments, and enable integrated examinations of concepts, algorithms, technologies, and NAS architectures. An important focus within this project is to enable the development of a real-time, system-wide safety assurance system. The basis of such a system is a continuum of information acquisition, analysis, and assessment that enables awareness and corrective action to detect and mitigate potential threats to continuous system-wide safety at all levels. This process, which currently can only be done post operations, will be driven towards "real-time" assessments in the 2035 time frame.

  14. BEopt-CA (Ex) -- A Tool for Optimal Integration of EE/DR/ES+PV in Existing California Homes. Cooperative Research and Development Final Report, CRADA Number CRD-11-429

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Christensen, Craig

    Opportunities for combining energy efficiency, demand response, and energy storage with PV are often missed, because the required knowledge and expertise for these different technologies exist in separate organizations or individuals. Furthermore, there is a lack of quantitative tools to optimize energy efficiency, demand response and energy storage with PV, especially for existing buildings. Our goal is to develop a modeling tool, BEopt-CA (Ex), with capabilities to facilitate identification and implementation of a balanced integration of energy efficiency (EE), demand response (DR), and energy storage (ES) with photovoltaics (PV) within the residential retrofit market. To achieve this goal, we willmore » adapt and extend an existing tool -- BEopt -- that is designed to identify optimal combinations of efficiency and PV in new home designs. In addition, we will develop multifamily residential modeling capabilities for use in California, to facilitate integration of distributed solar power into the grid in order to maximize its value to California ratepayers. The project is follow-on research that leverages previous California Solar Initiative RD&D investment in the BEopt software. BEopt facilitates finding the least cost combination of energy efficiency and renewables to support integrated DSM (iDSM) and Zero Net Energy (ZNE) in California residential buildings. However, BEopt is currently focused on modeling single-family houses and does not include satisfactory capabilities for modeling multifamily homes. The project brings BEopt's existing modeling and optimization capabilities to multifamily buildings, including duplexes, triplexes, townhouses, flats, and low-rise apartment buildings.« less

  15. A numerical model for thermal energy storage systems utilising encapsulated phase change materials

    NASA Astrophysics Data System (ADS)

    Jacob, Rhys; Saman, Wasim; Bruno, Frank

    2016-05-01

    In an effort to reduce the cost of thermal energy storage for concentrated solar power plants, a thermocline storage concept was investigated. Two systems were investigated being a sensible-only and an encapsulated phase change system. Both systems have the potential to reduce the storage tank volume and/or reduce the cost of the filler material, thereby reducing the cost of the system when compared to current two-tank molten salt systems. The objective of the current paper is to create a numerical model capable of designing and simulating the aforementioned thermocline storage concepts in the open source programming language known as Python. The results of the current study are compared to previous numerical results and are found to be in good agreement.

  16. Distributed Hydrologic Modeling Apps for Decision Support in the Cloud

    NASA Astrophysics Data System (ADS)

    Swain, N. R.; Latu, K.; Christiensen, S.; Jones, N.; Nelson, J.

    2013-12-01

    Advances in computation resources and greater availability of water resources data represent an untapped resource for addressing hydrologic uncertainties in water resources decision-making. The current practice of water authorities relies on empirical, lumped hydrologic models to estimate watershed response. These models are not capable of taking advantage of many of the spatial datasets that are now available. Physically-based, distributed hydrologic models are capable of using these data resources and providing better predictions through stochastic analysis. However, there exists a digital divide that discourages many science-minded decision makers from using distributed models. This divide can be spanned using a combination of existing web technologies. The purpose of this presentation is to present a cloud-based environment that will offer hydrologic modeling tools or 'apps' for decision support and the web technologies that have been selected to aid in its implementation. Compared to the more commonly used lumped-parameter models, distributed models, while being more intuitive, are still data intensive, computationally expensive, and difficult to modify for scenario exploration. However, web technologies such as web GIS, web services, and cloud computing have made the data more accessible, provided an inexpensive means of high-performance computing, and created an environment for developing user-friendly apps for distributed modeling. Since many water authorities are primarily interested in the scenario exploration exercises with hydrologic models, we are creating a toolkit that facilitates the development of a series of apps for manipulating existing distributed models. There are a number of hurdles that cloud-based hydrologic modeling developers face. One of these is how to work with the geospatial data inherent with this class of models in a web environment. Supporting geospatial data in a website is beyond the capabilities of standard web frameworks and it requires the use of additional software. In particular, there are at least three elements that are needed: a geospatially enabled database, a map server, and geoprocessing toolbox. We recommend a software stack for geospatial web application development comprising: MapServer, PostGIS, and 52 North with Python as the scripting language to tie them together. Another hurdle that must be cleared is managing the cloud-computing load. We are using HTCondor as a solution to this end. Finally, we are creating a scripting environment wherein developers will be able to create apps that use existing hydrologic models in our system with minimal effort. This capability will be accomplished by creating a plugin for a Python content management system called CKAN. We are currently developing cyberinfrastructure that utilizes this stack and greatly lowers the investment required to deploy cloud-based modeling apps. This material is based upon work supported by the National Science Foundation under Grant No. 1135482

  17. Interactive design and analysis of future large spacecraft concepts

    NASA Technical Reports Server (NTRS)

    Garrett, L. B.

    1981-01-01

    An interactive computer aided design program used to perform systems level design and analysis of large spacecraft concepts is presented. Emphasis is on rapid design, analysis of integrated spacecraft, and automatic spacecraft modeling for lattice structures. Capabilities and performance of multidiscipline applications modules, the executive and data management software, and graphics display features are reviewed. A single user at an interactive terminal create, design, analyze, and conduct parametric studies of Earth orbiting spacecraft with relative ease. Data generated in the design, analysis, and performance evaluation of an Earth-orbiting large diameter antenna satellite are used to illustrate current capabilities. Computer run time statistics for the individual modules quantify the speed at which modeling, analysis, and design evaluation of integrated spacecraft concepts is accomplished in a user interactive computing environment.

  18. The Framework for 0-D Atmospheric Modeling (F0AM) v3.1

    NASA Technical Reports Server (NTRS)

    Wolfe, Glenn M.; Marvin, Margaret R.; Roberts, Sandra J.; Travis, Katherine R.; Liao, Jin

    2016-01-01

    The Framework for 0-D Atmospheric Modeling(F0AM) is a flexible and user-friendly MATLAB-based platform for simulation of atmospheric chemistry systems. The F0AM interface incorporates front-end configuration of observational constraints and model setups, making it readily adaptable to simulation of photochemical chambers, Lagrangian plumes, and steady-state or time-evolving solar cycles. Six different chemical mechanisms and three options for calculation of photolysis frequencies are currently available. Example simulations are presented to illustrate model capabilities and, more generally, highlight some of the advantages and challenges of 0-D box modeling.

  19. Visualizing and Validating Metadata Traceability within the CDISC Standards.

    PubMed

    Hume, Sam; Sarnikar, Surendra; Becnel, Lauren; Bennett, Dorine

    2017-01-01

    The Food & Drug Administration has begun requiring that electronic submissions of regulated clinical studies utilize the Clinical Data Information Standards Consortium data standards. Within regulated clinical research, traceability is a requirement and indicates that the analysis results can be traced back to the original source data. Current solutions for clinical research data traceability are limited in terms of querying, validation and visualization capabilities. This paper describes (1) the development of metadata models to support computable traceability and traceability visualizations that are compatible with industry data standards for the regulated clinical research domain, (2) adaptation of graph traversal algorithms to make them capable of identifying traceability gaps and validating traceability across the clinical research data lifecycle, and (3) development of a traceability query capability for retrieval and visualization of traceability information.

  20. Visualizing and Validating Metadata Traceability within the CDISC Standards

    PubMed Central

    Hume, Sam; Sarnikar, Surendra; Becnel, Lauren; Bennett, Dorine

    2017-01-01

    The Food & Drug Administration has begun requiring that electronic submissions of regulated clinical studies utilize the Clinical Data Information Standards Consortium data standards. Within regulated clinical research, traceability is a requirement and indicates that the analysis results can be traced back to the original source data. Current solutions for clinical research data traceability are limited in terms of querying, validation and visualization capabilities. This paper describes (1) the development of metadata models to support computable traceability and traceability visualizations that are compatible with industry data standards for the regulated clinical research domain, (2) adaptation of graph traversal algorithms to make them capable of identifying traceability gaps and validating traceability across the clinical research data lifecycle, and (3) development of a traceability query capability for retrieval and visualization of traceability information. PMID:28815125

  1. Integration of Multifidelity Multidisciplinary Computer Codes for Design and Analysis of Supersonic Aircraft

    NASA Technical Reports Server (NTRS)

    Geiselhart, Karl A.; Ozoroski, Lori P.; Fenbert, James W.; Shields, Elwood W.; Li, Wu

    2011-01-01

    This paper documents the development of a conceptual level integrated process for design and analysis of efficient and environmentally acceptable supersonic aircraft. To overcome the technical challenges to achieve this goal, a conceptual design capability which provides users with the ability to examine the integrated solution between all disciplines and facilitates the application of multidiscipline design, analysis, and optimization on a scale greater than previously achieved, is needed. The described capability is both an interactive design environment as well as a high powered optimization system with a unique blend of low, mixed and high-fidelity engineering tools combined together in the software integration framework, ModelCenter. The various modules are described and capabilities of the system are demonstrated. The current limitations and proposed future enhancements are also discussed.

  2. On the application of quantum transport theory to electron sources.

    PubMed

    Jensen, Kevin L

    2003-01-01

    Electron sources (e.g., field emitter arrays, wide band-gap (WBG) semiconductor materials and coatings, carbon nanotubes, etc.) seek to exploit ballistic transport within the vacuum after emission from microfabricated structures. Regardless of kind, all sources strive to minimize the barrier to electron emission by engineering material properties (work function/electron affinity) or physical geometry (field enhancement) of the cathode. The unique capabilities of cold cathodes, such as instant ON/OFF performance, high brightness, high current density, large transconductance to capacitance ratio, cold emission, small size and/or low voltage operation characteristics, commend their use in several advanced devices when physical size, weight, power consumption, beam current, and pulse repletion frequency are important, e.g., RF power amplifier such as traveling wave tubes (TWTs) for radar and communications, electrodynamic tethers for satellite deboost/reboost, and electric propulsion systems such as Hall thrusters for small satellites. The theoretical program described herein is directed towards models to evaluate emission current from electron sources (in particular, emission from WBG and Spindt-type field emitter) in order to assess their utility, capabilities and performance characteristics. Modeling efforts particularly include: band bending, non-linear and resonant (Poole-Frenkel) potentials, the extension of one-dimensional theory to multi-dimensional structures, and emission site statistics due to variations in geometry and the presence of adsorbates. Two particular methodologies, namely, the modified Airy approach and metal-semiconductor statistical hyperbolic/ellipsoidal model, are described in detail in their present stage of development.

  3. Resolving Orbital and Climate Keys of Earth and Extraterrestrial Environments with Dynamics (ROCKE-3D) 1.0: A General Circulation Model for Simulating the Climates of Rocky Planets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Way, M. J.; Aleinov, I.; Amundsen, David S.

    Resolving Orbital and Climate Keys of Earth and Extraterrestrial Environments with Dynamics (ROCKE-3D) is a three-dimensional General Circulation Model (GCM) developed at the NASA Goddard Institute for Space Studies for the modeling of atmospheres of solar system and exoplanetary terrestrial planets. Its parent model, known as ModelE2, is used to simulate modern Earth and near-term paleo-Earth climates. ROCKE-3D is an ongoing effort to expand the capabilities of ModelE2 to handle a broader range of atmospheric conditions, including higher and lower atmospheric pressures, more diverse chemistries and compositions, larger and smaller planet radii and gravity, different rotation rates (from slower tomore » more rapid than modern Earth’s, including synchronous rotation), diverse ocean and land distributions and topographies, and potential basic biosphere functions. The first aim of ROCKE-3D is to model planetary atmospheres on terrestrial worlds within the solar system such as paleo-Earth, modern and paleo-Mars, paleo-Venus, and Saturn’s moon Titan. By validating the model for a broad range of temperatures, pressures, and atmospheric constituents, we can then further expand its capabilities to those exoplanetary rocky worlds that have been discovered in the past, as well as those to be discovered in the future. We also discuss the current and near-future capabilities of ROCKE-3D as a community model for studying planetary and exoplanetary atmospheres.« less

  4. An agile acquisition decision-support workbench for evaluating ISR effectiveness

    NASA Astrophysics Data System (ADS)

    Stouch, Daniel W.; Champagne, Valerie; Mow, Christopher; Rosenberg, Brad; Serrin, Joshua

    2011-06-01

    The U.S. Air Force is consistently evolving to support current and future operations through the planning and execution of intelligence, surveillance and reconnaissance (ISR) missions. However, it is a challenge to maintain a precise awareness of current and emerging ISR capabilities to properly prepare for future conflicts. We present a decisionsupport tool for acquisition managers to empirically compare ISR capabilities and approaches to employing them, thereby enabling the DoD to acquire ISR platforms and sensors that provide the greatest return on investment. We have developed an analysis environment to perform modeling and simulation-based experiments to objectively compare alternatives. First, the analyst specifies an operational scenario for an area of operations by providing terrain and threat information; a set of nominated collections; sensor and platform capabilities; and processing, exploitation, and dissemination (PED) capacities. Next, the analyst selects and configures ISR collection strategies to generate collection plans. The analyst then defines customizable measures of effectiveness or performance to compute during the experiment. Finally, the analyst empirically compares the efficacy of each solution and generates concise reports to document their conclusions, providing traceable evidence for acquisition decisions. Our capability demonstrates the utility of using a workbench environment for analysts to design and run experiments. Crafting impartial metrics enables the acquisition manager to focus on evaluating solutions based on specific military needs. Finally, the metric and collection plan visualizations provide an intuitive understanding of the suitability of particular solutions. This facilitates a more agile acquisition strategy that handles rapidly changing technology in response to current military needs.

  5. Leveraging annotation-based modeling with Jump.

    PubMed

    Bergmayr, Alexander; Grossniklaus, Michael; Wimmer, Manuel; Kappel, Gerti

    2018-01-01

    The capability of UML profiles to serve as annotation mechanism has been recognized in both research and industry. Today's modeling tools offer profiles specific to platforms, such as Java, as they facilitate model-based engineering approaches. However, considering the large number of possible annotations in Java, manually developing the corresponding profiles would only be achievable by huge development and maintenance efforts. Thus, leveraging annotation-based modeling requires an automated approach capable of generating platform-specific profiles from Java libraries. To address this challenge, we present the fully automated transformation chain realized by Jump, thereby continuing existing mapping efforts between Java and UML by emphasizing on annotations and profiles. The evaluation of Jump shows that it scales for large Java libraries and generates profiles of equal or even improved quality compared to profiles currently used in practice. Furthermore, we demonstrate the practical value of Jump by contributing profiles that facilitate reverse engineering and forward engineering processes for the Java platform by applying it to a modernization scenario.

  6. Productivity and injectivity of horizontal wells. Quarterly report, October 1--December 31, 1993

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fayers, F.J.; Aziz, K.; Hewett, T.A.

    1993-03-10

    A number of activities have been carried out in the last three months. A list outlining these efforts is presented below followed by brief description of each activity in the subsequent sections of this report: Progress is being made on the development of a black oil three-phase simulator which will allow the use of a generalized Voronoi grid in the plane perpendicular to a horizontal well. The available analytical solutions in the literature for calculating productivity indices (Inflow Performance) of horizontal wells have been reviewed. The pseudo-steady state analytic model of Goode and Kuchuk has been applied to an examplemore » problem. A general mechanistic two-phase flow model is under development. The model is capable of predicting flow transition boundaries for a horizontal pipe at any inclination angle. It also has the capability of determining pressure drops and holdups for all the flow regimes. A large code incorporating all the features of the model has been programmed and is currently being tested.« less

  7. Development of a GIS-based spill management information system.

    PubMed

    Martin, Paul H; LeBoeuf, Eugene J; Daniel, Edsel B; Dobbins, James P; Abkowitz, Mark D

    2004-08-30

    Spill Management Information System (SMIS) is a geographic information system (GIS)-based decision support system designed to effectively manage the risks associated with accidental or intentional releases of a hazardous material into an inland waterway. SMIS provides critical planning and impact information to emergency responders in anticipation of, or following such an incident. SMIS couples GIS and database management systems (DBMS) with the 2-D surface water model CE-QUAL-W2 Version 3.1 and the air contaminant model Computer-Aided Management of Emergency Operations (CAMEO) while retaining full GIS risk analysis and interpretive capabilities. Live 'real-time' data links are established within the spill management software to utilize current meteorological information and flowrates within the waterway. Capabilities include rapid modification of modeling conditions to allow for immediate scenario analysis and evaluation of 'what-if' scenarios. The functionality of the model is illustrated through a case study of the Cheatham Reach of the Cumberland River near Nashville, TN.

  8. Development of an Aeroelastic Modeling Capability for Transient Nozzle Side Load Analysis

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See; Zhao, Xiang; Zhang, Sijun; Chen, Yen-Sen

    2013-01-01

    Lateral nozzle forces are known to cause severe structural damage to any new rocket engine in development. Currently there is no fully coupled computational tool to analyze this fluid/structure interaction process. The objective of this study was to develop a fully coupled aeroelastic modeling capability to describe the fluid/structure interaction process during the transient nozzle operations. The aeroelastic model composes of three components: the computational fluid dynamics component based on an unstructured-grid, pressure-based computational fluid dynamics formulation, the computational structural dynamics component developed in the framework of modal analysis, and the fluid-structural interface component. The developed aeroelastic model was applied to the transient nozzle startup process of the Space Shuttle Main Engine at sea level. The computed nozzle side loads and the axial nozzle wall pressure profiles from the aeroelastic nozzle are compared with those of the published rigid nozzle results, and the impact of the fluid/structure interaction on nozzle side loads is interrogated and presented.

  9. Thermal niche estimators and the capability of poor dispersal species to cope with climate change

    PubMed Central

    Sánchez-Fernández, David; Rizzo, Valeria; Cieslak, Alexandra; Faille, Arnaud; Fresneda, Javier; Ribera, Ignacio

    2016-01-01

    For management strategies in the context of global warming, accurate predictions of species response are mandatory. However, to date most predictions are based on niche (bioclimatic) models that usually overlook biotic interactions, behavioral adjustments or adaptive evolution, and assume that species can disperse freely without constraints. The deep subterranean environment minimises these uncertainties, as it is simple, homogeneous and with constant environmental conditions. It is thus an ideal model system to study the effect of global change in species with poor dispersal capabilities. We assess the potential fate of a lineage of troglobitic beetles under global change predictions using different approaches to estimate their thermal niche: bioclimatic models, rates of thermal niche change estimated from a molecular phylogeny, and data from physiological studies. Using bioclimatic models, at most 60% of the species were predicted to have suitable conditions in 2080. Considering the rates of thermal niche change did not improve this prediction. However, physiological data suggest that subterranean species have a broad thermal tolerance, allowing them to stand temperatures never experienced through their evolutionary history. These results stress the need of experimental approaches to assess the capability of poor dispersal species to cope with temperatures outside those they currently experience. PMID:26983802

  10. Development of the NTF-117S Semi-Span Balance

    NASA Technical Reports Server (NTRS)

    Lynn, Keith C.

    2010-01-01

    A new high-capacity semi-span force and moment balance has recently been developed for use at the National Transonic Facility at the NASA Langley Research Center. This new semi-span balance provides the NTF a new measurement capability that will support testing of semi-span test models at transonic high-lift testing regimes. Future testing utilizing this new balance capability will include active circulation control and propulsion simulation testing of semi-span transonic wing models. The NTF has recently implemented a new highpressure air delivery station that will provide both high and low mass flow pressure lines that are routed out to the semi-span models via a set high/low pressure bellows that are indirectly linked to the metric end of the NTF-117S balance. A new check-load stand is currently being developed to provide the NTF with an in-house capability that will allow for performing check-loads on the NTF-117S balance in order to determine the pressure tare affects on the overall performance of the balance. An experimental design is being developed that will allow for experimentally assessing the static pressure tare affects on the balance performance.

  11. Study of eddy current probes

    NASA Technical Reports Server (NTRS)

    Workman, Gary L.; Wang, Morgan

    1992-01-01

    The recognition of materials properties still presents a number of problems for nondestructive testing in aerospace systems. This project attempts to utilize current capabilities in eddy current instrumentation, artificial intelligence, and robotics in order to provide insight into defining geometrical aspects of flaws in composite materials which are capable of being evaluated using eddy current inspection techniques.

  12. Nonlinear feedback control for high alpha flight

    NASA Technical Reports Server (NTRS)

    Stalford, Harold

    1990-01-01

    Analytical aerodynamic models are derived from a high alpha 6 DOF wind tunnel model. One detail model requires some interpolation between nonlinear functions of alpha. One analytical model requires no interpolation and as such is a completely continuous model. Flight path optimization is conducted on the basic maneuvers: half-loop, 90 degree pitch-up, and level turn. The optimal control analysis uses the derived analytical model in the equations of motion and is based on both moment and force equations. The maximum principle solution for the half-loop is poststall trajectory performing the half-loop in 13.6 seconds. The agility induced by thrust vectoring capability provided a minimum effect on reducing the maneuver time. By means of thrust vectoring control the 90 degrees pitch-up maneuver can be executed in a small place over a short time interval. The agility capability of thrust vectoring is quite beneficial for pitch-up maneuvers. The level turn results are based currently on only outer layer solutions of singular perturbation. Poststall solutions provide high turn rates but generate higher losses of energy than that of classical sustained solutions.

  13. Requirements for multi-level systems pharmacology models to reach end-usage: the case of type 2 diabetes.

    PubMed

    Nyman, Elin; Rozendaal, Yvonne J W; Helmlinger, Gabriel; Hamrén, Bengt; Kjellsson, Maria C; Strålfors, Peter; van Riel, Natal A W; Gennemark, Peter; Cedersund, Gunnar

    2016-04-06

    We are currently in the middle of a major shift in biomedical research: unprecedented and rapidly growing amounts of data may be obtained today, from in vitro, in vivo and clinical studies, at molecular, physiological and clinical levels. To make use of these large-scale, multi-level datasets, corresponding multi-level mathematical models are needed, i.e. models that simultaneously capture multiple layers of the biological, physiological and disease-level organization (also referred to as quantitative systems pharmacology-QSP-models). However, today's multi-level models are not yet embedded in end-usage applications, neither in drug research and development nor in the clinic. Given the expectations and claims made historically, this seemingly slow adoption may seem surprising. Therefore, we herein consider a specific example-type 2 diabetes-and critically review the current status and identify key remaining steps for these models to become mainstream in the future. This overview reveals how, today, we may use models to ask scientific questions concerning, e.g., the cellular origin of insulin resistance, and how this translates to the whole-body level and short-term meal responses. However, before these multi-level models can become truly useful, they need to be linked with the capabilities of other important existing models, in order to make them 'personalized' (e.g. specific to certain patient phenotypes) and capable of describing long-term disease progression. To be useful in drug development, it is also critical that the developed models and their underlying data and assumptions are easily accessible. For clinical end-usage, in addition, model links to decision-support systems combined with the engagement of other disciplines are needed to create user-friendly and cost-efficient software packages.

  14. Array Effects in Large Wind Farms. Cooperative Research and Development Final Report, CRADA Number CRD-09-343

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moriarty, Patrick

    2016-02-23

    The effects of wind turbine wakes within operating wind farms have a substantial impact on the overall energy production from the farm. The current generation of models drastically underpredicts the impact of these wakes leading to non-conservative estimates of energy capture and financial losses to wind farm operators and developers. To improve these models, detailed research of operating wind farms is necessary. Rebecca Barthelmie of Indiana University is a world leader of wind farm wakes effects and would like to partner with NREL to help improve wind farm modeling by gathering additional wind farm data, develop better models and increasemore » collaboration with European researchers working in the same area. This is currently an active area of research at NREL and the capabilities of both parties should mesh nicely.« less

  15. Study report on combining diagnostic and therapeutic considerations with subsystem and whole-body simulation

    NASA Technical Reports Server (NTRS)

    Furukawa, S.

    1975-01-01

    Current applications of simulation models for clinical research described included tilt model simulation of orthostatic intolerance with hemorrhage, and modeling long term circulatory circulation. Current capabilities include: (1) simulation of analogous pathological states and effects of abnormal environmental stressors by the manipulation of system variables and changing inputs in various sequences; (2) simulation of time courses of responses of controlled variables by the altered inputs and their relationships; (3) simulation of physiological responses of treatment such as isotonic saline transfusion; (4) simulation of the effectiveness of a treatment as well as the effects of complication superimposed on an existing pathological state; and (5) comparison of the effectiveness of various treatments/countermeasures for a given pathological state. The feasibility of applying simulation models to diagnostic and therapeutic research problems is assessed.

  16. Nuclear Energy Advanced Modeling and Simulation (NEAMS) Waste Integrated Performance and Safety Codes (IPSC) : FY10 development and integration.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Criscenti, Louise Jacqueline; Sassani, David Carl; Arguello, Jose Guadalupe, Jr.

    2011-02-01

    This report describes the progress in fiscal year 2010 in developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs,more » and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with robust verification, validation, and software quality requirements. Waste IPSC activities in fiscal year 2010 focused on specifying a challenge problem to demonstrate proof of concept, developing a verification and validation plan, and performing an initial gap analyses to identify candidate codes and tools to support the development and integration of the Waste IPSC. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. This year-end progress report documents the FY10 status of acquisition, development, and integration of thermal-hydrologic-chemical-mechanical (THCM) code capabilities, frameworks, and enabling tools and infrastructure.« less

  17. Bringing modeling to the masses: A web based system to predict potential species distributions

    USGS Publications Warehouse

    Graham, Jim; Newman, Greg; Kumar, Sunil; Jarnevich, Catherine S.; Young, Nick; Crall, Alycia W.; Stohlgren, Thomas J.; Evangelista, Paul

    2010-01-01

    Predicting current and potential species distributions and abundance is critical for managing invasive species, preserving threatened and endangered species, and conserving native species and habitats. Accurate predictive models are needed at local, regional, and national scales to guide field surveys, improve monitoring, and set priorities for conservation and restoration. Modeling capabilities, however, are often limited by access to software and environmental data required for predictions. To address these needs, we built a comprehensive web-based system that: (1) maintains a large database of field data; (2) provides access to field data and a wealth of environmental data; (3) accesses values in rasters representing environmental characteristics; (4) runs statistical spatial models; and (5) creates maps that predict the potential species distribution. The system is available online at www.niiss.org, and provides web-based tools for stakeholders to create potential species distribution models and maps under current and future climate scenarios.

  18. Report on results of current and future metal casting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Unal, Cetin; Carlson, Neil N.

    2015-09-28

    New modeling capabilities needed to simulate the casting of metallic fuels are added to Truchas code. In this report we summarize improvements we made in FY2015 in three areas; (1) Analysis of new casting experiments conducted with BCS and EFL designs, (2) the simulation of INL’s U-Zr casting experiments with Flow3D computer program, (3) the implementation of surface tension model into Truchas for unstructured mesh required to run U-Zr casting.

  19. Estimating the Uncertainty and Predictive Capabilities of Three-Dimensional Earth Models (Postprint)

    DTIC Science & Technology

    2012-03-22

    www.isc.ac.uk). This global database includes more than 7,000 events whose epicentral location accuracy is known to at least 5 km. GT events with...region, which illustrates the difficulty of validating a model with travel times alone. However, the IASPEI REL database is currently the highest...S (right) paths in the IASPEI REL ground-truth database . Stations are represented by purple triangles and events by gray circles. Note the sparse

  20. Common world model for unmanned systems: Phase 2

    NASA Astrophysics Data System (ADS)

    Dean, Robert M. S.; Oh, Jean; Vinokurov, Jerry

    2014-06-01

    The Robotics Collaborative Technology Alliance (RCTA) seeks to provide adaptive robot capabilities which move beyond traditional metric algorithms to include cognitive capabilities. Key to this effort is the Common World Model, which moves beyond the state-of-the-art by representing the world using semantic and symbolic as well as metric information. It joins these layers of information to define objects in the world. These objects may be reasoned upon jointly using traditional geometric, symbolic cognitive algorithms and new computational nodes formed by the combination of these disciplines to address Symbol Grounding and Uncertainty. The Common World Model must understand how these objects relate to each other. It includes the concept of Self-Information about the robot. By encoding current capability, component status, task execution state, and their histories we track information which enables the robot to reason and adapt its performance using Meta-Cognition and Machine Learning principles. The world model also includes models of how entities in the environment behave which enable prediction of future world states. To manage complexity, we have adopted a phased implementation approach. Phase 1, published in these proceedings in 2013 [1], presented the approach for linking metric with symbolic information and interfaces for traditional planners and cognitive reasoning. Here we discuss the design of "Phase 2" of this world model, which extends the Phase 1 design API, data structures, and reviews the use of the Common World Model as part of a semantic navigation use case.

  1. Damage prognosis: the future of structural health monitoring.

    PubMed

    Farrar, Charles R; Lieven, Nick A J

    2007-02-15

    This paper concludes the theme issue on structural health monitoring (SHM) by discussing the concept of damage prognosis (DP). DP attempts to forecast system performance by assessing the current damage state of the system (i.e. SHM), estimating the future loading environments for that system, and predicting through simulation and past experience the remaining useful life of the system. The successful development of a DP capability will require the further development and integration of many technology areas including both measurement/processing/telemetry hardware and a variety of deterministic and probabilistic predictive modelling capabilities, as well as the ability to quantify the uncertainty in these predictions. The multidisciplinary and challenging nature of the DP problem, its current embryonic state of development, and its tremendous potential for life-safety and economic benefits qualify DP as a 'grand challenge' problem for engineers in the twenty-first century.

  2. Space Weather - Current Capabilities, Future Requirements, and the Path to Improved Forecasting

    NASA Astrophysics Data System (ADS)

    Mann, Ian

    2016-07-01

    We present an overview of Space Weather activities and future opportunities including assessments of current status and capabilities, knowledge gaps, and future directions in relation to both observations and modeling. The review includes input from the scientific community including from SCOSTEP scientific discipline representatives (SDRs), COSPAR Main Scientific Organizers (MSOs), and SCOSTEP/VarSITI leaders. The presentation also draws on results from the recent activities related to the production of the COSPAR-ILWS Space Weather Roadmap "Understanding Space Weather to Shield Society" [Schrijver et al., Advances in Space Research 55, 2745 (2015) http://dx.doi.org/10.1016/j.asr.2015.03.023], from the activities related to the United Nations (UN) Committee on the Peaceful Uses of Outer Space (COPUOS) actions in relation to the Long-term Sustainability of Outer Space (LTS), and most recently from the newly formed and ongoing efforts of the UN COPUOS Expert Group on Space Weather.

  3. Understanding the Flushing Capability of Bellingham Bay and Its Implication on Bottom Water Hypoxia

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Taiping; Yang, Zhaoqing

    2015-05-05

    In this study, an unstructured-grid finite-volume coastal ocean model (FVCOM) was used to simulate hydrodynamic circulation and assess the flushing capability in Bellingham Bay, Washington, USA. The model was reasonably calibrated against field observations for water level, velocity and salinity, and was further used to calculate residence time distributions in the study site. The model results suggest that, despite the large tidal ranges (~4 m during spring tide), tidal currents are relatively weak in Bellingham Bay with surface currents generally below 0.5 m/s. The local residence time in Bellingham Bay varies from to near zero to as long as 15more » days, depending on the location and river flow condition. In general, Bellingham Bay is a well-flushed coastal embayment affected by freshwater discharge, tides, wind, and density-driven circulation. The basin-wide global residence time ranges from 5-7 days. The model results also provide useful information on possible causes of the emerging summertime hypoxia problem in the north central region of Bellingham Bay. It was concluded that the formation of the bottom hypoxic water should result from the increased consumption rate of oxygen in the bottom oceanic inflow with low dissolved oxygen by organic matters accumulated at the regions characterized with relatively long residence time in summer months.« less

  4. Steel Shear Walls, Behavior, Modeling and Design

    NASA Astrophysics Data System (ADS)

    Astaneh-Asl, Abolhassan

    2008-07-01

    In recent years steel shear walls have become one of the more efficient lateral load resisting systems in tall buildings. The basic steel shear wall system consists of a steel plate welded to boundary steel columns and boundary steel beams. In some cases the boundary columns have been concrete-filled steel tubes. Seismic behavior of steel shear wall systems during actual earthquakes and based on laboratory cyclic tests indicates that the systems are quite ductile and can be designed in an economical way to have sufficient stiffness, strength, ductility and energy dissipation capacity to resist seismic effects of strong earthquakes. This paper, after summarizing the past research, presents the results of two tests of an innovative steel shear wall system where the boundary elements are concrete-filled tubes. Then, a review of currently available analytical models of steel shear walls is provided with a discussion of capabilities and limitations of each model. We have observed that the tension only "strip model", forming the basis of the current AISC seismic design provisions for steel shear walls, is not capable of predicting the behavior of steel shear walls with length-to-thickness ratio less than about 600 which is the range most common in buildings. The main reasons for such shortcomings of the AISC seismic design provisions for steel shear walls is that it ignores the compression field in the shear walls, which can be significant in typical shear walls. The AISC method also is not capable of incorporating stresses in the shear wall due to overturning moments. A more rational seismic design procedure for design of shear walls proposed in 2000 by the author is summarized in the paper. The design method, based on procedures used for design of steel plate girders, takes into account both tension and compression stress fields and is applicable to all values of length-to-thickness ratios of steel shear walls. The method is also capable of including the effect of overturning moments and any normal forces that might act on the steel shear wall.

  5. Impact design methods for ceramic components in gas turbine engines

    NASA Technical Reports Server (NTRS)

    Song, J.; Cuccio, J.; Kington, H.

    1991-01-01

    Methods currently under development to design ceramic turbine components with improved impact resistance are presented. Two different modes of impact damage are identified and characterized, i.e., structural damage and local damage. The entire computation is incorporated into the EPIC computer code. Model capability is demonstrated by simulating instrumented plate impact and particle impact tests.

  6. The Development of the Non-hydrostatic Unified Model of the Atmosphere (NUMA)

    DTIC Science & Technology

    2011-09-19

    capabilities: 1.  Highly scalable on current and future computer architectures ( exascale computing: this means CPUs and GPUs) 2.  Flexibility to use a...From Terascale to Petascale/ Exascale Computing •  10 of Top 500 are already in the Petascale range •  3 of top 10 are GPU-based machines 2

  7. Southern Forest Resource Assessment and Linkages to the National RPA

    Treesearch

    Fredrick Cubbage; Jacek Siry; Steverson Moffat; David N. Wear; Robert Abt

    1998-01-01

    We developed a Southern Forest Resource Assessment Consortium (SOFAC) in 1994, which is designed to enhance our capabilities to analyze and model the southern forest and timber resources. Southern growth and yield analyses prepared for the RPA via SOFAC indicate that substantial increases in timber productivity can occur given current technology. A survey about NIPF...

  8. CFD for hypersonic airbreathing aircraft

    NASA Technical Reports Server (NTRS)

    Kumar, Ajay

    1989-01-01

    A general discussion is given on the use of advanced computational fluid dynamics (CFD) in analyzing the hypersonic flow field around an airbreathing aircraft. Unique features of the hypersonic flow physics are presented and an assessment is given of the current algorithms in terms of their capability to model hypersonic flows. Several examples of advanced CFD applications are then presented.

  9. Methods of treating complex space vehicle geometry for charged particle radiation transport

    NASA Technical Reports Server (NTRS)

    Hill, C. W.

    1973-01-01

    Current methods of treating complex geometry models for space radiation transport calculations are reviewed. The geometric techniques used in three computer codes are outlined. Evaluations of geometric capability and speed are provided for these codes. Although no code development work is included several suggestions for significantly improving complex geometry codes are offered.

  10. Local interconnection neural networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang Jiajun; Zhang Li; Yan Dapen

    1993-06-01

    The idea of a local interconnection neural network (LINN) is presentd and compared with the globally interconnected Hopfield model. Under the storage limit requirement, LINN is shown to offer the same associative memory capability as the global interconnection neural network while having a much smaller interconnection matrix. LINN can be readily implemented optically using the currently available spatial light modulators. 15 refs.

  11. The Effect of Teachers' Shared Leadership Perception on Academic Optimism and Organizational Citizenship Behaviour: A Turkish Case

    ERIC Educational Resources Information Center

    Akin Kösterelioglu, Meltem

    2017-01-01

    Purpose: The present study investigates the capability of high school teachers' shared leadership perception to predict the academic optimism and organizational citizenship levels. Research methods: The population of the current descriptive study, which was conducted via screening model, consists of 321 high school teachers working for Amasya…

  12. A fast, calibrated model for pyroclastic density currents kinematics and hazard

    NASA Astrophysics Data System (ADS)

    Esposti Ongaro, Tomaso; Orsucci, Simone; Cornolti, Fulvio

    2016-11-01

    Multiphase flow models represent valuable tools for the study of the complex, non-equilibrium dynamics of pyroclastic density currents. Particle sedimentation, flow stratification and rheological changes, depending on the flow regime, interaction with topographic obstacles, turbulent air entrainment, buoyancy reversal, and other complex features of pyroclastic currents can be simulated in two and three dimensions, by exploiting efficient numerical solvers and the improved computational capability of modern supercomputers. However, numerical simulations of polydisperse gas-particle mixtures are quite computationally expensive, so that their use in hazard assessment studies (where there is the need of evaluating the probability of hazardous actions over hundreds of possible scenarios) is still challenging. To this aim, a simplified integral (box) model can be used, under the appropriate hypotheses, to describe the kinematics of pyroclastic density currents over a flat topography, their scaling properties and their depositional features. In this work, multiphase flow simulations are used to evaluate integral model approximations, to calibrate its free parameters and to assess the influence of the input data on the results. Two-dimensional numerical simulations describe the generation and decoupling of a dense, basal layer (formed by progressive particle sedimentation) from the dilute transport system. In the Boussinesq regime (i.e., for solid mass fractions below about 0.1), the current Froude number (i.e., the ratio between the current inertia and buoyancy) does not strongly depend on initial conditions and it is consistent to that measured in laboratory experiments (i.e., between 1.05 and 1.2). For higher density ratios (solid mass fraction in the range 0.1-0.9) but still in a relatively dilute regime (particle volume fraction lower than 0.01), numerical simulations demonstrate that the box model is still applicable, but the Froude number depends on the reduced gravity. When the box model is opportunely calibrated with the numerical simulation results, the prediction of the flow runout is fairly accurate and the model predicts a rapid, non-linear decay of the flow kinetic energy (or dynamic pressure) with the distance from the source. The capability of PDC to overcome topographic obstacles can thus be analysed in the framework of the energy-conoid approach, in which the predicted kinetic energy of the flow front is compared with the potential energy jump associated with the elevated topography to derive a condition for blocking. Model results show that, although preferable to the energy-cone, the energy-conoid approach still has some serious limitations, mostly associated with the behaviour of the flow head. Implications of these outcomes are discussed in the context of probabilistic hazard assessment studies, in which a calibrated box model can be used as a fast pyroclastic density current emulator for Monte Carlo simulations.

  13. Integrated Workforce Modeling System

    NASA Technical Reports Server (NTRS)

    Moynihan, Gary P.

    2000-01-01

    There are several computer-based systems, currently in various phases of development at KSC, which encompass some component, aspect, or function of workforce modeling. These systems may offer redundant capabilities and/or incompatible interfaces. A systems approach to workforce modeling is necessary in order to identify and better address user requirements. This research has consisted of two primary tasks. Task 1 provided an assessment of existing and proposed KSC workforce modeling systems for their functionality and applicability to the workforce planning function. Task 2 resulted in the development of a proof-of-concept design for a systems approach to workforce modeling. The model incorporates critical aspects of workforce planning, including hires, attrition, and employee development.

  14. A Methodology for Modeling the Flow of Military Personnel Across Air Force Active and Reserve Components

    DTIC Science & Technology

    2016-01-01

    user, the model will use surplus inventory in one category to fill shortfalls in other categories. The TFBL model also has the capability to allow...the total force. As shown in Table 2.1, we used five-character AFSCs (four digits plus the suffix) to break out pilots in the current force as...Training Unit 4th digit Suffix 4th digit Suffix 4th digit 4th digit 2 or 3 Matches a specific aircraft designation 3 Does not match a specific

  15. Direct use of linear time-domain aerodynamics in aeroservoelastic analysis: Aerodynamic model

    NASA Technical Reports Server (NTRS)

    Woods, J. A.; Gilbert, Michael G.

    1990-01-01

    The work presented here is the first part of a continuing effort to expand existing capabilities in aeroelasticity by developing the methodology which is necessary to utilize unsteady time-domain aerodynamics directly in aeroservoelastic design and analysis. The ultimate objective is to define a fully integrated state-space model of an aeroelastic vehicle's aerodynamics, structure and controls which may be used to efficiently determine the vehicle's aeroservoelastic stability. Here, the current status of developing a state-space model for linear or near-linear time-domain indicial aerodynamic forces is presented.

  16. The brain and the law.

    PubMed Central

    Chorvat, Terrence; McCabe, Kevin

    2004-01-01

    Much has been written about how law as an institution has developed to solve many problems that human societies face. Inherent in all of these explanations are models of how humans make decisions. This article discusses what current neuroscience research tells us about the mechanisms of human decision making of particular relevance to law. This research indicates that humans are both more capable of solving many problems than standard economic models predict, but also limited in ways those models ignore. This article discusses how law is both shaped by our cognitive processes and also shapes them. The article considers some of the implications of this research for improving our understanding of how our current legal regimes operate and how the law can be structured to take advantage of our neural mechanisms to improve social welfare. PMID:15590613

  17. Lunar exploration rover program developments

    NASA Technical Reports Server (NTRS)

    Klarer, P. R.

    1994-01-01

    The Robotic All Terrain Lunar Exploration Rover (RATLER) design concept began at Sandia National Laboratories in late 1991 with a series of small, proof-of-principle, working scale models. The models proved the viability of the concept for high mobility through mechanical simplicity, and eventually received internal funding at Sandia National Laboratories for full scale, proof-of-concept prototype development. Whereas the proof-of-principle models demonstrated the mechanical design's capabilities for mobility, the full scale proof-of-concept design currently under development is intended to support field operations for experiments in telerobotics, autonomous robotic operations, telerobotic field geology, and advanced man-machine interface concepts. The development program's current status is described, including an outline of the program's work over the past year, recent accomplishments, and plans for follow-on development work.

  18. A Generalized Framework for Modeling Next Generation 911 Implementations.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kelic, Andjelka; Aamir, Munaf Syed; Kelic, Andjelka

    This document summarizes the current state of Sandia 911 modeling capabilities and then addresses key aspects of Next Generation 911 (NG911) architectures for expansion of existing models. Analysis of three NG911 implementations was used to inform heuristics , associated key data requirements , and assumptions needed to capture NG911 architectures in the existing models . Modeling of NG911 necessitates careful consideration of its complexity and the diversity of implementations. Draft heuristics for constructing NG911 models are pres ented based on the analysis along with a summary of current challenges and ways to improve future NG911 modeling efforts . We foundmore » that NG911 relies on E nhanced 911 (E911) assets such as 911 selective routers to route calls originating from traditional tel ephony service which are a majority of 911 calls . We also found that the diversity and transitional nature of NG911 implementations necessitates significant and frequent data collection to ensure that adequate model s are available for crisis action support .« less

  19. An integrated decision model for the application of airborne sensors for improved response to accidental and terrorist chemical vapor releases

    NASA Astrophysics Data System (ADS)

    Kapitan, Loginn

    This research created a new model which provides an integrated approach to planning the effective selection and employment of airborne sensor systems in response to accidental or intentional chemical vapor releases. The approach taken was to use systems engineering and decision analysis methods to construct a model architecture which produced a modular structure for integrating both new and existing components into a logical procedure to assess the application of airborne sensor systems to address chemical vapor hazards. The resulting integrated process model includes an internal aggregation model which allowed differentiation among alternative airborne sensor systems. Both models were developed and validated by experts and demonstrated using appropriate hazardous chemical release scenarios. The resultant prototype integrated process model or system fills a current gap in capability allowing improved planning, training and exercise for HAZMAT teams and first responders when considering the selection and employment of airborne sensor systems. Through the research process, insights into the current response structure and how current airborne capability may be most effectively used were generated. Furthermore, the resultant prototype system is tailorable for local, state, and federal application, and can potentially be modified to help evaluate investments in new airborne sensor technology and systems. Better planning, training and preparedness exercising holds the prospect for the effective application of airborne assets for improved response to large scale chemical release incidents. Improved response will result in fewer casualties and lives lost, reduced economic impact, and increased protection of critical infrastructure when faced with accidental and intentional terrorist release of hazardous industrial chemicals. With the prospect of more airborne sensor systems becoming available, this prototype system integrates existing and new tools into an effective process for the selection and employment of airborne sensors to better plan, train and exercise ahead of potential chemical release events.

  20. Top-level modeling of an als system utilizing object-oriented techniques

    NASA Astrophysics Data System (ADS)

    Rodriguez, L. F.; Kang, S.; Ting, K. C.

    The possible configuration of an Advanced Life Support (ALS) System capable of supporting human life for long-term space missions continues to evolve as researchers investigate potential technologies and configurations. To facilitate the decision process the development of acceptable, flexible, and dynamic mathematical computer modeling tools capable of system level analysis is desirable. Object-oriented techniques have been adopted to develop a dynamic top-level model of an ALS system.This approach has several advantages; among these, object-oriented abstractions of systems are inherently modular in architecture. Thus, models can initially be somewhat simplistic, while allowing for adjustments and improvements. In addition, by coding the model in Java, the model can be implemented via the World Wide Web, greatly encouraging the utilization of the model. Systems analysis is further enabled with the utilization of a readily available backend database containing information supporting the model. The subsystem models of the ALS system model include Crew, Biomass Production, Waste Processing and Resource Recovery, Food Processing and Nutrition, and the Interconnecting Space. Each subsystem model and an overall model have been developed. Presented here is the procedure utilized to develop the modeling tool, the vision of the modeling tool, and the current focus for each of the subsystem models.

  1. Software For Computing Reliability Of Other Software

    NASA Technical Reports Server (NTRS)

    Nikora, Allen; Antczak, Thomas M.; Lyu, Michael

    1995-01-01

    Computer Aided Software Reliability Estimation (CASRE) computer program developed for use in measuring reliability of other software. Easier for non-specialists in reliability to use than many other currently available programs developed for same purpose. CASRE incorporates mathematical modeling capabilities of public-domain Statistical Modeling and Estimation of Reliability Functions for Software (SMERFS) computer program and runs in Windows software environment. Provides menu-driven command interface; enabling and disabling of menu options guides user through (1) selection of set of failure data, (2) execution of mathematical model, and (3) analysis of results from model. Written in C language.

  2. Computer modeling of heat pipe performance

    NASA Technical Reports Server (NTRS)

    Peterson, G. P.

    1983-01-01

    A parametric study of the defining equations which govern the steady state operational characteristics of the Grumman monogroove dual passage heat pipe is presented. These defining equations are combined to develop a mathematical model which describes and predicts the operational and performance capabilities of a specific heat pipe given the necessary physical characteristics and working fluid. Included is a brief review of the current literature, a discussion of the governing equations, and a description of both the mathematical and computer model. Final results of preliminary test runs of the model are presented and compared with experimental tests on actual prototypes.

  3. Incorporation of Failure Into an Orthotropic Three-Dimensional Model with Tabulated Input Suitable for Use in Composite Impact Problems

    NASA Technical Reports Server (NTRS)

    Goldberg, Robert K.; Carney, Kelly S.; Dubois, Paul; Hoffarth, Canio; Khaled, Bilal; Shyamsunder, Loukham; Rajan, Subramaniam; Blankenhorn, Gunther

    2017-01-01

    The need for accurate material models to simulate the deformation, damage and failure of polymer matrix composites under impact conditions is becoming critical as these materials are gaining increased use in the aerospace and automotive communities. The aerospace community has identified several key capabilities which are currently lacking in the available material models in commercial transient dynamic finite element codes. To attempt to improve the predictive capability of composite impact simulations, a next generation material model is being developed for incorporation within the commercial transient dynamic finite element code LS-DYNA. The material model, which incorporates plasticity, damage and failure, utilizes experimentally based tabulated input to define the evolution of plasticity and damage and the initiation of failure as opposed to specifying discrete input parameters such as modulus and strength. The plasticity portion of the orthotropic, three-dimensional, macroscopic composite constitutive model is based on an extension of the Tsai-Wu composite failure model into a generalized yield function with a non-associative flow rule. For the damage model, a strain equivalent formulation is used to allow for the uncoupling of the deformation and damage analyses. In the damage model, a semi-coupled approach is employed where the overall damage in a particular coordinate direction is assumed to be a multiplicative combination of the damage in that direction resulting from the applied loads in various coordinate directions. For the failure model, a tabulated approach is utilized in which a stress or strain based invariant is defined as a function of the location of the current stress state in stress space to define the initiation of failure. Failure surfaces can be defined with any arbitrary shape, unlike traditional failure models where the mathematical functions used to define the failure surface impose a specific shape on the failure surface. In the current paper, the complete development of the failure model is described and the generation of a tabulated failure surface for a representative composite material is discussed.

  4. Coastal Zone Color Scanner

    NASA Technical Reports Server (NTRS)

    Johnson, B.

    1988-01-01

    The Coastal Zone Color Scanner (CZCS) spacecraft ocean color instrument is capable of measuring and mapping global ocean surface chlorophyll concentration. It is a scanning radiometer with multiband capability. With new electronics and some mechanical, and optical re-work, it probably can be made flight worthy. Some additional components of a second flight model are also available. An engineering study and further tests are necessary to determine exactly what effort is required to properly prepare the instrument for spaceflight and the nature of interfaces to prospective spacecraft. The CZCS provides operational instrument capability for monitoring of ocean productivity and currents. It could be a simple, low cost alternative to developing new instruments for ocean color imaging. Researchers have determined that with global ocean color data they can: specify quantitatively the role of oceans in the global carbon cycle and other major biogeochemical cycles; determine the magnitude and variability of annual primary production by marine phytoplankton on a global scale; understand the fate of fluvial nutrients and their possible affect on carbon budgets; elucidate the coupling mechanism between upwelling and large scale patterns in ocean basins; answer questions concerning the large scale distribution and timing of spring blooms in the global ocean; acquire a better understanding of the processes associated with mixing along the edge of eddies, coastal currents, western boundary currents, etc., and acquire global data on marine optical properties.

  5. New single-aircraft integrated atmospheric observation capabilities

    NASA Astrophysics Data System (ADS)

    Wang, Z.

    2011-12-01

    Improving current weather and climate model capabilities requires better understandings of many atmospheric processes. Thus, advancing atmospheric observation capabilities has been regarded as the highest imperatives to advance the atmospheric science in the 21st century. Under the NSF CAREER support, we focus on developing new airborne observation capabilities through the developments of new instrumentations and the single-aircraft integration of multiple remote sensors with in situ probes. Two compact Wyoming cloud lidars were built to work together with a 183 GHz microwave radiometer, a multi-beam Wyoming cloud radar and in situ probes for cloud studies. The synergy of these remote sensor measurements allows us to better resolve the vertical structure of cloud microphysical properties and cloud scale dynamics. Together with detailed in situ data for aerosol, cloud, water vapor and dynamics, we developed the most advanced observational capability to study cloud-scale properties and processes from a single aircraft (Fig. 1). A compact Raman lidar was also built to work together with in situ sampling to characterize boundary layer aerosol and water vapor distributions for many important atmospheric processes studies, such as, air-sea interaction and convective initialization. Case studies will be presented to illustrate these new observation capabilities.

  6. Structure of High Latitude Currents in Magnetosphere-Ionosphere Models

    NASA Astrophysics Data System (ADS)

    Wiltberger, M.; Rigler, E. J.; Merkin, V.; Lyon, J. G.

    2017-03-01

    Using three resolutions of the Lyon-Fedder-Mobarry global magnetosphere-ionosphere model (LFM) and the Weimer 2005 empirical model we examine the structure of the high latitude field-aligned current patterns. Each resolution was run for the entire Whole Heliosphere Interval which contained two high speed solar wind streams and modest interplanetary magnetic field strengths. Average states of the field-aligned current (FAC) patterns for 8 interplanetary magnetic field clock angle directions are computed using data from these runs. Generally speaking the patterns obtained agree well with results obtained from the Weimer 2005 computing using the solar wind and IMF conditions that correspond to each bin. As the simulation resolution increases the currents become more intense and narrow. A machine learning analysis of the FAC patterns shows that the ratio of Region 1 (R1) to Region 2 (R2) currents decreases as the simulation resolution increases. This brings the simulation results into better agreement with observational predictions and the Weimer 2005 model results. The increase in R2 current strengths also results in the cross polar cap potential (CPCP) pattern being concentrated in higher latitudes. Current-voltage relationships between the R1 and CPCP are quite similar at the higher resolution indicating the simulation is converging on a common solution. We conclude that LFM simulations are capable of reproducing the statistical features of FAC patterns.

  7. Structure of high latitude currents in global magnetospheric-ionospheric models

    USGS Publications Warehouse

    Wiltberger, M; Rigler, E. J.; Merkin, V; Lyon, J. G

    2016-01-01

    Using three resolutions of the Lyon-Fedder-Mobarry global magnetosphere-ionosphere model (LFM) and the Weimer 2005 empirical model we examine the structure of the high latitude field-aligned current patterns. Each resolution was run for the entire Whole Heliosphere Interval which contained two high speed solar wind streams and modest interplanetary magnetic field strengths. Average states of the field-aligned current (FAC) patterns for 8 interplanetary magnetic field clock angle directions are computed using data from these runs. Generally speaking the patterns obtained agree well with results obtained from the Weimer 2005 computing using the solar wind and IMF conditions that correspond to each bin. As the simulation resolution increases the currents become more intense and narrow. A machine learning analysis of the FAC patterns shows that the ratio of Region 1 (R1) to Region 2 (R2) currents decreases as the simulation resolution increases. This brings the simulation results into better agreement with observational predictions and the Weimer 2005 model results. The increase in R2 current strengths also results in the cross polar cap potential (CPCP) pattern being concentrated in higher latitudes. Current-voltage relationships between the R1 and CPCP are quite similar at the higher resolution indicating the simulation is converging on a common solution. We conclude that LFM simulations are capable of reproducing the statistical features of FAC patterns.

  8. Design of a nickel-hydrogen battery simulator for the NASA EOS testbed

    NASA Technical Reports Server (NTRS)

    Gur, Zvi; Mang, Xuesi; Patil, Ashok R.; Sable, Dan M.; Cho, Bo H.; Lee, Fred C.

    1992-01-01

    The hardware and software design of a nickel-hydrogen (Ni-H2) battery simulator (BS) with application to the NASA Earth Observation System (EOS) satellite is presented. The battery simulator is developed as a part of a complete testbed for the EOS satellite power system. The battery simulator involves both hardware and software components. The hardware component includes the capability of sourcing and sinking current at a constant programmable voltage. The software component includes the capability of monitoring the battery's ampere-hours (Ah) and programming the battery voltage according to an empirical model of the nickel-hydrogen battery stored in a computer.

  9. Building HR capability in health care organizations.

    PubMed

    Khatri, Naresh

    2006-01-01

    The current human resource (HR) management practices in health care are consistent with the industrial model of management. However, health care organizations are not factories. They are highly knowledge-intensive and service-oriented entities and thus require a different set of HR practices and systems to support them. Drawing from the resource-based theory, I argue that HRs are a potent weapon of competitive advantage for health care organizations and propose a five-dimensional conception of HR capability for harnessing HRs in health care organizations. The significant complementarities that exist between HRs and information technologies for delivering safer and better quality of patient care are also discussed.

  10. Development of the MCNPX depletion capability: A Monte Carlo linked depletion method that automates the coupling between MCNPX and CINDER90 for high fidelity burnup calculations

    NASA Astrophysics Data System (ADS)

    Fensin, Michael Lorne

    Monte Carlo-linked depletion methods have gained recent interest due to the ability to more accurately model complex 3-dimesional geometries and better track the evolution of temporal nuclide inventory by simulating the actual physical process utilizing continuous energy coefficients. The integration of CINDER90 into the MCNPX Monte Carlo radiation transport code provides a high-fidelity completely self-contained Monte-Carlo-linked depletion capability in a well established, widely accepted Monte Carlo radiation transport code that is compatible with most nuclear criticality (KCODE) particle tracking features in MCNPX. MCNPX depletion tracks all necessary reaction rates and follows as many isotopes as cross section data permits in order to achieve a highly accurate temporal nuclide inventory solution. This work chronicles relevant nuclear history, surveys current methodologies of depletion theory, details the methodology in applied MCNPX and provides benchmark results for three independent OECD/NEA benchmarks. Relevant nuclear history, from the Oklo reactor two billion years ago to the current major United States nuclear fuel cycle development programs, is addressed in order to supply the motivation for the development of this technology. A survey of current reaction rate and temporal nuclide inventory techniques is then provided to offer justification for the depletion strategy applied within MCNPX. The MCNPX depletion strategy is then dissected and each code feature is detailed chronicling the methodology development from the original linking of MONTEBURNS and MCNP to the most recent public release of the integrated capability (MCNPX 2.6.F). Calculation results of the OECD/NEA Phase IB benchmark, H. B. Robinson benchmark and OECD/NEA Phase IVB are then provided. The acceptable results of these calculations offer sufficient confidence in the predictive capability of the MCNPX depletion method. This capability sets up a significant foundation, in a well established and supported radiation transport code, for further development of a Monte Carlo-linked depletion methodology which is essential to the future development of advanced reactor technologies that exceed the limitations of current deterministic based methods.

  11. Structure of high latitude currents in magnetosphere-ionosphere models

    NASA Astrophysics Data System (ADS)

    Wiltberger, M. J.; Lyon, J.; Merkin, V. G.; Rigler, E. J.

    2016-12-01

    Using three resolutions of the Lyon-Fedder-Mobarry global magnetosphere-ionosphere model (LFM) and the Weimer 2005 empirical model the structure of the high latitude field-aligned current patterns is examined. Each LFM resolution was run for the entire Whole Heliosphere Interval (WHI), which contained two high-speed solar wind streams and modest interplanetary magnetic field strengths. Average states of the field-aligned current (FAC) patterns for 8 interplanetary magnetic field clock angle directions are computed using data from these runs. Generally speaking the patterns obtained agree well with results from the Weimer 2005 computed using the solar wind and IMF conditions that correspond to each bin. As the simulation resolution increases the currents become more intense and confined. A machine learning analysis of the FAC patterns shows that the ratio of Region 1 (R1) to Region 2 (R2) currents decreases as the simulation resolution increases. This brings the simulation results into better agreement with observational predictions and the Weimer 2005 model results. The increase in R2 current strengths in the model also results in a better shielding of mid- and low-latitude ionosphere from the polar cap convection, also in agreement with observations. Current-voltage relationships between the R1 strength and the cross-polar cap potential (CPCP) are quite similar at the higher resolutions indicating the simulation is converging on a common solution. We conclude that LFM simulations are capable of reproducing the statistical features of FAC patterns.

  12. NASA's Next Generation Space Geodesy Program

    NASA Technical Reports Server (NTRS)

    Pearlman, M. R.; Frey, H. V.; Gross, R. S.; Lemoine, F. G.; Long, J. L.; Ma, C.; McGarry J. F.; Merkowitz, S. M.; Noll, C. E.; Pavilis, E. C.; hide

    2012-01-01

    Requirements for the ITRF have increased dramatically since the 1980s. The most stringent requirement comes from critical sea level monitoring programs: a global accuracy of 1.0 mm, and 0.1mm/yr stability, a factor of 10 to 20 beyond current capability. Other requirements for the ITRF coming from ice mass change, ground motion, and mass transport studies are similar. Current and future satellite missions will have ever-increasing measurement capability and will lead to increasingly sophisticated models of these and other changes in the Earth system. Ground space geodesy networks with enhanced measurement capability will be essential to meeting the ITRF requirements and properly interpreting the satellite data. These networks must be globally distributed and built for longevity, to provide the robust data necessary to generate improved models for proper interpretation of the observed geophysical signals. NASA has embarked on a Space Geodesy Program with a long-range goal to build, deploy and operate a next generation NASA Space Geodetic Network (SGN). The plan is to build integrated, multi-technique next-generation space geodetic observing systems as the core contribution to a global network designed to produce the higher quality data required to maintain the Terrestrial Reference Frame and provide information essential for fully realizing the measurement potential of the current and coming generation of Earth Observing spacecraft. Phase 1 of this project has been funded to (1) Establish and demonstrate a next-generation prototype integrated Space Geodetic Station at Goddard s Geophysical and Astronomical Observatory (GGAO), including next-generation SLR and VLBI systems along with modern GNSS and DORIS; (2) Complete ongoing Network Design Studies that describe the appropriate number and distribution of next-generation Space Geodetic Stations for an improved global network; (3) Upgrade analysis capability to handle the next-generation data; (4) Implement a modern survey system to measure inter-technique vectors for co-location; and (5) Develop an Implementation Plan to build, deploy and operate a next-generation integrated NASA SGN that will serve as NASA s contribution to the international global geodetic network. An envisioned Phase 2 (which is not currently funded) would include the replication of up to ten such stations to be deployed either as integrated units or as a complement to already in-place components provided by other organizations. This talk will give an update on the activities underway and the plans for completion.

  13. NASA's Next Generation Space Geodesy Program

    NASA Technical Reports Server (NTRS)

    Merkowitz, S. M.; Desai, S. D.; Gross, R. S.; Hillard, L. M.; Lemoine, F. G.; Long, J. L.; Ma, C.; McGarry, J. F.; Murphy, D.; Noll, C. E.; hide

    2012-01-01

    Requirements for the ITRF have increased dramatically since the 1980s. The most stringent requirement comes from critical sea level monitoring programs: a global accuracy of 1.0 mm, and 0.1mm/yr stability, a factor of 10 to 20 beyond current capability. Other requirements for the ITRF coming from ice mass change, ground motion, and mass transport studies are similar. Current and future satellite missions will have ever-increasing measurement capability and will lead to increasingly sophisticated models of these and other changes in the Earth system. Ground space geodesy networks with enhanced measurement capability will be essential to meeting the ITRF requirements and properly interpreting the satellite data. These networks must be globally distributed and built for longevity, to provide the robust data necessary to generate improved models for proper interpretation of the observed geophysical signals. NASA has embarked on a Space Geodesy Program with a long-range goal to build, deploy and operate a next generation NASA Space Geodetic Network (SGN). The plan is to build integrated, multi-technique next-generation space geodetic observing systems as the core contribution to a global network designed to produce the higher quality data required to maintain the Terrestrial Reference Frame and provide information essential for fully realizing the measurement potential of the current and coming generation of Earth Observing spacecraft. Phase 1 of this project has been funded to (1) Establish and demonstrate a next-generation prototype integrated Space Geodetic Station at Goddard's Geophysical and Astronomical Observatory (GGAO), including next-generation SLR and VLBI systems along with modern GNSS and DORIS; (2) Complete ongoing Network Design Studies that describe the appropriate number and distribution of next-generation Space Geodetic Stations for an improved global network; (3) Upgrade analysis capability to handle the next-generation data; (4) Implement a modern survey system to measure inter-technique vectors for co-location; and (5) Develop an Implementation Plan to build, deploy and operate a next-generation integrated NASA SGN that will serve as NASA's contribution to the international global geodetic network. An envisioned Phase 2 (which is not currently funded) would include the replication of up to ten such stations to be deployed either as integrated units or as a complement to already in-place components provided by other organizations. This talk will give an update on the activities underway and the plans for completion.

  14. Integration of Error Compensation of Coordinate Measuring Machines into Feature Measurement: Part II—Experimental Implementation

    PubMed Central

    Calvo, Roque; D’Amato, Roberto; Gómez, Emilio; Domingo, Rosario

    2016-01-01

    Coordinate measuring machines (CMM) are main instruments of measurement in laboratories and in industrial quality control. A compensation error model has been formulated (Part I). It integrates error and uncertainty in the feature measurement model. Experimental implementation for the verification of this model is carried out based on the direct testing on a moving bridge CMM. The regression results by axis are quantified and compared to CMM indication with respect to the assigned values of the measurand. Next, testing of selected measurements of length, flatness, dihedral angle, and roundness features are accomplished. The measurement of calibrated gauge blocks for length or angle, flatness verification of the CMM granite table and roundness of a precision glass hemisphere are presented under a setup of repeatability conditions. The results are analysed and compared with alternative methods of estimation. The overall performance of the model is endorsed through experimental verification, as well as the practical use and the model capability to contribute in the improvement of current standard CMM measuring capabilities. PMID:27754441

  15. Optimization of a novel biophysical model using large scale in vivo antisense hybridization data displays improved prediction capabilities of structurally accessible RNA regions

    PubMed Central

    Vazquez-Anderson, Jorge; Mihailovic, Mia K.; Baldridge, Kevin C.; Reyes, Kristofer G.; Haning, Katie; Cho, Seung Hee; Amador, Paul; Powell, Warren B.

    2017-01-01

    Abstract Current approaches to design efficient antisense RNAs (asRNAs) rely primarily on a thermodynamic understanding of RNA–RNA interactions. However, these approaches depend on structure predictions and have limited accuracy, arguably due to overlooking important cellular environment factors. In this work, we develop a biophysical model to describe asRNA–RNA hybridization that incorporates in vivo factors using large-scale experimental hybridization data for three model RNAs: a group I intron, CsrB and a tRNA. A unique element of our model is the estimation of the availability of the target region to interact with a given asRNA using a differential entropic consideration of suboptimal structures. We showcase the utility of this model by evaluating its prediction capabilities in four additional RNAs: a group II intron, Spinach II, 2-MS2 binding domain and glgC 5΄ UTR. Additionally, we demonstrate the applicability of this approach to other bacterial species by predicting sRNA–mRNA binding regions in two newly discovered, though uncharacterized, regulatory RNAs. PMID:28334800

  16. Coupled Neutronics Thermal-Hydraulic Solution of a Full-Core PWR Using VERA-CS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clarno, Kevin T; Palmtag, Scott; Davidson, Gregory G

    2014-01-01

    The Consortium for Advanced Simulation of Light Water Reactors (CASL) is developing a core simulator called VERA-CS to model operating PWR reactors with high resolution. This paper describes how the development of VERA-CS is being driven by a set of progression benchmark problems that specify the delivery of useful capability in discrete steps. As part of this development, this paper will describe the current capability of VERA-CS to perform a multiphysics simulation of an operating PWR at Hot Full Power (HFP) conditions using a set of existing computer codes coupled together in a novel method. Results for several single-assembly casesmore » are shown that demonstrate coupling for different boron concentrations and power levels. Finally, high-resolution results are shown for a full-core PWR reactor modeled in quarter-symmetry.« less

  17. NSF's Perspective on Space Weather Research for Building Forecasting Capabilities

    NASA Astrophysics Data System (ADS)

    Bisi, M. M.; Pulkkinen, A. A.; Bisi, M. M.; Pulkkinen, A. A.; Webb, D. F.; Oughton, E. J.; Azeem, S. I.

    2017-12-01

    Space weather research at the National Science Foundation (NSF) is focused on scientific discovery and on deepening knowledge of the Sun-Geospace system. The process of maturation of knowledge base is a requirement for the development of improved space weather forecast models and for the accurate assessment of potential mitigation strategies. Progress in space weather forecasting requires advancing in-depth understanding of the underlying physical processes, developing better instrumentation and measurement techniques, and capturing the advancements in understanding in large-scale physics based models that span the entire chain of events from the Sun to the Earth. This presentation will provide an overview of current and planned programs pertaining to space weather research at NSF and discuss the recommendations of the Geospace Section portfolio review panel within the context of space weather forecasting capabilities.

  18. Vicarious social defeat stress: Bridging the gap between physical and emotional stress.

    PubMed

    Sial, Omar K; Warren, Brandon L; Alcantara, Lyonna F; Parise, Eric M; Bolaños-Guzmán, Carlos A

    2016-01-30

    Animal models capable of differentiating the neurobiological intricacies between physical and emotional stress are scarce. Current models rely primarily on physical stressors (e.g., chronic unpredictable or mild stress, social defeat, learned helplessness), and neglect the impact of psychological stress alone. This is surprising given extensive evidence that a traumatic event needs not be directly experienced to produce enduring perturbations on an individual's health and psychological well-being. Post-traumatic stress disorder (PTSD), a highly debilitating neuropsychiatric disorder characterized by intense fear of trauma-related stimuli, often occurs in individuals that have only witnessed a traumatic event. By modifying the chronic social defeat stress (CSDS) paradigm to include a witness component (witnessing the social defeat of another mouse), we demonstrate a novel behavioral paradigm capable of inducing a robust behavioral syndrome reminiscent of PTSD in emotionally stressed adult mice. We describe the vicarious social defeat stress (VSDS) model that is capable of inducing a host of behavioral deficits that include social avoidance and other depressive- and anxiety-like phenotypes in adult male mice. VSDS exposure induces weight loss and spike in serum corticosterone (CORT) levels. A month after stress, these mice retain the social avoidant phenotype and have an increased CORT response when exposed to subsequent stress. The VSDS is a novel paradigm capable of inducing emotional stress by isolating physical stress/confrontation in mice. The VSDS model can be used to study the short- and long-term neurobiological consequences of exposure to emotional stress in mice. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. Implementing a Loosely Coupled Fluid Structure Interaction Finite Element Model in PHASTA

    NASA Astrophysics Data System (ADS)

    Pope, David

    Fluid Structure Interaction problems are an important multi-physics phenomenon in the design of aerospace vehicles and other engineering applications. A variety of computational fluid dynamics solvers capable of resolving the fluid dynamics exist. PHASTA is one such computational fluid dynamics solver. Enhancing the capability of PHASTA to resolve Fluid-Structure Interaction first requires implementing a structural dynamics solver. The implementation also requires a correction of the mesh used to solve the fluid equations to account for the deformation of the structure. This results in mesh motion and causes the need for an Arbitrary Lagrangian-Eulerian modification to the fluid dynamics equations currently implemented in PHASTA. With the implementation of both structural dynamics physics, mesh correction, and the Arbitrary Lagrangian-Eulerian modification of the fluid dynamics equations, PHASTA is made capable of solving Fluid-Structure Interaction problems.

  20. Reacting Multi-Species Gas Capability for USM3D Flow Solver

    NASA Technical Reports Server (NTRS)

    Frink, Neal T.; Schuster, David M.

    2012-01-01

    The USM3D Navier-Stokes flow solver contributed heavily to the NASA Constellation Project (CxP) as a highly productive computational tool for generating the aerodynamic databases for the Ares I and V launch vehicles and Orion launch abort vehicle (LAV). USM3D is currently limited to ideal-gas flows, which are not adequate for modeling the chemistry or temperature effects of hot-gas jet flows. This task was initiated to create an efficient implementation of multi-species gas and equilibrium chemistry into the USM3D code to improve its predictive capabilities for hot jet impingement effects. The goal of this NASA Engineering and Safety Center (NESC) assessment was to implement and validate a simulation capability to handle real-gas effects in the USM3D code. This document contains the outcome of the NESC assessment.

  1. Uncertainty quantification's role in modeling and simulation planning, and credibility assessment through the predictive capability maturity model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rider, William J.; Witkowski, Walter R.; Mousseau, Vincent Andrew

    2016-04-13

    The importance of credible, trustworthy numerical simulations is obvious especially when using the results for making high-consequence decisions. Determining the credibility of such numerical predictions is much more difficult and requires a systematic approach to assessing predictive capability, associated uncertainties and overall confidence in the computational simulation process for the intended use of the model. This process begins with an evaluation of the computational modeling of the identified, important physics of the simulation for its intended use. This is commonly done through a Phenomena Identification Ranking Table (PIRT). Then an assessment of the evidence basis supporting the ability to computationallymore » simulate these physics can be performed using various frameworks such as the Predictive Capability Maturity Model (PCMM). There were several critical activities that follow in the areas of code and solution verification, validation and uncertainty quantification, which will be described in detail in the following sections. Here, we introduce the subject matter for general applications but specifics are given for the failure prediction project. In addition, the first task that must be completed in the verification & validation procedure is to perform a credibility assessment to fully understand the requirements and limitations of the current computational simulation capability for the specific application intended use. The PIRT and PCMM are tools used at Sandia National Laboratories (SNL) to provide a consistent manner to perform such an assessment. Ideally, all stakeholders should be represented and contribute to perform an accurate credibility assessment. PIRTs and PCMMs are both described in brief detail below and the resulting assessments for an example project are given.« less

  2. Navigating the flow: individual and continuum models for homing in flowing environments

    PubMed Central

    Painter, Kevin J.; Hillen, Thomas

    2015-01-01

    Navigation for aquatic and airborne species often takes place in the face of complicated flows, from persistent currents to highly unpredictable storms. Hydrodynamic models are capable of simulating flow dynamics and provide the impetus for much individual-based modelling, in which particle-sized individuals are immersed into a flowing medium. These models yield insights on the impact of currents on population distributions from fish eggs to large organisms, yet their computational demands and intractability reduce their capacity to generate the broader, less parameter-specific, insights allowed by traditional continuous approaches. In this paper, we formulate an individual-based model for navigation within a flowing field and apply scaling to derive its corresponding macroscopic and continuous model. We apply it to various movement classes, from drifters that simply go with the flow to navigators that respond to environmental orienteering cues. The utility of the model is demonstrated via its application to ‘homing’ problems and, in particular, the navigation of the marine green turtle Chelonia mydas to Ascension Island. PMID:26538557

  3. Genoviz Software Development Kit: Java tool kit for building genomics visualization applications.

    PubMed

    Helt, Gregg A; Nicol, John W; Erwin, Ed; Blossom, Eric; Blanchard, Steven G; Chervitz, Stephen A; Harmon, Cyrus; Loraine, Ann E

    2009-08-25

    Visualization software can expose previously undiscovered patterns in genomic data and advance biological science. The Genoviz Software Development Kit (SDK) is an open source, Java-based framework designed for rapid assembly of visualization software applications for genomics. The Genoviz SDK framework provides a mechanism for incorporating adaptive, dynamic zooming into applications, a desirable feature of genome viewers. Visualization capabilities of the Genoviz SDK include automated layout of features along genetic or genomic axes; support for user interactions with graphical elements (Glyphs) in a map; a variety of Glyph sub-classes that promote experimentation with new ways of representing data in graphical formats; and support for adaptive, semantic zooming, whereby objects change their appearance depending on zoom level and zooming rate adapts to the current scale. Freely available demonstration and production quality applications, including the Integrated Genome Browser, illustrate Genoviz SDK capabilities. Separation between graphics components and genomic data models makes it easy for developers to add visualization capability to pre-existing applications or build new applications using third-party data models. Source code, documentation, sample applications, and tutorials are available at http://genoviz.sourceforge.net/.

  4. Finite Element Analysis of Active and Sensory Thermopiezoelectric Composite Materials. Degree awarded by Northwestern Univ., Dec. 2000

    NASA Technical Reports Server (NTRS)

    Lee, Ho-Jun

    2001-01-01

    Analytical formulations are developed to account for the coupled mechanical, electrical, and thermal response of piezoelectric composite materials. The coupled response is captured at the material level through the thermopiezoelectric constitutive equations and leads to the inherent capability to model both the sensory and active responses of piezoelectric materials. A layerwise laminate theory is incorporated to provide more accurate analysis of the displacements, strains, stresses, electric fields, and thermal fields through-the-thickness. Thermal effects which arise from coefficient of thermal expansion mismatch, pyroelectric effects, and temperature dependent material properties are explicitly accounted for in the formulation. Corresponding finite element formulations are developed for piezoelectric beam, plate, and shell elements to provide a more generalized capability for the analysis of arbitrary piezoelectric composite structures. The accuracy of the current formulation is verified with comparisons from published experimental data and other analytical models. Additional numerical studies are also conducted to demonstrate additional capabilities of the formulation to represent the sensory and active behaviors. A future plan of experimental studies is provided to characterize the high temperature dynamic response of piezoelectric composite materials.

  5. Validation of NEOWAVE with Measurements from the 2011 Tohoku Tsunami

    NASA Astrophysics Data System (ADS)

    Cheung, K.; Yamazaki, Y.

    2012-12-01

    An accurate and reliable numerical model is essential in mapping tsunami hazards for mitigation and preparedness. The model NEOWAVE (Non-hydrostatic Evolution of Ocean WAVEs) is being used for tsunami inundation mapping in Hawaii, American Samoa, the Gulf coast states, and Puerto Rico. In addition to the benchmarks established by the National Tsunami Hazard Mitigation Program, we have been conducting a thorough investigation of NEOWAVE's capability in reproducing the 2011 Tohoku tsunami and its impact across the Pacific. The shock-capturing non-hydrostatic model is well suited to handle tsunami conditions in a variety of coastal environments in the near and far field. It describes dispersive waves through non-hydrostatic pressure and vertical velocity, which also account for tsunami generation from time histories of seafloor deformation. The semi-implicit, staggered finite difference model captures flow discontinuities associated with bores or hydraulic jumps through a momentum conservation scheme. The model supports up to five levels of two-way nested grids in spherical coordinates to describe tsunami processes of varying time and spatial scales from the open ocean to the coast. We first define the source mechanism through forward modeling of the near-field tsunami recorded by coastal and deep-ocean buoys. A finite-fault solution based on teleseismic P-wave inversion serves as the starting point of the iterative process, in which the source parameters are systematically adjusted to achieve convergence of the computed tsunami with the near-field records. The capability of NEOWAVE in modeling propagation of the tsunami is evaluated with DART data across the Pacific as well as water-level and current measurements in Hawaii. These far-field water-level records, which are not considered in the forward modeling, also provide an independently assessment of the source model. The computed runup and inundation are compared with measurements along Northeastern Japan coasts and the Hawaiian Island chain. These coastlines include shallow embayments with open plains, narrow estuaries with steep cliffs, and volcanic insular slopes with fringing reefs for full validation of the model in a single event. The Tohoku tsunami caused persistent oscillations and hazardous currents in coastal waters around Hawaii. Analysis of the computed surface elevation reveals complex resonance modes along the Hawaiian Island chain. Standing waves with period 16 min or shorter are able to form a series of nodes and antinodes over the reefs that results in strong currents and large drawdown responsible for the damage in harbors and marinas. The results provide insights into effects of fringing reefs, which are present along 70% of Hawaii's coastlines, on tsunami transformation and runup processes. The case study improves our understanding on tsunamis in tropical island environments and validates the modeling capability to predict their impacts for hazard mitigation and emergency management.

  6. Modeling tools for the assessment of microbiological risks during floods: a review

    NASA Astrophysics Data System (ADS)

    Collender, Philip; Yang, Wen; Stieglitz, Marc; Remais, Justin

    2015-04-01

    Floods are a major, recurring source of harm to global economies and public health. Projected increases in the frequency and intensity of heavy precipitation events under future climate change, coupled with continued urbanization in areas with high risk of floods, may exacerbate future impacts of flooding. Improved flood risk management is essential to support global development, poverty reduction and public health, and is likely to be a crucial aspect of climate change adaptation. Importantly, floods can facilitate the transmission of waterborne pathogens by changing social conditions (overcrowding among displaced populations, interruption of public health services), imposing physical challenges to infrastructure (sewerage overflow, reduced capacity to treat drinking water), and altering fate and transport of pathogens (transport into waterways from overland flow, resuspension of settled contaminants) during and after flood conditions. Hydrological and hydrodynamic models are capable of generating quantitative characterizations of microbiological risks associated with flooding, while accounting for these diverse and at times competing physical and biological processes. Despite a few applications of such models to the quantification of microbiological risks associated with floods, there exists limited guidance as to the relative capabilities, and limitations, of existing modeling platforms when used for this purpose. Here, we review 17 commonly used flood and water quality modeling tools that have demonstrated or implicit capabilities of mechanistically representing and quantifying microbial risk during flood conditions. We compare models with respect to their capabilities of generating outputs that describe physical and microbial conditions during floods, such as concentration or load of non-cohesive sediments or pathogens, and the dynamics of high flow conditions. Recommendations are presented for the application of specific modeling tools for assessing particular flood-related microbial risks, and model improvements are suggested that may better characterize key microbial risks during flood events. The state of current tools are assessed in the context of a changing climate where the frequency, intensity and duration of flooding are shifting in some areas.

  7. Direct model-based predictive control scheme without cost function for voltage source inverters with reduced common-mode voltage

    NASA Astrophysics Data System (ADS)

    Kim, Jae-Chang; Moon, Sung-Ki; Kwak, Sangshin

    2018-04-01

    This paper presents a direct model-based predictive control scheme for voltage source inverters (VSIs) with reduced common-mode voltages (CMVs). The developed method directly finds optimal vectors without using repetitive calculation of a cost function. To adjust output currents with the CMVs in the range of -Vdc/6 to +Vdc/6, the developed method uses voltage vectors, as finite control resources, excluding zero voltage vectors which produce the CMVs in the VSI within ±Vdc/2. In a model-based predictive control (MPC), not using zero voltage vectors increases the output current ripples and the current errors. To alleviate these problems, the developed method uses two non-zero voltage vectors in one sampling step. In addition, the voltage vectors scheduled to be used are directly selected at every sampling step once the developed method calculates the future reference voltage vector, saving the efforts of repeatedly calculating the cost function. And the two non-zero voltage vectors are optimally allocated to make the output current approach the reference current as close as possible. Thus, low CMV, rapid current-following capability and sufficient output current ripple performance are attained by the developed method. The results of a simulation and an experiment verify the effectiveness of the developed method.

  8. Programmatic Perspectives on Using `Rapid Prototyping Capability' for Water Management Applications Using NASA Products

    NASA Astrophysics Data System (ADS)

    Toll, D.; Friedl, L.; Entin, J.; Engman, E.

    2006-12-01

    The NASA Water Management Program addresses concerns and decision making related to water availability, water forecast and water quality. The goal of the Water Management Program Element is to encourage water management organizations to use NASA Earth science data, models products, technology and other capabilities in their decision support tools (DSTs) for problem solving. The goal of the NASA Rapid Prototyping Capability (RPC) is to speed the evaluation of these NASA products and technologies to improve current and future DSTs by reducing the time to access, configure, and assess the effectiveness of NASA products and technologies. The NASA Water Management Program Element partners with Federal agencies, academia, private firms, and may include international organizations. Currently, the NASA Water Management Program oversees eight application projects. However, water management is a very broad descriptor of a much larger number of activities that are carried out to insure safe and plentiful water supply for humans, industry and agriculture, promote environmental stewardship, and mitigate disaster such as floods and droughts. The goal of this presentation is to summarize how the RPC may further enhance the effectiveness of using NASA products for water management applications.

  9. The Experimental Measurement of Aerodynamic Heating About Complex Shapes at Supersonic Mach Numbers

    NASA Technical Reports Server (NTRS)

    Neumann, Richard D.; Freeman, Delma C.

    2011-01-01

    In 2008 a wind tunnel test program was implemented to update the experimental data available for predicting protuberance heating at supersonic Mach numbers. For this test the Langley Unitary Wind Tunnel was also used. The significant differences for this current test were the advances in the state-of-the-art in model design, fabrication techniques, instrumentation and data acquisition capabilities. This current paper provides a focused discussion of the results of an in depth analysis of unique measurements of recovery temperature obtained during the test.

  10. Current drive for stability of thermonuclear plasma reactor

    NASA Astrophysics Data System (ADS)

    Amicucci, L.; Cardinali, A.; Castaldo, C.; Cesario, R.; Galli, A.; Panaccione, L.; Paoletti, F.; Schettini, G.; Spigler, R.; Tuccillo, A.

    2016-01-01

    To produce in a thermonuclear fusion reactor based on the tokamak concept a sufficiently high fusion gain together stability necessary for operations represent a major challenge, which depends on the capability of driving non-inductive current in the hydrogen plasma. This request should be satisfied by radio-frequency (RF) power suitable for producing the lower hybrid current drive (LHCD) effect, recently demonstrated successfully occurring also at reactor-graded high plasma densities. An LHCD-based tool should be in principle capable of tailoring the plasma current density in the outer radial half of plasma column, where other methods are much less effective, in order to ensure operations in the presence of unpredictably changes of the plasma pressure profiles. In the presence of too high electron temperatures even at the periphery of the plasma column, as envisaged in DEMO reactor, the penetration of the coupled RF power into the plasma core was believed for long time problematic and, only recently, numerical modelling results based on standard plasma wave theory, have shown that this problem should be solved by using suitable parameter of the antenna power spectrum. We show here further information on the new understanding of the RF power deposition profile dependence on antenna parameters, which supports the conclusion that current can be actively driven over a broad layer of the outer radial half of plasma column, thus enabling current profile control necessary for the stability of a reactor.

  11. Nonlinear Dynamic Analysis of Disordered Bladed-Disk Assemblies

    NASA Technical Reports Server (NTRS)

    McGee, Oliver G., III

    1997-01-01

    In a effort to address current needs for efficient, air propulsion systems, we have developed some new analytical predictive tools for understanding and alleviating aircraft engine instabilities which have led to accelerated high cycle fatigue and catastrophic failures of these machines during flight. A frequent cause of failure in Jets engines is excessive resonant vibrations and stall flutter instabilities. The likelihood of these phenomena is reduced when designers employ the analytical models we have developed. These prediction models will ultimately increase the nation's competitiveness in producing high performance Jets engines with enhanced operability, energy economy, and safety. The objectives of our current threads of research in the final year are directed along two lines. First, we want to improve the current state of blade stress and aeromechanical reduced-ordered modeling of high bypass engine fans, Specifically, a new reduced-order iterative redesign tool for passively controlling the mechanical authority of shroudless, wide chord, laminated composite transonic bypass engine fans has been developed. Second, we aim to advance current understanding of aeromechanical feedback control of dynamic flow instabilities in axial flow compressors. A systematic theoretical evaluation of several approaches to aeromechanical feedback control of rotating stall in axial compressors has been conducted. Attached are abstracts of two .papers under preparation for the 1998 ASME Turbo Expo in Stockholm, Sweden sponsored under Grant No. NAG3-1571. Our goals during the final year under Grant No. NAG3-1571 is to enhance NASA's capabilities of forced response of turbomachines (such as NASA FREPS). We with continue our development of the reduced-ordered, three-dimensional component synthesis models for aeromechanical evaluation of integrated bladeddisk assemblies (i.e., the disk, non-identical bladeing etc.). We will complete our development of component systems design optimization strategies for specified vibratory stresses and increased fatigue life prediction of assembly components, and for specified frequency margins on the Campbell diagrams of turbomachines. Finally, we will integrate the developed codes with NASA's turbomachinery aeromechanics prediction capability (such as NASA FREPS).

  12. Solar Occultation Retrieval Algorithm Development

    NASA Technical Reports Server (NTRS)

    Lumpe, Jerry D.

    2004-01-01

    This effort addresses the comparison and validation of currently operational solar occultation retrieval algorithms, and the development of generalized algorithms for future application to multiple platforms. initial development of generalized forward model algorithms capable of simulating transmission data from of the POAM II/III and SAGE II/III instruments. Work in the 2" quarter will focus on: completion of forward model algorithms, including accurate spectral characteristics for all instruments, and comparison of simulated transmission data with actual level 1 instrument data for specific occultation events.

  13. Model compilation for real-time planning and diagnosis with feedback

    NASA Technical Reports Server (NTRS)

    Barrett, Anthony

    2005-01-01

    This paper describes MEXEC, an implemented micro executive that compiles a device model that can have feedback into a structure for subsequent evaluation. This system computes both the most likely current device mode from n sets of sensor measurements and the n-1 step reconfiguration plan that is most likely to result in reaching a target mode - if such a plan exists. A user tunes the system by increasing n to improve system capability at the cost of real-time performance.

  14. Integrating emerging earth science technologies into disaster risk management: an enterprise architecture approach

    NASA Astrophysics Data System (ADS)

    Evans, J. D.; Hao, W.; Chettri, S. R.

    2014-12-01

    Disaster risk management has grown to rely on earth observations, multi-source data analysis, numerical modeling, and interagency information sharing. The practice and outcomes of disaster risk management will likely undergo further change as several emerging earth science technologies come of age: mobile devices; location-based services; ubiquitous sensors; drones; small satellites; satellite direct readout; Big Data analytics; cloud computing; Web services for predictive modeling, semantic reconciliation, and collaboration; and many others. Integrating these new technologies well requires developing and adapting them to meet current needs; but also rethinking current practice to draw on new capabilities to reach additional objectives. This requires a holistic view of the disaster risk management enterprise and of the analytical or operational capabilities afforded by these technologies. One helpful tool for this assessment, the GEOSS Architecture for the Use of Remote Sensing Products in Disaster Management and Risk Assessment (Evans & Moe, 2013), considers all phases of the disaster risk management lifecycle for a comprehensive set of natural hazard types, and outlines common clusters of activities and their use of information and computation resources. We are using these architectural views, together with insights from current practice, to highlight effective, interrelated roles for emerging earth science technologies in disaster risk management. These roles may be helpful in creating roadmaps for research and development investment at national and international levels.

  15. The life and death of ATR/sensor fusion and the hope for resurrection

    NASA Astrophysics Data System (ADS)

    Rogers, Steven K.; Sadowski, Charles; Bauer, Kenneth W.; Oxley, Mark E.; Kabrisky, Matthew; Rogers, Adam; Mott, Stephen D.

    2008-04-01

    For over half a century, scientists and engineers have worked diligently to advance computational intelligence. One application of interest is how computational intelligence can bring value to our war fighters. Automatic Target Recognition (ATR) and sensor fusion efforts have fallen far short of the desired capabilities. In this article we review the capabilities requested by war fighters. When compared to our current capabilities, it is easy to conclude current Combat Identification (CID) as a Family of Systems (FoS) does a lousy job. The war fighter needed capable, operationalized ATR and sensor fusion systems ten years ago but it did not happen. The article reviews the war fighter needs and the current state of the art. The article then concludes by looking forward to where we are headed to provide the capabilities required.

  16. Coupling the System Analysis Module with SAS4A/SASSYS-1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fanning, T. H.; Hu, R.

    2016-09-30

    SAS4A/SASSYS-1 is a simulation tool used to perform deterministic analysis of anticipated events as well as design basis and beyond design basis accidents for advanced reactors, with an emphasis on sodium fast reactors. SAS4A/SASSYS-1 has been under development and in active use for nearly forty-five years, and is currently maintained by the U.S. Department of Energy under the Office of Advanced Reactor Technology. Although SAS4A/SASSYS-1 contains a very capable primary and intermediate system modeling component, PRIMAR-4, it also has some shortcomings: outdated data management and code structure makes extension of the PRIMAR-4 module somewhat difficult. The user input format formore » PRIMAR-4 also limits the number of volumes and segments that can be used to describe a given system. The System Analysis Module (SAM) is a fairly new code development effort being carried out under the U.S. DOE Nuclear Energy Advanced Modeling and Simulation (NEAMS) program. SAM is being developed with advanced physical models, numerical methods, and software engineering practices; however, it is currently somewhat limited in the system components and phenomena that can be represented. For example, component models for electromagnetic pumps and multi-layer stratified volumes have not yet been developed. Nor is there support for a balance of plant model. Similarly, system-level phenomena such as control-rod driveline expansion and vessel elongation are not represented. This report documents fiscal year 2016 work that was carried out to couple the transient safety analysis capabilities of SAS4A/SASSYS-1 with the system modeling capabilities of SAM under the joint support of the ART and NEAMS programs. The coupling effort was successful and is demonstrated by evaluating an unprotected loss of flow transient for the Advanced Burner Test Reactor (ABTR) design. There are differences between the stand-alone SAS4A/SASSYS-1 simulations and the coupled SAS/SAM simulations, but these are mainly attributed to the limited maturity of the SAM development effort. The severe accident modeling capabilities in SAS4A/SASSYS-1 (sodium boiling, fuel melting and relocation) will continue to play a vital role for a long time. Therefore, the SAS4A/SASSYS-1 modernization effort should remain a high priority task under the ART program to ensure continued participation in domestic and international SFR safety collaborations and design optimizations. On the other hand, SAM provides an advanced system analysis tool, with improved numerical solution schemes, data management, code flexibility, and accuracy. SAM is still in early stages of development and will require continued support from NEAMS to fulfill its potential and to mature into a production tool for advanced reactor safety analysis. The effort to couple SAS4A/SASSYS-1 and SAM is the first step on the integration of these modeling capabilities.« less

  17. Mapping Directly Imaged Giant Exoplanets

    NASA Astrophysics Data System (ADS)

    Kostov, Veselin; Apai, Dániel

    2013-01-01

    With the increasing number of directly imaged giant exoplanets, the current atmosphere models are often not capable of fully explaining the spectra and luminosity of the sources. A particularly challenging component of the atmosphere models is the formation and properties of condensate cloud layers, which fundamentally impact the energetics, opacity, and evolution of the planets. Here we present a suite of techniques that can be used to estimate the level of rotational modulations these planets may show. We propose that the time-resolved observations of such periodic photometric and spectroscopic variations of extrasolar planets due to their rotation can be used as a powerful tool to probe the heterogeneity of their optical surfaces. In this paper, we develop simulations to explore the capabilities of current and next-generation ground- and space-based instruments for this technique. We address and discuss the following questions: (1) what planet properties can be deduced from the light curve and/or spectra, and in particular can we determine rotation periods, spot coverage, spot colors, and spot spectra?; (2) what is the optimal configuration of instrument/wavelength/temporal sampling required for these measurements?; and (3) can principal component analysis be used to invert the light curve and deduce the surface map of the planet? Our simulations describe the expected spectral differences between homogeneous (clear or cloudy) and patchy atmospheres, outline the significance of the dominant absorption features of H2O, CH4, and CO, and provide a method to distinguish these two types of atmospheres. Assuming surfaces with and without clouds for most currently imaged planets the current models predict the largest variations in the J band. Simulated photometry from current and future instruments is used to estimate the level of detectable photometric variations. We conclude that future instruments will be able to recover not only the rotation periods, cloud cover, cloud colors, and spectra but even cloud evolution. We also show that a longitudinal map of the planet's atmosphere can be deduced from its disk-integrated light curves.

  18. Multisensor-integrated organs-on-chips platform for automated and continual in situ monitoring of organoid behaviors

    PubMed Central

    Zhang, Yu Shrike; Aleman, Julio; Shin, Su Ryon; Kim, Duckjin; Mousavi Shaegh, Seyed Ali; Massa, Solange; Riahi, Reza; Chae, Sukyoung; Hu, Ning; Avci, Huseyin; Zhang, Weijia; Silvestri, Antonia; Sanati Nezhad, Amir; Manbohi, Ahmad; De Ferrari, Fabio; Polini, Alessandro; Calzone, Giovanni; Shaikh, Noor; Alerasool, Parissa; Budina, Erica; Kang, Jian; Bhise, Nupura; Pourmand, Adel; Skardal, Aleksander; Shupe, Thomas; Bishop, Colin E.; Dokmeci, Mehmet Remzi; Atala, Anthony; Khademhosseini, Ali

    2017-01-01

    Organ-on-a-chip systems are miniaturized microfluidic 3D human tissue and organ models designed to recapitulate the important biological and physiological parameters of their in vivo counterparts. They have recently emerged as a viable platform for personalized medicine and drug screening. These in vitro models, featuring biomimetic compositions, architectures, and functions, are expected to replace the conventional planar, static cell cultures and bridge the gap between the currently used preclinical animal models and the human body. Multiple organoid models may be further connected together through the microfluidics in a similar manner in which they are arranged in vivo, providing the capability to analyze multiorgan interactions. Although a wide variety of human organ-on-a-chip models have been created, there are limited efforts on the integration of multisensor systems. However, in situ continual measuring is critical in precise assessment of the microenvironment parameters and the dynamic responses of the organs to pharmaceutical compounds over extended periods of time. In addition, automated and noninvasive capability is strongly desired for long-term monitoring. Here, we report a fully integrated modular physical, biochemical, and optical sensing platform through a fluidics-routing breadboard, which operates organ-on-a-chip units in a continual, dynamic, and automated manner. We believe that this platform technology has paved a potential avenue to promote the performance of current organ-on-a-chip models in drug screening by integrating a multitude of real-time sensors to achieve automated in situ monitoring of biophysical and biochemical parameters. PMID:28265064

  19. Hidden Markov models and neural networks for fault detection in dynamic systems

    NASA Technical Reports Server (NTRS)

    Smyth, Padhraic

    1994-01-01

    Neural networks plus hidden Markov models (HMM) can provide excellent detection and false alarm rate performance in fault detection applications, as shown in this viewgraph presentation. Modified models allow for novelty detection. Key contributions of neural network models are: (1) excellent nonparametric discrimination capability; (2) a good estimator of posterior state probabilities, even in high dimensions, and thus can be embedded within overall probabilistic model (HMM); and (3) simple to implement compared to other nonparametric models. Neural network/HMM monitoring model is currently being integrated with the new Deep Space Network (DSN) antenna controller software and will be on-line monitoring a new DSN 34-m antenna (DSS-24) by July, 1994.

  20. Advanced techniques in reliability model representation and solution

    NASA Technical Reports Server (NTRS)

    Palumbo, Daniel L.; Nicol, David M.

    1992-01-01

    The current tendency of flight control system designs is towards increased integration of applications and increased distribution of computational elements. The reliability analysis of such systems is difficult because subsystem interactions are increasingly interdependent. Researchers at NASA Langley Research Center have been working for several years to extend the capability of Markov modeling techniques to address these problems. This effort has been focused in the areas of increased model abstraction and increased computational capability. The reliability model generator (RMG) is a software tool that uses as input a graphical object-oriented block diagram of the system. RMG uses a failure-effects algorithm to produce the reliability model from the graphical description. The ASSURE software tool is a parallel processing program that uses the semi-Markov unreliability range evaluator (SURE) solution technique and the abstract semi-Markov specification interface to the SURE tool (ASSIST) modeling language. A failure modes-effects simulation is used by ASSURE. These tools were used to analyze a significant portion of a complex flight control system. The successful combination of the power of graphical representation, automated model generation, and parallel computation leads to the conclusion that distributed fault-tolerant system architectures can now be analyzed.

  1. Current Challenges in the First Principle Quantitative Modelling of the Lower Hybrid Current Drive in Tokamaks

    NASA Astrophysics Data System (ADS)

    Peysson, Y.; Bonoli, P. T.; Chen, J.; Garofalo, A.; Hillairet, J.; Li, M.; Qian, J.; Shiraiwa, S.; Decker, J.; Ding, B. J.; Ekedahl, A.; Goniche, M.; Zhai, X.

    2017-10-01

    The Lower Hybrid (LH) wave is widely used in existing tokamaks for tailoring current density profile or extending pulse duration to steady-state regimes. Its high efficiency makes it particularly attractive for a fusion reactor, leading to consider it for this purpose in ITER tokamak. Nevertheless, if basics of the LH wave in tokamak plasma are well known, quantitative modeling of experimental observations based on first principles remains a highly challenging exercise, despite considerable numerical efforts achieved so far. In this context, a rigorous methodology must be carried out in the simulations to identify the minimum number of physical mechanisms that must be considered to reproduce experimental shot to shot observations and also scalings (density, power spectrum). Based on recent simulations carried out for EAST, Alcator C-Mod and Tore Supra tokamaks, the state of the art in LH modeling is reviewed. The capability of fast electron bremsstrahlung, internal inductance li and LH driven current at zero loop voltage to constrain all together LH simulations is discussed, as well as the needs of further improvements (diagnostics, codes, LH model), for robust interpretative and predictive simulations.

  2. A methodology for achieving high-speed rates for artificial conductance injection in electrically excitable biological cells.

    PubMed

    Butera, R J; Wilson, C G; Delnegro, C A; Smith, J C

    2001-12-01

    We present a novel approach to implementing the dynamic-clamp protocol (Sharp et al., 1993), commonly used in neurophysiology and cardiac electrophysiology experiments. Our approach is based on real-time extensions to the Linux operating system. Conventional PC-based approaches have typically utilized single-cycle computational rates of 10 kHz or slower. In thispaper, we demonstrate reliable cycle-to-cycle rates as fast as 50 kHz. Our system, which we call model reference current injection (MRCI); pronounced merci is also capable of episodic logging of internal state variables and interactive manipulation of model parameters. The limiting factor in achieving high speeds was not processor speed or model complexity, but cycle jitter inherent in the CPU/motherboard performance. We demonstrate these high speeds and flexibility with two examples: 1) adding action-potential ionic currents to a mammalian neuron under whole-cell patch-clamp and 2) altering a cell's intrinsic dynamics via MRCI while simultaneously coupling it via artificial synapses to an internal computational model cell. These higher rates greatly extend the applicability of this technique to the study of fast electrophysiological currents such fast a currents and fast excitatory/inhibitory synapses.

  3. Climate Change and International Competition: the US Army in the Arctic Environment

    DTIC Science & Technology

    2015-05-21

    capabilities are evaluated within the domains of the current US doctrinal definition of Doctrine , Organization, Training, Materiel, Leadership and Education...environment. 15. SUBJECT TERMS US Army Cold Weather Doctrine ; US Army Arctic Operational Capability; ULO; Mission Command; Arctic Council; UNCLOS...capabilities are evaluated within the domains of the current US doctrinal definition of Doctrine , Organization, Training, Materiel, Leadership and

  4. Evaluation of methods for characterizing surface topography of models for high Reynolds number wind-tunnels

    NASA Technical Reports Server (NTRS)

    Teague, E. C.; Vorburger, T. V.; Scire, F. E.; Baker, S. M.; Jensen, S. W.; Gloss, B. B.; Trahan, C.

    1982-01-01

    Current work by the National Bureau of Standards at the NASA National Transonic Facility (NTF) to evaluate the performance of stylus instruments for determining the topography of models under investigation is described along with instrumentation for characterization of the surface microtopography. Potential areas of surface effects are reviewed, and the need for finer surfaced models for the NTF high Reynolds number flows is stressed. Current stylus instruments have a radii as large as 25 microns, and three models with surface finishes of 4-6, 8-10, and 12-15 micro-in. rms surface finishes were fabricated for tests with a stylus with a tip radius of 1 micron and a 50 mg force. Work involving three-dimensional stylus profilometry is discussed in terms of stylus displacement being converted to digital signals, and the design of a light scattering instrument capable of measuring the surface finish on curved objects is presented.

  5. An experimental comparison of several current viscoplastic constitutive models at elevated temperature

    NASA Technical Reports Server (NTRS)

    James, G. H.; Imbrie, P. K.; Hill, P. S.; Allen, D. H.; Haisler, W. E.

    1988-01-01

    Four current viscoplastic models are compared experimentally for Inconel 718 at 593 C. This material system responds with apparent negative strain rate sensitivity, undergoes cyclic work softening, and is susceptible to low cycle fatigue. A series of tests were performed to create a data base from which to evaluate material constants. A method to evaluate the constants is developed which draws on common assumptions for this type of material, recent advances by other researchers, and iterative techniques. A complex history test, not used in calculating the constants, is then used to compare the predictive capabilities of the models. The combination of exponentially based inelastic strain rate equations and dynamic recovery is shown to model this material system with the greatest success. The method of constant calculation developed was successfully applied to the complex material response encountered. Backstress measuring tests were found to be invaluable and to warrant further development.

  6. Browsing Space Weather Data and Models with the Integrated Space Weather Analysis (iSWA) System

    NASA Technical Reports Server (NTRS)

    Maddox, Marlo M.; Mullinix, Richard E.; Berrios, David H.; Hesse, Michael; Rastaetter, Lutz; Pulkkinen, Antti; Hourcle, Joseph A.; Thompson, Barbara J.

    2011-01-01

    The Integrated Space Weather Analysis (iSWA) System is a comprehensive web-based platform for space weather information that combines data from solar, heliospheric and geospace observatories with forecasts based on the most advanced space weather models. The iSWA system collects, generates, and presents a wide array of space weather resources in an intuitive, user-configurable, and adaptable format - thus enabling users to respond to current and future space weather impacts as well as enabling post-impact analysis. iSWA currently provides over 200 data and modeling products, and features a variety of tools that allow the user to browse, combine, and examine data and models from various sources. This presentation will consist of a summary of the iSWA products and an overview of the customizable user interfaces, and will feature several tutorial demonstrations highlighting the interactive tools and advanced capabilities.

  7. Exemplar for simulation challenges: Large-deformation micromechanics of Sylgard 184/glass microballoon syntactic foams.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Judith Alice; Long, Kevin Nicholas

    2018-05-01

    Sylgard® 184/Glass Microballoon (GMB) potting material is currently used in many NW systems. Analysts need a macroscale constitutive model that can predict material behavior under complex loading and damage evolution. To address this need, ongoing modeling and experimental efforts have focused on study of damage evolution in these materials. Micromechanical finite element simulations that resolve individual GMB and matrix components promote discovery and better understanding of the material behavior. With these simulations, we can study the role of the GMB volume fraction, time-dependent damage, behavior under confined vs. unconfined compression, and the effects of partial damage. These simulations are challengingmore » and push the boundaries of capability even with the high performance computing tools available at Sandia. We summarize the major challenges and the current state of this modeling effort, as an exemplar of micromechanical modeling needs that can motivate advances in future computing efforts.« less

  8. An Investigation of Micro-Enterprise Capability-Building via Access and Use of Technology

    ERIC Educational Resources Information Center

    Good, Travis Godwin

    2011-01-01

    Micro-enterprises (businesses with one to five employees) lie at the heart of the American economy but are not well-researched. It is believed that technology adoption has the potential to spark strong growth among micro-enterprises, but current technology adoption models are tailored for large businesses and do not consider the human, social, and…

  9. Radiation-Hardened Electronics for the Space Environment

    NASA Technical Reports Server (NTRS)

    Keys, Andrew S.; Watson, Michael D.

    2007-01-01

    RHESE covers a broad range of technology areas and products. - Radiation Hardened Electronics - High Performance Processing - Reconfigurable Computing - Radiation Environmental Effects Modeling - Low Temperature Radiation Hardened Electronics. RHESE has aligned with currently defined customer needs. RHESE is leveraging/advancing SOA space electronics, not duplicating. - Awareness of radiation-related activities through out government and industry allow advancement rather than duplication of capabilities.

  10. Enterprise Architecture Tradespace Analysis

    DTIC Science & Technology

    2014-02-21

    EXECUTIVE SUMMARY The Department of Defense (DoD)’s Science & Technology (S&T) priority for Engineered Resilient Systems (ERS) calls for...Science & Technology (S&T) priority for Engineered Resilient Systems (ERS) calls for adaptable designs with diverse systems models that can easily be...Department of Defense [Holland, 2012]. Some explicit goals are: • Establish baseline resiliency of current capabilities • More complete and robust

  11. Modeling and evaluation of the oil-spill emergency response capability based on linguistic variables.

    PubMed

    Kang, Jian; Zhang, Jixin; Bai, Yongqiang

    2016-12-15

    An evaluation of the oil-spill emergency response capability (OS-ERC) currently in place in modern marine management is required to prevent pollution and loss accidents. The objective of this paper is to develop a novel OS-ERC evaluation model, the importance of which stems from the current lack of integrated approaches for interpreting, ranking and assessing OS-ERC performance factors. In the first part of this paper, the factors influencing OS-ERC are analyzed and classified to generate a global evaluation index system. Then, a semantic tree is adopted to illustrate linguistic variables in the evaluation process, followed by the application of a combination of Fuzzy Cognitive Maps (FCM) and the Analytic Hierarchy Process (AHP) to construct and calculate the weight distribution. Finally, considering that the OS-ERC evaluation process is a complex system, a fuzzy comprehensive evaluation (FCE) is employed to calculate the OS-ERC level. The entire evaluation framework obtains the overall level of OS-ERC, and also highlights the potential major issues concerning OS-ERC, as well as expert opinions for improving the feasibility of oil-spill accident prevention and protection. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Biometric identification: a holistic perspective

    NASA Astrophysics Data System (ADS)

    Nadel, Lawrence D.

    2007-04-01

    Significant advances continue to be made in biometric technology. However, the global war on terrorism and our increasingly electronic society have created the societal need for large-scale, interoperable biometric capabilities that challenge the capabilities of current off-the-shelf technology. At the same time, there are concerns that large-scale implementation of biometrics will infringe our civil liberties and offer increased opportunities for identity theft. This paper looks beyond the basic science and engineering of biometric sensors and fundamental matching algorithms and offers approaches for achieving greater performance and acceptability of applications enabled with currently available biometric technologies. The discussion focuses on three primary biometric system aspects: performance and scalability, interoperability, and cost benefit. Significant improvements in system performance and scalability can be achieved through careful consideration of the following elements: biometric data quality, human factors, operational environment, workflow, multibiometric fusion, and integrated performance modeling. Application interoperability hinges upon some of the factors noted above as well as adherence to interface, data, and performance standards. However, there are times when the price of conforming to such standards can be decreased local system performance. The development of biometric performance-based cost benefit models can help determine realistic requirements and acceptable designs.

  13. The Requirements and Design of the Rapid Prototyping Capabilities System

    NASA Astrophysics Data System (ADS)

    Haupt, T. A.; Moorhead, R.; O'Hara, C.; Anantharaj, V.

    2006-12-01

    The Rapid Prototyping Capabilities (RPC) system will provide the capability to rapidly evaluate innovative methods of linking science observations. To this end, the RPC will provide the capability to integrate the software components and tools needed to evaluate the use of a wide variety of current and future NASA sensors, numerical models, and research results, model outputs, and knowledge, collectively referred to as "resources". It is assumed that the resources are geographically distributed, and thus RPC will provide the support for the location transparency of the resources. The RPC system requires providing support for: (1) discovery, semantic understanding, secure access and transport mechanisms for data products available from the known data provides; (2) data assimilation and geo- processing tools for all data transformations needed to match given data products to the model input requirements; (3) model management including catalogs of models and model metadata, and mechanisms for creation environments for model execution; and (4) tools for model output analysis and model benchmarking. The challenge involves developing a cyberinfrastructure for a coordinated aggregate of software, hardware and other technologies, necessary to facilitate RPC experiments, as well as human expertise to provide an integrated, "end-to-end" platform to support the RPC objectives. Such aggregation is to be achieved through a horizontal integration of loosely coupled services. The cyberinfrastructure comprises several software layers. At the bottom, the Grid fabric encompasses network protocols, optical networks, computational resources, storage devices, and sensors. At the top, applications use workload managers to coordinate their access to physical resources. Applications are not tightly bounded to a single physical resource. Instead, they bind dynamically to resources (i.e., they are provisioned) via a common grid infrastructure layer. For the RPC system, the cyberinfrastructure must support organizing computations (or "data transformations" in general) into complex workflows with resource discovery, automatic resource allocation, monitoring, preserving provenance as well as to aggregate heterogeneous, distributed data into knowledge databases. Such service orchestration is the responsibility of the "collective services" layer. For RPC, this layer will be based on Java Business Integration (JBI, [JSR-208]) specification which is a standards-based integration platform that combines messaging, web services, data transformation, and intelligent routing to reliably connect and coordinate the interaction of significant numbers of diverse applications (plug-in components) across organizational boundaries. JBI concept is a new approach to integration that can provide the underpinnings for loosely coupled, highly distributed integration network that can scale beyond the limits of currently used hub-and-spoke brokers. This presentation discusses the requirements, design and early prototype of the NASA-sponsored RPC system under development at Mississippi State University, demonstrating the integration of data provisioning mechanisms, data transformation tools and computational models into a single interoperable system enabling rapid execution of RPC experiments.

  14. Phenomenological Model of Current Sheet Canting in Pulsed Electromagnetic Accelerators

    NASA Technical Reports Server (NTRS)

    Markusic, Thomas; Choueiri, E. Y.

    2003-01-01

    The phenomenon of current sheet canting in pulsed electromagnetic accelerators is the departure of the plasma sheet (that carries the current) from a plane that is perpendicular to the electrodes to one that is skewed, or tipped. Review of pulsed electromagnetic accelerator literature reveals that current sheet canting is a ubiquitous phenomenon - occurring in all of the standard accelerator geometries. Developing an understanding of current sheet canting is important because it can detract from the propellant sweeping capabilities of current sheets and, hence, negatively impact the overall efficiency of pulsed electromagnetic accelerators. In the present study, it is postulated that depletion of plasma near the anode, which results from axial density gradient induced diamagnetic drift, occurs during the early stages of the discharge, creating a density gradient normal to the anode, with a characteristic length on the order of the ion skin depth. Rapid penetration of the magnetic field through this region ensues, due to the Hall effect, leading to a canted current front ahead of the initial current conduction channel. In this model, once the current sheet reaches appreciable speeds, entrainment of stationary propellant replenishes plasma in the anode region, inhibiting further Hall-convective transport of the magnetic field; however, the previously established tilted current sheet remains at a fairly constant canting angle for the remainder of the discharge cycle, exerting a transverse J x B force which drives plasma toward the cathode and accumulates it there. This proposed sequence of events has been incorporated into a phenomenological model. The model predicts that canting can be reduced by using low atomic mass propellants with high propellant loading number density; the model results are shown to give qualitative agreement with experimentally measured canting angle mass dependence trends.

  15. Digital current regulator for proportional electro-hydraulic valves with unknown disturbance rejection.

    PubMed

    Canuto, Enrico; Acuña-Bravo, Wilber; Agostani, Marco; Bonadei, Marco

    2014-07-01

    Solenoid current regulation is well-known and standard in any proportional electro-hydraulic valve. The goal is to provide a wide-band transfer function from the reference to the measured current, thus making the solenoid a fast and ideal force actuator within the limits of the power supplier. The power supplier is usually a Pulse Width Modulation (PWM) amplifier fixing the voltage bound and the Nyquist frequency of the regulator. Typical analog regulators include three main terms: a feedforward channel, a proportional feedback channel and the electromotive force compensation. The latter compensation may be accomplished by integrative feedback. Here the problem is faced through a model-based design (Embedded Model Control), on the basis of a wide-band embedded model of the solenoid which includes the effect of eddy currents. To this end model parameters must be identified. The embedded model includes a stochastic disturbance dynamics capable of estimating and correcting the electromotive contribution together with parametric uncertainty, variability and state dependence. The embedded model which is fed by the measured current and the supplied voltage becomes a state predictor of the controllable and disturbance dynamics. The control law combines reference generator, state feedback and disturbance rejection to dispatch the PWM amplifier with the appropriate duty cycle. Modeling, identification and control design are outlined together with experimental result. Comparison with an existing analog regulator is also provided. © 2013 ISA. Published by Elsevier Ltd. All rights reserved.

  16. Cygnus X-1: A Case for a Magnetic Accretion Disk?

    NASA Technical Reports Server (NTRS)

    Nowak, Michael A.; Vaughan, B. A.; Dove, J.; Wilms, J.

    1996-01-01

    With the advent of Rossi X-ray Timing Explorer (RXTE), which is capable of broad spectral coverage and fast timing, as well as other instruments which are increasingly being used in multi-wavelength campaigns (via both space-based and ground-based observations), we must demand more of our theoretical models. No current model mimics all facets of a system as complex as an x-ray binary. However, a modern theory should qualitatively reproduce - or at the very least not fundamentally disagree with - all of Cygnus X-l's most basic average properties: energy spectrum (viewed within a broader framework of black hole candidate spectral behavior), power spectrum (PSD), and time delays and coherence between variability in different energy bands. Below we discuss each of these basic properties in turn, and we assess the health of one of the currently popular theories: Comptonization of photons from a cold disk. We find that the data pose substantial challenges for this theory, as well as all other in currently discussed models.

  17. Modeling Longitudinal Dynamics in the Fermilab Booster Synchrotron

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ostiguy, Jean-Francois; Bhat, Chandra; Lebedev, Valeri

    2016-06-01

    The PIP-II project will replace the existing 400 MeV linac with a new, CW-capable, 800 MeV superconducting one. With respect to current operations, a 50% increase in beam intensity in the rapid cycling Booster synchrotron is expected. Booster batches are combined in the Recycler ring; this process limits the allowed longitudinal emittance of the extracted Booster beam. To suppress eddy currents, the Booster has no beam pipe; magnets are evacuated, exposing the beam to core laminations and this has a substantial impact on the longitudinal impedance. Noticeable longitudinal emittance growth is already observed at transition crossing. Operation at higher intensitymore » will likely necessitate mitigation measures. We describe systematic efforts to construct a predictive model for current operating conditions. A longitudinal only code including a laminated wall impedance model, space charge effects, and feedback loops is developed. Parameter validation is performed using detailed measurements of relevant beam, rf and control parameters. An attempt is made to benchmark the code at operationally favorable machine settings.« less

  18. Explicit wave action conservation for water waves on vertically sheared flows

    NASA Astrophysics Data System (ADS)

    Quinn, Brenda; Toledo, Yaron; Shrira, Victor

    2016-04-01

    Water waves almost always propagate on currents with a vertical structure such as currents directed towards the beach accompanied by an under-current directed back toward the deep sea or wind-induced currents which change magnitude with depth due to viscosity effects. On larger scales they also change their direction due to the Coriolis force as described by the Ekman spiral. This implies that the existing wave models, which assume vertically-averaged currents, is an approximation which is far from realistic. In recent years, ocean circulation models have significantly improved with the capability to model vertically-sheared current profiles in contrast with the earlier vertically-averaged current profiles. Further advancements have coupled wave action models to circulation models to relate the mutual effects between the two types of motion. Restricting wave models to vertically-averaged non-turbulent current profiles is obviously problematic in these cases and the primary goal of this work is to derive and examine a general wave action equation which accounts for these shortcoming. The formulation of the wave action conservation equation is made explicit by following the work of Voronovich (1976) and using known asymptotic solutions of the boundary value problem which exploit the smallness of the current magnitude compared to the wave phase velocity and/or its vertical shear and curvature. The adopted approximations are shown to be sufficient for most of the conceivable applications. This provides correction terms to the group velocity and wave action definition accounting for the shear effects, which are fitting for application to operational wave models. In the limit of vanishing current shear, the new formulation reduces to the commonly used Bretherton & Garrett (1968) no-shear wave action equation where the invariant is calculated with the current magnitude taken at the free surface. It is shown that in realistic oceanic conditions, the neglect of the vertical structure of the currents in wave modelling which is currently universal, might lead to significant errors in wave amplitude and the predicted wave ray paths. An extension of the work toward the more complex case of turbulent currents will also be discussed.

  19. Using a hybrid neuron in physiologically inspired models of the basal ganglia.

    PubMed

    Thibeault, Corey M; Srinivasa, Narayan

    2013-01-01

    Our current understanding of the basal ganglia (BG) has facilitated the creation of computational models that have contributed novel theories, explored new functional anatomy and demonstrated results complementing physiological experiments. However, the utility of these models extends beyond these applications. Particularly in neuromorphic engineering, where the basal ganglia's role in computation is important for applications such as power efficient autonomous agents and model-based control strategies. The neurons used in existing computational models of the BG, however, are not amenable for many low-power hardware implementations. Motivated by a need for more hardware accessible networks, we replicate four published models of the BG, spanning single neuron and small networks, replacing the more computationally expensive neuron models with an Izhikevich hybrid neuron. This begins with a network modeling action-selection, where the basal activity levels and the ability to appropriately select the most salient input is reproduced. A Parkinson's disease model is then explored under normal conditions, Parkinsonian conditions and during subthalamic nucleus deep brain stimulation (DBS). The resulting network is capable of replicating the loss of thalamic relay capabilities in the Parkinsonian state and its return under DBS. This is also demonstrated using a network capable of action-selection. Finally, a study of correlation transfer under different patterns of Parkinsonian activity is presented. These networks successfully captured the significant results of the originals studies. This not only creates a foundation for neuromorphic hardware implementations but may also support the development of large-scale biophysical models. The former potentially providing a way of improving the efficacy of DBS and the latter allowing for the efficient simulation of larger more comprehensive networks.

  20. Wind Field and Trajectory Models for Tornado-Propelled Objects

    NASA Technical Reports Server (NTRS)

    Redmann, G. H.; Radbill, J. R.; Marte, J. E.; Dergarabedian, P.; Fendell, F. E.

    1978-01-01

    A mathematical model to predict the trajectory of tornado born objects postulated to be in the vicinity of nuclear power plants is developed. An improved tornado wind field model satisfied the no slip ground boundary condition of fluid mechanics and includes the functional dependence of eddy viscosity with altitude. Subscale wind tunnel data are obtained for all of the missiles currently specified for nuclear plant design. Confirmatory full-scale data are obtained for a 12 inch pipe and automobile. The original six degree of freedom trajectory model is modified to include the improved wind field and increased capability as to body shapes and inertial characteristics that can be handled. The improved trajectory model is used to calculate maximum credible speeds, which for all of the heavy missiles are considerably less than those currently specified for design. Equivalent coefficients for use in three degree of freedom models are developed and the sensitivity of range and speed to various trajectory parameters for the 12 inch diameter pipe are examined.

  1. Simulations of the cardiac action potential based on the Hodgkin-Huxley kinetics with the use of Microsoft Excel spreadsheets.

    PubMed

    Wu, Sheng-Nan

    2004-03-31

    The purpose of this study was to develop a method to simulate the cardiac action potential using a Microsoft Excel spreadsheet. The mathematical model contained voltage-gated ionic currents that were modeled using either Beeler-Reuter (B-R) or Luo-Rudy (L-R) phase 1 kinetics. The simulation protocol involves the use of in-cell formulas directly typed into a spreadsheet. The capability of spreadsheet iteration was used in these simulations. It does not require any prior knowledge of computer programming, although the use of the macro language can speed up the calculation. The normal configuration of the cardiac ventricular action potential can be well simulated in the B-R model that is defined by four individual ionic currents, each representing the diffusion of ions through channels in the membrane. The contribution of Na+ inward current to the rate of depolarization is reproduced in this model. After removal of Na+ current from the model, a constant current stimulus elicits an oscillatory change in membrane potential. In the L-R phase 1 model where six types of ionic currents were defined, the effect of extracellular K+ concentration on changes both in the time course of repolarization and in the time-independent K+ current can be demonstrated, when the solutions are implemented in Excel. Using the simulation protocols described here, the users can readily study and graphically display the underlying properties of ionic currents to see how changes in these properties determine the behavior of the heart cell. The method employed in these simulation protocols may also be extended or modified to other biological simulation programs.

  2. Space Weather Modeling Services at the Community Coordinated Modeling Center

    NASA Technical Reports Server (NTRS)

    Hesse, Michael

    2006-01-01

    The Community Coordinated Modeling Center (CCMC) is a multi-agency partnership, which aims at the creation of next generation space weather models. The goal of the CCMC is to support the research and developmental work necessary to substantially increase the present-day modeling capability for space weather purposes, and to provide models for transition to the Rapid Prototyping Centers at the space weather forecast centers. This goal requires close collaborations with and substantial involvement of the research community. The physical regions to be addressed by CCMC-related activities range from the solar atmosphere to the Earth's upper atmosphere. The CCMC is an integral part of the National Space Weather Program Implementation Plan, of NASA's Living With a Star (LWS) initiative, and of the Department of Defense Space Weather Transition Plan. CCMC includes a facility at NASA Goddard Space Flight Center. CCMC also provides, to the research community, access to state-of-the-art space research models. In this paper we will provide a description of the current CCMC status, discuss current plans, research and development accomplishments and goals, and describe the model testing and validation process undertaken as part of the CCMC mandate. Special emphasis will be on solar and heliospheric models currently residing at CCMC, and on plans for validation and verification.

  3. The Coastal Ocean Prediction Systems program: Understanding and managing our coastal ocean

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eden, H.F.; Mooers, C.N.K.

    1990-06-01

    The goal of COPS is to couple a program of regular observations to numerical models, through techniques of data assimilation, in order to provide a predictive capability for the US coastal ocean including the Great Lakes, estuaries, and the entire Exclusive Economic Zone (EEZ). The objectives of the program include: determining the predictability of the coastal ocean and the processes that govern the predictability; developing efficient prediction systems for the coastal ocean based on the assimilation of real-time observations into numerical models; and coupling the predictive systems for the physical behavior of the coastal ocean to predictive systems for biological,more » chemical, and geological processes to achieve an interdisciplinary capability. COPS will provide the basis for effective monitoring and prediction of coastal ocean conditions by optimizing the use of increased scientific understanding, improved observations, advanced computer models, and computer graphics to make the best possible estimates of sea level, currents, temperatures, salinities, and other properties of entire coastal regions.« less

  4. Innovative Near Real-Time Data Dissemination Tools Developed by the Space Weather Research Center

    NASA Astrophysics Data System (ADS)

    Mullinix, R.; Maddox, M. M.; Berrios, D.; Kuznetsova, M.; Pulkkinen, A.; Rastaetter, L.; Zheng, Y.

    2012-12-01

    Space weather affects virtually all of NASA's endeavors, from robotic missions to human exploration. Knowledge and prediction of space weather conditions are therefore essential to NASA operations. The diverse nature of currently available space environment measurements and modeling products compels the need for a single access point to such information. The Integrated Space Weather Analysis (iSWA) System provides this single point access along with the capability to collect and catalog a vast range of sources including both observational and model data. NASA Goddard Space Weather Research Center heavily utilizes the iSWA System daily for research, space weather model validation, and forecasting for NASA missions. iSWA provides the capabilities to view and analyze near real-time space weather data from any where in the world. This presentation will describe the technology behind the iSWA system and describe how to use the system for space weather research, forecasting, training, education, and sharing.

  5. Novel Threat-risk Index Using Probabilistic Risk Assessment and Human Reliability Analysis - Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    George A. Beitel

    2004-02-01

    In support of a national need to improve the current state-of-the-art in alerting decision makers to the risk of terrorist attack, a quantitative approach employing scientific and engineering concepts to develop a threat-risk index was undertaken at the Idaho National Engineering and Environmental Laboratory (INEEL). As a result of this effort, a set of models has been successfully integrated into a single comprehensive model known as Quantitative Threat-Risk Index Model (QTRIM), with the capability of computing a quantitative threat-risk index on a system level, as well as for the major components of the system. Such a threat-risk index could providemore » a quantitative variant or basis for either prioritizing security upgrades or updating the current qualitative national color-coded terrorist threat alert.« less

  6. Evaluation of Current Planetary Boundary Layer Retrieval Capabilities from Space

    NASA Technical Reports Server (NTRS)

    Santanello, Joseph A., Jr.; Schaefer, Alexander J.; Blaisdell, John; Yorks, John

    2016-01-01

    The PBL over land remains a significant gap in our water and energy cycle understanding from space. This work combines unique NASA satellite and model products to demonstrate the ability of current sensors (advanced IR sounding and lidar) to retrieve PBL properties and in turn their potential to be used globally to evaluate and improve weather and climate prediction models. While incremental progress has been made in recent AIRS retrieval versions, insufficient vertical resolution remains in terms of detecting PBL properties. Lidar shows promise in terms of detecting vertical gradients (and PBLh) in the lower troposphere, but daytime conditions over land remain a challenge due to noise, and their coverage is limited to approximately 2 weeks or longer return times.

  7. Next generation data systems and knowledge products to support agricultural producers and science-based policy decision making.

    PubMed

    Capalbo, Susan M; Antle, John M; Seavert, Clark

    2017-07-01

    Research on next generation agricultural systems models shows that the most important current limitation is data, both for on-farm decision support and for research investment and policy decision making. One of the greatest data challenges is to obtain reliable data on farm management decision making, both for current conditions and under scenarios of changed bio-physical and socio-economic conditions. This paper presents a framework for the use of farm-level and landscape-scale models and data to provide analysis that could be used in NextGen knowledge products, such as mobile applications or personal computer data analysis and visualization software. We describe two analytical tools - AgBiz Logic and TOA-MD - that demonstrate the current capability of farmlevel and landscape-scale models. The use of these tools is explored with a case study of an oilseed crop, Camelina sativa , which could be used to produce jet aviation fuel. We conclude with a discussion of innovations needed to facilitate the use of farm and policy-level models to generate data and analysis for improved knowledge products.

  8. Low-Order Modeling of Dynamic Stall on Airfoils in Incompressible Flow

    NASA Astrophysics Data System (ADS)

    Narsipur, Shreyas

    Unsteady aerodynamics has been a topic of research since the late 1930's and has increased in popularity among researchers studying dynamic stall in helicopters, insect/bird flight, micro air vehicles, wind-turbine aerodynamics, and ow-energy harvesting devices. Several experimental and computational studies have helped researchers gain a good understanding of the unsteady ow phenomena, but have proved to be expensive and time-intensive for rapid design and analysis purposes. Since the early 1970's, the push to develop low-order models to solve unsteady ow problems has resulted in several semi-empirical models capable of effectively analyzing unsteady aerodynamics in a fraction of the time required by high-order methods. However, due to the various complexities associated with time-dependent flows, several empirical constants and curve fits derived from existing experimental and computational results are required by the semi-empirical models to be an effective analysis tool. The aim of the current work is to develop a low-order model capable of simulating incompressible dynamic-stall type ow problems with a focus on accurately modeling the unsteady ow physics with the aim of reducing empirical dependencies. The lumped-vortex-element (LVE) algorithm is used as the baseline unsteady inviscid model to which augmentations are applied to model unsteady viscous effects. The current research is divided into two phases. The first phase focused on augmentations aimed at modeling pure unsteady trailing-edge boundary-layer separation and stall without leading-edge vortex (LEV) formation. The second phase is targeted at including LEV shedding capabilities to the LVE algorithm and combining with the trailing-edge separation model from phase one to realize a holistic, optimized, and robust low-order dynamic stall model. In phase one, initial augmentations to theory were focused on modeling the effects of steady trailing-edge separation by implementing a non-linear decambering flap to model the effect of the separated boundary-layer. Unsteady RANS results for several pitch and plunge motions showed that the differences in aerodynamic loads between steady and unsteady flows can be attributed to the boundary-layer convection lag, which can be modeled by choosing an appropriate value of the time lag parameter, tau2. In order to provide appropriate viscous corrections to inviscid unsteady calculations, the non-linear decambering flap is applied with a time lag determined by the tau2 value, which was found to be independent of motion kinematics for a given airfoil and Reynolds number. The predictions of the aerodynamic loads, unsteady stall, hysteresis loops, and ow reattachment from the low-order model agree well with CFD and experimental results, both for individual cases and for trends between motions. The model was also found to perform as well as existing semi-empirical models while using only a single empirically defined parameter. Inclusion of LEV shedding capabilities and combining the resulting algorithm with phase one's trailing-edge separation model was the primary objective of phase two. Computational results at low and high Reynolds numbers were used to analyze the ow morphology of the LEV to identify the common surface signature associated with LEV initiation at both low and high Reynolds numbers and relate it to the critical leading-edge suction parameter (LESP ) to control the initiation and termination of LEV shedding in the low-order model. The critical LESP, like the tau2 parameter, was found to be independent of motion kinematics for a given airfoil and Reynolds number. Results from the final low-order model compared excellently with CFD and experimental solutions, both in terms of aerodynamic loads and vortex ow pattern predictions. Overall, the final combined dynamic stall model that resulted from the current research was successful in accurately modeling the physics of unsteady ow thereby helping restrict the number of empirical coefficients to just two variables while successfully modeling the aerodynamic forces and ow patterns in a simple and precise manner.

  9. Science based integrated approach to advanced nuclear fuel development - integrated multi-scale multi-physics hierarchical modeling and simulation framework Part III: cladding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tome, Carlos N; Caro, J A; Lebensohn, R A

    2010-01-01

    Advancing the performance of Light Water Reactors, Advanced Nuclear Fuel Cycles, and Advanced Reactors, such as the Next Generation Nuclear Power Plants, requires enhancing our fundamental understanding of fuel and materials behavior under irradiation. The capability to accurately model the nuclear fuel systems to develop predictive tools is critical. Not only are fabrication and performance models needed to understand specific aspects of the nuclear fuel, fully coupled fuel simulation codes are required to achieve licensing of specific nuclear fuel designs for operation. The backbone of these codes, models, and simulations is a fundamental understanding and predictive capability for simulating themore » phase and microstructural behavior of the nuclear fuel system materials and matrices. In this paper we review the current status of the advanced modeling and simulation of nuclear reactor cladding, with emphasis on what is available and what is to be developed in each scale of the project, how we propose to pass information from one scale to the next, and what experimental information is required for benchmarking and advancing the modeling at each scale level.« less

  10. ADPAC v1.0: User's Manual

    NASA Technical Reports Server (NTRS)

    Hall, Edward J.; Heidegger, Nathan J.; Delaney, Robert A.

    1999-01-01

    The overall objective of this study was to evaluate the effects of turbulence models in a 3-D numerical analysis on the wake prediction capability. The current version of the computer code resulting from this study is referred to as ADPAC v7 (Advanced Ducted Propfan Analysis Codes -Version 7). This report is intended to serve as a computer program user's manual for the ADPAC code used and modified under Task 15 of NASA Contract NAS3-27394. The ADPAC program is based on a flexible multiple-block and discretization scheme permitting coupled 2-D/3-D mesh block solutions with application to a wide variety of geometries. Aerodynamic calculations are based on a four-stage Runge-Kutta time-marching finite volume solution technique with added numerical dissipation. Steady flow predictions are accelerated by a multigrid procedure. Turbulence models now available in the ADPAC code are: a simple mixing-length model, the algebraic Baldwin-Lomax model with user defined coefficients, the one-equation Spalart-Allmaras model, and a two-equation k-R model. The consolidated ADPAC code is capable of executing in either a serial or parallel computing mode from a single source code.

  11. Virtual Habitat -a dynamic simulation of closed life support systems -human model status

    NASA Astrophysics Data System (ADS)

    Markus Czupalla, M. Sc.; Zhukov, Anton; Hwang, Su-Au; Schnaitmann, Jonas

    In order to optimize Life Support Systems on a system level, stability questions must be in-vestigated. To do so the exploration group of the Technical University of Munich (TUM) is developing the "Virtual Habitat" (V-HAB) dynamic LSS simulation software. V-HAB shall provide the possibility to conduct dynamic simulations of entire mission scenarios for any given LSS configuration. The Virtual Habitat simulation tool consists of four main modules: • Closed Environment Module (CEM) -monitoring of compounds in a closed environment • Crew Module (CM) -dynamic human simulation • P/C Systems Module (PCSM) -dynamic P/C subsystems • Plant Module (PM) -dynamic plant simulation The core module of the simulation is the dynamic and environment sensitive human module. Introduced in its basic version in 2008, the human module has been significantly updated since, increasing its capabilities and maturity significantly. In this paper three newly added human model subsystems (thermal regulation, digestion and schedule controller) are introduced touching also on the human stress subsystem which is cur-rently under development. Upon the introduction of these new subsystems, the integration of these into the overall V-HAB human model is discussed, highlighting the impact on the most important I/F. The overall human model capabilities shall further be summarized and presented based on meaningful test cases. In addition to the presentation of the results, the correlation strategy for the Virtual Habitat human model shall be introduced assessing the models current confidence level and giving an outlook on the future correlation strategy. Last but not least, the remaining V-HAB mod-ules shall be introduced shortly showing how the human model is integrated into the overall simulation.

  12. An efficient and scalable graph modeling approach for capturing information at different levels in next generation sequencing reads

    PubMed Central

    2013-01-01

    Background Next generation sequencing technologies have greatly advanced many research areas of the biomedical sciences through their capability to generate massive amounts of genetic information at unprecedented rates. The advent of next generation sequencing has led to the development of numerous computational tools to analyze and assemble the millions to billions of short sequencing reads produced by these technologies. While these tools filled an important gap, current approaches for storing, processing, and analyzing short read datasets generally have remained simple and lack the complexity needed to efficiently model the produced reads and assemble them correctly. Results Previously, we presented an overlap graph coarsening scheme for modeling read overlap relationships on multiple levels. Most current read assembly and analysis approaches use a single graph or set of clusters to represent the relationships among a read dataset. Instead, we use a series of graphs to represent the reads and their overlap relationships across a spectrum of information granularity. At each information level our algorithm is capable of generating clusters of reads from the reduced graph, forming an integrated graph modeling and clustering approach for read analysis and assembly. Previously we applied our algorithm to simulated and real 454 datasets to assess its ability to efficiently model and cluster next generation sequencing data. In this paper we extend our algorithm to large simulated and real Illumina datasets to demonstrate that our algorithm is practical for both sequencing technologies. Conclusions Our overlap graph theoretic algorithm is able to model next generation sequencing reads at various levels of granularity through the process of graph coarsening. Additionally, our model allows for efficient representation of the read overlap relationships, is scalable for large datasets, and is practical for both Illumina and 454 sequencing technologies. PMID:24564333

  13. Development of Green Fuels From Algae - The University of Tulsa

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crunkleton, Daniel; Price, Geoffrey; Johannes, Tyler

    The general public has become increasingly aware of the pitfalls encountered with the continued reliance on fossil fuels in the industrialized world. In response, the scientific community is in the process of developing non-fossil fuel technologies that can supply adequate energy while also being environmentally friendly. In this project, we concentrate on green fuels which we define as those capable of being produced from renewable and sustainable resources in a way that is compatible with the current transportation fuel infrastructure. One route to green fuels that has received relatively little attention begins with algae as a feedstock. Algae are amore » diverse group of aquatic, photosynthetic organisms, generally categorized as either macroalgae (i.e. seaweed) or microalgae. Microalgae constitute a spectacularly diverse group of prokaryotic and eukaryotic unicellular organisms and account for approximately 50% of global organic carbon fixation. The PI's have subdivided the proposed research program into three main research areas, all of which are essential to the development of commercially viable algae fuels compatible with current energy infrastructure. In the fuel development focus, catalytic cracking reactions of algae oils is optimized. In the species development project, genetic engineering is used to create microalgae strains that are capable of high-level hydrocarbon production. For the modeling effort, the construction of multi-scaled models of algae production was prioritized, including integrating small-scale hydrodynamic models of algae production and reactor design and large-scale design optimization models.« less

  14. Carrier velocity effect on carbon nanotube Schottky contact

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fathi, Amir, E-mail: fathi.amir@hotmail.com; Ahmadi, M. T., E-mail: mt.ahmadi@urmia.ac.ir; Ismail, Razali, E-mail: Razali@fke.utm.my

    One of the most important drawbacks which caused the silicon based technologies to their technical limitations is the instability of their products at nano-level. On the other side, carbon based materials such as carbon nanotube (CNT) as alternative materials have been involved in scientific efforts. Some of the important advantages of CNTs over silicon components are high mechanical strength, high sensing capability and large surface-to-volume ratio. In this article, the model of CNT Schottky transistor current which is under exterior applied voltage is employed. This model shows that its current has a weak dependence on thermal velocity corresponding to themore » small applied voltage. The conditions are quite different for high bias voltages which are independent of temperature. Our results indicate that the current is increased by Fermi velocity, but the I–V curves will not have considerable changes with the variations in number of carriers. It means that the current doesn’t increase sharply by voltage variations over different number of carriers.« less

  15. Numerical modeling of the coupling of an ICRH antenna with a plasma with self-consistent antenna currents

    NASA Astrophysics Data System (ADS)

    Pécoul, S.; Heuraux, S.; Koch, R.; Leclert, G.

    2002-07-01

    A realistic modeling of ICRH antennas requires the knowledge of the antenna currents. The code ICANT determines self-consistently these currents and, as a byproduct, the electrical characteristics of the antenna (radiated power, propagation constants on straps, frequency response, … ). The formalism allows for the description of three-dimensional antenna elements (for instance, finite size thick screen blades). The results obtained for various cases where analytical results are available are discussed. The resonances appearing in the spectrum and the occurrence of unphysical resonant modes are discussed. The capability of this self-consistent method is illustrated by a number of examples, e.g., fully conducting thin or thick screen bars leading to magnetic shielding effects, frequency response and resonances of an end-tuned antenna, field distributions in front of a Tore-Supra type antenna with tilted screen blades.

  16. Use of artificial intelligence in supervisory control

    NASA Technical Reports Server (NTRS)

    Cohen, Aaron; Erickson, Jon D.

    1989-01-01

    Viewgraphs describing the design and testing of an intelligent decision support system called OFMspert are presented. In this expert system, knowledge about the human operator is represented through an operator/system model referred to as the OFM (Operator Function Model). OFMspert uses the blackboard model of problem solving to maintain a dynamic representation of operator goals, plans, tasks, and actions given previous operator actions and current system state. Results of an experiment to assess OFMspert's intent inferencing capability are outlined. Finally, the overall design philosophy for an intelligent tutoring system (OFMTutor) for operators of complex dynamic systems is summarized.

  17. OAST system technology planning

    NASA Technical Reports Server (NTRS)

    Sadin, S. R.

    1978-01-01

    The NASA Office of Aeronautics and Space Technology developed a planning model for space technology consisting of a space systems technology model, technology forecasts and technology surveys. The technology model describes candidate space missions through the year 2000 and identifies their technology requirements. The technology surveys and technology forecasts provide, respectively, data on the current status and estimates of the projected status of relevant technologies. These tools are used to further the understanding of the activities and resources required to ensure the timely development of technological capabilities. Technology forecasting in the areas of information systems, spacecraft systems, transportation systems, and power systems are discussed.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Platania, P., E-mail: platania@ifp.cnr.it; Figini, L.; Farina, D.

    The purpose of this work is the optical modeling and physical performances evaluations of the JT-60SA ECRF launcher system. The beams have been simulated with the electromagnetic code GRASP® and used as input for ECCD calculations performed with the beam tracing code GRAY, capable of modeling propagation, absorption and current drive of an EC Gaussion beam with general astigmatism. Full details of the optical analysis has been taken into account to model the launched beams. Inductive and advanced reference scenarios has been analysed for physical evaluations in the full poloidal and toroidal steering ranges for two slightly different layouts ofmore » the launcher system.« less

  19. Communications processor for C3 analysis and wargaming

    NASA Astrophysics Data System (ADS)

    Clark, L. N.; Pless, L. D.; Rapp, R. L.

    1982-03-01

    This thesis developed the software capability to allow the investigation of c3 problems, procedures and methodologies. The resultant communications model, that while independent of a specific wargame, is currently implemented in conjunction with the McClintic Theater Model. It provides a computerized message handling system (C3 Model) which allows simulation of communication links (circuits) with user-definable delays; garble and loss rates; and multiple circuit types, addresses, and levels of command. It is designed to be used for test and evaluation of command and control problems in the areas of organizational relationships, communication networks and procedures, and combat doctrine or tactics.

  20. Thermofluid Modeling of Fuel Cells

    NASA Astrophysics Data System (ADS)

    Young, John B.

    2007-01-01

    Fuel cells offer the prospect of silent electrical power generation at high efficiency with near-zero pollutant emission. Many materials and fabrication problems have now been solved and attention has shifted toward system modeling, including the fluid flows that supply the cells with hydrogen and oxygen. This review describes the current thermofluid modeling capabilities for proton exchange membrane fuel cells (PEMFCs) and solid oxide fuel cells (SOFCs), the most promising candidates for commercial exploitation. Topics covered include basic operating principles and stack design, convective-diffusive flow in porous solids, special modeling issues for PEMFCs and SOFCs, and the use of computational fluid dynamics (CFD) methods.

  1. Toward Self-Referential Autonomous Learning of Object and Situation Models.

    PubMed

    Damerow, Florian; Knoblauch, Andreas; Körner, Ursula; Eggert, Julian; Körner, Edgar

    2016-01-01

    Most current approaches to scene understanding lack the capability to adapt object and situation models to behavioral needs not anticipated by the human system designer. Here, we give a detailed description of a system architecture for self-referential autonomous learning which enables the refinement of object and situation models during operation in order to optimize behavior. This includes structural learning of hierarchical models for situations and behaviors that is triggered by a mismatch between expected and actual action outcome. Besides proposing architectural concepts, we also describe a first implementation of our system within a simulated traffic scenario to demonstrate the feasibility of our approach.

  2. Study of optical techniques for the Ames unitary wind tunnels. Part 4: Model deformation

    NASA Technical Reports Server (NTRS)

    Lee, George

    1992-01-01

    A survey of systems capable of model deformation measurements was conducted. The survey included stereo-cameras, scanners, and digitizers. Moire, holographic, and heterodyne interferometry techniques were also looked at. Stereo-cameras with passive or active targets are currently being deployed for model deformation measurements at NASA Ames and LaRC, Boeing, and ONERA. Scanners and digitizers are widely used in robotics, motion analysis, medicine, etc., and some of the scanner and digitizers can meet the model deformation requirements. Commercial stereo-cameras, scanners, and digitizers are being improved in accuracy, reliability, and ease of operation. A number of new systems are coming onto the market.

  3. MgB2-based superconductors for fault current limiters

    NASA Astrophysics Data System (ADS)

    Sokolovsky, V.; Prikhna, T.; Meerovich, V.; Eisterer, M.; Goldacker, W.; Kozyrev, A.; Weber, H. W.; Shapovalov, A.; Sverdun, V.; Moshchil, V.

    2017-02-01

    A promising solution of the fault current problem in power systems is the application of fast-operating nonlinear superconducting fault current limiters (SFCLs) with the capability of rapidly increasing their impedance, and thus limiting high fault currents. We report the results of experiments with models of inductive (transformer type) SFCLs based on the ring-shaped bulk MgB2 prepared under high quasihydrostatic pressure (2 GPa) and by hot pressing technique (30 MPa). It was shown that the SFCLs meet the main requirements to fault current limiters: they possess low impedance in the nominal regime of the protected circuit and can fast increase their impedance limiting both the transient and the steady-state fault currents. The study of quenching currents of MgB2 rings (SFCL activation current) and AC losses in the rings shows that the quenching current density and critical current density determined from AC losses can be 10-20 times less than the critical current determined from the magnetization experiments.

  4. Brugga basin's TACD Model Adaptation to current GIS PCRaster 4.1

    NASA Astrophysics Data System (ADS)

    Lopez Rozo, Nicolas Antonio; Corzo Perez, Gerald Augusto; Santos Granados, Germán Ricardo

    2017-04-01

    The process-oriented catchment model TACD (Tracer-Aided Catchment model - Distributed) was developed in the Brugga Basin (Dark Forest, Germany) with a modular structure in the Geographic Information System PCRaster Version 2, in order to dynamically model the natural processes of a complex Basin, such as rainfall, air temperature, solar radiation, evapotranspiration and flow routing among others. Further research and application on this model has been done, such as adapting other meso-scaled basins and adding erosion processes in the hydrological model. However, TACD model is computationally intensive. This has made it not efficient on large and well discretized river basins. Aswell, the current version is not compatible with latest PCRaster Version 4.1, which offers new capabilities on 64-bit hardware architecture, hydraulic calculation improvements, in maps creation, some error and bug fixes. The current work studied and adapted TACD model into the latest GIS PCRaster Version 4.1. This was done by editing the original scripts, replacing deprecated functionalities without losing correctness of the TACD model. The correctness of the adapted TACD model was verified by using the original study case of the Brugga Basin and comparing the adapted model results with the original model results by Stefan Roser in 2001. Small differences were found due to the fact that some hydraulic and hydrological routines were optimized since version 2 of GIS PCRaster. Therefore, the hydraulic and hydrological processes are well represented. With this new working model, further research and development on current topics like uncertainty analysis, GCM downscaling techniques and spatio-temporal modelling are encouraged.

  5. Advanced Autonomous Systems for Space Operations

    NASA Astrophysics Data System (ADS)

    Gross, A. R.; Smith, B. D.; Muscettola, N.; Barrett, A.; Mjolssness, E.; Clancy, D. J.

    2002-01-01

    New missions of exploration and space operations will require unprecedented levels of autonomy to successfully accomplish their objectives. Inherently high levels of complexity, cost, and communication distances will preclude the degree of human involvement common to current and previous space flight missions. With exponentially increasing capabilities of computer hardware and software, including networks and communication systems, a new balance of work is being developed between humans and machines. This new balance holds the promise of not only meeting the greatly increased space exploration requirements, but simultaneously dramatically reducing the design, development, test, and operating costs. New information technologies, which take advantage of knowledge-based software, model-based reasoning, and high performance computer systems, will enable the development of a new generation of design and development tools, schedulers, and vehicle and system health management capabilities. Such tools will provide a degree of machine intelligence and associated autonomy that has previously been unavailable. These capabilities are critical to the future of advanced space operations, since the science and operational requirements specified by such missions, as well as the budgetary constraints will limit the current practice of monitoring and controlling missions by a standing army of ground-based controllers. System autonomy capabilities have made great strides in recent years, for both ground and space flight applications. Autonomous systems have flown on advanced spacecraft, providing new levels of spacecraft capability and mission safety. Such on-board systems operate by utilizing model-based reasoning that provides the capability to work from high-level mission goals, while deriving the detailed system commands internally, rather than having to have such commands transmitted from Earth. This enables missions of such complexity and communication` distances as are not otherwise possible, as well as many more efficient and low cost applications. In addition, utilizing component and system modeling and reasoning capabilities, autonomous systems will play an increasing role in ground operations for space missions, where they will both reduce the human workload as well as provide greater levels of monitoring and system safety. This paper will focus specifically on new and innovative software for remote, autonomous, space systems flight operations. Topics to be presented will include a brief description of key autonomous control concepts, the Remote Agent program that commanded the Deep Space 1 spacecraft to new levels of system autonomy, recent advances in distributed autonomous system capabilities, and concepts for autonomous vehicle health management systems. A brief description of teaming spacecraft and rovers for complex exploration missions will also be provided. New on-board software for autonomous science data acquisition for planetary exploration will be described, as well as advanced systems for safe planetary landings. A new multi-agent architecture that addresses some of the challenges of autonomous systems will be presented. Autonomous operation of ground systems will also be considered, including software for autonomous in-situ propellant production and management, and closed- loop ecological life support systems (CELSS). Finally, plans and directions for the future will be discussed.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clausen, Jonathan R.; Brunini, Victor E.; Moffat, Harry K.

    We develop a capability to simulate reduction-oxidation (redox) flow batteries in the Sierra Multi-Mechanics code base. Specifically, we focus on all-vanadium redox flow batteries; however, the capability is general in implementation and could be adopted to other chemistries. The electrochemical and porous flow models follow those developed in the recent publication by [28]. We review the model implemented in this work and its assumptions, and we show several verification cases including a binary electrolyte, and a battery half-cell. Then, we compare our model implementation with the experimental results shown in [28], with good agreement seen. Next, a sensitivity study ismore » conducted for the major model parameters, which is beneficial in targeting specific features of the redox flow cell for improvement. Lastly, we simulate a three-dimensional version of the flow cell to determine the impact of plenum channels on the performance of the cell. Such channels are frequently seen in experimental designs where the current collector plates are borrowed from fuel cell designs. These designs use a serpentine channel etched into a solid collector plate.« less

  7. ROMUSE 2.0 User Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khuwaileh, Bassam; Turinsky, Paul; Williams, Brian J.

    2016-10-04

    ROMUSE (Reduced Order Modeling Based Uncertainty/Sensitivity Estimator) is an effort within the Consortium for Advanced Simulation of Light water reactors (CASL) to provide an analysis tool to be used in conjunction with reactor core simulators, especially the Virtual Environment for Reactor Applications (VERA). ROMUSE is written in C++ and is currently capable of performing various types of parameters perturbations, uncertainty quantification, surrogate models construction and subspace analysis. Version 2.0 has the capability to interface with DAKOTA which gives ROMUSE access to the various algorithms implemented within DAKOTA. ROMUSE is mainly designed to interface with VERA and the Comprehensive Modeling andmore » Simulation Suite for Nuclear Safety Analysis and Design (SCALE) [1,2,3], however, ROMUSE can interface with any general model (e.g. python and matlab) with Input/Output (I/O) format that follows the Hierarchical Data Format 5 (HDF5). In this brief user manual, the use of ROMUSE will be overviewed and example problems will be presented and briefly discussed. The algorithms provided here range from algorithms inspired by those discussed in Ref.[4] to nuclear-specific algorithms discussed in Ref. [3].« less

  8. DEF: an automated dead-end filling approach based on quasi-endosymbiosis.

    PubMed

    Liu, Lili; Zhang, Zijun; Sheng, Taotao; Chen, Ming

    2017-02-01

    Gap filling for the reconstruction of metabolic networks is to restore the connectivity of metabolites via finding high-confidence reactions that could be missed in target organism. Current methods for gap filling either fall into the network topology or have limited capability in finding missing reactions that are indirectly related to dead-end metabolites but of biological importance to the target model. We present an automated dead-end filling (DEF) approach, which is derived from the wisdom of endosymbiosis theory, to fill gaps by finding the most efficient dead-end utilization paths in a constructed quasi-endosymbiosis model. The recalls of reactions and dead ends of DEF reach around 73% and 86%, respectively. This method is capable of finding indirectly dead-end-related reactions with biological importance for the target organism and is applicable to any given metabolic model. In the E. coli iJR904 model, for instance, about 42% of the dead-end metabolites were fixed by our proposed method. DEF is publicly available at http://bis.zju.edu.cn/DEF/. mchen@zju.edu.cn Supplementary data are available at Bioinformatics online.

  9. High fidelity studies of exploding foil initiator bridges, Part 3: ALEGRA MHD simulations

    NASA Astrophysics Data System (ADS)

    Neal, William; Garasi, Christopher

    2017-01-01

    Simulations of high voltage detonators, such as Exploding Bridgewire (EBW) and Exploding Foil Initiators (EFI), have historically been simple, often empirical, one-dimensional models capable of predicting parameters such as current, voltage, and in the case of EFIs, flyer velocity. Experimental methods have correspondingly generally been limited to the same parameters. With the advent of complex, first principles magnetohydrodynamic codes such as ALEGRA and ALE-MHD, it is now possible to simulate these components in three dimensions, and predict a much greater range of parameters than before. A significant improvement in experimental capability was therefore required to ensure these simulations could be adequately verified. In this third paper of a three part study, the experimental results presented in part 2 are compared against 3-dimensional MHD simulations. This improved experimental capability, along with advanced simulations, offer an opportunity to gain a greater understanding of the processes behind the functioning of EBW and EFI detonators.

  10. Advances in computer-aided well-test interpretation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Horne, R.N.

    1994-07-01

    Despite the feeling expressed several times over the past 40 years that well-test analysis had reached it peak development, an examination of recent advances shows continuous expansion in capability, with future improvement likely. The expansion in interpretation capability over the past decade arose mainly from the development of computer-aided techniques, which, although introduced 20 years ago, have come into use only recently. The broad application of computer-aided interpretation originated with the improvement of the methodologies and continued with the expansion in computer access and capability that accompanied the explosive development of the microcomputer industry. This paper focuses on the differentmore » pieces of the methodology that combine to constitute a computer-aided interpretation and attempts to compare some of the approaches currently used. Future directions of the approach are also discussed. The separate areas discussed are deconvolution, pressure derivatives, model recognition, nonlinear regression, and confidence intervals.« less

  11. At the Edge of Translation – Materials to Program Cells for Directed Differentiation

    PubMed Central

    Arany, Praveen R; Mooney, David J

    2010-01-01

    The rapid advancement in basic biology knowledge, especially in the stem cell field, has created new opportunities to develop biomaterials capable of orchestrating the behavior of transplanted and host cells. Based on our current understanding of cellular differentiation, a conceptual framework for the use of materials to program cells in situ is presented, namely a domino versus a switchboard model, to highlight the use of single versus multiple cues in a controlled manner to modulate biological processes. Further, specific design principles of material systems to present soluble and insoluble cues that are capable of recruiting, programming and deploying host cells for various applications are presented. The evolution of biomaterials from simple inert substances used to fill defects, to the recent development of sophisticated material systems capable of programming cells in situ is providing a platform to translate our understanding of basic biological mechanisms to clinical care. PMID:20860763

  12. Final report for the endowment of simulator agents with human-like episodic memory LDRD.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Speed, Ann Elizabeth; Lippitt, Carl Edward; Thomas, Edward Victor

    This report documents work undertaken to endow the cognitive framework currently under development at Sandia National Laboratories with a human-like memory for specific life episodes. Capabilities have been demonstrated within the context of three separate problem areas. The first year of the project developed a capability whereby simulated robots were able to utilize a record of shared experience to perform surveillance of a building to detect a source of smoke. The second year focused on simulations of social interactions providing a queriable record of interactions such that a time series of events could be constructed and reconstructed. The third yearmore » addressed tools to promote desktop productivity, creating a capability to query episodic logs in real time allowing the model of a user to build on itself based on observations of the user's behavior.« less

  13. The Role 1 capability review: mitigation and innovation for Op HERRICK 18 and into contingency.

    PubMed

    Wheatley, Robert J

    2014-09-01

    The Role 1 orientated JRAMC of September 2012 was a welcome addition to the body of Role 1 literature. In particular, the Role 1 capability review by Hodgetts and Findlay detailed both current issues and future aspirations for Role 1 provision. This personal view considers issues still prevalent during Op HERRICK 18 namely the provision of primary healthcare by combat medical technicians on operations and the organisational issues that contribute to historical structural and attitudinal obstructions to the employment of combat medical technicians in firm base primary healthcare. It also considers a dynamically updating dashboard capable of displaying risk across the Role 1 network with the implied move to a model of continuous healthcare assurance. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  14. Simulation capability for dynamics of two-body flexible satellites

    NASA Technical Reports Server (NTRS)

    Austin, F.; Zetkov, G.

    1973-01-01

    An analysis and computer program were prepared to realistically simulate the dynamic behavior of a class of satellites consisting of two end bodies separated by a connecting structure. The shape and mass distribution of the flexible end bodies are arbitrary; the connecting structure is flexible but massless and is capable of deployment and retraction. Fluid flowing in a piping system and rigid moving masses, representing a cargo elevator or crew members, have been modeled. Connecting structure characteristics, control systems, and externally applied loads are modeled in easily replaced subroutines. Subroutines currently available include a telescopic beam-type connecting structure as well as attitude, deployment, spin and wobble control. In addition, a unique mass balance control system was developed to sense and balance mass shifts due to the motion of a cargo elevator. The mass of the cargo may vary through a large range. Numerical results are discussed for various types of runs.

  15. Evaluation of the immunogenic capability of the BCG strains BCGΔBCG1419c and BCGΔBCG1416c in a three-dimensional human lung tissue model.

    PubMed

    Parasa, Venkata Ramanarao; Rose, Jeronimo; Castillo-Diaz, Luis Alberto; Aceves-Sánchez, Michel de Jesús; Vega-Domínguez, Perla Jazmín; Lerm, Maria; Flores-Valdez, Mario Alberto

    2018-03-27

    Tuberculosis (TB) still remains as an unmet global threat. The current vaccine is not fully effective and novel alternatives are needed. Here, two vaccine candidate strains derived from BCG carrying deletions in the BCG1416c or BCG1419c genes were analysed for their capacity to modulate the cytokine/chemokine profile and granuloma formation in a human lung tissue model (LTM). We show that the clustering of monocytes, reminiscent of early granuloma formation, in LTMs infected with BCG strains was similar for all of them. However, BCGΔBCG1419c, like M. tuberculosis, was capable of inducing the production of IL-6 in contrast to the other BCG strains. This work suggests that LTM could be a useful ex vivo assay to evaluate the potential immunogenicity of novel TB vaccine candidates. Copyright © 2018 Elsevier Ltd. All rights reserved.

  16. Artificial Immune System Approaches for Aerospace Applications

    NASA Technical Reports Server (NTRS)

    KrishnaKumar, Kalmanje; Koga, Dennis (Technical Monitor)

    2002-01-01

    Artificial Immune Systems (AIS) combine a priori knowledge with the adapting capabilities of biological immune system to provide a powerful alternative to currently available techniques for pattern recognition, modeling, design, and control. Immunology is the science of built-in defense mechanisms that are present in all living beings to protect against external attacks. A biological immune system can be thought of as a robust, adaptive system that is capable of dealing with an enormous variety of disturbances and uncertainties. Biological immune systems use a finite number of discrete "building blocks" to achieve this adaptiveness. These building blocks can be thought of as pieces of a puzzle which must be put together in a specific way-to neutralize, remove, or destroy each unique disturbance the system encounters. In this paper, we outline AIS models that are immediately applicable to aerospace problems and identify application areas that need further investigation.

  17. A graphical language for reliability model generation

    NASA Technical Reports Server (NTRS)

    Howell, Sandra V.; Bavuso, Salvatore J.; Haley, Pamela J.

    1990-01-01

    A graphical interface capability of the hybrid automated reliability predictor (HARP) is described. The graphics-oriented (GO) module provides the user with a graphical language for modeling system failure modes through the selection of various fault tree gates, including sequence dependency gates, or by a Markov chain. With this graphical input language, a fault tree becomes a convenient notation for describing a system. In accounting for any sequence dependencies, HARP converts the fault-tree notation to a complex stochastic process that is reduced to a Markov chain which it can then solve for system reliability. The graphics capability is available for use on an IBM-compatible PC, a Sun, and a VAX workstation. The GO module is written in the C programming language and uses the Graphical Kernel System (GKS) standard for graphics implementation. The PC, VAX, and Sun versions of the HARP GO module are currently in beta-testing.

  18. Accurate Modeling of Dark-Field Scattering Spectra of Plasmonic Nanostructures.

    PubMed

    Jiang, Liyong; Yin, Tingting; Dong, Zhaogang; Liao, Mingyi; Tan, Shawn J; Goh, Xiao Ming; Allioux, David; Hu, Hailong; Li, Xiangyin; Yang, Joel K W; Shen, Zexiang

    2015-10-27

    Dark-field microscopy is a widely used tool for measuring the optical resonance of plasmonic nanostructures. However, current numerical methods for simulating the dark-field scattering spectra were carried out with plane wave illumination either at normal incidence or at an oblique angle from one direction. In actual experiments, light is focused onto the sample through an annular ring within a range of glancing angles. In this paper, we present a theoretical model capable of accurately simulating the dark-field light source with an annular ring. Simulations correctly reproduce a counterintuitive blue shift in the scattering spectra from gold nanodisks with a diameter beyond 140 nm. We believe that our proposed simulation method can be potentially applied as a general tool capable of simulating the dark-field scattering spectra of plasmonic nanostructures as well as other dielectric nanostructures with sizes beyond the quasi-static limit.

  19. Anomaly Detection in Power Quality at Data Centers

    NASA Technical Reports Server (NTRS)

    Grichine, Art; Solano, Wanda M.

    2015-01-01

    The goal during my internship at the National Center for Critical Information Processing and Storage (NCCIPS) is to implement an anomaly detection method through the StruxureWare SCADA Power Monitoring system. The benefit of the anomaly detection mechanism is to provide the capability to detect and anticipate equipment degradation by monitoring power quality prior to equipment failure. First, a study is conducted that examines the existing techniques of power quality management. Based on these findings, and the capabilities of the existing SCADA resources, recommendations are presented for implementing effective anomaly detection. Since voltage, current, and total harmonic distortion demonstrate Gaussian distributions, effective set-points are computed using this model, while maintaining a low false positive count.

  20. Space Station Freedom electrical performance model

    NASA Technical Reports Server (NTRS)

    Hojnicki, Jeffrey S.; Green, Robert D.; Kerslake, Thomas W.; Mckissock, David B.; Trudell, Jeffrey J.

    1993-01-01

    The baseline Space Station Freedom electric power system (EPS) employs photovoltaic (PV) arrays and nickel hydrogen (NiH2) batteries to supply power to housekeeping and user electrical loads via a direct current (dc) distribution system. The EPS was originally designed for an operating life of 30 years through orbital replacement of components. As the design and development of the EPS continues, accurate EPS performance predictions are needed to assess design options, operating scenarios, and resource allocations. To meet these needs, NASA Lewis Research Center (LeRC) has, over a 10 year period, developed SPACE (Station Power Analysis for Capability Evaluation), a computer code designed to predict EPS performance. This paper describes SPACE, its functionality, and its capabilities.

  1. Nonlinear finite element simulation of non-local tension softening for high strength steel material

    NASA Astrophysics Data System (ADS)

    Tong, F. M.

    The capability of current finite element softwares in simulating the stress-strain relation beyond the elastic-plastic region has been limited by the inability for non- positivity in the computational finite elements' stiffness matrixes. Although analysis up to the peak stress has been proved adequate for analysis and design, it provides no indication of the possible failure predicament that is to follow. Therefore an attempt was made to develop a modelling technique capable of capturing the complete stress-deformation response in an analysis beyond the limit point. This proposed model characterizes a cyclic loading and unloading procedure, as observed in a typical laboratory uniaxial cyclic test, along with a series of material properties updates. The Voce equation and a polynomial function were proposed to define the monotonic elastoplastic hardening and softening behaviour respectively. A modified form of the Voce equation was used to capture the reloading response in the softening region. To accommodate the reduced load capacity of the material at each subsequent softening point, an optimization macro was written to control this optimum load at which the material could withstand. This preliminary study has ignored geometrical effect and is thus incapable of capturing the localized necking phenomenon that accompanies many ductile materials. The current softening model is sufficient if a global measure is considered. Several validation cases were performed to investigate the feasibility of the modelling technique and the results have been proved satisfactory. The ANSYS finite element software is used as the platform at which the modelling technique operates.

  2. On the Role of Global Magnetic Field Configuration in Affecting Ring Current Dynamics

    NASA Technical Reports Server (NTRS)

    Zheng, Y.; Zaharia, S. G.; Fok, M. H.

    2010-01-01

    Plasma and field interaction is one important aspect of inner magnetospheric physics. The magnetic field controls particle motion through gradient, curvature drifts and E cross B drift. In this presentation, we show how the global magnetic field affects dynamics of the ring current through simulations of two moderate geomagnetic storms (20 November 2007 and 8-9 March 2008). Preliminary results of coupling the Comprehensive Ring Current Model (CRCM) with a three-dimensional plasma force balance code (to achieve self-consistency in both E and B fields) indicate that inclusion of self-consistency in B tends to mitigate the intensification of the ring current as other similar coupling efforts have shown. In our approach, self-consistency in the electric field is already an existing capability of the CRCM. The magnetic self-consistency is achieved by computing the three-dimensional magnetic field in force balance with anisotropic ring current ion distributions. We discuss the coupling methodology and its further improvement. In addition, comparative studies by using various magnetic field models will be shown. Simulation results will be put into a global context by analyzing the morphology of the ring current, its anisotropy and characteristics ofthe interconnected region 2 field-aligned currents.

  3. JPRS Report Science & Technology, Europe

    DTIC Science & Technology

    1991-10-25

    in HPV infection diagnosis, and vaccines and drugs that may be capable of fighting its effects. BIOTEC: If we were to meet again in five years...The aircraft engine business is extremely risky . The manufacturers are dependent on the success or failure of the aircraft model for which they...program currently covers four main research areas: agrobiology (with special emphasis on crop improve- ment); health (development of new vaccines

  4. JPRS Report, Science & Technology, China

    DTIC Science & Technology

    1992-06-18

    The key to success of this model is the existence of a very effective basic investment capability, including an edu- cational foundation and an...firm lodgement in international markets. We must use various kinds of effective measures to channel our scientific and technical strength toward the... effectively farmland currently in use. Bio- engineering technology should be used to develop new kinds of plants and animals, the report says. China

  5. Review of methods for developing regional probabilistic risk assessments, part 2: modeling invasive plant, insect, and pathogen species

    Treesearch

    P. B. Woodbury; D. A. Weinstein

    2010-01-01

    We reviewed probabilistic regional risk assessment methodologies to identify the methods that are currently in use and are capable of estimating threats to ecosystems from fire and fuels, invasive species, and their interactions with stressors. In a companion chapter, we highlight methods useful for evaluating risks from fire. In this chapter, we highlight methods...

  6. Monitoring Retroviral RNA Dimerization In Vivo via Hammerhead Ribozyme Cleavage

    PubMed Central

    Pal, Bijay K.; Scherer, Lisa; Zelby, Laurie; Bertrand, Edouard; Rossi, John J.

    1998-01-01

    We have used a strategy for colocalization of Psi (Ψ)-tethered ribozymes and targets to demonstrate that Ψ sequences are capable of specific interaction in the cytoplasm of both packaging and nonpackaging cells. These results indicate that current in vitro dimerization models may have in vivo counterparts. The methodology used may be applied to further genetic analyses on Ψ domain interactions in vivo. PMID:9733882

  7. A COMPARISON OF AIRFLOW PATTERNS FROM THE QUIC MODEL AND AN ATMOSPHERIC WIND TUNNEL FOR A TWO-DIMENSIONAL BUILDING ARRAY AND A MULTI-CITY BLOCK REGION NEAR THE WORLD TRADE CENTER SITE

    EPA Science Inventory

    Dispersion of pollutants in densely populated urban areas is a research area of clear importance. Currently, few numerical tools exist capable of describing airflow and dispersion patterns in these complex regions in a time efficient manner. (QUIC), Quick Urban & Industrial C...

  8. University NanoSat Program: AggieSat3

    DTIC Science & Technology

    2009-06-01

    commercially available product for stereo machine vision developed by Point Grey Research. The current binocular BumbleBee2® system incorporates two...and Fellow of the American Society of Mechanical Engineers (ASME) in 1997. She was awarded the 2007 J. Leland "Lee" Atwood Award from the ASEE...AggieSat2 satellite programs. Additional experience gained in the area of drawing standards, machining capabilities, solid modeling, safety

  9. SHARP Multiphysics Tutorials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, Y. Q.; Shemon, E. R.; Mahadevan, Vijay S.

    SHARP, developed under the NEAMS Reactor Product Line, is an advanced modeling and simulation toolkit for the analysis of advanced nuclear reactors. SHARP is comprised of three physics modules currently including neutronics, thermal hydraulics, and structural mechanics. SHARP empowers designers to produce accurate results for modeling physical phenomena that have been identified as important for nuclear reactor analysis. SHARP can use existing physics codes and take advantage of existing infrastructure capabilities in the MOAB framework and the coupling driver/solver library, the Coupled Physics Environment (CouPE), which utilizes the widely used, scalable PETSc library. This report aims at identifying the coupled-physicsmore » simulation capability of SHARP by introducing the demonstration example called sahex in advance of the SHARP release expected by Mar 2016. sahex consists of 6 fuel pins with cladding, 1 control rod, sodium coolant and an outer duct wall that encloses all the other components. This example is carefully chosen to demonstrate the proof of concept for solving more complex demonstration examples such as EBR II assembly and ABTR full core. The workflow of preparing the input files, running the case and analyzing the results is demonstrated in this report. Moreover, an extension of the sahex model called sahex_core, which adds six homogenized neighboring assemblies to the full heterogeneous sahex model, is presented to test homogenization capabilities in both Nek5000 and PROTEUS. Some primary information on the configuration and build aspects for the SHARP toolkit, which includes capability to auto-download dependencies and configure/install with optimal flags in an architecture-aware fashion, is also covered by this report. A step-by-step instruction is provided to help users to create their cases. Details on these processes will be provided in the SHARP user manual that will accompany the first release.« less

  10. An IBM PC-based math model for space station solar array simulation

    NASA Technical Reports Server (NTRS)

    Emanuel, E. M.

    1986-01-01

    This report discusses and documents the design, development, and verification of a microcomputer-based solar cell math model for simulating the Space Station's solar array Initial Operational Capability (IOC) reference configuration. The array model is developed utilizing a linear solar cell dc math model requiring only five input parameters: short circuit current, open circuit voltage, maximum power voltage, maximum power current, and orbit inclination. The accuracy of this model is investigated using actual solar array on orbit electrical data derived from the Solar Array Flight Experiment/Dynamic Augmentation Experiment (SAFE/DAE), conducted during the STS-41D mission. This simulator provides real-time simulated performance data during the steady state portion of the Space Station orbit (i.e., array fully exposed to sunlight). Eclipse to sunlight transients and shadowing effects are not included in the analysis, but are discussed briefly. Integrating the Solar Array Simulator (SAS) into the Power Management and Distribution (PMAD) subsystem is also discussed.

  11. Nowcasting Ground Magnetic Perturbations with the Space Weather Modeling Framework

    NASA Astrophysics Data System (ADS)

    Welling, D. T.; Toth, G.; Singer, H. J.; Millward, G. H.; Gombosi, T. I.

    2015-12-01

    Predicting ground-based magnetic perturbations is a critical step towards specifying and predicting geomagnetically induced currents (GICs) in high voltage transmission lines. Currently, the Space Weather Modeling Framework (SWMF), a flexible modeling framework for simulating the multi-scale space environment, is being transitioned from research to operational use (R2O) by NOAA's Space Weather Prediction Center. Upon completion of this transition, the SWMF will provide localized B/t predictions using real-time solar wind observations from L1 and the F10.7 proxy for EUV as model input. This presentation describes the operational SWMF setup and summarizes the changes made to the code to enable R2O progress. The framework's algorithm for calculating ground-based magnetometer observations will be reviewed. Metrics from data-model comparisons will be reviewed to illustrate predictive capabilities. Early data products, such as regional-K index and grids of virtual magnetometer stations, will be presented. Finally, early successes will be shared, including the code's ability to reproduce the recent March 2015 St. Patrick's Day Storm.

  12. SBML-SAT: a systems biology markup language (SBML) based sensitivity analysis tool

    PubMed Central

    Zi, Zhike; Zheng, Yanan; Rundell, Ann E; Klipp, Edda

    2008-01-01

    Background It has long been recognized that sensitivity analysis plays a key role in modeling and analyzing cellular and biochemical processes. Systems biology markup language (SBML) has become a well-known platform for coding and sharing mathematical models of such processes. However, current SBML compatible software tools are limited in their ability to perform global sensitivity analyses of these models. Results This work introduces a freely downloadable, software package, SBML-SAT, which implements algorithms for simulation, steady state analysis, robustness analysis and local and global sensitivity analysis for SBML models. This software tool extends current capabilities through its execution of global sensitivity analyses using multi-parametric sensitivity analysis, partial rank correlation coefficient, SOBOL's method, and weighted average of local sensitivity analyses in addition to its ability to handle systems with discontinuous events and intuitive graphical user interface. Conclusion SBML-SAT provides the community of systems biologists a new tool for the analysis of their SBML models of biochemical and cellular processes. PMID:18706080

  13. SBML-SAT: a systems biology markup language (SBML) based sensitivity analysis tool.

    PubMed

    Zi, Zhike; Zheng, Yanan; Rundell, Ann E; Klipp, Edda

    2008-08-15

    It has long been recognized that sensitivity analysis plays a key role in modeling and analyzing cellular and biochemical processes. Systems biology markup language (SBML) has become a well-known platform for coding and sharing mathematical models of such processes. However, current SBML compatible software tools are limited in their ability to perform global sensitivity analyses of these models. This work introduces a freely downloadable, software package, SBML-SAT, which implements algorithms for simulation, steady state analysis, robustness analysis and local and global sensitivity analysis for SBML models. This software tool extends current capabilities through its execution of global sensitivity analyses using multi-parametric sensitivity analysis, partial rank correlation coefficient, SOBOL's method, and weighted average of local sensitivity analyses in addition to its ability to handle systems with discontinuous events and intuitive graphical user interface. SBML-SAT provides the community of systems biologists a new tool for the analysis of their SBML models of biochemical and cellular processes.

  14. Exploration Medical Capability System Engineering Introduction and Vision

    NASA Technical Reports Server (NTRS)

    Mindock, J.; Reilly, J.

    2017-01-01

    Human exploration missions to beyond low Earth orbit destinations such as Mars will require more autonomous capability compared to current low Earth orbit operations. For the medical system, lack of consumable resupply, evacuation opportunities, and real-time ground support are key drivers toward greater autonomy. Recognition of the limited mission and vehicle resources available to carry out exploration missions motivates the Exploration Medical Capability (ExMC) Element's approach to enabling the necessary autonomy. The Element's work must integrate with the overall exploration mission and vehicle design efforts to successfully provide exploration medical capabilities. ExMC is applying systems engineering principles and practices to accomplish its integrative goals. This talk will briefly introduce the discipline of systems engineering and key points in its application to exploration medical capability development. It will elucidate technical medical system needs to be met by the systems engineering work, and the structured and integrative science and engineering approach to satisfying those needs, including the development of shared mental and qualitative models within and external to the human health and performance community. These efforts are underway to ensure relevancy to exploration system maturation and to establish medical system development that is collaborative with vehicle and mission design and engineering efforts.

  15. Low Cost Sensors-Current Capabilities and Gaps

    EPA Science Inventory

    1. Present the findings from the a recent technology review of gas and particulate phase sensors 2. Focus on the lower-cost sensors 3. Discuss current capabilities, estimated range of measurement, selectivity, deployment platforms, response time, and expected range of acceptabl...

  16. Will high-resolution global ocean models benefit coupled predictions on short-range to climate timescales?

    NASA Astrophysics Data System (ADS)

    Hewitt, Helene T.; Bell, Michael J.; Chassignet, Eric P.; Czaja, Arnaud; Ferreira, David; Griffies, Stephen M.; Hyder, Pat; McClean, Julie L.; New, Adrian L.; Roberts, Malcolm J.

    2017-12-01

    As the importance of the ocean in the weather and climate system is increasingly recognised, operational systems are now moving towards coupled prediction not only for seasonal to climate timescales but also for short-range forecasts. A three-way tension exists between the allocation of computing resources to refine model resolution, the expansion of model complexity/capability, and the increase of ensemble size. Here we review evidence for the benefits of increased ocean resolution in global coupled models, where the ocean component explicitly represents transient mesoscale eddies and narrow boundary currents. We consider lessons learned from forced ocean/sea-ice simulations; from studies concerning the SST resolution required to impact atmospheric simulations; and from coupled predictions. Impacts of the mesoscale ocean in western boundary current regions on the large-scale atmospheric state have been identified. Understanding of air-sea feedback in western boundary currents is modifying our view of the dynamics in these key regions. It remains unclear whether variability associated with open ocean mesoscale eddies is equally important to the large-scale atmospheric state. We include a discussion of what processes can presently be parameterised in coupled models with coarse resolution non-eddying ocean models, and where parameterizations may fall short. We discuss the benefits of resolution and identify gaps in the current literature that leave important questions unanswered.

  17. Numerical Device Modeling, Analysis, and Optimization of Extended-SWIR HgCdTe Infrared Detectors

    NASA Astrophysics Data System (ADS)

    Schuster, J.; DeWames, R. E.; DeCuir, E. A.; Bellotti, E.; Dhar, N.; Wijewarnasuriya, P. S.

    2016-09-01

    Imaging in the extended short-wavelength infrared (eSWIR) spectral band (1.7-3.0 μm) for astronomy applications is an area of significant interest. However, these applications require infrared detectors with extremely low dark current (less than 0.01 electrons per pixel per second for certain applications). In these detectors, sources of dark current that may limit the overall system performance are fundamental and/or defect-related mechanisms. Non-optimized growth/device processing may present material point defects within the HgCdTe bandgap leading to Shockley-Read-Hall dominated dark current. While realizing contributions to the dark current from only fundamental mechanisms should be the goal for attaining optimal device performance, it may not be readily feasible with current technology and/or resources. In this regard, the U.S. Army Research Laboratory performed physics-based, two- and three-dimensional numerical modeling of HgCdTe photovoltaic infrared detectors designed for operation in the eSWIR spectral band. The underlying impetus for this capability and study originates with a desire to reach fundamental performance limits via intelligent device design.

  18. A Study of Fan Stage/Casing Interaction Models

    NASA Technical Reports Server (NTRS)

    Lawrence, Charles; Carney, Kelly; Gallardo, Vicente

    2003-01-01

    The purpose of the present study is to investigate the performance of several existing and new, blade-case interactions modeling capabilities that are compatible with the large system simulations used to capture structural response during blade-out events. Three contact models are examined for simulating the interactions between a rotor bladed disk and a case: a radial and linear gap element and a new element based on a hydrodynamic formulation. The first two models are currently available in commercial finite element codes such as NASTRAN and have been showed to perform adequately for simulating rotor-case interactions. The hydrodynamic model, although not readily available in commercial codes, may prove to be better able to characterize rotor-case interactions.

  19. Anisotropic constitutive modeling for nickel base single crystal superalloys using a crystallographic approach

    NASA Technical Reports Server (NTRS)

    Stouffer, D. C.; Sheh, M. Y.

    1988-01-01

    A micromechanical model based on crystallographic slip theory was formulated for nickel-base single crystal superalloys. The current equations include both drag stress and back stress state variables to model the local inelastic flow. Specially designed experiments have been conducted to evaluate the effect of back stress in single crystals. The results showed that (1) the back stress is orientation dependent; and (2) the back stress state variable in the inelastic flow equation is necessary for predicting anelastic behavior of the material. The model also demonstrated improved fatigue predictive capability. Model predictions and experimental data are presented for single crystal superalloy Rene N4 at 982 C.

  20. Status of the AIAA Modeling and Simulation Format Standard

    NASA Technical Reports Server (NTRS)

    Jackson, E. Bruce; Hildreth, Bruce L.

    2008-01-01

    The current draft AIAA Standard for flight simulation models represents an on-going effort to improve the productivity of practitioners of the art of digital flight simulation (one of the original digital computer applications). This initial release provides the capability for the efficient representation and exchange of an aerodynamic model in full fidelity; the DAVE-ML format can be easily imported (with development of site-specific import tools) in an unambiguous way with automatic verification. An attractive feature of the standard is the ability to coexist with existing legacy software or tools. The draft Standard is currently limited in scope to static elements of dynamic flight simulations; however, these static elements represent the bulk of typical flight simulation mathematical models. It is already seeing application within U.S. and Australian government agencies in an effort to improve productivity and reduce model rehosting overhead. An existing tool allows import of DAVE-ML models into a popular simulation modeling and analysis tool, and other community-contributed tools and libraries can simplify the use of DAVE-ML compliant models at compile- or run-time of high-fidelity flight simulation.

  1. Multiscale Modeling in the Clinic: Drug Design and Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clancy, Colleen E.; An, Gary; Cannon, William R.

    A wide range of length and time scales are relevant to pharmacology, especially in drug development, drug design and drug delivery. Therefore, multi-scale computational modeling and simulation methods and paradigms that advance the linkage of phenomena occurring at these multiple scales have become increasingly important. Multi-scale approaches present in silico opportunities to advance laboratory research to bedside clinical applications in pharmaceuticals research. This is achievable through the capability of modeling to reveal phenomena occurring across multiple spatial and temporal scales, which are not otherwise readily accessible to experimentation. The resultant models, when validated, are capable of making testable predictions tomore » guide drug design and delivery. In this review we describe the goals, methods, and opportunities of multi-scale modeling in drug design and development. We demonstrate the impact of multiple scales of modeling in this field. We indicate the common mathematical techniques employed for multi-scale modeling approaches used in pharmacology and present several examples illustrating the current state-of-the-art regarding drug development for: Excitable Systems (Heart); Cancer (Metastasis and Differentiation); Cancer (Angiogenesis and Drug Targeting); Metabolic Disorders; and Inflammation and Sepsis. We conclude with a focus on barriers to successful clinical translation of drug development, drug design and drug delivery multi-scale models.« less

  2. Underlying Physics of Conductive Polymer Composites and Force Sensing Resistors (FSRs) under Static Loading Conditions

    PubMed Central

    2017-01-01

    Conductive polymer composites are manufactured by randomly dispersing conductive particles along an insulating polymer matrix. Several authors have attempted to model the piezoresistive response of conductive polymer composites. However, all the proposed models rely upon experimental measurements of the electrical resistance at rest state. Similarly, the models available in literature assume a voltage-independent resistance and a stress-independent area for tunneling conduction. With the aim of developing and validating a more comprehensive model, a test bench capable of exerting controlled forces has been developed. Commercially available sensors—which are manufactured from conductive polymer composites—have been tested at different voltages and stresses, and a model has been derived on the basis of equations for the quantum tunneling conduction through thin insulating film layers. The resistance contribution from the contact resistance has been included in the model together with the resistance contribution from the conductive particles. The proposed model embraces a voltage-dependent behavior for the composite resistance, and a stress-dependent behavior for the tunneling conduction area. The proposed model is capable of predicting sensor current based upon information from the sourcing voltage and the applied stress. This study uses a physical (non-phenomenological) approach for all the phenomena discussed here. PMID:28906467

  3. Detection and Attribution of Regional Climate Change

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bala, G; Mirin, A

    2007-01-19

    We developed a high resolution global coupled modeling capability to perform breakthrough studies of the regional climate change. The atmospheric component in our simulation uses a 1{sup o} latitude x 1.25{sup o} longitude grid which is the finest resolution ever used for the NCAR coupled climate model CCSM3. Substantial testing and slight retuning was required to get an acceptable control simulation. The major accomplishment is the validation of this new high resolution configuration of CCSM3. There are major improvements in our simulation of the surface wind stress and sea ice thickness distribution in the Arctic. Surface wind stress and oceanmore » circulation in the Antarctic Circumpolar Current are also improved. Our results demonstrate that the FV version of the CCSM coupled model is a state of the art climate model whose simulation capabilities are in the class of those used for IPCC assessments. We have also provided 1000 years of model data to Scripps Institution of Oceanography to estimate the natural variability of stream flow in California. In the future, our global model simulations will provide boundary data to high-resolution mesoscale model that will be used at LLNL. The mesoscale model would dynamically downscale the GCM climate to regional scale on climate time scales.« less

  4. USEEIO: a New and Transparent United States ...

    EPA Pesticide Factsheets

    National-scope environmental life cycle models of goods and services may be used for many purposes, not limited to quantifying impacts of production and consumption of nations, assessing organization-wide impacts, identifying purchasing hot spots, analyzing environmental impacts of policies, and performing streamlined life cycle assessment. USEEIO is a new environmentally extended input-output model of the United States fit for such purposes and other sustainable materials management applications. USEEIO melds data on economic transactions between 389 industry sectors with environmental data for these sectors covering land, water, energy and mineral usage and emissions of greenhouse gases, criteria air pollutants, nutrients and toxics, to build a life cycle model of 385 US goods and services. In comparison with existing US input-output models, USEEIO is more current with most data representing year 2013, more extensive in its coverage of resources and emissions, more deliberate and detailed in its interpretation and combination of data sources, and includes formal data quality evaluation and description. USEEIO was assembled with a new Python module called the IO Model Builder capable of assembling and calculating results of user-defined input-output models and exporting the models into LCA software. The model and data quality evaluation capabilities are demonstrated with an analysis of the environmental performance of an average hospital in the US. All USEEIO f

  5. ISSM-SESAW v1.0: mesh-based computation of gravitationally consistent sea-level and geodetic signatures caused by cryosphere and climate driven mass change

    NASA Astrophysics Data System (ADS)

    Adhikari, Surendra; Ivins, Erik R.; Larour, Eric

    2016-03-01

    A classical Green's function approach for computing gravitationally consistent sea-level variations associated with mass redistribution on the earth's surface employed in contemporary sea-level models naturally suits the spectral methods for numerical evaluation. The capability of these methods to resolve high wave number features such as small glaciers is limited by the need for large numbers of pixels and high-degree (associated Legendre) series truncation. Incorporating a spectral model into (components of) earth system models that generally operate on a mesh system also requires repetitive forward and inverse transforms. In order to overcome these limitations, we present a method that functions efficiently on an unstructured mesh, thus capturing the physics operating at kilometer scale yet capable of simulating geophysical observables that are inherently of global scale with minimal computational cost. The goal of the current version of this model is to provide high-resolution solid-earth, gravitational, sea-level and rotational responses for earth system models operating in the domain of the earth's outer fluid envelope on timescales less than about 1 century when viscous effects can largely be ignored over most of the globe. The model has numerous important geophysical applications. For example, we compute time-varying computations of global geodetic and sea-level signatures associated with recent ice-sheet changes that are derived from space gravimetry observations. We also demonstrate the capability of our model to simultaneously resolve kilometer-scale sources of the earth's time-varying surface mass transport, derived from high-resolution modeling of polar ice sheets, and predict the corresponding local and global geodetic signatures.

  6. Model comparisons of the reactive burn model SURF in three ASC codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whitley, Von Howard; Stalsberg, Krista Lynn; Reichelt, Benjamin Lee

    A study of the SURF reactive burn model was performed in FLAG, PAGOSA and XRAGE. In this study, three different shock-to-detonation transition experiments were modeled in each code. All three codes produced similar model results for all the experiments modeled and at all resolutions. Buildup-to-detonation time, particle velocities and resolution dependence of the models was notably similar between the codes. Given the current PBX 9502 equations of state and SURF calibrations, each code is equally capable of predicting the correct detonation time and distance when impacted by a 1D impactor at pressures ranging from 10-16 GPa, as long as themore » resolution of the mesh is not too coarse.« less

  7. Quantitative, steady-state properties of Catania's computational model of the operant reserve.

    PubMed

    Berg, John P; McDowell, J J

    2011-05-01

    Catania (2005) found that a computational model of the operant reserve (Skinner, 1938) produced realistic behavior in initial, exploratory analyses. Although Catania's operant reserve computational model demonstrated potential to simulate varied behavioral phenomena, the model was not systematically tested. The current project replicated and extended the Catania model, clarified its capabilities through systematic testing, and determined the extent to which it produces behavior corresponding to matching theory. Significant departures from both classic and modern matching theory were found in behavior generated by the model across all conditions. The results suggest that a simple, dynamic operant model of the reflex reserve does not simulate realistic steady state behavior. Copyright © 2011 Elsevier B.V. All rights reserved.

  8. Intelligent Hardware-Enabled Sensor and Software Safety and Health Management for Autonomous UAS

    NASA Technical Reports Server (NTRS)

    Rozier, Kristin Y.; Schumann, Johann; Ippolito, Corey

    2015-01-01

    Unmanned Aerial Systems (UAS) can only be deployed if they can effectively complete their mission and respond to failures and uncertain environmental conditions while maintaining safety with respect to other aircraft as well as humans and property on the ground. We propose to design a real-time, onboard system health management (SHM) capability to continuously monitor essential system components such as sensors, software, and hardware systems for detection and diagnosis of failures and violations of safety or performance rules during the ight of a UAS. Our approach to SHM is three-pronged, providing: (1) real-time monitoring of sensor and software signals; (2) signal analysis, preprocessing, and advanced on-the- y temporal and Bayesian probabilistic fault diagnosis; (3) an unobtrusive, lightweight, read-only, low-power hardware realization using Field Programmable Gate Arrays (FPGAs) in order to avoid overburdening limited computing resources or costly re-certi cation of ight software due to instrumentation. No currently available SHM capabilities (or combinations of currently existing SHM capabilities) come anywhere close to satisfying these three criteria yet NASA will require such intelligent, hardwareenabled sensor and software safety and health management for introducing autonomous UAS into the National Airspace System (NAS). We propose a novel approach of creating modular building blocks for combining responsive runtime monitoring of temporal logic system safety requirements with model-based diagnosis and Bayesian network-based probabilistic analysis. Our proposed research program includes both developing this novel approach and demonstrating its capabilities using the NASA Swift UAS as a demonstration platform.

  9. Maximizing MST's inductive capability with a Bp programmable power supply

    NASA Astrophysics Data System (ADS)

    Chapman, B. E.; Holly, D. J.; Jacobson, C. M.; McCollam, K. J.; Morin, J. C.; Sarff, J. S.; Squitieri, A.

    2016-10-01

    A major goal of the MST program is the advancement of inductive control for the development of both the RFP's fusion potential and, synergistically, the predictive capability of fusion science. This entails programmable power supplies (PPS's) for the Bt and Bp circuits. A Bt PPS is already in place, allowing advanced RFP operation and the production of tokamak plasmas, and a Bp PPS prototype is under construction. To explore some of the new capabilities to be provided by the Bp PPS, the existing Bt PPS has been temporarily connected to the Bp circuit. One key result is new-found access to very low Ip (20 kA) and very low Lundquist number, S (104). At this low S, simulation of RFP plasmas with the MHD code NIMROD is readily achievable, and work toward validation of extended MHD models using NIMROD is underway with direct comparisons to these MST plasmas. The full Bp PPS will also provide higher Ip and S than presently possible, allowing MST to produce plasmas with S spanning as much as five orders of magnitude, a dramatic extension of MST's capability. In these initial tests, the PPS has also increased five-fold MST's Ip flattop duration, to about 100 ms. This, coupled with the recently demonstrated PPS ability to drive large-amplitude sinusoidal oscillations in Ip, will allow tests of extended-duration oscillating field current drive, the goal of which is ac sustainment of a quasi-dc plasma current. Work supported by US DOE.

  10. A portable monitor system for biology signal based on singlechip

    NASA Astrophysics Data System (ADS)

    Tu, Qiaoling; Guo, Jianhua; He, Li; Xu, Xia

    2005-12-01

    The objectives of the paper are to improve accuracy of the electrocardiogram and temperature signal, improve the system stability and the capability of dynamic response, and decrease power consumption and volume of the system. The basic method is making use of the inner resource of the singlechip, such as the exact constant-current source, hardware multiplier, ADC, etc. The model of singlechip is MSP430F449 of TI (Texas Instruments). A simple integral-coefficient band-rejection digital filter was designed for analyzing the electrocardiogram signal. The deviation of temperature coming from the degradation of battery voltage was compensated for. An automatic discharge access was designed in the circuit to improve the capability of dynamic response of circuit. The results indicate that the 50 Hz power frequency interfering and the baseline drift are filtered, the figure is clear, the accuracy of temperature is 0.03°C, and the consumption current is less than 1.3mA. The system can meet the requirement in ward monitor and surgery monitor.

  11. Small scale currents and ocean wave heights: from today's models to future satellite observations with CFOSAT and SKIM

    NASA Astrophysics Data System (ADS)

    Ardhuin, Fabrice; Gille, Sarah; Menemenlis, Dimitris; Rocha, Cesar; Rascle, Nicolas; Gula, Jonathan; Chapron, Bertrand

    2017-04-01

    Tidal currents and large oceanic currents, such as the Agulhas, Gulf Stream and Kuroshio, are known to modify ocean wave properties, causing extreme sea states that are a hazard to navigation. Recent advances in the understanding and modeling capability of ocean currents at scales of 10 km or less have revealed the ubiquitous presence of fronts and filaments. Based on realistic numerical models, we show that these structures can be the main source of variability in significant wave heights at scales less than 200 km, including important variations at 10 km. This current-induced variability creates gradients in wave heights that were previously overlooked and are relevant for extreme wave heights and remote sensing. The spectrum of significant wave heights is found to be of the order of 70⟨Hs ⟩2/(g2⟨Tm0,-1⟩2) times the current spectrum, where ⟨Hs ⟩ is the spatially-averaged significant wave height, ⟨Tm0,-1⟩ is the average energy period, and g is the gravity acceleration. This small scale variability is consistent with Jason-3 and SARAL along-track variability. We will discuss how future satellite mission with wave spectrometers can help observe these wave-current interactions. CFOSAT is due for launch in 2018, and SKIM is a proposal for ESA Earth Explorer 9.

  12. Extending the Lunar Mapping and Modeling Portal - New Capabilities and New Worlds

    NASA Astrophysics Data System (ADS)

    Day, Brian; Law, Emily

    2015-11-01

    NASA’s Lunar Mapping and Modeling Portal (LMMP) provides a web-based Portal and a suite of interactive visualization and analysis tools to enable mission planners, lunar scientists, and engineers to access mapped lunar data products from past and current lunar missions (http://lmmp.nasa.gov). During the past year, the capabilities and data served by LMMP have been significantly expanded. New interfaces are providing improved ways to access and visualize data. Many of the recent enhancements to LMMP have been specifically in response to the requirements of NASA's proposed Resource Prospector lunar rover, and as such, provide an excellent example of the application of LMMP to mission planning.At the request of NASA’s Science Mission Directorate, LMMP’s technology and capabilities are now being extended to additional planetary bodies. New portals for Vesta and Mars are the first of these new products to be released.On March 31, 2015, the LMMP team released Vesta Trek (http://vestatrek.jpl.nasa.gov), a web-based application applying LMMP technology to visualize the asteroid Vesta. Data gathered from multiple instruments aboard Dawn have been compiled into Vesta Trek’s user-friendly set of tools, enabling users to study the asteroid’s features.Released on July 1, 2015, Mars Trek replicates the functionality of Vesta Trek for the surface of Mars. While the entire surface of Mars is covered, higher levels of resolution and greater numbers of data products are provided for special areas of interest. Early releases focus on past, current, and future robotic sites of operation. Future releases will add many new data products and analysis tools as Mars Trek has been selected for use in site selection for the Mars 2020 rover and in identifying potential human landing sites on Mars.Other destinations will follow soon. The Solar Sytem Exploration Research Virtual Institute, which manages the project, invites the user community to provide suggestions and requests as the development team continues to expand the capabilities of these portals.This presentation will provide an overview of all three portals, demonstrate their uses and capabilities, highlight new features, and preview coming enhancements.

  13. Extending the Lunar Mapping and Modeling Portal - New Capabilities and New Worlds

    NASA Astrophysics Data System (ADS)

    Day, B. H.; Law, E.; Arevalo, E.; Bui, B.; Chang, G.; Dodge, K.; Kim, R. M.; Malhotra, S.; Sadaqathullah, S.

    2015-12-01

    NASA's Lunar Mapping and Modeling Portal (LMMP) provides a web-based Portal and a suite of interactive visualization and analysis tools to enable mission planners, lunar scientists, and engineers to access mapped lunar data products from past and current lunar missions (http://lmmp.nasa.gov). During the past year, the capabilities and data served by LMMP have been significantly expanded. New interfaces are providing improved ways to access and visualize data. Many of the recent enhancements to LMMP have been specifically in response to the requirements of NASA's proposed Resource Prospector lunar rover, and as such, provide an excellent example of the application of LMMP to mission planning. At the request of NASA's Science Mission Directorate, LMMP's technology and capabilities are now being extended to additional planetary bodies. New portals for Vesta and Mars are the first of these new products to be released. On March 31, 2015, the LMMP team released Vesta Trek (http://vestatrek.jpl.nasa.gov), a web-based application applying LMMP technology to visualizations of the asteroid Vesta. Data gathered from multiple instruments aboard Dawn have been compiled into Vesta Trek's user-friendly set of tools, enabling users to study the asteroid's features. With an initial release on July 1, 2015, Mars Trek replicates the functionality of Vesta Trek for the surface of Mars. While the entire surface of Mars is covered, higher levels of resolution and greater numbers of data products are provided for special areas of interest. Early releases focus on past, current, and future robotic sites of operation. Future releases will add many new data products and analysis tools as Mars Trek has been selected for use in site selection for the Mars 2020 rover and in identifying potential human landing sites on Mars. Other destinations will follow soon. The user community is invited to provide suggestions and requests as the development team continues to expand the capabilities of LMMP, its related products, and the range of data and tools that they provide. This presentation will provide an overview of LMMP, Vesta Trek, and Mars Trek, demonstrate their uses and capabilities, highlight new features, and preview coming enhancements.

  14. Chemical vapor deposition modeling: An assessment of current status

    NASA Technical Reports Server (NTRS)

    Gokoglu, Suleyman A.

    1991-01-01

    The shortcomings of earlier approaches that assumed thermochemical equilibrium and used chemical vapor deposition (CVD) phase diagrams are pointed out. Significant advancements in predictive capabilities due to recent computational developments, especially those for deposition rates controlled by gas phase mass transport, are demonstrated. The importance of using the proper boundary conditions is stressed, and the availability and reliability of gas phase and surface chemical kinetic information are emphasized as the most limiting factors. Future directions for CVD are proposed on the basis of current needs for efficient and effective progress in CVD process design and optimization.

  15. CFD Methods and Tools for Multi-Element Airfoil Analysis

    NASA Technical Reports Server (NTRS)

    Rogers, Stuart E.; George, Michael W. (Technical Monitor)

    1995-01-01

    This lecture will discuss the computational tools currently available for high-lift multi-element airfoil analysis. It will present an overview of a number of different numerical approaches, their current capabilities, short-comings, and computational costs. The lecture will be limited to viscous methods, including inviscid/boundary layer coupling methods, and incompressible and compressible Reynolds-averaged Navier-Stokes methods. Both structured and unstructured grid generation approaches will be presented. Two different structured grid procedures are outlined, one which uses multi-block patched grids, the other uses overset chimera grids. Turbulence and transition modeling will be discussed.

  16. Sensitivity Study of the Wall Interference Correction System (WICS) for Rectangular Tunnels

    NASA Technical Reports Server (NTRS)

    Walker, Eric L.; Everhart, Joel L.; Iyer, Venkit

    2001-01-01

    An off-line version of the Wall Interference Correction System (WICS) has been implemented for the NASA Langley National Transonic Facility. The correction capability is currently restricted to corrections for solid wall interference in the model pitch plane for Mach numbers less than 0.45 due to a limitation in tunnel calibration data. A study to assess output sensitivity to measurement uncertainty was conducted to determine standard operational procedures and guidelines to ensure data quality during the testing process. Changes to the current facility setup and design recommendations for installing the WICS code into a new facility are reported.

  17. OceanNOMADS: Real-time and retrospective access to operational U.S. ocean prediction products

    NASA Astrophysics Data System (ADS)

    Harding, J. M.; Cross, S. L.; Bub, F.; Ji, M.

    2011-12-01

    The National Oceanic and Atmospheric Administration (NOAA) National Operational Model Archive and Distribution System (NOMADS) provides both real-time and archived atmospheric model output from servers at the National Centers for Environmental Prediction (NCEP) and National Climatic Data Center (NCDC) respectively (http://nomads.ncep.noaa.gov/txt_descriptions/marRutledge-1.pdf). The NOAA National Ocean Data Center (NODC) with NCEP is developing a complementary capability called OceanNOMADS for operational ocean prediction models. An NCEP ftp server currently provides real-time ocean forecast output (http://www.opc.ncep.noaa.gov/newNCOM/NCOM_currents.shtml) with retrospective access through NODC. A joint effort between the Northern Gulf Institute (NGI; a NOAA Cooperative Institute) and the NOAA National Coastal Data Development Center (NCDDC; a division of NODC) created the developmental version of the retrospective OceanNOMADS capability (http://www.northerngulfinstitute.org/edac/ocean_nomads.php) under the NGI Ecosystem Data Assembly Center (EDAC) project (http://www.northerngulfinstitute.org/edac/). Complementary funding support for the developmental OceanNOMADS from U.S. Integrated Ocean Observing System (IOOS) through the Southeastern University Research Association (SURA) Model Testbed (http://testbed.sura.org/) this past year provided NODC the analogue that facilitated the creation of an NCDDC production version of OceanNOMADS (http://www.ncddc.noaa.gov/ocean-nomads/). Access tool development and storage of initial archival data sets occur on the NGI/NCDDC developmental servers with transition to NODC/NCCDC production servers as the model archives mature and operational space and distribution capability grow. Navy operational global ocean forecast subsets for U.S waters comprise the initial ocean prediction fields resident on the NCDDC production server. The NGI/NCDDC developmental server currently includes the Naval Research Laboratory Inter-America Seas Nowcast/Forecast System over the Gulf of Mexico from 2004-Mar 2011, the operational Naval Oceanographic Office (NAVOCEANO) regional USEast ocean nowcast/forecast system from early 2009 to present, and the NAVOCEANO operational regional AMSEAS (Gulf of Mexico/Caribbean) ocean nowcast/forecast system from its inception 25 June 2010 to present. AMSEAS provided one of the real-time ocean forecast products accessed by NOAA's Office of Response and Restoration from the NGI/NCDDC developmental OceanNOMADS during the Deep Water Horizon oil spill last year. The developmental server also includes archived, real-time Navy coastal forecast products off coastal Japan in support of U.S./Japanese joint efforts following the 2011 tsunami. Real-time NAVOCEANO output from regional prediction systems off Southern California and around Hawaii, currently available on the NCEP ftp server, are scheduled for archival on the developmental OceanNOMADS by late 2011 along with the next generation Navy/NOAA global ocean prediction output. Accession and archival of additional regions is planned as server capacities increase.

  18. Biophysically based mathematical modeling of interstitial cells of Cajal slow wave activity generated from a discrete unitary potential basis.

    PubMed

    Faville, R A; Pullan, A J; Sanders, K M; Koh, S D; Lloyd, C M; Smith, N P

    2009-06-17

    Spontaneously rhythmic pacemaker activity produced by interstitial cells of Cajal (ICC) is the result of the entrainment of unitary potential depolarizations generated at intracellular sites termed pacemaker units. In this study, we present a mathematical modeling framework that quantitatively represents the transmembrane ion flows and intracellular Ca2+ dynamics from a single ICC operating over the physiological membrane potential range. The mathematical model presented here extends our recently developed biophysically based pacemaker unit modeling framework by including mechanisms necessary for coordinating unitary potential events, such as a T-Type Ca2+ current, Vm-dependent K+ currents, and global Ca2+ diffusion. Model simulations produce spontaneously rhythmic slow wave depolarizations with an amplitude of 65 mV at a frequency of 17.4 cpm. Our model predicts that activity at the spatial scale of the pacemaker unit is fundamental for ICC slow wave generation, and Ca2+ influx from activation of the T-Type Ca2+ current is required for unitary potential entrainment. These results suggest that intracellular Ca2+ levels, particularly in the region local to the mitochondria and endoplasmic reticulum, significantly influence pacing frequency and synchronization of pacemaker unit discharge. Moreover, numerical investigations show that our ICC model is capable of qualitatively replicating a wide range of experimental observations.

  19. The impact of accelerating faster than exponential population growth on genetic variation.

    PubMed

    Reppell, Mark; Boehnke, Michael; Zöllner, Sebastian

    2014-03-01

    Current human sequencing projects observe an abundance of extremely rare genetic variation, suggesting recent acceleration of population growth. To better understand the impact of such accelerating growth on the quantity and nature of genetic variation, we present a new class of models capable of incorporating faster than exponential growth in a coalescent framework. Our work shows that such accelerated growth affects only the population size in the recent past and thus large samples are required to detect the models' effects on patterns of variation. When we compare models with fixed initial growth rate, models with accelerating growth achieve very large current population sizes and large samples from these populations contain more variation than samples from populations with constant growth. This increase is driven almost entirely by an increase in singleton variation. Moreover, linkage disequilibrium decays faster in populations with accelerating growth. When we instead condition on current population size, models with accelerating growth result in less overall variation and slower linkage disequilibrium decay compared to models with exponential growth. We also find that pairwise linkage disequilibrium of very rare variants contains information about growth rates in the recent past. Finally, we demonstrate that models of accelerating growth may substantially change estimates of present-day effective population sizes and growth times.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harben, P E; Harris, D; Myers, S

    Seismic imaging and tracking methods have intelligence and monitoring applications. Current systems, however, do not adequately calibrate or model the unknown geological heterogeneity. Current systems are also not designed for rapid data acquisition and analysis in the field. This project seeks to build the core technological capabilities coupled with innovative deployment, processing, and analysis methodologies to allow seismic methods to be effectively utilized in the applications of seismic imaging and vehicle tracking where rapid (minutes to hours) and real-time analysis is required. The goal of this project is to build capabilities in acquisition system design, utilization and in full 3Dmore » finite difference modeling as well as statistical characterization of geological heterogeneity. Such capabilities coupled with a rapid field analysis methodology based on matched field processing are applied to problems associated with surveillance, battlefield management, finding hard and deeply buried targets, and portal monitoring. This project benefits the U.S. military and intelligence community in support of LLNL's national-security mission. FY03 was the final year of this project. In the 2.5 years this project has been active, numerous and varied developments and milestones have been accomplished. A wireless communication module for seismic data was developed to facilitate rapid seismic data acquisition and analysis. The E3D code was enhanced to include topographic effects. Codes were developed to implement the Karhunen-Loeve (K-L) statistical methodology for generating geological heterogeneity that can be utilized in E3D modeling. The matched field processing methodology applied to vehicle tracking and based on a field calibration to characterize geological heterogeneity was tested and successfully demonstrated in a tank tracking experiment at the Nevada Test Site. A 3-seismic-array vehicle tracking testbed was installed on-site at LLNL for testing real-time seismic tracking methods. A field experiment was conducted over a tunnel at the Nevada Site that quantified the tunnel reflection signal and, coupled with modeling, identified key needs and requirements in experimental layout of sensors. A large field experiment was conducted at the Lake Lynn Laboratory, a mine safety research facility in Pennsylvania, over a tunnel complex in realistic, difficult conditions. This experiment gathered the necessary data for a full 3D attempt to apply the methodology. The experiment also collected data to analyze the capabilities to detect and locate in-tunnel explosions for mine safety and other applications.« less

Top