Capability Maturity Model (CMM) for Software Process Improvements
NASA Technical Reports Server (NTRS)
Ling, Robert Y.
2000-01-01
This slide presentation reviews the Avionic Systems Division's implementation of the Capability Maturity Model (CMM) for improvements in the software development process. The presentation reviews the process involved in implementing the model and the benefits of using CMM to improve the software development process.
On the Conditioning of Machine-Learning-Assisted Turbulence Modeling
NASA Astrophysics Data System (ADS)
Wu, Jinlong; Sun, Rui; Wang, Qiqi; Xiao, Heng
2017-11-01
Recently, several researchers have demonstrated that machine learning techniques can be used to improve the RANS modeled Reynolds stress by training on available database of high fidelity simulations. However, obtaining improved mean velocity field remains an unsolved challenge, restricting the predictive capability of current machine-learning-assisted turbulence modeling approaches. In this work we define a condition number to evaluate the model conditioning of data-driven turbulence modeling approaches, and propose a stability-oriented machine learning framework to model Reynolds stress. Two canonical flows, the flow in a square duct and the flow over periodic hills, are investigated to demonstrate the predictive capability of the proposed framework. The satisfactory prediction performance of mean velocity field for both flows demonstrates the predictive capability of the proposed framework for machine-learning-assisted turbulence modeling. With showing the capability of improving the prediction of mean flow field, the proposed stability-oriented machine learning framework bridges the gap between the existing machine-learning-assisted turbulence modeling approaches and the demand of predictive capability of turbulence models in real applications.
NASA Astrophysics Data System (ADS)
Chen, Y.; Li, J.; Xu, H.
2015-10-01
Physically based distributed hydrological models discrete the terrain of the whole catchment into a number of grid cells at fine resolution, and assimilate different terrain data and precipitation to different cells, and are regarded to have the potential to improve the catchment hydrological processes simulation and prediction capability. In the early stage, physically based distributed hydrological models are assumed to derive model parameters from the terrain properties directly, so there is no need to calibrate model parameters, but unfortunately, the uncertanties associated with this model parameter deriving is very high, which impacted their application in flood forecasting, so parameter optimization may also be necessary. There are two main purposes for this study, the first is to propose a parameter optimization method for physically based distributed hydrological models in catchment flood forecasting by using PSO algorithm and to test its competence and to improve its performances, the second is to explore the possibility of improving physically based distributed hydrological models capability in cathcment flood forecasting by parameter optimization. In this paper, based on the scalar concept, a general framework for parameter optimization of the PBDHMs for catchment flood forecasting is first proposed that could be used for all PBDHMs. Then, with Liuxihe model as the study model, which is a physically based distributed hydrological model proposed for catchment flood forecasting, the improverd Particle Swarm Optimization (PSO) algorithm is developed for the parameter optimization of Liuxihe model in catchment flood forecasting, the improvements include to adopt the linear decreasing inertia weight strategy to change the inertia weight, and the arccosine function strategy to adjust the acceleration coefficients. This method has been tested in two catchments in southern China with different sizes, and the results show that the improved PSO algorithm could be used for Liuxihe model parameter optimization effectively, and could improve the model capability largely in catchment flood forecasting, thus proven that parameter optimization is necessary to improve the flood forecasting capability of physically based distributed hydrological model. It also has been found that the appropriate particle number and the maximum evolution number of PSO algorithm used for Liuxihe model catchment flood forcasting is 20 and 30, respectively.
NASA Astrophysics Data System (ADS)
Chen, Y.; Li, J.; Xu, H.
2016-01-01
Physically based distributed hydrological models (hereafter referred to as PBDHMs) divide the terrain of the whole catchment into a number of grid cells at fine resolution and assimilate different terrain data and precipitation to different cells. They are regarded to have the potential to improve the catchment hydrological process simulation and prediction capability. In the early stage, physically based distributed hydrological models are assumed to derive model parameters from the terrain properties directly, so there is no need to calibrate model parameters. However, unfortunately the uncertainties associated with this model derivation are very high, which impacted their application in flood forecasting, so parameter optimization may also be necessary. There are two main purposes for this study: the first is to propose a parameter optimization method for physically based distributed hydrological models in catchment flood forecasting by using particle swarm optimization (PSO) algorithm and to test its competence and to improve its performances; the second is to explore the possibility of improving physically based distributed hydrological model capability in catchment flood forecasting by parameter optimization. In this paper, based on the scalar concept, a general framework for parameter optimization of the PBDHMs for catchment flood forecasting is first proposed that could be used for all PBDHMs. Then, with the Liuxihe model as the study model, which is a physically based distributed hydrological model proposed for catchment flood forecasting, the improved PSO algorithm is developed for the parameter optimization of the Liuxihe model in catchment flood forecasting. The improvements include adoption of the linearly decreasing inertia weight strategy to change the inertia weight and the arccosine function strategy to adjust the acceleration coefficients. This method has been tested in two catchments in southern China with different sizes, and the results show that the improved PSO algorithm could be used for the Liuxihe model parameter optimization effectively and could improve the model capability largely in catchment flood forecasting, thus proving that parameter optimization is necessary to improve the flood forecasting capability of physically based distributed hydrological models. It also has been found that the appropriate particle number and the maximum evolution number of PSO algorithm used for the Liuxihe model catchment flood forecasting are 20 and 30 respectively.
Information Quality Evaluation of C2 Systems at Architecture Level
2014-06-01
based on architecture models of C2 systems, which can help to identify key factors impacting information quality and improve the system capability at the stage of architecture design of C2 system....capability evaluation of C2 systems at architecture level becomes necessary and important for improving the system capability at the stage of architecture ... design . This paper proposes a method for information quality evaluation of C2 system at architecture level. First, the information quality model is
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rabiti, Cristian; Alfonsi, Andrea; Huang, Dongli
This report collect the effort performed to improve the reliability analysis capabilities of the RAVEN code and explore new opportunity in the usage of surrogate model by extending the current RAVEN capabilities to multi physics surrogate models and construction of surrogate models for high dimensionality fields.
Rubenstein, Lisa V; Danz, Marjorie S; Crain, A Lauren; Glasgow, Russell E; Whitebird, Robin R; Solberg, Leif I
2014-12-02
Depression is a major cause of morbidity and cost in primary care patient populations. Successful depression improvement models, however, are complex. Based on organizational readiness theory, a practice's commitment to change and its capability to carry out the change are both important predictors of initiating improvement. We empirically explored the links between relative commitment (i.e., the intention to move forward within the following year) and implementation capability. The DIAMOND initiative administered organizational surveys to medical and quality improvement leaders from each of 83 primary care practices in Minnesota. Surveys preceded initiation of activities directed at implementation of a collaborative care model for improving depression care. To assess implementation capability, we developed composites of survey items for five types of organizational factors postulated to be collaborative care barriers and facilitators. To assess relative commitment for each practice, we averaged leader ratings on an identical survey question assessing practice priorities. We used multivariable regression analyses to assess the extent to which implementation capability predicted relative commitment. We explored whether relative commitment or implementation capability measures were associated with earlier initiation of DIAMOND improvements. All five implementation capability measures independently predicted practice leaders' relative commitment to improving depression care in the following year. These included the following: quality improvement culture and attitudes (p = 0.003), depression culture and attitudes (p <0.001), prior depression quality improvement activities (p <0.001), advanced access and tracking capabilities (p = 0.03), and depression collaborative care features in place (p = 0.03). Higher relative commitment (p = 0.002) and prior depression quality improvement activities appeared to be associated with earlier participation in the DIAMOND initiative. The study supports the concept of organizational readiness to improve quality of care and the use of practice leader surveys to assess it. Practice leaders' relative commitment to depression care improvement may be a useful measure of the likelihood that a practice is ready to initiate evidence-based depression care changes. A comprehensive organizational assessment of implementation capability for depression care improvement may identify specific barriers or facilitators to readiness that require targeted attention from implementers.
People Capability Maturity Model. SM.
1995-09-01
People Capability Maturity Model SM .^^^^_ -——’ Bill Curtis William E. ] Sally Mille] Hefley r Accesion For t NTIS DTIC...People CMM The P-CMM adapts the architecture and the maturity framework underlying the CMM for use with people-related improvement issues. The CMM...focuses on helping organizations improve their software development processes. By adapting the maturity framework and the CMM architecture
Berentsen, Paul; Bush, Simon R.; Digal, Larry; Oude Lansink, Alfons
2016-01-01
This study identifies the capabilities needed by small-scale fishers to participate in Fishery Improvement Projects (FIPs) for yellowfin tuna in the Philippines. The current literature provides little empirical evidence on how different models, or types of FIPs, influence the participation of fishers in their programs and the degree which FIPs are able to foster improvements in fishing practices. To address this literature gap, two different FIPs are empirically analysed, each with different approaches for fostering improvement. The first is the non-governmental organisation-led Partnership Programme Towards Sustainable Tuna, which adopts a bottom-up or development oriented FIP model. The second is the private-led Artesmar FIP, which adopts a top-down or market-oriented FIP approach. The data were obtained from 350 fishers surveyed and were analysed using two separate models run in succession, taking into consideration full, partial, and non-participation in the two FIPs. The results demonstrate that different types of capabilities are required in order to participate in different FIP models. Individual firm capabilities are more important for fishers participation in market-oriented FIPs, which use direct economic incentives to encourage improvements in fisher practices. Collective capabilities are more important for fishers to participate in development-oriented FIPs, which drive improvement by supporting fishers, fisher associations, and governments to move towards market requirements. PMID:27732607
Tolentino-Zondervan, Frazen; Berentsen, Paul; Bush, Simon R; Digal, Larry; Oude Lansink, Alfons
2016-01-01
This study identifies the capabilities needed by small-scale fishers to participate in Fishery Improvement Projects (FIPs) for yellowfin tuna in the Philippines. The current literature provides little empirical evidence on how different models, or types of FIPs, influence the participation of fishers in their programs and the degree which FIPs are able to foster improvements in fishing practices. To address this literature gap, two different FIPs are empirically analysed, each with different approaches for fostering improvement. The first is the non-governmental organisation-led Partnership Programme Towards Sustainable Tuna, which adopts a bottom-up or development oriented FIP model. The second is the private-led Artesmar FIP, which adopts a top-down or market-oriented FIP approach. The data were obtained from 350 fishers surveyed and were analysed using two separate models run in succession, taking into consideration full, partial, and non-participation in the two FIPs. The results demonstrate that different types of capabilities are required in order to participate in different FIP models. Individual firm capabilities are more important for fishers participation in market-oriented FIPs, which use direct economic incentives to encourage improvements in fisher practices. Collective capabilities are more important for fishers to participate in development-oriented FIPs, which drive improvement by supporting fishers, fisher associations, and governments to move towards market requirements.
The People Capability Maturity Model
ERIC Educational Resources Information Center
Wademan, Mark R.; Spuches, Charles M.; Doughty, Philip L.
2007-01-01
The People Capability Maturity Model[R] (People CMM[R]) advocates a staged approach to organizational change. Developed by the Carnegie Mellon University Software Engineering Institute, this model seeks to bring discipline to the people side of management by promoting a structured, repeatable, and predictable approach for improving an…
NASA Technical Reports Server (NTRS)
Cole, Stanley R.; Garcia, Jerry L.
2000-01-01
The NASA Langley Transonic Dynamics Tunnel (TDT) has provided a unique capability for aeroelastic testing for forty years. The facility has a rich history of significant contributions to the design of many United States commercial transports, military aircraft, launch vehicles, and spacecraft. The facility has many features that contribute to its uniqueness for aeroelasticity testing, perhaps the most important feature being the use of a heavy gas test medium to achieve higher test densities. Higher test medium densities substantially improve model-building requirements and therefore simplify the fabrication process for building aeroelastically scaled wind tunnel models. Aeroelastic scaling for the heavy gas results in lower model structural frequencies. Lower model frequencies tend to a make aeroelastic testing safer. This paper will describe major developments in the testing capabilities at the TDT throughout its history, the current status of the facility, and planned additions and improvements to its capabilities in the near future.
Process Improvement Should Link to Security: SEPG 2007 Security Track Recap
2007-09-01
the Systems Security Engineering Capability Maturity Model (SSE- CMM / ISO 21827) and its use in system software developments ...software development life cycle ( SDLC )? 6. In what ways should process improvement support security in the SDLC ? 1.2 10BPANEL RESOURCES For each... project management, and support practices through the use of the capability maturity models including the CMMI and the Systems Security
Improvements to information management systems simulator
NASA Technical Reports Server (NTRS)
Bilek, R. W.
1972-01-01
The performance of personnel in the augmentation and improvement of the interactive IMSIM information management simulation model is summarized. With this augmented model, NASA now has even greater capabilities for the simulation of computer system configurations, data processing loads imposed on these configurations, and executive software to control system operations. Through these simulations, NASA has an extremely cost effective capability for the design and analysis of computer-based data management systems.
CLAES Product Improvement by use of GSFC Data Assimilation System
NASA Technical Reports Server (NTRS)
Kumer, J. B.; Douglass, Anne (Technical Monitor)
2001-01-01
Recent development in chemistry transport models (CTM) and in data assimilation systems (DAS) indicate impressive predictive capability for the movement of airparcels and the chemistry that goes on within these. This project was aimed at exploring the use of this capability to achieve improved retrieval of geophysical parameters from remote sensing data. The specific goal was to improve retrieval of the CLAES CH4 data obtained during the active north high latitude dynamics event of 18 to 25 February 1992. The model capabilities would be used: (1) rather than climatology to improve on the first guess and the a-priori fields, and (2) to provide horizontal gradients to include in the retrieval forward model. The retrieval would be implemented with the first forward DAS prediction. The results would feed back to the DAS and a second DAS prediction for first guess, a-priori and gradients would feed to the retrieval. The process would repeat to convergence and then proceed to the next day.
Automation of energy demand forecasting
NASA Astrophysics Data System (ADS)
Siddique, Sanzad
Automation of energy demand forecasting saves time and effort by searching automatically for an appropriate model in a candidate model space without manual intervention. This thesis introduces a search-based approach that improves the performance of the model searching process for econometrics models. Further improvements in the accuracy of the energy demand forecasting are achieved by integrating nonlinear transformations within the models. This thesis introduces machine learning techniques that are capable of modeling such nonlinearity. Algorithms for learning domain knowledge from time series data using the machine learning methods are also presented. The novel search based approach and the machine learning models are tested with synthetic data as well as with natural gas and electricity demand signals. Experimental results show that the model searching technique is capable of finding an appropriate forecasting model. Further experimental results demonstrate an improved forecasting accuracy achieved by using the novel machine learning techniques introduced in this thesis. This thesis presents an analysis of how the machine learning techniques learn domain knowledge. The learned domain knowledge is used to improve the forecast accuracy.
Systems Engineering | Wind | NREL
platform to leverage its research capabilities toward integrating wind energy engineering and cost models achieve a better understanding of how to improve system-level performance and achieve system-level cost research capabilities to: Integrate wind plant engineering performance and cost software modeling to enable
NASA Astrophysics Data System (ADS)
Bao, Yanli; Hua, Hefeng
2017-03-01
Network capability is the enterprise's capability to set up, manage, maintain and use a variety of relations between enterprises, and to obtain resources for improving competitiveness. Tourism in China is in a transformation period from sightseeing to leisure and vacation. Scenic spots as well as tourist enterprises can learn from some other enterprises in the process of resource development, and build up its own network relations in order to get resources for their survival and development. Through the effective management of network relations, the performance of resource development will be improved. By analyzing literature on network capability and the case analysis of Wuxi Huishan Ancient Town, the role of network capacity in the tourism resource development is explored and resource development path is built from the perspective of network capability. Finally, the tourism resource development process model based on network capacity is proposed. This model mainly includes setting up network vision, resource identification, resource acquisition, resource utilization and tourism project development. In these steps, network construction, network management and improving network center status are key points.
CEASAW: A User-Friendly Computer Environment Analysis for the Sawmill Owner
Guillermo Mendoza; William Sprouse; Philip A. Araman; William G. Luppold
1991-01-01
Improved spreadsheet software capabilities have brought optimization to users with little or no background in mathematical programming. Better interface capabilities of spreadsheet models now make it possible to combine optimization models with a spreadsheet system. Sawmill production and inventory systems possess many features that make them suitable application...
A Conceptual Measurement Model for eHealth Readiness: a Team Based Perspective
Phillips, James; Poon, Simon K.; Yu, Dan; Lam, Mary; Hines, Monique; Brunner, Melissa; Power, Emma; Keep, Melanie; Shaw, Tim; Togher, Leanne
2017-01-01
Despite the shift towards collaborative healthcare and the increase in the use of eHealth technologies, there does not currently exist a model for the measurement of eHealth readiness in interdisciplinary healthcare teams. This research aims to address this gap in the literature through the development of a three phase methodology incorporating qualitative and quantitative methods. We propose a conceptual measurement model consisting of operationalized themes affecting readiness across four factors: (i) Organizational Capabilities, (ii) Team Capabilities, (iii) Patient Capabilities, and (iv) Technology Capabilities. The creation of this model will allow for the measurement of the readiness of interdisciplinary healthcare teams to use eHealth technologies to improve patient outcomes. PMID:29854207
NASA Technical Reports Server (NTRS)
Kornhauser, A. L.; Wilson, L. B.
1974-01-01
Potential economic benefits obtainable from a state-of-the-art ERS system in the resource area of intensive use of living resources, agriculture, are studied. A spectrum of equal capability (cost saving), increased capability, and new capability benefits are quantified. These benefits are estimated via ECON developed models of the agricultural marketplace and include benefits of improved production and distribution of agricultural crops. It is shown that increased capability benefits and new capability benefits result from a reduction of losses due to disease and insect infestation given ERS's capability to distinguish crop vigor and from the improvement in world trade negotiations given ERS's worldwide surveying capability.
USDA-ARS?s Scientific Manuscript database
The coupling of land surface models and hydrological models potentially improves the land surface representation, benefiting both the streamflow prediction capabilities as well as providing improved estimates of water and energy fluxes into the atmosphere. In this study, the simple biosphere model 2...
Research on Nonlinear Time Series Forecasting of Time-Delay NN Embedded with Bayesian Regularization
NASA Astrophysics Data System (ADS)
Jiang, Weijin; Xu, Yusheng; Xu, Yuhui; Wang, Jianmin
Based on the idea of nonlinear prediction of phase space reconstruction, this paper presented a time delay BP neural network model, whose generalization capability was improved by Bayesian regularization. Furthermore, the model is applied to forecast the imp&exp trades in one industry. The results showed that the improved model has excellent generalization capabilities, which not only learned the historical curve, but efficiently predicted the trend of business. Comparing with common evaluation of forecasts, we put on a conclusion that nonlinear forecast can not only focus on data combination and precision improvement, it also can vividly reflect the nonlinear characteristic of the forecasting system. While analyzing the forecasting precision of the model, we give a model judgment by calculating the nonlinear characteristic value of the combined serial and original serial, proved that the forecasting model can reasonably 'catch' the dynamic characteristic of the nonlinear system which produced the origin serial.
Risk Quantification of Systems Engineering Documents Improves Probability of DOD Project Success
2009-09-01
comprehensive risk model for DoD milestone review documentation as well as recommended changes to the Capability Maturity Model Integration ( CMMI ) Project...Milestone Documentation, Project Planning, Rational Frame, Political Frame, CMMI Project Planning Process Area, CMMI Risk Management Process Area...well as recommended changes to the Capability Maturity Model Integration ( CMMI ) Project Planning and Risk Management process areas. The intent is to
Development of 3D Oxide Fuel Mechanics Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spencer, B. W.; Casagranda, A.; Pitts, S. A.
This report documents recent work to improve the accuracy and robustness of the mechanical constitutive models used in the BISON fuel performance code. These developments include migration of the fuel mechanics models to be based on the MOOSE Tensor Mechanics module, improving the robustness of the smeared cracking model, implementing a capability to limit the time step size based on material model response, and improving the robustness of the return mapping iterations used in creep and plasticity models.
A Capital or Capabilities Education Narrative in a World of Staggering Inequalities?
ERIC Educational Resources Information Center
Walker, Melanie
2012-01-01
In a world of tremendous inequalities, this paper explores two contrasting normative models for education policy, and the relationship of each to policy, practices and outcomes that can improve lives by reducing injustice and building societies which value capabilities for all. The first model is that of human capital which currently dominates…
Applying PCI in Combination Swivel Head Wrench
NASA Astrophysics Data System (ADS)
Chen, Tsang-Chiang; Yang, Chun-Ming; Hsu, Chang-Hsien; Hung, Hsiang-Wen
2017-09-01
Taiwan’s traditional industries are subject to competition in the era of globalization and environmental change, the industry is facing economic pressure and shock, and now sustainable business can only continue to improve production efficiency and quality of technology, in order to stabilize the market, to obtain high occupancy. The use of process capability indices to monitor the quality of the ratchet wrench to find the key function of the dual-use ratchet wrench, the actual measurement data, The use of process capability Cpk index analysis, and draw Process Capability Analysis Chart model. Finally, this study explores the current situation of this case and proposes a lack of improvement and improvement methods to improve the overall quality and thereby enhance the overall industry.
Yap, Keem Siah; Lim, Chee Peng; Au, Mau Teng
2011-12-01
Generalized adaptive resonance theory (GART) is a neural network model that is capable of online learning and is effective in tackling pattern classification tasks. In this paper, we propose an improved GART model (IGART), and demonstrate its applicability to power systems. IGART enhances the dynamics of GART in several aspects, which include the use of the Laplacian likelihood function, a new vigilance function, a new match-tracking mechanism, an ordering algorithm for determining the sequence of training data, and a rule extraction capability to elicit if-then rules from the network. To assess the effectiveness of IGART and to compare its performances with those from other methods, three datasets that are related to power systems are employed. The experimental results demonstrate the usefulness of IGART with the rule extraction capability in undertaking classification problems in power systems engineering.
A variable capacitance based modeling and power capability predicting method for ultracapacitor
NASA Astrophysics Data System (ADS)
Liu, Chang; Wang, Yujie; Chen, Zonghai; Ling, Qiang
2018-01-01
Methods of accurate modeling and power capability predicting for ultracapacitors are of great significance in management and application of lithium-ion battery/ultracapacitor hybrid energy storage system. To overcome the simulation error coming from constant capacitance model, an improved ultracapacitor model based on variable capacitance is proposed, where the main capacitance varies with voltage according to a piecewise linear function. A novel state-of-charge calculation approach is developed accordingly. After that, a multi-constraint power capability prediction is developed for ultracapacitor, in which a Kalman-filter-based state observer is designed for tracking ultracapacitor's real-time behavior. Finally, experimental results verify the proposed methods. The accuracy of the proposed model is verified by terminal voltage simulating results under different temperatures, and the effectiveness of the designed observer is proved by various test conditions. Additionally, the power capability prediction results of different time scales and temperatures are compared, to study their effects on ultracapacitor's power capability.
Nuclear Engine System Simulation (NESS). Version 2.0: Program user's guide
NASA Technical Reports Server (NTRS)
Pelaccio, Dennis G.; Scheil, Christine M.; Petrosky, Lyman
1993-01-01
This Program User's Guide discusses the Nuclear Thermal Propulsion (NTP) engine system design features and capabilities modeled in the Nuclear Engine System Simulation (NESS): Version 2.0 program (referred to as NESS throughout the remainder of this document), as well as its operation. NESS was upgraded to include many new modeling capabilities not available in the original version delivered to NASA LeRC in Dec. 1991, NESS's new features include the following: (1) an improved input format; (2) an advanced solid-core NERVA-type reactor system model (ENABLER 2); (3) a bleed-cycle engine system option; (4) an axial-turbopump design option; (5) an automated pump-out turbopump assembly sizing option; (6) an off-design gas generator engine cycle design option; (7) updated hydrogen properties; (8) an improved output format; and (9) personal computer operation capability. Sample design cases are presented in the user's guide that demonstrate many of the new features associated with this upgraded version of NESS, as well as design modeling features associated with the original version of NESS.
2009-09-01
NII)/CIO Assistant Secretary of Defense for Networks and Information Integration/Chief Information Officer CMMI Capability Maturity Model...a Web-based portal to share knowledge about software process-related methodologies, such as the SEI’s Capability Maturity Model Integration ( CMMI ...19 SEI’s IDEALSM model, and Lean Six Sigma.20 For example, the portal features content areas such as software acquisition management, the SEI CMMI
The Advanced Modeling, Simulation and Analysis Capability Roadmap Vision for Engineering
NASA Technical Reports Server (NTRS)
Zang, Thomas; Lieber, Mike; Norton, Charles; Fucik, Karen
2006-01-01
This paper summarizes a subset of the Advanced Modeling Simulation and Analysis (AMSA) Capability Roadmap that was developed for NASA in 2005. The AMSA Capability Roadmap Team was chartered to "To identify what is needed to enhance NASA's capabilities to produce leading-edge exploration and science missions by improving engineering system development, operations, and science understanding through broad application of advanced modeling, simulation and analysis techniques." The AMSA roadmap stressed the need for integration, not just within the science, engineering and operations domains themselves, but also across these domains. Here we discuss the roadmap element pertaining to integration within the engineering domain, with a particular focus on implications for future observatory missions. The AMSA products supporting the system engineering function are mission information, bounds on information quality, and system validation guidance. The Engineering roadmap element contains 5 sub-elements: (1) Large-Scale Systems Models, (2) Anomalous Behavior Models, (3) advanced Uncertainty Models, (4) Virtual Testing Models, and (5) space-based Robotics Manufacture and Servicing Models.
Recent Improvements in Semi-Span Testing at the National Transonic Facility (Invited)
NASA Technical Reports Server (NTRS)
Gatlin, G. M.; Tomek, W. G.; Payne, F. M.; Griffiths, R. C.
2006-01-01
Three wind tunnel investigations of a commercial transport, high-lift, semi-span configuration have recently been conducted in the National Transonic Facility at the NASA Langley Research Center. Throughout the course of these investigations multiple improvements have been developed in the facility semi-span test capability. The primary purpose of the investigations was to assess Reynolds number scale effects on a modern commercial transport configuration up to full-scale flight test conditions (Reynolds numbers on the order of 27 million). The tests included longitudinal aerodynamic studies at subsonic takeoff and landing conditions across a range of Reynolds numbers from that available in conventional wind tunnels up to flight conditions. The purpose of this paper is to discuss lessons learned and improvements incorporated into the semi-span testing process. Topics addressed include enhanced thermal stabilization and moisture reduction procedures, assessments and improvements in model sealing techniques, compensation of model reference dimensions due to test temperature, significantly improved semi-span model access capability, and assessments of data repeatability.
NASA Astrophysics Data System (ADS)
Aburas, Maher Milad; Ho, Yuek Ming; Ramli, Mohammad Firuz; Ash'aari, Zulfa Hanan
2017-07-01
The creation of an accurate simulation of future urban growth is considered one of the most important challenges in urban studies that involve spatial modeling. The purpose of this study is to improve the simulation capability of an integrated CA-Markov Chain (CA-MC) model using CA-MC based on the Analytical Hierarchy Process (AHP) and CA-MC based on Frequency Ratio (FR), both applied in Seremban, Malaysia, as well as to compare the performance and accuracy between the traditional and hybrid models. Various physical, socio-economic, utilities, and environmental criteria were used as predictors, including elevation, slope, soil texture, population density, distance to commercial area, distance to educational area, distance to residential area, distance to industrial area, distance to roads, distance to highway, distance to railway, distance to power line, distance to stream, and land cover. For calibration, three models were applied to simulate urban growth trends in 2010; the actual data of 2010 were used for model validation utilizing the Relative Operating Characteristic (ROC) and Kappa coefficient methods Consequently, future urban growth maps of 2020 and 2030 were created. The validation findings confirm that the integration of the CA-MC model with the FR model and employing the significant driving force of urban growth in the simulation process have resulted in the improved simulation capability of the CA-MC model. This study has provided a novel approach for improving the CA-MC model based on FR, which will provide powerful support to planners and decision-makers in the development of future sustainable urban planning.
Off-Gas Adsorption Model Capabilities and Recommendations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lyon, Kevin L.; Welty, Amy K.; Law, Jack
2016-03-01
Off-gas treatment is required to reduce emissions from aqueous fuel reprocessing. Evaluating the products of innovative gas adsorption research requires increased computational simulation capability to more effectively transition from fundamental research to operational design. Early modeling efforts produced the Off-Gas SeParation and REcoverY (OSPREY) model that, while efficient in terms of computation time, was of limited value for complex systems. However, the computational and programming lessons learned in development of the initial model were used to develop Discontinuous Galerkin OSPREY (DGOSPREY), a more effective model. Initial comparisons between OSPREY and DGOSPREY show that, while OSPREY does reasonably well to capturemore » the initial breakthrough time, it displays far too much numerical dispersion to accurately capture the real shape of the breakthrough curves. DGOSPREY is a much better tool as it utilizes a more stable set of numerical methods. In addition, DGOSPREY has shown the capability to capture complex, multispecies adsorption behavior, while OSPREY currently only works for a single adsorbing species. This capability makes DGOSPREY ultimately a more practical tool for real world simulations involving many different gas species. While DGOSPREY has initially performed very well, there is still need for improvement. The current state of DGOSPREY does not include any micro-scale adsorption kinetics and therefore assumes instantaneous adsorption. This is a major source of error in predicting water vapor breakthrough because the kinetics of that adsorption mechanism is particularly slow. However, this deficiency can be remedied by building kinetic kernels into DGOSPREY. Another source of error in DGOSPREY stems from data gaps in single species, such as Kr and Xe, isotherms. Since isotherm data for each gas is currently available at a single temperature, the model is unable to predict adsorption at temperatures outside of the set of data currently available. Thus, in order to improve the predictive capabilities of the model, there is a need for more single-species adsorption isotherms at different temperatures, in addition to extending the model to include adsorption kinetics. This report provides background information about the modeling process and a path forward for further model improvement in terms of accuracy and user interface.« less
NASA Technical Reports Server (NTRS)
Chow, Chuen-Yen; Ryan, James S.
1987-01-01
While the zonal grid system of Transonic Navier-Stokes (TNS) provides excellent modeling of complex geometries, improved shock capturing, and a higher Mach number range will be required if flows about hypersonic aircraft are to be modeled accurately. A computational fluid dynamics (CFD) code, the Compressible Navier-Stokes (CNS), is under development to combine the required high Mach number capability with the existing TNS geometry capability. One of several candidate flow solvers for inclusion in the CNS is that of F3D. This upwinding flow solver promises improved shock capturing, and more accurate hypersonic solutions overall, compared to the solver currently used in TNS.
Developing Quality Improvement capacity and capability across the Children in Fife partnership.
Morris, Craig; Alexander, Ingrid
2016-01-01
A Project Manager from the Fife Early Years Collaborative facilitated a large-scale Quality Improvement (herein QI) project to build organisational capacity and capability across the Children in Fife partnership through three separate, eight month training cohorts. This 18 month QI project enabled 32 practitioners to increase their skills, knowledge, and experiences in a variety of QI tools including the Model for Improvement which then supported the delivery of high quality improvement projects and improved outcomes for children and families. Essentially growing the confidence and capability of practitioners to deliver sustainable QI. 27 respective improvement projects were delivered, some leading to service redesign, reduced waiting times, increased uptake of health entitlements, and improved accessibility to front-line health services. 13 improvement projects spread or scaled beyond the initial site and informal QI mentoring took place with peers in respective agencies. Multiple PDSA cycles were conducted testing the most efficient and effective support mechanisms during and post training, maintaining regular contact, and utilising social media to share progress and achievements.
Development of Semi-Span Model Test Techniques
NASA Technical Reports Server (NTRS)
Pulnam, L. Elwood (Technical Monitor); Milholen, William E., II; Chokani, Ndaona; McGhee, Robert J.
1996-01-01
A computational investigation was performed to support the development of a semi-span model test capability in the NASA Langley Research Center's National Transonic Facility. This capability is desirable for the testing of advanced subsonic transport aircraft at full-scale Reynolds numbers. A state-of-the-art three-dimensional Navier-Stokes solver was used to examine methods to improve the flow over a semi-span configuration. First, a parametric study is conducted to examine the influence of the stand-off height on the flow over the semi-span model. It is found that decreasing the stand-off height, below the maximum fuselage radius, improves the aerodynamic characteristics of the semi-span model. Next, active sidewall boundary layer control techniques are examined. Juncture region blowing jets, upstream tangential blowing, and sidewall suction are found to improve the flow over the aft portion of the semi-span model. Both upstream blowing and suction are found to reduce the sidewall boundary layer separation. The resulting near surface streamline patterns are improved, and found to be quite similar to the full-span results. Both techniques however adversely affect the pitching moment coefficient.
Development of Semi-Span Model Test Techniques
NASA Technical Reports Server (NTRS)
Milholen, William E., II; Chokani, Ndaona; McGhee, Robert J.
1996-01-01
A computational investigation was performed to support the development of a semispan model test capability in the NASA Langley Research Center's National Transonic Facility. This capability is desirable for the testing of advanced subsonic transport aircraft at full-scale Reynolds numbers. A state-of-the-art three-dimensional Navier-Stokes solver was used to examine methods to improve the flow over a semi-span configuration. First, a parametric study is conducted to examine the influence of the stand-off height on the flow over the semispan model. It is found that decreasing the stand-off height, below the maximum fuselage radius, improves the aerodynamic characteristics of the semi-span model. Next, active sidewall boundary layer control techniques are examined. Juncture region blowing jets, upstream tangential blowing, and sidewall suction are found to improve the flow over the aft portion of the semispan model. Both upstream blowing and suction are found to reduce the sidewall boundary layer separation. The resulting near surface streamline patterns are improved, and found to be quite similar to the full-span results. Both techniques however adversely affect the pitching moment coefficient.
AIR QUALITY MODELING OF PM AND AIR TOXICS AT NEIGHBORHOOD SCALES
The current interest in fine particles and toxics pollutants provide an impetus for extending air quality modeling capability towards improving exposure modeling and assessments. Human exposure models require information on concentration derived from interpolation of observati...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clemens, Noel
This project was a combined computational and experimental effort to improve predictive capability for boundary layer flashback of premixed swirl flames relevant to gas-turbine power plants operating with high-hydrogen-content fuels. During the course of this project, significant progress in modeling was made on four major fronts: 1) use of direct numerical simulation of turbulent flames to understand the coupling between the flame and the turbulent boundary layer; 2) improved modeling capability for flame propagation in stratified pre-mixtures; 3) improved portability of computer codes using the OpenFOAM platform to facilitate transfer to industry and other researchers; and 4) application of LESmore » to flashback in swirl combustors, and a detailed assessment of its capabilities and limitations for predictive purposes. A major component of the project was an experimental program that focused on developing a rich experimental database of boundary layer flashback in swirl flames. Both methane and high-hydrogen fuels, including effects of elevated pressure (1 to 5 atm), were explored. For this project, a new model swirl combustor was developed. Kilohertz-rate stereoscopic PIV and chemiluminescence imaging were used to investigate the flame propagation dynamics. In addition to the planar measurements, a technique capable of detecting the instantaneous, time-resolved 3D flame front topography was developed and applied successfully to investigate the flow-flame interaction. The UT measurements and legacy data were used in a hierarchical validation approach where flows with increasingly complex physics were used for validation. First component models were validated with DNS and literature data in simplified configurations, and this was followed by validation with the UT 1-atm flashback cases, and then the UT high-pressure flashback cases. The new models and portable code represent a major improvement over what was available before this project was initiated.« less
How to improve healthcare? Identify, nurture and embed individuals and teams with "deep smarts".
Eljiz, Kathy; Greenfield, David; Molineux, John; Sloan, Terry
2018-03-19
Purpose Unlocking and transferring skills and capabilities in individuals to the teams they work within, and across, is the key to positive organisational development and improved patient care. Using the "deep smarts" model, the purpose of this paper is to examine these issues. Design/methodology/approach The "deep smarts" model is described, reviewed and proposed as a way of transferring knowledge and capabilities within healthcare organisations. Findings Effective healthcare delivery is achieved through, and continues to require, integrative care involving numerous, dispersed service providers. In the space of overlapping organisational boundaries, there is a need for "deep smarts" people who act as "boundary spanners". These are critical integrative, networking roles employing clinical, organisational and people skills across multiple settings. Research limitations/implications Studies evaluating the barriers and enablers to the application of the deep smarts model and 13 knowledge development strategies proposed are required. Such future research will empirically and contemporary ground our understanding of organisational development in modern complex healthcare settings. Practical implications An organisation with "deep smarts" people - in managerial, auxiliary and clinical positions - has a greater capacity for integration and achieving improved patient-centred care. Originality/value In total, 13 developmental strategies, to transfer individual capabilities into organisational capability, are proposed. These strategies are applicable to different contexts and challenges faced by individuals and teams in complex healthcare organisations.
Wind-US Code Physical Modeling Improvements to Complement Hypersonic Testing and Evaluation
NASA Technical Reports Server (NTRS)
Georgiadis, Nicholas J.; Yoder, Dennis A.; Towne, Charles S.; Engblom, William A.; Bhagwandin, Vishal A.; Power, Greg D.; Lankford, Dennis W.; Nelson, Christopher C.
2009-01-01
This report gives an overview of physical modeling enhancements to the Wind-US flow solver which were made to improve the capabilities for simulation of hypersonic flows and the reliability of computations to complement hypersonic testing. The improvements include advanced turbulence models, a bypass transition model, a conjugate (or closely coupled to vehicle structure) conduction-convection heat transfer capability, and an upgraded high-speed combustion solver. A Mach 5 shock-wave boundary layer interaction problem is used to investigate the benefits of k- s and k-w based explicit algebraic stress turbulence models relative to linear two-equation models. The bypass transition model is validated using data from experiments for incompressible boundary layers and a Mach 7.9 cone flow. The conjugate heat transfer method is validated for a test case involving reacting H2-O2 rocket exhaust over cooled calorimeter panels. A dual-mode scramjet configuration is investigated using both a simplified 1-step kinetics mechanism and an 8-step mechanism. Additionally, variations in the turbulent Prandtl and Schmidt numbers are considered for this scramjet configuration.
Development of task network models of human performance in microgravity
NASA Technical Reports Server (NTRS)
Diaz, Manuel F.; Adam, Susan
1992-01-01
This paper discusses the utility of task-network modeling for quantifying human performance variability in microgravity. The data are gathered for: (1) improving current methodologies for assessing human performance and workload in the operational space environment; (2) developing tools for assessing alternative system designs; and (3) developing an integrated set of methodologies for the evaluation of performance degradation during extended duration spaceflight. The evaluation entailed an analysis of the Remote Manipulator System payload-grapple task performed on many shuttle missions. Task-network modeling can be used as a tool for assessing and enhancing human performance in man-machine systems, particularly for modeling long-duration manned spaceflight. Task-network modeling can be directed toward improving system efficiency by increasing the understanding of basic capabilities of the human component in the system and the factors that influence these capabilities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Turinsky, Paul J., E-mail: turinsky@ncsu.edu; Kothe, Douglas B., E-mail: kothe@ornl.gov
The Consortium for the Advanced Simulation of Light Water Reactors (CASL), the first Energy Innovation Hub of the Department of Energy, was established in 2010 with the goal of providing modeling and simulation (M&S) capabilities that support and accelerate the improvement of nuclear energy's economic competitiveness and the reduction of spent nuclear fuel volume per unit energy, and all while assuring nuclear safety. To accomplish this requires advances in M&S capabilities in radiation transport, thermal-hydraulics, fuel performance and corrosion chemistry. To focus CASL's R&D, industry challenge problems have been defined, which equate with long standing issues of the nuclear powermore » industry that M&S can assist in addressing. To date CASL has developed a multi-physics “core simulator” based upon pin-resolved radiation transport and subchannel (within fuel assembly) thermal-hydraulics, capitalizing on the capabilities of high performance computing. CASL's fuel performance M&S capability can also be optionally integrated into the core simulator, yielding a coupled multi-physics capability with untapped predictive potential. Material models have been developed to enhance predictive capabilities of fuel clad creep and growth, along with deeper understanding of zirconium alloy clad oxidation and hydrogen pickup. Understanding of corrosion chemistry (e.g., CRUD formation) has evolved at all scales: micro, meso and macro. CFD R&D has focused on improvement in closure models for subcooled boiling and bubbly flow, and the formulation of robust numerical solution algorithms. For multiphysics integration, several iterative acceleration methods have been assessed, illuminating areas where further research is needed. Finally, uncertainty quantification and data assimilation techniques, based upon sampling approaches, have been made more feasible for practicing nuclear engineers via R&D on dimensional reduction and biased sampling. Industry adoption of CASL's evolving M&S capabilities, which is in progress, will assist in addressing long-standing and future operational and safety challenges of the nuclear industry. - Highlights: • Complexity of physics based modeling of light water reactor cores being addressed. • Capability developed to help address problems that have challenged the nuclear power industry. • Simulation capabilities that take advantage of high performance computing developed.« less
Final Report for "Design calculations for high-space-charge beam-to-RF conversion".
DOE Office of Scientific and Technical Information (OSTI.GOV)
David N Smithe
2008-10-17
Accelerator facility upgrades, new accelerator applications, and future design efforts are leading to novel klystron and IOT device concepts, including multiple beam, high-order mode operation, and new geometry configurations of old concepts. At the same time, a new simulation capability, based upon finite-difference “cut-cell” boundaries, has emerged and is transforming the existing modeling and design capability with unparalleled realism, greater flexibility, and improved accuracy. This same new technology can also be brought to bear on a difficult-to-study aspect of the energy recovery linac (ERL), namely the accurate modeling of the exit beam, and design of the beam dump for optimummore » energy efficiency. We have developed new capability for design calculations and modeling of a broad class of devices which convert bunched beam kinetic energy to RF energy, including RF sources, as for example, klystrons, gyro-klystrons, IOT's, TWT’s, and other devices in which space-charge effects are important. Recent advances in geometry representation now permits very accurate representation of the curved metallic surfaces common to RF sources, resulting in unprecedented simulation accuracy. In the Phase I work, we evaluated and demonstrated the capabilities of the new geometry representation technology as applied to modeling and design of output cavity components of klystron, IOT's, and energy recovery srf cavities. We identified and prioritized which aspects of the design study process to pursue and improve in Phase II. The development and use of the new accurate geometry modeling technology on RF sources for DOE accelerators will help spark a new generational modeling and design capability, free from many of the constraints and inaccuracy associated with the previous generation of “stair-step” geometry modeling tools. This new capability is ultimately expected to impact all fields with high power RF sources, including DOE fusion research, communications, radar and other defense applications.« less
Design and fabrication of robotic gripper for grasping in minimizing contact force
NASA Astrophysics Data System (ADS)
Heidari, Hamidreza; Pouria, Milad Jafary; Sharifi, Shahriar; Karami, Mahmoudreza
2018-03-01
This paper presents a new method to improve the kinematics of robot gripper for grasping in unstructured environments, such as space operations. The robot gripper is inspired from the human hand and kept the hand design close to the structure of human fingers to provide successful grasping capabilities. The main goal is to improve kinematic structure of gripper to increase the grasping capability of large objects, decrease the contact forces and makes a successful grasp of various objects in unstructured environments. This research will describe the development of a self-adaptive and reconfigurable robotic hand for space operations through mechanical compliance which is versatile, robust and easy to control. Our model contains two fingers, two-link and three-link, with combining a kinematic model of thumb index. Moreover, some experimental tests are performed to examine the effectiveness of the hand-made in real, unstructured tasks. The results represent that the successful grasp range is improved about 30% and the contact forces is reduced approximately 10% for a wide range of target object size. According to the obtained results, the proposed approach provides an accommodative kinematic model which makes the better grasping capability by fingers geometries for a robot gripper.
Chopp-Hurley, Jaclyn N; Brookham, Rebecca L; Dickerson, Clark R
2016-12-01
Biomechanical models are often used to estimate the muscular demands of various activities. However, specific muscle dysfunctions typical of unique clinical populations are rarely considered. Due to iatrogenic tissue damage, pectoralis major capability is markedly reduced in breast cancer population survivors, which could influence arm internal and external rotation muscular strategies. Accordingly, an optimization-based muscle force prediction model was systematically modified to emulate breast cancer population survivors through adjusting pectoralis capability and enforcing an empirical muscular co-activation relationship. Model permutations were evaluated through comparisons between predicted muscle forces and empirically measured muscle activations in survivors. Similarities between empirical data and model outputs were influenced by muscle type, hand force, pectoralis major capability and co-activation constraints. Differences in magnitude were lower when the co-activation constraint was enforced (-18.4% [31.9]) than unenforced (-23.5% [27.6]) (p<0.0001). This research demonstrates that muscle dysfunction in breast cancer population survivors can be reflected through including a capability constraint for pectoralis major. Further refinement of the co-activation constraint for survivors could improve its generalizability across this population and activities. Improving biomechanical models to more accurately represent clinical populations can provide novel information that can help in the development of optimal treatment programs for breast cancer population survivors. Copyright © 2016 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bryan, Frank; Dennis, John; MacCready, Parker
This project aimed to improve long term global climate simulations by resolving and enhancing the representation of the processes involved in the cycling of freshwater through estuaries and coastal regions. This was a collaborative multi-institution project consisting of physical oceanographers, climate model developers, and computational scientists. It specifically targeted the DOE objectives of advancing simulation and predictive capability of climate models through improvements in resolution and physical process representation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bryan, Frank; Dennis, John; MacCready, Parker
This project aimed to improve long term global climate simulations by resolving and enhancing the representation of the processes involved in the cycling of freshwater through estuaries and coastal regions. This was a collaborative multi-institution project consisting of physical oceanographers, climate model developers, and computational scientists. It specifically targeted the DOE objectives of advancing simulation and predictive capability of climate models through improvements in resolution and physical process representation.
Towards improved capability and confidence in coupled atmospheric and wildland fire modeling
NASA Astrophysics Data System (ADS)
Sauer, Jeremy A.
This dissertation work is aimed at improving the capability and confidence in a modernized and improved version of Los Alamos National Laboratory's coupled atmospheric and wild- land fire dynamics model, Higrad-Firetec. Higrad is the hydrodynamics component of this large eddy simulation model that solves the three dimensional, fully compressible Navier-Stokes equations, incorporating a dynamic eddy viscosity formulation through a two-scale turbulence closure scheme. Firetec is the vegetation, drag forcing, and combustion physics portion that is integrated with Higrad. The modern version of Higrad-Firetec incorporates multiple numerical methodologies and high performance computing aspects which combine to yield a unique tool capable of augmenting theoretical and observational investigations in order to better understand the multi-scale, multi-phase, and multi-physics, phenomena involved in coupled atmospheric and environmental dynamics. More specifically, the current work includes extended functionality and validation efforts targeting component processes in coupled atmospheric and wildland fire scenarios. Since observational data of sufficient quality and resolution to validate the fully coupled atmosphere-wildfire scenario simply does not exist, we instead seek to validate components of the full prohibitively convoluted process. This manuscript provides first, an introduction and background into the application space of Higrad-Firetec. Second we document the model formulation, solution procedure, and a simple scalar transport verification exercise. Third, we perform a validate model results against observational data for time averaged flow field metrics in and above four idealized forest canopies. Fourth, we carry out a validation effort for the non-buoyant jet in a crossflow scenario (to which an analogy can be made for atmosphere-wildfire interactions) comparing model results to laboratory data of both steady-in-time and unsteady-in-time metrics. Finally, an extension of model multi-phase physics is implemented, allowing for the representation of multiple collocated fuels as separately evolving constituents leading to differences resulting rate of spread and total burned area. In combination these efforts demonstrate improved capability, increased validation of component functionality, and unique applicability the Higrad-Firetec modeling framework. As a result this work provides a substantially more robust foundation for future new, more widely acceptable investigations into the complexities of coupled atmospheric and wildland fire behavior.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bryan, Frank; Dennis, John; MacCready, Parker
This project aimed to improve long term global climate simulations by resolving and enhancing the representation of the processes involved in the cycling of freshwater through estuaries and coastal regions. This was a collaborative multi-institution project consisting of physical oceanographers, climate model developers, and computational scientists. It specifically targeted the DOE objectives of advancing simulation and predictive capability of climate models through improvements in resolution and physical process representation. The main computational objectives were: 1. To develop computationally efficient, but physically based, parameterizations of estuary and continental shelf mixing processes for use in an Earth System Model (CESM). 2. Tomore » develop a two-way nested regional modeling framework in order to dynamically downscale the climate response of particular coastal ocean regions and to upscale the impact of the regional coastal processes to the global climate in an Earth System Model (CESM). 3. To develop computational infrastructure to enhance the efficiency of data transfer between specific sources and destinations, i.e., a point-to-point communication capability, (used in objective 1) within POP, the ocean component of CESM.« less
A comparison of WEC control strategies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilson, David G.; Bacelli, Giorgio; Coe, Ryan Geoffrey
2016-04-01
The operation of Wave Energy Converter (WEC) devices can pose many challenging problems to the Water Power Community. A key research question is how to significantly improve the performance of these WEC devices through improving the control system design. This report summarizes an effort to analyze and improve the performance of WEC through the design and implementation of control systems. Controllers were selected to span the WEC control design space with the aim of building a more comprehensive understanding of different controller capabilities and requirements. To design and evaluate these control strategies, a model scale test-bed WEC was designed formore » both numerical and experimental testing (see Section 1.1). Seven control strategies have been developed and applied on a numerical model of the selected WEC. This model is capable of performing at a range of levels, spanning from a fully-linear realization to varying levels of nonlinearity. The details of this model and its ongoing development are described in Section 1.2.« less
Analysis of the In-Water and Sky Radiance Distribution Data Acquired During the Radyo Project
2013-09-30
radiative transfer to model the BRDF of particulate surfaces. OBJECTIVES The major objective of this research is to understand the downwelling...of image and radiative transfer models used in the ocean. My near term ocean optics objectives have been: 1) to improve the measurement capability...directional Reflectance Distribution Function ( BRDF ) of benthic surfaces in the ocean, and 4) to understand the capabilities and limitations of using
A Low-Signal-to-Noise-Ratio Sensor Framework Incorporating Improved Nighttime Capabilities in DIRSIG
NASA Astrophysics Data System (ADS)
Rizzuto, Anthony P.
When designing new remote sensing systems, it is difficult to make apples-to-apples comparisons between designs because of the number of sensor parameters that can affect the final image. Using synthetic imagery and a computer sensor model allows for comparisons to be made between widely different sensor designs or between competing design parameters. Little work has been done in fully modeling low-SNR systems end-to-end for these types of comparisons. Currently DIRSIG has limited capability to accurately model nighttime scenes under new moon conditions or near large cities. An improved DIRSIG scene modeling capability is presented that incorporates all significant sources of nighttime radiance, including new models for urban glow and airglow, both taken from the astronomy community. A low-SNR sensor modeling tool is also presented that accounts for sensor components and noise sources to generate synthetic imagery from a DIRSIG scene. The various sensor parameters that affect SNR are discussed, and example imagery is shown with the new sensor modeling tool. New low-SNR detectors have recently been designed and marketed for remote sensing applications. A comparison of system parameters for a state-of-the-art low-SNR sensor is discussed, and a sample design trade study is presented for a hypothetical scene and sensor.
Experience Transitioning Models and Data at the NOAA Space Weather Prediction Center
NASA Astrophysics Data System (ADS)
Berger, Thomas
2016-07-01
The NOAA Space Weather Prediction Center has a long history of transitioning research data and models into operations and with the validation activities required. The first stage in this process involves demonstrating that the capability has sufficient value to customers to justify the cost needed to transition it and to run it continuously and reliably in operations. Once the overall value is demonstrated, a substantial effort is then required to develop the operational software from the research codes. The next stage is to implement and test the software and product generation on the operational computers. Finally, effort must be devoted to establishing long-term measures of performance, maintaining the software, and working with forecasters, customers, and researchers to improve over time the operational capabilities. This multi-stage process of identifying, transitioning, and improving operational space weather capabilities will be discussed using recent examples. Plans for future activities will also be described.
Developments in Coastal Ocean Modeling
NASA Astrophysics Data System (ADS)
Allen, J. S.
2001-12-01
Capabilities in modeling continental shelf flow fields have improved markedly in the last several years. Progress is being made toward the long term scientific goal of utilizing numerical circulation models to interpolate, or extrapolate, necessarily limited field measurements to provide additional full-field information describing the behavior of, and providing dynamical rationalizations for, complex observed coastal flow. The improvement in modeling capabilities has been due to several factors including an increase in computer power and, importantly, an increase in experience of modelers in formulating relevant numerical experiments and in analyzing model results. We demonstrate present modeling capabilities and limitations by discussion of results from recent studies of shelf circulation off Oregon and northern California (joint work with Newberger, Gan, Oke, Pullen, and Wijesekera). Strong interactions between wind-forced coastal currents and continental shelf topography characterize the flow regimes in these cases. Favorable comparisons of model and measured alongshore currents and other variables provide confidence in the model-produced fields. The dependence of the mesoscale circulation, including upwelling and downwelling fronts and flow instabilities, on the submodel used to parameterize the effects of small scale turbulence, is discussed. Analyses of model results to provide explanations for the observed, but previously unexplained, alongshore variability in the intensity of coastal upwelling, which typically results in colder surface water south of capes, and the observed development in some locations of northward currents near the coast in response to the relaxation of southward winds, are presented.
Modelling the impacts of pests and diseases on agricultural systems.
Donatelli, M; Magarey, R D; Bregaglio, S; Willocquet, L; Whish, J P M; Savary, S
2017-07-01
The improvement and application of pest and disease models to analyse and predict yield losses including those due to climate change is still a challenge for the scientific community. Applied modelling of crop diseases and pests has mostly targeted the development of support capabilities to schedule scouting or pesticide applications. There is a need for research to both broaden the scope and evaluate the capabilities of pest and disease models. Key research questions not only involve the assessment of the potential effects of climate change on known pathosystems, but also on new pathogens which could alter the (still incompletely documented) impacts of pests and diseases on agricultural systems. Yield loss data collected in various current environments may no longer represent a adequate reference to develop tactical, decision-oriented, models for plant diseases and pests and their impacts, because of the ongoing changes in climate patterns. Process-based agricultural simulation modelling, on the other hand, appears to represent a viable methodology to estimate the impacts of these potential effects. A new generation of tools based on state-of-the-art knowledge and technologies is needed to allow systems analysis including key processes and their dynamics over appropriate suitable range of environmental variables. This paper offers a brief overview of the current state of development in coupling pest and disease models to crop models, and discusses technical and scientific challenges. We propose a five-stage roadmap to improve the simulation of the impacts caused by plant diseases and pests; i) improve the quality and availability of data for model inputs; ii) improve the quality and availability of data for model evaluation; iii) improve the integration with crop models; iv) improve the processes for model evaluation; and v) develop a community of plant pest and disease modelers.
NASA Technical Reports Server (NTRS)
Mannino, Antonio
2008-01-01
Understanding how the different components of seawater alter the path of incident sunlight through scattering and absorption is essential to using remotely sensed ocean color observations effectively. This is particularly apropos in coastal waters where the different optically significant components (phytoplankton, detrital material, inorganic minerals, etc.) vary widely in concentration, often independently from one another. Inherent Optical Properties (IOPs) form the link between these biogeochemical constituents and the Apparent Optical Properties (AOPs). understanding this interrelationship is at the heart of successfully carrying out inversions of satellite-measured radiance to biogeochemical properties. While sufficient covariation of seawater constituents in case I waters typically allows empirical algorithms connecting AOPs and biogeochemical parameters to behave well, these empirical algorithms normally do not hold for case I1 regimes (Carder et al. 2003). Validation in the context of ocean color remote sensing refers to in-situ measurements used to verify or characterize algorithm products or any assumption used as input to an algorithm. In this project, validation capabilities are considered those measurement capabilities, techniques, methods, models, etc. that allow effective validation. Enhancing current validation capabilities by incorporating state-of-the-art IOP measurements and optical models is the purpose of this work. Involved in this pursuit is improving core IOP measurement capabilities (spectral, angular, spatio-temporal resolutions), improving our understanding of the behavior of analytical AOP-IOP approximations in complex coastal waters, and improving the spatial and temporal resolution of biogeochemical data for validation by applying biogeochemical-IOP inversion models so that these parameters can be computed from real-time IOP sensors with high sampling rates. Research cruises supported by this project provides for collection and processing of seawater samples for biogeochemical (pigments, DOC and POC) and optical (CDOM and POM absorption coefficients) analyses to enhance our understanding of the linkages between in-water optical measurements (IOPs and AOPs) and biogeochemical constituents and to provide a more comprehensive suite of validation products.
Interfacing a General Purpose Fluid Network Flow Program with the SINDA/G Thermal Analysis Program
NASA Technical Reports Server (NTRS)
Schallhorn, Paul; Popok, Daniel
1999-01-01
A general purpose, one dimensional fluid flow code is currently being interfaced with the thermal analysis program Systems Improved Numerical Differencing Analyzer/Gaski (SINDA/G). The flow code, Generalized Fluid System Simulation Program (GFSSP), is capable of analyzing steady state and transient flow in a complex network. The flow code is capable of modeling several physical phenomena including compressibility effects, phase changes, body forces (such as gravity and centrifugal) and mixture thermodynamics for multiple species. The addition of GFSSP to SINDA/G provides a significant improvement in convective heat transfer modeling for SINDA/G. The interface development is conducted in multiple phases. This paper describes the first phase of the interface which allows for steady and quasi-steady (unsteady solid, steady fluid) conjugate heat transfer modeling.
Demonstrating the improvement of predictive maturity of a computational model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hemez, Francois M; Unal, Cetin; Atamturktur, Huriye S
2010-01-01
We demonstrate an improvement of predictive capability brought to a non-linear material model using a combination of test data, sensitivity analysis, uncertainty quantification, and calibration. A model that captures increasingly complicated phenomena, such as plasticity, temperature and strain rate effects, is analyzed. Predictive maturity is defined, here, as the accuracy of the model to predict multiple Hopkinson bar experiments. A statistical discrepancy quantifies the systematic disagreement (bias) between measurements and predictions. Our hypothesis is that improving the predictive capability of a model should translate into better agreement between measurements and predictions. This agreement, in turn, should lead to a smallermore » discrepancy. We have recently proposed to use discrepancy and coverage, that is, the extent to which the physical experiments used for calibration populate the regime of applicability of the model, as basis to define a Predictive Maturity Index (PMI). It was shown that predictive maturity could be improved when additional physical tests are made available to increase coverage of the regime of applicability. This contribution illustrates how the PMI changes as 'better' physics are implemented in the model. The application is the non-linear Preston-Tonks-Wallace (PTW) strength model applied to Beryllium metal. We demonstrate that our framework tracks the evolution of maturity of the PTW model. Robustness of the PMI with respect to the selection of coefficients needed in its definition is also studied.« less
Landscape capability models as a tool to predict fine-scale forest bird occupancy and abundance
Loman, Zachary G.; DeLuca, William; Harrison, Daniel J.; Loftin, Cynthia S.; Rolek, Brian W.; Wood, Petra B.
2018-01-01
ContextSpecies-specific models of landscape capability (LC) can inform landscape conservation design. Landscape capability is “the ability of the landscape to provide the environment […] and the local resources […] needed for survival and reproduction […] in sufficient quantity, quality and accessibility to meet the life history requirements of individuals and local populations.” Landscape capability incorporates species’ life histories, ecologies, and distributions to model habitat for current and future landscapes and climates as a proactive strategy for conservation planning.ObjectivesWe tested the ability of a set of LC models to explain variation in point occupancy and abundance for seven bird species representative of spruce-fir, mixed conifer-hardwood, and riparian and wooded wetland macrohabitats.MethodsWe compiled point count data sets used for biological inventory, species monitoring, and field studies across the northeastern United States to create an independent validation data set. Our validation explicitly accounted for underestimation in validation data using joint distance and time removal sampling.ResultsBlackpoll warbler (Setophaga striata), wood thrush (Hylocichla mustelina), and Louisiana (Parkesia motacilla) and northern waterthrush (P. noveboracensis) models were validated as predicting variation in abundance, although this varied from not biologically meaningful (1%) to strongly meaningful (59%). We verified all seven species models [including ovenbird (Seiurus aurocapilla), blackburnian (Setophaga fusca) and cerulean warbler (Setophaga cerulea)], as all were positively related to occupancy data.ConclusionsLC models represent a useful tool for conservation planning owing to their predictive ability over a regional extent. As improved remote-sensed data become available, LC layers are updated, which will improve predictions.
APPLICATION OF A FULLY DISTRIBUTED WASHOFF AND TRANSPORT MODEL FOR A GULF COAST WATERSHED
Advances in hydrologic modeling have been shown to improve the accuracy of rainfall runoff simulation and prediction. Building on the capabilities of distributed hydrologic modeling, a water quality model was developed to simulate buildup, washoff, and advective transport of a co...
Improving Statewide Freight Routing Capabilities for Sub-National Commodity Flows
DOT National Transportation Integrated Search
2012-10-01
The ability to fully understand and accurately characterize freight vehicle route choices is important in helping to inform regional and state decisions. This project recommends improvements to WSDOTs Statewide Freight GIS Network Model to more ac...
Aviation Safety Program Atmospheric Environment Safety Technologies (AEST) Project
NASA Technical Reports Server (NTRS)
Colantonio, Ron
2011-01-01
Engine Icing: Characterization and Simulation Capability: Develop knowledge bases, analysis methods, and simulation tools needed to address the problem of engine icing; in particular, ice-crystal icing Airframe Icing Simulation and Engineering Tool Capability: Develop and demonstrate 3-D capability to simulate and model airframe ice accretion and related aerodynamic performance degradation for current and future aircraft configurations in an expanded icing environment that includes freezing drizzle/rain Atmospheric Hazard Sensing and Mitigation Technology Capability: Improve and expand remote sensing and mitigation of hazardous atmospheric environments and phenomena
Gabbay, Robert A.; Friedberg, Mark W.; Miller-Day, Michelle; Cronholm, Peter F.; Adelman, Alan; Schneider, Eric C.
2013-01-01
PURPOSE The medical home has gained national attention as a model to reorganize primary care to improve health outcomes. Pennsylvania has undertaken one of the largest state-based, multipayer medical home pilot projects. We used a positive deviance approach to identify and compare factors driving the care models of practices showing the greatest and least improvement in diabetes care in a sample of 25 primary care practices in southeast Pennsylvania. METHODS We ranked practices into improvement quintiles on the basis of the average absolute percentage point improvement from baseline to 18 months in 3 registry-based measures of performance related to diabetes care: glycated hemoglobin concentration, blood pressure, and low-density lipoprotein cholesterol level. We then conducted surveys and key informant interviews with leaders and staff in the 5 most and least improved practices, and compared their responses. RESULTS The most improved/higher-performing practices tended to have greater structural capabilities (eg, electronic health records) than the least improved/lower-performing practices at baseline. Interviews revealed striking differences between the groups in terms of leadership styles and shared vision; sense, use, and development of teams; processes for monitoring progress and obtaining feedback; and presence of technologic and financial distractions. CONCLUSIONS Positive deviance analysis suggests that primary care practices’ baseline structural capabilities and abilities to buffer the stresses of change may be key facilitators of performance improvement in medical home transformations. Attention to the practices’ structural capabilities and factors shaping successful change, especially early in the process, will be necessary to improve the likelihood of successful medical home transformation and better care. PMID:23690393
NASA Technical Reports Server (NTRS)
Arnold, William R.
2015-01-01
Since last year, a number of expanded capabilities have been added to the modeler. The support the integration with thermal modeling, the program can now produce simplified thermal models with the same geometric parameters as the more detailed dynamic and even more refined stress models. The local mesh refinement and mesh improvement tools have been expanded and more user friendly. The goal is to provide a means of evaluating both monolithic and segmented mirrors to the same level of fidelity and loading conditions at reasonable man-power efforts. The paper will demonstrate most of these new capabilities.
Observational breakthroughs lead the way to improved hydrological predictions
NASA Astrophysics Data System (ADS)
Lettenmaier, Dennis P.
2017-04-01
New data sources are revolutionizing the hydrological sciences. The capabilities of hydrological models have advanced greatly over the last several decades, but until recently model capabilities have outstripped the spatial resolution and accuracy of model forcings (atmospheric variables at the land surface) and the hydrologic state variables (e.g., soil moisture; snow water equivalent) that the models predict. This has begun to change, as shown in two examples here: soil moisture and drought evolution over Africa as predicted by a hydrology model forced with satellite-derived precipitation, and observations of snow water equivalent at very high resolution over a river basin in California's Sierra Nevada.
NASA Technical Reports Server (NTRS)
Arnold, William R., Sr.
2015-01-01
Since last year, a number of expanded capabilities have been added to the modeler. The support the integration with thermal modeling, the program can now produce simplified thermal models with the same geometric parameters as the more detailed dynamic and even more refined stress models. The local mesh refinement and mesh improvement tools have been expanded and more user friendly. The goal is to provide a means of evaluating both monolithic and segmented mirrors to the same level of fidelity and loading conditions at reasonable man-power efforts. The paper will demonstrate most of these new capabilities.
NASA Astrophysics Data System (ADS)
Turinsky, Paul J.; Kothe, Douglas B.
2016-05-01
The Consortium for the Advanced Simulation of Light Water Reactors (CASL), the first Energy Innovation Hub of the Department of Energy, was established in 2010 with the goal of providing modeling and simulation (M&S) capabilities that support and accelerate the improvement of nuclear energy's economic competitiveness and the reduction of spent nuclear fuel volume per unit energy, and all while assuring nuclear safety. To accomplish this requires advances in M&S capabilities in radiation transport, thermal-hydraulics, fuel performance and corrosion chemistry. To focus CASL's R&D, industry challenge problems have been defined, which equate with long standing issues of the nuclear power industry that M&S can assist in addressing. To date CASL has developed a multi-physics ;core simulator; based upon pin-resolved radiation transport and subchannel (within fuel assembly) thermal-hydraulics, capitalizing on the capabilities of high performance computing. CASL's fuel performance M&S capability can also be optionally integrated into the core simulator, yielding a coupled multi-physics capability with untapped predictive potential. Material models have been developed to enhance predictive capabilities of fuel clad creep and growth, along with deeper understanding of zirconium alloy clad oxidation and hydrogen pickup. Understanding of corrosion chemistry (e.g., CRUD formation) has evolved at all scales: micro, meso and macro. CFD R&D has focused on improvement in closure models for subcooled boiling and bubbly flow, and the formulation of robust numerical solution algorithms. For multiphysics integration, several iterative acceleration methods have been assessed, illuminating areas where further research is needed. Finally, uncertainty quantification and data assimilation techniques, based upon sampling approaches, have been made more feasible for practicing nuclear engineers via R&D on dimensional reduction and biased sampling. Industry adoption of CASL's evolving M&S capabilities, which is in progress, will assist in addressing long-standing and future operational and safety challenges of the nuclear industry.
Revised Reynolds Stress and Triple Product Models
NASA Technical Reports Server (NTRS)
Olsen, Michael E.; Lillard, Randolph P.
2017-01-01
Revised versions of Lag methodology Reynolds-stress and triple product models are applied to accepted test cases to assess the improvement, or lack thereof, in the prediction capability of the models. The Bachalo-Johnson bump flow is shown as an example for this abstract submission.
EPA's Models-3 CMAQ system is intended to provide a community modeling paradigm that allows continuous improvement of the one-atmosphere modeling capability in a unified fashion. CMAQ's modular design promotes incorporation of several sets of science process modules representing ...
Biosecurity through Public Health System Design.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beyeler, Walter E.; Finley, Patrick D.; Arndt, William
We applied modeling and simulation to examine the real-world tradeoffs between developingcountry public-health improvement and the need to improve the identification, tracking, and security of agents with bio-weapons potential. Traditionally, the international community has applied facility-focused strategies for improving biosecurity and biosafety. This work examines how system-level assessments and improvements can foster biosecurity and biosafety. We modeled medical laboratory resources and capabilities to identify scenarios where biosurveillance goals are transparently aligned with public health needs, and resource are distributed in a way that maximizes their ability to serve patients while minimizing security a nd safety risks. Our modeling platform simulatesmore » key processes involved in healthcare system operation, such as sample collection, transport, and analysis at medical laboratories. The research reported here extends the prior art by provided two key compone nts for comparative performance assessment: a model of patient interaction dynamics, and the capability to perform uncertainty quantification. In addition, we have outlined a process for incorporating quantitative biosecurity and biosafety risk measures. Two test problems were used to exercise these research products examine (a) Systemic effects of technological innovation and (b) Right -sizing of laboratory networks.« less
NASA Technical Reports Server (NTRS)
Chan, David T.; Balakrishna, Sundareswara; Walker, Eric L.; Goodliff, Scott L.
2015-01-01
Recent data quality improvements at the National Transonic Facility have an intended goal of reducing the Mach number variation in a data point to within plus or minus 0.0005, with the ultimate goal of reducing the data repeatability of the drag coefficient for full-span subsonic transport models at transonic speeds to within half a drag count. This paper will discuss the Mach stability improvements achieved through the use of an existing second throat capability at the NTF to create a minimum area at the end of the test section. These improvements were demonstrated using both the NASA Common Research Model and the NTF Pathfinder-I model in recent experiments. Sonic conditions at the throat were verified using sidewall static pressure data. The Mach variation levels from both experiments in the baseline tunnel configuration and the choked tunnel configuration will be presented and the correlation between Mach number and drag will also be examined. Finally, a brief discussion is given on the consequences of using the second throat in its location at the end of the test section.
NASA Technical Reports Server (NTRS)
Chan, David T.
2015-01-01
Recent data quality improvements at the National Transonic Facility (NTF) have an intended goal of reducing the Mach number variation in a data point to within unit vector A plus or minus 0.0005, with the ultimate goal of reducing the data repeatability of the drag coefficient for full-span subsonic transport models at transonic speeds to within half of a drag count. This paper will discuss the Mach stability improvements achieved through the use of an existing second throat capability at the NTF to create a minimum area at the end of the test section. These improvements were demonstrated using both the NASA Common Research Model and the NTF Pathfinder-I model in recent experiments. Sonic conditions at the throat were verified using sidewall static pressure data. The Mach variation levels from both experiments in the baseline tunnel configuration and the choked tunnel configuration will be presented. Finally, a brief discussion is given on the consequences of using the second throat in its location at the end of the test section.
Progress in and prospects for fluvial flood modelling.
Wheater, H S
2002-07-15
Recent floods in the UK have raised public and political awareness of flood risk. There is an increasing recognition that flood management and land-use planning are linked, and that decision-support modelling tools are required to address issues of climate and land-use change for integrated catchment management. In this paper, the scientific context for fluvial flood modelling is discussed, current modelling capability is considered and research challenges are identified. Priorities include (i) appropriate representation of spatial precipitation, including scenarios of climate change; (ii) development of a national capability for continuous hydrological simulation of ungauged catchments; (iii) improved scientific understanding of impacts of agricultural land-use and land-management change, and the development of new modelling approaches to represent those impacts; (iv) improved representation of urban flooding, at both local and catchment scale; (v) appropriate parametrizations for hydraulic simulation of in-channel and flood-plain flows, assimilating available ground observations and remotely sensed data; and (vi) a flexible decision-support modelling framework, incorporating developments in computing, data availability, data assimilation and uncertainty analysis.
NASA Astrophysics Data System (ADS)
Johns, Jesse M.; Burkes, Douglas
2017-07-01
In this work, a multilayered perceptron (MLP) network is used to develop predictive isothermal time-temperature-transformation (TTT) models covering a range of U-Mo binary and ternary alloys. The selected ternary alloys for model development are U-Mo-Ru, U-Mo-Nb, U-Mo-Zr, U-Mo-Cr, and U-Mo-Re. These model's ability to predict 'novel' U-Mo alloys is shown quite well despite the discrepancies between literature sources for similar alloys which likely arise from different thermal-mechanical processing conditions. These models are developed with the primary purpose of informing experimental decisions. Additional experimental insight is necessary in order to reduce the number of experiments required to isolate ideal alloys. These models allow test planners to evaluate areas of experimental interest; once initial tests are conducted, the model can be updated and further improve follow-on testing decisions. The model also improves analysis capabilities by reducing the number of data points necessary from any particular test. For example, if one or two isotherms are measured during a test, the model can construct the rest of the TTT curve over a wide range of temperature and time. This modeling capability reduces the cost of experiments while also improving the value of the results from the tests. The reduced costs could result in improved material characterization and therefore improved fundamental understanding of TTT dynamics. As additional understanding of phenomena driving TTTs is acquired, this type of MLP model can be used to populate unknowns (such as material impurity and other thermal mechanical properties) from past literature sources.
Snow on Sea Ice Workshop - An Icy Meeting of the Minds: Modelers and Measurers
2015-09-30
1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Snow on Sea Ice Workshop - An Icy Meeting of the Minds...workshop was to promote more seamless and better integration between measurements and modeling of snow on sea ice , thereby improving our predictive...capabilities for sea ice . OBJECTIVES The key objective was to improve the ability of modelers and measurers work together closely. To that end, we
NASA Astrophysics Data System (ADS)
Gunawan, D.; Amalia, A.; Rahmat, R. F.; Muchtar, M. A.; Siregar, I.
2018-02-01
Identification of software maturity level is a technique to determine the quality of the software. By identifying the software maturity level, the weaknesses of the software can be observed. As a result, the recommendations might be a reference for future software maintenance and development. This paper discusses the software Capability Level (CL) with case studies on Quality Management Unit (Unit Manajemen Mutu) University of Sumatera Utara (UMM-USU). This research utilized Standard CMMI Appraisal Method for Process Improvement class C (SCAMPI C) model with continuous representation. This model focuses on activities for developing quality products and services. The observation is done in three process areas, such as Project Planning (PP), Project Monitoring and Control (PMC), and Requirements Management (REQM). According to the measurement of software capability level for UMM-USU software, turns out that the capability level for the observed process area is in the range of CL1 and CL2. Planning Project (PP) is the only process area which reaches capability level 2, meanwhile, PMC and REQM are still in CL 1 or in performed level. This research reveals several weaknesses of existing UMM-USU software. Therefore, this study proposes several recommendations for UMM-USU to improve capability level for observed process areas.
Additions and improvements to the high energy density physics capabilities in the FLASH code
NASA Astrophysics Data System (ADS)
Lamb, D.; Bogale, A.; Feister, S.; Flocke, N.; Graziani, C.; Khiar, B.; Laune, J.; Tzeferacos, P.; Walker, C.; Weide, K.
2017-10-01
FLASH is an open-source, finite-volume Eulerian, spatially-adaptive radiation magnetohydrodynamics code that has the capabilities to treat a broad range of physical processes. FLASH performs well on a wide range of computer architectures, and has a broad user base. Extensive high energy density physics (HEDP) capabilities exist in FLASH, which make it a powerful open toolset for the academic HEDP community. We summarize these capabilities, emphasizing recent additions and improvements. We describe several non-ideal MHD capabilities that are being added to FLASH, including the Hall and Nernst effects, implicit resistivity, and a circuit model, which will allow modeling of Z-pinch experiments. We showcase the ability of FLASH to simulate Thomson scattering polarimetry, which measures Faraday due to the presence of magnetic fields, as well as proton radiography, proton self-emission, and Thomson scattering diagnostics. Finally, we describe several collaborations with the academic HEDP community in which FLASH simulations were used to design and interpret HEDP experiments. This work was supported in part at U. Chicago by DOE NNSA ASC through the Argonne Institute for Computing in Science under FWP 57789; DOE NNSA under NLUF Grant DE-NA0002724; DOE SC OFES Grant DE-SC0016566; and NSF Grant PHY-1619573.
Petri net modeling of high-order genetic systems using grammatical evolution.
Moore, Jason H; Hahn, Lance W
2003-11-01
Understanding how DNA sequence variations impact human health through a hierarchy of biochemical and physiological systems is expected to improve the diagnosis, prevention, and treatment of common, complex human diseases. We have previously developed a hierarchical dynamic systems approach based on Petri nets for generating biochemical network models that are consistent with genetic models of disease susceptibility. This modeling approach uses an evolutionary computation approach called grammatical evolution as a search strategy for optimal Petri net models. We have previously demonstrated that this approach routinely identifies biochemical network models that are consistent with a variety of genetic models in which disease susceptibility is determined by nonlinear interactions between two DNA sequence variations. In the present study, we evaluate whether the Petri net approach is capable of identifying biochemical networks that are consistent with disease susceptibility due to higher order nonlinear interactions between three DNA sequence variations. The results indicate that our model-building approach is capable of routinely identifying good, but not perfect, Petri net models. Ideas for improving the algorithm for this high-dimensional problem are presented.
A review of methods for predicting air pollution dispersion
NASA Technical Reports Server (NTRS)
Mathis, J. J., Jr.; Grose, W. L.
1973-01-01
Air pollution modeling, and problem areas in air pollution dispersion modeling were surveyed. Emission source inventory, meteorological data, and turbulent diffusion are discussed in terms of developing a dispersion model. Existing mathematical models of urban air pollution, and highway and airport models are discussed along with their limitations. Recommendations for improving modeling capabilities are included.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ritchie, L.T.; Alpert, D.J.; Burke, R.P.
1984-03-01
The CRAC2 computer code is a revised version of CRAC (Calculation of Reactor Accident Consequences) which was developed for the Reactor Safety Study. This document provides an overview of the CRAC2 code and a description of each of the models used. Significant improvements incorporated into CRAC2 include an improved weather sequence sampling technique, a new evacuation model, and new output capabilities. In addition, refinements have been made to the atmospheric transport and deposition model. Details of the modeling differences between CRAC2 and CRAC are emphasized in the model descriptions.
Structural capabilities in small and medium-sized patient-centered medical homes.
Alidina, Shehnaz; Schneider, Eric C; Singer, Sara J; Rosenthal, Meredith B
2014-07-01
1) Evaluate structural capabilities associated with the patient-centered medical home (PCMH) model in PCMH pilots in Colorado, Ohio, and Rhode Island; 2) evaluate changes in capabilities over 2 years in the Rhode Island pilot; and 3) evaluate facilitators and barriers to the adoption of capabilities. We assessed structural capabilities in the 30 pilot practices using a cross-sectional study design and examined changes over 2 years in 5 Rhode Island practices using a pre/post design. We used National Committee for Quality Assurance's Physician Practice Connections-Patient-Centered Medical Home (PPC/PCMH) accreditation survey data to measure capabilities. We stratified by high and low performance based on total score and by practice size. We analyzed change from baseline to 24 months for the Rhode Island practices. We analyzed qualitative data from interviews with practice leaders to identify facilitators and barriers to building capabilities. On average, practices scored 73 points (out of 100 points) for structural capabilities. High and low performers differed most on electronic prescribing, patient self-management, and care-management standards. Rhode Island practices averaged 42 points at baseline, and reached 90 points by the end of year 2. Some of the key facilitators that emerged were payment incentives, "transformation coaches," learning collaboratives, and data availability supporting performance management and quality improvement. Barriers to improvement included the extent of transformation required, technology shortcomings, slow cultural change, change fatigue, and lack of broader payment reform. For these early adopters, prevalence of structural capabilities was high, and performance was substantially improved for practices with initially lower capabilities. We conclude that building capabilities requires payment reform, attention to implementation, and cultural change.
Equivalent plate modeling for conceptual design of aircraft wing structures
NASA Technical Reports Server (NTRS)
Giles, Gary L.
1995-01-01
This paper describes an analysis method that generates conceptual-level design data for aircraft wing structures. A key requirement is that this data must be produced in a timely manner so that is can be used effectively by multidisciplinary synthesis codes for performing systems studies. Such a capability is being developed by enhancing an equivalent plate structural analysis computer code to provide a more comprehensive, robust and user-friendly analysis tool. The paper focuses on recent enhancements to the Equivalent Laminated Plate Solution (ELAPS) analysis code that significantly expands the modeling capability and improves the accuracy of results. Modeling additions include use of out-of-plane plate segments for representing winglets and advanced wing concepts such as C-wings along with a new capability for modeling the internal rib and spar structure. The accuracy of calculated results is improved by including transverse shear effects in the formulation and by using multiple sets of assumed displacement functions in the analysis. Typical results are presented to demonstrate these new features. Example configurations include a C-wing transport aircraft, a representative fighter wing and a blended-wing-body transport. These applications are intended to demonstrate and quantify the benefits of using equivalent plate modeling of wing structures during conceptual design.
NASA Astrophysics Data System (ADS)
Danner, Travis W.
Developing technology systems requires all manner of investment---engineering talent, prototypes, test facilities, and more. Even for simple design problems the investment can be substantial; for complex technology systems, the development costs can be staggering. The profitability of a corporation in a technology-driven industry is crucially dependent on maximizing the effectiveness of research and development investment. Decision-makers charged with allocation of this investment are forced to choose between the further evolution of existing technologies and the pursuit of revolutionary technologies. At risk on the one hand is excessive investment in an evolutionary technology which has only limited availability for further improvement. On the other hand, the pursuit of a revolutionary technology may mean abandoning momentum and the potential for substantial evolutionary improvement resulting from the years of accumulated knowledge. The informed answer to this question, evolutionary or revolutionary, requires knowledge of the expected rate of improvement and the potential a technology offers for further improvement. This research is dedicated to formulating the assessment and forecasting tools necessary to acquire this knowledge. The same physical laws and principles that enable the development and improvement of specific technologies also limit the ultimate capability of those technologies. Researchers have long used this concept as the foundation for modeling technological advancement through extrapolation by analogy to biological growth models. These models are employed to depict technology development as it asymptotically approaches limits established by the fundamental principles on which the technological approach is based. This has proven an effective and accurate approach to modeling and forecasting simple single-attribute technologies. With increased system complexity and the introduction of multiple system objectives, however, the usefulness of this modeling technique begins to diminish. With the introduction of multiple objectives, researchers often abandon technology growth models for scoring models and technology frontiers. While both approaches possess advantages over current growth models for the assessment of multi-objective technologies, each lacks a necessary dimension for comprehensive technology assessment. By collapsing multiple system metrics into a single, non-intuitive technology measure, scoring models provide a succinct framework for multi-objective technology assessment and forecasting. Yet, with no consideration of physical limits, scoring models provide no insight as to the feasibility of a particular combination of system capabilities. They only indicate that a given combination of system capabilities yields a particular score. Conversely, technology frontiers are constructed with the distinct objective of providing insight into the feasibility of system capability combinations. Yet again, upper limits to overall system performance are ignored. Furthermore, the data required to forecast subsequent technology frontiers is often inhibitive. In an attempt to reincorporate the fundamental nature of technology advancement as bound by physical principles, researchers have sought to normalize multi-objective systems whereby the variability of a single system objective is eliminated as a result of changes in the remaining objectives. This drastically limits the applicability of the resulting technology model because it is only applicable for a single setting of all other system attributes. Attempts to maintain the interaction between the growth curves of each technical objective of a complex system have thus far been limited to qualitative and subjective consideration. This research proposes the formulation of multidimensional growth models as an approach to simulating the advancement of multi-objective technologies towards their upper limits. Multidimensional growth models were formulated by noticing and exploiting the correlation between technology growth models and technology frontiers. Both are frontiers in actuality. The technology growth curve is a frontier between capability levels of a single attribute and time, while a technology frontier is a frontier between the capability levels of two or more attributes. Multidimensional growth models are formulated by exploiting the mathematical significance of this correlation. The result is a model that can capture both the interaction between multiple system attributes and their expected rates of improvement over time. The fundamental nature of technology development is maintained, and interdependent growth curves are generated for each system metric with minimal data requirements. Being founded on the basic nature of technology advancement, relative to physical limits, the availability for further improvement can be determined for a single metric relative to other system measures of merit. A by-product of this modeling approach is a single n-dimensional technology frontier linking all n system attributes with time. This provides an environment capable of forecasting future system capability in the form of advancing technology frontiers. The ability of a multidimensional growth model to capture the expected improvement of a specific technological approach is dependent on accurately identifying the physical limitations to each pertinent attribute. This research investigates two potential approaches to identifying those physical limits, a physics-based approach and a regression-based approach. The regression-based approach has found limited acceptance among forecasters, although it does show potential for estimating upper limits with a specified degree of uncertainty. Forecasters have long favored physics-based approaches for establishing the upper limit to unidimensional growth models. The task of accurately identifying upper limits has become increasingly difficult with the extension of growth models into multiple dimensions. A lone researcher may be able to identify the physical limitation to a single attribute of a simple system; however, as system complexity and the number of attributes increases, the attention of researchers from multiple fields of study is required. Thus, limit identification is itself an area of research and development requiring some level of investment. Whether estimated by physics or regression-based approaches, predicted limits will always have some degree of uncertainty. This research takes the approach of quantifying the impact of that uncertainty on model forecasts rather than heavily endorsing a single technique to limit identification. In addition to formulating the multidimensional growth model, this research provides a systematic procedure for applying that model to specific technology architectures. Researchers and decision-makers are able to investigate the potential for additional improvement within that technology architecture and to estimate the expected cost of each incremental improvement relative to the cost of past improvements. In this manner, multidimensional growth models provide the necessary information to set reasonable program goals for the further evolution of a particular technological approach or to establish the need for revolutionary approaches in light of the constraining limits of conventional approaches.
Improvements and validation of the erythropoiesis control model for bed rest simulation
NASA Technical Reports Server (NTRS)
Leonard, J. I.
1977-01-01
The most significant improvement in the model is the explicit formulation of separate elements representing erythropoietin production and red cell production. Other modifications include bone marrow time-delays, capability to shift oxyhemoglobin affinity and an algorithm for entering experimental data as time-varying driving functions. An area of model development is suggested by applying the model to simulating onset, diagnosis and treatment of a hematologic disorder. Recommendations for further improvements in the model and suggestions for experimental application are also discussed. A detailed analysis of the hematologic response to bed rest including simulation of the recent Baylor Medical College bed rest studies is also presented.
Elysee, Gerald; Herrin, Jeph; Horwitz, Leora I
2017-10-01
Stagnation in hospitals' adoption of data integration functionalities coupled with reduction in the number of operational health information exchanges could become a significant impediment to hospitals' adoption of 3 critical capabilities: electronic health information exchange, interoperability, and medication reconciliation, in which electronic systems are used to assist with resolving medication discrepancies and improving patient safety. Against this backdrop, we assessed the relationships between the 3 capabilities.We conducted an observational study applying partial least squares-structural equation modeling technique to 27 variables obtained from the 2013 American Hospital Association annual survey Information Technology (IT) supplement, which describes health IT capabilities.We included 1330 hospitals. In confirmatory factor analysis, out of the 27 variables, 15 achieved loading values greater than 0.548 at P < .001, as such were validated as the building blocks of the 3 capabilities. Subsequent path analysis showed a significant, positive, and cyclic relationship between the capabilities, in that decreases in the hospitals' adoption of one would lead to decreases in the adoption of the others.These results show that capability for high quality medication reconciliation may be impeded by lagging adoption of interoperability and health information exchange capabilities. Policies focused on improving one or more of these capabilities may have ancillary benefits.
Development of a Rolling Process Design Tool for Use in Improving Hot Roll Slab Recovery
DOE Office of Scientific and Technical Information (OSTI.GOV)
Couch, R; Becker, R; Rhee, M
2004-09-24
Lawrence Livermore National Laboratory participated in a U. S. Department of Energy/Office of Industrial Technology sponsored research project 'Development of a Rolling Process Design Tool for Use in Improving Hot Roll Slab Recovery', as a Cooperative Agreement TC-02028 with the Alcoa Technical Center (ATC). The objective of the joint project with Alcoa is to develop a numerical modeling capability to optimize the hot rolling process used to produce aluminum plate. Product lost in the rolling process and subsequent recycling, wastes resources consumed in the energy-intensive steps of remelting and reprocessing the ingot. The modeling capability developed by project partners willmore » be used to produce plate more efficiently and with reduced product loss.« less
NASA Technical Reports Server (NTRS)
Noll, Thomas E.
1990-01-01
The paper describes recent accomplishments and current research projects along four main thrusts in aeroservoelasticity at NASA Langley. One activity focuses on enhancing the modeling and analysis procedures to accurately predict aeroservoelastic interactions. Improvements to the minimum-state method of approximating unsteady aerodynamics are shown to provide precise low-order models for design and simulation tasks. Recent extensions in aerodynamic correction-factor methodology are also described. With respect to analysis procedures, the paper reviews novel enhancements to matched filter theory and random process theory for predicting the critical gust profile and the associated time-correlated gust loads for structural design considerations. Two research projects leading towards improved design capability are also summarized: (1) an integrated structure/control design capability and (2) procedures for obtaining low-order robust digital control laws for aeroelastic applications.
Comprehensive Micromechanics-Analysis Code - Version 4.0
NASA Technical Reports Server (NTRS)
Arnold, S. M.; Bednarcyk, B. A.
2005-01-01
Version 4.0 of the Micromechanics Analysis Code With Generalized Method of Cells (MAC/GMC) has been developed as an improved means of computational simulation of advanced composite materials. The previous version of MAC/GMC was described in "Comprehensive Micromechanics-Analysis Code" (LEW-16870), NASA Tech Briefs, Vol. 24, No. 6 (June 2000), page 38. To recapitulate: MAC/GMC is a computer program that predicts the elastic and inelastic thermomechanical responses of continuous and discontinuous composite materials with arbitrary internal microstructures and reinforcement shapes. The predictive capability of MAC/GMC rests on a model known as the generalized method of cells (GMC) - a continuum-based model of micromechanics that provides closed-form expressions for the macroscopic response of a composite material in terms of the properties, sizes, shapes, and responses of the individual constituents or phases that make up the material. Enhancements in version 4.0 include a capability for modeling thermomechanically and electromagnetically coupled ("smart") materials; a more-accurate (high-fidelity) version of the GMC; a capability to simulate discontinuous plies within a laminate; additional constitutive models of materials; expanded yield-surface-analysis capabilities; and expanded failure-analysis and life-prediction capabilities on both the microscopic and macroscopic scales.
Web-based applications for building, managing and analysing kinetic models of biological systems.
Lee, Dong-Yup; Saha, Rajib; Yusufi, Faraaz Noor Khan; Park, Wonjun; Karimi, Iftekhar A
2009-01-01
Mathematical modelling and computational analysis play an essential role in improving our capability to elucidate the functions and characteristics of complex biological systems such as metabolic, regulatory and cell signalling pathways. The modelling and concomitant simulation render it possible to predict the cellular behaviour of systems under various genetically and/or environmentally perturbed conditions. This motivates systems biologists/bioengineers/bioinformaticians to develop new tools and applications, allowing non-experts to easily conduct such modelling and analysis. However, among a multitude of systems biology tools developed to date, only a handful of projects have adopted a web-based approach to kinetic modelling. In this report, we evaluate the capabilities and characteristics of current web-based tools in systems biology and identify desirable features, limitations and bottlenecks for further improvements in terms of usability and functionality. A short discussion on software architecture issues involved in web-based applications and the approaches taken by existing tools is included for those interested in developing their own simulation applications.
Vertical resolution of baroclinic modes in global ocean models
NASA Astrophysics Data System (ADS)
Stewart, K. D.; Hogg, A. McC.; Griffies, S. M.; Heerdegen, A. P.; Ward, M. L.; Spence, P.; England, M. H.
2017-05-01
Improvements in the horizontal resolution of global ocean models, motivated by the horizontal resolution requirements for specific flow features, has advanced modelling capabilities into the dynamical regime dominated by mesoscale variability. In contrast, the choice of the vertical grid remains a subjective choice, and it is not clear that efforts to improve vertical resolution adequately support their horizontal counterparts. Indeed, considering that the bulk of the vertical ocean dynamics (including convection) are parameterized, it is not immediately obvious what the vertical grid is supposed to resolve. Here, we propose that the primary purpose of the vertical grid in a hydrostatic ocean model is to resolve the vertical structure of horizontal flows, rather than to resolve vertical motion. With this principle we construct vertical grids based on their abilities to represent baroclinic modal structures commensurate with the theoretical capabilities of a given horizontal grid. This approach is designed to ensure that the vertical grids of global ocean models complement (and, importantly, to not undermine) the resolution capabilities of the horizontal grid. We find that for z-coordinate global ocean models, at least 50 well-positioned vertical levels are required to resolve the first baroclinic mode, with an additional 25 levels per subsequent mode. High-resolution ocean-sea ice simulations are used to illustrate some of the dynamical enhancements gained by improving the vertical resolution of a 1/10° global ocean model. These enhancements include substantial increases in the sea surface height variance (∼30% increase south of 40°S), the barotropic and baroclinic eddy kinetic energies (up to 200% increase on and surrounding the Antarctic continental shelf and slopes), and the overturning streamfunction in potential density space (near-tripling of the Antarctic Bottom Water cell at 65°S).
NASA Technical Reports Server (NTRS)
Cole, Stanley R.; Johnson, R. Keith; Piatak, David J.; Florance, Jennifer P.; Rivera, Jose A., Jr.
2003-01-01
The Langley Transonic Dynamics Tunnel (TDT) has provided a unique capability for aeroelastic testing for over forty years. The facility has a rich history of significant contributions to the design of many United States commercial transports, military aircraft, launch vehicles, and spacecraft. The facility has many features that contribute to its uniqueness for aeroelasticity testing, perhaps the most important feature being the use of a heavy gas test medium to achieve higher test densities compared to testing in air. Higher test medium densities substantially improve model-building requirements and therefore simplify the fabrication process for building aeroelastically scaled wind tunnel models. This paper describes TDT capabilities that make it particularly suited for aeroelasticity testing. The paper also discusses the nature of recent test activities in the TDT, including summaries of several specific tests. Finally, the paper documents recent facility improvement projects and the continuous statistical quality assessment effort for the TDT.
2015-08-27
and 2) preparing for the post-MODIS/MISR era using the Geostationary Operational Environmental Satellite (GOES). 3. Improve model representations of...meteorological property retrievals. In this study, using collocated data from Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) and Geostationary
A CONSISTENT APPROACH FOR THE APPLICATION OF PHARMACOKINETIC MODELING IN CANCER RISK ASSESSMENT
Physiologically based pharmacokinetic (PBPK) modeling provides important capabilities for improving the reliability of the extrapolations across dose, species, and exposure route that are generally required in chemical risk assessment regardless of the toxic endpoint being consid...
Weather Research and Forecasting Model with Vertical Nesting Capability
DOE Office of Scientific and Technical Information (OSTI.GOV)
2014-08-01
The Weather Research and Forecasting (WRF) model with vertical nesting capability is an extension of the WRF model, which is available in the public domain, from www.wrf-model.org. The new code modifies the nesting procedure, which passes lateral boundary conditions between computational domains in the WRF model. Previously, the same vertical grid was required on all domains, while the new code allows different vertical grids to be used on concurrently run domains. This new functionality improves WRF's ability to produce high-resolution simulations of the atmosphere by allowing a wider range of scales to be efficiently resolved and more accurate lateral boundarymore » conditions to be provided through the nesting procedure.« less
Improving the accuracy and capability of transport and dispersion models in urban areas is essential for current and future urban applications. These models must reflect more realistically the presence and details of urban canopy features. Such features markedly influence the flo...
Industrial Sector Energy Efficiency Modeling (ISEEM) Framework Documentation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karali, Nihan; Xu, Tengfang; Sathaye, Jayant
2012-12-12
The goal of this study is to develop a new bottom-up industry sector energy-modeling framework with an agenda of addressing least cost regional and global carbon reduction strategies, improving the capabilities and limitations of the existing models that allows trading across regions and countries as an alternative.
Extending the Lunar Mapping and Modeling Portal - New Capabilities and New Worlds
NASA Technical Reports Server (NTRS)
Day, B.; Law, E.; Arevalo, E.; Bui, B.; Chang, G.; Dodge, K.; Kim, R.; Malhotra, S.; Sadaqathullah, S.; Schmidt, G.;
2015-01-01
NASA's Lunar Mapping and Modeling Portal (LMMP) provides a web-based Portal and a suite of interactive visualization and analysis tools to enable mission planners, lunar scientists, and engineers to access mapped lunar data products from past and current lunar missions (http://lmmp.nasa.gov). During the past year, the capabilities and data served by LMMP have been significantly expanded. New interfaces are providing improved ways to access and visualize data. At the request of NASA's Science Mission Directorate, LMMP's technology and capabilities are now being extended to additional planetary bodies. New portals for Vesta and Mars are the first of these new products to be released. This presentation will provide an overview of LMMP, Vesta Trek, and Mars Trek, demonstrate their uses and capabilities, highlight new features, and preview coming enhancements.
Extending the Lunar Mapping and Modeling Portal - New Capabilities and New Worlds
NASA Astrophysics Data System (ADS)
Day, B.; Law, E.; Arevalo, E.; Bui, B.; Chang, G.; Dodge, K.; Kim, R.; Malhotra, S.; Sadaqathullah, S.; Schmidt, G.; Bailey, B.
2015-10-01
NASA's Lunar Mapping and Modeling Portal (LMMP) provides a web-based Portal and a suite of interactive visualization and analysis tools to enable mission planners, lunar scientists, and engineers to access mapped lunar data products from past and current lunar missions (http://lmmp.nasa.gov). During the past year, the capabilities and data served by LMMP have been significantly expanded. New interfaces are providing improved ways to access and visualize data. At the request of NASA's Science Mission Directorate, LMMP's technology and capabilities are now being extended to additional planetary bodies. New portals for Vesta and Mars are the first of these new products to be released. This presentation will provide an overview of LMMP, Vesta Trek, and Mars Trek, demonstrate their uses and capabilities, highlight new features, and preview coming enhancements.
Development of an Improved Simulator for Chemical and Microbial EOR Methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pope, Gary A.; Sepehrnoori, Kamy; Delshad, Mojdeh
2000-09-11
The objective of this research was to extend the capability of an existing simulator (UTCHEM) to improved oil recovery methods that use surfactants, polymers, gels, alkaline chemicals, microorganisms and foam as well as various combinations of these in both conventional and naturally fractured oil reservoirs. Task 1 is the addition of a dual-porosity model for chemical improved of recovery processes in naturally fractured oil reservoirs. Task 2 is the addition of a foam model. Task 3 addresses several numerical and coding enhancements that will greatly improve the versatility and performance of UTCHEM. Task 4 is the enhancements of physical propertymore » models.« less
Computational Modeling in Concert with Laboratory Studies: Application to B Cell Differentiation
Remediation is expensive, so accurate prediction of dose-response is important to help control costs. Dose response is a function of biological mechanisms. Computational models of these mechanisms improve the efficiency of research and provide the capability for prediction.
AGSM Functional Fault Models for Fault Isolation Project
NASA Technical Reports Server (NTRS)
Harp, Janicce Leshay
2014-01-01
This project implements functional fault models to automate the isolation of failures during ground systems operations. FFMs will also be used to recommend sensor placement to improve fault isolation capabilities. The project enables the delivery of system health advisories to ground system operators.
Formulation of a parametric systems design framework for disaster response planning
NASA Astrophysics Data System (ADS)
Mma, Stephanie Weiya
The occurrence of devastating natural disasters in the past several years have prompted communities, responding organizations, and governments to seek ways to improve disaster preparedness capabilities locally, regionally, nationally, and internationally. A holistic approach to design used in the aerospace and industrial engineering fields enables efficient allocation of resources through applied parametric changes within a particular design to improve performance metrics to selected standards. In this research, this methodology is applied to disaster preparedness, using a community's time to restoration after a disaster as the response metric. A review of the responses from Hurricane Katrina and the 2010 Haiti earthquake, among other prominent disasters, provides observations leading to some current capability benchmarking. A need for holistic assessment and planning exists for communities but the current response planning infrastructure lacks a standardized framework and standardized assessment metrics. Within the humanitarian logistics community, several different metrics exist, enabling quantification and measurement of a particular area's vulnerability. These metrics, combined with design and planning methodologies from related fields, such as engineering product design, military response planning, and business process redesign, provide insight and a framework from which to begin developing a methodology to enable holistic disaster response planning. The developed methodology was applied to the communities of Shelby County, TN and pre-Hurricane-Katrina Orleans Parish, LA. Available literature and reliable media sources provide information about the different values of system parameters within the decomposition of the community aspects and also about relationships among the parameters. The community was modeled as a system dynamics model and was tested in the implementation of two, five, and ten year improvement plans for Preparedness, Response, and Development capabilities, and combinations of these capabilities. For Shelby County and for Orleans Parish, the Response improvement plan reduced restoration time the most. For the combined capabilities, Shelby County experienced the greatest reduction in restoration time with the implementation of Development and Response capability improvements, and for Orleans Parish it was the Preparedness and Response capability improvements. Optimization of restoration time with community parameters was tested by using a Particle Swarm Optimization algorithm. Fifty different optimized restoration times were generated using the Particle Swarm Optimization algorithm and ranked using the Technique for Order Preference by Similarity to Ideal Solution. The optimization results indicate that the greatest reduction in restoration time for a community is achieved with a particular combination of different parameter values instead of the maximization of each parameter.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ortensi, Javier; Baker, Benjamin Allen; Schunert, Sebastian
The INL is currently evolving the modeling and simulation (M&S) capability that will enable improved core operation as well as design and analysis of TREAT experiments. This M&S capability primarily uses MAMMOTH, a reactor physics application being developed under Multi-physics Object Oriented Simulation Environment (MOOSE) framework. MAMMOTH allows the coupling of a number of other MOOSE-based applications. This second year of work has been devoted to the generation of a deterministic reference solution for the full core, the preparation of anisotropic diffusion coefficients, the testing of the SPH equivalence method, and the improvement of the control rod modeling. In addition,more » this report includes the progress made in the modeling of the M8 core configuration and experiment vehicle since January of this year.« less
Combustion system CFD modeling at GE Aircraft Engines
NASA Technical Reports Server (NTRS)
Burrus, D.; Mongia, H.; Tolpadi, Anil K.; Correa, S.; Braaten, M.
1995-01-01
This viewgraph presentation discusses key features of current combustion system CFD modeling capabilities at GE Aircraft Engines provided by the CONCERT code; CONCERT development history; modeling applied for designing engine combustion systems; modeling applied to improve fundamental understanding; CONCERT3D results for current production combustors; CONCERT3D model of NASA/GE E3 combustor; HYBRID CONCERT CFD/Monte-Carlo modeling approach; and future modeling directions.
Combustion system CFD modeling at GE Aircraft Engines
NASA Astrophysics Data System (ADS)
Burrus, D.; Mongia, H.; Tolpadi, Anil K.; Correa, S.; Braaten, M.
1995-03-01
This viewgraph presentation discusses key features of current combustion system CFD modeling capabilities at GE Aircraft Engines provided by the CONCERT code; CONCERT development history; modeling applied for designing engine combustion systems; modeling applied to improve fundamental understanding; CONCERT3D results for current production combustors; CONCERT3D model of NASA/GE E3 combustor; HYBRID CONCERT CFD/Monte-Carlo modeling approach; and future modeling directions.
Capability maturity models for offshore organisational management.
Strutt, J E; Sharp, J V; Terry, E; Miles, R
2006-12-01
The goal setting regime imposed by the UK safety regulator has important implications for an organisation's ability to manage health and safety related risks. Existing approaches to safety assurance based on risk analysis and formal safety assessments are increasingly considered unlikely to create the step change improvement in safety to which the offshore industry aspires and alternative approaches are being considered. One approach, which addresses the important issue of organisational behaviour and which can be applied at a very early stage of design, is the capability maturity model (CMM). The paper describes the development of a design safety capability maturity model, outlining the key processes considered necessary to safety achievement, definition of maturity levels and scoring methods. The paper discusses how CMM is related to regulatory mechanisms and risk based decision making together with the potential of CMM to environmental risk management.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johns, Jesse M.; Burkes, Douglas
In this work, a multilayered perceptron (MLP) network is used to develop predictive isothermal time-temperature-transformation (TTT) models covering a range of U-Mo binary and ternary alloys. The selected ternary alloys for model development are U-Mo-Ru, U-Mo-Nb, U-Mo-Zr, U-Mo-Cr, and U-Mo-Re. These model’s ability to predict 'novel' U-Mo alloys is shown quite well despite the discrepancies between literature sources for similar alloys which likely arise from different thermal-mechanical processing conditions. These models are developed with the primary purpose of informing experimental decisions. Additional experimental insight is necessary in order to reduce the number of experiments required to isolate ideal alloys. Thesemore » models allow test planners to evaluate areas of experimental interest; once initial tests are conducted, the model can be updated and further improve follow-on testing decisions. The model also improves analysis capabilities by reducing the number of data points necessary from any particular test. For example, if one or two isotherms are measured during a test, the model can construct the rest of the TTT curve over a wide range of temperature and time. This modeling capability reduces the cost of experiments while also improving the value of the results from the tests. The reduced costs could result in improved material characterization and therefore improved fundamental understanding of TTT dynamics. As additional understanding of phenomena driving TTTs is acquired, this type of MLP model can be used to populate unknowns (such as material impurity and other thermal mechanical properties) from past literature sources.« less
An Investigation of Bomb Cyclogenesis in NCEP's CFS Model
NASA Astrophysics Data System (ADS)
Alvarez, F. M.; Eichler, T.; Gottschalck, J.
2008-12-01
With the concerns, impacts and consequences of climate change increasing, the need for climate models to simulate daily weather is very important. Given the improvements in resolution and physical parameterizations, climate models are becoming capable of resolving extreme weather events. A particular type of extreme event which has large impacts on transportation, industry and the general public is a rapidly intensifying cyclone referred to as a "bomb." In this study, bombs are investigated using the National Center for Environmental Prediction's (NCEP) Climate Forecast System (CFS) model. We generate storm tracks based on 6-hourly sea-level pressure (SLP) from long-term climate runs of the CFS model. Investigation of this dataset has revealed that the CFS model is capable of producing bombs. We show a case study of a bomb in the CFS model and demonstrate that it has characteristics similar to the observed. Since the CFS model is capable of producing bombs, future work will focus on trends in their frequency and intensity so that an assessment of the potential role of the bomb in climate change can be assessed.
NASA Technical Reports Server (NTRS)
Fogal, G. L.; Mangialardi, J. K.; Young, R.
1974-01-01
The capability of the basic automated Biowaste Sampling System (ABSS) hardware was extended and improved through the design, fabrication and test of breadboard hardware. A preliminary system design effort established the feasibility of integrating the breadboard concepts into the ABSS.
Curved Thermopiezoelectric Shell Structures Modeled by Finite Element Analysis
NASA Technical Reports Server (NTRS)
Lee, Ho-Jun
2000-01-01
"Smart" structures composed of piezoelectric materials may significantly improve the performance of aeropropulsion systems through a variety of vibration, noise, and shape-control applications. The development of analytical models for piezoelectric smart structures is an ongoing, in-house activity at the NASA Glenn Research Center at Lewis Field focused toward the experimental characterization of these materials. Research efforts have been directed toward developing analytical models that account for the coupled mechanical, electrical, and thermal response of piezoelectric composite materials. Current work revolves around implementing thermal effects into a curvilinear-shell finite element code. This enhances capabilities to analyze curved structures and to account for coupling effects arising from thermal effects and the curved geometry. The current analytical model implements a unique mixed multi-field laminate theory to improve computational efficiency without sacrificing accuracy. The mechanics can model both the sensory and active behavior of piezoelectric composite shell structures. Finite element equations are being implemented for an eight-node curvilinear shell element, and numerical studies are being conducted to demonstrate capabilities to model the response of curved piezoelectric composite structures (see the figure).
On central-difference and upwind schemes
NASA Technical Reports Server (NTRS)
Swanson, R. C.; Turkel, Eli
1990-01-01
A class of numerical dissipation models for central-difference schemes constructed with second- and fourth-difference terms is considered. The notion of matrix dissipation associated with upwind schemes is used to establish improved shock capturing capability for these models. In addition, conditions are given that guarantee that such dissipation models produce a Total Variation Diminishing (TVD) scheme. Appropriate switches for this type of model to ensure satisfaction of the TVD property are presented. Significant improvements in the accuracy of a central-difference scheme are demonstrated by computing both inviscid and viscous transonic airfoil flows.
Computational Fluid Dynamics (CFD) simulations provide a number of unique opportunities for expanding and improving capabilities for modeling exposures to environmental pollutants. The US Environmental Protection Agency's National Exposure Research Laboratory (NERL) has been c...
Content-Aware Video Adaptation under Low-Bitrate Constraint
NASA Astrophysics Data System (ADS)
Hsiao, Ming-Ho; Chen, Yi-Wen; Chen, Hua-Tsung; Chou, Kuan-Hung; Lee, Suh-Yin
2007-12-01
With the development of wireless network and the improvement of mobile device capability, video streaming is more and more widespread in such an environment. Under the condition of limited resource and inherent constraints, appropriate video adaptations have become one of the most important and challenging issues in wireless multimedia applications. In this paper, we propose a novel content-aware video adaptation in order to effectively utilize resource and improve visual perceptual quality. First, the attention model is derived from analyzing the characteristics of brightness, location, motion vector, and energy features in compressed domain to reduce computation complexity. Then, through the integration of attention model, capability of client device and correlational statistic model, attractive regions of video scenes are derived. The information object- (IOB-) weighted rate distortion model is used for adjusting the bit allocation. Finally, the video adaptation scheme dynamically adjusts video bitstream in frame level and object level. Experimental results validate that the proposed scheme achieves better visual quality effectively and efficiently.
An electromagnetic induction method for underground target detection and characterization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bartel, L.C.; Cress, D.H.
1997-01-01
An improved capability for subsurface structure detection is needed to support military and nonproliferation requirements for inspection and for surveillance of activities of threatening nations. As part of the DOE/NN-20 program to apply geophysical methods to detect and characterize underground facilities, Sandia National Laboratories (SNL) initiated an electromagnetic induction (EMI) project to evaluate low frequency electromagnetic (EM) techniques for subsurface structure detection. Low frequency, in this case, extended from kilohertz to hundreds of kilohertz. An EMI survey procedure had already been developed for borehole imaging of coal seams and had successfully been applied in a surface mode to detect amore » drug smuggling tunnel. The SNL project has focused on building upon the success of that procedure and applying it to surface and low altitude airborne platforms. Part of SNL`s work has focused on improving that technology through improved hardware and data processing. The improved hardware development has been performed utilizing Laboratory Directed Research and Development (LDRD) funding. In addition, SNL`s effort focused on: (1) improvements in modeling of the basic geophysics of the illuminating electromagnetic field and its coupling to the underground target (partially funded using LDRD funds) and (2) development of techniques for phase-based and multi-frequency processing and spatial processing to support subsurface target detection and characterization. The products of this project are: (1) an evaluation of an improved EM gradiometer, (2) an improved gradiometer concept for possible future development, (3) an improved modeling capability, (4) demonstration of an EM wave migration method for target recognition, and a demonstration that the technology is capable of detecting targets to depths exceeding 25 meters.« less
Ahlfeld, David P.; Baker, Kristine M.; Barlow, Paul M.
2009-01-01
This report describes the Groundwater-Management (GWM) Process for MODFLOW-2005, the 2005 version of the U.S. Geological Survey modular three-dimensional groundwater model. GWM can solve a broad range of groundwater-management problems by combined use of simulation- and optimization-modeling techniques. These problems include limiting groundwater-level declines or streamflow depletions, managing groundwater withdrawals, and conjunctively using groundwater and surface-water resources. GWM was initially released for the 2000 version of MODFLOW. Several modifications and enhancements have been made to GWM since its initial release to increase the scope of the program's capabilities and to improve its operation and reporting of results. The new code, which is called GWM-2005, also was designed to support the local grid refinement capability of MODFLOW-2005. Local grid refinement allows for the simulation of one or more higher resolution local grids (referred to as child models) within a coarser grid parent model. Local grid refinement is often needed to improve simulation accuracy in regions where hydraulic gradients change substantially over short distances or in areas requiring detailed representation of aquifer heterogeneity. GWM-2005 can be used to formulate and solve groundwater-management problems that include components in both parent and child models. Although local grid refinement increases simulation accuracy, it can also substantially increase simulation run times.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zeni, Lorenzo; Hesselbæk, Bo; Bech, John
This article presents an example of application of a modern test facility conceived for experiments regarding the integration of renewable energy in the power system. The capabilities of the test facility are used to validate dynamic simulation models of wind power plants and their controllers. The models are based on standard and generic blocks. The successful validation of events related to the control of active power (control phenomena in <10 Hz range, including frequency control and power oscillation damping) is described, demonstrating the capabilities of the test facility and drawing the track for future work and improvements.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jurrus, Elizabeth; Engel, Dave; Star, Keith
The Adaptive Poisson-Boltzmann Solver (APBS) software was developed to solve the equations of continuum electrostatics for large biomolecular assemblages that has provided impact in the study of a broad range of chemical, biological, and biomedical applications. APBS addresses three key technology challenges for understanding solvation and electrostatics in biomedical applications: accurate and efficient models for biomolecular solvation and electrostatics, robust and scalable software for applying those theories to biomolecular systems, and mechanisms for sharing and analyzing biomolecular electrostatics data in the scientific community. To address new research applications and advancing computational capabilities, we have continually updated APBS and its suitemore » of accompanying software since its release in 2001. In this manuscript, we discuss the models and capabilities that have recently been implemented within the APBS software package including: a Poisson-Boltzmann analytical and a semi-analytical solver, an optimized boundary element solver, a geometry-based geometric flow solvation model, a graph theory based algorithm for determining pKa values, and an improved web-based visualization tool for viewing electrostatics.« less
Status Report on NEAMS System Analysis Module Development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hu, R.; Fanning, T. H.; Sumner, T.
2015-12-01
Under the Reactor Product Line (RPL) of DOE-NE’s Nuclear Energy Advanced Modeling and Simulation (NEAMS) program, an advanced SFR System Analysis Module (SAM) is being developed at Argonne National Laboratory. The goal of the SAM development is to provide fast-running, improved-fidelity, whole-plant transient analyses capabilities. SAM utilizes an object-oriented application framework MOOSE), and its underlying meshing and finite-element library libMesh, as well as linear and non-linear solvers PETSc, to leverage modern advanced software environments and numerical methods. It also incorporates advances in physical and empirical models and seeks closure models based on information from high-fidelity simulations and experiments. This reportmore » provides an update on the SAM development, and summarizes the activities performed in FY15 and the first quarter of FY16. The tasks include: (1) implement the support of 2nd-order finite elements in SAM components for improved accuracy and computational efficiency; (2) improve the conjugate heat transfer modeling and develop pseudo 3-D full-core reactor heat transfer capabilities; (3) perform verification and validation tests as well as demonstration simulations; (4) develop the coupling requirements for SAS4A/SASSYS-1 and SAM integration.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spencer, Benjamin; Zhang, Yongfeng; Chakraborty, Pritam
2014-09-01
This report summarizes work during FY 2014 to develop capabilities to predict embrittlement of reactor pressure vessel steel, and to assess the response of embrittled reactor pressure vessels to postulated accident conditions. This work has been conducted a three length scales. At the engineering scale, 3D fracture mechanics capabilities have been developed to calculate stress intensities and fracture toughnesses, to perform a deterministic assessment of whether a crack would propagate at the location of an existing flaw. This capability has been demonstrated on several types of flaws in a generic reactor pressure vessel model. Models have been developed at themore » scale of fracture specimens to develop a capability to determine how irradiation affects the fracture toughness of material. Verification work has been performed on a previously-developed model to determine the sensitivity of the model to specimen geometry and size effects. The effects of irradiation on the parameters of this model has been investigated. At lower length scales, work has continued in an ongoing to understand how irradiation and thermal aging affect the microstructure and mechanical properties of reactor pressure vessel steel. Previously-developed atomistic kinetic monte carlo models have been further developed and benchmarked against experimental data. Initial work has been performed to develop models of nucleation in a phase field model. Additional modeling work has also been performed to improve the fundamental understanding of the formation mechanisms and stability of matrix defects caused.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ebert, D.
1997-07-01
This is a report on the CSNI Workshop on Transient Thermal-Hydraulic and Neutronic Codes Requirements held at Annapolis, Maryland, USA November 5-8, 1996. This experts` meeting consisted of 140 participants from 21 countries; 65 invited papers were presented. The meeting was divided into five areas: (1) current and prospective plans of thermal hydraulic codes development; (2) current and anticipated uses of thermal-hydraulic codes; (3) advances in modeling of thermal-hydraulic phenomena and associated additional experimental needs; (4) numerical methods in multi-phase flows; and (5) programming language, code architectures and user interfaces. The workshop consensus identified the following important action items tomore » be addressed by the international community in order to maintain and improve the calculational capability: (a) preserve current code expertise and institutional memory, (b) preserve the ability to use the existing investment in plant transient analysis codes, (c) maintain essential experimental capabilities, (d) develop advanced measurement capabilities to support future code validation work, (e) integrate existing analytical capabilities so as to improve performance and reduce operating costs, (f) exploit the proven advances in code architecture, numerics, graphical user interfaces, and modularization in order to improve code performance and scrutibility, and (g) more effectively utilize user experience in modifying and improving the codes.« less
Advanced Ground Systems Maintenance Functional Fault Models For Fault Isolation Project
NASA Technical Reports Server (NTRS)
Perotti, Jose M. (Compiler)
2014-01-01
This project implements functional fault models (FFM) to automate the isolation of failures during ground systems operations. FFMs will also be used to recommend sensor placement to improve fault isolation capabilities. The project enables the delivery of system health advisories to ground system operators.
Modeling tools for the assessment of microbiological risks during floods: a review
NASA Astrophysics Data System (ADS)
Collender, Philip; Yang, Wen; Stieglitz, Marc; Remais, Justin
2015-04-01
Floods are a major, recurring source of harm to global economies and public health. Projected increases in the frequency and intensity of heavy precipitation events under future climate change, coupled with continued urbanization in areas with high risk of floods, may exacerbate future impacts of flooding. Improved flood risk management is essential to support global development, poverty reduction and public health, and is likely to be a crucial aspect of climate change adaptation. Importantly, floods can facilitate the transmission of waterborne pathogens by changing social conditions (overcrowding among displaced populations, interruption of public health services), imposing physical challenges to infrastructure (sewerage overflow, reduced capacity to treat drinking water), and altering fate and transport of pathogens (transport into waterways from overland flow, resuspension of settled contaminants) during and after flood conditions. Hydrological and hydrodynamic models are capable of generating quantitative characterizations of microbiological risks associated with flooding, while accounting for these diverse and at times competing physical and biological processes. Despite a few applications of such models to the quantification of microbiological risks associated with floods, there exists limited guidance as to the relative capabilities, and limitations, of existing modeling platforms when used for this purpose. Here, we review 17 commonly used flood and water quality modeling tools that have demonstrated or implicit capabilities of mechanistically representing and quantifying microbial risk during flood conditions. We compare models with respect to their capabilities of generating outputs that describe physical and microbial conditions during floods, such as concentration or load of non-cohesive sediments or pathogens, and the dynamics of high flow conditions. Recommendations are presented for the application of specific modeling tools for assessing particular flood-related microbial risks, and model improvements are suggested that may better characterize key microbial risks during flood events. The state of current tools are assessed in the context of a changing climate where the frequency, intensity and duration of flooding are shifting in some areas.
A Coupled Surface Nudging Scheme for use in Retrospective ...
A surface analysis nudging scheme coupling atmospheric and land surface thermodynamic parameters has been implemented into WRF v3.8 (latest version) for use with retrospective weather and climate simulations, as well as for applications in air quality, hydrology, and ecosystem modeling. This scheme is known as the flux-adjusting surface data assimilation system (FASDAS) developed by Alapaty et al. (2008). This scheme provides continuous adjustments for soil moisture and temperature (via indirect nudging) and for surface air temperature and water vapor mixing ratio (via direct nudging). The simultaneous application of indirect and direct nudging maintains greater consistency between the soil temperature–moisture and the atmospheric surface layer mass-field variables. The new method, FASDAS, consistently improved the accuracy of the model simulations at weather prediction scales for different horizontal grid resolutions, as well as for high resolution regional climate predictions. This new capability has been released in WRF Version 3.8 as option grid_sfdda = 2. This new capability increased the accuracy of atmospheric inputs for use air quality, hydrology, and ecosystem modeling research to improve the accuracy of respective end-point research outcome. IMPACT: A new method, FASDAS, was implemented into the WRF model to consistently improve the accuracy of the model simulations at weather prediction scales for different horizontal grid resolutions, as wel
CrossTalk: The Journal of Defense Software Engineering. Volume 20, Number 2
2007-02-01
article is to show that when an organization is already doing competent project management, the effort to benchmark that capability by using CMMI is...process-improvement evolution. by Watts S . Humphrey, Dr. Michael D. Konrad, James W. Over, and William C. Peterson The ImprovAbility Model This model helps...17 23 29 3 12 16 22 30 31 D ep ar t m e n t s From the Sponsor Call For Articles Ad More Online From CrossTalk Coming Events SSTC 2007 BackTalk CMMI
LES ARM Symbiotic Simulation and Observation (LASSO) Implementation Strategy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gustafson Jr., WI; Vogelmann, AM
2015-09-01
This document illustrates the design of the Large-Eddy Simulation (LES) ARM Symbiotic Simulation and Observation (LASSO) workflow to provide a routine, high-resolution modeling capability to augment the U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Climate Research Facility’s high-density observations. LASSO will create a powerful new capability for furthering ARM’s mission to advance understanding of cloud, radiation, aerosol, and land-surface processes. The combined observational and modeling elements will enable a new level of scientific inquiry by connecting processes and context to observations and providing needed statistics for details that cannot be measured. The result will be improved process understandingmore » that facilitates concomitant improvements in climate model parameterizations. The initial LASSO implementation will be for ARM’s Southern Great Plains site in Oklahoma and will focus on shallow convection, which is poorly simulated by climate models due in part to clouds’ typically small spatial scale compared to model grid spacing, and because the convection involves complicated interactions of microphysical and boundary layer processes.« less
Optical Associative Memory Model With Threshold Modification Using Complementary Vector
NASA Astrophysics Data System (ADS)
Bian, Shaoping; Xu, Kebin; Hong, Jing
1989-02-01
A new criterion to evaluate the similarity between two vectors in associative memory is presented. According to it, an experimental research about optical associative memory model with threshold modification using complementary vector is carried out. This model is capable of eliminating the posibility to recall erroneously. Therefore the accuracy of reading out is improved.
Ignition behavior of live California chaparral leaves
J.D. Engstrom; J.K Butler; S.G. Smith; L.L. Baxter; T.H. Fletcher; D.R. Weise
2004-01-01
Current forest fire models are largely empirical correlations based on data from beds of dead vegetation Improvement in model capabilities is sought by developing models of the combustion of live fuels. A facility was developed to determine the combustion behavior of small samples of live fuels, consisting of a flat-flame burner on a moveable platform Qualitative and...
NASA Technical Reports Server (NTRS)
Knezovich, F. M.
1976-01-01
A modular structured system of computer programs is presented utilizing earth and ocean dynamical data keyed to finitely defined parameters. The model is an assemblage of mathematical algorithms with an inherent capability of maturation with progressive improvements in observational data frequencies, accuracies and scopes. The Eom in its present state is a first-order approach to a geophysical model of the earth's dynamics.
THE DEUTSCH MODEL--INSTITUTE FOR DEVELOPMENTAL STUDIES.
ERIC Educational Resources Information Center
New York Univ., NY. Inst. for Developmental Studies.
THE DEUTSCH INTERVENTION MODEL IS BASED ON THE THEORY THAT ENVIRONMENT PLAYS A MAJOR ROLE IN THE DEVELOPMENT OF COGNITIVE SKILLS AND OF FUNCTIONAL USE OF INTELLECTUAL CAPABILITIES. DISADVANTAGED CHILDREN HAVE INTELLECTUAL DEFICITS WHICH MAY BE OVERCOME BY USE OF MATCHED REMEDIAL MEASURES. LANGUAGE SKILLS AND MOTIVATION CAN BE IMPROVED BY TEACHING…
Improved Financial Capability Can Reduce Material Hardship among Mothers.
Huang, Jin; Nam, Yunju; Sherraden, Michael; Clancy, Margaret M
2016-10-01
This study draws on the theoretical framework of financial capability in investigating whether financial access (that is, availability of financial products and services) and financial knowledge (that is, understanding of basic financial concepts) can influence the risk of material hardship. Authors examine the possibility of direct associations as well as of indirect ones in which financial management (that is, individual financial behaviors) serves as a mediator. The probability sample of mothers with young children born in Oklahoma during 2007 (N = 2,529) was selected from Oklahoma birth certificates. Results from structural equation modeling analyses show that financial access is positively associated with financial management (p < 0.001) but that financial knowledge is not; both financial access (p < 0.001) and financial management (p < 0.001) are negatively correlated with material hardship. Similar results are obtained from analyses with a subsample of low-income mothers. Findings suggest that financial capability, particularly the financial access component, is critical for improving financial management and reducing the risk of material hardship among mothers with young children, including low-income mothers. Efforts to promote financial capability offer social workers an important strategy for improving their clients’ economic well-being.
Pointer, William David; Baglietto, Emilio
2016-05-01
Here, in the effort to reinvigorate innovation in the way we design, build, and operate the nuclear power generating stations of today and tomorrow, nothing can be taken for granted. Not even the seemingly familiar physics of boiling water. The Consortium for the Advanced Simulation of Light Water Reactors, or CASL, is focused on the deployment of advanced modeling and simulation capabilities to enable the nuclear industry to reduce uncertainties in the prediction of multi-physics phenomena and continue to improve the performance of today’s Light Water Reactors and their fuel. An important part of the CASL mission is the developmentmore » of a next generation thermal hydraulics simulation capability, integrating the history of engineering models based on experimental experience with the computing technology of the future.« less
Verification of a Finite Element Model for Pyrolyzing Ablative Materials
NASA Technical Reports Server (NTRS)
Risch, Timothy K.
2017-01-01
Ablating thermal protection system (TPS) materials have been used in many reentering spacecraft and in other applications such as rocket nozzle linings, fire protection materials, and as countermeasures for directed energy weapons. The introduction of the finite element model to the analysis of ablation has arguably resulted in improved computational capabilities due the flexibility and extended applicability of the method, especially to complex geometries. Commercial finite element codes often provide enhanced capability compared to custom, specially written programs based on versatility, usability, pre- and post-processing, grid generation, total life-cycle costs, and speed.
NASA and USGS invest in invasive species modeling to evaluate habitat for Africanized Honey Bees
2009-01-01
Invasive non-native species, such as plants, animals, and pathogens, have long been an interest to the U.S. Geological Survey (USGS) and NASA. Invasive species cause harm to our economy (around $120 B/year), the environment (e.g., replacing native biodiversity, forest pathogens negatively affecting carbon storage), and human health (e.g., plague, West Nile virus). Five years ago, the USGS and NASA formed a partnership to improve ecological forecasting capabilities for the early detection and containment of the highest priority invasive species. Scientists from NASA Goddard Space Flight Center (GSFC) and the Fort Collins Science Center developed a longterm strategy to integrate remote sensing capabilities, high-performance computing capabilities and new spatial modeling techniques to advance the science of ecological invasions [Schnase et al., 2002].
Telecom Modeling with ChatterBell.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jrad, Ahmad M.; Kelic, Andjelka
This document provides a description and user manual for the ChatterBell voice telecom modeling and simulation capability. The intended audience consists of network planners and practitioners who wish to use the tool to model a particular voice network and analyze its behavior under varying assumptions and possible failure conditions. ChatterBell is built on top of the N-SMART voice simulation and visualization suite that was developed through collaboration between Sandia National Laboratories and Bell Laboratories of Lucent Technologies. The new and improved modeling and simulation tool has been modified and modernized to incorporate the latest development in the telecom world includingmore » the widespread use of VoIP technology. In addition, ChatterBell provides new commands and modeling capabilities that were not available in the N-SMART application.« less
An Operations Concept for Integrated Model-Centric Engineering at JPL
NASA Technical Reports Server (NTRS)
Bayer, Todd J.; Cooney, Lauren A.; Delp, Christopher L.; Dutenhoffer, Chelsea A.; Gostelow, Roli D.; Ingham, Michel D.; Jenkins, J. Steven; Smith, Brian S.
2010-01-01
As JPL's missions grow more complex, the need for improved systems engineering processes is becoming clear. Of significant promise in this regard is the move toward a more integrated and model-centric approach to mission conception, design, implementation and operations. The Integrated Model-Centric Engineering (IMCE) Initiative, now underway at JPL, seeks to lay the groundwork for these improvements. This paper will report progress on three fronts: articulating JPL's need for IMCE; characterizing the enterprise into which IMCE capabilities will be deployed; and constructing an operations concept for a flight project development in an integrated model-centric environment.
Integrated Mecical Model (IMM) 4.0 Verification and Validation (VV) Testing (HRP IWS 2016)
NASA Technical Reports Server (NTRS)
Walton, M; Kerstman, E.; Arellano, J.; Boley, L.; Reyes, D.; Young, M.; Garcia, Y.; Saile, L.; Myers, J.
2016-01-01
Timeline, partial treatment, and alternate medications were added to the IMM to improve the fidelity of this model to enhance decision support capabilities. Using standard design reference missions, IMM VV testing compared outputs from the current operational IMM (v3) with those from the model with added functionalities (v4). These new capabilities were examined in a comparative, stepwise approach as follows: a) comparison of the current operational IMM v3 with the enhanced functionality of timeline alone (IMM 4.T), b) comparison of IMM 4.T with the timeline and partial treatment (IMM 4.TPT), and c) comparison of IMM 4.TPT with timeline, partial treatment and alternative medication (IMM 4.0).
INTEGRATION OF FACILITY MODELING CAPABILITIES FOR NUCLEAR NONPROLIFERATION ANALYSIS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gorensek, M.; Hamm, L.; Garcia, H.
2011-07-18
Developing automated methods for data collection and analysis that can facilitate nuclear nonproliferation assessment is an important research area with significant consequences for the effective global deployment of nuclear energy. Facility modeling that can integrate and interpret observations collected from monitored facilities in order to ascertain their functional details will be a critical element of these methods. Although improvements are continually sought, existing facility modeling tools can characterize all aspects of reactor operations and the majority of nuclear fuel cycle processing steps, and include algorithms for data processing and interpretation. Assessing nonproliferation status is challenging because observations can come frommore » many sources, including local and remote sensors that monitor facility operations, as well as open sources that provide specific business information about the monitored facilities, and can be of many different types. Although many current facility models are capable of analyzing large amounts of information, they have not been integrated in an analyst-friendly manner. This paper addresses some of these facility modeling capabilities and illustrates how they could be integrated and utilized for nonproliferation analysis. The inverse problem of inferring facility conditions based on collected observations is described, along with a proposed architecture and computer framework for utilizing facility modeling tools. After considering a representative sampling of key facility modeling capabilities, the proposed integration framework is illustrated with several examples.« less
Improvements to robotics-inspired conformational sampling in rosetta.
Stein, Amelie; Kortemme, Tanja
2013-01-01
To accurately predict protein conformations in atomic detail, a computational method must be capable of sampling models sufficiently close to the native structure. All-atom sampling is difficult because of the vast number of possible conformations and extremely rugged energy landscapes. Here, we test three sampling strategies to address these difficulties: conformational diversification, intensification of torsion and omega-angle sampling and parameter annealing. We evaluate these strategies in the context of the robotics-based kinematic closure (KIC) method for local conformational sampling in Rosetta on an established benchmark set of 45 12-residue protein segments without regular secondary structure. We quantify performance as the fraction of sub-Angstrom models generated. While improvements with individual strategies are only modest, the combination of intensification and annealing strategies into a new "next-generation KIC" method yields a four-fold increase over standard KIC in the median percentage of sub-Angstrom models across the dataset. Such improvements enable progress on more difficult problems, as demonstrated on longer segments, several of which could not be accurately remodeled with previous methods. Given its improved sampling capability, next-generation KIC should allow advances in other applications such as local conformational remodeling of multiple segments simultaneously, flexible backbone sequence design, and development of more accurate energy functions.
Improvements to Robotics-Inspired Conformational Sampling in Rosetta
Stein, Amelie; Kortemme, Tanja
2013-01-01
To accurately predict protein conformations in atomic detail, a computational method must be capable of sampling models sufficiently close to the native structure. All-atom sampling is difficult because of the vast number of possible conformations and extremely rugged energy landscapes. Here, we test three sampling strategies to address these difficulties: conformational diversification, intensification of torsion and omega-angle sampling and parameter annealing. We evaluate these strategies in the context of the robotics-based kinematic closure (KIC) method for local conformational sampling in Rosetta on an established benchmark set of 45 12-residue protein segments without regular secondary structure. We quantify performance as the fraction of sub-Angstrom models generated. While improvements with individual strategies are only modest, the combination of intensification and annealing strategies into a new “next-generation KIC” method yields a four-fold increase over standard KIC in the median percentage of sub-Angstrom models across the dataset. Such improvements enable progress on more difficult problems, as demonstrated on longer segments, several of which could not be accurately remodeled with previous methods. Given its improved sampling capability, next-generation KIC should allow advances in other applications such as local conformational remodeling of multiple segments simultaneously, flexible backbone sequence design, and development of more accurate energy functions. PMID:23704889
Nuclear Fuels & Materials Spotlight Volume 5
DOE Office of Scientific and Technical Information (OSTI.GOV)
Petti, David Andrew
2016-10-01
As the nation's nuclear energy laboratory, Idaho National Laboratory brings together talented people and specialized nuclear research capability to accomplish our mission. This edition of the Nuclear Fuels and Materials Division Spotlight provides an overview of some of our recent accomplishments in research and capability development. These accomplishments include: • Evaluation and modeling of light water reactor accident tolerant fuel concepts • Status and results of recent TRISO-coated particle fuel irradiations, post-irradiation examinations, high-temperature safety testing to demonstrate the accident performance of this fuel system, and advanced microscopy to improve the understanding of fission product transport in this fuel system.more » • Improvements in and applications of meso and engineering scale modeling of light water reactor fuel behavior under a range of operating conditions and postulated accidents (e.g., power ramping, loss of coolant accident, and reactivity initiated accidents) using the MARMOT and BISON codes. • Novel measurements of the properties of nuclear (actinide) materials under extreme conditions, (e.g. high pressure, low/high temperatures, high magnetic field) to improve the scientific understanding of these materials. • Modeling reactor pressure vessel behavior using the GRIZZLY code. • New methods using sound to sense temperature inside a reactor core. • Improved experimental capabilities to study the response of fusion reactor materials to a tritium plasma. Throughout Spotlight, you'll find examples of productive partnerships with academia, industry, and government agencies that deliver high-impact outcomes. The work conducted at Idaho National Laboratory helps spur innovation in nuclear energy applications that drive economic growth and energy security. We appreciate your interest in our work here at Idaho National Laboratory, and hope that you find this issue informative.« less
Crowdsourced Contributions to the Nation's Geodetic Elevation Infrastructure
NASA Astrophysics Data System (ADS)
Stone, W. A.
2014-12-01
NOAA's National Geodetic Survey (NGS), a United States Department of Commerce agency, is engaged in providing the nation's fundamental positioning infrastructure - the National Spatial Reference System (NSRS) - which includes the framework for latitude, longitude, and elevation determination as well as various geodetic models, tools, and data. Capitalizing on Global Navigation Satellite System (GNSS) technology for improved access to the nation's precise geodetic elevation infrastructure requires use of a geoid model, which relates GNSS-derived heights (ellipsoid heights) with traditional elevations (orthometric heights). NGS is facilitating the use of crowdsourced GNSS observations collected at published elevation control stations by the professional surveying, geospatial, and scientific communities to help improve NGS' geoid modeling capability. This collocation of published elevation data and newly collected GNSS data integrates together the two height systems. This effort in turn supports enhanced access to accurate elevation information across the nation, thereby benefiting all users of geospatial data. By partnering with the public in this collaborative effort, NGS is not only helping facilitate improvements to the elevation infrastructure for all users but also empowering users of NSRS with the capability to do their own high-accuracy positioning. The educational outreach facet of this effort helps inform the public, including the scientific community, about the utility of various NGS tools, including the widely used Online Positioning User Service (OPUS). OPUS plays a key role in providing user-friendly and high accuracy access to NSRS, with optional sharing of results with NGS and the public. All who are interested in helping evolve and improve the nationwide elevation determination capability are invited to participate in this nationwide partnership and to learn more about the geodetic infrastructure which is a vital component of viable spatial data for many disciplines, including the geosciences.
CoMD Implementation Suite in Emerging Programming Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haque, Riyaz; Reeve, Sam; Juallmes, Luc
CoMD-Em is a software implementation suite of the CoMD [4] proxy app using different emerging programming models. It is intended to analyze the features and capabilities of novel programming models that could help ensure code and performance portability and scalability across heterogeneous platforms while improving programmer productivity. Another goal is to provide the authors and venders with some meaningful feedback regarding the capabilities and limitations of their models. The actual application is a classical molecular dynamics (MD) simulation using either the Lennard-Jones method (LJ) or the embedded atom method (EAM) for primary particle interaction. The code can be extended tomore » support alternate interaction models. The code is expected ro run on a wide class of heterogeneous hardware configurations like shard/distributed/hybrid memory, GPU's and any other platform supported by the underlying programming model.« less
Improved alignment evaluation and optimization : final report.
DOT National Transportation Integrated Search
2007-09-11
This report outlines the development of an enhanced highway alignment evaluation and optimization : model. A GIS-based software tool is prepared for alignment optimization that uses genetic algorithms for : optimal search. The software is capable of ...
Advanced capability of air quality simulation models towards accurate performance at finer scales will be needed for such models to serve as tools for performing exposure and risk assessments in urban areas. It is recognized that the impact of urban features such as street and t...
Model Based Mission Assurance: Emerging Opportunities for Robotic Systems
NASA Technical Reports Server (NTRS)
Evans, John W.; DiVenti, Tony
2016-01-01
The emergence of Model Based Systems Engineering (MBSE) in a Model Based Engineering framework has created new opportunities to improve effectiveness and efficiencies across the assurance functions. The MBSE environment supports not only system architecture development, but provides for support of Systems Safety, Reliability and Risk Analysis concurrently in the same framework. Linking to detailed design will further improve assurance capabilities to support failures avoidance and mitigation in flight systems. This also is leading new assurance functions including model assurance and management of uncertainty in the modeling environment. Further, the assurance cases, a structured hierarchal argument or model, are emerging as a basis for supporting a comprehensive viewpoint in which to support Model Based Mission Assurance (MBMA).
Predictive Capability Maturity Model for computational modeling and simulation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oberkampf, William Louis; Trucano, Timothy Guy; Pilch, Martin M.
2007-10-01
The Predictive Capability Maturity Model (PCMM) is a new model that can be used to assess the level of maturity of computational modeling and simulation (M&S) efforts. The development of the model is based on both the authors experience and their analysis of similar investigations in the past. The perspective taken in this report is one of judging the usefulness of a predictive capability that relies on the numerical solution to partial differential equations to better inform and improve decision making. The review of past investigations, such as the Software Engineering Institute's Capability Maturity Model Integration and the National Aeronauticsmore » and Space Administration and Department of Defense Technology Readiness Levels, indicates that a more restricted, more interpretable method is needed to assess the maturity of an M&S effort. The PCMM addresses six contributing elements to M&S: (1) representation and geometric fidelity, (2) physics and material model fidelity, (3) code verification, (4) solution verification, (5) model validation, and (6) uncertainty quantification and sensitivity analysis. For each of these elements, attributes are identified that characterize four increasing levels of maturity. Importantly, the PCMM is a structured method for assessing the maturity of an M&S effort that is directed toward an engineering application of interest. The PCMM does not assess whether the M&S effort, the accuracy of the predictions, or the performance of the engineering system satisfies or does not satisfy specified application requirements.« less
Miskovic, Ljubisa; Alff-Tuomala, Susanne; Soh, Keng Cher; Barth, Dorothee; Salusjärvi, Laura; Pitkänen, Juha-Pekka; Ruohonen, Laura; Penttilä, Merja; Hatzimanikatis, Vassily
2017-01-01
Recent advancements in omics measurement technologies have led to an ever-increasing amount of available experimental data that necessitate systems-oriented methodologies for efficient and systematic integration of data into consistent large-scale kinetic models. These models can help us to uncover new insights into cellular physiology and also to assist in the rational design of bioreactor or fermentation processes. Optimization and Risk Analysis of Complex Living Entities (ORACLE) framework for the construction of large-scale kinetic models can be used as guidance for formulating alternative metabolic engineering strategies. We used ORACLE in a metabolic engineering problem: improvement of the xylose uptake rate during mixed glucose-xylose consumption in a recombinant Saccharomyces cerevisiae strain. Using the data from bioreactor fermentations, we characterized network flux and concentration profiles representing possible physiological states of the analyzed strain. We then identified enzymes that could lead to improved flux through xylose transporters (XTR). For some of the identified enzymes, including hexokinase (HXK), we could not deduce if their control over XTR was positive or negative. We thus performed a follow-up experiment, and we found out that HXK2 deletion improves xylose uptake rate. The data from the performed experiments were then used to prune the kinetic models, and the predictions of the pruned population of kinetic models were in agreement with the experimental data collected on the HXK2 -deficient S. cerevisiae strain. We present a design-build-test cycle composed of modeling efforts and experiments with a glucose-xylose co-utilizing recombinant S. cerevisiae and its HXK2 -deficient mutant that allowed us to uncover interdependencies between upper glycolysis and xylose uptake pathway. Through this cycle, we also obtained kinetic models with improved prediction capabilities. The present study demonstrates the potential of integrated "modeling and experiments" systems biology approaches that can be applied for diverse applications ranging from biotechnology to drug discovery.
Monte Carlo capabilities of the SCALE code system
Rearden, Bradley T.; Petrie, Jr., Lester M.; Peplow, Douglas E.; ...
2014-09-12
SCALE is a broadly used suite of tools for nuclear systems modeling and simulation that provides comprehensive, verified and validated, user-friendly capabilities for criticality safety, reactor physics, radiation shielding, and sensitivity and uncertainty analysis. For more than 30 years, regulators, licensees, and research institutions around the world have used SCALE for nuclear safety analysis and design. SCALE provides a “plug-and-play” framework that includes three deterministic and three Monte Carlo radiation transport solvers that can be selected based on the desired solution, including hybrid deterministic/Monte Carlo simulations. SCALE includes the latest nuclear data libraries for continuous-energy and multigroup radiation transport asmore » well as activation, depletion, and decay calculations. SCALE’s graphical user interfaces assist with accurate system modeling, visualization, and convenient access to desired results. SCALE 6.2 will provide several new capabilities and significant improvements in many existing features, especially with expanded continuous-energy Monte Carlo capabilities for criticality safety, shielding, depletion, and sensitivity and uncertainty analysis. Finally, an overview of the Monte Carlo capabilities of SCALE is provided here, with emphasis on new features for SCALE 6.2.« less
NASA Astrophysics Data System (ADS)
Mohd Yunos, Zuriahati; Shamsuddin, Siti Mariyam; Ismail, Noriszura; Sallehuddin, Roselina
2013-04-01
Artificial neural network (ANN) with back propagation algorithm (BP) and ANFIS was chosen as an alternative technique in modeling motor insurance claims. In particular, an ANN and ANFIS technique is applied to model and forecast the Malaysian motor insurance data which is categorized into four claim types; third party property damage (TPPD), third party bodily injury (TPBI), own damage (OD) and theft. This study is to determine whether an ANN and ANFIS model is capable of accurately predicting motor insurance claim. There were changes made to the network structure as the number of input nodes, number of hidden nodes and pre-processing techniques are also examined and a cross-validation technique is used to improve the generalization ability of ANN and ANFIS models. Based on the empirical studies, the prediction performance of the ANN and ANFIS model is improved by using different number of input nodes and hidden nodes; and also various sizes of data. The experimental results reveal that the ANFIS model has outperformed the ANN model. Both models are capable of producing a reliable prediction for the Malaysian motor insurance claims and hence, the proposed method can be applied as an alternative to predict claim frequency and claim severity.
NASA Astrophysics Data System (ADS)
Carr, Michael J.; Gazel, Esteban
2017-04-01
We provide here an open version of Igpet software, called t-Igpet to emphasize its application for teaching and research in forward modeling of igneous geochemistry. There are three programs, a norm utility, a petrologic mixing program using least squares and Igpet, a graphics program that includes many forms of numerical modeling. Igpet is a multifaceted tool that provides the following basic capabilities: igneous rock identification using the IUGS (International Union of Geological Sciences) classification and several supplementary diagrams; tectonic discrimination diagrams; pseudo-quaternary projections; least squares fitting of lines, polynomials and hyperbolae; magma mixing using two endmembers, histograms, x-y plots, ternary plots and spider-diagrams. The advanced capabilities of Igpet are multi-element mixing and magma evolution modeling. Mixing models are particularly useful for understanding the isotopic variations in rock suites that evolved by mixing different sources. The important melting models include, batch melting, fractional melting and aggregated fractional melting. Crystallization models include equilibrium and fractional crystallization and AFC (assimilation and fractional crystallization). Theses, reports and proposals concerning igneous petrology are improved by numerical modeling. For reviewed publications some elements of modeling are practically a requirement. Our intention in providing this software is to facilitate improved communication and lower entry barriers to research, especially for students.
NASA Technical Reports Server (NTRS)
Simmons, J.; Erlich, D.; Shockey, D.
2009-01-01
A team consisting of Arizona State University, Honeywell Engines, Systems & Services, the National Aeronautics and Space Administration Glenn Research Center, and SRI International collaborated to develop computational models and verification testing for designing and evaluating turbine engine fan blade fabric containment structures. This research was conducted under the Federal Aviation Administration Airworthiness Assurance Center of Excellence and was sponsored by the Aircraft Catastrophic Failure Prevention Program. The research was directed toward improving the modeling of a turbine engine fabric containment structure for an engine blade-out containment demonstration test required for certification of aircraft engines. The research conducted in Phase II began a new level of capability to design and develop fan blade containment systems for turbine engines. Significant progress was made in three areas: (1) further development of the ballistic fabric model to increase confidence and robustness in the material models for the Kevlar(TradeName) and Zylon(TradeName) material models developed in Phase I, (2) the capability was improved for finite element modeling of multiple layers of fabric using multiple layers of shell elements, and (3) large-scale simulations were performed. This report concentrates on the material model development and simulations of the impact tests.
NASA Astrophysics Data System (ADS)
Coyne, Kevin Anthony
The safe operation of complex systems such as nuclear power plants requires close coordination between the human operators and plant systems. In order to maintain an adequate level of safety following an accident or other off-normal event, the operators often are called upon to perform complex tasks during dynamic situations with incomplete information. The safety of such complex systems can be greatly improved if the conditions that could lead operators to make poor decisions and commit erroneous actions during these situations can be predicted and mitigated. The primary goal of this research project was the development and validation of a cognitive model capable of simulating nuclear plant operator decision-making during accident conditions. Dynamic probabilistic risk assessment methods can improve the prediction of human error events by providing rich contextual information and an explicit consideration of feedback arising from man-machine interactions. The Accident Dynamics Simulator paired with the Information, Decision, and Action in a Crew context cognitive model (ADS-IDAC) shows promise for predicting situational contexts that might lead to human error events, particularly knowledge driven errors of commission. ADS-IDAC generates a discrete dynamic event tree (DDET) by applying simple branching rules that reflect variations in crew responses to plant events and system status changes. Branches can be generated to simulate slow or fast procedure execution speed, skipping of procedure steps, reliance on memorized information, activation of mental beliefs, variations in control inputs, and equipment failures. Complex operator mental models of plant behavior that guide crew actions can be represented within the ADS-IDAC mental belief framework and used to identify situational contexts that may lead to human error events. This research increased the capabilities of ADS-IDAC in several key areas. The ADS-IDAC computer code was improved to support additional branching events and provide a better representation of the IDAC cognitive model. An operator decision-making engine capable of responding to dynamic changes in situational context was implemented. The IDAC human performance model was fully integrated with a detailed nuclear plant model in order to realistically simulate plant accident scenarios. Finally, the improved ADS-IDAC model was calibrated, validated, and updated using actual nuclear plant crew performance data. This research led to the following general conclusions: (1) A relatively small number of branching rules are capable of efficiently capturing a wide spectrum of crew-to-crew variabilities. (2) Compared to traditional static risk assessment methods, ADS-IDAC can provide a more realistic and integrated assessment of human error events by directly determining the effect of operator behaviors on plant thermal hydraulic parameters. (3) The ADS-IDAC approach provides an efficient framework for capturing actual operator performance data such as timing of operator actions, mental models, and decision-making activities.
Stakeholder approach for evaluating organizational change projects.
Peltokorpi, Antti; Alho, Antti; Kujala, Jaakko; Aitamurto, Johanna; Parvinen, Petri
2008-01-01
This paper aims to create a model for evaluating organizational change initiatives from a stakeholder resistance viewpoint. The paper presents a model to evaluate change projects and their expected benefits. Factors affecting the challenge to implement change were defined based on stakeholder theory literature. The authors test the model's practical validity for screening change initiatives to improve operating room productivity. Change initiatives can be evaluated using six factors: the effect of the planned intervention on stakeholders' actions and position; stakeholders' capability to influence the project's implementation; motivation to participate; capability to change; change complexity; and management capability. The presented model's generalizability should be explored by filtering presented factors through a larger number of historical cases operating in different healthcare contexts. The link between stakeholders, the change challenge and the outcomes of change projects needs to be empirically tested. The proposed model can be used to prioritize change projects, manage stakeholder resistance and establish a better organizational and professional competence for managing healthcare organization change projects. New insights into existing stakeholder-related understanding of change project successes are provided.
ERIC Educational Resources Information Center
Grier, Betsy Chesno; Bradley-Klug, Kathy L.
2011-01-01
Medical technology continues to improve, increasing life expectancies and capabilities of children with chronic illnesses and disabilities. Pediatric health issues have an impact on children's academic, emotional, behavioral, and social functioning. This article reviews a consultative Biopsychoeducational Model, based on a problem-solving process,…
USDA-ARS?s Scientific Manuscript database
Environmental modeling framework (EMF) design goals are multi-dimensional and often include many aspects of general software framework development. Many functional capabilities offered by current EMFs are closely related to interoperability and reuse aspects. For example, an EMF needs to support dev...
NASA Technical Reports Server (NTRS)
Case, Jonathan L.; Mungai, John; Sakwa, Vincent; Kabuchanga, Eric; Zavodsky, Bradley T.; Limaye, Ashutosh S.
2014-01-01
SPoRT/SERVIR/RCMRD/KMS Collaboration: Builds off strengths of each organization. SPoRT: Transition of satellite, modeling and verification capabilities; SERVIR-Africa/RCMRD: International capacity-building expertise; KMS: Operational organization with regional weather forecasting expertise in East Africa. Hypothesis: Improved land-surface initialization over Eastern Africa can lead to better temperature, moisture, and ultimately precipitation forecasts in NWP models. KMS currently initializes Weather Research and Forecasting (WRF) model with NCEP/Global Forecast System (GFS) model 0.5-deg initial / boundary condition data. LIS will provide much higher-resolution land-surface data at a scale more representative to regional WRF configuration. Future implementation of real-time NESDIS/VIIRS vegetation fraction to further improve land surface representativeness.
NASA Astrophysics Data System (ADS)
Toepfer, F.; Cortinas, J. V., Jr.; Kuo, W.; Tallapragada, V.; Stajner, I.; Nance, L. B.; Kelleher, K. E.; Firl, G.; Bernardet, L.
2017-12-01
NOAA develops, operates, and maintains an operational global modeling capability for weather, sub seasonal and seasonal prediction for the protection of life and property and fostering the US economy. In order to substantially improve the overall performance and accelerate advancements of the operational modeling suite, NOAA is partnering with NCAR to design and build the Global Modeling Test Bed (GMTB). The GMTB has been established to provide a platform and a capability for researchers to contribute to the advancement primarily through the development of physical parameterizations needed to improve operational NWP. The strategy to achieve this goal relies on effectively leveraging global expertise through a modern collaborative software development framework. This framework consists of a repository of vetted and supported physical parameterizations known as the Common Community Physics Package (CCPP), a common well-documented interface known as the Interoperable Physics Driver (IPD) for combining schemes into suites and for their configuration and connection to dynamic cores, and an open evidence-based governance process for managing the development and evolution of CCPP. In addition, a physics test harness designed to work within this framework has been established in order to facilitate easier like-to-like comparison of physics advancements. This paper will present an overview of the design of the CCPP and test platform. Additionally, an overview of potential new opportunities of how physics developers can engage in the process, from implementing code for CCPP/IPD compliance to testing their development within an operational-like software environment, will be presented. In addition, insight will be given as to how development gets elevated to CPPP-supported status, the pre-cursor to broad availability and use within operational NWP. An overview of how the GMTB can be expanded to support other global or regional modeling capabilities will also be presented.
Exhaust plumes and their interaction with missile airframes - A new viewpoint
NASA Technical Reports Server (NTRS)
Dash, S. M.; Sinha, N.
1992-01-01
The present, novel treatment of missile airframe-exhaust plume interactions emphasizes their simulation via a formal solution of the Reynolds-averaged Navier-Stokes (RNS) equation and is accordingly able to address the simulation requirements of novel missiles with nonconventional/integrated propulsion systems. The method is made possible by implicit RNS codes with improved artificial dissipation models, generalized geometric capabilities, and improved two-equation turbulence models, as well as by such codes' recent incorporation of plume thermochemistry and multiphase flow effects.
Telehealth and Indian healthcare: moving to scale and sustainability.
Carroll, Mark; Horton, Mark B
2013-05-01
Telehealth innovation has brought important improvements in access to quality healthcare for American Indian and Alaska Native communities. Despite these improvements, substantive work remains before telehealth capability can be more available and sustainable across Indian healthcare. Some of this work will rely on system change guided by new care model development. Such care model development depends on expansion of telehealth reimbursement. The U.S. Indian healthcare system is an ideal framework for implementing and evaluating large-scale change in U.S. telehealth reimbursement policy.
Strategies for using remotely sensed data in hydrologic models
NASA Technical Reports Server (NTRS)
Peck, E. L.; Keefer, T. N.; Johnson, E. R. (Principal Investigator)
1981-01-01
Present and planned remote sensing capabilities were evaluated. The usefulness of six remote sensing capabilities (soil moisture, land cover, impervious area, areal extent of snow cover, areal extent of frozen ground, and water equivalent of the snow cover) with seven hydrologic models (API, CREAMS, NWSRFS, STORM, STANFORD, SSARR, and NWSRFS Snowmelt) were reviewed. The results indicate remote sensing information has only limited value for use with the hydrologic models in their present form. With minor modifications to the models the usefulness would be enhanced. Specific recommendations are made for incorporating snow covered area measurements in the NWSRFS Snowmelt model. Recommendations are also made for incorporating soil moisture measurements in NWSRFS. Suggestions are made for incorporating snow covered area, soil moisture, and others in STORM and SSARR. General characteristics of a hydrologic model needed to make maximum use of remotely sensed data are discussed. Suggested goals for improvements in remote sensing for use in models are also established.
Reduced and Validated Kinetic Mechanisms for Hydrogen-CO-sir Combustion in Gas Turbines
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yiguang Ju; Frederick Dryer
2009-02-07
Rigorous experimental, theoretical, and numerical investigation of various issues relevant to the development of reduced, validated kinetic mechanisms for synthetic gas combustion in gas turbines was carried out - including the construction of new radiation models for combusting flows, improvement of flame speed measurement techniques, measurements and chemical kinetic analysis of H{sub 2}/CO/CO{sub 2}/O{sub 2}/diluent mixtures, revision of the H{sub 2}/O{sub 2} kinetic model to improve flame speed prediction capabilities, and development of a multi-time scale algorithm to improve computational efficiency in reacting flow simulations.
Detection and Attribution of Regional Climate Change
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bala, G; Mirin, A
2007-01-19
We developed a high resolution global coupled modeling capability to perform breakthrough studies of the regional climate change. The atmospheric component in our simulation uses a 1{sup o} latitude x 1.25{sup o} longitude grid which is the finest resolution ever used for the NCAR coupled climate model CCSM3. Substantial testing and slight retuning was required to get an acceptable control simulation. The major accomplishment is the validation of this new high resolution configuration of CCSM3. There are major improvements in our simulation of the surface wind stress and sea ice thickness distribution in the Arctic. Surface wind stress and oceanmore » circulation in the Antarctic Circumpolar Current are also improved. Our results demonstrate that the FV version of the CCSM coupled model is a state of the art climate model whose simulation capabilities are in the class of those used for IPCC assessments. We have also provided 1000 years of model data to Scripps Institution of Oceanography to estimate the natural variability of stream flow in California. In the future, our global model simulations will provide boundary data to high-resolution mesoscale model that will be used at LLNL. The mesoscale model would dynamically downscale the GCM climate to regional scale on climate time scales.« less
High fidelity studies of exploding foil initiator bridges, Part 3: ALEGRA MHD simulations
NASA Astrophysics Data System (ADS)
Neal, William; Garasi, Christopher
2017-01-01
Simulations of high voltage detonators, such as Exploding Bridgewire (EBW) and Exploding Foil Initiators (EFI), have historically been simple, often empirical, one-dimensional models capable of predicting parameters such as current, voltage, and in the case of EFIs, flyer velocity. Experimental methods have correspondingly generally been limited to the same parameters. With the advent of complex, first principles magnetohydrodynamic codes such as ALEGRA and ALE-MHD, it is now possible to simulate these components in three dimensions, and predict a much greater range of parameters than before. A significant improvement in experimental capability was therefore required to ensure these simulations could be adequately verified. In this third paper of a three part study, the experimental results presented in part 2 are compared against 3-dimensional MHD simulations. This improved experimental capability, along with advanced simulations, offer an opportunity to gain a greater understanding of the processes behind the functioning of EBW and EFI detonators.
Advances in computer-aided well-test interpretation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Horne, R.N.
1994-07-01
Despite the feeling expressed several times over the past 40 years that well-test analysis had reached it peak development, an examination of recent advances shows continuous expansion in capability, with future improvement likely. The expansion in interpretation capability over the past decade arose mainly from the development of computer-aided techniques, which, although introduced 20 years ago, have come into use only recently. The broad application of computer-aided interpretation originated with the improvement of the methodologies and continued with the expansion in computer access and capability that accompanied the explosive development of the microcomputer industry. This paper focuses on the differentmore » pieces of the methodology that combine to constitute a computer-aided interpretation and attempts to compare some of the approaches currently used. Future directions of the approach are also discussed. The separate areas discussed are deconvolution, pressure derivatives, model recognition, nonlinear regression, and confidence intervals.« less
NASA Technical Reports Server (NTRS)
Esgar, J. B.; Sokolowski, Daniel E.
1989-01-01
The Hot Section Technology (HOST) Project, which was initiated by NASA Lewis Research Center in 1980 and concluded in 1987, was aimed at improving advanced aircraft engine hot section durability through better technical understanding and more accurate design analysis capability. The project was a multidisciplinary, multiorganizational, focused research effort that involved 21 organizations and 70 research and technology activities and generated approximately 250 research reports. No major hardware was developed. To evaluate whether HOST had a significant impact on the overall aircraft engine industry in the development of new engines, interviews were conducted with 41 participants in the project to obtain their views. The summarized results of these interviews are presented. Emphasis is placed on results relative to three-dimensional inelastic structural analysis, thermomechanical fatigue testing, constitutive modeling, combustor aerothermal modeling, turbine heat transfer, protective coatings, computer codes, improved engine design capability, reduced engine development costs, and the impacts on technology transfer and the industry-government partnership.
Decision - making of Direct Customers Based on Available Transfer Capability
NASA Astrophysics Data System (ADS)
Quan, Tang; Zhaohang, Lin; Huaqiang, Li
2017-05-01
Large customer direct-power-purchasing is a hot spot in the electricity market reform. In this paper, the author established an Available Transfer Capability (ATC) model which takes uncertain factors into account, applied the model into large customer direct-power-purchasing transactions and improved the reliability of power supply during direct-power-purchasing by introducing insurance theory. The author also considered the customers loss suffered from power interruption when building ATC model, established large customer decision model, took purchasing quantity of power from different power plants and reserved capacity insurance as variables, targeted minimum power interruption loss as optimization goal and best solution by means of particle swarm algorithm to produce optimal power purchasing decision of large consumers. Simulation was made through IEEE57 system finally and proved that such method is effective.
NASA Technical Reports Server (NTRS)
Perkins, Hugh Douglas
2010-01-01
In order to improve the understanding of particle vitiation effects in hypersonic propulsion test facilities, a quasi-one dimensional numerical tool was developed to efficiently model reacting particle-gas flows over a wide range of conditions. Features of this code include gas-phase finite-rate kinetics, a global porous-particle combustion model, mass, momentum and energy interactions between phases, and subsonic and supersonic particle drag and heat transfer models. The basic capabilities of this tool were validated against available data or other validated codes. To demonstrate the capabilities of the code a series of computations were performed for a model hypersonic propulsion test facility and scramjet. Parameters studied were simulated flight Mach number, particle size, particle mass fraction and particle material.
NASA Technical Reports Server (NTRS)
Murray, John; Vernier, Jean-Paul; Fairlie, T. Duncan; Pavolonis, Michael; Krotkov, Nickolay A.; Lindsay, Francis; Haynes, John
2013-01-01
Although significant progress has been made in recent years, estimating volcanic ash concentration for the full extent of the airspace affected by volcanic ash remains a challenge. No single satellite, airborne or ground observing system currently exists which can sufficiently inform dispersion models to provide the degree of accuracy required to use them with a high degree of confidence for routing aircraft in and near volcanic ash. Toward this end, the detection and characterization of volcanic ash in the atmosphere may be substantially improved by integrating a wider array of observing systems and advancements in trajectory and dispersion modeling to help solve this problem. The qualitative aspect of this effort has advanced significantly in the past decade due to the increase of highly complementary observational and model data currently available. Satellite observations, especially when coupled with trajectory and dispersion models can provide a very accurate picture of the 3-dimensional location of ash clouds. The accurate estimate of the mass loading at various locations throughout the entire plume, however improving, remains elusive. This paper examines the capabilities of various satellite observation systems and postulates that model-based volcanic ash concentration maps and forecasts might be significantly improved if the various extant satellite capabilities are used together with independent, accurate mass loading data from other observing systems available to calibrate (tune) ash concentration retrievals from the satellite systems.
Simulation study of a new inverse-pinch high Coulomb transfer switch
NASA Technical Reports Server (NTRS)
Choi, S. H.
1984-01-01
A simulation study of a simplified model of a high coulomb transfer switch is performed. The switch operates in an inverse pinch geometry formed by an all metal chamber, which greatly reduces hot spot formations on the electrode surfaces. Advantages of the switch over the conventional switches are longer useful life, higher current capability and lower inductance, which improves the characteristics required for a high repetition rate switch. The simulation determines the design parameters by analytical computations and comparison with the experimentally measured risetime, current handling capability, electrode damage, and hold-off voltages. The parameters of initial switch design can be determined for the anticipated switch performance. Results are in agreement with the experiment results. Although the model is simplified, the switch characteristics such as risetime, current handling capability, electrode damages, and hold-off voltages are accurately determined.
MODULES FOR EXPERIMENTS IN STELLAR ASTROPHYSICS (MESA): BINARIES, PULSATIONS, AND EXPLOSIONS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paxton, Bill; Bildsten, Lars; Cantiello, Matteo
We substantially update the capabilities of the open-source software instrument Modules for Experiments in Stellar Astrophysics (MESA). MESA can now simultaneously evolve an interacting pair of differentially rotating stars undergoing transfer and loss of mass and angular momentum, greatly enhancing the prior ability to model binary evolution. New MESA capabilities in fully coupled calculation of nuclear networks with hundreds of isotopes now allow MESA to accurately simulate the advanced burning stages needed to construct supernova progenitor models. Implicit hydrodynamics with shocks can now be treated with MESA, enabling modeling of the entire massive star lifecycle, from pre-main-sequence evolution to themore » onset of core collapse and nucleosynthesis from the resulting explosion. Coupling of the GYRE non-adiabatic pulsation instrument with MESA allows for new explorations of the instability strips for massive stars while also accelerating the astrophysical use of asteroseismology data. We improve the treatment of mass accretion, giving more accurate and robust near-surface profiles. A new MESA capability to calculate weak reaction rates “on-the-fly” from input nuclear data allows better simulation of accretion induced collapse of massive white dwarfs and the fate of some massive stars. We discuss the ongoing challenge of chemical diffusion in the strongly coupled plasma regime, and exhibit improvements in MESA that now allow for the simulation of radiative levitation of heavy elements in hot stars. We close by noting that the MESA software infrastructure provides bit-for-bit consistency for all results across all the supported platforms, a profound enabling capability for accelerating MESA's development.« less
Initialization and Setup of the Coastal Model Test Bed: STWAVE
2017-01-01
Laboratory (CHL) Field Research Facility (FRF) in Duck , NC. The improved evaluation methodology will promote rapid enhancement of model capability and focus...Blanton 2008) study . This regional digital elevation model (DEM), with a cell size of 10 m, was generated from numerous datasets collected at different...INFORMATION: For additional information, contact Spicer Bak, Coastal Observation and Analysis Branch, Coastal and Hydraulics Laboratory, 1261 Duck Road
NASA Technical Reports Server (NTRS)
1979-01-01
Information to identify viable coal gasification and utilization technologies is presented. Analysis capabilities required to support design and implementation of coal based synthetic fuels complexes are identified. The potential market in the Southeast United States for coal based synthetic fuels is investigated. A requirements analysis to identify the types of modeling and analysis capabilities required to conduct and monitor coal gasification project designs is discussed. Models and methodologies to satisfy these requirements are identified and evaluated, and recommendations are developed. Requirements for development of technology and data needed to improve gasification feasibility and economies are examined.
Enhancing the LVRT Capability of PMSG-Based Wind Turbines Based on R-SFCL
NASA Astrophysics Data System (ADS)
Xu, Lin; Lin, Ruixing; Ding, Lijie; Huang, Chunjun
2018-03-01
A novel low voltage ride-through (LVRT) scheme for PMSG-based wind turbines based on the Resistor Superconducting Fault Current Limiter (R-SFCL) is proposed in this paper. The LVRT scheme is mainly formed by R-SFCL in series between the transformer and the Grid Side Converter (GSC), and basic modelling has been discussed in detail. The proposed LVRT scheme is implemented to interact with PMSG model in PSCAD/EMTDC under three phase short circuit fault condition, which proves that the proposed scheme based on R-SFCL can improve the transient performance and LVRT capability to consolidate grid connection with wind turbines.
NASA Technical Reports Server (NTRS)
Crisp, David; Komar, George (Technical Monitor)
2001-01-01
Advancement of our predictive capabilities will require new scientific knowledge, improvement of our modeling capabilities, and new observation strategies to generate the complex data sets needed by coupled modeling networks. New observation strategies must support remote sensing from a variety of vantage points and will include "sensorwebs" of small satellites in low Earth orbit, large aperture sensors in Geostationary orbits, and sentinel satellites at L1 and L2 to provide day/night views of the entire globe. Onboard data processing and high speed computing and communications will enable near real-time tailoring and delivery of information products (i.e., predictions) directly to users.
Ground Contact Modeling for the Morpheus Test Vehicle Simulation
NASA Technical Reports Server (NTRS)
Cordova, Luis
2014-01-01
The Morpheus vertical test vehicle is an autonomous robotic lander being developed at Johnson Space Center (JSC) to test hazard detection technology. Because the initial ground contact simulation model was not very realistic, it was decided to improve the model without making it too computationally expensive. The first development cycle added capability to define vehicle attachment points (AP) and to keep track of their states in the lander reference frame (LFRAME). These states are used with a spring damper model to compute an AP contact force. The lateral force is then overwritten, if necessary, by the Coulomb static or kinetic friction force. The second development cycle added capability to use the PolySurface class as the contact surface. The class can load CAD data in STL (Stereo Lithography) format, and use the data to compute line of sight (LOS) intercepts. A polygon frame (PFRAME) is computed from the facet intercept normal and used to convert the AP state to PFRAME. Three flat plane tests validate the transitions from kinetic to static, static to kinetic, and vertical impact. The hazardous terrain test will be used to test for visual reasonableness. The improved model is numerically inexpensive, robust, and produces results that are reasonable.
Ground Contact Modeling for the Morpheus Test Vehicle Simulation
NASA Technical Reports Server (NTRS)
Cordova, Luis
2013-01-01
The Morpheus vertical test vehicle is an autonomous robotic lander being developed at Johnson Space Center (JSC) to test hazard detection technology. Because the initial ground contact simulation model was not very realistic, it was decided to improve the model without making it too computationally expensive. The first development cycle added capability to define vehicle attachment points (AP) and to keep track of their states in the lander reference frame (LFRAME). These states are used with a spring damper model to compute an AP contact force. The lateral force is then overwritten, if necessary, by the Coulomb static or kinetic friction force. The second development cycle added capability to use the PolySurface class as the contact surface. The class can load CAD data in STL (Stereo Lithography) format, and use the data to compute line of sight (LOS) intercepts. A polygon frame (PFRAME) is computed from the facet intercept normal and used to convert the AP state to PFRAME. Three flat plane tests validate the transitions from kinetic to static, static to kinetic, and vertical impact. The hazardous terrain test will be used to test for visual reasonableness. The improved model is numerically inexpensive, robust, and produces results that are reasonable.
Process improvement as an investment: Measuring its worth
NASA Technical Reports Server (NTRS)
Mcgarry, Frank; Jeletic, Kellyann
1993-01-01
This paper discusses return on investment (ROI) generated from software process improvement programs. It details the steps needed to compute ROI and compares these steps from the perspective of two process improvement approaches: the widely known Software Engineering Institute's capability maturity model and the approach employed by NASA's Software Engineering Laboratory (SEL). The paper then describes the specific investments made in the SEL over the past 18 years and discusses the improvements gained from this investment by the production organization in the SEL.
Nuclear Power Plant Mechanical Component Flooding Fragility Experiments Status
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pope, C. L.; Savage, B.; Johnson, B.
This report describes progress on Nuclear Power Plant mechanical component flooding fragility experiments and supporting research. The progress includes execution of full scale fragility experiments using hollow-core doors, design of improvements to the Portal Evaluation Tank, equipment procurement and initial installation of PET improvements, designation of experiments exploiting the improved PET capabilities, fragility mathematical model development, Smoothed Particle Hydrodynamic simulations, wave impact simulation device research, and pipe rupture mechanics research.
Hypertext: Improved Capability for Shipboard Naval Messages
1989-09-01
message handling system; a complete working model of the system has not been developed . 3 D. ORGANIZATION OF STUDY 1. The "Paperless" Ship Initiative...work in tandem to improve afloat message handling procedures. The objective of the PCMT project is to develop a system that could be installed on...working group has identified a list of requirements to guide the DoD’s progress towards improving its message communication system. These
NASA Astrophysics Data System (ADS)
von Hillebrandt-Andrade, C.; Huerfano Moreno, V. A.; McNamara, D. E.; Saurel, J. M.
2014-12-01
The magnitude-9.3 Sumatra-Andaman Islands earthquake of December 26, 2004, increased global awareness to the destructive hazard of earthquakes and tsunamis. Post event assessments of global coastline vulnerability highlighted the Caribbean as a region of high hazard and risk and that it was poorly monitored. Nearly 100 tsunamis have been reported for the Caribbean region and Adjacent Regions in the past 500 years and continue to pose a threat for its nations, coastal areas along the Gulf of Mexico, and the Atlantic seaboard of North and South America. Significant efforts to improve monitoring capabilities have been undertaken since this time including an expansion of the United States Geological Survey (USGS) Global Seismographic Network (GSN) (McNamara et al., 2006) and establishment of the United Nations Educational, Scientific and Cultural Organization (UNESCO) Intergovernmental Coordination Group (ICG) for the Tsunami and other Coastal Hazards Warning System for the Caribbean and Adjacent Regions (CARIBE EWS). The minimum performance standards it recommended for initial earthquake locations include: 1) Earthquake detection within 1 minute, 2) Minimum magnitude threshold = M4.5, and 3) Initial hypocenter error of <30 km. In this study, we assess current compliance with performance standards and model improvements in earthquake and tsunami monitoring capabilities in the Caribbean region since the first meeting of the UNESCO ICG-Caribe EWS in 2006. The three measures of network capability modeled in this study are: 1) minimum Mw detection threshold; 2) P-wave detection time of an automatic processing system and; 3) theoretical earthquake location uncertainty. By modeling three measures of seismic network capability, we can optimize the distribution of ICG-Caribe EWS seismic stations and select an international network that will be contributed from existing real-time broadband national networks in the region. Sea level monitoring improvements both offshore and along the coast will also be addressed. With the support of Member States and other countries and organizations it has been possible to significantly expand the sea level network thus reducing the amount of time it now takes to verify tsunamis.
Computer-aided communication satellite system analysis and optimization
NASA Technical Reports Server (NTRS)
Stagl, T. W.; Morgan, N. H.; Morley, R. E.; Singh, J. P.
1973-01-01
The capabilities and limitations of the various published computer programs for fixed/broadcast communication satellite system synthesis and optimization are discussed. A satellite Telecommunication analysis and Modeling Program (STAMP) for costing and sensitivity analysis work in application of communication satellites to educational development is given. The modifications made to STAMP include: extension of the six beam capability to eight; addition of generation of multiple beams from a single reflector system with an array of feeds; an improved system costing to reflect the time value of money, growth in earth terminal population with time, and to account for various measures of system reliability; inclusion of a model for scintillation at microwave frequencies in the communication link loss model; and, an updated technological environment.
NASA Technical Reports Server (NTRS)
Schallhorn, Paul; Majumdar, Alok; Tiller, Bruce
2001-01-01
A general purpose, one dimensional fluid flow code is currently being interfaced with the thermal analysis program SINDA/G. The flow code, GFSSP, is capable of analyzing steady state and transient flow in a complex network. The flow code is capable of modeling several physical phenomena including compressibility effects, phase changes, body forces (such as gravity and centrifugal) and mixture thermodynamics for multiple species. The addition of GFSSP to SINDA/G provides a significant improvement in convective heat transfer modeling for SINDA/G. The interface development is conducted in multiple phases. This paper describes the first phase of the interface which allows for steady and quasisteady (unsteady solid, steady fluid) conjugate heat transfer modeling.
The ASAC Air Carrier Investment Model (Third Generation)
NASA Technical Reports Server (NTRS)
Wingrove, Earl R., III; Gaier, Eric M.; Santmire, Tara E.
1998-01-01
To meet its objective of assisting the U.S. aviation industry with the technological challenges of the future, NASA must identify research areas that have the greatest potential for improving the operation of the air transportation system. To accomplish this, NASA is building an Aviation System Analysis Capability (ASAC). The ASAC differs from previous NASA modeling efforts in that the economic behavior of buyers and sellers in the air transportation and aviation industries is central to its conception. To link the economics of flight with the technology of flight, ASAC requires a parametrically based model with extensions that link airline operations and investments in aircraft with aircraft characteristics. This model also must provide a mechanism for incorporating air travel demand and profitability factors into the airlines' investment decisions. Finally, the model must be flexible and capable of being incorporated into a wide-ranging suite of economic and technical models flat are envisioned for ASAC.
Climbing the ladder: capability maturity model integration level 3
NASA Astrophysics Data System (ADS)
Day, Bryce; Lutteroth, Christof
2011-02-01
This article details the attempt to form a complete workflow model for an information and communication technologies (ICT) company in order to achieve a capability maturity model integration (CMMI) maturity rating of 3. During this project, business processes across the company's core and auxiliary sectors were documented and extended using modern enterprise modelling tools and a The Open Group Architectural Framework (TOGAF) methodology. Different challenges were encountered with regard to process customisation and tool support for enterprise modelling. In particular, there were problems with the reuse of process models, the integration of different project management methodologies and the integration of the Rational Unified Process development process framework that had to be solved. We report on these challenges and the perceived effects of the project on the company. Finally, we point out research directions that could help to improve the situation in the future.
NASA Astrophysics Data System (ADS)
Li, W. W.; Du, Z. Z.; Yuan, R. m.; Xiong, D. Z.; Shi, E. W.; Lu, G. N.; Dai, Z. Y.; Chen, X. Q.; Jiang, Z. Y.; Lv, Y. G.
2017-10-01
Smart meter represents the development direction of energy-saving smart grid in the future. The load switch, one of the core parts of smart meter, should be of high reliability, safety and endurance capability of limit short-circuit current. For this reason, this paper discusses the quick simulation of relationship between attraction and counterforce of load switch without iteration, establishes dual response surface model of attraction and counterforce and optimizes the design scheme of load switch for charge control smart meter, thus increasing electromagnetic attraction and spring counterforce. In this way, this paper puts forward a method to improve the withstand capacity of limit short-circuit current.
Limiting the immediate and subsequent hazards associated with wildfires
DeGraff, Jerome V.; Cannon, Susan H.; Parise, Mario
2013-01-01
Similarly, our capability to limit impacts from post-fire debris flows is improving. Empirical models for estimating the probability of debris-flow occurrence, the volume of such an event, and mapping the inundated area, linked with improved definitions of the rainfall conditions that trigger debris flows, can be used to provide critical information for post-fire hazard mitigation and emergency-response planning.
DOT National Transportation Integrated Search
2011-01-01
To support improved analysis of the environmental impacts of proposed global aircraft operational changes, the United States Federal Aviation Administration recently worked : with European academic partners to update the airport terminal area fuel co...
FY16 Analysis report: Financial systems dependency on communications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beyeler, Walter E.
Within the Department of Homeland Security (DHS), the Office of Cyber and Infrastructure Analysis (OCIA)'s National Infrastructure Simulation and Analysis Center (NISAC) develops capabilities to support the DHS mission and the resilience of the Nation’s critical infrastructure. At Sandia National Laboratories, under DHS/OCIA direction, NISAC is developing models of financial sector dependence on communications. This capability is designed to improve DHS's ability to assess potential impacts of communication disruptions to major financial services and the effectiveness of possible mitigations. This report summarizes findings and recommendations from the application of that capability as part of the FY2016 NISAC program plan.
ENHANCED STREAM WATER QUALITY MODELS QUAL2E AND QUAL2E-UNCAS: DOCUMENTATION AND USER MANUAL
The manual is a major revision of the original QUAL2E program documentation released in 1985. It includes a description of the recent modifications and improvements to the widely used water quality models QUAL-II and QUAL2E. The enhancements include an extensive capability for un...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gonzalez Galdamez, Rinaldo A.; Recknagle, Kurtis P.
2012-04-30
This report provides an overview of the work performed for Solid Oxide Fuel Cell (SOFC) modeling during the 2012 Winter/Spring Science Undergraduate Laboratory Internship at Pacific Northwest National Laboratory (PNNL). A brief introduction on the concept, operation basics and applications of fuel cells is given for the general audience. Further details are given regarding the modifications and improvements of the Distributed Electrochemistry (DEC) Modeling tool developed by PNNL engineers to model SOFC long term performance. Within this analysis, a literature review on anode degradation mechanisms is explained and future plans of implementing these into the DEC modeling tool are alsomore » proposed.« less
Pyrolysis Model Development for a Multilayer Floor Covering
McKinnon, Mark B.; Stoliarov, Stanislav I.
2015-01-01
Comprehensive pyrolysis models that are integral to computational fire codes have improved significantly over the past decade as the demand for improved predictive capabilities has increased. High fidelity pyrolysis models may improve the design of engineered materials for better fire response, the design of the built environment, and may be used in forensic investigations of fire events. A major limitation to widespread use of comprehensive pyrolysis models is the large number of parameters required to fully define a material and the lack of effective methodologies for measurement of these parameters, especially for complex materials. The work presented here details a methodology used to characterize the pyrolysis of a low-pile carpet tile, an engineered composite material that is common in commercial and institutional occupancies. The studied material includes three distinct layers of varying composition and physical structure. The methodology utilized a comprehensive pyrolysis model (ThermaKin) to conduct inverse analyses on data collected through several experimental techniques. Each layer of the composite was individually parameterized to identify its contribution to the overall response of the composite. The set of properties measured to define the carpet composite were validated against mass loss rate curves collected at conditions outside the range of calibration conditions to demonstrate the predictive capabilities of the model. The mean error between the predicted curve and the mean experimental mass loss rate curve was calculated as approximately 20% on average for heat fluxes ranging from 30 to 70 kW·m−2, which is within the mean experimental uncertainty. PMID:28793556
Impact of using scatterometer and altimeter data on storm surge forecasting
NASA Astrophysics Data System (ADS)
Bajo, Marco; De Biasio, Francesco; Umgiesser, Georg; Vignudelli, Stefano; Zecchetto, Stefano
2017-05-01
Satellite data are rarely used in storm surge models because of the lack of established methodologies. Nevertheless, they can provide useful information on surface wind and sea level, which can potentially improve the forecast. In this paper satellite wind data are used to correct the bias of wind originating from a global atmospheric model, while satellite sea level data are used to improve the initial conditions of the model simulations. In a first step, the capability of global winds (biased and unbiased) to adequately force a storm surge model are assessed against that of a high resolution local wind. Then, the added value of direct assimilation of satellite altimeter data in the storm surge model is tested. Eleven storm surge events, recorded in Venice from 2008 to 2012, are simulated using different configurations of wind forcing and altimeter data assimilation. Focusing on the maximum surge peak, results show that the relative error, averaged over the eleven cases considered, decreases from 13% to 7%, using both the unbiased wind and assimilating the altimeter data, while, if the high resolution local wind is used to force the hydrodynamic model, the altimeter data assimilation reduces the error from 9% to 6%. Yet, the overall capabilities in reproducing the surge in the first day of forecast, measured by the correlation and by the rms error, improve only with the use of the unbiased global wind and not with the use of high resolution local wind and altimeter data assimilation.
Capturing the Energy Absorbing Mechanisms of Composite Structures under Crash Loading
NASA Astrophysics Data System (ADS)
Wade, Bonnie
As fiber reinforced composite material systems become increasingly utilized in primary aircraft and automotive structures, the need to understand their contribution to the crashworthiness of the structure is of great interest to meet safety certification requirements. The energy absorbing behavior of a composite structure, however, is not easily predicted due to the great complexity of the failure mechanisms that occur within the material. Challenges arise both in the experimental characterization and in the numerical modeling of the material/structure combination. At present, there is no standardized test method to characterize the energy absorbing capability of composite materials to aide crashworthy structural design. In addition, although many commercial finite element analysis codes exist and offer a means to simulate composite failure initiation and propagation, these models are still under development and refinement. As more metallic structures are replaced by composite structures, the need for both experimental guidelines to characterize the energy absorbing capability of a composite structure, as well as guidelines for using numerical tools to simulate composite materials in crash conditions has become a critical matter. This body of research addresses both the experimental characterization of the energy absorption mechanisms occurring in composite materials during crushing, as well as the numerical simulation of composite materials undergoing crushing. In the experimental investigation, the specific energy absorption (SEA) of a composite material system is measured using a variety of test element geometries, such as corrugated plates and tubes. Results from several crush experiments reveal that SEA is not a constant material property for laminated composites, and varies significantly with the geometry of the test specimen used. The variation of SEA measured for a single material system requires that crush test data must be generated for a range of different test geometries in order to define the range of its energy absorption capability. Further investigation from the crush tests has led to the development of a direct link between geometric features of the crush specimen and its resulting SEA. Through micrographic analysis, distinct failure modes are shown to be guided by the geometry of the specimen, and subsequently are shown to directly influence energy absorption. A new relationship between geometry, failure mode, and SEA has been developed. This relationship has allowed for the reduction of the element-level crush testing requirement to characterize the composite material energy absorption capability. In the numerical investigation, the LS-DYNA composite material model MAT54 is selected for its suitability to model composite materials beyond failure determination, as required by crush simulation, and its capability to remain within the scope of ultimately using this model for large-scale crash simulation. As a result of this research, this model has been thoroughly investigated in depth for its capacity to simulate composite materials in crush, and results from several simulations of the element-level crush experiments are presented. A modeling strategy has been developed to use MAT54 for crush simulation which involves using the experimental data collected from the coupon- and element-level crush tests to directly calibrate the crush damage parameter in MAT54 such that it may be used in higher-level simulations. In addition, the source code of the material model is modified to improve upon its capability. The modifications include improving the elastic definition such that the elastic response to multi-axial load cases can be accurately portrayed simultaneously in each element, which is a capability not present in other composite material models. Modifications made to the failure determination and post-failure model have newly emphasized the post-failure stress degradation scheme rather than the failure criterion which is traditionally considered the most important composite material model definition for crush simulation. The modification efforts have also validated the use of the MAT54 failure criterion and post-failure model for crash modeling when its capabilities and limitations are well understood, and for this reason guidelines for using MAT54 for composite crush simulation are presented. This research has effectively (a) developed and demonstrated a procedure that defines a set of experimental crush results that characterize the energy absorption capability of a composite material system, (b) used the experimental results in the development and refinement of a composite material model for crush simulation, (c) explored modifying the material model to improve its use in crush modeling, and (d) provided experimental and modeling guidelines for composite structures under crush at the element-level in the scope of the Building Block Approach.
Genetically Engineered Humanized Mouse Models for Preclinical Antibody Studies
Proetzel, Gabriele; Wiles, Michael V.; Roopenian, Derry C.
2015-01-01
The use of genetic engineering has vastly improved our capabilities to create animal models relevant in preclinical research. With the recent advances in gene-editing technologies, it is now possible to very rapidly create highly tunable mouse models as needs arise. Here, we provide an overview of genetic engineering methods, as well as the development of humanized neonatal Fc receptor (FcRn) models and their use for monoclonal antibody in vivo studies. PMID:24150980
2014-06-01
systems. It can model systems including both conventional, diesel powered generators and renewable power sources such as photovoltaic arrays and wind...conducted an experiment where he assessed the capabilities of the HOMER model in forecasting the power output of a solar panel at NPS [32]. In his ex...energy efficiency in expeditionary operations, the HOMER micropower optimization model provides potential to serve as a powerful tool for improving
AgBase: supporting functional modeling in agricultural organisms
McCarthy, Fiona M.; Gresham, Cathy R.; Buza, Teresia J.; Chouvarine, Philippe; Pillai, Lakshmi R.; Kumar, Ranjit; Ozkan, Seval; Wang, Hui; Manda, Prashanti; Arick, Tony; Bridges, Susan M.; Burgess, Shane C.
2011-01-01
AgBase (http://www.agbase.msstate.edu/) provides resources to facilitate modeling of functional genomics data and structural and functional annotation of agriculturally important animal, plant, microbe and parasite genomes. The website is redesigned to improve accessibility and ease of use, including improved search capabilities. Expanded capabilities include new dedicated pages for horse, cat, dog, cotton, rice and soybean. We currently provide 590 240 Gene Ontology (GO) annotations to 105 454 gene products in 64 different species, including GO annotations linked to transcripts represented on agricultural microarrays. For many of these arrays, this provides the only functional annotation available. GO annotations are available for download and we provide comprehensive, species-specific GO annotation files for 18 different organisms. The tools available at AgBase have been expanded and several existing tools improved based upon user feedback. One of seven new tools available at AgBase, GOModeler, supports hypothesis testing from functional genomics data. We host several associated databases and provide genome browsers for three agricultural pathogens. Moreover, we provide comprehensive training resources (including worked examples and tutorials) via links to Educational Resources at the AgBase website. PMID:21075795
NASA Astrophysics Data System (ADS)
Villamil-Otero, G.; Zhang, J.; Yao, Y.
2017-12-01
The Antarctic Peninsula (AP) has long been the focus of climate change studies due to its rapid environmental changes such as significantly increased glacier melt and retreat, and ice-shelf break-up. Progress has been continuously made in the use of regional modeling to simulate surface mass changes over ice sheets. Most efforts, however, focus on the ice sheets of Greenland with considerable fewer studies in Antarctica. In this study the Weather Research and Forecasting (WRF) model, which has been applied to the Antarctic region for weather modeling, is adopted to capture the past and future surface mass balance changes over AP. In order to enhance the capabilities of WRF model simulating surface mass balance over the ice surface, we implement various ice and snow processes within the WRF and develop a new WRF suite (WRF-Ice). The WRF-Ice includes a thermodynamic ice sheet model that improves the representation of internal melting and refreezing processes and the thermodynamic effects over ice sheet. WRF-Ice also couples a thermodynamic sea ice model to improve the simulation of surface temperature and fluxes over sea ice. Lastly, complex snow processes are also taken into consideration including the implementation of a snowdrift model that takes into account the redistribution of blowing snow as well as the thermodynamic impact of drifting snow sublimation on the lower atmospheric boundary layer. Intensive testing of these ice and snow processes are performed to assess the capability of WRF-Ice in simulating the surface mass balance changes over AP.
Chen, Shangying; Zhang, Peng; Liu, Xin; Qin, Chu; Tao, Lin; Zhang, Cheng; Yang, Sheng Yong; Chen, Yu Zong; Chui, Wai Keung
2016-06-01
The overall efficacy and safety profile of a new drug is partially evaluated by the therapeutic index in clinical studies and by the protective index (PI) in preclinical studies. In-silico predictive methods may facilitate the assessment of these indicators. Although QSAR and QSTR models can be used for predicting PI, their predictive capability has not been evaluated. To test this capability, we developed QSAR and QSTR models for predicting the activity and toxicity of anticonvulsants at accuracy levels above the literature-reported threshold (LT) of good QSAR models as tested by both the internal 5-fold cross validation and external validation method. These models showed significantly compromised PI predictive capability due to the cumulative errors of the QSAR and QSTR models. Therefore, in this investigation a new quantitative structure-index relationship (QSIR) model was devised and it showed improved PI predictive capability that superseded the LT of good QSAR models. The QSAR, QSTR and QSIR models were developed using support vector regression (SVR) method with the parameters optimized by using the greedy search method. The molecular descriptors relevant to the prediction of anticonvulsant activities, toxicities and PIs were analyzed by a recursive feature elimination method. The selected molecular descriptors are primarily associated with the drug-like, pharmacological and toxicological features and those used in the published anticonvulsant QSAR and QSTR models. This study suggested that QSIR is useful for estimating the therapeutic index of drug candidates. Copyright © 2016. Published by Elsevier Inc.
NASA Astrophysics Data System (ADS)
Li, Xin; Li, Xingang; Xiao, Yao; Jia, Bin
2016-06-01
Real traffic is heterogeneous with car and truck. Due to mechanical restrictions, the car and the truck have different limited deceleration capabilities, which are important factors in safety driving. This paper extends the single lane safety driving (SD) model with limited deceleration capability to two-lane SD model, in which car-truck heterogeneous traffic is considered. A car has a larger limited deceleration capability while a heavy truck has a smaller limited deceleration capability as a result of loaded goods. Then the safety driving conditions are different as the types of the following and the leading vehicles vary. In order to eliminate the well-known plug in heterogeneous two-lane traffic, it is assumed that heavy truck has active deceleration behavior when the heavy truck perceives the forming plug. The lane-changing decisions are also determined by the safety driving conditions. The fundamental diagram, spatiotemporal diagram, and lane-changing frequency were investigated to show the effect of mechanical restriction on heterogeneous traffic flow. It was shown that there would be still three traffic phases in heterogeneous traffic condition; the active deceleration of the heavy truck could well eliminate the plug; the lane-changing frequency was low in synchronized flow; the flow and velocity would decrease as the proportion of heavy truck grows or the limited deceleration capability of heavy truck drops; and the flow could be improved with lane control measures.
NASA Astrophysics Data System (ADS)
Schmidt, J. B.
1985-09-01
This thesis investigates ways of improving the real-time performance of the Stockpoint Logistics Integrated Communication Environment (SPLICE). Performance evaluation through continuous monitoring activities and performance studies are the principle vehicles discussed. The method for implementing this performance evaluation process is the measurement of predefined performance indexes. Performance indexes for SPLICE are offered that would measure these areas. Existing SPLICE capability to carry out performance evaluation is explored, and recommendations are made to enhance that capability.
Determining your organization's 'risk capability'.
Hannah, Bill; Hancock, Melinda
2014-05-01
An assessment of a provider's level of risk capability should focus on three key elements: Business intelligence, including sophisticated analytical models that can offer insight into the expected cost and quality of care for a given population. Clinical enterprise maturity, marked by the ability to improve health outcomes and to manage utilization and costs to drive change. Revenue transformation, emphasizing the need for a revenue cycle platform that allows for risk acceptance and management and that provides incentives for performance against defined objectives.
Research notes : improving freight data collection methods.
DOT National Transportation Integrated Search
2004-07-01
The overall goal of this study was to identify data collection methods capable of generating the information at a level of detail that would better fill ODOTs modeling and freight planning needs at the metropolitan level. After a review of other r...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pinilla, Maria Isabel
This report seeks to study and benchmark code predictions against experimental data; determine parameters to match MCNP-simulated detector response functions to experimental stilbene measurements; add stilbene processing capabilities to DRiFT; and improve NEUANCE detector array modeling and analysis using new MCNP6 and DRiFT features.
The Ensemble Space Weather Modeling System (eSWMS): Status, Capabilities and Challenges
NASA Astrophysics Data System (ADS)
Fry, C. D.; Eccles, J. V.; Reich, J. P.
2010-12-01
Marking a milestone in space weather forecasting, the Space Weather Modeling System (SWMS) successfully completed validation testing in advance of operational testing at Air Force Weather Agency’s primary space weather production center. This is the first coupling of stand-alone, physics-based space weather models that are currently in operations at AFWA supporting the warfighter. Significant development effort went into ensuring the component models were portable and scalable while maintaining consistent results across diverse high performance computing platforms. Coupling was accomplished under the Earth System Modeling Framework (ESMF). The coupled space weather models are the Hakamada-Akasofu-Fry version 2 (HAFv2) solar wind model and GAIM1, the ionospheric forecast component of the Global Assimilation of Ionospheric Measurements (GAIM) model. The SWMS was developed by team members from AFWA, Explorations Physics International, Inc. (EXPI) and Space Environment Corporation (SEC). The successful development of the SWMS provides new capabilities beyond enabling extended lead-time, data-driven ionospheric forecasts. These include ingesting diverse data sets at higher resolution, incorporating denser computational grids at finer time steps, and performing probability-based ensemble forecasts. Work of the SWMS development team now focuses on implementing the ensemble-based probability forecast capability by feeding multiple scenarios of 5 days of solar wind forecasts to the GAIM1 model based on the variation of the input fields to the HAFv2 model. The ensemble SWMS (eSWMS) will provide the most-likely space weather scenario with uncertainty estimates for important forecast fields. The eSWMS will allow DoD mission planners to consider the effects of space weather on their systems with more advance warning than is currently possible. The payoff is enhanced, tailored support to the warfighter with improved capabilities, such as point-to-point HF propagation forecasts, single-frequency GPS error corrections, and high cadence, high-resolution Space Situational Awareness (SSA) products. We present the current status of eSWMS, its capabilities, limitations and path of transition to operational use.
Xyce Parallel Electronic Simulator : users' guide, version 2.0.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoekstra, Robert John; Waters, Lon J.; Rankin, Eric Lamont
2004-06-01
This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been designed as a SPICE-compatible, high-performance analog circuit simulator capable of simulating electrical circuits at a variety of abstraction levels. Primarily, Xyce has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability the current state-of-the-art in the following areas: {sm_bullet} Capability to solve extremely large circuit problems by supporting large-scale parallel computing platforms (up to thousands of processors). Note that this includes support for most popular parallel and serial computers. {sm_bullet} Improved performance for allmore » numerical kernels (e.g., time integrator, nonlinear and linear solvers) through state-of-the-art algorithms and novel techniques. {sm_bullet} Device models which are specifically tailored to meet Sandia's needs, including many radiation-aware devices. {sm_bullet} A client-server or multi-tiered operating model wherein the numerical kernel can operate independently of the graphical user interface (GUI). {sm_bullet} Object-oriented code design and implementation using modern coding practices that ensure that the Xyce Parallel Electronic Simulator will be maintainable and extensible far into the future. Xyce is a parallel code in the most general sense of the phrase - a message passing of computing platforms. These include serial, shared-memory and distributed-memory parallel implementation - which allows it to run efficiently on the widest possible number parallel as well as heterogeneous platforms. Careful attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows. One feature required by designers is the ability to add device models, many specific to the needs of Sandia, to the code. To this end, the device package in the Xyce These input formats include standard analytical models, behavioral models look-up Parallel Electronic Simulator is designed to support a variety of device model inputs. tables, and mesh-level PDE device models. Combined with this flexible interface is an architectural design that greatly simplifies the addition of circuit models. One of the most important feature of Xyce is in providing a platform for computational research and development aimed specifically at the needs of the Laboratory. With Xyce, Sandia now has an 'in-house' capability with which both new electrical (e.g., device model development) and algorithmic (e.g., faster time-integration methods) research and development can be performed. Ultimately, these capabilities are migrated to end users.« less
Mars Global Reference Atmospheric Model (Mars-GRAM): Release No. 2 - Overview and applications
NASA Technical Reports Server (NTRS)
James, B.; Johnson, D.; Tyree, L.
1993-01-01
The Mars Global Reference Atmospheric Model (Mars-GRAM), a science and engineering model for empirically parameterizing the temperature, pressure, density, and wind structure of the Martian atmosphere, is described with particular attention to the model's newest version, Mars-GRAM, Release No. 2 and to the improvements incorporated into the Release No. 2 model as compared with the Release No. 1 version. These improvements include (1) an addition of a new capability to simulate local-scale Martian dust storms and the growth and decay of these storms; (2) an addition of the Zurek and Haberle (1988) wave perturbation model, for simulating tidal perturbation effects; and (3) a new modular version of Mars-GRAM, for incorporation as a subroutine into other codes.
Recent progress towards predicting aircraft ground handling performance
NASA Technical Reports Server (NTRS)
Yager, T. J.; White, E. J.
1981-01-01
Capability implemented in simulating aircraft ground handling performance is reviewed and areas for further expansion and improvement are identified. Problems associated with providing necessary simulator input data for adequate modeling of aircraft tire/runway friction behavior are discussed and efforts to improve tire/runway friction definition, and simulator fidelity are described. Aircraft braking performance data obtained on several wet runway surfaces are compared to ground vehicle friction measurements. Research to improve methods of predicting tire friction performance are discussed.
William Massman
2015-01-01
Increased use of prescribed fire by land managers and the increasing likelihood of wildfires due to climate change require an improved modeling capability of extreme heating of soils during fires. This issue is addressed here by developing and testing the soil (heat-moisture-vapor) HMVmodel, a 1-D (one-dimensional) non-equilibrium (liquid- vapor phase change)...
FireStem2D A two-dimensional heat transfer model for simulating tree stem injury in fires
Efthalia K. Chatziefstratiou; Gil Bohrer; Anthony S. Bova; Ravishankar Subramanian; Renato P.M. Frasson; Amy Scherzer; Bret W. Butler; Matthew B. Dickinson
2013-01-01
FireStem2D, a software tool for predicting tree stem heating and injury in forest fires, is a physically-based, two-dimensional model of stem thermodynamics that results from heating at the bark surface. It builds on an earlier one-dimensional model (FireStem) and provides improved capabilities for predicting fire-induced mortality and injury before a fire occurs by...
Hirani, Shela Akbar Ali; Richter, Solina
2017-02-21
The world is progressing in terms of communication, innovative technology and cure of various diseases through advanced pharmacological preparations. Unfortunately, populations are still struggling with ill-health, disabilities, poverty, hunger, inequality, gender disparities and conflicts. Several questions come to mind in this regard: why are prosperity, health, peace and progress not evenly distributed and what is the best approach to address the issues associated with population health? The capability approach may offer a possible model. This approach is a blend of 5 key concepts: capabilities, functioning, agency, endowment, and conversion factors. It proposes an innovative approach to examine and enhance the quality of life and wellbeing of individuals. This reflective paper provides an overview of the capability approach, critically analyses population health from the theoretical lens of the capability approach and highlights the relevance of this approach to achieving the Sustainable Developmental Goals.
Hydropower Modeling Challenges
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stoll, Brady; Andrade, Juan; Cohen, Stuart
Hydropower facilities are important assets for the electric power sector and represent a key source of flexibility for electric grids with large amounts of variable generation. As variable renewable generation sources expand, understanding the capabilities and limitations of the flexibility from hydropower resources is important for grid planning. Appropriately modeling these resources, however, is difficult because of the wide variety of constraints these plants face that other generators do not. These constraints can be broadly categorized as environmental, operational, and regulatory. This report highlights several key issues involving incorporating these constraints when modeling hydropower operations in terms of production costmore » and capacity expansion. Many of these challenges involve a lack of data to adequately represent the constraints or issues of model complexity and run time. We present several potential methods for improving the accuracy of hydropower representation in these models to allow for a better understanding of hydropower's capabilities.« less
Introducing WISDEM:An Integrated System Modeling for Wind Turbines and Plant (Presentation)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dykes, K.; Graf, P.; Scott, G.
2015-01-01
The National Wind Technology Center wind energy systems engineering initiative has developed an analysis platform to leverage its research capabilities toward integrating wind energy engineering and cost models across wind plants. This Wind-Plant Integrated System Design & Engineering Model (WISDEM) platform captures the important interactions between various subsystems to achieve a better National Wind Technology Center wind energy systems engineering initiative has developed an analysis platform to leverage its research capabilities toward integrating wind energy engineering and cost models across wind plants. This Wind-Plant Integrated System Design & Engineering Model (WISDEM) platform captures the important interactions between various subsystems tomore » achieve a better understanding of how to improve system-level performance and achieve system-level cost reductions. This work illustrates a few case studies with WISDEM that focus on the design and analysis of wind turbines and plants at different system levels.« less
NASA Technical Reports Server (NTRS)
Moore, James; Marty, Dave; Cody, Joe
2000-01-01
SRS and NASA/MSFC have developed software with unique capabilities to couple bearing kinematic modeling with high fidelity thermal modeling. The core thermomechanical modeling software was developed by SRS and others in the late 1980's and early 1990's under various different contractual efforts. SRS originally developed software that enabled SHABERTH (Shaft Bearing Thermal Model) and SINDA (Systems Improved Numerical Differencing Analyzer) to exchange data and autonomously allowing bearing component temperature effects to propagate into the steady state bearing mechanical model. A separate contract was issued in 1990 to create a personal computer version of the software. At that time SRS performed major improvements to the code. Both SHABERTH and SINDA were independently ported to the PC and compiled. SRS them integrated the two programs into a single program that was named SINSHA. This was a major code improvement.
NASA Technical Reports Server (NTRS)
Moore, James; Marty, Dave; Cody, Joe
2000-01-01
SRS and NASA/MSFC have developed software with unique capabilities to couple bearing kinematic modeling with high fidelity thermal modeling. The core thermomechanical modeling software was developed by SRS and others in the late 1980's and early 1990's under various different contractual efforts. SRS originally developed software that enabled SHABERTH (Shaft Bearing Thermal Model) and SINDA (Systems Improved Numerical Differencing Analyzer) to exchange data and autonomously allowing bearing component temperature effects to propagate into the steady state bearing mechanical model. A separate contract was issued in 1990 to create a personal computer version of the software. At that time SRS performed major improvements to the code. Both SHABERTH and SINDA were independently ported to the PC and compiled. SRS them integrated the two programs into a single program that was named SINSHA. This was a major code improvement.
RELAP-7 Software Verification and Validation Plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Curtis L.; Choi, Yong-Joon; Zou, Ling
This INL plan comprehensively describes the software for RELAP-7 and documents the software, interface, and software design requirements for the application. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7. The RELAP-7 (Reactor Excursion and Leak Analysis Program) code is a nuclear reactor system safety analysis code being developed at Idaho National Laboratory (INL). The code is based on the INL’s modern scientific software development framework – MOOSE (Multi-Physics Object-Oriented Simulation Environment). The overall design goal of RELAP-7 is to take advantage of the previous thirty yearsmore » of advancements in computer architecture, software design, numerical integration methods, and physical models. The end result will be a reactor systems analysis capability that retains and improves upon RELAP5’s capability and extends the analysis capability for all reactor system simulation scenarios.« less
NASA Technical Reports Server (NTRS)
Kaul, Upender K. (Inventor)
2009-01-01
Modeling and simulation of free and forced structural vibrations is essential to an overall structural health monitoring capability. In the various embodiments, a first principles finite-difference approach is adopted in modeling a structural subsystem such as a mechanical gear by solving elastodynamic equations in generalized curvilinear coordinates. Such a capability to generate a dynamic structural response is widely applicable in a variety of structural health monitoring systems. This capability (1) will lead to an understanding of the dynamic behavior of a structural system and hence its improved design, (2) will generate a sufficiently large space of normal and damage solutions that can be used by machine learning algorithms to detect anomalous system behavior and achieve a system design optimization and (3) will lead to an optimal sensor placement strategy, based on the identification of local stress maxima all over the domain.
NASA Astrophysics Data System (ADS)
Jonny, Zagloed, Teuku Yuri M.
2017-11-01
This paper aims to present an integrated health care model for Indonesian health care industry. Based on previous researches, there are two health care models in the industry such as decease- and patient-centered care models. In their developments, the patient-centered care model is widely applied due to its capability in reducing cost and improving quality simultaneously. However, there is still no comprehensive model resulting in cost reduction, quality improvement, patient satisfaction and hospital profitability simultaneously. Therefore, this research is intended to develop that model. In doing so, first, a conceptual model using Kano's Model, Quality Function Deployment (QFD) and Balanced Scorecard (BSC) is developed to generate several important elements of the model as required by stakeholders. Then, a case study of an Indonesian hospital is presented to evaluate the validity of the model using correlation analysis. As a result, it can be concluded that the model is validated implying several managerial insights among its elements such as l) leadership (r=0.85) and context of the organization (r=0.77) improve operations; 2) planning (r=0.96), support process (r=0.87) and continual improvement (r=0.95) also improve operations; 3) operations improve customer satisfaction (r=0.89) and financial performance (r=0.93) and 4) customer satisfaction improves the financial performance (0.98).
NASA Technical Reports Server (NTRS)
Case. Jonathan; Mungai, John; Sakwa, Vincent; Kabuchanga, Eric; Zavodsky, Bradley T.; Limaye, Ashutosh S.
2014-01-01
Flooding and drought are two key forecasting challenges for the Kenya Meteorological Department (KMD). Atmospheric processes leading to excessive precipitation and/or prolonged drought can be quite sensitive to the state of the land surface, which interacts with the boundary layer of the atmosphere providing a source of heat and moisture. The development and evolution of precipitation systems are affected by heat and moisture fluxes from the land surface within weakly-sheared environments, such as in the tropics and sub-tropics. These heat and moisture fluxes during the day can be strongly influenced by land cover, vegetation, and soil moisture content. Therefore, it is important to represent the land surface state as accurately as possible in numerical weather prediction models. Enhanced regional modeling capabilities have the potential to improve forecast guidance in support of daily operations and high-end events over east Africa. KMD currently runs a configuration of the Weather Research and Forecasting (WRF) model in real time to support its daily forecasting operations, invoking the Nonhydrostatic Mesoscale Model (NMM) dynamical core. They make use of the National Oceanic and Atmospheric Administration / National Weather Service Science and Training Resource Center's Environmental Modeling System (EMS) to manage and produce the WRF-NMM model runs on a 7-km regional grid over eastern Africa. Two organizations at the National Aeronautics and Space Administration Marshall Space Flight Center in Huntsville, AL, SERVIR and the Short-term Prediction Research and Transition (SPoRT) Center, have established a working partnership with KMD for enhancing its regional modeling capabilities. To accomplish this goal, SPoRT and SERVIR will provide experimental land surface initialization datasets and model verification capabilities to KMD. To produce a land-surface initialization more consistent with the resolution of the KMD-WRF runs, the NASA Land Information System (LIS) will be run at a comparable resolution to provide real-time, daily soil initialization data in place of interpolated Global Forecast System soil moisture and temperature data. Additionally, real-time green vegetation fraction data from the Visible Infrared Imaging Radiometer Suite will be incorporated into the KMD-WRF runs, once it becomes publicly available from the National Environmental Satellite Data and Information Service. Finally, model verification capabilities will be transitioned to KMD using the Model Evaluation Tools (MET) package, in order to quantify possible improvements in simulated temperature, moisture and precipitation resulting from the experimental land surface initialization. The transition of these MET tools will enable KMD to monitor model forecast accuracy in near real time. This presentation will highlight preliminary verification results of WRF runs over east Africa using the LIS land surface initialization.
NASA Astrophysics Data System (ADS)
Wismadi, Arif; Zuidgeest, Mark; Brussel, Mark; van Maarseveen, Martin
2014-01-01
To determine whether the inclusion of spatial neighbourhood comparison factors in Preference Modelling allows spatial decision support systems (SDSSs) to better address spatial equity, we introduce Spatial Preference Modelling (SPM). To evaluate the effectiveness of this model in addressing equity, various standardisation functions in both Non-Spatial Preference Modelling and SPM are compared. The evaluation involves applying the model to a resource location-allocation problem for transport infrastructure in the Special Province of Yogyakarta in Indonesia. We apply Amartya Sen's Capability Approach to define opportunity to mobility as a non-income indicator. Using the extended Moran's I interpretation for spatial equity, we evaluate the distribution output regarding, first, `the spatial distribution patterns of priority targeting for allocation' (SPT) and, second, `the effect of new distribution patterns after location-allocation' (ELA). The Moran's I index of the initial map and its comparison with six patterns for SPT as well as ELA consistently indicates that the SPM is more effective for addressing spatial equity. We conclude that the inclusion of spatial neighbourhood comparison factors in Preference Modelling improves the capability of SDSS to address spatial equity. This study thus proposes a new formal method for SDSS with specific attention on resource location-allocation to address spatial equity.
Thermal Effects Modeling Developed for Smart Structures
NASA Technical Reports Server (NTRS)
Lee, Ho-Jun
1998-01-01
Applying smart materials in aeropropulsion systems may improve the performance of aircraft engines through a variety of vibration, noise, and shape-control applications. To facilitate the experimental characterization of these smart structures, researchers have been focusing on developing analytical models to account for the coupled mechanical, electrical, and thermal response of these materials. One focus of current research efforts has been directed toward incorporating a comprehensive thermal analysis modeling capability. Typically, temperature affects the behavior of smart materials by three distinct mechanisms: Induction of thermal strains because of coefficient of thermal expansion mismatch 1. Pyroelectric effects on the piezoelectric elements; 2. Temperature-dependent changes in material properties; and 3. Previous analytical models only investigated the first two thermal effects mechanisms. However, since the material properties of piezoelectric materials generally vary greatly with temperature (see the graph), incorporating temperature-dependent material properties will significantly affect the structural deflections, sensory voltages, and stresses. Thus, the current analytical model captures thermal effects arising from all three mechanisms through thermopiezoelectric constitutive equations. These constitutive equations were incorporated into a layerwise laminate theory with the inherent capability to model both the active and sensory response of smart structures in thermal environments. Corresponding finite element equations were formulated and implemented for both the beam and plate elements to provide a comprehensive thermal effects modeling capability.
Improvements to the APBS biomolecular solvation software suite.
Jurrus, Elizabeth; Engel, Dave; Star, Keith; Monson, Kyle; Brandi, Juan; Felberg, Lisa E; Brookes, David H; Wilson, Leighton; Chen, Jiahui; Liles, Karina; Chun, Minju; Li, Peter; Gohara, David W; Dolinsky, Todd; Konecny, Robert; Koes, David R; Nielsen, Jens Erik; Head-Gordon, Teresa; Geng, Weihua; Krasny, Robert; Wei, Guo-Wei; Holst, Michael J; McCammon, J Andrew; Baker, Nathan A
2018-01-01
The Adaptive Poisson-Boltzmann Solver (APBS) software was developed to solve the equations of continuum electrostatics for large biomolecular assemblages that have provided impact in the study of a broad range of chemical, biological, and biomedical applications. APBS addresses the three key technology challenges for understanding solvation and electrostatics in biomedical applications: accurate and efficient models for biomolecular solvation and electrostatics, robust and scalable software for applying those theories to biomolecular systems, and mechanisms for sharing and analyzing biomolecular electrostatics data in the scientific community. To address new research applications and advancing computational capabilities, we have continually updated APBS and its suite of accompanying software since its release in 2001. In this article, we discuss the models and capabilities that have recently been implemented within the APBS software package including a Poisson-Boltzmann analytical and a semi-analytical solver, an optimized boundary element solver, a geometry-based geometric flow solvation model, a graph theory-based algorithm for determining pK a values, and an improved web-based visualization tool for viewing electrostatics. © 2017 The Protein Society.
Perspectives On Dilution Jet Mixing
NASA Technical Reports Server (NTRS)
Holdeman, J. D.; Srinivasan, R.
1990-01-01
NASA recently completed program of measurements and modeling of mixing of transverse jets with ducted crossflow, motivated by need to design or tailor temperature pattern at combustor exit in gas turbine engines. Objectives of program to identify dominant physical mechanisms governing mixing, extend empirical models to provide near-term predictive capability, and compare numerical code calculations with data to guide future analysis improvement efforts.
Institutional Transformation Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
2015-10-19
Reducing the energy consumption of large institutions with dozens to hundreds of existing buildings while maintaining and improving existing infrastructure is a critical economic and environmental challenge. SNL's Institutional Transformation (IX) work integrates facilities and infrastructure sustainability technology capabilities and collaborative decision support modeling approaches to help facilities managers at Sandia National Laboratories (SNL) simulate different future energy reduction strategies and meet long term energy conservation goals.
Advancing investigation and physical modeling of first-order fire effects on soils
William J. Massman; John M. Frank; Sacha J. Mooney
2010-01-01
Heating soil during intense wildland fires or slash-pile burns can alter the soil irreversibly, resulting in many significant long-term biological, chemical, physical, and hydrological effects. To better understand these long-term effects, it is necessary to improve modeling capability and prediction of the more immediate, or first-order, effects that fire can have on...
ERIC Educational Resources Information Center
Robadue, Donald D., Jr.
2012-01-01
Those advocating for effective management of the use of coastal areas and ecosystems have long aspired for an approach to governance that includes information systems with the capability to predict the end results of various courses of action, monitor the impacts of decisions and compare results with those predicted by computer models in order to…
Advances in HYDRA and its application to simulations of Inertial Confinement Fusion targets
NASA Astrophysics Data System (ADS)
Marinak, M. M.; Kerbel, G. D.; Koning, J. M.; Patel, M. V.; Sepke, S. M.; Brown, P. N.; Chang, B.; Procassini, R.; Veitzer, S. A.
2008-11-01
We will outline new capabilities added to the HYDRA 2D/3D multiphysics ICF simulation code. These include a new SN multigroup radiation transport package (1D), constitutive models for elastic-plastic (strength) effects, and a mix model. A Monte Carlo burn package is being incorporated to model diagnostic signatures of neutrons, gamma rays and charged particles. A 3D MHD package that treats resistive MHD is available. Improvements to HYDRA's implicit Monte Carlo photonics package, including the addition of angular biasing, now enable integrated hohlraum simulations to complete in substantially shorter time. The heavy ion beam deposition package now includes a new model for ion stopping power developed by the Tech-X Corporation, with improved accuracy below the Bragg peak. Examples will illustrate HYDRA's enhanced capabilities to simulate various aspects of inertial confinement fusion targets.This work was performed under the auspices of the Lawrence Livermore National Security, LLC, (LLNS) under Contract No. DE-AC52-07NA27344. The work of Tech-X personnel was funded by the Department of Energy under Small Business Innovation Research Contract No. DE-FG02-03ER83797.
Multi-Scale Three-Dimensional Variational Data Assimilation System for Coastal Ocean Prediction
NASA Technical Reports Server (NTRS)
Li, Zhijin; Chao, Yi; Li, P. Peggy
2012-01-01
A multi-scale three-dimensional variational data assimilation system (MS-3DVAR) has been formulated and the associated software system has been developed for improving high-resolution coastal ocean prediction. This system helps improve coastal ocean prediction skill, and has been used in support of operational coastal ocean forecasting systems and field experiments. The system has been developed to improve the capability of data assimilation for assimilating, simultaneously and effectively, sparse vertical profiles and high-resolution remote sensing surface measurements into coastal ocean models, as well as constraining model biases. In this system, the cost function is decomposed into two separate units for the large- and small-scale components, respectively. As such, data assimilation is implemented sequentially from large to small scales, the background error covariance is constructed to be scale-dependent, and a scale-dependent dynamic balance is incorporated. This scheme then allows effective constraining large scales and model bias through assimilating sparse vertical profiles, and small scales through assimilating high-resolution surface measurements. This MS-3DVAR enhances the capability of the traditional 3DVAR for assimilating highly heterogeneously distributed observations, such as along-track satellite altimetry data, and particularly maximizing the extraction of information from limited numbers of vertical profile observations.
DOT National Transportation Integrated Search
2015-04-01
Research done through the Second Strategic Highway Research Program (SHRP 2) determined that agencies with the most effective transportation systems management and operations (TSM&O) activities were differentiated not by budgets or technical skills a...
DOT National Transportation Integrated Search
2015-04-01
Research done through the Second Strategic Highway Research Program (SHRP 2) determined that agencies with the most effective transportation systems management and operations (TSM&O) activities were differentiated not by budgets or technical skills a...
DOT National Transportation Integrated Search
2015-04-01
Research done through the Second Strategic Highway Research Program (SHRP 2) determined that agencies with the most effective transportation systems management and operations (TSM&O) activities were differentiated not by budgets or technical skills a...
DOT National Transportation Integrated Search
2015-04-01
Research done through the Second Strategic Highway Research Program (SHRP 2) determined that agencies with the most effective transportation systems management and operations (TSM&O) activities were differentiated not by budgets or technical skills a...
Jones, James W; Antle, John M; Basso, Bruno; Boote, Kenneth J; Conant, Richard T; Foster, Ian; Godfray, H Charles J; Herrero, Mario; Howitt, Richard E; Janssen, Sander; Keating, Brian A; Munoz-Carpena, Rafael; Porter, Cheryl H; Rosenzweig, Cynthia; Wheeler, Tim R
2017-07-01
We review the current state of agricultural systems science, focusing in particular on the capabilities and limitations of agricultural systems models. We discuss the state of models relative to five different Use Cases spanning field, farm, landscape, regional, and global spatial scales and engaging questions in past, current, and future time periods. Contributions from multiple disciplines have made major advances relevant to a wide range of agricultural system model applications at various spatial and temporal scales. Although current agricultural systems models have features that are needed for the Use Cases, we found that all of them have limitations and need to be improved. We identified common limitations across all Use Cases, namely 1) a scarcity of data for developing, evaluating, and applying agricultural system models and 2) inadequate knowledge systems that effectively communicate model results to society. We argue that these limitations are greater obstacles to progress than gaps in conceptual theory or available methods for using system models. New initiatives on open data show promise for addressing the data problem, but there also needs to be a cultural change among agricultural researchers to ensure that data for addressing the range of Use Cases are available for future model improvements and applications. We conclude that multiple platforms and multiple models are needed for model applications for different purposes. The Use Cases provide a useful framework for considering capabilities and limitations of existing models and data.
NASA Technical Reports Server (NTRS)
Jones, James W.; Antle, John M.; Basso, Bruno; Boote, Kenneth J.; Conant, Richard T.; Foster, Ian; Godfray, H. Charles J.; Herrero, Mario; Howitt, Richard E.; Janssen, Sander;
2016-01-01
We review the current state of agricultural systems science, focusing in particular on the capabilities and limitations of agricultural systems models. We discuss the state of models relative to five different Use Cases spanning field, farm, landscape, regional, and global spatial scales and engaging questions in past, current, and future time periods. Contributions from multiple disciplines have made major advances relevant to a wide range of agricultural system model applications at various spatial and temporal scales. Although current agricultural systems models have features that are needed for the Use Cases, we found that all of them have limitations and need to be improved. We identified common limitations across all Use Cases, namely 1) a scarcity of data for developing, evaluating, and applying agricultural system models and 2) inadequate knowledge systems that effectively communicate model results to society. We argue that these limitations are greater obstacles to progress than gaps in conceptual theory or available methods for using system models. New initiatives on open data show promise for addressing the data problem, but there also needs to be a cultural change among agricultural researchers to ensure that data for addressing the range of Use Cases are available for future model improvements and applications. We conclude that multiple platforms and multiple models are needed for model applications for different purposes. The Use Cases provide a useful framework for considering capabilities and limitations of existing models and data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, James W.; Antle, John M.; Basso, Bruno
We review the current state of agricultural systems science, focusing in particular on the capabilities and limitations of agricultural systems models. We discuss the state of models relative to five different Use Cases spanning field, farm, landscape, regional, and global spatial scales and engaging questions in past, current, and future time periods. Contributions from multiple disciplines have made major advances relevant to a wide range of agricultural system model applications at various spatial and temporal scales. Although current agricultural systems models have features that are needed for the Use Cases, we found that all of them have limitations and needmore » to be improved. We identified common limitations across all Use Cases, namely 1) a scarcity of data for developing, evaluating, and applying agricultural system models and 2) inadequate knowledge systems that effectively communicate model results to society. We argue that these limitations are greater obstacles to progress than gaps in conceptual theory or available methods for using system models. New initiatives on open data show promise for addressing the data problem, but there also needs to be a cultural change among agricultural researchers to ensure that data for addressing the range of Use Cases are available for future model improvements and applications. We conclude that multiple platforms and multiple models are needed for model applications for different purposes. The Use Cases provide a useful framework for considering capabilities and limitations of existing models and data.« less
Polarized BRDF for coatings based on three-component assumption
NASA Astrophysics Data System (ADS)
Liu, Hong; Zhu, Jingping; Wang, Kai; Xu, Rong
2017-02-01
A pBRDF(polarized bidirectional reflection distribution function) model for coatings is given based on three-component reflection assumption in order to improve the polarized scattering simulation capability for space objects. In this model, the specular reflection is given based on microfacet theory, the multiple reflection and volume scattering are given separately according to experimental results. The polarization of specular reflection is considered from Fresnel's law, and both multiple reflection and volume scattering are assumed depolarized. Simulation and measurement results of two satellite coating samples SR107 and S781 are given to validate that the pBRDF modeling accuracy can be significantly improved by the three-component model given in this paper.
Airport Flight Departure Delay Model on Improved BN Structure Learning
NASA Astrophysics Data System (ADS)
Cao, Weidong; Fang, Xiangnong
An high score prior genetic simulated annealing Bayesian network structure learning algorithm (HSPGSA) by combining genetic algorithm(GA) with simulated annealing algorithm(SAA) is developed. The new algorithm provides not only with strong global search capability of GA, but also with strong local hill climb search capability of SAA. The structure with the highest score is prior selected. In the mean time, structures with lower score are also could be choice. It can avoid efficiently prematurity problem by higher score individual wrong direct growing population. Algorithm is applied to flight departure delays analysis in a large hub airport. Based on the flight data a BN model is created. Experiments show that parameters learning can reflect departure delay.
Software-as-a-Service Vendors: Are They Ready to Successfully Deliver?
NASA Astrophysics Data System (ADS)
Heart, Tsipi; Tsur, Noa Shamir; Pliskin, Nava
Software as a service (SaaS) is a software sourcing option that allows organizations to remotely access enterprise applications, without having to install the application in-house. In this work we study vendors' readiness to deliver SaaS, a topic scarcely studied before. The innovation classification (evolutionary vs. revolutionary) and a new, Seven Fundamental Organizational Capabilities (FOCs) Model, are used as the theoretical frameworks. The Seven FOCs model suggests generic yet comprehensive set of capabilities that are required for organizational success: 1) sensing the stakeholders, 2) sensing the business environment, 3) sensing the knowledge environment, 4) process control, 5) process improvement, 6) new process development, and 7) appropriate resolution.
Advancing botnet modeling techniques for military and security simulations
NASA Astrophysics Data System (ADS)
Banks, Sheila B.; Stytz, Martin R.
2011-06-01
Simulation environments serve many purposes, but they are only as good as their content. One of the most challenging and pressing areas that call for improved content is the simulation of bot armies (botnets) and their effects upon networks and computer systems. Botnets are a new type of malware, a type that is more powerful and potentially dangerous than any other type of malware. A botnet's power derives from several capabilities including the following: 1) the botnet's capability to be controlled and directed throughout all phases of its activity, 2) a command and control structure that grows increasingly sophisticated, and 3) the ability of a bot's software to be updated at any time by the owner of the bot (a person commonly called a bot master or bot herder.) Not only is a bot army powerful and agile in its technical capabilities, a bot army can be extremely large, can be comprised of tens of thousands, if not millions, of compromised computers or it can be as small as a few thousand targeted systems. In all botnets, their members can surreptitiously communicate with each other and their command and control centers. In sum, these capabilities allow a bot army to execute attacks that are technically sophisticated, difficult to trace, tactically agile, massive, and coordinated. To improve our understanding of their operation and potential, we believe that it is necessary to develop computer security simulations that accurately portray bot army activities, with the goal of including bot army simulations within military simulation environments. In this paper, we investigate issues that arise when simulating bot armies and propose a combination of the biologically inspired MSEIR infection spread model coupled with the jump-diffusion infection spread model to portray botnet propagation.
NOAA Climate Program Office Contributions to National ESPC
NASA Astrophysics Data System (ADS)
Higgins, W.; Huang, J.; Mariotti, A.; Archambault, H. M.; Barrie, D.; Lucas, S. E.; Mathis, J. T.; Legler, D. M.; Pulwarty, R. S.; Nierenberg, C.; Jones, H.; Cortinas, J. V., Jr.; Carman, J.
2016-12-01
NOAA is one of five federal agencies (DOD, DOE, NASA, NOAA, and NSF) which signed an updated charter in 2016 to partner on the National Earth System Prediction Capability (ESPC). Situated within NOAA's Office of Oceanic and Atmospheric Research (OAR), NOAA Climate Program Office (CPO) programs contribute significantly to the National ESPC goals and activities. This presentation will provide an overview of CPO contributions to National ESPC. First, we will discuss selected CPO research and transition activities that directly benefit the ESPC coupled model prediction capability, including The North American Multi-Model Ensemble (NMME) seasonal prediction system The Subseasonal Experiment (SubX) project to test real-time subseasonal ensemble prediction systems. Improvements to the NOAA operational Climate Forecast System (CFS), including software infrastructure and data assimilation. Next, we will show how CPO's foundational research activities are advancing future ESPC capabilities. Highlights will include: The Tropical Pacific Observing System (TPOS) to provide the basis for predicting climate on subseasonal to decadal timescales. Subseasonal-to-Seasonal (S2S) processes and predictability studies to improve understanding, modeling and prediction of the MJO. An Arctic Research Program to address urgent needs for advancing monitoring and prediction capabilities in this major area of concern. Advances towards building an experimental multi-decadal prediction system through studies on the Atlantic Meridional Overturning Circulation (AMOC). Finally, CPO has embraced Integrated Information Systems (IIS's) that build on the innovation of programs such as the National Integrated Drought Information System (NIDIS) to develop and deliver end to end environmental information for key societal challenges (e.g. extreme heat; coastal flooding). These contributions will help the National ESPC better understand and address societal needs and decision support requirements.
DEVELOPMENT OF AN IMPROVED SIMULATOR FOR CHEMICAL AND MICROBIAL IOR METHODS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gary A. Pope; Kamy Sepehrnoori; Mojdeh Delshad
2001-10-01
This is the final report of a three-year research project on further development of a chemical and microbial improved oil recovery reservoir simulator. The objective of this research was to extend the capability of an existing simulator (UTCHEM) to improved oil recovery methods which use surfactants, polymers, gels, alkaline chemicals, microorganisms and foam as well as various combinations of these in both conventional and naturally fractured oil reservoirs. The first task was the addition of a dual-porosity model for chemical IOR in naturally fractured oil reservoirs. They formulated and implemented a multiphase, multicomponent dual porosity model for enhanced oil recoverymore » from naturally fractured reservoirs. The multiphase dual porosity model was tested against analytical solutions, coreflood data, and commercial simulators. The second task was the addition of a foam model. They implemented a semi-empirical surfactant/foam model in UTCHEM and validated the foam model by comparison with published laboratory data. The third task addressed several numerical and coding enhancements that will greatly improve its versatility and performance. Major enhancements were made in UTCHEM output files and memory management. A graphical user interface to set up the simulation input and to process the output data on a Windows PC was developed. New solvers for solving the pressure equation and geochemical system of equations were implemented and tested. A corner point grid geometry option for gridding complex reservoirs was implemented and tested. Enhancements of physical property models for both chemical and microbial IOR simulations were included in the final task of this proposal. Additional options for calculating the physical properties such as relative permeability and capillary pressure were added. A microbiological population model was developed and incorporated into UTCHEM. They have applied the model to microbial enhanced oil recovery (MEOR) processes by including the capability of permeability reduction due to biomass growth and retention. The formations of bio-products such as surfactant and polymer surfactant have also been incorporated.« less
NASA Technical Reports Server (NTRS)
Garland, D. B.
1980-01-01
Modifications were made to the model to improve longitudinal acceleration capability during transition from hovering to wing borne flight. A rearward deflection of the fuselage augmentor thrust vector is shown to be beneficial in this regard. Other agmentor modifications were tested, notably the removal of both endplates, which improved acceleration performance at the higher transition speeds. The model tests again demonstrated minimal interference of the fuselage augmentor on aerodynamic lift. A flapped canard surface also shows negligible influence on the performance of the wing and of the fuselage augmentor.
DeMO: An Ontology for Discrete-event Modeling and Simulation.
Silver, Gregory A; Miller, John A; Hybinette, Maria; Baramidze, Gregory; York, William S
2011-09-01
Several fields have created ontologies for their subdomains. For example, the biological sciences have developed extensive ontologies such as the Gene Ontology, which is considered a great success. Ontologies could provide similar advantages to the Modeling and Simulation community. They provide a way to establish common vocabularies and capture knowledge about a particular domain with community-wide agreement. Ontologies can support significantly improved (semantic) search and browsing, integration of heterogeneous information sources, and improved knowledge discovery capabilities. This paper discusses the design and development of an ontology for Modeling and Simulation called the Discrete-event Modeling Ontology (DeMO), and it presents prototype applications that demonstrate various uses and benefits that such an ontology may provide to the Modeling and Simulation community.
Software Surface Modeling and Grid Generation Steering Committee
NASA Technical Reports Server (NTRS)
Smith, Robert E. (Editor)
1992-01-01
It is a NASA objective to promote improvements in the capability and efficiency of computational fluid dynamics. Grid generation, the creation of a discrete representation of the solution domain, is an essential part of computational fluid dynamics. However, grid generation about complex boundaries requires sophisticated surface-model descriptions of the boundaries. The surface modeling and the associated computation of surface grids consume an extremely large percentage of the total time required for volume grid generation. Efficient and user friendly software systems for surface modeling and grid generation are critical for computational fluid dynamics to reach its potential. The papers presented here represent the state-of-the-art in software systems for surface modeling and grid generation. Several papers describe improved techniques for grid generation.
DeMO: An Ontology for Discrete-event Modeling and Simulation
Silver, Gregory A; Miller, John A; Hybinette, Maria; Baramidze, Gregory; York, William S
2011-01-01
Several fields have created ontologies for their subdomains. For example, the biological sciences have developed extensive ontologies such as the Gene Ontology, which is considered a great success. Ontologies could provide similar advantages to the Modeling and Simulation community. They provide a way to establish common vocabularies and capture knowledge about a particular domain with community-wide agreement. Ontologies can support significantly improved (semantic) search and browsing, integration of heterogeneous information sources, and improved knowledge discovery capabilities. This paper discusses the design and development of an ontology for Modeling and Simulation called the Discrete-event Modeling Ontology (DeMO), and it presents prototype applications that demonstrate various uses and benefits that such an ontology may provide to the Modeling and Simulation community. PMID:22919114
McHenry, John N; Vukovich, Jeffery M; Hsu, N Christina
2015-12-01
This two-part paper reports on the development, implementation, and improvement of a version of the Community Multi-Scale Air Quality (CMAQ) model that assimilates real-time remotely-sensed aerosol optical depth (AOD) information and ground-based PM2.5 monitor data in routine prognostic application. The model is being used by operational air quality forecasters to help guide their daily issuance of state or local-agency-based air quality alerts (e.g. action days, health advisories). Part 1 describes the development and testing of the initial assimilation capability, which was implemented offline in partnership with NASA and the Visibility Improvement State and Tribal Association of the Southeast (VISTAS) Regional Planning Organization (RPO). In the initial effort, MODIS-derived aerosol optical depth (AOD) data are input into a variational data-assimilation scheme using both the traditional Dark Target and relatively new "Deep Blue" retrieval methods. Evaluation of the developmental offline version, reported in Part 1 here, showed sufficient promise to implement the capability within the online, prognostic operational model described in Part 2. In Part 2, the addition of real-time surface PM2.5 monitoring data to improve the assimilation and an initial evaluation of the prognostic modeling system across the continental United States (CONUS) is presented. Air quality forecasts are now routinely used to understand when air pollution may reach unhealthy levels. For the first time, an operational air quality forecast model that includes the assimilation of remotely-sensed aerosol optical depth and ground based PM2.5 observations is being used. The assimilation enables quantifiable improvements in model forecast skill, which improves confidence in the accuracy of the officially-issued forecasts. This helps air quality stakeholders be more effective in taking mitigating actions (reducing power consumption, ride-sharing, etc.) and avoiding exposures that could otherwise result in more serious air quality episodes or more deleterious health effects.
Image-optimized Coronal Magnetic Field Models
NASA Astrophysics Data System (ADS)
Jones, Shaela I.; Uritsky, Vadim; Davila, Joseph M.
2017-08-01
We have reported previously on a new method we are developing for using image-based information to improve global coronal magnetic field models. In that work, we presented early tests of the method, which proved its capability to improve global models based on flawed synoptic magnetograms, given excellent constraints on the field in the model volume. In this follow-up paper, we present the results of similar tests given field constraints of a nature that could realistically be obtained from quality white-light coronagraph images of the lower corona. We pay particular attention to difficulties associated with the line-of-sight projection of features outside of the assumed coronagraph image plane and the effect on the outcome of the optimization of errors in the localization of constraints. We find that substantial improvement in the model field can be achieved with these types of constraints, even when magnetic features in the images are located outside of the image plane.
Image-Optimized Coronal Magnetic Field Models
NASA Technical Reports Server (NTRS)
Jones, Shaela I.; Uritsky, Vadim; Davila, Joseph M.
2017-01-01
We have reported previously on a new method we are developing for using image-based information to improve global coronal magnetic field models. In that work we presented early tests of the method which proved its capability to improve global models based on flawed synoptic magnetograms, given excellent constraints on the field in the model volume. In this follow-up paper we present the results of similar tests given field constraints of a nature that could realistically be obtained from quality white-light coronagraph images of the lower corona. We pay particular attention to difficulties associated with the line-of-sight projection of features outside of the assumed coronagraph image plane, and the effect on the outcome of the optimization of errors in localization of constraints. We find that substantial improvement in the model field can be achieved with this type of constraints, even when magnetic features in the images are located outside of the image plane.
NASA Technical Reports Server (NTRS)
Kvaternik, Raymond G.
1992-01-01
An overview is presented of government contributions to the program called Design Analysis Methods for Vibrations (DAMV) which attempted to develop finite-element-based analyses of rotorcraft vibrations. NASA initiated the program with a finite-element modeling program for the CH-47D tandem-rotor helicopter. The DAMV program emphasized four areas including: airframe finite-element modeling, difficult components studies, coupled rotor-airframe vibrations, and airframe structural optimization. Key accomplishments of the program include industrywide standards for modeling metal and composite airframes, improved industrial designs for vibrations, and the identification of critical structural contributors to airframe vibratory responses. The program also demonstrated the value of incorporating secondary modeling details to improving correlation, and the findings provide the basis for an improved finite-element-based dynamics design-analysis capability.
Designing teams of unattended ground sensors using genetic algorithms
NASA Astrophysics Data System (ADS)
Yilmaz, Ayse S.; McQuay, Brian N.; Wu, Annie S.; Sciortino, John C., Jr.
2004-04-01
Improvements in sensor capabilities have driven the need for automated sensor allocation and management systems. Such systems provide a penalty-free test environment and valuable input to human operators by offering candidate solutions. These abilities lead, in turn, to savings in manpower and time. Determining an optimal team of cooperating sensors for military operations is a challenging task. There is a tradeoff between the desire to decrease the cost and the need to increase the sensing capabilities of a sensor suite. This work focuses on unattended ground sensor networks consisting of teams of small, inexpensive sensors. Given a possible configuration of enemy radar, our goal isto generate sensor suites that monitor as many enemy radar as possible while minimizing cost. In previous work, we have shown that genetic algorithms (GAs) can be used to evolve successful teams of sensors for this problem. This work extends our previous work in two ways: we use an improved simulator containing a more accurate model of radar and sensor capabilities for out fitness evaluations and we introduce two new genetic operators, insertion and deletion, that are expected to improve the GA's fine tuning abilities. Empirical results show that our GA approach produces near optimal results under a variety of enemy radar configurations using sensors with varying capabilities. Detection percentage remains stable regardless of changes in the enemy radar placements.
High fidelity studies of exploding foil initiator bridges, Part 1: Experimental method
NASA Astrophysics Data System (ADS)
Bowden, Mike; Neal, William
2017-01-01
Simulations of high voltage detonators, such as Exploding Bridgewire (EBW) and Exploding Foil Initiators (EFI), have historically been simple, often empirical, one-dimensional models capable of predicting parameters such as current, voltage and in the case of EFIs, flyer velocity. Correspondingly, experimental methods have in general been limited to the same parameters. With the advent of complex, first principles magnetohydrodynamic codes such as ALEGRA and ALE-MHD, it is now possible to simulate these components in three dimensions, predicting a much greater range of parameters than before. A significant improvement in experimental capability was therefore required to ensure these simulations could be adequately validated. In this first paper of a three part study, the experimental method for determining the current, voltage, flyer velocity and multi-dimensional profile of detonator components is presented. This improved capability, along with high fidelity simulations, offer an opportunity to gain a greater understanding of the processes behind the functioning of EBW and EFI detonators.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Epiney, Aaron Simon; Chen, Jun; Rabiti, Cristian
Continued effort to design and build a modeling and simulation framework to assess the economic viability of Nuclear Hybrid Energy Systems (NHES) was undertaken in fiscal year (FY) 2016. The purpose of this report is to document the various tasks associated with the development of such a framework and to provide a status of their progress. Several tasks have been accomplished. First, a synthetic time history generator has been developed in RAVEN, which consists of Fourier series and autoregressive moving average model. The former is used to capture the seasonal trend in historical data, while the latter is to characterizemore » the autocorrelation in residue time series (e.g., measurements with seasonal trends subtracted). As demonstration, both synthetic wind speed and grid demand are generated, showing matching statistics with database. In order to build a design and operations optimizer in RAVEN, a new type of sampler has been developed with highly object-oriented design. In particular, simultaneous perturbation stochastic approximation algorithm is implemented. The optimizer is capable to drive the model to optimize a scalar objective function without constraint in the input space, while the constraints handling is a work in progress and will be implemented to improve the optimization capability. Furthermore, a simplified cash flow model of the performance of an NHES in the electric market has been developed in Python and used as external model in RAVEN to confirm expectations on the analysis capability of RAVEN to provide insight into system economics and to test the capability of RAVEN to identify limit surfaces. Finally, an example calculation is performed that shows the integration and proper data passing in RAVEN of the synthetic time history generator, the cash flow model and the optimizer. It has been shown that the developed Python models external to RAVEN are able to communicate with RAVEN and each other through the newly developed RAVEN capability called “EnsembleModel”.« less
Sensing and Virtual Worlds - A Survey of Research Opportunities
NASA Technical Reports Server (NTRS)
Moore, Dana
2012-01-01
Virtual Worlds (VWs) have been used effectively in live and constructive military training. An area that remains fertile ground for exploration and a new vision involves integrating various traditional and now non-traditional sensors into virtual worlds. In this paper, we will assert that the benefits of this integration are several. First, we maintain that virtual worlds offer improved sensor deployment planning through improved visualization and stimulation of the model, using geo-specific terrain and structure. Secondly, we assert that VWs enhance the mission rehearsal process, and that using a mix of live avatars, non-player characters, and live sensor feeds (e.g. real time meteorology) can help visualization of the area of operations. Finally, tactical operations are improved via better collaboration and integration of real world sensing capabilities, and in most situations, 30 VWs improve the state of the art over current "dots on a map" 20 geospatial visualization. However, several capability gaps preclude a fuller realization of this vision. In this paper, we identify many of these gaps and suggest research directions
Rethinking Strategy and Strategic Leadership in Schools.
ERIC Educational Resources Information Center
Davies, Brent
2003-01-01
Reviews nature of strategy and strategic leadership in schools. Considers how leaders can map and reconceptualize the nature of strategy and develop strategic capabilities for longer-term sustainability. Questions hierarchical models of leadership. Highlights three characteristics of strategically oriented schools; suggests ways to improve art of…
DOT National Transportation Integrated Search
2015-04-01
Research done through the Second Strategic Highway Research Program (SHRP 2) determined that agencies with the most effective transportation systems management and operations (TSM&O) activities were differentiated not by budgets or technical skills a...
DOT National Transportation Integrated Search
2015-04-01
Research done through the Second Strategic Highway Research Program (SHRP 2) determined that agencies with the most effective transportation systems management and operations (TSM&O) activities were differentiated not by budgets or technical skills a...
NASA Technical Reports Server (NTRS)
Gaston, S.; Wertheim, M.; Orourke, J. A.
1973-01-01
Summary, consolidation and analysis of specifications, manufacturing process and test controls, and performance results for OAO-2 and OAO-3 lot 20 Amp-Hr sealed nickel cadmium cells and batteries are reported. Correlation of improvements in control requirements with performance is a key feature. Updates for a cell/battery computer model to improve performance prediction capability are included. Applicability of regression analysis computer techniques to relate process controls to performance is checked.
2012-09-30
improving forecast performance over cloudy regions using the Ozone Monitoring Instrument (OMI) Aerosol Index; and 2) preparing for the post-MODIS...meteorological fields, the International Geosphere-Biosphere Programme (IGBP) SW and LW surface characteristics, and an ozone climatology are used as...The primary impact of CALIOP assimilation on the model is the redistribution of mass toward the boundary layer from the free troposphere . For high
Integrated Modeling and Analysis of Physical Oceanographic and Acoustic Processes
2015-09-30
goal is to improve ocean physical state and acoustic state predictive capabilities. The goal fitting the scope of this project is the creation of... Project -scale objectives are to complete targeted studies of oceanographic processes in a few regimes, accompanied by studies of acoustic propagation...by the basic research efforts of this project . An additional objective is to develop improved computational tools for acoustics and for the
Extremum Seeking Control of Smart Inverters for VAR Compensation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arnold, Daniel; Negrete-Pincetic, Matias; Stewart, Emma
2015-09-04
Reactive power compensation is used by utilities to ensure customer voltages are within pre-defined tolerances and reduce system resistive losses. While much attention has been paid to model-based control algorithms for reactive power support and Volt Var Optimization (VVO), these strategies typically require relatively large communications capabilities and accurate models. In this work, a non-model-based control strategy for smart inverters is considered for VAR compensation. An Extremum Seeking control algorithm is applied to modulate the reactive power output of inverters based on real power information from the feeder substation, without an explicit feeder model. Simulation results using utility demand informationmore » confirm the ability of the control algorithm to inject VARs to minimize feeder head real power consumption. In addition, we show that the algorithm is capable of improving feeder voltage profiles and reducing reactive power supplied by the distribution substation.« less
Status of Computational Aerodynamic Modeling Tools for Aircraft Loss-of-Control
NASA Technical Reports Server (NTRS)
Frink, Neal T.; Murphy, Patrick C.; Atkins, Harold L.; Viken, Sally A.; Petrilli, Justin L.; Gopalarathnam, Ashok; Paul, Ryan C.
2016-01-01
A concerted effort has been underway over the past several years to evolve computational capabilities for modeling aircraft loss-of-control under the NASA Aviation Safety Program. A principal goal has been to develop reliable computational tools for predicting and analyzing the non-linear stability & control characteristics of aircraft near stall boundaries affecting safe flight, and for utilizing those predictions for creating augmented flight simulation models that improve pilot training. Pursuing such an ambitious task with limited resources required the forging of close collaborative relationships with a diverse body of computational aerodynamicists and flight simulation experts to leverage their respective research efforts into the creation of NASA tools to meet this goal. Considerable progress has been made and work remains to be done. This paper summarizes the status of the NASA effort to establish computational capabilities for modeling aircraft loss-of-control and offers recommendations for future work.
Modeling a Civil Event Case Study for Consequence Management Using the IMPRINT Forces Module
NASA Technical Reports Server (NTRS)
Gacy, Marc; Gosakan, Mala; Eckdahl, Angela; Miller, Jeffrey R.
2012-01-01
A critical challenge in the Consequence Management (CM) domain is the appropriate allocation of necessary and skilled military and civilian personnel and materiel resources in unexpected emergencies. To aid this process we used the Forces module in the Improved Performance Research Integration Tool (IMPRINT). This module enables analysts to enter personnel and equipment capabilities, prioritized schedules and numbers available, along with unexpected emergency requirements in order to assess force response requirements. Using a suspected terrorist threat on a college campus, we developed a test case model which exercised the capabilities of the module, including the scope and scale of operations. The model incorporates data from multiple sources, including daily schedules and frequency of events such as fire calls. Our preliminary results indicate that the model can predict potential decreases in civilian emergency response coverage due to an involved unplanned incident requiring significant portions of police, fire and civil responses teams.
NASA Stennis Space Center integrated system health management test bed and development capabilities
NASA Astrophysics Data System (ADS)
Figueroa, Fernando; Holland, Randy; Coote, David
2006-05-01
Integrated System Health Management (ISHM) capability for rocket propulsion testing is rapidly evolving and promises substantial reduction in time and cost of propulsion systems development, with substantially reduced operational costs and evolutionary improvements in launch system operational robustness. NASA Stennis Space Center (SSC), along with partners that includes NASA, contractor, and academia; is investigating and developing technologies to enable ISHM capability in SSC's rocket engine test stands (RETS). This will enable validation and experience capture over a broad range of rocket propulsion systems of varying complexity. This paper describes key components that constitute necessary ingredients to make possible implementation of credible ISHM capability in RETS, other NASA ground test and operations facilities, and ultimately spacecraft and space platforms and systems: (1) core technologies for ISHM, (2) RETS as ISHM testbeds, and (3) RETS systems models.
NASA Astrophysics Data System (ADS)
Rimland, Jeffrey; McNeese, Michael; Hall, David
2013-05-01
Although the capability of computer-based artificial intelligence techniques for decision-making and situational awareness has seen notable improvement over the last several decades, the current state-of-the-art still falls short of creating computer systems capable of autonomously making complex decisions and judgments in many domains where data is nuanced and accountability is high. However, there is a great deal of potential for hybrid systems in which software applications augment human capabilities by focusing the analyst's attention to relevant information elements based on both a priori knowledge of the analyst's goals and the processing/correlation of a series of data streams too numerous and heterogeneous for the analyst to digest without assistance. Researchers at Penn State University are exploring ways in which an information framework influenced by Klein's (Recognition Primed Decision) RPD model, Endsley's model of situational awareness, and the Joint Directors of Laboratories (JDL) data fusion process model can be implemented through a novel combination of Complex Event Processing (CEP) and Multi-Agent Software (MAS). Though originally designed for stock market and financial applications, the high performance data-driven nature of CEP techniques provide a natural compliment to the proven capabilities of MAS systems for modeling naturalistic decision-making, performing process adjudication, and optimizing networked processing and cognition via the use of "mobile agents." This paper addresses the challenges and opportunities of such a framework for augmenting human observational capability as well as enabling the ability to perform collaborative context-aware reasoning in both human teams and hybrid human / software agent teams.
The HackensackUMC Value-Based Care Model: Building Essentials for Value-Based Purchasing.
Douglas, Claudia; Aroh, Dianne; Colella, Joan; Quadri, Mohammed
2016-01-01
The Affordable Care Act, 2010, and the subsequent shift from a quantity-focus to a value-centric reimbursement model led our organization to create the HackensackUMC Value-Based Care Model to improve our process capability and performance to meet and sustain the triple aims of value-based purchasing: higher quality, lower cost, and consumer perception. This article describes the basics of our model and illustrates how we used it to reduce the costs of our patient sitter program.
NASA Technical Reports Server (NTRS)
Molthan, Andrew; Case, Jonathan; Venner, Jason; Moreno-Madrinan, Max J.; Delgado, Francisco
2012-01-01
Two projects at NASA Marshall Space Flight Center have collaborated to develop a high resolution weather forecast model for Mesoamerica: The NASA Short-term Prediction Research and Transition (SPoRT) Center, which integrates unique NASA satellite and weather forecast modeling capabilities into the operational weather forecasting community. NASA's SERVIR Program, which integrates satellite observations, ground-based data, and forecast models to improve disaster response in Central America, the Caribbean, Africa, and the Himalayas.
Analysis of the In-Water and Sky Radiance Distribution Data Acquired During the RaDyO Project
2011-09-30
radiative transfer to model the BRDF of particulate surfaces. OBJECTIVES The major objective of this research is to understand the downwelling spectral...in the water, was also used by the two major modeling groups in RaDyO, to successfully validate their radiative transfer models . This work is...image and radiative transfer models used in the ocean. My near term ocean optics objectives have been: 1) to improve the measurement capability of
Next generation agricultural system data, models and knowledge products: Introduction.
Antle, John M; Jones, James W; Rosenzweig, Cynthia E
2017-07-01
Agricultural system models have become important tools to provide predictive and assessment capability to a growing array of decision-makers in the private and public sectors. Despite ongoing research and model improvements, many of the agricultural models today are direct descendants of research investments initially made 30-40 years ago, and many of the major advances in data, information and communication technology (ICT) of the past decade have not been fully exploited. The purpose of this Special Issue of Agricultural Systems is to lay the foundation for the next generation of agricultural systems data, models and knowledge products. The Special Issue is based on a "NextGen" study led by the Agricultural Model Intercomparison and Improvement Project (AgMIP) with support from the Bill and Melinda Gates Foundation.
Next Generation Agricultural System Data, Models and Knowledge Products: Introduction
NASA Technical Reports Server (NTRS)
Antle, John M.; Jones, James W.; Rosenzweig, Cynthia E.
2016-01-01
Agricultural system models have become important tools to provide predictive and assessment capability to a growing array of decision-makers in the private and public sectors. Despite ongoing research and model improvements, many of the agricultural models today are direct descendants of research investments initially made 30-40 years ago, and many of the major advances in data, information and communication technology (ICT) of the past decade have not been fully exploited. The purpose of this Special Issue of Agricultural Systems is to lay the foundation for the next generation of agricultural systems data, models and knowledge products. The Special Issue is based on a 'NextGen' study led by the Agricultural Model Intercomparison and Improvement Project (AgMIP) with support from the Bill and Melinda Gates Foundation.
A new bead-spring model for simulation of semi-flexible macromolecules
NASA Astrophysics Data System (ADS)
Saadat, Amir; Khomami, Bamin
2016-11-01
A bead-spring model for semi-flexible macromolecules is developed to overcome the deficiencies of the current coarse-grained bead-spring models. Specifically, model improvements are achieved through incorporation of a bending potential. The new model is designed to accurately describe the correlation along the backbone of the chain, segmental length, and force-extension behavior of the macromolecule even at the limit of 1 Kuhn step per spring. The relaxation time of different Rouse modes is used to demonstrate the capabilities of the new model in predicting chain dynamics.
Del Rio-Chanona, Ehecatl A; Liu, Jiao; Wagner, Jonathan L; Zhang, Dongda; Meng, Yingying; Xue, Song; Shah, Nilay
2018-02-01
Biodiesel produced from microalgae has been extensively studied due to its potentially outstanding advantages over traditional transportation fuels. In order to facilitate its industrialization and improve the process profitability, it is vital to construct highly accurate models capable of predicting the complex behavior of the investigated biosystem for process optimization and control, which forms the current research goal. Three original contributions are described in this paper. Firstly, a dynamic model is constructed to simulate the complicated effect of light intensity, nutrient supply and light attenuation on both biomass growth and biolipid production. Secondly, chlorophyll fluorescence, an instantly measurable variable and indicator of photosynthetic activity, is embedded into the model to monitor and update model accuracy especially for the purpose of future process optimal control, and its correlation between intracellular nitrogen content is quantified, which to the best of our knowledge has never been addressed so far. Thirdly, a thorough experimental verification is conducted under different scenarios including both continuous illumination and light/dark cycle conditions to testify the model predictive capability particularly for long-term operation, and it is concluded that the current model is characterized by a high level of predictive capability. Based on the model, the optimal light intensity for algal biomass growth and lipid synthesis is estimated. This work, therefore, paves the way to forward future process design and real-time optimization. © 2017 Wiley Periodicals, Inc.
Short National Early Warning Score - Developing a Modified Early Warning Score.
Luís, Leandro; Nunes, Carla
2017-12-11
Early Warning Score (EWS) systems have been developed for detecting hospital patients clinical deterioration. Many studies show that a National Early Warning Score (NEWS) performs well in discriminating survival from death in acute medical and surgical hospital wards. NEWS is validated for Portugal and is available for use. A simpler EWS system may help to reduce the risk of error, as well as increase clinician compliance with the tool. The aim of the study was to evaluate whether a simplified NEWS model will improve use and data collection. We evaluated the ability of single and aggregated parameters from the NEWS model to detect patients' clinical deterioration in the 24h prior to an outcome. There were 2 possible outcomes: Survival vs Unanticipated intensive care unit admission or death. We used binary logistic regression models and Receiver Operating Characteristic Curves (ROC) to evaluate the parameters' performance in discriminating among the outcomes for a sample of patients from 6 Portuguese hospital wards. NEWS presented an excellent discriminating capability (Area under the Curve of ROC (AUCROC)=0.944). Temperature and systolic blood pressure (SBP) parameters did not contribute significantly to the model. We developed two different models, one without temperature, and the other by removing temperature and SBP (M2). Both models had an excellent discriminating capability (AUCROC: 0.965; 0.903, respectively) and a good predictive power in the optimum threshold of the ROC curve. The 3 models revealed similar discriminant capabilities. Although the use of SBP is not clearly evident in the identification of clinical deterioration, it is recognized as an important vital sign. We recommend the use of the first new model, as its simplicity may help to improve adherence and use by health care workers. Copyright © 2017 Australian College of Critical Care Nurses Ltd. Published by Elsevier Ltd. All rights reserved.
Characterization of structural connections using free and forced response test data
NASA Technical Reports Server (NTRS)
Lawrence, Charles; Huckelbridge, Arthur A.
1989-01-01
The accurate prediction of system dynamic response often has been limited by deficiencies in existing capabilities to characterize connections adequately. Connections between structural components often are complex mechanically, and difficult to accurately model analytically. Improved analytical models for connections are needed to improve system dynamic preditions. A procedure for identifying physical connection properties from free and forced response test data is developed, then verified utilizing a system having both a linear and nonlinear connection. Connection properties are computed in terms of physical parameters so that the physical characteristics of the connections can better be understood, in addition to providing improved input for the system model. The identification procedure is applicable to multi-degree of freedom systems, and does not require that the test data be measured directly at the connection locations.
NASA Astrophysics Data System (ADS)
Jacox, M.; Edwards, C. A.; Kahru, M.; Rudnick, D. L.; Kudela, R. M.
2012-12-01
A 26-year record of depth integrated primary productivity (PP) in the Southern California Current System (SCCS) is analyzed with the goal of improving satellite net primary productivity (PP) estimates. The ratio of integrated primary productivity to surface chlorophyll correlates strongly to surface chlorophyll concentration (chl0). However, chl0 does not correlate to chlorophyll-specific productivity, and appears to be a proxy for vertical phytoplankton distribution rather than phytoplankton physiology. Modest improvements in PP model performance are achieved by tuning existing algorithms for the SCCS, particularly by empirical parameterization of photosynthetic efficiency in the Vertically Generalized Production Model. Much larger improvements are enabled by improving accuracy of subsurface chlorophyll and light profiles. In a simple vertically resolved production model, substitution of in situ surface data for remote sensing estimates offers only marginal improvements in model r2 and total log10 root mean squared difference, while inclusion of in situ chlorophyll and light profiles improves these metrics significantly. Autonomous underwater gliders, capable of measuring subsurface fluorescence on long-term, long-range deployments, significantly improve PP model fidelity in the SCCS. We suggest their use (and that of other autonomous profilers such as Argo floats) in conjunction with satellites as a way forward for improved PP estimation in coastal upwelling systems.
NASA Astrophysics Data System (ADS)
Jacox, Michael G.; Edwards, Christopher A.; Kahru, Mati; Rudnick, Daniel L.; Kudela, Raphael M.
2015-02-01
A 26-year record of depth integrated primary productivity (PP) in the Southern California Current System (SCCS) is analyzed with the goal of improving satellite net primary productivity (PP) estimates. Modest improvements in PP model performance are achieved by tuning existing algorithms for the SCCS, particularly by parameterizing carbon fixation rate in the vertically generalized production model as a function of surface chlorophyll concentration and distance from shore. Much larger improvements are enabled by improving the accuracy of subsurface chlorophyll and light profiles. In a simple vertically resolved production model for the SCCS (VRPM-SC), substitution of in situ surface data for remote sensing estimates offers only marginal improvements in model r2 (from 0.54 to 0.56) and total log10 root mean squared difference (from 0.22 to 0.21), while inclusion of in situ chlorophyll and light profiles improves these metrics to 0.77 and 0.15, respectively. Autonomous underwater gliders, capable of measuring subsurface properties on long-term, long-range deployments, significantly improve PP model fidelity in the SCCS. We suggest their use (and that of other autonomous profilers such as Argo floats) in conjunction with satellites as a way forward for large-scale improvements in PP estimation.
U.S. National / Naval Ice Center (NIC) Support to Naval and Maritime Operations
2011-06-20
States and Canadian governments. • International Arctic Buoy Programme ( IABP ) Global participants working together to maintain a network of... Modeling Surface Observations Satellite Air Recon Data Fusion Derived Data Automation Direct Data Dissemination TODAY’S CHALLENGES...and AUVs • Improve modeling and forecasting capabilities (OTSR/WEAX) • More trained ice analysts, ice pilots, and Arctic marine weather forecasters
NASA Technical Reports Server (NTRS)
Mahoney, M. J.; Ismail, S.; Browell, E. V.; Ferrare, R. A.; Kooi, S. A.; Brasseur, L.; Notari, A.; Petway, L.; Brackett, V.; Clayton, M.;
2002-01-01
LASE measures high resolution moisture, aerosol, and cloud distributions not available from conventional observations. LASE water vapor measurements were compared with dropsondes to evaluate their accuracy. LASE water vapor measurements were used to assess the capability of hurricane models to improve their track accuracy by 100 km on 3 day forecasts using Florida State University models.
Development of Improved Algorithms and Multiscale Modeling Capability with SUNTANS
2013-09-30
solves the Navier-Stokes equations under the Boussinesq approximation (Fringer et al.,2006). The formulation is based on the method outlined by...stratified systems . Figure 4 shows a nonhydrostatic isopycnal simulation of oscillatory flow in a continuously stratified fluid over a Gaussian sill. This...Modeling the Earth System , Boulder (invited). Sankaranarayanan, S., and Fringer, O. B., 2013, "Dynamics of barotropic low-frequency fluctuations in
Approaches and possible improvements in the area of multibody dynamics modeling
NASA Technical Reports Server (NTRS)
Lips, K. W.; Singh, R.
1987-01-01
A wide ranging look is taken at issues involved in the dynamic modeling of complex, multibodied orbiting space systems. Capabilities and limitations of two major codes (DISCOS, TREETOPS) are assessed and possible extensions to the CONTOPS software are outlined. In addition, recommendations are made concerning the direction future development should take in order to achieve higher fidelity, more computationally efficient multibody software solutions.
Exploring cosmic origins with CORE: Inflation
NASA Astrophysics Data System (ADS)
Finelli, F.; Bucher, M.; Achúcarro, A.; Ballardini, M.; Bartolo, N.; Baumann, D.; Clesse, S.; Errard, J.; Handley, W.; Hindmarsh, M.; Kiiveri, K.; Kunz, M.; Lasenby, A.; Liguori, M.; Paoletti, D.; Ringeval, C.; Väliviita, J.; van Tent, B.; Vennin, V.; Ade, P.; Allison, R.; Arroja, F.; Ashdown, M.; Banday, A. J.; Banerji, R.; Bartlett, J. G.; Basak, S.; de Bernardis, P.; Bersanelli, M.; Bonaldi, A.; Borril, J.; Bouchet, F. R.; Boulanger, F.; Brinckmann, T.; Burigana, C.; Buzzelli, A.; Cai, Z.-Y.; Calvo, M.; Carvalho, C. S.; Castellano, G.; Challinor, A.; Chluba, J.; Colantoni, I.; Coppolecchia, A.; Crook, M.; D'Alessandro, G.; D'Amico, G.; Delabrouille, J.; Desjacques, V.; De Zotti, G.; Diego, J. M.; Di Valentino, E.; Feeney, S.; Fergusson, J. R.; Fernandez-Cobos, R.; Ferraro, S.; Forastieri, F.; Galli, S.; García-Bellido, J.; de Gasperis, G.; Génova-Santos, R. T.; Gerbino, M.; González-Nuevo, J.; Grandis, S.; Greenslade, J.; Hagstotz, S.; Hanany, S.; Hazra, D. K.; Hernández-Monteagudo, C.; Hervias-Caimapo, C.; Hills, M.; Hivon, E.; Hu, B.; Kisner, T.; Kitching, T.; Kovetz, E. D.; Kurki-Suonio, H.; Lamagna, L.; Lattanzi, M.; Lesgourgues, J.; Lewis, A.; Lindholm, V.; Lizarraga, J.; López-Caniego, M.; Luzzi, G.; Maffei, B.; Mandolesi, N.; Martínez-González, E.; Martins, C. J. A. P.; Masi, S.; McCarthy, D.; Matarrese, S.; Melchiorri, A.; Melin, J.-B.; Molinari, D.; Monfardini, A.; Natoli, P.; Negrello, M.; Notari, A.; Oppizzi, F.; Paiella, A.; Pajer, E.; Patanchon, G.; Patil, S. P.; Piat, M.; Pisano, G.; Polastri, L.; Polenta, G.; Pollo, A.; Poulin, V.; Quartin, M.; Ravenni, A.; Remazeilles, M.; Renzi, A.; Roest, D.; Roman, M.; Rubiño-Martin, J. A.; Salvati, L.; Starobinsky, A. A.; Tartari, A.; Tasinato, G.; Tomasi, M.; Torrado, J.; Trappe, N.; Trombetti, T.; Tucci, M.; Tucker, C.; Urrestilla, J.; van de Weygaert, R.; Vielva, P.; Vittorio, N.; Young, K.; Zannoni, M.
2018-04-01
We forecast the scientific capabilities to improve our understanding of cosmic inflation of CORE, a proposed CMB space satellite submitted in response to the ESA fifth call for a medium-size mission opportunity. The CORE satellite will map the CMB anisotropies in temperature and polarization in 19 frequency channels spanning the range 60–600 GHz. CORE will have an aggregate noise sensitivity of 1.7 μKṡ arcmin and an angular resolution of 5' at 200 GHz. We explore the impact of telescope size and noise sensitivity on the inflation science return by making forecasts for several instrumental configurations. This study assumes that the lower and higher frequency channels suffice to remove foreground contaminations and complements other related studies of component separation and systematic effects, which will be reported in other papers of the series "Exploring Cosmic Origins with CORE." We forecast the capability to determine key inflationary parameters, to lower the detection limit for the tensor-to-scalar ratio down to the 10‑3 level, to chart the landscape of single field slow-roll inflationary models, to constrain the epoch of reheating, thus connecting inflation to the standard radiation-matter dominated Big Bang era, to reconstruct the primordial power spectrum, to constrain the contribution from isocurvature perturbations to the 10‑3 level, to improve constraints on the cosmic string tension to a level below the presumptive GUT scale, and to improve the current measurements of primordial non-Gaussianities down to the fNLlocal < 1 level. For all the models explored, CORE alone will improve significantly on the present constraints on the physics of inflation. Its capabilities will be further enhanced by combining with complementary future cosmological observations.
Coupling of TRAC-PF1/MOD2, Version 5.4.25, with NESTLE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Knepper, P.L.; Hochreiter, L.E.; Ivanov, K.N.
1999-09-01
A three-dimensional (3-D) spatial kinetics capability within a thermal-hydraulics system code provides a more correct description of the core physics during reactor transients that involve significant variations in the neutron flux distribution. Coupled codes provide the ability to forecast safety margins in a best-estimate manner. The behavior of a reactor core and the feedback to the plant dynamics can be accurately simulated. For each time step, coupled codes are capable of resolving system interaction effects on neutronics feedback and are capable of describing local neutronics effects caused by the thermal hydraulics and neutronics coupling. With the improvements in computational technology,more » modeling complex reactor behaviors with coupled thermal hydraulics and spatial kinetics is feasible. Previously, reactor analysis codes were limited to either a detailed thermal-hydraulics model with simplified kinetics or multidimensional neutron kinetics with a simplified thermal-hydraulics model. The authors discuss the coupling of the Transient Reactor Analysis Code (TRAC)-PF1/MOD2, Version 5.4.25, with the NESTLE code.« less
University Research in Support of TREAT Modeling and Simulation, FY 2016
DOE Office of Scientific and Technical Information (OSTI.GOV)
DeHart, Mark David
Idaho National Laboratory is currently evolving the modeling and simulation (M&S) capability that will enable improved core operation as well as design and analysis of TREAT experiments. This M&S capability primarily uses MAMMOTH, a reactor physics application being developed under the Multi-physics Object Oriented Simulation Environment (MOOSE) framework. MAMMOTH allows the coupling of a number of other MOOSE-based applications. In support of this research, INL is working with four universities to explore advanced solution methods that will complement or augment capabilities in MAMMOTH. This report consists of a collection of year end summaries of research from the universities performed inmore » support of TREAT modeling and simulation. This research was led by Prof. Sedat Goluoglu at the University of Florida, Profs. Jim Morel and Jean Ragusa at Texas A&M University, Profs. Benoit Forget and Kord Smith at Massachusetts Institute of Technology, Prof. Leslie Kerby of Idaho State University and Prof. Barry Ganapol of University of Arizona. A significant number of students were supported at various levels though the projects and, for some, also as interns at INL.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Chao ..; Singh, Vijay P.; Mishra, Ashok K.
2013-02-06
This paper presents an improved brivariate mixed distribution, which is capable of modeling the dependence of daily rainfall from two distinct sources (e.g., rainfall from two stations, two consecutive days, or two instruments such as satellite and rain gauge). The distribution couples an existing framework for building a bivariate mixed distribution, the theory of copulae and a hybrid marginal distribution. Contributions of the improved distribution are twofold. One is the appropriate selection of the bivariate dependence structure from a wider admissible choice (10 candidate copula families). The other is the introduction of a marginal distribution capable of better representing lowmore » to moderate values as well as extremes of daily rainfall. Among several applications of the improved distribution, particularly presented here is its utility for single-site daily rainfall simulation. Rather than simulating rainfall occurrences and amounts separately, the developed generator unifies the two processes by generalizing daily rainfall as a Markov process with autocorrelation described by the improved bivariate mixed distribution. The generator is first tested on a sample station in Texas. Results reveal that the simulated and observed sequences are in good agreement with respect to essential characteristics. Then, extensive simulation experiments are carried out to compare the developed generator with three other alternative models: the conventional two-state Markov chain generator, the transition probability matrix model and the semi-parametric Markov chain model with kernel density estimation for rainfall amounts. Analyses establish that overall the developed generator is capable of reproducing characteristics of historical extreme rainfall events and is apt at extrapolating rare values beyond the upper range of available observed data. Moreover, it automatically captures the persistence of rainfall amounts on consecutive wet days in a relatively natural and easy way. Another interesting observation is that the recognized ‘overdispersion’ problem in daily rainfall simulation ascribes more to the loss of rainfall extremes than the under-representation of first-order persistence. The developed generator appears to be a sound option for daily rainfall simulation, especially in particular hydrologic planning situations when rare rainfall events are of great importance.« less
The influence of institutional pressures on hospital electronic health record presence.
Fareed, Naleef; Bazzoli, Gloria J; Farnsworth Mick, Stephen S; Harless, David W
2015-05-01
Electronic health records (EHR) are a promising form of health information technology that could help US hospitals improve on their quality of care and costs. During the study period explored (2005-2009), high expectations for EHR diffused across institutional stakeholders in the healthcare environment, which may have pressured hospitals to have EHR capabilities even in the presence of weak technical rationale for the technology. Using an extensive set of organizational theory-specific predictors, this study explored whether five factors - cause, constituents, content, context, and control - that reflect the nature of institutional pressures for EHR capabilities motivated hospitals to comply with these pressures. Using information from several national data bases, an ordered probit regression model was estimated. The resulting predicted probabilities of EHR capabilities from the empirical model's estimates were used to test the study's five hypotheses, of which three were supported. When the underlying cause, dependence on constituents, or influence of control were high and potential countervailing forces were low, hospitals were more likely to employ strategic responses that were compliant with the institutional pressures for EHR capabilities. In light of these pressures, hospitals may have acquiesced, by having comprehensive EHR capabilities, or compromised, by having intermediate EHR capabilities, in order to maintain legitimacy in their environment. The study underscores the importance of our assessment for theory and policy development, and provides suggestions for future research. Copyright © 2015 Elsevier Ltd. All rights reserved.
Thermal niche estimators and the capability of poor dispersal species to cope with climate change
NASA Astrophysics Data System (ADS)
Sánchez-Fernández, David; Rizzo, Valeria; Cieslak, Alexandra; Faille, Arnaud; Fresneda, Javier; Ribera, Ignacio
2016-03-01
For management strategies in the context of global warming, accurate predictions of species response are mandatory. However, to date most predictions are based on niche (bioclimatic) models that usually overlook biotic interactions, behavioral adjustments or adaptive evolution, and assume that species can disperse freely without constraints. The deep subterranean environment minimises these uncertainties, as it is simple, homogeneous and with constant environmental conditions. It is thus an ideal model system to study the effect of global change in species with poor dispersal capabilities. We assess the potential fate of a lineage of troglobitic beetles under global change predictions using different approaches to estimate their thermal niche: bioclimatic models, rates of thermal niche change estimated from a molecular phylogeny, and data from physiological studies. Using bioclimatic models, at most 60% of the species were predicted to have suitable conditions in 2080. Considering the rates of thermal niche change did not improve this prediction. However, physiological data suggest that subterranean species have a broad thermal tolerance, allowing them to stand temperatures never experienced through their evolutionary history. These results stress the need of experimental approaches to assess the capability of poor dispersal species to cope with temperatures outside those they currently experience.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zou, Ling; Berry, R. A.; Martineau, R. C.
The RELAP-7 code is the next generation nuclear reactor system safety analysis code being developed at the Idaho National Laboratory (INL). The code is based on the INL’s modern scientific software development framework, MOOSE (Multi-Physics Object Oriented Simulation Environment). The overall design goal of RELAP-7 is to take advantage of the previous thirty years of advancements in computer architecture, software design, numerical integration methods, and physical models. The end result will be a reactor systems analysis capability that retains and improves upon RELAP5’s and TRACE’s capabilities and extends their analysis capabilities for all reactor system simulation scenarios. The RELAP-7 codemore » utilizes the well-posed 7-equation two-phase flow model for compressible two-phase flow. Closure models used in the TRACE code has been reviewed and selected to reflect the progress made during the past decades and provide a basis for the colure correlations implemented in the RELAP-7 code. This document provides a summary on the closure correlations that are currently implemented in the RELAP-7 code. The closure correlations include sub-grid models that describe interactions between the fluids and the flow channel, and interactions between the two phases.« less
Cryogenic wind tunnels: Unique capabilities for the aerodynamicist
NASA Technical Reports Server (NTRS)
Hall, R. M.
1976-01-01
The cryogenic wind-tunnel concept as a practical means for improving ground simulation of transonic flight conditions. The Langley 1/3-meter transonic cryogenic tunnel is operational, and the design of a cryogenic National Transonic Facility is undertaken. A review of some of the unique capabilities of cryogenic wind tunnels is presented. In particular, the advantages of having independent control of tunnel Mach number, total pressure, and total temperature are highlighted. This separate control over the three tunnel parameters will open new frontiers in Mach number, Reynolds number, aeroelastic, and model-tunnel interaction studies.
Seismographs, sensors, and satellites: Better technology for safer communities
Groat, C.G.
2004-01-01
In the past 25 years, our ability to measure, monitor, and model the processes that lead to natural disasters has increased dramatically. Equally important has been the improvement in our technological capability to communicate information about hazards to those whose lives may be affected. These innovations in tracking and communicating the changes-floods, earthquakes, wildfires, volcanic eruptions-in our dynamic planet, supported by a deeper understanding of earth processes, enable us to expand our predictive capabilities and point the way to a safer future. ?? 2004 Elsevier Ltd. All rights reserved.
Distributed generation capabilities of the national energy modeling system
DOE Office of Scientific and Technical Information (OSTI.GOV)
LaCommare, Kristina Hamachi; Edwards, Jennifer L.; Marnay, Chris
2003-01-01
This report describes Berkeley Lab's exploration of how the National Energy Modeling System (NEMS) models distributed generation (DG) and presents possible approaches for improving how DG is modeled. The on-site electric generation capability has been available since the AEO2000 version of NEMS. Berkeley Lab has previously completed research on distributed energy resources (DER) adoption at individual sites and has developed a DER Customer Adoption Model called DER-CAM. Given interest in this area, Berkeley Lab set out to understand how NEMS models small-scale on-site generation to assess how adequately DG is treated in NEMS, and to propose improvements or alternatives. Themore » goal is to determine how well NEMS models the factors influencing DG adoption and to consider alternatives to the current approach. Most small-scale DG adoption takes place in the residential and commercial modules of NEMS. Investment in DG ultimately offsets purchases of electricity, which also eliminates the losses associated with transmission and distribution (T&D). If the DG technology that is chosen is photovoltaics (PV), NEMS assumes renewable energy consumption replaces the energy input to electric generators. If the DG technology is fuel consuming, consumption of fuel in the electric utility sector is replaced by residential or commercial fuel consumption. The waste heat generated from thermal technologies can be used to offset the water heating and space heating energy uses, but there is no thermally activated cooling capability. This study consists of a review of model documentation and a paper by EIA staff, a series of sensitivity runs performed by Berkeley Lab that exercise selected DG parameters in the AEO2002 version of NEMS, and a scoping effort of possible enhancements and alternatives to NEMS current DG capabilities. In general, the treatment of DG in NEMS is rudimentary. The penetration of DG is determined by an economic cash-flow analysis that determines adoption based on the n umber of years to a positive cash flow. Some important technologies, e.g. thermally activated cooling, are absent, and ceilings on DG adoption are determined by some what arbitrary caps on the number of buildings that can adopt DG. These caps are particularly severe for existing buildings, where the maximum penetration for any one technology is 0.25 percent. On the other hand, competition among technologies is not fully considered, and this may result in double-counting for certain applications. A series of sensitivity runs show greater penetration with net metering enhancements and aggressive tax credits and a more limited response to lowered DG technology costs. Discussion of alternatives to the current code is presented in Section 4. Alternatives or improvements to how DG is modeled in NEMS cover three basic areas: expanding on the existing total market for DG both by changing existing parameters in NEMS and by adding new capabilities, such as for missing technologies; enhancing the cash flow analysis but incorporating aspects of DG economics that are not currently represented, e.g. complex tariffs; and using an external geographic information system (GIS) driven analysis that can better and more intuitively identify niche markets.« less
The Transfer Function Model as a Tool to Study and Describe Space Weather Phenomena
NASA Technical Reports Server (NTRS)
Porter, Hayden S.; Mayr, Hans G.; Bhartia, P. K. (Technical Monitor)
2001-01-01
The Transfer Function Model (TFM) is a semi-analytical, linear model that is designed especially to describe thermospheric perturbations associated with magnetic storms and substorm. activity. It is a multi-constituent model (N2, O, He H, Ar) that accounts for wind induced diffusion, which significantly affects not only the composition and mass density but also the temperature and wind fields. Because the TFM adopts a semianalytic approach in which the geometry and temporal dependencies of the driving sources are removed through the use of height-integrated Green's functions, it provides physical insight into the essential properties of processes being considered, which are uncluttered by the accidental complexities that arise from particular source geometrie and time dependences. Extending from the ground to 700 km, the TFM eliminates spurious effects due to arbitrarily chosen boundary conditions. A database of transfer functions, computed only once, can be used to synthesize a wide range of spatial and temporal sources dependencies. The response synthesis can be performed quickly in real-time using only limited computing capabilities. These features make the TFM unique among global dynamical models. Given these desirable properties, a version of the TFM has been developed for personal computers (PC) using advanced platform-independent 3D visualization capabilities. We demonstrate the model capabilities with simulations for different auroral sources, including the response of ducted gravity waves modes that propagate around the globe. The thermospheric response is found to depend strongly on the spatial and temporal frequency spectra of the storm. Such varied behavior is difficult to describe in statistical empirical models. To improve the capability of space weather prediction, the TFM thus could be grafted naturally onto existing statistical models using data assimilation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thorne, P.D.; Chamness, M.A.; Vermeul, V.R.
This report documents work conducted during the fiscal year 1994 to development an improved three-dimensional conceptual model of ground-water flow in the unconfined aquifer system across the Hanford Site Ground-Water Surveillance Project, which is managed by Pacific Northwest Laboratory. The main objective of the ongoing effort to develop an improved conceptual model of ground-water flow is to provide the basis for improved numerical report models that will be capable of accurately predicting the movement of radioactive and chemical contaminant plumes in the aquifer beneath Hanford. More accurate ground-water flow models will also be useful in assessing the impacts of changesmore » in facilities and operations. For example, decreasing volumes of operational waste-water discharge are resulting in a declining water table in parts of the unconfined aquifer. In addition to supporting numerical modeling, the conceptual model also provides a qualitative understanding of the movement of ground water and contaminants in the aquifer.« less
Evaluation of airborne lidar data to predict vegetation Presence/Absence
Palaseanu-Lovejoy, M.; Nayegandhi, A.; Brock, J.; Woodman, R.; Wright, C.W.
2009-01-01
This study evaluates the capabilities of the Experimental Advanced Airborne Research Lidar (EAARL) in delineating vegetation assemblages in Jean Lafitte National Park, Louisiana. Five-meter-resolution grids of bare earth, canopy height, canopy-reflection ratio, and height of median energy were derived from EAARL data acquired in September 2006. Ground-truth data were collected along transects to assess species composition, canopy cover, and ground cover. To decide which model is more accurate, comparisons of general linear models and generalized additive models were conducted using conventional evaluation methods (i.e., sensitivity, specificity, Kappa statistics, and area under the curve) and two new indexes, net reclassification improvement and integrated discrimination improvement. Generalized additive models were superior to general linear models in modeling presence/absence in training vegetation categories, but no statistically significant differences between the two models were achieved in determining the classification accuracy at validation locations using conventional evaluation methods, although statistically significant improvements in net reclassifications were observed. ?? 2009 Coastal Education and Research Foundation.
NASA Astrophysics Data System (ADS)
Cai, X.; Yang, Z.-L.; Fisher, J. B.; Zhang, X.; Barlage, M.; Chen, F.
2016-01-01
Climate and terrestrial biosphere models consider nitrogen an important factor in limiting plant carbon uptake, while operational environmental models view nitrogen as the leading pollutant causing eutrophication in water bodies. The community Noah land surface model with multi-parameterization options (Noah-MP) is unique in that it is the next-generation land surface model for the Weather Research and Forecasting meteorological model and for the operational weather/climate models in the National Centers for Environmental Prediction. In this study, we add a capability to Noah-MP to simulate nitrogen dynamics by coupling the Fixation and Uptake of Nitrogen (FUN) plant model and the Soil and Water Assessment Tool (SWAT) soil nitrogen dynamics. This model development incorporates FUN's state-of-the-art concept of carbon cost theory and SWAT's strength in representing the impacts of agricultural management on the nitrogen cycle. Parameterizations for direct root and mycorrhizal-associated nitrogen uptake, leaf retranslocation, and symbiotic biological nitrogen fixation are employed from FUN, while parameterizations for nitrogen mineralization, nitrification, immobilization, volatilization, atmospheric deposition, and leaching are based on SWAT. The coupled model is then evaluated at the Kellogg Biological Station - a Long Term Ecological Research site within the US Corn Belt. Results show that the model performs well in capturing the major nitrogen state/flux variables (e.g., soil nitrate and nitrate leaching). Furthermore, the addition of nitrogen dynamics improves the modeling of net primary productivity and evapotranspiration. The model improvement is expected to advance the capability of Noah-MP to simultaneously predict weather and water quality in fully coupled Earth system models.
Forecasting Hourly Water Demands With Seasonal Autoregressive Models for Real-Time Application
NASA Astrophysics Data System (ADS)
Chen, Jinduan; Boccelli, Dominic L.
2018-02-01
Consumer water demands are not typically measured at temporal or spatial scales adequate to support real-time decision making, and recent approaches for estimating unobserved demands using observed hydraulic measurements are generally not capable of forecasting demands and uncertainty information. While time series modeling has shown promise for representing total system demands, these models have generally not been evaluated at spatial scales appropriate for representative real-time modeling. This study investigates the use of a double-seasonal time series model to capture daily and weekly autocorrelations to both total system demands and regional aggregated demands at a scale that would capture demand variability across a distribution system. Emphasis was placed on the ability to forecast demands and quantify uncertainties with results compared to traditional time series pattern-based demand models as well as nonseasonal and single-seasonal time series models. Additional research included the implementation of an adaptive-parameter estimation scheme to update the time series model when unobserved changes occurred in the system. For two case studies, results showed that (1) for the smaller-scale aggregated water demands, the log-transformed time series model resulted in improved forecasts, (2) the double-seasonal model outperformed other models in terms of forecasting errors, and (3) the adaptive adjustment of parameters during forecasting improved the accuracy of the generated prediction intervals. These results illustrate the capabilities of time series modeling to forecast both water demands and uncertainty estimates at spatial scales commensurate for real-time modeling applications and provide a foundation for developing a real-time integrated demand-hydraulic model.
Aligning business strategy of incubator center and tenants
NASA Astrophysics Data System (ADS)
Prasetyawan, Yudha; Agustiani, Elly; Jumayla, Sari
2017-06-01
Incubator center is developed to help a particular group of small business players to achieve the expected business growth. In this center, business players often called as tenants will get assistances in pertaining with space, professional network, marketing, investment or funding, and training to improve their business capability. There are three types of incubator center, namely universities that help their alumni or business people in their surrounded area, company that supports small business as the corporate social responsibility, and independent organizations that have specialties in the business development. Some might success in increasing the capacity of the tenants, while other can have difficulties to increase the simplest business capability, e.g., to define the production cost to measure the profit. This study was intended to propose a model to align the business strategy between incubator center and its tenants. The sales and profit growth are the main priorities for the tenants together with their business capability and sustainability. The proposed alignment model provides measurement tools that link the motivation of tenants for joining the incubation process with the mission of incubator center. The linkage covered the key performance indicators (KPI), steps to achieve the target and evaluation tools to improve the current handicaps. An experiment on 4 (four) diverse business fields of the tenants of an incubator center was performed to test the model. As a result, the increase of KPI of incubator center will simultaneously yield a higher value of the tenants' sales.
Land Surface Verification Toolkit (LVT) - A Generalized Framework for Land Surface Model Evaluation
NASA Technical Reports Server (NTRS)
Kumar, Sujay V.; Peters-Lidard, Christa D.; Santanello, Joseph; Harrison, Ken; Liu, Yuqiong; Shaw, Michael
2011-01-01
Model evaluation and verification are key in improving the usage and applicability of simulation models for real-world applications. In this article, the development and capabilities of a formal system for land surface model evaluation called the Land surface Verification Toolkit (LVT) is described. LVT is designed to provide an integrated environment for systematic land model evaluation and facilitates a range of verification approaches and analysis capabilities. LVT operates across multiple temporal and spatial scales and employs a large suite of in-situ, remotely sensed and other model and reanalysis datasets in their native formats. In addition to the traditional accuracy-based measures, LVT also includes uncertainty and ensemble diagnostics, information theory measures, spatial similarity metrics and scale decomposition techniques that provide novel ways for performing diagnostic model evaluations. Though LVT was originally designed to support the land surface modeling and data assimilation framework known as the Land Information System (LIS), it also supports hydrological data products from other, non-LIS environments. In addition, the analysis of diagnostics from various computational subsystems of LIS including data assimilation, optimization and uncertainty estimation are supported within LVT. Together, LIS and LVT provide a robust end-to-end environment for enabling the concepts of model data fusion for hydrological applications. The evolving capabilities of LVT framework are expected to facilitate rapid model evaluation efforts and aid the definition and refinement of formal evaluation procedures for the land surface modeling community.
Improving mixing efficiency of a polymer micromixer by use of a plastic shim divider
NASA Astrophysics Data System (ADS)
Li, Lei; Lee, L. James; Castro, Jose M.; Yi, Allen Y.
2010-03-01
In this paper, a critical modification to a polymer based affordable split-and-recombination static micromixer is described. To evaluate the improvement, both the original and the modified design were carefully investigated using an experimental setup and numerical modeling approach. The structure of the micromixer was designed to take advantage of the process capabilities of both ultraprecision micromachining and microinjection molding process. Specifically, the original and the modified design were numerically simulated using commercial finite element method software ANSYS CFX to assist the re-designing of the micromixers. The simulation results have shown that both designs are capable of performing mixing while the modified design has a much improved performance. Mixing experiments with two different fluids were carried out using the original and the modified mixers again showed a significantly improved mixing uniformity by the latter. The measured mixing coefficient for the original design was 0.11, and for the improved design it was 0.065. The developed manufacturing process based on ultraprecision machining and microinjection molding processes for device fabrication has the advantage of high-dimensional precision, low cost and manufacturing flexibility.
Rolling scheduling of electric power system with wind power based on improved NNIA algorithm
NASA Astrophysics Data System (ADS)
Xu, Q. S.; Luo, C. J.; Yang, D. J.; Fan, Y. H.; Sang, Z. X.; Lei, H.
2017-11-01
This paper puts forth a rolling modification strategy for day-ahead scheduling of electric power system with wind power, which takes the operation cost increment of unit and curtailed wind power of power grid as double modification functions. Additionally, an improved Nondominated Neighbor Immune Algorithm (NNIA) is proposed for solution. The proposed rolling scheduling model has further improved the operation cost of system in the intra-day generation process, enhanced the system’s accommodation capacity of wind power, and modified the key transmission section power flow in a rolling manner to satisfy the security constraint of power grid. The improved NNIA algorithm has defined an antibody preference relation model based on equal incremental rate, regulation deviation constraints and maximum & minimum technical outputs of units. The model can noticeably guide the direction of antibody evolution, and significantly speed up the process of algorithm convergence to final solution, and enhance the local search capability.
USDA-ARS?s Scientific Manuscript database
Multimodeling (MM) has been developed during the last decade to improve prediction capability of hydrological models. The MM combined with the pedotransfer functions (PTFs) was successfully applied to soil water flow simulations. This study examined the uncertainty in water content simulations assoc...
Potential capabilities of lunar laser ranging for geodesy and relativity
NASA Technical Reports Server (NTRS)
Muller, Jurgen; Williams, James G.; Turshev, Slava G.; Shelus, Peter J.
2005-01-01
Here, we review the LLR technique focusing on its impact on Geodesy and Relativity. We discuss the modem observational accuracy and the level of existing LLR modeling. We present the near-term objectives and emphasize improvements needed to fully utilize the scientific potential of LLR.
Cai, Longyan; He, Hong S.; Wu, Zhiwei; Lewis, Benard L.; Liang, Yu
2014-01-01
Understanding the fire prediction capabilities of fuel models is vital to forest fire management. Various fuel models have been developed in the Great Xing'an Mountains in Northeast China. However, the performances of these fuel models have not been tested for historical occurrences of wildfires. Consequently, the applicability of these models requires further investigation. Thus, this paper aims to develop standard fuel models. Seven vegetation types were combined into three fuel models according to potential fire behaviors which were clustered using Euclidean distance algorithms. Fuel model parameter sensitivity was analyzed by the Morris screening method. Results showed that the fuel model parameters 1-hour time-lag loading, dead heat content, live heat content, 1-hour time-lag SAV(Surface Area-to-Volume), live shrub SAV, and fuel bed depth have high sensitivity. Two main sensitive fuel parameters: 1-hour time-lag loading and fuel bed depth, were determined as adjustment parameters because of their high spatio-temporal variability. The FARSITE model was then used to test the fire prediction capabilities of the combined fuel models (uncalibrated fuel models). FARSITE was shown to yield an unrealistic prediction of the historical fire. However, the calibrated fuel models significantly improved the capabilities of the fuel models to predict the actual fire with an accuracy of 89%. Validation results also showed that the model can estimate the actual fires with an accuracy exceeding 56% by using the calibrated fuel models. Therefore, these fuel models can be efficiently used to calculate fire behaviors, which can be helpful in forest fire management. PMID:24714164
Improving short-term air quality predictions over the U.S. using chemical data assimilation
NASA Astrophysics Data System (ADS)
Kumar, R.; Delle Monache, L.; Alessandrini, S.; Saide, P.; Lin, H. C.; Liu, Z.; Pfister, G.; Edwards, D. P.; Baker, B.; Tang, Y.; Lee, P.; Djalalova, I.; Wilczak, J. M.
2017-12-01
State and local air quality forecasters across the United States use air quality forecasts from the National Air Quality Forecasting Capability (NAQFC) at the National Oceanic and Atmospheric Administration (NOAA) as one of the key tools to protect the public from adverse air pollution related health effects by dispensing timely information about air pollution episodes. This project funded by the National Aeronautics and Space Administration (NASA) aims to enhance the decision-making process by improving the accuracy of NAQFC short-term predictions of ground-level particulate matter of less than 2.5 µm in diameter (PM2.5) by exploiting NASA Earth Science Data with chemical data assimilation. The NAQFC is based on the Community Multiscale Air Quality (CMAQ) model. To improve the initialization of PM2.5 in CMAQ, we developed a new capability in the community Gridpoint Statistical Interpolation (GSI) system to assimilate Terra/Aqua Moderate Resolution Imaging Spectroradiometer (MODIS) aerosol optical depth (AOD) retrievals in CMAQ. Specifically, we developed new capabilities within GSI to read/write CMAQ data, a forward operator that calculates AOD at 550 nm from CMAQ aerosol chemical composition and an adjoint of the forward operator that translates the changes in AOD to aerosol chemical composition. A generalized background error covariance program called "GEN_BE" has been extended to calculate background error covariance using CMAQ output. The background error variances are generated using a combination of both emissions and meteorological perturbations to better capture sources of uncertainties in PM2.5 simulations. The newly developed CMAQ-GSI system is used to perform daily 24-h PM2.5 forecasts with and without data assimilation from 15 July to 14 August 2014, and the resulting forecasts are compared against AirNOW PM2.5 measurements at 550 stations across the U. S. We find that the assimilation of MODIS AOD retrievals improves initialization of the CMAQ model in terms of improved correlation coefficient and reduced bias. However, we notice a large bias in nighttime PM2.5 simulations which is primarily associated with very shallow boundary layer in the model. The developments and results will be discussed in detail during the presentation.
21st Century Power Partnership: September 2016 Fellowship Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reber, Timothy J.; Rambau, Prudence; Mdhluli, Sipho
This report details the 21st Century Power Partnership fellowship from September 2016. This Fellowship is a follow-up to the Technical Audit of Eskom's Medium- and Long-term Modelling Capabilities, conducted by U.S. National Renewable Energy Laboratory (NREL) in April 2016. The prospect and role of variable renewable energy (vRE) in South Africa poses new modelling-related challenges that Eskom is actively working to address by improving the fidelity of PLEXOS LT and ST models.
Liu, Tao; Zhu, Guanghu; Lin, Hualiang; Zhang, Yonghui; He, Jianfeng; Deng, Aiping; Peng, Zhiqiang; Xiao, Jianpeng; Rutherford, Shannon; Xie, Runsheng; Zeng, Weilin; Li, Xing; Ma, Wenjun
2017-01-01
Background Dengue fever (DF) in Guangzhou, Guangdong province in China is an important public health issue. The problem was highlighted in 2014 by a large, unprecedented outbreak. In order to respond in a more timely manner and hence better control such potential outbreaks in the future, this study develops an early warning model that integrates internet-based query data into traditional surveillance data. Methodology and principal findings A Dengue Baidu Search Index (DBSI) was collected from the Baidu website for developing a predictive model of dengue fever in combination with meteorological and demographic factors. Generalized additive models (GAM) with or without DBSI were established. The generalized cross validation (GCV) score and deviance explained indexes, intraclass correlation coefficient (ICC) and root mean squared error (RMSE), were respectively applied to measure the fitness and the prediction capability of the models. Our results show that the DBSI with one-week lag has a positive linear relationship with the local DF occurrence, and the model with DBSI (ICC:0.94 and RMSE:59.86) has a better prediction capability than the model without DBSI (ICC:0.72 and RMSE:203.29). Conclusions Our study suggests that a DSBI combined with traditional disease surveillance and meteorological data can improve the dengue early warning system in Guangzhou. PMID:28263988
NASA Astrophysics Data System (ADS)
Kunnath-Poovakka, A.; Ryu, D.; Renzullo, L. J.; George, B.
2016-04-01
Calibration of spatially distributed hydrologic models is frequently limited by the availability of ground observations. Remotely sensed (RS) hydrologic information provides an alternative source of observations to inform models and extend modelling capability beyond the limits of ground observations. This study examines the capability of RS evapotranspiration (ET) and soil moisture (SM) in calibrating a hydrologic model and its efficacy to improve streamflow predictions. SM retrievals from the Advanced Microwave Scanning Radiometer-EOS (AMSR-E) and daily ET estimates from the CSIRO MODIS ReScaled potential ET (CMRSET) are used to calibrate a simplified Australian Water Resource Assessment - Landscape model (AWRA-L) for a selection of parameters. The Shuffled Complex Evolution Uncertainty Algorithm (SCE-UA) is employed for parameter estimation at eleven catchments in eastern Australia. A subset of parameters for calibration is selected based on the variance-based Sobol' sensitivity analysis. The efficacy of 15 objective functions for calibration is assessed based on streamflow predictions relative to control cases, and relative merits of each are discussed. Synthetic experiments were conducted to examine the effect of bias in RS ET observations on calibration. The objective function containing the root mean square deviation (RMSD) of ET result in best streamflow predictions and the efficacy is superior for catchments with medium to high average runoff. Synthetic experiments revealed that accurate ET product can improve the streamflow predictions in catchments with low average runoff.
Development of a Higher Fidelity Model for the Cascade Distillation Subsystem (CDS)
NASA Technical Reports Server (NTRS)
Perry, Bruce; Anderson, Molly
2014-01-01
Significant improvements have been made to the ACM model of the CDS, enabling accurate predictions of dynamic operations with fewer assumptions. The model has been utilized to predict how CDS performance would be impacted by changing operating parameters, revealing performance trade-offs and possibilities for improvement. CDS efficiency is driven by the THP coefficient of performance, which in turn is dependent on heat transfer within the system. Based on the remaining limitations of the simulation, priorities for further model development include: center dot Relaxing the assumption of total condensation center dot Incorporating dynamic simulation capability for the buildup of dissolved inert gasses in condensers center dot Examining CDS operation with more complex feeds center dot Extending heat transfer analysis to all surfaces
NASA Technical Reports Server (NTRS)
Mog, Robert A.
1999-01-01
Unique and innovative graph theory, neural network, organizational modeling, and genetic algorithms are applied to the design and evolution of programmatic and organizational architectures. Graph theory representations of programs and organizations increase modeling capabilities and flexibility, while illuminating preferable programmatic/organizational design features. Treating programs and organizations as neural networks results in better system synthesis, and more robust data modeling. Organizational modeling using covariance structures enhances the determination of organizational risk factors. Genetic algorithms improve programmatic evolution characteristics, while shedding light on rulebase requirements for achieving specified technological readiness levels, given budget and schedule resources. This program of research improves the robustness and verifiability of systems synthesis tools, including the Complex Organizational Metric for Programmatic Risk Environments (COMPRE).
Gerrits, Esther G; Alkhalaf, Alaa; Landman, Gijs W D; van Hateren, Kornelis J J; Groenier, Klaas H; Struck, Joachim; Schulte, Janin; Gans, Reinold O B; Bakker, Stephan J L; Kleefstra, Nanne; Bilo, Henk J G
2014-01-01
Oxidative stress plays an underlying pathophysiologic role in the development of diabetes complications. The aim of this study was to investigate peroxiredoxin 4 (Prx4), a proposed novel biomarker of oxidative stress, and its association with and capability as a biomarker in predicting (cardiovascular) mortality in type 2 diabetes mellitus. Prx4 was assessed in baseline serum samples of 1161 type 2 diabetes patients. Cox proportional hazard models were used to evaluate the relationship between Prx4 and (cardiovascular) mortality. Risk prediction capabilities of Prx4 for (cardiovascular) mortality were assessed with Harrell's C statistic, the integrated discrimination improvement and net reclassification improvement. Mean age was 67 and the median diabetes duration was 4.0 years. After a median follow-up period of 5.8 years, 327 patients died; 137 cardiovascular deaths. Prx4 was associated with (cardiovascular) mortality. The Cox proportional hazard models added the variables: Prx4 (model 1); age and gender (model 2), and BMI, creatinine, smoking, diabetes duration, systolic blood pressure, cholesterol-HDL ratio, history of macrovascular complications, and albuminuria (model 3). Hazard ratios (HR) (95% CI) for cardiovascular mortality were 1.93 (1.57 - 2.38), 1.75 (1.39 - 2.20), and 1.63 (1.28 - 2.09) for models 1, 2 and 3, respectively. HR for all-cause mortality were 1.73 (1.50 - 1.99), 1.50 (1.29 - 1.75), and 1.44 (1.23 - 1.67) for models 1, 2 and 3, respectively. Addition of Prx4 to the traditional risk factors slightly improved risk prediction of (cardiovascular) mortality. Prx4 is independently associated with (cardiovascular) mortality in type 2 diabetes patients. After addition of Prx4 to the traditional risk factors, there was a slightly improvement in risk prediction of (cardiovascular) mortality in this patient group.
NASA Astrophysics Data System (ADS)
Kapitan, Loginn
This research created a new model which provides an integrated approach to planning the effective selection and employment of airborne sensor systems in response to accidental or intentional chemical vapor releases. The approach taken was to use systems engineering and decision analysis methods to construct a model architecture which produced a modular structure for integrating both new and existing components into a logical procedure to assess the application of airborne sensor systems to address chemical vapor hazards. The resulting integrated process model includes an internal aggregation model which allowed differentiation among alternative airborne sensor systems. Both models were developed and validated by experts and demonstrated using appropriate hazardous chemical release scenarios. The resultant prototype integrated process model or system fills a current gap in capability allowing improved planning, training and exercise for HAZMAT teams and first responders when considering the selection and employment of airborne sensor systems. Through the research process, insights into the current response structure and how current airborne capability may be most effectively used were generated. Furthermore, the resultant prototype system is tailorable for local, state, and federal application, and can potentially be modified to help evaluate investments in new airborne sensor technology and systems. Better planning, training and preparedness exercising holds the prospect for the effective application of airborne assets for improved response to large scale chemical release incidents. Improved response will result in fewer casualties and lives lost, reduced economic impact, and increased protection of critical infrastructure when faced with accidental and intentional terrorist release of hazardous industrial chemicals. With the prospect of more airborne sensor systems becoming available, this prototype system integrates existing and new tools into an effective process for the selection and employment of airborne sensors to better plan, train and exercise ahead of potential chemical release events.
Including resonances in the multiperipheral model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pinsky, S.S.; Snider, D.R.; Thomas, G.H.
1973-10-01
A simple generalization of the multiperipheral model (MPM) and the Mueller--Regge Model (MRM) is given which has improved phenomenological capabilities by explicitly incorporating resonance phenomena, and still is simple enough to be an important theoretical laboratory. The model is discussed both with and without charge. In addition, the one channel, two channel, three channel and N channel cases are explicitly treated. Particular attention is paid to the constraints of charge conservation and positivity in the MRM. The recently proven equivalence between the MRM and MPM is extended to this model, and is used extensively. (auth)
An advanced terrain modeler for an autonomous planetary rover
NASA Technical Reports Server (NTRS)
Hunter, E. L.
1980-01-01
A roving vehicle capable of autonomously exploring the surface of an alien world is under development and an advanced terrain modeler to characterize the possible paths of the rover as hazardous or safe is presented. This advanced terrain modeler has several improvements over the Troiani modeler that include: a crosspath analysis, better determination of hazards on slopes, and methods for dealing with missing returns at the extremities of the sensor field. The results from a package of programs to simulate the roving vehicle are then examined and compared to results from the Troiani modeler.
Advanced capabilities for materials modelling with Quantum ESPRESSO
NASA Astrophysics Data System (ADS)
Giannozzi, P.; Andreussi, O.; Brumme, T.; Bunau, O.; Buongiorno Nardelli, M.; Calandra, M.; Car, R.; Cavazzoni, C.; Ceresoli, D.; Cococcioni, M.; Colonna, N.; Carnimeo, I.; Dal Corso, A.; de Gironcoli, S.; Delugas, P.; DiStasio, R. A., Jr.; Ferretti, A.; Floris, A.; Fratesi, G.; Fugallo, G.; Gebauer, R.; Gerstmann, U.; Giustino, F.; Gorni, T.; Jia, J.; Kawamura, M.; Ko, H.-Y.; Kokalj, A.; Küçükbenli, E.; Lazzeri, M.; Marsili, M.; Marzari, N.; Mauri, F.; Nguyen, N. L.; Nguyen, H.-V.; Otero-de-la-Roza, A.; Paulatto, L.; Poncé, S.; Rocca, D.; Sabatini, R.; Santra, B.; Schlipf, M.; Seitsonen, A. P.; Smogunov, A.; Timrov, I.; Thonhauser, T.; Umari, P.; Vast, N.; Wu, X.; Baroni, S.
2017-11-01
Quantum EXPRESSO is an integrated suite of open-source computer codes for quantum simulations of materials using state-of-the-art electronic-structure techniques, based on density-functional theory, density-functional perturbation theory, and many-body perturbation theory, within the plane-wave pseudopotential and projector-augmented-wave approaches. Quantum EXPRESSO owes its popularity to the wide variety of properties and processes it allows to simulate, to its performance on an increasingly broad array of hardware architectures, and to a community of researchers that rely on its capabilities as a core open-source development platform to implement their ideas. In this paper we describe recent extensions and improvements, covering new methodologies and property calculators, improved parallelization, code modularization, and extended interoperability both within the distribution and with external software.
Advanced capabilities for materials modelling with Quantum ESPRESSO.
Giannozzi, P; Andreussi, O; Brumme, T; Bunau, O; Buongiorno Nardelli, M; Calandra, M; Car, R; Cavazzoni, C; Ceresoli, D; Cococcioni, M; Colonna, N; Carnimeo, I; Dal Corso, A; de Gironcoli, S; Delugas, P; DiStasio, R A; Ferretti, A; Floris, A; Fratesi, G; Fugallo, G; Gebauer, R; Gerstmann, U; Giustino, F; Gorni, T; Jia, J; Kawamura, M; Ko, H-Y; Kokalj, A; Küçükbenli, E; Lazzeri, M; Marsili, M; Marzari, N; Mauri, F; Nguyen, N L; Nguyen, H-V; Otero-de-la-Roza, A; Paulatto, L; Poncé, S; Rocca, D; Sabatini, R; Santra, B; Schlipf, M; Seitsonen, A P; Smogunov, A; Timrov, I; Thonhauser, T; Umari, P; Vast, N; Wu, X; Baroni, S
2017-10-24
Quantum EXPRESSO is an integrated suite of open-source computer codes for quantum simulations of materials using state-of-the-art electronic-structure techniques, based on density-functional theory, density-functional perturbation theory, and many-body perturbation theory, within the plane-wave pseudopotential and projector-augmented-wave approaches. Quantum EXPRESSO owes its popularity to the wide variety of properties and processes it allows to simulate, to its performance on an increasingly broad array of hardware architectures, and to a community of researchers that rely on its capabilities as a core open-source development platform to implement their ideas. In this paper we describe recent extensions and improvements, covering new methodologies and property calculators, improved parallelization, code modularization, and extended interoperability both within the distribution and with external software.
Advanced capabilities for materials modelling with Quantum ESPRESSO.
Andreussi, Oliviero; Brumme, Thomas; Bunau, Oana; Buongiorno Nardelli, Marco; Calandra, Matteo; Car, Roberto; Cavazzoni, Carlo; Ceresoli, Davide; Cococcioni, Matteo; Colonna, Nicola; Carnimeo, Ivan; Dal Corso, Andrea; de Gironcoli, Stefano; Delugas, Pietro; DiStasio, Robert; Ferretti, Andrea; Floris, Andrea; Fratesi, Guido; Fugallo, Giorgia; Gebauer, Ralph; Gerstmann, Uwe; Giustino, Feliciano; Gorni, Tommaso; Jia, Junteng; Kawamura, Mitsuaki; Ko, Hsin-Yu; Kokalj, Anton; Küçükbenli, Emine; Lazzeri, Michele; Marsili, Margherita; Marzari, Nicola; Mauri, Francesco; Nguyen, Ngoc Linh; Nguyen, Huy-Viet; Otero-de-la-Roza, Alberto; Paulatto, Lorenzo; Poncé, Samuel; Giannozzi, Paolo; Rocca, Dario; Sabatini, Riccardo; Santra, Biswajit; Schlipf, Martin; Seitsonen, Ari Paavo; Smogunov, Alexander; Timrov, Iurii; Thonhauser, Timo; Umari, Paolo; Vast, Nathalie; Wu, Xifan; Baroni, Stefano
2017-09-27
Quantum ESPRESSO is an integrated suite of open-source computer codes for quantum simulations of materials using state-of-the art electronic-structure techniques, based on density-functional theory, density-functional perturbation theory, and many-body perturbation theory, within the plane-wave pseudo-potential and projector-augmented-wave approaches. Quantum ESPRESSO owes its popularity to the wide variety of properties and processes it allows to simulate, to its performance on an increasingly broad array of hardware architectures, and to a community of researchers that rely on its capabilities as a core open-source development platform to implement theirs ideas. In this paper we describe recent extensions and improvements, covering new methodologies and property calculators, improved parallelization, code modularization, and extended interoperability both within the distribution and with external software. © 2017 IOP Publishing Ltd.
Development Of A Data Assimilation Capability For RAPID
NASA Astrophysics Data System (ADS)
Emery, C. M.; David, C. H.; Turmon, M.; Hobbs, J.; Allen, G. H.; Famiglietti, J. S.
2017-12-01
The global decline of in situ observations associated with the increasing ability to monitor surface water from space motivates the creation of data assimilation algorithms that merge computer models and space-based observations to produce consistent estimates of terrestrial hydrology that fill the spatiotemporal gaps in observations. RAPID is a routing model based on the Muskingum method that is capable of estimating river streamflow over large scales with a relatively short computing time. This model only requires limited inputs: a reach-based river network, and lateral surface and subsurface flow into the rivers. The relatively simple model physics imply that RAPID simulations could be significantly improved by including a data assimilation capability. Here we present the early developments of such data assimilation approach into RAPID. Given the linear and matrix-based structure of the model, we chose to apply a direct Kalman filter, hence allowing for the preservation of high computational speed. We correct the simulated streamflows by assimilating streamflow observations and our early results demonstrate the feasibility of the approach. Additionally, the use of in situ gauges at continental scales motivates the application of our new data assimilation scheme to altimetry measurements from existing (e.g. EnviSat, Jason 2) and upcoming satellite missions (e.g. SWOT), and ultimately apply the scheme globally.
Examining quality improvement programs: the case of Minnesota hospitals.
Olson, John R; Belohlav, James A; Cook, Lori S; Hays, Julie M
2008-10-01
To determine if there is a hierarchy of improvement program adoption by hospitals and outline that hierarchy. Primary data were collected in the spring of 2007 via e-survey from 210 individuals representing 109 Minnesota hospitals. Secondary data from 2006 were assembled from the Leapfrog database. As part of a larger survey, respondents were given a list of improvement programs and asked to identify those programs that are used in their hospital. DATA COLLECTION/DATA EXTRACTION: Rasch Model Analysis was used to assess whether a unidimensional construct exists that defines a hospital's ability to implement performance improvement programs. Linear regression analysis was used to assess the relationship of the Rasch ability scores with Leapfrog Safe Practices Scores to validate the research findings. Principal Findings. The results of the study show that hospitals have widely varying abilities in implementing improvement programs. In addition, improvement programs present differing levels of difficulty for hospitals trying to implement them. Our findings also indicate that the ability to adopt improvement programs is important to the overall performance of hospitals. There is a hierarchy of improvement programs in the health care context. A hospital's ability to successfully adopt improvement programs is a function of its existing capabilities. As a hospital's capability increases, the ability to successfully implement higher level programs also increases.
NASA Astrophysics Data System (ADS)
Hu, Jiexiang; Zhou, Qi; Jiang, Ping; Shao, Xinyu; Xie, Tingli
2018-01-01
Variable-fidelity (VF) modelling methods have been widely used in complex engineering system design to mitigate the computational burden. Building a VF model generally includes two parts: design of experiments and metamodel construction. In this article, an adaptive sampling method based on improved hierarchical kriging (ASM-IHK) is proposed to refine the improved VF model. First, an improved hierarchical kriging model is developed as the metamodel, in which the low-fidelity model is varied through a polynomial response surface function to capture the characteristics of a high-fidelity model. Secondly, to reduce local approximation errors, an active learning strategy based on a sequential sampling method is introduced to make full use of the already required information on the current sampling points and to guide the sampling process of the high-fidelity model. Finally, two numerical examples and the modelling of the aerodynamic coefficient for an aircraft are provided to demonstrate the approximation capability of the proposed approach, as well as three other metamodelling methods and two sequential sampling methods. The results show that ASM-IHK provides a more accurate metamodel at the same simulation cost, which is very important in metamodel-based engineering design problems.
NASA Technical Reports Server (NTRS)
Orme, John S.; Schkolnik, Gerard S.
1995-01-01
Performance Seeking Control (PSC), an onboard, adaptive, real-time optimization algorithm, relies upon an onboard propulsion system model. Flight results illustrated propulsion system performance improvements as calculated by the model. These improvements were subject to uncertainty arising from modeling error. Thus to quantify uncertainty in the PSC performance improvements, modeling accuracy must be assessed. A flight test approach to verify PSC-predicted increases in thrust (FNP) and absolute levels of fan stall margin is developed and applied to flight test data. Application of the excess thrust technique shows that increases of FNP agree to within 3 percent of full-scale measurements for most conditions. Accuracy to these levels is significant because uncertainty bands may now be applied to the performance improvements provided by PSC. Assessment of PSC fan stall margin modeling accuracy was completed with analysis of in-flight stall tests. Results indicate that the model overestimates the stall margin by between 5 to 10 percent. Because PSC achieves performance gains by using available stall margin, this overestimation may represent performance improvements to be recovered with increased modeling accuracy. Assessment of thrust and stall margin modeling accuracy provides a critical piece for a comprehensive understanding of PSC's capabilities and limitations.
Gong, H; Pishgar, R; Tay, J H
2018-04-27
Aerobic granulation is a recent technology with high level of complexity and sensitivity to environmental and operational conditions. Artificial neural networks (ANNs), computational tools capable of describing complex non-linear systems, are the best fit to simulate aerobic granular bioreactors. In this study, two feedforward backpropagation ANN models were developed to predict chemical oxygen demand (Model I) and total nitrogen removal efficiencies (Model II) of aerobic granulation technology under steady-state condition. Fundamentals of ANN models and the steps to create them were briefly reviewed. The models were respectively fed with 205 and 136 data points collected from laboratory-, pilot-, and full-scale studies on aerobic granulation technology reported in the literature. Initially, 60%, 20%, and 20%, and 80%, 10%, and 10% of the points in the corresponding datasets were randomly chosen and used for training, testing, and validation of Model I, and Model II, respectively. Overall coefficient of determination (R 2 ) value and mean squared error (MSE) of the two models were initially 0.49 and 15.5, and 0.37 and 408, respectively. To improve the model performance, two data division methods were used. While one method is generic and potentially applicable to other fields, the other can only be applied to modelling the performance of aerobic granular reactors. R 2 value and MSE were improved to 0.90 and 2.54, and 0.81 and 121.56, respectively, after applying the new data division methods. The results demonstrated that ANN-based models were capable simulation approach to predict a complicated process like aerobic granulation.
NASA Technical Reports Server (NTRS)
Shen, Bo-Wen; Tao, Wei-Kuo; Chern, Jiun-Dar
2007-01-01
Improving our understanding of hurricane inter-annual variability and the impact of climate change (e.g., doubling CO2 and/or global warming) on hurricanes brings both scientific and computational challenges to researchers. As hurricane dynamics involves multiscale interactions among synoptic-scale flows, mesoscale vortices, and small-scale cloud motions, an ideal numerical model suitable for hurricane studies should demonstrate its capabilities in simulating these interactions. The newly-developed multiscale modeling framework (MMF, Tao et al., 2007) and the substantial computing power by the NASA Columbia supercomputer show promise in pursuing the related studies, as the MMF inherits the advantages of two NASA state-of-the-art modeling components: the GEOS4/fvGCM and 2D GCEs. This article focuses on the computational issues and proposes a revised methodology to improve the MMF's performance and scalability. It is shown that this prototype implementation enables 12-fold performance improvements with 364 CPUs, thereby making it more feasible to study hurricane climate.
Synthesising empirical results to improve predictions of post-wildfire runoff and erosion response
Shakesby, Richard A.; Moody, John A.; Martin, Deborah A.; Robichaud, Peter R.
2016-01-01
Advances in research into wildfire impacts on runoff and erosion have demonstrated increasing complexity of controlling factors and responses, which, combined with changing fire frequency, present challenges for modellers. We convened a conference attended by experts and practitioners in post-wildfire impacts, meteorology and related research, including modelling, to focus on priority research issues. The aim was to improve our understanding of controls and responses and the predictive capabilities of models. This conference led to the eight selected papers in this special issue. They address aspects of the distinctiveness in the controls and responses among wildfire regions, spatiotemporal rainfall variability, infiltration, runoff connectivity, debris flow formation and modelling applications. Here we summarise key findings from these papers and evaluate their contribution to improving understanding and prediction of post-wildfire runoff and erosion under changes in climate, human intervention and population pressure on wildfire-prone areas.
Modeling Lost-Particle Backgrounds in PEP-II Using LPTURTLE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fieguth, T.; /SLAC; Barlow, R.
2005-05-17
Background studies during the design, construction, commissioning, operation and improvement of BaBar and PEP-II have been greatly influenced by results from a program referred to as LPTURTLE (Lost Particle TURTLE) which was originally conceived for the purpose of studying gas background for SLC. This venerable program is still in use today. We describe its use, capabilities and improvements and refer to current results now being applied to BaBar.
2017-03-01
determine the optimum required operational capability of the unmanned aerial vehicles to support Korean rear area operations. We use Map Aware Non ...area operations. Through further experimentations and analyses, we were able to find the optimum characteristics of an improved unmanned aerial...operations. We use Map Aware Non -Uniform Automata, an agent-based simulation software platform for computational experiments. The study models a scenario
A breakthrough for experiencing and understanding simulated physics
NASA Technical Reports Server (NTRS)
Watson, Val
1988-01-01
The use of computer simulation in physics research is discussed, focusing on improvements to graphic workstations. Simulation capabilities and applications of enhanced visualization tools are outlined. The elements of an ideal computer simulation are presented and the potential for improving various simulation elements is examined. The interface between the human and the computer and simulation models are considered. Recommendations are made for changes in computer simulation practices and applications of simulation technology in education.
Mediterranea Forecasting System: a focus on wave-current coupling
NASA Astrophysics Data System (ADS)
Clementi, Emanuela; Delrosso, Damiano; Pistoia, Jenny; Drudi, Massimiliano; Fratianni, Claudia; Grandi, Alessandro; Pinardi, Nadia; Oddo, Paolo; Tonani, Marina
2016-04-01
The Mediterranean Forecasting System (MFS) is a numerical ocean prediction system that produces analyses, reanalyses and short term forecasts for the entire Mediterranean Sea and its Atlantic Ocean adjacent areas. MFS became operational in the late 90's and has been developed and continuously improved in the framework of a series of EU and National funded programs and is now part of the Copernicus Marine Service. The MFS is composed by the hydrodynamic model NEMO (Nucleus for European Modelling of the Ocean) 2-way coupled with the third generation wave model WW3 (WaveWatchIII) implemented in the Mediterranean Sea with 1/16 horizontal resolution and forced by ECMWF atmospheric fields. The model solutions are corrected by the data assimilation system (3D variational scheme adapted to the oceanic assimilation problem) with a daily assimilation cycle, using a background error correlation matrix varying seasonally and in different sub-regions of the Mediterranean Sea. The focus of this work is to present the latest modelling system upgrades and the related achieved improvements. In order to evaluate the performance of the coupled system a set of experiments has been built by coupling the wave and circulation models that hourly exchange the following fields: the sea surface currents and air-sea temperature difference are transferred from NEMO model to WW3 model modifying respectively the mean momentum transfer of waves and the wind speed stability parameter; while the neutral drag coefficient computed by WW3 model is passed to NEMO that computes the turbulent component. In order to validate the modelling system, numerical results have been compared with in-situ and remote sensing data. This work suggests that a coupled model might be capable of a better description of wave-current interactions, in particular feedback from the ocean to the waves might assess an improvement on the prediction capability of wave characteristics, while suggests to proceed toward a fully coupled modelling system in order to achieve stronger enhancements of the hydrodynamic fields.
Taylor, M J; Arriscado, D; Vlaev, I; Taylor, D; Gately, P; Darzi, A
2016-01-01
According to the COM-B ('Capability', 'Opportunity', 'Motivation' and 'Behaviour') model of behaviour, three factors are essential for behaviour to occur: capability, opportunity and motivation. Obese children are less likely to feel capable of exercising. The implementation of a new methodological approach to investigate the relationship between perceived exercise capability (PEC) and childhood obesity was conducted, which involved creating a new instrument, and demonstrating how it can be used to measure obesity intervention outcomes. A questionnaire aiming to measure perceived exercise capability, opportunity and motivation was systematically constructed using the COM-B model and administered to 71 obese children (aged 9-17 years (12.24±0.2.01), body mass index (BMI) standard deviation scores (SDS) 2.80±0.660) at a weight-management camp in northern England. Scale validity and reliability was assessed. Relationships between PEC, as measured by the questionnaire, and BMI SDS were investigated for the children at the weight-management camp, and for 45 Spanish schoolchildren (aged 9-13 years, (10.52±1.23), BMI SDS 0.80±0.99). A pilot study, demonstrating how the questionnaire can be used to measure the effectiveness of an intervention aiming to bring about improved PEC for weight-management camp attendees, was conducted. No participants withdrew from these studies. The questionnaire domain (exercise capability, opportunity and motivation) composite scales were found to have adequate internal consistency (a=0.712-0.796) and construct validity (χ(2)/degrees of freedom=1.55, root mean square error of approximation=0.072, comparative fit index=0.92). Linear regression revealed that low PEC was associated with higher baseline BMI SDS for both UK (b=-0.289, P=0.010) and Spanish (b=-0.446, P=0.047) participants. Pilot study findings provide preliminary evidence for PEC improvements through intervention being achievable, and measurable using the questionnaire. Evidence is presented for reliability and validity of the questionnaire, and for feasibility of its use in the context of a childhood obesity intervention. Future research could investigate the link between PEC and childhood obesity further.
NEW IMPROVEMENTS TO MFIRE TO ENHANCE FIRE MODELING CAPABILITIES.
Zhou, L; Smith, A C; Yuan, L
2016-06-01
NIOSH's mine fire simulation program, MFIRE, is widely accepted as a standard for assessing and predicting the impact of a fire on the mine ventilation system and the spread of fire contaminants in coal and metal/nonmetal mines, which has been used by U.S. and international companies to simulate fires for planning and response purposes. MFIRE is a dynamic, transient-state, mine ventilation network simulation program that performs normal planning calculations. It can also be used to analyze ventilation networks under thermal and mechanical influence such as changes in ventilation parameters, external influences such as changes in temperature, and internal influences such as a fire. The program output can be used to analyze the effects of these influences on the ventilation system. Since its original development by Michigan Technological University for the Bureau of Mines in the 1970s, several updates have been released over the years. In 2012, NIOSH completed a major redesign and restructuring of the program with the release of MFIRE 3.0. MFIRE's outdated FORTRAN programming language was replaced with an object-oriented C++ language and packaged into a dynamic link library (DLL). However, the MFIRE 3.0 release made no attempt to change or improve the fire modeling algorithms inherited from its previous version, MFIRE 2.20. This paper reports on improvements that have been made to the fire modeling capabilities of MFIRE 3.0 since its release. These improvements include the addition of fire source models of the t-squared fire and heat release rate curve data file, the addition of a moving fire source for conveyor belt fire simulations, improvement of the fire location algorithm, and the identification and prediction of smoke rollback phenomena. All the improvements discussed in this paper will be termed as MFIRE 3.1 and released by NIOSH in the near future.
Wind Field and Trajectory Models for Tornado-Propelled Objects
NASA Technical Reports Server (NTRS)
Redmann, G. H.; Radbill, J. R.; Marte, J. E.; Dergarabedian, P.; Fendell, F. E.
1978-01-01
A mathematical model to predict the trajectory of tornado born objects postulated to be in the vicinity of nuclear power plants is developed. An improved tornado wind field model satisfied the no slip ground boundary condition of fluid mechanics and includes the functional dependence of eddy viscosity with altitude. Subscale wind tunnel data are obtained for all of the missiles currently specified for nuclear plant design. Confirmatory full-scale data are obtained for a 12 inch pipe and automobile. The original six degree of freedom trajectory model is modified to include the improved wind field and increased capability as to body shapes and inertial characteristics that can be handled. The improved trajectory model is used to calculate maximum credible speeds, which for all of the heavy missiles are considerably less than those currently specified for design. Equivalent coefficients for use in three degree of freedom models are developed and the sensitivity of range and speed to various trajectory parameters for the 12 inch diameter pipe are examined.
EPAs National Center for Computational Toxicology is building capabilities to support a new paradigm for toxicity screening and prediction. The DSSTox project is improving public access to quality structure-annotated chemical toxicity information in less summarized forms than tr...
A remote sensing-based dry and wet limit-reference evapotranspiration model for water use monitoring
USDA-ARS?s Scientific Manuscript database
With increasing growth in human population, the demand for greater food production has exceeded the capability to provide a sustainable water supply for agriculture. This is exacerbated in areas suffering from prolonged drought conditions, particularly in water limited regions. Improving the managem...
Improvement of High-Resolution Tropical Cyclone Structure and Intensity Forecasts using COAMPS-TC
2013-09-30
scientific community including the recent T- PARC /TCS08, ITOP, and HS3 field campaigns to build upon the existing modeling capabilities. We will...heating and cooling rates in developing and non-developing tropical disturbances during tcs-08: radar -equivalent retrievals from mesoscale numerical
A Summative Report of the Leadership Training Program.
ERIC Educational Resources Information Center
Buikema, Lolita; Many, Wesley
An ESEA Title III program to improve leadership capabilities of educators was conducted in both actual and model school settings during 1966-69. Participants included staff personnel, consultants, administrative and teaching personnel from cooperating school districts, and board of education members from a consortium school. This report discusses…
NASA Technical Reports Server (NTRS)
Case, Johnathan L.; Mungai, John; Sakwa, Vincent; Kabuchanga, Eric; Zavodsky, Bradley T.; Limaye, Ashutosh S.
2014-01-01
Flooding and drought are two key forecasting challenges for the Kenya Meteorological Service (KMS). Atmospheric processes leading to excessive precipitation and/or prolonged drought can be quite sensitive to the state of the land surface, which interacts with the planetary boundary layer (PBL) of the atmosphere providing a source of heat and moisture. The development and evolution of precipitation systems are affected by heat and moisture fluxes from the land surface, particularly within weakly-sheared environments such as in the tropics and sub-tropics. These heat and moisture fluxes during the day can be strongly influenced by land cover, vegetation, and soil moisture content. Therefore, it is important to represent the land surface state as accurately as possible in land surface and numerical weather prediction (NWP) models. Enhanced regional modeling capabilities have the potential to improve forecast guidance in support of daily operations and high-impact weather over eastern Africa. KMS currently runs a configuration of the Weather Research and Forecasting (WRF) NWP model in real time to support its daily forecasting operations, making use of the NOAA/National Weather Service (NWS) Science and Training Resource Center's Environmental Modeling System (EMS) to manage and produce the KMS-WRF runs on a regional grid over eastern Africa. Two organizations at the NASA Marshall Space Flight Center in Huntsville, AL, SERVIR and the Shortterm Prediction Research and Transition (SPoRT) Center, have established a working partnership with KMS for enhancing its regional modeling capabilities through new datasets and tools. To accomplish this goal, SPoRT and SERVIR is providing enhanced, experimental land surface initialization datasets and model verification capabilities to KMS as part of this collaboration. To produce a land-surface initialization more consistent with the resolution of the KMS-WRF runs, the NASA Land Information System (LIS) is run at a comparable resolution to provide real-time, daily soil initialization data in place of data interpolated from the National Centers for Environmental Prediction (NCEP) Global Forecast System (GFS) model soil moisture and temperature fields. Additionally, realtime green vegetation fraction (GVF) data from the Visible Infrared Imaging Radiometer Suite (VIIRS) on the Suomi National Polar-orbiting Partnership (Suomi- NPP) satellite will be incorporated into the KMS-WRF runs, once it becomes publicly available from the National Environmental Satellite Data and Information Service (NESDIS). Finally, model verification capabilities will be transitioned to KMS using the Model Evaluation Tools (MET; Brown et al. 2009) package in conjunction with a dynamic scripting package developed by SPoRT (Zavodsky et al. 2014), to help quantify possible improvements in simulated temperature, moisture and precipitation resulting from the experimental land surface initialization. Furthermore, the transition of these MET tools will enable KMS to monitor model forecast accuracy in near real time. This paper presents preliminary efforts to improve land surface model initialization over eastern Africa in support of operations at KMS. The remainder of this extended abstract is organized as follows: The collaborating organizations involved in the project are described in Section 2; background information on LIS and the configuration for eastern Africa is presented in Section 3; the WRF configuration used in this modeling experiment is described in Section 4; sample experimental WRF output with and without LIS initialization data are given in Section 5; a summary is given in Section 6 followed by acknowledgements and references.
Integration of Linear Dynamic Emission and Climate Models with Air Traffic Simulations
NASA Technical Reports Server (NTRS)
Sridhar, Banavar; Ng, Hok K.; Chen, Neil Y.
2012-01-01
Future air traffic management systems are required to balance the conflicting objectives of maximizing safety and efficiency of traffic flows while minimizing the climate impact of aviation emissions and contrails. Integrating emission and climate models together with air traffic simulations improve the understanding of the complex interaction between the physical climate system, carbon and other greenhouse gas emissions and aviation activity. This paper integrates a national-level air traffic simulation and optimization capability with simple climate models and carbon cycle models, and climate metrics to assess the impact of aviation on climate. The capability can be used to make trade-offs between extra fuel cost and reduction in global surface temperature change. The parameters in the simulation can be used to evaluate the effect of various uncertainties in emission models and contrails and the impact of different decision horizons. Alternatively, the optimization results from the simulation can be used as inputs to other tools that monetize global climate impacts like the FAA s Aviation Environmental Portfolio Management Tool for Impacts.
Equilibrium cycle pin by pin transport depletion calculations with DeCART
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kochunas, B.; Downar, T.; Taiwo, T.
As the Advanced Fuel Cycle Initiative (AFCI) program has matured it has become more important to utilize more advanced simulation methods. The work reported here was performed as part of the AFCI fellowship program to develop and demonstrate the capability of performing high fidelity equilibrium cycle calculations. As part of the work here, a new multi-cycle analysis capability was implemented in the DeCART code which included modifying the depletion modules to perform nuclide decay calculations, implementing an assembly shuffling pattern description, and modifying iteration schemes. During the work, stability issues were uncovered with respect to converging simultaneously the neutron flux,more » isotopics, and fluid density and temperature distributions in 3-D. Relaxation factors were implemented which considerably improved the stability of the convergence. To demonstrate the capability two core designs were utilized, a reference UOX core and a CORAIL core. Full core equilibrium cycle calculations were performed on both cores and the discharge isotopics were compared. From this comparison it was noted that the improved modeling capability was not drastically different in its prediction of the discharge isotopics when compared to 2-D single assembly or 2-D core models. For fissile isotopes such as U-235, Pu-239, and Pu-241 the relative differences were 1.91%, 1.88%, and 0.59%), respectively. While this difference may not seem large it translates to mass differences on the order of tens of grams per assembly, which may be significant for the purposes of accounting of special nuclear material. (authors)« less
Multiscale Modelling of the 2011 Tohoku Tsunami with Fluidity: Coastal Inundation and Run-up.
NASA Astrophysics Data System (ADS)
Hill, J.; Martin-Short, R.; Piggott, M. D.; Candy, A. S.
2014-12-01
Tsunami-induced flooding represents one of the most dangerous natural hazards to coastal communities around the world, as exemplified by Tohoku tsunami of March 2011. In order to further understand this hazard and to design appropriate mitigation it is necessary to develop versatile, accurate software capable of simulating large scale tsunami propagation and interaction with coastal geomorphology on a local scale. One such software package is Fluidity, an open source, finite element, multiscale, code that is capable of solving the fully three dimensional Navier-Stokes equations on unstructured meshes. Such meshes are significantly better at representing complex coastline shapes than structured meshes and have the advantage of allowing variation in element size across a domain. Furthermore, Fluidity incorporates a novel wetting and drying algorithm, which enables accurate, efficient simulation of tsunami run-up over complex, multiscale, topography. Fluidity has previously been demonstrated to accurately simulate the 2011 Tohoku tsunami (Oishi et al 2013) , but its wetting and drying facility has not yet been tested on a geographical scale. This study makes use of Fluidity to simulate the 2011 Tohoku tsunami and its interaction with Japan's eastern shoreline, including coastal flooding. The results are validated against observations made by survey teams, aerial photographs and previous modelling efforts in order to evaluate Fluidity's current capabilities and suggest methods of future improvement. The code is shown to perform well at simulating flooding along the topographically complex Tohoku coast of Japan, with major deviations between model and observation arising mainly due to limitations imposed by bathymetry resolution, which could be improved in future. In theory, Fluidity is capable of full multiscale tsunami modelling, thus enabling researchers to understand both wave propagation across ocean basins and flooding of coastal landscapes down to interaction with individual defence structures. This makes the code an exciting candidate for use in future studies aiming to investigate tsunami risk elsewhere in the world. Oishi, Y. et al. Three-dimensional tsunami propagation simulations using an unstructured mesh finite element model. J. Geophys. Res. [Solid Earth] 118, 2998-3018 (2013).
Cryogenic, high speed, turbopump bearing cooling requirements
NASA Technical Reports Server (NTRS)
Dolan, Fred J.; Gibson, Howard G.; Cannon, James L.; Cody, Joe C.
1988-01-01
Although the Space Shuttle Main Engine (SSME) has repeatedly demonstrated the capability to perform during launch, the High Pressure Oxidizer Turbopump (HPOTP) main shaft bearings have not met their 7.5 hour life requirement. A tester is being employed to provide the capability of subjecting full scale bearings and seals to speeds, loads, propellants, temperatures, and pressures which simulate engine operating conditions. The tester design permits much more elaborate instrumentation and diagnostics than could be accommodated in an SSME turbopump. Tests were made to demonstrate the facilities; and the devices' capabilities, to verify the instruments in its operating environment and to establish a performance baseline for the flight type SSME HPOTP Turbine Bearing design. Bearing performance data from tests are being utilized to generate: (1) a high speed, cryogenic turbopump bearing computer mechanical model, and (2) a much improved, very detailed thermal model to better understand bearing internal operating conditions. Parametric tests were also made to determine the effects of speed, axial loads, coolant flow rate, and surface finish degradation on bearing performance.
NASA Astrophysics Data System (ADS)
Wrożyna, Andrzej; Pernach, Monika; Kuziak, Roman; Pietrzyk, Maciej
2016-04-01
Due to their exceptional strength properties combined with good workability the Advanced High-Strength Steels (AHSS) are commonly used in automotive industry. Manufacturing of these steels is a complex process which requires precise control of technological parameters during thermo-mechanical treatment. Design of these processes can be significantly improved by the numerical models of phase transformations. Evaluation of predictive capabilities of models, as far as their applicability in simulation of thermal cycles thermal cycles for AHSS is considered, was the objective of the paper. Two models were considered. The former was upgrade of the JMAK equation while the latter was an upgrade of the Leblond model. The models can be applied to any AHSS though the examples quoted in the paper refer to the Dual Phase (DP) steel. Three series of experimental simulations were performed. The first included various thermal cycles going beyond limitations of the continuous annealing lines. The objective was to validate models behavior in more complex cooling conditions. The second set of tests included experimental simulations of the thermal cycle characteristic for the continuous annealing lines. Capability of the models to describe properly phase transformations in this process was evaluated. The third set included data from the industrial continuous annealing line. Validation and verification of models confirmed their good predictive capabilities. Since it does not require application of the additivity rule, the upgrade of the Leblond model was selected as the better one for simulation of industrial processes in AHSS production.
Optimization of Rei-mullite Physical Properties
NASA Technical Reports Server (NTRS)
Tanzilli, R. A.; Musikant, S.; Bolinger, P. N.; Brazel, J. P.
1973-01-01
Micromechanical and thermal modeling studies prove that ceramic fiber mullite materials is the only system capable of shuttle thermal protection to 1644 K. Hafnia pigmentated mullite surface coatings meet both orbital and reentry thermal radiative requirements for reuse without refurbishment. Thermal and mechanical models show growths potentials associated with the mullite system for a factor of 2 improvement in mechanical properties, and a factor of 2 to 3 reduction in thermal conductivity.
High-End Climate Science: Development of Modeling and Related Computing Capabilities
2000-12-01
toward strengthening research on key scientific issues. The Program has supported research that has led to substantial increases in knowledge , improved...provides overall direction and executive oversight of the USGCRP. Within this framework, agencies manage and coordinate Federally supported scientific...critical for the U.S. Global Change Research Program. Such models can be used to look backward to test the consistency of our knowledge of Earth system
High Fidelity Ion Beam Simulation of High Dose Neutron Irradiation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Was, Gary; Wirth, Brian; Motta, Athur
The objective of this proposal is to demonstrate the capability to predict the evolution of microstructure and properties of structural materials in-reactor and at high doses, using ion irradiation as a surrogate for reactor irradiations. “Properties” includes both physical properties (irradiated microstructure) and the mechanical properties of the material. Demonstration of the capability to predict properties has two components. One is ion irradiation of a set of alloys to yield an irradiated microstructure and corresponding mechanical behavior that are substantially the same as results from neutron exposure in the appropriate reactor environment. Second is the capability to predict the irradiatedmore » microstructure and corresponding mechanical behavior on the basis of improved models, validated against both ion and reactor irradiations and verified against ion irradiations. Taken together, achievement of these objectives will yield an enhanced capability for simulating the behavior of materials in reactor irradiations.« less
Shao, Yu; Chang, Chip-Hong
2007-08-01
We present a new speech enhancement scheme for a single-microphone system to meet the demand for quality noise reduction algorithms capable of operating at a very low signal-to-noise ratio. A psychoacoustic model is incorporated into the generalized perceptual wavelet denoising method to reduce the residual noise and improve the intelligibility of speech. The proposed method is a generalized time-frequency subtraction algorithm, which advantageously exploits the wavelet multirate signal representation to preserve the critical transient information. Simultaneous masking and temporal masking of the human auditory system are modeled by the perceptual wavelet packet transform via the frequency and temporal localization of speech components. The wavelet coefficients are used to calculate the Bark spreading energy and temporal spreading energy, from which a time-frequency masking threshold is deduced to adaptively adjust the subtraction parameters of the proposed method. An unvoiced speech enhancement algorithm is also integrated into the system to improve the intelligibility of speech. Through rigorous objective and subjective evaluations, it is shown that the proposed speech enhancement system is capable of reducing noise with little speech degradation in adverse noise environments and the overall performance is superior to several competitive methods.
Feature Extraction with GMDH-Type Neural Networks for EEG-Based Person Identification.
Schetinin, Vitaly; Jakaite, Livija; Nyah, Ndifreke; Novakovic, Dusica; Krzanowski, Wojtek
2018-08-01
The brain activity observed on EEG electrodes is influenced by volume conduction and functional connectivity of a person performing a task. When the task is a biometric test the EEG signals represent the unique "brain print", which is defined by the functional connectivity that is represented by the interactions between electrodes, whilst the conduction components cause trivial correlations. Orthogonalization using autoregressive modeling minimizes the conduction components, and then the residuals are related to features correlated with the functional connectivity. However, the orthogonalization can be unreliable for high-dimensional EEG data. We have found that the dimensionality can be significantly reduced if the baselines required for estimating the residuals can be modeled by using relevant electrodes. In our approach, the required models are learnt by a Group Method of Data Handling (GMDH) algorithm which we have made capable of discovering reliable models from multidimensional EEG data. In our experiments on the EEG-MMI benchmark data which include 109 participants, the proposed method has correctly identified all the subjects and provided a statistically significant ([Formula: see text]) improvement of the identification accuracy. The experiments have shown that the proposed GMDH method can learn new features from multi-electrode EEG data, which are capable to improve the accuracy of biometric identification.
NASA Technical Reports Server (NTRS)
Underwood, Lauren W.; Ryan, Robert E.
2007-01-01
This Candidate Solution uses NASA Earth science research on atmospheric ozone and aerosols data (1) to help improve the prediction capabilities of water runoff models that are used to estimate runoff pollution from retention ponds, and (2) to understand the pollutant removal contribution and potential of photocatalytically coated materials that could be used in these ponds. Models (the EPA's SWMM and the USGS SLAMM) exist that estimate the release of pollutants into the environment from storm-water-related retention pond runoff. UV irradiance data acquired from the satellite mission Aura and from the OMI Surface UV algorithm will be incorporated into these models to enhance their capabilities, not only by increasing the general understanding of retention pond function (both the efficacy and efficiency) but additionally by adding photocatalytic materials to these retention ponds, augmenting their performance. State and local officials who run pollution protection programs could then develop and implement photocatalytic technologies for water pollution control in retention ponds and use them in conjunction with existing runoff models. More effective decisions about water pollution protection programs could be made, the persistence and toxicity of waste generated could be minimized, and subsequently our natural water resources would be improved. This Candidate Solution is in alignment with the Water Management and Public Health National Applications.
Improvements in Virtual Sensors: Using Spatial Information to Estimate Remote Sensing Spectra
NASA Technical Reports Server (NTRS)
Oza, Nikunj C.; Srivastava, Ashok N.; Stroeve, Julienne
2005-01-01
Various instruments are used to create images of the Earth and other objects in the universe in a diverse set of wavelength bands with the aim of understanding natural phenomena. Sometimes these instruments are built in a phased approach, with additional measurement capabilities added in later phases. In other cases, technology may mature to the point that the instrument offers new measurement capabilities that were not planned in the original design of the instrument. In still other cases, high resolution spectral measurements may be too costly to perform on a large sample and therefore lower resolution spectral instruments are used to take the majority of measurements. Many applied science questions that are relevant to the earth science remote sensing community require analysis of enormous amounts of data that were generated by instruments with disparate measurement capabilities. In past work [1], we addressed this problem using Virtual Sensors: a method that uses models trained on spectrally rich (high spectral resolution) data to "fill in" unmeasured spectral channels in spectrally poor (low spectral resolution) data. We demonstrated this method by using models trained on the high spectral resolution Terra MODIS instrument to estimate what the equivalent of the MODIS 1.6 micron channel would be for the NOAA AVHRR2 instrument. The scientific motivation for the simulation of the 1.6 micron channel is to improve the ability of the AVHRR2 sensor to detect clouds over snow and ice. This work contains preliminary experiments demonstrating that the use of spatial information can improve our ability to estimate these spectra.
Modeling the viscosity of polydisperse suspensions: Improvements in prediction of limiting behavior
NASA Astrophysics Data System (ADS)
Mwasame, Paul M.; Wagner, Norman J.; Beris, Antony N.
2016-06-01
The present study develops a fully consistent extension of the approach pioneered by Farris ["Prediction of the viscosity of multimodal suspensions from unimodal viscosity data," Trans. Soc. Rheol. 12, 281-301 (1968)] to describe the viscosity of polydisperse suspensions significantly improving upon our previous model [P. M. Mwasame, N. J. Wagner, and A. N. Beris, "Modeling the effects of polydispersity on the viscosity of noncolloidal hard sphere suspensions," J. Rheol. 60, 225-240 (2016)]. The new model captures the Farris limit of large size differences between consecutive particle size classes in a suspension. Moreover, the new model includes a further generalization that enables its application to real, complex suspensions that deviate from ideal non-colloidal suspension behavior. The capability of the new model to predict the viscosity of complex suspensions is illustrated by comparison against experimental data.
Double multiple streamtube model with recent improvements
NASA Astrophysics Data System (ADS)
Paraschivoiu, I.; Delclaux, F.
1983-06-01
The objective of the present paper is to show the new capabilities of the double multiple streamtube (DMS) model for predicting the aerodynamic loads and performance of the Darrieus vertical-axis turbine. The original DMS model has been improved (DMSV model) by considering the variation in the upwind and downwind induced velocities as a function of the azimuthal angle for each streamtube. A comparison is made of the rotor performance for several blade geometries (parabola, catenary, troposkien, and Sandia shape). A new formulation is given for an approximate troposkien shape by considering the effect of the gravitational field. The effects of three NACA symmetrical profiles, 0012, 0015 and 0018, on the aerodynamic performance of the turbine are shown. Finally, a semiempirical dynamic-stall model has been incorporated and a better approximation obtained for modeling the local aerodynamic forces and performance for a Darrieus rotor.
Xiang, Junxi; Liu, Peng; Zheng, Xinglong; Dong, Dinghui; Fan, Shujuan; Dong, Jian; Zhang, Xufeng; Liu, Xuemin; Wang, Bo; Lv, Yi
2017-10-01
Weak mechanical property and unstable degradation rate limited the application of decellularized liver matrix in tissue engineering. The aim of this study was to explore a new method for improving the mechanical properties, anti-degeneration and angiogenic capability of decellularized liver matrix. This was achieved by a novel approach using riboflavin/ultraviolet A treatment to induce collagen cross-linking of decellularized matrix. Histological staining and scanning electron microscope showed that the diameter of cross-linked fibers significantly increased compared with the control group. The average peak load and Young's modulus of decellularized matrix were obviously improved after cross-linking. Then we implanted the modified matrix into the rat hepatic injury model to test the anti-degeneration and angiogenic capability of riboflavin/UVA cross-linked decellularized liver scaffolds in vivo. The results indicated that cross-linked scaffolds degrade more slowly than those in the control group. In the experiment group, average microvessel density in the implanted matrix was higher than that in the control group since the first week after implantation. In conclusion, we initiated the method to improve the biomechanical properties of decellularized liver scaffolds by riboflavin/UVA cross-linking, and more importantly, its improvement on anti-degeneration and angiogenesis was identified. © 2017 Wiley Periodicals, Inc. J Biomed Mater Res Part A: 105A: 2662-2669, 2017. © 2017 Wiley Periodicals, Inc.
Bill spurs efforts to improve forecasting of inland flooding from tropical storms
NASA Astrophysics Data System (ADS)
Showstack, Randy
Newly-enacted U.S. legislation to reduce the threat of inland flooding from tropical storms could provide a "laser beam" focus to dealing with this natural hazard, according to Rep. Bob Etheridge (D-N.C.), the chief sponsor of the bill.The Tropical Cyclone Inland Forecasting Improvement and Warning System Development Act, (PL. 107-253), signed into law on 29 October, authorizes the National Oceanic and Atmospheric Administration's U.S. Weather Research Program (USWRP) to improve the capability to accurately forecast inland flooding from tropical storms through research and modeling.
An Update on Improvements to NiCE Support for PROTEUS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bennett, Andrew; McCaskey, Alexander J.; Billings, Jay Jay
2015-09-01
The Department of Energy Office of Nuclear Energy's Nuclear Energy Advanced Modeling and Simulation (NEAMS) program has supported the development of the NEAMS Integrated Computational Environment (NiCE), a modeling and simulation workflow environment that provides services and plugins to facilitate tasks such as code execution, model input construction, visualization, and data analysis. This report details the development of workflows for the reactor core neutronics application, PROTEUS. This advanced neutronics application (primarily developed at Argonne National Laboratory) aims to improve nuclear reactor design and analysis by providing an extensible and massively parallel, finite-element solver for current and advanced reactor fuel neutronicsmore » modeling. The integration of PROTEUS-specific tools into NiCE is intended to make the advanced capabilities that PROTEUS provides more accessible to the nuclear energy research and development community. This report will detail the work done to improve existing PROTEUS workflow support in NiCE. We will demonstrate and discuss these improvements, including the development of flexible IO services, an improved interface for input generation, and the addition of advanced Fortran development tools natively in the platform.« less
Continuing Development of a Hybrid Model (VSH) of the Neutral Thermosphere
NASA Technical Reports Server (NTRS)
Burns, Alan
1996-01-01
We propose to continue the development of a new operational model of neutral thermospheric density, composition, temperatures and winds to improve current engineering environment definitions of the neutral thermosphere. This model will be based on simulations made with the National Center for Atmospheric Research (NCAR) Thermosphere-Ionosphere- Electrodynamic General Circulation Model (TIEGCM) and on empirical data. It will be capable of using real-time geophysical indices or data from ground-based and satellite inputs and provides neutral variables at specified locations and times. This "hybrid" model will be based on a Vector Spherical Harmonic (VSH) analysis technique developed (over the last 8 years) at the University of Michigan that permits the incorporation of the TIGCM outputs and data into the model. The VSH model will be a more accurate version of existing models of the neutral thermospheric, and will thus improve density specification for satellites flying in low Earth orbit (LEO).
Computation of confined coflow jets with three turbulence models
NASA Technical Reports Server (NTRS)
Zhu, J.; Shih, T. H.
1993-01-01
A numerical study of confined jets in a cylindrical duct is carried out to examine the performance of two recently proposed turbulence models: an RNG-based K-epsilon model and a realizable Reynolds stress algebraic equation model. The former is of the same form as the standard K-epsilon model but has different model coefficients. The latter uses an explicit quadratic stress-strain relationship to model the turbulent stresses and is capable of ensuring the positivity of each turbulent normal stress. The flow considered involves recirculation with unfixed separation and reattachment points and severe adverse pressure gradients, thereby providing a valuable test of the predictive capability of the models for complex flows. Calculations are performed with a finite-volume procedure. Numerical credibility of the solutions is ensured by using second-order accurate differencing schemes and sufficiently fine grids. Calculations with the standard K-epsilon model are also made for comparison. Detailed comparisons with experiments show that the realizable Reynolds stress algebraic equation model consistently works better than does the standard K-epsilon model in capturing the essential flow features, while the RNG-based K-epsilon model does not seem to give improvements over the standard K-epsilon model under the flow conditions considered.
Yu, Lu; Mo, Lin; Tang, Yan; Huang, Xiaoyan; Tan, Juan
2014-06-01
The objectives of this study are to compare the effects of two nursing intervention models on the ability of preschool children with malignant tumors to socialize and to determine if these interventions improved their social adaption capability (SAC) and quality of life. Inpatient preschool children with malignant tumors admitted to the hospital between December 2009 and March 2012 were recruited and randomized into either the experimental or control groups. The control group received routine nursing care, and the experimental group received family-centered nursing care, including physical, psychological, and social interventions. The Infants-Junior Middle School Student's Social-Life Abilities Scale was used to evaluate SAC development of participants. Participants (n = 240) were recruited and randomized into two groups. After the intervention, the excellent and normal SAC rates were 27.5% and 55% in the experimental group, respectively, compared with 2.5% and 32.5% in the control group (p < 0.001). After the intervention, SAC in experimental group was improved compared with before intervention (54.68 ± 10.85 vs 79.9 ± 22.3, p < 0.001). However, no differences in SAC were observed between baseline and after intervention in the control group (54.70 ± 11.47 vs. 52 ± 15.8, p = 0.38). The family-centered nursing care model that included physical, psychological, and social interventions improved the SAC of children with malignancies compared with children receiving routine nursing care. Establishing a standardized family-school-community-hospital hierarchical multi-management intervention model for children is important to the efficacy of long-term interventions and to the improvement of SAC of children with malignancies. Copyright © 2014 John Wiley & Sons, Ltd.
Shuttle TPS thermal performance and analysis methodology
NASA Technical Reports Server (NTRS)
Neuenschwander, W. E.; Mcbride, D. U.; Armour, G. A.
1983-01-01
Thermal performance of the thermal protection system was approximately as predicted. The only extensive anomalies were filler bar scorching and over-predictions in the high Delta p gap heating regions of the orbiter. A technique to predict filler bar scorching has been developed that can aid in defining a solution. Improvement in high Delta p gap heating methodology is still under study. Minor anomalies were also examined for improvements in modeling techniques and prediction capabilities. These include improved definition of low Delta p gap heating, an analytical model for inner mode line convection heat transfer, better modeling of structure, and inclusion of sneak heating. The limited number of problems related to penetration items that presented themselves during orbital flight tests were resolved expeditiously, and designs were changed and proved successful within the time frame of that program.
NSR&D FY17 Report: CartaBlanca Capability Enhancements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Long, Christopher Curtis; Dhakal, Tilak Raj; Zhang, Duan Zhong
Over the last several years, particle technology in the CartaBlanca code has been matured and has been successfully applied to a wide variety of physical problems. It has been shown that the particle methods, especially Los Alamos's dual domain material point method, is capable of computing many problems involves complex physics, chemistries accompanied by large material deformations, where the traditional finite element or Eulerian method encounter significant difficulties. In FY17, the CartaBlanca code has been enhanced with physical models and numerical algorithms. We started out to compute penetration and HE safety problems. Most of the year we focused on themore » TEPLA model improvement testing against the sweeping wave experiment by Gray et al., because it was found that pore growth and material failure are essentially important for our tasks and needed to be understood for modeling the penetration and the can experiments efficiently. We extended the TEPLA mode from the point view of ensemble phase average to include the effects of nite deformation. It is shown that the assumed pore growth model in TEPLA is actually an exact result from the theory. Alone this line, we then generalized the model to include finite deformations to consider nonlinear dynamics of large deformation. The interaction between the HE product gas and the solid metal is based on the multi-velocity formation. Our preliminary numerical results suggest good agreement between the experiment and the numerical results, pending further verification. To improve the parallel processing capabilities of the CartaBlanca code, we are actively working with the Next Generation Code (NGC) project to rewrite selected packages using C++. This work is expected to continue in the following years. This effort also makes the particle technology developed with CartaBlanca project available to other part of the laboratory. Working with the NGC project and rewriting some parts of the code also given us an opportunity to improve our numerical implementations of the method and to take advantage of recently advances in the numerical methods, such as multiscale algorithms.« less
[Soil infiltration characteristics under main vegetation types in Anji County of Zhejiang Province].
Liu, Dao-Ping; Chen, San-Xiong; Zhang, Jin-Chi; Xie, Li; Jiang, Jiang
2007-03-01
The study on the soil infiltration under different main vegetation types in Anji County of Zhejiang Province showed that the characteristics of soil infiltration differed significantly with land use type, and the test eight vegetation types could be classified into four groups, based on soil infiltration capability. The first group, deciduous broadleaved forest, had the strongest soil infiltration capability, and the second group with a stronger soil infiltration capability was composed of grass, pine forest, shrub community and tea bush. Bamboo and evergreen broadleaved forest were classified into the third group with a relatively strong soil infiltration capability, while bare land belonged to the fourth group because of the bad soil structure and poorest soil infiltration capability. The comprehensive parameters of soil infiltration (alpha) and root (beta) were obtained by principal component analysis, and the regression model of alpha and beta could be described as alpha = 0. 1708ebeta -0. 3122. Soil infiltration capability was greatly affected by soil physical and chemical characteristics and root system. Fine roots (< or = 1 mm in diameter) played effective roles on the improvement of soil physical and chemical properties, and the increase of soil infiltration capability was closely related to the amount of the fine roots.
Rapid Automated Aircraft Simulation Model Updating from Flight Data
NASA Technical Reports Server (NTRS)
Brian, Geoff; Morelli, Eugene A.
2011-01-01
Techniques to identify aircraft aerodynamic characteristics from flight measurements and compute corrections to an existing simulation model of a research aircraft were investigated. The purpose of the research was to develop a process enabling rapid automated updating of aircraft simulation models using flight data and apply this capability to all flight regimes, including flight envelope extremes. The process presented has the potential to improve the efficiency of envelope expansion flight testing, revision of control system properties, and the development of high-fidelity simulators for pilot training.
Integrated Modeling of Optical Systems (IMOS): An Assessment and Future Directions
NASA Technical Reports Server (NTRS)
Moore, Gregory; Broduer, Steve (Technical Monitor)
2001-01-01
Integrated Modeling of Optical Systems (IMOS) is a finite element-based code combining structural, thermal, and optical ray-tracing capabilities in a single environment for analysis of space-based optical systems. We'll present some recent examples of IMOS usage and discuss future development directions. Due to increasing model sizes and a greater emphasis on multidisciplinary analysis and design, much of the anticipated future work will be in the areas of improved architecture, numerics, and overall performance and analysis integration.
TRIGRS Application for landslide susceptibility mapping
NASA Astrophysics Data System (ADS)
Sugiarti, K.; Sukristiyanti, S.
2018-02-01
Research on landslide susceptibility has been carried out using several different methods. TRIGRS is a modeling program for landslide susceptibility by considering pore water pressure changes due to infiltration of rainfall. This paper aims to present a current state-of-the-art science on the development and application of TRIGRS. Some limitations of TRIGRS, some developments of it to improve its modeling capability, and some examples of the applications of some versions of it to model the effect of rainfall variation on landslide susceptibility are reviewed and discussed.
The Temporal Morphology of Infrasound Propagation
NASA Astrophysics Data System (ADS)
Drob, Douglas P.; Garcés, Milton; Hedlin, Michael; Brachet, Nicolas
2010-05-01
Expert knowledge suggests that the performance of automated infrasound event association and source location algorithms could be greatly improved by the ability to continually update station travel-time curves to properly account for the hourly, daily, and seasonal changes of the atmospheric state. With the goal of reducing false alarm rates and improving network detection capability we endeavor to develop, validate, and integrate this capability into infrasound processing operations at the International Data Centre of the Comprehensive Nuclear Test-Ban Treaty Organization. Numerous studies have demonstrated that incorporation of hybrid ground-to-space (G2S) enviromental specifications in numerical calculations of infrasound signal travel time and azimuth deviation yields significantly improved results over that of climatological atmospheric specifications, specifically for tropospheric and stratospheric modes. A robust infrastructure currently exists to generate hybrid G2S vector spherical harmonic coefficients, based on existing operational and emperical models on a real-time basis (every 3- to 6-hours) (D rob et al., 2003). Thus the next requirement in this endeavor is to refine numerical procedures to calculate infrasound propagation characteristics for robust automatic infrasound arrival identification and network detection, location, and characterization algorithms. We present results from a new code that integrates the local (range-independent) τp ray equations to provide travel time, range, turning point, and azimuth deviation for any location on the globe given a G2S vector spherical harmonic coefficient set. The code employs an accurate numerical technique capable of handling square-root singularities. We investigate the seasonal variability of propagation characteristics over a five-year time series for two different stations within the International Monitoring System with the aim of understanding the capabilities of current working knowledge of the atmosphere and infrasound propagation models. The statistical behaviors or occurrence frequency of various propagation configurations are discussed. Representative examples of some of these propagation configuration states are also shown.
Computational Simulations of the NASA Langley HyMETS Arc-Jet Facility
NASA Technical Reports Server (NTRS)
Brune, A. J.; Bruce, W. E., III; Glass, D. E.; Splinter, S. C.
2017-01-01
The Hypersonic Materials Environmental Test System (HyMETS) arc-jet facility located at the NASA Langley Research Center in Hampton, Virginia, is primarily used for the research, development, and evaluation of high-temperature thermal protection systems for hypersonic vehicles and reentry systems. In order to improve testing capabilities and knowledge of the test article environment, an effort is underway to computationally simulate the flow-field using computational fluid dynamics (CFD). A detailed three-dimensional model of the arc-jet nozzle and free-jet portion of the flow-field has been developed and compared to calibration probe Pitot pressure and stagnation-point heat flux for three test conditions at low, medium, and high enthalpy. The CFD model takes into account uniform pressure and non-uniform enthalpy profiles at the nozzle inlet as well as catalytic recombination efficiency effects at the probe surface. Comparing the CFD results and test data indicates an effectively fully-catalytic copper surface on the heat flux probe of about 10% efficiency and a 2-3 kpa pressure drop from the arc heater bore, where the pressure is measured, to the plenum section, prior to the nozzle. With these assumptions, the CFD results are well within the uncertainty of the stagnation pressure and heat flux measurements. The conditions at the nozzle exit were also compared with radial and axial velocimetry. This simulation capability will be used to evaluate various three-dimensional models that are tested in the HyMETS facility. An end-to-end aerothermal and thermal simulation of HyMETS test articles will follow this work to provide a better understanding of the test environment, test results, and to aid in test planning. Additional flow-field diagnostic measurements will also be considered to improve the modeling capability.
A Survey of Cost Estimating Methodologies for Distributed Spacecraft Missions
NASA Technical Reports Server (NTRS)
Foreman, Veronica; Le Moigne, Jacqueline; de Weck, Oliver
2016-01-01
Satellite constellations present unique capabilities and opportunities to Earth orbiting and near-Earth scientific and communications missions, but also present new challenges to cost estimators. An effective and adaptive cost model is essential to successful mission design and implementation, and as Distributed Spacecraft Missions (DSM) become more common, cost estimating tools must become more representative of these types of designs. Existing cost models often focus on a single spacecraft and require extensive design knowledge to produce high fidelity estimates. Previous research has examined the shortcomings of existing cost practices as they pertain to the early stages of mission formulation, for both individual satellites and small satellite constellations. Recommendations have been made for how to improve the cost models for individual satellites one-at-a-time, but much of the complexity in constellation and DSM cost modeling arises from constellation systems level considerations that have not yet been examined. This paper constitutes a survey of the current state-of-the-art in cost estimating techniques with recommendations for improvements to increase the fidelity of future constellation cost estimates. To enable our investigation, we have developed a cost estimating tool for constellation missions. The development of this tool has revealed three high-priority weaknesses within existing parametric cost estimating capabilities as they pertain to DSM architectures: design iteration, integration and test, and mission operations. Within this paper we offer illustrative examples of these discrepancies and make preliminary recommendations for addressing them. DSM and satellite constellation missions are shifting the paradigm of space-based remote sensing, showing promise in the realms of Earth science, planetary observation, and various heliophysical applications. To fully reap the benefits of DSM technology, accurate and relevant cost estimating capabilities must exist; this paper offers insights critical to the future development and implementation of DSM cost estimating tools.
A Survey of Cost Estimating Methodologies for Distributed Spacecraft Missions
NASA Technical Reports Server (NTRS)
Foreman, Veronica L.; Le Moigne, Jacqueline; de Weck, Oliver
2016-01-01
Satellite constellations present unique capabilities and opportunities to Earth orbiting and near-Earth scientific and communications missions, but also present new challenges to cost estimators. An effective and adaptive cost model is essential to successful mission design and implementation, and as Distributed Spacecraft Missions (DSM) become more common, cost estimating tools must become more representative of these types of designs. Existing cost models often focus on a single spacecraft and require extensive design knowledge to produce high fidelity estimates. Previous research has examined the limitations of existing cost practices as they pertain to the early stages of mission formulation, for both individual satellites and small satellite constellations. Recommendations have been made for how to improve the cost models for individual satellites one-at-a-time, but much of the complexity in constellation and DSM cost modeling arises from constellation systems level considerations that have not yet been examined. This paper constitutes a survey of the current state-of-theart in cost estimating techniques with recommendations for improvements to increase the fidelity of future constellation cost estimates. To enable our investigation, we have developed a cost estimating tool for constellation missions. The development of this tool has revealed three high-priority shortcomings within existing parametric cost estimating capabilities as they pertain to DSM architectures: design iteration, integration and test, and mission operations. Within this paper we offer illustrative examples of these discrepancies and make preliminary recommendations for addressing them. DSM and satellite constellation missions are shifting the paradigm of space-based remote sensing, showing promise in the realms of Earth science, planetary observation, and various heliophysical applications. To fully reap the benefits of DSM technology, accurate and relevant cost estimating capabilities must exist; this paper offers insights critical to the future development and implementation of DSM cost estimating tools.
JIMM: the next step for mission-level models
NASA Astrophysics Data System (ADS)
Gump, Jamieson; Kurker, Robert G.; Nalepka, Joseph P.
2001-09-01
The (Simulation Based Acquisition) SBA process is one in which the planning, design, and test of a weapon system or other product is done through the more effective use of modeling and simulation, information technology, and process improvement. This process results in a product that is produced faster, cheaper, and more reliably than its predecessors. Because the SBA process requires realistic and detailed simulation conditions, it was necessary to develop a simulation tool that would provide a simulation environment acceptable for doing SBA analysis. The Joint Integrated Mission Model (JIMM) was created to help define and meet the analysis, test and evaluation, and training requirements of a Department of Defense program utilizing SBA. Through its generic nature of representing simulation entities, its data analysis capability, and its robust configuration management process, JIMM can be used to support a wide range of simulation applications as both a constructive and a virtual simulation tool. JIMM is a Mission Level Model (MLM). A MLM is capable of evaluating the effectiveness and survivability of a composite force of air and space systems executing operational objectives in a specific scenario against an integrated air and space defense system. Because MLMs are useful for assessing a system's performance in a realistic, integrated, threat environment, they are key to implementing the SBA process. JIMM is a merger of the capabilities of one legacy model, the Suppressor MLM, into another, the Simulated Warfare Environment Generator (SWEG) MLM. By creating a more capable MLM, JIMM will not only be a tool to support the SBA initiative, but could also provide the framework for the next generation of MLMs.
NASA Astrophysics Data System (ADS)
Li, Ji; Chen, Yangbo; Wang, Huanyu; Qin, Jianming; Li, Jie; Chiao, Sen
2017-03-01
Long lead time flood forecasting is very important for large watershed flood mitigation as it provides more time for flood warning and emergency responses. The latest numerical weather forecast model could provide 1-15-day quantitative precipitation forecasting products in grid format, and by coupling this product with a distributed hydrological model could produce long lead time watershed flood forecasting products. This paper studied the feasibility of coupling the Liuxihe model with the Weather Research and Forecasting quantitative precipitation forecast (WRF QPF) for large watershed flood forecasting in southern China. The QPF of WRF products has three lead times, including 24, 48 and 72 h, with the grid resolution being 20 km × 20 km. The Liuxihe model is set up with freely downloaded terrain property; the model parameters were previously optimized with rain gauge observed precipitation, and re-optimized with the WRF QPF. Results show that the WRF QPF has bias with the rain gauge precipitation, and a post-processing method is proposed to post-process the WRF QPF products, which improves the flood forecasting capability. With model parameter re-optimization, the model's performance improves also. This suggests that the model parameters be optimized with QPF, not the rain gauge precipitation. With the increasing of lead time, the accuracy of the WRF QPF decreases, as does the flood forecasting capability. Flood forecasting products produced by coupling the Liuxihe model with the WRF QPF provide a good reference for large watershed flood warning due to its long lead time and rational results.
Modelling Solar Energetic Particle Events Using the iPATH Model
NASA Astrophysics Data System (ADS)
Li, G.; Hu, J.; Ao, X.; Zank, G. P.; Verkhoglyadova, O. P.
2016-12-01
Solar Energetic Particles (SEPs) is the No. 1 space weather hazard. Understanding how particles are energized and propagated in these events is of practical concerns to the manned space missions. In particular, both the radial evolution and the longitudinal extent of a gradual solarenergetic particle (SEP) event are central topics for space weather forecasting. In this talk, I discuss the improved Particle Acceleration and Transport in the Heliosphere (iPATH) model. The iPATH model consists of three parts: (1) an updated ZEUS3D V3.5 MHD module that models thebackground solar wind and the initiation of a CME in a 2D domain; (2) an updated shock acceleration module where we investigate particle acceleration at different longitudinal locations along the surface of a CME-driven shock. Accelerated particle spectrum are obtained at the shock under the diffusive shock acceleration mechanism. Shock parameters and particle distributions are recorded and used as inputs for the later part. (3) an updated transport module where we follow the transport of accelerated particles from the shock to any destinations (Earth and/or Mars, e.g.) using a Monte-Carlo method. Both pitch angle scattering due to MHD turbulence and perpendicular diffusion across magnetic field are included. Our iPATH model is therefore intrinsically 2D in nature. The model is capable of generating time intensity profiles and instantaneous particle spectra atvarious locations and can greatly improve our current space weather forecasting capability.
Structure and dynamics of the coronal magnetic field
NASA Technical Reports Server (NTRS)
VanHoven, Gerard; Schnack, Dalton D.
1996-01-01
The last few years have seen a marked increase in the sophistication of models of the solar corona. This has been brought about by a confluence of three key elements. First, the collection of high-resolution observations of the Sun, both in space and time, has grown tremendously. The SOHO (Solar Heliospheric Observatory) mission is providing additional correlated high-resolution magnetic, white-light and spectroscopic observations. Second, the power and availability of supercomputers has made two- and three-dimensional modeling routine. Third, the sophistication of the models themselves, both in their geometrical realism and in the detailed physics that has been included, has improved significantly. The support from our current Space Physics Theory grant has allowed us to exploit this confluence of capabilities. We have carried out direct comparisons between observations and models of the solar corona. The agreement between simulated coronal structure and observations has verified that the models are mature enough for detailed analysis, as we will describe. The development of this capability is especially timely, since observations obtained from three space missions that are underway (Ulysses, WIND and SOHO) offer an opportunity for significant advances in our understanding of the corona and heliosphere. Through this interplay of observations and theory we can improve our understanding of the Sun. Our achievements thus far include progress modeling the large-scale structure of the solar corona, three-dimensional models of active region fields, development of emerging flux and current, formation and evolution of coronal loops, and coronal heating by current filaments.
Automatic inference of multicellular regulatory networks using informative priors.
Sun, Xiaoyun; Hong, Pengyu
2009-01-01
To fully understand the mechanisms governing animal development, computational models and algorithms are needed to enable quantitative studies of the underlying regulatory networks. We developed a mathematical model based on dynamic Bayesian networks to model multicellular regulatory networks that govern cell differentiation processes. A machine-learning method was developed to automatically infer such a model from heterogeneous data. We show that the model inference procedure can be greatly improved by incorporating interaction data across species. The proposed approach was applied to C. elegans vulval induction to reconstruct a model capable of simulating C. elegans vulval induction under 73 different genetic conditions.
A qualitative study of vortex trapping capability for lift enhancement on unconventional wing
NASA Astrophysics Data System (ADS)
Salleh, M. B.; Kamaruddin, N. M.; Mohamed-Kassim, Z.
2018-05-01
Lift enhancement by using passive vortex trapping technique offers great advantage in small aircraft design as it can improve aerodynamics performance and reduce weight of the wing. To achieve this aim, a qualitative study on the flow structures across wing models with cavities has been performed using smoke wire visualisation technique. An experiment has been conducted at low Reynolds number of 26,000 with angle of attack (α) = 0°, 5°, 10° and 15° to investigate the vortex trapping capability of semi-circular leading edge (SCLE) flat-plate wing model and elliptical leading edge (ELE) flat-plate wing model with cavities, respectively. Results from the qualitative study indicated unique characteristics in the flow structures between the tested wing models. The SCLE wing models were able to trap stable rotating vortices for α ≤ 10° whereas the ability of ELE wing models to suppress flow separation allowed stable clockwise vortices to be trapped inside the cavities even at α > 10°. The trapped vortices found to have the potential to increase lift on the unconventional wing models.
Learning reliable manipulation strategies without initial physical models
NASA Technical Reports Server (NTRS)
Christiansen, Alan D.; Mason, Matthew T.; Mitchell, Tom M.
1990-01-01
A description is given of a robot, possessing limited sensory and effectory capabilities but no initial model of the effects of its actions on the world, that acquires such a model through exploration, practice, and observation. By acquiring an increasingly correct model of its actions, it generates increasingly successful plans to achieve its goals. In an apparently nondeterministic world, achieving reliability requires the identification of reliable actions and a preference for using such actions. Furthermore, by selecting its training actions carefully, the robot can significantly improve its learning rate.
AROME-Arctic: New operational NWP model for the Arctic region
NASA Astrophysics Data System (ADS)
Süld, Jakob; Dale, Knut S.; Myrland, Espen; Batrak, Yurii; Homleid, Mariken; Valkonen, Teresa; Seierstad, Ivar A.; Randriamampianina, Roger
2016-04-01
In the frame of the EU-funded project ACCESS (Arctic Climate Change, Economy and Society), MET Norway aimed 1) to describe the present monitoring and forecasting capabilities in the Arctic; and 2) to identify the key factors limiting the forecasting capabilities and to give recommendations on key areas to improve the forecasting capabilities in the Arctic. We have observed that the NWP forecast quality is lower in the Arctic than in the regions further south. Earlier research indicated that one of the factors behind this is the composition of the observing system in the Arctic, in particular the scarceness of conventional observations. To further assess possible strategies for alleviating the situation and propose scenarios for a future Arctic observing system, we have performed a set of experiments to gain a more detailed insight in the contribution of the components of the present observing system in a regional state-of-the-art non-hydrostatic NWP model using the AROME physics (Seity et al, 2011) at 2.5 km horizontal resolution - AROME-Arctic. Our observing system experiment studies showed that conventional observations (Synop, Buoys) can play an important role in correcting the surface state of the model, but prove that the present upper-air conventional (Radiosondes, Aircraft) observations in the area are too scarce to have a significant effect on forecasts. We demonstrate that satellite sounding data play an important role in improving forecast quality. This is the case with satellite temperature sounding data (AMSU-A, IASI), as well as with the satellite moisture sounding data (AMSU-B/MHS, IASI). With these sets of observations, the AROME-Arctic clearly performs better in forecasting extreme events, like for example polar lows. For more details see presentation by Randriamampianina et al. in this session. The encouraging performance of AROME-Arctic lead us to implement it with more observations and improved settings into daily runs with the objective to substitute our actual operational Arctic mesoscale HIRLAM (High Resolution Limited Area Model) NWP model. This presentation will discuss in detail the operational implementation of the AROME-Arctic model together with post-processing methods. Aimed services in the Arctic region covered by the model, such as online weather forecasting (yr.no) and tracking of polar lows (barentswatch.no), is also included.
Concerns over modeling and warning capabilities in wake of Tohoku Earthquake and Tsunami
NASA Astrophysics Data System (ADS)
Showstack, Randy
2011-04-01
Improved earthquake models, better tsunami modeling and warning capabilities, and a review of nuclear power plant safety are all greatly needed following the 11 March Tohoku earthquake and tsunami, according to scientists at the European Geosciences Union's (EGU) General Assembly, held 3-8 April in Vienna, Austria. EGU quickly organized a morning session of oral presentations and an afternoon panel discussion less than 1 month after the earthquake and the tsunami and the resulting crisis at Japan's Fukushima nuclear power plant, which has now been identified as having reached the same level of severity as the 1986 Chernobyl disaster. Many of the scientists at the EGU sessions expressed concern about the inability to have anticipated the size of the earthquake and the resulting tsunami, which appears likely to have caused most of the fatalities and damage, including damage to the nuclear plant.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hathaway, M.D.; Wood, J.R.
1997-10-01
CFD codes capable of utilizing multi-block grids provide the capability to analyze the complete geometry of centrifugal compressors. Attendant with this increased capability is potentially increased grid setup time and more computational overhead with the resultant increase in wall clock time to obtain a solution. If the increase in difficulty of obtaining a solution significantly improves the solution from that obtained by modeling the features of the tip clearance flow or the typical bluntness of a centrifugal compressor`s trailing edge, then the additional burden is worthwhile. However, if the additional information obtained is of marginal use, then modeling of certainmore » features of the geometry may provide reasonable solutions for designers to make comparative choices when pursuing a new design. In this spirit a sequence of grids were generated to study the relative importance of modeling versus detailed gridding of the tip gap and blunt trailing edge regions of the NASA large low-speed centrifugal compressor for which there is considerable detailed internal laser anemometry data available for comparison. The results indicate: (1) There is no significant difference in predicted tip clearance mass flow rate whether the tip gap is gridded or modeled. (2) Gridding rather than modeling the trailing edge results in better predictions of some flow details downstream of the impeller, but otherwise appears to offer no great benefits. (3) The pitchwise variation of absolute flow angle decreases rapidly up to 8% impeller radius ratio and much more slowly thereafter. Although some improvements in prediction of flow field details are realized as a result of analyzing the actual geometry there is no clear consensus that any of the grids investigated produced superior results in every case when compared to the measurements. However, if a multi-block code is available, it should be used, as it has the propensity for enabling better predictions than a single block code.« less
NASA Astrophysics Data System (ADS)
Peters-Lidard, C. D.; Kumar, S. V.; Santanello, J. A.; Tian, Y.; Rodell, M.; Mocko, D.; Reichle, R.
2008-12-01
The Land Information System (LIS; http://lis.gsfc.nasa.gov; Kumar et al., 2006; Peters-Lidard et al., 2007) is a flexible land surface modeling framework that has been developed with the goal of integrating satellite- and ground-based observational data products and advanced land surface modeling techniques to produce optimal fields of land surface states and fluxes. The LIS software was the co-winner of NASA's 2005 Software of the Year award. LIS facilitates the integration of observations from Earth-observing systems and predictions and forecasts from Earth System and Earth science models into the decision-making processes of partnering agency and national organizations. Due to its flexible software design, LIS can serve both as a Problem Solving Environment (PSE) for hydrologic research to enable accurate global water and energy cycle predictions, and as a Decision Support System (DSS) to generate useful information for application areas including disaster management, water resources management, agricultural management, numerical weather prediction, air quality and military mobility assessment. LIS has evolved from two earlier efforts - North American Land Data Assimilation System (NLDAS; Mitchell et al. 2004) and Global Land Data Assimilation System (GLDAS; Rodell et al. 2004) that focused primarily on improving numerical weather prediction skills by improving the characterization of the land surface conditions. Both of these systems, now use specific configurations of the LIS software in their current implementations. LIS not only consolidates the capabilities of these two systems, but also enables a much larger variety of configurations with respect to horizontal spatial resolution, input datasets and choice of land surface model through 'plugins'. In addition to these capabilities, LIS has also been demonstrated for parameter estimation (Peters-Lidard et al., 2008; Santanello et al., 2007) and data assimilation (Kumar et al., 2008). Examples and case studies demonstrating the capabilities and impacts of LIS for hydrometeorological modeling, land data assimilation and parameter estimation will be presented.
77 FR 16585 - Open Meeting of the President's Advisory Council on Financial Capability
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-21
... private sector in improving financial capability. DATES: The meeting will be held on April 9, 2012, at 10... federal government and the private sector can work together to improve the financial capability of...
ERIC Educational Resources Information Center
Aufa, Mahrani; Saragih, Sahat; Minarni, Ani
2016-01-01
The purposes of this study were:1) Developed problem-based on learning tools in the cultural context of Aceh (PBM-BKBA) who meet the criteria are valid, practical and effective; 2) Described the improvement of communication capabilities mathematics and social skills of students using the PBM-BKBA developed; and 3) Described the process of student…
Cai, X.; Yang, Z. -L.; Fisher, J. B.; ...
2016-01-15
Climate and terrestrial biosphere models consider nitrogen an important factor in limiting plant carbon uptake, while operational environmental models view nitrogen as the leading pollutant causing eutrophication in water bodies. The community Noah land surface model with multi-parameterization options (Noah-MP) is unique in that it is the next-generation land surface model for the Weather Research and Forecasting meteorological model and for the operational weather/climate models in the National Centers for Environmental Prediction. Here in this study, we add a capability to Noah-MP to simulate nitrogen dynamics by coupling the Fixation and Uptake of Nitrogen (FUN) plant model and the Soilmore » and Water Assessment Tool (SWAT) soil nitrogen dynamics. This model development incorporates FUN's state-of-the-art concept of carbon cost theory and SWAT's strength in representing the impacts of agricultural management on the nitrogen cycle. Parameterizations for direct root and mycorrhizal-associated nitrogen uptake, leaf retranslocation, and symbiotic biological nitrogen fixation are employed from FUN, while parameterizations for nitrogen mineralization, nitrification, immobilization, volatilization, atmospheric deposition, and leaching are based on SWAT. The coupled model is then evaluated at the Kellogg Biological Station – a Long Term Ecological Research site within the US Corn Belt. Results show that the model performs well in capturing the major nitrogen state/flux variables (e.g., soil nitrate and nitrate leaching). Furthermore, the addition of nitrogen dynamics improves the modeling of net primary productivity and evapotranspiration. The model improvement is expected to advance the capability of Noah-MP to simultaneously predict weather and water quality in fully coupled Earth system models.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cai, X.; Yang, Z. -L.; Fisher, J. B.
Climate and terrestrial biosphere models consider nitrogen an important factor in limiting plant carbon uptake, while operational environmental models view nitrogen as the leading pollutant causing eutrophication in water bodies. The community Noah land surface model with multi-parameterization options (Noah-MP) is unique in that it is the next-generation land surface model for the Weather Research and Forecasting meteorological model and for the operational weather/climate models in the National Centers for Environmental Prediction. Here in this study, we add a capability to Noah-MP to simulate nitrogen dynamics by coupling the Fixation and Uptake of Nitrogen (FUN) plant model and the Soilmore » and Water Assessment Tool (SWAT) soil nitrogen dynamics. This model development incorporates FUN's state-of-the-art concept of carbon cost theory and SWAT's strength in representing the impacts of agricultural management on the nitrogen cycle. Parameterizations for direct root and mycorrhizal-associated nitrogen uptake, leaf retranslocation, and symbiotic biological nitrogen fixation are employed from FUN, while parameterizations for nitrogen mineralization, nitrification, immobilization, volatilization, atmospheric deposition, and leaching are based on SWAT. The coupled model is then evaluated at the Kellogg Biological Station – a Long Term Ecological Research site within the US Corn Belt. Results show that the model performs well in capturing the major nitrogen state/flux variables (e.g., soil nitrate and nitrate leaching). Furthermore, the addition of nitrogen dynamics improves the modeling of net primary productivity and evapotranspiration. The model improvement is expected to advance the capability of Noah-MP to simultaneously predict weather and water quality in fully coupled Earth system models.« less
Mechanics of airflow in the human nasal airways.
Doorly, D J; Taylor, D J; Schroter, R C
2008-11-30
The mechanics of airflow in the human nasal airways is reviewed, drawing on the findings of experimental and computational model studies. Modelling inevitably requires simplifications and assumptions, particularly given the complexity of the nasal airways. The processes entailed in modelling the nasal airways (from defining the model, to its production and, finally, validating the results) is critically examined, both for physical models and for computational simulations. Uncertainty still surrounds the appropriateness of the various assumptions made in modelling, particularly with regard to the nature of flow. New results are presented in which high-speed particle image velocimetry (PIV) and direct numerical simulation are applied to investigate the development of flow instability in the nasal cavity. These illustrate some of the improved capabilities afforded by technological developments for future model studies. The need for further improvements in characterising airway geometry and flow together with promising new methods are briefly discussed.
Plant water potential improves prediction of empirical stomatal models.
Anderegg, William R L; Wolf, Adam; Arango-Velez, Adriana; Choat, Brendan; Chmura, Daniel J; Jansen, Steven; Kolb, Thomas; Li, Shan; Meinzer, Frederick; Pita, Pilar; Resco de Dios, Víctor; Sperry, John S; Wolfe, Brett T; Pacala, Stephen
2017-01-01
Climate change is expected to lead to increases in drought frequency and severity, with deleterious effects on many ecosystems. Stomatal responses to changing environmental conditions form the backbone of all ecosystem models, but are based on empirical relationships and are not well-tested during drought conditions. Here, we use a dataset of 34 woody plant species spanning global forest biomes to examine the effect of leaf water potential on stomatal conductance and test the predictive accuracy of three major stomatal models and a recently proposed model. We find that current leaf-level empirical models have consistent biases of over-prediction of stomatal conductance during dry conditions, particularly at low soil water potentials. Furthermore, the recently proposed stomatal conductance model yields increases in predictive capability compared to current models, and with particular improvement during drought conditions. Our results reveal that including stomatal sensitivity to declining water potential and consequent impairment of plant water transport will improve predictions during drought conditions and show that many biomes contain a diversity of plant stomatal strategies that range from risky to conservative stomatal regulation during water stress. Such improvements in stomatal simulation are greatly needed to help unravel and predict the response of ecosystems to future climate extremes.
Image-optimized Coronal Magnetic Field Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, Shaela I.; Uritsky, Vadim; Davila, Joseph M., E-mail: shaela.i.jones-mecholsky@nasa.gov, E-mail: shaela.i.jonesmecholsky@nasa.gov
We have reported previously on a new method we are developing for using image-based information to improve global coronal magnetic field models. In that work, we presented early tests of the method, which proved its capability to improve global models based on flawed synoptic magnetograms, given excellent constraints on the field in the model volume. In this follow-up paper, we present the results of similar tests given field constraints of a nature that could realistically be obtained from quality white-light coronagraph images of the lower corona. We pay particular attention to difficulties associated with the line-of-sight projection of features outsidemore » of the assumed coronagraph image plane and the effect on the outcome of the optimization of errors in the localization of constraints. We find that substantial improvement in the model field can be achieved with these types of constraints, even when magnetic features in the images are located outside of the image plane.« less
Space Shuttle Orbiter oxygen partial pressure sensing and control system improvements
NASA Technical Reports Server (NTRS)
Frampton, Robert F.; Hoy, Dennis M.; Kelly, Kevin J.; Walleshauser, James J.
1992-01-01
A program aimed at developing a new PPO2 oxygen sensor and a replacement amplifier for the Space Shuttle Orbiter is described. Experimental design methodologies used in the test and modeling process made it possible to enhance the effectiveness of the program and to reduce its cost. Significant cost savings are due to the increased lifetime of the basic sensor cell, the maximization of useful sensor life through an increased amplifier gain adjustment capability, the use of streamlined production processes for the manufacture of the assemblies, and the refurbishment capability of the replacement sensor.
NASA Technical Reports Server (NTRS)
Conway, R.; Matuck, G. N.; Roe, J. M.; Taylor, J.; Turner, A.
1975-01-01
A vortex information display system is described which provides flexible control through system-user interaction for collecting wing-tip-trailing vortex data, processing this data in real time, displaying the processed data, storing raw data on magnetic tape, and post processing raw data. The data is received from two asynchronous laser Doppler velocimeters (LDV's) and includes position, velocity, and intensity information. The raw data is written onto magnetic tape for permanent storage and is also processed in real time to locate vortices and plot their positions as a function of time. The interactive capability enables the user to make real time adjustments in processing data and provides a better definition of vortex behavior. Displaying the vortex information in real time produces a feedback capability to the LDV system operator allowing adjustments to be made in the collection of raw data. Both raw data and processing can be continually upgraded during flyby testing to improve vortex behavior studies. The post-analysis capability permits the analyst to perform in-depth studies of test data and to modify vortex behavior models to improve transport predictions.
USM3D Analysis of Low Boom Configuration
NASA Technical Reports Server (NTRS)
Carter, Melissa B.; Campbell, Richard L.; Nayani, Sudheer N.
2011-01-01
In the past few years considerable improvement was made in NASA's in house boom prediction capability. As part of this improved capability, the USM3D Navier-Stokes flow solver, when combined with a suitable unstructured grid, went from accurately predicting boom signatures at 1 body length to 10 body lengths. Since that time, the research emphasis has shifted from analysis to the design of supersonic configurations with boom signature mitigation In order to design an aircraft, the techniques for accurately predicting boom and drag need to be determined. This paper compares CFD results with the wind tunnel experimental results conducted on a Gulfstream reduced boom and drag configuration. Two different wind-tunnel models were designed and tested for drag and boom data. The goal of this study was to assess USM3D capability for predicting both boom and drag characteristics. Overall, USM3D coupled with a grid that was sheared and stretched was able to reasonably predict boom signature. The computational drag polar matched the experimental results for a lift coefficient above 0.1 despite some mismatch in the predicted lift-curve slope.
The Defense Threat Reduction Agency's Technical Nuclear Forensics Research and Development Program
NASA Astrophysics Data System (ADS)
Franks, J.
2015-12-01
The Defense Threat Reduction Agency (DTRA) Technical Nuclear Forensics (TNF) Research and Development (R&D) Program's overarching goal is to design, develop, demonstrate, and transition advanced technologies and methodologies that improve the interagency operational capability to provide forensics conclusions after the detonation of a nuclear device. This goal is attained through the execution of three focus areas covering the span of the TNF process to enable strategic decision-making (attribution): Nuclear Forensic Materials Exploitation - Development of targeted technologies, methodologies and tools enabling the timely collection, analysis and interpretation of detonation materials.Prompt Nuclear Effects Exploitation - Improve ground-based capabilities to collect prompt nuclear device outputs and effects data for rapid, complementary and corroborative information.Nuclear Forensics Device Characterization - Development of a validated and verified capability to reverse model a nuclear device with high confidence from observables (e.g., prompt diagnostics, sample analysis, etc.) seen after an attack. This presentation will outline DTRA's TNF R&D strategy and current investments, with efforts focusing on: (1) introducing new technical data collection capabilities (e.g., ground-based prompt diagnostics sensor systems; innovative debris collection and analysis); (2) developing new TNF process paradigms and concepts of operations to decrease timelines and uncertainties, and increase results confidence; (3) enhanced validation and verification (V&V) of capabilities through technology evaluations and demonstrations; and (4) updated weapon output predictions to account for the modern threat environment. A key challenge to expanding these efforts to a global capability is the need for increased post-detonation TNF international cooperation, collaboration and peer reviews.
Panel management, team culture, and worklife experience.
Willard-Grace, Rachel; Dubé, Kate; Hessler, Danielle; O'Brien, Bridget; Earnest, Gillian; Gupta, Reena; Shunk, Rebecca; Grumbach, Kevin
2015-09-01
Burnout and professional dissatisfaction are threats to the primary care workforce. We investigated the relationship between panel management capability, team culture, cynicism, and perceived "do-ability" of primary care among primary care providers (PCPs) and staff in primary care practices. We surveyed 326 PCPs and 142 staff members in 10 county-administered, 6 university-run, and 3 Veterans Affairs primary care clinics in a large urban area in 2013. Predictor variables included capability for performing panel management and perception of team culture. Outcome variables included 2 work experience measures--the Maslach Burnout Inventory cynicism scale and a 1-item measure of the "do-ability" of primary care this year compared with last year. Generalized Estimation Equation (GEE) models were used to account for clustering at the clinic level. Greater panel management capability and higher team culture were associated with lower cynicism among PCPs and staff and higher reported "do-ability" of primary care among PCPs. Panel management capability and team culture interacted to predict the 2 work experience outcomes. Among PCPs and staff reporting high team culture, there was little association between panel management capability and the outcomes, which were uniformly positive. However, there was a strong relationship between greater panel management capability and improved work experience outcomes for PCPs and staff reporting low team culture. Team-based processes of care such as panel management may be an important strategy to protect against cynicism and dissatisfaction in primary care, particularly in settings that are still working to improve their team culture. (c) 2015 APA, all rights reserved).
Edge printability: techniques used to evaluate and improve extreme wafer edge printability
NASA Astrophysics Data System (ADS)
Roberts, Bill; Demmert, Cort; Jekauc, Igor; Tiffany, Jason P.
2004-05-01
The economics of semiconductor manufacturing have forced process engineers to develop techniques to increase wafer yield. Improvements in process controls and uniformities in all areas of the fab have reduced film thickness variations at the very edge of the wafer surface. This improved uniformity has provided the opportunity to consider decreasing edge exclusions, and now the outermost extents of the wafer must be considered in the yield model and expectations. These changes have increased the requirements on lithography to improve wafer edge printability in areas that previously were not even coated. This has taxed all software and hardware components used in defining the optical focal plane at the wafer edge. We have explored techniques to determine the capabilities of extreme wafer edge printability and the components of the systems that influence this printability. We will present current capabilities and new detection techniques and the influence that the individual hardware and software components have on edge printability. We will show effects of focus sensor designs, wafer layout, utilization of dummy edge fields, the use of non-zero overlay targets and chemical/optical edge bead optimization.
Improving Distributed Diagnosis Through Structural Model Decomposition
NASA Technical Reports Server (NTRS)
Bregon, Anibal; Daigle, Matthew John; Roychoudhury, Indranil; Biswas, Gautam; Koutsoukos, Xenofon; Pulido, Belarmino
2011-01-01
Complex engineering systems require efficient fault diagnosis methodologies, but centralized approaches do not scale well, and this motivates the development of distributed solutions. This work presents an event-based approach for distributed diagnosis of abrupt parametric faults in continuous systems, by using the structural model decomposition capabilities provided by Possible Conflicts. We develop a distributed diagnosis algorithm that uses residuals computed by extending Possible Conflicts to build local event-based diagnosers based on global diagnosability analysis. The proposed approach is applied to a multitank system, and results demonstrate an improvement in the design of local diagnosers. Since local diagnosers use only a subset of the residuals, and use subsystem models to compute residuals (instead of the global system model), the local diagnosers are more efficient than previously developed distributed approaches.
NASA Technical Reports Server (NTRS)
Garland, D. B.; Harris, J. L.
1980-01-01
Static and forward speed tests were made in a 40 multiplied by 80 foot wind tunnel of a large-scale, ejector-powered V/STOL aircraft model. Modifications were made to the model following earlier tests primarily to improve longitudinal acceleration capability during transition from hovering to wingborne flight. A rearward deflection of the fuselage augmentor thrust vector was shown to be beneficial in this regard. Other augmentor modifications were tested, notably the removal of both endplates, which improved acceleration performance at the higher transition speeds. The model tests again demonstrated minimal interference of the fuselage augmentor on aerodynamic lift. A flapped canard surface also showed negligible influence on the performance of the wing and of the fuselage augmentor.
Christensen, Nikolaj K; Minsley, Burke J.; Christensen, Steen
2017-01-01
We present a new methodology to combine spatially dense high-resolution airborne electromagnetic (AEM) data and sparse borehole information to construct multiple plausible geological structures using a stochastic approach. The method developed allows for quantification of the performance of groundwater models built from different geological realizations of structure. Multiple structural realizations are generated using geostatistical Monte Carlo simulations that treat sparse borehole lithological observations as hard data and dense geophysically derived structural probabilities as soft data. Each structural model is used to define 3-D hydrostratigraphical zones of a groundwater model, and the hydraulic parameter values of the zones are estimated by using nonlinear regression to fit hydrological data (hydraulic head and river discharge measurements). Use of the methodology is demonstrated for a synthetic domain having structures of categorical deposits consisting of sand, silt, or clay. It is shown that using dense AEM data with the methodology can significantly improve the estimated accuracy of the sediment distribution as compared to when borehole data are used alone. It is also shown that this use of AEM data can improve the predictive capability of a calibrated groundwater model that uses the geological structures as zones. However, such structural models will always contain errors because even with dense AEM data it is not possible to perfectly resolve the structures of a groundwater system. It is shown that when using such erroneous structures in a groundwater model, they can lead to biased parameter estimates and biased model predictions, therefore impairing the model's predictive capability.
NASA Astrophysics Data System (ADS)
Christensen, N. K.; Minsley, B. J.; Christensen, S.
2017-02-01
We present a new methodology to combine spatially dense high-resolution airborne electromagnetic (AEM) data and sparse borehole information to construct multiple plausible geological structures using a stochastic approach. The method developed allows for quantification of the performance of groundwater models built from different geological realizations of structure. Multiple structural realizations are generated using geostatistical Monte Carlo simulations that treat sparse borehole lithological observations as hard data and dense geophysically derived structural probabilities as soft data. Each structural model is used to define 3-D hydrostratigraphical zones of a groundwater model, and the hydraulic parameter values of the zones are estimated by using nonlinear regression to fit hydrological data (hydraulic head and river discharge measurements). Use of the methodology is demonstrated for a synthetic domain having structures of categorical deposits consisting of sand, silt, or clay. It is shown that using dense AEM data with the methodology can significantly improve the estimated accuracy of the sediment distribution as compared to when borehole data are used alone. It is also shown that this use of AEM data can improve the predictive capability of a calibrated groundwater model that uses the geological structures as zones. However, such structural models will always contain errors because even with dense AEM data it is not possible to perfectly resolve the structures of a groundwater system. It is shown that when using such erroneous structures in a groundwater model, they can lead to biased parameter estimates and biased model predictions, therefore impairing the model's predictive capability.
Nelson, Christopher; Savoia, Elena; Ljungqvist, Irina; Ciotti, Massimo
2017-01-01
Improving preparedness in the European region requires a clear understanding of what European Union (EU) member states should be able to do, whether acting internally or in cooperation with each other or the EU and other multilateral organizations. We have developed a preparedness logic model that specifies the aims and objectives of public health preparedness, as well as the response capabilities and preparedness capacities needed to achieve them. The capabilities, which describe the ability to effectively use capacities to identify, characterize, and respond to emergencies, are organized into 5 categories. The first 3 categories—(1) assessment; (2) policy development, adaptation, and implementation; and (3) prevention and treatment services in the health sector—represent what the public health system must accomplish to respond effectively. The fourth and fifth categories represent a series of interrelated functions needed to ensure that the system fulfills its assessment, policy development, and prevention and treatment roles: (4) coordination and communication regards information sharing within the public health system, incident management, and leadership, and (5) emergency risk communication focuses on communication with the public. This model provides a framework for identifying what to measure in capacity inventories, exercises, critical incident analyses, and other approaches to assessing public health emergency preparedness, not how to measure them. Focusing on a common set of capacities and capabilities to measure allows for comparisons both over time and between member states, which can enhance learning and sharing results and help identify both strengths and areas for improvement of public health emergency preparedness in the EU. PMID:29058967
Stoto, Michael A; Nelson, Christopher; Savoia, Elena; Ljungqvist, Irina; Ciotti, Massimo
Improving preparedness in the European region requires a clear understanding of what European Union (EU) member states should be able to do, whether acting internally or in cooperation with each other or the EU and other multilateral organizations. We have developed a preparedness logic model that specifies the aims and objectives of public health preparedness, as well as the response capabilities and preparedness capacities needed to achieve them. The capabilities, which describe the ability to effectively use capacities to identify, characterize, and respond to emergencies, are organized into 5 categories. The first 3 categories-(1) assessment; (2) policy development, adaptation, and implementation; and (3) prevention and treatment services in the health sector-represent what the public health system must accomplish to respond effectively. The fourth and fifth categories represent a series of interrelated functions needed to ensure that the system fulfills its assessment, policy development, and prevention and treatment roles: (4) coordination and communication regards information sharing within the public health system, incident management, and leadership, and (5) emergency risk communication focuses on communication with the public. This model provides a framework for identifying what to measure in capacity inventories, exercises, critical incident analyses, and other approaches to assessing public health emergency preparedness, not how to measure them. Focusing on a common set of capacities and capabilities to measure allows for comparisons both over time and between member states, which can enhance learning and sharing results and help identify both strengths and areas for improvement of public health emergency preparedness in the EU.
NASA Tools for Climate Impacts on Water Resources
NASA Technical Reports Server (NTRS)
Toll, David; Doorn, Brad
2010-01-01
Climate and environmental change are expected to fundamentally alter the nation's hydrological cycle and water availability. Satellites provide global or near-global coverage using instruments, allowing for consistent, well-calibrated, and equivalent-quality data of the Earth system. A major goal for NASA climate and environmental change research is to create multi-instrument data sets to span the multi-decadal time scales of climate change and to combine these data with those from modeling and surface-based observing systems to improve process understanding and predictions. NASA and Earth science data and analyses will ultimately enable more accurate climate prediction, and characterization of uncertainties. NASA's Applied Sciences Program works with other groups, including other federal agencies, to transition demonstrated observational capabilities to operational capabilities. A summary of some of NASA tools for improved water resources management will be presented.
NASA Astrophysics Data System (ADS)
Kim, G. H.; Kim, A. R.; Kim, S.; Park, M.; Yu, I. K.; Seong, K. C.; Won, Y. J.
2011-11-01
Superconducting magnetic energy storage (SMES) system is a DC current driven device and can be utilized to improve power quality particularly in connection with renewable energy sources due to higher efficiency and faster response than other devices. This paper suggests a novel connection topology of SMES which can smoothen the output power flow of the wind power generation system (WPGS). The structure of the proposed system is cost-effective because it reduces a power converter in comparison with a conventional application of SMES. One more advantage of SMES in the proposed system is to improve the capability of low voltage ride through (LVRT) for the permanent magnet synchronous generator (PMSG) type WPGS. The proposed system including a SMES has been modeled and analyzed by a PSCAD/EMTDC. The simulation results show the effectiveness of the novel SMES application strategy to not only mitigate the output power of the PMSG but also improve the capability of LVRT for PMSG type WPGS.
A portable monitor system for biology signal based on singlechip
NASA Astrophysics Data System (ADS)
Tu, Qiaoling; Guo, Jianhua; He, Li; Xu, Xia
2005-12-01
The objectives of the paper are to improve accuracy of the electrocardiogram and temperature signal, improve the system stability and the capability of dynamic response, and decrease power consumption and volume of the system. The basic method is making use of the inner resource of the singlechip, such as the exact constant-current source, hardware multiplier, ADC, etc. The model of singlechip is MSP430F449 of TI (Texas Instruments). A simple integral-coefficient band-rejection digital filter was designed for analyzing the electrocardiogram signal. The deviation of temperature coming from the degradation of battery voltage was compensated for. An automatic discharge access was designed in the circuit to improve the capability of dynamic response of circuit. The results indicate that the 50 Hz power frequency interfering and the baseline drift are filtered, the figure is clear, the accuracy of temperature is 0.03°C, and the consumption current is less than 1.3mA. The system can meet the requirement in ward monitor and surgery monitor.
NASA Technical Reports Server (NTRS)
Starr, D. OC. (Editor); Melfi, S. Harvey (Editor)
1991-01-01
The proposed GEWEX Water Vapor Project (GVaP) addresses fundamental deficiencies in the present understanding of moist atmospheric processes and the role of water vapor in the global hydrologic cycle and climate. Inadequate knowledge of the distribution of atmospheric water vapor and its transport is a major impediment to progress in achieving a fuller understanding of various hydrologic processes and a capability for reliable assessment of potential climatic change on global and regional scales. GVap will promote significant improvements in knowledge of atmospheric water vapor and moist processes as well as in present capabilities to model these processes on global and regional scales. GVaP complements a number of ongoing and planned programs focused on various aspects of the hydrologic cycle. The goal of GVaP is to improve understanding of the role of water vapor in meteorological, hydrological, and climatological processes through improved knowledge of water vapor and its variability on all scales. A detailed description of the GVaP is presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Irminger, Philip; Starke, Michael R; Dimitrovski, Aleksandar D
2014-01-01
Power system equipment manufacturers and researchers continue to experiment with novel overhead electric conductor designs that support better conductor performance and address congestion issues. To address the technology gap in testing these novel designs, Oak Ridge National Laboratory constructed the Powerline Conductor Accelerated Testing (PCAT) facility to evaluate the performance of novel overhead conductors in an accelerated fashion in a field environment. Additionally, PCAT has the capability to test advanced sensors and measurement methods for accessing overhead conductor performance and condition. Equipped with extensive measurement and monitoring devices, PCAT provides a platform to improve/validate conductor computer models and assess themore » performance of novel conductors. The PCAT facility and its testing capabilities are described in this paper.« less
Integrated Medical Model (IMM) Optimization Version 4.0 Functional Improvements
NASA Technical Reports Server (NTRS)
Arellano, John; Young, M.; Boley, L.; Garcia, Y.; Saile, L.; Walton, M.; Kerstman, E.; Reyes, D.; Goodenow, D. A.; Myers, J. G.
2016-01-01
The IMMs ability to assess mission outcome risk levels relative to available resources provides a unique capability to provide guidance on optimal operational medical kit and vehicle resources. Post-processing optimization allows IMM to optimize essential resources to improve a specific model outcome such as maximization of the Crew Health Index (CHI), or minimization of the probability of evacuation (EVAC) or the loss of crew life (LOCL). Mass and or volume constrain the optimized resource set. The IMMs probabilistic simulation uses input data on one hundred medical conditions to simulate medical events that may occur in spaceflight, the resources required to treat those events, and the resulting impact to the mission based on specific crew and mission characteristics. Because IMM version 4.0 provides for partial treatment for medical events, IMM Optimization 4.0 scores resources at the individual resource unit increment level as opposed to the full condition-specific treatment set level, as done in version 3.0. This allows the inclusion of as many resources as possible in the event that an entire set of resources called out for treatment cannot satisfy the constraints. IMM Optimization version 4.0 adds capabilities that increase efficiency by creating multiple resource sets based on differing constraints and priorities, CHI, EVAC, or LOCL. It also provides sets of resources that improve mission-related IMM v4.0 outputs with improved performance compared to the prior optimization. The new optimization represents much improved fidelity that will improve the utility of the IMM 4.0 for decision support.
Technical and Scientific Support for Passive Acoustic Monitoring in the Research Cruise MED09
2009-09-30
The sound analysis workstation developed for previous Sirena cruises was further improved with 8 channels recording capability at 192kHz and 2...This dataset will be compared with the one produced in Sirena 08 (Alboran Sea only) and included in distribution/density models being developed at
Academic Airframe Icing Perspective
NASA Technical Reports Server (NTRS)
Bragg, Mike; Rothmayer, Alric; Thompson, David
2009-01-01
2-D ice accretion and aerodynamics reasonably well understood for engineering applications To significantly improve our current capabilities we need to understand 3-D: a) Important ice accretion physics and modeling not well understood in 3-D; and b) Aerodynamics unsteady and 3-D especially near stall. Larger systems issues important and require multidisciplinary team approach
A Conceptual Design Model for CBT Development: A NATO Case Study
ERIC Educational Resources Information Center
Kok, Ayse
2014-01-01
CBT (computer-based training) can benefit from the modern multimedia tools combined with network capabilities to overcame traditional education. The objective of this paper is focused on CBT development to improve strategic decision-making with regard to air command and control system for NATO staff in virtual environment. A conceptual design for…
DEVELOPMENT WORK FOR IMPROVED HEAVY-DUTY VEHICLE MODELING CAPABILITY DATA MINING--FHWA DATASETS
A heavy-duty vehicle can produce 10 to 100 times the emissions (of NOx and PM emissions especially) of a light-duty vehicle, so heavy-duty vehicle activity needs to be well characterized. Key uncertainties with the use of MOBILE6 regarding heavy-duty vehicle emissions include th...
Meeting the needs of an ever-demanding market.
Rigby, Richard
2002-04-01
Balancing cost and performance in packaging is critical. This article outlines techniques to assist in this whilst delivering added value and product differentiation. The techniques include a rigorous statistical process capable of delivering cost reduction and improved quality and a computer modelling process that can save time when validating new packaging options.
Initial Implementation of Transient VERA-CS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gerlach, Andrew; Kochunas, Brendan; Salko, Robert
In this milestone the capabilities of both CTF and MPACT were extended to perform coupled transient calculations. This required several small changes in MPACT to setup the problems correctly, perform the edits correctly, and call the appropriate CTF interfaces in the right order. For CTF, revisions and corrections to the transient timestepping algorithm were made, as well as the addition of a new interface subroutine to allow MPACT to drive CTF at each timestep. With the modifications completed, the initial coupled capability was demonstrated on some problems used for code verification, a hypothetical small mini-core, and a Watts Bar demonstrationmore » problem. For each of these cases the results showed good agreement with the previous MPACT internal TH feedback model that relied on a simplified fuel heat conduction model and simplified coolant treatment. After the pulse the results are notably different as expected, where the effects of convection of heat to the coolant can be observed. Areas for future work were discussed, including assessment and development of the CTF dynamic fuel deformation and gap conductance models, addition of suitable transient boiling and CHF models for the rapid heating and cooling rates seen in RIAs, additional validation and demonstration work, and areas for improvement to the code input and output capabilities.« less
Thermal niche estimators and the capability of poor dispersal species to cope with climate change
Sánchez-Fernández, David; Rizzo, Valeria; Cieslak, Alexandra; Faille, Arnaud; Fresneda, Javier; Ribera, Ignacio
2016-01-01
For management strategies in the context of global warming, accurate predictions of species response are mandatory. However, to date most predictions are based on niche (bioclimatic) models that usually overlook biotic interactions, behavioral adjustments or adaptive evolution, and assume that species can disperse freely without constraints. The deep subterranean environment minimises these uncertainties, as it is simple, homogeneous and with constant environmental conditions. It is thus an ideal model system to study the effect of global change in species with poor dispersal capabilities. We assess the potential fate of a lineage of troglobitic beetles under global change predictions using different approaches to estimate their thermal niche: bioclimatic models, rates of thermal niche change estimated from a molecular phylogeny, and data from physiological studies. Using bioclimatic models, at most 60% of the species were predicted to have suitable conditions in 2080. Considering the rates of thermal niche change did not improve this prediction. However, physiological data suggest that subterranean species have a broad thermal tolerance, allowing them to stand temperatures never experienced through their evolutionary history. These results stress the need of experimental approaches to assess the capability of poor dispersal species to cope with temperatures outside those they currently experience. PMID:26983802
Incompressible viscous flow simulations of the NFAC wind tunnel
NASA Technical Reports Server (NTRS)
Champney, Joelle Milene
1986-01-01
The capabilities of an existing 3-D incompressible Navier-Stokes flow solver, INS3D, are extended and improved to solve turbulent flows through the incorporation of zero- and two-equation turbulence models. The two-equation model equations are solved in their high Reynolds number form and utilize wall functions in the treatment of solid wall boundary conditions. The implicit approximate factorization scheme is modified to improve the stability of the two-equation solver. Applications to the 3-D viscous flow inside the 80 by 120 feet open return wind tunnel of the National Full Scale Aerodynamics Complex (NFAC) are discussed and described.
NASA Technical Reports Server (NTRS)
Zhu, Dongming
2017-01-01
Environmental barrier coatings (EBCs) are considered technologically important because of the critical needs and their ability to effectively protect the turbine hot-section SiC/SiC ceramic matrix composite (CMC) components in harsh engine combustion environments. The development of NASA's advanced environmental barrier coatings have been aimed at significantly improved the coating system temperature capability, stability, erosion-impact, and CMAS resistance for SiC/SiC turbine airfoil and combustors component applications. The NASA environmental barrier coating developments have also emphasized thermo-mechanical creep and fatigue resistance in simulated engine heat flux and environments. Experimental results and models for advanced EBC systems will be presented to help establishing advanced EBC composition design methodologies, performance modeling and life predictions, for achieving prime-reliant, durable environmental coating systems for 2700-3000 F engine component applications. Major technical barriers in developing environmental barrier coating systems and the coating integration with next generation composites having further improved temperature capability, environmental stability, EBC-CMC fatigue-environment system durability will be discussed.
NASA Astrophysics Data System (ADS)
Taori, Alok; Raghunath, Karnam; Jayaraman, Achuthan
We use combination of simultaneous measurements made with Rayleigh lidar and O2 airglow monitoring to improve lidar investigation capability to cover a higher altitude range. We feed instantaneous O2 airglow temperatures instead the model values at the top altitude for subsequent integration method of temperature retrieval using Rayleigh lidar back scattered signals. Using this method, errors in the lidar temperature estimates converges at higher altitudes indicating better altitude coverage compared to regular methods where model temperatures are used instead of real-time measurements. This improvement enables the measurements of short period waves at upper mesospheric altitudes (~90 km). With two case studies, we show that above 60 km the few short period wave amplitude drastically increases while, some of the short period wave show either damping or saturation. We claim that by using such combined measurements, a significant and cost effective progress can be made in the understanding of short period wave processes which are important for the coupling across the different atmospheric regions.
Predictive protocol of flocks with small-world connection pattern.
Zhang, Hai-Tao; Chen, Michael Z Q; Zhou, Tao
2009-01-01
By introducing a predictive mechanism with small-world connections, we propose a new motion protocol for self-driven flocks. The small-world connections are implemented by randomly adding long-range interactions from the leader to a few distant agents, namely, pseudoleaders. The leader can directly affect the pseudoleaders, thereby influencing all the other agents through them efficiently. Moreover, these pseudoleaders are able to predict the leader's motion several steps ahead and use this information in decision making towards coherent flocking with more stable formation. It is shown that drastic improvement can be achieved in terms of both the consensus performance and the communication cost. From the engineering point of view, the current protocol allows for a significant improvement in the cohesion and rigidity of the formation at a fairly low cost of adding a few long-range links embedded with predictive capabilities. Significantly, this work uncovers an important feature of flocks that predictive capability and long-range links can compensate for the insufficiency of each other. These conclusions are valid for both the attractive and repulsive swarm model and the Vicsek model.
Performance of the NEXT Engineering Model Power Processing Unit
NASA Technical Reports Server (NTRS)
Pinero, Luis R.; Hopson, Mark; Todd, Philip C.; Wong, Brian
2007-01-01
The NASA s Evolutionary Xenon Thruster (NEXT) project is developing an advanced ion propulsion system for future NASA missions for solar system exploration. An engineering model (EM) power processing unit (PPU) for the NEXT project was designed and fabricated by L-3 Communications under contract with NASA Glenn Research Center (GRC). This modular PPU is capable of processing up from 0.5 to 7.0 kW of output power for the NEXT ion thruster. Its design includes many significant improvements for better performance over the state-of-the-art PPU. The most significant difference is the beam supply which is comprised of six modules and capable of very efficient operation through a wide voltage range because of innovative features like dual controls, module addressing, and a high current mode. The low voltage power supplies are based on elements of the previously validated NASA Solar Electric Propulsion Technology Application Readiness (NSTAR) PPU. The highly modular construction of the PPU resulted in improved manufacturability, simpler scalability, and lower cost. This paper describes the design of the EM PPU and the results of the bench-top performance tests.
Improving measurement technology for the design of sustainable cities
NASA Astrophysics Data System (ADS)
Pardyjak, Eric R.; Stoll, Rob
2017-09-01
This review identifies and discusses measurement technology gaps that are currently preventing major science leaps from being realized in the study of urban environmental transport processes. These scientific advances are necessary to better understand the links between atmospheric transport processes in the urban environment, human activities, and potential management strategies. We propose that with various improved and targeted measurements, it will be possible to provide technically sound guidance to policy and decision makers for the design of sustainable cities. This review focuses on full-scale in situ and remotely sensed measurements of atmospheric winds, temperature, and humidity in cities and links measurements to current modeling and simulation needs. A key conclusion of this review is that there is a need for urban-specific measurement techniques including measurements of highly-resolved three-dimensional fields at sampling frequencies high enough to capture small-scale turbulence processes yet also capable of covering spatial extents large enough to simultaneously capture key features of urban heterogeneity and boundary layer processes while also supporting the validation of current and emerging modeling capabilities.
SmaggIce 2.0: Additional Capabilities for Interactive Grid Generation of Iced Airfoils
NASA Technical Reports Server (NTRS)
Kreeger, Richard E.; Baez, Marivell; Braun, Donald C.; Schilling, Herbert W.; Vickerman, Mary B.
2008-01-01
The Surface Modeling and Grid Generation for Iced Airfoils (SmaggIce) software toolkit has been extended to allow interactive grid generation for multi-element iced airfoils. The essential phases of an icing effects study include geometry preparation, block creation and grid generation. SmaggIce Version 2.0 now includes these main capabilities for both single and multi-element airfoils, plus an improved flow solver interface and a variety of additional tools to enhance the efficiency and accuracy of icing effects studies. An overview of these features is given, especially the new multi-element blocking strategy using the multiple wakes method. Examples are given which illustrate the capabilities of SmaggIce for conducting an icing effects study for both single and multi-element airfoils.
The Evolution of Medical Training Simulation in the U.S. Military.
Linde, Amber S; Kunkler, Kevin
2016-01-01
The United States has been at war since 2003. During that time, training using Medical Simulation technology has been developed and integrated into military medical training for combat medics, nurses and surgeons. Efforts stemming from the Joint Programmatic Committee-1 (JPC-1) Medical Simulation and Training Portfolio has allowed for the improvement and advancement in military medical training by focusing on research in simulation training technology in order to achieve this. Based upon lessons learned capability gaps have been identified concerning the necessity to validate and enhance combat medial training simulators. These capability gaps include 1) Open Source/Open Architecture; 2) Modularity and Interoperability; and 3) Material and Virtual Reality (VR) Models. Using the capability gaps, JPC-1 has identified important research endeavors that need to be explored.
NASA Astrophysics Data System (ADS)
Prasad, K.; Thorpe, A. K.; Duren, R. M.; Thompson, D. R.; Whetstone, J. R.
2016-12-01
The National Institute of Standards and Technology (NIST) has supported the development and demonstration of a measurement capability to accurately locate greenhouse gas sources and measure their flux to the atmosphere over urban domains. However, uncertainties in transport models which form the basis of all top-down approaches can significantly affect our capability to attribute sources and predict their flux to the atmosphere. Reducing uncertainties between bottom-up and top-down models will require high resolution transport models as well as validation and verification of dispersion models over an urban domain. Tracer experiments involving the release of Perfluorocarbon Tracers (PFTs) at known flow rates offer the best approach for validating dispersion / transport models. However, tracer experiments are limited by cost, ability to make continuous measurements, and environmental concerns. Natural tracer experiments, such as the leak from the Aliso Canyon underground storage facility offers a unique opportunity to improve and validate high resolution transport models, test leak hypothesis, and to estimate the amount of methane released.High spatial resolution (10 m) Large Eddy Simulations (LES) coupled with WRF atmospheric transport models were performed to simulate the dynamics of the Aliso Canyon methane plume and to quantify the source. High resolution forward simulation results were combined with aircraft and tower based in-situ measurements as well as data from NASA airborne imaging spectrometers. Comparison of simulation results with measurement data demonstrate the capability of the LES models to accurately model transport and dispersion of methane plumes over urban domains.
Lozito, Thomas P; Tuan, Rocky S
2017-03-01
The ability to regenerate damaged or lost tissues has remained the lofty goal of regenerative medicine. Unfortunately, humans, like most mammals, suffer from very minimal natural regenerative capabilities. Certain non-mammalian animal species, however, are not so limited in their healing capabilities, and several have attracted the attention of researchers hoping to recreate enhanced healing responses in humans. This review focuses on one such animal group with remarkable regenerative abilities, the lizards. As the closest relatives of mammals that exhibit enhanced regenerative abilities as adults, lizards potentially represent the most relevant model for direct comparison and subsequent improvement of mammalian healing. Lizards are able to regenerate amputated tails and exhibit adaptations that both limit tissue damage in response to injury and initiate coordinated regenerative responses. This review summarizes the salient aspects of lizard tail regeneration as they relate to the overall regenerative process and also presents the relevant information pertaining to regrowth of specific tissues, including skeletal, muscular, nervous, and vascular tissues. The goal of this review is to introduce the topic of lizard tail regeneration to new audiences with the hope of expanding the knowledge base of this underutilized but potentially powerful model organism.
NASA Astrophysics Data System (ADS)
Cao, Duc; Moses, Gregory; Delettrez, Jacques
2015-08-01
An implicit, non-local thermal conduction algorithm based on the algorithm developed by Schurtz, Nicolai, and Busquet (SNB) [Schurtz et al., Phys. Plasmas 7, 4238 (2000)] for non-local electron transport is presented and has been implemented in the radiation-hydrodynamics code DRACO. To study the model's effect on DRACO's predictive capability, simulations of shot 60 303 from OMEGA are completed using the iSNB model, and the computed shock speed vs. time is compared to experiment. Temperature outputs from the iSNB model are compared with the non-local transport model of Goncharov et al. [Phys. Plasmas 13, 012702 (2006)]. Effects on adiabat are also examined in a polar drive surrogate simulation. Results show that the iSNB model is not only capable of flux-limitation but also preheat prediction while remaining numerically robust and sacrificing little computational speed. Additionally, the results provide strong incentive to further modify key parameters within the SNB theory, namely, the newly introduced non-local mean free path. This research was supported by the Laboratory for Laser Energetics of the University of Rochester.
NASA Astrophysics Data System (ADS)
Ajani, Penelope; Larsson, Michaela E.; Rubio, Ana; Bush, Stephen; Brett, Steve; Farrell, Hazel
2016-12-01
Dinoflagellates belonging to the toxigenic genus Dinophysis are increasing in abundance in the Hawkesbury River, south-eastern Australia. This study investigates a twelve year time series of abundance and physico-chemical data to model these blooms. Four species were reported over the sampling campaign - Dinophysis acuminata, Dinophysis caudata, Dinophysis fortii and Dinophysis tripos-with D. acuminata and D. caudata being most abundant. Highest abundance of D. acuminata occurred in the austral spring (max. abundance 4500 cells l-1), whilst highest D. caudata occurred in the summer to autumn (max. 12,000 cells l-1). Generalised additive models revealed abundance of D. acuminata was significantly linked to season, thermal stratification and nutrients, whilst D. caudata was associated with nutrients, salinity and dissolved oxygen. The models' predictive capability was up to 60% for D. acuminata and 53% for D. caudata. Altering sampling strategies during blooms accompanied with in situ high resolution monitoring will further improve Dinophysis bloom prediction capability.
Leveraging annotation-based modeling with Jump.
Bergmayr, Alexander; Grossniklaus, Michael; Wimmer, Manuel; Kappel, Gerti
2018-01-01
The capability of UML profiles to serve as annotation mechanism has been recognized in both research and industry. Today's modeling tools offer profiles specific to platforms, such as Java, as they facilitate model-based engineering approaches. However, considering the large number of possible annotations in Java, manually developing the corresponding profiles would only be achievable by huge development and maintenance efforts. Thus, leveraging annotation-based modeling requires an automated approach capable of generating platform-specific profiles from Java libraries. To address this challenge, we present the fully automated transformation chain realized by Jump, thereby continuing existing mapping efforts between Java and UML by emphasizing on annotations and profiles. The evaluation of Jump shows that it scales for large Java libraries and generates profiles of equal or even improved quality compared to profiles currently used in practice. Furthermore, we demonstrate the practical value of Jump by contributing profiles that facilitate reverse engineering and forward engineering processes for the Java platform by applying it to a modernization scenario.
NASA Technical Reports Server (NTRS)
Roberge, Aki; Rizzo, Maxime J.; Lincowski, Andrew P.; Arney, Giada N.; Stark, Christopher C.; Robinson, Tyler D.; Snyder, Gregory F.; Pueyo, Laurent; Zimmerman, Neil T.; Jansen, Tiffany;
2017-01-01
We present two state-of-the-art models of the solar system, one corresponding to the present day and one to the Archean Eon 3.5 billion years ago. Each model contains spatial and spectral information for the star, the planets, and the interplanetary dust, extending to 50 au from the Sun and covering the wavelength range 0.3-2.5 micron. In addition, we created a spectral image cube representative of the astronomical backgrounds that will be seen behind deep observations of extrasolar planetary systems, including galaxies and Milky Way stars. These models are intended as inputs to high-fidelity simulations of direct observations of exoplanetary systems using telescopes equipped with high-contrast capability. They will help improve the realism of observation and instrument parameters that are required inputs to statistical observatory yield calculations, as well as guide development of post-processing algorithms for telescopes capable of directly imaging Earth-like planets.
Point cloud modeling using the homogeneous transformation for non-cooperative pose estimation
NASA Astrophysics Data System (ADS)
Lim, Tae W.
2015-06-01
A modeling process to simulate point cloud range data that a lidar (light detection and ranging) sensor produces is presented in this paper in order to support the development of non-cooperative pose (relative attitude and position) estimation approaches which will help improve proximity operation capabilities between two adjacent vehicles. The algorithms in the modeling process were based on the homogeneous transformation, which has been employed extensively in robotics and computer graphics, as well as in recently developed pose estimation algorithms. Using a flash lidar in a laboratory testing environment, point cloud data of a test article was simulated and compared against the measured point cloud data. The simulated and measured data sets match closely, validating the modeling process. The modeling capability enables close examination of the characteristics of point cloud images of an object as it undergoes various translational and rotational motions. Relevant characteristics that will be crucial in non-cooperative pose estimation were identified such as shift, shadowing, perspective projection, jagged edges, and differential point cloud density. These characteristics will have to be considered in developing effective non-cooperative pose estimation algorithms. The modeling capability will allow extensive non-cooperative pose estimation performance simulations prior to field testing, saving development cost and providing performance metrics of the pose estimation concepts and algorithms under evaluation. The modeling process also provides "truth" pose of the test objects with respect to the sensor frame so that the pose estimation error can be quantified.
Development of an engineering model atmosphere for Mars
NASA Technical Reports Server (NTRS)
Justus, C. G.
1988-01-01
An engineering model atmosphere for Mars is being developed with many of the same features and capabilities for the highly successful Global Reference Atmospheric Model (GRAM) program for Earth's atmosphere. As an initial approach, the model is being built around the Martian atmosphere model computer subroutine (ATMOS) of Culp and Stewart (1984). In a longer-term program of research, additional refinements and modifications will be included. ATMOS includes parameterizations to stimulate the effects of solar activity, seasonal variation, diurnal variation magnitude, dust storm effects, and effects due to the orbital position of Mars. One of the current shortcomings of ATMOS is the neglect of surface variation effects. The longer-term period of research and model building is to address some of these problem areas and provide further improvements in the model (including improved representation of near-surface variations, improved latitude-longitude gradient representation, effects of the large annual variation in surface pressure because of differential condensation/sublimation of the CO2 atmosphere in the polar caps, and effects of Martian atmospheric wave perturbations on the magnitude of the expected density perturbation.
NASA Astrophysics Data System (ADS)
Floyd, I. E.; Downer, C. W.; Brown, G.; Pradhan, N. R.
2017-12-01
The Gridded Surface Subsurface Hydrologic Analysis (GSSHA) model is the US Army Corps of Engineers' (USACE)'s only fully coupled overland/in-stream sediment transport model. While the overland sediment transport formulation in GSSHA is considered state of the art, the existing in-stream sediment transport formulation is less robust. A major omission in the formulation of the existing GSSHA in-stream model is the lack of in-stream sources of fine materials. In this effort, we enhanced the in-stream sediment transport capacity of GSSHA by linking GSSHA to the SEDLIB sediment transport library. SEDLIB was developed at the Coastal and Hydraulics Laboratory (CHL) under the System Wide Water Resources Program (SWWRP) and Flood and Coastal (F&C) research program. It is designed to provide a library of sediment flux formulations for hydraulic and hydrologic models, such as GSSHA. This new version of GSSHA, with the updated in-stream sediment transport simulation capability afforded by the linkage to SEDLIB, was tested in against observations in an experimental watershed that had previously been used as a test bed for GSSHA. The results show a significant improvement in the ability to model in-stream sources of fine sediment. This improved capability will broaden the applicability of GSSHA to larger watersheds and watersheds with complex sediment dynamics, such as those subjected to fire hydrology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Santi, Peter Angelo; Cutler, Theresa Elizabeth; Favalli, Andrea
In order to improve the accuracy and capabilities of neutron multiplicity counting, additional quantifiable information is needed in order to address the assumptions that are present in the point model. Extracting and utilizing higher order moments (Quads and Pents) from the neutron pulse train represents the most direct way of extracting additional information from the measurement data to allow for an improved determination of the physical properties of the item of interest. The extraction of higher order moments from a neutron pulse train required the development of advanced dead time correction algorithms which could correct for dead time effects inmore » all of the measurement moments in a self-consistent manner. In addition, advanced analysis algorithms have been developed to address specific assumptions that are made within the current analysis model, namely that all neutrons are created at a single point within the item of interest, and that all neutrons that are produced within an item are created with the same energy distribution. This report will discuss the current status of implementation and initial testing of the advanced dead time correction and analysis algorithms that have been developed in an attempt to utilize higher order moments to improve the capabilities of correlated neutron measurement techniques.« less
Examining Quality Improvement Programs: The Case of Minnesota Hospitals
Olson, John R; Belohlav, James A; Cook, Lori S; Hays, Julie M
2008-01-01
Objective To determine if there is a hierarchy of improvement program adoption by hospitals and outline that hierarchy. Data Sources Primary data were collected in the spring of 2007 via e-survey from 210 individuals representing 109 Minnesota hospitals. Secondary data from 2006 were assembled from the Leapfrog database. Study Design As part of a larger survey, respondents were given a list of improvement programs and asked to identify those programs that are used in their hospital. Data Collection/Data Extraction Rasch Model Analysis was used to assess whether a unidimensional construct exists that defines a hospital's ability to implement performance improvement programs. Linear regression analysis was used to assess the relationship of the Rasch ability scores with Leapfrog Safe Practices Scores to validate the research findings. Principal Findings The results of the study show that hospitals have widely varying abilities in implementing improvement programs. In addition, improvement programs present differing levels of difficulty for hospitals trying to implement them. Our findings also indicate that the ability to adopt improvement programs is important to the overall performance of hospitals. Conclusions There is a hierarchy of improvement programs in the health care context. A hospital's ability to successfully adopt improvement programs is a function of its existing capabilities. As a hospital's capability increases, the ability to successfully implement higher level programs also increases. PMID:18761677
Prediction of Agglomeration, Fouling, and Corrosion Tendency of Fuels in CFB Co-Combustion
NASA Astrophysics Data System (ADS)
Barišć, Vesna; Zabetta, Edgardo Coda; Sarkki, Juha
Prediction of agglomeration, fouling, and corrosion tendency of fuels is essential to the design of any CFB boiler. During the years, tools have been successfully developed at Foster Wheeler to help with such predictions for the most commercial fuels. However, changes in fuel market and the ever-growing demand for co-combustion capabilities pose a continuous need for development. This paper presents results from recently upgraded models used at Foster Wheeler to predict agglomeration, fouling, and corrosion tendency of a variety of fuels and mixtures. The models, subject of this paper, are semi-empirical computer tools that combine the theoretical basics of agglomeration/fouling/corrosion phenomena with empirical correlations. Correlations are derived from Foster Wheeler's experience in fluidized beds, including nearly 10,000 fuel samples and over 1,000 tests in about 150 CFB units. In these models, fuels are evaluated based on their classification, their chemical and physical properties by standard analyses (proximate, ultimate, fuel ash composition, etc.;.) alongside with Foster Wheeler own characterization methods. Mixtures are then evaluated taking into account the component fuels. This paper presents the predictive capabilities of the agglomeration/fouling/corrosion probability models for selected fuels and mixtures fired in full-scale. The selected fuels include coals and different types of biomass. The models are capable to predict the behavior of most fuels and mixtures, but also offer possibilities for further improvements.
Powathil, Gibin G; Swat, Maciej; Chaplain, Mark A J
2015-02-01
The multiscale complexity of cancer as a disease necessitates a corresponding multiscale modelling approach to produce truly predictive mathematical models capable of improving existing treatment protocols. To capture all the dynamics of solid tumour growth and its progression, mathematical modellers need to couple biological processes occurring at various spatial and temporal scales (from genes to tissues). Because effectiveness of cancer therapy is considerably affected by intracellular and extracellular heterogeneities as well as by the dynamical changes in the tissue microenvironment, any model attempt to optimise existing protocols must consider these factors ultimately leading to improved multimodal treatment regimes. By improving existing and building new mathematical models of cancer, modellers can play important role in preventing the use of potentially sub-optimal treatment combinations. In this paper, we analyse a multiscale computational mathematical model for cancer growth and spread, incorporating the multiple effects of radiation therapy and chemotherapy in the patient survival probability and implement the model using two different cell based modelling techniques. We show that the insights provided by such multiscale modelling approaches can ultimately help in designing optimal patient-specific multi-modality treatment protocols that may increase patients quality of life. Copyright © 2014 Elsevier Ltd. All rights reserved.
Advanced Wavefront Control Techniques
DOE Office of Scientific and Technical Information (OSTI.GOV)
Olivier, S S; Brase, J M; Avicola, K
2001-02-21
Programs at LLNL that involve large laser systems--ranging from the National Ignition Facility to new tactical laser weapons--depend on the maintenance of laser beam quality through precise control of the optical wavefront. This can be accomplished using adaptive optics, which compensate for time-varying aberrations that are often caused by heating in a high-power laser system. Over the past two decades, LLNL has developed a broad capability in adaptive optics technology for both laser beam control and high-resolution imaging. This adaptive optics capability has been based on thin deformable glass mirrors with individual ceramic actuators bonded to the back. In themore » case of high-power lasers, these adaptive optics systems have successfully improved beam quality. However, as we continue to extend our applications requirements, the existing technology base for wavefront control cannot satisfy them. To address this issue, this project studied improved modeling tools to increase our detailed understanding of the performance of these systems, and evaluated novel approaches to low-order wavefront control that offer the possibility of reduced cost and complexity. We also investigated improved beam control technology for high-resolution wavefront control. Many high-power laser systems suffer from high-spatial-frequency aberrations that require control of hundreds or thousands of phase points to provide adequate correction. However, the cost and size of current deformable mirrors can become prohibitive for applications requiring more than a few tens of phase control points. New phase control technologies are becoming available which offer control of many phase points with small low-cost devices. The goal of this project was to expand our wavefront control capabilities with improved modeling tools, new devices that reduce system cost and complexity, and extensions to high spatial and temporal frequencies using new adaptive optics technologies. In FY 99, the second year of this project, work was performed in four areas (1) advanced modeling tools for deformable mirrors (2) low-order wavefront correctors with Alvarez lenses, (3) a direct phase measuring heterdyne wavefront sensor, and (4) high-spatial-frequency wavefront control using spatial light modulators.« less
Challenges and Progress in Aerodynamic Design of Hybrid Wingbody Aircraft with Embedded Engines
NASA Technical Reports Server (NTRS)
Liou, Meng-Sing; Kim, Hyoungjin; Liou, May-Fun
2016-01-01
We summarize the contributions to high-fidelity capabilities for analysis and design of hybrid wingbody (HWB) configurations considered by NASA. Specifically, we focus on the embedded propulsion concepts of the N2-B and N3-X configurations, some of the future concepts seriously investigated by the NASA Fixed Wing Project. The objective is to develop the capability to compute the integrated propulsion and airframe system realistically in geometry and accurately in flow physics. In particular, the propulsion system (including the entire engine core-compressor, combustor, and turbine stages) is vastly more difficult and costly to simulate with the same level of fidelity as the external aerodynamics. Hence, we develop an accurate modeling approach that retains important physical parameters relevant to aerodynamic and propulsion analyses for evaluating the HWB concepts. Having the analytical capabilities at our disposal, concerns and issues that were considered to be critical for the HWB concepts can now be assessed reliably and systematically; assumptions invoked by previous studies were found to have serious consequences in our study. During this task, we establish firmly that aerodynamic analysis of a HWB concept without including installation of the propulsion system is far from realistic and can be misleading. Challenges in delivering the often-cited advantages that belong to the HWB are the focus of our study and are emphasized in this report. We have attempted to address these challenges and have had successes, which are summarized here. Some can have broad implications, such as the concept of flow conditioning for reducing flow distortion and the modeling of fan stages. The design optimization capability developed for improving the aerodynamic characteristics of the baseline HWB configurations is general and can be employed for other applications. Further improvement of the N3-X configuration can be expected by expanding the design space. Finally, the support of the System Analysis and Integration Element under the NASA Fixed Wing Project has enabled the development and helped deployment of the capabilities shown in this report.
NASA Astrophysics Data System (ADS)
Neill, Aaron; Reaney, Sim
2015-04-01
Fully-distributed, physically-based rainfall-runoff models attempt to capture some of the complexity of the runoff processes that operate within a catchment, and have been used to address a variety of issues including water quality and the effect of climate change on flood frequency. Two key issues are prevalent, however, which call into question the predictive capability of such models. The first is the issue of parameter equifinality which can be responsible for large amounts of uncertainty. The second is whether such models make the right predictions for the right reasons - are the processes operating within a catchment correctly represented, or do the predictive abilities of these models result only from the calibration process? The use of additional data sources, such as environmental tracers, has been shown to help address both of these issues, by allowing for multi-criteria model calibration to be undertaken, and by permitting a greater understanding of the processes operating in a catchment and hence a more thorough evaluation of how well catchment processes are represented in a model. Using discharge and oxygen-18 data sets, the ability of the fully-distributed, physically-based CRUM3 model to represent the runoff processes in three sub-catchments in Cumbria, NW England has been evaluated. These catchments (Morland, Dacre and Pow) are part of the of the River Eden demonstration test catchment project. The oxygen-18 data set was firstly used to derive transit-time distributions and mean residence times of water for each of the catchments to gain an integrated overview of the types of processes that were operating. A generalised likelihood uncertainty estimation procedure was then used to calibrate the CRUM3 model for each catchment based on a single discharge data set from each catchment. Transit-time distributions and mean residence times of water obtained from the model using the top 100 behavioural parameter sets for each catchment were then compared to those derived from the oxygen-18 data to see how well the model captured catchment dynamics. The value of incorporating the oxygen-18 data set, as well as discharge data sets from multiple as opposed to single gauging stations in each catchment, in the calibration process to improve the predictive capability of the model was then investigated. This was achieved by assessing by how much the identifiability of the model parameters and the ability of the model to represent the runoff processes operating in each catchment improved with the inclusion of the additional data sets with respect to the likely costs that would be incurred in obtaining the data sets themselves.
Recent Progress Towards Predicting Aircraft Ground Handling Performance
NASA Technical Reports Server (NTRS)
Yager, T. J.; White, E. J.
1981-01-01
The significant progress which has been achieved in development of aircraft ground handling simulation capability is reviewed and additional improvements in software modeling identified. The problem associated with providing necessary simulator input data for adequate modeling of aircraft tire/runway friction behavior is discussed and efforts to improve this complex model, and hence simulator fidelity, are described. Aircraft braking performance data obtained on several wet runway surfaces is compared to ground vehicle friction measurements and, by use of empirically derived methods, good agreement between actual and estimated aircraft braking friction from ground vehilce data is shown. The performance of a relatively new friction measuring device, the friction tester, showed great promise in providing data applicable to aircraft friction performance. Additional research efforts to improve methods of predicting tire friction performance are discussed including use of an instrumented tire test vehicle to expand the tire friction data bank and a study of surface texture measurement techniques.
Airborne Cloud Computing Environment (ACCE)
NASA Technical Reports Server (NTRS)
Hardman, Sean; Freeborn, Dana; Crichton, Dan; Law, Emily; Kay-Im, Liz
2011-01-01
Airborne Cloud Computing Environment (ACCE) is JPL's internal investment to improve the return on airborne missions. Improve development performance of the data system. Improve return on the captured science data. The investment is to develop a common science data system capability for airborne instruments that encompasses the end-to-end lifecycle covering planning, provisioning of data system capabilities, and support for scientific analysis in order to improve the quality, cost effectiveness, and capabilities to enable new scientific discovery and research in earth observation.
Air Quality Modeling Using the NASA GEOS-5 Multispecies Data Assimilation System
NASA Technical Reports Server (NTRS)
Keller, Christoph A.; Pawson, Steven; Wargan, Krzysztof; Weir, Brad
2018-01-01
The NASA Goddard Earth Observing System (GEOS) data assimilation system (DAS) has been expanded to include chemically reactive tropospheric trace gases including ozone (O3), nitrogen dioxide (NO2), and carbon monoxide (CO). This system combines model analyses from the GEOS-5 model with detailed atmospheric chemistry and observations from MLS (O3), OMI (O3 and NO2), and MOPITT (CO). We show results from a variety of assimilation test experiments, highlighting the improvements in the representation of model species concentrations by up to 50% compared to an assimilation-free control experiment. Taking into account the rapid chemical cycling of NO2 when applying the assimilation increments greatly improves assimilation skills for NO2 and provides large benefits for model concentrations near the surface. Analysis of the geospatial distribution of the assimilation increments suggest that the free-running model overestimates biomass burning emissions but underestimates lightning NOx emissions by 5-20%. We discuss the capability of the chemical data assimilation system to improve atmospheric composition forecasts through improved initial value and boundary condition inputs, particularly during air pollution events. We find that the current assimilation system meaningfully improves short-term forecasts (1-3 day). For longer-term forecasts more emphasis on updating the emissions instead of initial concentration fields is needed.
Galerkin CFD solvers for use in a multi-disciplinary suite for modeling advanced flight vehicles
NASA Astrophysics Data System (ADS)
Moffitt, Nicholas J.
This work extends existing Galerkin CFD solvers for use in a multi-disciplinary suite. The suite is proposed as a means of modeling advanced flight vehicles, which exhibit strong coupling between aerodynamics, structural dynamics, controls, rigid body motion, propulsion, and heat transfer. Such applications include aeroelastics, aeroacoustics, stability and control, and other highly coupled applications. The suite uses NASA STARS for modeling structural dynamics and heat transfer. Aerodynamics, propulsion, and rigid body dynamics are modeled in one of the five CFD solvers below. Euler2D and Euler3D are Galerkin CFD solvers created at OSU by Cowan (2003). These solvers are capable of modeling compressible inviscid aerodynamics with modal elastics and rigid body motion. This work reorganized these solvers to improve efficiency during editing and at run time. Simple and efficient propulsion models were added, including rocket, turbojet, and scramjet engines. Viscous terms were added to the previous solvers to create NS2D and NS3D. The viscous contributions were demonstrated in the inertial and non-inertial frames. Variable viscosity (Sutherland's equation) and heat transfer boundary conditions were added to both solvers but not verified in this work. Two turbulence models were implemented in NS2D and NS3D: Spalart-Allmarus (SA) model of Deck, et al. (2002) and Menter's SST model (1994). A rotation correction term (Shur, et al., 2000) was added to the production of turbulence. Local time stepping and artificial dissipation were adapted to each model. CFDsol is a Taylor-Galerkin solver with an SA turbulence model. This work improved the time accuracy, far field stability, viscous terms, Sutherland?s equation, and SA model with NS3D as a guideline and added the propulsion models from Euler3D to CFDsol. Simple geometries were demonstrated to utilize current meshing and processing capabilities. Air-breathing hypersonic flight vehicles (AHFVs) represent the ultimate application of the suite. The current models are accurate at low supersonic speed and reasonable for engineering approximation at hypersonic speeds. Improvements to extend the models fully into the hypersonic regime are given in the Recommendations section.
Radioactive threat detection using scintillant-based detectors
NASA Astrophysics Data System (ADS)
Chalmers, Alex
2004-09-01
An update to the performance of AS&E's Radioactive Threat Detection sensor technology. A model is presented detailing the components of the scintillant-based RTD system employed in AS&E products aimed at detecting radiological WMD. An overview of recent improvements in the sensors, electrical subsystems and software algorithms are presented. The resulting improvements in performance are described and sample results shown from existing systems. Advanced and future capabilities are described with an assessment of their feasibility and their application to Homeland Defense.
Abstracting data warehousing issues in scientific research.
Tews, Cody; Bracio, Boris R
2002-01-01
This paper presents the design and implementation of the Idaho Biomedical Data Management System (IBDMS). This system preprocesses biomedical data from the IMPROVE (Improving Control of Patient Status in Critical Care) library via an Open Database Connectivity (ODBC) connection. The ODBC connection allows for local and remote simulations to access filtered, joined, and sorted data using the Structured Query Language (SQL). The tool is capable of providing an overview of available data in addition to user defined data subset for verification of models of the human respiratory system.
Top-level modeling of an als system utilizing object-oriented techniques
NASA Astrophysics Data System (ADS)
Rodriguez, L. F.; Kang, S.; Ting, K. C.
The possible configuration of an Advanced Life Support (ALS) System capable of supporting human life for long-term space missions continues to evolve as researchers investigate potential technologies and configurations. To facilitate the decision process the development of acceptable, flexible, and dynamic mathematical computer modeling tools capable of system level analysis is desirable. Object-oriented techniques have been adopted to develop a dynamic top-level model of an ALS system.This approach has several advantages; among these, object-oriented abstractions of systems are inherently modular in architecture. Thus, models can initially be somewhat simplistic, while allowing for adjustments and improvements. In addition, by coding the model in Java, the model can be implemented via the World Wide Web, greatly encouraging the utilization of the model. Systems analysis is further enabled with the utilization of a readily available backend database containing information supporting the model. The subsystem models of the ALS system model include Crew, Biomass Production, Waste Processing and Resource Recovery, Food Processing and Nutrition, and the Interconnecting Space. Each subsystem model and an overall model have been developed. Presented here is the procedure utilized to develop the modeling tool, the vision of the modeling tool, and the current focus for each of the subsystem models.
NASA Technical Reports Server (NTRS)
Hall, Callie; Arnone, Robert
2006-01-01
The NASA Applied Sciences Program seeks to transfer NASA data, models, and knowledge into the hands of end-users by forming links with partner agencies and associated decision support tools (DSTs). Through the NASA REASoN (Research, Education and Applications Solutions Network) Cooperative Agreement, the Oceanography Division of the Naval Research Laboratory (NRLSSC) is developing new products through the integration of data from NASA Earth-Sun System assets with coastal ocean forecast models and other available data to enhance coastal management in the Gulf of Mexico. The recipient federal agency for this research effort is the National Oceanic and Atmospheric Administration (NOAA). The contents of this report detail the effort to further the goals of the NASA Applied Sciences Program by demonstrating the use of NASA satellite products combined with data-assimilating ocean models to provide near real-time information to maritime users and coastal managers of the Gulf of Mexico. This effort provides new and improved capabilities for monitoring, assessing, and predicting the coastal environment. Coastal managers can exploit these capabilities through enhanced DSTs at federal, state and local agencies. The project addresses three major issues facing coastal managers: 1) Harmful Algal Blooms (HABs); 2) hypoxia; and 3) freshwater fluxes to the coastal ocean. A suite of ocean products capable of describing Ocean Weather is assembled on a daily basis as the foundation for this semi-operational multiyear effort. This continuous realtime capability brings decision makers a new ability to monitor both normal and anomalous coastal ocean conditions with a steady flow of satellite and ocean model conditions. Furthermore, as the baseline data sets are used more extensively and the customer list increased, customer feedback is obtained and additional customized products are developed and provided to decision makers. Continual customer feedback and response with new improved products are required between the researcher and customer. This document details the methods by which these coastal ocean products are produced including the data flow, distribution, and verification. Product applications and the degree to which these products are used successfully within NOAA and coordinated with the Mississippi Department of Marine Resources (MDMR) is benchmarked.
Using Micro-Synchrophasor Data for Advanced Distribution Grid Planning and Operations Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stewart, Emma; Kiliccote, Sila; McParland, Charles
2014-07-01
This report reviews the potential for distribution-grid phase-angle data that will be available from new micro-synchrophasors (µPMUs) to be utilized in existing distribution-grid planning and operations analysis. This data could augment the current diagnostic capabilities of grid analysis software, used in both planning and operations for applications such as fault location, and provide data for more accurate modeling of the distribution system. µPMUs are new distribution-grid sensors that will advance measurement and diagnostic capabilities and provide improved visibility of the distribution grid, enabling analysis of the grid’s increasingly complex loads that include features such as large volumes of distributed generation.more » Large volumes of DG leads to concerns on continued reliable operation of the grid, due to changing power flow characteristics and active generation, with its own protection and control capabilities. Using µPMU data on change in voltage phase angle between two points in conjunction with new and existing distribution-grid planning and operational tools is expected to enable model validation, state estimation, fault location, and renewable resource/load characterization. Our findings include: data measurement is outstripping the processing capabilities of planning and operational tools; not every tool can visualize a voltage phase-angle measurement to the degree of accuracy measured by advanced sensors, and the degree of accuracy in measurement required for the distribution grid is not defined; solving methods cannot handle the high volumes of data generated by modern sensors, so new models and solving methods (such as graph trace analysis) are needed; standardization of sensor-data communications platforms in planning and applications tools would allow integration of different vendors’ sensors and advanced measurement devices. In addition, data from advanced sources such as µPMUs could be used to validate models to improve/ensure accuracy, providing information on normally estimated values such as underground conductor impedance, and characterization of complex loads. Although the input of high-fidelity data to existing tools will be challenging, µPMU data on phase angle (as well as other data from advanced sensors) will be useful for basic operational decisions that are based on a trend of changing data.« less
Berridge, Brian R; Schultze, A Eric; Heyen, Jon R; Searfoss, George H; Sarazan, R Dustan
2016-12-01
Cardiovascular (CV) safety liabilities are significant concerns for drug developers and preclinical animal studies are predominately where those liabilities are characterized before patient exposures. Steady progress in technology and laboratory capabilities is enabling a more refined and informative use of animals in those studies. The application of surgically implantable and telemetered instrumentation in the acute assessment of drug effects on CV function has significantly improved historical approaches that involved anesthetized or restrained animals. More chronically instrumented animals and application of common clinical imaging assessments like echocardiography and MRI extend functional and in-life structural assessments into the repeat-dose setting. A growing portfolio of circulating CV biomarkers is allowing longitudinal and repeated measures of cardiac and vascular injury and dysfunction better informing an understanding of temporal pathogenesis and allowing earlier detection of undesirable effects. In vitro modeling systems of the past were limited by their lack of biological relevance to the in vivo human condition. Advances in stem cell technology and more complex in vitro modeling platforms are quickly creating more opportunity to supplant animals in our earliest assessments for liabilities. Continuing improvement in our capabilities in both animal and nonanimal modeling should support a steady decrease in animal use for primary liability identification and optimize the translational relevance of the animal studies we continue to do. © The Author 2016. Published by Oxford University Press on behalf of the Institute for Laboratory Animal Research. All rights reserved. For permissions, please email: journals.permissions@oup.com.
Wang, Chongjian; Wei, Sheng; Xiang, Hao; Wu, Jing; Xu, Yihua; Liu, Li; Nie, Shaofa
2008-10-30
Since the 9/11 attack and severe acute respiratory syndrome (SARS), the development of qualified and able public health leaders has become a new urgency in building the infrastructure needed to address public health emergencies. Although previous studies have reported that the training of individual leaders is an important approach, the systemic and scientific training model need further improvement and development. The purpose of this study was to develop, deliver, and evaluate a participatory leadership training program for emergency response. Forty-one public health leaders (N = 41) from five provinces completed the entire emergency preparedness training program in China. The program was evaluated by anonymous questionnaires and semi-structured interviews held prior to training, immediately post-training and 12-month after training (Follow-up). The emergency preparedness training resulted in positive shifts in knowledge, self-assessment of skills for public health leaders. More than ninety-five percent of participants reported that the training model was scientific and feasible. Moreover, the response of participants in the program to the avian influenza outbreak, as well as the planned evaluations for this leadership training program, further demonstrated both the successful approaches and methods and the positive impact of this integrated leadership training initiative. The emergency preparedness training program met its aims and objectives satisfactorily, and improved the emergency capability of public health leaders. This suggests that the leadership training model was effective and feasible in improving the emergency preparedness capability.
A spatiotemporal data model for incorporating time in geographic information systems (GEN-STGIS)
NASA Astrophysics Data System (ADS)
Narciso, Flor Eugenia
Temporal Geographic Information Systems (TGIS) is a new technology, which is being developed to work with Geographic Information Systems (GIS) that deal with geographic phenomena that change over time. The capabilities of TGIS depend on the underlying data model. However, a literature review of current spatiotemporal GIS data models has shown that they are not adequate for managing time when representing temporal data. In addition, the majority of these data models have been designed to support the requirements of specific-purpose applications. In an effort to resolve this problem, the related literature has been explored. A comparative investigation of the current spatiotemporal GIS data models has been made to identify their characteristics, advantages and disadvantages, similarities and differences, and to determine why they do not work adequately. A new object-oriented General-purpose Spatiotemporal GIS (GEN-STGIS) data model is proposed here. This model provides better representation, storage and management of data related to geographic phenomena that change over time and overcomes some of the problems detected in the reviewed data models. The proposed data model has four key benefits. First, it provides the capabilities of a standard vector-based GIS embedded in the 2-D Euclidean space. Second, it includes the two temporal dimensions, valid time and transaction time, supported by temporal databases. Third, it inherits, from the object oriented approach, the flexibility, modularity and ability to handle the complexities introduced by spatial and temporal dimensions. Fourth, it improves the geographic query capabilities of current TGIS with the introduction of the concept of bounding box while providing temporal and spatiotemporal query capabilities. The data model is then evaluated in order to assess its strengths and weaknesses as a spatiotemporal GIS data model, and to determine how well the model satisfies the requirements imposed by TGIS applications. The practicality of the data model is demonstrated by the creation of a TGIS example and the partial implementation of the model using the POET Java software for developing the object-oriented database. the object-oriented database.
An improved gravity model for Mars: Goddard Mars Model 1
NASA Technical Reports Server (NTRS)
Smith, D. E.; Lerch, F. J.; Nerem, R. S.; Zuber, M. T.; Patel, G. B.; Fricke, S. K.; Lemoine, F. G.
1993-01-01
Doppler tracking data of three orbiting spacecraft have been reanalyzed to develop a new gravitational field model for the planet Mars, Goddard Mars Model 1 (GMM-1). This model employs nearly all available data, consisting of approximately 1100 days of S band tracking data collected by NASA's Deep Space Network from the Mariner 9 and Viking 1 and Viking 2 spacecraft, in seven different orbits, between 1971 and 1979. GMM-1 is complete to spherical harmonic degree and order 50, which corresponds to a half-wavelength spatial resolution of 200-300 km where the data permit. GMM-1 represents satellite orbits with considerably better accuracy than previous Mars gravity models and shows greater resolution of identifiable geological structures. The notable improvement in GMM-1 over previous models is a consequence of several factors: improved computational capabilities, the use of otpimum weighting and least squares collocation solution techniques which stabilized the behavior of the solution at high degree and order, and the use of longer satellite arcs than employed in previous solutions that were made possible by improved force and measurement models. The inclusion of X band tracking data from the 379-km altitude, nnear-polar orbiting Mars Observer spacecraft should provide a significant improvement over GMM-1, particularly at high latitudes where current data poorly resolve the gravitational signature of the planet.
An improved gravity model for Mars: Goddard Mars Model-1 (GMM-1)
NASA Technical Reports Server (NTRS)
Smith, D. E.; Lerch, F. J.; Nerem, R. S.; Zuber, M. T.; Patel, G. B.; Fricke, S. K.; Lemoine, F. G.
1993-01-01
Doppler tracking data of three orbiting spacecraft have been reanalyzed to develop a new gravitational field model for the planet Mars, GMM-1 (Goddard Mars Model-1). This model employs nearly all available data, consisting of approximately 1100 days of S-bank tracking data collected by NASA's Deep Space Network from the Mariner 9, and Viking 1 and Viking 2 spacecraft, in seven different orbits, between 1971 and 1979. GMM-1 is complete to spherical harmonic degree and order 50, which corresponds to a half-wavelength spatial resolution of 200-300 km where the data permit. GMM-1 represents satellite orbits with considerably better accuracy than previous Mars gravity models and shows greater resolution of identifiable geological structures. The notable improvement in GMM-1 over previous models is a consequence of several factors: improved computational capabilities, the use of optimum weighting and least-squares collocation solution techniques which stabilized the behavior of the solution at high degree and order, and the use of longer satellite arcs than employed in previous solutions that were made possible by improved force and measurement models. The inclusion of X-band tracking data from the 379-km altitude, near-polar orbiting Mars Observer spacecraft should provide a significant improvement over GMM-1, particularly at high latitudes where current data poorly resolves the gravitational signature of the planet.
Microburst vertical wind estimation from horizontal wind measurements
NASA Technical Reports Server (NTRS)
Vicroy, Dan D.
1994-01-01
The vertical wind or downdraft component of a microburst-generated wind shear can significantly degrade airplane performance. Doppler radar and lidar are two sensor technologies being tested to provide flight crews with early warning of the presence of hazardous wind shear. An inherent limitation of Doppler-based sensors is the inability to measure velocities perpendicular to the line of sight, which results in an underestimate of the total wind shear hazard. One solution to the line-of-sight limitation is to use a vertical wind model to estimate the vertical component from the horizontal wind measurement. The objective of this study was to assess the ability of simple vertical wind models to improve the hazard prediction capability of an airborne Doppler sensor in a realistic microburst environment. Both simulation and flight test measurements were used to test the vertical wind models. The results indicate that in the altitude region of interest (at or below 300 m), the simple vertical wind models improved the hazard estimate. The radar simulation study showed that the magnitude of the performance improvement was altitude dependent. The altitude of maximum performance improvement occurred at about 300 m.
New single-aircraft integrated atmospheric observation capabilities
NASA Astrophysics Data System (ADS)
Wang, Z.
2011-12-01
Improving current weather and climate model capabilities requires better understandings of many atmospheric processes. Thus, advancing atmospheric observation capabilities has been regarded as the highest imperatives to advance the atmospheric science in the 21st century. Under the NSF CAREER support, we focus on developing new airborne observation capabilities through the developments of new instrumentations and the single-aircraft integration of multiple remote sensors with in situ probes. Two compact Wyoming cloud lidars were built to work together with a 183 GHz microwave radiometer, a multi-beam Wyoming cloud radar and in situ probes for cloud studies. The synergy of these remote sensor measurements allows us to better resolve the vertical structure of cloud microphysical properties and cloud scale dynamics. Together with detailed in situ data for aerosol, cloud, water vapor and dynamics, we developed the most advanced observational capability to study cloud-scale properties and processes from a single aircraft (Fig. 1). A compact Raman lidar was also built to work together with in situ sampling to characterize boundary layer aerosol and water vapor distributions for many important atmospheric processes studies, such as, air-sea interaction and convective initialization. Case studies will be presented to illustrate these new observation capabilities.
An improved water-filled impedance tube.
Wilson, Preston S; Roy, Ronald A; Carey, William M
2003-06-01
A water-filled impedance tube capable of improved measurement accuracy and precision is reported. The measurement instrument employs a variation of the standardized two-sensor transfer function technique. Performance improvements were achieved through minimization of elastic waveguide effects and through the use of sound-hard wall-mounted acoustic pressure sensors. Acoustic propagation inside the water-filled impedance tube was found to be well described by a plane wave model, which is a necessary condition for the technique. Measurements of the impedance of a pressure-release terminated transmission line, and the reflection coefficient from a water/air interface, were used to verify the system.
Modeling occupants in far-side impacts.
Douglas, Clay; Fildes, Brian; Gibson, Tom
2011-10-01
Far-side impacts are not part of any regulated NCAP, FMVSS, or similar test regime despite accounting for 43 percent of the seriously injured persons and 30 percent of the harm in U.S. side impact crashes. Furthermore, injuries to the head and thorax account for over half of the serious injuries sustained by occupants in far-side crashes. Despite this, there is no regulated or well-accepted anthropomorphic test device (ATD) or computer model available to investigate far-side impacts. As such, this presents an opportunity to assess a computer model that can be used to measure the effect of varying restraint parameters on occupant biomechanics in far-side impacts. This study sets out to demonstrate the modified TASS human facet model's (MOTHMO) capabilities in modeling whole-body response in far-side impacts. MOTHMO's dynamic response was compared to that of postmortem human subjects (PMHS), WorldSID, and Thor-NT in a series of far-side sled tests. The advantages, disadvantages, and differences of using MOTHMO compared to ATDs were highlighted and described in terms of model design and instrumentation. Potential applications and improvements for MOTHMO were also recommended. The results showed that MOTHMO is capable of replicating the seat belt-to-shoulder complex interaction, pelvis impacts, head displacement, neck and shoulder belt loading from inboard mounted belts, and impacts from multiple directions. Overall, the model performed better than Thor-NT and at least as well as WorldSID when compared to PMHS results. Though WorldSID and Thor-NT ATDs were capable of reproducing many of these impact loads, measuring the seat belt-to-shoulder complex interaction and thoracic deflection at multiple sites and directions was less accurately handled. This study demonstrated that MOTHMO is capable of modeling whole-body response in far-side impacts. Furthermore, MOTHMO can be used as a virtual design tool to explore the effect of varying restraint parameters on occupant kinematics in far-side crash configurations.
Public health surveillance and infectious disease detection.
Morse, Stephen S
2012-03-01
Emerging infectious diseases, such as HIV/AIDS, SARS, and pandemic influenza, and the anthrax attacks of 2001, have demonstrated that we remain vulnerable to health threats caused by infectious diseases. The importance of strengthening global public health surveillance to provide early warning has been the primary recommendation of expert groups for at least the past 2 decades. However, despite improvements in the past decade, public health surveillance capabilities remain limited and fragmented, with uneven global coverage. Recent initiatives provide hope of addressing this issue, and new technological and conceptual advances could, for the first time, place capability for global surveillance within reach. Such advances include the revised International Health Regulations (IHR 2005) and the use of new data sources and methods to improve global coverage, sensitivity, and timeliness, which show promise for providing capabilities to extend and complement the existing infrastructure. One example is syndromic surveillance, using nontraditional and often automated data sources. Over the past 20 years, other initiatives, including ProMED-mail, GPHIN, and HealthMap, have demonstrated new mechanisms for acquiring surveillance data. In 2009 the U.S. Agency for International Development (USAID) began the Emerging Pandemic Threats (EPT) program, which includes the PREDICT project, to build global capacity for surveillance of novel infections that have pandemic potential (originating in wildlife and at the animal-human interface) and to develop a framework for risk assessment. Improved understanding of factors driving infectious disease emergence and new technological capabilities in modeling, diagnostics and pathogen identification, and communications, such as using the increasing global coverage of cellphones for public health surveillance, can further enhance global surveillance.
Improvements to Nuclear Data and Its Uncertainties by Theoretical Modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Danon, Yaron; Nazarewicz, Witold; Talou, Patrick
2013-02-18
This project addresses three important gaps in existing evaluated nuclear data libraries that represent a significant hindrance against highly advanced modeling and simulation capabilities for the Advanced Fuel Cycle Initiative (AFCI). This project will: Develop advanced theoretical tools to compute prompt fission neutrons and gamma-ray characteristics well beyond average spectra and multiplicity, and produce new evaluated files of U and Pu isotopes, along with some minor actinides; Perform state-of-the-art fission cross-section modeling and calculations using global and microscopic model input parameters, leading to truly predictive fission cross-sections capabilities. Consistent calculations for a suite of Pu isotopes will be performed; Implementmore » innovative data assimilation tools, which will reflect the nuclear data evaluation process much more accurately, and lead to a new generation of uncertainty quantification files. New covariance matrices will be obtained for Pu isotopes and compared to existing ones. The deployment of a fleet of safe and efficient advanced reactors that minimize radiotoxic waste and are proliferation-resistant is a clear and ambitious goal of AFCI. While in the past the design, construction and operation of a reactor were supported through empirical trials, this new phase in nuclear energy production is expected to rely heavily on advanced modeling and simulation capabilities. To be truly successful, a program for advanced simulations of innovative reactors will have to develop advanced multi-physics capabilities, to be run on massively parallel super- computers, and to incorporate adequate and precise underlying physics. And all these areas have to be developed simultaneously to achieve those ambitious goals. Of particular interest are reliable fission cross-section uncertainty estimates (including important correlations) and evaluations of prompt fission neutrons and gamma-ray spectra and uncertainties.« less
Decompression models: review, relevance and validation capabilities.
Hugon, J
2014-01-01
For more than a century, several types of mathematical models have been proposed to describe tissue desaturation mechanisms in order to limit decompression sickness. These models are statistically assessed by DCS cases, and, over time, have gradually included bubble formation biophysics. This paper proposes to review this evolution and discuss its limitations. This review is organized around the comparison of decompression model biophysical criteria and theoretical foundations. Then, the DCS-predictive capability was analyzed to assess whether it could be improved by combining different approaches. Most of the operational decompression models have a neo-Haldanian form. Nevertheless, bubble modeling has been gaining popularity, and the circulating bubble amount has become a major output. By merging both views, it seems possible to build a relevant global decompression model that intends to simulate bubble production while predicting DCS risks for all types of exposures and decompression profiles. A statistical approach combining both DCS and bubble detection databases has to be developed to calibrate a global decompression model. Doppler ultrasound and DCS data are essential: i. to make correlation and validation phases reliable; ii. to adjust biophysical criteria to fit at best the observed bubble kinetics; and iii. to build a relevant risk function.
NASA Astrophysics Data System (ADS)
Paxton, Bill; Schwab, Josiah; Bauer, Evan B.; Bildsten, Lars; Blinnikov, Sergei; Duffell, Paul; Farmer, R.; Goldberg, Jared A.; Marchant, Pablo; Sorokina, Elena; Thoul, Anne; Townsend, Richard H. D.; Timmes, F. X.
2018-02-01
We update the capabilities of the software instrument Modules for Experiments in Stellar Astrophysics (MESA) and enhance its ease of use and availability. Our new approach to locating convective boundaries is consistent with the physics of convection, and yields reliable values of the convective-core mass during both hydrogen- and helium-burning phases. Stars with M< 8 M⊙ become white dwarfs and cool to the point where the electrons are degenerate and the ions are strongly coupled, a realm now available to study with MESA due to improved treatments of element diffusion, latent heat release, and blending of equations of state. Studies of the final fates of massive stars are extended in MESA by our addition of an approximate Riemann solver that captures shocks and conserves energy to high accuracy during dynamic epochs. We also introduce a 1D capability for modeling the effects of Rayleigh-Taylor instabilities that, in combination with the coupling to a public version of the STELLA radiation transfer instrument, creates new avenues for exploring Type II supernova properties. These capabilities are exhibited with exploratory models of pair-instability supernovae, pulsational pair-instability supernovae, and the formation of stellar-mass black holes. The applicability of MESA is now widened by the capability to import multidimensional hydrodynamic models into MESA. We close by introducing software modules for handling floating point exceptions and stellar model optimization, as well as four new software tools - MESA-Web, MESA-Docker, pyMESA, and mesastar.org - to enhance MESA's education and research impact.
NASA Technical Reports Server (NTRS)
1971-01-01
The analytical models developed for the Space Propulsion Automated Synthesis Modeling (SPASM) program are presented. Weight scaling laws developed during this study are incorporated into the program's scaling data bank. A detail listing, logic diagram and input/output formats are supplied for the SPASM program. Two test examples for one to four-stage vehicles performing different types of missions are shown to demonstrate the program's capability and versatility.
An Improved Maintenance Model for the Simulation of Strategic Airlift Capability.
1982-03-01
developed using SLAM as the primary simulation language. Maintenance manning is modeled at the Air Force Specialty Code level, to allow the possibility of...Atlantic Treaty Organization (NATO) allies is one of our primary national objectives, but recent increases in Soviet ground and air forces (Ref 5:100) have...arrive from the United States. Consequently, the primary objective of the United States Air Force mobility program is to be able, by 1982, to double the
Improving the Analysis Capabilities of the Synthetic Theater Operations Research Model (STORM)
2014-09-01
course of action CSG carrier strike group DMSO defense modeling and simulation DOD Department of Defense DOE design of experiments ESG...development of an overall objective or end-state; a ways ( courses of action); and a means (available resources). STORM is a campaign analysis tool that...refers to the courses of action (COA) that are carefully planned out in advance by individuals relevant to a specific campaign (such as N81). For
Report on results of current and future metal casting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Unal, Cetin; Carlson, Neil N.
2015-09-28
New modeling capabilities needed to simulate the casting of metallic fuels are added to Truchas code. In this report we summarize improvements we made in FY2015 in three areas; (1) Analysis of new casting experiments conducted with BCS and EFL designs, (2) the simulation of INL’s U-Zr casting experiments with Flow3D computer program, (3) the implementation of surface tension model into Truchas for unstructured mesh required to run U-Zr casting.
Davies, Julie; Sampson, Mark; Beesley, Frank; Smith, Debra; Baldwin, Victoria
2014-05-01
5 Boroughs Partnership NHS Foundation Trust, in the Northwest of England, has trained over 500 staff in the Knowledge and Understanding Framework, level 1 personality disorder awareness training. This is a 3-day nationally devised training programme delivered via an innovative co-production model (i.e. co-delivery and partnership working with service users who have lived experience). This paper provides quantitative and qualitative information on the effectiveness of training delivery and also serves to provide some insight into the impact of service-user involvement via such a co-production model. Information on 162 participants using the Knowledge and Understanding Framework bespoke questionnaire (Personality Disorder Knowledge, Attitudes and Skills Questionnaire) suggests that the training can be effectively delivered by and within a local NHS Mental Health Trust. Results immediately post-training suggest an improvement in levels of understanding and capability efficacy and a reduction in negative emotional reactions. Indications from a 3-month follow-up suggest that while understanding and emotional reaction remain improved, capability efficacy regresses back to pre-training levels, suggesting the need for ongoing supervision and/or support to consolidate skills. Discussion includes guidelines for the implementation of a truly integrated co-production model of training provision, as well as advice relating to the maximization of long-term benefits. Copyright © 2014 John Wiley & Sons, Ltd.
Master Middle Ware: A Tool to Integrate Water Resources and Fish Population Dynamics Models
NASA Astrophysics Data System (ADS)
Yi, S.; Sandoval Solis, S.; Thompson, L. C.; Kilduff, D. P.
2017-12-01
Linking models that investigate separate components of ecosystem processes has the potential to unify messages regarding management decisions by evaluating potential trade-offs in a cohesive framework. This project aimed to improve the ability of riparian resource managers to forecast future water availability conditions and resultant fish habitat suitability, in order to better inform their management decisions. To accomplish this goal, we developed a middleware tool that is capable of linking and overseeing the operations of two existing models, a water resource planning tool Water Evaluation and Planning (WEAP) model and a habitat-based fish population dynamics model (WEAPhish). First, we designed the Master Middle Ware (MMW) software in Visual Basic for Application® in one Excel® file that provided a familiar framework for both data input and output Second, MMW was used to link and jointly operate WEAP and WEAPhish, using Visual Basic Application (VBA) macros to implement system level calls to run the models. To demonstrate the utility of this approach, hydrological, biological, and middleware model components were developed for the Butte Creek basin. This tributary of the Sacramento River, California is managed for both hydropower and the persistence of a threatened population of spring-run Chinook salmon (Oncorhynchus tschawytscha). While we have demonstrated the use of MMW for a particular watershed and fish population, MMW can be customized for use with different rivers and fish populations, assuming basic data requirements are met. This model integration improves on ad hoc linkages for managing data transfer between software programs by providing a consistent, user-friendly, and familiar interface across different model implementations. Furthermore, the data-viewing capabilities of MMW facilitate the rapid interpretation of model results by hydrologists, fisheries biologists, and resource managers, in order to accelerate learning and management decision making.
Rapid performance modeling and parameter regression of geodynamic models
NASA Astrophysics Data System (ADS)
Brown, J.; Duplyakin, D.
2016-12-01
Geodynamic models run in a parallel environment have many parameters with complicated effects on performance and scientifically-relevant functionals. Manually choosing an efficient machine configuration and mapping out the parameter space requires a great deal of expert knowledge and time-consuming experiments. We propose an active learning technique based on Gaussion Process Regression to automatically select experiments to map out the performance landscape with respect to scientific and machine parameters. The resulting performance model is then used to select optimal experiments for improving the accuracy of a reduced order model per unit of computational cost. We present the framework and evaluate its quality and capability using popular lithospheric dynamics models.
An improved numerical model for wave rotor design and analysis
NASA Technical Reports Server (NTRS)
Paxson, Daniel E.; Wilson, Jack
1993-01-01
A numerical model has been developed which can predict both the unsteady flows within a wave rotor and the steady averaged flows in the ports. The model is based on the assumptions of one-dimensional, unsteady, and perfect gas flow. Besides the dominant wave behavior, it is also capable of predicting the effects of finite tube opening time, leakage from the tube ends, and viscosity. The relative simplicity of the model makes it useful for design, optimization, and analysis of wave rotor cycles for any application. This paper discusses some details of the model and presents comparisons between the model and two laboratory wave rotor experiments.
An improved numerical model for wave rotor design and analysis
NASA Technical Reports Server (NTRS)
Paxson, Daniel E.; Wilson, Jack
1992-01-01
A numerical model has been developed which can predict both the unsteady flows within a wave rotor and the steady averaged flows in the ports. The model is based on the assumptions of one-dimensional, unsteady, and perfect gas flow. Besides the dominant wave behavior, it is also capable of predicting the effects of finite tube opening time, leakage from the tube ends, and viscosity. The relative simplicity of the model makes it useful for design, optimization, and analysis of wave rotor cycles for any application. This paper discusses some details of the model and presents comparisons between the model and two laboratory wave rotor experiments.
An in-depth review of photovoltaic system performance models
NASA Technical Reports Server (NTRS)
Smith, J. H.; Reiter, L. R.
1984-01-01
The features, strong points and shortcomings of 10 numerical models commonly applied to assessing photovoltaic performance are discussed. The models range in capabilities from first-order approximations to full circuit level descriptions. Account is taken, at times, of the cell and module characteristics, the orientation and geometry, array-level factors, the power-conditioning equipment, the overall plant performance, O and M effects, and site-specific factors. Areas of improvement and/or necessary extensions are identified for several of the models. Although the simplicity of a model was found not necessarily to affect the accuracy of the data generated, the use of any one model was dependent on the application.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leung, Ruby
2017-05-01
Internationally recognized Climate Scientist Ruby Leung is a cloud gazer. But rather than looking for shapes, Ruby’s life’s calling is to develop regional atmospheric models to better predict and understand the effects of global climate change at scales relevant to humans and the environment. Ruby’s accomplishments include developing novel methods for modeling mountain clouds and precipitation in climate models, and improving understanding of hydroclimate variability and change. She also has led efforts to develop regional climate modeling capabilities in the Weather Research and Forecasting model that is widely adopted by scientists worldwide. Ruby is part of a team of PNNLmore » researchers studying the impacts of global warming.« less
Modeling air concentration over macro roughness conditions by Artificial Intelligence techniques
NASA Astrophysics Data System (ADS)
Roshni, T.; Pagliara, S.
2018-05-01
Aeration is improved in rivers by the turbulence created in the flow over macro and intermediate roughness conditions. Macro and intermediate roughness flow conditions are generated by flows over block ramps or rock chutes. The measurements are taken in uniform flow region. Efficacy of soft computing methods in modeling hydraulic parameters are not common so far. In this study, modeling efficiencies of MPMR model and FFNN model are found for estimating the air concentration over block ramps under macro roughness conditions. The experimental data are used for training and testing phases. Potential capability of MPMR and FFNN model in estimating air concentration are proved through this study.
Banta, Edward R.; Poeter, Eileen P.; Doherty, John E.; Hill, Mary C.
2006-01-01
he Joint Universal Parameter IdenTification and Evaluation of Reliability Application Programming Interface (JUPITER API) improves the computer programming resources available to those developing applications (computer programs) for model analysis.The JUPITER API consists of eleven Fortran-90 modules that provide for encapsulation of data and operations on that data. Each module contains one or more entities: data, data types, subroutines, functions, and generic interfaces. The modules do not constitute computer programs themselves; instead, they are used to construct computer programs. Such computer programs are called applications of the API. The API provides common modeling operations for use by a variety of computer applications.The models being analyzed are referred to here as process models, and may, for example, represent the physics, chemistry, and(or) biology of a field or laboratory system. Process models commonly are constructed using published models such as MODFLOW (Harbaugh et al., 2000; Harbaugh, 2005), MT3DMS (Zheng and Wang, 1996), HSPF (Bicknell et al., 1997), PRMS (Leavesley and Stannard, 1995), and many others. The process model may be accessed by a JUPITER API application as an external program, or it may be implemented as a subroutine within a JUPITER API application . In either case, execution of the model takes place in a framework designed by the application programmer. This framework can be designed to take advantage of any parallel processing capabilities possessed by the process model, as well as the parallel-processing capabilities of the JUPITER API.Model analyses for which the JUPITER API could be useful include, for example: Compare model results to observed values to determine how well the model reproduces system processes and characteristics.Use sensitivity analysis to determine the information provided by observations to parameters and predictions of interest.Determine the additional data needed to improve selected model predictions.Use calibration methods to modify parameter values and other aspects of the model.Compare predictions to regulatory limits.Quantify the uncertainty of predictions based on the results of one or many simulations using inferential or Monte Carlo methods.Determine how to manage the system to achieve stated objectives.The capabilities provided by the JUPITER API include, for example, communication with process models, parallel computations, compressed storage of matrices, and flexible input capabilities. The input capabilities use input blocks suitable for lists or arrays of data. The input blocks needed for one application can be included within one data file or distributed among many files. Data exchange between different JUPITER API applications or between applications and other programs is supported by data-exchange files.The JUPITER API has already been used to construct a number of applications. Three simple example applications are presented in this report. More complicated applications include the universal inverse code UCODE_2005 (Poeter et al., 2005), the multi-model analysis MMA (Eileen P. Poeter, Mary C. Hill, E.R. Banta, S.W. Mehl, and Steen Christensen, written commun., 2006), and a code named OPR_PPR (Matthew J. Tonkin, Claire R. Tiedeman, Mary C. Hill, and D. Matthew Ely, written communication, 2006).This report describes a set of underlying organizational concepts and complete specifics about the JUPITER API. While understanding the organizational concept presented is useful to understanding the modules, other organizational concepts can be used in applications constructed using the JUPITER API.
The Finite Strain Johnson Cook Plasticity and Damage Constitutive Model in ALEGRA.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanchez, Jason James
A finite strain formulation of the Johnson Cook plasticity and damage model and it's numerical implementation into the ALEGRA code is presented. The goal of this work is to improve the predictive material failure capability of the Johnson Cook model. The new implementation consists of a coupling of damage and the stored elastic energy as well as the minimum failure strain criteria for spall included in the original model development. This effort establishes the necessary foundation for a thermodynamically consistent and complete continuum solid material model, for which all intensive properties derive from a common energy. The motivation for developingmore » such a model is to improve upon ALEGRA's present combined model framework. Several applications of the new Johnson Cook implementation are presented. Deformation driven loading paths demonstrate the basic features of the new model formulation. Use of the model produces good comparisons with experimental Taylor impact data. Localized deformation leading to fragmentation is produced for expanding ring and exploding cylinder applications.« less
1990-09-01
exper[ ence in u.sings both the KC-13iA/E/R d ,aboase model and other mat.hematival models. A staListical analysis of survey oz;ai,.,arons, will be...statistic. Consequently, differ- ences of opinion among respondents will be amplified. Summary The research methodology provide5 a sequential set of...Cost Accounting Direc- torate (AFLC/ACC). Though used for cost accounting pur- poses, the VAMOSC system has the capability of cross refer- encing a WUC
Modeling Tumor Clonal Evolution for Drug Combinations Design.
Zhao, Boyang; Hemann, Michael T; Lauffenburger, Douglas A
2016-03-01
Cancer is a clonal evolutionary process. This presents challenges for effective therapeutic intervention, given the constant selective pressure towards drug resistance. Mathematical modeling from population genetics, evolutionary dynamics, and engineering perspectives are being increasingly employed to study tumor progression, intratumoral heterogeneity, drug resistance, and rational drug scheduling and combinations design. In this review, we discuss promising opportunities these inter-disciplinary approaches hold for advances in cancer biology and treatment. We propose that quantitative modeling perspectives can complement emerging experimental technologies to facilitate enhanced understanding of disease progression and improved capabilities for therapeutic drug regimen designs.
A new nuclide transport model in soil in the GENII-LIN health physics code
NASA Astrophysics Data System (ADS)
Teodori, F.
2017-11-01
The nuclide soil transfer model, originally included in the GENII-LIN software system, was intended for residual contamination from long term activities and from waste form degradation. Short life nuclides were supposed absent or at equilibrium with long life parents. Here we present an enhanced soil transport model, where short life nuclide contributions are correctly accounted. This improvement extends the code capabilities to handle incidental release of contaminant to soil, by evaluating exposure since the very beginning of the contamination event, before the radioactive decay chain equilibrium is reached.
NASA Astrophysics Data System (ADS)
Raz-Yaseef, N.; Sonnentag, O.; Kobayashi, H.; Chen, J. M.; Verfaillie, J. G.; Ma, S.; Baldocchi, D. D.
2011-12-01
Semi-arid climates experience large seasonal and inter-annual variability in radiation and precipitation, creating natural conditions adequate to study how year-to-year changes affect atmosphere-biosphere fluxes. Especially, savanna ecosystems, that combine tree and below-canopy components, create a unique environment in which phenology dramatically changes between seasons. We used a 10-year flux database in order to define seasonal and interannual variability of climatic inputs and fluxes, and evaluate model capability to reproduce observed variability. This is based on the perception that model capability to construct the deviation, and not the average, is important in order to correctly predict ecosystem sensitivity to climate change. Our research site is a low density and low LAI (0.8) semi-arid savanna, located at Tonzi Ranch, Northern California. In this system, trees are active during the warm season (Mar - Oct), and grasses are active during the wet season (Dec - May). Measurements of carbon and water fluxes above and below the tree canopy using eddy covariance and supplementary measurements have been made since 2001. Fluxes were simulated using bio-meteorological process-oriented ecosystem models: BEPS and 3D-CAONAK. Models were partly capable of reproducing fluxes on daily scales (R2=0.66). We then compared model outputs for different ecosystem components and seasons, and found distinct seasons with high correlations while other seasons were purely represented. Comparison was much higher for ET than for GPP. The understory was better simulated than the overstory. CANOAK overestimated spring understory fluxes, probably due to the capability to directly calculated 3D radiative transfer. BEPS underestimated spring understory fluxes, following the pre-description of grass die-off. Both models underestimated peak spring overstory fluxes. During winter tree dormant, modeled fluxes were null, but occasional high fluxes of both ET and GPP were measured following precipitation events, likely produced by an adverse measurement effect. This analysis enabled to pinpoint specific areas where models break, and stress that model capability to reproduce fluxes vary among seasons and ecosystem components. The combined response was such, that comparison decreases when ecosystem fluxes were partitioned between overstory and understory fluxes. Model performance decreases with time scale; while performance was high for some seasons, models were less capable of reproducing the high variability in understory fluxes vs. the conservative overstory fluxes on annual scales. Discrepancies were not always a result of models' faults; comparison largely improved when measurements of overstory fluxes during precipitation events were excluded. Conclusions raised from this research enable to answer the critical question of the level and type of details needed in order to correctly predict ecosystem respond to environmental and climatic change.
NASA Technical Reports Server (NTRS)
Tikidjian, Raffi; Mackey, Ryan
2008-01-01
The DSN Array Simulator (wherein 'DSN' signifies NASA's Deep Space Network) is an updated version of software previously denoted the DSN Receive Array Technology Assessment Simulation. This software (see figure) is used for computational modeling of a proposed DSN facility comprising user-defined arrays of antennas and transmitting and receiving equipment for microwave communication with spacecraft on interplanetary missions. The simulation includes variations in spacecraft tracked and communication demand changes for up to several decades of future operation. Such modeling is performed to estimate facility performance, evaluate requirements that govern facility design, and evaluate proposed improvements in hardware and/or software. The updated version of this software affords enhanced capability for characterizing facility performance against user-defined mission sets. The software includes a Monte Carlo simulation component that enables rapid generation of key mission-set metrics (e.g., numbers of links, data rates, and date volumes), and statistical distributions thereof as functions of time. The updated version also offers expanded capability for mixed-asset network modeling--for example, for running scenarios that involve user-definable mixtures of antennas having different diameters (in contradistinction to a fixed number of antennas having the same fixed diameter). The improved version also affords greater simulation fidelity, sufficient for validation by comparison with actual DSN operations and analytically predictable performance metrics.
A introduction of a Scientific Research Program on Chinese Drought
NASA Astrophysics Data System (ADS)
Li, Y.
2014-12-01
Drought is one of the major meteorological disasters, with high frequencies, wide distributions and serious conditions. It is one of the biggest impacts on global agricultural productions, ecological environment and socioeconomic sustainable developments. China is particularly one of the countries in the world with serious drought disasters. The goal of this project is improving the capabilities in drought monitoring and forecasting based on an in-depth theories of drought. The project will be implemented in the typical extreme drought area based on comprehensive and systemic observation network and numerical experiments It will show a complete feedback mechanism among the atmospheric, water, biological and other spheres for forming drought. First, the atmospheric droughts that leads to agriculture and hydrologic drought and the possible causes for these disasters will be explored using our observation data sets. Second, the capability of monitoring, forecasting and early warning for drought will be developed with numerical model (regional climate model and land surface model, etc.). Last but not the least, evaluation approaches for the risk of drought and the strategy of predicting/prohibiting the drought at regional scale will be proposed. Meanwhile, service system and information sharing platform of drought monitoring and early warning will be established to improve the technical level of drought disaster preparedness and response in China.
Maturing Weapon Systems for Improved Availability at Lower Costs
1994-01-01
development of new measures of R&M performance and improved data collection and analysis capabilities . Innovations in automated data collection, including the...45 Capabilities Required to Implement Maturation Development ...... 45 Assess R&M Performance Accurately ....................... 46 Identify...Requirements Determination ...................................... 49 Capabilities of the Best Existing Databases ..................... 49 Data Elements Needed
Calvo, Roque; D’Amato, Roberto; Gómez, Emilio; Domingo, Rosario
2016-01-01
Coordinate measuring machines (CMM) are main instruments of measurement in laboratories and in industrial quality control. A compensation error model has been formulated (Part I). It integrates error and uncertainty in the feature measurement model. Experimental implementation for the verification of this model is carried out based on the direct testing on a moving bridge CMM. The regression results by axis are quantified and compared to CMM indication with respect to the assigned values of the measurand. Next, testing of selected measurements of length, flatness, dihedral angle, and roundness features are accomplished. The measurement of calibrated gauge blocks for length or angle, flatness verification of the CMM granite table and roundness of a precision glass hemisphere are presented under a setup of repeatability conditions. The results are analysed and compared with alternative methods of estimation. The overall performance of the model is endorsed through experimental verification, as well as the practical use and the model capability to contribute in the improvement of current standard CMM measuring capabilities. PMID:27754441
Vazquez-Anderson, Jorge; Mihailovic, Mia K.; Baldridge, Kevin C.; Reyes, Kristofer G.; Haning, Katie; Cho, Seung Hee; Amador, Paul; Powell, Warren B.
2017-01-01
Abstract Current approaches to design efficient antisense RNAs (asRNAs) rely primarily on a thermodynamic understanding of RNA–RNA interactions. However, these approaches depend on structure predictions and have limited accuracy, arguably due to overlooking important cellular environment factors. In this work, we develop a biophysical model to describe asRNA–RNA hybridization that incorporates in vivo factors using large-scale experimental hybridization data for three model RNAs: a group I intron, CsrB and a tRNA. A unique element of our model is the estimation of the availability of the target region to interact with a given asRNA using a differential entropic consideration of suboptimal structures. We showcase the utility of this model by evaluating its prediction capabilities in four additional RNAs: a group II intron, Spinach II, 2-MS2 binding domain and glgC 5΄ UTR. Additionally, we demonstrate the applicability of this approach to other bacterial species by predicting sRNA–mRNA binding regions in two newly discovered, though uncharacterized, regulatory RNAs. PMID:28334800
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moriarty, Patrick
2016-02-23
The effects of wind turbine wakes within operating wind farms have a substantial impact on the overall energy production from the farm. The current generation of models drastically underpredicts the impact of these wakes leading to non-conservative estimates of energy capture and financial losses to wind farm operators and developers. To improve these models, detailed research of operating wind farms is necessary. Rebecca Barthelmie of Indiana University is a world leader of wind farm wakes effects and would like to partner with NREL to help improve wind farm modeling by gathering additional wind farm data, develop better models and increasemore » collaboration with European researchers working in the same area. This is currently an active area of research at NREL and the capabilities of both parties should mesh nicely.« less
The Defense Industrial Base: Prescription for a Psychosomatic Ailment
1983-08-01
The Decision- Making Process ------------------------- 65 Notes ---------------------------------------- FIGURE 4-1. The Decision [laking Process...the strategy and tactics process to make certain that we can attain out national security objectives. (IFP is also known as mobilization planning or...decision- making model that could improve the capacity and capability-of the military-industrial complex, thereby increasing the probability of success
Impact design methods for ceramic components in gas turbine engines
NASA Technical Reports Server (NTRS)
Song, J.; Cuccio, J.; Kington, H.
1991-01-01
Methods currently under development to design ceramic turbine components with improved impact resistance are presented. Two different modes of impact damage are identified and characterized, i.e., structural damage and local damage. The entire computation is incorporated into the EPIC computer code. Model capability is demonstrated by simulating instrumented plate impact and particle impact tests.
ERIC Educational Resources Information Center
Berardi, Victor L.
2012-01-01
Using information systems to solve business problems is increasingly required of everyone in an organization, not just technical specialists. In the operations management class, spreadsheet usage has intensified with the focus on building decision models to solve operations management concerns such as forecasting, process capability, and inventory…
Greg C. Liknes; Christopher W. Woodall; Charles H. Perry
2009-01-01
Climate information frequently is included in geospatial modeling efforts to improve the predictive capability of other data sources. The selection of an appropriate climate data source requires consideration given the number of choices available. With regard to climate data, there are a variety of parameters (e.g., temperature, humidity, precipitation), time intervals...
Methods of treating complex space vehicle geometry for charged particle radiation transport
NASA Technical Reports Server (NTRS)
Hill, C. W.
1973-01-01
Current methods of treating complex geometry models for space radiation transport calculations are reviewed. The geometric techniques used in three computer codes are outlined. Evaluations of geometric capability and speed are provided for these codes. Although no code development work is included several suggestions for significantly improving complex geometry codes are offered.
ERIC Educational Resources Information Center
Pusey, Portia; Sadera, William A.
2012-01-01
In teacher education programs, preservice teachers learn about strategies to appropriately integrate computer-related and Internet-capable technologies into instructional settings to improve student learning. Many presume that preservice teachers have the knowledge to competently model and teach issues of safety when working with these devices as…
2012-09-30
Development of Sand Properties 103 Advanced Modeling Dataset.. 105 High Strength Low Alloy (HSLA) Steels 107 Steel Casting and Engineering Support...to achieve the performance goals required for new systems. The dramatic reduction in weight and increase in capability will require high performance...for improved weapon system reliability. SFSA developed innovative casting design and manufacturing processes for high performance parts. SFSA is
The ASAC Flight Segment and Network Cost Models
NASA Technical Reports Server (NTRS)
Kaplan, Bruce J.; Lee, David A.; Retina, Nusrat; Wingrove, Earl R., III; Malone, Brett; Hall, Stephen G.; Houser, Scott A.
1997-01-01
To assist NASA in identifying research art, with the greatest potential for improving the air transportation system, two models were developed as part of its Aviation System Analysis Capability (ASAC). The ASAC Flight Segment Cost Model (FSCM) is used to predict aircraft trajectories, resource consumption, and variable operating costs for one or more flight segments. The Network Cost Model can either summarize the costs for a network of flight segments processed by the FSCM or can be used to independently estimate the variable operating costs of flying a fleet of equipment given the number of departures and average flight stage lengths.
Description of the NCAR Community Climate Model (CCM3). Technical note
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kiehl, J.T.; Hack, J.J.; Bonan, G.B.
This repor presents the details of the governing equations, physical parameterizations, and numerical algorithms defining the version of the NCAR Community Climate Model designated CCM3. The material provides an overview of the major model components, and the way in which they interact as the numerical integration proceeds. This version of the CCM incorporates significant improvements to the physic package, new capabilities such as the incorporation of a slab ocean component, and a number of enhancements to the implementation (e.g., the ability to integrate the model on parallel distributed-memory computational platforms).
Predictive Behavior of a Computational Foot/Ankle Model through Artificial Neural Networks.
Chande, Ruchi D; Hargraves, Rosalyn Hobson; Ortiz-Robinson, Norma; Wayne, Jennifer S
2017-01-01
Computational models are useful tools to study the biomechanics of human joints. Their predictive performance is heavily dependent on bony anatomy and soft tissue properties. Imaging data provides anatomical requirements while approximate tissue properties are implemented from literature data, when available. We sought to improve the predictive capability of a computational foot/ankle model by optimizing its ligament stiffness inputs using feedforward and radial basis function neural networks. While the former demonstrated better performance than the latter per mean square error, both networks provided reasonable stiffness predictions for implementation into the computational model.
Reusable Launch Vehicle (RLV) Market Analysis Model
NASA Technical Reports Server (NTRS)
Prince, Frank A.
1999-01-01
The RLV Market Analysis model is at best a rough order approximation of actual market behavior. However, it does give a quick indication if the flights exists to enable an economically viable RLV, and the assumptions necessary for the vehicle to capture those flights. Additional analysis, market research, and updating with the latest information on payloads and launches would improve the model. Plans are to update the model as new information becomes available and new requirements are levied. This tool will continue to be a vital part of NASA's RLV business analysis capability for the foreseeable future.
Modeling hydraulic regenerative hybrid vehicles using AMESim and Matlab/Simulink
NASA Astrophysics Data System (ADS)
Lynn, Alfred; Smid, Edzko; Eshraghi, Moji; Caldwell, Niall; Woody, Dan
2005-05-01
This paper presents the overview of the simulation modeling of a hydraulic system with regenerative braking used to improve vehicle emissions and fuel economy. Two simulation software packages were used together to enhance the simulation capability for fuel economy results and development of vehicle and hybrid control strategy. AMESim, a hydraulic simulation software package modeled the complex hydraulic circuit and component hardware and was interlinked with a Matlab/Simulink model of the vehicle, engine and the control strategy required to operate the vehicle and the hydraulic hybrid system through various North American and European drive cycles.
NASA Technical Reports Server (NTRS)
Kubat, Gregory
2016-01-01
This report provides a description and performance characterization of the large-scale, Relay architecture, UAS communications simulation capability developed for the NASA GRC, UAS in the NAS Project. The system uses a validated model of the GRC Gen5 CNPC, Flight-Test Radio model. Contained in the report is a description of the simulation system and its model components, recent changes made to the system to improve performance, descriptions and objectives of sample simulations used for test and verification, and a sampling and observations of results and performance data.
Theoretical and software considerations for nonlinear dynamic analysis
NASA Technical Reports Server (NTRS)
Schmidt, R. J.; Dodds, R. H., Jr.
1983-01-01
In the finite element method for structural analysis, it is generally necessary to discretize the structural model into a very large number of elements to accurately evaluate displacements, strains, and stresses. As the complexity of the model increases, the number of degrees of freedom can easily exceed the capacity of present-day software system. Improvements of structural analysis software including more efficient use of existing hardware and improved structural modeling techniques are discussed. One modeling technique that is used successfully in static linear and nonlinear analysis is multilevel substructuring. This research extends the use of multilevel substructure modeling to include dynamic analysis and defines the requirements for a general purpose software system capable of efficient nonlinear dynamic analysis. The multilevel substructuring technique is presented, the analytical formulations and computational procedures for dynamic analysis and nonlinear mechanics are reviewed, and an approach to the design and implementation of a general purpose structural software system is presented.
Assessment of State-of-the-Art Dust Emission Scheme in GEOS
NASA Technical Reports Server (NTRS)
Darmenov, Anton; Liu, Xiaohong; Prigent, Catherine
2017-01-01
The GEOS modeling system has been extended with state of the art parameterization of dust emissions based on the vertical flux formulation described in Kok et al 2014. The new dust scheme was coupled with the GOCART and MAM aerosol models. In the present study we compare dust emissions, aerosol optical depth (AOD) and radiative fluxes from GEOS experiments with the standard and new dust emissions. AOD from the model experiments are also compared with AERONET and satellite based data. Based on this comparative analysis we concluded that the new parameterization improves the GEOS capability to model dust aerosols originating from African sources, however it lead to overestimation of dust emissions from Asian and Arabian sources. Further regional tuning of key parameters controlling the threshold friction velocity may be required in order to achieve more definitive and uniform improvement in the dust modeling skill.
A New Model for the Estimation of Cell Proliferation Dynamics Using CFSE Data
Banks, H.T.; Sutton, Karyn L.; Thompson, W. Clayton; Bocharov, Gennady; Doumic, Marie; Schenkel, Tim; Argilaguet, Jordi; Giest, Sandra; Peligero, Cristina; Meyerhans, Andreas
2011-01-01
CFSE analysis of a proliferating cell population is a popular tool for the study of cell division and division-linked changes in cell behavior. Recently [13, 43, 45], a partial differential equation (PDE) model to describe lymphocyte dynamics in a CFSE proliferation assay was proposed. We present a significant revision of this model which improves the physiological understanding of several parameters. Namely, the parameter γ used previously as a heuristic explanation for the dilution of CFSE dye by cell division is replaced with a more physical component, cellular autofluorescence. The rate at which label decays is also quantified using a Gompertz decay process. We then demonstrate a revised method of fitting the model to the commonly used histogram representation of the data. It is shown that these improvements result in a model with a strong physiological basis which is fully capable of replicating the behavior observed in the data. PMID:21889510
Parallelization of the Coupled Earthquake Model
NASA Technical Reports Server (NTRS)
Block, Gary; Li, P. Peggy; Song, Yuhe T.
2007-01-01
This Web-based tsunami simulation system allows users to remotely run a model on JPL s supercomputers for a given undersea earthquake. At the time of this reporting, predicting tsunamis on the Internet has never happened before. This new code directly couples the earthquake model and the ocean model on parallel computers and improves simulation speed. Seismometers can only detect information from earthquakes; they cannot detect whether or not a tsunami may occur as a result of the earthquake. When earthquake-tsunami models are coupled with the improved computational speed of modern, high-performance computers and constrained by remotely sensed data, they are able to provide early warnings for those coastal regions at risk. The software is capable of testing NASA s satellite observations of tsunamis. It has been successfully tested for several historical tsunamis, has passed all alpha and beta testing, and is well documented for users.
Device design and signal processing for multiple-input multiple-output multimode fiber links
NASA Astrophysics Data System (ADS)
Appaiah, Kumar; Vishwanath, Sriram; Bank, Seth R.
2012-01-01
Multimode fibers (MMFs) are limited in data rate capabilities owing to modal dispersion. However, their large core diameter simplifies alignment and packaging, and makes them attractive for short and medium length links. Recent research has shown that the use of signal processing and techniques such as multiple-input multiple-output (MIMO) can greatly improve the data rate capabilities of multimode fibers. In this paper, we review recent experimental work using MIMO and signal processing for multimode fibers, and the improvements in data rates achievable with these techniques. We then present models to design as well as simulate the performance benefits obtainable with arrays of lasers and detectors in conjunction with MIMO, using channel capacity as the metric to optimize. We also discuss some aspects related to complexity of the algorithms needed for signal processing and discuss techniques for low complexity implementation.
Dryland pasture and crop conditions as seen by HCMM. [Washita River watershed, Oklahoma
NASA Technical Reports Server (NTRS)
Rosenthal, W. D.; Harlan, J. C.; Blanchard, B. J. (Principal Investigator)
1980-01-01
Ground truth, aircraft, and satellite data were examined in order to: (1) assess the capability for determining wheat and pasture canopy temperatures in a dryland farming region from HCMM data; (2) assess the capability for determining soil moisture from HCMM data in dryland crops (winter wheat) from adjacent range lands; and (3) determine the relationship of HCMM-derived soil moisture and canopy temperature values with the condition of winter wheat and dryland farming areas during the principal growth stages. The IR data were screened to include areas having greater than 60% pasture and surface temperatures were recalculated using the atmospheric correction factor calculated by the modified RADTRA model, and the July 29, 1978 IR data were analyzed. Screening the IR data improved the relationship for July 24/July 13 and October 7/August 31 temperature/API relationship. However the coefficient of determination was not improved in the July 29/July 13 relationship.
FY17 Status Report on NEAMS Neutronics Activities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, C. H.; Jung, Y. S.; Smith, M. A.
2017-09-30
Under the U.S. DOE NEAMS program, the high-fidelity neutronics code system has been developed to support the multiphysics modeling and simulation capability named SHARP. The neutronics code system includes the high-fidelity neutronics code PROTEUS, the cross section library and preprocessing tools, the multigroup cross section generation code MC2-3, the in-house meshing generation tool, the perturbation and sensitivity analysis code PERSENT, and post-processing tools. The main objectives of the NEAMS neutronics activities in FY17 are to continue development of an advanced nodal solver in PROTEUS for use in nuclear reactor design and analysis projects, implement a simplified sub-channel based thermal-hydraulic (T/H)more » capability into PROTEUS to efficiently compute the thermal feedback, improve the performance of PROTEUS-MOCEX using numerical acceleration and code optimization, improve the cross section generation tools including MC2-3, and continue to perform verification and validation tests for PROTEUS.« less
Research for new UAV capabilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Canavan, G.H.; Leadabrand, R.
1996-07-01
This paper discusses research for new Unmanned Aerial Vehicles (UAV) capabilities. Findings indicate that UAV performance could be greatly enhanced by modest research. Improved sensors and communications enhance near term cost effectiveness. Improved engines, platforms, and stealth improve long term effectiveness.
3D Protein structure prediction with genetic tabu search algorithm
2010-01-01
Background Protein structure prediction (PSP) has important applications in different fields, such as drug design, disease prediction, and so on. In protein structure prediction, there are two important issues. The first one is the design of the structure model and the second one is the design of the optimization technology. Because of the complexity of the realistic protein structure, the structure model adopted in this paper is a simplified model, which is called off-lattice AB model. After the structure model is assumed, optimization technology is needed for searching the best conformation of a protein sequence based on the assumed structure model. However, PSP is an NP-hard problem even if the simplest model is assumed. Thus, many algorithms have been developed to solve the global optimization problem. In this paper, a hybrid algorithm, which combines genetic algorithm (GA) and tabu search (TS) algorithm, is developed to complete this task. Results In order to develop an efficient optimization algorithm, several improved strategies are developed for the proposed genetic tabu search algorithm. The combined use of these strategies can improve the efficiency of the algorithm. In these strategies, tabu search introduced into the crossover and mutation operators can improve the local search capability, the adoption of variable population size strategy can maintain the diversity of the population, and the ranking selection strategy can improve the possibility of an individual with low energy value entering into next generation. Experiments are performed with Fibonacci sequences and real protein sequences. Experimental results show that the lowest energy obtained by the proposed GATS algorithm is lower than that obtained by previous methods. Conclusions The hybrid algorithm has the advantages from both genetic algorithm and tabu search algorithm. It makes use of the advantage of multiple search points in genetic algorithm, and can overcome poor hill-climbing capability in the conventional genetic algorithm by using the flexible memory functions of TS. Compared with some previous algorithms, GATS algorithm has better performance in global optimization and can predict 3D protein structure more effectively. PMID:20522256