Radiation Environment Modeling for Spacecraft Design: New Model Developments
NASA Technical Reports Server (NTRS)
Barth, Janet; Xapsos, Mike; Lauenstein, Jean-Marie; Ladbury, Ray
2006-01-01
A viewgraph presentation on various new space radiation environment models for spacecraft design is described. The topics include: 1) The Space Radiatio Environment; 2) Effects of Space Environments on Systems; 3) Space Radiatio Environment Model Use During Space Mission Development and Operations; 4) Space Radiation Hazards for Humans; 5) "Standard" Space Radiation Environment Models; 6) Concerns about Standard Models; 7) Inadequacies of Current Models; 8) Development of New Models; 9) New Model Developments: Proton Belt Models; 10) Coverage of New Proton Models; 11) Comparison of TPM-1, PSB97, AP-8; 12) New Model Developments: Electron Belt Models; 13) Coverage of New Electron Models; 14) Comparison of "Worst Case" POLE, CRESELE, and FLUMIC Models with the AE-8 Model; 15) New Model Developments: Galactic Cosmic Ray Model; 16) Comparison of NASA, MSU, CIT Models with ACE Instrument Data; 17) New Model Developmemts: Solar Proton Model; 18) Comparison of ESP, JPL91, KIng/Stassinopoulos, and PSYCHIC Models; 19) New Model Developments: Solar Heavy Ion Model; 20) Comparison of CREME96 to CREDO Measurements During 2000 and 2002; 21) PSYCHIC Heavy ion Model; 22) Model Standardization; 23) Working Group Meeting on New Standard Radiation Belt and Space Plasma Models; and 24) Summary.
How Qualitative Methods Can be Used to Inform Model Development.
Husbands, Samantha; Jowett, Susan; Barton, Pelham; Coast, Joanna
2017-06-01
Decision-analytic models play a key role in informing healthcare resource allocation decisions. However, there are ongoing concerns with the credibility of models. Modelling methods guidance can encourage good practice within model development, but its value is dependent on its ability to address the areas that modellers find most challenging. Further, it is important that modelling methods and related guidance are continually updated in light of any new approaches that could potentially enhance model credibility. The objective of this article was to highlight the ways in which qualitative methods have been used and recommended to inform decision-analytic model development and enhance modelling practices. With reference to the literature, the article discusses two key ways in which qualitative methods can be, and have been, applied. The first approach involves using qualitative methods to understand and inform general and future processes of model development, and the second, using qualitative techniques to directly inform the development of individual models. The literature suggests that qualitative methods can improve the validity and credibility of modelling processes by providing a means to understand existing modelling approaches that identifies where problems are occurring and further guidance is needed. It can also be applied within model development to facilitate the input of experts to structural development. We recommend that current and future model development would benefit from the greater integration of qualitative methods, specifically by studying 'real' modelling processes, and by developing recommendations around how qualitative methods can be adopted within everyday modelling practice.
A toolbox and a record for scientific model development
NASA Technical Reports Server (NTRS)
Ellman, Thomas
1994-01-01
Scientific computation can benefit from software tools that facilitate construction of computational models, control the application of models, and aid in revising models to handle new situations. Existing environments for scientific programming provide only limited means of handling these tasks. This paper describes a two pronged approach for handling these tasks: (1) designing a 'Model Development Toolbox' that includes a basic set of model constructing operations; and (2) designing a 'Model Development Record' that is automatically generated during model construction. The record is subsequently exploited by tools that control the application of scientific models and revise models to handle new situations. Our two pronged approach is motivated by our belief that the model development toolbox and record should be highly interdependent. In particular, a suitable model development record can be constructed only when models are developed using a well defined set of operations. We expect this research to facilitate rapid development of new scientific computational models, to help ensure appropriate use of such models and to facilitate sharing of such models among working computational scientists. We are testing this approach by extending SIGMA, and existing knowledge-based scientific software design tool.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Zili; Nordhaus, William
2009-03-19
In the duration of this project, we finished the main tasks set up in the initial proposal. These tasks include: setting up the basic platform in GAMS language for the new RICE 2007 model; testing various model structure of RICE 2007; incorporating PPP data set in the new RICE model; developing gridded data set for IA modeling.
2017-06-01
This research expands the modeling and simulation (M and S) body of knowledge through the development of an Implicit Model Development Process (IMDP...When augmented to traditional Model Development Processes (MDP), the IMDP enables the development of models that can address a broader array of...where a broader, more holistic approach of defining a models referent is achieved. Next, the IMDP codifies the process for implementing the improved model
JEDI Natural Gas Model | Jobs and Economic Development Impact Models | NREL
Natural Gas Model JEDI Natural Gas Model The Jobs and Economic Development Impacts (JEDI) Natural Gas model allows users to estimate economic development impacts from natural gas power generation -specific data should be used to obtain the best estimate of economic development impacts. This model has
JEDI International Model | Jobs and Economic Development Impact Models |
NREL International Model JEDI International Model The Jobs and Economic Development Impacts (JEDI) International Model allows users to estimate economic development impacts from international
Ontology for Life-Cycle Modeling of Water Distribution Systems: Model View Definition
2013-06-01
Research and Development Center, Construction Engineering Research Laboratory (ERDC-CERL) to develop a life-cycle building model have resulted in the...Laboratory (ERDC-CERL) to develop a life-cycle building model have resulted in the definition of a “core” building information model that contains...developed experimental BIM models us- ing commercial off-the-shelf (COTS) software. Those models represent three types of typical low-rise Army
JEDI Geothermal Model | Jobs and Economic Development Impact Models | NREL
Geothermal Model JEDI Geothermal Model The Jobs and Economic Development Impacts (JEDI) Geothermal Model allows users to estimate economic development impacts from geothermal projects and includes
JEDI Biofuels Models | Jobs and Economic Development Impact Models | NREL
Biofuels Models JEDI Biofuels Models The Jobs and Economic Development Impacts (JEDI) biofuel models allow users to estimate economic development impacts from biofuel projects and include default
JEDI Petroleum Model | Jobs and Economic Development Impact Models | NREL
Petroleum Model JEDI Petroleum Model The Jobs and Economic Development Impacts (JEDI) Petroleum Model allows users to estimate economic development impacts from petroleum projects and includes default
Adolescent Psychosocial Development: A Review of Longitudinal Models and Research
ERIC Educational Resources Information Center
Meeus, Wim
2016-01-01
This review used 4 types of longitudinal models (descriptive models, prediction models, developmental sequence models and longitudinal mediation models) to identify regular patterns of psychosocial development in adolescence. Eight patterns of adolescent development were observed across countries: (1) adolescent maturation in multiple…
Second Generation Crop Yield Models Review
NASA Technical Reports Server (NTRS)
Hodges, T. (Principal Investigator)
1982-01-01
Second generation yield models, including crop growth simulation models and plant process models, may be suitable for large area crop yield forecasting in the yield model development project. Subjective and objective criteria for model selection are defined and models which might be selected are reviewed. Models may be selected to provide submodels as input to other models; for further development and testing; or for immediate testing as forecasting tools. A plant process model may range in complexity from several dozen submodels simulating (1) energy, carbohydrates, and minerals; (2) change in biomass of various organs; and (3) initiation and development of plant organs, to a few submodels simulating key physiological processes. The most complex models cannot be used directly in large area forecasting but may provide submodels which can be simplified for inclusion into simpler plant process models. Both published and unpublished models which may be used for development or testing are reviewed. Several other models, currently under development, may become available at a later date.
A Model-Driven Development Method for Management Information Systems
NASA Astrophysics Data System (ADS)
Mizuno, Tomoki; Matsumoto, Keinosuke; Mori, Naoki
Traditionally, a Management Information System (MIS) has been developed without using formal methods. By the informal methods, the MIS is developed on its lifecycle without having any models. It causes many problems such as lack of the reliability of system design specifications. In order to overcome these problems, a model theory approach was proposed. The approach is based on an idea that a system can be modeled by automata and set theory. However, it is very difficult to generate automata of the system to be developed right from the start. On the other hand, there is a model-driven development method that can flexibly correspond to changes of business logics or implementing technologies. In the model-driven development, a system is modeled using a modeling language such as UML. This paper proposes a new development method for management information systems applying the model-driven development method to a component of the model theory approach. The experiment has shown that a reduced amount of efforts is more than 30% of all the efforts.
ASTP ranging system mathematical model
NASA Technical Reports Server (NTRS)
Ellis, M. R.; Robinson, L. H.
1973-01-01
A mathematical model is presented of the VHF ranging system to analyze the performance of the Apollo-Soyuz test project (ASTP). The system was adapted for use in the ASTP. The ranging system mathematical model is presented in block diagram form, and a brief description of the overall model is also included. A procedure for implementing the math model is presented along with a discussion of the validation of the math model and the overall summary and conclusions of the study effort. Detailed appendices of the five study tasks are presented: early late gate model development, unlock probability development, system error model development, probability of acquisition and model development, and math model validation testing.
DOT National Transportation Integrated Search
2014-02-01
A mathematical model was developed for the purpose of providing students with data : acquisition and engine modeling experience at the University of Idaho. In developing the : model, multiple heat transfer and emissions models were researched and com...
Distributed geospatial model sharing based on open interoperability standards
Feng, Min; Liu, Shuguang; Euliss, Ned H.; Fang, Yin
2009-01-01
Numerous geospatial computational models have been developed based on sound principles and published in journals or presented in conferences. However modelers have made few advances in the development of computable modules that facilitate sharing during model development or utilization. Constraints hampering development of model sharing technology includes limitations on computing, storage, and connectivity; traditional stand-alone and closed network systems cannot fully support sharing and integrating geospatial models. To address this need, we have identified methods for sharing geospatial computational models using Service Oriented Architecture (SOA) techniques and open geospatial standards. The service-oriented model sharing service is accessible using any tools or systems compliant with open geospatial standards, making it possible to utilize vast scientific resources available from around the world to solve highly sophisticated application problems. The methods also allow model services to be empowered by diverse computational devices and technologies, such as portable devices and GRID computing infrastructures. Based on the generic and abstract operations and data structures required for Web Processing Service (WPS) standards, we developed an interactive interface for model sharing to help reduce interoperability problems for model use. Geospatial computational models are shared on model services, where the computational processes provided by models can be accessed through tools and systems compliant with WPS. We developed a platform to help modelers publish individual models in a simplified and efficient way. Finally, we illustrate our technique using wetland hydrological models we developed for the prairie pothole region of North America.
JEDI Coal Model | Jobs and Economic Development Impact Models | NREL
Coal Model JEDI Coal Model The Jobs and Economic Development Impacts (JEDI) Coal Model allow users to estimate economic development impacts from coal projects and includes default information that can
NASA Astrophysics Data System (ADS)
Robins, N. S.; Rutter, H. K.; Dumpleton, S.; Peach, D. W.
2005-01-01
Groundwater investigation has long depended on the process of developing a conceptual flow model as a precursor to developing a mathematical model, which in turn may lead in complex aquifers to the development of a numerical approximation model. The assumptions made in the development of the conceptual model depend heavily on the geological framework defining the aquifer, and if the conceptual model is inappropriate then subsequent modelling will also be incorrect. Paradoxically, the development of a robust conceptual model remains difficult, not least because this 3D paradigm is usually reduced to 2D plans and sections. 3D visualisation software is now available to facilitate the development of the conceptual model, to make the model more robust and defensible and to assist in demonstrating the hydraulics of the aquifer system. Case studies are presented to demonstrate the role and cost-effectiveness of the visualisation process.
Hybrid LCA model for assessing the embodied environmental impacts of buildings in South Korea
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jang, Minho, E-mail: minmin40@hanmail.net; Hong, Taehoon, E-mail: hong7@yonsei.ac.kr; Ji, Changyoon, E-mail: chnagyoon@yonsei.ac.kr
2015-01-15
The assessment of the embodied environmental impacts of buildings can help decision-makers plan environment-friendly buildings and reduce environmental impacts. For a more comprehensive assessment of the embodied environmental impacts of buildings, a hybrid life cycle assessment model was developed in this study. The developed model can assess the embodied environmental impacts (global warming, ozone layer depletion, acidification, eutrophication, photochemical ozone creation, abiotic depletion, and human toxicity) generated directly and indirectly in the material manufacturing, transportation, and construction phases. To demonstrate the application and validity of the developed model, the environmental impacts of an elementary school building were assessed using themore » developed model and compared with the results of a previous model used in a case study. The embodied environmental impacts from the previous model were lower than those from the developed model by 4.6–25.2%. Particularly, human toxicity potential (13 kg C{sub 6}H{sub 6} eq.) calculated by the previous model was much lower (1965 kg C{sub 6}H{sub 6} eq.) than what was calculated by the developed model. The results indicated that the developed model can quantify the embodied environmental impacts of buildings more comprehensively, and can be used by decision-makers as a tool for selecting environment-friendly buildings. - Highlights: • The model was developed to assess the embodied environmental impacts of buildings. • The model evaluates GWP, ODP, AP, EP, POCP, ADP, and HTP as environmental impacts. • The model presents more comprehensive results than the previous model by 4.6–100%. • The model can present the HTP of buildings, which the previous models cannot do. • Decision-makers can use the model for selecting environment-friendly buildings.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tonks, Michael R; Zhang, Yongfeng; Bai, Xianming
2014-06-01
This report summarizes development work funded by the Nuclear Energy Advanced Modeling Simulation program's Fuels Product Line (FPL) to develop a mechanistic model for the average grain size in UO₂ fuel. The model is developed using a multiscale modeling and simulation approach involving atomistic simulations, as well as mesoscale simulations using INL's MARMOT code.
Replicating Health Economic Models: Firm Foundations or a House of Cards?
Bermejo, Inigo; Tappenden, Paul; Youn, Ji-Hee
2017-11-01
Health economic evaluation is a framework for the comparative analysis of the incremental health gains and costs associated with competing decision alternatives. The process of developing health economic models is usually complex, financially expensive and time-consuming. For these reasons, model development is sometimes based on previous model-based analyses; this endeavour is usually referred to as model replication. Such model replication activity may involve the comprehensive reproduction of an existing model or 'borrowing' all or part of a previously developed model structure. Generally speaking, the replication of an existing model may require substantially less effort than developing a new de novo model by bypassing, or undertaking in only a perfunctory manner, certain aspects of model development such as the development of a complete conceptual model and/or comprehensive literature searching for model parameters. A further motivation for model replication may be to draw on the credibility or prestige of previous analyses that have been published and/or used to inform decision making. The acceptability and appropriateness of replicating models depends on the decision-making context: there exists a trade-off between the 'savings' afforded by model replication and the potential 'costs' associated with reduced model credibility due to the omission of certain stages of model development. This paper provides an overview of the different levels of, and motivations for, replicating health economic models, and discusses the advantages, disadvantages and caveats associated with this type of modelling activity. Irrespective of whether replicated models should be considered appropriate or not, complete replicability is generally accepted as a desirable property of health economic models, as reflected in critical appraisal checklists and good practice guidelines. To this end, the feasibility of comprehensive model replication is explored empirically across a small number of recent case studies. Recommendations are put forward for improving reporting standards to enhance comprehensive model replicability.
Electric Propulsion System Modeling for the Proposed Prometheus 1 Mission
NASA Technical Reports Server (NTRS)
Fiehler, Douglas; Dougherty, Ryan; Manzella, David
2005-01-01
The proposed Prometheus 1 spacecraft would utilize nuclear electric propulsion to propel the spacecraft to its ultimate destination where it would perform its primary mission. As part of the Prometheus 1 Phase A studies, system models were developed for each of the spacecraft subsystems that were integrated into one overarching system model. The Electric Propulsion System (EPS) model was developed using data from the Prometheus 1 electric propulsion technology development efforts. This EPS model was then used to provide both performance and mass information to the Prometheus 1 system model for total system trades. Development of the EPS model is described, detailing both the performance calculations as well as its evolution over the course of Phase A through three technical baselines. Model outputs are also presented, detailing the performance of the model and its direct relationship to the Prometheus 1 technology development efforts. These EP system model outputs are also analyzed chronologically showing the response of the model development to the four technical baselines during Prometheus 1 Phase A.
Clinical Predictive Modeling Development and Deployment through FHIR Web Services.
Khalilia, Mohammed; Choi, Myung; Henderson, Amelia; Iyengar, Sneha; Braunstein, Mark; Sun, Jimeng
2015-01-01
Clinical predictive modeling involves two challenging tasks: model development and model deployment. In this paper we demonstrate a software architecture for developing and deploying clinical predictive models using web services via the Health Level 7 (HL7) Fast Healthcare Interoperability Resources (FHIR) standard. The services enable model development using electronic health records (EHRs) stored in OMOP CDM databases and model deployment for scoring individual patients through FHIR resources. The MIMIC2 ICU dataset and a synthetic outpatient dataset were transformed into OMOP CDM databases for predictive model development. The resulting predictive models are deployed as FHIR resources, which receive requests of patient information, perform prediction against the deployed predictive model and respond with prediction scores. To assess the practicality of this approach we evaluated the response and prediction time of the FHIR modeling web services. We found the system to be reasonably fast with one second total response time per patient prediction.
Clinical Predictive Modeling Development and Deployment through FHIR Web Services
Khalilia, Mohammed; Choi, Myung; Henderson, Amelia; Iyengar, Sneha; Braunstein, Mark; Sun, Jimeng
2015-01-01
Clinical predictive modeling involves two challenging tasks: model development and model deployment. In this paper we demonstrate a software architecture for developing and deploying clinical predictive models using web services via the Health Level 7 (HL7) Fast Healthcare Interoperability Resources (FHIR) standard. The services enable model development using electronic health records (EHRs) stored in OMOP CDM databases and model deployment for scoring individual patients through FHIR resources. The MIMIC2 ICU dataset and a synthetic outpatient dataset were transformed into OMOP CDM databases for predictive model development. The resulting predictive models are deployed as FHIR resources, which receive requests of patient information, perform prediction against the deployed predictive model and respond with prediction scores. To assess the practicality of this approach we evaluated the response and prediction time of the FHIR modeling web services. We found the system to be reasonably fast with one second total response time per patient prediction. PMID:26958207
1990-09-01
following two chapters. 28 V. COCOMO MODEL A. OVERVIEW The COCOMO model which stands for COnstructive COst MOdel was developed by Barry Boehm and is...estimation model which uses an expert system to automate the Intermediate COnstructive Cost Estimation MOdel (COCOMO), developed by Barry W. Boehm and...cost estimation model which uses an expert system to automate the Intermediate COnstructive Cost Estimation MOdel (COCOMO), developed by Barry W
Modeling Rabbit Responses to Single and Multiple Aerosol ...
Journal Article Survival models are developed here to predict response and time-to-response for mortality in rabbits following exposures to single or multiple aerosol doses of Bacillus anthracis spores. Hazard function models were developed for a multiple dose dataset to predict the probability of death through specifying dose-response functions and the time between exposure and the time-to-death (TTD). Among the models developed, the best-fitting survival model (baseline model) has an exponential dose-response model with a Weibull TTD distribution. Alternative models assessed employ different underlying dose-response functions and use the assumption that, in a multiple dose scenario, earlier doses affect the hazard functions of each subsequent dose. In addition, published mechanistic models are analyzed and compared with models developed in this paper. None of the alternative models that were assessed provided a statistically significant improvement in fit over the baseline model. The general approach utilizes simple empirical data analysis to develop parsimonious models with limited reliance on mechanistic assumptions. The baseline model predicts TTDs consistent with reported results from three independent high-dose rabbit datasets. More accurate survival models depend upon future development of dose-response datasets specifically designed to assess potential multiple dose effects on response and time-to-response. The process used in this paper to dev
Nonequilibrium radiation and chemistry models for aerocapture vehicle flowfields
NASA Technical Reports Server (NTRS)
Carlson, Leland A.
1991-01-01
The primary tasks performed are: (1) the development of a second order local thermodynamic nonequilibrium (LTNE) model for atoms; (2) the continued development of vibrational nonequilibrium models; and (3) the development of a new multicomponent diffusion model. In addition, studies comparing these new models with previous models and results were conducted and reported.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Greenquist, Ian; Tonks, Michael
2016-10-01
Light water reactor fuel pellets are fabricated using sintering to final densities of 95% or greater. During reactor operation, the porosity remaining in the fuel after fabrication decreases further due to irradiation-assisted densification. While empirical models have been developed to describe this densification process, a mechanistic model is needed as part of the ongoing work by the NEAMS program to develop a more predictive fuel performance code. In this work we will develop a phase field model of sintering of UO 2 in the MARMOT code, and validate it by comparing to published sintering data. We will then add themore » capability to capture irradiation effects into the model, and use it to develop a mechanistic model of densification that will go into the BISON code and add another essential piece to the microstructure-based materials models. The final step will be to add the effects of applied fields, to model field-assisted sintering of UO 2. The results of the phase field model will be validated by comparing to data from field-assisted sintering. Tasks over three years: 1. Develop a sintering model for UO 2 in MARMOT 2. Expand model to account for irradiation effects 3. Develop a mechanistic macroscale model of densification for BISON« less
Thermal Model Development for Ares I-X
NASA Technical Reports Server (NTRS)
Amundsen, Ruth M.; DelCorso, Joe
2008-01-01
Thermal analysis for the Ares I-X vehicle has involved extensive thermal model integration, since thermal models of vehicle elements came from several different NASA and industry organizations. Many valuable lessons were learned in terms of model integration and validation. Modeling practices such as submodel, analysis group and symbol naming were standardized to facilitate the later model integration. Upfront coordination of coordinate systems, timelines, units, symbols and case scenarios was very helpful in minimizing integration rework. A process for model integration was developed that included pre-integration runs and basic checks of both models, and a step-by-step process to efficiently integrate one model into another. Extensive use of model logic was used to create scenarios and timelines for avionics and air flow activation. Efficient methods of model restart between case scenarios were developed. Standardization of software version and even compiler version between organizations was found to be essential. An automated method for applying aeroheating to the full integrated vehicle model, including submodels developed by other organizations, was developed.
Development and application of air quality models at the US ...
Overview of the development and application of air quality models at the U.S. EPA, particularly focused on the development and application of the Community Multiscale Air Quality (CMAQ) model developed within the Computation Exposure Division (CED) of the National Exposure Research Laboratory (NERL). This presentation will provide a simple overview of air quality model development and application geared toward a non-technical student audience. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.
ERIC Educational Resources Information Center
Paul, Kristina Ayers; Seward, Kristen K.
2016-01-01
The place-based investment model (PBIM) of talent development is a programming model for developing talents of high-potential youth in ways that could serve as an investment in the community. In this article, we discuss the PBIM within rural contexts. The model is grounded in three theories--Moon's personal talent development theory, Sternberg's…
Assessing the Development of Cross-Cultural Competence in Soldiers
2010-11-01
five stages of CQ development based on models from developmental psychology including Piaget’s Model of Cognitive Development ( Piaget , 1985) and...and competence development , the Stage Model of Cognitive Skill Acquisition (Ross et al., 2005), and the Bennett Developmental Model of... developmental stages of proficiency and expertise (see the Stage Model of Cognitive Development ). At this level, cross-cultural competence is highly refined and
Class Model Development Using Business Rules
NASA Astrophysics Data System (ADS)
Skersys, Tomas; Gudas, Saulius
New developments in the area of computer-aided system engineering (CASE) greatly improve processes of the information systems development life cycle (ISDLC). Much effort is put into the quality improvement issues, but IS development projects still suffer from the poor quality of models during the system analysis and design cycles. At some degree, quality of models that are developed using CASE tools can be assured using various. automated. model comparison, syntax. checking procedures. It. is also reasonable to check these models against the business domain knowledge, but the domain knowledge stored in the repository of CASE tool (enterprise model) is insufficient (Gudas et al. 2004). Involvement of business domain experts into these processes is complicated because non- IT people often find it difficult to understand models that were developed by IT professionals using some specific modeling language.
ERIC Educational Resources Information Center
Idawati; Mahmud, Alimuddin; Dirawan, Gufran Darma
2016-01-01
The purpose of this research was to determine the effectiveness of a training model for capacity building of women entrepreneurship community-based. Research type approach Research and Development Model, which refers to the model of development research that developed by Romiszowki (1996) combined with a model of development Sugiono (2011) it was…
A Framework for Developing the Structure of Public Health Economic Models.
Squires, Hazel; Chilcott, James; Akehurst, Ronald; Burr, Jennifer; Kelly, Michael P
2016-01-01
A conceptual modeling framework is a methodology that assists modelers through the process of developing a model structure. Public health interventions tend to operate in dynamically complex systems. Modeling public health interventions requires broader considerations than clinical ones. Inappropriately simple models may lead to poor validity and credibility, resulting in suboptimal allocation of resources. This article presents the first conceptual modeling framework for public health economic evaluation. The framework presented here was informed by literature reviews of the key challenges in public health economic modeling and existing conceptual modeling frameworks; qualitative research to understand the experiences of modelers when developing public health economic models; and piloting a draft version of the framework. The conceptual modeling framework comprises four key principles of good practice and a proposed methodology. The key principles are that 1) a systems approach to modeling should be taken; 2) a documented understanding of the problem is imperative before and alongside developing and justifying the model structure; 3) strong communication with stakeholders and members of the team throughout model development is essential; and 4) a systematic consideration of the determinants of health is central to identifying the key impacts of public health interventions. The methodology consists of four phases: phase A, aligning the framework with the decision-making process; phase B, identifying relevant stakeholders; phase C, understanding the problem; and phase D, developing and justifying the model structure. Key areas for further research involve evaluation of the framework in diverse case studies and the development of methods for modeling individual and social behavior. This approach could improve the quality of Public Health economic models, supporting efficient allocation of scarce resources. Copyright © 2016 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Wang, Wenyi; Kim, Marlene T.; Sedykh, Alexander
2015-01-01
Purpose Experimental Blood–Brain Barrier (BBB) permeability models for drug molecules are expensive and time-consuming. As alternative methods, several traditional Quantitative Structure-Activity Relationship (QSAR) models have been developed previously. In this study, we aimed to improve the predictivity of traditional QSAR BBB permeability models by employing relevant public bio-assay data in the modeling process. Methods We compiled a BBB permeability database consisting of 439 unique compounds from various resources. The database was split into a modeling set of 341 compounds and a validation set of 98 compounds. Consensus QSAR modeling workflow was employed on the modeling set to develop various QSAR models. A five-fold cross-validation approach was used to validate the developed models, and the resulting models were used to predict the external validation set compounds. Furthermore, we used previously published membrane transporter models to generate relevant transporter profiles for target compounds. The transporter profiles were used as additional biological descriptors to develop hybrid QSAR BBB models. Results The consensus QSAR models have R2=0.638 for fivefold cross-validation and R2=0.504 for external validation. The consensus model developed by pooling chemical and transporter descriptors showed better predictivity (R2=0.646 for five-fold cross-validation and R2=0.526 for external validation). Moreover, several external bio-assays that correlate with BBB permeability were identified using our automatic profiling tool. Conclusions The BBB permeability models developed in this study can be useful for early evaluation of new compounds (e.g., new drug candidates). The combination of chemical and biological descriptors shows a promising direction to improve the current traditional QSAR models. PMID:25862462
Teachers' Development Model to Authentic Assessment by Empowerment Evaluation Approach
ERIC Educational Resources Information Center
Charoenchai, Charin; Phuseeorn, Songsak; Phengsawat, Waro
2015-01-01
The purposes of this study were 1) Study teachers authentic assessment, teachers comprehension of authentic assessment and teachers needs for authentic assessment development. 2) To create teachers development model. 3) Experiment of teachers development model. 4) Evaluate effectiveness of teachers development model. The research is divided into 4…
NASA Technical Reports Server (NTRS)
Kopasakis, George; Connolly, Joseph W.; Seiel, Jonathan
2016-01-01
A summary of the propulsion system modeling under NASA's High Speed Project (HSP) AeroPropulsoServoElasticity (APSE) task is provided with a focus on the propulsion system for the low-boom supersonic configuration developed by Lockheed Martin and referred to as the N+2 configuration. This summary includes details on the effort to date to develop computational models for the various propulsion system components. The objective of this paper is to summarize the model development effort in this task, while providing more detail in the modeling areas that have not been previously published. The purpose of the propulsion system modeling and the overall APSE effort is to develop an integrated dynamic vehicle model to conduct appropriate unsteady analysis of supersonic vehicle performance. This integrated APSE system model concept includes the propulsion system model, and the vehicle structural aerodynamics model. The development to date of such a preliminary integrated model will also be summarized in this report
NASA Technical Reports Server (NTRS)
Kopasakis, George; Connolly, Joseph; Seidel, Jonathan
2014-01-01
A summary of the propulsion system modeling under NASA's High Speed Project (HSP) AeroPropulsoServoElasticity (APSE) task is provided with a focus on the propulsion system for the low-boom supersonic configuration developed by Lockheed Martin and referred to as the N+2 configuration. This summary includes details on the effort to date to develop computational models for the various propulsion system components. The objective of this paper is to summarize the model development effort in this task, while providing more detail in the modeling areas that have not been previously published. The purpose of the propulsion system modeling and the overall APSE effort is to develop an integrated dynamic vehicle model to conduct appropriate unsteady analysis of supersonic vehicle performance. This integrated APSE system model concept includes the propulsion system model, and the vehicle structural-aerodynamics model. The development to date of such a preliminary integrated model will also be summarized in this report.propulsion system dynamics, the structural dynamics, and aerodynamics.
NASA Technical Reports Server (NTRS)
Kopasakis, George; Connolly, Joseph W.; Seidel, Jonathan
2014-01-01
A summary of the propulsion system modeling under NASA's High Speed Project (HSP) AeroPropulsoServoElasticity (APSE) task is provided with a focus on the propulsion system for the lowboom supersonic configuration developed by Lockheed Martin and referred to as the N+2 configuration. This summary includes details on the effort to date to develop computational models for the various propulsion system components. The objective of this paper is to summarize the model development effort in this task, while providing more detail in the modeling areas that have not been previously published. The purpose of the propulsion system modeling and the overall APSE effort is to develop an integrated dynamic vehicle model to conduct appropriate unsteady analysis of supersonic vehicle performance. This integrated APSE system model concept includes the propulsion system model, and the vehicle structural-aerodynamics model. The development to date of such a preliminary integrated model will also be summarized in this report.
Nonequilibrium radiation and chemistry models for aerocapture vehicle flowfields
NASA Technical Reports Server (NTRS)
Carlson, Leland A.
1994-01-01
The primary accomplishments of the project were as follows: (1) From an overall standpoint, the primary accomplishment of this research was the development of a complete gasdynamic-radiatively coupled nonequilibrium viscous shock layer solution method for axisymmetric blunt bodies. This method can be used for rapid engineering modeling of nonequilibrium re-entry flowfields over a wide range of conditions. (2) Another significant accomplishment was the development of an air radiation model that included local thermodynamic nonequilibrium (LTNE) phenomena. (3) As part of this research, three electron-electronic energy models were developed. The first was a quasi-equilibrium electron (QEE) model which determined an effective free electron temperature and assumed that the electronic states were in equilibrium with the free electrons. The second was a quasi-equilibrium electron-electronic (QEEE) model which computed an effective electron-electronic temperature. The third model was a full electron-electronic (FEE) differential equation model which included convective, collisional, viscous, conductive, vibrational coupling, and chemical effects on electron-electronic energy. (4) Since vibration-dissociation coupling phenomena as well as vibrational thermal nonequilibrium phenomena are important in the nonequilibrium zone behind a shock front, a vibrational energy and vibration-dissociation coupling model was developed and included in the flowfield model. This model was a modified coupled vibrational dissociation vibrational (MCVDV) model and also included electron-vibrational coupling. (5) Another accomplishment of the project was the usage of the developed models to investigate radiative heating. (6) A multi-component diffusion model which properly models the multi-component nature of diffusion in complex gas mixtures such as air, was developed and incorporated into the blunt body model. (7) A model was developed to predict the magnitude and characteristics of the shock wave precursor ahead of vehicles entering the Earth's atmosphere. (8) Since considerable data exists for radiating nonequilibrium flow behind normal shock waves, a normal shock wave version of the blunt body code was developed. (9) By comparing predictions from the models and codes with available normal shock data and the flight data of Fire II, it is believed that the developed flowfield and nonequilibrium radiation models have been essentially validated for engineering applications.
NASA Technical Reports Server (NTRS)
Ensey, Tyler S.
2013-01-01
During my internship at NASA, I was a model developer for Ground Support Equipment (GSE). The purpose of a model developer is to develop and unit test model component libraries (fluid, electrical, gas, etc.). The models are designed to simulate software for GSE (Ground Special Power, Crew Access Arm, Cryo, Fire and Leak Detection System, Environmental Control System (ECS), etc. .) before they are implemented into hardware. These models support verifying local control and remote software for End-Item Software Under Test (SUT). The model simulates the physical behavior (function, state, limits and 110) of each end-item and it's dependencies as defined in the Subsystem Interface Table, Software Requirements & Design Specification (SRDS), Ground Integrated Schematic (GIS), and System Mechanical Schematic.(SMS). The software of each specific model component is simulated through MATLAB's Simulink program. The intensiv model development life cycle is a.s follows: Identify source documents; identify model scope; update schedule; preliminary design review; develop model requirements; update model.. scope; update schedule; detailed design review; create/modify library component; implement library components reference; implement subsystem components; develop a test script; run the test script; develop users guide; send model out for peer review; the model is sent out for verifictionlvalidation; if there is empirical data, a validation data package is generated; if there is not empirical data, a verification package is generated; the test results are then reviewed; and finally, the user. requests accreditation, and a statement of accreditation is prepared. Once each component model is reviewed and approved, they are intertwined together into one integrated model. This integrated model is then tested itself, through a test script and autotest, so that it can be concluded that all models work conjointly, for a single purpose. The component I was assigned, specifically, was a fluid component, a discrete pressure switch. The switch takes a fluid pressure input, and if the pressure is greater than a designated cutoff pressure, the switch would stop fluid flow.
Development of Virtual Blade Model for Modelling Helicopter Rotor Downwash in OpenFOAM
2013-12-01
UNCLASSIFIED Development of Virtual Blade Model for Modelling Helicopter Rotor Downwash in OpenFOAM Stefano Wahono Aerospace...Georgia Institute of Technology. The OpenFOAM predicted result was also shown to compare favourably with ANSYS Fluent predictions. RELEASE...UNCLASSIFIED Development of Virtual Blade Model for Modelling Helicopter Rotor Downwash in OpenFOAM Executive Summary The Infrared
Poss, J E
2001-06-01
This article discusses the development of a new model representing the synthesis of two models that are often used to study health behaviors: the Health Belief Model and the Theory of Reasoned Action. The new model was developed as the theoretic framework for an investigation of the factors affecting participation by Mexican migrant workers in tuberculosis screening. Development of the synthesized model evolved from the concern that models used to investigate health-seeking behaviors of mainstream Anglo groups in the United States might not be appropriate for studying migrant workers or persons from other cultural backgrounds.
Accuracy of eastern white pine site index models developed in the Southern Appalachian Mountains
W. Henry McNab
2002-01-01
Three older, anamorphic eastern white pine (Pinus sfrobus L.) site index models developed in the southern Appalachian Mountains between 1932 and 1962 were evaluated for accuracy and compared with a newer, polymorphic model developed in 1971. Accuracies of the older models were tested with data used in development of the 1971 model, in which actual...
NASA Astrophysics Data System (ADS)
Kennedy, J. H.; Bennett, A. R.; Evans, K. J.; Fyke, J. G.; Vargo, L.; Price, S. F.; Hoffman, M. J.
2016-12-01
Accurate representation of ice sheets and glaciers are essential for robust predictions of arctic climate within Earth System models. Verification and Validation (V&V) is a set of techniques used to quantify the correctness and accuracy of a model, which builds developer/modeler confidence, and can be used to enhance the credibility of the model. Fundamentally, V&V is a continuous process because each model change requires a new round of V&V testing. The Community Ice Sheet Model (CISM) development community is actively developing LIVVkit, the Land Ice Verification and Validation toolkit, which is designed to easily integrate into an ice-sheet model's development workflow (on both personal and high-performance computers) to provide continuous V&V testing.LIVVkit is a robust and extensible python package for V&V, which has components for both software V&V (construction and use) and model V&V (mathematics and physics). The model Verification component is used, for example, to verify model results against community intercomparisons such as ISMIP-HOM. The model validation component is used, for example, to generate a series of diagnostic plots showing the differences between model results against observations for variables such as thickness, surface elevation, basal topography, surface velocity, surface mass balance, etc. Because many different ice-sheet models are under active development, new validation datasets are becoming available, and new methods of analysing these models are actively being researched, LIVVkit includes a framework to easily extend the model V&V analyses by ice-sheet modelers. This allows modelers and developers to develop evaluations of parameters, implement changes, and quickly see how those changes effect the ice-sheet model and earth system model (when coupled). Furthermore, LIVVkit outputs a portable hierarchical website allowing evaluations to be easily shared, published, and analysed throughout the arctic and Earth system communities.
NASA Technical Reports Server (NTRS)
Sebok, Angelia; Wickens, Christopher; Sargent, Robert
2015-01-01
One human factors challenge is predicting operator performance in novel situations. Approaches such as drawing on relevant previous experience, and developing computational models to predict operator performance in complex situations, offer potential methods to address this challenge. A few concerns with modeling operator performance are that models need to realistic, and they need to be tested empirically and validated. In addition, many existing human performance modeling tools are complex and require that an analyst gain significant experience to be able to develop models for meaningful data collection. This paper describes an effort to address these challenges by developing an easy to use model-based tool, using models that were developed from a review of existing human performance literature and targeted experimental studies, and performing an empirical validation of key model predictions.
Gsflow-py: An integrated hydrologic model development tool
NASA Astrophysics Data System (ADS)
Gardner, M.; Niswonger, R. G.; Morton, C.; Henson, W.; Huntington, J. L.
2017-12-01
Integrated hydrologic modeling encompasses a vast number of processes and specifications, variable in time and space, and development of model datasets can be arduous. Model input construction techniques have not been formalized or made easily reproducible. Creating the input files for integrated hydrologic models (IHM) requires complex GIS processing of raster and vector datasets from various sources. Developing stream network topology that is consistent with the model resolution digital elevation model is important for robust simulation of surface water and groundwater exchanges. Distribution of meteorologic parameters over the model domain is difficult in complex terrain at the model resolution scale, but is necessary to drive realistic simulations. Historically, development of input data for IHM models has required extensive GIS and computer programming expertise which has restricted the use of IHMs to research groups with available financial, human, and technical resources. Here we present a series of Python scripts that provide a formalized technique for the parameterization and development of integrated hydrologic model inputs for GSFLOW. With some modifications, this process could be applied to any regular grid hydrologic model. This Python toolkit automates many of the necessary and laborious processes of parameterization, including stream network development and cascade routing, land coverages, and meteorological distribution over the model domain.
Generic domain models in software engineering
NASA Technical Reports Server (NTRS)
Maiden, Neil
1992-01-01
This paper outlines three research directions related to domain-specific software development: (1) reuse of generic models for domain-specific software development; (2) empirical evidence to determine these generic models, namely elicitation of mental knowledge schema possessed by expert software developers; and (3) exploitation of generic domain models to assist modelling of specific applications. It focuses on knowledge acquisition for domain-specific software development, with emphasis on tool support for the most important phases of software development.
NASA Technical Reports Server (NTRS)
Tijidjian, Raffi P.
2010-01-01
The TEAMS model analyzer is a supporting tool developed to work with models created with TEAMS (Testability, Engineering, and Maintenance System), which was developed by QSI. In an effort to reduce the time spent in the manual process that each TEAMS modeler must perform in the preparation of reporting for model reviews, a new tool has been developed as an aid to models developed in TEAMS. The software allows for the viewing, reporting, and checking of TEAMS models that are checked into the TEAMS model database. The software allows the user to selectively model in a hierarchical tree outline view that displays the components, failure modes, and ports. The reporting features allow the user to quickly gather statistics about the model, and generate an input/output report pertaining to all of the components. Rules can be automatically validated against the model, with a report generated containing resulting inconsistencies. In addition to reducing manual effort, this software also provides an automated process framework for the Verification and Validation (V&V) effort that will follow development of these models. The aid of such an automated tool would have a significant impact on the V&V process.
EPR-based material modelling of soils
NASA Astrophysics Data System (ADS)
Faramarzi, Asaad; Alani, Amir M.
2013-04-01
In the past few decades, as a result of the rapid developments in computational software and hardware, alternative computer aided pattern recognition approaches have been introduced to modelling many engineering problems, including constitutive modelling of materials. The main idea behind pattern recognition systems is that they learn adaptively from experience and extract various discriminants, each appropriate for its purpose. In this work an approach is presented for developing material models for soils based on evolutionary polynomial regression (EPR). EPR is a recently developed hybrid data mining technique that searches for structured mathematical equations (representing the behaviour of a system) using genetic algorithm and the least squares method. Stress-strain data from triaxial tests are used to train and develop EPR-based material models for soil. The developed models are compared with some of the well-known conventional material models and it is shown that EPR-based models can provide a better prediction for the behaviour of soils. The main benefits of using EPR-based material models are that it provides a unified approach to constitutive modelling of all materials (i.e., all aspects of material behaviour can be implemented within a unified environment of an EPR model); it does not require any arbitrary choice of constitutive (mathematical) models. In EPR-based material models there are no material parameters to be identified. As the model is trained directly from experimental data therefore, EPR-based material models are the shortest route from experimental research (data) to numerical modelling. Another advantage of EPR-based constitutive model is that as more experimental data become available, the quality of the EPR prediction can be improved by learning from the additional data, and therefore, the EPR model can become more effective and robust. The developed EPR-based material models can be incorporated in finite element (FE) analysis.
The Model Life-cycle: Training Module
Model Life-Cycle includes identification of problems & the subsequent development, evaluation, & application of the model. Objectives: define ‘model life-cycle’, explore stages of model life-cycle, & strategies for development, evaluation, & applications.
NASA Technical Reports Server (NTRS)
1977-01-01
The development of a framework and structure for shuttle era unmanned spacecraft projects and the development of a commonality evaluation model is documented. The methodology developed for model utilization in performing cost trades and comparative evaluations for commonality studies is discussed. The model framework consists of categories of activities associated with the spacecraft system's development process. The model structure describes the physical elements to be treated as separate identifiable entities. Cost estimating relationships for subsystem and program-level components were calculated.
Using the NPSS Environment to Model an Altitude Test Facility
NASA Technical Reports Server (NTRS)
Lavelle, Thomas M.; Owen, Albert K.; Huffman, Brian C.
2013-01-01
An altitude test facility was modeled using Numerical Propulsion System Simulation (NPSS). This altitude test facility model represents the most detailed facility model developed in the NPSS architecture. The current paper demonstrates the use of the NPSS system to define the required operating range of a component for the facility. A significant number of additional component models were easily developed to complete the model. Discussed in this paper are the additional components developed and what was done in the development of these components.
Lightweight approach to model traceability in a CASE tool
NASA Astrophysics Data System (ADS)
Vileiniskis, Tomas; Skersys, Tomas; Pavalkis, Saulius; Butleris, Rimantas; Butkiene, Rita
2017-07-01
A term "model-driven" is not at all a new buzzword within the ranks of system development community. Nevertheless, the ever increasing complexity of model-driven approaches keeps fueling all kinds of discussions around this paradigm and pushes researchers forward to research and develop new and more effective ways to system development. With the increasing complexity, model traceability, and model management as a whole, becomes indispensable activities of model-driven system development process. The main goal of this paper is to present a conceptual design and implementation of a practical lightweight approach to model traceability in a CASE tool.
Program management model study
NASA Technical Reports Server (NTRS)
Connelly, J. J.; Russell, J. E.; Seline, J. R.; Sumner, N. R., Jr.
1972-01-01
Two models, a system performance model and a program assessment model, have been developed to assist NASA management in the evaluation of development alternatives for the Earth Observations Program. Two computer models were developed and demonstrated on the Goddard Space Flight Center Computer Facility. Procedures have been outlined to guide the user of the models through specific evaluation processes, and the preparation of inputs describing earth observation needs and earth observation technology. These models are intended to assist NASA in increasing the effectiveness of the overall Earth Observation Program by providing a broader view of system and program development alternatives.
Jobs and Economic Development Impact (JEDI) Model Geothermal User Reference Guide
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, C.; Augustine, C.; Goldberg, M.
2012-09-01
The Geothermal Jobs and Economic Development Impact (JEDI) model, developed through the National Renewable Energy Laboratory (NREL), is an Excel-based user-friendly tools that estimates the economic impacts of constructing and operating hydrothermal and Enhanced Geothermal System (EGS) power generation projects at the local level for a range of conventional and renewable energy technologies. The JEDI Model Geothermal User Reference Guide was developed to assist users in using and understanding the model. This guide provides information on the model's underlying methodology, as well as the parameters and references used to develop the cost data utilized in the model. This guide alsomore » provides basic instruction on model add-in features, operation of the model, and a discussion of how the results should be interpreted.« less
Groundwater modeling in integrated water resources management--visions for 2020.
Refsgaard, Jens Christian; Højberg, Anker Lajer; Møller, Ingelise; Hansen, Martin; Søndergaard, Verner
2010-01-01
Groundwater modeling is undergoing a change from traditional stand-alone studies toward being an integrated part of holistic water resources management procedures. This is illustrated by the development in Denmark, where comprehensive national databases for geologic borehole data, groundwater-related geophysical data, geologic models, as well as a national groundwater-surface water model have been established and integrated to support water management. This has enhanced the benefits of using groundwater models. Based on insight gained from this Danish experience, a scientifically realistic scenario for the use of groundwater modeling in 2020 has been developed, in which groundwater models will be a part of sophisticated databases and modeling systems. The databases and numerical models will be seamlessly integrated, and the tasks of monitoring and modeling will be merged. Numerical models for atmospheric, surface water, and groundwater processes will be coupled in one integrated modeling system that can operate at a wide range of spatial scales. Furthermore, the management systems will be constructed with a focus on building credibility of model and data use among all stakeholders and on facilitating a learning process whereby data and models, as well as stakeholders' understanding of the system, are updated to currently available information. The key scientific challenges for achieving this are (1) developing new methodologies for integration of statistical and qualitative uncertainty; (2) mapping geological heterogeneity and developing scaling methodologies; (3) developing coupled model codes; and (4) developing integrated information systems, including quality assurance and uncertainty information that facilitate active stakeholder involvement and learning.
Model-Driven Useware Engineering
NASA Astrophysics Data System (ADS)
Meixner, Gerrit; Seissler, Marc; Breiner, Kai
User-oriented hardware and software development relies on a systematic development process based on a comprehensive analysis focusing on the users' requirements and preferences. Such a development process calls for the integration of numerous disciplines, from psychology and ergonomics to computer sciences and mechanical engineering. Hence, a correspondingly interdisciplinary team must be equipped with suitable software tools to allow it to handle the complexity of a multimodal and multi-device user interface development approach. An abstract, model-based development approach seems to be adequate for handling this complexity. This approach comprises different levels of abstraction requiring adequate tool support. Thus, in this chapter, we present the current state of our model-based software tool chain. We introduce the use model as the core model of our model-based process, transformation processes, and a model-based architecture, and we present different software tools that provide support for creating and maintaining the models or performing the necessary model transformations.
ERIC Educational Resources Information Center
Tumthong, Suwut; Piriyasurawong, Pullop; Jeerangsuwan, Namon
2016-01-01
This research proposes a functional competency development model for academic personnel based on international professional qualification standards in computing field and examines the appropriateness of the model. Specifically, the model consists of three key components which are: 1) functional competency development model, 2) blended training…
Quantifying Aluminum Crystal Size Part 2: The Model-Development Sequence
ERIC Educational Resources Information Center
Hjalmarson, Margret; Diefes-Dux, Heidi A.; Bowman, Keith; Zawojewski, Judith S.
2006-01-01
We have designed model-development sequences using a common context to provide authentic problem-solving experiences for first-year students. The model-development sequence takes a model-eliciting activity a step further by engaging students in the exploration and adaptation of a mathematical model (e.g., procedure, algorithm, method) for solving…
JEDI Publications | Jobs and Economic Development Impact Models | NREL
use of, or sometimes a discussion of, the JEDI models and their application to economic impact model. 2015 JEDI: Jobs and Economic Development Impact Model (Factsheet). 2015. NREL/FS-5000-64129 Economic Development Impact (JEDI) User Reference Guide: Fast Pyrolysis Biorefinery Model. NREL/TP-6A20
Self-Identity Development Model of Oppressed People: Inclusive Model for All?
ERIC Educational Resources Information Center
Highlen, Pamela S.; And Others
The Self-Identity Development Model of Oppressed People (SIDMOP) is a synthesis of several areas of psychology, including developmental, cross cultural, and spiritual literatures. SIDMOP provides an all-inclusive model of identity development for oppressed minorities in the United States, regardless of ethnicity. The model was formulated from the…
JEDI Wind Models | Jobs and Economic Development Impact Models | NREL
Wind Models JEDI Wind Models The Jobs and Economic Development Impacts (JEDI) Wind model allows the user to estimate economic development impacts from wind power generation projects. JEDI Wind has default information that can be used to run a generic impacts analysis assuming wind industry averages
Toward a Theoretical Model of Employee Turnover: A Human Resource Development Perspective
ERIC Educational Resources Information Center
Peterson, Shari L.
2004-01-01
This article sets forth the Organizational Model of Employee Persistence, influenced by traditional turnover models and a student attrition model. The model was developed to clarify the impact of organizational practices on employee turnover from a human resource development (HRD) perspective and provide a theoretical foundation for research on…
Risk prediction model: Statistical and artificial neural network approach
NASA Astrophysics Data System (ADS)
Paiman, Nuur Azreen; Hariri, Azian; Masood, Ibrahim
2017-04-01
Prediction models are increasingly gaining popularity and had been used in numerous areas of studies to complement and fulfilled clinical reasoning and decision making nowadays. The adoption of such models assist physician's decision making, individual's behavior, and consequently improve individual outcomes and the cost-effectiveness of care. The objective of this paper is to reviewed articles related to risk prediction model in order to understand the suitable approach, development and the validation process of risk prediction model. A qualitative review of the aims, methods and significant main outcomes of the nineteen published articles that developed risk prediction models from numerous fields were done. This paper also reviewed on how researchers develop and validate the risk prediction models based on statistical and artificial neural network approach. From the review done, some methodological recommendation in developing and validating the prediction model were highlighted. According to studies that had been done, artificial neural network approached in developing the prediction model were more accurate compared to statistical approach. However currently, only limited published literature discussed on which approach is more accurate for risk prediction model development.
ERIC Educational Resources Information Center
Cheng, Meng-Fei; Lin, Jang-Long
2015-01-01
Understanding the nature of models and engaging in modeling practice have been emphasized in science education. However, few studies discuss the relationships between students' views of scientific models and their ability to develop those models. Hence, this study explores the relationship between students' views of scientific models and their…
USDA-ARS?s Scientific Manuscript database
The environmental modeling community has historically been concerned with the proliferation of models and the effort associated with collective model development tasks (e.g., code generation, data provisioning and transformation, etc.). Environmental modeling frameworks (EMFs) have been developed to...
A demonstrative model of a lunar base simulation on a personal computer
NASA Technical Reports Server (NTRS)
1985-01-01
The initial demonstration model of a lunar base simulation is described. This initial model was developed on the personal computer level to demonstrate feasibility and technique before proceeding to a larger computer-based model. Lotus Symphony Version 1.1 software was used to base the demonstration model on an personal computer with an MS-DOS operating system. The personal computer-based model determined the applicability of lunar base modeling techniques developed at an LSPI/NASA workshop. In addition, the personnal computer-based demonstration model defined a modeling structure that could be employed on a larger, more comprehensive VAX-based lunar base simulation. Refinement of this personal computer model and the development of a VAX-based model is planned in the near future.
Janssen, Sander J C; Porter, Cheryl H; Moore, Andrew D; Athanasiadis, Ioannis N; Foster, Ian; Jones, James W; Antle, John M
2017-07-01
Agricultural modeling has long suffered from fragmentation in model implementation. Many models are developed, there is much redundancy, models are often poorly coupled, model component re-use is rare, and it is frequently difficult to apply models to generate real solutions for the agricultural sector. To improve this situation, we argue that an open, self-sustained, and committed community is required to co-develop agricultural models and associated data and tools as a common resource. Such a community can benefit from recent developments in information and communications technology (ICT). We examine how such developments can be leveraged to design and implement the next generation of data, models, and decision support tools for agricultural production systems. Our objective is to assess relevant technologies for their maturity, expected development, and potential to benefit the agricultural modeling community. The technologies considered encompass methods for collaborative development and for involving stakeholders and users in development in a transdisciplinary manner. Our qualitative evaluation suggests that as an overall research challenge, the interoperability of data sources, modular granular open models, reference data sets for applications and specific user requirements analysis methodologies need to be addressed to allow agricultural modeling to enter in the big data era. This will enable much higher analytical capacities and the integrated use of new data sources. Overall agricultural systems modeling needs to rapidly adopt and absorb state-of-the-art data and ICT technologies with a focus on the needs of beneficiaries and on facilitating those who develop applications of their models. This adoption requires the widespread uptake of a set of best practices as standard operating procedures.
Scripting MODFLOW model development using Python and FloPy
Bakker, Mark; Post, Vincent E. A.; Langevin, Christian D.; Hughes, Joseph D.; White, Jeremy; Starn, Jeffrey; Fienen, Michael N.
2016-01-01
Graphical user interfaces (GUIs) are commonly used to construct and postprocess numerical groundwater flow and transport models. Scripting model development with the programming language Python is presented here as an alternative approach. One advantage of Python is that there are many packages available to facilitate the model development process, including packages for plotting, array manipulation, optimization, and data analysis. For MODFLOW-based models, the FloPy package was developed by the authors to construct model input files, run the model, and read and plot simulation results. Use of Python with the available scientific packages and FloPy facilitates data exploration, alternative model evaluations, and model analyses that can be difficult to perform with GUIs. Furthermore, Python scripts are a complete, transparent, and repeatable record of the modeling process. The approach is introduced with a simple FloPy example to create and postprocess a MODFLOW model. A more complicated capture-fraction analysis with a real-world model is presented to demonstrate the types of analyses that can be performed using Python and FloPy.
Artificial heart development program. Volume I. System development. Phase III summary report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1977-01-01
The report documents efforts and results in the development of the power system portions of a calf implantable model of nuclear-powered artificial heart. The primary objective in developing the implantable model was to solve the packaging problems for total system implantation. The power systems portion is physically that portion of the implantable model between the Pu-238 heat sources and the blood pump ventricles. The work performed had two parallel themes. The first of these was the development of an integrated implantable model for bench and animal experiments plus design effort on a more advanced model. The second was research andmore » development on components of the system done in conjunction with the development of the implantable model and to provide technology for incorporation into advanced models plus support to implantations, at the University of Utah, of the systems blood pumping elements when driven by electric motor. The efforts and results of implantable model development are covered, mainly, in the text of the report. The research and development efforts and results are reported, primarily, in the appendices (Vol. 2).« less
Developing Models to Convey Understanding Rather than Merely Knowledge of the Methods of Science.
ERIC Educational Resources Information Center
Pinet, Paul Raymond
1989-01-01
A teaching method in which students develop models of simple systems to learn critical thinking and scientific methodology is presented. Discussed are the stages of the development of a model and the possible outcomes of the use of model development for students. (CW)
Brake, M. R. W.
2015-02-17
Impact between metallic surfaces is a phenomenon that is ubiquitous in the design and analysis of mechanical systems. We found that to model this phenomenon, a new formulation for frictional elastic–plastic contact between two surfaces is developed. The formulation is developed to consider both frictional, oblique contact (of which normal, frictionless contact is a limiting case) and strain hardening effects. The constitutive model for normal contact is developed as two contiguous loading domains: the elastic regime and a transitionary region in which the plastic response of the materials develops and the elastic response abates. For unloading, the constitutive model ismore » based on an elastic process. Moreover, the normal contact model is assumed to only couple one-way with the frictional/tangential contact model, which results in the normal contact model being independent of the frictional effects. Frictional, tangential contact is modeled using a microslip model that is developed to consider the pressure distribution that develops from the elastic–plastic normal contact. This model is validated through comparisons with experimental results reported in the literature, and is demonstrated to be significantly more accurate than 10 other normal contact models and three other tangential contact models found in the literature.« less
NASA Astrophysics Data System (ADS)
Wang, S.; Peters-Lidard, C. D.; Mocko, D. M.; Kumar, S.; Nearing, G. S.; Arsenault, K. R.; Geiger, J. V.
2014-12-01
Model integration bridges the data flow between modeling frameworks and models. However, models usually do not fit directly into a particular modeling environment, if not designed for it. An example includes implementing different types of models into the NASA Land Information System (LIS), a software framework for land-surface modeling and data assimilation. Model implementation requires scientific knowledge and software expertise and may take a developer months to learn LIS and model software structure. Debugging and testing of the model implementation is also time-consuming due to not fully understanding LIS or the model. This time spent is costly for research and operational projects. To address this issue, an approach has been developed to automate model integration into LIS. With this in mind, a general model interface was designed to retrieve forcing inputs, parameters, and state variables needed by the model and to provide as state variables and outputs to LIS. Every model can be wrapped to comply with the interface, usually with a FORTRAN 90 subroutine. Development efforts need only knowledge of the model and basic programming skills. With such wrappers, the logic is the same for implementing all models. Code templates defined for this general model interface could be re-used with any specific model. Therefore, the model implementation can be done automatically. An automated model implementation toolkit was developed with Microsoft Excel and its built-in VBA language. It allows model specifications in three worksheets and contains FORTRAN 90 code templates in VBA programs. According to the model specification, the toolkit generates data structures and procedures within FORTRAN modules and subroutines, which transfer data between LIS and the model wrapper. Model implementation is standardized, and about 80 - 90% of the development load is reduced. In this presentation, the automated model implementation approach is described along with LIS programming interfaces, the general model interface and five case studies, including a regression model, Noah-MP, FASST, SAC-HTET/SNOW-17, and FLake. These different models vary in complexity with software structure. Also, we will describe how these complexities were overcome through using this approach and results of model benchmarks within LIS.
Neural Networks for Hydrological Modeling Tool for Operational Purposes
NASA Astrophysics Data System (ADS)
Bhatt, Divya; Jain, Ashu
2010-05-01
Hydrological models are useful in many water resources applications such as flood control, irrigation and drainage, hydro power generation, water supply, erosion and sediment control, etc. Estimates of runoff are needed in many water resources planning, design development, operation and maintenance activities. Runoff is generally computed using rainfall-runoff models. Computer based hydrologic models have become popular for obtaining hydrological forecasts and for managing water systems. Rainfall-runoff library (RRL) is computer software developed by Cooperative Research Centre for Catchment Hydrology (CRCCH), Australia consisting of five different conceptual rainfall-runoff models, and has been in operation in many water resources applications in Australia. Recently, soft artificial intelligence tools such as Artificial Neural Networks (ANNs) have become popular for research purposes but have not been adopted in operational hydrological forecasts. There is a strong need to develop ANN models based on real catchment data and compare them with the conceptual models actually in use in real catchments. In this paper, the results from an investigation on the use of RRL and ANNs are presented. Out of the five conceptual models in the RRL toolkit, SimHyd model has been used. Genetic Algorithm has been used as an optimizer in the RRL to calibrate the SimHyd model. Trial and error procedures were employed to arrive at the best values of various parameters involved in the GA optimizer to develop the SimHyd model. The results obtained from the best configuration of the SimHyd model are presented here. Feed-forward neural network model structure trained by back-propagation training algorithm has been adopted here to develop the ANN models. The daily rainfall and runoff data derived from Bird Creek Basin, Oklahoma, USA have been employed to develop all the models included here. A wide range of error statistics have been used to evaluate the performance of all the models developed in this study. The ANN models developed consistently outperformed the conceptual model developed in this study. The results obtained in this study indicate that the ANNs can be extremely useful tools for modeling the complex rainfall-runoff process in real catchments. The ANNs should be adopted in real catchments for hydrological modeling and forecasting. It is hoped that more research will be carried out to compare the performance of ANN model with the conceptual models actually in use at catchment scales. It is hoped that such efforts may go a long way in making the ANNs more acceptable by the policy makers, water resources decision makers, and traditional hydrologists.
Van Holsbeke, C; Ameye, L; Testa, A C; Mascilini, F; Lindqvist, P; Fischerova, D; Frühauf, F; Fransis, S; de Jonge, E; Timmerman, D; Epstein, E
2014-05-01
To develop and validate strategies, using new ultrasound-based mathematical models, for the prediction of high-risk endometrial cancer and compare them with strategies using previously developed models or the use of preoperative grading only. Women with endometrial cancer were prospectively examined using two-dimensional (2D) and three-dimensional (3D) gray-scale and color Doppler ultrasound imaging. More than 25 ultrasound, demographic and histological variables were analyzed. Two logistic regression models were developed: one 'objective' model using mainly objective variables; and one 'subjective' model including subjective variables (i.e. subjective impression of myometrial and cervical invasion, preoperative grade and demographic variables). The following strategies were validated: a one-step strategy using only preoperative grading and two-step strategies using preoperative grading as the first step and one of the new models, subjective assessment or previously developed models as a second step. One-hundred and twenty-five patients were included in the development set and 211 were included in the validation set. The 'objective' model retained preoperative grade and minimal tumor-free myometrium as variables. The 'subjective' model retained preoperative grade and subjective assessment of myometrial invasion. On external validation, the performance of the new models was similar to that on the development set. Sensitivity for the two-step strategy with the 'objective' model was 78% (95% CI, 69-84%) at a cut-off of 0.50, 82% (95% CI, 74-88%) for the strategy with the 'subjective' model and 83% (95% CI, 75-88%) for that with subjective assessment. Specificity was 68% (95% CI, 58-77%), 72% (95% CI, 62-80%) and 71% (95% CI, 61-79%) respectively. The two-step strategies detected up to twice as many high-risk cases as preoperative grading only. The new models had a significantly higher sensitivity than did previously developed models, at the same specificity. Two-step strategies with 'new' ultrasound-based models predict high-risk endometrial cancers with good accuracy and do this better than do previously developed models. Copyright © 2013 ISUOG. Published by John Wiley & Sons Ltd.
A toolbox and record for scientific models
NASA Technical Reports Server (NTRS)
Ellman, Thomas
1994-01-01
Computational science presents a host of challenges for the field of knowledge-based software design. Scientific computation models are difficult to construct. Models constructed by one scientist are easily misapplied by other scientists to problems for which they are not well-suited. Finally, models constructed by one scientist are difficult for others to modify or extend to handle new types of problems. Construction of scientific models actually involves much more than the mechanics of building a single computational model. In the course of developing a model, a scientist will often test a candidate model against experimental data or against a priori expectations. Test results often lead to revisions of the model and a consequent need for additional testing. During a single model development session, a scientist typically examines a whole series of alternative models, each using different simplifying assumptions or modeling techniques. A useful scientific software design tool must support these aspects of the model development process as well. In particular, it should propose and carry out tests of candidate models. It should analyze test results and identify models and parts of models that must be changed. It should determine what types of changes can potentially cure a given negative test result. It should organize candidate models, test data, and test results into a coherent record of the development process. Finally, it should exploit the development record for two purposes: (1) automatically determining the applicability of a scientific model to a given problem; (2) supporting revision of a scientific model to handle a new type of problem. Existing knowledge-based software design tools must be extended in order to provide these facilities.
NASA Astrophysics Data System (ADS)
Bahtiar; Rahayu, Y. S.; Wasis
2018-01-01
This research aims to produce P3E learning model to improve students’ critical thinking skills. The developed model is named P3E, consisting of 4 (four) stages namely; organization, inquiry, presentation, and evaluation. This development research refers to the development stage by Kemp. The design of the wide scale try-out used pretest-posttest group design. The wide scale try-out was conducted in grade X of 2016/2017 academic year. The analysis of the results of this development research inludes three aspects, namely: validity, practicality, and effectiveness of the model developed. The research results showed; (1) the P3E learning model was valid, according to experts with an average value of 3.7; (2) The completion of the syntax of the learning model developed obtained 98.09% and 94.39% for two schools based on the assessment of the observers. This shows that the developed model is practical to be implemented; (3) the developed model is effective for improving students’ critical thinking skills, although the n-gain of the students’ critical thinking skills was 0.54 with moderate category. Based on the results of the research above, it can be concluded that the developed P3E learning model is suitable to be used to improve students’ critical thinking skills.
Risk prediction models of breast cancer: a systematic review of model performances.
Anothaisintawee, Thunyarat; Teerawattananon, Yot; Wiratkapun, Chollathip; Kasamesup, Vijj; Thakkinstian, Ammarin
2012-05-01
The number of risk prediction models has been increasingly developed, for estimating about breast cancer in individual women. However, those model performances are questionable. We therefore have conducted a study with the aim to systematically review previous risk prediction models. The results from this review help to identify the most reliable model and indicate the strengths and weaknesses of each model for guiding future model development. We searched MEDLINE (PubMed) from 1949 and EMBASE (Ovid) from 1974 until October 2010. Observational studies which constructed models using regression methods were selected. Information about model development and performance were extracted. Twenty-five out of 453 studies were eligible. Of these, 18 developed prediction models and 7 validated existing prediction models. Up to 13 variables were included in the models and sample sizes for each study ranged from 550 to 2,404,636. Internal validation was performed in four models, while five models had external validation. Gail and Rosner and Colditz models were the significant models which were subsequently modified by other scholars. Calibration performance of most models was fair to good (expected/observe ratio: 0.87-1.12), but discriminatory accuracy was poor to fair both in internal validation (concordance statistics: 0.53-0.66) and in external validation (concordance statistics: 0.56-0.63). Most models yielded relatively poor discrimination in both internal and external validation. This poor discriminatory accuracy of existing models might be because of a lack of knowledge about risk factors, heterogeneous subtypes of breast cancer, and different distributions of risk factors across populations. In addition the concordance statistic itself is insensitive to measure the improvement of discrimination. Therefore, the new method such as net reclassification index should be considered to evaluate the improvement of the performance of a new develop model.
Training Module on the Development of Best Modeling Practices
This module continues the fundamental concepts outlined in the previous modules. Objectives are to identify the ‘best modeling practices’ and strategies for the Development Stage of the model life-cycle and define the steps of model development.
SUMMA and Model Mimicry: Understanding Differences Among Land Models
NASA Astrophysics Data System (ADS)
Nijssen, B.; Nearing, G. S.; Ou, G.; Clark, M. P.
2016-12-01
Model inter-comparison and model ensemble experiments suffer from an inability to explain the mechanisms behind differences in model outcomes. We can clearly demonstrate that the models are different, but we cannot necessarily identify the reasons why, because most models exhibit myriad differences in process representations, model parameterizations, model parameters and numerical solution methods. This inability to identify the reasons for differences in model performance hampers our understanding and limits model improvement, because we cannot easily identify the most promising paths forward. We have developed the Structure for Unifying Multiple Modeling Alternatives (SUMMA) to allow for controlled experimentation with model construction, numerical techniques, and parameter values and therefore isolate differences in model outcomes to specific choices during the model development process. In developing SUMMA, we recognized that hydrologic models can be thought of as individual instantiations of a master modeling template that is based on a common set of conservation equations for energy and water. Given this perspective, SUMMA provides a unified approach to hydrologic modeling that integrates different modeling methods into a consistent structure with the ability to instantiate alternative hydrologic models at runtime. Here we employ SUMMA to revisit a previous multi-model experiment and demonstrate its use for understanding differences in model performance. Specifically, we implement SUMMA to mimic the spread of behaviors exhibited by the land models that participated in the Protocol for the Analysis of Land Surface Models (PALS) Land Surface Model Benchmarking Evaluation Project (PLUMBER) and draw conclusions about the relative performance of specific model parameterizations for water and energy fluxes through the soil-vegetation continuum. SUMMA's ability to mimic the spread of model ensembles and the behavior of individual models can be an important tool in focusing model development and improvement efforts.
A Knowledge-Based Representation Scheme for Environmental Science Models
NASA Technical Reports Server (NTRS)
Keller, Richard M.; Dungan, Jennifer L.; Lum, Henry, Jr. (Technical Monitor)
1994-01-01
One of the primary methods available for studying environmental phenomena is the construction and analysis of computational models. We have been studying how artificial intelligence techniques can be applied to assist in the development and use of environmental science models within the context of NASA-sponsored activities. We have identified several high-utility areas as potential targets for research and development: model development; data visualization, analysis, and interpretation; model publishing and reuse, training and education; and framing, posing, and answering questions. Central to progress on any of the above areas is a representation for environmental models that contains a great deal more information than is present in a traditional software implementation. In particular, a traditional software implementation is devoid of any semantic information that connects the code with the environmental context that forms the background for the modeling activity. Before we can build AI systems to assist in model development and usage, we must develop a representation for environmental models that adequately describes a model's semantics and explicitly represents the relationship between the code and the modeling task at hand. We have developed one such representation in conjunction with our work on the SIGMA (Scientists' Intelligent Graphical Modeling Assistant) environment. The key feature of the representation is that it provides a semantic grounding for the symbols in a set of modeling equations by linking those symbols to an explicit representation of the underlying environmental scenario.
An Integrated High Resolution Hydrometeorological Modeling Testbed using LIS and WRF
NASA Technical Reports Server (NTRS)
Kumar, Sujay V.; Peters-Lidard, Christa D.; Eastman, Joseph L.; Tao, Wei-Kuo
2007-01-01
Scientists have made great strides in modeling physical processes that represent various weather and climate phenomena. Many modeling systems that represent the major earth system components (the atmosphere, land surface, and ocean) have been developed over the years. However, developing advanced Earth system applications that integrates these independently developed modeling systems have remained a daunting task due to limitations in computer hardware and software. Recently, efforts such as the Earth System Modeling Ramework (ESMF) and Assistance for Land Modeling Activities (ALMA) have focused on developing standards, guidelines, and computational support for coupling earth system model components. In this article, the development of a coupled land-atmosphere hydrometeorological modeling system that adopts these community interoperability standards, is described. The land component is represented by the Land Information System (LIS), developed by scientists at the NASA Goddard Space Flight Center. The Weather Research and Forecasting (WRF) model, a mesoscale numerical weather prediction system, is used as the atmospheric component. LIS includes several community land surface models that can be executed at spatial scales as fine as 1km. The data management capabilities in LIS enable the direct use of high resolution satellite and observation data for modeling. Similarly, WRF includes several parameterizations and schemes for modeling radiation, microphysics, PBL and other processes. Thus the integrated LIS-WRF system facilitates several multi-model studies of land-atmosphere coupling that can be used to advance earth system studies.
Demonstration of reduced-order urban scale building energy models
Heidarinejad, Mohammad; Mattise, Nicholas; Dahlhausen, Matthew; ...
2017-09-08
The aim of this study is to demonstrate a developed framework to rapidly create urban scale reduced-order building energy models using a systematic summary of the simplifications required for the representation of building exterior and thermal zones. These urban scale reduced-order models rely on the contribution of influential variables to the internal, external, and system thermal loads. OpenStudio Application Programming Interface (API) serves as a tool to automate the process of model creation and demonstrate the developed framework. The results of this study show that the accuracy of the developed reduced-order building energy models varies only up to 10% withmore » the selection of different thermal zones. In addition, to assess complexity of the developed reduced-order building energy models, this study develops a novel framework to quantify complexity of the building energy models. Consequently, this study empowers the building energy modelers to quantify their building energy model systematically in order to report the model complexity alongside the building energy model accuracy. An exhaustive analysis on four university campuses suggests that the urban neighborhood buildings lend themselves to simplified typical shapes. Specifically, building energy modelers can utilize the developed typical shapes to represent more than 80% of the U.S. buildings documented in the CBECS database. One main benefits of this developed framework is the opportunity for different models including airflow and solar radiation models to share the same exterior representation, allowing a unifying exchange data. Altogether, the results of this study have implications for a large-scale modeling of buildings in support of urban energy consumption analyses or assessment of a large number of alternative solutions in support of retrofit decision-making in the building industry.« less
Demonstration of reduced-order urban scale building energy models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heidarinejad, Mohammad; Mattise, Nicholas; Dahlhausen, Matthew
The aim of this study is to demonstrate a developed framework to rapidly create urban scale reduced-order building energy models using a systematic summary of the simplifications required for the representation of building exterior and thermal zones. These urban scale reduced-order models rely on the contribution of influential variables to the internal, external, and system thermal loads. OpenStudio Application Programming Interface (API) serves as a tool to automate the process of model creation and demonstrate the developed framework. The results of this study show that the accuracy of the developed reduced-order building energy models varies only up to 10% withmore » the selection of different thermal zones. In addition, to assess complexity of the developed reduced-order building energy models, this study develops a novel framework to quantify complexity of the building energy models. Consequently, this study empowers the building energy modelers to quantify their building energy model systematically in order to report the model complexity alongside the building energy model accuracy. An exhaustive analysis on four university campuses suggests that the urban neighborhood buildings lend themselves to simplified typical shapes. Specifically, building energy modelers can utilize the developed typical shapes to represent more than 80% of the U.S. buildings documented in the CBECS database. One main benefits of this developed framework is the opportunity for different models including airflow and solar radiation models to share the same exterior representation, allowing a unifying exchange data. Altogether, the results of this study have implications for a large-scale modeling of buildings in support of urban energy consumption analyses or assessment of a large number of alternative solutions in support of retrofit decision-making in the building industry.« less
Software Tools For Building Decision-support Models For Flood Emergency Situations
NASA Astrophysics Data System (ADS)
Garrote, L.; Molina, M.; Ruiz, J. M.; Mosquera, J. C.
The SAIDA decision-support system was developed by the Spanish Ministry of the Environment to provide assistance to decision-makers during flood situations. SAIDA has been tentatively implemented in two test basins: Jucar and Guadalhorce, and the Ministry is currently planning to have it implemented in all major Spanish basins in a few years' time. During the development cycle of SAIDA, the need for providing as- sistance to end-users in model definition and calibration was clearly identified. System developers usually emphasise abstraction and generality with the goal of providing a versatile software environment. End users, on the other hand, require concretion and specificity to adapt the general model to their local basins. As decision-support models become more complex, the gap between model developers and users gets wider: Who takes care of model definition, calibration and validation?. Initially, model developers perform these tasks, but the scope is usually limited to a few small test basins. Before the model enters operational stage, end users must get involved in model construction and calibration, in order to gain confidence in the model recommendations. However, getting the users involved in these activities is a difficult task. The goal of this re- search is to develop representation techniques for simulation and management models in order to define, develop and validate a mechanism, supported by a software envi- ronment, oriented to provide assistance to the end-user in building decision models for the prediction and management of river floods in real time. The system is based on three main building blocks: A library of simulators of the physical system, an editor to assist the user in building simulation models, and a machine learning method to calibrate decision models based on the simulation models provided by the user.
The increasing use of tissue dosimetry estimated using pharmacokinetic models in chemical risk assessments in multiple countries necessitates the need to develop internationally recognized good modelling practices. These practices would facilitate sharing of models and model eva...
Challenges and opportunities for integrating lake ecosystem modelling approaches
Mooij, Wolf M.; Trolle, Dennis; Jeppesen, Erik; Arhonditsis, George; Belolipetsky, Pavel V.; Chitamwebwa, Deonatus B.R.; Degermendzhy, Andrey G.; DeAngelis, Donald L.; Domis, Lisette N. De Senerpont; Downing, Andrea S.; Elliott, J. Alex; Ruberto, Carlos Ruberto; Gaedke, Ursula; Genova, Svetlana N.; Gulati, Ramesh D.; Hakanson, Lars; Hamilton, David P.; Hipsey, Matthew R.; Hoen, Jochem 't; Hulsmann, Stephan; Los, F. Hans; Makler-Pick, Vardit; Petzoldt, Thomas; Prokopkin, Igor G.; Rinke, Karsten; Schep, Sebastiaan A.; Tominaga, Koji; Van Dam, Anne A.; Van Nes, Egbert H.; Wells, Scott A.; Janse, Jan H.
2010-01-01
A large number and wide variety of lake ecosystem models have been developed and published during the past four decades. We identify two challenges for making further progress in this field. One such challenge is to avoid developing more models largely following the concept of others ('reinventing the wheel'). The other challenge is to avoid focusing on only one type of model, while ignoring new and diverse approaches that have become available ('having tunnel vision'). In this paper, we aim at improving the awareness of existing models and knowledge of concurrent approaches in lake ecosystem modelling, without covering all possible model tools and avenues. First, we present a broad variety of modelling approaches. To illustrate these approaches, we give brief descriptions of rather arbitrarily selected sets of specific models. We deal with static models (steady state and regression models), complex dynamic models (CAEDYM, CE-QUAL-W2, Delft 3D-ECO, LakeMab, LakeWeb, MyLake, PCLake, PROTECH, SALMO), structurally dynamic models and minimal dynamic models. We also discuss a group of approaches that could all be classified as individual based: super-individual models (Piscator, Charisma), physiologically structured models, stage-structured models and trait-based models. We briefly mention genetic algorithms, neural networks, Kalman filters and fuzzy logic. Thereafter, we zoom in, as an in-depth example, on the multi-decadal development and application of the lake ecosystem model PCLake and related models (PCLake Metamodel, Lake Shira Model, IPH-TRIM3D-PCLake). In the discussion, we argue that while the historical development of each approach and model is understandable given its 'leading principle', there are many opportunities for combining approaches. We take the point of view that a single 'right' approach does not exist and should not be strived for. Instead, multiple modelling approaches, applied concurrently to a given problem, can help develop an integrative view on the functioning of lake ecosystems. We end with a set of specific recommendations that may be of help in the further development of lake ecosystem models.
OBO to UML: Support for the development of conceptual models in the biomedical domain.
Waldemarin, Ricardo C; de Farias, Cléver R G
2018-04-01
A conceptual model abstractly defines a number of concepts and their relationships for the purposes of understanding and communication. Once a conceptual model is available, it can also be used as a starting point for the development of a software system. The development of conceptual models using the Unified Modeling Language (UML) facilitates the representation of modeled concepts and allows software developers to directly reuse these concepts in the design of a software system. The OBO Foundry represents the most relevant collaborative effort towards the development of ontologies in the biomedical domain. The development of UML conceptual models in the biomedical domain may benefit from the use of domain-specific semantics and notation. Further, the development of these models may also benefit from the reuse of knowledge contained in OBO ontologies. This paper investigates the support for the development of conceptual models in the biomedical domain using UML as a conceptual modeling language and using the support provided by the OBO Foundry for the development of biomedical ontologies, namely entity kind and relationship types definitions provided by the Basic Formal Ontology (BFO) and the OBO Core Relations Ontology (OBO Core), respectively. Further, the paper investigates the support for the reuse of biomedical knowledge currently available in OBOFFF ontologies in the development these conceptual models. The paper describes a UML profile for the OBO Core Relations Ontology, which basically defines a number of stereotypes to represent BFO entity kinds and OBO Core relationship types definitions. The paper also presents a support toolset consisting of a graphical editor named OBO-RO Editor, which directly supports the development of UML models using the extensions defined by our profile, and a command-line tool named OBO2UML, which directly converts an OBOFFF ontology into a UML model. Copyright © 2018 Elsevier Inc. All rights reserved.
Park, Byung-Jung; Lord, Dominique; Wu, Lingtao
2016-10-28
This study aimed to investigate the relative performance of two models (negative binomial (NB) model and two-component finite mixture of negative binomial models (FMNB-2)) in terms of developing crash modification factors (CMFs). Crash data on rural multilane divided highways in California and Texas were modeled with the two models, and crash modification functions (CMFunctions) were derived. The resultant CMFunction estimated from the FMNB-2 model showed several good properties over that from the NB model. First, the safety effect of a covariate was better reflected by the CMFunction developed using the FMNB-2 model, since the model takes into account the differential responsiveness of crash frequency to the covariate. Second, the CMFunction derived from the FMNB-2 model is able to capture nonlinear relationships between covariate and safety. Finally, following the same concept as those for NB models, the combined CMFs of multiple treatments were estimated using the FMNB-2 model. The results indicated that they are not the simple multiplicative of single ones (i.e., their safety effects are not independent under FMNB-2 models). Adjustment Factors (AFs) were then developed. It is revealed that current Highway Safety Manual's method could over- or under-estimate the combined CMFs under particular combination of covariates. Safety analysts are encouraged to consider using the FMNB-2 models for developing CMFs and AFs. Copyright © 2016 Elsevier Ltd. All rights reserved.
Hybrid Model of IRT and Latent Class Models.
ERIC Educational Resources Information Center
Yamamoto, Kentaro
This study developed a hybrid of item response theory (IRT) models and latent class models, which combined the strengths of each type of model. The primary motivation for developing the new model is to describe characteristics of examinees' knowledge at the time of the examination. Hence, the application of the model lies mainly in so-called…
The Past, Present, and Future of Computational Models of Cognitive Development
ERIC Educational Resources Information Center
Schlesinger, Matthew; McMurray, Bob
2012-01-01
Does modeling matter? We address this question by providing a broad survey of the computational models of cognitive development that have been proposed and studied over the last three decades. We begin by noting the advantages and limitations of computational models. We then describe four key dimensions across which models of development can be…
Koh, Keumseok; Reno, Rebecca; Hyder, Ayaz
2018-04-01
Recent advances in computing resources have increased interest in systems modeling and population health. While group model building (GMB) has been effectively applied in developing system dynamics models (SD), few studies have used GMB for developing an agent-based model (ABM). This article explores the use of a GMB approach to develop an ABM focused on food insecurity. In our GMB workshops, we modified a set of the standard GMB scripts to develop and validate an ABM in collaboration with local experts and stakeholders. Based on this experience, we learned that GMB is a useful collaborative modeling platform for modelers and community experts to address local population health issues. We also provide suggestions for increasing the use of the GMB approach to develop rigorous, useful, and validated ABMs.
Turbulence Modeling: Progress and Future Outlook
NASA Technical Reports Server (NTRS)
Marvin, Joseph G.; Huang, George P.
1996-01-01
Progress in the development of the hierarchy of turbulence models for Reynolds-averaged Navier-Stokes codes used in aerodynamic applications is reviewed. Steady progress is demonstrated, but transfer of the modeling technology has not kept pace with the development and demands of the computational fluid dynamics (CFD) tools. An examination of the process of model development leads to recommendations for a mid-course correction involving close coordination between modelers, CFD developers, and application engineers. In instances where the old process is changed and cooperation enhanced, timely transfer is realized. A turbulence modeling information database is proposed to refine the process and open it to greater participation among modeling and CFD practitioners.
Stone, Wesley W.; Gilliom, Robert J.
2012-01-01
Watershed Regressions for Pesticides (WARP) models, previously developed for atrazine at the national scale, are improved for application to the United States (U.S.) Corn Belt region by developing region-specific models that include watershed characteristics that are influential in predicting atrazine concentration statistics within the Corn Belt. WARP models for the Corn Belt (WARP-CB) were developed for annual maximum moving-average (14-, 21-, 30-, 60-, and 90-day durations) and annual 95th-percentile atrazine concentrations in streams of the Corn Belt region. The WARP-CB models accounted for 53 to 62% of the variability in the various concentration statistics among the model-development sites. Model predictions were within a factor of 5 of the observed concentration statistic for over 90% of the model-development sites. The WARP-CB residuals and uncertainty are lower than those of the National WARP model for the same sites. Although atrazine-use intensity is the most important explanatory variable in the National WARP models, it is not a significant variable in the WARP-CB models. The WARP-CB models provide improved predictions for Corn Belt streams draining watersheds with atrazine-use intensities of 17 kg/km2 of watershed area or greater.
MOEMS Modeling Using the Geometrical Matrix Toolbox
NASA Technical Reports Server (NTRS)
Wilson, William C.; Atkinson, Gary M.
2005-01-01
New technologies such as MicroOptoElectro-Mechanical Systems (MOEMS) require new modeling tools. These tools must simultaneously model the optical, electrical, and mechanical domains and the interactions between these domains. To facilitate rapid prototyping of these new technologies an optical toolbox has been developed for modeling MOEMS devices. The toolbox models are constructed using MATLAB's dynamical simulator, Simulink. Modeling toolboxes will allow users to focus their efforts on system design and analysis as opposed to developing component models. This toolbox was developed to facilitate rapid modeling and design of a MOEMS based laser ultrasonic receiver system.
Developing physiologically-based pharmacokinetic (PBPK) models for chemicals can be resource-intensive, as neither chemical-specific parameters nor in vivo pharmacokinetic data are easily available for model construction. Previously developed, well-parameterized, and thoroughly-v...
Directory of Energy Information Administration model abstracts 1988
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1988-01-01
This directory contains descriptions about each basic and auxiliary model, including the title, acronym, purpose, and type, followed by more detailed information on characteristics, uses, and requirements. For developing models, limited information is provided. Sources for additional information are identified. Included in this directory are 44 EIA models active as of February 1, 1988; 16 of which operate on personal computers. Models that run on personal computers are identified by ''PC'' as part of the acronyms. The main body of this directory is an alphabetical listing of all basic and auxiliary EIA models. Appendix A identifies major EIA modeling systemsmore » and the models within these systems, and Appendix B identifies EIA models by type (basic or auxiliary). Appendix C lists developing models and contact persons for those models. A basic model is one designated by the EIA Administrator as being sufficiently important to require sustained support and public scrutiny. An auxiliary model is one designated by the EIA Administrator as being used only occasionally in analyses, and therefore requires minimal levels of documentation. A developing model is one designated by the EIA Administrator as being under development and yet of sufficient interest to require a basic level of documentation at a future date. EIA also leases models developed by proprietary software vendors. Documentation for these ''proprietary'' models is the responsibility of the companies from which they are leased. EIA has recently leased models from Chase Econometrics, Inc., Data Resources, Inc. (DRI), the Oak Ridge National Laboratory (ORNL), and Wharton Econometric Forecasting Associates (WEFA). Leased models are not abstracted here.« less
Sager, Jennifer E.; Yu, Jingjing; Ragueneau-Majlessi, Isabelle
2015-01-01
Modeling and simulation of drug disposition has emerged as an important tool in drug development, clinical study design and regulatory review, and the number of physiologically based pharmacokinetic (PBPK) modeling related publications and regulatory submissions have risen dramatically in recent years. However, the extent of use of PBPK modeling by researchers, and the public availability of models has not been systematically evaluated. This review evaluates PBPK-related publications to 1) identify the common applications of PBPK modeling; 2) determine ways in which models are developed; 3) establish how model quality is assessed; and 4) provide a list of publically available PBPK models for sensitive P450 and transporter substrates as well as selective inhibitors and inducers. PubMed searches were conducted using the terms “PBPK” and “physiologically based pharmacokinetic model” to collect published models. Only papers on PBPK modeling of pharmaceutical agents in humans published in English between 2008 and May 2015 were reviewed. A total of 366 PBPK-related articles met the search criteria, with the number of articles published per year rising steadily. Published models were most commonly used for drug-drug interaction predictions (28%), followed by interindividual variability and general clinical pharmacokinetic predictions (23%), formulation or absorption modeling (12%), and predicting age-related changes in pharmacokinetics and disposition (10%). In total, 106 models of sensitive substrates, inhibitors, and inducers were identified. An in-depth analysis of the model development and verification revealed a lack of consistency in model development and quality assessment practices, demonstrating a need for development of best-practice guidelines. PMID:26296709
Geospatial Data Science Modeling | Geospatial Data Science | NREL
Geospatial Data Science Modeling Geospatial Data Science Modeling NREL uses geospatial data science modeling to develop innovative models and tools for energy professionals, project developers, and consumers . Photo of researchers inspecting maps on a large display. Geospatial modeling at NREL often produces the
Prognostic models for complete recovery in ischemic stroke: a systematic review and meta-analysis.
Jampathong, Nampet; Laopaiboon, Malinee; Rattanakanokchai, Siwanon; Pattanittum, Porjai
2018-03-09
Prognostic models have been increasingly developed to predict complete recovery in ischemic stroke. However, questions arise about the performance characteristics of these models. The aim of this study was to systematically review and synthesize performance of existing prognostic models for complete recovery in ischemic stroke. We searched journal publications indexed in PUBMED, SCOPUS, CENTRAL, ISI Web of Science and OVID MEDLINE from inception until 4 December, 2017, for studies designed to develop and/or validate prognostic models for predicting complete recovery in ischemic stroke patients. Two reviewers independently examined titles and abstracts, and assessed whether each study met the pre-defined inclusion criteria and also independently extracted information about model development and performance. We evaluated validation of the models by medians of the area under the receiver operating characteristic curve (AUC) or c-statistic and calibration performance. We used a random-effects meta-analysis to pool AUC values. We included 10 studies with 23 models developed from elderly patients with a moderately severe ischemic stroke, mainly in three high income countries. Sample sizes for each study ranged from 75 to 4441. Logistic regression was the only analytical strategy used to develop the models. The number of various predictors varied from one to 11. Internal validation was performed in 12 models with a median AUC of 0.80 (95% CI 0.73 to 0.84). One model reported good calibration. Nine models reported external validation with a median AUC of 0.80 (95% CI 0.76 to 0.82). Four models showed good discrimination and calibration on external validation. The pooled AUC of the two validation models of the same developed model was 0.78 (95% CI 0.71 to 0.85). The performance of the 23 models found in the systematic review varied from fair to good in terms of internal and external validation. Further models should be developed with internal and external validation in low and middle income countries.
A Neuroconstructivist Model of Past Tense Development and Processing
ERIC Educational Resources Information Center
Westermann, Gert; Ruh, Nicolas
2012-01-01
We present a neural network model of learning and processing the English past tense that is based on the notion that experience-dependent cortical development is a core aspect of cognitive development. During learning the model adds and removes units and connections to develop a task-specific final architecture. The model provides an integrated…
Development of a National HRD Strategy Model: Cases of India and China
ERIC Educational Resources Information Center
Alagaraja, Meera; Wang, Jia
2012-01-01
National human resource development (NHRD) literature describes the importance of developing human resources at the national level and presents several models. These models are primarily concerned with the national contexts of developing and underdeveloped countries. In contrast, the NHRD models in the non-HRD literature focus primarily on…
NASA Technical Reports Server (NTRS)
Connelly, L. C.
1977-01-01
The mission planning processor is a user oriented tool for consumables management and is part of the total consumables subsystem management concept. The approach to be used in developing a working model of the mission planning processor is documented. The approach includes top-down design, structured programming techniques, and application of NASA approved software development standards. This development approach: (1) promotes cost effective software development, (2) enhances the quality and reliability of the working model, (3) encourages the sharing of the working model through a standard approach, and (4) promotes portability of the working model to other computer systems.
User modeling for distributed virtual environment intelligent agents
NASA Astrophysics Data System (ADS)
Banks, Sheila B.; Stytz, Martin R.
1999-07-01
This paper emphasizes the requirement for user modeling by presenting the necessary information to motivate the need for and use of user modeling for intelligent agent development. The paper will present information on our current intelligent agent development program, the Symbiotic Information Reasoning and Decision Support (SIRDS) project. We then discuss the areas of intelligent agents and user modeling, which form the foundation of the SIRDS project. Included in the discussion of user modeling are its major components, which are cognitive modeling and behavioral modeling. We next motivate the need for and user of a methodology to develop user models to encompass work within cognitive task analysis. We close the paper by drawing conclusions from our current intelligent agent research project and discuss avenues of future research in the utilization of user modeling for the development of intelligent agents for virtual environments.
Fundamental Travel Demand Model Example
NASA Technical Reports Server (NTRS)
Hanssen, Joel
2010-01-01
Instances of transportation models are abundant and detailed "how to" instruction is available in the form of transportation software help documentation. The purpose of this paper is to look at the fundamental inputs required to build a transportation model by developing an example passenger travel demand model. The example model reduces the scale to a manageable size for the purpose of illustrating the data collection and analysis required before the first step of the model begins. This aspect of the model development would not reasonably be discussed in software help documentation (it is assumed the model developer comes prepared). Recommendations are derived from the example passenger travel demand model to suggest future work regarding the data collection and analysis required for a freight travel demand model.
Guariguata, L; Guell, C; Samuels, T A; Rouwette, E A J A; Woodcock, J; Hambleton, I R; Unwin, N
2016-10-26
Diabetes is highly prevalent in the Caribbean, associated with a high morbidity and mortality and is a recognised threat to economic and social development. Heads of Government in the Caribbean Community came together in 2007 and declared their commitment to reducing the burden of non-communicable diseases (NCDs), including diabetes, by calling for a multi-sectoral, systemic response. To facilitate the development of effective policies, policymakers are being engaged in the development and use of a system dynamics (SD) model of diabetes for Caribbean countries. Previous work on a diabetes SD model from the United States of America (USA) is being adapted to a local context for three countries in the region using input from stakeholders, a review of existing qualitative and quantitative data, and collection of new qualitative data. Three country models will be developed using one-on-one stakeholder engagement and iterative revision. An inter-country model will also be developed following a model-building workshop. Models will be compared to each other and to the USA model. The inter-country model will be used to simulate policies identified as priorities by stakeholders and to develop targets for prevention and control. The model and model-building process will be evaluated by stakeholders and a manual developed for use in other high-burden developing regions. SD has been applied with success for health policy development in high-income country settings. The utility of SD in developing countries as an aid to policy decision-making related to NCDs has not been tested. This study represents the first of its kind.
Ahmad, Zulfiqar; Ashraf, Arshad; Fryar, Alan; Akhter, Gulraiz
2011-02-01
The integration of the Geographic Information System (GIS) with groundwater modeling and satellite remote sensing capabilities has provided an efficient way of analyzing and monitoring groundwater behavior and its associated land conditions. A 3-dimensional finite element model (Feflow) has been used for regional groundwater flow modeling of Upper Chaj Doab in Indus Basin, Pakistan. The approach of using GIS techniques that partially fulfill the data requirements and define the parameters of existing hydrologic models was adopted. The numerical groundwater flow model is developed to configure the groundwater equipotential surface, hydraulic head gradient, and estimation of the groundwater budget of the aquifer. GIS is used for spatial database development, integration with a remote sensing, and numerical groundwater flow modeling capabilities. The thematic layers of soils, land use, hydrology, infrastructure, and climate were developed using GIS. The Arcview GIS software is used as additive tool to develop supportive data for numerical groundwater flow modeling and integration and presentation of image processing and modeling results. The groundwater flow model was calibrated to simulate future changes in piezometric heads from the period 2006 to 2020. Different scenarios were developed to study the impact of extreme climatic conditions (drought/flood) and variable groundwater abstraction on the regional groundwater system. The model results indicated a significant response in watertable due to external influential factors. The developed model provides an effective tool for evaluating better management options for monitoring future groundwater development in the study area.
NASA Astrophysics Data System (ADS)
Malard, J. J.; Adamowski, J. F.; Wang, L. Y.; Rojas, M.; Carrera, J.; Gálvez, J.; Tuy, H. A.; Melgar-Quiñonez, H.
2015-12-01
The modelling of the impacts of climate change on agriculture requires the inclusion of socio-economic factors. However, while cropping models and economic models of agricultural systems are common, dynamically coupled socio-economic-biophysical models have not received as much success. A promising methodology for modelling the socioeconomic aspects of coupled natural-human systems is participatory system dynamics modelling, in which stakeholders develop mental maps of the socio-economic system that are then turned into quantified simulation models. This methodology has been successful in the water resources management field. However, while the stocks and flows of water resources have also been represented within the system dynamics modelling framework and thus coupled to the socioeconomic portion of the model, cropping models are ill-suited for such reformulation. In addition, most of these system dynamics models were developed without stakeholder input, limiting the scope for the adoption and implementation of their results. We therefore propose a new methodology for the analysis of climate change variability on agroecosystems which uses dynamically coupled system dynamics (socio-economic) and biophysical (cropping) models to represent both physical and socioeconomic aspects of the agricultural system, using two case studies (intensive market-based agricultural development versus subsistence crop-based development) from rural Guatemala. The system dynamics model component is developed with relevant governmental and NGO stakeholders from rural and agricultural development in the case study regions and includes such processes as education, poverty and food security. Common variables with the cropping models (yield and agricultural management choices) are then used to dynamically couple the two models together, allowing for the analysis of the agroeconomic system's response to and resilience against various climatic and socioeconomic shocks.
Experimental psychiatric illness and drug abuse models: from human to animal, an overview.
Edwards, Scott; Koob, George F
2012-01-01
Preclinical animal models have supported much of the recent rapid expansion of neuroscience research and have facilitated critical discoveries that undoubtedly benefit patients suffering from psychiatric disorders. This overview serves as an introduction for the following chapters describing both in vivo and in vitro preclinical models of psychiatric disease components and briefly describes models related to drug dependence and affective disorders. Although there are no perfect animal models of any psychiatric disorder, models do exist for many elements of each disease state or stage. In many cases, the development of certain models is essentially restricted to the human clinical laboratory domain for the purpose of maximizing validity, whereas the use of in vitro models may best represent an adjunctive, well-controlled means to model specific signaling mechanisms associated with psychiatric disease states. The data generated by preclinical models are only as valid as the model itself, and the development and refinement of animal models for human psychiatric disorders continues to be an important challenge. Collaborative relationships between basic neuroscience and clinical modeling could greatly benefit the development of new and better models, in addition to facilitating medications development.
Computational Modeling of Space Physiology
NASA Technical Reports Server (NTRS)
Lewandowski, Beth E.; Griffin, Devon W.
2016-01-01
The Digital Astronaut Project (DAP), within NASAs Human Research Program, develops and implements computational modeling for use in the mitigation of human health and performance risks associated with long duration spaceflight. Over the past decade, DAP developed models to provide insights into space flight related changes to the central nervous system, cardiovascular system and the musculoskeletal system. Examples of the models and their applications include biomechanical models applied to advanced exercise device development, bone fracture risk quantification for mission planning, accident investigation, bone health standards development, and occupant protection. The International Space Station (ISS), in its role as a testing ground for long duration spaceflight, has been an important platform for obtaining human spaceflight data. DAP has used preflight, in-flight and post-flight data from short and long duration astronauts for computational model development and validation. Examples include preflight and post-flight bone mineral density data, muscle cross-sectional area, and muscle strength measurements. Results from computational modeling supplement space physiology research by informing experimental design. Using these computational models, DAP personnel can easily identify both important factors associated with a phenomenon and areas where data are lacking. This presentation will provide examples of DAP computational models, the data used in model development and validation, and applications of the model.
Aspinall, Richard
2004-08-01
This paper develops an approach to modelling land use change that links model selection and multi-model inference with empirical models and GIS. Land use change is frequently studied, and understanding gained, through a process of modelling that is an empirical analysis of documented changes in land cover or land use patterns. The approach here is based on analysis and comparison of multiple models of land use patterns using model selection and multi-model inference. The approach is illustrated with a case study of rural housing as it has developed for part of Gallatin County, Montana, USA. A GIS contains the location of rural housing on a yearly basis from 1860 to 2000. The database also documents a variety of environmental and socio-economic conditions. A general model of settlement development describes the evolution of drivers of land use change and their impacts in the region. This model is used to develop a series of different models reflecting drivers of change at different periods in the history of the study area. These period specific models represent a series of multiple working hypotheses describing (a) the effects of spatial variables as a representation of social, economic and environmental drivers of land use change, and (b) temporal changes in the effects of the spatial variables as the drivers of change evolve over time. Logistic regression is used to calibrate and interpret these models and the models are then compared and evaluated with model selection techniques. Results show that different models are 'best' for the different periods. The different models for different periods demonstrate that models are not invariant over time which presents challenges for validation and testing of empirical models. The research demonstrates (i) model selection as a mechanism for rating among many plausible models that describe land cover or land use patterns, (ii) inference from a set of models rather than from a single model, (iii) that models can be developed based on hypothesised relationships based on consideration of underlying and proximate causes of change, and (iv) that models are not invariant over time.
A Collective Study on Modeling and Simulation of Resistive Random Access Memory
NASA Astrophysics Data System (ADS)
Panda, Debashis; Sahu, Paritosh Piyush; Tseng, Tseung Yuen
2018-01-01
In this work, we provide a comprehensive discussion on the various models proposed for the design and description of resistive random access memory (RRAM), being a nascent technology is heavily reliant on accurate models to develop efficient working designs and standardize its implementation across devices. This review provides detailed information regarding the various physical methodologies considered for developing models for RRAM devices. It covers all the important models reported till now and elucidates their features and limitations. Various additional effects and anomalies arising from memristive system have been addressed, and the solutions provided by the models to these problems have been shown as well. All the fundamental concepts of RRAM model development such as device operation, switching dynamics, and current-voltage relationships are covered in detail in this work. Popular models proposed by Chua, HP Labs, Yakopcic, TEAM, Stanford/ASU, Ielmini, Berco-Tseng, and many others have been compared and analyzed extensively on various parameters. The working and implementations of the window functions like Joglekar, Biolek, Prodromakis, etc. has been presented and compared as well. New well-defined modeling concepts have been discussed which increase the applicability and accuracy of the models. The use of these concepts brings forth several improvements in the existing models, which have been enumerated in this work. Following the template presented, highly accurate models would be developed which will vastly help future model developers and the modeling community.
Satoshi Hirabayashi; Chuck Kroll; David Nowak
2011-01-01
The Urban Forest Effects-Deposition model (UFORE-D) was developed with a component-based modeling approach. Functions of the model were separated into components that are responsible for user interface, data input/output, and core model functions. Taking advantage of the component-based approach, three UFORE-D applications were developed: a base application to estimate...
Son, Yeongkwon; Osornio-Vargas, Álvaro R; O'Neill, Marie S; Hystad, Perry; Texcalac-Sangrador, José L; Ohman-Strickland, Pamela; Meng, Qingyu; Schwander, Stephan
2018-05-17
The Mexico City Metropolitan Area (MCMA) is one of the largest and most populated urban environments in the world and experiences high air pollution levels. To develop models that estimate pollutant concentrations at fine spatiotemporal scales and provide improved air pollution exposure assessments for health studies in Mexico City. We developed finer spatiotemporal land use regression (LUR) models for PM 2.5 , PM 10 , O 3 , NO 2 , CO and SO 2 using mixed effect models with the Least Absolute Shrinkage and Selection Operator (LASSO). Hourly traffic density was included as a temporal variable besides meteorological and holiday variables. Models of hourly, daily, monthly, 6-monthly and annual averages were developed and evaluated using traditional and novel indices. The developed spatiotemporal LUR models yielded predicted concentrations with good spatial and temporal agreements with measured pollutant levels except for the hourly PM 2.5 , PM 10 and SO 2 . Most of the LUR models met performance goals based on the standardized indices. LUR models with temporal scales greater than one hour were successfully developed using mixed effect models with LASSO and showed superior model performance compared to earlier LUR models, especially for time scales of a day or longer. The newly developed LUR models will be further refined with ongoing Mexico City air pollution sampling campaigns to improve personal exposure assessments. Copyright © 2018. Published by Elsevier B.V.
A model for a knowledge-based system's life cycle
NASA Technical Reports Server (NTRS)
Kiss, Peter A.
1990-01-01
The American Institute of Aeronautics and Astronautics has initiated a Committee on Standards for Artificial Intelligence. Presented here are the initial efforts of one of the working groups of that committee. The purpose here is to present a candidate model for the development life cycle of Knowledge Based Systems (KBS). The intent is for the model to be used by the Aerospace Community and eventually be evolved into a standard. The model is rooted in the evolutionary model, borrows from the spiral model, and is embedded in the standard Waterfall model for software development. Its intent is to satisfy the development of both stand-alone and embedded KBSs. The phases of the life cycle are detailed as are and the review points that constitute the key milestones throughout the development process. The applicability and strengths of the model are discussed along with areas needing further development and refinement by the aerospace community.
Quasi steady-state aerodynamic model development for race vehicle simulations
NASA Astrophysics Data System (ADS)
Mohrfeld-Halterman, J. A.; Uddin, M.
2016-01-01
Presented in this paper is a procedure to develop a high fidelity quasi steady-state aerodynamic model for use in race car vehicle dynamic simulations. Developed to fit quasi steady-state wind tunnel data, the aerodynamic model is regressed against three independent variables: front ground clearance, rear ride height, and yaw angle. An initial dual range model is presented and then further refined to reduce the model complexity while maintaining a high level of predictive accuracy. The model complexity reduction decreases the required amount of wind tunnel data thereby reducing wind tunnel testing time and cost. The quasi steady-state aerodynamic model for the pitch moment degree of freedom is systematically developed in this paper. This same procedure can be extended to the other five aerodynamic degrees of freedom to develop a complete six degree of freedom quasi steady-state aerodynamic model for any vehicle.
Artificial Intelligence Software Engineering (AISE) model
NASA Technical Reports Server (NTRS)
Kiss, Peter A.
1990-01-01
The American Institute of Aeronautics and Astronautics has initiated a committee on standards for Artificial Intelligence. Presented are the initial efforts of one of the working groups of that committee. A candidate model is presented for the development life cycle of knowledge based systems (KBSs). The intent is for the model to be used by the aerospace community and eventually be evolved into a standard. The model is rooted in the evolutionary model, borrows from the spiral model, and is embedded in the standard Waterfall model for software development. Its intent is to satisfy the development of both stand-alone and embedded KBSs. The phases of the life cycle are shown and detailed as are the review points that constitute the key milestones throughout the development process. The applicability and strengths of the model are discussed along with areas needing further development and refinement by the aerospace community.
2013-09-01
partner agencies and nations, detects, tracks, and interdicts illegal drug-trafficking in this region. In this thesis, we develop a probability model based...trafficking in this region. In this thesis, we develop a probability model based on intelligence inputs to generate a spatial temporal heat map specifying the...complement and vet such complicated simulation by developing more analytically tractable models. We develop probability models to generate a heat map
Mathematical modeling of efficacy and safety for anticancer drugs clinical development.
Lavezzi, Silvia Maria; Borella, Elisa; Carrara, Letizia; De Nicolao, Giuseppe; Magni, Paolo; Poggesi, Italo
2018-01-01
Drug attrition in oncology clinical development is higher than in other therapeutic areas. In this context, pharmacometric modeling represents a useful tool to explore drug efficacy in earlier phases of clinical development, anticipating overall survival using quantitative model-based metrics. Furthermore, modeling approaches can be used to characterize earlier the safety and tolerability profile of drug candidates, and, thus, the risk-benefit ratio and the therapeutic index, supporting the design of optimal treatment regimens and accelerating the whole process of clinical drug development. Areas covered: Herein, the most relevant mathematical models used in clinical anticancer drug development during the last decade are described. Less recent models were considered in the review if they represent a standard for the analysis of certain types of efficacy or safety measures. Expert opinion: Several mathematical models have been proposed to predict overall survival from earlier endpoints and validate their surrogacy in demonstrating drug efficacy in place of overall survival. An increasing number of mathematical models have also been developed to describe the safety findings. Modeling has been extensively used in anticancer drug development to individualize dosing strategies based on patient characteristics, and design optimal dosing regimens balancing efficacy and safety.
Development and calibration of the statewide land use-transport model
DOT National Transportation Integrated Search
1999-02-12
The TRANUS package has been used to develop an integrated land use-transport model at the statewide level. It is based upon a specific modeling approach described by de la Barra (1989,1995). For the prototype statewide model developed during this pro...
Ku-Band rendezvous radar performance computer simulation model
NASA Technical Reports Server (NTRS)
Magnusson, H. G.; Goff, M. F.
1984-01-01
All work performed on the Ku-band rendezvous radar performance computer simulation model program since the release of the preliminary final report is summarized. Developments on the program fall into three distinct categories: (1) modifications to the existing Ku-band radar tracking performance computer model; (2) the addition of a highly accurate, nonrealtime search and acquisition performance computer model to the total software package developed on this program; and (3) development of radar cross section (RCS) computation models for three additional satellites. All changes in the tracking model involved improvements in the automatic gain control (AGC) and the radar signal strength (RSS) computer models. Although the search and acquisition computer models were developed under the auspices of the Hughes Aircraft Company Ku-Band Integrated Radar and Communications Subsystem program office, they have been supplied to NASA as part of the Ku-band radar performance comuter model package. Their purpose is to predict Ku-band acquisition performance for specific satellite targets on specific missions. The RCS models were developed for three satellites: the Long Duration Exposure Facility (LDEF) spacecraft, the Solar Maximum Mission (SMM) spacecraft, and the Space Telescopes.
Ku-Band rendezvous radar performance computer simulation model
NASA Astrophysics Data System (ADS)
Magnusson, H. G.; Goff, M. F.
1984-06-01
All work performed on the Ku-band rendezvous radar performance computer simulation model program since the release of the preliminary final report is summarized. Developments on the program fall into three distinct categories: (1) modifications to the existing Ku-band radar tracking performance computer model; (2) the addition of a highly accurate, nonrealtime search and acquisition performance computer model to the total software package developed on this program; and (3) development of radar cross section (RCS) computation models for three additional satellites. All changes in the tracking model involved improvements in the automatic gain control (AGC) and the radar signal strength (RSS) computer models. Although the search and acquisition computer models were developed under the auspices of the Hughes Aircraft Company Ku-Band Integrated Radar and Communications Subsystem program office, they have been supplied to NASA as part of the Ku-band radar performance comuter model package. Their purpose is to predict Ku-band acquisition performance for specific satellite targets on specific missions. The RCS models were developed for three satellites: the Long Duration Exposure Facility (LDEF) spacecraft, the Solar Maximum Mission (SMM) spacecraft, and the Space Telescopes.
NASA Astrophysics Data System (ADS)
Okawa, Tsutomu; Kaminishi, Tsukasa; Hirabayashi, Syuichi; Suzuki, Ryo; Mitsui, Hiroyasu; Koizumi, Hisao
The business in the enterprise is closely related with the information system to such an extent that the business activities are difficult without the information system. The system design technique that considers the business process well, and that enables a quick system development is requested. In addition, the demand for the development cost is also severe than before. To cope with the current situation, the modeling technology named BPM(Business Process Management/Modeling)is drawing attention and becoming important as a key technology. BPM is a technology to model business activities as business processes and visualize them to improve the business efficiency. However, a general methodology to develop the information system using the analysis result of BPM doesn't exist, and a few development cases are reported. This paper proposes an information system development method combining business process modeling with executable modeling. In this paper we describe a guideline to support consistency of development and development efficiency and the framework enabling to develop the information system from model. We have prototyped the information system with the proposed method and our experience has shown that the methodology is valuable.
ERIC Educational Resources Information Center
Doren, Bonnie; Flannery, K. Brigid; Lombardi, Allison
2012-01-01
The purpose of the study was to examine the potential efficacy of a professional development training model targeting IEP case managers of transition-age students. A training model was developed and a pilot study conducted to understand the promise of the model to improve the development of critical components within the IEP document that support…
Bornhorst, Ellen R; Tang, Juming; Sablani, Shyam S; Barbosa-Cánovas, Gustavo V; Liu, Fang
2017-07-01
Development and selection of model foods is a critical part of microwave thermal process development, simulation validation, and optimization. Previously developed model foods for pasteurization process evaluation utilized Maillard reaction products as the time-temperature integrators, which resulted in similar temperature sensitivity among the models. The aim of this research was to develop additional model foods based on different time-temperature integrators, determine their dielectric properties and color change kinetics, and validate the optimal model food in hot water and microwave-assisted pasteurization processes. Color, quantified using a * value, was selected as the time-temperature indicator for green pea and garlic puree model foods. Results showed 915 MHz microwaves had a greater penetration depth into the green pea model food than the garlic. a * value reaction rates for the green pea model were approximately 4 times slower than in the garlic model food; slower reaction rates were preferred for the application of model food in this study, that is quality evaluation for a target process of 90 °C for 10 min at the cold spot. Pasteurization validation used the green pea model food and results showed that there were quantifiable differences between the color of the unheated control, hot water pasteurization, and microwave-assisted thermal pasteurization system. Both model foods developed in this research could be utilized for quality assessment and optimization of various thermal pasteurization processes. © 2017 Institute of Food Technologists®.
Scripting MODFLOW Model Development Using Python and FloPy.
Bakker, M; Post, V; Langevin, C D; Hughes, J D; White, J T; Starn, J J; Fienen, M N
2016-09-01
Graphical user interfaces (GUIs) are commonly used to construct and postprocess numerical groundwater flow and transport models. Scripting model development with the programming language Python is presented here as an alternative approach. One advantage of Python is that there are many packages available to facilitate the model development process, including packages for plotting, array manipulation, optimization, and data analysis. For MODFLOW-based models, the FloPy package was developed by the authors to construct model input files, run the model, and read and plot simulation results. Use of Python with the available scientific packages and FloPy facilitates data exploration, alternative model evaluations, and model analyses that can be difficult to perform with GUIs. Furthermore, Python scripts are a complete, transparent, and repeatable record of the modeling process. The approach is introduced with a simple FloPy example to create and postprocess a MODFLOW model. A more complicated capture-fraction analysis with a real-world model is presented to demonstrate the types of analyses that can be performed using Python and FloPy. © 2016, National Ground Water Association.
Development of an Instructional Quality Assurance Model in Nursing Science
ERIC Educational Resources Information Center
Ajpru, Haruthai; Pasiphol, Shotiga; Wongwanich, Suwimon
2011-01-01
The purpose of this study was to develop an instructional quality assurance model in nursing science. The study was divided into 3 phases; (1) to study the information for instructional quality assurance model development (2) to develop an instructional quality assurance model in nursing science and (3) to audit and the assessment of the developed…
ERIC Educational Resources Information Center
Whitten-Andrews, Jeanie
2016-01-01
The social change model has proven an effective and widely utilized model assisting college students in leadership development toward positive social change. However, while this particular model gives much needed attention to the process of development leading to social change, it fails to acknowledge the external factors which significantly…
The Development of a New Model of Solar EUV Irradiance Variability
NASA Technical Reports Server (NTRS)
Warren, Harry; Wagner, William J. (Technical Monitor)
2002-01-01
The goal of this research project is the development of a new model of solar EUV (Extreme Ultraviolet) irradiance variability. The model is based on combining differential emission measure distributions derived from spatially and spectrally resolved observations of active regions, coronal holes, and the quiet Sun with full-disk solar images. An initial version of this model was developed with earlier funding from NASA. The new version of the model developed with this research grant will incorporate observations from SoHO as well as updated compilations of atomic data. These improvements will make the model calculations much more accurate.
NASA Astrophysics Data System (ADS)
Kuznetsova, Maria
The Community Coordinated Modeling Center (CCMC, http://ccmc.gsfc.nasa.gov) was established at the dawn of the new millennium as a long-term flexible solution to the problem of transition of progress in space environment modeling to operational space weather forecasting. CCMC hosts an expanding collection of state-of-the-art space weather models developed by the international space science community. Over the years the CCMC acquired the unique experience in preparing complex models and model chains for operational environment and developing and maintaining custom displays and powerful web-based systems and tools ready to be used by researchers, space weather service providers and decision makers. In support of space weather needs of NASA users CCMC is developing highly-tailored applications and services that target specific orbits or locations in space and partnering with NASA mission specialists on linking CCMC space environment modeling with impacts on biological and technological systems in space. Confidence assessment of model predictions is an essential element of space environment modeling. CCMC facilitates interaction between model owners and users in defining physical parameters and metrics formats relevant to specific applications and leads community efforts to quantify models ability to simulate and predict space environment events. Interactive on-line model validation systems developed at CCMC make validation a seamless part of model development circle. The talk will showcase innovative solutions for space weather research, validation, anomaly analysis and forecasting and review on-going community-wide model validation initiatives enabled by CCMC applications.
General form of a cooperative gradual maximal covering location problem
NASA Astrophysics Data System (ADS)
Bagherinejad, Jafar; Bashiri, Mahdi; Nikzad, Hamideh
2018-07-01
Cooperative and gradual covering are two new methods for developing covering location models. In this paper, a cooperative maximal covering location-allocation model is developed (CMCLAP). In addition, both cooperative and gradual covering concepts are applied to the maximal covering location simultaneously (CGMCLP). Then, we develop an integrated form of a cooperative gradual maximal covering location problem, which is called a general CGMCLP. By setting the model parameters, the proposed general model can easily be transformed into other existing models, facilitating general comparisons. The proposed models are developed without allocation for physical signals and with allocation for non-physical signals in discrete location space. Comparison of the previously introduced gradual maximal covering location problem (GMCLP) and cooperative maximal covering location problem (CMCLP) models with our proposed CGMCLP model in similar data sets shows that the proposed model can cover more demands and acts more efficiently. Sensitivity analyses are performed to show the effect of related parameters and the model's validity. Simulated annealing (SA) and a tabu search (TS) are proposed as solution algorithms for the developed models for large-sized instances. The results show that the proposed algorithms are efficient solution approaches, considering solution quality and running time.
Development and Validation of Linear Alternator Models for the Advanced Stirling Convertor
NASA Technical Reports Server (NTRS)
Metscher, Jonathan F.; Lewandowski, Edward
2014-01-01
Two models of the linear alternator of the Advanced Stirling Convertor (ASC) have been developed using the Sage 1-D modeling software package. The first model relates the piston motion to electric current by means of a motor constant. The second uses electromagnetic model components to model the magnetic circuit of the alternator. The models are tuned and validated using test data and compared against each other. Results show both models can be tuned to achieve results within 7% of ASC test data under normal operating conditions. Using Sage enables the creation of a complete ASC model to be developed and simulations completed quickly compared to more complex multi-dimensional models. These models allow for better insight into overall Stirling convertor performance, aid with Stirling power system modeling, and in the future support NASA mission planning for Stirling-based power systems.
Development and Validation of Linear Alternator Models for the Advanced Stirling Convertor
NASA Technical Reports Server (NTRS)
Metscher, Jonathan F.; Lewandowski, Edward J.
2015-01-01
Two models of the linear alternator of the Advanced Stirling Convertor (ASC) have been developed using the Sage 1-D modeling software package. The first model relates the piston motion to electric current by means of a motor constant. The second uses electromagnetic model components to model the magnetic circuit of the alternator. The models are tuned and validated using test data and also compared against each other. Results show both models can be tuned to achieve results within 7 of ASC test data under normal operating conditions. Using Sage enables the creation of a complete ASC model to be developed and simulations completed quickly compared to more complex multi-dimensional models. These models allow for better insight into overall Stirling convertor performance, aid with Stirling power system modeling, and in the future support NASA mission planning for Stirling-based power systems.
A universal Model-R Coupler to facilitate the use of R functions for model calibration and analysis
Wu, Yiping; Liu, Shuguang; Yan, Wende
2014-01-01
Mathematical models are useful in various fields of science and engineering. However, it is a challenge to make a model utilize the open and growing functions (e.g., model inversion) on the R platform due to the requirement of accessing and revising the model's source code. To overcome this barrier, we developed a universal tool that aims to convert a model developed in any computer language to an R function using the template and instruction concept of the Parameter ESTimation program (PEST) and the operational structure of the R-Soil and Water Assessment Tool (R-SWAT). The developed tool (Model-R Coupler) is promising because users of any model can connect an external algorithm (written in R) with their model to implement various model behavior analyses (e.g., parameter optimization, sensitivity and uncertainty analysis, performance evaluation, and visualization) without accessing or modifying the model's source code.
NASA Astrophysics Data System (ADS)
Utanto, Yuli; Widhanarto, Ghanis Putra; Maretta, Yoris Adi
2017-03-01
This study aims to develop a web-based portfolio model. The model developed in this study could reveal the effectiveness of the new model in experiments conducted at research respondents in the department of curriculum and educational technology FIP Unnes. In particular, the further research objectives to be achieved through this development of research, namely: (1) Describing the process of implementing a portfolio in a web-based model; (2) Assessing the effectiveness of web-based portfolio model for the final task, especially in Web-Based Learning courses. This type of research is the development of research Borg and Gall (2008: 589) says "educational research and development (R & D) is a process used to develop and validate educational production". The series of research and development carried out starting with exploration and conceptual studies, followed by testing and evaluation, and also implementation. For the data analysis, the technique used is simple descriptive analysis, analysis of learning completeness, which then followed by prerequisite test for normality and homogeneity to do T - test. Based on the data analysis, it was concluded that: (1) a web-based portfolio model can be applied to learning process in higher education; (2) The effectiveness of web-based portfolio model with field data from the respondents of large group trial participants (field trial), the number of respondents who reached mastery learning (a score of 60 and above) were 24 people (92.3%) in which it indicates that the web-based portfolio model is effective. The conclusion of this study is that a web-based portfolio model is effective. The implications of the research development of this model, the next researcher is expected to be able to use the guideline of the development model based on the research that has already been conducted to be developed on other subjects.
Microphysics in Multi-scale Modeling System with Unified Physics
NASA Technical Reports Server (NTRS)
Tao, Wei-Kuo
2012-01-01
Recently, a multi-scale modeling system with unified physics was developed at NASA Goddard. It consists of (1) a cloud-resolving model (Goddard Cumulus Ensemble model, GCE model), (2) a regional scale model (a NASA unified weather research and forecast, WRF), (3) a coupled CRM and global model (Goddard Multi-scale Modeling Framework, MMF), and (4) a land modeling system. The same microphysical processes, long and short wave radiative transfer and land processes and the explicit cloud-radiation, and cloud-land surface interactive processes are applied in this multi-scale modeling system. This modeling system has been coupled with a multi-satellite simulator to use NASA high-resolution satellite data to identify the strengths and weaknesses of cloud and precipitation processes simulated by the model. In this talk, a review of developments and applications of the multi-scale modeling system will be presented. In particular, the microphysics development and its performance for the multi-scale modeling system will be presented.
Top-level modeling of an als system utilizing object-oriented techniques
NASA Astrophysics Data System (ADS)
Rodriguez, L. F.; Kang, S.; Ting, K. C.
The possible configuration of an Advanced Life Support (ALS) System capable of supporting human life for long-term space missions continues to evolve as researchers investigate potential technologies and configurations. To facilitate the decision process the development of acceptable, flexible, and dynamic mathematical computer modeling tools capable of system level analysis is desirable. Object-oriented techniques have been adopted to develop a dynamic top-level model of an ALS system.This approach has several advantages; among these, object-oriented abstractions of systems are inherently modular in architecture. Thus, models can initially be somewhat simplistic, while allowing for adjustments and improvements. In addition, by coding the model in Java, the model can be implemented via the World Wide Web, greatly encouraging the utilization of the model. Systems analysis is further enabled with the utilization of a readily available backend database containing information supporting the model. The subsystem models of the ALS system model include Crew, Biomass Production, Waste Processing and Resource Recovery, Food Processing and Nutrition, and the Interconnecting Space. Each subsystem model and an overall model have been developed. Presented here is the procedure utilized to develop the modeling tool, the vision of the modeling tool, and the current focus for each of the subsystem models.
NASA Technical Reports Server (NTRS)
Epperson, David L.; Davis, Jerry M.; Bloomfield, Peter; Karl, Thomas R.; Mcnab, Alan L.; Gallo, Kevin P.
1995-01-01
Multiple regression techniques were used to predict surface shelter temperatures based on the time period 1986-89 using upper-air data from the European Centre for Medium-Range Weather Forecasts (ECMWF) to represent the background climate and site-specific data to represent the local landscape. Global monthly mean temperature models were developed using data from over 5000 stations available in the Global Historical Climate Network (GHCN). Monthly maximum, mean, and minimum temperature models for the United States were also developed using data from over 1000 stations available in the U.S. Cooperative (COOP) Network and comparative monthly mean temperature models were developed using over 1150 U.S. stations in the GHCN. Three-, six-, and full-variable models were developed for comparative purposes. Inferences about the variables selected for the various models were easier for the GHCN models, which displayed month-to-month consistency in which variables were selected, than for the COOP models, which were assigned a different list of variables for nearly every month. These and other results suggest that global calibration is preferred because data from the global spectrum of physical processes that control surface temperatures are incorporated in a global model. All of the models that were developed in this study validated relatively well, especially the global models. Recalibration of the models with validation data resulted in only slightly poorer regression statistics, indicating that the calibration list of variables was valid. Predictions using data from the validation dataset in the calibrated equation were better for the GHCN models, and the globally calibrated GHCN models generally provided better U.S. predictions than the U.S.-calibrated COOP models. Overall, the GHCN and COOP models explained approximately 64%-95% of the total variance of surface shelter temperatures, depending on the month and the number of model variables. In addition, root-mean-square errors (rmse's) were over 3 C for GHCN models and over 2 C for COOP models for winter months, and near 2 C for GHCN models and near 1.5 C for COOP models for summer months.
SSME structural dynamic model development
NASA Technical Reports Server (NTRS)
Foley, M. J.; Tilley, D. M.; Welch, C. T.
1983-01-01
A mathematical model of the Space Shuttle Main Engine (SSME) as a complete assembly, with detailed emphasis on LOX and High Fuel Turbopumps is developed. The advantages of both complete engine dynamics, and high fidelity modeling are incorporated. Development of this model, some results, and projected applications are discussed.
The stock-flow model of spatial data infrastructure development refined by fuzzy logic.
Abdolmajidi, Ehsan; Harrie, Lars; Mansourian, Ali
2016-01-01
The system dynamics technique has been demonstrated to be a proper method by which to model and simulate the development of spatial data infrastructures (SDI). An SDI is a collaborative effort to manage and share spatial data at different political and administrative levels. It is comprised of various dynamically interacting quantitative and qualitative (linguistic) variables. To incorporate linguistic variables and their joint effects in an SDI-development model more effectively, we suggest employing fuzzy logic. Not all fuzzy models are able to model the dynamic behavior of SDIs properly. Therefore, this paper aims to investigate different fuzzy models and their suitability for modeling SDIs. To that end, two inference and two defuzzification methods were used for the fuzzification of the joint effect of two variables in an existing SDI model. The results show that the Average-Average inference and Center of Area defuzzification can better model the dynamics of SDI development.
A computer model for liquid jet atomization in rocket thrust chambers
NASA Astrophysics Data System (ADS)
Giridharan, M. G.; Lee, J. G.; Krishnan, A.; Yang, H. Q.; Ibrahim, E.; Chuech, S.; Przekwas, A. J.
1991-12-01
The process of atomization has been used as an efficient means of burning liquid fuels in rocket engines, gas turbine engines, internal combustion engines, and industrial furnaces. Despite its widespread application, this complex hydrodynamic phenomenon has not been well understood, and predictive models for this process are still in their infancy. The difficulty in simulating the atomization process arises from the relatively large number of parameters that influence it, including the details of the injector geometry, liquid and gas turbulence, and the operating conditions. In this study, numerical models are developed from first principles, to quantify factors influencing atomization. For example, the surface wave dynamics theory is used for modeling the primary atomization and the droplet energy conservation principle is applied for modeling the secondary atomization. The use of empirical correlations has been minimized by shifting the analyses to fundamental levels. During applications of these models, parametric studies are performed to understand and correlate the influence of relevant parameters on the atomization process. The predictions of these models are compared with existing experimental data. The main tasks of this study were the following: development of a primary atomization model; development of a secondary atomization model; development of a model for impinging jets; development of a model for swirling jets; and coupling of the primary atomization model with a CFD code.
Global Analysis, Interpretation and Modelling: An Earth Systems Modelling Program
NASA Technical Reports Server (NTRS)
Moore, Berrien, III; Sahagian, Dork
1997-01-01
The Goal of the GAIM is: To advance the study of the coupled dynamics of the Earth system using as tools both data and models; to develop a strategy for the rapid development, evaluation, and application of comprehensive prognostic models of the Global Biogeochemical Subsystem which could eventually be linked with models of the Physical-Climate Subsystem; to propose, promote, and facilitate experiments with existing models or by linking subcomponent models, especially those associated with IGBP Core Projects and with WCRP efforts. Such experiments would be focused upon resolving interface issues and questions associated with developing an understanding of the prognostic behavior of key processes; to clarify key scientific issues facing the development of Global Biogeochemical Models and the coupling of these models to General Circulation Models; to assist the Intergovernmental Panel on Climate Change (IPCC) process by conducting timely studies that focus upon elucidating important unresolved scientific issues associated with the changing biogeochemical cycles of the planet and upon the role of the biosphere in the physical-climate subsystem, particularly its role in the global hydrological cycle; and to advise the SC-IGBP on progress in developing comprehensive Global Biogeochemical Models and to maintain scientific liaison with the WCRP Steering Group on Global Climate Modelling.
Microwave scattering models and basic experiments
NASA Technical Reports Server (NTRS)
Fung, Adrian K.
1989-01-01
Progress is summarized which has been made in four areas of study: (1) scattering model development for sparsely populated media, such as a forested area; (2) scattering model development for dense media, such as a sea ice medium or a snow covered terrain; (3) model development for randomly rough surfaces; and (4) design and conduct of basic scattering and attenuation experiments suitable for the verification of theoretical models.
Yield model development project implementation plan
NASA Technical Reports Server (NTRS)
Ambroziak, R. A.
1982-01-01
Tasks remaining to be completed are summarized for the following major project elements: (1) evaluation of crop yield models; (2) crop yield model research and development; (3) data acquisition processing, and storage; (4) related yield research: defining spectral and/or remote sensing data requirements; developing input for driving and testing crop growth/yield models; real time testing of wheat plant process models) and (5) project management and support.
Proposal for constructing an advanced software tool for planetary atmospheric modeling
NASA Technical Reports Server (NTRS)
Keller, Richard M.; Sims, Michael H.; Podolak, Esther; Mckay, Christopher P.; Thompson, David E.
1990-01-01
Scientific model building can be a time intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot easily be distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We believe that advanced software techniques can facilitate both the model building and model sharing process. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing and using models. The proposed tool will include an interactive intelligent graphical interface and a high level, domain specific, modeling language. As a testbed for this research, we propose development of a software prototype in the domain of planetary atmospheric modeling.
Sonic Boom Modeling Technical Challenge
NASA Technical Reports Server (NTRS)
Sullivan, Brenda M.
2007-01-01
This viewgraph presentation reviews the technical challenges in modeling sonic booms. The goal of this program is to develop knowledge, capabilities and technologies to enable overland supersonic flight. The specific objectives of the modeling are: (1) Develop and validate sonic boom propagation model through realistic atmospheres, including effects of turbulence (2) Develop methods enabling prediction of response of and acoustic transmission into structures impacted by sonic booms (3) Develop and validate psychoacoustic model of human response to sonic booms under both indoor and outdoor listening conditions, using simulators.
NASA Technical Reports Server (NTRS)
Allen Phillip A.; Wilson, Christopher D.
2003-01-01
The development of a pressure-dependent constitutive model with combined multilinear kinematic and isotropic hardening is presented. The constitutive model is developed using the ABAQUS user material subroutine (UMAT). First the pressure-dependent plasticity model is derived. Following this, the combined bilinear and combined multilinear hardening equations are developed for von Mises plasticity theory. The hardening rule equations are then modified to include pressure dependency. The method for implementing the new constitutive model into ABAQUS is given.
Chilcott, J; Tappenden, P; Rawdin, A; Johnson, M; Kaltenthaler, E; Paisley, S; Papaioannou, D; Shippam, A
2010-05-01
Health policy decisions must be relevant, evidence-based and transparent. Decision-analytic modelling supports this process but its role is reliant on its credibility. Errors in mathematical decision models or simulation exercises are unavoidable but little attention has been paid to processes in model development. Numerous error avoidance/identification strategies could be adopted but it is difficult to evaluate the merits of strategies for improving the credibility of models without first developing an understanding of error types and causes. The study aims to describe the current comprehension of errors in the HTA modelling community and generate a taxonomy of model errors. Four primary objectives are to: (1) describe the current understanding of errors in HTA modelling; (2) understand current processes applied by the technology assessment community for avoiding errors in development, debugging and critically appraising models for errors; (3) use HTA modellers' perceptions of model errors with the wider non-HTA literature to develop a taxonomy of model errors; and (4) explore potential methods and procedures to reduce the occurrence of errors in models. It also describes the model development process as perceived by practitioners working within the HTA community. A methodological review was undertaken using an iterative search methodology. Exploratory searches informed the scope of interviews; later searches focused on issues arising from the interviews. Searches were undertaken in February 2008 and January 2009. In-depth qualitative interviews were performed with 12 HTA modellers from academic and commercial modelling sectors. All qualitative data were analysed using the Framework approach. Descriptive and explanatory accounts were used to interrogate the data within and across themes and subthemes: organisation, roles and communication; the model development process; definition of error; types of model error; strategies for avoiding errors; strategies for identifying errors; and barriers and facilitators. There was no common language in the discussion of modelling errors and there was inconsistency in the perceived boundaries of what constitutes an error. Asked about the definition of model error, there was a tendency for interviewees to exclude matters of judgement from being errors and focus on 'slips' and 'lapses', but discussion of slips and lapses comprised less than 20% of the discussion on types of errors. Interviewees devoted 70% of the discussion to softer elements of the process of defining the decision question and conceptual modelling, mostly the realms of judgement, skills, experience and training. The original focus concerned model errors, but it may be more useful to refer to modelling risks. Several interviewees discussed concepts of validation and verification, with notable consistency in interpretation: verification meaning the process of ensuring that the computer model correctly implemented the intended model, whereas validation means the process of ensuring that a model is fit for purpose. Methodological literature on verification and validation of models makes reference to the Hermeneutic philosophical position, highlighting that the concept of model validation should not be externalized from the decision-makers and the decision-making process. Interviewees demonstrated examples of all major error types identified in the literature: errors in the description of the decision problem, in model structure, in use of evidence, in implementation of the model, in operation of the model, and in presentation and understanding of results. The HTA error classifications were compared against existing classifications of model errors in the literature. A range of techniques and processes are currently used to avoid errors in HTA models: engaging with clinical experts, clients and decision-makers to ensure mutual understanding, producing written documentation of the proposed model, explicit conceptual modelling, stepping through skeleton models with experts, ensuring transparency in reporting, adopting standard housekeeping techniques, and ensuring that those parties involved in the model development process have sufficient and relevant training. Clarity and mutual understanding were identified as key issues. However, their current implementation is not framed within an overall strategy for structuring complex problems. Some of the questioning may have biased interviewees responses but as all interviewees were represented in the analysis no rebalancing of the report was deemed necessary. A potential weakness of the literature review was its focus on spreadsheet and program development rather than specifically on model development. It should also be noted that the identified literature concerning programming errors was very narrow despite broad searches being undertaken. Published definitions of overall model validity comprising conceptual model validation, verification of the computer model, and operational validity of the use of the model in addressing the real-world problem are consistent with the views expressed by the HTA community and are therefore recommended as the basis for further discussions of model credibility. Such discussions should focus on risks, including errors of implementation, errors in matters of judgement and violations. Discussions of modelling risks should reflect the potentially complex network of cognitive breakdowns that lead to errors in models and existing research on the cognitive basis of human error should be included in an examination of modelling errors. There is a need to develop a better understanding of the skills requirements for the development, operation and use of HTA models. Interaction between modeller and client in developing mutual understanding of a model establishes that model's significance and its warranty. This highlights that model credibility is the central concern of decision-makers using models so it is crucial that the concept of model validation should not be externalized from the decision-makers and the decision-making process. Recommendations for future research would be studies of verification and validation; the model development process; and identification of modifications to the modelling process with the aim of preventing the occurrence of errors and improving the identification of errors in models.
Advanced very high resolution radiometer
NASA Technical Reports Server (NTRS)
1976-01-01
The advanced very high resolution radiometer development program is considered. The program covered the design, construction, and test of a breadboard model, engineering model, protoflight model, mechanical structural model, and a life test model. Special bench test and calibration equipment was also developed for use on the program.
Software Framework for Advanced Power Plant Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
John Widmann; Sorin Munteanu; Aseem Jain
2010-08-01
This report summarizes the work accomplished during the Phase II development effort of the Advanced Process Engineering Co-Simulator (APECS). The objective of the project is to develop the tools to efficiently combine high-fidelity computational fluid dynamics (CFD) models with process modeling software. During the course of the project, a robust integration controller was developed that can be used in any CAPE-OPEN compliant process modeling environment. The controller mediates the exchange of information between the process modeling software and the CFD software. Several approaches to reducing the time disparity between CFD simulations and process modeling have been investigated and implemented. Thesemore » include enabling the CFD models to be run on a remote cluster and enabling multiple CFD models to be run simultaneously. Furthermore, computationally fast reduced-order models (ROMs) have been developed that can be 'trained' using the results from CFD simulations and then used directly within flowsheets. Unit operation models (both CFD and ROMs) can be uploaded to a model database and shared between multiple users.« less
Non-equilibrium dog-flea model
NASA Astrophysics Data System (ADS)
Ackerson, Bruce J.
2017-11-01
We develop the open dog-flea model to serve as a check of proposed non-equilibrium theories of statistical mechanics. The model is developed in detail. Then it is applied to four recent models for non-equilibrium statistical mechanics. Comparison of the dog-flea solution with these different models allows checking claims and giving a concrete example of the theoretical models.
Artificial intelligence support for scientific model-building
NASA Technical Reports Server (NTRS)
Keller, Richard M.
1992-01-01
Scientific model-building can be a time-intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot easily be distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientific development team to understand. We believe that artificial intelligence techniques can facilitate both the model-building and model-sharing process. In this paper, we overview our effort to build a scientific modeling software tool that aids the scientist in developing and using models. This tool includes an interactive intelligent graphical interface, a high-level domain specific modeling language, a library of physics equations and experimental datasets, and a suite of data display facilities.
[Chest modelling and automotive accidents].
Trosseille, Xavier
2011-11-01
Automobile development is increasingly based on mathematical modeling. Accurate models of the human body are now available and serve to develop new means of protection. These models used to consist of rigid, articulated bodies but are now made of several million finite elements. They are now capable of predicting some risks of injury. To develop these models, sophisticated tests were conducted on human cadavers. For example, chest modeling started with material characterization and led to complete validation in the automobile environment. Model personalization, based on medical imaging, will permit studies of the behavior and tolerances of the entire population.
Proceedings of the Workshop on Government Oil Spill Modeling
NASA Technical Reports Server (NTRS)
Bishop, J. M. (Compiler)
1980-01-01
Oil spill model users and modelers were brought together for the purpose of fostering joint communication and increasing understanding of mutual problems. The workshop concentrated on defining user needs, presentations on ongoing modeling programs, and discussions of supporting research for these modeling efforts. Specific user recommendations include the development of an oil spill model user library which identifies and describes available models. The development of models for the long-term fate and effect of spilled oil was examined.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nguyen, Ba Nghiep; Jin, Xiaoshi; Wang, Jin
2010-02-23
This report describes the work conducted under the Cooperative Research and Development Agreement (CRADA) (Nr. 260) between the Pacific Northwest National Laboratory (PNNL) and Autodesk, Inc. to develop and implement process models for injection-molded long-fiber thermoplastics (LFTs) in processing software packages. The structure of this report is organized as follows. After the Introduction Section (Section 1), Section 2 summarizes the current fiber orientation models developed for injection-molded short-fiber thermoplastics (SFTs). Section 3 provides an assessment of these models to determine their capabilities and limitations, and the developments needed for injection-molded LFTs. Section 4 then focuses on the development of amore » new fiber orientation model for LFTs. This model is termed the anisotropic rotary diffusion - reduced strain closure (ARD-RSC) model as it explores the concept of anisotropic rotary diffusion to capture the fiber-fiber interaction in long-fiber suspensions and uses the reduced strain closure method of Wang et al. to slow down the orientation kinetics in concentrated suspensions. In contrast to fiber orientation modeling, before this project, no standard model was developed to predict the fiber length distribution in molded fiber composites. Section 5 is therefore devoted to the development of a fiber length attrition model in the mold. Sections 6 and 7 address the implementations of the models in AMI, and the conclusions drawn from this work is presented in Section 8.« less
Wen, Zhang; Guo, Ya; Xu, Banghao; Xiao, Kaiyin; Peng, Tao; Peng, Minhao
2016-04-01
Postoperative pancreatic fistula is still a major complication after pancreatic surgery, despite improvements of surgical technique and perioperative management. We sought to systematically review and critically access the conduct and reporting of methods used to develop risk prediction models for predicting postoperative pancreatic fistula. We conducted a systematic search of PubMed and EMBASE databases to identify articles published before January 1, 2015, which described the development of models to predict the risk of postoperative pancreatic fistula. We extracted information of developing a prediction model including study design, sample size and number of events, definition of postoperative pancreatic fistula, risk predictor selection, missing data, model-building strategies, and model performance. Seven studies of developing seven risk prediction models were included. In three studies (42 %), the number of events per variable was less than 10. The number of candidate risk predictors ranged from 9 to 32. Five studies (71 %) reported using univariate screening, which was not recommended in building a multivariate model, to reduce the number of risk predictors. Six risk prediction models (86 %) were developed by categorizing all continuous risk predictors. The treatment and handling of missing data were not mentioned in all studies. We found use of inappropriate methods that could endanger the development of model, including univariate pre-screening of variables, categorization of continuous risk predictors, and model validation. The use of inappropriate methods affects the reliability and the accuracy of the probability estimates of predicting postoperative pancreatic fistula.
Genetic Programming as Alternative for Predicting Development Effort of Individual Software Projects
Chavoya, Arturo; Lopez-Martin, Cuauhtemoc; Andalon-Garcia, Irma R.; Meda-Campaña, M. E.
2012-01-01
Statistical and genetic programming techniques have been used to predict the software development effort of large software projects. In this paper, a genetic programming model was used for predicting the effort required in individually developed projects. Accuracy obtained from a genetic programming model was compared against one generated from the application of a statistical regression model. A sample of 219 projects developed by 71 practitioners was used for generating the two models, whereas another sample of 130 projects developed by 38 practitioners was used for validating them. The models used two kinds of lines of code as well as programming language experience as independent variables. Accuracy results from the model obtained with genetic programming suggest that it could be used to predict the software development effort of individual projects when these projects have been developed in a disciplined manner within a development-controlled environment. PMID:23226305
Performance of fire behavior fuel models developed for the Rothermel Surface Fire Spread Model
Robert Ziel; W. Matt Jolly
2009-01-01
In 2005, 40 new fire behavior fuel models were published for use with the Rothermel Surface Fire Spread Model. These new models are intended to augment the original 13 developed in 1972 and 1976. As a compiled set of quantitative fuel descriptions that serve as input to the Rothermel model, the selected fire behavior fuel model has always been critical to the resulting...
The Analysis of Weak Rock Using the Pressuremeter
NASA Astrophysics Data System (ADS)
Dafni, Jacob
The pressuremeter is a versatile in situ testing instrument capable of testing a large range of materials from very soft clay to weak rock. Due to limitations of other testing devices, the pressuremeter is one of the few instruments capable of capturing stiffness and strength properties of weak rock. However, data collected is only useful if the material tested is properly modeled and desirable material properties can be obtained. While constitutive models with various flows rules have been developed for pressuremeter analysis in soil, less research has been directed at model development for pressuremeter tests in weak rock. The result is pressuremeter data collected in rock is typically analyzed using models designed for soil. The aim of this study was to explore constitutive rock models for development into a pressuremeter framework. Three models were considered, with two of those three implemented for pressuremeter analysis. A Mohr-Coulomb model with a tensile cutoff developed by Haberfield (1987) and a Hoek-Brown model initiated by Yang et al (2011) and further developed by the author were implemented and calibrated against a data set of pressuremeter tests from 5 project test sites including a total of 115 pressuremeter tests in a number of different rock formations. Development of a multiscale damage model established by Kondo et al (2008) was explored. However, this model requires further development to be used for pressuremeter data analysis.
Developing rural palliative care: validating a conceptual model.
Kelley, Mary Lou; Williams, Allison; DeMiglio, Lily; Mettam, Hilary
2011-01-01
The purpose of this research was to validate a conceptual model for developing palliative care in rural communities. This model articulates how local rural healthcare providers develop palliative care services according to four sequential phases. The model has roots in concepts of community capacity development, evolves from collaborative, generalist rural practice, and utilizes existing health services infrastructure. It addresses how rural providers manage challenges, specifically those related to: lack of resources, minimal community understanding of palliative care, health professionals' resistance, the bureaucracy of the health system, and the obstacles of providing services in rural environments. Seven semi-structured focus groups were conducted with interdisciplinary health providers in 7 rural communities in two Canadian provinces. Using a constant comparative analysis approach, focus group data were analyzed by examining participants' statements in relation to the model and comparing emerging themes in the development of rural palliative care to the elements of the model. The data validated the conceptual model as the model was able to theoretically predict and explain the experiences of the 7 rural communities that participated in the study. New emerging themes from the data elaborated existing elements in the model and informed the requirement for minor revisions. The model was validated and slightly revised, as suggested by the data. The model was confirmed as being a useful theoretical tool for conceptualizing the development of rural palliative care that is applicable in diverse rural communities.
Development of estrogen receptor beta binding prediction model using large sets of chemicals.
Sakkiah, Sugunadevi; Selvaraj, Chandrabose; Gong, Ping; Zhang, Chaoyang; Tong, Weida; Hong, Huixiao
2017-11-03
We developed an ER β binding prediction model to facilitate identification of chemicals specifically bind ER β or ER α together with our previously developed ER α binding model. Decision Forest was used to train ER β binding prediction model based on a large set of compounds obtained from EADB. Model performance was estimated through 1000 iterations of 5-fold cross validations. Prediction confidence was analyzed using predictions from the cross validations. Informative chemical features for ER β binding were identified through analysis of the frequency data of chemical descriptors used in the models in the 5-fold cross validations. 1000 permutations were conducted to assess the chance correlation. The average accuracy of 5-fold cross validations was 93.14% with a standard deviation of 0.64%. Prediction confidence analysis indicated that the higher the prediction confidence the more accurate the predictions. Permutation testing results revealed that the prediction model is unlikely generated by chance. Eighteen informative descriptors were identified to be important to ER β binding prediction. Application of the prediction model to the data from ToxCast project yielded very high sensitivity of 90-92%. Our results demonstrated ER β binding of chemicals could be accurately predicted using the developed model. Coupling with our previously developed ER α prediction model, this model could be expected to facilitate drug development through identification of chemicals that specifically bind ER β or ER α .
Supporting Collaborative Model and Data Service Development and Deployment with DevOps
NASA Astrophysics Data System (ADS)
David, O.
2016-12-01
Adopting DevOps practices for model service development and deployment enables a community to engage in service-oriented modeling and data management. The Cloud Services Integration Platform (CSIP) developed the last 5 years at Colorado State University provides for collaborative integration of environmental models into scalable model and data services as a micro-services platform with API and deployment infrastructure. Originally developed to support USDA natural resource applications, it proved suitable for a wider range of applications in the environmental modeling domain. While extending its scope and visibility it became apparent community integration and adequate work flow support through the full model development and application cycle drove successful outcomes.DevOps provide best practices, tools, and organizational structures to optimize the transition from model service development to deployment by minimizing the (i) operational burden and (ii) turnaround time for modelers. We have developed and implemented a methodology to fully automate a suite of applications for application lifecycle management, version control, continuous integration, container management, and container scaling to enable model and data service developers in various institutions to collaboratively build, run, deploy, test, and scale services within minutes.To date more than 160 model and data services are available for applications in hydrology (PRMS, Hydrotools, CFA, ESP), water and wind erosion prediction (WEPP, WEPS, RUSLE2), soil quality trends (SCI, STIR), water quality analysis (SWAT-CP, WQM, CFA, AgES-W), stream degradation assessment (SWAT-DEG), hydraulics (cross-section), and grazing management (GRAS). In addition, supporting data services include soil (SSURGO), ecological site (ESIS), climate (CLIGEN, WINDGEN), land management and crop rotations (LMOD), and pesticides (WQM), developed using this workflow automation and decentralized governance.
A Case Study of Teachers' Development of Well-Structured Mathematical Modelling Activities
ERIC Educational Resources Information Center
Stohlmann, Micah; Maiorca, Cathrine; Allen, Charlie
2017-01-01
This case study investigated how three teachers developed mathematical modelling activities integrated with content standards through participation in a course on mathematical modelling. The class activities involved experiencing a mathematical modelling activity, reading and rating example mathematical modelling activities, reading articles about…
DOT National Transportation Integrated Search
2008-04-01
The objective of this research is to develop alternative time-dependent travel demand models of hurricane evacuation travel and to compare the performance of these models with each other and with the state-of-the-practice models in current use. Speci...
Cavuşoğlu, M Cenk; Göktekin, Tolga G; Tendick, Frank
2006-04-01
This paper presents the architectural details of an evolving open source/open architecture software framework for developing organ-level surgical simulations. Our goal is to facilitate shared development of reusable models, to accommodate heterogeneous models of computation, and to provide a framework for interfacing multiple heterogeneous models. The framework provides an application programming interface for interfacing dynamic models defined over spatial domains. It is specifically designed to be independent of the specifics of the modeling methods used, and therefore facilitates seamless integration of heterogeneous models and processes. Furthermore, each model has separate geometries for visualization, simulation, and interfacing, allowing the model developer to choose the most natural geometric representation for each case. Input/output interfaces for visualization and haptics for real-time interactive applications have also been provided.
2011-01-01
used in efforts to develop QSAR models. Measurement of Repellent Efficacy Screening for Repellency of Compounds with Unknown Toxicology In screening...CPT) were used to develop Quantitative Structure Activity Relationship ( QSAR ) models to predict repellency. Successful prediction of novel...acylpiperidine QSAR models employed 4 descriptors to describe the relationship between structure and repellent duration. The ANN model of the carboxamides did not
ERIC Educational Resources Information Center
Retnaningsih, Woro; Djatmiko; Sumarlam
2017-01-01
The research objective is to develop a model of Assessment for Learning (AFL) in Pragmatic course in IAIN Surakarta. The research problems are as follows: How did the lecturer develop a model of AFL? What was the form of assessment information used as the model of AFL? How was the results of the implementation of the model of assessment. The…
A user-friendly model for spray drying to aid pharmaceutical product development.
Grasmeijer, Niels; de Waard, Hans; Hinrichs, Wouter L J; Frijlink, Henderik W
2013-01-01
The aim of this study was to develop a user-friendly model for spray drying that can aid in the development of a pharmaceutical product, by shifting from a trial-and-error towards a quality-by-design approach. To achieve this, a spray dryer model was developed in commercial and open source spreadsheet software. The output of the model was first fitted to the experimental output of a Büchi B-290 spray dryer and subsequently validated. The predicted outlet temperatures of the spray dryer model matched the experimental values very well over the entire range of spray dryer settings that were tested. Finally, the model was applied to produce glassy sugars by spray drying, an often used excipient in formulations of biopharmaceuticals. For the production of glassy sugars, the model was extended to predict the relative humidity at the outlet, which is not measured in the spray dryer by default. This extended model was then successfully used to predict whether specific settings were suitable for producing glassy trehalose and inulin by spray drying. In conclusion, a spray dryer model was developed that is able to predict the output parameters of the spray drying process. The model can aid the development of spray dried pharmaceutical products by shifting from a trial-and-error towards a quality-by-design approach.
NASA Technical Reports Server (NTRS)
Nguyen, Han; Mazurkivich, Pete
2006-01-01
A pressurization system model was developed for a crossfeed subscale water test article using the EASY5 modeling software. The model consisted of an integrated tank pressurization and pressurization line model. The tank model was developed using the general purpose library, while the line model was assembled from the gas dynamic library. The pressurization system model was correlated to water test data obtained from nine test runs conducted on the crossfeed subscale test article. The model was first correlated to a representative test run and frozen. The correlated model was then used to predict the tank pressures and compared with the test data for eight other runs. The model prediction showed excellent agreement with the test data, allowing it to be used in a later study to analyze the pressurization system performance of a full-scale bimese vehicle with cryogenic propellants.
Construction of an advanced software tool for planetary atmospheric modeling
NASA Technical Reports Server (NTRS)
Friedland, Peter; Keller, Richard M.; Mckay, Christopher P.; Sims, Michael H.; Thompson, David E.
1993-01-01
Scientific model-building can be a time intensive and painstaking process, often involving the development of large complex computer programs. Despite the effort involved, scientific models cannot be distributed easily and shared with other scientists. In general, implemented scientific models are complicated, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing, using and sharing models. The proposed tool will include an interactive intelligent graphical interface and a high-level domain-specific modeling language. As a testbed for this research, we propose to develop a software prototype in the domain of planetary atmospheric modeling.
Biopharma business models in Canada.
March-Chordà, I; Yagüe-Perales, R M
2011-08-01
This article provides new insights into the different strategy paths or business models currently being implemented by Canadian biopharma companies. Through a case-study methodology, seven biopharma companies pertaining to three business models were analyzed, leading to a broad set of results emerging from the following areas: activity, business model and strategy; management and human resources; and R&D, technology and innovation strategy. The three business models represented were: model 1 (conventional biotech oriented to new drug development, radical innovation and search for discoveries); model 2 (development of a technology platform, usually in proteomics and bioinformatics); and model 3 (incremental innovation, with shorter and less risky development timelines). Copyright © 2011 Elsevier Ltd. All rights reserved.
Construction of an advanced software tool for planetary atmospheric modeling
NASA Technical Reports Server (NTRS)
Friedland, Peter; Keller, Richard M.; Mckay, Christopher P.; Sims, Michael H.; Thompson, David E.
1992-01-01
Scientific model-building can be a time intensive and painstaking process, often involving the development of large complex computer programs. Despite the effort involved, scientific models cannot be distributed easily and shared with other scientists. In general, implemented scientific models are complicated, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing, using and sharing models. The proposed tool will include an interactive intelligent graphical interface and a high-level domain-specific modeling language. As a test bed for this research, we propose to develop a software prototype in the domain of planetary atmospheric modeling.
Ebben, Matthew R; Narizhnaya, Mariya; Krieger, Ana C
2017-05-01
Numerous mathematical formulas have been developed to determine continuous positive airway pressure (CPAP) without an in-laboratory titration study. Recent studies have shown that style of CPAP mask can affect the optimal pressure requirement. However, none of the current models take mask style into account. Therefore, the goal of this study was to develop new predictive models of CPAP that take into account the style of mask interface. Data from 200 subjects with attended CPAP titrations during overnight polysomnograms using nasal masks and 132 subjects using oronasal masks were randomized and split into either a model development or validation group. Predictive models were then created in each model development group and the accuracy of the models was then tested in the model validation groups. The correlation between our new oronasal model and laboratory determined optimal CPAP was significant, r = 0.61, p < 0.001. Our nasal formula was also significantly related to laboratory determined optimal CPAP, r = 0.35, p < 0.001. The oronasal model created in our study significantly outperformed the original CPAP predictive model developed by Miljeteig and Hoffstein, z = 1.99, p < 0.05. The predictive performance of our new nasal model did not differ significantly from Miljeteig and Hoffstein's original model, z = -0.16, p < 0.90. The best predictors for the nasal mask group were AHI, lowest SaO2, and neck size, whereas the top predictors in the oronasal group were AHI and lowest SaO2. Our data show that predictive models of CPAP that take into account mask style can significantly improve the formula's accuracy. Most of the past models likely focused on model development with nasal masks (mask style used for model development was not typically reported in previous investigations) and are not well suited for patients using an oronasal interface. Our new oronasal CPAP prediction equation produced significantly improved performance compared to the well-known Miljeteig and Hoffstein formula in patients titrated on CPAP with an oronasal mask and was also significantly related to laboratory determined optimal CPAP.
NASA Astrophysics Data System (ADS)
Bennett, A.; Nijssen, B.; Chegwidden, O.; Wood, A.; Clark, M. P.
2017-12-01
Model intercomparison experiments have been conducted to quantify the variability introduced during the model development process, but have had limited success in identifying the sources of this model variability. The Structure for Unifying Multiple Modeling Alternatives (SUMMA) has been developed as a framework which defines a general set of conservation equations for mass and energy as well as a common core of numerical solvers along with the ability to set options for choosing between different spatial discretizations and flux parameterizations. SUMMA can be thought of as a framework for implementing meta-models which allows for the investigation of the impacts of decisions made during the model development process. Through this flexibility we develop a hierarchy of definitions which allows for models to be compared to one another. This vocabulary allows us to define the notion of weak equivalence between model instantiations. Through this weak equivalence we develop the concept of model mimicry, which can be used to investigate the introduction of uncertainty and error during the modeling process as well as provide a framework for identifying modeling decisions which may complement or negate one another. We instantiate SUMMA instances that mimic the behaviors of the Variable Infiltration Capacity (VIC) model and the Precipitation Runoff Modeling System (PRMS) by choosing modeling decisions which are implemented in each model. We compare runs from these models and their corresponding mimics across the Columbia River Basin located in the Pacific Northwest of the United States and Canada. From these comparisons, we are able to determine the extent to which model implementation has an effect on the results, as well as determine the changes in sensitivity of parameters due to these implementation differences. By examining these changes in results and sensitivities we can attempt to postulate changes in the modeling decisions which may provide better estimation of state variables.
NASA Astrophysics Data System (ADS)
Vanderborght, Jan; Priesack, Eckart
2017-04-01
The Soil Model Development and Intercomparison Panel (SoilMIP) is an initiative of the International Soil Modeling Consortium. Its mission is to foster the further development of soil models that can predict soil functions and their changes (i) due to soil use and land management and (ii) due to external impacts of climate change and pollution. Since soil functions and soil threats are diverse but linked with each other, the overall aim is to develop holistic models that represent the key functions of the soil system and the links between them. These models should be scaled up and integrated in terrestrial system models that describe the feedbacks between processes in the soil and the other terrestrial compartments. We propose and illustrate a few steps that could be taken to achieve these goals. A first step is the development of scenarios that compare simulations by models that predict the same or different soil services. Scenarios can be considered at three different levels of comparisons: scenarios that compare the numerics (accuracy but also speed) of models, scenarios that compare the effect of differences in process descriptions, and scenarios that compare simulations with experimental data. A second step involves the derivation of metrics or summary statistics that effectively compare model simulations and disentangle parameterization from model concept differences. These metrics can be used to evaluate how more complex model simulations can be represented by simpler models using an appropriate parameterization. A third step relates to the parameterization of models. Application of simulation models implies that appropriate model parameters have to be defined for a range of environmental conditions and locations. Spatial modelling approaches are used to derive parameter distributions. Considering that soils and their properties emerge from the interaction between physical, chemical and biological processes, the combination of spatial models with process models would lead to consistent parameter distributions correlations and could potentially represent self-organizing processes in soils and landscapes.
Next-generation concurrent engineering: developing models to complement point designs
NASA Technical Reports Server (NTRS)
Morse, Elizabeth; Leavens, Tracy; Cohanim, Babak; Harmon, Corey; Mahr, Eric; Lewis, Brian
2006-01-01
Concurrent Engineering Design (CED) teams have made routine the rapid development of point designs for space missions. The Jet Propulsion Laboratory's Team X is now evolving into a 'next-generation CED; in addition to a point design, the Team develops a model of the local trade space. The process is a balance between the power of a model developing tools and the creativity of humal experts, enabling the development of a variety of trade models for any space mission. This paper reviews the modeling method and its practical implementation in the ED environment. Example results illustrate the benefit of this approach.
Shaw, Tim; Barnet, Stewart; Mcgregor, Deborah; Avery, Jennifer
2015-01-01
Online learning is a primary delivery method for continuing health education programs. It is critical that programs have curricula objectives linked to educational models that support learning. Using a proven educational modelling process ensures that curricula objectives are met and a solid basis for learning and assessment is achieved. To develop an educational design model that produces an educationally sound program development plan for use by anyone involved in online course development. We have described the development of a generic educational model designed for continuing health education programs. The Knowledge, Process, Practice (KPP) model is founded on recognised educational theory and online education practice. This paper presents a step-by-step guide on using this model for program development that encases reliable learning and evaluation. The model supports a three-step approach, KPP, based on learning outcomes and supporting appropriate assessment activities. It provides a program structure for online or blended learning that is explicit, educationally defensible, and supports multiple assessment points for health professionals. The KPP model is based on best practice educational design using a structure that can be adapted for a variety of online or flexibly delivered postgraduate medical education programs.
Thermal performance modeling of NASA s scientific balloons
NASA Astrophysics Data System (ADS)
Franco, H.; Cathey, H.
The flight performance of a scientific balloon is highly dependant on the interaction between the balloon and its environment. The balloon is a thermal vehicle. Modeling a scientific balloon's thermal performance has proven to be a difficult analytical task. Most previous thermal models have attempted these analyses by using either a bulk thermal model approach, or by simplified representations of the balloon. These approaches to date have provided reasonable, but not very accurate results. Improvements have been made in recent years using thermal analysis tools developed for the thermal modeling of spacecraft and other sophisticated heat transfer problems. These tools, which now allow for accurate modeling of highly transmissive materials, have been applied to the thermal analysis of NASA's scientific balloons. A research effort has been started that utilizes the "Thermal Desktop" addition to AUTO CAD. This paper will discuss the development of thermal models for both conventional and Ultra Long Duration super-pressure balloons. This research effort has focused on incremental analysis stages of development to assess the accuracy of the tool and the required model resolution to produce usable data. The first stage balloon thermal analyses started with simple spherical balloon models with a limited number of nodes, and expanded the number of nodes to determine required model resolution. These models were then modified to include additional details such as load tapes. The second stage analyses looked at natural shaped Zero Pressure balloons. Load tapes were then added to these shapes, again with the goal of determining the required modeling accuracy by varying the number of gores. The third stage, following the same steps as the Zero Pressure balloon efforts, was directed at modeling super-pressure pumpkin shaped balloons. The results were then used to develop analysis guidelines and an approach for modeling balloons for both simple first order estimates and detailed full models. The development of the radiative environment and program input files, the development of the modeling techniques for balloons, and the development of appropriate data output handling techniques for both the raw data and data plots will be discussed. A general guideline to match predicted balloon performance with known flight data will also be presented. One long-term goal of this effort is to develop simplified approaches and techniques to include results in performance codes being developed.
Payne, Courtney E; Wolfrum, Edward J
2015-01-01
Obtaining accurate chemical composition and reactivity (measures of carbohydrate release and yield) information for biomass feedstocks in a timely manner is necessary for the commercialization of biofuels. Our objective was to use near-infrared (NIR) spectroscopy and partial least squares (PLS) multivariate analysis to develop calibration models to predict the feedstock composition and the release and yield of soluble carbohydrates generated by a bench-scale dilute acid pretreatment and enzymatic hydrolysis assay. Major feedstocks included in the calibration models are corn stover, sorghum, switchgrass, perennial cool season grasses, rice straw, and miscanthus. We present individual model statistics to demonstrate model performance and validation samples to more accurately measure predictive quality of the models. The PLS-2 model for composition predicts glucan, xylan, lignin, and ash (wt%) with uncertainties similar to primary measurement methods. A PLS-2 model was developed to predict glucose and xylose release following pretreatment and enzymatic hydrolysis. An additional PLS-2 model was developed to predict glucan and xylan yield. PLS-1 models were developed to predict the sum of glucose/glucan and xylose/xylan for release and yield (grams per gram). The release and yield models have higher uncertainties than the primary methods used to develop the models. It is possible to build effective multispecies feedstock models for composition, as well as carbohydrate release and yield. The model for composition is useful for predicting glucan, xylan, lignin, and ash with good uncertainties. The release and yield models have higher uncertainties; however, these models are useful for rapidly screening sample populations to identify unusual samples.
Development and Implementation of CFD-Informed Models for the Advanced Subchannel Code CTF
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blyth, Taylor S.; Avramova, Maria
The research described in this PhD thesis contributes to the development of efficient methods for utilization of high-fidelity models and codes to inform low-fidelity models and codes in the area of nuclear reactor core thermal-hydraulics. The objective is to increase the accuracy of predictions of quantities of interests using high-fidelity CFD models while preserving the efficiency of low-fidelity subchannel core calculations. An original methodology named Physics- based Approach for High-to-Low Model Information has been further developed and tested. The overall physical phenomena and corresponding localized effects, which are introduced by the presence of spacer grids in light water reactor (LWR)more » cores, are dissected in corresponding four building basic processes, and corresponding models are informed using high-fidelity CFD codes. These models are a spacer grid-directed cross-flow model, a grid-enhanced turbulent mixing model, a heat transfer enhancement model, and a spacer grid pressure loss model. The localized CFD-models are developed and tested using the CFD code STAR-CCM+, and the corresponding global model development and testing in sub-channel formulation is performed in the thermal- hydraulic subchannel code CTF. The improved CTF simulations utilize data-files derived from CFD STAR-CCM+ simulation results covering the spacer grid design desired for inclusion in the CTF calculation. The current implementation of these models is examined and possibilities for improvement and further development are suggested. The validation experimental database is extended by including the OECD/NRC PSBT benchmark data. The outcome is an enhanced accuracy of CTF predictions while preserving the computational efficiency of a low-fidelity subchannel code.« less
Development and Implementation of CFD-Informed Models for the Advanced Subchannel Code CTF
NASA Astrophysics Data System (ADS)
Blyth, Taylor S.
The research described in this PhD thesis contributes to the development of efficient methods for utilization of high-fidelity models and codes to inform low-fidelity models and codes in the area of nuclear reactor core thermal-hydraulics. The objective is to increase the accuracy of predictions of quantities of interests using high-fidelity CFD models while preserving the efficiency of low-fidelity subchannel core calculations. An original methodology named Physics-based Approach for High-to-Low Model Information has been further developed and tested. The overall physical phenomena and corresponding localized effects, which are introduced by the presence of spacer grids in light water reactor (LWR) cores, are dissected in corresponding four building basic processes, and corresponding models are informed using high-fidelity CFD codes. These models are a spacer grid-directed cross-flow model, a grid-enhanced turbulent mixing model, a heat transfer enhancement model, and a spacer grid pressure loss model. The localized CFD-models are developed and tested using the CFD code STAR-CCM+, and the corresponding global model development and testing in sub-channel formulation is performed in the thermal-hydraulic subchannel code CTF. The improved CTF simulations utilize data-files derived from CFD STAR-CCM+ simulation results covering the spacer grid design desired for inclusion in the CTF calculation. The current implementation of these models is examined and possibilities for improvement and further development are suggested. The validation experimental database is extended by including the OECD/NRC PSBT benchmark data. The outcome is an enhanced accuracy of CTF predictions while preserving the computational efficiency of a low-fidelity subchannel code.
Multifidelity, Multidisciplinary Design Under Uncertainty with Non-Intrusive Polynomial Chaos
NASA Technical Reports Server (NTRS)
West, Thomas K., IV; Gumbert, Clyde
2017-01-01
The primary objective of this work is to develop an approach for multifidelity uncertainty quantification and to lay the framework for future design under uncertainty efforts. In this study, multifidelity is used to describe both the fidelity of the modeling of the physical systems, as well as the difference in the uncertainty in each of the models. For computational efficiency, a multifidelity surrogate modeling approach based on non-intrusive polynomial chaos using the point-collocation technique is developed for the treatment of both multifidelity modeling and multifidelity uncertainty modeling. Two stochastic model problems are used to demonstrate the developed methodologies: a transonic airfoil model and multidisciplinary aircraft analysis model. The results of both showed the multifidelity modeling approach was able to predict the output uncertainty predicted by the high-fidelity model as a significant reduction in computational cost.
Continuous Improvement of a Groundwater Model over a 20-Year Period: Lessons Learned.
Andersen, Peter F; Ross, James L; Fenske, Jon P
2018-04-17
Groundwater models developed for specific sites generally become obsolete within a few years due to changes in: (1) modeling technology; (2) site/project personnel; (3) project funding; and (4) modeling objectives. Consequently, new models are sometimes developed for the same sites using the latest technology and data, but without potential knowledge gained from the prior models. When it occurs, this practice is particularly problematic because, although technology, data, and observed conditions change, development of the new numerical model may not consider the conceptual model's underpinnings. As a contrary situation, we present the unique case of a numerical flow and trichloroethylene (TCE) transport model that was first developed in 1993 and since revised and updated annually by the same personnel. The updates are prompted by an increase in the amount of data, exposure to a wider range of hydrologic conditions over increasingly longer timeframes, technological advances, evolving modeling objectives, and revised modeling methodologies. The history of updates shows smooth, incremental changes in the conceptual model and modeled aquifer parameters that result from both increase and decrease in complexity. Myriad modeling objectives have included demonstrating the ineffectiveness of a groundwater extraction/injection system, evaluating potential TCE degradation, locating new monitoring points, and predicting likelihood of exceedance of groundwater standards. The application emphasizes an original tenet of successful groundwater modeling: iterative adjustment of the conceptual model based on observations of actual vs. model response. © 2018, National Ground Water Association.
ERIC Educational Resources Information Center
Justi, Rosaria; van Driel, Jan
2006-01-01
Models play an important role in science education. However, previous research has revealed that science teachers' content knowledge, curricular knowledge, and pedagogical content knowledge on models and modelling are often incomplete or inadequate. From this perspective, a research project was designed which aimed at the development of beginning…
USDA-ARS?s Scientific Manuscript database
Modeling routines of the Integrated Farm System Model (IFSM version 4.2) and Dairy Gas Emission Model (DairyGEM version 3.2), two whole-farm simulation models developed and maintained by USDA-ARS, were revised with new components for: (1) simulation of ammonia (NH3) and greenhouse gas emissions gene...
A New Model for the Integration of Science and Mathematics: The Balance Model
ERIC Educational Resources Information Center
Kiray, S. Ahmet
2012-01-01
The aim of this study is to develop an integrated scientific and mathematical model that is suited to the background of Turkish teachers. The dimensions of the model are given and compared to the models which have been previously developed and the findings of earlier studies on the topic. The model is called the balance, reflecting the…
NASA Astrophysics Data System (ADS)
Aydogan Yenmez, Arzu; Erbas, Ayhan Kursat; Cakiroglu, Erdinc; Alacaci, Cengiz; Cetinkaya, Bulent
2017-08-01
Applications and modelling have gained a prominent role in mathematics education reform documents and curricula. Thus, there is a growing need for studies focusing on the effective use of mathematical modelling in classrooms. Assessment is an integral part of using modelling activities in classrooms, since it allows teachers to identify and manage problems that arise in various stages of the modelling process. However, teachers' difficulties in assessing student modelling work are a challenge to be considered when implementing modelling in the classroom. Thus, the purpose of this study was to investigate how teachers' knowledge on generating assessment criteria for assessing student competence in mathematical modelling evolved through a professional development programme, which is based on a lesson study approach and modelling perspective. The data was collected with four teachers from two public high schools over a five-month period. The professional development programme included a cyclical process, with each cycle consisting of an introductory meeting, the implementation of a model-eliciting activity with students, and a follow-up meeting. The results showed that the professional development programme contributed to teachers' knowledge for generating assessment criteria on the products, and the observable actions that affect the modelling cycle.
Dynamic Simulation of Human Gait Model With Predictive Capability.
Sun, Jinming; Wu, Shaoli; Voglewede, Philip A
2018-03-01
In this paper, it is proposed that the central nervous system (CNS) controls human gait using a predictive control approach in conjunction with classical feedback control instead of exclusive classical feedback control theory that controls based on past error. To validate this proposition, a dynamic model of human gait is developed using a novel predictive approach to investigate the principles of the CNS. The model developed includes two parts: a plant model that represents the dynamics of human gait and a controller that represents the CNS. The plant model is a seven-segment, six-joint model that has nine degrees-of-freedom (DOF). The plant model is validated using data collected from able-bodied human subjects. The proposed controller utilizes model predictive control (MPC). MPC uses an internal model to predict the output in advance, compare the predicted output to the reference, and optimize the control input so that the predicted error is minimal. To decrease the complexity of the model, two joints are controlled using a proportional-derivative (PD) controller. The developed predictive human gait model is validated by simulating able-bodied human gait. The simulation results show that the developed model is able to simulate the kinematic output close to experimental data.
The Unified Plant Growth Model (UPGM): software framework overview and model application
USDA-ARS?s Scientific Manuscript database
Since the Environmental Policy Integrated Climate (EPIC) model was developed in 1989, the EPIC plant growth component has been incorporated into other erosion and crop management models (e.g., WEPS, WEPP, SWAT, ALMANAC, and APEX) and modified to meet model developer research objectives. This has re...
An analytical framework to assist decision makers in the use of forest ecosystem model predictions
USDA-ARS?s Scientific Manuscript database
The predictions of most terrestrial ecosystem models originate from deterministic simulations. Relatively few uncertainty evaluation exercises in model outputs are performed by either model developers or users. This issue has important consequences for decision makers who rely on models to develop n...
An approach to developing user interfaces for space systems
NASA Astrophysics Data System (ADS)
Shackelford, Keith; McKinney, Karen
1993-08-01
Inherent weakness in the traditional waterfall model of software development has led to the definition of the spiral model. The spiral model software development lifecycle model, however, has not been applied to NASA projects. This paper describes its use in developing real time user interface software for an Environmental Control and Life Support System (ECLSS) Process Control Prototype at NASA's Marshall Space Flight Center.
Linking the Leadership Identity Development Model to Collegiate Recreation and Athletics.
Hall, Stacey L
2015-01-01
The Leadership Identity Development (LID) Model (Komives, Owen, Longerbeam, Mainella, & Osteen, 2005) provides a stage leadership development model for college students that can be applied to collegiate recreation student staff, volunteers, participants, and varsity student-athletes. This chapter provides guidance to implement the model in these settings and to create environments that support development. © 2015 Wiley Periodicals, Inc., A Wiley Company.
Directory of Energy Information Administration model abstracts
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1987-08-11
This report contains brief statements from the model managers about each model's title, acronym, purpose, and status, followed by more detailed information on characteristics, uses, and requirements. Sources for additional information are identified. All models ''active'' through March 1987 are included. The main body of this directory is an alphabetical list of all active EIA models. Appendix A identifies major EIA modeling systems and the models within these systems, and Appendix B identifies active EIA models by type (basic, auxiliary, and developing). A basic model is one designated by the EIA Administrator as being sufficiently important to require sustained supportmore » and public scrutiny. An auxiliary model is one designated by the EIA Administrator as being used only occasionally in analyses, and therefore requires minimal levels of documentation. A developing model is one designated by the EIA Administrator as being under development and yet of sufficient interest to require a basic level of documentation at a future date. EIA also leases models developed by proprietary software vendors. Documentation for these ''proprietary'' models is the responsibility of the companies from which they are leased. EIA has recently leased models from Chase Econometrics, Inc., Data Resources, Inc. (DRI), the Oak Ridge National Laboratory (ORNL), and Wharton Econometric Forecasting Associates (WEFA). Leased models are not abstracted here. The directory is intended for the use of energy and energy-policy analysts in the public and private sectors.« less
Markstrom, Steven L.; Niswonger, Richard G.; Regan, R. Steven; Prudic, David E.; Barlow, Paul M.
2008-01-01
The need to assess the effects of variability in climate, biota, geology, and human activities on water availability and flow requires the development of models that couple two or more components of the hydrologic cycle. An integrated hydrologic model called GSFLOW (Ground-water and Surface-water FLOW) was developed to simulate coupled ground-water and surface-water resources. The new model is based on the integration of the U.S. Geological Survey Precipitation-Runoff Modeling System (PRMS) and the U.S. Geological Survey Modular Ground-Water Flow Model (MODFLOW). Additional model components were developed, and existing components were modified, to facilitate integration of the models. Methods were developed to route flow among the PRMS Hydrologic Response Units (HRUs) and between the HRUs and the MODFLOW finite-difference cells. This report describes the organization, concepts, design, and mathematical formulation of all GSFLOW model components. An important aspect of the integrated model design is its ability to conserve water mass and to provide comprehensive water budgets for a location of interest. This report includes descriptions of how water budgets are calculated for the integrated model and for individual model components. GSFLOW provides a robust modeling system for simulating flow through the hydrologic cycle, while allowing for future enhancements to incorporate other simulation techniques.
A review of unmanned aircraft system ground risk models
NASA Astrophysics Data System (ADS)
Washington, Achim; Clothier, Reece A.; Silva, Jose
2017-11-01
There is much effort being directed towards the development of safety regulations for unmanned aircraft systems (UAS). National airworthiness authorities have advocated the adoption of a risk-based approach, whereby regulations are driven by the outcomes of a systematic process to assess and manage identified safety risks. Subsequently, models characterising the primary hazards associated with UAS operations have now become critical to the development of regulations and in turn, to the future of the industry. Key to the development of airworthiness regulations for UAS is a comprehensive understanding of the risks UAS operations pose to people and property on the ground. A comprehensive review of the literature identified 33 different models (and component sub models) used to estimate ground risk posed by UAS. These models comprise failure, impact location, recovery, stress, exposure, incident stress and harm sub-models. The underlying assumptions and treatment of uncertainties in each of these sub-models differ significantly between models, which can have a significant impact on the development of regulations. This paper reviews the state-of-the-art in research into UAS ground risk modelling, discusses how the various sub-models relate to the different components of the regulation, and explores how model-uncertainties potentially impact the development of regulations for UAS.
A Model of the Temporal Dynamics of Knowledge Brokerage in Sustainable Development
ERIC Educational Resources Information Center
Hukkinen, Janne I.
2016-01-01
I develop a conceptual model of the temporal dynamics of knowledge brokerage for sustainable development. Brokerage refers to efforts to make research and policymaking more accessible to each other. The model enables unbiased and systematic consideration of knowledge brokerage as part of policy evolution. The model is theoretically grounded in…
ERIC Educational Resources Information Center
Edinger, Matthew J.
2017-01-01
This article theoretically develops and examines the outcomes of a pilot study that evaluates the PACKaGE Model of online Teacher Professional Development (the Model). The Model was created to facilitate positive pedagogical change within gifted education teachers' practice, attitude, collaboration, content knowledge, and goal effectiveness.…
Learning Together: The Role of the Online Community in Army Professional Education
2005-05-26
Kolb , Experiential Learning : Experience as the Source of Learning and Development... Experiential Learning One model frequently discussed is experiential learning .15 Kolb develops this model through analysis of older models. One of the...observations about the experience. Kolb develops several characteristics of adult learning . Kolb discusses his model of experiential learning
Experiences in Teaching a Graduate Course on Model-Driven Software Development
ERIC Educational Resources Information Center
Tekinerdogan, Bedir
2011-01-01
Model-driven software development (MDSD) aims to support the development and evolution of software intensive systems using the basic concepts of model, metamodel, and model transformation. In parallel with the ongoing academic research, MDSD is more and more applied in industrial practices. After being accepted both by a broad community of…
A Leadership Identity Development Model: Applications from a Grounded Theory
ERIC Educational Resources Information Center
Komives, Susan R.; Mainella, Felicia C.; Longerbeam, Susan D.; Osteen, Laura; Owen, Julie E.
2006-01-01
This article describes a stage-based model of leadership identity development (LID) that resulted from a grounded theory study on developing a leadership identity (Komives, Owen, Longerbeam, Mainella, & Osteen, 2005). The LID model expands on the leadership identity stages, integrates the categories of the grounded theory into the LID model, and…
Career Planning: Towards a More Inclusive Model for Women and Diverse Individuals
ERIC Educational Resources Information Center
Banks, Claretha H.
2006-01-01
Since the 1953 introduction of Super's model of career development, many publications regarding career development and career planning have been developed. However, career planning models for women and diverse individuals are not prevalent. This paper contains a literature review of various well-known models that have few specific applications for…
Use of hydrologic and hydrodynamic modeling for ecosystem restoration
Obeysekera, J.; Kuebler, L.; Ahmed, S.; Chang, M.-L.; Engel, V.; Langevin, C.; Swain, E.; Wan, Y.
2011-01-01
Planning and implementation of unprecedented projects for restoring the greater Everglades ecosystem are underway and the hydrologic and hydrodynamic modeling of restoration alternatives has become essential for success of restoration efforts. In view of the complex nature of the South Florida water resources system, regional-scale (system-wide) hydrologic models have been developed and used extensively for the development of the Comprehensive Everglades Restoration Plan. In addition, numerous subregional-scale hydrologic and hydrodynamic models have been developed and are being used for evaluating project-scale water management plans associated with urban, agricultural, and inland costal ecosystems. The authors provide a comprehensive summary of models of all scales, as well as the next generation models under development to meet the future needs of ecosystem restoration efforts in South Florida. The multiagency efforts to develop and apply models have allowed the agencies to understand the complex hydrologic interactions, quantify appropriate performance measures, and use new technologies in simulation algorithms, software development, and GIS/database techniques to meet the future modeling needs of the ecosystem restoration programs. Copyright ?? 2011 Taylor & Francis Group, LLC.
Animal models of neoplastic development.
Pitot, H C
2001-01-01
The basic animal model for neoplastic development used by regulatory agencies is the two-year chronic bioassay developed more than 30 years ago and based on the presumed mechanism of action of a few potential chemical carcinogens. Since that time, a variety of other model carcinogenic systems have been developed, usually involving shorter duration, single organ endpoints, multistage models, and those in genetically-engineered mice. The chronic bioassay is still the "gold standard" of regulatory agencies despite a number of deficiencies, while in this country the use of shorter term assays based on single organ endpoints has not been popular. The multistage model of carcinogenesis in mouse epidermis actually preceded the development of the chronic two-year bioassay, but it was not until multistage models in other organ systems were developed that the usefulness of such systems became apparent. Recently, several genetically-engineered mouse lines involving mutations in proto-oncogenes and tumour suppressor genes have been proposed as additional model systems for use in regulatory decisions. It is likely that a combination of several of these model systems may be most useful in both practical and basic applications of cancer prevention and therapy.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spencer, Benjamin; Zhang, Yongfeng; Chakraborty, Pritam
2014-09-01
This report summarizes work during FY 2014 to develop capabilities to predict embrittlement of reactor pressure vessel steel, and to assess the response of embrittled reactor pressure vessels to postulated accident conditions. This work has been conducted a three length scales. At the engineering scale, 3D fracture mechanics capabilities have been developed to calculate stress intensities and fracture toughnesses, to perform a deterministic assessment of whether a crack would propagate at the location of an existing flaw. This capability has been demonstrated on several types of flaws in a generic reactor pressure vessel model. Models have been developed at themore » scale of fracture specimens to develop a capability to determine how irradiation affects the fracture toughness of material. Verification work has been performed on a previously-developed model to determine the sensitivity of the model to specimen geometry and size effects. The effects of irradiation on the parameters of this model has been investigated. At lower length scales, work has continued in an ongoing to understand how irradiation and thermal aging affect the microstructure and mechanical properties of reactor pressure vessel steel. Previously-developed atomistic kinetic monte carlo models have been further developed and benchmarked against experimental data. Initial work has been performed to develop models of nucleation in a phase field model. Additional modeling work has also been performed to improve the fundamental understanding of the formation mechanisms and stability of matrix defects caused.« less
The Modular Modeling System (MMS): A toolbox for water- and environmental-resources management
Leavesley, G.H.; Markstrom, S.L.; Viger, R.J.; Hay, L.E.; ,
2005-01-01
The increasing complexity of water- and environmental-resource problems require modeling approaches that incorporate knowledge from a broad range of scientific and software disciplines. To address this need, the U.S. Geological Survey (USGS) has developed the Modular Modeling System (MMS). MMS is an integrated system of computer software for model development, integration, and application. Its modular design allows a high level of flexibility and adaptability to enable modelers to incorporate their own software into a rich array of built-in models and modeling tools. These include individual process models, tightly coupled models, loosely coupled models, and fully- integrated decision support systems. A geographic information system (GIS) interface, the USGS GIS Weasel, has been integrated with MMS to enable spatial delineation and characterization of basin and ecosystem features, and to provide objective parameter-estimation methods for models using available digital data. MMS provides optimization and sensitivity-analysis tools to analyze model parameters and evaluate the extent to which uncertainty in model parameters affects uncertainty in simulation results. MMS has been coupled with the Bureau of Reclamation object-oriented reservoir and river-system modeling framework, RiverWare, to develop models to evaluate and apply optimal resource-allocation and management strategies to complex, operational decisions on multipurpose reservoir systems and watersheds. This decision support system approach has been developed, tested, and implemented in the Gunnison, Yakima, San Joaquin, Rio Grande, and Truckee River basins of the western United States. MMS is currently being coupled with the U.S. Forest Service model SIMulating Patterns and Processes at Landscape Scales (SIMPPLLE) to assess the effects of alternative vegetation-management strategies on a variety of hydrological and ecological responses. Initial development and testing of the MMS-SIMPPLLE integration is being conducted on the Colorado Plateau region of the western United Sates.
Comparison of Conventional and ANN Models for River Flow Forecasting
NASA Astrophysics Data System (ADS)
Jain, A.; Ganti, R.
2011-12-01
Hydrological models are useful in many water resources applications such as flood control, irrigation and drainage, hydro power generation, water supply, erosion and sediment control, etc. Estimates of runoff are needed in many water resources planning, design development, operation and maintenance activities. River flow is generally estimated using time series or rainfall-runoff models. Recently, soft artificial intelligence tools such as Artificial Neural Networks (ANNs) have become popular for research purposes but have not been extensively adopted in operational hydrological forecasts. There is a strong need to develop ANN models based on real catchment data and compare them with the conventional models. In this paper, a comparative study has been carried out for river flow forecasting using the conventional and ANN models. Among the conventional models, multiple linear, and non linear regression, and time series models of auto regressive (AR) type have been developed. Feed forward neural network model structure trained using the back propagation algorithm, a gradient search method, was adopted. The daily river flow data derived from Godavari Basin @ Polavaram, Andhra Pradesh, India have been employed to develop all the models included here. Two inputs, flows at two past time steps, (Q(t-1) and Q(t-2)) were selected using partial auto correlation analysis for forecasting flow at time t, Q(t). A wide range of error statistics have been used to evaluate the performance of all the models developed in this study. It has been found that the regression and AR models performed comparably, and the ANN model performed the best amongst all the models investigated in this study. It is concluded that ANN model should be adopted in real catchments for hydrological modeling and forecasting.
Craven, Stephen; Shirsat, Nishikant; Whelan, Jessica; Glennon, Brian
2013-01-01
A Monod kinetic model, logistic equation model, and statistical regression model were developed for a Chinese hamster ovary cell bioprocess operated under three different modes of operation (batch, bolus fed-batch, and continuous fed-batch) and grown on two different bioreactor scales (3 L bench-top and 15 L pilot-scale). The Monod kinetic model was developed for all modes of operation under study and predicted cell density, glucose glutamine, lactate, and ammonia concentrations well for the bioprocess. However, it was computationally demanding due to the large number of parameters necessary to produce a good model fit. The transferability of the Monod kinetic model structure and parameter set across bioreactor scales and modes of operation was investigated and a parameter sensitivity analysis performed. The experimentally determined parameters had the greatest influence on model performance. They changed with scale and mode of operation, but were easily calculated. The remaining parameters, which were fitted using a differential evolutionary algorithm, were not as crucial. Logistic equation and statistical regression models were investigated as alternatives to the Monod kinetic model. They were less computationally intensive to develop due to the absence of a large parameter set. However, modeling of the nutrient and metabolite concentrations proved to be troublesome due to the logistic equation model structure and the inability of both models to incorporate a feed. The complexity, computational load, and effort required for model development has to be balanced with the necessary level of model sophistication when choosing which model type to develop for a particular application. Copyright © 2012 American Institute of Chemical Engineers (AIChE).
Stone, Wesley W.; Gilliom, Robert J.; Crawford, Charles G.
2008-01-01
Regression models were developed for predicting annual maximum and selected annual maximum moving-average concentrations of atrazine in streams using the Watershed Regressions for Pesticides (WARP) methodology developed by the National Water-Quality Assessment Program (NAWQA) of the U.S. Geological Survey (USGS). The current effort builds on the original WARP models, which were based on the annual mean and selected percentiles of the annual frequency distribution of atrazine concentrations. Estimates of annual maximum and annual maximum moving-average concentrations for selected durations are needed to characterize the levels of atrazine and other pesticides for comparison to specific water-quality benchmarks for evaluation of potential concerns regarding human health or aquatic life. Separate regression models were derived for the annual maximum and annual maximum 21-day, 60-day, and 90-day moving-average concentrations. Development of the regression models used the same explanatory variables, transformations, model development data, model validation data, and regression methods as those used in the original development of WARP. The models accounted for 72 to 75 percent of the variability in the concentration statistics among the 112 sampling sites used for model development. Predicted concentration statistics from the four models were within a factor of 10 of the observed concentration statistics for most of the model development and validation sites. Overall, performance of the models for the development and validation sites supports the application of the WARP models for predicting annual maximum and selected annual maximum moving-average atrazine concentration in streams and provides a framework to interpret the predictions in terms of uncertainty. For streams with inadequate direct measurements of atrazine concentrations, the WARP model predictions for the annual maximum and the annual maximum moving-average atrazine concentrations can be used to characterize the probable levels of atrazine for comparison to specific water-quality benchmarks. Sites with a high probability of exceeding a benchmark for human health or aquatic life can be prioritized for monitoring.
Kropf, Stefan; Chalopin, Claire; Lindner, Dirk; Denecke, Kerstin
2017-06-28
Access to patient data within the hospital or between hospitals is still problematic since a variety of information systems is in use applying different vendor specific terminologies and underlying knowledge models. Beyond, the development of electronic health record systems (EHRSs) is time and resource consuming. Thus, there is a substantial need for a development strategy of standardized EHRSs. We are applying a reuse-oriented process model and demonstrate its feasibility and realization on a practical medical use case, which is an EHRS holding all relevant data arising in the context of treatment of tumors of the sella region. In this paper, we describe the development process and our practical experiences. Requirements towards the development of the EHRS were collected by interviews with a neurosurgeon and patient data analysis. For modelling of patient data, we selected openEHR as standard and exploited the software tools provided by the openEHR foundation. The patient information model forms the core of the development process, which comprises the EHR generation and the implementation of an EHRS architecture. Moreover, a reuse-oriented process model from the business domain was adapted to the development of the EHRS. The reuse-oriented process model is a model for a suitable abstraction of both, modeling and development of an EHR centralized EHRS. The information modeling process resulted in 18 archetypes that were aggregated in a template and built the boilerplate of the model driven development. The EHRs and the EHRS were developed by openEHR and W3C standards, tightly supported by well-established XML techniques. The GUI of the final EHRS integrates and visualizes information from various examinations, medical reports, findings and laboratory test results. We conclude that the development of a standardized overarching EHR and an EHRS is feasible using openEHR and W3C standards, enabling a high degree of semantic interoperability. The standardized representation visualizes data and can in this way support the decision process of clinicians.
Nonequilibrium radiation and chemistry models for aerocapture vehicle flowfields
NASA Technical Reports Server (NTRS)
Carlson, Leland A.
1993-01-01
The period from Jan. 1993 thru Aug. 1993 is covered. The primary tasks during this period were the development of a single and multi-vibrational temperature preferential vibration-dissociation coupling model, the development of a normal shock nonequilibrium radiation-gasdynamic coupling model based upon the blunt body model, and the comparison of results obtained with these models with experimental data. In addition, an extensive series of computations were conducted using the blunt body model to develop a set of reference results covering a wide range of vehicle sizes, altitudes, and entry velocities.
The development of advanced manufacturing systems
NASA Astrophysics Data System (ADS)
Doumeingts, Guy; Vallespir, Bruno; Darricau, Didier; Roboam, Michel
Various methods for the design of advanced manufacturing systems (AMSs) are reviewed. The specifications for AMSs and problems inherent in their development are first discussed. Three models, the Computer Aided Manufacturing-International model, the National Bureau of Standards model, and the GRAI model, are considered in detail. Hierarchical modeling tools such as structured analysis and design techniques, Petri nets, and the Icam definition method are used in the development of integrated manufacturing models. Finally, the GRAI method is demonstrated in the design of specifications for the production management system of the Snecma AMS.
Dantigny, Philippe; Guilmart, Audrey; Bensoussan, Maurice
2005-04-15
For over 20 years, predictive microbiology focused on food-pathogenic bacteria. Few studies concerned modelling fungal development. On one hand, most of food mycologists are not familiar with modelling techniques; on the other hand, people involved in modelling are developing tools dedicated to bacteria. Therefore, there is a tendency to extend the use of models that were developed for bacteria to moulds. However, some mould specificities should be taken into account. The use of specific models for predicting germination and growth of fungi was advocated previously []. This paper provides a short review of fungal modelling studies.
Road safety forecasts in five European countries using structural time series models.
Antoniou, Constantinos; Papadimitriou, Eleonora; Yannis, George
2014-01-01
Modeling road safety development is a complex task and needs to consider both the quantifiable impact of specific parameters as well as the underlying trends that cannot always be measured or observed. The objective of this research is to apply structural time series models for obtaining reliable medium- to long-term forecasts of road traffic fatality risk using data from 5 countries with different characteristics from all over Europe (Cyprus, Greece, Hungary, Norway, and Switzerland). Two structural time series models are considered: (1) the local linear trend model and the (2) latent risk time series model. Furthermore, a structured decision tree for the selection of the applicable model for each situation (developed within the Road Safety Data, Collection, Transfer and Analysis [DaCoTA] research project, cofunded by the European Commission) is outlined. First, the fatality and exposure data that are used for the development of the models are presented and explored. Then, the modeling process is presented, including the model selection process, introduction of intervention variables, and development of mobility scenarios. The forecasts using the developed models appear to be realistic and within acceptable confidence intervals. The proposed methodology is proved to be very efficient for handling different cases of data availability and quality, providing an appropriate alternative from the family of structural time series models in each country. A concluding section providing perspectives and directions for future research is presented.
Load Model Verification, Validation and Calibration Framework by Statistical Analysis on Field Data
NASA Astrophysics Data System (ADS)
Jiao, Xiangqing; Liao, Yuan; Nguyen, Thai
2017-11-01
Accurate load models are critical for power system analysis and operation. A large amount of research work has been done on load modeling. Most of the existing research focuses on developing load models, while little has been done on developing formal load model verification and validation (V&V) methodologies or procedures. Most of the existing load model validation is based on qualitative rather than quantitative analysis. In addition, not all aspects of model V&V problem have been addressed by the existing approaches. To complement the existing methods, this paper proposes a novel load model verification and validation framework that can systematically and more comprehensively examine load model's effectiveness and accuracy. Statistical analysis, instead of visual check, quantifies the load model's accuracy, and provides a confidence level of the developed load model for model users. The analysis results can also be used to calibrate load models. The proposed framework can be used as a guidance to systematically examine load models for utility engineers and researchers. The proposed method is demonstrated through analysis of field measurements collected from a utility system.
Program of research in severe storms
NASA Technical Reports Server (NTRS)
1979-01-01
Two modeling areas, the development of a mesoscale chemistry-meteorology interaction model, and the development of a combined urban chemical kinetics-transport model are examined. The problems associated with developing a three dimensional combined meteorological-chemical kinetics computer program package are defined. A similar three dimensional hydrostatic real time model which solves the fundamental Navier-Stokes equations for nonviscous flow is described. An urban air quality simulation model, developed to predict the temporal and spatial distribution of reactive and nonreactive gases in and around an urban area and to support a remote sensor evaluation program is reported.
Nonequilibrium radiation and chemistry models for aerocapture vehicle flowfields
NASA Technical Reports Server (NTRS)
Carlson, Leland A.
1990-01-01
The primary tasks during January 1990 to June 1990 have been the development and evaluation of various electron and electron-electronic energy equation models, the continued development of improved nonequilibrium radiation models for molecules and atoms, and the continued development and investigation of precursor models and their effects. In addition, work was initiated to develop a vibrational model for the viscous shock layer (VSL) nonequilibrium chemistry blunt body engineering code. Also, an effort was started associated with the effects of including carbon species, say from an ablator, in the flowfield.
Logic models as a tool for sexual violence prevention program development.
Hawkins, Stephanie R; Clinton-Sherrod, A Monique; Irvin, Neil; Hart, Laurie; Russell, Sarah Jane
2009-01-01
Sexual violence is a growing public health problem, and there is an urgent need to develop sexual violence prevention programs. Logic models have emerged as a vital tool in program development. The Centers for Disease Control and Prevention funded an empowerment evaluation designed to work with programs focused on the prevention of first-time male perpetration of sexual violence, and it included as one of its goals, the development of program logic models. Two case studies are presented that describe how significant positive changes can be made to programs as a result of their developing logic models that accurately describe desired outcomes. The first case study describes how the logic model development process made an organization aware of the importance of a program's environmental context for program success; the second case study demonstrates how developing a program logic model can elucidate gaps in organizational programming and suggest ways to close those gaps.
CFD Code Development for Combustor Flows
NASA Technical Reports Server (NTRS)
Norris, Andrew
2003-01-01
During the lifetime of this grant, work has been performed in the areas of model development, code development, code validation and code application. For model development, this has included the PDF combustion module, chemical kinetics based on thermodynamics, neural network storage of chemical kinetics, ILDM chemical kinetics and assumed PDF work. Many of these models were then implemented in the code, and in addition many improvements were made to the code, including the addition of new chemistry integrators, property evaluation schemes, new chemistry models and turbulence-chemistry interaction methodology. Validation of all new models and code improvements were also performed, while application of the code to the ZCET program and also the NPSS GEW combustor program were also performed. Several important items remain under development, including the NOx post processing, assumed PDF model development and chemical kinetic development. It is expected that this work will continue under the new grant.
Statistically Qualified Neuro-Analytic system and Method for Process Monitoring
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vilim, Richard B.; Garcia, Humberto E.; Chen, Frederick W.
1998-11-04
An apparatus and method for monitoring a process involves development and application of a statistically qualified neuro-analytic (SQNA) model to accurately and reliably identify process change. The development of the SQNA model is accomplished in two steps: deterministic model adaption and stochastic model adaptation. Deterministic model adaption involves formulating an analytic model of the process representing known process characteristics,augmenting the analytic model with a neural network that captures unknown process characteristics, and training the resulting neuro-analytic model by adjusting the neural network weights according to a unique scaled equation emor minimization technique. Stochastic model adaptation involves qualifying any remaining uncertaintymore » in the trained neuro-analytic model by formulating a likelihood function, given an error propagation equation, for computing the probability that the neuro-analytic model generates measured process output. Preferably, the developed SQNA model is validated using known sequential probability ratio tests and applied to the process as an on-line monitoring system.« less
Timmis, J; Alden, K; Andrews, P; Clark, E; Nellis, A; Naylor, B; Coles, M; Kaye, P
2017-03-01
This tutorial promotes good practice for exploring the rationale of systems pharmacology models. A safety systems engineering inspired notation approach provides much needed rigor and transparency in development and application of models for therapeutic discovery and design of intervention strategies. Structured arguments over a model's development, underpinning biological knowledge, and analyses of model behaviors are constructed to determine the confidence that a model is fit for the purpose for which it will be applied. © 2016 The Authors CPT: Pharmacometrics & Systems Pharmacology published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.
Overcoming limitations of model-based diagnostic reasoning systems
NASA Technical Reports Server (NTRS)
Holtzblatt, Lester J.; Marcotte, Richard A.; Piazza, Richard L.
1989-01-01
The development of a model-based diagnostic system to overcome the limitations of model-based reasoning systems is discussed. It is noted that model-based reasoning techniques can be used to analyze the failure behavior and diagnosability of system and circuit designs as part of the system process itself. One goal of current research is the development of a diagnostic algorithm which can reason efficiently about large numbers of diagnostic suspects and can handle both combinational and sequential circuits. A second goal is to address the model-creation problem by developing an approach for using design models to construct the GMODS model in an automated fashion.
Developing Cognitive Models for Social Simulation from Survey Data
NASA Astrophysics Data System (ADS)
Alt, Jonathan K.; Lieberman, Stephen
The representation of human behavior and cognition continues to challenge the modeling and simulation community. The use of survey and polling instruments to inform belief states, issue stances and action choice models provides a compelling means of developing models and simulations with empirical data. Using these types of data to population social simulations can greatly enhance the feasibility of validation efforts, the reusability of social and behavioral modeling frameworks, and the testable reliability of simulations. We provide a case study demonstrating these effects, document the use of survey data to develop cognitive models, and suggest future paths forward for social and behavioral modeling.
The Use of DNS in Turbulence Modeling
NASA Technical Reports Server (NTRS)
Mansour, Nagi N.; Merriam, Marshal (Technical Monitor)
1997-01-01
The use of Direct numerical simulations (DNS) data in developing and testing turbulence models is reviewed. The data is used to test turbulence models at all levels: algebraic, one-equation, two-equation and full Reynolds stress models were tested. Particular examples on the development of models for the dissipation rate equation are presented. Homogeneous flows are used to test new scaling arguments for the various terms in the dissipation rate equation. The channel flow data is used to develop modifications to the equation model that take into account near-wall effects. DNS of compressible flows under mean compression are used in testing new compressible modifications to the two-equation models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
James Francfort; Kevin Morrow; Dimitri Hochard
2007-02-01
This report documents efforts to develop a computer tool for modeling the economic payback for comparative airport ground support equipment (GSE) that are propelled by either electric motors or gasoline and diesel engines. The types of GSE modeled are pushback tractors, baggage tractors, and belt loaders. The GSE modeling tool includes an emissions module that estimates the amount of tailpipe emissions saved by replacing internal combustion engine GSE with electric GSE. This report contains modeling assumptions, methodology, a user’s manual, and modeling results. The model was developed based on the operations of two airlines at four United States airports.
NASA Astrophysics Data System (ADS)
Skersys, Tomas; Butleris, Rimantas; Kapocius, Kestutis
2013-10-01
Approaches for the analysis and specification of business vocabularies and rules are very relevant topics in both Business Process Management and Information Systems Development disciplines. However, in common practice of Information Systems Development, the Business modeling activities still are of mostly empiric nature. In this paper, basic aspects of the approach for business vocabularies' semi-automated extraction from business process models are presented. The approach is based on novel business modeling-level OMG standards "Business Process Model and Notation" (BPMN) and "Semantics for Business Vocabularies and Business Rules" (SBVR), thus contributing to OMG's vision about Model-Driven Architecture (MDA) and to model-driven development in general.
Theory, development, and applicability of the surface water hydrologic model CASC2D
NASA Astrophysics Data System (ADS)
Downer, Charles W.; Ogden, Fred L.; Martin, William D.; Harmon, Russell S.
2002-02-01
Numerical tests indicate that Hortonian runoff mechanisms benefit from scaling effects that non-Hortonian runoff mechanisms do not share. This potentially makes Hortonian watersheds more amenable to physically based modelling provided that the physically based model employed properly accounts for rainfall distribution and initial soil moisture conditions, to which these types of model are highly sensitive. The distributed Hortonian runoff model CASC2D has been developed and tested for the US Army over the past decade. The purpose of the model is to provide the Army with superior predictions of runoff and stream-flow compared with the standard lumped parameter model HEC-1. The model is also to be used to help minimize negative effects on the landscape caused by US armed forces training activities. Development of the CASC2D model is complete and the model has been tested and applied at several locations. These applications indicate that the model can realistically reproduce hydrographs when properly applied. These applications also indicate that there may be many situations where the model is inadequate. Because of this, the Army is pursuing development of a new model, GSSHA, that will provide improved numerical stability and incorporate additional stream-flow-producing mechanisms and improved hydraulics.
Tellurium notebooks-An environment for reproducible dynamical modeling in systems biology.
Medley, J Kyle; Choi, Kiri; König, Matthias; Smith, Lucian; Gu, Stanley; Hellerstein, Joseph; Sealfon, Stuart C; Sauro, Herbert M
2018-06-01
The considerable difficulty encountered in reproducing the results of published dynamical models limits validation, exploration and reuse of this increasingly large biomedical research resource. To address this problem, we have developed Tellurium Notebook, a software system for model authoring, simulation, and teaching that facilitates building reproducible dynamical models and reusing models by 1) providing a notebook environment which allows models, Python code, and narrative to be intermixed, 2) supporting the COMBINE archive format during model development for capturing model information in an exchangeable format and 3) enabling users to easily simulate and edit public COMBINE-compliant models from public repositories to facilitate studying model dynamics, variants and test cases. Tellurium Notebook, a Python-based Jupyter-like environment, is designed to seamlessly inter-operate with these community standards by automating conversion between COMBINE standards formulations and corresponding in-line, human-readable representations. Thus, Tellurium brings to systems biology the strategy used by other literate notebook systems such as Mathematica. These capabilities allow users to edit every aspect of the standards-compliant models and simulations, run the simulations in-line, and re-export to standard formats. We provide several use cases illustrating the advantages of our approach and how it allows development and reuse of models without requiring technical knowledge of standards. Adoption of Tellurium should accelerate model development, reproducibility and reuse.
Model-Based Development of Automotive Electronic Climate Control Software
NASA Astrophysics Data System (ADS)
Kakade, Rupesh; Murugesan, Mohan; Perugu, Bhupal; Nair, Mohanan
With increasing complexity of software in today's products, writing and maintaining thousands of lines of code is a tedious task. Instead, an alternative methodology must be employed. Model-based development is one candidate that offers several benefits and allows engineers to focus on the domain of their expertise than writing huge codes. In this paper, we discuss the application of model-based development to the electronic climate control software of vehicles. The back-to-back testing approach is presented that ensures flawless and smooth transition from legacy designs to the model-based development. Simulink report generator to create design documents from the models is presented along with its usage to run the simulation model and capture the results into the test report. Test automation using model-based development tool that support the use of unique set of test cases for several testing levels and the test procedure that is independent of software and hardware platform is also presented.
The development of a classification system for maternity models of care.
Donnolley, Natasha; Butler-Henderson, Kerryn; Chapman, Michael; Sullivan, Elizabeth
2016-08-01
A lack of standard terminology or means to identify and define models of maternity care in Australia has prevented accurate evaluations of outcomes for mothers and babies in different models of maternity care. As part of the Commonwealth-funded National Maternity Data Development Project, a classification system was developed utilising a data set specification that defines characteristics of models of maternity care. The Maternity Care Classification System or MaCCS was developed using a participatory action research design that built upon the published and grey literature. The study identified the characteristics that differentiate models of care and classifies models into eleven different Major Model Categories. The MaCCS will enable individual health services, local health districts (networks), jurisdictional and national health authorities to make better informed decisions for planning, policy development and delivery of maternity services in Australia. © The Author(s) 2016.
Single Plant Root System Modeling under Soil Moisture Variation
NASA Astrophysics Data System (ADS)
Yabusaki, S.; Fang, Y.; Chen, X.; Scheibe, T. D.
2016-12-01
A prognostic Virtual Plant-Atmosphere-Soil System (vPASS) model is being developed that integrates comprehensively detailed mechanistic single plant modeling with microbial, atmospheric, and soil system processes in its immediate environment. Three broad areas of process module development are targeted: Incorporating models for root growth and function, rhizosphere interactions with bacteria and other organisms, litter decomposition and soil respiration into established porous media flow and reactive transport models Incorporating root/shoot transport, growth, photosynthesis and carbon allocation process models into an integrated plant physiology model Incorporating transpiration, Volatile Organic Compounds (VOC) emission, particulate deposition and local atmospheric processes into a coupled plant/atmosphere model. The integrated plant ecosystem simulation capability is being developed as open source process modules and associated interfaces under a modeling framework. The initial focus addresses the coupling of root growth, vascular transport system, and soil under drought scenarios. Two types of root water uptake modeling approaches are tested: continuous root distribution and constitutive root system architecture. The continuous root distribution models are based on spatially averaged root development process parameters, which are relatively straightforward to accommodate in the continuum soil flow and reactive transport module. Conversely, the constitutive root system architecture models use root growth rates, root growth direction, and root branching to evolve explicit root geometries. The branching topologies require more complex data structures and additional input parameters. Preliminary results are presented for root model development and the vascular response to temporal and spatial variations in soil conditions.
A musculoskeletal foot model for clinical gait analysis.
Saraswat, Prabhav; Andersen, Michael S; Macwilliams, Bruce A
2010-06-18
Several full body musculoskeletal models have been developed for research applications and these models may potentially be developed into useful clinical tools to assess gait pathologies. Existing full-body musculoskeletal models treat the foot as a single segment and ignore the motions of the intrinsic joints of the foot. This assumption limits the use of such models in clinical cases with significant foot deformities. Therefore, a three-segment musculoskeletal model of the foot was developed to match the segmentation of a recently developed multi-segment kinematic foot model. All the muscles and ligaments of the foot spanning the modeled joints were included. Muscle pathways were adjusted with an optimization routine to minimize the difference between the muscle flexion-extension moment arms from the model and moment arms reported in literature. The model was driven by walking data from five normal pediatric subjects (aged 10.6+/-1.57 years) and muscle forces and activation levels required to produce joint motions were calculated using an inverse dynamic analysis approach. Due to the close proximity of markers on the foot, small marker placement error during motion data collection may lead to significant differences in musculoskeletal model outcomes. Therefore, an optimization routine was developed to enforce joint constraints, optimally scale each segment length and adjust marker positions. To evaluate the model outcomes, the muscle activation patterns during walking were compared with electromyography (EMG) activation patterns reported in the literature. Model-generated muscle activation patterns were observed to be similar to the EMG activation patterns. Published by Elsevier Ltd.
Lu, Jingtao; Goldsmith, Michael-Rock; Grulke, Christopher M; Chang, Daniel T; Brooks, Raina D; Leonard, Jeremy A; Phillips, Martin B; Hypes, Ethan D; Fair, Matthew J; Tornero-Velez, Rogelio; Johnson, Jeffre; Dary, Curtis C; Tan, Yu-Mei
2016-02-01
Developing physiologically-based pharmacokinetic (PBPK) models for chemicals can be resource-intensive, as neither chemical-specific parameters nor in vivo pharmacokinetic data are easily available for model construction. Previously developed, well-parameterized, and thoroughly-vetted models can be a great resource for the construction of models pertaining to new chemicals. A PBPK knowledgebase was compiled and developed from existing PBPK-related articles and used to develop new models. From 2,039 PBPK-related articles published between 1977 and 2013, 307 unique chemicals were identified for use as the basis of our knowledgebase. Keywords related to species, gender, developmental stages, and organs were analyzed from the articles within the PBPK knowledgebase. A correlation matrix of the 307 chemicals in the PBPK knowledgebase was calculated based on pharmacokinetic-relevant molecular descriptors. Chemicals in the PBPK knowledgebase were ranked based on their correlation toward ethylbenzene and gefitinib. Next, multiple chemicals were selected to represent exact matches, close analogues, or non-analogues of the target case study chemicals. Parameters, equations, or experimental data relevant to existing models for these chemicals and their analogues were used to construct new models, and model predictions were compared to observed values. This compiled knowledgebase provides a chemical structure-based approach for identifying PBPK models relevant to other chemical entities. Using suitable correlation metrics, we demonstrated that models of chemical analogues in the PBPK knowledgebase can guide the construction of PBPK models for other chemicals.
Grulke, Christopher M.; Chang, Daniel T.; Brooks, Raina D.; Leonard, Jeremy A.; Phillips, Martin B.; Hypes, Ethan D.; Fair, Matthew J.; Tornero-Velez, Rogelio; Johnson, Jeffre; Dary, Curtis C.; Tan, Yu-Mei
2016-01-01
Developing physiologically-based pharmacokinetic (PBPK) models for chemicals can be resource-intensive, as neither chemical-specific parameters nor in vivo pharmacokinetic data are easily available for model construction. Previously developed, well-parameterized, and thoroughly-vetted models can be a great resource for the construction of models pertaining to new chemicals. A PBPK knowledgebase was compiled and developed from existing PBPK-related articles and used to develop new models. From 2,039 PBPK-related articles published between 1977 and 2013, 307 unique chemicals were identified for use as the basis of our knowledgebase. Keywords related to species, gender, developmental stages, and organs were analyzed from the articles within the PBPK knowledgebase. A correlation matrix of the 307 chemicals in the PBPK knowledgebase was calculated based on pharmacokinetic-relevant molecular descriptors. Chemicals in the PBPK knowledgebase were ranked based on their correlation toward ethylbenzene and gefitinib. Next, multiple chemicals were selected to represent exact matches, close analogues, or non-analogues of the target case study chemicals. Parameters, equations, or experimental data relevant to existing models for these chemicals and their analogues were used to construct new models, and model predictions were compared to observed values. This compiled knowledgebase provides a chemical structure-based approach for identifying PBPK models relevant to other chemical entities. Using suitable correlation metrics, we demonstrated that models of chemical analogues in the PBPK knowledgebase can guide the construction of PBPK models for other chemicals. PMID:26871706
Body composition analysis: Cellular level modeling of body component ratios.
Wang, Z; Heymsfield, S B; Pi-Sunyer, F X; Gallagher, D; Pierson, R N
2008-01-01
During the past two decades, a major outgrowth of efforts by our research group at St. Luke's-Roosevelt Hospital is the development of body composition models that include cellular level models, models based on body component ratios, total body potassium models, multi-component models, and resting energy expenditure-body composition models. This review summarizes these models with emphasis on component ratios that we believe are fundamental to understanding human body composition during growth and development and in response to disease and treatments. In-vivo measurements reveal that in healthy adults some component ratios show minimal variability and are relatively 'stable', for example total body water/fat-free mass and fat-free mass density. These ratios can be effectively applied for developing body composition methods. In contrast, other ratios, such as total body potassium/fat-free mass, are highly variable in vivo and therefore are less useful for developing body composition models. In order to understand the mechanisms governing the variability of these component ratios, we have developed eight cellular level ratio models and from them we derived simplified models that share as a major determining factor the ratio of extracellular to intracellular water ratio (E/I). The E/I value varies widely among adults. Model analysis reveals that the magnitude and variability of each body component ratio can be predicted by correlating the cellular level model with the E/I value. Our approach thus provides new insights into and improved understanding of body composition ratios in adults.
Hunt, R.J.; Anderson, M.P.; Kelson, V.A.
1998-01-01
This paper demonstrates that analytic element models have potential as powerful screening tools that can facilitate or improve calibration of more complicated finite-difference and finite-element models. We demonstrate how a two-dimensional analytic element model was used to identify errors in a complex three-dimensional finite-difference model caused by incorrect specification of boundary conditions. An improved finite-difference model was developed using boundary conditions developed from a far-field analytic element model. Calibration of a revised finite-difference model was achieved using fewer zones of hydraulic conductivity and lake bed conductance than the original finite-difference model. Calibration statistics were also improved in that simulated base-flows were much closer to measured values. The improved calibration is due mainly to improved specification of the boundary conditions made possible by first solving the far-field problem with an analytic element model.This paper demonstrates that analytic element models have potential as powerful screening tools that can facilitate or improve calibration of more complicated finite-difference and finite-element models. We demonstrate how a two-dimensional analytic element model was used to identify errors in a complex three-dimensional finite-difference model caused by incorrect specification of boundary conditions. An improved finite-difference model was developed using boundary conditions developed from a far-field analytic element model. Calibration of a revised finite-difference model was achieved using fewer zones of hydraulic conductivity and lake bed conductance than the original finite-difference model. Calibration statistics were also improved in that simulated base-flows were much closer to measured values. The improved calibration is due mainly to improved specification of the boundary conditions made possible by first solving the far-field problem with an analytic element model.
JEDI Marine and Hydrokinetic Power Model | Jobs and Economic Development
Model The Jobs and Economic Development Impacts (JEDI) Marine Hydrokinetic Model allows users to estimate economic development impacts from marine hydrokinetic projects and includes default information
Atmospheric Models for Aerocapture
NASA Technical Reports Server (NTRS)
Justus, C. G.; Duvall, Aleta L.; Keller, Vernon W.
2004-01-01
There are eight destinations in the solar System with sufficient atmosphere for aerocapture to be a viable aeroassist option - Venus, Earth, Mars, Jupiter, Saturn and its moon Titan, Uranus, and Neptune. Engineering-level atmospheric models for four of these targets (Earth, Mars, Titan, and Neptune) have been developed for NASA to support systems analysis studies of potential future aerocapture missions. Development of a similar atmospheric model for Venus has recently commenced. An important capability of all of these models is their ability to simulate quasi-random density perturbations for Monte Carlo analyses in developing guidance, navigation and control algorithm, and for thermal systems design. Similarities and differences among these atmospheric models are presented, with emphasis on the recently developed Neptune model and on planned characteristics of the Venus model. Example applications for aerocapture are also presented and illustrated. Recent updates to the Titan atmospheric model are discussed, in anticipation of applications for trajectory and atmospheric reconstruct of Huygens Probe entry at Titan.
NASA Astrophysics Data System (ADS)
Roushangar, Kiyoumars; Mehrabani, Fatemeh Vojoudi; Shiri, Jalal
2014-06-01
This study presents Artificial Intelligence (AI)-based modeling of total bed material load through developing the accuracy level of the predictions of traditional models. Gene expression programming (GEP) and adaptive neuro-fuzzy inference system (ANFIS)-based models were developed and validated for estimations. Sediment data from Qotur River (Northwestern Iran) were used for developing and validation of the applied techniques. In order to assess the applied techniques in relation to traditional models, stream power-based and shear stress-based physical models were also applied in the studied case. The obtained results reveal that developed AI-based models using minimum number of dominant factors, give more accurate results than the other applied models. Nonetheless, it was revealed that k-fold test is a practical but high-cost technique for complete scanning of applied data and avoiding the over-fitting.
Technical Review of the CENWP Computational Fluid Dynamics Model of the John Day Dam Forebay
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rakowski, Cynthia L.; Serkowski, John A.; Richmond, Marshall C.
The US Army Corps of Engineers Portland District (CENWP) has developed a computational fluid dynamics (CFD) model of the John Day forebay on the Columbia River to aid in the development and design of alternatives to improve juvenile salmon passage at the John Day Project. At the request of CENWP, Pacific Northwest National Laboratory (PNNL) Hydrology Group has conducted a technical review of CENWP's CFD model run in CFD solver software, STAR-CD. PNNL has extensive experience developing and applying 3D CFD models run in STAR-CD for Columbia River hydroelectric projects. The John Day forebay model developed by CENWP is adequatelymore » configured and validated. The model is ready for use simulating forebay hydraulics for structural and operational alternatives. The approach and method are sound, however CENWP has identified some improvements that need to be made for future models and for modifications to this existing model.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zapol, Peter; Bourg, Ian; Criscenti, Louise Jacqueline
2011-10-01
This report summarizes research performed for the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Subcontinuum and Upscaling Task. The work conducted focused on developing a roadmap to include molecular scale, mechanistic information in continuum-scale models of nuclear waste glass dissolution. This information is derived from molecular-scale modeling efforts that are validated through comparison with experimental data. In addition to developing a master plan to incorporate a subcontinuum mechanistic understanding of glass dissolution into continuum models, methods were developed to generate constitutive dissolution rate expressions from quantum calculations, force field models were selected to generate multicomponent glass structures and gel layers,more » classical molecular modeling was used to study diffusion through nanopores analogous to those in the interfacial gel layer, and a micro-continuum model (K{mu}C) was developed to study coupled diffusion and reaction at the glass-gel-solution interface.« less
Reusable Component Model Development Approach for Parallel and Distributed Simulation
Zhu, Feng; Yao, Yiping; Chen, Huilong; Yao, Feng
2014-01-01
Model reuse is a key issue to be resolved in parallel and distributed simulation at present. However, component models built by different domain experts usually have diversiform interfaces, couple tightly, and bind with simulation platforms closely. As a result, they are difficult to be reused across different simulation platforms and applications. To address the problem, this paper first proposed a reusable component model framework. Based on this framework, then our reusable model development approach is elaborated, which contains two phases: (1) domain experts create simulation computational modules observing three principles to achieve their independence; (2) model developer encapsulates these simulation computational modules with six standard service interfaces to improve their reusability. The case study of a radar model indicates that the model developed using our approach has good reusability and it is easy to be used in different simulation platforms and applications. PMID:24729751
Comparison of CdZnTe neutron detector models using MCNP6 and Geant4
NASA Astrophysics Data System (ADS)
Wilson, Emma; Anderson, Mike; Prendergasty, David; Cheneler, David
2018-01-01
The production of accurate detector models is of high importance in the development and use of detectors. Initially, MCNP and Geant were developed to specialise in neutral particle models and accelerator models, respectively; there is now a greater overlap of the capabilities of both, and it is therefore useful to produce comparative models to evaluate detector characteristics. In a collaboration between Lancaster University, UK, and Innovative Physics Ltd., UK, models have been developed in both MCNP6 and Geant4 of Cadmium Zinc Telluride (CdZnTe) detectors developed by Innovative Physics Ltd. Herein, a comparison is made of the relative strengths of MCNP6 and Geant4 for modelling neutron flux and secondary γ-ray emission. Given the increasing overlap of the modelling capabilities of MCNP6 and Geant4, it is worthwhile to comment on differences in results for simulations which have similarities in terms of geometries and source configurations.
Results from a workshop on research needs for modeling aquifer thermal energy storage systems
NASA Astrophysics Data System (ADS)
Drost, M. K.
1990-08-01
A workshop an aquifer thermal energy storage (ATES) system modeling was conducted by Pacific Northwest Laboratory (PNL). The goal of the workshop was to develop a list of high priority research activities that would facilitate the commercial success of ATES. During the workshop, participants reviewed currently available modeling tools for ATES systems and produced a list of significant issues related to modeling ATES systems. Participants assigned a priority to each issue on the list by voting and developed a list of research needs for each of four high-priority research areas; the need for a feasibility study model, the need for engineering design models, the need for aquifer characterization, and the need for an economic model. The workshop participants concluded that ATES commercialization can be accelerated by aggressive development of ATES modeling tools and made specific recommendations for that development.
Quality tracing in meat supply chains
Mack, Miriam; Dittmer, Patrick; Veigt, Marius; Kus, Mehmet; Nehmiz, Ulfert; Kreyenschmidt, Judith
2014-01-01
The aim of this study was the development of a quality tracing model for vacuum-packed lamb that is applicable in different meat supply chains. Based on the development of relevant sensory parameters, the predictive model was developed by combining a linear primary model and the Arrhenius model as the secondary model. Then a process analysis was conducted to define general requirements for the implementation of the temperature-based model into a meat supply chain. The required hardware and software for continuous temperature monitoring were developed in order to use the model under practical conditions. Further on a decision support tool was elaborated in order to use the model as an effective tool in combination with the temperature monitoring equipment for the improvement of quality and storage management within the meat logistics network. Over the long term, this overall procedure will support the reduction of food waste and will improve the resources efficiency of food production. PMID:24797136
Quality tracing in meat supply chains.
Mack, Miriam; Dittmer, Patrick; Veigt, Marius; Kus, Mehmet; Nehmiz, Ulfert; Kreyenschmidt, Judith
2014-06-13
The aim of this study was the development of a quality tracing model for vacuum-packed lamb that is applicable in different meat supply chains. Based on the development of relevant sensory parameters, the predictive model was developed by combining a linear primary model and the Arrhenius model as the secondary model. Then a process analysis was conducted to define general requirements for the implementation of the temperature-based model into a meat supply chain. The required hardware and software for continuous temperature monitoring were developed in order to use the model under practical conditions. Further on a decision support tool was elaborated in order to use the model as an effective tool in combination with the temperature monitoring equipment for the improvement of quality and storage management within the meat logistics network. Over the long term, this overall procedure will support the reduction of food waste and will improve the resources efficiency of food production.
Development of a model to assess environmental performance, concerning HSE-MS principles.
Abbaspour, M; Hosseinzadeh Lotfi, F; Karbassi, A R; Roayaei, E; Nikoomaram, H
2010-06-01
The main objective of the present study was to develop a valid and appropriate model to evaluate companies' efficiency and environmental performance, concerning health, safety, and environmental management system principles. The proposed model overcomes the shortcomings of the previous models developed in this area. This model has been designed on the basis of a mathematical method known as Data Envelopment Analysis (DEA). In order to differentiate high-performing companies from weak ones, one of DEA nonradial models named as enhanced Russell graph efficiency measure has been applied. Since some of the environmental performance indicators cannot be controlled by companies' managers, it was necessary to develop the model in a way that it could be applied when discretionary and/or nondiscretionary factors were involved. The model, then, has been modified on a real case that comprised 12 oil and gas general contractors. The results showed the relative efficiency, inefficiency sources, and the rank of contractors.
Development of a new global radiation belt model
NASA Astrophysics Data System (ADS)
Sicard, Angelica; Boscher, Daniel; Bourdarie, Sébastien; Lazaro, Didier; Maget, Vincent; Ecoffet, Robert; Rolland, Guy; Standarovski, Denis
2017-04-01
The well known AP8 and AE8 NASA models are commonly used in the industry to specify the radiation belt environment. Unfortunately, there are some limitations in the use of these models, first due to the covered energy range, but also because in some regions of space, there are discrepancies between the predicted average values and the measurements. Therefore, our aim is to develop a radiation belt model, covering a large region of space and energy, from LEO altitudes to GEO and above, and from plasma to relativistic particles. The aim for the first version of this new model is to correct the AP8 and AE8 models where they are deficient or not defined. At geostationary, we developed ten years ago for electrons the IGE-2006 model which was proven to be more accurate than AE8, and used commonly in the industry, covering a broad energy range, from 1keV to 5MeV. From then, a proton model for geostationary orbit was also developed for material applications, followed by the OZONE model covering a narrower energy range but the whole outer electron belt, a SLOT model to asses average electron values for 2
Next-generation concurrent engineering: developing models to complement point designs
NASA Technical Reports Server (NTRS)
Morse, Elizabeth; Leavens, Tracy; Cohanim, Barbak; Harmon, Corey; Mahr, Eric; Lewis, Brian
2006-01-01
Concurrent Engineering Design teams have made routine the rapid development of point designs for space missions. The Jet Propulsion Laboratory's Team X is now evolving into a next generation CED; nin addition to a point design, the team develops a model of the local trade space. The process is a balance between the power of model-developing tools and the creativity of human experts, enabling the development of a variety of trade models for any space mission.
Long-term athletic development- part 1: a pathway for all youth.
Lloyd, Rhodri S; Oliver, Jon L; Faigenbaum, Avery D; Howard, Rick; De Ste Croix, Mark B A; Williams, Craig A; Best, Thomas M; Alvar, Brent A; Micheli, Lyle J; Thomas, D Phillip; Hatfield, Disa L; Cronin, John B; Myer, Gregory D
2015-05-01
The concept of developing talent and athleticism in youth is the goal of many coaches and sports systems. Consequently, an increasing number of sporting organizations have adopted long-term athletic development models in an attempt to provide a structured approach to the training of youth. It is clear that maximizing sporting talent is an important goal of long-term athletic development models. However, ensuring that youth of all ages and abilities are provided with a strategic plan for the development of their health and physical fitness is also important to maximize physical activity participation rates, reduce the risk of sport- and activity-related injury, and to ensure long-term health and well-being. Critical reviews of independent models of long-term athletic development are already present within the literature; however, to the best of our knowledge, a comprehensive examination and review of the most prominent models does not exist. Additionally, considerations of modern day issues that may impact on the success of any long-term athletic development model are lacking, as are proposed solutions to address such issues. Therefore, within this 2-part commentary, Part 1 provides a critical review of existing models of practice for long-term athletic development and introduces a composite youth development model that includes the integration of talent, psychosocial and physical development across maturation. Part 2 identifies limiting factors that may restrict the success of such models and offers potential solutions.
Functional Fault Modeling of a Cryogenic System for Real-Time Fault Detection and Isolation
NASA Technical Reports Server (NTRS)
Ferrell, Bob; Lewis, Mark; Perotti, Jose; Oostdyk, Rebecca; Brown, Barbara
2010-01-01
The purpose of this paper is to present the model development process used to create a Functional Fault Model (FFM) of a liquid hydrogen (L H2) system that will be used for realtime fault isolation in a Fault Detection, Isolation and Recover (FDIR) system. The paper explains th e steps in the model development process and the data products required at each step, including examples of how the steps were performed fo r the LH2 system. It also shows the relationship between the FDIR req uirements and steps in the model development process. The paper concl udes with a description of a demonstration of the LH2 model developed using the process and future steps for integrating the model in a live operational environment.
Toolkit of Available EPA Green Infrastructure Modeling ...
This webinar will present a toolkit consisting of five EPA green infrastructure models and tools, along with communication material. This toolkit can be used as a teaching and quick reference resource for use by planners and developers when making green infrastructure implementation decisions. It can also be used for low impact development design competitions. Models and tools included: Green Infrastructure Wizard (GIWiz), Watershed Management Optimization Support Tool (WMOST), Visualizing Ecosystem Land Management Assessments (VELMA) Model, Storm Water Management Model (SWMM), and the National Stormwater Calculator (SWC). This webinar will present a toolkit consisting of five EPA green infrastructure models and tools, along with communication material. This toolkit can be used as a teaching and quick reference resource for use by planners and developers when making green infrastructure implementation decisions. It can also be used for low impact development design competitions. Models and tools included: Green Infrastructure Wizard (GIWiz), Watershed Management Optimization Support Tool (WMOST), Visualizing Ecosystem Land Management Assessments (VELMA) Model, Storm Water Management Model (SWMM), and the National Stormwater Calculator (SWC).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gastelum, Zoe N.; White, Amanda M.; Whitney, Paul D.
2013-06-04
The Multi-Source Signatures for Nuclear Programs project, part of Pacific Northwest National Laboratory’s (PNNL) Signature Discovery Initiative, seeks to computationally capture expert assessment of multi-type information such as text, sensor output, imagery, or audio/video files, to assess nuclear activities through a series of Bayesian network (BN) models. These models incorporate knowledge from a diverse range of information sources in order to help assess a country’s nuclear activities. The models span engineering topic areas, state-level indicators, and facility-specific characteristics. To illustrate the development, calibration, and use of BN models for multi-source assessment, we present a model that predicts a country’s likelihoodmore » to participate in the international nuclear nonproliferation regime. We validate this model by examining the extent to which the model assists non-experts arrive at conclusions similar to those provided by nuclear proliferation experts. We also describe the PNNL-developed software used throughout the lifecycle of the Bayesian network model development.« less
NASA Astrophysics Data System (ADS)
Sadi, Maryam
2018-01-01
In this study a group method of data handling model has been successfully developed to predict heat capacity of ionic liquid based nanofluids by considering reduced temperature, acentric factor and molecular weight of ionic liquids, and nanoparticle concentration as input parameters. In order to accomplish modeling, 528 experimental data points extracted from the literature have been divided into training and testing subsets. The training set has been used to predict model coefficients and the testing set has been applied for model validation. The ability and accuracy of developed model, has been evaluated by comparison of model predictions with experimental values using different statistical parameters such as coefficient of determination, mean square error and mean absolute percentage error. The mean absolute percentage error of developed model for training and testing sets are 1.38% and 1.66%, respectively, which indicate excellent agreement between model predictions and experimental data. Also, the results estimated by the developed GMDH model exhibit a higher accuracy when compared to the available theoretical correlations.
NASA Technical Reports Server (NTRS)
Flowers, George T.
1994-01-01
Progress over the past year includes the following: A simplified rotor model with a flexible shaft and backup bearings has been developed. A simple rotor model which includes a flexible disk and bearings with clearance has been developed and the dynamics of the model investigated. A rotor model based upon the T-501 engine has been developed which includes backup bearing effects. Parallel simulation runs are being conducted using an ANSYS based finite element model of the T-501. The magnetic bearing test rig is currently floating and dynamics/control tests are being conducted. A paper has been written that documents the work using the T-501 engine model. Work has continued with the simplified model. The finite element model is currently being modified to include the effects of foundation dynamics. A literature search for material on foil bearings has been conducted. A finite element model is being developed for a magnetic bearing in series with a foil backup bearing.
Stochastic models for inferring genetic regulation from microarray gene expression data.
Tian, Tianhai
2010-03-01
Microarray expression profiles are inherently noisy and many different sources of variation exist in microarray experiments. It is still a significant challenge to develop stochastic models to realize noise in microarray expression profiles, which has profound influence on the reverse engineering of genetic regulation. Using the target genes of the tumour suppressor gene p53 as the test problem, we developed stochastic differential equation models and established the relationship between the noise strength of stochastic models and parameters of an error model for describing the distribution of the microarray measurements. Numerical results indicate that the simulated variance from stochastic models with a stochastic degradation process can be represented by a monomial in terms of the hybridization intensity and the order of the monomial depends on the type of stochastic process. The developed stochastic models with multiple stochastic processes generated simulations whose variance is consistent with the prediction of the error model. This work also established a general method to develop stochastic models from experimental information. 2009 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Zurweni, Wibawa, Basuki; Erwin, Tuti Nurian
2017-08-01
The framework for teaching and learning in the 21st century was prepared with 4Cs criteria. Learning providing opportunity for the development of students' optimal creative skills is by implementing collaborative learning. Learners are challenged to be able to compete, work independently to bring either individual or group excellence and master the learning material. Virtual laboratory is used for the media of Instrumental Analytical Chemistry (Vis, UV-Vis-AAS etc) lectures through simulations computer application and used as a substitution for the laboratory if the equipment and instruments are not available. This research aims to design and develop collaborative-creative learning model using virtual laboratory media for Instrumental Analytical Chemistry lectures, to know the effectiveness of this design model adapting the Dick & Carey's model and Hannafin & Peck's model. The development steps of this model are: needs analyze, design collaborative-creative learning, virtual laboratory media using macromedia flash, formative evaluation and test of learning model effectiveness. While, the development stages of collaborative-creative learning model are: apperception, exploration, collaboration, creation, evaluation, feedback. Development of collaborative-creative learning model using virtual laboratory media can be used to improve the quality learning in the classroom, overcome the limitation of lab instruments for the real instrumental analysis. Formative test results show that the Collaborative-Creative Learning Model developed meets the requirements. The effectiveness test of students' pretest and posttest proves significant at 95% confidence level, t-test higher than t-table. It can be concluded that this learning model is effective to use for Instrumental Analytical Chemistry lectures.
Update on ORNL TRANSFORM Tool: Simulating Multi-Module Advanced Reactor with End-to-End I&C
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hale, Richard Edward; Fugate, David L.; Cetiner, Sacit M.
2015-05-01
The Small Modular Reactor (SMR) Dynamic System Modeling Tool project is in the fourth year of development. The project is designed to support collaborative modeling and study of various advanced SMR (non-light water cooled reactor) concepts, including the use of multiple coupled reactors at a single site. The focus of this report is the development of a steam generator and drum system model that includes the complex dynamics of typical steam drum systems, the development of instrumentation and controls for the steam generator with drum system model, and the development of multi-reactor module models that reflect the full power reactormore » innovative small module design concept. The objective of the project is to provide a common simulation environment and baseline modeling resources to facilitate rapid development of dynamic advanced reactor models; ensure consistency among research products within the Instrumentation, Controls, and Human-Machine Interface technical area; and leverage cross-cutting capabilities while minimizing duplication of effort. The combined simulation environment and suite of models are identified as the TRANSFORM tool. The critical elements of this effort include (1) defining a standardized, common simulation environment that can be applied throughout the Advanced Reactors Technology program; (2) developing a library of baseline component modules that can be assembled into full plant models using available geometry, design, and thermal-hydraulic data; (3) defining modeling conventions for interconnecting component models; and (4) establishing user interfaces and support tools to facilitate simulation development (i.e., configuration and parameterization), execution, and results display and capture.« less
Claggett, Peter; Jantz, Claire A.; Goetz, S.J.; Bisland, C.
2004-01-01
Natural resource lands in the Chesapeake Bay watershed are increasingly susceptible to conversion into developed land uses, particularly as the demand for residential development grows. We assessed development pressure in the Baltimore-Washington, DC region, one of the major urban and suburban centers in the watershed. We explored the utility of two modeling approaches for forecasting future development trends and patterns by comparing results from a cellular automata model, SLEUTH (slope, land use, excluded land, urban extent, transportation), and a supply/demand/allocation model, the Western Futures Model. SLEUTH can be classified as a land-cover change model and produces projections on the basis of historic trends of changes in the extent and patterns of developed land and future land protection scenarios. The Western Futures Model derives forecasts from historic trends in housing units, a U.S. Census variable, and exogenously supplied future population projections. Each approach has strengths and weaknesses, and combining the two has advantages and limitations. ?? 2004 Kluwer Academic Publishers.
Claggett, Peter R; Jantz, Claire A; Goetz, Scott J; Bisland, Carin
2004-06-01
Natural resource lands in the Chesapeake Bay watershed are increasingly susceptible to conversion into developed land uses, particularly as the demand for residential development grows. We assessed development pressure in the Baltimore-Washington, DC region, one of the major urban and suburban centers in the watershed. We explored the utility of two modeling approaches for forecasting future development trends and patterns by comparing results from a cellular automata model, SLEUTH (slope, land use, excluded land, urban extent, transportation), and a supply/demand/allocation model, the Western Futures Model. SLEUTH can be classified as a land-cover change model and produces projections on the basis of historic trends of changes in the extent and patterns of developed land and future land protection scenarios. The Western Futures Model derives forecasts from historic trends in housing units, a U.S. Census variable, and exogenously supplied future population projections. Each approach has strengths and weaknesses, and combining the two has advantages and limitations.
Modeling wind adjustment factor and midflame wind speed for Rothermel's surface fire spread model
Patricia L. Andrews
2012-01-01
Rothermel's surface fire spread model was developed to use a value for the wind speed that affects surface fire, called midflame wind speed. Models have been developed to adjust 20-ft wind speed to midflame wind speed for sheltered and unsheltered surface fuel. In this report, Wind Adjustment Factor (WAF) model equations are given, and the BehavePlus fire modeling...
Modeling of Spacecraft Advanced Chemical Propulsion Systems
NASA Technical Reports Server (NTRS)
Benfield, Michael P. J.; Belcher, Jeremy A.
2004-01-01
This paper outlines the development of the Advanced Chemical Propulsion System (ACPS) model for Earth and Space Storable propellants. This model was developed by the System Technology Operation of SAIC-Huntsville for the NASA MSFC In-Space Propulsion Project Office. Each subsystem of the model is described. Selected model results will also be shown to demonstrate the model's ability to evaluate technology changes in chemical propulsion systems.
Development and verification of an agent-based model of opinion leadership.
Anderson, Christine A; Titler, Marita G
2014-09-27
The use of opinion leaders is a strategy used to speed the process of translating research into practice. Much is still unknown about opinion leader attributes and activities and the context in which they are most effective. Agent-based modeling is a methodological tool that enables demonstration of the interactive and dynamic effects of individuals and their behaviors on other individuals in the environment. The purpose of this study was to develop and test an agent-based model of opinion leadership. The details of the design and verification of the model are presented. The agent-based model was developed by using a software development platform to translate an underlying conceptual model of opinion leadership into a computer model. Individual agent attributes (for example, motives and credibility) and behaviors (seeking or providing an opinion) were specified as variables in the model in the context of a fictitious patient care unit. The verification process was designed to test whether or not the agent-based model was capable of reproducing the conditions of the preliminary conceptual model. The verification methods included iterative programmatic testing ('debugging') and exploratory analysis of simulated data obtained from execution of the model. The simulation tests included a parameter sweep, in which the model input variables were adjusted systematically followed by an individual time series experiment. Statistical analysis of model output for the 288 possible simulation scenarios in the parameter sweep revealed that the agent-based model was performing, consistent with the posited relationships in the underlying model. Nurse opinion leaders act on the strength of their beliefs and as a result, become an opinion resource for their uncertain colleagues, depending on their perceived credibility. Over time, some nurses consistently act as this type of resource and have the potential to emerge as opinion leaders in a context where uncertainty exists. The development and testing of agent-based models is an iterative process. The opinion leader model presented here provides a basic structure for continued model development, ongoing verification, and the establishment of validation procedures, including empirical data collection.
A predictive model for biomimetic plate type broadband frequency sensor
NASA Astrophysics Data System (ADS)
Ahmed, Riaz U.; Banerjee, Sourav
2016-04-01
In this work, predictive model for a bio-inspired broadband frequency sensor is developed. Broadband frequency sensing is essential in many domains of science and technology. One great example of such sensor is human cochlea, where it senses a frequency band of 20 Hz to 20 KHz. Developing broadband sensor adopting the physics of human cochlea has found tremendous interest in recent years. Although few experimental studies have been reported, a true predictive model to design such sensors is missing. A predictive model is utmost necessary for accurate design of selective broadband sensors that are capable of sensing very selective band of frequencies. Hence, in this study, we proposed a novel predictive model for the cochlea-inspired broadband sensor, aiming to select the frequency band and model parameters predictively. Tapered plate geometry is considered mimicking the real shape of the basilar membrane in the human cochlea. The predictive model is intended to develop flexible enough that can be employed in a wide variety of scientific domains. To do that, the predictive model is developed in such a way that, it can not only handle homogeneous but also any functionally graded model parameters. Additionally, the predictive model is capable of managing various types of boundary conditions. It has been found that, using the homogeneous model parameters, it is possible to sense a specific frequency band from a specific portion (B) of the model length (L). It is also possible to alter the attributes of `B' using functionally graded model parameters, which confirms the predictive frequency selection ability of the developed model.
Katsube, Takayuki; Ishibashi, Toru; Kano, Takeshi; Wajima, Toshihiro
2016-11-01
The aim of this study was to develop a population pharmacokinetic (PK)/pharmacodynamic (PD) model for describing plasma lusutrombopag concentrations and platelet response following oral lusutrombopag dosing and for evaluating covariates in the PK/PD profiles. A population PK/PD model was developed using a total of 2539 plasma lusutrombopag concentration data and 1408 platelet count data from 78 healthy adult subjects following oral single and multiple (14-day once-daily) dosing. Covariates in PK and PK/PD models were explored for subject age, body weight, sex, and ethnicity. A three-compartment model with first-order rate and lag time for absorption was selected as a PK model. A three-transit and one-platelet compartment model with a sigmoid E max model for drug effect and feedback of platelet production was selected as the PD model. The PK and PK/PD models well described the plasma lusutrombopag concentrations and the platelet response, respectively. Body weight was a significant covariate in PK. The bioavailability of non-Japanese subjects (White and Black/African American subjects) was 13 % lower than that of Japanese subjects, while the simulated platelet response profiles using the PK/PD model were similar between Japanese and non-Japanese subjects. There were no significant covariates of the tested background data including age, sex, and ethnicity (Japanese or non-Japanese) for the PD sensitivity. A population PK/PD model was developed for lusutrombopag and shown to provide good prediction for the PK/PD profiles. The model could be used as a basic PK/PD model in the drug development of lusutrombopag.
Cammarota, M; Huppes, V; Gaia, S; Degoulet, P
1998-01-01
The development of Health Information Systems is widely determined by the establishment of the underlying information models. An Object-Oriented Matrix Model (OOMM) is described which target is to facilitate the integration of the overall health system. The model is based on information modules named micro-databases that are structured in a three-dimensional network: planning, health structures and information systems. The modelling tool has been developed as a layer on top of a relational database system. A visual browser facilitates the development and maintenance of the information model. The modelling approach has been applied to the Brasilia University Hospital since 1991. The extension of the modelling approach to the Brasilia regional health system is considered.
Crop weather models of barley and spring wheat yield for agrophysical units in North Dakota
NASA Technical Reports Server (NTRS)
Leduc, S. (Principal Investigator)
1982-01-01
Models based on multiple regression were developed to estimate barley yield and spring wheat yield from weather data for Agrophysical units(APU) in North Dakota. The predictor variables are derived from monthly average temperature and monthly total precipitation data at meteorological stations in the cooperative network. The models are similar in form to the previous models developed for Crop Reporting Districts (CRD). The trends and derived variables were the same and the approach to select the significant predictors was similar to that used in developing the CRD models. The APU models show sight improvements in some of the statistics of the models, e.g., explained variation. These models are to be independently evaluated and compared to the previously evaluated CRD models. The comparison will indicate the preferred model area for this application, i.e., APU or CRD.
NASA Technical Reports Server (NTRS)
Antle, John M.; Basso, Bruno; Conant, Richard T.; Godfray, H. Charles J.; Jones, James W.; Herrero, Mario; Howitt, Richard E.; Keating, Brian A.; Munoz-Carpena, Rafael; Rosenzweig, Cynthia
2016-01-01
This paper presents ideas for a new generation of agricultural system models that could meet the needs of a growing community of end-users exemplified by a set of Use Cases. We envision new data, models and knowledge products that could accelerate the innovation process that is needed to achieve the goal of achieving sustainable local, regional and global food security. We identify desirable features for models, and describe some of the potential advances that we envisage for model components and their integration. We propose an implementation strategy that would link a "pre-competitive" space for model development to a "competitive space" for knowledge product development and through private-public partnerships for new data infrastructure. Specific model improvements would be based on further testing and evaluation of existing models, the development and testing of modular model components and integration, and linkages of model integration platforms to new data management and visualization tools.
Antle, John M; Basso, Bruno; Conant, Richard T; Godfray, H Charles J; Jones, James W; Herrero, Mario; Howitt, Richard E; Keating, Brian A; Munoz-Carpena, Rafael; Rosenzweig, Cynthia; Tittonell, Pablo; Wheeler, Tim R
2017-07-01
This paper presents ideas for a new generation of agricultural system models that could meet the needs of a growing community of end-users exemplified by a set of Use Cases. We envision new data, models and knowledge products that could accelerate the innovation process that is needed to achieve the goal of achieving sustainable local, regional and global food security. We identify desirable features for models, and describe some of the potential advances that we envisage for model components and their integration. We propose an implementation strategy that would link a "pre-competitive" space for model development to a "competitive space" for knowledge product development and through private-public partnerships for new data infrastructure. Specific model improvements would be based on further testing and evaluation of existing models, the development and testing of modular model components and integration, and linkages of model integration platforms to new data management and visualization tools.
Consumer Protection, Leadership Models, and Training and Development.
ERIC Educational Resources Information Center
Coleman, Donald G.; Heun, Richard E.
This paper examines several models of leadership style that are widely used in staff development workshops and seminars for managers. The discussion focuses mainly on the implications and possible dangers for training subjects of these models, which include two-factor models, such as those of Barnard and Halpin, and orthogonal grid models, such as…
A PBPK model for TCE with specificity for the male LE rat that accurately predicts TCE tissue time-course data has not been developed, although other PBPK models for TCE exist. Development of such a model was the present aim. The PBPK model consisted of 5 compartments: fat; slowl...
Systems Modelling and the Development of Coherent Understanding of Cell Biology
ERIC Educational Resources Information Center
Verhoeff, Roald P.; Waarlo, Arend Jan; Boersma, Kerst Th.
2008-01-01
This article reports on educational design research concerning a learning and teaching strategy for cell biology in upper-secondary education introducing "systems modelling" as a key competence. The strategy consists of four modelling phases in which students subsequently develop models of free-living cells, a general two-dimensional model of…
Retrofitting Non-Cognitive-Diagnostic Reading Assessment under the Generalized DINA Model Framework
ERIC Educational Resources Information Center
Chen, Huilin; Chen, Jinsong
2016-01-01
Cognitive diagnosis models (CDMs) are psychometric models developed mainly to assess examinees' specific strengths and weaknesses in a set of skills or attributes within a domain. By adopting the Generalized-DINA model framework, the recently developed general modeling framework, we attempted to retrofit the PISA reading assessments, a…
Combining Simulation and Optimization Models for Hardwood Lumber Production
G.A. Mendoza; R.J. Meimban; W.G. Luppold; Philip A. Araman
1991-01-01
Published literature contains a number of optimization and simulation models dealing with the primary processing of hardwood and softwood logs. Simulation models have been developed primarily as descriptive models for characterizing the general operations and performance of a sawmill. Optimization models, on the other hand, were developed mainly as analytical tools for...
Animal models of ocular angiogenesis: from development to pathologies.
Liu, Chi-Hsiu; Wang, Zhongxiao; Sun, Ye; Chen, Jing
2017-11-01
Pathological angiogenesis in the eye is an important feature in the pathophysiology of many vision-threatening diseases, including retinopathy of prematurity, diabetic retinopathy, and age-related macular degeneration, as well as corneal diseases with abnormal angiogenesis. Development of reproducible and reliable animal models of ocular angiogenesis has advanced our understanding of both the normal development and the pathobiology of ocular neovascularization. These models have also proven to be valuable experimental tools with which to easily evaluate potential antiangiogenic therapies beyond eye research. This review summarizes the current available animal models of ocular angiogenesis. Models of retinal and choroidal angiogenesis, including oxygen-induced retinopathy, laser-induced choroidal neovascularization, and transgenic mouse models with deficient or spontaneous retinal/choroidal neovascularization, as well as models with induced corneal angiogenesis, are widely used to investigate the molecular and cellular basis of angiogenic mechanisms. Theoretical concepts and experimental protocols of these models are outlined, as well as their advantages and potential limitations, which may help researchers choose the most suitable models for their investigative work.-Liu, C.-H., Wang, Z., Sun, Y., Chen, J. Animal models of ocular angiogenesis: from development to pathologies. © FASEB.
Acoustic backscatter models of fish: Gradual or punctuated evolution
NASA Astrophysics Data System (ADS)
Horne, John K.
2004-05-01
Sound-scattering characteristics of aquatic organisms are routinely investigated using theoretical and numerical models. Development of the inverse approach by van Holliday and colleagues in the 1970s catalyzed the development and validation of backscatter models for fish and zooplankton. As the understanding of biological scattering properties increased, so did the number and computational sophistication of backscatter models. The complexity of data used to represent modeled organisms has also evolved in parallel to model development. Simple geometric shapes representing body components or the whole organism have been replaced by anatomically accurate representations derived from imaging sensors such as computer-aided tomography (CAT) scans. In contrast, Medwin and Clay (1998) recommend that fish and zooplankton should be described by simple theories and models, without acoustically superfluous extensions. Since van Holliday's early work, how has data and computational complexity influenced accuracy and precision of model predictions? How has the understanding of aquatic organism scattering properties increased? Significant steps in the history of model development will be identified and changes in model results will be characterized and compared. [Work supported by ONR and the Alaska Fisheries Science Center.
Lung Cancer Screening Participation: Developing a Conceptual Model to Guide Research
Carter-Harris, Lisa; Davis, Lorie L.; Rawl, Susan M.
2017-01-01
Purpose To describe the development of a conceptual model to guide research focused on lung cancer screening participation from the perspective of the individual in the decision-making process. Methods Based on a comprehensive review of empirical and theoretical literature, a conceptual model was developed linking key psychological variables (stigma, medical mistrust, fatalism, worry, and fear) to the health belief model and precaution adoption process model. Results Proposed model concepts have been examined in prior research of either lung or other cancer screening behavior. To date, a few studies have explored a limited number of variables that influence screening behavior in lung cancer specifically. Therefore, relationships among concepts in the model have been proposed and future research directions presented. Conclusion This proposed model is an initial step to support theoretically based research. As lung cancer screening becomes more widely implemented, it is critical to theoretically guide research to understand variables that may be associated with lung cancer screening participation. Findings from future research guided by the proposed conceptual model can be used to refine the model and inform tailored intervention development. PMID:28304262
Lung Cancer Screening Participation: Developing a Conceptual Model to Guide Research.
Carter-Harris, Lisa; Davis, Lorie L; Rawl, Susan M
2016-11-01
To describe the development of a conceptual model to guide research focused on lung cancer screening participation from the perspective of the individual in the decision-making process. Based on a comprehensive review of empirical and theoretical literature, a conceptual model was developed linking key psychological variables (stigma, medical mistrust, fatalism, worry, and fear) to the health belief model and precaution adoption process model. Proposed model concepts have been examined in prior research of either lung or other cancer screening behavior. To date, a few studies have explored a limited number of variables that influence screening behavior in lung cancer specifically. Therefore, relationships among concepts in the model have been proposed and future research directions presented. This proposed model is an initial step to support theoretically based research. As lung cancer screening becomes more widely implemented, it is critical to theoretically guide research to understand variables that may be associated with lung cancer screening participation. Findings from future research guided by the proposed conceptual model can be used to refine the model and inform tailored intervention development.
Tests of Hypotheses Arising In the Correlated Random Coefficient Model*
Heckman, James J.; Schmierer, Daniel
2010-01-01
This paper examines the correlated random coefficient model. It extends the analysis of Swamy (1971), who pioneered the uncorrelated random coefficient model in economics. We develop the properties of the correlated random coefficient model and derive a new representation of the variance of the instrumental variable estimator for that model. We develop tests of the validity of the correlated random coefficient model against the null hypothesis of the uncorrelated random coefficient model. PMID:21170148
Seven challenges for metapopulation models of epidemics, including households models.
Ball, Frank; Britton, Tom; House, Thomas; Isham, Valerie; Mollison, Denis; Pellis, Lorenzo; Scalia Tomba, Gianpaolo
2015-03-01
This paper considers metapopulation models in the general sense, i.e. where the population is partitioned into sub-populations (groups, patches,...), irrespective of the biological interpretation they have, e.g. spatially segregated large sub-populations, small households or hosts themselves modelled as populations of pathogens. This framework has traditionally provided an attractive approach to incorporating more realistic contact structure into epidemic models, since it often preserves analytic tractability (in stochastic as well as deterministic models) but also captures the most salient structural inhomogeneity in contact patterns in many applied contexts. Despite the progress that has been made in both the theory and application of such metapopulation models, we present here several major challenges that remain for future work, focusing on models that, in contrast to agent-based ones, are amenable to mathematical analysis. The challenges range from clarifying the usefulness of systems of weakly-coupled large sub-populations in modelling the spread of specific diseases to developing a theory for endemic models with household structure. They include also developing inferential methods for data on the emerging phase of epidemics, extending metapopulation models to more complex forms of human social structure, developing metapopulation models to reflect spatial population structure, developing computationally efficient methods for calculating key epidemiological model quantities, and integrating within- and between-host dynamics in models. Copyright © 2014 The Authors. Published by Elsevier B.V. All rights reserved.
Cognitive Components Underpinning the Development of Model-Based Learning
Potter, Tracey C.S.; Bryce, Nessa V.; Hartley, Catherine A.
2016-01-01
Reinforcement learning theory distinguishes “model-free” learning, which fosters reflexive repetition of previously rewarded actions, from “model-based” learning, which recruits a mental model of the environment to flexibly select goal-directed actions. Whereas model-free learning is evident across development, recruitment of model-based learning appears to increase with age. However, the cognitive processes underlying the development of model-based learning remain poorly characterized. Here, we examined whether age-related differences in cognitive processes underlying the construction and flexible recruitment of mental models predict developmental increases in model-based choice. In a cohort of participants aged 9–25, we examined whether the abilities to infer sequential regularities in the environment (“statistical learning”), maintain information in an active state (“working memory”) and integrate distant concepts to solve problems (“fluid reasoning”) predicted age-related improvements in model-based choice. We found that age-related improvements in statistical learning performance did not mediate the relationship between age and model-based choice. Ceiling performance on our working memory assay prevented examination of its contribution to model-based learning. However, age-related improvements in fluid reasoning statistically mediated the developmental increase in the recruitment of a model-based strategy. These findings suggest that gradual development of fluid reasoning may be a critical component process underlying the emergence of model-based learning. PMID:27825732
Cognitive components underpinning the development of model-based learning.
Potter, Tracey C S; Bryce, Nessa V; Hartley, Catherine A
2017-06-01
Reinforcement learning theory distinguishes "model-free" learning, which fosters reflexive repetition of previously rewarded actions, from "model-based" learning, which recruits a mental model of the environment to flexibly select goal-directed actions. Whereas model-free learning is evident across development, recruitment of model-based learning appears to increase with age. However, the cognitive processes underlying the development of model-based learning remain poorly characterized. Here, we examined whether age-related differences in cognitive processes underlying the construction and flexible recruitment of mental models predict developmental increases in model-based choice. In a cohort of participants aged 9-25, we examined whether the abilities to infer sequential regularities in the environment ("statistical learning"), maintain information in an active state ("working memory") and integrate distant concepts to solve problems ("fluid reasoning") predicted age-related improvements in model-based choice. We found that age-related improvements in statistical learning performance did not mediate the relationship between age and model-based choice. Ceiling performance on our working memory assay prevented examination of its contribution to model-based learning. However, age-related improvements in fluid reasoning statistically mediated the developmental increase in the recruitment of a model-based strategy. These findings suggest that gradual development of fluid reasoning may be a critical component process underlying the emergence of model-based learning. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
Open data models for smart health interconnected applications: the example of openEHR.
Demski, Hans; Garde, Sebastian; Hildebrand, Claudia
2016-10-22
Smart Health is known as a concept that enhances networking, intelligent data processing and combining patient data with other parameters. Open data models can play an important role in creating a framework for providing interoperable data services that support the development of innovative Smart Health applications profiting from data fusion and sharing. This article describes a model-driven engineering approach based on standardized clinical information models and explores its application for the development of interoperable electronic health record systems. The following possible model-driven procedures were considered: provision of data schemes for data exchange, automated generation of artefacts for application development and native platforms that directly execute the models. The applicability of the approach in practice was examined using the openEHR framework as an example. A comprehensive infrastructure for model-driven engineering of electronic health records is presented using the example of the openEHR framework. It is shown that data schema definitions to be used in common practice software development processes can be derived from domain models. The capabilities for automatic creation of implementation artefacts (e.g., data entry forms) are demonstrated. Complementary programming libraries and frameworks that foster the use of open data models are introduced. Several compatible health data platforms are listed. They provide standard based interfaces for interconnecting with further applications. Open data models help build a framework for interoperable data services that support the development of innovative Smart Health applications. Related tools for model-driven application development foster semantic interoperability and interconnected innovative applications.
ERIC Educational Resources Information Center
Arnold, Holly Weber
2013-01-01
This study examines the relationship between delivery models (the class size reduction model and the sheltered instruction model) and language development levels on the grade-level reading development of sixth-grade English learners (ELs) attending public middle schools in metro Atlanta, Georgia. The instrument used to measure grade-level mastery…
Modeling of the silane FBR system
NASA Technical Reports Server (NTRS)
Dudokovic, M. P.; Ramachandran, P. A.; Lai, S.
1984-01-01
Development of a mathematical model for fluidized bed pyrolysis of silane that relates production rate and product properties (size, size distribution, presence or absence of fines) with bed size and operating conditions (temperature, feed concentration, flow rate, seed size, etc.) and development of user oriented algorithm for the model are considered. A parameter sensitivity study of the model was also developed.
A Conceptual Model of Career Development to Enhance Academic Motivation
ERIC Educational Resources Information Center
Collins, Nancy Creighton
2010-01-01
The purpose of this study was to develop, refine, and validate a conceptual model of career development to enhance the academic motivation of community college students. To achieve this end, a straw model was built from the theoretical and empirical research literature. The model was then refined and validated through three rounds of a Delphi…
ERIC Educational Resources Information Center
De La Lama, Luisa Batthyany; De La Lama, Luis; Wittgenstein, Ariana
2012-01-01
This article presents the integrative soul mates relationship development model, which provides the helping professionals with a conceptual map for couples' relationship development from dating, to intimacy, to soul mating, and long-term flourishing. This model is informed by a holistic, a developmental, and a positive psychology conceptualization…
Highly Relevant Mentoring (HRM) as a Faculty Development Model for Web-Based Instruction
ERIC Educational Resources Information Center
Carter, Lorraine; Salyers, Vincent; Page, Aroha; Williams, Lynda; Albl, Liz; Hofsink, Clarence
2012-01-01
This paper describes a faculty development model called the highly relevant mentoring (HRM) model; the model includes a framework as well as some practical strategies for meeting the professional development needs of faculty who teach web-based courses. The paper further emphasizes the need for faculty and administrative buy-in for HRM and…
The complexity of air quality modeling systems, air quality monitoring data make ad-hoc systems for model evaluation important aids to the modeling community. Among those are the ENSEMBLE system developed by the EC-Joint Research Center, and the AMET software developed by the US-...
Assessing the feasibility, cost, and utility of developing models of human performance in aviation
NASA Technical Reports Server (NTRS)
Stillwell, William
1990-01-01
The purpose of the effort outlined in this briefing was to determine whether models exist or can be developed that can be used to address aviation automation issues. A multidisciplinary team has been assembled to undertake this effort, including experts in human performance, team/crew, and aviation system modeling, and aviation data used as input to such models. The project consists of two phases, a requirements assessment phase that is designed to determine the feasibility and utility of alternative modeling efforts, and a model development and evaluation phase that will seek to implement the plan (if a feasible cost effective development effort is found) that results from the first phase. Viewgraphs are given.
Self-calibrating models for dynamic monitoring and diagnosis
NASA Technical Reports Server (NTRS)
Kuipers, Benjamin
1994-01-01
The present goal in qualitative reasoning is to develop methods for automatically building qualitative and semiquantitative models of dynamic systems and to use them for monitoring and fault diagnosis. The qualitative approach to modeling provides a guarantee of coverage while our semiquantitative methods support convergence toward a numerical model as observations are accumulated. We have developed and applied methods for automatic creation of qualitative models, developed two methods for obtaining tractable results on problems that were previously intractable for qualitative simulation, and developed more powerful methods for learning semiquantitative models from observations and deriving semiquantitative predictions from them. With these advances, qualitative reasoning comes significantly closer to realizing its aims as a practical engineering method.
Numerical Modeling of River Ice Processes on the Lower Nelson River
NASA Astrophysics Data System (ADS)
Malenchak, Jarrod Joseph
Water resource infrastructure in cold regions of the world can be significantly impacted by the existence of river ice. Major engineering concerns related to river ice include ice jam flooding, the design and operation of hydropower facilities and other hydraulic structures, water supplies, as well as ecological, environmental, and morphological effects. The use of numerical simulation models has been identified as one of the most efficient means by which river ice processes can be studied and the effects of river ice be evaluated. The continued advancement of these simulation models will help to develop new theories and evaluate potential mitigation alternatives for these ice issues. In this thesis, a literature review of existing river ice numerical models, of anchor ice formation and modeling studies, and of aufeis formation and modeling studies is conducted. A high level summary of the two-dimensional CRISSP numerical model is presented as well as the developed freeze-up model with a focus specifically on the anchor ice and aufeis growth processes. This model includes development in the detailed heat transfer calculations, an improved surface ice mass exchange model which includes the rapids entrainment process, and an improved dry bed treatment model along with the expanded anchor ice and aufeis growth model. The developed sub-models are tested in an ideal channel setting as somewhat of a model confirmation. A case study of significant anchor ice and aufeis growth on the Nelson River in northern Manitoba, Canada, will be the primary field test case for the anchor ice and aufeis model. A second case study on the same river will be used to evaluate the surface ice components of the model in a field setting. The results from these cases studies will be used to highlight the capabilities and deficiencies in the numerical model and to identify areas of further research and model development.
Development and Application of a Process-based River System Model at a Continental Scale
NASA Astrophysics Data System (ADS)
Kim, S. S. H.; Dutta, D.; Vaze, J.; Hughes, J. D.; Yang, A.; Teng, J.
2014-12-01
Existing global and continental scale river models, mainly designed for integrating with global climate model, are of very course spatial resolutions and they lack many important hydrological processes, such as overbank flow, irrigation diversion, groundwater seepage/recharge, which operate at a much finer resolution. Thus, these models are not suitable for producing streamflow forecast at fine spatial resolution and water accounts at sub-catchment levels, which are important for water resources planning and management at regional and national scale. A large-scale river system model has been developed and implemented for water accounting in Australia as part of the Water Information Research and Development Alliance between Australia's Bureau of Meteorology (BoM) and CSIRO. The model, developed using node-link architecture, includes all major hydrological processes, anthropogenic water utilisation and storage routing that influence the streamflow in both regulated and unregulated river systems. It includes an irrigation model to compute water diversion for irrigation use and associated fluxes and stores and a storage-based floodplain inundation model to compute overbank flow from river to floodplain and associated floodplain fluxes and stores. An auto-calibration tool has been built within the modelling system to automatically calibrate the model in large river systems using Shuffled Complex Evolution optimiser and user-defined objective functions. The auto-calibration tool makes the model computationally efficient and practical for large basin applications. The model has been implemented in several large basins in Australia including the Murray-Darling Basin, covering more than 2 million km2. The results of calibration and validation of the model shows highly satisfactory performance. The model has been operalisationalised in BoM for producing various fluxes and stores for national water accounting. This paper introduces this newly developed river system model describing the conceptual hydrological framework, methods used for representing different hydrological processes in the model and the results and evaluation of the model performance. The operational implementation of the model for water accounting is discussed.
Novel residual-based large eddy simulation turbulence models for incompressible magnetohydrodynamics
NASA Astrophysics Data System (ADS)
Sondak, David
The goal of this work was to develop, introduce, and test a promising computational paradigm for the development of turbulence models for incompressible magnetohydrodynamics (MHD). MHD governs the behavior of an electrically conducting fluid in the presence of an external electromagnetic (EM) field. The incompressible MHD model is used in many engineering and scientific disciplines from the development of nuclear fusion as a sustainable energy source to the study of space weather and solar physics. Many interesting MHD systems exhibit the phenomenon of turbulence which remains an elusive problem from all scientific perspectives. This work focuses on the computational perspective and proposes techniques that enable the study of systems involving MHD turbulence. Direct numerical simulation (DNS) is not a feasible approach for studying MHD turbulence. In this work, turbulence models for incompressible MHD were developed from the variational multiscale (VMS) formulation wherein the solution fields were decomposed into resolved and unresolved components. The unresolved components were modeled with a term that is proportional to the residual of the resolved scales. Two additional MHD models were developed based off of the VMS formulation: a residual-based eddy viscosity (RBEV) model and a mixed model that partners the VMS formulation with the RBEV model. These models are endowed with several special numerical and physics features. Included in the numerical features is the internal numerical consistency of each of the models. Physically, the new models are able to capture desirable MHD physics such as the inverse cascade of magnetic energy and the subgrid dynamo effect. The models were tested with a Fourier-spectral numerical method and the finite element method (FEM). The primary test problem was the Taylor-Green vortex. Results comparing the performance of the new models to DNS were obtained. The performance of the new models was compared to classic and cutting-edge dynamic Smagorinsky eddy viscosity (DSEV) models. The new models typically outperform the classical models.
Further Development of the PCRTM Model and RT Model Inter Comparison
NASA Technical Reports Server (NTRS)
Yang, Qiguang; Liu, Xu; Wu, Wan; Kizer, Susan
2015-01-01
New results for the development of the PCRTM model will be presented. The new results were used for IASI retrieval validation inter comparison and better results were obtained compare to other fast radiative transfer models.
An Introduction to the Partial Credit Model for Developing Nursing Assessments.
ERIC Educational Resources Information Center
Fox, Christine
1999-01-01
Demonstrates how the partial credit model, a variation of the Rasch Measurement Model, can be used to develop performance-based assessments for nursing education. Applies the model using the Practical Knowledge Inventory for Nurses. (SK)
Technical solutions to overcrowded park and ride facilities.
DOT National Transportation Integrated Search
2004-02-01
This report presents two transportation models that could be used for modeling park : and ride intermodal travelers. The first model developed is a static intermodal-planning : model that was developed for the New Jersey I80 transportation corridor a...
Payne, Courtney E.; Wolfrum, Edward J.
2015-03-12
Obtaining accurate chemical composition and reactivity (measures of carbohydrate release and yield) information for biomass feedstocks in a timely manner is necessary for the commercialization of biofuels. Our objective was to use near-infrared (NIR) spectroscopy and partial least squares (PLS) multivariate analysis to develop calibration models to predict the feedstock composition and the release and yield of soluble carbohydrates generated by a bench-scale dilute acid pretreatment and enzymatic hydrolysis assay. Major feedstocks included in the calibration models are corn stover, sorghum, switchgrass, perennial cool season grasses, rice straw, and miscanthus. Here are the results: We present individual model statistics tomore » demonstrate model performance and validation samples to more accurately measure predictive quality of the models. The PLS-2 model for composition predicts glucan, xylan, lignin, and ash (wt%) with uncertainties similar to primary measurement methods. A PLS-2 model was developed to predict glucose and xylose release following pretreatment and enzymatic hydrolysis. An additional PLS-2 model was developed to predict glucan and xylan yield. PLS-1 models were developed to predict the sum of glucose/glucan and xylose/xylan for release and yield (grams per gram). The release and yield models have higher uncertainties than the primary methods used to develop the models. In conclusion, it is possible to build effective multispecies feedstock models for composition, as well as carbohydrate release and yield. The model for composition is useful for predicting glucan, xylan, lignin, and ash with good uncertainties. The release and yield models have higher uncertainties; however, these models are useful for rapidly screening sample populations to identify unusual samples.« less
NASA Astrophysics Data System (ADS)
Campbell, Eleanor E.; Paustian, Keith
2015-12-01
Soil organic matter (SOM) is an important natural resource. It is fundamental to soil and ecosystem functions across a wide range of scales, from site-specific soil fertility and water holding capacity to global biogeochemical cycling. It is also a highly complex material that is sensitive to direct and indirect human impacts. In SOM research, simulation models play an important role by providing a mathematical framework to integrate, examine, and test the understanding of SOM dynamics. Simulation models of SOM are also increasingly used in more ‘applied’ settings to evaluate human impacts on ecosystem function, and to manage SOM for greenhouse gas mitigation, improved soil health, and sustainable use as a natural resource. Within this context, there is a need to maintain a robust connection between scientific developments in SOM modeling approaches and SOM model applications. This need forms the basis of this review. In this review we first provide an overview of SOM modeling, focusing on SOM theory, data-model integration, and model development as evidenced by a quantitative review of SOM literature. Second, we present the landscape of SOM model applications, focusing on examples in climate change policy. We conclude by discussing five areas of recent developments in SOM modeling including: (1) microbial roles in SOM stabilization; (2) modeling SOM saturation kinetics; (3) temperature controls on decomposition; (4) SOM dynamics in deep soil layers; and (5) SOM representation in earth system models. Our aim is to comprehensively connect SOM model development to its applications, revealing knowledge gaps in need of focused interdisciplinary attention and exposing pitfalls that, if avoided, can lead to best use of SOM models to support policy initiatives and sustainable land management solutions.
Development of a Linear Stirling Model with Varying Heat Inputs
NASA Technical Reports Server (NTRS)
Regan, Timothy F.; Lewandowski, Edward J.
2007-01-01
The linear model of the Stirling system developed by NASA Glenn Research Center (GRC) has been extended to include a user-specified heat input. Previously developed linear models were limited to the Stirling convertor and electrical load. They represented the thermodynamic cycle with pressure factors that remained constant. The numerical values of the pressure factors were generated by linearizing GRC s non-linear System Dynamic Model (SDM) of the convertor at a chosen operating point. The pressure factors were fixed for that operating point, thus, the model lost accuracy if a transition to a different operating point were simulated. Although the previous linear model was used in developing controllers that manipulated current, voltage, and piston position, it could not be used in the development of control algorithms that regulated hot-end temperature. This basic model was extended to include the thermal dynamics associated with a hot-end temperature that varies over time in response to external changes as well as to changes in the Stirling cycle. The linear model described herein includes not only dynamics of the piston, displacer, gas, and electrical circuit, but also the transient effects of the heater head thermal inertia. The linear version algebraically couples two separate linear dynamic models, one model of the Stirling convertor and one model of the thermal system, through the pressure factors. The thermal system model includes heat flow of heat transfer fluid, insulation loss, and temperature drops from the heat source to the Stirling convertor expansion space. The linear model was compared to a nonlinear model, and performance was very similar. The resulting linear model can be implemented in a variety of computing environments, and is suitable for analysis with classical and state space controls analysis techniques.
Development of a Linear Stirling System Model with Varying Heat Inputs
NASA Technical Reports Server (NTRS)
Regan, Timothy F.; Lewandowski, Edward J.
2007-01-01
The linear model of the Stirling system developed by NASA Glenn Research Center (GRC) has been extended to include a user-specified heat input. Previously developed linear models were limited to the Stirling convertor and electrical load. They represented the thermodynamic cycle with pressure factors that remained constant. The numerical values of the pressure factors were generated by linearizing GRC's nonlinear System Dynamic Model (SDM) of the convertor at a chosen operating point. The pressure factors were fixed for that operating point, thus, the model lost accuracy if a transition to a different operating point were simulated. Although the previous linear model was used in developing controllers that manipulated current, voltage, and piston position, it could not be used in the development of control algorithms that regulated hot-end temperature. This basic model was extended to include the thermal dynamics associated with a hot-end temperature that varies over time in response to external changes as well as to changes in the Stirling cycle. The linear model described herein includes not only dynamics of the piston, displacer, gas, and electrical circuit, but also the transient effects of the heater head thermal inertia. The linear version algebraically couples two separate linear dynamic models, one model of the Stirling convertor and one model of the thermal system, through the pressure factors. The thermal system model includes heat flow of heat transfer fluid, insulation loss, and temperature drops from the heat source to the Stirling convertor expansion space. The linear model was compared to a nonlinear model, and performance was very similar. The resulting linear model can be implemented in a variety of computing environments, and is suitable for analysis with classical and state space controls analysis techniques.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Payne, Courtney E.; Wolfrum, Edward J.
Obtaining accurate chemical composition and reactivity (measures of carbohydrate release and yield) information for biomass feedstocks in a timely manner is necessary for the commercialization of biofuels. Our objective was to use near-infrared (NIR) spectroscopy and partial least squares (PLS) multivariate analysis to develop calibration models to predict the feedstock composition and the release and yield of soluble carbohydrates generated by a bench-scale dilute acid pretreatment and enzymatic hydrolysis assay. Major feedstocks included in the calibration models are corn stover, sorghum, switchgrass, perennial cool season grasses, rice straw, and miscanthus. Here are the results: We present individual model statistics tomore » demonstrate model performance and validation samples to more accurately measure predictive quality of the models. The PLS-2 model for composition predicts glucan, xylan, lignin, and ash (wt%) with uncertainties similar to primary measurement methods. A PLS-2 model was developed to predict glucose and xylose release following pretreatment and enzymatic hydrolysis. An additional PLS-2 model was developed to predict glucan and xylan yield. PLS-1 models were developed to predict the sum of glucose/glucan and xylose/xylan for release and yield (grams per gram). The release and yield models have higher uncertainties than the primary methods used to develop the models. In conclusion, it is possible to build effective multispecies feedstock models for composition, as well as carbohydrate release and yield. The model for composition is useful for predicting glucan, xylan, lignin, and ash with good uncertainties. The release and yield models have higher uncertainties; however, these models are useful for rapidly screening sample populations to identify unusual samples.« less
Campbell, Eleanor E.; Paustian, Keith
2015-12-23
It is important to note that Soil organic matter (SOM) is a great natural resource. It is fundamental to soil and ecosystem functions across a wide range of scales, from site-specific soil fertility and water holding capacity to global biogeochemical cycling. It is also a highly complex material that is sensitive to direct and indirect human impacts. In our SOM research, simulation models play an important role by providing a mathematical framework to integrate, examine, and test the understanding of SOM dynamics. Simulation models of SOM are also increasingly used in more ‘applied’ settings to evaluate human impacts on ecosystemmore » function, and to manage SOM for greenhouse gas mitigation, improved soil health, and sustainable use as a natural resource. Within this context, there is a need to maintain a robust connection between scientific developments in SOM modeling approaches and SOM model applications. This need forms the basis of this review. In this review we first provide an overview of SOM modeling, focusing on SOM theory, data-model integration, and model development as evidenced by a quantitative review of SOM literature. Second, we present the landscape of SOM model applications, focusing on examples in climate change policy. Finally, we conclude by discussing five areas of recent developments in SOM modeling including: (1) microbial roles in SOM stabilization; (2) modeling SOM saturation kinetics; (3) temperature controls on decomposition; (4)SOM dynamics in deep soil layers; and (5)SOM representation in earth system models. Our aim is to comprehensively connect SOM model development to its applications, revealing knowledge gaps in need of focused interdisciplinary attention and exposing pitfalls that, if avoided, can lead to best use of SOM models to support policy initiatives and sustainable land management solutions.« less
Functional structure and dynamics of the human nervous system
NASA Technical Reports Server (NTRS)
Lawrence, J. A.
1981-01-01
The status of an effort to define the directions needed to take in extending pilot models is reported. These models are needed to perform closed-loop (man-in-the-loop) feedback flight control system designs and to develop cockpit display requirements. The approach taken is to develop a hypothetical working model of the human nervous system by reviewing the current literature in neurology and psychology and to develop a computer model of this hypothetical working model.
A User-Friendly Model for Spray Drying to Aid Pharmaceutical Product Development
Grasmeijer, Niels; de Waard, Hans; Hinrichs, Wouter L. J.; Frijlink, Henderik W.
2013-01-01
The aim of this study was to develop a user-friendly model for spray drying that can aid in the development of a pharmaceutical product, by shifting from a trial-and-error towards a quality-by-design approach. To achieve this, a spray dryer model was developed in commercial and open source spreadsheet software. The output of the model was first fitted to the experimental output of a Büchi B-290 spray dryer and subsequently validated. The predicted outlet temperatures of the spray dryer model matched the experimental values very well over the entire range of spray dryer settings that were tested. Finally, the model was applied to produce glassy sugars by spray drying, an often used excipient in formulations of biopharmaceuticals. For the production of glassy sugars, the model was extended to predict the relative humidity at the outlet, which is not measured in the spray dryer by default. This extended model was then successfully used to predict whether specific settings were suitable for producing glassy trehalose and inulin by spray drying. In conclusion, a spray dryer model was developed that is able to predict the output parameters of the spray drying process. The model can aid the development of spray dried pharmaceutical products by shifting from a trial-and-error towards a quality-by-design approach. PMID:24040240
Thermal modeling of NiH2 batteries
NASA Technical Reports Server (NTRS)
Ponthus, Agnes-Marie; Alexandre, Alain
1994-01-01
The following are discussed: NiH2 battery mission and environment; NiH2 cell heat dissipation; Nodal software; model development general philosophy; NiH2 battery model development; and NiH2 experimental developments.
The Modular Modeling System (MMS): User's Manual
Leavesley, G.H.; Restrepo, Pedro J.; Markstrom, S.L.; Dixon, M.; Stannard, L.G.
1996-01-01
The Modular Modeling System (MMS) is an integrated system of computer software that has been developed to provide the research and operational framework needed to support development, testing, and evaluation of physical-process algorithms and to facilitate integration of user-selected sets of algorithms into operational physical-process models. MMS uses a module library that contains modules for simulating a variety of water, energy, and biogeochemical processes. A model is created by selectively coupling the most appropriate modules from the library to create a 'suitable' model for the desired application. Where existing modules do not provide appropriate process algorithms, new modules can be developed. The MMS user's manual provides installation instructions and a detailed discussion of system concepts, module development, and model development and application using the MMS graphical user interface.
Transmission Line Jobs and Economic Development Impact (JEDI) Model User Reference Guide
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goldberg, M.; Keyser, D.
The Jobs and Economic Development Impact (JEDI) models, developed through the National Renewable Energy Laboratory (NREL), are freely available, user-friendly tools that estimate the potential economic impacts of constructing and operating power generation projects for a range of conventional and renewable energy technologies. The Transmission Line JEDI model can be used to field questions about the economic impacts of transmission lines in a given state, region, or local community. This Transmission Line JEDI User Reference Guide was developed to provide basic instruction on operating the model and understanding the results. This guide also provides information on the model's underlying methodology,more » as well as the parameters and references used to develop the cost data contained in the model.« less
Model-Based Reasoning in Upper-division Lab Courses
NASA Astrophysics Data System (ADS)
Lewandowski, Heather
2015-05-01
Modeling, which includes developing, testing, and refining models, is a central activity in physics. Well-known examples from AMO physics include everything from the Bohr model of the hydrogen atom to the Bose-Hubbard model of interacting bosons in a lattice. Modeling, while typically considered a theoretical activity, is most fully represented in the laboratory where measurements of real phenomena intersect with theoretical models, leading to refinement of models and experimental apparatus. However, experimental physicists use models in complex ways and the process is often not made explicit in physics laboratory courses. We have developed a framework to describe the modeling process in physics laboratory activities. The framework attempts to abstract and simplify the complex modeling process undertaken by expert experimentalists. The framework can be applied to understand typical processes such the modeling of the measurement tools, modeling ``black boxes,'' and signal processing. We demonstrate that the framework captures several important features of model-based reasoning in a way that can reveal common student difficulties in the lab and guide the development of curricula that emphasize modeling in the laboratory. We also use the framework to examine troubleshooting in the lab and guide students to effective methods and strategies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hobbs, Michael L.
We previously developed a PETN thermal decomposition model that accurately predicts thermal ignition and detonator failure [1]. This model was originally developed for CALORE [2] and required several complex user subroutines. Recently, a simplified version of the PETN decomposition model was implemented into ARIA [3] using a general chemistry framework without need for user subroutines. Detonator failure was also predicted with this new model using ENCORE. The model was simplified by 1) basing the model on moles rather than mass, 2) simplifying the thermal conductivity model, and 3) implementing ARIA’s new phase change model. This memo briefly describes the model,more » implementation, and validation.« less
Disentangling the Role of Domain-Specific Knowledge in Student Modeling
NASA Astrophysics Data System (ADS)
Ruppert, John; Duncan, Ravit Golan; Chinn, Clark A.
2017-08-01
This study explores the role of domain-specific knowledge in students' modeling practice and how this knowledge interacts with two domain-general modeling strategies: use of evidence and developing a causal mechanism. We analyzed models made by middle school students who had a year of intensive model-based instruction. These models were made to explain a familiar but unstudied biological phenomenon: late onset muscle pain. Students were provided with three pieces of evidence related to this phenomenon and asked to construct a model to account for this evidence. Findings indicate that domain-specific resources play a significant role in the extent to which the models accounted for provided evidence. On the other hand, familiarity with the situation appeared to contribute to the mechanistic character of models. Our results indicate that modeling strategies alone are insufficient for the development of a mechanistic model that accounts for provided evidence and that, while learners can develop a tentative model with a basic familiarity of the situation, scaffolding certain domain-specific knowledge is necessary to assist students with incorporating evidence in modeling tasks.
The modeling of a standalone solid-oxide fuel cell auxiliary power unit
NASA Astrophysics Data System (ADS)
Lu, N.; Li, Q.; Sun, X.; Khaleel, M. A.
In this research, a Simulink model of a standalone vehicular solid-oxide fuel cell (SOFC) auxiliary power unit (APU) is developed. The SOFC APU model consists of three major components: a controller model; a power electronics system model; and an SOFC plant model, including an SOFC stack module, two heat exchanger modules, and a combustor module. This paper discusses the development of the nonlinear dynamic models for the SOFC stacks, the heat exchangers and the combustors. When coupling with a controller model and a power electronic circuit model, the developed SOFC plant model is able to model the thermal dynamics and the electrochemical dynamics inside the SOFC APU components, as well as the transient responses to the electric loading changes. It has been shown that having such a model for the SOFC APU will help design engineers to adjust design parameters to optimize the performance. The modeling results of the SOFC APU heat-up stage and the output voltage response to a sudden load change are presented in this paper. The fuel flow regulation based on fuel utilization is also briefly discussed.
Statistically qualified neuro-analytic failure detection method and system
Vilim, Richard B.; Garcia, Humberto E.; Chen, Frederick W.
2002-03-02
An apparatus and method for monitoring a process involve development and application of a statistically qualified neuro-analytic (SQNA) model to accurately and reliably identify process change. The development of the SQNA model is accomplished in two stages: deterministic model adaption and stochastic model modification of the deterministic model adaptation. Deterministic model adaption involves formulating an analytic model of the process representing known process characteristics, augmenting the analytic model with a neural network that captures unknown process characteristics, and training the resulting neuro-analytic model by adjusting the neural network weights according to a unique scaled equation error minimization technique. Stochastic model modification involves qualifying any remaining uncertainty in the trained neuro-analytic model by formulating a likelihood function, given an error propagation equation, for computing the probability that the neuro-analytic model generates measured process output. Preferably, the developed SQNA model is validated using known sequential probability ratio tests and applied to the process as an on-line monitoring system. Illustrative of the method and apparatus, the method is applied to a peristaltic pump system.
NASA Astrophysics Data System (ADS)
Gosselin, Marie-Christine; Neufeld, Esra; Moser, Heidi; Huber, Eveline; Farcito, Silvia; Gerber, Livia; Jedensjö, Maria; Hilber, Isabel; Di Gennaro, Fabienne; Lloyd, Bryn; Cherubini, Emilio; Szczerba, Dominik; Kainz, Wolfgang; Kuster, Niels
2014-09-01
The Virtual Family computational whole-body anatomical human models were originally developed for electromagnetic (EM) exposure evaluations, in particular to study how absorption of radiofrequency radiation from external sources depends on anatomy. However, the models immediately garnered much broader interest and are now applied by over 300 research groups, many from medical applications research fields. In a first step, the Virtual Family was expanded to the Virtual Population to provide considerably broader population coverage with the inclusion of models of both sexes ranging in age from 5 to 84 years old. Although these models have proven to be invaluable for EM dosimetry, it became evident that significantly enhanced models are needed for reliable effectiveness and safety evaluations of diagnostic and therapeutic applications, including medical implants safety. This paper describes the research and development performed to obtain anatomical models that meet the requirements necessary for medical implant safety assessment applications. These include implementation of quality control procedures, re-segmentation at higher resolution, more-consistent tissue assignments, enhanced surface processing and numerous anatomical refinements. Several tools were developed to enhance the functionality of the models, including discretization tools, posing tools to expand the posture space covered, and multiple morphing tools, e.g., to develop pathological models or variations of existing ones. A comprehensive tissue properties database was compiled to complement the library of models. The results are a set of anatomically independent, accurate, and detailed models with smooth, yet feature-rich and topologically conforming surfaces. The models are therefore suited for the creation of unstructured meshes, and the possible applications of the models are extended to a wider range of solvers and physics. The impact of these improvements is shown for the MRI exposure of an adult woman with an orthopedic spinal implant. Future developments include the functionalization of the models for specific physical and physiological modeling tasks.
Evaluation of Proteus as a Tool for the Rapid Development of Models of Hydrologic Systems
NASA Astrophysics Data System (ADS)
Weigand, T. M.; Farthing, M. W.; Kees, C. E.; Miller, C. T.
2013-12-01
Models of modern hydrologic systems can be complex and involve a variety of operators with varying character. The goal is to implement approximations of such models that are both efficient for the developer and computationally efficient, which is a set of naturally competing objectives. Proteus is a Python-based toolbox that supports prototyping of model formulations as well as a wide variety of modern numerical methods and parallel computing. We used Proteus to develop numerical approximations for three models: Richards' equation, a brine flow model derived using the Thermodynamically Constrained Averaging Theory (TCAT), and a multiphase TCAT-based tumor growth model. For Richards' equation, we investigated discontinuous Galerkin solutions with higher order time integration based on the backward difference formulas. The TCAT brine flow model was implemented using Proteus and a variety of numerical methods were compared to hand coded solutions. Finally, an existing tumor growth model was implemented in Proteus to introduce more advanced numerics and allow the code to be run in parallel. From these three example models, Proteus was found to be an attractive open-source option for rapidly developing high quality code for solving existing and evolving computational science models.
AgMIP Training in Multiple Crop Models and Tools
NASA Technical Reports Server (NTRS)
Boote, Kenneth J.; Porter, Cheryl H.; Hargreaves, John; Hoogenboom, Gerrit; Thornburn, Peter; Mutter, Carolyn
2015-01-01
The Agricultural Model Intercomparison and Improvement Project (AgMIP) has the goal of using multiple crop models to evaluate climate impacts on agricultural production and food security in developed and developing countries. There are several major limitations that must be overcome to achieve this goal, including the need to train AgMIP regional research team (RRT) crop modelers to use models other than the ones they are currently familiar with, plus the need to harmonize and interconvert the disparate input file formats used for the various models. Two activities were followed to address these shortcomings among AgMIP RRTs to enable them to use multiple models to evaluate climate impacts on crop production and food security. We designed and conducted courses in which participants trained on two different sets of crop models, with emphasis on the model of least experience. In a second activity, the AgMIP IT group created templates for inputting data on soils, management, weather, and crops into AgMIP harmonized databases, and developed translation tools for converting the harmonized data into files that are ready for multiple crop model simulations. The strategies for creating and conducting the multi-model course and developing entry and translation tools are reviewed in this chapter.
NASA Technical Reports Server (NTRS)
Pace, Dale K.
2000-01-01
A simulation conceptual model is a simulation developers way of translating modeling requirements (i. e., what is to be represented by the simulation or its modification) into a detailed design framework (i. e., how it is to be done), from which the software, hardware, networks (in the case of distributed simulation), and systems/equipment that will make up the simulation can be built or modified. A conceptual model is the collection of information which describes a simulation developers concept about the simulation and its pieces. That information consists of assumptions, algorithms, characteristics, relationships, and data. Taken together, these describe how the simulation developer understands what is to be represented by the simulation (entities, actions, tasks, processes, interactions, etc.) and how that representation will satisfy the requirements to which the simulation responds. Thus the conceptual model is the basis for judgment about simulation fidelity and validity for any condition that is not specifically tested. The more perspicuous and precise the conceptual model, the more likely it is that the simulation development will both fully satisfy requirements and allow demonstration that the requirements are satisfied (i. e., validation). Methods used in simulation conceptual model development have significant implications for simulation management and for assessment of simulation uncertainty. This paper suggests how to develop and document a simulation conceptual model so that the simulation fidelity and validity can be most effectively determined. These ideas for conceptual model development apply to all simulation varieties. The paper relates these ideas to uncertainty assessments as they relate to simulation fidelity and validity. The paper also explores implications for simulation management from conceptual model development methods, especially relative to reuse of simulation components.
ERIC Educational Resources Information Center
Foust, Gretchen E.; Goslee, Patricia A.
2014-01-01
The Professional Development School (PDS) model, a successful collaborative partnership model between university teacher education programs and P-12 schools, focuses on ''preparing future educators, providing current educators with ongoing professional development, encouraging joint school-university faculty investigation of education-related…
Development of a finite element model of the Thor crash test dummy
DOT National Transportation Integrated Search
2000-03-06
The paper describes the development of a detailed finite element model of the new advanced frontal crash test dummy, Thor. The Volpe Center is developing the model for LS-DYNA in collaboration with GESAC, the dummy hardware developer, under the direc...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Allgor, R.J.; Feehery, W.F.; Tolsma, J.E.
The batch process development problem serves as good candidate to guide the development of process modeling environments. It demonstrates that very robust numerical techniques are required within an environment that can collect, organize, and maintain the data and models required to address the batch process development problem. This paper focuses on improving the robustness and efficiency of the numerical algorithms required in such a modeling environment through the development of hybrid numerical and symbolic strategies.
NASA Technical Reports Server (NTRS)
Simmons, J.; Erlich, D.; Shockey, D.
2009-01-01
A team consisting of Arizona State University, Honeywell Engines, Systems & Services, the National Aeronautics and Space Administration Glenn Research Center, and SRI International collaborated to develop computational models and verification testing for designing and evaluating turbine engine fan blade fabric containment structures. This research was conducted under the Federal Aviation Administration Airworthiness Assurance Center of Excellence and was sponsored by the Aircraft Catastrophic Failure Prevention Program. The research was directed toward improving the modeling of a turbine engine fabric containment structure for an engine blade-out containment demonstration test required for certification of aircraft engines. The research conducted in Phase II began a new level of capability to design and develop fan blade containment systems for turbine engines. Significant progress was made in three areas: (1) further development of the ballistic fabric model to increase confidence and robustness in the material models for the Kevlar(TradeName) and Zylon(TradeName) material models developed in Phase I, (2) the capability was improved for finite element modeling of multiple layers of fabric using multiple layers of shell elements, and (3) large-scale simulations were performed. This report concentrates on the material model development and simulations of the impact tests.
A Review of Surface Water Quality Models
Li, Shibei; Jia, Peng; Qi, Changjun; Ding, Feng
2013-01-01
Surface water quality models can be useful tools to simulate and predict the levels, distributions, and risks of chemical pollutants in a given water body. The modeling results from these models under different pollution scenarios are very important components of environmental impact assessment and can provide a basis and technique support for environmental management agencies to make right decisions. Whether the model results are right or not can impact the reasonability and scientificity of the authorized construct projects and the availability of pollution control measures. We reviewed the development of surface water quality models at three stages and analyzed the suitability, precisions, and methods among different models. Standardization of water quality models can help environmental management agencies guarantee the consistency in application of water quality models for regulatory purposes. We concluded the status of standardization of these models in developed countries and put forward available measures for the standardization of these surface water quality models, especially in developing countries. PMID:23853533
Wolfs, Vincent; Villazon, Mauricio Florencio; Willems, Patrick
2013-01-01
Applications such as real-time control, uncertainty analysis and optimization require an extensive number of model iterations. Full hydrodynamic sewer models are not sufficient for these applications due to the excessive computation time. Simplifications are consequently required. A lumped conceptual modelling approach results in a much faster calculation. The process of identifying and calibrating the conceptual model structure could, however, be time-consuming. Moreover, many conceptual models lack accuracy, or do not account for backwater effects. To overcome these problems, a modelling methodology was developed which is suited for semi-automatic calibration. The methodology is tested for the sewer system of the city of Geel in the Grote Nete river basin in Belgium, using both synthetic design storm events and long time series of rainfall input. A MATLAB/Simulink(®) tool was developed to guide the modeller through the step-wise model construction, reducing significantly the time required for the conceptual modelling process.
THE EARTH SYSTEM PREDICTION SUITE: Toward a Coordinated U.S. Modeling Capability
Theurich, Gerhard; DeLuca, C.; Campbell, T.; Liu, F.; Saint, K.; Vertenstein, M.; Chen, J.; Oehmke, R.; Doyle, J.; Whitcomb, T.; Wallcraft, A.; Iredell, M.; Black, T.; da Silva, AM; Clune, T.; Ferraro, R.; Li, P.; Kelley, M.; Aleinov, I.; Balaji, V.; Zadeh, N.; Jacob, R.; Kirtman, B.; Giraldo, F.; McCarren, D.; Sandgathe, S.; Peckham, S.; Dunlap, R.
2017-01-01
The Earth System Prediction Suite (ESPS) is a collection of flagship U.S. weather and climate models and model components that are being instrumented to conform to interoperability conventions, documented to follow metadata standards, and made available either under open source terms or to credentialed users. The ESPS represents a culmination of efforts to create a common Earth system model architecture, and the advent of increasingly coordinated model development activities in the U.S. ESPS component interfaces are based on the Earth System Modeling Framework (ESMF), community-developed software for building and coupling models, and the National Unified Operational Prediction Capability (NUOPC) Layer, a set of ESMF-based component templates and interoperability conventions. This shared infrastructure simplifies the process of model coupling by guaranteeing that components conform to a set of technical and semantic behaviors. The ESPS encourages distributed, multi-agency development of coupled modeling systems, controlled experimentation and testing, and exploration of novel model configurations, such as those motivated by research involving managed and interactive ensembles. ESPS codes include the Navy Global Environmental Model (NavGEM), HYbrid Coordinate Ocean Model (HYCOM), and Coupled Ocean Atmosphere Mesoscale Prediction System (COAMPS®); the NOAA Environmental Modeling System (NEMS) and the Modular Ocean Model (MOM); the Community Earth System Model (CESM); and the NASA ModelE climate model and GEOS-5 atmospheric general circulation model. PMID:29568125
THE EARTH SYSTEM PREDICTION SUITE: Toward a Coordinated U.S. Modeling Capability.
Theurich, Gerhard; DeLuca, C; Campbell, T; Liu, F; Saint, K; Vertenstein, M; Chen, J; Oehmke, R; Doyle, J; Whitcomb, T; Wallcraft, A; Iredell, M; Black, T; da Silva, A M; Clune, T; Ferraro, R; Li, P; Kelley, M; Aleinov, I; Balaji, V; Zadeh, N; Jacob, R; Kirtman, B; Giraldo, F; McCarren, D; Sandgathe, S; Peckham, S; Dunlap, R
2016-07-01
The Earth System Prediction Suite (ESPS) is a collection of flagship U.S. weather and climate models and model components that are being instrumented to conform to interoperability conventions, documented to follow metadata standards, and made available either under open source terms or to credentialed users. The ESPS represents a culmination of efforts to create a common Earth system model architecture, and the advent of increasingly coordinated model development activities in the U.S. ESPS component interfaces are based on the Earth System Modeling Framework (ESMF), community-developed software for building and coupling models, and the National Unified Operational Prediction Capability (NUOPC) Layer, a set of ESMF-based component templates and interoperability conventions. This shared infrastructure simplifies the process of model coupling by guaranteeing that components conform to a set of technical and semantic behaviors. The ESPS encourages distributed, multi-agency development of coupled modeling systems, controlled experimentation and testing, and exploration of novel model configurations, such as those motivated by research involving managed and interactive ensembles. ESPS codes include the Navy Global Environmental Model (NavGEM), HYbrid Coordinate Ocean Model (HYCOM), and Coupled Ocean Atmosphere Mesoscale Prediction System (COAMPS ® ); the NOAA Environmental Modeling System (NEMS) and the Modular Ocean Model (MOM); the Community Earth System Model (CESM); and the NASA ModelE climate model and GEOS-5 atmospheric general circulation model.
The Earth System Prediction Suite: Toward a Coordinated U.S. Modeling Capability
NASA Technical Reports Server (NTRS)
Theurich, Gerhard; DeLuca, C.; Campbell, T.; Liu, F.; Saint, K.; Vertenstein, M.; Chen, J.; Oehmke, R.; Doyle, J.; Whitcomb, T.;
2016-01-01
The Earth System Prediction Suite (ESPS) is a collection of flagship U.S. weather and climate models and model components that are being instrumented to conform to interoperability conventions, documented to follow metadata standards, and made available either under open source terms or to credentialed users.The ESPS represents a culmination of efforts to create a common Earth system model architecture, and the advent of increasingly coordinated model development activities in the U.S. ESPS component interfaces are based on the Earth System Modeling Framework (ESMF), community-developed software for building and coupling models, and the National Unified Operational Prediction Capability (NUOPC) Layer, a set of ESMF-based component templates and interoperability conventions. This shared infrastructure simplifies the process of model coupling by guaranteeing that components conform to a set of technical and semantic behaviors. The ESPS encourages distributed, multi-agency development of coupled modeling systems, controlled experimentation and testing, and exploration of novel model configurations, such as those motivated by research involving managed and interactive ensembles. ESPS codes include the Navy Global Environmental Model (NavGEM), HYbrid Coordinate Ocean Model (HYCOM), and Coupled Ocean Atmosphere Mesoscale Prediction System (COAMPS); the NOAA Environmental Modeling System (NEMS) and the Modular Ocean Model (MOM); the Community Earth System Model (CESM); and the NASA ModelE climate model and GEOS-5 atmospheric general circulation model.
2013-06-01
Building in- formation exchange (COBie), Building Information Modeling ( BIM ) 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF...to develop a life-cycle building model have resulted in the definition of a “core” building information model that contains general information de...develop an information -exchange Model View Definition (MVD) for building electrical systems. The objective of the current work was to document the
A Generic Modeling Process to Support Functional Fault Model Development
NASA Technical Reports Server (NTRS)
Maul, William A.; Hemminger, Joseph A.; Oostdyk, Rebecca; Bis, Rachael A.
2016-01-01
Functional fault models (FFMs) are qualitative representations of a system's failure space that are used to provide a diagnostic of the modeled system. An FFM simulates the failure effect propagation paths within a system between failure modes and observation points. These models contain a significant amount of information about the system including the design, operation and off nominal behavior. The development and verification of the models can be costly in both time and resources. In addition, models depicting similar components can be distinct, both in appearance and function, when created individually, because there are numerous ways of representing the failure space within each component. Generic application of FFMs has the advantages of software code reuse: reduction of time and resources in both development and verification, and a standard set of component models from which future system models can be generated with common appearance and diagnostic performance. This paper outlines the motivation to develop a generic modeling process for FFMs at the component level and the effort to implement that process through modeling conventions and a software tool. The implementation of this generic modeling process within a fault isolation demonstration for NASA's Advanced Ground System Maintenance (AGSM) Integrated Health Management (IHM) project is presented and the impact discussed.
Challenges in the development of chronic pulmonary hypertension models in large animals
Rothman, Abraham; Wiencek, Robert G.; Davidson, Stephanie; Evans, William N.; Restrepo, Humberto; Sarukhanov, Valeri; Mann, David
2017-01-01
Pulmonary hypertension (PH) results in significant morbidity and mortality. Chronic PH animal models may advance the study of PH’s mechanisms, evolution, and therapy. In this report, we describe the challenges and successes in developing three models of chronic PH in large animals: two models (one canine and one swine) utilized repeated infusions of ceramic microspheres into the pulmonary vascular bed, and the third model employed a surgical aorto-pulmonary shunt. In the canine model, seven dogs underwent microsphere infusions that resulted in progressive elevation of pulmonary arterial pressure over a few months. In this model, pulmonary endoarterial tissue was obtained for histology. In the aorto-pulmonary shunt swine model, 17 pigs developed systemic level pulmonary pressures after 2–3 months. In this model, pulmonary endoarterial tissue was sequentially obtained to assess for changes in gene and microRNA expression. In the swine microsphere infusion model, three pigs developed only a modest chronic increase in pulmonary arterial pressure, despite repeated infusions of microspheres (up to 40 in one animal). The main purpose of this model was for vasodilator testing, which was performed successfully immediately after acute microsphere infusions. Chronic PH in large animal models can be successfully created; however, a model’s characteristics need to match the investigational goals. PMID:28680575
Specifications of insilicoML 1.0: a multilevel biophysical model description language.
Asai, Yoshiyuki; Suzuki, Yasuyuki; Kido, Yoshiyuki; Oka, Hideki; Heien, Eric; Nakanishi, Masao; Urai, Takahito; Hagihara, Kenichi; Kurachi, Yoshihisa; Nomura, Taishin
2008-12-01
An extensible markup language format, insilicoML (ISML), version 0.1, describing multi-level biophysical models has been developed and available in the public domain. ISML is fully compatible with CellML 1.0, a model description standard developed by the IUPS Physiome Project, for enhancing knowledge integration and model sharing. This article illustrates the new specifications of ISML 1.0 that largely extend the capability of ISML 0.1. ISML 1.0 can describe various types of mathematical models, including ordinary/partial differential/difference equations representing the dynamics of physiological functions and the geometry of living organisms underlying the functions. ISML 1.0 describes a model using a set of functional elements (modules) each of which can specify mathematical expressions of the functions. Structural and logical relationships between any two modules are specified by edges, which allow modular, hierarchical, and/or network representations of the model. The role of edge-relationships is enriched by key words in order for use in constructing a physiological ontology. The ontology is further improved by the traceability of history of the model's development and by linking between different ISML models stored in the model's database using meta-information. ISML 1.0 is designed to operate with a model database and integrated environments for model development and simulations for knowledge integration and discovery.
DATA FOR ENVIRONMENTAL MODELING: AN OVERVIEW
The objective of the project described here, entitled Data for Environmental Modeling (D4EM), is the development of a comprehensive set of software tools that allow an environmental model developer to automatically populate model input files with environmental data available from...
DOT National Transportation Integrated Search
2011-11-01
This report summarizes the technical work performed developing and incorporating Metropolitan Planning : Organization sub-models into the existing Texas Revenue Estimator and Needs Determination System : (TRENDS) model. Additionally, this report expl...
Development of weekend travel demand and mode choice models : final report, June 2009.
DOT National Transportation Integrated Search
2010-06-30
Travel demand models are widely used for forecasting and analyzing policies for automobile and transit travel. However, these models are typically developed for average weekday travel when regular activities are routine. The weekday models focus prim...
Development of the NASA Digital Astronaut Project Muscle Model
NASA Technical Reports Server (NTRS)
Lewandowski, Beth E.; Pennline, James A.; Thompson, W. K.; Humphreys, B. T.; Ryder, J. W.; Ploutz-Snyder, L. L.; Mulugeta, L.
2015-01-01
This abstract describes development work performed on the NASA Digital Astronaut Project Muscle Model. Muscle atrophy is a known physiological response to exposure to a low gravity environment. The DAP muscle model computationally predicts the change in muscle structure and function vs. time in a reduced gravity environment. The spaceflight muscle model can then be used in biomechanical models of exercise countermeasures and spaceflight tasks to: 1) develop site specific bone loading input to the DAP bone adaptation model over the course of a mission; 2) predict astronaut performance of spaceflight tasks; 3) inform effectiveness of new exercise countermeasures concepts.
NASA Technical Reports Server (NTRS)
Briggs, Maxwell H.
2011-01-01
The Fission Power System (FPS) project is developing a Technology Demonstration Unit (TDU) to verify the performance and functionality of a subscale version of the FPS reference concept in a relevant environment, and to verify component and system models. As hardware is developed for the TDU, component and system models must be refined to include the details of specific component designs. This paper describes the development of a Sage-based pseudo-steady-state Stirling convertor model and its implementation into a system-level model of the TDU.
Automatic inference of multicellular regulatory networks using informative priors.
Sun, Xiaoyun; Hong, Pengyu
2009-01-01
To fully understand the mechanisms governing animal development, computational models and algorithms are needed to enable quantitative studies of the underlying regulatory networks. We developed a mathematical model based on dynamic Bayesian networks to model multicellular regulatory networks that govern cell differentiation processes. A machine-learning method was developed to automatically infer such a model from heterogeneous data. We show that the model inference procedure can be greatly improved by incorporating interaction data across species. The proposed approach was applied to C. elegans vulval induction to reconstruct a model capable of simulating C. elegans vulval induction under 73 different genetic conditions.
SABRINA - an interactive geometry modeler for MCNP
DOE Office of Scientific and Technical Information (OSTI.GOV)
West, J.T.; Murphy, J.
One of the most difficult tasks when analyzing a complex three-dimensional system with Monte Carlo is geometry model development. SABRINA attempts to make the modeling process more user-friendly and less of an obstacle. It accepts both combinatorial solid bodies and MCNP surfaces and produces MCNP cells. The model development process in SABRINA is highly interactive and gives the user immediate feedback on errors. Users can view their geometry from arbitrary perspectives while the model is under development and interactively find and correct modeling errors. An example of a SABRINA display is shown. It represents a complex three-dimensional shape.
Animal models in the research of abdominal aortic aneurysms development.
Patelis, N; Moris, D; Schizas, D; Damaskos, C; Perrea, D; Bakoyiannis, C; Liakakos, T; Georgopoulos, S
2017-12-20
Abdominal aortic aneurysm (AAA) is a prevalent and potentially life threatening disease. Many animal models have been developed to simulate the natural history of the disease or test preclinical endovascular devices and surgical procedures. The aim of this review is to describe different methods of AAA induction in animal models and report on the effectiveness of the methods described in inducing an analogue of a human AAA. The PubMed database was searched for publications with titles containing the following terms "animal" or "animal model(s)" and keywords "research", "aneurysm(s)", "aorta", "pancreatic elastase", "Angiotensin", "AngII" "calcium chloride" or "CaCl(2)". Starting date for this search was set to 2004, since previously bibliography was already covered by the review of Daugherty and Cassis (2004). We focused on animal studies that reported a model of aneurysm development and progression. A number of different approaches of AAA induction in animal models has been developed, used and combined since the first report in the 1960's. Although specific methods are successful in AAA induction in animal models, it is necessary that these methods and their respective results are in line with the pathophysiology and the mechanisms involved in human AAA development. A researcher should know the advantages/disadvantages of each animal model and choose the appropriate model.
NASA Astrophysics Data System (ADS)
Abbasi Baharanchi, Ahmadreza
This dissertation focused on development and utilization of numerical and experimental approaches to improve the CFD modeling of fluidization flow of cohesive micron size particles. The specific objectives of this research were: (1) Developing a cluster prediction mechanism applicable to Two-Fluid Modeling (TFM) of gas-solid systems (2) Developing more accurate drag models for Two-Fluid Modeling (TFM) of gas-solid fluidization flow with the presence of cohesive interparticle forces (3) using the developed model to explore the improvement of accuracy of TFM in simulation of fluidization flow of cohesive powders (4) Understanding the causes and influential factor which led to improvements and quantification of improvements (5) Gathering data from a fast fluidization flow and use these data for benchmark validations. Simulation results with two developed cluster-aware drag models showed that cluster prediction could effectively influence the results in both the first and second cluster-aware models. It was proven that improvement of accuracy of TFM modeling using three versions of the first hybrid model was significant and the best improvements were obtained by using the smallest values of the switch parameter which led to capturing the smallest chances of cluster prediction. In the case of the second hybrid model, dependence of critical model parameter on only Reynolds number led to the fact that improvement of accuracy was significant only in dense section of the fluidized bed. This finding may suggest that a more sophisticated particle resolved DNS model, which can span wide range of solid volume fraction, can be used in the formulation of the cluster-aware drag model. The results of experiment suing high speed imaging indicated the presence of particle clusters in the fluidization flow of FCC inside the riser of FIU-CFB facility. In addition, pressure data was successfully captured along the fluidization column of the facility and used as benchmark validation data for the second hybrid model developed in the present dissertation. It was shown the second hybrid model could predict the pressure data in the dense section of the fluidization column with better accuracy.
Development of an Integrated Hydrologic Modeling System for Rainfall-Runoff Simulation
NASA Astrophysics Data System (ADS)
Lu, B.; Piasecki, M.
2008-12-01
This paper aims to present the development of an integrated hydrological model which involves functionalities of digital watershed processing, online data retrieval, hydrologic simulation and post-event analysis. The proposed system is intended to work as a back end to the CUAHSI HIS cyberinfrastructure developments. As a first step into developing this system, a physics-based distributed hydrologic model PIHM (Penn State Integrated Hydrologic Model) is wrapped into OpenMI(Open Modeling Interface and Environment ) environment so as to seamlessly interact with OpenMI compliant meteorological models. The graphical user interface is being developed from the openGIS application called MapWindows which permits functionality expansion through the addition of plug-ins. . Modules required to set up through the GUI workboard include those for retrieving meteorological data from existing database or meteorological prediction models, obtaining geospatial data from the output of digital watershed processing, and importing initial condition and boundary condition. They are connected to the OpenMI compliant PIHM to simulate rainfall-runoff processes and includes a module for automatically displaying output after the simulation. Online databases are accessed through the WaterOneFlow web services, and the retrieved data are either stored in an observation database(OD) following the schema of Observation Data Model(ODM) in case for time series support, or a grid based storage facility which may be a format like netCDF or a grid-based-data database schema . Specific development steps include the creation of a bridge to overcome interoperability issue between PIHM and the ODM, as well as the embedding of TauDEM (Terrain Analysis Using Digital Elevation Models) into the model. This module is responsible for developing watershed and stream network using digital elevation models. Visualizing and editing geospatial data is achieved by the usage of MapWinGIS, an ActiveX control developed by MapWindow team. After applying to the practical watershed, the performance of the model can be tested by the post-event analysis module.
Model based design introduction: modeling game controllers to microprocessor architectures
NASA Astrophysics Data System (ADS)
Jungwirth, Patrick; Badawy, Abdel-Hameed
2017-04-01
We present an introduction to model based design. Model based design is a visual representation, generally a block diagram, to model and incrementally develop a complex system. Model based design is a commonly used design methodology for digital signal processing, control systems, and embedded systems. Model based design's philosophy is: to solve a problem - a step at a time. The approach can be compared to a series of steps to converge to a solution. A block diagram simulation tool allows a design to be simulated with real world measurement data. For example, if an analog control system is being upgraded to a digital control system, the analog sensor input signals can be recorded. The digital control algorithm can be simulated with the real world sensor data. The output from the simulated digital control system can then be compared to the old analog based control system. Model based design can compared to Agile software develop. The Agile software development goal is to develop working software in incremental steps. Progress is measured in completed and tested code units. Progress is measured in model based design by completed and tested blocks. We present a concept for a video game controller and then use model based design to iterate the design towards a working system. We will also describe a model based design effort to develop an OS Friendly Microprocessor Architecture based on the RISC-V.
Sheriff, R; Banks, A
2001-01-01
Organization change efforts have led to critically examining the structure of education and development departments within hospitals. This qualitative study evaluated an education and development model in an academic health sciences center. The model combines centralization and decentralization. The study results can be used by staff development educators and administrators when organization structure is questioned. This particular model maximizes the benefits and minimizes the limitations of centralized and decentralized structures.
2013-10-01
to Bench Modeling For Developing Treatment and Rehabilitation Strategies PRINCIPAL INVESTIGATOR: Geoffrey Manley, MD, PhD RECIPIENT...to Bench Modeling For Developing Treatment and Rehabilitation Strategies 5b. GRANT NUMBER W81XWH-10-1-0912 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR... treatment of this “dual- diagnosis” are lacking. This project proposed using current clinical-practice evidence to guide development of an animal model to
NASA Technical Reports Server (NTRS)
Kopasakis, George
2014-01-01
The presentation covers a recently developed methodology to model atmospheric turbulence as disturbances for aero vehicle gust loads and for controls development like flutter and inlet shock position. The approach models atmospheric turbulence in their natural fractional order form, which provides for more accuracy compared to traditional methods like the Dryden model, especially for high speed vehicle. The presentation provides a historical background on atmospheric turbulence modeling and the approaches utilized for air vehicles. This is followed by the motivation and the methodology utilized to develop the atmospheric turbulence fractional order modeling approach. Some examples covering the application of this method are also provided, followed by concluding remarks.
Probalistic Models for Solar Particle Events
NASA Technical Reports Server (NTRS)
Adams, James H., Jr.; Xapsos, Michael
2009-01-01
Probabilistic Models of Solar Particle Events (SPEs) are used in space mission design studies to describe the radiation environment that can be expected at a specified confidence level. The task of the designer is then to choose a design that will operate in the model radiation environment. Probabilistic models have already been developed for solar proton events that describe the peak flux, event-integrated fluence and missionintegrated fluence. In addition a probabilistic model has been developed that describes the mission-integrated fluence for the Z>2 elemental spectra. This talk will focus on completing this suite of models by developing models for peak flux and event-integrated fluence elemental spectra for the Z>2 element
NASA Technical Reports Server (NTRS)
Santi, L. Michael; Helmicki, Arthur J.
1993-01-01
The objective of Phase I of this research effort was to develop an advanced mathematical-empirical model of SSME steady-state performance. Task 6 of Phase I is to develop component specific modification strategy for baseline case influence coefficient matrices. This report describes the background of SSME performance characteristics and provides a description of the control variable basis of three different gains models. The procedure used to establish influence coefficients for each of these three models is also described. Gains model analysis results are compared to Rocketdyne's power balance model (PBM).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stauffer, Philip H.; Levitt, Daniel G.; Miller, Terry Ann
2017-02-09
This report consists of four major sections, including this introductory section. Section 2 provides an overview of previous investigations related to the development of the current sitescale model. The methods and data used to develop the 3-D groundwater model and the techniques used to distill that model into a form suitable for use in the GoldSim models are discussed in Section 3. Section 4 presents the results of the model development effort and discusses some of the uncertainties involved. Three attachments that provide details about the components and data used in this groundwater pathway model are also included with thismore » report.« less
NASA Astrophysics Data System (ADS)
Joshi, Nitin; Ojha, C. S. P.; Sharma, P. K.
2012-10-01
In this study a conceptual model that accounts for the effects of nonequilibrium contaminant transport in a fractured porous media is developed. Present model accounts for both physical and sorption nonequilibrium. Analytical solution was developed using the Laplace transform technique, which was then numerically inverted to obtain solute concentration in the fracture matrix system. The semianalytical solution developed here can incorporate both semi-infinite and finite fracture matrix extent. In addition, the model can account for flexible boundary conditions and nonzero initial condition in the fracture matrix system. The present semianalytical solution was validated against the existing analytical solutions for the fracture matrix system. In order to differentiate between various sorption/transport mechanism different cases of sorption and mass transfer were analyzed by comparing the breakthrough curves and temporal moments. It was found that significant differences in the signature of sorption and mass transfer exists. Applicability of the developed model was evaluated by simulating the published experimental data of Calcium and Strontium transport in a single fracture. The present model simulated the experimental data reasonably well in comparison to the model based on equilibrium sorption assumption in fracture matrix system, and multi rate mass transfer model.
Development of Models for Regional Cardiac Surgery Centers
Park, Choon Seon; Park, Nam Hee; Sim, Sung Bo; Yun, Sang Cheol; Ahn, Hye Mi; Kim, Myunghwa; Choi, Ji Suk; Kim, Myo Jeong; Kim, Hyunsu; Chee, Hyun Keun; Oh, Sanggi; Kang, Shinkwang; Lee, Sok-Goo; Shin, Jun Ho; Kim, Keonyeop; Lee, Kun Sei
2016-01-01
Background This study aimed to develop the models for regional cardiac surgery centers, which take regional characteristics into consideration, as a policy measure that could alleviate the concentration of cardiac surgery in the metropolitan area and enhance the accessibility for patients who reside in the regions. Methods To develop the models and set standards for the necessary personnel and facilities for the initial management plan, we held workshops, debates, and conference meetings with various experts. Results After partitioning the plan into two parts (the operational autonomy and the functional comprehensiveness), three models were developed: the ‘independent regional cardiac surgery center’ model, the ‘satellite cardiac surgery center within hospitals’ model, and the ‘extended cardiac surgery department within hospitals’ model. Proposals on personnel and facility management for each of the models were also presented. A regional cardiac surgery center model that could be applied to each treatment area was proposed, which was developed based on the anticipated demand for cardiac surgery. The independent model or the satellite model was proposed for Chungcheong, Jeolla, North Gyeongsang, and South Gyeongsang area, where more than 500 cardiac surgeries are performed annually. The extended model was proposed as most effective for the Gangwon and Jeju area, where more than 200 cardiac surgeries are performed annually. Conclusion The operation of regional cardiac surgery centers with high caliber professionals and quality resources such as optimal equipment and facility size, should enhance regional healthcare accessibility and the quality of cardiac surgery in South Korea. PMID:28035295
Development of a rotorcraft. Propulsion dynamics interface analysis, volume 2
NASA Technical Reports Server (NTRS)
Hull, R.
1982-01-01
A study was conducted to establish a coupled rotor/propulsion analysis that would be applicable to a wide range of rotorcraft systems. The effort included the following tasks: (1) development of a model structure suitable for simulating a wide range of rotorcraft configurations; (2) defined a methodology for parameterizing the model structure to represent a particular rotorcraft; (3) constructing a nonlinear coupled rotor/propulsion model as a test case to use in analyzing coupled system dynamics; and (4) an attempt to develop a mostly linear coupled model derived from the complete nonlinear simulations. Documentation of the computer models developed is presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ernest A. Mancini; William C. Parcell; Bruce S. Hart
The principal research effort for Year 2 of the project is on stratigraphic model assessment and development. The research focus for the first six (6) months of Year 2 is on T-R cycle model development. The emphasis for the remainder of the year is on assessing the depositional model and developing and testing a sequence stratigraphy model. The development and testing of the sequence stratigraphy model has been accomplished through integrated outcrop, well log and seismic studies of Mesozoic strata in the Gulf of Mexico, North Atlantic and Rocky Mountain areas.
NASA Astrophysics Data System (ADS)
Gaševic, Dragan; Djuric, Dragan; Devedžic, Vladan
A relevant initiative from the software engineering community called Model Driven Engineering (MDE) is being developed in parallel with the Semantic Web (Mellor et al. 2003a). The MDE approach to software development suggests that one should first develop a model of the system under study, which is then transformed into the real thing (i.e., an executable software entity). The most important research initiative in this area is the Model Driven Architecture (MDA), which is Model Driven Architecture being developed under the umbrella of the Object Management Group (OMG). This chapter describes the basic concepts of this software engineering effort.
Computational Aeroelastic Analyses of a Low-Boom Supersonic Configuration
NASA Technical Reports Server (NTRS)
Silva, Walter A.; Sanetrik, Mark D.; Chwalowski, Pawel; Connolly, Joseph
2015-01-01
An overview of NASA's Commercial Supersonic Technology (CST) Aeroservoelasticity (ASE) element is provided with a focus on recent computational aeroelastic analyses of a low-boom supersonic configuration developed by Lockheed-Martin and referred to as the N+2 configuration. The overview includes details of the computational models developed to date including a linear finite element model (FEM), linear unsteady aerodynamic models, unstructured CFD grids, and CFD-based aeroelastic analyses. In addition, a summary of the work involving the development of aeroelastic reduced-order models (ROMs) and the development of an aero-propulso-servo-elastic (APSE) model is provided.
NASA Astrophysics Data System (ADS)
Barthel, Roland
2013-04-01
Participatory modeling and participatory scenario development have become an essential part of environmental impact assessment and planning in the field of water resources management. But even if most people agree that participation is required to solve environmental problems in a way that satisfies both the environmental and societal needs, success stories are relatively rare, while many attempts to include stakeholders in the development of models are still reported to have failed. This paper proposes the hypothesis, that the lack of success in participatory modeling can partly be attributed to a lack of attractiveness of participatory approaches for researchers from natural sciences (subsequently called 'modelers'). It has to be pointed out that this discussion is mainly concerned with natural scientists in academia and not with modelers who develop models for commercial purposes or modelers employed by public agencies. The involvement of modelers and stakeholders in participatory modeling has been intensively studied during recent years. However, such analysis is rarely made from the viewpoint of the modelers themselves. Modelers usually don't see participatory modeling and scenario development as scientific targets as such, because the theoretical foundations of such processes usually lie far outside their own area of expertise. Thus, participatory processes are seen mainly as a means to attract funding or to facilitate the access to data or (relatively rarely) as a way to develop a research model into a commercial product. The majority of modelers very likely do not spend too much time on reflecting whether or not their new tools are helpful to solve real world problems or if the results are understandable and acceptable for stakeholders. They consider their task completed when the model they developed satisfies the 'scientific requirements', which are essentially different from the requirements to satisfy a group of stakeholders. Funding often stops before a newly developed model can actually be tested in a stakeholder process. Therefore the gap between stakeholders and modelers persists or is even growing. A main reason for this probably lies in the way that the work of scientists (modelers) is evaluated. What counts is the number of journal articles produced, while applicability or societal impact is still not a measure of scientific success. A good journal article on a model requires an exemplary validation but only very rarely would a reviewer ask if a model was accepted by stakeholders. So why should a scientist go through a tedious stakeholder process? The stakeholder process might be a requirement of the research grant, but whether this is taken seriously, can be questioned, as long as stakeholder dialogues do not lead to quantifiable scientific success. In particular for researchers in early career stages who undergo typical, publication-based evaluation processes, participatory research is hardly beneficial. The discussion in this contribution is based on three pillars: (i) a comprehensive evaluation of the literature published on participatory modeling and scenario development, (ii) a case study involving the development of an integrated model for water and land use management including an intensive stakeholder process and (iii) unstructured, personal communication - with mainly young scientists - about the attractiveness of multidisciplinary, applied research.
The use of direct numerical simulation data in turbulence modeling
NASA Technical Reports Server (NTRS)
Mansour, N. N.
1991-01-01
Direct numerical simulations (DNS) of turbulent flows provide a complete data base to develop and to test turbulence models. In this article, the progress made in developing models for the dissipation rate equation is reviewed. New scaling arguments for the various terms in the dissipation rate equation were tested using data from DNS of homogeneous shear flows. Modifications to the epsilon-equation model that take into account near-wall effects were developed using DNS of turbulent channel flows. Testing of new models for flows under mean compression was carried out using data from DNS of isotropically compressed turbulence. In all of these studies the data from the simulations was essential in guiding the model development. The next generation of DNS will be at higher Reynolds numbers, and will undoubtedly lead to improved models for computations of flows of practical interest.
ERIC Educational Resources Information Center
Luecht, Richard M.
2003-01-01
This article contends that the necessary links between constructs and test scores/decisions in language assessment must be established through principled design procedures that align three models: (1) a theoretical construct model; (2) a test development model; and (3) a psychometric scoring model. The theoretical construct model articulates the…
System Simulation Modeling: A Case Study Illustration of the Model Development Life Cycle
Janice K. Wiedenbeck; D. Earl Kline
1994-01-01
Systems simulation modeling techniques offer a method of representing the individual elements of a manufacturing system and their interactions. By developing and experimenting with simulation models, one can obtain a better understanding of the overall physical system. Forest products industries are beginning to understand the importance of simulation modeling to help...
A Model of International Communication Media Appraisal: Phase IV, Generalizing the Model to Film.
ERIC Educational Resources Information Center
Johnson, J. David
A study tested a causal model of international communication media appraisal using audience evaluations of tests of two films conducted in the Philippines. It was the fourth in a series of tests of the model in both developed and developing countries. In general the model posited determinative relationships between three exogenous variables…
Progress in the development of the GMM-2 gravity field model for Mars
NASA Technical Reports Server (NTRS)
Lemoine, F. G.; Smith, D. E.; Lerch, F. J.; Zuber, M. T.; Patel, G. B.
1994-01-01
Last year we published the GMM-1 (Goddard Mars Model-1) gravity model for Mars. We have completely re-analyzed the Viking and Mariner 9 tracking data in the development of the new field, designated GMM-2. The model is complete to degree and order 70. Various aspects of the model are discussed.
Developing a Learning Progression for Number Sense Based on the Rule Space Model in China
ERIC Educational Resources Information Center
Chen, Fu; Yan, Yue; Xin, Tao
2017-01-01
The current study focuses on developing the learning progression of number sense for primary school students, and it applies a cognitive diagnostic model, the rule space model, to data analysis. The rule space model analysis firstly extracted nine cognitive attributes and their hierarchy model from the analysis of previous research and the…
A Conceptual Model for Analysing Management Development in the UK Hospitality Industry
ERIC Educational Resources Information Center
Watson, Sandra
2007-01-01
This paper presents a conceptual, contingent model of management development. It explains the nature of the UK hospitality industry and its potential influence on MD practices, prior to exploring dimensions and relationships in the model. The embryonic model is presented as a model that can enhance our understanding of the complexities of the…
ERIC Educational Resources Information Center
Gong, Yu
2017-01-01
This study investigates how students can use "interactive example models" in inquiry activities to develop their conceptual knowledge about an engineering phenomenon like electromagnetic fields and waves. An interactive model, for example a computational model, could be used to develop and teach principles of dynamic complex systems, and…
ERIC Educational Resources Information Center
Moallem, Mahnaz; Earle, Rodney S.
1998-01-01
In an effort to connect current research findings on teacher thinking with components of instructional design models and principles, this article discusses a new contextual model for thinking about teaching and considers the implications of the model for instructional development of research in instructional design and teacher thinking. (Author)
ERIC Educational Resources Information Center
Chiu, Mei-Shiu
2012-01-01
The skill-development model contends that achievements have an effect on academic self-confidences, while the self-enhancement model contends that self-confidences have an effect on achievements. Differential psychological processes underlying the 2 models across the domains of mathematics and science were posited and examined with structural…
Comparison of different stomatal conductance algorithms for ozone flux modelling [Proceedings
P. Buker; L. D. Emberson; M. R. Ashmore; G. Gerosa; C. Jacobs; W. J. Massman; J. Muller; N. Nikolov; K. Novak; E. Oksanen; D. De La Torre; J. -P. Tuovinen
2006-01-01
The ozone deposition model (D03SE) that has been developed and applied within the EMEP photooxidant model (Emberson et al., 2000, Simpson et al. 2003) currently estimates stomatal ozone flux using a stomatal conductance (gs) model based on the multiplicative algorithm initially developed by Jarvis (1976). This model links gs to environmental and phenological parameters...
Web Instruction as Cultural Transformation: A Reeducation Model for Faculty Development.
ERIC Educational Resources Information Center
Fuller, Frank
This paper offers a model of faculty staff development for distance education that does not require, or permit, continuous change in instructional design. The model is based on the paradigm shift ideas of Thomas Kuhn and the reeducation model of Kurt Lewin. In the model offered reeducation implies not simply education or training, but involves…
Stephen N. Matthews; Louis R. Iverson; Anantha M. Prasad; Matthew P. Peters; Paul G. Rodewald
2011-01-01
Species distribution models (SDMs) to evaluate trees' potential responses to climate change are essential for developing appropriate forest management strategies. However, there is a great need to better understand these models' limitations and evaluate their uncertainties. We have previously developed statistical models of suitable habitat, based on both...
ERIC Educational Resources Information Center
Martin, Ian; Carey, John
2014-01-01
A logic model was developed based on an analysis of the 2012 American School Counselor Association (ASCA) National Model in order to provide direction for program evaluation initiatives. The logic model identified three outcomes (increased student achievement/gap reduction, increased school counseling program resources, and systemic change and…
A reexamination of age-related variation in body weight and morphometry of Maryland nutria
Sherfy, M.H.; Mollett, T.A.; McGowan, K.R.; Daugherty, S.L.
2006-01-01
Age-related variation in morphometry has been documented for many species. Knowledge of growth patterns can be useful for modeling energetics, detecting physiological influences on populations, and predicting age. These benefits have shown value in understanding population dynamics of invasive species, particularly in developing efficient control and eradication programs. However, development and evaluation of descriptive and predictive models is a critical initial step in this process. Accordingly, we used data from necropsies of 1,544 nutria (Myocastor coypus) collected in Maryland, USA, to evaluate the accuracy of previously published models for prediction of nutria age from body weight. Published models underestimated body weights of our animals, especially for ages <3. We used cross-validation procedures to develop and evaluate models for describing nutria growth patterns and for predicting nutria age. We derived models from a randomly selected model-building data set (n = 192-193 M, 217-222 F) and evaluated them with the remaining animals (n = 487-488 M, 642-647 F). We used nonlinear regression to develop Gompertz growth-curve models relating morphometric variables to age. Predicted values of morphometric variables fell within the 95% confidence limits of their true values for most age classes. We also developed predictive models for estimating nutria age from morphometry, using linear regression of log-transformed age on morphometric variables. The evaluation data set corresponded with 95% prediction intervals from the new models. Predictive models for body weight and length provided greater accuracy and less bias than models for foot length and axillary girth. Our growth models accurately described age-related variation in nutria morphometry, and our predictive models provided accurate estimates of ages from morphometry that will be useful for live-captured individuals. Our models offer better accuracy and precision than previously published models, providing a capacity for modeling energetics and growth patterns of Maryland nutria as well as an empirical basis for determining population age structure from live-captured animals.
ERIC Educational Resources Information Center
Freeman, Thomas J.
This paper discusses six different models of organizational structure and leadership, including the scalar chain or pyramid model, the continuum model, the grid model, the linking pin model, the contingency model, and the circle or democratic model. Each model is examined in a separate section that describes the model and its development, lists…
Curriculum Development for Business and Industry.
ERIC Educational Resources Information Center
Stolovitch, Harold D.; Keeps, Erica J.
1988-01-01
Defines the concept of curriculum for industrial personnel development needs, explains the concept of professionalism, and presents a model for developing curricula for business and industry called the Professional Development Curriculum (PDC) model. Training needs are discussed and two applications of the model in General Motors are described.…
Development and application of air quality models at the U.S. EPA
Overview of the development and application of air quality models at the U.S. EPA, particularly focused on the development and application of the Community Multiscale Air Quality (CMAQ) model developed within the Computation Exposure Division (CED) of the National Exposure Resear...
Adult Development. Trends and Issues Alert No. 22.
ERIC Educational Resources Information Center
Imel, Susan
Theories about adult development have been grouped into four models: biological, psychological, sociocultural, and integrative. Biological models (those that are concerned with how physical changes affect development) and psychological models (those that view development as either sequential--defined by life events-- or as a series of transitions…
Employer-Sponsored Career Development Programs. Information Series No. 231.
ERIC Educational Resources Information Center
Lancaster, Anita Sklare; Berne, Richard R.
This monograph presents an overview of employer-sponsored career development programs. It is divided into four sections. The "Adult Development" and "Adult Career Development" sections review pertinent theories and research (basic concepts, task model, transition model, theme model, adult career stages, career anchors approach, career development…
Regional analyses of labor markets and demography: a model based Norwegian example.
Stambol, L S; Stolen, N M; Avitsland, T
1998-01-01
The authors discuss the regional REGARD model, developed by Statistics Norway to analyze the regional implications of macroeconomic development of employment, labor force, and unemployment. "In building the model, empirical analyses of regional producer behavior in manufacturing industries have been performed, and the relation between labor market development and regional migration has been investigated. Apart from providing a short description of the REGARD model, this article demonstrates the functioning of the model, and presents some results of an application." excerpt
A Methodology for Cybercraft Requirement Definition and Initial System Design
2008-06-01
the software development concepts of the SDLC , requirements, use cases and domain modeling . It ...collectively as Software Development 5 Life Cycle ( SDLC ) models . While there are numerous models that fit under the SDLC definition, all are based on... developed that provided expanded understanding of the domain, it is necessary to either update an existing domain model or create another domain
Development of the Play Experience Model to Enhance Desirable Qualifications of Early Childhood
ERIC Educational Resources Information Center
Panpum, Watchara; Soonthornrojana, Wimonrat; Nakunsong, Thatsanee
2015-01-01
The objectives of this research were to develop the play experience model and to study the effect of usage in play experience model for enhancing the early childhood's desirable qualification. There were 3 phases of research: 1) the document and context in experience management were studied, 2) the play experience model was developed, and 3) the…
The Marshall Engineering Thermosphere (MET) Model. Volume 1; Technical Description
NASA Technical Reports Server (NTRS)
Smith, R. E.
1998-01-01
Volume 1 presents a technical description of the Marshall Engineering Thermosphere (MET) model atmosphere and a summary of its historical development. Various programs developed to augment the original capability of the model are discussed in detail. The report also describes each of the individual subroutines developed to enhance the model. Computer codes for these subroutines are contained in four appendices.
ERIC Educational Resources Information Center
Wichadee, Saovapa
2011-01-01
The purposes of this study were to develop the instructional model for enhancing self-directed learning skills of Bangkok University students, study the impacts of the model on their English reading comprehension and self-directed learning ability as well as explore their opinion towards self-directed learning. The model development process…
Modeling, analysis, and simulation of the co-development of road networks and vehicle ownership
NASA Astrophysics Data System (ADS)
Xu, Mingtao; Ye, Zhirui; Shan, Xiaofeng
2016-01-01
A two-dimensional logistic model is proposed to describe the co-development of road networks and vehicle ownership. The endogenous interaction between road networks and vehicle ownership and how natural market forces and policies transformed into their co-development are considered jointly in this model. If the involved parameters satisfy a certain condition, the proposed model can arrive at a steady equilibrium level and the final development scale will be within the maximum capacity of an urban traffic system; otherwise, the co-development process will be unstable and even manifest chaotic behavior. Then sensitivity tests are developed to determine the proper values for a series of parameters in this model. Finally, a case study, using Beijing City as an example, is conducted to explore the applicability of the proposed model to the real condition. Results demonstrate that the proposed model can effectively simulate the co-development of road network and vehicle ownership for Beijing City. Furthermore, we can obtain that their development process will arrive at a stable equilibrium level in the years 2040 and 2045 respectively, and the equilibrium values are within the maximum capacity.
NASA Astrophysics Data System (ADS)
Dalton, Rebecca Marie
The development of student's mental models of chemical substances and processes at the molecular level was studied in a three-phase project. Animations produced in the VisChem project were used as an integral part of the chemistry instruction to help students develop their mental models. Phase one of the project involved examining the effectiveness of using animations to help first-year university chemistry students develop useful mental models of chemical phenomena. Phase two explored factors affecting the development of student's mental models, analysing results in terms of a proposed model of the perceptual processes involved in interpreting an animation. Phase three involved four case studies that served to confirm and elaborate on the effects of prior knowledge and disembedding ability on student's mental model development, and support the influence of study style on learning outcomes. Recommendations for use of the VisChem animations, based on the above findings, include: considering the prior knowledge of students; focusing attention on relevant features; encouraging a deep approach to learning; using animation to teach visual concepts; presenting ideas visually, verbally and conceptually; establishing 'animation literacy'; minimising cognitive load; using animation as feedback; using student drawings; repeating animations; and discussing 'scientific modelling'.
NASA Software Cost Estimation Model: An Analogy Based Estimation Model
NASA Technical Reports Server (NTRS)
Hihn, Jairus; Juster, Leora; Menzies, Tim; Mathew, George; Johnson, James
2015-01-01
The cost estimation of software development activities is increasingly critical for large scale integrated projects such as those at DOD and NASA especially as the software systems become larger and more complex. As an example MSL (Mars Scientific Laboratory) developed at the Jet Propulsion Laboratory launched with over 2 million lines of code making it the largest robotic spacecraft ever flown (Based on the size of the software). Software development activities are also notorious for their cost growth, with NASA flight software averaging over 50% cost growth. All across the agency, estimators and analysts are increasingly being tasked to develop reliable cost estimates in support of program planning and execution. While there has been extensive work on improving parametric methods there is very little focus on the use of models based on analogy and clustering algorithms. In this paper we summarize our findings on effort/cost model estimation and model development based on ten years of software effort estimation research using data mining and machine learning methods to develop estimation models based on analogy and clustering. The NASA Software Cost Model performance is evaluated by comparing it to COCOMO II, linear regression, and K- nearest neighbor prediction model performance on the same data set.
Gradient Models in Molecular Biophysics: Progress, Challenges, Opportunities
Bardhan, Jaydeep P.
2014-01-01
In the interest of developing a bridge between researchers modeling materials and those modeling biological molecules, we survey recent progress in developing nonlocal-dielectric continuum models for studying the behavior of proteins and nucleic acids. As in other areas of science, continuum models are essential tools when atomistic simulations (e.g. molecular dynamics) are too expensive. Because biological molecules are essentially all nanoscale systems, the standard continuum model, involving local dielectric response, has basically always been dubious at best. The advanced continuum theories discussed here aim to remedy these shortcomings by adding features such as nonlocal dielectric response, and nonlinearities resulting from dielectric saturation. We begin by describing the central role of electrostatic interactions in biology at the molecular scale, and motivate the development of computationally tractable continuum models using applications in science and engineering. For context, we highlight some of the most important challenges that remain and survey the diverse theoretical formalisms for their treatment, highlighting the rigorous statistical mechanics that support the use and improvement of continuum models. We then address the development and implementation of nonlocal dielectric models, an approach pioneered by Dogonadze, Kornyshev, and their collaborators almost forty years ago. The simplest of these models is just a scalar form of gradient elasticity, and here we use ideas from gradient-based modeling to extend the electrostatic model to include additional length scales. The paper concludes with a discussion of open questions for model development, highlighting the many opportunities for the materials community to leverage its physical, mathematical, and computational expertise to help solve one of the most challenging questions in molecular biology and biophysics. PMID:25505358
Gradient Models in Molecular Biophysics: Progress, Challenges, Opportunities.
Bardhan, Jaydeep P
2013-12-01
In the interest of developing a bridge between researchers modeling materials and those modeling biological molecules, we survey recent progress in developing nonlocal-dielectric continuum models for studying the behavior of proteins and nucleic acids. As in other areas of science, continuum models are essential tools when atomistic simulations (e.g. molecular dynamics) are too expensive. Because biological molecules are essentially all nanoscale systems, the standard continuum model, involving local dielectric response, has basically always been dubious at best. The advanced continuum theories discussed here aim to remedy these shortcomings by adding features such as nonlocal dielectric response, and nonlinearities resulting from dielectric saturation. We begin by describing the central role of electrostatic interactions in biology at the molecular scale, and motivate the development of computationally tractable continuum models using applications in science and engineering. For context, we highlight some of the most important challenges that remain and survey the diverse theoretical formalisms for their treatment, highlighting the rigorous statistical mechanics that support the use and improvement of continuum models. We then address the development and implementation of nonlocal dielectric models, an approach pioneered by Dogonadze, Kornyshev, and their collaborators almost forty years ago. The simplest of these models is just a scalar form of gradient elasticity, and here we use ideas from gradient-based modeling to extend the electrostatic model to include additional length scales. The paper concludes with a discussion of open questions for model development, highlighting the many opportunities for the materials community to leverage its physical, mathematical, and computational expertise to help solve one of the most challenging questions in molecular biology and biophysics.
Raabe, Margaret E.; Chaudhari, Ajit M.W.
2016-01-01
The ability of a biomechanical simulation to produce results that can translate to real-life situations is largely dependent on the physiological accuracy of the musculoskeletal model. There are a limited number of freely-available, full-body models that exist in OpenSim, and those that do exist are very limited in terms of trunk musculature and degrees of freedom in the spine. Properly modeling the motion and musculature of the trunk is necessary to most accurately estimate lower extremity and spinal loading. The objective of this study was to develop and validate a more physiologically accurate OpenSim full-body model. By building upon three previously developed OpenSim models, the Full-Body Lumbar Spine (FBLS) model, comprised of 21 segments, 30 degrees-of-freedom, and 324 musculotendon actuators, was developed. The five lumbar vertebrae were modeled as individual bodies, and coupled constraints were implemented to describe the net motion of the spine. The eight major muscle groups of the lumbar spine were modeled (rectus abdominis, external and internal obliques, erector spinae, multifidus, quadratus lumborum, psoas major, and latissimus dorsi), and many of these muscle groups were modeled as multiple fascicles allowing the large muscles to act in multiple directions. The resulting FBLS model's trunk muscle geometry, maximal isometric joint moments, and simulated muscle activations compare well to experimental data. The FBLS model will be made freely available (https://simtk.org/home/fullbodylumbar) for others to perform additional analyses and develop simulations investigating full-body dynamics and contributions of the trunk muscles to dynamic tasks. PMID:26947033
Development and validation of a 10-year-old child ligamentous cervical spine finite element model.
Dong, Liqiang; Li, Guangyao; Mao, Haojie; Marek, Stanley; Yang, King H
2013-12-01
Although a number of finite element (FE) adult cervical spine models have been developed to understand the injury mechanisms of the neck in automotive related crash scenarios, there have been fewer efforts to develop a child neck model. In this study, a 10-year-old ligamentous cervical spine FE model was developed for application in the improvement of pediatric safety related to motor vehicle crashes. The model geometry was obtained from medical scans and meshed using a multi-block approach. Appropriate properties based on review of literature in conjunction with scaling were assigned to different parts of the model. Child tensile force-deformation data in three segments, Occipital-C2 (C0-C2), C4-C5 and C6-C7, were used to validate the cervical spine model and predict failure forces and displacements. Design of computer experiments was performed to determine failure properties for intervertebral discs and ligaments needed to set up the FE model. The model-predicted ultimate displacements and forces were within the experimental range. The cervical spine FE model was validated in flexion and extension against the child experimental data in three segments, C0-C2, C4-C5 and C6-C7. Other model predictions were found to be consistent with the experimental responses scaled from adult data. The whole cervical spine model was also validated in tension, flexion and extension against the child experimental data. This study provided methods for developing a child ligamentous cervical spine FE model and to predict soft tissue failures in tension.
Gradient models in molecular biophysics: progress, challenges, opportunities
NASA Astrophysics Data System (ADS)
Bardhan, Jaydeep P.
2013-12-01
In the interest of developing a bridge between researchers modeling materials and those modeling biological molecules, we survey recent progress in developing nonlocal-dielectric continuum models for studying the behavior of proteins and nucleic acids. As in other areas of science, continuum models are essential tools when atomistic simulations (e.g., molecular dynamics) are too expensive. Because biological molecules are essentially all nanoscale systems, the standard continuum model, involving local dielectric response, has basically always been dubious at best. The advanced continuum theories discussed here aim to remedy these shortcomings by adding nonlocal dielectric response. We begin by describing the central role of electrostatic interactions in biology at the molecular scale, and motivate the development of computationally tractable continuum models using applications in science and engineering. For context, we highlight some of the most important challenges that remain, and survey the diverse theoretical formalisms for their treatment, highlighting the rigorous statistical mechanics that support the use and improvement of continuum models. We then address the development and implementation of nonlocal dielectric models, an approach pioneered by Dogonadze, Kornyshev, and their collaborators almost 40 years ago. The simplest of these models is just a scalar form of gradient elasticity, and here we use ideas from gradient-based modeling to extend the electrostatic model to include additional length scales. The review concludes with a discussion of open questions for model development, highlighting the many opportunities for the materials community to leverage its physical, mathematical, and computational expertise to help solve one of the most challenging questions in molecular biology and biophysics.
Precipitation-runoff modeling system; user's manual
Leavesley, G.H.; Lichty, R.W.; Troutman, B.M.; Saindon, L.G.
1983-01-01
The concepts, structure, theoretical development, and data requirements of the precipitation-runoff modeling system (PRMS) are described. The precipitation-runoff modeling system is a modular-design, deterministic, distributed-parameter modeling system developed to evaluate the impacts of various combinations of precipitation, climate, and land use on streamflow, sediment yields, and general basin hydrology. Basin response to normal and extreme rainfall and snowmelt can be simulated to evaluate changes in water balance relationships, flow regimes, flood peaks and volumes, soil-water relationships, sediment yields, and groundwater recharge. Parameter-optimization and sensitivity analysis capabilites are provided to fit selected model parameters and evaluate their individual and joint effects on model output. The modular design provides a flexible framework for continued model system enhancement and hydrologic modeling research and development. (Author 's abstract)
Three-Dimension Visualization for Primary Wheat Diseases Based on Simulation Model
NASA Astrophysics Data System (ADS)
Shijuan, Li; Yeping, Zhu
Crop simulation model has been becoming the core of agricultural production management and resource optimization management. Displaying crop growth process makes user observe the crop growth and development intuitionisticly. On the basis of understanding and grasping the occurrence condition, popularity season, key impact factors for main wheat diseases of stripe rust, leaf rust, stem rust, head blight and powdery mildew from research material and literature, we designed 3D visualization model for wheat growth and diseases occurrence. The model system will help farmer, technician and decision-maker to use crop growth simulation model better and provide decision-making support. Now 3D visualization model for wheat growth on the basis of simulation model has been developed, and the visualization model for primary wheat diseases is in the process of development.
Zhang, Xin; Liu, Pan; Chen, Yuguang; Bai, Lu; Wang, Wei
2014-01-01
The primary objective of this study was to identify whether the frequency of traffic conflicts at signalized intersections can be modeled. The opposing left-turn conflicts were selected for the development of conflict predictive models. Using data collected at 30 approaches at 20 signalized intersections, the underlying distributions of the conflicts under different traffic conditions were examined. Different conflict-predictive models were developed to relate the frequency of opposing left-turn conflicts to various explanatory variables. The models considered include a linear regression model, a negative binomial model, and separate models developed for four traffic scenarios. The prediction performance of different models was compared. The frequency of traffic conflicts follows a negative binominal distribution. The linear regression model is not appropriate for the conflict frequency data. In addition, drivers behaved differently under different traffic conditions. Accordingly, the effects of conflicting traffic volumes on conflict frequency vary across different traffic conditions. The occurrences of traffic conflicts at signalized intersections can be modeled using generalized linear regression models. The use of conflict predictive models has potential to expand the uses of surrogate safety measures in safety estimation and evaluation.
Predictive QSAR modeling workflow, model applicability domains, and virtual screening.
Tropsha, Alexander; Golbraikh, Alexander
2007-01-01
Quantitative Structure Activity Relationship (QSAR) modeling has been traditionally applied as an evaluative approach, i.e., with the focus on developing retrospective and explanatory models of existing data. Model extrapolation was considered if only in hypothetical sense in terms of potential modifications of known biologically active chemicals that could improve compounds' activity. This critical review re-examines the strategy and the output of the modern QSAR modeling approaches. We provide examples and arguments suggesting that current methodologies may afford robust and validated models capable of accurate prediction of compound properties for molecules not included in the training sets. We discuss a data-analytical modeling workflow developed in our laboratory that incorporates modules for combinatorial QSAR model development (i.e., using all possible binary combinations of available descriptor sets and statistical data modeling techniques), rigorous model validation, and virtual screening of available chemical databases to identify novel biologically active compounds. Our approach places particular emphasis on model validation as well as the need to define model applicability domains in the chemistry space. We present examples of studies where the application of rigorously validated QSAR models to virtual screening identified computational hits that were confirmed by subsequent experimental investigations. The emerging focus of QSAR modeling on target property forecasting brings it forward as predictive, as opposed to evaluative, modeling approach.
Modelling soil water retention using support vector machines with genetic algorithm optimisation.
Lamorski, Krzysztof; Sławiński, Cezary; Moreno, Felix; Barna, Gyöngyi; Skierucha, Wojciech; Arrue, José L
2014-01-01
This work presents point pedotransfer function (PTF) models of the soil water retention curve. The developed models allowed for estimation of the soil water content for the specified soil water potentials: -0.98, -3.10, -9.81, -31.02, -491.66, and -1554.78 kPa, based on the following soil characteristics: soil granulometric composition, total porosity, and bulk density. Support Vector Machines (SVM) methodology was used for model development. A new methodology for elaboration of retention function models is proposed. Alternative to previous attempts known from literature, the ν-SVM method was used for model development and the results were compared with the formerly used the C-SVM method. For the purpose of models' parameters search, genetic algorithms were used as an optimisation framework. A new form of the aim function used for models parameters search is proposed which allowed for development of models with better prediction capabilities. This new aim function avoids overestimation of models which is typically encountered when root mean squared error is used as an aim function. Elaborated models showed good agreement with measured soil water retention data. Achieved coefficients of determination values were in the range 0.67-0.92. Studies demonstrated usability of ν-SVM methodology together with genetic algorithm optimisation for retention modelling which gave better performing models than other tested approaches.
NASA Astrophysics Data System (ADS)
Jonny, Zagloed, Teuku Yuri M.
2017-11-01
This paper aims to present an integrated health care model for Indonesian health care industry. Based on previous researches, there are two health care models in the industry such as decease- and patient-centered care models. In their developments, the patient-centered care model is widely applied due to its capability in reducing cost and improving quality simultaneously. However, there is still no comprehensive model resulting in cost reduction, quality improvement, patient satisfaction and hospital profitability simultaneously. Therefore, this research is intended to develop that model. In doing so, first, a conceptual model using Kano's Model, Quality Function Deployment (QFD) and Balanced Scorecard (BSC) is developed to generate several important elements of the model as required by stakeholders. Then, a case study of an Indonesian hospital is presented to evaluate the validity of the model using correlation analysis. As a result, it can be concluded that the model is validated implying several managerial insights among its elements such as l) leadership (r=0.85) and context of the organization (r=0.77) improve operations; 2) planning (r=0.96), support process (r=0.87) and continual improvement (r=0.95) also improve operations; 3) operations improve customer satisfaction (r=0.89) and financial performance (r=0.93) and 4) customer satisfaction improves the financial performance (0.98).
Hierarchical animal movement models for population-level inference
Hooten, Mevin B.; Buderman, Frances E.; Brost, Brian M.; Hanks, Ephraim M.; Ivans, Jacob S.
2016-01-01
New methods for modeling animal movement based on telemetry data are developed regularly. With advances in telemetry capabilities, animal movement models are becoming increasingly sophisticated. Despite a need for population-level inference, animal movement models are still predominantly developed for individual-level inference. Most efforts to upscale the inference to the population level are either post hoc or complicated enough that only the developer can implement the model. Hierarchical Bayesian models provide an ideal platform for the development of population-level animal movement models but can be challenging to fit due to computational limitations or extensive tuning required. We propose a two-stage procedure for fitting hierarchical animal movement models to telemetry data. The two-stage approach is statistically rigorous and allows one to fit individual-level movement models separately, then resample them using a secondary MCMC algorithm. The primary advantages of the two-stage approach are that the first stage is easily parallelizable and the second stage is completely unsupervised, allowing for an automated fitting procedure in many cases. We demonstrate the two-stage procedure with two applications of animal movement models. The first application involves a spatial point process approach to modeling telemetry data, and the second involves a more complicated continuous-time discrete-space animal movement model. We fit these models to simulated data and real telemetry data arising from a population of monitored Canada lynx in Colorado, USA.
Gu, Yingxin; Wylie, Bruce K.; Boyte, Stephen; Picotte, Joshua J.; Howard, Danny; Smith, Kelcy; Nelson, Kurtis
2016-01-01
Regression tree models have been widely used for remote sensing-based ecosystem mapping. Improper use of the sample data (model training and testing data) may cause overfitting and underfitting effects in the model. The goal of this study is to develop an optimal sampling data usage strategy for any dataset and identify an appropriate number of rules in the regression tree model that will improve its accuracy and robustness. Landsat 8 data and Moderate-Resolution Imaging Spectroradiometer-scaled Normalized Difference Vegetation Index (NDVI) were used to develop regression tree models. A Python procedure was designed to generate random replications of model parameter options across a range of model development data sizes and rule number constraints. The mean absolute difference (MAD) between the predicted and actual NDVI (scaled NDVI, value from 0–200) and its variability across the different randomized replications were calculated to assess the accuracy and stability of the models. In our case study, a six-rule regression tree model developed from 80% of the sample data had the lowest MAD (MADtraining = 2.5 and MADtesting = 2.4), which was suggested as the optimal model. This study demonstrates how the training data and rule number selections impact model accuracy and provides important guidance for future remote-sensing-based ecosystem modeling.
Development of a dynamic computational model of social cognitive theory.
Riley, William T; Martin, Cesar A; Rivera, Daniel E; Hekler, Eric B; Adams, Marc A; Buman, Matthew P; Pavel, Misha; King, Abby C
2016-12-01
Social cognitive theory (SCT) is among the most influential theories of behavior change and has been used as the conceptual basis of health behavior interventions for smoking cessation, weight management, and other health behaviors. SCT and other behavior theories were developed primarily to explain differences between individuals, but explanatory theories of within-person behavioral variability are increasingly needed as new technologies allow for intensive longitudinal measures and interventions adapted from these inputs. These within-person explanatory theoretical applications can be modeled as dynamical systems. SCT constructs, such as reciprocal determinism, are inherently dynamical in nature, but SCT has not been modeled as a dynamical system. This paper describes the development of a dynamical system model of SCT using fluid analogies and control systems principles drawn from engineering. Simulations of this model were performed to assess if the model performed as predicted based on theory and empirical studies of SCT. This initial model generates precise and testable quantitative predictions for future intensive longitudinal research. Dynamic modeling approaches provide a rigorous method for advancing health behavior theory development and refinement and for guiding the development of more potent and efficient interventions.
Parametric Modelling of As-Built Beam Framed Structure in Bim Environment
NASA Astrophysics Data System (ADS)
Yang, X.; Koehl, M.; Grussenmeyer, P.
2017-02-01
A complete documentation and conservation of a historic timber roof requires the integration of geometry modelling, attributional and dynamic information management and results of structural analysis. Recently developed as-built Building Information Modelling (BIM) technique has the potential to provide a uniform platform, which provides possibility to integrate the traditional geometry modelling, parametric elements management and structural analysis together. The main objective of the project presented in this paper is to develop a parametric modelling tool for a timber roof structure whose elements are leaning and crossing beam frame. Since Autodesk Revit, as the typical BIM software, provides the platform for parametric modelling and information management, an API plugin, able to automatically create the parametric beam elements and link them together with strict relationship, was developed. The plugin under development is introduced in the paper, which can obtain the parametric beam model via Autodesk Revit API from total station points and terrestrial laser scanning data. The results show the potential of automatizing the parametric modelling by interactive API development in BIM environment. It also integrates the separate data processing and different platforms into the uniform Revit software.
Geospatial Modelling Approach for 3d Urban Densification Developments
NASA Astrophysics Data System (ADS)
Koziatek, O.; Dragićević, S.; Li, S.
2016-06-01
With growing populations, economic pressures, and the need for sustainable practices, many urban regions are rapidly densifying developments in the vertical built dimension with mid- and high-rise buildings. The location of these buildings can be projected based on key factors that are attractive to urban planners, developers, and potential buyers. Current research in this area includes various modelling approaches, such as cellular automata and agent-based modelling, but the results are mostly linked to raster grids as the smallest spatial units that operate in two spatial dimensions. Therefore, the objective of this research is to develop a geospatial model that operates on irregular spatial tessellations to model mid- and high-rise buildings in three spatial dimensions (3D). The proposed model is based on the integration of GIS, fuzzy multi-criteria evaluation (MCE), and 3D GIS-based procedural modelling. Part of the City of Surrey, within the Metro Vancouver Region, Canada, has been used to present the simulations of the generated 3D building objects. The proposed 3D modelling approach was developed using ESRI's CityEngine software and the Computer Generated Architecture (CGA) language.
How can animal models inform on the transition to chronic symptoms in whiplash?
Winkelstein, Beth A.
2011-01-01
Study Design A non-systematic review of the literature. Objective The objective was to present general schema for mechanisms of whiplash pain and review the role of animal models in understanding the development of chronic pain from whiplash injury. Summary of Background Data Extensive biomechanical and clinical studies of whiplash have been performed to understand the injury mechanisms and symptoms of whiplash injury. However, only recently have animal models of this painful disorder been developed based on other pain models in the literature. Methods A non-systematic review was performed and findings were integrated to formulate a generalized picture of mechanisms by chronic whiplash pain develops from mechanical tissue injuries. Results The development of chronic pain from tissue injuries in the neck due to whiplash involves complex interactions between the injured tissue and spinal neuroimmune circuits. A variety of animal models are beginning to define these mechanisms. Conclusion Continued work is needed in developing appropriate animal models to investigate chronic pain from whiplash injuries and care must be taken to determine whether such models aim to model the injury event or the pain symptom. PMID:22020616
Conceptual Development of a National Volcanic Hazard Model for New Zealand
NASA Astrophysics Data System (ADS)
Stirling, Mark; Bebbington, Mark; Brenna, Marco; Cronin, Shane; Christophersen, Annemarie; Deligne, Natalia; Hurst, Tony; Jolly, Art; Jolly, Gill; Kennedy, Ben; Kereszturi, Gabor; Lindsay, Jan; Neall, Vince; Procter, Jonathan; Rhoades, David; Scott, Brad; Shane, Phil; Smith, Ian; Smith, Richard; Wang, Ting; White, James D. L.; Wilson, Colin J. N.; Wilson, Tom
2017-06-01
We provide a synthesis of a workshop held in February 2016 to define the goals, challenges and next steps for developing a national probabilistic volcanic hazard model for New Zealand. The workshop involved volcanologists, statisticians, and hazards scientists from GNS Science, Massey University, University of Otago, Victoria University of Wellington, University of Auckland, and University of Canterbury. We also outline key activities that will develop the model components, define procedures for periodic update of the model, and effectively articulate the model to end-users and stakeholders. The development of a National Volcanic Hazard Model is a formidable task that will require long-term stability in terms of team effort, collaboration and resources. Development of the model in stages or editions that are modular will make the process a manageable one that progressively incorporates additional volcanic hazards over time, and additional functionalities (e.g. short-term forecasting). The first edition is likely to be limited to updating and incorporating existing ashfall hazard models, with the other hazards associated with lahar, pyroclastic density currents, lava flow, ballistics, debris avalanche, and gases/aerosols being considered in subsequent updates.
Marcilio, Izabel; Hajat, Shakoor; Gouveia, Nelson
2013-08-01
This study aimed to develop different models to forecast the daily number of patients seeking emergency department (ED) care in a general hospital according to calendar variables and ambient temperature readings and to compare the models in terms of forecasting accuracy. The authors developed and tested six different models of ED patient visits using total daily counts of patient visits to an ED in Sao Paulo, Brazil, from January 1, 2008, to December 31, 2010. The first 33 months of the data set were used to develop the ED patient visits forecasting models (the training set), leaving the last 3 months to measure each model's forecasting accuracy by the mean absolute percentage error (MAPE). Forecasting models were developed using three different time-series analysis methods: generalized linear models (GLM), generalized estimating equations (GEE), and seasonal autoregressive integrated moving average (SARIMA). For each method, models were explored with and without the effect of mean daily temperature as a predictive variable. The daily mean number of ED visits was 389, ranging from 166 to 613. Data showed a weekly seasonal distribution, with highest patient volumes on Mondays and lowest patient volumes on weekends. There was little variation in daily visits by month. GLM and GEE models showed better forecasting accuracy than SARIMA models. For instance, the MAPEs from GLM models and GEE models at the first month of forecasting (October 2012) were 11.5 and 10.8% (models with and without control for the temperature effect, respectively), while the MAPEs from SARIMA models were 12.8 and 11.7%. For all models, controlling for the effect of temperature resulted in worse or similar forecasting ability than models with calendar variables alone, and forecasting accuracy was better for the short-term horizon (7 days in advance) than for the longer term (30 days in advance). This study indicates that time-series models can be developed to provide forecasts of daily ED patient visits, and forecasting ability was dependent on the type of model employed and the length of the time horizon being predicted. In this setting, GLM and GEE models showed better accuracy than SARIMA models. Including information about ambient temperature in the models did not improve forecasting accuracy. Forecasting models based on calendar variables alone did in general detect patterns of daily variability in ED volume and thus could be used for developing an automated system for better planning of personnel resources. © 2013 by the Society for Academic Emergency Medicine.
Use case driven approach to develop simulation model for PCS of APR1400 simulator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dong Wook, Kim; Hong Soo, Kim; Hyeon Tae, Kang
2006-07-01
The full-scope simulator is being developed to evaluate specific design feature and to support the iterative design and validation in the Man-Machine Interface System (MMIS) design of Advanced Power Reactor (APR) 1400. The simulator consists of process model, control logic model, and MMI for the APR1400 as well as the Power Control System (PCS). In this paper, a use case driven approach is proposed to develop a simulation model for PCS. In this approach, a system is considered from the point of view of its users. User's view of the system is based on interactions with the system and themore » resultant responses. In use case driven approach, we initially consider the system as a black box and look at its interactions with the users. From these interactions, use cases of the system are identified. Then the system is modeled using these use cases as functions. Lower levels expand the functionalities of each of these use cases. Hence, starting from the topmost level view of the system, we proceeded down to the lowest level (the internal view of the system). The model of the system thus developed is use case driven. This paper will introduce the functionality of the PCS simulation model, including a requirement analysis based on use case and the validation result of development of PCS model. The PCS simulation model using use case will be first used during the full-scope simulator development for nuclear power plant and will be supplied to Shin-Kori 3 and 4 plant. The use case based simulation model development can be useful for the design and implementation of simulation models. (authors)« less
A canopy architectural model to study the competitive ability of chickpea with sowthistle.
Cici, S-Zahra-Hosseini; Adkins, Steve; Hanan, Jim
2008-06-01
Improving the competitive ability of crops is a sustainable method of weed management. This paper shows how a virtual plant model of competition between chickpea (Cicer arietinum) and sowthistle (Sonchus oleraceus) can be used as a framework for discovering and/or developing more competitive chickpea cultivars. The virtual plant models were developed using the L-systems formalism, parameterized according to measurements taken on plants at intervals during their development. A quasi-Monte Carlo light-environment model was used to model the effect of chickpea canopy on the development of sowthistle. The chickpea-light environment-sowthistle model (CLES model) captured the hypothesis that the architecture of chickpea plants modifies the light environment inside the canopy and determines sowthistle growth and development pattern. The resulting CLES model was parameterized for different chickpea cultivars (viz. 'Macarena', 'Bumper', 'Jimbour' and '99071-1001') to compare their competitive ability with sowthistle. To validate the CLES model, an experiment was conducted using the same four chickpea cultivars as different treatments with a sowthistle growing under their canopy. The growth of sowthistle, both in silico and in glasshouse experiments, was reduced most by '99071-1001', a cultivar with a short phyllochron. The second rank of competitive ability belonged to 'Macarena' and 'Bumper', while 'Jimbour' was the least competitive cultivar. The architecture of virtual chickpea plants modified the light inside the canopy, which influenced the growth and development of the sowthistle plants in response to different cultivars. This is the first time that a virtual plant model of a crop-weed interaction has been developed. This virtual plant model can serve as a platform for a broad range of applications in the study of chickpea-weed interactions and their environment.
NASA Astrophysics Data System (ADS)
Lafontaine, J.; Hay, L.; Markstrom, S. L.
2016-12-01
The United States Geological Survey (USGS) has developed a National Hydrologic Model (NHM) to support coordinated, comprehensive and consistent hydrologic model development, and facilitate the application of hydrologic simulations within the conterminous United States (CONUS). As many stream reaches in the CONUS are either not gaged, or are substantially impacted by water use or flow regulation, ancillary information must be used to determine reasonable parameter estimations for streamflow simulations. Hydrologic models for 1,576 gaged watersheds across the CONUS were developed to test the feasibility of improving streamflow simulations linking physically-based hydrologic models with remotely-sensed data products (i.e. snow water equivalent). Initially, the physically-based models were calibrated to measured streamflow data to provide a baseline for comparison across multiple calibration strategy tests. In addition, not all ancillary datasets are appropriate for application to all parts of the CONUS (e.g. snow water equivalent in the southeastern U.S., where snow is a rarity). As it is not expected that any one data product or model simulation will be sufficient for representing hydrologic behavior across the entire CONUS, a systematic evaluation of which data products improve hydrologic simulations for various regions across the CONUS was performed. The resulting portfolio of calibration strategies can be used to guide selection of an appropriate combination of modeled and measured information for hydrologic model development and calibration. In addition, these calibration strategies have been developed to be flexible so that new data products can be assimilated. This analysis provides a foundation to understand how well models work when sufficient streamflow data are not available and could be used to further inform hydrologic model parameter development for ungaged areas.
Advanced Mirror & Modelling Technology Development
NASA Technical Reports Server (NTRS)
Effinger, Michael; Stahl, H. Philip; Abplanalp, Laura; Maffett, Steven; Egerman, Robert; Eng, Ron; Arnold, William; Mosier, Gary; Blaurock, Carl
2014-01-01
The 2020 Decadal technology survey is starting in 2018. Technology on the shelf at that time will help guide selection to future low risk and low cost missions. The Advanced Mirror Technology Development (AMTD) team has identified development priorities based on science goals and engineering requirements for Ultraviolet Optical near-Infrared (UVOIR) missions in order to contribute to the selection process. One key development identified was lightweight mirror fabrication and testing. A monolithic, stacked, deep core mirror was fused and replicated twice to achieve the desired radius of curvature. It was subsequently successfully polished and tested. A recently awarded second phase to the AMTD project will develop larger mirrors to demonstrate the lateral scaling of the deep core mirror technology. Another key development was rapid modeling for the mirror. One model focused on generating optical and structural model results in minutes instead of months. Many variables could be accounted for regarding the core, face plate and back structure details. A portion of a spacecraft model was also developed. The spacecraft model incorporated direct integration to transform optical path difference to Point Spread Function (PSF) and between PSF to modulation transfer function. The second phase to the project will take the results of the rapid mirror modeler and integrate them into the rapid spacecraft modeler.
Zhang, Qi; Kindig, Matthew; Li, Zuoping; Crandall, Jeff R; Kerrigan, Jason R
2014-08-22
Clavicle injuries were frequently observed in automotive side and frontal crashes. Finite element (FE) models have been developed to understand the injury mechanism, although no clavicle loading response corridors yet exist in the literature to ensure the model response biofidelity. Moreover, the typically developed structural level (e.g., force-deflection) response corridors were shown to be insufficient for verifying the injury prediction capacity of FE model, which usually is based on strain related injury criteria. Therefore, the purpose of this study is to develop both the structural (force vs deflection) and material level (strain vs force) clavicle response corridors for validating FE models for injury risk modeling. 20 Clavicles were loaded to failure under loading conditions representative of side and frontal crashes respectively, half of which in axial compression, and the other half in three point bending. Both structural and material response corridors were developed for each loading condition. FE model that can accurately predict structural response and strain level provides a more useful tool in injury risk modeling and prediction. The corridor development method in this study could also be extended to develop corridors for other components of the human body. Copyright © 2014 Elsevier Ltd. All rights reserved.
High-level PC-based laser system modeling
NASA Astrophysics Data System (ADS)
Taylor, Michael S.
1991-05-01
Since the inception of the Strategic Defense Initiative (SDI) there have been a multitude of comparison studies done in an attempt to evaluate the effectiveness and relative sizes of complementary, and sometimes competitive, laser weapon systems. It became more and more apparent that what the systems analyst needed was not only a fast, but a cost effective way to perform high-level trade studies. In the present investigation, a general procedure is presented for the development of PC-based algorithmic systems models for laser systems. This procedure points out all of the major issues that should be addressed in the design and development of such a model. Issues addressed include defining the problem to be modeled, defining a strategy for development, and finally, effective use of the model once developed. Being a general procedure, it will allow a systems analyst to develop a model to meet specific needs. To illustrate this method of model development, a description of the Strategic Defense Simulation - Design To (SDS-DT) model developed and used by Science Applications International Corporation (SAIC) is presented. SDS-DT is a menu-driven, fast executing, PC-based program that can be used to either calculate performance, weight, volume, and cost values for a particular design or, alternatively, to run parametrics on particular system parameters to perhaps optimize a design.
Alternatives for jet engine control
NASA Technical Reports Server (NTRS)
Sain, M. K.
1983-01-01
Tensor model order reduction, recursive tensor model identification, input design for tensor model identification, software development for nonlinear feedback control laws based upon tensors, and development of the CATNAP software package for tensor modeling, identification and simulation were studied. The last of these are discussed.
Development and Application of a Simple Hydrogeomorphic Model for Headwater Catchments
We developed a catchment model based on a hydrogeomorphic concept that simulates discharge from channel-riparian complexes, zero-order basins (ZOB, basins ZB and FA), and hillslopes. Multitank models simulate ZOB and hillslope hydrological response, while kinematic wave models pr...
DOT National Transportation Integrated Search
2013-12-01
Travel forecasting models predict travel demand based on the present transportation system and its use. Transportation modelers must develop, validate, and calibrate models to ensure that predicted travel demand is as close to reality as possible. Mo...
Raabe, Margaret E; Chaudhari, Ajit M W
2016-05-03
The ability of a biomechanical simulation to produce results that can translate to real-life situations is largely dependent on the physiological accuracy of the musculoskeletal model. There are a limited number of freely-available, full-body models that exist in OpenSim, and those that do exist are very limited in terms of trunk musculature and degrees of freedom in the spine. Properly modeling the motion and musculature of the trunk is necessary to most accurately estimate lower extremity and spinal loading. The objective of this study was to develop and validate a more physiologically accurate OpenSim full-body model. By building upon three previously developed OpenSim models, the full-body lumbar spine (FBLS) model, comprised of 21 segments, 30 degrees-of-freedom, and 324 musculotendon actuators, was developed. The five lumbar vertebrae were modeled as individual bodies, and coupled constraints were implemented to describe the net motion of the spine. The eight major muscle groups of the lumbar spine were modeled (rectus abdominis, external and internal obliques, erector spinae, multifidus, quadratus lumborum, psoas major, and latissimus dorsi), and many of these muscle groups were modeled as multiple fascicles allowing the large muscles to act in multiple directions. The resulting FBLS model׳s trunk muscle geometry, maximal isometric joint moments, and simulated muscle activations compare well to experimental data. The FBLS model will be made freely available (https://simtk.org/home/fullbodylumbar) for others to perform additional analyses and develop simulations investigating full-body dynamics and contributions of the trunk muscles to dynamic tasks. Copyright © 2016 Elsevier Ltd. All rights reserved.
Modelling vortex-induced fluid-structure interaction.
Benaroya, Haym; Gabbai, Rene D
2008-04-13
The principal goal of this research is developing physics-based, reduced-order, analytical models of nonlinear fluid-structure interactions associated with offshore structures. Our primary focus is to generalize the Hamilton's variational framework so that systems of flow-oscillator equations can be derived from first principles. This is an extension of earlier work that led to a single energy equation describing the fluid-structure interaction. It is demonstrated here that flow-oscillator models are a subclass of the general, physical-based framework. A flow-oscillator model is a reduced-order mechanical model, generally comprising two mechanical oscillators, one modelling the structural oscillation and the other a nonlinear oscillator representing the fluid behaviour coupled to the structural motion.Reduced-order analytical model development continues to be carried out using a Hamilton's principle-based variational approach. This provides flexibility in the long run for generalizing the modelling paradigm to complex, three-dimensional problems with multiple degrees of freedom, although such extension is very difficult. As both experimental and analytical capabilities advance, the critical research path to developing and implementing fluid-structure interaction models entails-formulating generalized equations of motion, as a superset of the flow-oscillator models; and-developing experimentally derived, semi-analytical functions to describe key terms in the governing equations of motion. The developed variational approach yields a system of governing equations. This will allow modelling of multiple d.f. systems. The extensions derived generalize the Hamilton's variational formulation for such problems. The Navier-Stokes equations are derived and coupled to the structural oscillator. This general model has been shown to be a superset of the flow-oscillator model. Based on different assumptions, one can derive a variety of flow-oscillator models.
A composite computational model of liver glucose homeostasis. I. Building the composite model.
Hetherington, J; Sumner, T; Seymour, R M; Li, L; Rey, M Varela; Yamaji, S; Saffrey, P; Margoninski, O; Bogle, I D L; Finkelstein, A; Warner, A
2012-04-07
A computational model of the glucagon/insulin-driven liver glucohomeostasis function, focusing on the buffering of glucose into glycogen, has been developed. The model exemplifies an 'engineering' approach to modelling in systems biology, and was produced by linking together seven component models of separate aspects of the physiology. The component models use a variety of modelling paradigms and degrees of simplification. Model parameters were determined by an iterative hybrid of fitting to high-scale physiological data, and determination from small-scale in vitro experiments or molecular biological techniques. The component models were not originally designed for inclusion within such a composite model, but were integrated, with modification, using our published modelling software and computational frameworks. This approach facilitates the development of large and complex composite models, although, inevitably, some compromises must be made when composing the individual models. Composite models of this form have not previously been demonstrated.
VMT-based traffic impact assessment : development of a trip length model.
DOT National Transportation Integrated Search
2010-06-01
This report develops models that relate the trip-lengths to the land-use characteristics at : the trip-ends (both production- and attraction-ends). Separate models were developed by trip : purpose. The results indicate several statistically significa...
Evolving Non-Dominated Parameter Sets for Computational Models from Multiple Experiments
NASA Astrophysics Data System (ADS)
Lane, Peter C. R.; Gobet, Fernand
2013-03-01
Creating robust, reproducible and optimal computational models is a key challenge for theorists in many sciences. Psychology and cognitive science face particular challenges as large amounts of data are collected and many models are not amenable to analytical techniques for calculating parameter sets. Particular problems are to locate the full range of acceptable model parameters for a given dataset, and to confirm the consistency of model parameters across different datasets. Resolving these problems will provide a better understanding of the behaviour of computational models, and so support the development of general and robust models. In this article, we address these problems using evolutionary algorithms to develop parameters for computational models against multiple sets of experimental data; in particular, we propose the `speciated non-dominated sorting genetic algorithm' for evolving models in several theories. We discuss the problem of developing a model of categorisation using twenty-nine sets of data and models drawn from four different theories. We find that the evolutionary algorithms generate high quality models, adapted to provide a good fit to all available data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Margevicius, Kristen J.; Generous, Nicholas; Abeyta, Esteban
Epidemiological modeling for infectious disease is important for disease management and its routine implementation needs to be facilitated through better description of models in an operational context. A standardized model characterization process that allows selection or making manual comparisons of available models and their results is currently lacking. A key need is a universal framework to facilitate model description and understanding of its features. Los Alamos National Laboratory (LANL) has developed a comprehensive framework that can be used to characterize an infectious disease model in an operational context. The framework was developed through a consensus among a panel of subjectmore » matter experts. In this paper, we describe the framework, its application to model characterization, and the development of the Biosurveillance Analytics Resource Directory (BARD; http://brd.bsvgateway.org/brd/), to facilitate the rapid selection of operational models for specific infectious/communicable diseases. We offer this framework and associated database to stakeholders of the infectious disease modeling field as a tool for standardizing model description and facilitating the use of epidemiological models.« less
Static Aeroelastic Scaling and Analysis of a Sub-Scale Flexible Wing Wind Tunnel Model
NASA Technical Reports Server (NTRS)
Ting, Eric; Lebofsky, Sonia; Nguyen, Nhan; Trinh, Khanh
2014-01-01
This paper presents an approach to the development of a scaled wind tunnel model for static aeroelastic similarity with a full-scale wing model. The full-scale aircraft model is based on the NASA Generic Transport Model (GTM) with flexible wing structures referred to as the Elastically Shaped Aircraft Concept (ESAC). The baseline stiffness of the ESAC wing represents a conventionally stiff wing model. Static aeroelastic scaling is conducted on the stiff wing configuration to develop the wind tunnel model, but additional tailoring is also conducted such that the wind tunnel model achieves a 10% wing tip deflection at the wind tunnel test condition. An aeroelastic scaling procedure and analysis is conducted, and a sub-scale flexible wind tunnel model based on the full-scale's undeformed jig-shape is developed. Optimization of the flexible wind tunnel model's undeflected twist along the span, or pre-twist or wash-out, is then conducted for the design test condition. The resulting wind tunnel model is an aeroelastic model designed for the wind tunnel test condition.
Margevicius, Kristen J; Generous, Nicholas; Abeyta, Esteban; Althouse, Ben; Burkom, Howard; Castro, Lauren; Daughton, Ashlynn; Del Valle, Sara Y.; Fairchild, Geoffrey; Hyman, James M.; Kiang, Richard; Morse, Andrew P.; Pancerella, Carmen M.; Pullum, Laura; Ramanathan, Arvind; Schlegelmilch, Jeffrey; Scott, Aaron; Taylor-McCabe, Kirsten J; Vespignani, Alessandro; Deshpande, Alina
2016-01-01
Epidemiological modeling for infectious disease is important for disease management and its routine implementation needs to be facilitated through better description of models in an operational context. A standardized model characterization process that allows selection or making manual comparisons of available models and their results is currently lacking. A key need is a universal framework to facilitate model description and understanding of its features. Los Alamos National Laboratory (LANL) has developed a comprehensive framework that can be used to characterize an infectious disease model in an operational context. The framework was developed through a consensus among a panel of subject matter experts. In this paper, we describe the framework, its application to model characterization, and the development of the Biosurveillance Analytics Resource Directory (BARD; http://brd.bsvgateway.org/brd/), to facilitate the rapid selection of operational models for specific infectious/communicable diseases. We offer this framework and associated database to stakeholders of the infectious disease modeling field as a tool for standardizing model description and facilitating the use of epidemiological models. PMID:26820405
Margevicius, Kristen J; Generous, Nicholas; Abeyta, Esteban; Althouse, Ben; Burkom, Howard; Castro, Lauren; Daughton, Ashlynn; Del Valle, Sara Y; Fairchild, Geoffrey; Hyman, James M; Kiang, Richard; Morse, Andrew P; Pancerella, Carmen M; Pullum, Laura; Ramanathan, Arvind; Schlegelmilch, Jeffrey; Scott, Aaron; Taylor-McCabe, Kirsten J; Vespignani, Alessandro; Deshpande, Alina
2016-01-01
Epidemiological modeling for infectious disease is important for disease management and its routine implementation needs to be facilitated through better description of models in an operational context. A standardized model characterization process that allows selection or making manual comparisons of available models and their results is currently lacking. A key need is a universal framework to facilitate model description and understanding of its features. Los Alamos National Laboratory (LANL) has developed a comprehensive framework that can be used to characterize an infectious disease model in an operational context. The framework was developed through a consensus among a panel of subject matter experts. In this paper, we describe the framework, its application to model characterization, and the development of the Biosurveillance Analytics Resource Directory (BARD; http://brd.bsvgateway.org/brd/), to facilitate the rapid selection of operational models for specific infectious/communicable diseases. We offer this framework and associated database to stakeholders of the infectious disease modeling field as a tool for standardizing model description and facilitating the use of epidemiological models.
Margevicius, Kristen J.; Generous, Nicholas; Abeyta, Esteban; ...
2016-01-28
Epidemiological modeling for infectious disease is important for disease management and its routine implementation needs to be facilitated through better description of models in an operational context. A standardized model characterization process that allows selection or making manual comparisons of available models and their results is currently lacking. A key need is a universal framework to facilitate model description and understanding of its features. Los Alamos National Laboratory (LANL) has developed a comprehensive framework that can be used to characterize an infectious disease model in an operational context. The framework was developed through a consensus among a panel of subjectmore » matter experts. In this paper, we describe the framework, its application to model characterization, and the development of the Biosurveillance Analytics Resource Directory (BARD; http://brd.bsvgateway.org/brd/), to facilitate the rapid selection of operational models for specific infectious/communicable diseases. We offer this framework and associated database to stakeholders of the infectious disease modeling field as a tool for standardizing model description and facilitating the use of epidemiological models.« less
Reusing models of actors and services in smart homecare to improve sustainability.
Walderhaug, Ståle; Stav, Erlend; Mikalsen, Marius
2008-01-01
Industrial countries are faced with a growing elderly population. Homecare systems with assistive smart house technology enable elderly to live independently at home. Development of such smart home care systems is complex and expensive and there is no common reference model that can facilitate service reuse. This paper proposes reusable actor and service models based on a model-driven development process where end user organizations and domain healthcare experts from four European countries have been involved. The models, specified using UML can be reused actively as assets in the system design and development process and can reduce development costs, and improve interoperability and sustainability of systems. The models are being evaluated in the European IST project MPOWER.
An overview of the CellML API and its implementation
2010-01-01
Background CellML is an XML based language for representing mathematical models, in a machine-independent form which is suitable for their exchange between different authors, and for archival in a model repository. Allowing for the exchange and archival of models in a computer readable form is a key strategic goal in bioinformatics, because of the associated improvements in scientific record accuracy, the faster iterative process of scientific development, and the ability to combine models into large integrative models. However, for CellML models to be useful, tools which can process them correctly are needed. Due to some of the more complex features present in CellML models, such as imports, developing code ab initio to correctly process models can be an onerous task. For this reason, there is a clear and pressing need for an application programming interface (API), and a good implementation of that API, upon which tools can base their support for CellML. Results We developed an API which allows the information in CellML models to be retrieved and/or modified. We also developed a series of optional extension APIs, for tasks such as simplifying the handling of connections between variables, dealing with physical units, validating models, and translating models into different procedural languages. We have also provided a Free/Open Source implementation of this application programming interface, optimised to achieve good performance. Conclusions Tools have been developed using the API which are mature enough for widespread use. The API has the potential to accelerate the development of additional tools capable of processing CellML, and ultimately lead to an increased level of sharing of mathematical model descriptions. PMID:20377909
An overview of the CellML API and its implementation.
Miller, Andrew K; Marsh, Justin; Reeve, Adam; Garny, Alan; Britten, Randall; Halstead, Matt; Cooper, Jonathan; Nickerson, David P; Nielsen, Poul F
2010-04-08
CellML is an XML based language for representing mathematical models, in a machine-independent form which is suitable for their exchange between different authors, and for archival in a model repository. Allowing for the exchange and archival of models in a computer readable form is a key strategic goal in bioinformatics, because of the associated improvements in scientific record accuracy, the faster iterative process of scientific development, and the ability to combine models into large integrative models.However, for CellML models to be useful, tools which can process them correctly are needed. Due to some of the more complex features present in CellML models, such as imports, developing code ab initio to correctly process models can be an onerous task. For this reason, there is a clear and pressing need for an application programming interface (API), and a good implementation of that API, upon which tools can base their support for CellML. We developed an API which allows the information in CellML models to be retrieved and/or modified. We also developed a series of optional extension APIs, for tasks such as simplifying the handling of connections between variables, dealing with physical units, validating models, and translating models into different procedural languages.We have also provided a Free/Open Source implementation of this application programming interface, optimised to achieve good performance. Tools have been developed using the API which are mature enough for widespread use. The API has the potential to accelerate the development of additional tools capable of processing CellML, and ultimately lead to an increased level of sharing of mathematical model descriptions.
Kinetic Model of Growth of Arthropoda Populations
NASA Astrophysics Data System (ADS)
Ershov, Yu. A.; Kuznetsov, M. A.
2018-05-01
Kinetic equations were derived for calculating the growth of crustacean populations ( Crustacea) based on the biological growth model suggested earlier using shrimp ( Caridea) populations as an example. The development cycle of successive stages for populations can be represented in the form of quasi-chemical equations. The kinetic equations that describe the development cycle of crustaceans allow quantitative prediction of the development of populations depending on conditions. In contrast to extrapolation-simulation models, in the developed kinetic model of biological growth the kinetic parameters are the experimental characteristics of population growth. Verification and parametric identification of the developed model on the basis of the experimental data showed agreement with experiment within the error of the measurement technique.
NASA Technical Reports Server (NTRS)
Adams, Neil S.; Bollenbacher, Gary
1992-01-01
This report discusses the development and underlying mathematics of a rigid-body computer model of a proposed cryogenic on-orbit liquid depot storage, acquisition, and transfer spacecraft (COLD-SAT). This model, referred to in this report as the COLD-SAT dynamic model, consists of both a trajectory model and an attitudinal model. All disturbance forces and torques expected to be significant for the actual COLD-SAT spacecraft are modeled to the required degree of accuracy. Control and experimental thrusters are modeled, as well as fluid slosh. The model also computes microgravity disturbance accelerations at any specified point in the spacecraft. The model was developed by using the Boeing EASY5 dynamic analysis package and will run on Apollo, Cray, and other computing platforms.
Hyper-Resolution Groundwater Modeling using MODFLOW 6
NASA Astrophysics Data System (ADS)
Hughes, J. D.; Langevin, C.
2017-12-01
MODFLOW 6 is the latest version of the U.S. Geological Survey's modular hydrologic model. MODFLOW 6 was developed to synthesize many of the recent versions of MODFLOW into a single program, improve the way different process models are coupled, and to provide an object-oriented framework for adding new types of models and packages. The object-oriented framework and underlying numerical solver make it possible to tightly couple any number of hyper-resolution models within coarser regional models. The hyper-resolution models can be used to evaluate local-scale groundwater issues that may be affected by regional-scale forcings. In MODFLOW 6, hyper-resolution meshes can be maintained as separate model datasets, similar to MODFLOW-LGR, which simplifies the development of a coarse regional model with imbedded hyper-resolution models from a coarse regional model. For example, the South Atlantic Coastal Plain regional water availability model was converted from a MODFLOW-2000 model to a MODFLOW 6 model. The horizontal discretization of the original model is approximately 3,218 m x 3,218 m. Hyper-resolution models of the Aiken and Sumter County water budget areas in South Carolina with a horizontal discretization of approximately 322 m x 322 m were developed and were tightly coupled to a modified version of the original coarse regional model that excluded these areas. Hydraulic property and aquifer geometry data from the coarse model were mapped to the hyper-resolution models. The discretization of the hyper-resolution models is fine enough to make detailed analyses of the effect that changes in groundwater withdrawals in the production aquifers have on the water table and surface-water/groundwater interactions. The approach used in this analysis could be applied to other regional water availability models that have been developed by the U.S. Geological Survey to evaluate local scale groundwater issues.
Modeling of turbulence and transition
NASA Technical Reports Server (NTRS)
Shih, Tsan-Hsing
1992-01-01
The first objective is to evaluate current two-equation and second order closure turbulence models using available direct numerical simulations and experiments, and to identify the models which represent the state of the art in turbulence modeling. The second objective is to study the near-wall behavior of turbulence, and to develop reliable models for an engineering calculation of turbulence and transition. The third objective is to develop a two-scale model for compressible turbulence.
NASA Technical Reports Server (NTRS)
Hashemi-Kia, Mostafa; Toossi, Mostafa
1990-01-01
A computational procedure for the reduction of large finite element models was developed. This procedure is used to obtain a significantly reduced model while retaining the essential global dynamic characteristics of the full-size model. This reduction procedure is applied to the airframe finite element model of AH-64A Attack Helicopter. The resulting reduced model is then validated by application to a vibration reduction study.
Enabling Accessibility Through Model-Based User Interface Development.
Ziegler, Daniel; Peissner, Matthias
2017-01-01
Adaptive user interfaces (AUIs) can increase the accessibility of interactive systems. They provide personalized display and interaction modes to fit individual user needs. Most AUI approaches rely on model-based development, which is considered relatively demanding. This paper explores strategies to make model-based development more attractive for mainstream developers.
Kohlberg's Moral Development Model: Cohort Influences on Validity.
ERIC Educational Resources Information Center
Bechtel, Ashleah
An overview of Kohlberg's theory of moral development is presented; three interviews regarding the theory are reported, and the author's own moral development is compared to the model; finally, a critique of the theory is addressed along with recommendations for future enhancement. Lawrence Kohlberg's model of moral development, also referred to…
NASA Technical Reports Server (NTRS)
Flowers, George T.
1994-01-01
Substantial progress has been made toward the goals of this research effort in the past six months. A simplified rotor model with a flexible shaft and backup bearings has been developed. The model is based upon the work of Ishii and Kirk. Parameter studies of the behavior of this model are currently being conducted. A simple rotor model which includes a flexible disk and bearings with clearance has been developed and the dynamics of the model investigated. The study consists of simulation work coupled with experimental verification. The work is documented in the attached paper. A rotor model based upon the T-501 engine has been developed which includes backup bearing effects. The dynamics of this model are currently being studied with the objective of verifying the conclusions obtained from the simpler models. Parallel simulation runs are being conducted using an ANSYS based finite element model of the T-501.
An introduction to the partial credit model for developing nursing assessments.
Fox, C
1999-11-01
The partial credit model, which is a special case of the Rasch measurement model, was presented as a useful way to develop and refine complex nursing assessments. The advantages of the Rasch model over the classical psychometric model were presented including the lack of bias in the measurement process, the ability to highlight those items in need of refinement, the provision of information on congruence between the data and the model, and feedback on the usefulness of the response categories. The partial credit model was introduced as a way to develop complex nursing assessments such as performance-based assessments, because of the model's ability to accommodate a variety of scoring procedures. Finally, an application of the partial credit model was illustrated using the Practical Knowledge Inventory for Nurses, a paper-and-pencil instrument that measures on-the-job decision-making for nurses.
Modeling of near wall turbulence and modeling of bypass transition
NASA Technical Reports Server (NTRS)
Yang, Z.
1992-01-01
The objectives for this project are as follows: (1) Modeling of the near wall turbulence: We aim to develop a second order closure for the near wall turbulence. As a first step of this project, we try to develop a kappa-epsilon model for near wall turbulence. We require the resulting model to be able to handle both near wall turbulence and turbulent flows away from the wall, computationally robust, and applicable for complex flow situations, flow with separation, for example, and (2) Modeling of the bypass transition: We aim to develop a bypass transition model which contains the effect of intermittency. Thus, the model can be used for both the transitional boundary layers and the turbulent boundary layers. We require the resulting model to give a good prediction of momentum and heat transfer within the transitional boundary and a good prediction of the effect of freestream turbulence on transitional boundary layers.
Modeling methods for high-fidelity rotorcraft flight mechanics simulation
NASA Technical Reports Server (NTRS)
Mansur, M. Hossein; Tischler, Mark B.; Chaimovich, Menahem; Rosen, Aviv; Rand, Omri
1992-01-01
The cooperative effort being carried out under the agreements of the United States-Israel Memorandum of Understanding is discussed. Two different models of the AH-64 Apache Helicopter, which may differ in their approach to modeling the main rotor, are presented. The first model, the Blade Element Model for the Apache (BEMAP), was developed at Ames Research Center, and is the only model of the Apache to employ a direct blade element approach to calculating the coupled flap-lag motion of the blades and the rotor force and moment. The second model was developed at the Technion-Israel Institute of Technology and uses an harmonic approach to analyze the rotor. The approach allows two different levels of approximation, ranging from the 'first harmonic' (similar to a tip-path-plane model) to 'complete high harmonics' (comparable to a blade element approach). The development of the two models is outlined and the two are compared using available flight test data.
Extracting scene feature vectors through modeling, volume 3
NASA Technical Reports Server (NTRS)
Berry, J. K.; Smith, J. A.
1976-01-01
The remote estimation of the leaf area index of winter wheat at Finney County, Kansas was studied. The procedure developed consists of three activities: (1) field measurements; (2) model simulations; and (3) response classifications. The first activity is designed to identify model input parameters and develop a model evaluation data set. A stochastic plant canopy reflectance model is employed to simulate reflectance in the LANDSAT bands as a function of leaf area index for two phenological stages. An atmospheric model is used to translate these surface reflectances into simulated satellite radiance. A divergence classifier determines the relative similarity between model derived spectral responses and those of areas with unknown leaf area index. The unknown areas are assigned the index associated with the closest model response. This research demonstrated that the SRVC canopy reflectance model is appropriate for wheat scenes and that broad categories of leaf area index can be inferred from the procedure developed.
Gottlieb, Sami L; Giersing, Birgitte; Boily, Marie-Claude; Chesson, Harrell; Looker, Katharine J; Schiffer, Joshua; Spicknall, Ian; Hutubessy, Raymond; Broutet, Nathalie
2017-06-21
Development of a vaccine against herpes simplex virus (HSV) is an important goal for global sexual and reproductive health. In order to more precisely define the health and economic burden of HSV infection and the theoretical impact and cost-effectiveness of an HSV vaccine, in 2015 the World Health Organization convened an expert consultation meeting on HSV vaccine impact modelling. The experts reviewed existing model-based estimates and dynamic models of HSV infection to outline critical future modelling needs to inform development of a comprehensive business case and preferred product characteristics for an HSV vaccine. This article summarizes key findings and discussions from the meeting on modelling needs related to HSV burden, costs, and vaccine impact, essential data needs to carry out those models, and important model components and parameters. Copyright © 2017. Published by Elsevier Ltd.
The Earth System Prediction Suite: Toward a Coordinated U.S. Modeling Capability
Theurich, Gerhard; DeLuca, C.; Campbell, T.; ...
2016-08-22
The Earth System Prediction Suite (ESPS) is a collection of flagship U.S. weather and climate models and model components that are being instrumented to conform to interoperability conventions, documented to follow metadata standards, and made available either under open-source terms or to credentialed users. Furthermore, the ESPS represents a culmination of efforts to create a common Earth system model architecture, and the advent of increasingly coordinated model development activities in the United States. ESPS component interfaces are based on the Earth System Modeling Framework (ESMF), community-developed software for building and coupling models, and the National Unified Operational Prediction Capability (NUOPC)more » Layer, a set of ESMF-based component templates and interoperability conventions. Our shared infrastructure simplifies the process of model coupling by guaranteeing that components conform to a set of technical and semantic behaviors. The ESPS encourages distributed, multiagency development of coupled modeling systems; controlled experimentation and testing; and exploration of novel model configurations, such as those motivated by research involving managed and interactive ensembles. ESPS codes include the Navy Global Environmental Model (NAVGEM), the Hybrid Coordinate Ocean Model (HYCOM), and the Coupled Ocean–Atmosphere Mesoscale Prediction System (COAMPS); the NOAA Environmental Modeling System (NEMS) and the Modular Ocean Model (MOM); the Community Earth System Model (CESM); and the NASA ModelE climate model and the Goddard Earth Observing System Model, version 5 (GEOS-5), atmospheric general circulation model.« less
The Earth System Prediction Suite: Toward a Coordinated U.S. Modeling Capability
DOE Office of Scientific and Technical Information (OSTI.GOV)
Theurich, Gerhard; DeLuca, C.; Campbell, T.
The Earth System Prediction Suite (ESPS) is a collection of flagship U.S. weather and climate models and model components that are being instrumented to conform to interoperability conventions, documented to follow metadata standards, and made available either under open-source terms or to credentialed users. Furthermore, the ESPS represents a culmination of efforts to create a common Earth system model architecture, and the advent of increasingly coordinated model development activities in the United States. ESPS component interfaces are based on the Earth System Modeling Framework (ESMF), community-developed software for building and coupling models, and the National Unified Operational Prediction Capability (NUOPC)more » Layer, a set of ESMF-based component templates and interoperability conventions. Our shared infrastructure simplifies the process of model coupling by guaranteeing that components conform to a set of technical and semantic behaviors. The ESPS encourages distributed, multiagency development of coupled modeling systems; controlled experimentation and testing; and exploration of novel model configurations, such as those motivated by research involving managed and interactive ensembles. ESPS codes include the Navy Global Environmental Model (NAVGEM), the Hybrid Coordinate Ocean Model (HYCOM), and the Coupled Ocean–Atmosphere Mesoscale Prediction System (COAMPS); the NOAA Environmental Modeling System (NEMS) and the Modular Ocean Model (MOM); the Community Earth System Model (CESM); and the NASA ModelE climate model and the Goddard Earth Observing System Model, version 5 (GEOS-5), atmospheric general circulation model.« less
Cunningham, Albert R.; Trent, John O.
2012-01-01
Structure–activity relationship (SAR) models are powerful tools to investigate the mechanisms of action of chemical carcinogens and to predict the potential carcinogenicity of untested compounds. We describe the use of a traditional fragment-based SAR approach along with a new virtual ligand-protein interaction-based approach for modeling of nonmutagenic carcinogens. The ligand-based SAR models used descriptors derived from computationally calculated ligand-binding affinities for learning set agents to 5495 proteins. Two learning sets were developed. One set was from the Carcinogenic Potency Database, where chemicals tested for rat carcinogenesis along with Salmonella mutagenicity data were provided. The second was from Malacarne et al. who developed a learning set of nonalerting compounds based on rodent cancer bioassay data and Ashby’s structural alerts. When the rat cancer models were categorized based on mutagenicity, the traditional fragment model outperformed the ligand-based model. However, when the learning sets were composed solely of nonmutagenic or nonalerting carcinogens and noncarcinogens, the fragment model demonstrated a concordance of near 50%, whereas the ligand-based models demonstrated a concordance of 71% for nonmutagenic carcinogens and 74% for nonalerting carcinogens. Overall, these findings suggest that expert system analysis of virtual chemical protein interactions may be useful for developing predictive SAR models for nonmutagenic carcinogens. Moreover, a more practical approach for developing SAR models for carcinogenesis may include fragment-based models for chemicals testing positive for mutagenicity and ligand-based models for chemicals devoid of DNA reactivity. PMID:22678118
Development and evaluation of thermal model reduction algorithms for spacecraft
NASA Astrophysics Data System (ADS)
Deiml, Michael; Suderland, Martin; Reiss, Philipp; Czupalla, Markus
2015-05-01
This paper is concerned with the topic of the reduction of thermal models of spacecraft. The work presented here has been conducted in cooperation with the company OHB AG, formerly Kayser-Threde GmbH, and the Institute of Astronautics at Technische Universität München with the goal to shorten and automatize the time-consuming and manual process of thermal model reduction. The reduction of thermal models can be divided into the simplification of the geometry model for calculation of external heat flows and radiative couplings and into the reduction of the underlying mathematical model. For simplification a method has been developed which approximates the reduced geometry model with the help of an optimization algorithm. Different linear and nonlinear model reduction techniques have been evaluated for their applicability in reduction of the mathematical model. Thereby the compatibility with the thermal analysis tool ESATAN-TMS is of major concern, which restricts the useful application of these methods. Additional model reduction methods have been developed, which account to these constraints. The Matrix Reduction method allows the approximation of the differential equation to reference values exactly expect for numerical errors. The summation method enables a useful, applicable reduction of thermal models that can be used in industry. In this work a framework for model reduction of thermal models has been created, which can be used together with a newly developed graphical user interface for the reduction of thermal models in industry.
Cunningham, Albert R; Carrasquer, C Alex; Qamar, Shahid; Maguire, Jon M; Cunningham, Suzanne L; Trent, John O
2012-10-01
Structure-activity relationship (SAR) models are powerful tools to investigate the mechanisms of action of chemical carcinogens and to predict the potential carcinogenicity of untested compounds. We describe the use of a traditional fragment-based SAR approach along with a new virtual ligand-protein interaction-based approach for modeling of nonmutagenic carcinogens. The ligand-based SAR models used descriptors derived from computationally calculated ligand-binding affinities for learning set agents to 5495 proteins. Two learning sets were developed. One set was from the Carcinogenic Potency Database, where chemicals tested for rat carcinogenesis along with Salmonella mutagenicity data were provided. The second was from Malacarne et al. who developed a learning set of nonalerting compounds based on rodent cancer bioassay data and Ashby's structural alerts. When the rat cancer models were categorized based on mutagenicity, the traditional fragment model outperformed the ligand-based model. However, when the learning sets were composed solely of nonmutagenic or nonalerting carcinogens and noncarcinogens, the fragment model demonstrated a concordance of near 50%, whereas the ligand-based models demonstrated a concordance of 71% for nonmutagenic carcinogens and 74% for nonalerting carcinogens. Overall, these findings suggest that expert system analysis of virtual chemical protein interactions may be useful for developing predictive SAR models for nonmutagenic carcinogens. Moreover, a more practical approach for developing SAR models for carcinogenesis may include fragment-based models for chemicals testing positive for mutagenicity and ligand-based models for chemicals devoid of DNA reactivity.
Zheng, Jianqiu; Thornton, Peter; Painter, Scott; Gu, Baohua; Wullschleger, Stan; Graham, David
2018-06-13
This anaerobic carbon decomposition model is developed with explicit representation of fermentation, methanogenesis and iron reduction by combining three well-known modeling approaches developed in different disciplines. A pool-based model to represent upstream carbon transformations and replenishment of DOC pool, a thermodynamically-based model to calculate rate kinetics and biomass growth for methanogenesis and Fe(III) reduction, and a humic ion-binding model for aqueous phase speciation and pH calculation are implemented into the open source geochemical model PHREEQC (V3.0). Installation of PHREEQC is required to run this model.
NASA Astrophysics Data System (ADS)
Butler, Samuel D.; Marciniak, Michael A.
2014-09-01
Since the development of the Torrance-Sparrow bidirectional re ectance distribution function (BRDF) model in 1967, several BRDF models have been created. Previous attempts to categorize BRDF models have relied upon somewhat vague descriptors, such as empirical, semi-empirical, and experimental. Our approach is to instead categorize BRDF models based on functional form: microfacet normal distribution, geometric attenua- tion, directional-volumetric and Fresnel terms, and cross section conversion factor. Several popular microfacet models are compared to a standardized notation for a microfacet BRDF model. A library of microfacet model components is developed, allowing for creation of unique microfacet models driven by experimentally measured BRDFs.
NASA Astrophysics Data System (ADS)
Tallapragada, V.
2017-12-01
NOAA's Next Generation Global Prediction System (NGGPS) has provided the unique opportunity to develop and implement a non-hydrostatic global model based on Geophysical Fluid Dynamics Laboratory (GFDL) Finite Volume Cubed Sphere (FV3) Dynamic Core at National Centers for Environmental Prediction (NCEP), making a leap-step advancement in seamless prediction capabilities across all spatial and temporal scales. Model development efforts are centralized with unified model development in the NOAA Environmental Modeling System (NEMS) infrastructure based on Earth System Modeling Framework (ESMF). A more sophisticated coupling among various earth system components is being enabled within NEMS following National Unified Operational Prediction Capability (NUOPC) standards. The eventual goal of unifying global and regional models will enable operational global models operating at convective resolving scales. Apart from the advanced non-hydrostatic dynamic core and coupling to various earth system components, advanced physics and data assimilation techniques are essential for improved forecast skill. NGGPS is spearheading ambitious physics and data assimilation strategies, concentrating on creation of a Common Community Physics Package (CCPP) and Joint Effort for Data Assimilation Integration (JEDI). Both initiatives are expected to be community developed, with emphasis on research transitioning to operations (R2O). The unified modeling system is being built to support the needs of both operations and research. Different layers of community partners are also established with specific roles/responsibilities for researchers, core development partners, trusted super-users, and operations. Stakeholders are engaged at all stages to help drive the direction of development, resources allocations and prioritization. This talk presents the current and future plans of unified model development at NCEP for weather, sub-seasonal, and seasonal climate prediction applications with special emphasis on implementation of NCEP FV3 Global Forecast System (GFS) and Global Ensemble Forecast System (GEFS) into operations by 2019.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sathaye, Jayant A.
2000-04-01
Integrated assessment (IA) modeling of climate policy is increasingly global in nature, with models incorporating regional disaggregation. The existing empirical basis for IA modeling, however, largely arises from research on industrialized economies. Given the growing importance of developing countries in determining long-term global energy and carbon emissions trends, filling this gap with improved statistical information on developing countries' energy and carbon-emissions characteristics is an important priority for enhancing IA modeling. Earlier research at LBNL on this topic has focused on assembling and analyzing statistical data on productivity trends and technological change in the energy-intensive manufacturing sectors of five developing countries,more » India, Brazil, Mexico, Indonesia, and South Korea. The proposed work will extend this analysis to the agriculture and electric power sectors in India, South Korea, and two other developing countries. They will also examine the impact of alternative model specifications on estimates of productivity growth and technological change for each of the three sectors, and estimate the contribution of various capital inputs--imported vs. indigenous, rigid vs. malleable-- in contributing to productivity growth and technological change. The project has already produced a data resource on the manufacturing sector which is being shared with IA modelers. This will be extended to the agriculture and electric power sectors, which would also be made accessible to IA modeling groups seeking to enhance the empirical descriptions of developing country characteristics. The project will entail basic statistical and econometric analysis of productivity and energy trends in these developing country sectors, with parameter estimates also made available to modeling groups. The parameter estimates will be developed using alternative model specifications that could be directly utilized by the existing IAMs for the manufacturing, agriculture, and electric power sectors.« less
An Immuno-epidemiological Model of Paratuberculosis
NASA Astrophysics Data System (ADS)
Martcheva, M.
2011-11-01
The primary objective of this article is to introduce an immuno-epidemiological model of paratuberculosis (Johne's disease). To develop the immuno-epidemiological model, we first develop an immunological model and an epidemiological model. Then, we link the two models through time-since-infection structure and parameters of the epidemiological model. We use the nested approach to compose the immuno-epidemiological model. Our immunological model captures the switch between the T-cell immune response and the antibody response in Johne's disease. The epidemiological model is a time-since-infection model and captures the variability of transmission rate and the vertical transmission of the disease. We compute the immune-response-dependent epidemiological reproduction number. Our immuno-epidemiological model can be used for investigation of the impact of the immune response on the epidemiology of Johne's disease.
1986-01-01
the information that has been determined experimentally. The Labyrinth Seal Analysis program was, therefore, directed to the develop - ment of an...labyrinth seal performance, the program included the development of an improved empirical design model to pro- j. .,’ vide the calculation of the flow... program . * Phase I was directed to the analytical development of both an *analysis* model and an improvwd empirical *design" model. Supporting rig tests
Development of an EUV Test Facility at the Marshall Space Flight Center
2011-08-22
Zemax model developed from beam size measurements that locates and determines the size of a copper gasket mounted to our pneumatic gate \\ahe at the...the observed spectra. Therefore, a Zemax model of the source, transmission grating and the Andor camera had to be developed. Two models were developed...see Figures 16, 17 and 18). The Zemax model including the NIST transmission data is in good agreement with the observed spectrum shown in Figure 18
NASA Astrophysics Data System (ADS)
Begnaud, M. L.; Anderson, D. N.; Phillips, W. S.; Myers, S. C.; Ballard, S.
2016-12-01
The Regional Seismic Travel Time (RSTT) tomography model has been developed to improve travel time predictions for regional phases (Pn, Sn, Pg, Lg) in order to increase seismic location accuracy, especially for explosion monitoring. The RSTT model is specifically designed to exploit regional phases for location, especially when combined with teleseismic arrivals. The latest RSTT model (version 201404um) has been released (http://www.sandia.gov/rstt). Travel time uncertainty estimates for RSTT are determined using one-dimensional (1D), distance-dependent error models, that have the benefit of being very fast to use in standard location algorithms, but do not account for path-dependent variations in error, and structural inadequacy of the RSTTT model (e.g., model error). Although global in extent, the RSTT tomography model is only defined in areas where data exist. A simple 1D error model does not accurately model areas where RSTT has not been calibrated. We are developing and validating a new error model for RSTT phase arrivals by mathematically deriving this multivariate model directly from a unified model of RSTT embedded into a statistical random effects model that captures distance, path and model error effects. An initial method developed is a two-dimensional path-distributed method using residuals. The goals for any RSTT uncertainty method are for it to be both readily useful for the standard RSTT user as well as improve travel time uncertainty estimates for location. We have successfully tested using the new error model for Pn phases and will demonstrate the method and validation of the error model for Sn, Pg, and Lg phases.
Nonlinear Dynamic Modeling and Controls Development for Supersonic Propulsion System Research
NASA Technical Reports Server (NTRS)
Connolly, Joseph W.; Kopasakis, George; Paxson, Daniel E.; Stuber, Eric; Woolwine, Kyle
2012-01-01
This paper covers the propulsion system component modeling and controls development of an integrated nonlinear dynamic simulation for an inlet and engine that can be used for an overall vehicle (APSE) model. The focus here is on developing a methodology for the propulsion model integration, which allows for controls design that prevents inlet instabilities and minimizes the thrust oscillation experienced by the vehicle. Limiting thrust oscillations will be critical to avoid exciting vehicle aeroelastic modes. Model development includes both inlet normal shock position control and engine rotor speed control for a potential supersonic commercial transport. A loop shaping control design process is used that has previously been developed for the engine and verified on linear models, while a simpler approach is used for the inlet control design. Verification of the modeling approach is conducted by simulating a two-dimensional bifurcated inlet and a representative J-85 jet engine previously used in a NASA supersonics project. Preliminary results are presented for the current supersonics project concept variable cycle turbofan engine design.
Mermis, Bernie J
2018-02-01
The present article concerns the development of a taxonomy model for organizing and classifying all aspects of rehabilitation psychology from an integrative level-dimensional conceptualization. This conceptualization is presented as an alternative to a primarily categorical approach to classification. It also assumes a continuity perspective for all aspects of behavior and experience. Development of this taxonomy model involves organizing information relevant to levels/domains of all aspects of behavior and experience, and to constructs describing their underlying components conceptually as well as dimensions which constitute the measurable basis of constructs. A taxonomy model with levels/domains, representative examples of constructs and dimensions is presented as a foundation for development of the present taxonomy model, with specific relevance to rehabilitation psychology. This integrative level-dimensional taxonomy model provides a structure for organizing all aspects of rehabilitation psychology relevant to understanding, assessing, and influencing the rehabilitation process. Suggestions for development and research are provided for the taxonomy model. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Variables influencing food perception reviewed for consumer-oriented product development.
Sijtsema, Siet; Linnemann, Anita; van Gaasbeek, Ton; Dagevos, Hans; Jongen, Wim
2002-01-01
Consumer wishes have to be translated into product characteristics to implement consumer-oriented product development. Before this step can be made, insight in food-related behavior and perception of consumers is necessary to make the right, useful, and successful translation. Food choice behavior and consumers' perception are studied in many disciplines. Models of food behavior and preferences therefore were studied from a multidisciplinary perspective. Nearly all models structure the determinants related to the person, the food, and the environment. Consequently, the overview of models was used as a basis to structure the variables influencing food perception into a model for consumer-oriented product development. To this new model, referred to as food perception model, other variables like time and place as part of consumption moment were added. These are important variables influencing consumers' perception, and therefore of increasing importance to consumer-oriented product development nowadays. In further research, the presented food perception model is used as a tool to implement successful consumer-oriented product development.
Modelling a single phase voltage controlled rectifier using Laplace transforms
NASA Technical Reports Server (NTRS)
Kraft, L. Alan; Kankam, M. David
1992-01-01
The development of a 20 kHz, AC power system by NASA for large space projects has spurred a need to develop models for the equipment which will be used on these single phase systems. To date, models for the AC source (i.e., inverters) have been developed. It is the intent of this paper to develop a method to model the single phase voltage controlled rectifiers which will be attached to the AC power grid as an interface for connected loads. A modified version of EPRI's HARMFLO program is used as the shell for these models. The results obtained from the model developed in this paper are quite adequate for the analysis of problems such as voltage resonance. The unique technique presented in this paper uses the Laplace transforms to determine the harmonic content of the load current of the rectifier rather than a curve fitting technique. Laplace transforms yield the coefficient of the differential equations which model the line current to the rectifier directly.
Comparison of Conceptual and Neural Network Rainfall-Runoff Models
NASA Astrophysics Data System (ADS)
Vidyarthi, V. K.; Jain, A.
2014-12-01
Rainfall-runoff (RR) model is a key component of any water resource application. There are two types of techniques usually employed for RR modeling: physics based and data-driven techniques. Although the physics based models have been used for operational purposes for a very long time, they provide only reasonable accuracy in modeling and forecasting. On the other hand, the Artificial Neural Networks (ANNs) have been reported to provide superior modeling performance; however, they have not been acceptable by practitioners, decision makers and water resources engineers as operational tools. The ANNs one of the data driven techniques, became popular for efficient modeling of the complex natural systems in the last couple of decades. In this paper, the comparative results for conceptual and ANN models in RR modeling are presented. The conceptual models were developed by the use of rainfall-runoff library (RRL) and genetic algorithm (GA) was used for calibration of these models. Feed-forward neural network model structure trained by Levenberg-Marquardt (LM) training algorithm has been adopted here to develop all the ANN models. The daily rainfall, runoff and various climatic data derived from Bird creek basin, Oklahoma, USA were employed to develop all the models included here. Daily potential evapotranspiration (PET), which was used in conceptual model development, was calculated by the use of Penman equation. The input variables were selected on the basis of correlation analysis. The performance evaluation statistics such as average absolute relative error (AARE), Pearson's correlation coefficient (R) and threshold statistics (TS) were used for assessing the performance of all the models developed here. The results obtained in this study show that the ANN models outperform the conventional conceptual models due to their ability to learn the non-linearity and complexity inherent in data of rainfall-runoff process in a more efficient manner. There is a strong need to carry out such studies to prove the superiority of ANN models over conventional methods in an attempt to make them acceptable by water resources community responsible for the operation of water resources systems.
NASA Astrophysics Data System (ADS)
Zahmatkesh, Zahra; Karamouz, Mohammad; Nazif, Sara
2015-09-01
Simulation of rainfall-runoff process in urban areas is of great importance considering the consequences and damages of extreme runoff events and floods. The first issue in flood hazard analysis is rainfall simulation. Large scale climate signals have been proved to be effective in rainfall simulation and prediction. In this study, an integrated scheme is developed for rainfall-runoff modeling considering different sources of uncertainty. This scheme includes three main steps of rainfall forecasting, rainfall-runoff simulation and future runoff prediction. In the first step, data driven models are developed and used to forecast rainfall using large scale climate signals as rainfall predictors. Due to high effect of different sources of uncertainty on the output of hydrologic models, in the second step uncertainty associated with input data, model parameters and model structure is incorporated in rainfall-runoff modeling and simulation. Three rainfall-runoff simulation models are developed for consideration of model conceptual (structural) uncertainty in real time runoff forecasting. To analyze the uncertainty of the model structure, streamflows generated by alternative rainfall-runoff models are combined, through developing a weighting method based on K-means clustering. Model parameters and input uncertainty are investigated using an adaptive Markov Chain Monte Carlo method. Finally, calibrated rainfall-runoff models are driven using the forecasted rainfall to predict future runoff for the watershed. The proposed scheme is employed in the case study of the Bronx River watershed, New York City. Results of uncertainty analysis of rainfall-runoff modeling reveal that simultaneous estimation of model parameters and input uncertainty significantly changes the probability distribution of the model parameters. It is also observed that by combining the outputs of the hydrological models using the proposed clustering scheme, the accuracy of runoff simulation in the watershed is remarkably improved up to 50% in comparison to the simulations by the individual models. Results indicate that the developed methodology not only provides reliable tools for rainfall and runoff modeling, but also adequate time for incorporating required mitigation measures in dealing with potentially extreme runoff events and flood hazard. Results of this study can be used in identification of the main factors affecting flood hazard analysis.
Atmospheric Models for Aerocapture
NASA Technical Reports Server (NTRS)
Justus, C. G.; Duval, Aleta; Keller, Vernon W.
2003-01-01
There are eight destinations in the Solar System with sufficient atmosphere for aerocapture to be a viable aeroassist option - Venus, Earth, Mars, Jupiter, Saturn and its moon Titan, Uranus, and Neptune. Engineering-level atmospheric models for four of these targets (Earth, Mars, Titan, and Neptune) have been developed for NASA to support systems analysis studies of potential future aerocapture missions. Development of a similar atmospheric model for Venus has recently commenced. An important capability of all of these models is their ability to simulate quasi-random density perturbations for Monte Carlo analyses in developing guidance, navigation and control algorithms, and for thermal systems design. Similarities and differences among these atmospheric models are presented, with emphasis on the recently developed Neptune model and on planned characteristics of the Venus model. Example applications for aerocapture are also presented and illustrated. Recent updates to the Titan atmospheric model, in anticipation of applications for trajectory and atmospheric reconstruct of Huygens Robe entry at Titan, are discussed. Recent updates to the Mars atmospheric model, in support of ongoing Mars aerocapture systems analysis studies, are also presented.
Status of DSMT research program
NASA Technical Reports Server (NTRS)
Mcgowan, Paul E.; Javeed, Mehzad; Edighoffer, Harold H.
1991-01-01
The status of the Dynamic Scale Model Technology (DSMT) research program is presented. DSMT is developing scale model technology for large space structures as part of the Control Structure Interaction (CSI) program at NASA Langley Research Center (LaRC). Under DSMT a hybrid-scale structural dynamics model of Space Station Freedom was developed. Space Station Freedom was selected as the focus structure for DSMT since the station represents the first opportunity to obtain flight data on a complex, three-dimensional space structure. Included is an overview of DSMT including the development of the space station scale model and the resulting hardware. Scaling technology was developed for this model to achieve a ground test article which existing test facilities can accommodate while employing realistically scaled hardware. The model was designed and fabricated by the Lockheed Missile and Space Co., and is assembled at LaRc for dynamic testing. Also, results from ground tests and analyses of the various model components are presented along with plans for future subassembly and matted model tests. Finally, utilization of the scale model for enhancing analysis verification of the full-scale space station is also considered.
Wall-resolved spectral cascade-transport turbulence model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, C. S.; Shaver, D. R.; Lahey, R. T.
A spectral cascade-transport model has been developed and applied to turbulent channel flows (Reτ= 550, 950, and 2000 based on friction velocity, uτ ; or ReδΜ= 8,500; 14,800 and 31,000, based on the mean velocity and channel half-width). This model is an extension of a spectral model previously developed for homogeneous single and two-phase decay of isotropic turbulence and uniform shear flows; and a spectral turbulence model for wall-bounded flows without resolving the boundary layer. Data from direct numerical simulation (DNS) of turbulent channel flow was used to help develop this model and to assess its performance in the 1Dmore » direction across the channel width. The resultant spectral model is capable of predicting the mean velocity, turbulent kinetic energy and energy spectrum distributions for single-phase wall-bounded flows all the way to the wall, where the model source terms have been developed to account for the wall influence. We implemented the model into the 3D multiphase CFD code NPHASE-CMFD and the latest results are within reasonable error of the 1D predictions.« less
Wall-resolved spectral cascade-transport turbulence model
Brown, C. S.; Shaver, D. R.; Lahey, R. T.; ...
2017-07-08
A spectral cascade-transport model has been developed and applied to turbulent channel flows (Reτ= 550, 950, and 2000 based on friction velocity, uτ ; or ReδΜ= 8,500; 14,800 and 31,000, based on the mean velocity and channel half-width). This model is an extension of a spectral model previously developed for homogeneous single and two-phase decay of isotropic turbulence and uniform shear flows; and a spectral turbulence model for wall-bounded flows without resolving the boundary layer. Data from direct numerical simulation (DNS) of turbulent channel flow was used to help develop this model and to assess its performance in the 1Dmore » direction across the channel width. The resultant spectral model is capable of predicting the mean velocity, turbulent kinetic energy and energy spectrum distributions for single-phase wall-bounded flows all the way to the wall, where the model source terms have been developed to account for the wall influence. We implemented the model into the 3D multiphase CFD code NPHASE-CMFD and the latest results are within reasonable error of the 1D predictions.« less
A Systematic Review of Conceptual Frameworks of Medical Complexity and New Model Development.
Zullig, Leah L; Whitson, Heather E; Hastings, Susan N; Beadles, Chris; Kravchenko, Julia; Akushevich, Igor; Maciejewski, Matthew L
2016-03-01
Patient complexity is often operationalized by counting multiple chronic conditions (MCC) without considering contextual factors that can affect patient risk for adverse outcomes. Our objective was to develop a conceptual model of complexity addressing gaps identified in a review of published conceptual models. We searched for English-language MEDLINE papers published between 1 January 2004 and 16 January 2014. Two reviewers independently evaluated abstracts and all authors contributed to the development of the conceptual model in an iterative process. From 1606 identified abstracts, six conceptual models were selected. One additional model was identified through reference review. Each model had strengths, but several constructs were not fully considered: 1) contextual factors; 2) dynamics of complexity; 3) patients' preferences; 4) acute health shocks; and 5) resilience. Our Cycle of Complexity model illustrates relationships between acute shocks and medical events, healthcare access and utilization, workload and capacity, and patient preferences in the context of interpersonal, organizational, and community factors. This model may inform studies on the etiology of and changes in complexity, the relationship between complexity and patient outcomes, and intervention development to improve modifiable elements of complex patients.
Building new physiologically based pharmacokinetic (PBPK) models requires a lot data, such as the chemical-specific parameters and in vivo pharmacokinetic data. Previously-developed, well-parameterized, and thoroughly-vetted models can be great resource for supporting the constr...
Mirfazaelian et al. (2006) developed a physiologically based pharmacokinetic (PBPK) model for the pyrethroid pesticide deltamethrin in the rat. This model describes gastrointestinal tract absorption as a saturable process mediated by phase III efflux transporters which pump delta...
DOT National Transportation Integrated Search
2013-06-01
This report summarizes a research project aimed at developing degradation models for bridge decks in the state of Michigan based on durability mechanics. A probabilistic framework to implement local-level mechanistic-based models for predicting the c...
To bridge the gaps between traditional mesoscale modelling and microscale modelling, the National Center for Atmospheric Research, in collaboration with other agencies and research groups, has developed an integrated urban modelling system coupled to the weather research and fore...
NEW DEVELOPMENTS IN THE COMMUNITY MULTISCALE AIR QUALITY (CMAQ) MODEL
CMAQ model research and development is currently following two tracks at the Atmospheric Modeling Division of the USEPA. Public releases of the community model system for research and policy analysis is continuing on an annual interval with the latest release scheduled for Augus...
Mechanisms of Developmental Change in Infant Categorization
ERIC Educational Resources Information Center
Westermann, Gert; Mareschal, Denis
2012-01-01
Computational models are tools for testing mechanistic theories of learning and development. Formal models allow us to instantiate theories of cognitive development in computer simulations. Model behavior can then be compared to real performance. Connectionist models, loosely based on neural information processing, have been successful in…
Models of Purposive Human Organization: A Comparative Study
1984-02-01
develop techniques for organizational diagnosis with the D-M model, to be followed by intervention by S-T methodology. 2. Introduction 2.1. Background In...relational and object data for Dinnat-Murphree model construction. 2. Develop techniques for organizational diagnosis with the Dinnat-Murphree model
Dependability modeling and assessment in UML-based software development.
Bernardi, Simona; Merseguer, José; Petriu, Dorina C
2012-01-01
Assessment of software nonfunctional properties (NFP) is an important problem in software development. In the context of model-driven development, an emerging approach for the analysis of different NFPs consists of the following steps: (a) to extend the software models with annotations describing the NFP of interest; (b) to transform automatically the annotated software model to the formalism chosen for NFP analysis; (c) to analyze the formal model using existing solvers; (d) to assess the software based on the results and give feedback to designers. Such a modeling→analysis→assessment approach can be applied to any software modeling language, be it general purpose or domain specific. In this paper, we focus on UML-based development and on the dependability NFP, which encompasses reliability, availability, safety, integrity, and maintainability. The paper presents the profile used to extend UML with dependability information, the model transformation to generate a DSPN formal model, and the assessment of the system properties based on the DSPN results.
Development and parameter identification of a visco-hyperelastic model for the periodontal ligament.
Huang, Huixiang; Tang, Wencheng; Tan, Qiyan; Yan, Bin
2017-04-01
The present study developed and implemented a new visco-hyperelastic model that is capable of predicting the time-dependent biomechanical behavior of the periodontal ligament. The constitutive model has been implemented into the finite element package ABAQUS by means of a user-defined material subroutine (UMAT). The stress response is decomposed into two constitutive parts in parallel which are a hyperelastic and a time-dependent viscoelastic stress response. In order to identify the model parameters, the indentation equation based on V-W hyperelastic model and the indentation creep model are developed. Then the parameters are determined by fitting them to the corresponding nanoindentation experimental data of the PDL. The nanoindentation experiment was simulated by finite element analysis to validate the visco-hyperelastic model. The simulated results are in good agreement with the experimental data, which demonstrates that the visco-hyperelastic model developed is able to accurately predict the time-dependent mechanical behavior of the PDL. Copyright © 2017 Elsevier Ltd. All rights reserved.
A Model-Based Expert System for Space Power Distribution Diagnostics
NASA Technical Reports Server (NTRS)
Quinn, Todd M.; Schlegelmilch, Richard F.
1994-01-01
When engineers diagnose system failures, they often use models to confirm system operation. This concept has produced a class of advanced expert systems that perform model-based diagnosis. A model-based diagnostic expert system for the Space Station Freedom electrical power distribution test bed is currently being developed at the NASA Lewis Research Center. The objective of this expert system is to autonomously detect and isolate electrical fault conditions. Marple, a software package developed at TRW, provides a model-based environment utilizing constraint suspension. Originally, constraint suspension techniques were developed for digital systems. However, Marple provides the mechanisms for applying this approach to analog systems such as the test bed, as well. The expert system was developed using Marple and Lucid Common Lisp running on a Sun Sparc-2 workstation. The Marple modeling environment has proved to be a useful tool for investigating the various aspects of model-based diagnostics. This report describes work completed to date and lessons learned while employing model-based diagnostics using constraint suspension within an analog system.
Animal Models in Cardiovascular Research: Hypertension and Atherosclerosis
Ng, Chun-Yi; Jaarin, Kamsiah
2015-01-01
Hypertension and atherosclerosis are among the most common causes of mortality in both developed and developing countries. Experimental animal models of hypertension and atherosclerosis have become a valuable tool for providing information on etiology, pathophysiology, and complications of the disease and on the efficacy and mechanism of action of various drugs and compounds used in treatment. An animal model has been developed to study hypertension and atherosclerosis for several reasons. Compared to human models, an animal model is easily manageable, as compounding effects of dietary and environmental factors can be controlled. Blood vessels and cardiac tissue samples can be taken for detailed experimental and biomolecular examination. Choice of animal model is often determined by the research aim, as well as financial and technical factors. A thorough understanding of the animal models used and complete analysis must be validated so that the data can be extrapolated to humans. In conclusion, animal models for hypertension and atherosclerosis are invaluable in improving our understanding of cardiovascular disease and developing new pharmacological therapies. PMID:26064920
Dependability Modeling and Assessment in UML-Based Software Development
Bernardi, Simona; Merseguer, José; Petriu, Dorina C.
2012-01-01
Assessment of software nonfunctional properties (NFP) is an important problem in software development. In the context of model-driven development, an emerging approach for the analysis of different NFPs consists of the following steps: (a) to extend the software models with annotations describing the NFP of interest; (b) to transform automatically the annotated software model to the formalism chosen for NFP analysis; (c) to analyze the formal model using existing solvers; (d) to assess the software based on the results and give feedback to designers. Such a modeling→analysis→assessment approach can be applied to any software modeling language, be it general purpose or domain specific. In this paper, we focus on UML-based development and on the dependability NFP, which encompasses reliability, availability, safety, integrity, and maintainability. The paper presents the profile used to extend UML with dependability information, the model transformation to generate a DSPN formal model, and the assessment of the system properties based on the DSPN results. PMID:22988428
Ground-water models for water resources planning
Moore, John E.
1980-01-01
In the past decade hydrologists have emphasized the development of computer-based mathematical models to aid in the understanding of flow, the transport of solutes, transport of heat, and deformation in the groundwater system. These models have been used to provide information and predictions for water managers. Too frequently, groundwater was neglected in water-resource planning because managers believed that it could not be adequately evaluated in terms of availability, quality, and effect of development on surface water supplies. Now, however, with newly developed digital groundwater models, effects of development can be predicted. Such models have been used to predict hydrologic and quality changes under different stresses. These models have grown in complexity over the last 10 years from simple one-layer flow models to three-dimensional simulations of groundwater flow which may include solute transport, heat transport, effects of land subsidence, and encroachment of salt water. This paper illustrates, through case histories, how predictive groundwater models have provided the information needed for the sound planning and management of water resources in the United States. (USGS)
Managing Analysis Models in the Design Process
NASA Technical Reports Server (NTRS)
Briggs, Clark
2006-01-01
Design of large, complex space systems depends on significant model-based support for exploration of the design space. Integrated models predict system performance in mission-relevant terms given design descriptions and multiple physics-based numerical models. Both the design activities and the modeling activities warrant explicit process definitions and active process management to protect the project from excessive risk. Software and systems engineering processes have been formalized and similar formal process activities are under development for design engineering and integrated modeling. JPL is establishing a modeling process to define development and application of such system-level models.
Suits reflectance models for wheat and cotton - Theoretical and experimental tests
NASA Technical Reports Server (NTRS)
Chance, J. E.; Lemaster, E. W.
1977-01-01
Plant canopy reflectance models developed by Suits are tested for cotton and Penjamo winter wheat. Properties of the models are discussed, and the concept of model depth is developed. The models' predicted exchange symmetry for specular irradiance with respect to sun polar angle and observer polar angle agreed with field data for cotton and wheat. Model calculations and experimental data for wheat reflectance vs sun angle disagreed. Specular reflectance from 0.50 to 1.10 micron shows fair agreement between the model and wheat measurements. An Appendix includes the physical and optical parameters for wheat necessary to apply Suits' models.
Marchioro, C A; Krechemer, F S; de Moraes, C P; Foerster, L A
2015-12-01
The diamondback moth, Plutella xylostella (L.), is a cosmopolitan pest of brassicaceous crops occurring in regions with highly distinct climate conditions. Several studies have investigated the relationship between temperature and P. xylostella development rate, providing degree-day models for populations from different geographical regions. However, there are no data available to date to demonstrate the suitability of such models to make reliable projections on the development time for this species in field conditions. In the present study, 19 models available in the literature were tested regarding their ability to accurately predict the development time of two cohorts of P. xylostella under field conditions. Only 11 out of the 19 models tested accurately predicted the development time for the first cohort of P. xylostella, but only seven for the second cohort. Five models correctly predicted the development time for both cohorts evaluated. Our data demonstrate that the accuracy of the models available for P. xylostella varies widely and therefore should be used with caution for pest management purposes.
NASA Astrophysics Data System (ADS)
Sanchez, P.; Hinojosa, J.; Ruiz, R.
2005-06-01
Recently, neuromodeling methods of microwave devices have been developed. These methods are suitable for the model generation of novel devices. They allow fast and accurate simulations and optimizations. However, the development of libraries makes these methods to be a formidable task, since they require massive input-output data provided by an electromagnetic simulator or measurements and repeated artificial neural network (ANN) training. This paper presents a strategy reducing the cost of library development with the advantages of the neuromodeling methods: high accuracy, large range of geometrical and material parameters and reduced CPU time. The library models are developed from a set of base prior knowledge input (PKI) models, which take into account the characteristics common to all the models in the library, and high-level ANNs which give the library model outputs from base PKI models. This technique is illustrated for a microwave multiconductor tunable phase shifter using anisotropic substrates. Closed-form relationships have been developed and are presented in this paper. The results show good agreement with the expected ones.
ERIC Educational Resources Information Center
Al-Dor, Nira
2006-01-01
The objective of this study is to present "The Spiral Model for the Development of Coordination" (SMDC), a learning model that reflects the complexity and possibilities embodied in the learning of movement notation Eshkol-Wachman (EWMN), an Israeli invention. This model constituted the infrastructure for a comprehensive study that examined the…
NASA Technical Reports Server (NTRS)
Colborn, B. L.; Armstong, T. W.
1993-01-01
A three-dimensional geometry and mass model of the Long Duration Exposure Facility (LDEF) spacecraft and experiment trays was developed for use in predictions and data interpretation related to ionizing radiation measurements. The modeling approach, level of detail incorporated, example models for specific experiments and radiation dosimeters, and example applications of the model are described.
Cancer Model Development Centers
The Cancer Model Development Centers (CMDCs) are the NCI-funded contributors to the HCMI. They are tasked with producing next-generation cancer models from clinical samples. The cancer models will encompass tumor types that are rare, originate from patients from underrepresented populations, or lack precision therapy. These models will be annotated with clinical and genomic data and will become a community resource.
USDA-ARS?s Scientific Manuscript database
Validation of model predictions for independent variables not included in model development can save time and money by identifying conditions for which new models are not needed. A single strain of Salmonella Typhimurium DT104 was used to develop a general regression neural network model for growth...
ERIC Educational Resources Information Center
Koellner, Karen; Jacobs, Jennifer
2015-01-01
We posit that professional development (PD) models fall on a continuum from highly adaptive to highly specified, and that these constructs provide a productive way to characterize and distinguish among models. The study reported here examines the impact of an adaptive mathematics PD model on teachers' knowledge and instructional practices as well…
Developing Model-Making and Model-Breaking Skills Using Direct Measurement Video-Based Activities
ERIC Educational Resources Information Center
Vonk, Matthew; Bohacek, Peter; Militello, Cheryl; Iverson, Ellen
2017-01-01
This study focuses on student development of two important laboratory skills in the context of introductory college-level physics. The first skill, which we call model making, is the ability to analyze a phenomenon in a way that produces a quantitative multimodal model. The second skill, which we call model breaking, is the ability to critically…
Weck, Philippe F.; Kim, Eunja; Wang, Yifeng; ...
2017-08-01
Molecular structures of kerogen control hydrocarbon production in unconventional reservoirs. Significant progress has been made in developing model representations of various kerogen structures. These models have been widely used for the prediction of gas adsorption and migration in shale matrix. However, using density functional perturbation theory (DFPT) calculations and vibrational spectroscopic measurements, we here show that a large gap may still remain between the existing model representations and actual kerogen structures, therefore calling for new model development. Using DFPT, we calculated Fourier transform infrared (FTIR) spectra for six most widely used kerogen structure models. The computed spectra were then systematicallymore » compared to the FTIR absorption spectra collected for kerogen samples isolated from Mancos, Woodford and Marcellus formations representing a wide range of kerogen origin and maturation conditions. Limited agreement between the model predictions and the measurements highlights that the existing kerogen models may still miss some key features in structural representation. A combination of DFPT calculations with spectroscopic measurements may provide a useful diagnostic tool for assessing the adequacy of a proposed structural model as well as for future model development. This approach may eventually help develop comprehensive infrared (IR)-fingerprints for tracing kerogen evolution.« less
NASA Technical Reports Server (NTRS)
Silva, Walter A.; Vartio, Eric; Shimko, Anthony; Kvaternik, Raymond G.; Eure, Kenneth W.; Scott,Robert C.
2007-01-01
Aeroservoelastic (ASE) analytical models of a SensorCraft wind-tunnel model are generated using measured data. The data was acquired during the ASE wind-tunnel test of the HiLDA (High Lift-to-Drag Active) Wing model, tested in the NASA Langley Transonic Dynamics Tunnel (TDT) in late 2004. Two time-domain system identification techniques are applied to the development of the ASE analytical models: impulse response (IR) method and the Generalized Predictive Control (GPC) method. Using measured control surface inputs (frequency sweeps) and associated sensor responses, the IR method is used to extract corresponding input/output impulse response pairs. These impulse responses are then transformed into state-space models for use in ASE analyses. Similarly, the GPC method transforms measured random control surface inputs and associated sensor responses into an AutoRegressive with eXogenous input (ARX) model. The ARX model is then used to develop the gust load alleviation (GLA) control law. For the IR method, comparison of measured with simulated responses are presented to investigate the accuracy of the ASE analytical models developed. For the GPC method, comparison of simulated open-loop and closed-loop (GLA) time histories are presented.
NASA Technical Reports Server (NTRS)
Silva, Walter A.; Shimko, Anthony; Kvaternik, Raymond G.; Eure, Kenneth W.; Scott, Robert C.
2006-01-01
Aeroservoelastic (ASE) analytical models of a SensorCraft wind-tunnel model are generated using measured data. The data was acquired during the ASE wind-tunnel test of the HiLDA (High Lift-to-Drag Active) Wing model, tested in the NASA Langley Transonic Dynamics Tunnel (TDT) in late 2004. Two time-domain system identification techniques are applied to the development of the ASE analytical models: impulse response (IR) method and the Generalized Predictive Control (GPC) method. Using measured control surface inputs (frequency sweeps) and associated sensor responses, the IR method is used to extract corresponding input/output impulse response pairs. These impulse responses are then transformed into state-space models for use in ASE analyses. Similarly, the GPC method transforms measured random control surface inputs and associated sensor responses into an AutoRegressive with eXogenous input (ARX) model. The ARX model is then used to develop the gust load alleviation (GLA) control law. For the IR method, comparison of measured with simulated responses are presented to investigate the accuracy of the ASE analytical models developed. For the GPC method, comparison of simulated open-loop and closed-loop (GLA) time histories are presented.
Weck, Philippe F; Kim, Eunja; Wang, Yifeng; Kruichak, Jessica N; Mills, Melissa M; Matteo, Edward N; Pellenq, Roland J-M
2017-08-01
Molecular structures of kerogen control hydrocarbon production in unconventional reservoirs. Significant progress has been made in developing model representations of various kerogen structures. These models have been widely used for the prediction of gas adsorption and migration in shale matrix. However, using density functional perturbation theory (DFPT) calculations and vibrational spectroscopic measurements, we here show that a large gap may still remain between the existing model representations and actual kerogen structures, therefore calling for new model development. Using DFPT, we calculated Fourier transform infrared (FTIR) spectra for six most widely used kerogen structure models. The computed spectra were then systematically compared to the FTIR absorption spectra collected for kerogen samples isolated from Mancos, Woodford and Marcellus formations representing a wide range of kerogen origin and maturation conditions. Limited agreement between the model predictions and the measurements highlights that the existing kerogen models may still miss some key features in structural representation. A combination of DFPT calculations with spectroscopic measurements may provide a useful diagnostic tool for assessing the adequacy of a proposed structural model as well as for future model development. This approach may eventually help develop comprehensive infrared (IR)-fingerprints for tracing kerogen evolution.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weck, Philippe F.; Kim, Eunja; Wang, Yifeng
Molecular structures of kerogen control hydrocarbon production in unconventional reservoirs. Significant progress has been made in developing model representations of various kerogen structures. These models have been widely used for the prediction of gas adsorption and migration in shale matrix. However, using density functional perturbation theory (DFPT) calculations and vibrational spectroscopic measurements, we here show that a large gap may still remain between the existing model representations and actual kerogen structures, therefore calling for new model development. Using DFPT, we calculated Fourier transform infrared (FTIR) spectra for six most widely used kerogen structure models. The computed spectra were then systematicallymore » compared to the FTIR absorption spectra collected for kerogen samples isolated from Mancos, Woodford and Marcellus formations representing a wide range of kerogen origin and maturation conditions. Limited agreement between the model predictions and the measurements highlights that the existing kerogen models may still miss some key features in structural representation. A combination of DFPT calculations with spectroscopic measurements may provide a useful diagnostic tool for assessing the adequacy of a proposed structural model as well as for future model development. This approach may eventually help develop comprehensive infrared (IR)-fingerprints for tracing kerogen evolution.« less
Artificial Intelligence Techniques for Predicting and Mapping Daily Pan Evaporation
NASA Astrophysics Data System (ADS)
Arunkumar, R.; Jothiprakash, V.; Sharma, Kirty
2017-09-01
In this study, Artificial Intelligence techniques such as Artificial Neural Network (ANN), Model Tree (MT) and Genetic Programming (GP) are used to develop daily pan evaporation time-series (TS) prediction and cause-effect (CE) mapping models. Ten years of observed daily meteorological data such as maximum temperature, minimum temperature, relative humidity, sunshine hours, dew point temperature and pan evaporation are used for developing the models. For each technique, several models are developed by changing the number of inputs and other model parameters. The performance of each model is evaluated using standard statistical measures such as Mean Square Error, Mean Absolute Error, Normalized Mean Square Error and correlation coefficient (R). The results showed that daily TS-GP (4) model predicted better with a correlation coefficient of 0.959 than other TS models. Among various CE models, CE-ANN (6-10-1) resulted better than MT and GP models with a correlation coefficient of 0.881. Because of the complex non-linear inter-relationship among various meteorological variables, CE mapping models could not achieve the performance of TS models. From this study, it was found that GP performs better for recognizing single pattern (time series modelling), whereas ANN is better for modelling multiple patterns (cause-effect modelling) in the data.
Global Environmental Multiscale model - a platform for integrated environmental predictions
NASA Astrophysics Data System (ADS)
Kaminski, Jacek W.; Struzewska, Joanna; Neary, Lori; Dearden, Frank
2017-04-01
The Global Environmental Multiscale model was developed by the Government of Canada as an operational weather prediction model in the mid-1990s. Subsequently, it was used as the host meteorological model for an on-line implementation of air quality chemistry and aerosols from global to the meso-gamma scale. Further model developments led to the vertical extension of the modelling domain to include stratospheric chemistry, aerosols, and formation of polar stratospheric clouds. In parallel, the modelling platform was used for planetary applications where dynamical, radiative transfer and chemical processes in the atmosphere of Mars were successfully simulated. Undoubtedly, the developed modelling platform can be classified as an example capable of the seamless and coupled modelling of the dynamics and chemistry of planetary atmospheres. We will present modelling results for global, regional, and local air quality episodes and the long-term air quality trends. Upper troposphere and lower stratosphere modelling results will be presented in terms of climate change and subsonic aviation emissions modelling. Model results for the atmosphere of Mars will be presented in the context of the 2016 ExoMars mission and the anticipated observations from the NOMAD instrument. Also, we will present plans and the design to extend the GEM model to the F region with further coupling with a magnetospheric model that extends to 15 Re.
An Ontology of Power: Perception and Reality in Conflict
2016-12-01
synthetic model was developed as the constant comparative analysis was resumed through the application of selected theory toward the original source...The synthetic model represents a series of maxims for the analysis of a complex social system, developed through a study of contemporary national...and categories. A model of strategic agency is proposed as an alternative framework for developing security strategy. The strategic agency model draws
ERIC Educational Resources Information Center
Ismail, Yilmaz
2016-01-01
This study aims to develop a semiotic declarative knowledge model, which is a positive constructive behavior model that systematically facilitates understanding in order to ensure that learners think accurately and ask the right questions about a topic. The data used to develop the experimental model were obtained using four measurement tools…
Berrisford, Richard; Brunelli, Alessandro; Rocco, Gaetano; Treasure, Tom; Utley, Martin
2005-08-01
To identify pre-operative factors associated with in-hospital mortality following lung resection and to construct a risk model that could be used prospectively to inform decisions and retrospectively to enable fair comparisons of outcomes. Data were submitted to the European Thoracic Surgery Database from 27 units in 14 countries. We analysed data concerning all patients that had a lung resection. Logistic regression was used with a random sample of 60% of cases to identify pre-operative factors associated with in-hospital mortality and to build a model of risk. The resulting model was tested on the remaining 40% of patients. A second model based on age and ppoFEV1% was developed for risk of in-hospital death amongst tumour resection patients. Of the 3426 adult patients that had a first lung resection for whom mortality data were available, 66 died within the same hospital admission. Within the data used for model development, dyspnoea (according to the Medical Research Council classification), ASA (American Society of Anaesthesiologists) score, class of procedure and age were found to be significantly associated with in-hospital death in a multivariate analysis. The logistic model developed on these data displayed predictive value when tested on the remaining data. Two models of the risk of in-hospital death amongst adult patients undergoing lung resection have been developed. The models show predictive value and can be used to discern between high-risk and low-risk patients. Amongst the test data, the model developed for all diagnoses performed well at low risk, underestimated mortality at medium risk and overestimated mortality at high risk. The second model for resection of lung neoplasms was developed after establishing the performance of the first model and so could not be tested robustly. That said, we were encouraged by its performance over the entire range of estimated risk. The first of these two models could be regarded as an evaluation based on clinically available criteria while the second uses data obtained from objective measurement. We are optimistic that further model development and testing will provide a tool suitable for case mix adjustment.
NASA Astrophysics Data System (ADS)
Krishnasamy, M.; Qian, Feng; Zuo, Lei; Lenka, T. R.
2018-03-01
The charge cancellation due to the change of strain along single continuous piezoelectric layer can remarkably affect the performance of a cantilever based harvester. In this paper, analytical models using distributed parameters are developed with some extent of averting the charge cancellation in cantilever piezoelectric transducer where the piezoelectric layers are segmented at strain nodes of concerned vibration mode. The electrode of piezoelectric segments are parallelly connected with a single external resistive load in the 1st model (Model 1). While each bimorph piezoelectric layers are connected in parallel to a resistor to form an independent circuit in the 2nd model (Model 2). The analytical expressions of the closed-form electromechanical coupling responses in frequency domain under harmonic base excitation are derived based on the Euler-Bernoulli beam assumption for both models. The developed analytical models are validated by COMSOL and experimental results. The results demonstrate that the energy harvesting performance of the developed segmented piezoelectric layer models is better than the traditional model of continuous piezoelectric layer.
Modelling Root Systems Using Oriented Density Distributions
NASA Astrophysics Data System (ADS)
Dupuy, Lionel X.
2011-09-01
Root architectural models are essential tools to understand how plants access and utilize soil resources during their development. However, root architectural models use complex geometrical descriptions of the root system and this has limitations to model interactions with the soil. This paper presents the development of continuous models based on the concept of oriented density distribution function. The growth of the root system is built as a hierarchical system of partial differential equations (PDEs) that incorporate single root growth parameters such as elongation rate, gravitropism and branching rate which appear explicitly as coefficients of the PDE. Acquisition and transport of nutrients are then modelled by extending Darcy's law to oriented density distribution functions. This framework was applied to build a model of the growth and water uptake of barley root system. This study shows that simplified and computer effective continuous models of the root system development can be constructed. Such models will allow application of root growth models at field scale.
NASA Technical Reports Server (NTRS)
Daiker, Ron; Schnell, Thomas
2010-01-01
A human motor model was developed on the basis of performance data that was collected in a flight simulator. The motor model is under consideration as one component of a virtual pilot model for the evaluation of NextGen crew alerting and notification systems in flight decks. This model may be used in a digital Monte Carlo simulation to compare flight deck layout design alternatives. The virtual pilot model is being developed as part of a NASA project to evaluate multiple crews alerting and notification flight deck configurations. Model parameters were derived from empirical distributions of pilot data collected in a flight simulator experiment. The goal of this model is to simulate pilot motor performance in the approach-to-landing task. The unique challenges associated with modeling the complex dynamics of humans interacting with the cockpit environment are discussed, along with the current state and future direction of the model.
An integrated mathematical model of the human cardiopulmonary system: model development.
Albanese, Antonio; Cheng, Limei; Ursino, Mauro; Chbat, Nicolas W
2016-04-01
Several cardiovascular and pulmonary models have been proposed in the last few decades. However, very few have addressed the interactions between these two systems. Our group has developed an integrated cardiopulmonary model (CP Model) that mathematically describes the interactions between the cardiovascular and respiratory systems, along with their main short-term control mechanisms. The model has been compared with human and animal data taken from published literature. Due to the volume of the work, the paper is divided in two parts. The present paper is on model development and normophysiology, whereas the second is on the model's validation on hypoxic and hypercapnic conditions. The CP Model incorporates cardiovascular circulation, respiratory mechanics, tissue and alveolar gas exchange, as well as short-term neural control mechanisms acting on both the cardiovascular and the respiratory functions. The model is able to simulate physiological variables typically observed in adult humans under normal and pathological conditions and to explain the underlying mechanisms and dynamics. Copyright © 2016 the American Physiological Society.
Estimating Traffic Accidents in Turkey Using Differential Evolution Algorithm
NASA Astrophysics Data System (ADS)
Akgüngör, Ali Payıdar; Korkmaz, Ersin
2017-06-01
Estimating traffic accidents play a vital role to apply road safety procedures. This study proposes Differential Evolution Algorithm (DEA) models to estimate the number of accidents in Turkey. In the model development, population (P) and the number of vehicles (N) are selected as model parameters. Three model forms, linear, exponential and semi-quadratic models, are developed using DEA with the data covering from 2000 to 2014. Developed models are statistically compared to select the best fit model. The results of the DE models show that the linear model form is suitable to estimate the number of accidents. The statistics of this form is better than other forms in terms of performance criteria which are the Mean Absolute Percentage Errors (MAPE) and the Root Mean Square Errors (RMSE). To investigate the performance of linear DE model for future estimations, a ten-year period from 2015 to 2024 is considered. The results obtained from future estimations reveal the suitability of DE method for road safety applications.