NASA Technical Reports Server (NTRS)
Pace, Dale K.
2000-01-01
A simulation conceptual model is a simulation developers way of translating modeling requirements (i. e., what is to be represented by the simulation or its modification) into a detailed design framework (i. e., how it is to be done), from which the software, hardware, networks (in the case of distributed simulation), and systems/equipment that will make up the simulation can be built or modified. A conceptual model is the collection of information which describes a simulation developers concept about the simulation and its pieces. That information consists of assumptions, algorithms, characteristics, relationships, and data. Taken together, these describe how the simulation developer understands what is to be represented by the simulation (entities, actions, tasks, processes, interactions, etc.) and how that representation will satisfy the requirements to which the simulation responds. Thus the conceptual model is the basis for judgment about simulation fidelity and validity for any condition that is not specifically tested. The more perspicuous and precise the conceptual model, the more likely it is that the simulation development will both fully satisfy requirements and allow demonstration that the requirements are satisfied (i. e., validation). Methods used in simulation conceptual model development have significant implications for simulation management and for assessment of simulation uncertainty. This paper suggests how to develop and document a simulation conceptual model so that the simulation fidelity and validity can be most effectively determined. These ideas for conceptual model development apply to all simulation varieties. The paper relates these ideas to uncertainty assessments as they relate to simulation fidelity and validity. The paper also explores implications for simulation management from conceptual model development methods, especially relative to reuse of simulation components.
Assessment of Alternative Conceptual Models Using Reactive Transport Modeling with Monitoring Data
NASA Astrophysics Data System (ADS)
Dai, Z.; Price, V.; Heffner, D.; Hodges, R.; Temples, T.; Nicholson, T.
2005-12-01
Monitoring data proved very useful in evaluating alternative conceptual models, simulating contaminant transport behavior, and reducing uncertainty. A graded approach using three alternative conceptual site models was formulated to simulate a field case of tetrachloroethene (PCE) transport and biodegradation. These models ranged from simple to complex in their representation of subsurface heterogeneities. The simplest model was a single-layer homogeneous aquifer that employed an analytical reactive transport code, BIOCHLOR (Aziz et al., 1999). Due to over-simplification of the aquifer structure, this simulation could not reproduce the monitoring data. The second model consisted of a multi-layer conceptual model, in combination with numerical modules, MODFLOW and RT3D within GMS, to simulate flow and reactive transport. Although the simulation results from the second model were comparatively better than those from the simple model, they still did not adequately reproduce the monitoring well concentrations because the geological structures were still inadequately defined. Finally, a more realistic conceptual model was formulated that incorporated heterogeneities and geologic structures identified from well logs and seismic survey data using the Petra and PetraSeis software. This conceptual model included both a major channel and a younger channel that were detected in the PCE source area. In this model, these channels control the local ground-water flow direction and provide a preferential chemical transport pathway. Simulation results using this conceptual site model proved compatible with the monitoring concentration data. This study demonstrates that the bias and uncertainty from inadequate conceptual models are much larger than those introduced from an inadequate choice of model parameter values (Neuman and Wierenga, 2003; Meyer et al., 2004; Ye et al., 2004). This case study integrated conceptual and numerical models, based on interpreted local hydrogeologic and geochemical data, with detailed monitoring plume data. It provided key insights for confirming alternative conceptual site models and assessing the performance of monitoring networks. A monitoring strategy based on this graded approach for assessing alternative conceptual models can provide the technical bases for identifying critical monitoring locations, adequate monitoring frequency, and performance indicator parameters for performance monitoring involving ground-water levels and PCE concentrations.
A Scoping Review: Conceptualizations and Pedagogical Models of Learning in Nursing Simulation
ERIC Educational Resources Information Center
Poikela, Paula; Teräs, Marianne
2015-01-01
Simulations have been implemented globally in nursing education for years with diverse conceptual foundations. The aim of this scoping review is to examine the literature regarding the conceptualizations of learning and pedagogical models in nursing simulations. A scoping review of peer-reviewed articles published between 2000 and 2013 was…
Conceptualization of preferential flow for hillslope stability assessment
NASA Astrophysics Data System (ADS)
Kukemilks, Karlis; Wagner, Jean-Frank; Saks, Tomas; Brunner, Philip
2018-03-01
This study uses two approaches to conceptualize preferential flow with the goal to investigate their influence on hillslope stability. Synthetic three-dimensional hydrogeological models using dual-permeability and discrete-fracture conceptualization were subsequently integrated into slope stability simulations. The slope stability simulations reveal significant differences in slope stability depending on the preferential flow conceptualization applied, despite similar small-scale hydrogeological responses of the system. This can be explained by a local-scale increase of pore-water pressures observed in the scenario with discrete fractures. The study illustrates the critical importance of correctly conceptualizing preferential flow for slope stability simulations. It further demonstrates that the combination of the latest generation of physically based hydrogeological models with slope stability simulations allows for improvement to current modeling approaches through more complex consideration of preferential flow paths.
A conceptual modeling framework for discrete event simulation using hierarchical control structures.
Furian, N; O'Sullivan, M; Walker, C; Vössner, S; Neubacher, D
2015-08-01
Conceptual Modeling (CM) is a fundamental step in a simulation project. Nevertheless, it is only recently that structured approaches towards the definition and formulation of conceptual models have gained importance in the Discrete Event Simulation (DES) community. As a consequence, frameworks and guidelines for applying CM to DES have emerged and discussion of CM for DES is increasing. However, both the organization of model-components and the identification of behavior and system control from standard CM approaches have shortcomings that limit CM's applicability to DES. Therefore, we discuss the different aspects of previous CM frameworks and identify their limitations. Further, we present the Hierarchical Control Conceptual Modeling framework that pays more attention to the identification of a models' system behavior, control policies and dispatching routines and their structured representation within a conceptual model. The framework guides the user step-by-step through the modeling process and is illustrated by a worked example.
A conceptual modeling framework for discrete event simulation using hierarchical control structures
Furian, N.; O’Sullivan, M.; Walker, C.; Vössner, S.; Neubacher, D.
2015-01-01
Conceptual Modeling (CM) is a fundamental step in a simulation project. Nevertheless, it is only recently that structured approaches towards the definition and formulation of conceptual models have gained importance in the Discrete Event Simulation (DES) community. As a consequence, frameworks and guidelines for applying CM to DES have emerged and discussion of CM for DES is increasing. However, both the organization of model-components and the identification of behavior and system control from standard CM approaches have shortcomings that limit CM’s applicability to DES. Therefore, we discuss the different aspects of previous CM frameworks and identify their limitations. Further, we present the Hierarchical Control Conceptual Modeling framework that pays more attention to the identification of a models’ system behavior, control policies and dispatching routines and their structured representation within a conceptual model. The framework guides the user step-by-step through the modeling process and is illustrated by a worked example. PMID:26778940
Navigating Tensions Between Conceptual and Metaconceptual Goals in the Use of Models
NASA Astrophysics Data System (ADS)
Delgado, Cesar
2015-04-01
Science education involves learning about phenomena at three levels: concrete (facts and generalizations), conceptual (concepts and theories), and metaconceptual (epistemology) (Snir et al. in J Sci Educ Technol 2(2):373-388, 1993). Models are key components in science, can help build conceptual understanding, and may also build metaconceptual understanding. Technology can transform teaching and learning by turning models into interactive simulations that learners can investigate. This paper identifies four characteristics of models and simulations that support conceptual learning but misconstrue models and science at a metaconceptual level. Ahistorical models combine the characteristics of several historical models; they conveniently compile ideas but misrepresent the history of science (Gilbert in Int J Sci Math Educ 2(2):115-130, 2004). Teleological models explain behavior in terms of a final cause; they can lead to useful heuristics but imply purpose in processes driven by chance and probability (Talanquer in Int J Sci Educ 29(7):853-870, 2007). Epistemological overreach occurs when models or simulations imply greater certainty and knowledge about phenomena than warranted; conceptualizing nature as being well known (e.g., having a mathematical structure) poses the danger of conflating model and reality or data and theory. Finally, models are inevitably ontologically impoverished. Real-world deviations and many variables are left out of models, as models' role is to simplify. Models and simulations also lose much of the sensory data present in phenomena. Teachers, designers, and professional development designers and facilitators must thus navigate the tension between conceptual and metaconceptual learning when using models and simulations. For each characteristic, examples are provided, along with recommendations for instruction and design. Prompts for explicit reflective activities around models are provided for each characteristic
Asymmetric Eyewall Vertical Motion in a High-Resolution Simulation of Hurricane Bonnie (1998)
NASA Technical Reports Server (NTRS)
Braun, Scott A.; Montgomery, Michael T.; Pu, Zhao-Xia
2003-01-01
This study examines a high-resolution simulation of Hurricane Bonnie. Results from the simulation will be compared to the conceptual model of Heymsfield et al. (2001) to determine the extent to which this conceptual model explains vertical motions and precipitation growth in the eyewall.
Conceptual Modeling Framework for E-Area PA HELP Infiltration Model Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dyer, J. A.
A conceptual modeling framework based on the proposed E-Area Low-Level Waste Facility (LLWF) closure cap design is presented for conducting Hydrologic Evaluation of Landfill Performance (HELP) model simulations of intact and subsided cap infiltration scenarios for the next E-Area Performance Assessment (PA).
Soulis, Konstantinos X; Valiantzas, John D; Ntoulas, Nikolaos; Kargas, George; Nektarios, Panayiotis A
2017-09-15
In spite of the well-known green roof benefits, their widespread adoption in the management practices of urban drainage systems requires the use of adequate analytical and modelling tools. In the current study, green roof runoff modeling was accomplished by developing, testing, and jointly using a simple conceptual model and a physically based numerical simulation model utilizing HYDRUS-1D software. The use of such an approach combines the advantages of the conceptual model, namely simplicity, low computational requirements, and ability to be easily integrated in decision support tools with the capacity of the physically based simulation model to be easily transferred in conditions and locations other than those used for calibrating and validating it. The proposed approach was evaluated with an experimental dataset that included various green roof covers (either succulent plants - Sedum sediforme, or xerophytic plants - Origanum onites, or bare substrate without any vegetation) and two substrate depths (either 8 cm or 16 cm). Both the physically based and the conceptual models matched very closely the observed hydrographs. In general, the conceptual model performed better than the physically based simulation model but the overall performance of both models was sufficient in most cases as it is revealed by the Nash-Sutcliffe Efficiency index which was generally greater than 0.70. Finally, it was showcased how a physically based and a simple conceptual model can be jointly used to allow the use of the simple conceptual model for a wider set of conditions than the available experimental data and in order to support green roof design. Copyright © 2017 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Chen, Yu-Lung; Pan, Pei-Rong; Sung, Yao-Ting; Chang, Kuo-En
2013-01-01
Computer simulation has significant potential as a supplementary tool for effective conceptual-change learning based on the integration of technology and appropriate instructional strategies. This study elucidates misconceptions in learning on diodes and constructs a conceptual-change learning system that incorporates…
Conceptual modeling for Prospective Health Technology Assessment.
Gantner-Bär, Marion; Djanatliev, Anatoli; Prokosch, Hans-Ulrich; Sedlmayr, Martin
2012-01-01
Prospective Health Technology Assessment (ProHTA) is a new and innovative approach to analyze and assess new technologies, methods and procedures in health care. Simulation processes are used to model innovations before the cost-intensive design and development phase. Thus effects on patient care, the health care system as well as health economics aspects can be estimated. To generate simulation models a valid information base is necessary and therefore conceptual modeling is most suitable. Project-specifically improved methods and characteristics of simulation modeling are combined in the ProHTA Conceptual Modeling Process and initially implemented for acute ischemic stroke treatment in Germany. Additionally the project aims at simulation of other diseases and health care systems as well. ProHTA is an interdisciplinary research project within the Cluster of Excellence for Medical Technology - Medical Valley European Metropolitan Region Nuremberg (EMN), which is funded by the German Federal Ministry of Education and Research (BMBF), project grant No. 01EX1013B.
Evaluation of a distributed catchment scale water balance model
NASA Technical Reports Server (NTRS)
Troch, Peter A.; Mancini, Marco; Paniconi, Claudio; Wood, Eric F.
1993-01-01
The validity of some of the simplifying assumptions in a conceptual water balance model is investigated by comparing simulation results from the conceptual model with simulation results from a three-dimensional physically based numerical model and with field observations. We examine, in particular, assumptions and simplifications related to water table dynamics, vertical soil moisture and pressure head distributions, and subsurface flow contributions to stream discharge. The conceptual model relies on a topographic index to predict saturation excess runoff and on Philip's infiltration equation to predict infiltration excess runoff. The numerical model solves the three-dimensional Richards equation describing flow in variably saturated porous media, and handles seepage face boundaries, infiltration excess and saturation excess runoff production, and soil driven and atmosphere driven surface fluxes. The study catchments (a 7.2 sq km catchment and a 0.64 sq km subcatchment) are located in the North Appalachian ridge and valley region of eastern Pennsylvania. Hydrologic data collected during the MACHYDRO 90 field experiment are used to calibrate the models and to evaluate simulation results. It is found that water table dynamics as predicted by the conceptual model are close to the observations in a shallow water well and therefore, that a linear relationship between a topographic index and the local water table depth is found to be a reasonable assumption for catchment scale modeling. However, the hydraulic equilibrium assumption is not valid for the upper 100 cm layer of the unsaturated zone and a conceptual model that incorporates a root zone is suggested. Furthermore, theoretical subsurface flow characteristics from the conceptual model are found to be different from field observations, numerical simulation results, and theoretical baseflow recession characteristics based on Boussinesq's groundwater equation.
A Formal Modelling Language Extending SysML for Simulation of Continuous and Discrete System
2012-11-01
UNCLASSIFIED DSTO-GD-0734 16. A Formal Modelling Language Extending SysML for Simulation of Continuous and Discrete System – Mark Hodson1 and...be conceptual at some level because a one to one mapping with the real system will never exist. SysML is an extension and modification of UML that...simulation, which can provide great insights into the behaviour of complex systems. Although UML and SysML primarily support conceptual modelling they
Identifying Hydrologic Processes in Agricultural Watersheds Using Precipitation-Runoff Models
Linard, Joshua I.; Wolock, David M.; Webb, Richard M.T.; Wieczorek, Michael
2009-01-01
Understanding the fate and transport of agricultural chemicals applied to agricultural fields will assist in designing the most effective strategies to prevent water-quality impairments. At a watershed scale, the processes controlling the fate and transport of agricultural chemicals are generally understood only conceptually. To examine the applicability of conceptual models to the processes actually occurring, two precipitation-runoff models - the Soil and Water Assessment Tool (SWAT) and the Water, Energy, and Biogeochemical Model (WEBMOD) - were applied in different agricultural settings of the contiguous United States. Each model, through different physical processes, simulated the transport of water to a stream from the surface, the unsaturated zone, and the saturated zone. Models were calibrated for watersheds in Maryland, Indiana, and Nebraska. The calibrated sets of input parameters for each model at each watershed are discussed, and the criteria used to validate the models are explained. The SWAT and WEBMOD model results at each watershed conformed to each other and to the processes identified in each watershed's conceptual hydrology. In Maryland the conceptual understanding of the hydrology indicated groundwater flow was the largest annual source of streamflow; the simulation results for the validation period confirm this. The dominant source of water to the Indiana watershed was thought to be tile drains. Although tile drains were not explicitly simulated in the SWAT model, a large component of streamflow was received from lateral flow, which could be attributed to tile drains. Being able to explicitly account for tile drains, WEBMOD indicated water from tile drains constituted most of the annual streamflow in the Indiana watershed. The Nebraska models indicated annual streamflow was composed primarily of perennial groundwater flow and infiltration-excess runoff, which conformed to the conceptual hydrology developed for that watershed. The hydrologic processes represented in the parameter sets resulting from each model were comparable at individual watersheds, but varied between watersheds. The models were unable to show, however, whether hydrologic processes other than those included in the original conceptual models were major contributors to streamflow. Supplemental simulations of agricultural chemical transport could improve the ability to assess conceptual models.
2014-09-18
and full/scale experimental verifications towards ground/ satellite quantum key distribution0 Oat Qhotonics 4235>9+7,=5;9!អ \\58^ Zin K. Dao Z. Miu T...Conceptual Modeling of a Quantum Key Distribution Simulation Framework Using the Discrete Event System Specification DISSERTATION Jeffrey D. Morris... QUANTUM KEY DISTRIBUTION SIMULATION FRAMEWORK USING THE DISCRETE EVENT SYSTEM SPECIFICATION DISSERTATION Presented to the Faculty Department of Systems
NASA Astrophysics Data System (ADS)
Chang, Yong; Wu, Jichun; Jiang, Guanghui; Kang, Zhiqiang
2017-05-01
Conceptual models often suffer from the over-parameterization problem due to limited available data for the calibration. This leads to the problem of parameter nonuniqueness and equifinality, which may bring much uncertainty of the simulation result. How to find out the appropriate model structure supported by the available data to simulate the catchment is still a big challenge in the hydrological research. In this paper, we adopt a multi-model framework to identify the dominant hydrological process and appropriate model structure of a karst spring, located in Guilin city, China. For this catchment, the spring discharge is the only available data for the model calibration. This framework starts with a relative complex conceptual model according to the perception of the catchment and then this complex is simplified into several different models by gradually removing the model component. The multi-objective approach is used to compare the performance of these different models and the regional sensitivity analysis (RSA) is used to investigate the parameter identifiability. The results show this karst spring is mainly controlled by two different hydrological processes and one of the processes is threshold-driven which is consistent with the fieldwork investigation. However, the appropriate model structure to simulate the discharge of this spring is much simpler than the actual aquifer structure and hydrological processes understanding from the fieldwork investigation. A simple linear reservoir with two different outlets is enough to simulate this spring discharge. The detail runoff process in the catchment is not needed in the conceptual model to simulate the spring discharge. More complex model should need more other additional data to avoid serious deterioration of model predictions.
Johnson, R.H.; Poeter, E.P.
2007-01-01
Perchloroethylene (PCE) saturations determined from GPR surveys were used as observations for inversion of multiphase flow simulations of a PCE injection experiment (Borden 9??m cell), allowing for the estimation of optimal bulk intrinsic permeability values. The resulting fit statistics and analysis of residuals (observed minus simulated PCE saturations) were used to improve the conceptual model. These improvements included adjustment of the elevation of a permeability contrast, use of the van Genuchten versus Brooks-Corey capillary pressure-saturation curve, and a weighting scheme to account for greater measurement error with larger saturation values. A limitation in determining PCE saturations through one-dimensional GPR modeling is non-uniqueness when multiple GPR parameters are unknown (i.e., permittivity, depth, and gain function). Site knowledge, fixing the gain function, and multiphase flow simulations assisted in evaluating non-unique conceptual models of PCE saturation, where depth and layering were reinterpreted to provide alternate conceptual models. Remaining bias in the residuals is attributed to the violation of assumptions in the one-dimensional GPR interpretation (which assumes flat, infinite, horizontal layering) resulting from multidimensional influences that were not included in the conceptual model. While the limitations and errors in using GPR data as observations for inverse multiphase flow simulations are frustrating and difficult to quantify, simulation results indicate that the error and bias in the PCE saturation values are small enough to still provide reasonable optimal permeability values. The effort to improve model fit and reduce residual bias decreases simulation error even for an inversion based on biased observations and provides insight into alternate GPR data interpretations. Thus, this effort is warranted and provides information on bias in the observation data when this bias is otherwise difficult to assess. ?? 2006 Elsevier B.V. All rights reserved.
Uses of Computer Simulation Models in Ag-Research and Everyday Life
USDA-ARS?s Scientific Manuscript database
When the news media talks about models they could be talking about role models, fashion models, conceptual models like the auto industry uses, or computer simulation models. A computer simulation model is a computer code that attempts to imitate the processes and functions of certain systems. There ...
Integrating Flight Dynamics & Control Analysis and Simulation in Rotorcraft Conceptual Design
NASA Technical Reports Server (NTRS)
Lawrence, Ben; Berger, Tom; Tischler, Mark B.; Theodore, Colin R; Elmore, Josh; Gallaher, Andrew; Tobias, Eric L.
2016-01-01
The development of a toolset, SIMPLI-FLYD ('SIMPLIfied FLight dynamics for conceptual Design') is described. SIMPLI-FLYD is a collection of tools that perform flight dynamics and control modeling and analysis of rotorcraft conceptual designs including a capability to evaluate the designs in an X-Plane-based real-time simulation. The establishment of this framework is now facilitating the exploration of this new capability, in terms of modeling fidelity and data requirements, and the investigation of which stability and control and handling qualities requirements are appropriate for conceptual design. Illustrative design variation studies for single main rotor and tiltrotor vehicle configurations show sensitivity of the stability and control characteristics and an approach to highlight potential weight savings by identifying over-design.
NASA Astrophysics Data System (ADS)
Krause, Lee S.; Burns, Carla L.
2000-06-01
This paper discusses the research currently in progress to develop the Conceptual Federation Object Model Design Tool. The objective of the Conceptual FOM (C-FOM) Design Tool effort is to provide domain and subject matter experts, such as scenario developers, with automated support for understanding and utilizing available HLA simulation and other simulation assets during HLA Federation development. The C-FOM Design Tool will import Simulation Object Models from HLA reuse repositories, such as the MSSR, to populate the domain space that will contain all the objects and their supported interactions. In addition, the C-FOM tool will support the conversion of non-HLA legacy models into HLA- compliant models by applying proven abstraction techniques against the legacy models. Domain experts will be able to build scenarios based on the domain objects and interactions in both a text and graphical form and export a minimal FOM. The ability for domain and subject matter experts to effectively access HLA and non-HLA assets is critical to the long-term acceptance of the HLA initiative.
Opportunities and Challenges in Supply-Side Simulation: Physician-Based Models
Gresenz, Carole Roan; Auerbach, David I; Duarte, Fabian
2013-01-01
Objective To provide a conceptual framework and to assess the availability of empirical data for supply-side microsimulation modeling in the context of health care. Data Sources Multiple secondary data sources, including the American Community Survey, Health Tracking Physician Survey, and SK&A physician database. Study Design We apply our conceptual framework to one entity in the health care market—physicians—and identify, assess, and compare data available for physician-based simulation models. Principal Findings Our conceptual framework describes three broad types of data required for supply-side microsimulation modeling. Our assessment of available data for modeling physician behavior suggests broad comparability across various sources on several dimensions and highlights the need for significant integration of data across multiple sources to provide a platform adequate for modeling. A growing literature provides potential estimates for use as behavioral parameters that could serve as the models' engines. Sources of data for simulation modeling that account for the complex organizational and financial relationships among physicians and other supply-side entities are limited. Conclusions A key challenge for supply-side microsimulation modeling is optimally combining available data to harness their collective power. Several possibilities also exist for novel data collection. These have the potential to serve as catalysts for the next generation of supply-side-focused simulation models to inform health policy. PMID:23347041
ERIC Educational Resources Information Center
Yin, Chengjiu; Song, Yanjie; Tabata, Yoshiyuki; Ogata, Hiroaki; Hwang, Gwo-Jen
2013-01-01
This paper proposes a conceptual framework, scaffolding participatory simulation for mobile learning (SPSML), used on mobile devices for helping students learn conceptual knowledge in the classroom. As the pedagogical design, the framework adopts an experiential learning model, which consists of five sequential but cyclic steps: the initial stage,…
A Conceptual View of the Officer Procurement Model (TOPOPS). Technical Report No. 73-73.
ERIC Educational Resources Information Center
Akman, Allan; Nordhauser, Fred
This report presents the conceptual design of a computer-based linear programing model of the Air Force officer procurement system called TOPOPS. The TOPOPS model is an aggregate model which simulates officer accession and training and is directed at optimizing officer procurement in terms of either minimizing cost or maximizing accession quality…
NASA Astrophysics Data System (ADS)
Silva, F. E. O. E.; Naghettini, M. D. C.; Fernandes, W.
2014-12-01
This paper evaluated the uncertainties associated with the estimation of the parameters of a conceptual rainfall-runoff model, through the use of Bayesian inference techniques by Monte Carlo simulation. The Pará River sub-basin, located in the upper São Francisco river basin, in southeastern Brazil, was selected for developing the studies. In this paper, we used the Rio Grande conceptual hydrologic model (EHR/UFMG, 2001) and the Markov Chain Monte Carlo simulation method named DREAM (VRUGT, 2008a). Two probabilistic models for the residues were analyzed: (i) the classic [Normal likelihood - r ≈ N (0, σ²)]; and (ii) a generalized likelihood (SCHOUPS & VRUGT, 2010), in which it is assumed that the differences between observed and simulated flows are correlated, non-stationary, and distributed as a Skew Exponential Power density. The assumptions made for both models were checked to ensure that the estimation of uncertainties in the parameters was not biased. The results showed that the Bayesian approach proved to be adequate to the proposed objectives, enabling and reinforcing the importance of assessing the uncertainties associated with hydrological modeling.
Analyzing Interaction Patterns to Verify a Simulation/Game Model
ERIC Educational Resources Information Center
Myers, Rodney Dean
2012-01-01
In order for simulations and games to be effective for learning, instructional designers must verify that the underlying computational models being used have an appropriate degree of fidelity to the conceptual models of their real-world counterparts. A simulation/game that provides incorrect feedback is likely to promote misunderstanding and…
Towards improving software security by using simulation to inform requirements and conceptual design
Nutaro, James J.; Allgood, Glenn O.; Kuruganti, Teja
2015-06-17
We illustrate the use of modeling and simulation early in the system life-cycle to improve security and reduce costs. The models that we develop for this illustration are inspired by problems in reliability analysis and supervisory control, for which similar models are used to quantify failure probabilities and rates. In the context of security, we propose that models of this general type can be used to understand trades between risk and cost while writing system requirements and during conceptual design, and thereby significantly reduce the need for expensive security corrections after a system enters operation
Development and testing of a fast conceptual river water quality model.
Keupers, Ingrid; Willems, Patrick
2017-04-15
Modern, model based river quality management strongly relies on river water quality models to simulate the temporal and spatial evolution of pollutant concentrations in the water body. Such models are typically constructed by extending detailed hydrodynamic models with a component describing the advection-diffusion and water quality transformation processes in a detailed, physically based way. This approach is too computational time demanding, especially when simulating long time periods that are needed for statistical analysis of the results or when model sensitivity analysis, calibration and validation require a large number of model runs. To overcome this problem, a structure identification method to set up a conceptual river water quality model has been developed. Instead of calculating the water quality concentrations at each water level and discharge node, the river branch is divided into conceptual reservoirs based on user information such as location of interest and boundary inputs. These reservoirs are modelled as Plug Flow Reactor (PFR) and Continuously Stirred Tank Reactor (CSTR) to describe advection and diffusion processes. The same water quality transformation processes as in the detailed models are considered but with adjusted residence times based on the hydrodynamic simulation results and calibrated to the detailed water quality simulation results. The developed approach allows for a much faster calculation time (factor 10 5 ) without significant loss of accuracy, making it feasible to perform time demanding scenario runs. Copyright © 2017 Elsevier Ltd. All rights reserved.
Zyvoloski, G.; Kwicklis, E.; Eddebbarh, A.-A.; Arnold, B.; Faunt, C.; Robinson, B.A.
2003-01-01
This paper presents several different conceptual models of the Large Hydraulic Gradient (LHG) region north of Yucca Mountain and describes the impact of those models on groundwater flow near the potential high-level repository site. The results are based on a numerical model of site-scale saturated zone beneath Yucca Mountain. This model is used for performance assessment predictions of radionuclide transport and to guide future data collection and modeling activities. The numerical model is calibrated by matching available water level measurements using parameter estimation techniques, along with more informal comparisons of the model to hydrologic and geochemical information. The model software (hydrologic simulation code FEHM and parameter estimation software PEST) and model setup allows for efficient calibration of multiple conceptual models. Until now, the Large Hydraulic Gradient has been simulated using a low-permeability, east-west oriented feature, even though direct evidence for this feature is lacking. In addition to this model, we investigate and calibrate three additional conceptual models of the Large Hydraulic Gradient, all of which are based on a presumed zone of hydrothermal chemical alteration north of Yucca Mountain. After examining the heads and permeabilities obtained from the calibrated models, we present particle pathways from the potential repository that record differences in the predicted groundwater flow regime. The results show that Large Hydraulic Gradient can be represented with the alternate conceptual models that include the hydrothermally altered zone. The predicted pathways are mildly sensitive to the choice of the conceptual model and more sensitive to the quality of calibration in the vicinity on the repository. These differences are most likely due to different degrees of fit of model to data, and do not represent important differences in hydrologic conditions for the different conceptual models. ?? 2002 Elsevier Science B.V. All rights reserved.
On the Performance of Alternate Conceptual Ecohydrological Models for Streamflow Prediction
NASA Astrophysics Data System (ADS)
Naseem, Bushra; Ajami, Hoori; Cordery, Ian; Sharma, Ashish
2016-04-01
A merging of a lumped conceptual hydrological model with two conceptual dynamic vegetation models is presented to assess the performance of these models for simultaneous simulations of streamflow and leaf area index (LAI). Two conceptual dynamic vegetation models with differing representation of ecological processes are merged with a lumped conceptual hydrological model (HYMOD) to predict catchment scale streamflow and LAI. The merged RR-LAI-I model computes relative leaf biomass based on transpiration rates while the RR-LAI-II model computes above ground green and dead biomass based on net primary productivity and water use efficiency in response to soil moisture dynamics. To assess the performance of these models, daily discharge and 8-day MODIS LAI product for 27 catchments of 90 - 1600km2 in size located in the Murray - Darling Basin in Australia are used. Our results illustrate that when single-objective optimisation was focussed on maximizing the objective function for streamflow or LAI, the other un-calibrated predicted outcome (LAI if streamflow is the focus) was consistently compromised. Thus, single-objective optimization cannot take into account the essence of all processes in the conceptual ecohydrological models. However, multi-objective optimisation showed great strength for streamflow and LAI predictions. Both response outputs were better simulated by RR-LAI-II than RR-LAI-I due to better representation of physical processes such as net primary productivity (NPP) in RR-LAI-II. Our results highlight that simultaneous calibration of streamflow and LAI using a multi-objective algorithm proves to be an attractive tool for improved streamflow predictions.
Eco-Logic: Logic-Based Approaches to Ecological Modelling
Daniel L. Schmoldt
1991-01-01
This paper summarizes the simulation research carried out during 1984-1989 at the University of Edinburgh. Two primary objectives of their research are 1) to provide tools for manipulating simulation models (i.e., implementation tools) and 2) to provide advice on conceptualizing real-world phenomena into an idealized representation for simulation (i.e., model design...
NASA Technical Reports Server (NTRS)
Strutzenberg, L. L.; Dougherty, N. S.; Liever, P. A.; West, J. S.; Smith, S. D.
2007-01-01
This paper details advances being made in the development of Reynolds-Averaged Navier-Stokes numerical simulation tools, models, and methods for the integrated Space Shuttle Vehicle at launch. The conceptual model and modeling approach described includes the development of multiple computational models to appropriately analyze the potential debris transport for critical debris sources at Lift-Off. The conceptual model described herein involves the integration of propulsion analysis for the nozzle/plume flow with the overall 3D vehicle flowfield at Lift-Off. Debris Transport Analyses are being performed using the Shuttle Lift-Off models to assess the risk to the vehicle from Lift-Off debris and appropriately prioritized mitigation of potential debris sources to continue to reduce vehicle risk. These integrated simulations are being used to evaluate plume-induced debris environments where the multi-plume interactions with the launch facility can potentially accelerate debris particles toward the vehicle.
Calibration of an Unsteady Groundwater Flow Model for a Complex, Strongly Heterogeneous Aquifer
NASA Astrophysics Data System (ADS)
Curtis, Z. K.; Liao, H.; Li, S. G.; Phanikumar, M. S.; Lusch, D.
2016-12-01
Modeling of groundwater systems characterized by complex three-dimensional structure and heterogeneity remains a significant challenge. Most of today's groundwater models are developed based on relatively simple conceptual representations in favor of model calibratibility. As more complexities are modeled, e.g., by adding more layers and/or zones, or introducing transient processes, more parameters have to be estimated and issues related to ill-posed groundwater problems and non-unique calibration arise. Here, we explore the use of an alternative conceptual representation for groundwater modeling that is fully three-dimensional and can capture complex 3D heterogeneity (both systematic and "random") without over-parameterizing the aquifer system. In particular, we apply Transition Probability (TP) geostatistics on high resolution borehole data from a water well database to characterize the complex 3D geology. Different aquifer material classes, e.g., `AQ' (aquifer material), `MAQ' (marginal aquifer material'), `PCM' (partially confining material), and `CM' (confining material), are simulated, with the hydraulic properties of each material type as tuning parameters during calibration. The TP-based approach is applied to simulate unsteady groundwater flow in a large, complex, and strongly heterogeneous glacial aquifer system in Michigan across multiple spatial and temporal scales. The resulting model is calibrated to observed static water level data over a time span of 50 years. The results show that the TP-based conceptualization enables much more accurate and robust calibration/simulation than that based on conventional deterministic layer/zone based conceptual representations.
School Finance Reform: Decoding the Simulation Maze
ERIC Educational Resources Information Center
Jargowsky, Peter; And Others
1977-01-01
Demonstrates the mathematical equivalence of various school finance equalization formulas, describes the elements that complicate the preparation of a generalized simulation capability, and briefly presents a conceptualization of a generalized simulation model. (JG)
Thermohydrology of fractured geologic materials
NASA Astrophysics Data System (ADS)
Esh, David Whittaker
1998-11-01
Thermohydrological and thermohydrochemical modeling as applied to the disposal of radioactive materials in a geologic repository is presented. Site hydrology, chemistry, and mineralogy were summarized and conceptual models of the fundamental system processes were developed. The numerical model TOUGH2 was used to complete computer simulations of thermohydrological processes in fractured, geologic media. Sensitivity studies investigating the impact of dimensionality and different conceptual models to represent fractures (ECM, DK, MINC) on thermohydrological response were developed. Sensitivity to parameter variation within a given conceptual model was also considered. The sensitivity of response was examined against thermohydrological metrics derived from the flow and redistribution of moisture. A simple thermohydrochemical model to investigate a three-process coupling (thermal-hydrological-chemical) was presented. The redistribution of chloride was evaluated because the chemical behavior is well known and defensible. In addition, it is very important to overall system performance. For all of the simulations completed, chloride was found to be extremely concentrated in the fluids that eventually return to the engineered barrier system. Chloride concentration and mass flux were increased from ambient by over a factor of 1000 for some simulations. Thermohydrology was found to have the potential to significantly alter chemistry from ambient conditions.
An Interactive Simulation Program for Exploring Computational Models of Auto-Associative Memory.
Fink, Christian G
2017-01-01
While neuroscience students typically learn about activity-dependent plasticity early in their education, they often struggle to conceptually connect modification at the synaptic scale with network-level neuronal dynamics, not to mention with their own everyday experience of recalling a memory. We have developed an interactive simulation program (based on the Hopfield model of auto-associative memory) that enables the user to visualize the connections generated by any pattern of neural activity, as well as to simulate the network dynamics resulting from such connectivity. An accompanying set of student exercises introduces the concepts of pattern completion, pattern separation, and sparse versus distributed neural representations. Results from a conceptual assessment administered before and after students worked through these exercises indicate that the simulation program is a useful pedagogical tool for illustrating fundamental concepts of computational models of memory.
The morphodynamics and sedimentology of large river confluences
NASA Astrophysics Data System (ADS)
Nicholas, Andrew; Sambrook Smith, Greg; Best, James; Bull, Jon; Dixon, Simon; Goodbred, Steven; Sarker, Mamin; Vardy, Mark
2017-04-01
Confluences are key locations within large river networks, yet surprisingly little is known about how they migrate and evolve through time. Moreover, because confluence sites are associated with scour pools that are typically several times the mean channel depth, the deposits associated with such scours should have a high potential for preservation within the rock record. However, paradoxically, such scours are rarely observed, and the sedimentological characteristics of such deposits are poorly understood. This study reports results from a physically-based morphodynamic model, which is applied to simulate the evolution and resulting alluvial architecture associated with large river junctions. Boundary conditions within the model simulation are defined to approximate the junction of the Ganges and Jamuna rivers, in Bangladesh. Model results are supplemented by geophysical datasets collected during boat-based surveys at this junction. Simulated deposit characteristics and geophysical datasets are compared with three existing and contrasting conceptual models that have been proposed to represent the sedimentary architecture of confluence scours. Results illustrate that existing conceptual models may be overly simplistic, although elements of each of the three conceptual models are evident in the deposits generated by the numerical simulation. The latter are characterised by several distinct styles of sedimentary fill, which can be linked to particular morphodynamic behaviours. However, the preserved characteristics of simulated confluence deposits vary substantial according to the degree of reworking by channel migration. This may go some way towards explaining the confluence scour paradox; while abundant large scours might be expected in the rock record, they are rarely reported.
NASA Astrophysics Data System (ADS)
Yang, J.; Zammit, C.; McMillan, H. K.
2016-12-01
As in most countries worldwide, water management in lowland areas is a big concern for New Zealand due to its economic importance for water related human activities. As a result, the estimation of available water resources in these areas (e.g., for irrigation and water supply purpose) is crucial and often requires an understanding of complex hydrological processes, which are often characterized by strong interactions between surface water and groundwater (usually expressed as losing and gaining rivers). These processes are often represented and simulated using integrated physically based hydrological models. However models with physically based groundwater modules typically require large amount of non-readily available geologic and aquifer information and are computationally intensive. Instead, this paper presents a conceptual groundwater model that is fully integrated into New Zealand's national hydrological model TopNet based on TopModel concepts (Beven, 1992). Within this conceptual framework, the integrated model can simulate not only surface processes, but also groundwater processes and surface water-groundwater interaction processes (including groundwater flow, river-groundwater interaction, and groundwater interaction with external watersheds). The developed model was applied to two New Zealand catchments with different hydro-geological and climate characteristics (Pareora catchment in the Canterbury Plains and Grey catchment on the West Coast). Previous studies have documented strong interactions between the river and groundwater, based on the analysis of a large number of concurrent flow measurements and associated information along the river main stem. Application of the integrated hydrological model indicates flow simulation (compared to the original hydrological model conceptualisation) during low flow conditions are significantly improved and further insights on local river dynamics are gained. Due to its conceptual characteristics and low level of data requirement, the integrated model could be used at local and national scales to improve the simulation of hydrological processes in non-topographically driven areas (where groundwater processes are important), and to assess impact of climate change on the integrated hydrological cycle in these areas.
Conceptual strategies and inter-theory relations: The case of nanoscale cracks
NASA Astrophysics Data System (ADS)
Bursten, Julia R.
2018-05-01
This paper introduces a new account of inter-theory relations in physics, which I call the conceptual strategies account. Using the example of a multiscale computer simulation model of nanoscale crack propagation in silicon, I illustrate this account and contrast it with existing reductive, emergent, and handshaking approaches. The conceptual strategies account develops the notion that relations among physical theories, and among their models, are constrained but not dictated by limitations from physics, mathematics, and computation, and that conceptual reasoning within those limits is required both to generate and to understand the relations between theories. Conceptual strategies result in a variety of types of relations between theories and models. These relations are themselves epistemic objects, like theories and models, and as such are an under-recognized part of the epistemic landscape of science.
Macquarrie, K T B; Mayer, K U; Jin, B; Spiessl, S M
2010-03-01
Redox evolution in sparsely fractured crystalline rocks is a key, and largely unresolved, issue when assessing the geochemical suitability of deep geological repositories for nuclear waste. Redox zonation created by the influx of oxygenated waters has previously been simulated using reactive transport models that have incorporated a variety of processes, resulting in predictions for the depth of oxygen penetration that may vary greatly. An assessment and direct comparison of the various underlying conceptual models are therefore needed. In this work a reactive transport model that considers multiple processes in an integrated manner is used to investigate the ingress of oxygen for both single fracture and fracture zone scenarios. It is shown that the depth of dissolved oxygen migration is greatly influenced by the a priori assumptions that are made in the conceptual models. For example, the ability of oxygen to access and react with minerals in the rock matrix may be of paramount importance for single fracture conceptual models. For fracture zone systems, the abundance and reactivity of minerals within the fractures and thin matrix slabs between the fractures appear to provide key controls on O(2) attenuation. The findings point to the need for improved understanding of the coupling between the key transport-reaction feedbacks to determine which conceptual models are most suitable and to provide guidance for which parameters should be targeted in field and laboratory investigations. Copyright 2009 Elsevier B.V. All rights reserved.
Techno-economic analysis Process model development for existing and conceptual processes Detailed heat integration Economic analysis of integrated processes Integration of process simulation learnings into control ;Conceptual Process Design and Techno-Economic Assessment of Ex Situ Catalytic Fast Pyrolysis of Biomass: A
Dini-Andreote, Francisco; Stegen, James C.; van Elsas, Jan D.; ...
2015-03-17
Despite growing recognition that deterministic and stochastic factors simultaneously influence bacterial communities, little is known about mechanisms shifting their relative importance. To better understand underlying mechanisms, we developed a conceptual model linking ecosystem development during primary succession to shifts in the stochastic/deterministic balance. To evaluate the conceptual model we coupled spatiotemporal data on soil bacterial communities with environmental conditions spanning 105 years of salt marsh development. At the local scale there was a progression from stochasticity to determinism due to Na accumulation with increasing ecosystem age, supporting a main element of the conceptual model. At the regional-scale, soil organic mattermore » (SOM) governed the relative influence of stochasticity and the type of deterministic ecological selection, suggesting scale-dependency in how deterministic ecological selection is imposed. Analysis of a new ecological simulation model supported these conceptual inferences. Looking forward, we propose an extended conceptual model that integrates primary and secondary succession in microbial systems.« less
Comparison of a Conceptual Groundwater Model and Physically Based Groundwater Mode
NASA Astrophysics Data System (ADS)
Yang, J.; Zammit, C.; Griffiths, J.; Moore, C.; Woods, R. A.
2017-12-01
Groundwater is a vital resource for human activities including agricultural practice and urban water demand. Hydrologic modelling is an important way to study groundwater recharge, movement and discharge, and its response to both human activity and climate change. To understand the groundwater hydrologic processes nationally in New Zealand, we have developed a conceptually based groundwater flow model, which is fully integrated into a national surface-water model (TopNet), and able to simulate groundwater recharge, movement, and interaction with surface water. To demonstrate the capability of this groundwater model (TopNet-GW), we applied the model to an irrigated area with water shortage and pollution problems in the upper Ruamahanga catchment in Great Wellington Region, New Zealand, and compared its performance with a physically-based groundwater model (MODFLOW). The comparison includes river flow at flow gauging sites, and interaction between groundwater and river. Results showed that the TopNet-GW produced similar flow and groundwater interaction patterns as the MODFLOW model, but took less computation time. This shows the conceptually-based groundwater model has the potential to simulate national groundwater process, and could be used as a surrogate for the more physically based model.
A Conceptual Framework for SAHRA Integrated Multi-resolution Modeling in the Rio Grande Basin
NASA Astrophysics Data System (ADS)
Liu, Y.; Gupta, H.; Springer, E.; Wagener, T.; Brookshire, D.; Duffy, C.
2004-12-01
The sustainable management of water resources in a river basin requires an integrated analysis of the social, economic, environmental and institutional dimensions of the problem. Numerical models are commonly used for integration of these dimensions and for communication of the analysis results to stakeholders and policy makers. The National Science Foundation Science and Technology Center for Sustainability of semi-Arid Hydrology and Riparian Areas (SAHRA) has been developing integrated multi-resolution models to assess impacts of climate variability and land use change on water resources in the Rio Grande Basin. These models not only couple natural systems such as surface and ground waters, but will also include engineering, economic and social components that may be involved in water resources decision-making processes. This presentation will describe the conceptual framework being developed by SAHRA to guide and focus the multiple modeling efforts and to assist the modeling team in planning, data collection and interpretation, communication, evaluation, etc. One of the major components of this conceptual framework is a Conceptual Site Model (CSM), which describes the basin and its environment based on existing knowledge and identifies what additional information must be collected to develop technically sound models at various resolutions. The initial CSM is based on analyses of basin profile information that has been collected, including a physical profile (e.g., topographic and vegetative features), a man-made facility profile (e.g., dams, diversions, and pumping stations), and a land use and ecological profile (e.g., demographics, natural habitats, and endangered species). Based on the initial CSM, a Conceptual Physical Model (CPM) is developed to guide and evaluate the selection of a model code (or numerical model) for each resolution to conduct simulations and predictions. A CPM identifies, conceptually, all the physical processes and engineering and socio-economic activities occurring (or to occur) in the real system that the corresponding numerical models are required to address, such as riparian evapotranspiration responses to vegetation change and groundwater pumping impacts on soil moisture contents. Simulation results from different resolution models and observations of the real system will then be compared to evaluate the consistency among the CSM, the CPMs, and the numerical models, and feedbacks will be used to update the models. In a broad sense, the evaluation of the models (conceptual or numerical), as well as the linkages between them, can be viewed as a part of the overall conceptual framework. As new data are generated and understanding improves, the models will evolve, and the overall conceptual framework is refined. The development of the conceptual framework becomes an on-going process. We will describe the current state of this framework and the open questions that have to be addressed in the future.
D. Caamano; P. Goodwin; J. M. Buffington
2010-01-01
Detailed field measurements and simulations of three-dimensional flow structure were used to develop a conceptual model to explain the sustainability of self-formed pool-riffle sequences in gravel-bed rivers. The analysis was conducted at the Red River Wildlife Management Area in Idaho, USA, and enabled characterization of the flow structure through two consecutive...
NASA Astrophysics Data System (ADS)
Lee, Hoon Hee; Koo, Cheol Hea; Moon, Sung Tae; Han, Sang Hyuck; Ju, Gwang Hyeok
2013-08-01
The conceptual study for Korean lunar orbiter/lander prototype has been performed in Korea Aerospace Research Institute (KARI). Across diverse space programs around European countries, a variety of simulation application has been developed using SMP2 (Simulation Modelling Platform) standard related to portability and reuse of simulation models by various model users. KARI has not only first-hand experience of a development of SMP compatible simulation environment but also an ongoing study to apply the SMP2 development process of simulation model to a simulator development project for lunar missions. KARI has tried to extend the coverage of the development domain based on SMP2 standard across the whole simulation model life-cycle from software design to its validation through a lunar exploration project. Figure. 1 shows a snapshot from a visualization tool for the simulation of lunar lander motion. In reality, a demonstrator prototype on the right-hand side of image was made and tested in 2012. In an early phase of simulator development prior to a kick-off start in the near future, targeted hardware to be modelled has been investigated and indentified at the end of 2012. The architectural breakdown of the lunar simulator at system level was performed and the architecture with a hierarchical tree of models from the system to parts at lower level has been established. Finally, SMP Documents such as Catalogue, Assembly, Schedule and so on were converted using a XML(eXtensible Mark-up Language) converter. To obtain benefits of the suggested approaches and design mechanisms in SMP2 standard as far as possible, the object-oriented and component-based design concepts were strictly chosen throughout a whole model development process.
Bärgman, Jonas; Boda, Christian-Nils; Dozza, Marco
2017-05-01
As the development and deployment of in-vehicle intelligent safety systems (ISS) for crash avoidance and mitigation have rapidly increased in the last decades, the need to evaluate their prospective safety benefits before introduction has never been higher. Counterfactual simulations using relevant mathematical models (for vehicle dynamics, sensors, the environment, ISS algorithms, and models of driver behavior) have been identified as having high potential. However, although most of these models are relatively mature, models of driver behavior in the critical seconds before a crash are still relatively immature. There are also large conceptual differences between different driver models. The objective of this paper is, firstly, to demonstrate the importance of the choice of driver model when counterfactual simulations are used to evaluate two ISS: Forward collision warning (FCW), and autonomous emergency braking (AEB). Secondly, the paper demonstrates how counterfactual simulations can be used to perform sensitivity analyses on parameter settings, both for driver behavior and ISS algorithms. Finally, the paper evaluates the effect of the choice of glance distribution in the driver behavior model on the safety benefit estimation. The paper uses pre-crash kinematics and driver behavior from 34 rear-end crashes from the SHRP2 naturalistic driving study for the demonstrations. The results for FCW show a large difference in the percent of avoided crashes between conceptually different models of driver behavior, while differences were small for conceptually similar models. As expected, the choice of model of driver behavior did not affect AEB benefit much. Based on our results, researchers and others who aim to evaluate ISS with the driver in the loop through counterfactual simulations should be sure to make deliberate and well-grounded choices of driver models: the choice of model matters. Copyright © 2017 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Chabalengula, Vivien; Fateen, Rasheta; Mumba, Frackson; Ochs, Laura Kathryn
2016-01-01
This study investigated the effect of an inquiry-based computer simulation modeling (ICoSM) instructional approach on pre-service science teachers' understanding of homeostasis and its related concepts, and their perceived design features of the ICoSM and simulation that enhanced their conceptual understanding of these concepts. Fifty pre-service…
Developing a Conceptual Architecture for a Generalized Agent-based Modeling Environment (GAME)
2008-03-01
4. REPAST (Java, Python , C#, Open Source) ........28 5. MASON: Multi-Agent Modeling Language (Swarm Extension... Python , C#, Open Source) Repast (Recursive Porous Agent Simulation Toolkit) was designed for building agent-based models and simulations in the...Repast makes it easy for inexperienced users to build models by including a built-in simple model and provide interfaces through which menus and Python
Integrating O/S models during conceptual design, part 1
NASA Technical Reports Server (NTRS)
Ebeling, Charles E.
1994-01-01
The University of Dayton is pleased to submit this report to the National Aeronautics and Space Administration (NASA), Langley Research Center, which integrates a set of models for determining operational capabilities and support requirements during the conceptual design of proposed space systems. This research provides for the integration of the reliability and maintainability (R&M) model, both new and existing simulation models, and existing operations and support (O&S) costing equations in arriving at a complete analysis methodology. Details concerning the R&M model and the O&S costing model may be found in previous reports accomplished under this grant (NASA Research Grant NAG1-1327). In the process of developing this comprehensive analysis approach, significant enhancements were made to the R&M model, updates to the O&S costing model were accomplished, and a new simulation model developed. This is the 1st part of a 3 part technical report.
NASA Astrophysics Data System (ADS)
Blessent, Daniela; Therrien, René; Lemieux, Jean-Michel
2011-12-01
This paper presents numerical simulations of a series of hydraulic interference tests conducted in crystalline bedrock at Olkiluoto (Finland), a potential site for the disposal of the Finnish high-level nuclear waste. The tests are in a block of crystalline bedrock of about 0.03 km3 that contains low-transmissivity fractures. Fracture density, orientation, and fracture transmissivity are estimated from Posiva Flow Log (PFL) measurements in boreholes drilled in the rock block. On the basis of those data, a geostatistical approach relying on a transitional probability and Markov chain models is used to define a conceptual model based on stochastic fractured rock facies. Four facies are defined, from sparsely fractured bedrock to highly fractured bedrock. Using this conceptual model, three-dimensional groundwater flow is then simulated to reproduce interference pumping tests in either open or packed-off boreholes. Hydraulic conductivities of the fracture facies are estimated through automatic calibration using either hydraulic heads or both hydraulic heads and PFL flow rates as targets for calibration. The latter option produces a narrower confidence interval for the calibrated hydraulic conductivities, therefore reducing the associated uncertainty and demonstrating the usefulness of the measured PFL flow rates. Furthermore, the stochastic facies conceptual model is a suitable alternative to discrete fracture network models to simulate fluid flow in fractured geological media.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sig Drellack, Lance Prothro
2007-12-01
The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result ofmore » the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The simulations are challenged by the distributed sources in each of the Corrective Action Units, by complex mass transfer processes, and by the size and complexity of the field-scale flow models. An efficient methodology utilizing particle tracking results and convolution integrals provides in situ concentrations appropriate for Monte Carlo analysis. Uncertainty in source releases and transport parameters including effective porosity, fracture apertures and spacing, matrix diffusion coefficients, sorption coefficients, and colloid load and mobility are considered. With the distributions of input uncertainties and output plume volumes, global analysis methods including stepwise regression, contingency table analysis, and classification tree analysis are used to develop sensitivity rankings of parameter uncertainties for each model considered, thus assisting a variety of decisions.« less
AI and simulation: What can they learn from each other
NASA Technical Reports Server (NTRS)
Colombano, Silvano P.
1988-01-01
Simulation and Artificial Intelligence share a fertile common ground both from a practical and from a conceptual point of view. Strengths and weaknesses of both Knowledge Based System and Modeling and Simulation are examined and three types of systems that combine the strengths of both technologies are discussed. These types of systems are a practical starting point, however, the real strengths of both technologies will be exploited only when they are combined in a common knowledge representation paradigm. From an even deeper conceptual point of view, one might even argue that the ability to reason from a set of facts (i.e., Expert System) is less representative of human reasoning than the ability to make a model of the world, change it as required, and derive conclusions about the expected behavior of world entities. This is a fundamental problem in AI, and Modeling Theory can contribute to its solution. The application of Knowledge Engineering technology to a Distributed Processing Network Simulator (DPNS) is discussed.
Numerical Simulation of the 9-10 June 1972 Black Hills Storm Using CSU RAMS
NASA Technical Reports Server (NTRS)
Nair, U. S.; Hjelmfelt, Mark R.; Pielke, Roger A., Sr.
1997-01-01
Strong easterly flow of low-level moist air over the eastern slopes of the Black Hills on 9-10 June 1972 generated a storm system that produced a flash flood, devastating the area. Based on observations from this storm event, and also from the similar Big Thompson 1976 storm event, conceptual models have been developed to explain the unusually high precipitation efficiency. In this study, the Black Hills storm is simulated using the Colorado State University Regional Atmospheric Modeling System. Simulations with homogeneous and inhomogeneous initializations and different grid structures are presented. The conceptual models of storm structure proposed by previous studies are examined in light of the present simulations. Both homogeneous and inhomogeneous initialization results capture the intense nature of the storm, but the inhomogeneous simulation produced a precipitation pattern closer to the observed pattern. The simulations point to stationary tilted updrafts, with precipitation falling out to the rear as the preferred storm structure. Experiments with different grid structures point to the importance of removing the lateral boundaries far from the region of activity. Overall, simulation performance in capturing the observed behavior of the storm system was enhanced by use of inhomogeneous initialization.
Reiter, Michael A; Saintil, Max; Yang, Ziming; Pokrajac, Dragoljub
2009-08-01
Conceptual modeling is a useful tool for identifying pathways between drivers, stressors, Valued Ecosystem Components (VECs), and services that are central to understanding how an ecosystem operates. The St. Jones River watershed, DE is a complex ecosystem, and because management decisions must include ecological, social, political, and economic considerations, a conceptual model is a good tool for accommodating the full range of inputs. In 2002, a Four-Component, Level 1 conceptual model was formed for the key habitats of the St. Jones River watershed, but since the habitat level of resolution is too fine for some important watershed-scale issues we developed a functional watershed-scale model using the existing narrowed habitat-scale models. The narrowed habitat-scale conceptual models and associated matrices developed by Reiter et al. (2006) were combined with data from the 2002 land use/land cover (LULC) GIS-based maps of Kent County in Delaware to assemble a diagrammatic and numerical watershed-scale conceptual model incorporating the calculated weight of each habitat within the watershed. The numerical component of the assembled watershed model was subsequently subjected to the same Monte Carlo narrowing methodology used for the habitat versions to refine the diagrammatic component of the watershed-scale model. The narrowed numerical representation of the model was used to generate forecasts for changes in the parameters "Agriculture" and "Forest", showing that land use changes in these habitats propagated through the results of the model by the weighting factor. Also, the narrowed watershed-scale conceptual model identified some key parameters upon which to focus research attention and management decisions at the watershed scale. The forecast and simulation results seemed to indicate that the watershed-scale conceptual model does lead to different conclusions than the habitat-scale conceptual models for some issues at the larger watershed scale.
2013-06-01
1 18th ICCRTS Using a Functional Simulation of Crisis Management to Test the C2 Agility Model Parameters on Key Performance Variables...AND SUBTITLE Using a Functional Simulation of Crisis Management to Test the C2 Agility Model Parameters on Key Performance Variables 5a. CONTRACT...command in crisis management. C2 Agility Model Agility can be conceptualized at a number of different levels; for instance at the team
CAMPUS-MINNESOTA User Information Manual. Project PRIME Report, Number 12.
ERIC Educational Resources Information Center
Andrew, Gary M.
The purpose of this report is to aid the use of the computer simulation model, CAMPUS-M, in 4 specific areas: (1) the conceptual modeling of the institution; (2) the preparation of machine readable input data; (3) the preparation of simulation and report commands for the model; and (4) the actual running of the program on a CDC 6600 computer.…
Conceptualization of Karstic Aquifer with Multiple Outlets Using a Dual Porosity Model.
Hosseini, Seiyed Mossa; Ataie-Ashtiani, Behzad
2017-07-01
In this study, two conceptual models, the classic reservoir (CR) model and exchange reservoirs model embedded by dual porosity approach (DPR) are developed for simulation of karst aquifer functioning drained by multiple outlets. The performances of two developed models are demonstrated at a less developed karstic aquifer with three spring outlets located in Zagros Mountain in the south-west of Iran using 22-years of daily data. During the surface recharge, a production function based on water mass balance is implemented for computing the time series of surface recharge to the karst formations. The efficiency of both models has been assessed for simulation of daily spring discharge during the recession and also surface recharge periods. Results indicate that both CR and DPR models are capable of simulating the ordinates of spring hydrographs which drainage less developed karstic aquifer. However, the goodness of fit criteria indicates outperformance of DPR model for simulation of total hydrograph ordinates. In addition, the DPR model is capable of quantifying hydraulic properties of two hydrologically connected overlapping continua conduits network and fissure matrix which lays important foundations for the mining operation and water resource management whereas homogeneous model representations of the karstic subsurface (e.g., the CR) do not work accurately in the karstic environment. © 2017, National Ground Water Association.
NASA Astrophysics Data System (ADS)
Liu, Gang; Zhao, Rong; Liu, Jiping; Zhang, Qingpu
2007-06-01
The Lancang River Basin is so narrow and its hydrological and meteorological information are so flexible. The Rainfall, evaporation, glacial melt water and groundwater affect the runoff whose replenishment forms changing notable with the season in different areas at the basin. Characters of different kind of distributed model and conceptual hydrological model are analyzed. A semi-distributed hydrological model of relation between monthly runoff and rainfall, temperate and soil type has been built in Changdu County based on Visual Basic and ArcObject. The way of discretization of distributed hydrological model was used in the model, and principles of conceptual model are taken into account. The sub-catchment of Changdu is divided into regular cells, and all kinds of hydrological and meteorological information and land use classes and slope extracted from 1:250000 digital elevation models are distributed in each cell. The model does not think of the rainfall-runoff hydro-physical process but use the conceptual model to simulate the whole contributes to the runoff of the area. The affection of evapotranspiration loss and underground water is taken into account at the same time. The spatial distribute characteristics of the monthly runoff in the area are simulated and analyzed with a few parameters.
Computer-aided operations engineering with integrated models of systems and operations
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Ryan, Dan; Fleming, Land
1994-01-01
CONFIG 3 is a prototype software tool that supports integrated conceptual design evaluation from early in the product life cycle, by supporting isolated or integrated modeling, simulation, and analysis of the function, structure, behavior, failures and operation of system designs. Integration and reuse of models is supported in an object-oriented environment providing capabilities for graph analysis and discrete event simulation. Integration is supported among diverse modeling approaches (component view, configuration or flow path view, and procedure view) and diverse simulation and analysis approaches. Support is provided for integrated engineering in diverse design domains, including mechanical and electro-mechanical systems, distributed computer systems, and chemical processing and transport systems. CONFIG supports abstracted qualitative and symbolic modeling, for early conceptual design. System models are component structure models with operating modes, with embedded time-related behavior models. CONFIG supports failure modeling and modeling of state or configuration changes that result in dynamic changes in dependencies among components. Operations and procedure models are activity structure models that interact with system models. CONFIG is designed to support evaluation of system operability, diagnosability and fault tolerance, and analysis of the development of system effects of problems over time, including faults, failures, and procedural or environmental difficulties.
Water-Balance Model to Simulate Historical Lake Levels for Lake Merced, California
NASA Astrophysics Data System (ADS)
Maley, M. P.; Onsoy, S.; Debroux, J.; Eagon, B.
2009-12-01
Lake Merced is a freshwater lake located in southwestern San Francisco, California. In the late 1980s and early 1990s, an extended, severe drought impacted the area that resulted in significant declines in Lake Merced lake levels that raised concerns about the long-term health of the lake. In response to these concerns, the Lake Merced Water Level Restoration Project was developed to evaluate an engineered solution to increase and maintain Lake Merced lake levels. The Lake Merced Lake-Level Model was developed to support the conceptual engineering design to restore lake levels. It is a spreadsheet-based water-balance model that performs monthly water-balance calculations based on the hydrological conceptual model. The model independently calculates each water-balance component based on available climate and hydrological data. The model objective was to develop a practical, rule-based approach for the water balance and to calibrate the model results to measured lake levels. The advantage of a rule-based approach is that once the rules are defined, they enhance the ability to then adapt the model for use in future-case simulations. The model was calibrated to historical lake levels over a 70-year period from 1939 to 2009. Calibrating the model over this long historical range tested the model over a variety of hydrological conditions including wet, normal and dry precipitation years, flood events, and periods of high and low lake levels. The historical lake level range was over 16 feet. The model calibration of historical to simulated lake levels had a residual mean of 0.02 feet and an absolute residual mean of 0.42 feet. More importantly, the model demonstrated the ability to simulate both long-term and short-term trends with a strong correlation of the magnitude for both annual and seasonal fluctuations in lake levels. The calibration results demonstrate an improved conceptual understanding of the key hydrological factors that control lake levels, reduce uncertainty in the hydrological conceptual model, and increase confidence in the model’s ability to forecast future lake conditions. The Lake Merced Lake-Level Model will help decision-makers with a straightforward, practical analysis of the major contributions to lake-level declines that can be used to support engineering, environmental and other decisions.
NASA Astrophysics Data System (ADS)
Chien, Cheng-Chih
In the past thirty years, the effectiveness of computer assisted learning was found varied by individual studies. Today, with drastic technical improvement, computers have been widely spread in schools and used in a variety of ways. In this study, a design model involving educational technology, pedagogy, and content domain is proposed for effective use of computers in learning. Computer simulation, constructivist and Vygotskian perspectives, and circular motion are the three elements of the specific Chain Model for instructional design. The goal of the physics course is to help students remove the ideas which are not consistent with the physics community and rebuild new knowledge. To achieve the learning goal, the strategies of using conceptual conflicts and using language to internalize specific tasks into mental functions were included. Computer simulations and accompanying worksheets were used to help students explore their own ideas and to generate questions for discussions. Using animated images to describe the dynamic processes involved in the circular motion may reduce the complexity and possible miscommunications resulting from verbal explanations. The effectiveness of the instructional material on student learning is evaluated. The results of problem solving activities show that students using computer simulations had significantly higher scores than students not using computer simulations. For conceptual understanding, on the pretest students in the non-simulation group had significantly higher score than students in the simulation group. There was no significant difference observed between the two groups in the posttest. The relations of gender, prior physics experience, and frequency of computer uses outside the course to student achievement were also studied. There were fewer female students than male students and fewer students using computer simulations than students not using computer simulations. These characteristics affect the statistical power for detecting differences. For the future research, more intervention of simulations may be introduced to explore the potential of computer simulation in helping students learning. A test for conceptual understanding with more problems and appropriate difficulty level may be needed.
Provencher, Louis; Frid, Leonardo; Czembor, Christina; Morisette, Jeffrey T.
2016-01-01
State-and-Transition Simulation Modeling (STSM) is a quantitative analysis method that can consolidate a wide array of resource management issues under a “what-if” scenario exercise. STSM can be seen as an ensemble of models, such as climate models, ecological models, and economic models that incorporate human dimensions and management options. This chapter presents STSM as a tool to help synthesize information on social–ecological systems and to investigate some of the management issues associated with exotic annual Bromus species, which have been described elsewhere in this book. Definitions, terminology, and perspectives on conceptual and computer-simulated stochastic state-and-transition models are given first, followed by a brief review of past STSM studies relevant to the management of Bromus species. A detailed case study illustrates the usefulness of STSM for land management. As a whole, this chapter is intended to demonstrate how STSM can help both managers and scientists: (a) determine efficient resource allocation for monitoring nonnative grasses; (b) evaluate sources of uncertainty in model simulation results involving expert opinion, and their consequences for management decisions; and (c) provide insight into the consequences of predicted local climate change effects on ecological systems invaded by exotic annual Bromus species.
A Simulation Model Articulation of the REA Ontology
NASA Astrophysics Data System (ADS)
Laurier, Wim; Poels, Geert
This paper demonstrates how the REA enterprise ontology can be used to construct simulation models for business processes, value chains and collaboration spaces in supply chains. These models support various high-level and operational management simulation applications, e.g. the analysis of enterprise sustainability and day-to-day planning. First, the basic constructs of the REA ontology and the ExSpect modelling language for simulation are introduced. Second, collaboration space, value chain and business process models and their conceptual dependencies are shown, using the ExSpect language. Third, an exhibit demonstrates the use of value chain models in predicting the financial performance of an enterprise.
ERIC Educational Resources Information Center
Rands, Sean A.
2012-01-01
Models are an important tool in science: not only do they act as a convenient device for describing a system or problem, but they also act as a conceptual tool for framing and exploring hypotheses. Models, and in particular computer simulations, are also an important education tool for training scientists, but it is difficult to teach students the…
Enhancing Simulation Learning with Team Mental Model Mapping
ERIC Educational Resources Information Center
Goltz, Sonia M.
2017-01-01
Simulations have been developed for many business courses because of enhanced student engagement and learning. A challenge for instructors using simulations is how to take this learning to the next level since student reflection and learning can vary. This article describes how to use a conceptual mapping game at the beginning and end of a…
NASA Astrophysics Data System (ADS)
Cao, Guoliang; Han, Dongmei; Currell, Matthew J.; Zheng, Chunmiao
2016-09-01
Groundwater flow in deep sedimentary basins results from complex evolution processes on geological timescales. Groundwater flow systems conceptualized according to topography and/or groundwater table configuration generally assume a near-equilibrium state with the modern landscape. However, the time to reach such a steady state, and more generally the timescales of groundwater flow system evolution are key considerations for large sedimentary basins. This is true in the North China Basin (NCB), which has been studied for many years due to its importance as a groundwater supply. Despite many years of study, there remain contradictions between the generally accepted conceptual model of regional flow, and environmental tracer data. We seek to reconcile these contractions by conducting simulations of groundwater flow, age and heat transport in a three dimensional model, using an alternative conceptual model, based on geological, thermal, isotope and historical data. We infer flow patterns under modern hydraulic conditions using this new model and present the theoretical maximum groundwater ages under such a flow regime. The model results show that in contrast to previously accepted conceptualizations, most groundwater is discharged in the vicinity of the break-in-slope of topography at the boundary between the piedmont and central plain. Groundwater discharge to the ocean is in contrast small, and in general there are low rates of active flow in the eastern parts of the basin below the central and coastal plain. This conceptualization is more compatible with geochemical and geothermal data than the previous model. Simulated maximum groundwater ages of ∼1 Myrs below the central and coastal plain indicate that residual groundwater may be retained in the deep parts of the basin since being recharged during the last glacial period or earlier. The groundwater flow system has therefore probably not reached a new equilibrium state with modern-day hydraulic conditions. The previous hypothesis that regional groundwater flow from the piedmont groundwater recharge zone predominantly discharges at the coastline may therefore be false. A more reliable alternative might be to conceptualize deep groundwater below the coastal plains a hydrodynamically stagnant zone, responding gradually to landscape and hydrological change on geologic timescales. This study brings a new and original understanding of the groundwater flow system in an important regional basin, in the context of its geometry and evolution over geological timescales. There are important implications for the sustainability of the ongoing high rates of groundwater extraction in the NCB.
He, Yujie; Yang, Jinyan; Zhuang, Qianlai; McGuire, A. David; Zhu, Qing; Liu, Yaling; Teskey, Robert O.
2014-01-01
Conventional Q10 soil organic matter decomposition models and more complex microbial models are available for making projections of future soil carbon dynamics. However, it is unclear (1) how well the conceptually different approaches can simulate observed decomposition and (2) to what extent the trajectories of long-term simulations differ when using the different approaches. In this study, we compared three structurally different soil carbon (C) decomposition models (one Q10 and two microbial models of different complexity), each with a one- and two-horizon version. The models were calibrated and validated using 4 years of measurements of heterotrophic soil CO2 efflux from trenched plots in a Dahurian larch (Larix gmelinii Rupr.) plantation. All models reproduced the observed heterotrophic component of soil CO2 efflux, but the trajectories of soil carbon dynamics differed substantially in 100 year simulations with and without warming and increased litterfall input, with microbial models that produced better agreement with observed changes in soil organic C in long-term warming experiments. Our results also suggest that both constant and varying carbon use efficiency are plausible when modeling future decomposition dynamics and that the use of a short-term (e.g., a few years) period of measurement is insufficient to adequately constrain model parameters that represent long-term responses of microbial thermal adaption. These results highlight the need to reframe the representation of decomposition models and to constrain parameters with long-term observations and multiple data streams. We urge caution in interpreting future soil carbon responses derived from existing decomposition models because both conceptual and parameter uncertainties are substantial.
Antle, John M.; Stoorvogel, Jetse J.; Valdivia, Roberto O.
2014-01-01
This article presents conceptual and empirical foundations for new parsimonious simulation models that are being used to assess future food and environmental security of farm populations. The conceptual framework integrates key features of the biophysical and economic processes on which the farming systems are based. The approach represents a methodological advance by coupling important behavioural processes, for example, self-selection in adaptive responses to technological and environmental change, with aggregate processes, such as changes in market supply and demand conditions or environmental conditions as climate. Suitable biophysical and economic data are a critical limiting factor in modelling these complex systems, particularly for the characterization of out-of-sample counterfactuals in ex ante analyses. Parsimonious, population-based simulation methods are described that exploit available observational, experimental, modelled and expert data. The analysis makes use of a new scenario design concept called representative agricultural pathways. A case study illustrates how these methods can be used to assess food and environmental security. The concluding section addresses generalizations of parametric forms and linkages of regional models to global models. PMID:24535388
Antle, John M; Stoorvogel, Jetse J; Valdivia, Roberto O
2014-04-05
This article presents conceptual and empirical foundations for new parsimonious simulation models that are being used to assess future food and environmental security of farm populations. The conceptual framework integrates key features of the biophysical and economic processes on which the farming systems are based. The approach represents a methodological advance by coupling important behavioural processes, for example, self-selection in adaptive responses to technological and environmental change, with aggregate processes, such as changes in market supply and demand conditions or environmental conditions as climate. Suitable biophysical and economic data are a critical limiting factor in modelling these complex systems, particularly for the characterization of out-of-sample counterfactuals in ex ante analyses. Parsimonious, population-based simulation methods are described that exploit available observational, experimental, modelled and expert data. The analysis makes use of a new scenario design concept called representative agricultural pathways. A case study illustrates how these methods can be used to assess food and environmental security. The concluding section addresses generalizations of parametric forms and linkages of regional models to global models.
Coupling groundwater and riparian vegetation models to assess effects of reservoir releases
Springer, Abraham E.; Wright, Julie M.; Shafroth, Patrick B.; Stromberg, Juliet C.; Patten, Duncan T.
1999-01-01
Although riparian areas in the arid southwestern United States are critical for maintaining species diversity, their extent and health have been declining since Euro‐American settlement. The purpose of this study was to develop a methodology to evaluate the potential for riparian vegetation restoration and groundwater recharge. A numerical groundwater flow model was coupled with a conceptual riparian vegetation model to predict hydrologic conditions favorable to maintaining riparian vegetation downstream of a reservoir. A Geographic Information System (GIS) was used for this one‐way coupling. Constant and seasonally varying releases from the dam were simulated using volumes anticipated to be permitted by a regional water supplier. Simulations indicated that seasonally variable releases would produce surface flow 5.4–8.5 km below the dam in a previously dry reach. Using depth to groundwater simulations from the numerical flow model with conceptual models of depths to water necessary for maintenance of riparian vegetation, the GIS analysis predicted a 5‐ to 6.5‐fold increase in the area capable of sustaining riparian vegetation.
NASA Astrophysics Data System (ADS)
Byrne, Michael P.; O'Gorman, Paul A.
2016-12-01
Climate models simulate a strong land-ocean contrast in the response of near-surface relative humidity to global warming: relative humidity tends to increase slightly over oceans but decrease substantially over land. Surface energy balance arguments have been used to understand the response over ocean but are difficult to apply over more complex land surfaces. Here, a conceptual box model is introduced, involving moisture transport between the land and ocean boundary layers and evapotranspiration, to investigate the decreases in land relative humidity as the climate warms. The box model is applied to idealized and full-complexity (CMIP5) general circulation model simulations, and it is found to capture many of the features of the simulated changes in land relative humidity. The box model suggests there is a strong link between fractional changes in specific humidity over land and ocean, and the greater warming over land than ocean then implies a decrease in land relative humidity. Evapotranspiration is of secondary importance for the increase in specific humidity over land, but it matters more for the decrease in relative humidity. Further analysis shows there is a strong feedback between changes in surface-air temperature and relative humidity, and this can amplify the influence on relative humidity of factors such as stomatal conductance and soil moisture.
Dieckmann, P; Rall, M; Ostergaard, D
2009-01-01
We describe how simulation and incident reporting can be used in combination to make the interaction between people, (medical) technology and organisation safer for patients and users. We provide the background rationale for our conceptual ideas and apply the concepts to the analysis of an actual incident report. Simulation can serve as a laboratory to analyse such cases and to create relevant and effective training scenarios based on such analyses. We will describe a methodological framework for analysing simulation scenarios in a way that allows discovering and discussing mismatches between conceptual models of the device design and mental models users hold about the device and its use. We further describe how incident reporting systems can be used as one source of data to conduct the necessary needs analyses - both for training and further needs for closer analysis of specific devices or some of their special features or modes during usability analyses.
Conceptual Hierarchies in a Flat Attractor Network
O’Connor, Christopher M.; Cree, George S.; McRae, Ken
2009-01-01
The structure of people’s conceptual knowledge of concrete nouns has traditionally been viewed as hierarchical (Collins & Quillian, 1969). For example, superordinate concepts (vegetable) are assumed to reside at a higher level than basic-level concepts (carrot). A feature-based attractor network with a single layer of semantic features developed representations of both basic-level and superordinate concepts. No hierarchical structure was built into the network. In Experiment and Simulation 1, the graded structure of categories (typicality ratings) is accounted for by the flat attractor-network. Experiment and Simulation 2 show that, as with basic-level concepts, such a network predicts feature verification latencies for superordinate concepts (vegetable
NASA Astrophysics Data System (ADS)
Schoups, G.; Vrugt, J. A.; Fenicia, F.; van de Giesen, N. C.
2010-10-01
Conceptual rainfall-runoff models have traditionally been applied without paying much attention to numerical errors induced by temporal integration of water balance dynamics. Reliance on first-order, explicit, fixed-step integration methods leads to computationally cheap simulation models that are easy to implement. Computational speed is especially desirable for estimating parameter and predictive uncertainty using Markov chain Monte Carlo (MCMC) methods. Confirming earlier work of Kavetski et al. (2003), we show here that the computational speed of first-order, explicit, fixed-step integration methods comes at a cost: for a case study with a spatially lumped conceptual rainfall-runoff model, it introduces artificial bimodality in the marginal posterior parameter distributions, which is not present in numerically accurate implementations of the same model. The resulting effects on MCMC simulation include (1) inconsistent estimates of posterior parameter and predictive distributions, (2) poor performance and slow convergence of the MCMC algorithm, and (3) unreliable convergence diagnosis using the Gelman-Rubin statistic. We studied several alternative numerical implementations to remedy these problems, including various adaptive-step finite difference schemes and an operator splitting method. Our results show that adaptive-step, second-order methods, based on either explicit finite differencing or operator splitting with analytical integration, provide the best alternative for accurate and efficient MCMC simulation. Fixed-step or adaptive-step implicit methods may also be used for increased accuracy, but they cannot match the efficiency of adaptive-step explicit finite differencing or operator splitting. Of the latter two, explicit finite differencing is more generally applicable and is preferred if the individual hydrologic flux laws cannot be integrated analytically, as the splitting method then loses its advantage.
Exploring the Components of Dynamic Modeling Techniques
ERIC Educational Resources Information Center
Turnitsa, Charles Daniel
2012-01-01
Upon defining the terms modeling and simulation, it becomes apparent that there is a wide variety of different models, using different techniques, appropriate for different levels of representation for any one system to be modeled. Selecting an appropriate conceptual modeling technique from those available is an open question for the practitioner.…
Ground-water flow in the New Jersey Coastal Plain
Martin, Mary
1998-01-01
Ground-water flow in 10 aquifers and 9 intervening confining units of the New Jersey Coastal Plain was simulated as part of the Regional Aquifer System Analysis. Data on aquifer and confining unit characteristics and on pumpage and water levels from 1918 through 1980 were incorporated into a multilayer finite-difference model. The report describes the conceptual hydrogeologic model of the unstressed flow systems, the methods and approach used in simulating flow, and the results of the simulations.
NASA Astrophysics Data System (ADS)
Tang, Ting; Seuntjens, Piet; van Griensven, Ann; Bronders, Jan
2016-04-01
Urban areas can significantly contribute to pesticide contamination in surface water. However, pesticide behaviours in urban areas, particularly on hard surfaces, are far less studied than those in agricultural areas. Pesticide application on hard surfaces (e.g. roadsides and walkways) is of particular concern due to the high imperviousness and therefore high pesticide runoff potential. Experimental studies have shown that pesticide behaviours on and interactions with hard surfaces are important factors controlling the pesticide runoff potential, and therefore the magnitude and timing of peak concentrations in surface water. We conceptualized pesticide behaviours on hard surfaces and incorporated the conceptualization into a new pesticide runoff model. The pesticide runoff model was implemented in a catchment hydrological model WetSpa-Python (Water and Energy Transfer between Soil, Plants and Atmosphere, Python version). The conceptualization for pesticide processes on hard surfaces accounts for the differences in pesticide behaviour on different hard surfaces. Four parameters are used to describe the partitioning and wash-off of each pesticide on hard surfaces. We tested the conceptualization using experimental dataset for five pesticides on two types of hard surfaces, namely concrete and asphalt. The conceptualization gave good performance in accounting for the wash-off pattern for the modelled pesticides and surfaces, according to quantitative evaluations using the Nash-Sutcliffe efficiency and percent bias. The resulting pesticide runoff model WetSpa-PST (WetSpa for PeSTicides) can simulate pesticides and their metabolites at the catchment scale. Overall, it includes four groups of pesticide processes, namely pesticide application, pesticide interception by plant foliage, pesticide processes on land surfaces (including partitioning, degradation and wash-off on hard surface; partitioning, dissipation, infiltration and runoff in soil) and pesticide processes in depression storage (including degradation, infiltration and runoff). Processes on hard surfaces employs the conceptualization described in the paragraph above. The WetSpa-PST model can account for various spatial details of the urban features in a catchment, such as asphalt, concrete and roof areas. The distributed feature also allows users to input detailed pesticide application data of both non-point and point origins. Thanks to the Python modelling framework prototype used in the WetSpa-Python model, processes in the WetSpa-PST model can be simulated at different time steps depending on data availability and the characteristic temporal scale of each process. This helps to increase the computational accuracy during heavy rainfall events, especially for the associated fast transport of pesticides into surface water. Overall, the WetSpa-PST model has good potential in predicting effects of management options on pesticide releases from heavily urbanized catchments.
SAINT: A combined simulation language for modeling man-machine systems
NASA Technical Reports Server (NTRS)
Seifert, D. J.
1979-01-01
SAINT (Systems Analysis of Integrated Networks of Tasks) is a network modeling and simulation technique for design and analysis of complex man machine systems. SAINT provides the conceptual framework for representing systems that consist of discrete task elements, continuous state variables, and interactions between them. It also provides a mechanism for combining human performance models and dynamic system behaviors in a single modeling structure. The SAINT technique is described and applications of the SAINT are discussed.
Mukherjee, Shayantani; Warshel, Arieh
2012-01-01
The molecular origin of the action of the F0 proton gradient-driven rotor presents a major puzzle despite significant structural advances. Although important conceptual models have provided guidelines of how such systems should work, it has been challenging to generate a structure-based molecular model using physical principles that will consistently lead to the unidirectional proton-driven rotational motion during ATP synthesis. This work uses a coarse-grained (CG) model to simulate the energetics of the F0-ATPase system in the combined space defined by the rotational coordinate and the proton transport (PTR) from the periplasmic side (P) to the cytoplasmic side (N). The model establishes the molecular origin of the rotation, showing that this effect is due to asymmetry in the energetics of the proton path rather than only the asymmetry of the interaction of the Asp on the c-ring helices and Arg on the subunit-a. The simulation provides a clear conceptual background for further exploration of the electrostatic basis of proton-driven mechanochemical systems. PMID:22927379
Intercomparison of 3D pore-scale flow and solute transport simulation methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Xiaofan; Mehmani, Yashar; Perkins, William A.
2016-09-01
Multiple numerical approaches have been developed to simulate porous media fluid flow and solute transport at the pore scale. These include methods that 1) explicitly model the three-dimensional geometry of pore spaces and 2) those that conceptualize the pore space as a topologically consistent set of stylized pore bodies and pore throats. In previous work we validated a model of class 1, based on direct numerical simulation using computational fluid dynamics (CFD) codes, against magnetic resonance velocimetry (MRV) measurements of pore-scale velocities. Here we expand that validation to include additional models of class 1 based on the immersed-boundary method (IMB),more » lattice Boltzmann method (LBM), smoothed particle hydrodynamics (SPH), as well as a model of class 2 (a pore-network model or PNM). The PNM approach used in the current study was recently improved and demonstrated to accurately simulate solute transport in a two-dimensional experiment. While the PNM approach is computationally much less demanding than direct numerical simulation methods, the effect of conceptualizing complex three-dimensional pore geometries on solute transport in the manner of PNMs has not been fully determined. We apply all four approaches (CFD, LBM, SPH and PNM) to simulate pore-scale velocity distributions and nonreactive solute transport, and intercompare the model results with previously reported experimental observations. Experimental observations are limited to measured pore-scale velocities, so solute transport comparisons are made only among the various models. Comparisons are drawn both in terms of macroscopic variables (e.g., permeability, solute breakthrough curves) and microscopic variables (e.g., local velocities and concentrations).« less
Abstract: Two physically based and deterministic models, CASC2-D and KINEROS are evaluated and compared for their performances on modeling sediment movement on a small agricultural watershed over several events. Each model has different conceptualization of a watershed. CASC...
Two physically based watershed models, GSSHA and KINEROS-2 are evaluated and compared for their performances on modeling flow and sediment movement. Each model has a different watershed conceptualization. GSSHA divides the watershed into cells, and flow and sediments are routed t...
Graphical Means for Inspecting Qualitative Models of System Behaviour
ERIC Educational Resources Information Center
Bouwer, Anders; Bredeweg, Bert
2010-01-01
This article presents the design and evaluation of a tool for inspecting conceptual models of system behaviour. The basis for this research is the Garp framework for qualitative simulation. This framework includes modelling primitives, such as entities, quantities and causal dependencies, which are combined into model fragments and scenarios.…
NASA Astrophysics Data System (ADS)
Maples, S.; Fogg, G. E.; Harter, T.
2015-12-01
Accurate estimation of groundwater (GW) budgets and effective management of agricultural GW pumping remains a challenge in much of California's Central Valley (CV) due to a lack of irrigation well metering. CVHM and C2VSim are two regional-scale integrated hydrologic models that provide estimates of historical and current CV distributed pumping rates. However, both models estimate GW pumping using conceptually different agricultural water models with uncertainties that have not been adequately investigated. Here, we evaluate differences in distributed agricultural GW pumping and recharge estimates related to important differences in the conceptual framework and model assumptions used to simulate surface water (SW) and GW interaction across the root zone. Differences in the magnitude and timing of GW pumping and recharge were evaluated for a subregion (~1000 mi2) coincident with Yolo County, CA, to provide similar initial and boundary conditions for both models. Synthetic, multi-year datasets of land-use, precipitation, evapotranspiration (ET), and SW deliveries were prescribed for each model to provide realistic end-member scenarios for GW-pumping demand and recharge. Results show differences in the magnitude and timing of GW-pumping demand, deep percolation, and recharge. Discrepancies are related, in large part, to model differences in the estimation of ET requirements and representation of soil-moisture conditions. CVHM partitions ET demand, while C2VSim uses a bulk ET rate, resulting in differences in both crop-water and GW-pumping demand. Additionally, CVHM assumes steady-state soil-moisture conditions, and simulates deep percolation as a function of irrigation inefficiencies, while C2VSim simulates deep percolation as a function of transient soil-moisture storage conditions. These findings show that estimates of GW-pumping demand are sensitive to these important conceptual differences, which can impact conjunctive-use water management decisions in the CV.
Simulation of a Schema Theory-Based Knowledge Delivery System for Scientists.
ERIC Educational Resources Information Center
Vaughan, W. S., Jr.; Mavor, Anne S.
A future, automated, interactive, knowledge delivery system for use by researchers was tested using a manual cognitive model. Conceptualized from schema/frame/script theories in cognitive psychology and artificial intelligence, this hypothetical system was simulated by two psychologists who interacted with four researchers in microbiology to…
Digital Simulation Games for Social Studies Classrooms
ERIC Educational Resources Information Center
Devlin-Scherer, Roberta; Sardone, Nancy B.
2010-01-01
Data from ten teacher candidates studying teaching methods were analyzed to determine perceptions toward digital simulation games in the area of social studies. This research can be used as a conceptual model of how current teacher candidates react to new methods of instruction and determine how education programs might change existing curricula…
An Undergraduate Laboratory Activity on Molecular Dynamics Simulations
ERIC Educational Resources Information Center
Spitznagel, Benjamin; Pritchett, Paige R.; Messina, Troy C.; Goadrich, Mark; Rodriguez, Juan
2016-01-01
Vision and Change [AAAS, 2011] outlines a blueprint for modernizing biology education by addressing conceptual understanding of key concepts, such as the relationship between structure and function. The document also highlights skills necessary for student success in 21st century Biology, such as the use of modeling and simulation. Here we…
New Approaches to Motion Cuing in Flight Simulators
1991-09-01
iv Table of Contents 1.0 Introduction ............................. .......... ...... 1 1.1 The Problem of Motion Cuing in Flight Simulation...the Report ................ ................... 7 2.0 A Conceptual Model of Pilot Control .......... ............ 9 2.1 Introduction ...33 3.4 Task Analysis ................ ...................... .. 34 3.4.1 Introduction ................ ...................... 34 3.4.2 Discussion
NASA Astrophysics Data System (ADS)
Liu, Lei
The dissertation aims to achieve two goals. First, it attempts to establish a new theoretical framework---the collaborative scientific conceptual change model, which explicitly attends to social factor and epistemic practices of science, to understand conceptual change. Second, it report the findings of a classroom study to investigate how to apply this theoretical framework to examine the trajectories of collaborative scientific conceptual change in a CSCL environment and provide pedagogical implications. Two simulations were designed to help students make connections between the macroscopic substances and the aperceptual microscopic entities and underlying processes. The reported study was focused on analyzing the aggregated data from all participants and the video and audio data from twenty focal groups' collaborative activities and the process of their conceptual development in two classroom settings. Mixed quantitative and qualitative analyses were applied to analyze the video/audio data. The results found that, overall participants showed significant improvements from pretest to posttest on system understanding. Group and teacher effect as well as group variability were detected in both students' posttest performance and their collaborative activities, and variability emerged in group interaction. Multiple data analyses found that attributes of collaborative discourse and epistemic practices made a difference in student learning. Generating warranted claims in discourse as well as the predicting, coordinating theory-evidence, and modifying knowledge in epistemic practices had an impact on student's conceptual understanding. However, modifying knowledge was found negatively related to students' learning effect. The case studies show how groups differed in using the computer tools as a medium to conduct collaborative discourse and epistemic practices. Only with certain combination of discourse features and epistemic practices can the group interaction lead to successful convergent understanding. The results of the study imply that the collaborative scientific conceptual change model is an effective framework to study conceptual change and the simulation environment may mediate the development of successful collaborative interactions (including collaborative discourse and epistemic practices) that lead to collaborative scientific conceptual change.
RAETRAD MODEL OF RADON GAS GENERATION, TRANSPORT, AND INDOOR ENTRY
The report describes the theoretical basis, implementation, and validation of the Radon Emanation and Transport into Dwellings (RAETRAD) model, a conceptual and mathematical approach for simulating radon (222Rn) gas generation and transport from soils and building foundations to ...
Scale-dependency of effective hydraulic conductivity on fire-affected hillslopes
NASA Astrophysics Data System (ADS)
Langhans, Christoph; Lane, Patrick N. J.; Nyman, Petter; Noske, Philip J.; Cawson, Jane G.; Oono, Akiko; Sheridan, Gary J.
2016-07-01
Effective hydraulic conductivity (Ke) for Hortonian overland flow modeling has been defined as a function of rainfall intensity and runon infiltration assuming a distribution of saturated hydraulic conductivities (Ks). But surface boundary condition during infiltration and its interactions with the distribution of Ks are not well represented in models. As a result, the mean value of the Ks distribution (KS¯), which is the central parameter for Ke, varies between scales. Here we quantify this discrepancy with a large infiltration data set comprising four different methods and scales from fire-affected hillslopes in SE Australia using a relatively simple yet widely used conceptual model of Ke. Ponded disk (0.002 m2) and ring infiltrometers (0.07 m2) were used at the small scales and rainfall simulations (3 m2) and small catchments (ca 3000 m2) at the larger scales. We compared KS¯ between methods measured at the same time and place. Disk and ring infiltrometer measurements had on average 4.8 times higher values of KS¯ than rainfall simulations and catchment-scale estimates. Furthermore, the distribution of Ks was not clearly log-normal and scale-independent, as supposed in the conceptual model. In our interpretation, water repellency and preferential flow paths increase the variance of the measured distribution of Ks and bias ponding toward areas of very low Ks during rainfall simulations and small catchment runoff events while areas with high preferential flow capacity remain water supply-limited more than the conceptual model of Ke predicts. The study highlights problems in the current theory of scaling runoff generation.
NASA Astrophysics Data System (ADS)
Mockler, E. M.; Chun, K. P.; Sapriza-Azuri, G.; Bruen, M.; Wheater, H. S.
2016-11-01
Predictions of river flow dynamics provide vital information for many aspects of water management including water resource planning, climate adaptation, and flood and drought assessments. Many of the subjective choices that modellers make including model and criteria selection can have a significant impact on the magnitude and distribution of the output uncertainty. Hydrological modellers are tasked with understanding and minimising the uncertainty surrounding streamflow predictions before communicating the overall uncertainty to decision makers. Parameter uncertainty in conceptual rainfall-runoff models has been widely investigated, and model structural uncertainty and forcing data have been receiving increasing attention. This study aimed to assess uncertainties in streamflow predictions due to forcing data and the identification of behavioural parameter sets in 31 Irish catchments. By combining stochastic rainfall ensembles and multiple parameter sets for three conceptual rainfall-runoff models, an analysis of variance model was used to decompose the total uncertainty in streamflow simulations into contributions from (i) forcing data, (ii) identification of model parameters and (iii) interactions between the two. The analysis illustrates that, for our subjective choices, hydrological model selection had a greater contribution to overall uncertainty, while performance criteria selection influenced the relative intra-annual uncertainties in streamflow predictions. Uncertainties in streamflow predictions due to the method of determining parameters were relatively lower for wetter catchments, and more evenly distributed throughout the year when the Nash-Sutcliffe Efficiency of logarithmic values of flow (lnNSE) was the evaluation criterion.
Nishikawa, Tracy
1997-01-01
Two alternative conceptual models of the physical processes controlling seawater intrusion in a coastal basin in California, USA, were tested to identify a likely principal pathway for seawater intrusion. The conceptual models were tested by using a two-dimensional, finite-element groundwater flow and transport model. This pathway was identified by the conceptual model that best replicated the historical data. The numerical model was applied in cross section to a submarine canyon that is a main avenue for seawater to enter the aquifer system underlying the study area. Both models are characterized by a heterogeneous, layered, water-bearing aquifer. However, the first model is characterized by flat-lying aquifer layers and by a high value of hydraulic conductivity in the basal aquifer layer, which is thought to be a principal conduit for seawater intrusion. The second model is characterized by offshore folding, which was modeled as a very nearshore outcrop, thereby providing a shorter path for seawater to intrude. General conclusions are that: 1) the aquifer system is best modeled as a flat, heterogeneous, layered system; 2) relatively thin basal layers with relatively high values of hydraulic conductivity are the principal pathways for seawater intrusion; and 3) continuous clay layers of low hydraulic conductivity play an important role in controlling the movement of seawater.
Integration of Engine, Plume, and CFD Analyses in Conceptual Design of Low-Boom Supersonic Aircraft
NASA Technical Reports Server (NTRS)
Li, Wu; Campbell, Richard; Geiselhart, Karl; Shields, Elwood; Nayani, Sudheer; Shenoy, Rajiv
2009-01-01
This paper documents an integration of engine, plume, and computational fluid dynamics (CFD) analyses in the conceptual design of low-boom supersonic aircraft, using a variable fidelity approach. In particular, the Numerical Propulsion Simulation System (NPSS) is used for propulsion system cycle analysis and nacelle outer mold line definition, and a low-fidelity plume model is developed for plume shape prediction based on NPSS engine data and nacelle geometry. This model provides a capability for the conceptual design of low-boom supersonic aircraft that accounts for plume effects. Then a newly developed process for automated CFD analysis is presented for CFD-based plume and boom analyses of the conceptual geometry. Five test cases are used to demonstrate the integrated engine, plume, and CFD analysis process based on a variable fidelity approach, as well as the feasibility of the automated CFD plume and boom analysis capability.
Runoff prediction is a cornerstone of water resources planning, and therefore modeling performance is a key issue. This paper investigates the comparative advantages of conceptual versus process- based models in predicting warm season runoff for upland, low-yield micro-catchments...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reed, M.; Bowles, A.E.; Anderson, E.L.
1984-08-01
Feasibility and design considerations for developing computer models of migratory bow-head and gray whales and linking such models to oil spill models for application in Alaskan Outer Continental Shelf areas were evaluated. A summary of all relevant bowhead and gray whale distributional and migration data were summarized and presented at monthly intervals. The data were, for the most part, deemed sufficient to prepare whale migration simulation models. A variety of whale migration conceptual models were devised and ranking was achieved by means of a scaling-weighted protocol. Existing oil spill trajectory and fate models, as well as conceptual models, were similarlymore » ranked.« less
ENZVU--An Enzyme Kinetics Computer Simulation Based upon a Conceptual Model of Enzyme Action.
ERIC Educational Resources Information Center
Graham, Ian
1985-01-01
Discusses a simulation on enzyme kinetics based upon the ability of computers to generate random numbers. The program includes: (1) enzyme catalysis in a restricted two-dimensional grid; (2) visual representation of catalysis; and (3) storage and manipulation of data. Suggested applications and conclusions are also discussed. (DH)
A physical data model for fields and agents
NASA Astrophysics Data System (ADS)
de Jong, Kor; de Bakker, Merijn; Karssenberg, Derek
2016-04-01
Two approaches exist in simulation modeling: agent-based and field-based modeling. In agent-based (or individual-based) simulation modeling, the entities representing the system's state are represented by objects, which are bounded in space and time. Individual objects, like an animal, a house, or a more abstract entity like a country's economy, have properties representing their state. In an agent-based model this state is manipulated. In field-based modeling, the entities representing the system's state are represented by fields. Fields capture the state of a continuous property within a spatial extent, examples of which are elevation, atmospheric pressure, and water flow velocity. With respect to the technology used to create these models, the domains of agent-based and field-based modeling have often been separate worlds. In environmental modeling, widely used logical data models include feature data models for point, line and polygon objects, and the raster data model for fields. Simulation models are often either agent-based or field-based, even though the modeled system might contain both entities that are better represented by individuals and entities that are better represented by fields. We think that the reason for this dichotomy in kinds of models might be that the traditional object and field data models underlying those models are relatively low level. We have developed a higher level conceptual data model for representing both non-spatial and spatial objects, and spatial fields (De Bakker et al. 2016). Based on this conceptual data model we designed a logical and physical data model for representing many kinds of data, including the kinds used in earth system modeling (e.g. hydrological and ecological models). The goal of this work is to be able to create high level code and tools for the creation of models in which entities are representable by both objects and fields. Our conceptual data model is capable of representing the traditional feature data models and the raster data model, among many other data models. Our physical data model is capable of storing a first set of kinds of data, like omnipresent scalars, mobile spatio-temporal points and property values, and spatio-temporal rasters. With our poster we will provide an overview of the physical data model expressed in HDF5 and show examples of how it can be used to capture both object- and field-based information. References De Bakker, M, K. de Jong, D. Karssenberg. 2016. A conceptual data model and language for fields and agents. European Geosciences Union, EGU General Assembly, 2016, Vienna.
Conceptual Model Scenarios for the Vapor Intrusion Pathway
This report provides simplified simulation examples to illustrate graphically how subsurface conditions and building-specific characteristics determine the distribution chemical distribution and indoor air concentration relative to a source concentration.
Evaluating Conceptual Site Models with Multicomponent Reactive Transport Modeling
NASA Astrophysics Data System (ADS)
Dai, Z.; Heffner, D.; Price, V.; Temples, T. J.; Nicholson, T. J.
2005-05-01
Modeling ground-water flow and multicomponent reactive chemical transport is a useful approach for testing conceptual site models and assessing the design of monitoring networks. A graded approach with three conceptual site models is presented here with a field case of tetrachloroethene (PCE) transport and biodegradation near Charleston, SC. The first model assumed a one-layer homogeneous aquifer structure with semi-infinite boundary conditions, in which an analytical solution of the reactive solute transport can be obtained with BIOCHLOR (Aziz et al., 1999). Due to the over-simplification of the aquifer structure, this simulation cannot reproduce the monitoring data. In the second approach we used GMS to develop the conceptual site model, a layer-cake multi-aquifer system, and applied a numerical module (MODFLOW and RT3D within GMS) to solve the flow and reactive transport problem. The results were better than the first approach but still did not fit the plume well because the geological structures were still inadequately defined. In the third approach we developed a complex conceptual site model by interpreting log and seismic survey data with Petra and PetraSeis. We detected a major channel and a younger channel, through the PCE source area. These channels control the local ground-water flow direction and provide a preferential chemical transport pathway. Results using the third conceptual site model agree well with the monitoring concentration data. This study confirms that the bias and uncertainty from inadequate conceptual models are much larger than those introduced from an inadequate choice of model parameter values (Neuman and Wierenga, 2003; Meyer et al., 2004). Numerical modeling in this case provides key insight into the hydrogeology and geochemistry of the field site for predicting contaminant transport in the future. Finally, critical monitoring points and performance indicator parameters are selected for future monitoring to confirm system performance.
Sheets, Rodney A.; Bair, E. Scott; Rowe, Gary L.
1998-01-01
Combined use of the tritium/helium 3 (3H/3He) dating technique and particle-tracking analysis can improve flow-model calibration. As shown at two sites in the Great Miami buried-valley aquifer in southwestern Ohio, the combined use of 3H/3He age dating and particle tracking led to a lower mean absolute error between measured heads and simulated heads than in the original calibrated models and/or between simulated travel times and 3H/3He ages. Apparent groundwater ages were obtained for water samples collected from 44 wells at two locations where previously constructed finite difference models of groundwater flow were available (Mound Plant and Wright-Patterson Air Force Base (WPAFB)). The two-layer Mound Plant model covers 11 km2 within the buried-valley aquifer. The WPAFB model has three layers and covers 262 km2 within the buried-valley aquifer and adjacent bedrock uplands. Sampled wells were chosen along flow paths determined from potentiometric maps or particle-tracking analyses. Water samples were collected at various depths within the aquifer. In the Mound Plant area, samples used for comparison of 3H/3He ages with simulated travel times were from wells completed in the uppermost model layer. Simulated travel times agreed well with 3H/3He ages. The mean absolute error (MAE) was 3.5 years. Agreement in ages at WPAFB decreased with increasing depth in the system. The MAEs were 1.63, 17.2, and 255 years for model layers 1, 2, and 3, respectively. Discrepancies between the simulated travel times and 3H/3He ages were assumed to be due to improper conceptualization or incorrect parameterization of the flow models. Selected conceptual and parameter modifications to the models resulted in improved agreement between 3H/3He ages and simulated travel times and between measured and simulated heads and flows.
How much expert knowledge is it worth to put in conceptual hydrological models?
NASA Astrophysics Data System (ADS)
Antonetti, Manuel; Zappa, Massimiliano
2017-04-01
Both modellers and experimentalists agree on using expert knowledge to improve our conceptual hydrological simulations on ungauged basins. However, they use expert knowledge differently for both hydrologically mapping the landscape and parameterising a given hydrological model. Modellers use generally very simplified (e.g. topography-based) mapping approaches and put most of the knowledge for constraining the model by defining parameter and process relational rules. In contrast, experimentalists tend to invest all their detailed and qualitative knowledge about processes to obtain a spatial distribution of areas with different dominant runoff generation processes (DRPs) as realistic as possible, and for defining plausible narrow value ranges for each model parameter. Since, most of the times, the modelling goal is exclusively to simulate runoff at a specific site, even strongly simplified hydrological classifications can lead to satisfying results due to equifinality of hydrological models, overfitting problems and the numerous uncertainty sources affecting runoff simulations. Therefore, to test to which extent expert knowledge can improve simulation results under uncertainty, we applied a typical modellers' modelling framework relying on parameter and process constraints defined based on expert knowledge to several catchments on the Swiss Plateau. To map the spatial distribution of the DRPs, mapping approaches with increasing involvement of expert knowledge were used. Simulation results highlighted the potential added value of using all the expert knowledge available on a catchment. Also, combinations of event types and landscapes, where even a simplified mapping approach can lead to satisfying results, were identified. Finally, the uncertainty originated by the different mapping approaches was compared with the one linked to meteorological input data and catchment initial conditions.
NASA Astrophysics Data System (ADS)
Bormann, H.; Diekkrüger, B.
2003-04-01
A conceptual model is presented to simulate the water fluxes of regional catchments in Benin (West Africa). The model is applied in the framework of the IMPETUS project (an integrated approach to the efficient management of scarce water resources in West Africa) which aims to assess the effects of environmental and anthropogenic changes on the regional hydrological processes and on the water availability in Benin. In order to assess the effects of decreasing precipitation and increasing human activities on the hydrological processes in the upper Ouémé valley, a scenario analysis is performed to predict possible changes. Therefore a regional hydrological model is proposed which reproduces the recent hydrological processes, and which is able to consider the changes of landscape properties.The study presented aims to check the validity of the conceptual and lumped model under the conditions of the subhumid tree savannah and therefore analyses the importance of possible sources of uncertainty. Main focus is set on the uncertainties caused by input data, model parameters and model structure. As the model simulates the water fluxes at the catchment outlet of the Térou river (3133 km2) in a sufficient quality, first results of a scenario analysis are presented. Changes of interest are the expected future decrease in amount and temporal structure of the precipitation (e.g. minus X percent precipitation during the whole season versus minus X percent precipitation in the end of the rainy season, alternatively), the decrease in soil water storage capacity which is caused by erosion, and the increasing consumption of ground water for drinking water and agricultural purposes. Resuming from the results obtained, the perspectives of lumped and conceptual models are discussed with special regard to available management options of this kind of models. Advantages and disadvantages compared to alternative model approaches (process based, physics based) are discussed.
A simplified model of precipitation enhancement over a heterogeneous surface
NASA Astrophysics Data System (ADS)
Cioni, Guido; Hohenegger, Cathy
2018-06-01
Soil moisture heterogeneities influence the onset of convection and subsequent evolution of precipitating systems through the triggering of mesoscale circulations. However, local evaporation also plays a role in determining precipitation amounts. Here we aim at disentangling the effect of advection and evaporation on precipitation over the course of a diurnal cycle by formulating a simple conceptual model. The derivation of the model is inspired by the results of simulations performed with a high-resolution (250 m) large eddy simulation model over a surface with varying degrees of heterogeneity. A key element of the conceptual model is the representation of precipitation as a weighted sum of advection and evaporation, each weighed by its own efficiency. The model is then used to isolate the main parameters that control precipitation variations over a spatially drier patch. It is found that these changes surprisingly do not depend on soil moisture itself but instead purely on parameters that describe the atmospheric initial state. The likelihood for enhanced precipitation over drier soils is discussed based on these parameters. Additional experiments are used to test the validity of the model.
A Generic Guidance and Control Structure for Six-Degree-of-Freedom Conceptual Aircraft Design
NASA Technical Reports Server (NTRS)
Cotting, M. Christopher; Cox, Timothy H.
2005-01-01
A control system framework is presented for both real-time and batch six-degree-of-freedom simulation. This framework allows stabilization and control with multiple command options, from body rate control to waypoint guidance. Also, pilot commands can be used to operate the simulation in a pilot-in-the-loop environment. This control system framework is created by using direct vehicle state feedback with nonlinear dynamic inversion. A direct control allocation scheme is used to command aircraft effectors. Online B-matrix estimation is used in the control allocation algorithm for maximum algorithm flexibility. Primary uses for this framework include conceptual design and early preliminary design of aircraft, where vehicle models change rapidly and a knowledge of vehicle six-degree-of-freedom performance is required. A simulated airbreathing hypersonic vehicle and a simulated high performance fighter are controlled to demonstrate the flexibility and utility of the control system.
Reilly, Thomas E.; Harbaugh, Arlen W.
1993-01-01
Cylindrical (axisymmetric) flow to a well is an important specialized topic of ground-water hydraulics and has been applied by many investigators to determine aquifer properties and determine heads and flows in the vicinity of the well. A recent modification to the U.S. Geological Survey Modular Three-Dimensional Finite-Difference Ground-Water Flow Model provides the opportunity to simulate axisymmetric flow to a well. The theory involves the conceptualization of a system of concentric shells that are capable of reproducing the large variations in gradient in the vicinity of the well by decreasing their area in the direction of the well. The computer program presented serves as a preprocessor to the U.S. Geological Survey model by creating the input data file needed to implement the axisymmetric conceptualization. Data input requirements to this preprocessor are described, and a comparison with a known analytical solution indicates that the model functions appropriately.
NASA Astrophysics Data System (ADS)
Knoben, Wouter; Woods, Ross; Freer, Jim
2016-04-01
Conceptual hydrologic models consist of a certain arrangement of spatial and temporal dynamics consisting of stores, fluxes and transformation functions, depending on the modeller's choices and intended use. They have the advantages of being computationally efficient, being relatively easy model structures to reconfigure and having relatively low input data demands. This makes them well-suited for large-scale and large-sample hydrology, where appropriately representing the dominant hydrologic functions of a catchment is a main concern. Given these requirements, the number of parameters in the model cannot be too high, to avoid equifinality and identifiability issues. This limits the number and level of complexity of dominant hydrologic processes the model can represent. Specific purposes and places thus require a specific model and this has led to an abundance of conceptual hydrologic models. No structured overview of these models exists and there is no clear method to select appropriate model structures for different catchments. This study is a first step towards creating an overview of the elements that make up conceptual models, which may later assist a modeller in finding an appropriate model structure for a given catchment. To this end, this study brings together over 30 past and present conceptual models. The reviewed model structures are simply different configurations of three basic model elements (stores, fluxes and transformation functions), depending on the hydrologic processes the models are intended to represent. Differences also exist in the inner workings of the stores, fluxes and transformations, i.e. the mathematical formulations that describe each model element's intended behaviour. We investigate the hypothesis that different model structures can produce similar behavioural simulations. This can clarify the overview of model elements by grouping elements which are similar, which can improve model structure selection.
Analyzing Robotic Kinematics Via Computed Simulations
NASA Technical Reports Server (NTRS)
Carnahan, Timothy M.
1992-01-01
Computing system assists in evaluation of kinematics of conceptual robot. Displays positions and motions of robotic manipulator within work cell. Also displays interactions between robotic manipulator and other objects. Results of simulation displayed on graphical computer workstation. System includes both off-the-shelf software originally developed for automotive industry and specially developed software. Simulation system also used to design human-equivalent hand, to model optical train in infrared system, and to develop graphical interface for teleoperator simulation system.
Rule based design of conceptual models for formative evaluation
NASA Technical Reports Server (NTRS)
Moore, Loretta A.; Chang, Kai; Hale, Joseph P.; Bester, Terri; Rix, Thomas; Wang, Yaowen
1994-01-01
A Human-Computer Interface (HCI) Prototyping Environment with embedded evaluation capability has been investigated. This environment will be valuable in developing and refining HCI standards and evaluating program/project interface development, especially Space Station Freedom on-board displays for payload operations. This environment, which allows for rapid prototyping and evaluation of graphical interfaces, includes the following four components: (1) a HCI development tool; (2) a low fidelity simulator development tool; (3) a dynamic, interactive interface between the HCI and the simulator; and (4) an embedded evaluator that evaluates the adequacy of a HCI based on a user's performance. The embedded evaluation tool collects data while the user is interacting with the system and evaluates the adequacy of an interface based on a user's performance. This paper describes the design of conceptual models for the embedded evaluation system using a rule-based approach.
Rule based design of conceptual models for formative evaluation
NASA Technical Reports Server (NTRS)
Moore, Loretta A.; Chang, Kai; Hale, Joseph P.; Bester, Terri; Rix, Thomas; Wang, Yaowen
1994-01-01
A Human-Computer Interface (HCI) Prototyping Environment with embedded evaluation capability has been investigated. This environment will be valuable in developing and refining HCI standards and evaluating program/project interface development, especially Space Station Freedom on-board displays for payload operations. This environment, which allows for rapid prototyping and evaluation of graphical interfaces, includes the following four components: (1) a HCI development tool, (2) a low fidelity simulator development tool, (3) a dynamic, interactive interface between the HCI and the simulator, and (4) an embedded evaluator that evaluates the adequacy of a HCI based on a user's performance. The embedded evaluation tool collects data while the user is interacting with the system and evaluates the adequacy of an interface based on a user's performance. This paper describes the design of conceptual models for the embedded evaluation system using a rule-based approach.
A geostationary Earth orbit satellite model using Easy Java Simulation
NASA Astrophysics Data System (ADS)
Wee, Loo Kang; Hwee Goh, Giam
2013-01-01
We develop an Easy Java Simulation (EJS) model for students to visualize geostationary orbits near Earth, modelled using a Java 3D implementation of the EJS 3D library. The simplified physics model is described and simulated using a simple constant angular velocity equation. We discuss four computer model design ideas: (1) a simple and realistic 3D view and associated learning in the real world; (2) comparative visualization of permanent geostationary satellites; (3) examples of non-geostationary orbits of different rotation senses, periods and planes; and (4) an incorrect physics model for conceptual discourse. General feedback from the students has been relatively positive, and we hope teachers will find the computer model useful in their own classes.
Fulton, John W.; Koerkle, Edward H.; McAuley, Steven D.; Hoffman, Scott A.; Zarr, Linda F.
2005-01-01
The Spring Creek Basin, Centre County, Pa., is experiencing some of the most rapid growth and development within the Commonwealth. This trend has resulted in land-use changes and increased water use, which will affect the quantity and quality of stormwater runoff, surface water, ground water, and aquatic resources within the basin. The U.S. Geological Survey (USGS), in cooperation with the ClearWater Conservancy (CWC), Spring Creek Watershed Community (SCWC), and Spring Creek Watershed Commission (SCWCm), has developed a Watershed Plan (Plan) to assist decision makers in water-resources planning. One element of the Plan is to provide a summary of the basin characteristics and a conceptual model that incorporates the hydrogeologic characteristics of the basin. The report presents hydrogeologic data for the basin and presents a conceptual model that can be used as the basis for simulating surface-water and ground-water flow within the basin. Basin characteristics; sources of data referenced in this text; physical characteristics such as climate, physiography, topography, and land use; hydrogeologic characteristics; and water-quality characteristics are discussed. A conceptual model is a simplified description of the physical components and interaction of the surface- and ground-water systems. The purpose for constructing a conceptual model is to simplify the problem and to organize the available data so that the system can be analyzed accurately. Simplification is necessary, because a complete accounting of a system, such as Spring Creek, is not possible. The data and the conceptual model could be used in development of a fully coupled numerical model that dynamically links surface water, ground water, and land-use changes. The model could be used by decision makers to manage water resources within the basin and as a prototype that is transferable to other watersheds.
Intercomparison of 3D pore-scale flow and solute transport simulation methods
Mehmani, Yashar; Schoenherr, Martin; Pasquali, Andrea; ...
2015-09-28
Multiple numerical approaches have been developed to simulate porous media fluid flow and solute transport at the pore scale. These include 1) methods that explicitly model the three-dimensional geometry of pore spaces and 2) methods that conceptualize the pore space as a topologically consistent set of stylized pore bodies and pore throats. In previous work we validated a model of the first type, using computational fluid dynamics (CFD) codes employing a standard finite volume method (FVM), against magnetic resonance velocimetry (MRV) measurements of pore-scale velocities. Here we expand that validation to include additional models of the first type based onmore » the lattice Boltzmann method (LBM) and smoothed particle hydrodynamics (SPH), as well as a model of the second type, a pore-network model (PNM). The PNM approach used in the current study was recently improved and demonstrated to accurately simulate solute transport in a two-dimensional experiment. While the PNM approach is computationally much less demanding than direct numerical simulation methods, the effect of conceptualizing complex three-dimensional pore geometries on solute transport in the manner of PNMs has not been fully determined. We apply all four approaches (FVM-based CFD, LBM, SPH and PNM) to simulate pore-scale velocity distributions and (for capable codes) nonreactive solute transport, and intercompare the model results. Comparisons are drawn both in terms of macroscopic variables (e.g., permeability, solute breakthrough curves) and microscopic variables (e.g., local velocities and concentrations). Generally good agreement was achieved among the various approaches, but some differences were observed depending on the model context. The intercomparison work was challenging because of variable capabilities of the codes, and inspired some code enhancements to allow consistent comparison of flow and transport simulations across the full suite of methods. This paper provides support for confidence in a variety of pore-scale modeling methods and motivates further development and application of pore-scale simulation methods.« less
Intercomparison of 3D pore-scale flow and solute transport simulation methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Xiaofan; Mehmani, Yashar; Perkins, William A.
2016-09-01
Multiple numerical approaches have been developed to simulate porous media fluid flow and solute transport at the pore scale. These include 1) methods that explicitly model the three-dimensional geometry of pore spaces and 2) methods that conceptualize the pore space as a topologically consistent set of stylized pore bodies and pore throats. In previous work we validated a model of the first type, using computational fluid dynamics (CFD) codes employing a standard finite volume method (FVM), against magnetic resonance velocimetry (MRV) measurements of pore-scale velocities. Here we expand that validation to include additional models of the first type based onmore » the lattice Boltzmann method (LBM) and smoothed particle hydrodynamics (SPH), as well as a model of the second type, a pore-network model (PNM). The PNM approach used in the current study was recently improved and demonstrated to accurately simulate solute transport in a two-dimensional experiment. While the PNM approach is computationally much less demanding than direct numerical simulation methods, the effect of conceptualizing complex three-dimensional pore geometries on solute transport in the manner of PNMs has not been fully determined. We apply all four approaches (FVM-based CFD, LBM, SPH and PNM) to simulate pore-scale velocity distributions and (for capable codes) nonreactive solute transport, and intercompare the model results. Comparisons are drawn both in terms of macroscopic variables (e.g., permeability, solute breakthrough curves) and microscopic variables (e.g., local velocities and concentrations). Generally good agreement was achieved among the various approaches, but some differences were observed depending on the model context. The intercomparison work was challenging because of variable capabilities of the codes, and inspired some code enhancements to allow consistent comparison of flow and transport simulations across the full suite of methods. This study provides support for confidence in a variety of pore-scale modeling methods and motivates further development and application of pore-scale simulation methods.« less
Detecting hydrological changes through conceptual model
NASA Astrophysics Data System (ADS)
Viola, Francesco; Caracciolo, Domenico; Pumo, Dario; Francipane, Antonio; Valerio Noto, Leonardo
2015-04-01
Natural changes and human modifications in hydrological systems coevolve and interact in a coupled and interlinked way. If, on one hand, climatic changes are stochastic, non-steady, and affect the hydrological systems, on the other hand, human-induced changes due to over-exploitation of soils and water resources modifies the natural landscape, water fluxes and its partitioning. Indeed, the traditional assumption of static systems in hydrological analysis, which has been adopted for long time, fails whenever transient climatic conditions and/or land use changes occur. Time series analysis is a way to explore environmental changes together with societal changes; unfortunately, the not distinguishability between causes restrict the scope of this method. In order to overcome this limitation, it is possible to couple time series analysis with an opportune hydrological model, such as a conceptual hydrological model, which offers a schematization of complex dynamics acting within a basin. Assuming that model parameters represent morphological basin characteristics and that calibration is a way to detect hydrological signature at a specific moment, it is possible to argue that calibrating the model over different time windows could be a method for detecting potential hydrological changes. In order to test the capabilities of a conceptual model in detecting hydrological changes, this work presents different "in silico" experiments. A synthetic-basin is forced with an ensemble of possible future scenarios generated with a stochastic weather generator able to simulate steady and non-steady climatic conditions. The experiments refer to Mediterranean climate, which is characterized by marked seasonality, and consider the outcomes of the IPCC 5th report for describing climate evolution in the next century. In particular, in order to generate future climate change scenarios, a stochastic downscaling in space and time is carried out using realizations of an ensemble of General Circulation Models (GCMs) for the future scenarios 2046-2065 and 2081-2100. Land use changes (i.e., changes in the fraction of impervious area due to increasing urbanization) are explicitly simulated, while the reference hydrological responses are assessed by the spatially distributed, process-based hydrological model tRIBS, the TIN-based Real-time Integrated Basin Simulator. Several scenarios have been created, describing hypothetical centuries with steady conditions, climate change conditions, land use change conditions and finally complex conditions involving both transient climatic modifications and gradual land use changes. A conceptual lumped model, the EHSM (EcoHydrological Streamflow Model) is calibrated for the above mentioned scenarios with regard to different time-windows. The calibrated parameters show high sensitivity to anthropic variations in land use and/or climatic variability. Land use changes are clearly visible from parameters evolution especially when steady climatic conditions are considered. When the increase in urbanization is coupled with rainfall reduction the ability to detect human interventions through the analysis of conceptual model parameters is weakened.
NASA Astrophysics Data System (ADS)
Clark, Martyn P.; Kavetski, Dmitri
2010-10-01
A major neglected weakness of many current hydrological models is the numerical method used to solve the governing model equations. This paper thoroughly evaluates several classes of time stepping schemes in terms of numerical reliability and computational efficiency in the context of conceptual hydrological modeling. Numerical experiments are carried out using 8 distinct time stepping algorithms and 6 different conceptual rainfall-runoff models, applied in a densely gauged experimental catchment, as well as in 12 basins with diverse physical and hydroclimatic characteristics. Results show that, over vast regions of the parameter space, the numerical errors of fixed-step explicit schemes commonly used in hydrology routinely dwarf the structural errors of the model conceptualization. This substantially degrades model predictions, but also, disturbingly, generates fortuitously adequate performance for parameter sets where numerical errors compensate for model structural errors. Simply running fixed-step explicit schemes with shorter time steps provides a poor balance between accuracy and efficiency: in some cases daily-step adaptive explicit schemes with moderate error tolerances achieved comparable or higher accuracy than 15 min fixed-step explicit approximations but were nearly 10 times more efficient. From the range of simple time stepping schemes investigated in this work, the fixed-step implicit Euler method and the adaptive explicit Heun method emerge as good practical choices for the majority of simulation scenarios. In combination with the companion paper, where impacts on model analysis, interpretation, and prediction are assessed, this two-part study vividly highlights the impact of numerical errors on critical performance aspects of conceptual hydrological models and provides practical guidelines for robust numerical implementation.
Detailed Modelling of Kinetic Biodegradation Processes in a Laboratory Mmicrocosm
NASA Astrophysics Data System (ADS)
Watson, I.; Oswald, S.; Banwart, S.; Mayer, U.
2003-04-01
Biodegradation of organic contaminants in soil and groundwater usually takes places via different redox processes happening sequentially as well as simultaneously. We used numerical modelling of a long-term lab microcosm experiment to simulate the dynamic behaviour of fermentation and respiration in the aqueous phase in contact with the sandstone material, and to develop a conceptual model describing these processes. Aqueous speciation, surface complexation, mineral dissolution and precipitation were taken into account also. Fermentation can be the first step of the degradation process producing intermediate species, which are subsequently consumed by TEAPs. Microbial growth and substrate utilisation kinetics are coupled via a formulation that also includes aqueous speciation and other geochemical reactions including surface complexation, mineral dissolution and precipitation. Competitive exclusion between TEAPs is integral to the conceptual model of the simulation, and the results indicate that exclusion is not complete, but some overlap is found between TEAPs. The model was used to test approaches like the partial equilibrium approach that currently make use of hydrogen levels to diagnose prevalent TEAPs in groundwater. The observed pattern of hydrogen and acetate concentrations were reproduced well by the simulations, and the results show the relevance of kinetics, lag times and inhibition, and especially that intermediate products play a key role.
NASA Astrophysics Data System (ADS)
Shafii, Mahyar; Basu, Nandita; Schiff, Sherry; Van Cappellen, Philippe
2017-04-01
Dramatic increase in nitrogen circulating in the biosphere due to anthropogenic activities has resulted in impairment of water quality in groundwater and surface water causing eutrophication in coastal regions. Understanding the fate and transport of nitrogen from landscape to coastal areas requires exploring the drivers of nitrogen processes in both time and space, as well as the identification of appropriate flow pathways. Conceptual models can be used as diagnostic tools to provide insights into such controls. However, diagnostic evaluation of coupled hydrological-biogeochemical models is challenging. This research proposes a top-down methodology utilizing hydrochemical signatures to develop conceptual models for simulating the integrated streamflow and nitrate responses while taking into account dominant controls on nitrate variability (e.g., climate, soil water content, etc.). Our main objective is to seek appropriate model complexity that sufficiently reproduces multiple hydrological and nitrate signatures. Having developed a suitable conceptual model for a given watershed, we employ it in sensitivity studies to demonstrate the dominant process controls that contribute to the nitrate response at scales of interest. We apply the proposed approach to nitrate simulation in a range of small to large sub-watersheds in the Grand River Watershed (GRW) located in Ontario. Such multi-basin modeling experiment will enable us to address process scaling and investigate the consequences of lumping processes in terms of models' predictive capability. The proposed methodology can be applied to the development of large-scale models that can help decision-making associated with nutrients management at regional scale.
Multi-model groundwater-management optimization: reconciling disparate conceptual models
NASA Astrophysics Data System (ADS)
Timani, Bassel; Peralta, Richard
2015-09-01
Disagreement among policymakers often involves policy issues and differences between the decision makers' implicit utility functions. Significant disagreement can also exist concerning conceptual models of the physical system. Disagreement on the validity of a single simulation model delays discussion on policy issues and prevents the adoption of consensus management strategies. For such a contentious situation, the proposed multi-conceptual model optimization (MCMO) can help stakeholders reach a compromise strategy. MCMO computes mathematically optimal strategies that simultaneously satisfy analogous constraints and bounds in multiple numerical models that differ in boundary conditions, hydrogeologic stratigraphy, and discretization. Shadow prices and trade-offs guide the process of refining the first MCMO-developed `multi-model strategy into a realistic compromise management strategy. By employing automated cycling, MCMO is practical for linear and nonlinear aquifer systems. In this reconnaissance study, MCMO application to the multilayer Cache Valley (Utah and Idaho, USA) river-aquifer system employs two simulation models with analogous background conditions but different vertical discretization and boundary conditions. The objective is to maximize additional safe pumping (beyond current pumping), subject to constraints on groundwater head and seepage from the aquifer to surface waters. MCMO application reveals that in order to protect the local ecosystem, increased groundwater pumping can satisfy only 40 % of projected water demand increase. To explore the possibility of increasing that pumping while protecting the ecosystem, MCMO clearly identifies localities requiring additional field data. MCMO is applicable to other areas and optimization problems than used here. Steps to prepare comparable sub-models for MCMO use are area-dependent.
Conceptualizing Peatlands in a Physically-Based Spatially Distributed Hydrologic Model
NASA Astrophysics Data System (ADS)
Downer, Charles; Wahl, Mark
2017-04-01
In as part of a research effort focused on climate change effects on permafrost near Fairbanks, Alaska, it became apparent that peat soils, overlain by thick sphagnum moss, had a considerable effect on the overall hydrology. Peatlands represent a confounding mixture of vegetation, soils, and water that present challenges for conceptualizing and parametrizing hydrologic models. We employed the Gridded Surface Subsurface Hydrologic Analysis Model (GSSHA) in our analysis of the Caribou Poker Creek Experimental Watershed (CPCRW). GSSHA is a physically-based, spatially distributed, watershed model developed by the U.S. Army to simulate important streamflow-generating processes (Downer and Ogden, 2004). The model enables simulation of surface water and groundwater interactions, as well as soil temperature and frozen ground effects on subsurface water movement. The test site is a 104 km2 basin located in the Yukon-Tanana Uplands of the Northern Plateaus Physiographic Province centered on 65˚10' N latitude and 147˚30' W longitude. The area lies above the Chattanika River floodplain and is characterized by rounded hilltops with gentle slopes and alluvium-floored valleys having minimal relief (Wahrhaftig, 1965) underlain by a mica shist of the Birch Creek formation (Rieger et al., 1972). The region has a cold continental climate characterized by short warm summers and long cold winters. Observed stream flows indicated significant groundwater contribution with sustained base flows even during dry periods. A site visit exposed the presence of surface water flows indicating a mixed basin that would require both surface and subsurface simulation capability to properly capture the response. Soils in the watershed are predominately silt loam underlain by shallow fractured bedrock. Throughout much of the basin, a thick layer of live sphagnum moss and fine peat covers the ground surface. A restrictive layer of permafrost is found on north facing slopes. The combination of thick moss and peat soils presented a conundrum in terms of conceptualizing the hydrology and identifying reasonable parameter ranges for physical properties. Various combinations of overland roughness, surface retention, and subsurface flow were used to represent the peatlands. The process resulted in some interesting results that may shed light on the dominant hydrologic processes associated with peatland, as well as what hydrologic conceptualizations, simulation tools, and approaches are applicable in modeling peatland hydrology. Downer, C.W., Ogden, F.L., 2004. GSSHA: Model to simulate diverse stream flow producing processes. J. Hydrol. Eng. 161-174. Rieger, S., Furbush, C.E., Schoephorster, D.B., Summerfield Jr., H., Geiger, L.C., 1972. Soils of the Caribou-Poker Creeks Research Watershed, Interior Alaska. Hanover, New Hampshire. Wahrhaftig, C., 1965. Physiographic Divisions of Alaska. Washington, DC.
Dynamic Simulation of Crime Perpetration and Reporting to Examine Community Intervention Strategies
ERIC Educational Resources Information Center
Yonas, Michael A.; Burke, Jessica G.; Brown, Shawn T.; Borrebach, Jeffrey D.; Garland, Richard; Burke, Donald S.; Grefenstette, John J.
2013-01-01
Objective: To develop a conceptual computational agent-based model (ABM) to explore community-wide versus spatially focused crime reporting interventions to reduce community crime perpetrated by youth. Method: Agents within the model represent individual residents and interact on a two-dimensional grid representing an abstract nonempirically…
DOT National Transportation Integrated Search
2002-08-01
Building upon the conceptual framework developed during our year one research, a container port and multimodal transportation demand simulation model is applied. The model selects the least-cost (vessel-port-rail-truck) route from sources to markets,...
A SCREENING MODEL FOR SIMULATING DNAPL FLOW AND TRANSPORT IN POROUS MEDIA: THEORETICAL DEVELOPMENT
There exists a need for a simple tool that will allow us to analyze a DNAPL contamination scenario from free-product release to transport of soluble constituents to downgradient receptor wells. The objective of this manuscript is to present the conceptual model and formulate the ...
Research in Distance Education: A System Modeling Approach.
ERIC Educational Resources Information Center
Saba, Farhad; Twitchell, David
This demonstration of the use of a computer simulation research method based on the System Dynamics modeling technique for studying distance education reviews research methods in distance education, including the broad categories of conceptual and case studies, and presents a rationale for the application of systems research in this area. The…
Kovalchuk, Sergey V; Funkner, Anastasia A; Metsker, Oleg G; Yakovlev, Aleksey N
2018-06-01
An approach to building a hybrid simulation of patient flow is introduced with a combination of data-driven methods for automation of model identification. The approach is described with a conceptual framework and basic methods for combination of different techniques. The implementation of the proposed approach for simulation of the acute coronary syndrome (ACS) was developed and used in an experimental study. A combination of data, text, process mining techniques, and machine learning approaches for the analysis of electronic health records (EHRs) with discrete-event simulation (DES) and queueing theory for the simulation of patient flow was proposed. The performed analysis of EHRs for ACS patients enabled identification of several classes of clinical pathways (CPs) which were used to implement a more realistic simulation of the patient flow. The developed solution was implemented using Python libraries (SimPy, SciPy, and others). The proposed approach enables more a realistic and detailed simulation of the patient flow within a group of related departments. An experimental study shows an improved simulation of patient length of stay for ACS patient flow obtained from EHRs in Almazov National Medical Research Centre in Saint Petersburg, Russia. The proposed approach, methods, and solutions provide a conceptual, methodological, and programming framework for the implementation of a simulation of complex and diverse scenarios within a flow of patients for different purposes: decision making, training, management optimization, and others. Copyright © 2018 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Wu, Y.; Blodau, C.
2013-08-01
Elevated nitrogen deposition and climate change alter the vegetation communities and carbon (C) and nitrogen (N) cycling in peatlands. To address this issue we developed a new process-oriented biogeochemical model (PEATBOG) for analyzing coupled carbon and nitrogen dynamics in northern peatlands. The model consists of four submodels, which simulate: (1) daily water table depth and depth profiles of soil moisture, temperature and oxygen levels; (2) competition among three plants functional types (PFTs), production and litter production of plants; (3) decomposition of peat; and (4) production, consumption, diffusion and export of dissolved C and N species in soil water. The model is novel in the integration of the C and N cycles, the explicit spatial resolution belowground, the consistent conceptualization of movement of water and solutes, the incorporation of stoichiometric controls on elemental fluxes and a consistent conceptualization of C and N reactivity in vegetation and soil organic matter. The model was evaluated for the Mer Bleue Bog, near Ottawa, Ontario, with regards to simulation of soil moisture and temperature and the most important processes in the C and N cycles. Model sensitivity was tested for nitrogen input, precipitation, and temperature, and the choices of the most uncertain parameters were justified. A simulation of nitrogen deposition over 40 yr demonstrates the advantages of the PEATBOG model in tracking biogeochemical effects and vegetation change in the ecosystem.
NASA Astrophysics Data System (ADS)
Wu, Y.; Blodau, C.
2013-03-01
Elevated nitrogen deposition and climate change alter the vegetation communities and carbon (C) and nitrogen (N) cycling in peatlands. To address this issue we developed a new process-oriented biogeochemical model (PEATBOG) for analyzing coupled carbon and nitrogen dynamics in northern peatlands. The model consists of four submodels, which simulate: (1) daily water table depth and depth profiles of soil moisture, temperature and oxygen levels; (2) competition among three plants functional types (PFTs), production and litter production of plants; (3) decomposition of peat; and (4) production, consumption, diffusion and export of dissolved C and N species in soil water. The model is novel in the integration of the C and N cycles, the explicit spatial resolution belowground, the consistent conceptualization of movement of water and solutes, the incorporation of stoichiometric controls on elemental fluxes and a consistent conceptualization of C and N reactivity in vegetation and soil organic matter. The model was evaluated for the Mer Bleue Bog, near Ottawa, Ontario, with regards to simulation of soil moisture and temperature and the most important processes in the C and N cycles. Model sensitivity was tested for nitrogen input, precipitation, and temperature, and the choices of the most uncertain parameters were justified. A simulation of nitrogen deposition over 40 yr demonstrates the advantages of the PEATBOG model in tracking biogeochemical effects and vegetation change in the ecosystem.
General-circulation-model simulations of future snowpack in the western United States
McCabe, G.J.; Wolock, D.M.
1999-01-01
April 1 snowpack accumulations measured at 311 snow courses in the western United States (U.S.) are grouped using a correlation-based cluster analysis. A conceptual snow accumulation and melt model and monthly temperature and precipitation for each cluster are used to estimate cluster-average April 1 snowpack. The conceptual snow model is subsequently used to estimate future snowpack by using changes in monthly temperature and precipitation simulated by the Canadian Centre for Climate Modeling and Analysis (CCC) and the Hadley Centre for Climate Prediction and Research (HADLEY) general circulation models (GCMs). Results for the CCC model indicate that although winter precipitation is estimated to increase in the future, increases in temperatures will result in large decreases in April 1 snowpack for the entire western US. Results for the HADLEY model also indicate large decreases in April 1 snowpack for most of the western US, but the decreases are not as severe as those estimated using the CCC simulations. Although snowpack conditions are estimated to decrease for most areas of the western US, both GCMs estimate a general increase in winter precipitation toward the latter half of the next century. Thus, water quantity may be increased in the western US; however, the timing of runoff will be altered because precipitation will more frequently occur as rain rather than as snow.
Surface-water hydrology and runoff simulations for three basins in Pierce County, Washington
Mastin, M.C.
1996-01-01
The surface-water hydrology in Clear, Clarks, and Clover Creek Basins in central Pierce County, Washington, is described with a conceptual model of the runoff processes and then simulated with the Hydrological Simulation Program-FORTRAN (HSPF), a continuous, deterministic hydrologic model. The study area is currently undergoing a rapid conversion of rural, undeveloped land to urban and suburban land that often changes the flow characteristics of the streams that drain these lands. The complex interactions of land cover, climate, soils, topography, channel characteristics, and ground- water flow patterns determine the surface-water hydrology of the study area and require a complex numerical model to assess the impact of urbanization on streamflows. The U.S. Geological Survey completed this investigation in cooperation with the Storm Drainage and Surface Water Management Utility within the Pierce County Department of Public Works to describe the important rainfall-runoff processes within the study area and to develop a simulation model to be used as a tool to predict changes in runoff characteristics resulting from changes in land use. The conceptual model, a qualitative representation of the study basins, links the physical characteristics to the runoff process of the study basins. The model incorporates 11 generalizations identified by the investigation, eight of which describe runoff from hillslopes, and three that account for the effects of channel characteristics and ground-water flow patterns on runoff. Stream discharge was measured at 28 sites and precipitation was measured at six sites for 3 years in two overlapping phases during the period of October 1989 through September 1992 to calibrate and validate the simulation model. Comparison of rainfall data from October 1989 through September 1992 shows the data-collection period beginning with 2 wet water years followed by the relatively dry 1992 water year. Runoff was simulated with two basin models-the Clover Creek Basin model and the Clear-Clarks Basin model-by incorporating the generalizations of the conceptual model into the construction of two HSPF numerical models. Initially, the process-related parameters for runoff from glacial-till hillslopes were calibrated with numerical models for three catchment sites and one headwater basin where streamflows were continuously measured and little or no influence from ground water, channel storage, or channel losses affected runoff. At one of the catchments soil moisture was monitored and compared with simulated soil moisture. The values for these parameters were used in the basin models. Basin models were calibrated to the first year of observed streamflow data by adjusting other parameters in the numerical model that simulated channel losses, simulated channel storage in a few of the reaches in the headwaters and in the floodplain of the main stem of Clover Creek, and simulated volume and outflow of the ground-water reservoir representing the regional ground-water aquifers. The models were run for a second year without any adjustments, and simulated results were compared with observed results as a measure of validation of the models. The investigation showed the importance of defining the ground-water flow boundaries and demonstrated a simple method of simulating the influence of the regional ground-water aquifer on streamflows. In the Clover Creek Basin model, ground-water flow boundaries were used to define subbasins containing mostly glacial outwash soils and not containing any surface drainage channels. In the Clear-Clarks Basin model, ground-water flow boundaries outlined a recharge area outside the surface-water boundaries of the basin that was incorporated into the model in order to provide sufficient water to balance simulated ground-water outflows to the creeks. A simulated ground-water reservoir used to represent regional ground-water flow processes successfully provided the proper water balance of inflows and outfl
ERIC Educational Resources Information Center
Schoenfeld, Alan H.
1992-01-01
Reacts to Ohlsson, Ernst, and Rees' paper by initially discussing the costs of methodology that utilizes artificial intelligence (AI) to model cognitive processes. Raises three concerns with the paper: insufficient clarification of the meaning of conceptual versus procedural understanding of base-10 subtraction; realism of the learning model; and…
NASA Astrophysics Data System (ADS)
Jacques, Diederik
2017-04-01
As soil functions are governed by a multitude of interacting hydrological, geochemical and biological processes, simulation tools coupling mathematical models for interacting processes are needed. Coupled reactive transport models are a typical example of such coupled tools mainly focusing on hydrological and geochemical coupling (see e.g. Steefel et al., 2015). Mathematical and numerical complexity for both the tool itself or of the specific conceptual model can increase rapidly. Therefore, numerical verification of such type of models is a prerequisite for guaranteeing reliability and confidence and qualifying simulation tools and approaches for any further model application. In 2011, a first SeSBench -Subsurface Environmental Simulation Benchmarking- workshop was held in Berkeley (USA) followed by four other ones. The objective is to benchmark subsurface environmental simulation models and methods with a current focus on reactive transport processes. The final outcome was a special issue in Computational Geosciences (2015, issue 3 - Reactive transport benchmarks for subsurface environmental simulation) with a collection of 11 benchmarks. Benchmarks, proposed by the participants of the workshops, should be relevant for environmental or geo-engineering applications; the latter were mostly related to radioactive waste disposal issues - excluding benchmarks defined for pure mathematical reasons. Another important feature is the tiered approach within a benchmark with the definition of a single principle problem and different sub problems. The latter typically benchmarked individual or simplified processes (e.g. inert solute transport, simplified geochemical conceptual model) or geometries (e.g. batch or one-dimensional, homogeneous). Finally, three codes should be involved into a benchmark. The SeSBench initiative contributes to confidence building for applying reactive transport codes. Furthermore, it illustrates the use of those type of models for different environmental and geo-engineering applications. SeSBench will organize new workshops to add new benchmarks in a new special issue. Steefel, C. I., et al. (2015). "Reactive transport codes for subsurface environmental simulation." Computational Geosciences 19: 445-478.
Modelling dwarf mistletoe at three scales: life history, ballistics and contagion
Donald C. E. Robinson; Brian W. Geils
2006-01-01
The epidemiology of dwarf mistletoe (Arceuthobium) is simulated for the reproduction, dispersal, and spatial patterns of these plant pathogens on conifer trees. A conceptual model for mistletoe spread and intensification is coded as sets of related subprograms that link to either of two individual-tree growth models (FVS and TASS) used by managers to develop...
An application of tensor ideas to nonlinear modeling of a turbofan jet engine
NASA Technical Reports Server (NTRS)
Klingler, T. A.; Yurkovich, S.; Sain, M. K.
1982-01-01
An application of tensor modelling to a digital simulation of NASA's Quiet, Clean, Shorthaul Experimental (QCSE) gas turbine engine is presented. The results show that the tensor algebra offers a universal parametrization which is helpful in conceptualization and identification for plant modelling prior to feedback or for representing scheduled controllers over an operating line.
A Generic Inner-Loop Control Law Structure for Six-Degree-of-Freedom Conceptual Aircraft Design
NASA Technical Reports Server (NTRS)
Cox, Timothy H.; Cotting, M. Christopher
2005-01-01
A generic control system framework for both real-time and batch six-degree-of-freedom simulations is presented. This framework uses a simplified dynamic inversion technique to allow for stabilization and control of any type of aircraft at the pilot interface level. The simulation, designed primarily for the real-time simulation environment, also can be run in a batch mode through a simple guidance interface. Direct vehicle-state acceleration feedback is required with the simplified dynamic inversion technique. The estimation of surface effectiveness within real-time simulation timing constraints also is required. The generic framework provides easily modifiable control variables, allowing flexibility in the variables that the pilot commands. A direct control allocation scheme is used to command aircraft effectors. Primary uses for this system include conceptual and preliminary design of aircraft, when vehicle models are rapidly changing and knowledge of vehicle six-degree-of-freedom performance is required. A simulated airbreathing hypersonic vehicle and simulated high-performance fighter aircraft are used to demonstrate the flexibility and utility of the control system.
A Generic Inner-Loop Control Law Structure for Six-Degree-of-Freedom Conceptual Aircraft Design
NASA Technical Reports Server (NTRS)
Cox, Timothy H.; Cotting, Christopher
2005-01-01
A generic control system framework for both real-time and batch six-degree-of-freedom (6-DOF) simulations is presented. This framework uses a simplified dynamic inversion technique to allow for stabilization and control of any type of aircraft at the pilot interface level. The simulation, designed primarily for the real-time simulation environment, also can be run in a batch mode through a simple guidance interface. Direct vehicle-state acceleration feedback is required with the simplified dynamic inversion technique. The estimation of surface effectiveness within real-time simulation timing constraints also is required. The generic framework provides easily modifiable control variables, allowing flexibility in the variables that the pilot commands. A direct control allocation scheme is used to command aircraft effectors. Primary uses for this system include conceptual and preliminary design of aircraft, when vehicle models are rapidly changing and knowledge of vehicle 6-DOF performance is required. A simulated airbreathing hypersonic vehicle and simulated high-performance fighter aircraft are used to demonstrate the flexibility and utility of the control system.
Thinking as the control of imagination: a conceptual framework for goal-directed systems.
Pezzulo, Giovanni; Castelfranchi, Cristiano
2009-07-01
This paper offers a conceptual framework which (re)integrates goal-directed control, motivational processes, and executive functions, and suggests a developmental pathway from situated action to higher level cognition. We first illustrate a basic computational (control-theoretic) model of goal-directed action that makes use of internal modeling. We then show that by adding the problem of selection among multiple action alternatives motivation enters the scene, and that the basic mechanisms of executive functions such as inhibition, the monitoring of progresses, and working memory, are required for this system to work. Further, we elaborate on the idea that the off-line re-enactment of anticipatory mechanisms used for action control gives rise to (embodied) mental simulations, and propose that thinking consists essentially in controlling mental simulations rather than directly controlling behavior and perceptions. We conclude by sketching an evolutionary perspective of this process, proposing that anticipation leveraged cognition, and by highlighting specific predictions of our model.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-29
... aquifer (U.S. EPA, 1987, Sole Source Aquifer Designation Decision Process, Petition Review Guidance... the petition; U.S. Geological Survey, 2011, Conceptual Model and Numerical Simulation of the...
ERIC Educational Resources Information Center
Kangassalo, Marjatta
Using a pictorial computer simulation of a natural phenomenon, children's exploration processes and their construction of conceptual models were examined. The selected natural phenomenon was the variations of sunlight and heat of the sun experienced on the earth in relation to the positions of the earth and sun in space, and the subjects were…
ERIC Educational Resources Information Center
Chen, Baiyun; Wei, Lei; Li, Huihui
2016-01-01
Building a solid foundation of conceptual knowledge is critical for students in electrical engineering. This mixed-method case study explores the use of simulation videos to illustrate complicated conceptual knowledge in foundational communications and signal processing courses. Students found these videos to be very useful for establishing…
NASA Technical Reports Server (NTRS)
Bragg-Sitton, Shannon M.; Dickens, Ricky; Dixon, David; Kapernick, Richard
2007-01-01
Non-nuclear testing can be a valuable tool in the development of a space nuclear power system, providing system characterization data and allowing one to work through various fabrication, assembly and integration issues without the cost and time associated with a full ground nuclear test. In a non-nuclear test bed, electric heaters are used to simulate the heat from nuclear fuel. Testing with non-optimized heater elements allows one to assess thermal, heat transfer. and stress related attributes of a given system, but fails to demonstrate the dynamic response that would be present in an integrated, fueled reactor system. High fidelity thermal simulators that match both the static and the dynamic fuel pin performance that would be observed in an operating, fueled nuclear reactor can vastly increase the value of non-nuclear test results. With optimized simulators, the integration of thermal hydraulic hardware tests with simulated neutronic response provides a bridge between electrically heated testing and fueled nuclear testing. By implementing a neutronic response model to simulate the dynamic response that would be expected in a fueled reactor system, one can better understand system integration issues, characterize integrated system response times and response characteristics and assess potential design improvements at relatively small fiscal investment. Initial conceptual thermal simulator designs are determined by simple one-dimensional analysis at a single axial location and at steady state conditions; feasible concepts are then input into a detailed three-dimensional model for comparison to expected fuel pin performance. Static and dynamic fuel pin performance for a proposed reactor design is determined using SINDA/FLUINT thermal analysis software, and comparison is made between the expected nuclear performance and the performance of conceptual thermal simulator designs. Through a series of iterative analyses, a conceptual high fidelity design is developed: this is followed by engineering design, fabrication, and testing to validate the overall design process. Test results presented in this paper correspond to a "first cut" simulator design for a potential liquid metal (NaK) cooled reactor design that could be applied for Lunar surface power. Proposed refinements to this simulator design are also presented.
NASA Astrophysics Data System (ADS)
Tomshaw, Stephen G.
Physics education research has shown that students bring alternate conceptions to the classroom which can be quite resistant to traditional instruction methods (Clement, 1982; Halloun & Hestenes, 1985; McDermott, 1991). Microcomputer-based laboratory (MBL) experiments that employ an active-engagement strategy have been shown to improve student conceptual understanding in high school and introductory university physics courses (Thornton & Sokoloff, 1998). These (MBL) experiments require a specialized computer interface, type-specific sensors (e.g. motion detectors, force probes, accelerometers), and specialized software in addition to the standard physics experimental apparatus. Tao and Gunstone (1997) have shown that computer simulations used in an active engagement environment can also lead to conceptual change. This study investigated 69 secondary physics students' use of computer simulations of MBL activities in place of the hands-on MBL laboratory activities. The average normalized gain
2012-07-01
du monde de la modélisation et de la simulation et lui fournir des directives de mise en œuvre ; et fournir des ...définition ; rapports avec les normes ; spécification de procédure de gestion de la MC ; spécification d’artefact de MC. Considérations importantes...utilisant la présente directive comme référence. • Les VV&A (vérification, validation et acceptation) des MC doivent faire partie intégrante du
Using computer simulations to facilitate conceptual understanding of electromagnetic induction
NASA Astrophysics Data System (ADS)
Lee, Yu-Fen
This study investigated the use of computer simulations to facilitate conceptual understanding in physics. The use of computer simulations in the present study was grounded in a conceptual framework drawn from findings related to the use of computer simulations in physics education. To achieve the goal of effective utilization of computers for physics education, I first reviewed studies pertaining to computer simulations in physics education categorized by three different learning frameworks and studies comparing the effects of different simulation environments. My intent was to identify the learning context and factors for successful use of computer simulations in past studies and to learn from the studies which did not obtain a significant result. Based on the analysis of reviewed literature, I proposed effective approaches to integrate computer simulations in physics education. These approaches are consistent with well established education principles such as those suggested by How People Learn (Bransford, Brown, Cocking, Donovan, & Pellegrino, 2000). The research based approaches to integrated computer simulations in physics education form a learning framework called Concept Learning with Computer Simulations (CLCS) in the current study. The second component of this study was to examine the CLCS learning framework empirically. The participants were recruited from a public high school in Beijing, China. All participating students were randomly assigned to two groups, the experimental (CLCS) group and the control (TRAD) group. Research based computer simulations developed by the physics education research group at University of Colorado at Boulder were used to tackle common conceptual difficulties in learning electromagnetic induction. While interacting with computer simulations, CLCS students were asked to answer reflective questions designed to stimulate qualitative reasoning and explanation. After receiving model reasoning online, students were asked to submit their revised answers electronically. Students in the TRAD group were not granted access to the CLCS material and followed their normal classroom routine. At the end of the study, both the CLCS and TRAD students took a post-test. Questions on the post-test were divided into "what" questions, "how" questions, and an open response question. Analysis of students' post-test performance showed mixed results. While the TRAD students scored higher on the "what" questions, the CLCS students scored higher on the "how" questions and the one open response questions. This result suggested that more TRAD students knew what kinds of conditions may or may not cause electromagnetic induction without understanding how electromagnetic induction works. Analysis of the CLCS students' learning also suggested that frequent disruption and technical trouble might pose threats to the effectiveness of the CLCS learning framework. Despite the mixed results of students' post-test performance, the CLCS learning framework revealed some limitations to promote conceptual understanding in physics. Improvement can be made by providing students with background knowledge necessary to understand model reasoning and incorporating the CLCS learning framework with other learning frameworks to promote integration of various physics concepts. In addition, the reflective questions in the CLCS learning framework may be refined to better address students' difficulties. Limitations of the study, as well as suggestions for future research, are also presented in this study.
NASA Astrophysics Data System (ADS)
Catur Wibowo, Firmanul; Suhandi, Andi; Rusdiana, Dadi; Samsudin, Achmad; Rahmi Darman, Dina; Faizin, M. Noor; Wiyanto; Supriyatman; Permanasari, Anna; Kaniawati, Ida; Setiawan, Wawan; Karyanto, Yudi; Linuwih, Suharto; Fatah, Abdul; Subali, Bambang; Hasani, Aceng; Hidayat, Sholeh
2017-07-01
Electricity is a concept that is abstract and difficult to see by eye directly, one example electric shock, but cannot see the movement of electric current so that students have difficulty by students. A computer simulation designed to improve the understanding of the concept of the workings of the dry cell (battery). This study was conducted to 82 students (aged 18-20 years) in the experimental group by learning to use the Dry Cell Microscopic Simulation (DCMS). The result shows the improving of students’ conceptual understanding scores from post test were statistically significantly of the workings of batteries. The implication using computer simulations designed to overcome the difficulties of conceptual understanding, can effectively help students in facilitating conceptual change.
Numerical simulation of hydrothermal circulation in the Cascade Range, north-central Oregon
Ingebritsen, S.E.; Paulson, K.M.
1990-01-01
Alternate conceptual models to explain near-surface heat-flow observations in the central Oregon Cascade Range involve (1) an extensive mid-crustal magmatic heat source underlying both the Quaternary arc and adjacent older rocks or (2) a narrower deep heat source which is flanked by a relatively shallow conductive heat-flow anomaly caused by regional ground-water flow (the lateral-flow model). Relative to the mid-crustal heat source model, the lateral-flow model suggests a more limited geothermal resource base, but a better-defined exploration target. We simulated ground-water flow and heat transport through two cross sections trending west from the Cascade range crest in order to explore the implications of the two models. The thermal input for the alternate conceptual models was simulated by varying the width and intensity of a basal heat-flow anomaly and, in some cases, by introducing shallower heat sources beneath the Quaternary arc. Near-surface observations in the Breitenbush Hot Springs area are most readily explained in terms of lateral heat transport by regional ground-water flow; however, the deep thermal structure still cannot be uniquely inferred. The sparser thermal data set from the McKenzie River area can be explained either in terms of deep regional ground-water flow or in terms of a conduction-dominated system, with ground-water flow essentially confined to Quaternary rocks and fault zones.
Power, Gerald; Miller, Anne
2007-01-01
Abstract: Cardiopulmonary bypass (CPB) is a complex task requiring high levels of practitioner expertise. Although some education standards exist, few are based on an analysis of perfusionists’ problem-solving needs. This study shows the efficacy of work domain analysis (WDA) as a framework for analyzing perfusionists’ conceptualization and problem-solving strategies. A WDA model of a CPB circuit was developed. A high-fidelity CPB simulator (Manbit) was used to present routine and oxygenator failure scenarios to six proficient perfusionists. The video-cued recall technique was used to elicit perfusionists’ conceptualization strategies. The resulting recall transcripts were coded using the WDA model and analyzed for associations between task completion times and patterns of conceptualization. The WDA model developed was successful in being able to account for and describe the thought process followed by each participant. It was also shown that, although there was no correlation between experience with CPB and ability to change an oxygenator, there was a link between the between specific thought patterns and the efficiency in undertaking this task. Simulators are widely used in many fields of human endeavor, and in this research, the attempt was made to use WDA to gain insights into the complexities of the human thought process when engaged in the complex task of conducting CPB. The assumption that experience equates with ability is challenged, and rather, it is shown that thought process is a more significant determinant of success when engaged in complex tasks. WDA analysis in combination with a CPB simulator may be used to elucidate successful strategies for completing complex tasks. PMID:17972450
Moving alcohol prevention research forward-Part I: introducing a complex systems paradigm.
Apostolopoulos, Yorghos; Lemke, Michael K; Barry, Adam E; Lich, Kristen Hassmiller
2018-02-01
The drinking environment is a complex system consisting of a number of heterogeneous, evolving and interacting components, which exhibit circular causality and emergent properties. These characteristics reduce the efficacy of commonly used research approaches, which typically do not account for the underlying dynamic complexity of alcohol consumption and the interdependent nature of diverse factors influencing misuse over time. We use alcohol misuse among college students in the United States as an example for framing our argument for a complex systems paradigm. A complex systems paradigm, grounded in socio-ecological and complex systems theories and computational modeling and simulation, is introduced. Theoretical, conceptual, methodological and analytical underpinnings of this paradigm are described in the context of college drinking prevention research. The proposed complex systems paradigm can transcend limitations of traditional approaches, thereby fostering new directions in alcohol prevention research. By conceptualizing student alcohol misuse as a complex adaptive system, computational modeling and simulation methodologies and analytical techniques can be used. Moreover, use of participatory model-building approaches to generate simulation models can further increase stakeholder buy-in, understanding and policymaking. A complex systems paradigm for research into alcohol misuse can provide a holistic understanding of the underlying drinking environment and its long-term trajectory, which can elucidate high-leverage preventive interventions. © 2017 Society for the Study of Addiction.
NASA Astrophysics Data System (ADS)
Thomsen, N. I.; Troldborg, M.; McKnight, U. S.; Binning, P. J.; Bjerg, P. L.
2012-04-01
Mass discharge estimates are increasingly being used in the management of contaminated sites. Such estimates have proven useful for supporting decisions related to the prioritization of contaminated sites in a groundwater catchment. Potential management options can be categorised as follows: (1) leave as is, (2) clean up, or (3) further investigation needed. However, mass discharge estimates are often very uncertain, which may hamper the management decisions. If option 1 is incorrectly chosen soil and water quality will decrease, threatening or destroying drinking water resources. The risk of choosing option 2 is to spend money on remediating a site that does not pose a problem. Choosing option 3 will often be safest, but may not be the optimal economic solution. Quantification of the uncertainty in mass discharge estimates can therefore greatly improve the foundation for selecting the appropriate management option. The uncertainty of mass discharge estimates depends greatly on the extent of the site characterization. A good approach for uncertainty estimation will be flexible with respect to the investigation level, and account for both parameter and conceptual model uncertainty. We propose a method for quantifying the uncertainty of dynamic mass discharge estimates from contaminant point sources on the local scale. The method considers both parameter and conceptual uncertainty through a multi-model approach. The multi-model approach evaluates multiple conceptual models for the same site. The different conceptual models consider different source characterizations and hydrogeological descriptions. The idea is to include a set of essentially different conceptual models where each model is believed to be realistic representation of the given site, based on the current level of information. Parameter uncertainty is quantified using Monte Carlo simulations. For each conceptual model we calculate a transient mass discharge estimate with uncertainty bounds resulting from the parametric uncertainty. To quantify the conceptual uncertainty from a given site, we combine the outputs from the different conceptual models using Bayesian model averaging. The weight for each model is obtained by integrating available data and expert knowledge using Bayesian belief networks. The multi-model approach is applied to a contaminated site. At the site a DNAPL (dense non aqueous phase liquid) spill consisting of PCE (perchloroethylene) has contaminated a fractured clay till aquitard overlaying a limestone aquifer. The exact shape and nature of the source is unknown and so is the importance of transport in the fractures. The result of the multi-model approach is a visual representation of the uncertainty of the mass discharge estimates for the site which can be used to support the management options.
Shallow groundwater in the Matanuska-Susitna Valley, Alaska—Conceptualization and simulation of flow
Kikuchi, Colin P.
2013-01-01
The Matanuska-Susitna Valley is in the Upper Cook Inlet Basin and is currently undergoing rapid population growth outside of municipal water and sewer service areas. In response to concerns about the effects of increasing water use on future groundwater availability, a study was initiated between the Alaska Department of Natural Resources and the U.S. Geological Survey. The goals of the study were (1) to compile existing data and collect new data to support hydrogeologic conceptualization of the study area, and (2) to develop a groundwater flow model to simulate flow dynamics important at the regional scale. The purpose of the groundwater flow model is to provide a scientific framework for analysis of regional-scale groundwater availability. To address the first study goal, subsurface lithologic data were compiled into a database and were used to construct a regional hydrogeologic framework model describing the extent and thickness of hydrogeologic units in the Matanuska-Susitna Valley. The hydrogeologic framework model synthesizes existing maps of surficial geology and conceptual geochronologies developed in the study area with the distribution of lithologies encountered in hundreds of boreholes. The geologic modeling package Geological Surveying and Investigation in Three Dimensions (GSI3D) was used to construct the hydrogeologic framework model. In addition to characterizing the hydrogeologic framework, major groundwater-budget components were quantified using several different techniques. A land-surface model known as the Deep Percolation Model was used to estimate in-place groundwater recharge across the study area. This model incorporates data on topography, soils, vegetation, and climate. Model-simulated surface runoff was consistent with observed streamflow at U.S. Geological Survey streamgages. Groundwater withdrawals were estimated on the basis of records from major water suppliers during 2004-2010. Fluxes between groundwater and surface water were estimated during field investigations on several small streams. Regional groundwater flow patterns were characterized by synthesizing previous water-table maps with a synoptic water-level measurement conducted during 2009. Time-series water-level data were collected at groundwater and lake monitoring stations over the study period (2009–present). Comparison of historical groundwater-level records with time-series groundwater-level data collected during this study showed similar patterns in groundwater-level fluctuation in response to precipitation. Groundwater-age data collected during previous studies show that water moves quickly through the groundwater system, suggesting that the system responds quickly to changes in climate forcing. Similarly, the groundwater system quickly returns to long-term average conditions following variability due to seasonal or interannual changes in precipitation. These analyses indicate that the groundwater system is in a state of dynamic equilibrium, characterized by water-level fluctuation about a constant average state, with no long-term trends in aquifer-system storage. To address the second study goal, a steady-state groundwater flow model was developed to simulate regional groundwater flow patterns. The groundwater flow model was bounded by physically meaningful hydrologic features, and appropriate internal model boundaries were specified on the basis of conceptualization of the groundwater system resulting in a three-layer model. Calibration data included 173 water‑level measurements and 18 measurements of streamflow gains and losses along small streams. Comparison of simulated and observed heads and flows showed that the model accurately simulates important regional characteristics of the groundwater flow system. This model is therefore appropriate for studying regional-scale groundwater availability. Mismatch between model-simulated and observed hydrologic quantities is likely because of the coarse grid size of the model and seasonal transient effects. Next steps towards model refinement include the development of a transient groundwater flow model that is suitable for analysis of seasonal variability in hydraulic heads and flows. In addition, several important groundwater budget components remain poorly quantified—including groundwater outflow to the Matanuska River, Little Susitna River, and Knik Arm.
Diffuse-flow conceptualization and simulation of the Edwards aquifer, San Antonio region, Texas
Lindgren, R.J.
2006-01-01
A numerical ground-water-flow model (hereinafter, the conduit-flow Edwards aquifer model) of the karstic Edwards aquifer in south-central Texas was developed for a previous study on the basis of a conceptualization emphasizing conduit development and conduit flow, and included simulating conduits as one-cell-wide, continuously connected features. Uncertainties regarding the degree to which conduits pervade the Edwards aquifer and influence ground-water flow, as well as other uncertainties inherent in simulating conduits, raised the question of whether a model based on the conduit-flow conceptualization was the optimum model for the Edwards aquifer. Accordingly, a model with an alternative hydraulic conductivity distribution without conduits was developed in a study conducted during 2004-05 by the U.S. Geological Survey, in cooperation with the San Antonio Water System. The hydraulic conductivity distribution for the modified Edwards aquifer model (hereinafter, the diffuse-flow Edwards aquifer model), based primarily on a conceptualization in which flow in the aquifer predominantly is through a network of numerous small fractures and openings, includes 38 zones, with hydraulic conductivities ranging from 3 to 50,000 feet per day. Revision of model input data for the diffuse-flow Edwards aquifer model was limited to changes in the simulated hydraulic conductivity distribution. The root-mean-square error for 144 target wells for the calibrated steady-state simulation for the diffuse-flow Edwards aquifer model is 20.9 feet. This error represents about 3 percent of the total head difference across the model area. The simulated springflows for Comal and San Marcos Springs for the calibrated steady-state simulation were within 2.4 and 15 percent of the median springflows for the two springs, respectively. The transient calibration period for the diffuse-flow Edwards aquifer model was 1947-2000, with 648 monthly stress periods, the same as for the conduit-flow Edwards aquifer model. The root-mean-square error for a period of drought (May-November 1956) for the calibrated transient simulation for 171 target wells is 33.4 feet, which represents about 5 percent of the total head difference across the model area. The root-mean-square error for a period of above-normal rainfall (November 1974-July 1975) for the calibrated transient simulation for 169 target wells is 25.8 feet, which represents about 4 percent of the total head difference across the model area. The root-mean-square error ranged from 6.3 to 30.4 feet in 12 target wells with long-term water-level measurements for varying periods during 1947-2000 for the calibrated transient simulation for the diffuse-flow Edwards aquifer model, and these errors represent 5.0 to 31.3 percent of the range in water-level fluctuations of each of those wells. The root-mean-square errors for the five major springs in the San Antonio segment of the aquifer for the calibrated transient simulation, as a percentage of the range of discharge fluctuations measured at the springs, varied from 7.2 percent for San Marcos Springs and 8.1 percent for Comal Springs to 28.8 percent for Leona Springs. The root-mean-square errors for hydraulic heads for the conduit-flow Edwards aquifer model are 27, 76, and 30 percent greater than those for the diffuse-flow Edwards aquifer model for the steady-state, drought, and above-normal rainfall synoptic time periods, respectively. The goodness-of-fit between measured and simulated springflows is similar for Comal, San Marcos, and Leona Springs for the diffuse-flow Edwards aquifer model and the conduit-flow Edwards aquifer model. The root-mean-square errors for Comal and Leona Springs were 15.6 and 21.3 percent less, respectively, whereas the root-mean-square error for San Marcos Springs was 3.3 percent greater for the diffuse-flow Edwards aquifer model compared to the conduit-flow Edwards aquifer model. The root-mean-square errors for San Antonio and San Pedro Springs were appreciably greater, 80.2 and 51.0 percent, respectively, for the diffuse-flow Edwards aquifer model. The simulated water budgets for the diffuse-flow Edwards aquifer model are similar to those for the conduit-flow Edwards aquifer model. Differences in percentage of total sources or discharges for a budget component are 2.0 percent or less for all budget components for the steady-state and transient simulations. The largest difference in terms of the magnitude of water budget components for the transient simulation for 1956 was a decrease of about 10,730 acre-feet per year (about 2 per-cent) in springflow for the diffuse-flow Edwards aquifer model compared to the conduit-flow Edwards aquifer model. This decrease in springflow (a water budget discharge) was largely offset by the decreased net loss of water from storage (a water budget source) of about 10,500 acre-feet per year.
MCFire model technical description
David R. Conklin; James M. Lenihan; Dominique Bachelet; Ronald P. Neilson; John B. Kim
2016-01-01
MCFire is a computer program that simulates the occurrence and effects of wildfire on natural vegetation, as a submodel within the MC1 dynamic global vegetation model. This report is a technical description of the algorithms and parameter values used in MCFire, intended to encapsulate its design and features a higher level that is more conceptual than the level...
A perspective on coherent structures and conceptual models for turbulent boundary layer physics
NASA Technical Reports Server (NTRS)
Robinson, Stephen K.
1990-01-01
Direct numerical simulations of turbulent boundary layers have been analyzed to develop a unified conceptual model for the kinematics of coherent motions in low Reynolds number canonical turbulent boundary layers. All classes of coherent motions are considered in the model, including low-speed streaks, ejections and sweeps, vortical structures, near-wall and outer-region shear layers, sublayer pockets, and large-scale outer-region eddies. The model reflects the conclusions from the study of the simulated boundary layer that vortical structures are directly associated with the production of turbulent shear stresses, entrainment, dissipation of turbulence kinetic energy, and the fluctuating pressure field. These results, when viewed from the perspective of the large body of published work on the subject of coherent motions, confirm that vortical structures may be considered the central dynamic element in the maintenance of turbulence in the canonical boundary layer. Vortical structures serve as a framework on which to construct a unified picture of boundary layer structure, providing a means to relate the many known structural elements in a consistent way.
Puig, V; Cembrano, G; Romera, J; Quevedo, J; Aznar, B; Ramón, G; Cabot, J
2009-01-01
This paper deals with the global control of the Riera Blanca catchment in the Barcelona sewer network using a predictive optimal control approach. This catchment has been modelled using a conceptual modelling approach based on decomposing the catchments in subcatchments and representing them as virtual tanks. This conceptual modelling approach allows real-time model calibration and control of the sewer network. The global control problem of the Riera Blanca catchment is solved using a optimal/predictive control algorithm. To implement the predictive optimal control of the Riera Blanca catchment, a software tool named CORAL is used. The on-line control is simulated by interfacing CORAL with a high fidelity simulator of sewer networks (MOUSE). CORAL interchanges readings from the limnimeters and gate commands with MOUSE as if it was connected with the real SCADA system. Finally, the global control results obtained using the predictive optimal control are presented and compared against the results obtained using current local control system. The results obtained using the global control are very satisfactory compared to those obtained using the local control.
Linguistic geometry for technologies procurement
NASA Astrophysics Data System (ADS)
Stilman, Boris; Yakhnis, Vladimir; Umanskiy, Oleg; Boyd, Ron
2005-05-01
In the modern world of rapidly rising prices of new military hardware, the importance of Simulation Based Acquisition (SBA) is hard to overestimate. With SAB, DOD would be able to test, develop CONOPS for, debug, and evaluate new conceptual military equipment before actually building the expensive hardware. However, only recently powerful tools for real SBA have been developed. Linguistic Geometry (LG) permits full-scale modeling and evaluation of new military technologies, combinations of hardware systems and concepts of their application. Using LG tools, the analysts can create a gaming environment populated with the Blue forces armed with the new conceptual hardware as well as with appropriate existing weapons and equipment. This environment will also contain the intelligent enemy with appropriate weaponry and, if desired, with a conceptual counters to the new Blue weapons. Within such LG gaming environment, the analyst can run various what-ifs with the LG tools providing the simulated combatants with strategies and tactics solving their goals with minimal resources spent.
Propulsion simulation for magnetically suspended wind tunnel models
NASA Technical Reports Server (NTRS)
Joshi, Prakash B.; Beerman, Henry P.; Chen, James; Krech, Robert H.; Lintz, Andrew L.; Rosen, David I.
1990-01-01
The feasibility of simulating propulsion-induced aerodynamic effects on scaled aircraft models in wind tunnels employing Magnetic Suspension and Balance Systems. The investigation concerned itself with techniques of generating exhaust jets of appropriate characteristics. The objectives were to: (1) define thrust and mass flow requirements of jets; (2) evaluate techniques for generating propulsive gas within volume limitations imposed by magnetically-suspended models; (3) conduct simple diagnostic experiments for techniques involving new concepts; and (4) recommend experiments for demonstration of propulsion simulation techniques. Various techniques of generating exhaust jets of appropriate characteristics were evaluated on scaled aircraft models in wind tunnels with MSBS. Four concepts of remotely-operated propulsion simulators were examined. Three conceptual designs involving innovative adaptation of convenient technologies (compressed gas cylinders, liquid, and solid propellants) were developed. The fourth innovative concept, namely, the laser-assisted thruster, which can potentially simulate both inlet and exhaust flows, was found to require very high power levels for small thrust levels.
Tutorial: Parallel Computing of Simulation Models for Risk Analysis.
Reilly, Allison C; Staid, Andrea; Gao, Michael; Guikema, Seth D
2016-10-01
Simulation models are widely used in risk analysis to study the effects of uncertainties on outcomes of interest in complex problems. Often, these models are computationally complex and time consuming to run. This latter point may be at odds with time-sensitive evaluations or may limit the number of parameters that are considered. In this article, we give an introductory tutorial focused on parallelizing simulation code to better leverage modern computing hardware, enabling risk analysts to better utilize simulation-based methods for quantifying uncertainty in practice. This article is aimed primarily at risk analysts who use simulation methods but do not yet utilize parallelization to decrease the computational burden of these models. The discussion is focused on conceptual aspects of embarrassingly parallel computer code and software considerations. Two complementary examples are shown using the languages MATLAB and R. A brief discussion of hardware considerations is located in the Appendix. © 2016 Society for Risk Analysis.
Garcea, Frank E.; Dombovy, Mary; Mahon, Bradford Z.
2013-01-01
A number of studies have observed that the motor system is activated when processing the semantics of manipulable objects. Such phenomena have been taken as evidence that simulation over motor representations is a necessary and intermediary step in the process of conceptual understanding. Cognitive neuropsychological evaluations of patients with impairments for action knowledge permit a direct test of the necessity of motor simulation in conceptual processing. Here, we report the performance of a 47-year-old male individual (Case AA) and six age-matched control participants on a number of tests probing action and object knowledge. Case AA had a large left-hemisphere frontal-parietal lesion and hemiplegia affecting his right arm and leg. Case AA presented with impairments for object-associated action production, and his conceptual knowledge of actions was severely impaired. In contrast, his knowledge of objects such as tools and other manipulable objects was largely preserved. The dissociation between action and object knowledge is difficult to reconcile with strong forms of the embodied cognition hypothesis. We suggest that these, and other similar findings, point to the need to develop tractable hypotheses about the dynamics of information exchange among sensory, motor and conceptual processes. PMID:23641205
CONFIG: Integrated engineering of systems and their operation
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Ryan, Dan; Fleming, Land
1994-01-01
This article discusses CONFIG 3, a prototype software tool that supports integrated conceptual design evaluation from early in the product life cycle, by supporting isolated or integrated modeling, simulation, and analysis of the function, structure, behavior, failures and operations of system designs. Integration and reuse of models is supported in an object-oriented environment providing capabilities for graph analysis and discrete event simulation. CONFIG supports integration among diverse modeling approaches (component view, configuration or flow path view, and procedure view) and diverse simulation and analysis approaches. CONFIG is designed to support integrated engineering in diverse design domains, including mechanical and electro-mechanical systems, distributed computer systems, and chemical processing and transport systems.
Computer simulation of the cumulative effects of brushland fire-management policies
NASA Astrophysics Data System (ADS)
Bonnicksen, Thomas M.
1980-01-01
A mathematical model simulates the cumulative volume of debris produced from brushland watersheds. Application of this model to a 176-km2 (0.678 = mi2) watershed along the southern flank of the Central San Gabriel Mountains permits assessment of expected debris production associated with alternative fire-management policies. The political implications of simulated debris production are evaluated through a conceptual model that links interest groups to particular successional stages in brushland watersheds by means of the resources claimed by each group. It is concluded that in theory, a rotation burn policy would provide benefits to more interest groups concerned about southern California's brushland watersheds than does the current fire exclusion policy.
Xu, Zexuan; Hu, Bill X; Davis, Hal; Kish, Stephen
2015-11-01
In this study, a groundwater flow cycling in a karst springshed and an interaction between two springs, Spring Creek Springs and Wakulla Springs, through a subground conduit network are numerically simulated using CFPv2, the latest research version of MODFLOW-CFP (Conduit Flow Process). The Spring Creek Springs and Wakulla Springs, located in a marine estuary and 11 miles inland, respectively, are two major groundwater discharge spots in the Woodville Karst Plain (WKP), North Florida, USA. A three-phase conceptual model of groundwater flow cycling between the two springs and surface water recharge from a major surface creek (Lost Creek) was proposed in various rainfall conditions. A high permeable subground karst conduit network connecting the two springs was found by tracer tests and cave diving. Flow rate of discharge, salinity, sea level and tide height at Spring Creek Springs could significantly affect groundwater discharge and water stage at Wakulla Springs simultaneously. Based on the conceptual model, a numerical hybrid discrete-continuum groundwater flow model is developed using CFPv2 and calibrated by field measurements. Non-laminar flows in conduits and flow exchange between conduits and porous medium are implemented in the hybrid coupling numerical model. Time-variable salinity and equivalent freshwater head boundary conditions at the submarine spring as well as changing recharges have significant impacts on seawater/freshwater interaction and springs' discharges. The developed numerical model is used to simulate the dynamic hydrological process and quantitatively represent the three-phase conceptual model from June 2007 to June 2010. Simulated results of two springs' discharges match reasonably well to measurements with correlation coefficients 0.891 and 0.866 at Spring Creeks Springs and Wakulla Springs, respectively. The impacts of sea level rise on regional groundwater flow field and relationship between the inland springs and submarine springs are evaluated as well in this study. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Xu, Zexuan; Hu, Bill X.; Davis, Hal; Kish, Stephen
2015-11-01
In this study, a groundwater flow cycling in a karst springshed and an interaction between two springs, Spring Creek Springs and Wakulla Springs, through a subground conduit network are numerically simulated using CFPv2, the latest research version of MODFLOW-CFP (Conduit Flow Process). The Spring Creek Springs and Wakulla Springs, located in a marine estuary and 11 miles inland, respectively, are two major groundwater discharge spots in the Woodville Karst Plain (WKP), North Florida, USA. A three-phase conceptual model of groundwater flow cycling between the two springs and surface water recharge from a major surface creek (Lost Creek) was proposed in various rainfall conditions. A high permeable subground karst conduit network connecting the two springs was found by tracer tests and cave diving. Flow rate of discharge, salinity, sea level and tide height at Spring Creek Springs could significantly affect groundwater discharge and water stage at Wakulla Springs simultaneously. Based on the conceptual model, a numerical hybrid discrete-continuum groundwater flow model is developed using CFPv2 and calibrated by field measurements. Non-laminar flows in conduits and flow exchange between conduits and porous medium are implemented in the hybrid coupling numerical model. Time-variable salinity and equivalent freshwater head boundary conditions at the submarine spring as well as changing recharges have significant impacts on seawater/freshwater interaction and springs' discharges. The developed numerical model is used to simulate the dynamic hydrological process and quantitatively represent the three-phase conceptual model from June 2007 to June 2010. Simulated results of two springs' discharges match reasonably well to measurements with correlation coefficients 0.891 and 0.866 at Spring Creeks Springs and Wakulla Springs, respectively. The impacts of sea level rise on regional groundwater flow field and relationship between the inland springs and submarine springs are evaluated as well in this study.
Inverse modeling of BTEX dissolution and biodegradation at the Bemidji, MN crude-oil spill site
Essaid, H.I.; Cozzarelli, I.M.; Eganhouse, R.P.; Herkelrath, W.N.; Bekins, B.A.; Delin, G.N.
2003-01-01
The U.S. Geological Survey (USGS) solute transport and biodegradation code BIOMOC was used in conjunction with the USGS universal inverse modeling code UCODE to quantify field-scale hydrocarbon dissolution and biodegradation at the USGS Toxic Substances Hydrology Program crude-oil spill research site located near Bemidji, MN. This inverse modeling effort used the extensive historical data compiled at the Bemidji site from 1986 to 1997 and incorporated a multicomponent transport and biodegradation model. Inverse modeling was successful when coupled transport and degradation processes were incorporated into the model and a single dissolution rate coefficient was used for all BTEX components. Assuming a stationary oil body, we simulated benzene, toluene, ethylbenzene, m,p-xylene, and o-xylene (BTEX) concentrations in the oil and ground water, respectively, as well as dissolved oxygen. Dissolution from the oil phase and aerobic and anaerobic degradation processes were represented. The parameters estimated were the recharge rate, hydraulic conductivity, dissolution rate coefficient, individual first-order BTEX anaerobic degradation rates, and transverse dispersivity. Results were similar for simulations obtained using several alternative conceptual models of the hydrologic system and biodegradation processes. The dissolved BTEX concentration data were not sufficient to discriminate between these conceptual models. The calibrated simulations reproduced the general large-scale evolution of the plume, but did not reproduce the observed small-scale spatial and temporal variability in concentrations. The estimated anaerobic biodegradation rates for toluene and o-xylene were greater than the dissolution rate coefficient. However, the estimated anaerobic biodegradation rates for benzene, ethylbenzene, and m,p-xylene were less than the dissolution rate coefficient. The calibrated model was used to determine the BTEX mass balance in the oil body and groundwater plume. Dissolution from the oil body was greatest for compounds with large effective solubilities (benzene) and with large degradation rates (toluene and o-xylene). Anaerobic degradation removed 77% of the BTEX that dissolved into the water phase and aerobic degradation removed 17%. Although goodness-of-fit measures for the alternative conceptual models were not significantly different, predictions made with the models were quite variable. ?? 2003 Elsevier Science B.V. All rights reserved.
Multiporosity flow in fractured low-permeability rocks: Extension to shale hydrocarbon reservoirs
Kuhlman, Kristopher L.; Malama, Bwalya; Heath, Jason E.
2015-02-05
We presented a multiporosity extension of classical double and triple-porosity fractured rock flow models for slightly compressible fluids. The multiporosity model is an adaptation of the multirate solute transport model of Haggerty and Gorelick (1995) to viscous flow in fractured rock reservoirs. It is a generalization of both pseudo steady state and transient interporosity flow double-porosity models. The model includes a fracture continuum and an overlapping distribution of multiple rock matrix continua, whose fracture-matrix exchange coefficients are specified through a discrete probability mass function. Semianalytical cylindrically symmetric solutions to the multiporosity mathematical model are developed using the Laplace transform tomore » illustrate its behavior. Furthermore, the multiporosity model presented here is conceptually simple, yet flexible enough to simulate common conceptualizations of double and triple-porosity flow. This combination of generality and simplicity makes the multiporosity model a good choice for flow modelling in low-permeability fractured rocks.« less
Benchmarking hydrological model predictive capability for UK River flows and flood peaks.
NASA Astrophysics Data System (ADS)
Lane, Rosanna; Coxon, Gemma; Freer, Jim; Wagener, Thorsten
2017-04-01
Data and hydrological models are now available for national hydrological analyses. However, hydrological model performance varies between catchments, and lumped, conceptual models are not able to produce adequate simulations everywhere. This study aims to benchmark hydrological model performance for catchments across the United Kingdom within an uncertainty analysis framework. We have applied four hydrological models from the FUSE framework to 1128 catchments across the UK. These models are all lumped models and run at a daily timestep, but differ in the model structural architecture and process parameterisations, therefore producing different but equally plausible simulations. We apply FUSE over a 20 year period from 1988-2008, within a GLUE Monte Carlo uncertainty analyses framework. Model performance was evaluated for each catchment, model structure and parameter set using standard performance metrics. These were calculated both for the whole time series and to assess seasonal differences in model performance. The GLUE uncertainty analysis framework was then applied to produce simulated 5th and 95th percentile uncertainty bounds for the daily flow time-series and additionally the annual maximum prediction bounds for each catchment. The results show that the model performance varies significantly in space and time depending on catchment characteristics including climate, geology and human impact. We identify regions where models are systematically failing to produce good results, and present reasons why this could be the case. We also identify regions or catchment characteristics where one model performs better than others, and have explored what structural component or parameterisation enables certain models to produce better simulations in these catchments. Model predictive capability was assessed for each catchment, through looking at the ability of the models to produce discharge prediction bounds which successfully bound the observed discharge. These results improve our understanding of the predictive capability of simple conceptual hydrological models across the UK and help us to identify where further effort is needed to develop modelling approaches to better represent different catchment and climate typologies.
ERIC Educational Resources Information Center
Windschitl, Mark
2001-01-01
Examines how academic assertiveness in junior high school students was related to conceptual change and the degree to which their assertiveness affected conceptual change in the partners paired with them for a series of activities using a simulation of the human cardiovascular system. Indicates that the assertiveness ratings of the individuals'…
ERIC Educational Resources Information Center
Huang, Kun; Ge, Xun; Eseryel, Deniz
2017-01-01
This study investigated the effects of metaconceptually-enhanced, simulation-based inquiry learning on eighth grade students' conceptual change in science and their development of science epistemic beliefs. Two experimental groups studied the topics of motion and force using the same computer simulations but with different simulation guides: one…
NASA Astrophysics Data System (ADS)
Guerrero, J.; Halldin, S.; Xu, C.; Lundin, L.
2011-12-01
Distributed hydrological models are important tools in water management as they account for the spatial variability of the hydrological data, as well as being able to produce spatially distributed outputs. They can directly incorporate and assess potential changes in the characteristics of our basins. A recognized problem for models in general is equifinality, which is only exacerbated for distributed models who tend to have a large number of parameters. We need to deal with the fundamentally ill-posed nature of the problem that such models force us to face, i.e. a large number of parameters and very few variables that can be used to constrain them, often only the catchment discharge. There is a growing but yet limited literature showing how the internal states of a distributed model can be used to calibrate/validate its predictions. In this paper, a distributed version of WASMOD, a conceptual rainfall runoff model with only three parameters, combined with a routing algorithm based on the high-resolution HydroSHEDS data was used to simulate the discharge in the Paso La Ceiba basin in Honduras. The parameter space was explored using Monte-Carlo simulations and the region of space containing the parameter-sets that were considered behavioral according to two different criteria was delimited using the geometric concept of alpha-shapes. The discharge data from five internal sub-basins was used to aid in the calibration of the model and to answer the following questions: Can this information improve the simulations at the outlet of the catchment, or decrease their uncertainty? Also, after reducing the number of model parameters needing calibration through sensitivity analysis: Is it possible to relate them to basin characteristics? The analysis revealed that in most cases the internal discharge data can be used to reduce the uncertainty in the discharge at the outlet, albeit with little improvement in the overall simulation results.
Processing Motion: Using Code to Teach Newtonian Physics
NASA Astrophysics Data System (ADS)
Massey, M. Ryan
Prior to instruction, students often possess a common-sense view of motion, which is inconsistent with Newtonian physics. Effective physics lessons therefore involve conceptual change. To provide a theoretical explanation for concepts and how they change, the triangulation model brings together key attributes of prototypes, exemplars, theories, Bayesian learning, ontological categories, and the causal model theory. The triangulation model provides a theoretical rationale for why coding is a viable method for physics instruction. As an experiment, thirty-two adolescent students participated in summer coding academies to learn how to design Newtonian simulations. Conceptual and attitudinal data was collected using the Force Concept Inventory and the Colorado Learning Attitudes about Science Survey. Results suggest that coding is an effective means for teaching Newtonian physics.
NASA Technical Reports Server (NTRS)
Cotton, W. R.; Tripoli, G. J.
1982-01-01
Observational requirements for predicting convective storm development and intensity as suggested by recent numerical experiments are examined. Recent 3D numerical experiments are interpreted with regard to the relationship between overshooting tops and surface wind gusts. The development of software for emulating satellite inferred cloud properties using 3D cloud model predicted data and the simulation of Heymsfield (1981) Northern Illinois storm are described as well as the development of a conceptual/semi-quantitative model of eastward propagating, mesoscale convective complexes forming to the lee of the Rocky Mountains.
NASA Astrophysics Data System (ADS)
Roy, N.; Molson, J.; Lemieux, J.-M.; Van Stempvoort, D.; Nowamooz, A.
2016-07-01
Three-dimensional numerical simulations are used to provide insight into the behavior of methane as it migrates from a leaky decommissioned hydrocarbon well into a shallow aquifer. The conceptual model includes gas-phase migration from a leaky well, dissolution into groundwater, advective-dispersive transport and biodegradation of the dissolved methane plume. Gas-phase migration is simulated using the DuMux multiphase simulator, while transport and fate of the dissolved phase is simulated using the BIONAPL/3D reactive transport model. Methane behavior is simulated for two conceptual models: first in a shallow confined aquifer containing a decommissioned leaky well based on a monitored field site near Lindbergh, Alberta, Canada, and secondly on a representative unconfined aquifer based loosely on the Borden, Ontario, field site. The simulations show that the Lindbergh site confined aquifer data are generally consistent with a 2 year methane leak of 2-20 m3/d, assuming anaerobic (sulfate-reducing) methane oxidation and with maximum oxidation rates of 1 × 10-5 to 1 × 10-3 kg/m3/d. Under the highest oxidation rate, dissolved methane decreased from solubility (110 mg/L) to the threshold concentration of 10 mg/L within 5 years. In the unconfined case with the same leakage rate, including both aerobic and anaerobic methane oxidation, the methane plume was less extensive compared to the confined aquifer scenarios. Unconfined aquifers may therefore be less vulnerable to impacts from methane leaks along decommissioned wells. At other potential leakage sites, site-specific data on the natural background geochemistry would be necessary to make reliable predictions on the fate of methane in groundwater.
A high-resolution conceptual model for diffuse organic micropollutant loads in streams
NASA Astrophysics Data System (ADS)
Stamm, Christian; Honti, Mark; Ghielmetti, Nico
2013-04-01
The ecological state of surface waters has become the dominant aspect in water quality assessments. Toxicity is a key determinant of the ecological state, but organic micropollutants (OMP) are seldom monitored with the same spatial and temporal frequency as for example nutrients, mainly due the demanding analytical methods and costs. However, diffuse transport pathways are at least equally complex for OMPs as for nutrients and there are still significant knowledge gaps. Moreover, concentrations of the different compounds would need to be known with fairly high temporal resolution because acute toxicity can be as important as the chronic one. Fully detailed mechanistic models of diffuse OMP loads require an immense set of site-specific knowledge and are rarely applicable for catchments lacking an exceptional monitoring coverage. Simple empirical methods are less demanding but usually work with more temporal aggregation and that's why they have limited possibilities to support the estimation of the ecological state. This study presents a simple conceptual model that aims to simulate the concentrations of selected organic micropollutants with daily resolution at 11 locations in the stream network of a small catchment (46 km2). The prerequisite is a known hydrological and meteorological background (daily discharge, precipitation and air temperature time series), a land use map and some historic measurements of the desired compounds. The model is conceptual in the sense that all important diffuse transport pathways are simulated separately, but each with a simple empirical process rate. Consequently, some site-specific observations are required to calibrate the model, but afterwards the model can be used for forecasting and scenario analysis as the calibrated process rates typically describe invariant properties of the catchment. We simulated 6 different OMPs from the categories of agricultural and urban pesticides and urban biocides. The application of agricultural pesticides was also simulated with the model using a heat-sum approach. Calibration was carried out with weekly aggregated samples covering the growing season in 2 years. The model could reproduce the observed OMP concentrations with varying success. Compounds that are less persistent in the environment and thus have a dominant temporal dynamics (pesticides with a short half-life) could be simulated in general better than the persistent ones. For the latter group the relatively stable available stock meant that there were no clear seasonal dynamics, which revealed that transport processes are quite uncertain even when daily rainfall is used as the main driver. Nevertheless the daily concentration distribution could still be simulated with higher accuracy than the individual peaks. Thus we can model the concentration-duration relationship for daily resolution in an acceptable way for each compound.
E. L. Landguth; S. A. Cushman; M. A. Murphy; G. Luikart
2010-01-01
Linking landscape effects on gene flow to processes such as dispersal and mating is essential to provide a conceptual foundation for landscape genetics. It is particularly important to determine how classical population genetic models relate to recent individual-based landscape genetic models when assessing individual movement and its influence on population genetic...
A conceptual framework for using Doppler radar acquired atmospheric data for flight simulation
NASA Technical Reports Server (NTRS)
Campbell, W.
1983-01-01
A concept is presented which can permit turbulence simulation in the vicinity of microbursts. The method involves a large data base, but should be fast enough for use with flight simulators. The model permits any pilot to simulate any flight maneuver in any aircraft. The model simulates a wind field with three-component mean winds and three-component turbulent gusts, and gust variation over the body of an aircraft so that all aerodynamic loads and moments can be calculated. The time and space variation of mean winds and turbulent intensities associated with a particular atmospheric phenomenon such as a microburst is used in the model. In fact, Doppler radar data such as provided by JAWS is uniquely suited for use with the proposed model. The concept is completely general and is not restricted to microburst studies. Reentry and flight in terrestrial or planetary atmospheres could be realistically simulated if supporting data of sufficient resolution were available.
An Educational Model for Hands-On Hydrology Education
NASA Astrophysics Data System (ADS)
AghaKouchak, A.; Nakhjiri, N.; Habib, E. H.
2014-12-01
This presentation provides an overview of a hands-on modeling tool developed for students in civil engineering and earth science disciplines to help them learn the fundamentals of hydrologic processes, model calibration, sensitivity analysis, uncertainty assessment, and practice conceptual thinking in solving engineering problems. The toolbox includes two simplified hydrologic models, namely HBV-EDU and HBV-Ensemble, designed as a complement to theoretical hydrology lectures. The models provide an interdisciplinary application-oriented learning environment that introduces the hydrologic phenomena through the use of a simplified conceptual hydrologic model. The toolbox can be used for in-class lab practices and homework assignments, and assessment of students' understanding of hydrological processes. Using this modeling toolbox, students can gain more insights into how hydrological processes (e.g., precipitation, snowmelt and snow accumulation, soil moisture, evapotranspiration and runoff generation) are interconnected. The educational toolbox includes a MATLAB Graphical User Interface (GUI) and an ensemble simulation scheme that can be used for teaching more advanced topics including uncertainty analysis, and ensemble simulation. Both models have been administered in a class for both in-class instruction and a final project, and students submitted their feedback about the toolbox. The results indicate that this educational software had a positive impact on students understanding and knowledge of hydrology.
Shuttle mission simulator software conceptual design
NASA Technical Reports Server (NTRS)
Burke, J. F.
1973-01-01
Software conceptual designs (SCD) are presented for meeting the simulator requirements for the shuttle missions. The major areas of the SCD discussed include: malfunction insertion, flight software, applications software, systems software, and computer complex.
NASA Astrophysics Data System (ADS)
Marchionda, Elisabetta; Deschamps, Rémy; Nader, Fadi H.; Ceriani, Andrea; Di Giulio, Andrea; Lawrence, David; Morad, Daniel J.
2017-04-01
The stratigraphic record of a carbonate system is the result of the interplay of several local and global factors that control the physical and the biological responses within a basin. Conceptual models cannot be detailed enough to take into account all the processes that control the deposition of sediments. The evaluation of the key controlling parameters on the sedimentation can be investigated with the use of stratigraphic forward models, that permit dynamic and quantitative simulations of the sedimentary basin infill. This work focuses on an onshore Abu Dhabi field (UAE) and it aims to provide a complete picture of the stratigraphic evolution of Upper Jurassic Arab Formation (Fm.). In this study, we started with the definition of the field-scale conceptual depositional model of the Formation, resulting from facies and well log analysis based on five wells. The Arab Fm. could be defined as a shallow marine carbonate ramp, that ranges from outer ramp deposits to supratidal/evaporitic facies association (from bottom to top). With the reconstruction of the sequence stratigraphic pattern and several paleofacies maps, it was possible to suggest multiple directions of progradations at local scale. Then, a 3D forward modelling tool has been used to i) identify and quantify the controlling parameters on geometries and facies distribution of the Arab Fm.; ii) predict the stratigraphic architecture of the Arab Fm.; and iii) integrate and validate the conceptual model. Numerous constraints were set during the different simulations and sensitivity analyses were performed testing the carbonate production, eustatic oscillations and transport parameters. To verify the geological consistency the 3D forward modelling has been calibrated with the available control points (five wells) in terms of thickness and facies distribution.
Ackerman, Daniel J.; Rousseau, Joseph P.; Rattray, Gordon W.; Fisher, Jason C.
2010-01-01
Three-dimensional steady-state and transient models of groundwater flow and advective transport in the eastern Snake River Plain aquifer were developed by the U.S. Geological Survey in cooperation with the U.S. Department of Energy. The steady-state and transient flow models cover an area of 1,940 square miles that includes most of the 890 square miles of the Idaho National Laboratory (INL). A 50-year history of waste disposal at the INL has resulted in measurable concentrations of waste contaminants in the eastern Snake River Plain aquifer. Model results can be used in numerical simulations to evaluate the movement of contaminants in the aquifer. Saturated flow in the eastern Snake River Plain aquifer was simulated using the MODFLOW-2000 groundwater flow model. Steady-state flow was simulated to represent conditions in 1980 with average streamflow infiltration from 1966-80 for the Big Lost River, the major variable inflow to the system. The transient flow model simulates groundwater flow between 1980 and 1995, a period that included a 5-year wet cycle (1982-86) followed by an 8-year dry cycle (1987-94). Specified flows into or out of the active model grid define the conditions on all boundaries except the southwest (outflow) boundary, which is simulated with head-dependent flow. In the transient flow model, streamflow infiltration was the major stress, and was variable in time and location. The models were calibrated by adjusting aquifer hydraulic properties to match simulated and observed heads or head differences using the parameter-estimation program incorporated in MODFLOW-2000. Various summary, regression, and inferential statistics, in addition to comparisons of model properties and simulated head to measured properties and head, were used to evaluate the model calibration. Model parameters estimated for the steady-state calibration included hydraulic conductivity for seven of nine hydrogeologic zones and a global value of vertical anisotropy. Parameters estimated for the transient calibration included specific yield for five of the seven hydrogeologic zones. The zones represent five rock units and parts of four rock units with abundant interbedded sediment. All estimates of hydraulic conductivity were nearly within 2 orders of magnitude of the maximum expected value in a range that exceeds 6 orders of magnitude. The estimate of vertical anisotropy was larger than the maximum expected value. All estimates of specific yield and their confidence intervals were within the ranges of values expected for aquifers, the range of values for porosity of basalt, and other estimates of specific yield for basalt. The steady-state model reasonably simulated the observed water-table altitude, orientation, and gradients. Simulation of transient flow conditions accurately reproduced observed changes in the flow system resulting from episodic infiltration from the Big Lost River and facilitated understanding and visualization of the relative importance of historical differences in infiltration in time and space. As described in a conceptual model, the numerical model simulations demonstrate flow that is (1) dominantly horizontal through interflow zones in basalt and vertical anisotropy resulting from contrasts in hydraulic conductivity of various types of basalt and the interbedded sediments, (2) temporally variable due to streamflow infiltration from the Big Lost River, and (3) moving downward downgradient of the INL. The numerical models were reparameterized, recalibrated, and analyzed to evaluate alternative conceptualizations or implementations of the conceptual model. The analysis of the reparameterized models revealed that little improvement in the model could come from alternative descriptions of sediment content, simulated aquifer thickness, streamflow infiltration, and vertical head distribution on the downgradient boundary. Of the alternative estimates of flow to or from the aquifer, only a 20 percent decrease in
Predicting herbicide and biocide concentrations in rivers across Switzerland
NASA Astrophysics Data System (ADS)
Wemyss, Devon; Honti, Mark; Stamm, Christian
2014-05-01
Pesticide concentrations vary strongly in space and time. Accordingly, intensive sampling is required to achieve a reliable quantification of pesticide pollution. As this requires substantial resources, loads and concentration ranges in many small and medium streams remain unknown. Here, we propose partially filling the information gap for herbicides and biocides by using a modelling approach that predicts stream concentrations without site-specific calibration simply based on generally available data like land use, discharge and nation-wide consumption data. The simple, conceptual model distinguishes herbicide losses from agricultural fields, private gardens and biocide losses from buildings (facades, roofs). The herbicide model is driven by river discharge and the applied herbicide mass; the biocide model requires precipitation and the footprint area of urban areas containing the biocide. The model approach allows for modelling concentrations across multiple catchments at the daily, or shorter, time scale and for small to medium-sized catchments (1 - 100 km2). Four high resolution sampling campaigns in the Swiss Plateau were used to calibrate the model parameters for six model compounds: atrazine, metolachlor, terbuthylazine, terbutryn, diuron and mecoprop. Five additional sampled catchments across Switzerland were used to directly compare the predicted to the measured concentrations. Analysis of the first results reveals a reasonable simulation of the concentration dynamics for specific rainfall events and across the seasons. Predicted concentration ranges are reasonable even without site-specific calibration. This indicates the transferability of the calibrated model directly to other areas. However, the results also demonstrate systematic biases in that the highest measured peaks were not attained by the model. Probable causes for these deviations are conceptual model limitations and input uncertainty (pesticide use intensity, local precipitation, etc.). Accordingly, the model will be conceptually improved. This presentation will present the model simulations and compare the performance of the original and the modified model versions. Finally, the model will be applied across approximately 50% of the catchments in the Swiss Plateau, where necessary input data is available and where the model concept can be reasonably applied.
Sage, Jérémie; El Oreibi, Elissar; Saad, Mohamed; Gromaire, Marie-Christine
2016-08-01
This study investigates the temporal variability of zinc concentrations from zinc roof runoff. The influence of rainfall characteristics and dry period duration is evaluated by combining laboratory experiment on small zinc sheets and in situ measurements under real weather conditions from a 1.6-m(2) zinc panel. A reformulation of a commonly used conceptual runoff quality model is introduced and its ability to simulate the evolution of zinc concentrations is evaluated. A systematic and sharp decrease from initially high to relatively low and stable zinc concentrations after 0.5 to 2 mm of rainfall is observed for both experiments, suggesting that highly soluble corrosion products are removed at early stages of runoff. A moderate dependence between antecedent dry period duration and the magnitude of zinc concentrations at the beginning of a rain event is evidenced. Contrariwise, results indicate that concentrations are not significantly influenced by rainfall intensities. Simulated rainfall experiment nonetheless suggests that a slight effect of rainfall intensities may be expected after the initial decrease of concentrations. Finally, this study shows that relatively simple conceptual runoff quality models may be adopted to simulate the variability of zinc concentrations during a rain event and from a rain event to another.
Vlasakakis, G; Comets, E; Keunecke, A; Gueorguieva, I; Magni, P; Terranova, N; Della Pasqua, O; de Lange, E C; Kloft, C
2013-01-01
Pharmaceutical sciences experts and regulators acknowledge that pharmaceutical development as well as drug usage requires more than scientific advancements to cope with current attrition rates/therapeutic failures. Drug disease modeling and simulation (DDM&S) creates a paradigm to enable an integrated and higher-level understanding of drugs, (diseased)systems, and their interactions (systems pharmacology) through mathematical/statistical models (pharmacometrics)1—hence facilitating decision making during drug development and therapeutic usage of medicines. To identify gaps and challenges in DDM&S, an inventory of skills and competencies currently available in academia, industry, and clinical practice was obtained through survey. The survey outcomes revealed benefits, weaknesses, and hurdles for the implementation of DDM&S. In addition, the survey indicated that no consensus exists about the knowledge, skills, and attributes required to perform DDM&S activities effectively. Hence, a landscape of technical and conceptual requirements for DDM&S was identified and serves as a basis for developing a framework of competencies to guide future education and training in DDM&S. PMID:23887723
NASA Technical Reports Server (NTRS)
Kolb, Mark A.
1988-01-01
The Rubber Airplane program, which combines two symbolic processing techniques with a component-based database of design knowledge, is proposed as a computer aid for conceptual design. Using object-oriented programming, programs are organized around the objects and behavior to be simulated, and using constraint propagation, declarative statements designate mathematical relationships among all the equation variables. It is found that the additional level of organizational structure resulting from the arrangement of the design information in terms of design components provides greater flexibility and convenience.
Parametric Model of an Aerospike Rocket Engine
NASA Technical Reports Server (NTRS)
Korte, J. J.
2000-01-01
A suite of computer codes was assembled to simulate the performance of an aerospike engine and to generate the engine input for the Program to Optimize Simulated Trajectories. First an engine simulator module was developed that predicts the aerospike engine performance for a given mixture ratio, power level, thrust vectoring level, and altitude. This module was then used to rapidly generate the aerospike engine performance tables for axial thrust, normal thrust, pitching moment, and specific thrust. Parametric engine geometry was defined for use with the engine simulator module. The parametric model was also integrated into the iSIGHTI multidisciplinary framework so that alternate designs could be determined. The computer codes were used to support in-house conceptual studies of reusable launch vehicle designs.
Parametric Model of an Aerospike Rocket Engine
NASA Technical Reports Server (NTRS)
Korte, J. J.
2000-01-01
A suite of computer codes was assembled to simulate the performance of an aerospike engine and to generate the engine input for the Program to Optimize Simulated Trajectories. First an engine simulator module was developed that predicts the aerospike engine performance for a given mixture ratio, power level, thrust vectoring level, and altitude. This module was then used to rapidly generate the aerospike engine performance tables for axial thrust, normal thrust, pitching moment, and specific thrust. Parametric engine geometry was defined for use with the engine simulator module. The parametric model was also integrated into the iSIGHT multidisciplinary framework so that alternate designs could be determined. The computer codes were used to support in-house conceptual studies of reusable launch vehicle designs.
Integration of Continuous-Time Dynamics in a Spiking Neural Network Simulator.
Hahne, Jan; Dahmen, David; Schuecker, Jannis; Frommer, Andreas; Bolten, Matthias; Helias, Moritz; Diesmann, Markus
2017-01-01
Contemporary modeling approaches to the dynamics of neural networks include two important classes of models: biologically grounded spiking neuron models and functionally inspired rate-based units. We present a unified simulation framework that supports the combination of the two for multi-scale modeling, enables the quantitative validation of mean-field approaches by spiking network simulations, and provides an increase in reliability by usage of the same simulation code and the same network model specifications for both model classes. While most spiking simulations rely on the communication of discrete events, rate models require time-continuous interactions between neurons. Exploiting the conceptual similarity to the inclusion of gap junctions in spiking network simulations, we arrive at a reference implementation of instantaneous and delayed interactions between rate-based models in a spiking network simulator. The separation of rate dynamics from the general connection and communication infrastructure ensures flexibility of the framework. In addition to the standard implementation we present an iterative approach based on waveform-relaxation techniques to reduce communication and increase performance for large-scale simulations of rate-based models with instantaneous interactions. Finally we demonstrate the broad applicability of the framework by considering various examples from the literature, ranging from random networks to neural-field models. The study provides the prerequisite for interactions between rate-based and spiking models in a joint simulation.
Integration of Continuous-Time Dynamics in a Spiking Neural Network Simulator
Hahne, Jan; Dahmen, David; Schuecker, Jannis; Frommer, Andreas; Bolten, Matthias; Helias, Moritz; Diesmann, Markus
2017-01-01
Contemporary modeling approaches to the dynamics of neural networks include two important classes of models: biologically grounded spiking neuron models and functionally inspired rate-based units. We present a unified simulation framework that supports the combination of the two for multi-scale modeling, enables the quantitative validation of mean-field approaches by spiking network simulations, and provides an increase in reliability by usage of the same simulation code and the same network model specifications for both model classes. While most spiking simulations rely on the communication of discrete events, rate models require time-continuous interactions between neurons. Exploiting the conceptual similarity to the inclusion of gap junctions in spiking network simulations, we arrive at a reference implementation of instantaneous and delayed interactions between rate-based models in a spiking network simulator. The separation of rate dynamics from the general connection and communication infrastructure ensures flexibility of the framework. In addition to the standard implementation we present an iterative approach based on waveform-relaxation techniques to reduce communication and increase performance for large-scale simulations of rate-based models with instantaneous interactions. Finally we demonstrate the broad applicability of the framework by considering various examples from the literature, ranging from random networks to neural-field models. The study provides the prerequisite for interactions between rate-based and spiking models in a joint simulation. PMID:28596730
A model for closing the inviscid form of the average-passage equation system
NASA Technical Reports Server (NTRS)
Adamczyk, J. J.; Mulac, R. A.; Celestina, M. L.
1985-01-01
A mathematical model is proposed for closing or mathematically completing the system of equations which describes the time average flow field through the blade passages of multistage turbomachinery. These equations referred to as the average passage equation system govern a conceptual model which has proven useful in turbomachinery aerodynamic design and analysis. The closure model is developed so as to insure a consistency between these equations and the axisymmetric through flow equations. The closure model was incorporated into a computer code for use in simulating the flow field about a high speed counter rotating propeller and a high speed fan stage. Results from these simulations are presented.
NASA Astrophysics Data System (ADS)
Sahoo, Sasmita; Jha, Madan K.
2017-12-01
Process-based groundwater models are useful to understand complex aquifer systems and make predictions about their response to hydrological changes. A conceptual model for evaluating responses to environmental changes is presented, considering the hydrogeologic framework, flow processes, aquifer hydraulic properties, boundary conditions, and sources and sinks of the groundwater system. Based on this conceptual model, a quasi-three-dimensional transient groundwater flow model was designed using MODFLOW to simulate the groundwater system of Mahanadi River delta, eastern India. The model was constructed in the context of an upper unconfined aquifer and lower confined aquifer, separated by an aquitard. Hydraulic heads of 13 shallow wells and 11 deep wells were used to calibrate transient groundwater conditions during 1997-2006, followed by validation (2007-2011). The aquifer and aquitard hydraulic properties were obtained by pumping tests and were calibrated along with the rainfall recharge. The statistical and graphical performance indicators suggested a reasonably good simulation of groundwater flow over the study area. Sensitivity analysis revealed that groundwater level is most sensitive to the hydraulic conductivities of both the aquifers, followed by vertical hydraulic conductivity of the confining layer. The calibrated model was then employed to explore groundwater-flow dynamics in response to changes in pumping and recharge conditions. The simulation results indicate that pumping has a substantial effect on the confined aquifer flow regime as compared to the unconfined aquifer. The results and insights from this study have important implications for other regional groundwater modeling studies, especially in multi-layered aquifer systems.
Simulation of ground-water flow and solute transport in the Glen Canyon aquifer, East-Central Utah
Freethey, Geoffrey W.; Stolp, Bernard J.
2010-01-01
The extraction of methane from coal beds in the Ferron coal trend in central Utah started in the mid-1980s. Beginning in 1994, water from the extraction process was pressure injected into the Glen Canyon aquifer. The lateral extent of the aquifer that could be affected by injection is about 7,600 square miles. To address regional-scale effects of injection over a decadal time frame, a conceptual model of ground-water movement and transport of dissolved solids was formulated. A numerical model that incorporates aquifer concepts was then constructed and used to simulate injection.The Glen Canyon aquifer within the study area is conceptualized in two parts—an active area of ground-water flow and solute transport that exists between recharge areas in the San Rafael Swell and Desert, Waterpocket Fold, and Henry Mountains and discharge locations along the Muddy, Dirty Devil, San Rafael, and Green Rivers. An area of little or negligible ground-water flow exists north of Price, Utah, and beneath the Wasatch Plateau. Pressurized injection of coal-bed methane production water occurs in this area where dissolved-solids concentrations can be more than 100,000 milligrams per liter. Injection has the potential to increase hydrologic interaction with the active flow area, where dissolved-solids concentrations are generally less than 3,000 milligrams per liter.Pressurized injection of coal-bed methane production water in 1994 initiated a net addition of flow and mass of solutes into the Glen Canyon aquifer. To better understand the regional scale hydrologic interaction between the two areas of the Glen Canyon aquifer, pressurized injection was numerically simulated. Data constraints precluded development of a fully calibrated simulation; instead, an uncalibrated model was constructed that is a plausible representation of the conceptual flow and solute-transport processes. The amount of injected water over the 36-year simulation period is about 25,000 acre-feet. As a result, simulated water levels in the injection areas increased by 50 feet and dissolved-solids concentrations increased by 100 milligrams per liter or more. These increases are accrued into aquifer storage and do not extend to the rivers during the 36-year simulation period. The amount of change in simulated discharge and solute load to the rivers is less than the resolution accuracy of the numerical simulation and is interpreted as no significant change over the considered time period.
Erdogan, Goker; Yildirim, Ilker; Jacobs, Robert A.
2015-01-01
People learn modality-independent, conceptual representations from modality-specific sensory signals. Here, we hypothesize that any system that accomplishes this feat will include three components: a representational language for characterizing modality-independent representations, a set of sensory-specific forward models for mapping from modality-independent representations to sensory signals, and an inference algorithm for inverting forward models—that is, an algorithm for using sensory signals to infer modality-independent representations. To evaluate this hypothesis, we instantiate it in the form of a computational model that learns object shape representations from visual and/or haptic signals. The model uses a probabilistic grammar to characterize modality-independent representations of object shape, uses a computer graphics toolkit and a human hand simulator to map from object representations to visual and haptic features, respectively, and uses a Bayesian inference algorithm to infer modality-independent object representations from visual and/or haptic signals. Simulation results show that the model infers identical object representations when an object is viewed, grasped, or both. That is, the model’s percepts are modality invariant. We also report the results of an experiment in which different subjects rated the similarity of pairs of objects in different sensory conditions, and show that the model provides a very accurate account of subjects’ ratings. Conceptually, this research significantly contributes to our understanding of modality invariance, an important type of perceptual constancy, by demonstrating how modality-independent representations can be acquired and used. Methodologically, it provides an important contribution to cognitive modeling, particularly an emerging probabilistic language-of-thought approach, by showing how symbolic and statistical approaches can be combined in order to understand aspects of human perception. PMID:26554704
Robert J. Luxmoore; William W. Hargrove; M. Lynn Tharp; Wilfred M. Post; Michael W. Berry; Karen S. Minser; Wendell P. Cropper; Dale W. Johnson; Boris Zeide; Ralph L. Amateis; Harold E. Burkhart; V. Clark Baldwin; Kelly D. Peterson
2000-01-01
Stochastic transfer of information in a hierarchy of simulators is offered as a conceptual approach for assessing forest responses to changing climate and air quality across 13 southeastern states of the USA. This assessment approach combines geographic information system and Monte Carlo capabilities with several scales of computer modeling for southern pine species...
Stocker, Martin; Burmester, Margarita; Allen, Meredith
2014-04-03
As a conceptual review, this paper will debate relevant learning theories to inform the development, design and delivery of an effective educational programme for simulated team training relevant to health professionals. Kolb's experiential learning theory is used as the main conceptual framework to define the sequence of activities. Dewey's theory of reflective thought and action, Jarvis modification of Kolb's learning cycle and Schön's reflection-on-action serve as a model to design scenarios for optimal concrete experience and debriefing for challenging participants' beliefs and habits. Bandura's theory of self-efficacy and newer socio-cultural learning models outline that for efficient team training, it is mandatory to introduce the social-cultural context of a team. The ideal simulated team training programme needs a scenario for concrete experience, followed by a debriefing with a critical reflexive observation and abstract conceptualisation phase, and ending with a second scenario for active experimentation. Let them re-experiment to optimise the effect of a simulated training session. Challenge them to the edge: The scenario needs to challenge participants to generate failures and feelings of inadequacy to drive and motivate team members to critical reflect and learn. Not experience itself but the inadequacy and contradictions of habitual experience serve as basis for reflection. Facilitate critical reflection: Facilitators and group members must guide and motivate individual participants through the debriefing session, inciting and empowering learners to challenge their own beliefs and habits. To do this, learners need to feel psychological safe. Let the group talk and critical explore. Motivate with reality and context: Training with multidisciplinary team members, with different levels of expertise, acting in their usual environment (in-situ simulation) on physiological variables is mandatory to introduce cultural context and social conditions to the learning experience. Embedding in situ team training sessions into a teaching programme to enable repeated training and to assess regularly team performance is mandatory for a cultural change of sustained improvement of team performance and patient safety.
Life cycle cost modeling of conceptual space vehicles
NASA Technical Reports Server (NTRS)
Ebeling, Charles
1993-01-01
This paper documents progress to date by the University of Dayton on the development of a life cycle cost model for use during the conceptual design of new launch vehicles and spacecraft. This research is being conducted under NASA Research Grant NAG-1-1327. This research effort changes the focus from that of the first two years in which a reliability and maintainability model was developed to the initial development of a life cycle cost model. Cost categories are initially patterned after NASA's three axis work breakdown structure consisting of a configuration axis (vehicle), a function axis, and a cost axis. The focus will be on operations and maintenance costs and other recurring costs. Secondary tasks performed concurrent with the development of the life cycle costing model include continual support and upgrade of the R&M model. The primary result of the completed research will be a methodology and a computer implementation of the methodology to provide for timely cost analysis in support of the conceptual design activities. The major objectives of this research are: to obtain and to develop improved methods for estimating manpower, spares, software and hardware costs, facilities costs, and other cost categories as identified by NASA personnel; to construct a life cycle cost model of a space transportation system for budget exercises and performance-cost trade-off analysis during the conceptual and development stages; to continue to support modifications and enhancements to the R&M model; and to continue to assist in the development of a simulation model to provide an integrated view of the operations and support of the proposed system.
Burow, Karen R.; Panshin, Sandra Y.; Dubrovsky, Neil H.; Vanbrocklin, David; Fogg, Graham E.
1999-01-01
A conceptual two-dimensional numerical flow and transport modeling approach was used to test hypotheses addressing dispersion, transformation rate, and in a relative sense, the effects of ground- water pumping and reapplication of irrigation water on DBCP concentrations in the aquifer. The flow and transport simulations, which represent hypothetical steady-state flow conditions in the aquifer, were used to refine the conceptual understanding of the aquifer system rather than to predict future concentrations of DBCP. Results indicate that dispersion reduces peak concentrations, but this process alone does not account for the apparent decrease in DBCP concentrations in ground water in the eastern San Joaquin Valley. Ground-water pumping and reapplication of irrigation water may affect DBCP concentrations to the extent that this process can be simulated indirectly using first-order decay. Transport simulation results indicate that the in situ 'effective' half-life of DBCP caused by processes other than dispersion and transformation to BAA could be on the order of 6 years.
Factor Analysis via Components Analysis
ERIC Educational Resources Information Center
Bentler, Peter M.; de Leeuw, Jan
2011-01-01
When the factor analysis model holds, component loadings are linear combinations of factor loadings, and vice versa. This interrelation permits us to define new optimization criteria and estimation methods for exploratory factor analysis. Although this article is primarily conceptual in nature, an illustrative example and a small simulation show…
ERIC Educational Resources Information Center
Abdullah, Sopiah; Shariff, Adilah
2008-01-01
The purpose of the study was to investigate the effects of inquiry-based computer simulation with heterogeneous-ability cooperative learning (HACL) and inquiry-based computer simulation with friendship cooperative learning (FCL) on (a) scientific reasoning (SR) and (b) conceptual understanding (CU) among Form Four students in Malaysian Smart…
ERIC Educational Resources Information Center
Baser, Mustafa
2006-01-01
The objective of this research is to investigate the effects of simulations based on conceptual change conditions (CCS) and traditional confirmatory simulations (TCS) on pre-service elementary school teachers' understanding of direct current electric circuits. The data was collected from a sample consisting of 89 students; 48 students in the…
High Fidelity Thermal Simulators for Non-Nuclear Testing: Analysis and Initial Results
NASA Technical Reports Server (NTRS)
Bragg-Sitton, Shannon M.; Dickens, Ricky; Dixon, David
2007-01-01
Non-nuclear testing can be a valuable tool in the development of a space nuclear power system, providing system characterization data and allowing one to work through various fabrication, assembly and integration issues without the cost and time associated with a full ground nuclear test. In a non-nuclear test bed, electric heaters are used to simulate the heat from nuclear fuel. Testing with non-optimized heater elements allows one to assess thermal, heat transfer, and stress related attributes of a given system, but fails to demonstrate the dynamic response that would be present in an integrated, fueled reactor system. High fidelity thermal simulators that match both the static and the dynamic fuel pin performance that would be observed in an operating, fueled nuclear reactor can vastly increase the value of non-nuclear test results. With optimized simulators, the integration of thermal hydraulic hardware tests with simulated neutronie response provides a bridge between electrically heated testing and fueled nuclear testing, providing a better assessment of system integration issues, characterization of integrated system response times and response characteristics, and assessment of potential design improvements' at a relatively small fiscal investment. Initial conceptual thermal simulator designs are determined by simple one-dimensional analysis at a single axial location and at steady state conditions; feasible concepts are then input into a detailed three-dimensional model for comparison to expected fuel pin performance. Static and dynamic fuel pin performance for a proposed reactor design is determined using SINDA/FLUINT thermal analysis software, and comparison is made between the expected nuclear performance and the performance of conceptual thermal simulator designs. Through a series of iterative analyses, a conceptual high fidelity design can developed. Test results presented in this paper correspond to a "first cut" simulator design for a potential liquid metal (NaK) cooled reactor design that could be applied for Lunar surface power. Proposed refinements to this simulator design are also presented.
Perceived game realism: a test of three alternative models.
Ribbens, Wannes
2013-01-01
Perceived realism is considered a key concept in explaining the mental processing of media messages and the societal impact of media. Despite its importance, little is known about its conceptualization and dimensional structure, especially with regard to digital games. The aim of this study was to test a six-factor model of perceived game realism comprised of simulational realism, freedom of choice, perceptual pervasiveness, social realism, authenticity, and character involvement and to assess it against an alternative single- and five-factor model. Data were collected from 380 male digital game users who judged the realism of the first-person shooter Half-Life 2 based upon their previous experience with the game. Confirmatory factor analysis was applied to investigate which model fits the data best. The results support the six-factor model over the single- and five-factor solutions. The study contributes to our knowledge of perceived game realism by further developing its conceptualization and measurement.
NASA Astrophysics Data System (ADS)
Wee, Loo Kang
2012-05-01
We develop an Easy Java Simulation (EJS) model for students to experience the physics of idealized one-dimensional collision carts. The physics model is described and simulated by both continuous dynamics and discrete transition during collision. In designing the simulations, we discuss briefly three pedagogical considerations namely (1) a consistent simulation world view with a pen and paper representation, (2) a data table, scientific graphs and symbolic mathematical representations for ease of data collection and multiple representational visualizations and (3) a game for simple concept testing that can further support learning. We also suggest using a physical world setup augmented by simulation by highlighting three advantages of real collision carts equipment such as a tacit 3D experience, random errors in measurement and the conceptual significance of conservation of momentum applied to just before and after collision. General feedback from the students has been relatively positive, and we hope teachers will find the simulation useful in their own classes.
Using a Virtual Experiment to Analyze Infiltration Process from Point to Grid-cell Size Scale
NASA Astrophysics Data System (ADS)
Barrios, M. I.
2013-12-01
The hydrological science requires the emergence of a consistent theoretical corpus driving the relationships between dominant physical processes at different spatial and temporal scales. However, the strong spatial heterogeneities and non-linearities of these processes make difficult the development of multiscale conceptualizations. Therefore, scaling understanding is a key issue to advance this science. This work is focused on the use of virtual experiments to address the scaling of vertical infiltration from a physically based model at point scale to a simplified physically meaningful modeling approach at grid-cell scale. Numerical simulations have the advantage of deal with a wide range of boundary and initial conditions against field experimentation. The aim of the work was to show the utility of numerical simulations to discover relationships between the hydrological parameters at both scales, and to use this synthetic experience as a media to teach the complex nature of this hydrological process. The Green-Ampt model was used to represent vertical infiltration at point scale; and a conceptual storage model was employed to simulate the infiltration process at the grid-cell scale. Lognormal and beta probability distribution functions were assumed to represent the heterogeneity of soil hydraulic parameters at point scale. The linkages between point scale parameters and the grid-cell scale parameters were established by inverse simulations based on the mass balance equation and the averaging of the flow at the point scale. Results have shown numerical stability issues for particular conditions and have revealed the complex nature of the non-linear relationships between models' parameters at both scales and indicate that the parameterization of point scale processes at the coarser scale is governed by the amplification of non-linear effects. The findings of these simulations have been used by the students to identify potential research questions on scale issues. Moreover, the implementation of this virtual lab improved the ability to understand the rationale of these process and how to transfer the mathematical models to computational representations.
NASA Astrophysics Data System (ADS)
Hong, E.; Park, Y.; Muirhead, R.; Jeong, J.; Pachepsky, Y. A.
2017-12-01
Pathogenic microorganisms in recreational and irrigation waters remain the subject of concern. Water quality models are used to estimate microbial quality of water sources, to evaluate microbial contamination-related risks, to guide the microbial water quality monitoring, and to evaluate the effect of agricultural management on the microbial water quality. The Agricultural Policy/Environmental eXtender (APEX) is the watershed-scale water quality model that includes highly detailed representation of agricultural management. The APEX currently does not have microbial fate and transport simulation capabilities. The objective of this work was to develop the first APEX microbial fate and transport module that could use the APEX conceptual model of manure removal together with recently introduced conceptualizations of the in-stream microbial fate and transport. The module utilizes manure erosion rates found in the APEX. Bacteria survival in soil-manure mixing layer was simulated with the two-stage survival model. Individual survival patterns were simulated for each manure application date. Simulated in-stream microbial fate and transport processes included the reach-scale passive release of bacteria with resuspended bottom sediment during high flow events, the transport of bacteria from bottom sediment due to the hyporheic exchange during low flow periods, the deposition with settling sediment, and the two-stage survival. Default parameter values were available from recently published databases. The APEX model with the newly developed microbial fate and transport module was applied to simulate seven years of monitoring data for the Toenepi watershed in New Zealand. Based on calibration and testing results, the APEX with the microbe module reproduced well the monitored pattern of E. coli concentrations at the watershed outlet. The APEX with the microbial fate and transport module will be utilized for predicting microbial quality of water under various agricultural practices, evaluating monitoring protocols, and supporting the selection of management practices based on regulations that rely on fecal indicator bacteria concentrations.
ERIC Educational Resources Information Center
Windschitl, Mark; Andre, Thomas
1998-01-01
Investigates the effects of a constructivist versus objectivist learning environment on college students' conceptual change using a computer simulation of the human cardiovascular system as an instructional tool. Contains 33 references. (DDR)
Simulated discharge trends indicate robustness of hydrological models in a changing climate
NASA Astrophysics Data System (ADS)
Addor, Nans; Nikolova, Silviya; Seibert, Jan
2016-04-01
Assessing the robustness of hydrological models under contrasted climatic conditions should be part any hydrological model evaluation. Robust models are particularly important for climate impact studies, as models performing well under current conditions are not necessarily capable of correctly simulating hydrological perturbations caused by climate change. A pressing issue is the usually assumed stationarity of parameter values over time. Modeling experiments using conceptual hydrological models revealed that assuming transposability of parameters values in changing climatic conditions can lead to significant biases in discharge simulations. This raises the question whether parameter values should to be modified over time to reflect changes in hydrological processes induced by climate change. Such a question denotes a focus on the contribution of internal processes (i.e., catchment processes) to discharge generation. Here we adopt a different perspective and explore the contribution of external forcing (i.e., changes in precipitation and temperature) to changes in discharge. We argue that in a robust hydrological model, discharge variability should be induced by changes in the boundary conditions, and not by changes in parameter values. In this study, we explore how well the conceptual hydrological model HBV captures transient changes in hydrological signatures over the period 1970-2009. Our analysis focuses on research catchments in Switzerland undisturbed by human activities. The precipitation and temperature forcing are extracted from recently released 2km gridded data sets. We use a genetic algorithm to calibrate HBV for the whole 40-year period and for the eight successive 5-year periods to assess eventual trends in parameter values. Model calibration is run multiple times to account for parameter uncertainty. We find that in alpine catchments showing a significant increase of winter discharge, this trend can be captured reasonably well with constant parameter values over the whole reference period. Further, preliminary results suggest that some trends in parameter values do not reflect changes in hydrological processes, as reported by others previously, but instead might stem from a modeling artifact related to the parameterization of evapotranspiration, which is overly sensitive to temperature increase. We adopt a trading-space-for-time approach to better understand whether robust relationships between parameter values and forcing can be established, and to critically explore the rationale behind time-dependent parameter values in conceptual hydrological models.
Evaluation of a pumping test of the Snake River Plain aquifer using axial-flow numerical modeling
NASA Astrophysics Data System (ADS)
Johnson, Gary S.; Frederick, David B.; Cosgrove, Donna M.
2002-06-01
The Snake River Plain aquifer in southeast Idaho is hosted in a thick sequence of layered basalts and interbedded sediments. The degree to which the layering impedes vertical flow has not been well understood, yet is a feature that may exert a substantial control on the movement of contaminants. An axial-flow numerical model, RADFLOW, was calibrated to pumping test data collected by a straddle-packer system deployed at 23 depth intervals in four observation wells to evaluate conceptual models and estimate properties of the Snake River Plain aquifer at the Idaho National Engineering and Environmental Laboratory. A delayed water-table response observed in intervals beneath a sediment interbed was best reproduced with a three-layer simulation. The results demonstrate the hydraulic significance of this interbed as a semi-confining layer. Vertical hydraulic conductivity of the sediment interbed was estimated to be about three orders of magnitude less than vertical hydraulic conductivity of the lower basalt and upper basalt units. The numerical model was capable of representing aquifer conceptual models that could not be represented with any single analytical technique. The model proved to be a useful tool for evaluating alternative conceptual models and estimating aquifer properties in this application.
Farisco, Michele; Kotaleski, Jeanette H; Evers, Kathinka
2018-01-01
Modeling and simulations have gained a leading position in contemporary attempts to describe, explain, and quantitatively predict the human brain's operations. Computer models are highly sophisticated tools developed to achieve an integrated knowledge of the brain with the aim of overcoming the actual fragmentation resulting from different neuroscientific approaches. In this paper we investigate the plausibility of simulation technologies for emulation of consciousness and the potential clinical impact of large-scale brain simulation on the assessment and care of disorders of consciousness (DOCs), e.g., Coma, Vegetative State/Unresponsive Wakefulness Syndrome, Minimally Conscious State. Notwithstanding their technical limitations, we suggest that simulation technologies may offer new solutions to old practical problems, particularly in clinical contexts. We take DOCs as an illustrative case, arguing that the simulation of neural correlates of consciousness is potentially useful for improving treatments of patients with DOCs.
Large-Scale Brain Simulation and Disorders of Consciousness. Mapping Technical and Conceptual Issues
Farisco, Michele; Kotaleski, Jeanette H.; Evers, Kathinka
2018-01-01
Modeling and simulations have gained a leading position in contemporary attempts to describe, explain, and quantitatively predict the human brain’s operations. Computer models are highly sophisticated tools developed to achieve an integrated knowledge of the brain with the aim of overcoming the actual fragmentation resulting from different neuroscientific approaches. In this paper we investigate the plausibility of simulation technologies for emulation of consciousness and the potential clinical impact of large-scale brain simulation on the assessment and care of disorders of consciousness (DOCs), e.g., Coma, Vegetative State/Unresponsive Wakefulness Syndrome, Minimally Conscious State. Notwithstanding their technical limitations, we suggest that simulation technologies may offer new solutions to old practical problems, particularly in clinical contexts. We take DOCs as an illustrative case, arguing that the simulation of neural correlates of consciousness is potentially useful for improving treatments of patients with DOCs. PMID:29740372
ERIC Educational Resources Information Center
Twissell, Adrian
2018-01-01
Abstract electronics concepts are difficult to develop because the phenomena of interest cannot be readily observed. Visualisation skills support learning about electronics and can be applied at different levels of representation and understanding (observable, symbolic and abstract). Providing learners with opportunities to make transitions…
Simulating F-22 Heavy Maintenance and Modifications Workforce Multi-Skilling
2014-03-27
conceptualization starts by identifying the goal of the model. Stephen Covey’s Second Habit in The 7 Habits of Highly Effective People is to “Begin...S. R. (2004). The 7 habits of highly effective people : Restoring the character ethic. New York: Free Press. 169 REPORT
The Living Dead: Transformative Experiences in Modelling Natural Selection
ERIC Educational Resources Information Center
Petersen, Morten Rask
2017-01-01
This study considers how students change their coherent conceptual understanding of natural selection through a hands-on simulation. The results show that most students change their understanding. In addition, some students also underwent a transformative experience and used their new knowledge in a leisure time activity. These transformative…
Saul, Katherine R.; Hu, Xiao; Goehler, Craig M.; Vidt, Meghan E.; Daly, Melissa; Velisar, Anca; Murray, Wendy M.
2014-01-01
Several opensource or commercially available software platforms are widely used to develop dynamic simulations of movement. While computational approaches are conceptually similar across platforms, technical differences in implementation may influence output. We present a new upper limb dynamic model as a tool to evaluate potential differences in predictive behavior between platforms. We evaluated to what extent differences in technical implementations in popular simulation software environments result in differences in kinematic predictions for single and multijoint movements using EMG- and optimization-based approaches for deriving control signals. We illustrate the benchmarking comparison using SIMM-Dynamics Pipeline-SD/Fast and OpenSim platforms. The most substantial divergence results from differences in muscle model and actuator paths. This model is a valuable resource and is available for download by other researchers. The model, data, and simulation results presented here can be used by future researchers to benchmark other software platforms and software upgrades for these two platforms. PMID:24995410
Saul, Katherine R; Hu, Xiao; Goehler, Craig M; Vidt, Meghan E; Daly, Melissa; Velisar, Anca; Murray, Wendy M
2015-01-01
Several opensource or commercially available software platforms are widely used to develop dynamic simulations of movement. While computational approaches are conceptually similar across platforms, technical differences in implementation may influence output. We present a new upper limb dynamic model as a tool to evaluate potential differences in predictive behavior between platforms. We evaluated to what extent differences in technical implementations in popular simulation software environments result in differences in kinematic predictions for single and multijoint movements using EMG- and optimization-based approaches for deriving control signals. We illustrate the benchmarking comparison using SIMM-Dynamics Pipeline-SD/Fast and OpenSim platforms. The most substantial divergence results from differences in muscle model and actuator paths. This model is a valuable resource and is available for download by other researchers. The model, data, and simulation results presented here can be used by future researchers to benchmark other software platforms and software upgrades for these two platforms.
Pesticide fate on catchment scale: conceptual modelling of stream CSIA data
NASA Astrophysics Data System (ADS)
Lutz, Stefanie R.; van der Velde, Ype; Elsayed, Omniea F.; Imfeld, Gwenaël; Lefrancq, Marie; Payraudeau, Sylvain; van Breukelen, Boris M.
2017-10-01
Compound-specific stable isotope analysis (CSIA) has proven beneficial in the characterization of contaminant degradation in groundwater, but it has never been used to assess pesticide transformation on catchment scale. This study presents concentration and carbon CSIA data of the herbicides S-metolachlor and acetochlor from three locations (plot, drain, and catchment outlets) in a 47 ha agricultural catchment (Bas-Rhin, France). Herbicide concentrations at the catchment outlet were highest (62 µg L-1) in response to an intense rainfall event following herbicide application. Increasing δ13C values of S-metolachlor and acetochlor by more than 2 ‰ during the study period indicated herbicide degradation. To assist the interpretation of these data, discharge, concentrations, and δ13C values of S-metolachlor were modelled with a conceptual mathematical model using the transport formulation by travel-time distributions. Testing of different model setups supported the assumption that degradation half-lives (DT50) increase with increasing soil depth, which can be straightforwardly implemented in conceptual models using travel-time distributions. Moreover, model calibration yielded an estimate of a field-integrated isotopic enrichment factor as opposed to laboratory-based assessments of enrichment factors in closed systems. Thirdly, the Rayleigh equation commonly applied in groundwater studies was tested by our model for its potential to quantify degradation on catchment scale. It provided conservative estimates on the extent of degradation as occurred in stream samples. However, largely exceeding the simulated degradation within the entire catchment, these estimates were not representative of overall degradation on catchment scale. The conceptual modelling approach thus enabled us to upscale sample-based CSIA information on degradation to the catchment scale. Overall, this study demonstrates the benefit of combining monitoring and conceptual modelling of concentration and CSIA data and advocates the use of travel-time distributions for assessing pesticide fate and transport on catchment scale.
Shuttle mission simulator hardware conceptual design report
NASA Technical Reports Server (NTRS)
Burke, J. F.
1973-01-01
The detailed shuttle mission simulator hardware requirements are discussed. The conceptual design methods, or existing technology, whereby those requirements will be fulfilled are described. Information of a general nature on the total design problem plus specific details on how these requirements are to be satisfied are reported. The configuration of the simulator is described and the capabilities for various types of training are identified.
Deep arid system hydrodynamics 1. Equilibrium states and response times in thick desert vadose zones
Walvoord, Michelle Ann; Plummer, Mitchell A.; Phillips, Fred M.; Wolfsberg, Andrew V.
2002-01-01
Quantifying moisture fluxes through deep desert soils remains difficult because of the small magnitude of the fluxes and the lack of a comprehensive model to describe flow and transport through such dry material. A particular challenge for such a model is reproducing both observed matric potential and chloride profiles. We propose a conceptual model for flow in desert vadose zones that includes isothermal and nonisothermal vapor transport and the role of desert vegetation in supporting a net upward moisture flux below the root zone. Numerical simulations incorporating this conceptual model match typical matric potential and chloride profiles. The modeling approach thereby reconciles the paradox between the recognized importance of plants, upward driving forces, and vapor flow processes in desert vadose zones and the inadequacy of the downward‐only liquid flow assumption of the conventional chloride mass balance approach. Our work shows that water transport in thick desert vadose zones at steady state is usually dominated by upward vapor flow and that long response times, of the order of 104–105 years, are required to equilibrate to existing arid surface conditions. Simulation results indicate that most thick desert vadose zones have been locked in slow drying transients that began in response to a climate shift and establishment of desert vegetation many thousands of years ago.
Deterministic modelling and stochastic simulation of biochemical pathways using MATLAB.
Ullah, M; Schmidt, H; Cho, K H; Wolkenhauer, O
2006-03-01
The analysis of complex biochemical networks is conducted in two popular conceptual frameworks for modelling. The deterministic approach requires the solution of ordinary differential equations (ODEs, reaction rate equations) with concentrations as continuous state variables. The stochastic approach involves the simulation of differential-difference equations (chemical master equations, CMEs) with probabilities as variables. This is to generate counts of molecules for chemical species as realisations of random variables drawn from the probability distribution described by the CMEs. Although there are numerous tools available, many of them free, the modelling and simulation environment MATLAB is widely used in the physical and engineering sciences. We describe a collection of MATLAB functions to construct and solve ODEs for deterministic simulation and to implement realisations of CMEs for stochastic simulation using advanced MATLAB coding (Release 14). The program was successfully applied to pathway models from the literature for both cases. The results were compared to implementations using alternative tools for dynamic modelling and simulation of biochemical networks. The aim is to provide a concise set of MATLAB functions that encourage the experimentation with systems biology models. All the script files are available from www.sbi.uni-rostock.de/ publications_matlab-paper.html.
A model for simulating adaptive, dynamic flows on networks: Application to petroleum infrastructure
Corbet, Thomas F.; Beyeler, Walt; Wilson, Michael L.; ...
2017-10-03
Simulation models can greatly improve decisions meant to control the consequences of disruptions to critical infrastructures. We describe a dynamic flow model on networks purposed to inform analyses by those concerned about consequences of disruptions to infrastructures and to help policy makers design robust mitigations. We conceptualize the adaptive responses of infrastructure networks to perturbations as market transactions and business decisions of operators. We approximate commodity flows in these networks by a diffusion equation, with nonlinearities introduced to model capacity limits. To illustrate the behavior and scalability of the model, we show its application first on two simple networks, thenmore » on petroleum infrastructure in the United States, where we analyze the effects of a hypothesized earthquake.« less
A model for simulating adaptive, dynamic flows on networks: Application to petroleum infrastructure
DOE Office of Scientific and Technical Information (OSTI.GOV)
Corbet, Thomas F.; Beyeler, Walt; Wilson, Michael L.
Simulation models can greatly improve decisions meant to control the consequences of disruptions to critical infrastructures. We describe a dynamic flow model on networks purposed to inform analyses by those concerned about consequences of disruptions to infrastructures and to help policy makers design robust mitigations. We conceptualize the adaptive responses of infrastructure networks to perturbations as market transactions and business decisions of operators. We approximate commodity flows in these networks by a diffusion equation, with nonlinearities introduced to model capacity limits. To illustrate the behavior and scalability of the model, we show its application first on two simple networks, thenmore » on petroleum infrastructure in the United States, where we analyze the effects of a hypothesized earthquake.« less
NASA Astrophysics Data System (ADS)
Glesener, G. B.; Aurnou, J. M.
2010-12-01
The Modeling and Educational Demonstrations Laboratory (MEDL) at UCLA is developing a mantle convection physical model to assist educators with the pedagogy of Earth’s interior processes. Our design goal consists of two components to help the learner gain conceptual understanding by means of visual interactions without the burden of distracters, which may promote alternative conceptions. Distracters may be any feature of the conceptual model that causes the learner to use inadequate mental artifact to help him or her understand what the conceptual model is intended to convey. The first component, and most important, is a psychological component that links properties of “everyday things” (Norman, 1988) to the natural phenomenon, mantle convection. Some examples of everyday things may be heat rising out from a freshly popped bag of popcorn, or cold humid air falling from an open freezer. The second component is the scientific accuracy of the conceptual model. We would like to simplify the concepts for the learner without sacrificing key information that is linked to other natural phenomena the learner will come across in future science lessons. By taking into account the learner’s mental artifacts in combination with a simplified, but accurate, representation of what scientists know of the Earth’s interior, we expect the learner to have the ability to create an adequate qualitative mental simulation of mantle convection. We will be presenting some of our prototypes of this mantle convection physical model at this year’s poster session and invite constructive input from our colleagues.
Incorporating Handling Qualities Analysis into Rotorcraft Conceptual Design
NASA Technical Reports Server (NTRS)
Lawrence, Ben
2014-01-01
This paper describes the initial development of a framework to incorporate handling qualities analyses into a rotorcraft conceptual design process. In particular, the paper describes how rotorcraft conceptual design level data can be used to generate flight dynamics models for handling qualities analyses. Also, methods are described that couple a basic stability augmentation system to the rotorcraft flight dynamics model to extend analysis to beyond that of the bare airframe. A methodology for calculating the handling qualities characteristics of the flight dynamics models and for comparing the results to ADS-33E criteria is described. Preliminary results from the application of the handling qualities analysis for variations in key rotorcraft design parameters of main rotor radius, blade chord, hub stiffness and flap moment of inertia are shown. Varying relationships, with counteracting trends for different handling qualities criteria and different flight speeds are exhibited, with the action of the control system playing a complex part in the outcomes. Overall, the paper demonstrates how a broad array of technical issues across flight dynamics stability and control, simulation and modeling, control law design and handling qualities testing and evaluation had to be confronted to implement even a moderately comprehensive handling qualities analysis of relatively low fidelity models. A key outstanding issue is to how to 'close the loop' with an overall design process, and options for the exploration of how to feedback handling qualities results to a conceptual design process are proposed for future work.
Dealing With Unexpected Events on the Flight Deck: A Conceptual Model of Startle and Surprise.
Landman, Annemarie; Groen, Eric L; van Paassen, M M René; Bronkhorst, Adelbert W; Mulder, Max
2017-12-01
A conceptual model is proposed in order to explain pilot performance in surprising and startling situations. Today's debate around loss of control following in-flight events and the implementation of upset prevention and recovery training has highlighted the importance of pilots' ability to deal with unexpected events. Unexpected events, such as technical malfunctions or automation surprises, potentially induce a "startle factor" that may significantly impair performance. Literature on surprise, startle, resilience, and decision making is reviewed, and findings are combined into a conceptual model. A number of recent flight incident and accident cases are then used to illustrate elements of the model. Pilot perception and actions are conceptualized as being guided by "frames," or mental knowledge structures that were previously learned. Performance issues in unexpected situations can often be traced back to insufficient adaptation of one's frame to the situation. It is argued that such sensemaking or reframing processes are especially vulnerable to issues caused by startle or acute stress. Interventions should focus on (a) increasing the supply and quality of pilot frames (e.g., though practicing a variety of situations), (b) increasing pilot reframing skills (e.g., through the use of unpredictability in training scenarios), and (c) improving pilot metacognitive skills, so that inappropriate automatic responses to startle and surprise can be avoided. The model can be used to explain pilot behavior in accident cases, to design experiments and training simulations, to teach pilots metacognitive skills, and to identify intervention methods.
Lihoreau, Mathieu; Buhl, Jerome; Charleston, Michael A; Sword, Gregory A; Raubenheimer, David; Simpson, Stephen J
2015-01-01
Over recent years, modelling approaches from nutritional ecology (known as Nutritional Geometry) have been increasingly used to describe how animals and some other organisms select foods and eat them in appropriate amounts in order to maintain a balanced nutritional state maximising fitness. These nutritional strategies profoundly affect the physiology, behaviour and performance of individuals, which in turn impact their social interactions within groups and societies. Here, we present a conceptual framework to study the role of nutrition as a major ecological factor influencing the development and maintenance of social life. We first illustrate some of the mechanisms by which nutritional differences among individuals mediate social interactions in a broad range of species and ecological contexts. We then explain how studying individual- and collective-level nutrition in a common conceptual framework derived from Nutritional Geometry can bring new fundamental insights into the mechanisms and evolution of social interactions, using a combination of simulation models and manipulative experiments. PMID:25586099
Hsieh, Paul A.
2001-01-01
This report serves as a user?s guide for two computer models: TopoDrive and ParticleFlow. These two-dimensional models are designed to simulate two ground-water processes: topography-driven flow and advective transport of fluid particles. To simulate topography-driven flow, the user may specify the shape of the water table, which bounds the top of the vertical flow section. To simulate transport of fluid particles, the model domain is a rectangle with overall flow from left to right. In both cases, the flow is under steady state, and the distribution of hydraulic conductivity may be specified by the user. The models compute hydraulic head, ground-water flow paths, and the movement of fluid particles. An interactive visual interface enables the user to easily and quickly explore model behavior, and thereby better understand ground-water flow processes. In this regard, TopoDrive and ParticleFlow are not intended to be comprehensive modeling tools, but are designed for modeling at the exploratory or conceptual level, for visual demonstration, and for educational purposes.
NASA Astrophysics Data System (ADS)
Pawar, R.; Dash, Z.; Sakaki, T.; Plampin, M. R.; Lassen, R. N.; Illangasekare, T. H.; Zyvoloski, G.
2011-12-01
One of the concerns related to geologic CO2 sequestration is potential leakage of CO2 and its subsequent migration to shallow groundwater resources leading to geochemical impacts. Developing approaches to monitor CO2 migration in shallow aquifer and mitigate leakage impacts will require improving our understanding of gas phase formation and multi-phase flow subsequent to CO2 leakage in shallow aquifers. We are utilizing an integrated approach combining laboratory experiments and numerical simulations to characterize the multi-phase flow of CO2 in shallow aquifers. The laboratory experiments involve a series of highly controlled experiments in which CO2 dissolved water is injected in homogeneous and heterogeneous soil columns and tanks. The experimental results are used to study the effects of soil properties, temperature, pressure gradients and heterogeneities on gas formation and migration. We utilize the Finite Element Heat and Mass (FEHM) simulator (Zyvoloski et al, 2010) to numerically model the experimental results. The numerical models capture the physics of CO2 exsolution, multi-phase fluid flow as well as sand heterogeneity. Experimental observations of pressure, temperature and gas saturations are used to develop and constrain conceptual models for CO2 gas-phase formation and multi-phase CO2 flow in porous media. This talk will provide details of development of conceptual models based on experimental observation, development of numerical models for laboratory experiments and modelling results.
Davis, Kyle W.; Putnam, Larry D.
2013-01-01
The Ogallala aquifer is an important water resource for the Rosebud Sioux Tribe in Gregory and Tripp Counties in south-central South Dakota and is used for irrigation, public supply, domestic, and stock water supplies. To better understand groundwater flow in the Ogallala aquifer, conceptual and numerical models of groundwater flow were developed for the aquifer. A conceptual model of the Ogallala aquifer was used to analyze groundwater flow and develop a numerical model to simulate groundwater flow in the aquifer. The MODFLOW–NWT model was used to simulate transient groundwater conditions for water years 1985–2009. The model was calibrated using statistical parameter estimation techniques. Potential future scenarios were simulated using the input parameters from the calibrated model for simulations of potential future drought and future increased pumping. Transient simulations were completed with the numerical model. A 200-year transient initialization period was used to establish starting conditions for the subsequent 25-year simulation of water years 1985–2009. The 25-year simulation was discretized into three seasonal stress periods per year and used to simulate transient conditions. A single-layer model was used to simulate flow and mass balance in the Ogallala aquifer with a grid of 133 rows and 282 columns and a uniform spacing of 500 meters (1,640 feet). Regional inflow and outflow were simulated along the western and southern boundaries using specified-head cells. All other boundaries were simulated using no-flow cells. Recharge to the aquifer occurs through precipitation on the outcrop area. Model calibration was accomplished using the Parameter Estimation (PEST) program that adjusted individual model input parameters and assessed the difference between estimated and model-simulated values of hydraulic head and base flow. This program was designed to estimate parameter values that are statistically the most likely set of values to result in the smallest differences between simulated and observed values, within a given set of constraints. The potentiometric surface of the aquifer calculated during the 200-year initialization period established initial conditions for the transient simulation. Water levels for 38 observation wells were used to calibrate the 25-year simulation. Simulated hydraulic heads for the transient simulation were within plus or minus 20 feet of observed values for 95 percent of observation wells, and the mean absolute difference was 5.1 feet. Calibrated hydraulic conductivity ranged from 0.9 to 227 feet per day (ft/d). The annual recharge rates for the transient simulation (water years 1985–2009) ranged from 0.60 to 6.96 inches, with a mean of 3.68 inches for the Ogallala aquifer. This represents a mean recharge rate of 280.5 ft3/s for the model area. Discharge from the aquifer occurs through evapotranspiration, discharge to streams through river leakage and flow from springs and seeps, and well withdrawals. Water is withdrawn from wells for irrigation, public supply, domestic, and stock uses. Simulated mean discharge rates for water years 1985–2009 were about 185 cubic feet per second (ft3/s) for evapotranspiration, 66.7 ft3/s for discharge to streams, and 5.48 ft3/s for well withdrawals. Simulated annual evapotranspiration rates ranged from about 128 to 254 ft3/s, and outflow to streams ranged from 52.2 to 79.9 ft3/s. A sensitivity analysis was used to examine the response of the calibrated model to changes in model parameters for horizontal hydraulic conductivity, recharge, evapotranspiration, and spring and riverbed conductance. The model was most sensitive to recharge and maximum potential evapotranspiration and least sensitive to riverbed and spring conductances. Two potential future scenarios were simulated: a potential drought scenario and a potential increased pumping scenario. To simulate a potential drought scenario, a synthetic drought record was created, the mean of which was equal to 60 percent of the mean estimated recharge rate for the 25-year simulation period. Compared with the results of the calibrated model (non-drought simulation), the simulation representing a potential drought scenario resulted in water-level decreases of as much as 30 feet for the Ogallala aquifer. To simulate the effects of potential future increases in pumping, well withdrawal rates were increased by 50 percent from those estimated for the 25-year simulation period. Compared with the results of the calibrated model, the simulation representing an increased pumping scenario resulted in water-level decreases of as much as 26 feet for the Ogallala aquifer. Groundwater budgets for the potential future scenario simulations were compared with the transient simulation representing water years 1985–2009. The simulation representing a potential drought scenario resulted in lower aquifer recharge from precipitation and decreased discharge from streams, springs, seeps, and evapotranspiration. The simulation representing a potential increased pumping scenario was similar to results from the transient simulation, with a slight increase in well withdrawals and a slight decrease in discharge from river leakage and evapotranspiration. This numerical model is suitable as a tool that could be used to better understand the flow system of the Ogallala aquifer, to approximate hydraulic heads in the aquifer, and to estimate discharge to rivers, springs, and seeps in the study area. The model also is useful to help assess the response of the aquifer to additional stresses, including potential drought conditions and increased well withdrawals.
Global Sensitivity Analysis for Process Identification under Model Uncertainty
NASA Astrophysics Data System (ADS)
Ye, M.; Dai, H.; Walker, A. P.; Shi, L.; Yang, J.
2015-12-01
The environmental system consists of various physical, chemical, and biological processes, and environmental models are always built to simulate these processes and their interactions. For model building, improvement, and validation, it is necessary to identify important processes so that limited resources can be used to better characterize the processes. While global sensitivity analysis has been widely used to identify important processes, the process identification is always based on deterministic process conceptualization that uses a single model for representing a process. However, environmental systems are complex, and it happens often that a single process may be simulated by multiple alternative models. Ignoring the model uncertainty in process identification may lead to biased identification in that identified important processes may not be so in the real world. This study addresses this problem by developing a new method of global sensitivity analysis for process identification. The new method is based on the concept of Sobol sensitivity analysis and model averaging. Similar to the Sobol sensitivity analysis to identify important parameters, our new method evaluates variance change when a process is fixed at its different conceptualizations. The variance considers both parametric and model uncertainty using the method of model averaging. The method is demonstrated using a synthetic study of groundwater modeling that considers recharge process and parameterization process. Each process has two alternative models. Important processes of groundwater flow and transport are evaluated using our new method. The method is mathematically general, and can be applied to a wide range of environmental problems.
NASA Astrophysics Data System (ADS)
Hartmann, A. J.; Ireson, A. M.
2017-12-01
Chalk aquifers represent an important source of drinking water in the UK. Due to its fractured-porous structure, Chalk aquifers are characterized by highly dynamic groundwater fluctuations that enhance the risk of groundwater flooding. The risk of groundwater flooding can be assessed by physically-based groundwater models. But for reliable results, a-priori information about the distribution of hydraulic conductivities and porosities is necessary, which is often not available. For that reason, conceptual simulation models are often used to predict groundwater behaviour. They commonly require calibration by historic groundwater observations. Consequently, their prediction performance may reduce significantly, when it comes to system states that did not occur within the calibration time series. In this study, we calibrate a conceptual model to the observed groundwater level observations at several locations within a Chalk system in Southern England. During the calibration period, no groundwater flooding occurred. We then apply our model to predict the groundwater dynamics of the system at a time that includes a groundwater flooding event. We show that the calibrated model provides reasonable predictions before and after the flooding event but it over-estimates groundwater levels during the event. After modifying the model structure to include topographic information, the model is capable of prediction the groundwater flooding event even though groundwater flooding never occurred in the calibration period. Although straight forward, our approach shows how conceptual process-based models can be applied to predict system states and dynamics that did not occur in the calibration period. We believe such an approach can be transferred to similar cases, especially to regions where rainfall intensities are expected to trigger processes and system states that may have not yet been observed.
Jahn, Beate; Theurl, Engelbert; Siebert, Uwe; Pfeiffer, Karl-Peter
2010-01-01
In most decision-analytic models in health care, it is assumed that there is treatment without delay and availability of all required resources. Therefore, waiting times caused by limited resources and their impact on treatment effects and costs often remain unconsidered. Queuing theory enables mathematical analysis and the derivation of several performance measures of queuing systems. Nevertheless, an analytical approach with closed formulas is not always possible. Therefore, simulation techniques are used to evaluate systems that include queuing or waiting, for example, discrete event simulation. To include queuing in decision-analytic models requires a basic knowledge of queuing theory and of the underlying interrelationships. This tutorial introduces queuing theory. Analysts and decision-makers get an understanding of queue characteristics, modeling features, and its strength. Conceptual issues are covered, but the emphasis is on practical issues like modeling the arrival of patients. The treatment of coronary artery disease with percutaneous coronary intervention including stent placement serves as an illustrative queuing example. Discrete event simulation is applied to explicitly model resource capacities, to incorporate waiting lines and queues in the decision-analytic modeling example.
Regionalized rainfall-runoff model to estimate low flow indices
NASA Astrophysics Data System (ADS)
Garcia, Florine; Folton, Nathalie; Oudin, Ludovic
2016-04-01
Estimating low flow indices is of paramount importance to manage water resources and risk assessments. These indices are derived from river discharges which are measured at gauged stations. However, the lack of observations at ungauged sites bring the necessity of developing methods to estimate these low flow indices from observed discharges in neighboring catchments and from catchment characteristics. Different estimation methods exist. Regression or geostatistical methods performed on the low flow indices are the most common types of methods. Another less common method consists in regionalizing rainfall-runoff model parameters, from catchment characteristics or by spatial proximity, to estimate low flow indices from simulated hydrographs. Irstea developed GR2M-LoiEau, a conceptual monthly rainfall-runoff model, combined with a regionalized model of snow storage and melt. GR2M-LoiEau relies on only two parameters, which are regionalized and mapped throughout France. This model allows to cartography monthly reference low flow indices. The inputs data come from SAFRAN, the distributed mesoscale atmospheric analysis system, which provides daily solid and liquid precipitation and temperature data from everywhere in the French territory. To exploit fully these data and to estimate daily low flow indices, a new version of GR-LoiEau has been developed at a daily time step. The aim of this work is to develop and regionalize a GR-LoiEau model that can provide any daily, monthly or annual estimations of low flow indices, yet keeping only a few parameters, which is a major advantage to regionalize them. This work includes two parts. On the one hand, a daily conceptual rainfall-runoff model is developed with only three parameters in order to simulate daily and monthly low flow indices, mean annual runoff and seasonality. On the other hand, different regionalization methods, based on spatial proximity and similarity, are tested to estimate the model parameters and to simulate low flow indices in ungauged sites. The analysis is carried out on 691 French catchments that are representative of various hydro-meteorological behaviors. The results are validated with a cross-validation procedure and are compared with the ones obtained with GR4J, a conceptual rainfall-runoff model, which already provides daily estimations, but involves four parameters that cannot easily be regionalized.
NASA Astrophysics Data System (ADS)
Xu, Zexuan; Hu, Bill
2016-04-01
Dual-permeability karst aquifers of porous media and conduit networks with significant different hydrological characteristics are widely distributed in the world. Discrete-continuum numerical models, such as MODFLOW-CFP and CFPv2, have been verified as appropriate approaches to simulate groundwater flow and solute transport in numerical modeling of karst hydrogeology. On the other hand, seawater intrusion associated with fresh groundwater resources contamination has been observed and investigated in numbers of coastal aquifers, especially under conditions of sea level rise. Density-dependent numerical models including SEAWAT are able to quantitatively evaluate the seawater/freshwater interaction processes. A numerical model of variable-density flow and solute transport - conduit flow process (VDFST-CFP) is developed to provide a better description of seawater intrusion and submarine groundwater discharge in a coastal karst aquifer with conduits. The coupling discrete-continuum VDFST-CFP model applies Darcy-Weisbach equation to simulate non-laminar groundwater flow in the conduit system in which is conceptualized and discretized as pipes, while Darcy equation is still used in continuum porous media. Density-dependent groundwater flow and solute transport equations with appropriate density terms in both conduit and porous media systems are derived and numerically solved using standard finite difference method with an implicit iteration procedure. Synthetic horizontal and vertical benchmarks are created to validate the newly developed VDFST-CFP model by comparing with other numerical models such as variable density SEAWAT, couplings of constant density groundwater flow and solute transport MODFLOW/MT3DMS and discrete-continuum CFPv2/UMT3D models. VDFST-CFP model improves the simulation of density dependent seawater/freshwater mixing processes and exchanges between conduit and matrix. Continuum numerical models greatly overestimated the flow rate under turbulent flow condition but discrete-continuum models provide more accurate results. Parameters sensitivities analysis indicates that conduit diameter and friction factor, matrix hydraulic conductivity and porosity are important parameters that significantly affect variable-density flow and solute transport simulation. The pros and cons of model assumptions, conceptual simplifications and numerical techniques in VDFST-CFP are discussed. In general, the development of VDFST-CFP model is an innovation in numerical modeling methodology and could be applied to quantitatively evaluate the seawater/freshwater interaction in coastal karst aquifers. Keywords: Discrete-continuum numerical model; Variable density flow and transport; Coastal karst aquifer; Non-laminar flow
Watershed Models for Decision Support for Inflows to Potholes Reservoir, Washington
Mastin, Mark C.
2009-01-01
A set of watershed models for four basins (Crab Creek, Rocky Ford Creek, Rocky Coulee, and Lind Coulee), draining into Potholes Reservoir in east-central Washington, was developed as part of a decision support system to aid the U.S. Department of the Interior, Bureau of Reclamation, in managing water resources in east-central Washington State. The project is part of the U.S. Geological Survey and Bureau of Reclamation collaborative Watershed and River Systems Management Program. A conceptual model of hydrology is outlined for the study area that highlights the significant processes that are important to accurately simulate discharge under a wide range of conditions. The conceptual model identified the following factors as significant for accurate discharge simulations: (1) influence of frozen ground on peak discharge, (2) evaporation and ground-water flow as major pathways in the system, (3) channel losses, and (4) influence of irrigation practices on reducing or increasing discharge. The Modular Modeling System was used to create a watershed model for the four study basins by combining standard Precipitation Runoff Modeling System modules with modified modules from a previous study and newly modified modules. The model proved unreliable in simulating peak-flow discharge because the index used to track frozen ground conditions was not reliable. Mean monthly and mean annual discharges were more reliable when simulated. Data from seven USGS streamflow-gaging stations were used to compare with simulated discharge for model calibration and evaluation. Mean annual differences between simulated and observed discharge varied from 1.2 to 13.8 percent for all stations used in the comparisons except one station on a regional ground-water discharge stream. Two thirds of the mean monthly percent differences between the simulated mean and the observed mean discharge for these six stations were between -20 and 240 percent, or in absolute terms, between -0.8 and 11 cubic feet per second. A graphical user interface was developed for the user to easily run the model, make runoff forecasts, and evaluate the results. The models; however, are not reliable for managing short-term operations because of their demonstrated inability to match individual storm peaks and individual monthly discharge values. Short-term forecasting may be improved with real-time monitoring of the extent of frozen ground and the snow-water equivalent in the basin. Despite the models unreliability for short-term runoff forecasts, they are useful in providing long-term, time-series discharge data where no observed data exist.
Improving the XAJ Model on the Basis of Mass-Energy Balance
NASA Astrophysics Data System (ADS)
Fang, Yuanhao; Corbari, Chiara; Zhang, Xingnan; Mancini, Marco
2014-11-01
Introduction: The Xin'anjiang(XAJ) model is a conceptual model developed by the group led by Prof. Ren-Jun Zhao, which takes the pan evaporation as one of its input and then computes the effective evapotranspiration (ET) of the catchment by mass balance. Such scheme can ensure a good performance of discharge simulation but has obvious defects, one of which is that the effective ET is spatially-constant over the computation unit, neglecting the spatial variation of variables that influence the effective ET and therefore the simulation of ET and SM by the XAJ model, comparing with discharge, is less reliable. In this study, The XAJ model was improved to employ both energy and mass balance to compute the ET following the energy-mass balance scheme of FEST-EWB. model.
Improving the XAJ Model on the Basis of Mass-Energy Balance
NASA Astrophysics Data System (ADS)
Fang, Yuanghao; Corbari, Chiara; Zhang, Xingnan; Mancini, Marco
2014-11-01
The Xin’anjiang(XAJ) model is a conceptual model developed by the group led by Prof. Ren-Jun Zhao, which takes the pan evaporation as one of its input and then computes the effective evapotranspiration (ET) of the catchment by mass balance. Such scheme can ensure a good performance of discharge simulation but has obvious defects, one of which is that the effective ET is spatially-constant over the computation unit, neglecting the spatial variation of variables that influence the effective ET and therefore the simulation of ET and SM by the XAJ model, comparing with discharge, is less reliable. In this study, The XAJ model was improved to employ both energy and mass balance to compute the ET following the energy-mass balance scheme of FEST-EWB. model.
NASA Astrophysics Data System (ADS)
Liu, Z.; LU, G.; He, H.; Wu, Z.; He, J.
2017-12-01
Reliable drought prediction is fundamental for seasonal water management. Considering that drought development is closely related to the spatio-temporal evolution of large-scale circulation patterns, we develop a conceptual prediction model of seasonal drought processes based on atmospheric/oceanic Standardized Anomalies (SA). It is essentially the synchronous stepwise regression relationship between 90-day-accumulated atmospheric/oceanic SA-based predictors and 3-month SPI updated daily (SPI3). It is forced with forecasted atmospheric and oceanic variables retrieved from seasonal climate forecast systems, and it can make seamless drought prediction for operational use after a year-to-year calibration. Simulation and prediction of four severe seasonal regional drought processes in China were forced with the NCEP/NCAR reanalysis datasets and the NCEP Climate Forecast System Version 2 (CFSv2) operationally forecasted datasets, respectively. With the help of real-time correction for operational application, model application during four recent severe regional drought events in China revealed that the model is good at development prediction but weak in severity prediction. In addition to weakness in prediction of drought peak, the prediction of drought relief is possible to be predicted as drought recession. This weak performance may be associated with precipitation-causing weather patterns during drought relief. Based on initial virtual analysis on predicted 90-day prospective SPI3 curves, it shows that the 2009/2010 drought in Southwest China and 2014 drought in North China can be predicted and simulated well even for the prospective 1-75 day. In comparison, the prospective 1-45 day may be a feasible and acceptable lead time for simulation and prediction of the 2011 droughts in Southwest China and East China, after which the simulated and predicted developments clearly change.
Ihrke, Matthias; Brennen, Tim
2011-01-01
In this paper three experiments and corresponding model simulations are reported that investigate the priming of famous name recognition in order to explore the structure of the part of the semantic system dealing with people. Consistent with empirical findings, novel computational simulations using Burton et al.’s interactive activation and competition model point to a conceptual distinction between how priming is initiated in single- and double-familiarity tasks, indicating that priming should be weaker or non-existent for the single-familiarity task. Experiment 1 demonstrates that, within a double-familiarity framework using famous names, categorical, and associative priming are reliable effects. Pushing the model to the limit, it predicts that pairs of celebrities who are neither associatively nor categorically related but who share single biographical features, both died in a car crash for example, should prime each other. Experiment 2 investigated this in a double-familiarity task but the effect was not observed. We therefore simulated and realized a pairwise learning task that was conceptually similar to the double-familiarity-decision task but allowed to strengthen the underlying connections. Priming based on a single biographical feature could be found both in simulations and the experiment. The effect was not due to visual or name similarity which were controlled for and participants did not report using the biographical links between the people to learn the pairs. The results are interpreted to lend further support to structural models of the memory for persons. Furthermore, the results are consistent with the idea that episodic features known about people are stored in semantic memory and are automatically activated when encountering that person. PMID:21687446
ERIC Educational Resources Information Center
Fan, Xinxin; Geelan, David; Gillies, Robyn
2018-01-01
This study investigated the effectiveness of a novel inquiry-based instructional sequence using interactive simulations for supporting students' development of conceptual understanding, inquiry process skills and confidence in learning. The study, conducted in Beijing, involved two teachers and 117 students in four classes. The teachers…
ERIC Educational Resources Information Center
Chambers, Jay G.; Parrish, Thomas B.
The Resource Cost Model (RCM) is a resource management system that combines the technical advantages of sophisticated computer simulation software with the practical benefits of group decision making to provide detailed information about educational program costs. The first section of this document introduces the conceptual framework underlying…
The Impact of Inflation on Endowment Assets
ERIC Educational Resources Information Center
Birkeland, Kathryn; Carr, David L.; Lavin, Angeline M.
2013-01-01
Maintaining spending power in real terms (current) while preserving an endowment's value in real terms (future) is the crux of intergenerational equity. Tobin's (1974) model provides the conceptual basis on which simulations were developed to study the impact of various inflation (0%, TIPS, CPI, HECA, and HEPI) and new giving scenarios ($0, $4…
USDA-ARS?s Scientific Manuscript database
Martini and Habeck (2014) correctly describe the conceptual simulation model of Byers (2013) where molecules in an odor filament pass by an antenna causing an electrophysiological antennographic (EAG) response that is proportional to how many of the receptors are hit at least once by a molecule. Inc...
Conceptualizing intragroup and intergroup dynamics within a controlled crowd evacuation.
Elzie, Terra; Frydenlund, Erika; Collins, Andrew J; Robinson, R Michael
2015-01-01
Social dynamics play a critical role in successful pedestrian evacuations. Crowd modeling research has made progress in capturing the way individual and group dynamics affect evacuations; however, few studies have simultaneously examined how individuals and groups interact with one another during egress. To address this gap, the researchers present a conceptual agent-based model (ABM) designed to study the ways in which autonomous, heterogeneous, decision-making individuals negotiate intragroup and intergroup behavior while exiting a large venue. A key feature of this proposed model is the examination of the dynamics among and between various groupings, where heterogeneity at the individual level dynamically affects group behavior and subsequently group/group interactions. ABM provides a means of representing the important social factors that affect decision making among diverse social groups. Expanding on the 2013 work of Vizzari et al., the researchers focus specifically on social factors and decision making at the individual/group and group/group levels to more realistically portray dynamic crowd systems during a pedestrian evacuation. By developing a model with individual, intragroup, and intergroup interactions, the ABM provides a more representative approximation of real-world crowd egress. The simulation will enable more informed planning by disaster managers, emergency planners, and other decision makers. This pedestrian behavioral concept is one piece of a larger simulation model. Future research will build toward an integrated model capturing decision-making interactions between pedestrians and vehicles that affect evacuation outcomes.
Conceptualizing socio‐hydrological drought processes: The case of the Maya collapse
Carr, Gemma; Viglione, Alberto; Prskawetz, Alexia; Blöschl, Günter
2016-01-01
Abstract With population growth, increasing water demands and climate change the need to understand the current and future pathways to water security is becoming more pressing. To contribute to addressing this challenge, we examine the link between water stress and society through socio‐hydrological modeling. We conceptualize the interactions between an agricultural society with its environment in a stylized way. We apply the model to the case of the ancient Maya, a population that experienced a peak during the Classic Period (AD 600–830) and then declined during the ninth century. The hypothesis that modest drought periods played a major role in the society's collapse is explored. Simulating plausible feedbacks between water and society we show that a modest reduction in rainfall may lead to an 80% population collapse. Population density and crop sensitivity to droughts, however, may play an equally important role. The simulations indicate that construction of reservoirs results in less frequent drought impacts, but if the reservoirs run dry, drought impact may be more severe and the population drop may be larger. PMID:27840455
Conceptualizing socio-hydrological drought processes: The case of the Maya collapse.
Kuil, Linda; Carr, Gemma; Viglione, Alberto; Prskawetz, Alexia; Blöschl, Günter
2016-08-01
With population growth, increasing water demands and climate change the need to understand the current and future pathways to water security is becoming more pressing. To contribute to addressing this challenge, we examine the link between water stress and society through socio-hydrological modeling. We conceptualize the interactions between an agricultural society with its environment in a stylized way. We apply the model to the case of the ancient Maya, a population that experienced a peak during the Classic Period (AD 600-830) and then declined during the ninth century. The hypothesis that modest drought periods played a major role in the society's collapse is explored. Simulating plausible feedbacks between water and society we show that a modest reduction in rainfall may lead to an 80% population collapse. Population density and crop sensitivity to droughts, however, may play an equally important role. The simulations indicate that construction of reservoirs results in less frequent drought impacts, but if the reservoirs run dry, drought impact may be more severe and the population drop may be larger.
Conceptualizing socio-hydrological drought processes: The case of the Maya collapse
NASA Astrophysics Data System (ADS)
Kuil, Linda; Carr, Gemma; Viglione, Alberto; Prskawetz, Alexia; Blöschl, Günter
2016-08-01
With population growth, increasing water demands and climate change the need to understand the current and future pathways to water security is becoming more pressing. To contribute to addressing this challenge, we examine the link between water stress and society through socio-hydrological modeling. We conceptualize the interactions between an agricultural society with its environment in a stylized way. We apply the model to the case of the ancient Maya, a population that experienced a peak during the Classic Period (AD 600-830) and then declined during the ninth century. The hypothesis that modest drought periods played a major role in the society's collapse is explored. Simulating plausible feedbacks between water and society we show that a modest reduction in rainfall may lead to an 80% population collapse. Population density and crop sensitivity to droughts, however, may play an equally important role. The simulations indicate that construction of reservoirs results in less frequent drought impacts, but if the reservoirs run dry, drought impact may be more severe and the population drop may be larger.
Advanced Design Methodology for Robust Aircraft Sizing and Synthesis
NASA Technical Reports Server (NTRS)
Mavris, Dimitri N.
1997-01-01
Contract efforts are focused on refining the Robust Design Methodology for Conceptual Aircraft Design. Robust Design Simulation (RDS) was developed earlier as a potential solution to the need to do rapid trade-offs while accounting for risk, conflict, and uncertainty. The core of the simulation revolved around Response Surface Equations as approximations of bounded design spaces. An ongoing investigation is concerned with the advantages of using Neural Networks in conceptual design. Thought was also given to the development of systematic way to choose or create a baseline configuration based on specific mission requirements. Expert system was developed, which selects aerodynamics, performance and weights model from several configurations based on the user's mission requirements for subsonic civil transport. The research has also resulted in a step-by-step illustration on how to use the AMV method for distribution generation and the search for robust design solutions to multivariate constrained problems.
Xiao, Bo; Imel, Zac E; Georgiou, Panayiotis; Atkins, David C; Narayanan, Shrikanth S
2016-05-01
Empathy is an important psychological process that facilitates human communication and interaction. Enhancement of empathy has profound significance in a range of applications. In this paper, we review emerging directions of research on computational analysis of empathy expression and perception as well as empathic interactions, including their simulation. We summarize the work on empathic expression analysis by the targeted signal modalities (e.g., text, audio, and facial expressions). We categorize empathy simulation studies into theory-based emotion space modeling or application-driven user and context modeling. We summarize challenges in computational study of empathy including conceptual framing and understanding of empathy, data availability, appropriate use and validation of machine learning techniques, and behavior signal processing. Finally, we propose a unified view of empathy computation and offer a series of open problems for future research.
Kelay, Tanika; Chan, Kah Leong; Ako, Emmanuel; Yasin, Mohammad; Costopoulos, Charis; Gold, Matthew; Kneebone, Roger K; Malik, Iqbal S; Bello, Fernando
2017-01-01
Distributed Simulation is the concept of portable, high-fidelity immersive simulation. Here, it is used for the development of a simulation-based training programme for cardiovascular specialities. We present an evidence base for how accessible, portable and self-contained simulated environments can be effectively utilised for the modelling, development and testing of a complex training framework and assessment methodology. Iterative user feedback through mixed-methods evaluation techniques resulted in the implementation of the training programme. Four phases were involved in the development of our immersive simulation-based training programme: ( 1) initial conceptual stage for mapping structural criteria and parameters of the simulation training framework and scenario development ( n = 16), (2) training facility design using Distributed Simulation , (3) test cases with clinicians ( n = 8) and collaborative design, where evaluation and user feedback involved a mixed-methods approach featuring (a) quantitative surveys to evaluate the realism and perceived educational relevance of the simulation format and framework for training and (b) qualitative semi-structured interviews to capture detailed feedback including changes and scope for development. Refinements were made iteratively to the simulation framework based on user feedback, resulting in (4) transition towards implementation of the simulation training framework, involving consistent quantitative evaluation techniques for clinicians ( n = 62). For comparative purposes, clinicians' initial quantitative mean evaluation scores for realism of the simulation training framework, realism of the training facility and relevance for training ( n = 8) are presented longitudinally, alongside feedback throughout the development stages from concept to delivery, including the implementation stage ( n = 62). Initially, mean evaluation scores fluctuated from low to average, rising incrementally. This corresponded with the qualitative component, which augmented the quantitative findings; trainees' user feedback was used to perform iterative refinements to the simulation design and components (collaborative design), resulting in higher mean evaluation scores leading up to the implementation phase. Through application of innovative Distributed Simulation techniques, collaborative design, and consistent evaluation techniques from conceptual, development, and implementation stages, fully immersive simulation techniques for cardiovascular specialities are achievable and have the potential to be implemented more broadly.
Linking stressors and ecological responses
Gentile, J.H.; Solomon, K.R.; Butcher, J.B.; Harrass, M.; Landis, W.G.; Power, M.; Rattner, B.A.; Warren-Hicks, W.J.; Wenger, R.; Foran, Jeffery A.; Ferenc, Susan A.
1999-01-01
To characterize risk, it is necessary to quantify the linkages and interactions between chemical, physical and biological stressors and endpoints in the conceptual framework for ecological risk assessment (ERA). This can present challenges in a multiple stressor analysis, and it will not always be possible to develop a quantitative stressor-response profile. This review commences with a conceptual representation of the problem of developing a linkage analysis for multiple stressors and responses. The remainder of the review surveys a variety of mathematical and statistical methods (e.g., ranking methods, matrix models, multivariate dose-response for mixtures, indices, visualization, simulation modeling and decision-oriented methods) for accomplishing the linkage analysis for multiple stressors. Describing the relationships between multiple stressors and ecological effects are critical components of 'effects assessment' in the ecological risk assessment framework.
Lindner-Lunsford, J. B.; Ellis, S.R.
1987-01-01
Multievent, conceptually based models and a single-event, multiple linear-regression model for estimating storm-runoff quantity and quality from urban areas were calibrated and verified for four small (57 to 167 acres) basins in the Denver metropolitan area, Colorado. The basins represented different land-use types - light commercial, single-family housing, and multi-family housing. Both types of models were calibrated using the same data set for each basin. A comparison was made between the storm-runoff volume, peak flow, and storm-runoff loads of seven water quality constituents simulated by each of the models by use of identical verification data sets. The models studied were the U.S. Geological Survey 's Distributed Routing Rainfall-Runoff Model-Version II (DR3M-II) (a runoff-quantity model designed for urban areas), and a multievent urban runoff quality model (DR3M-QUAL). Water quality constituents modeled were chemical oxygen demand, total suspended solids, total nitrogen, total phosphorus, total lead, total manganese, and total zinc. (USGS)
OntoVIP: an ontology for the annotation of object models used for medical image simulation.
Gibaud, Bernard; Forestier, Germain; Benoit-Cattin, Hugues; Cervenansky, Frédéric; Clarysse, Patrick; Friboulet, Denis; Gaignard, Alban; Hugonnard, Patrick; Lartizien, Carole; Liebgott, Hervé; Montagnat, Johan; Tabary, Joachim; Glatard, Tristan
2014-12-01
This paper describes the creation of a comprehensive conceptualization of object models used in medical image simulation, suitable for major imaging modalities and simulators. The goal is to create an application ontology that can be used to annotate the models in a repository integrated in the Virtual Imaging Platform (VIP), to facilitate their sharing and reuse. Annotations make the anatomical, physiological and pathophysiological content of the object models explicit. In such an interdisciplinary context we chose to rely on a common integration framework provided by a foundational ontology, that facilitates the consistent integration of the various modules extracted from several existing ontologies, i.e. FMA, PATO, MPATH, RadLex and ChEBI. Emphasis is put on methodology for achieving this extraction and integration. The most salient aspects of the ontology are presented, especially the organization in model layers, as well as its use to browse and query the model repository. Copyright © 2014 Elsevier Inc. All rights reserved.
Assessing the future of air freight
NASA Technical Reports Server (NTRS)
Dajani, J. S.
1977-01-01
The role of air cargo in the current transportation system in the United States is explored. Methods for assessing the future role of this mode of transportation include the use of continuous-time recursive systems modeling for the simulation of different components of the air freight system, as well as for the development of alternative future scenarios which may result from different policy actions. A basic conceptual framework for conducting such a dynamic simulation is presented within the context of the air freight industry. Some research needs are identified and recommended for further research. The benefits, limitations, pitfalls, and problems usually associated with large scale systems models are examined.
2016-01-01
Background Contributing to health informatics research means using conceptual models that are integrative and explain the research in terms of the two broad domains of health science and information science. However, it can be hard for novice health informatics researchers to find exemplars and guidelines in working with integrative conceptual models. Objectives The aim of this paper is to support the use of integrative conceptual models in research on information and communication technologies in the health sector, and to encourage discussion of these conceptual models in scholarly forums. Methods A two-part method was used to summarize and structure ideas about how to work effectively with conceptual models in health informatics research that included (1) a selective review and summary of the literature of conceptual models; and (2) the construction of a step-by-step approach to developing a conceptual model. Results The seven-step methodology for developing conceptual models in health informatics research explained in this paper involves (1) acknowledging the limitations of health science and information science conceptual models; (2) giving a rationale for one’s choice of integrative conceptual model; (3) explicating a conceptual model verbally and graphically; (4) seeking feedback about the conceptual model from stakeholders in both the health science and information science domains; (5) aligning a conceptual model with an appropriate research plan; (6) adapting a conceptual model in response to new knowledge over time; and (7) disseminating conceptual models in scholarly and scientific forums. Conclusions Making explicit the conceptual model that underpins a health informatics research project can contribute to increasing the number of well-formed and strongly grounded health informatics research projects. This explication has distinct benefits for researchers in training, research teams, and researchers and practitioners in information, health, and other disciplines. PMID:26912288
Gray, Kathleen; Sockolow, Paulina
2016-02-24
Contributing to health informatics research means using conceptual models that are integrative and explain the research in terms of the two broad domains of health science and information science. However, it can be hard for novice health informatics researchers to find exemplars and guidelines in working with integrative conceptual models. The aim of this paper is to support the use of integrative conceptual models in research on information and communication technologies in the health sector, and to encourage discussion of these conceptual models in scholarly forums. A two-part method was used to summarize and structure ideas about how to work effectively with conceptual models in health informatics research that included (1) a selective review and summary of the literature of conceptual models; and (2) the construction of a step-by-step approach to developing a conceptual model. The seven-step methodology for developing conceptual models in health informatics research explained in this paper involves (1) acknowledging the limitations of health science and information science conceptual models; (2) giving a rationale for one's choice of integrative conceptual model; (3) explicating a conceptual model verbally and graphically; (4) seeking feedback about the conceptual model from stakeholders in both the health science and information science domains; (5) aligning a conceptual model with an appropriate research plan; (6) adapting a conceptual model in response to new knowledge over time; and (7) disseminating conceptual models in scholarly and scientific forums. Making explicit the conceptual model that underpins a health informatics research project can contribute to increasing the number of well-formed and strongly grounded health informatics research projects. This explication has distinct benefits for researchers in training, research teams, and researchers and practitioners in information, health, and other disciplines.
A-DROP: A predictive model for the formation of oil particle aggregates (OPAs).
Zhao, Lin; Boufadel, Michel C; Geng, Xiaolong; Lee, Kenneth; King, Thomas; Robinson, Brian; Fitzpatrick, Faith
2016-05-15
Oil-particle interactions play a major role in removal of free oil from the water column. We present a new conceptual-numerical model, A-DROP, to predict oil amount trapped in oil-particle aggregates. A new conceptual formulation of oil-particle coagulation efficiency is introduced to account for the effects of oil stabilization by particles, particle hydrophobicity, and oil-particle size ratio on OPA formation. A-DROP was able to closely reproduce the oil trapping efficiency reported in experimental studies. The model was then used to simulate the OPA formation in a typical nearshore environment. Modeling results indicate that the increase of particle concentration in the swash zone would speed up the oil-particle interaction process; but the oil amount trapped in OPAs did not correspond to the increase of particle concentration. The developed A-DROP model could become an important tool in understanding the natural removal of oil and developing oil spill countermeasures by means of oil-particle aggregation. Copyright © 2016 Elsevier Ltd. All rights reserved.
Dealing With Unexpected Events on the Flight Deck: A Conceptual Model of Startle and Surprise
Landman, Annemarie; Groen, Eric L.; van Paassen, M. M. (René); Bronkhorst, Adelbert W.; Mulder, Max
2017-01-01
Objective: A conceptual model is proposed in order to explain pilot performance in surprising and startling situations. Background: Today’s debate around loss of control following in-flight events and the implementation of upset prevention and recovery training has highlighted the importance of pilots’ ability to deal with unexpected events. Unexpected events, such as technical malfunctions or automation surprises, potentially induce a “startle factor” that may significantly impair performance. Method: Literature on surprise, startle, resilience, and decision making is reviewed, and findings are combined into a conceptual model. A number of recent flight incident and accident cases are then used to illustrate elements of the model. Results: Pilot perception and actions are conceptualized as being guided by “frames,” or mental knowledge structures that were previously learned. Performance issues in unexpected situations can often be traced back to insufficient adaptation of one’s frame to the situation. It is argued that such sensemaking or reframing processes are especially vulnerable to issues caused by startle or acute stress. Conclusion: Interventions should focus on (a) increasing the supply and quality of pilot frames (e.g., though practicing a variety of situations), (b) increasing pilot reframing skills (e.g., through the use of unpredictability in training scenarios), and (c) improving pilot metacognitive skills, so that inappropriate automatic responses to startle and surprise can be avoided. Application: The model can be used to explain pilot behavior in accident cases, to design experiments and training simulations, to teach pilots metacognitive skills, and to identify intervention methods. PMID:28777917
Carbon Tetrachloride Flow and Transport in the Subsurface of the 216-Z-9 Trench at the Hanford Site
NASA Astrophysics Data System (ADS)
Oostrom, M.; Rockhold, M.; Truex, M.; Thorne, P.; Last, G.; Rohay, V.
2006-12-01
Three-dimensional modeling was conducted with layered and heterogeneous models to enhance the conceptual model of CT distribution in the vertical and lateral direction beneath the 216-Z-9 trench and to investigate the effects of soil vapor extraction (SVE). This work supports the U.S. Department of Energy's (DOE's) efforts to characterize the nature and distribution of CT in the 200 West Area and subsequently select an appropriate final remedy. Simulations targeted migration of dense, nonaqueous phase liquid (DNAPL) consisting of CT and co-disposed organics in the subsurface beneath the 216-Z-9 trench as a function of the properties and distribution of subsurface sediments and of the properties and disposal history of the waste. Simulations of CT migration were conducted using the Subsurface Transport Over Multiple Phases (STOMP) simulator. Simulation results support a conceptual model for CT distribution where CT in the DNAPL phase is expected to have migrated primarily in a vertical direction below the disposal trench. Presence of small-scale heterogeneities tends to limit the extent of vertical migration of CT DNAPL due to enhanced retention of DNAPL compared to more homogeneous conditions, but migration is still predominantly in the vertical direction. Results also show that the Cold Creek units retain more CT DNAPL within the vadose zone than other hydrologic unit during SVE. A considerable amount of the disposed CT DNAPL may have partitioned to the vapor and subsequently water and sorbed phases. Presence of small-scale heterogeneities tends to increase the amount of volatilization. Any continued migration of CT from the vadose zone to the groundwater is likely through interaction of vapor phase CT with the groundwater and not through continued DNAPL migration. The results indicated that SVE appears to be an effective technology for vadose zone remediation, but additional effort is needed to improve simulation of the SVE process.
2014-01-01
Background As a conceptual review, this paper will debate relevant learning theories to inform the development, design and delivery of an effective educational programme for simulated team training relevant to health professionals. Discussion Kolb’s experiential learning theory is used as the main conceptual framework to define the sequence of activities. Dewey’s theory of reflective thought and action, Jarvis modification of Kolb’s learning cycle and Schön’s reflection-on-action serve as a model to design scenarios for optimal concrete experience and debriefing for challenging participants’ beliefs and habits. Bandura’s theory of self-efficacy and newer socio-cultural learning models outline that for efficient team training, it is mandatory to introduce the social-cultural context of a team. Summary The ideal simulated team training programme needs a scenario for concrete experience, followed by a debriefing with a critical reflexive observation and abstract conceptualisation phase, and ending with a second scenario for active experimentation. Let them re-experiment to optimise the effect of a simulated training session. Challenge them to the edge: The scenario needs to challenge participants to generate failures and feelings of inadequacy to drive and motivate team members to critical reflect and learn. Not experience itself but the inadequacy and contradictions of habitual experience serve as basis for reflection. Facilitate critical reflection: Facilitators and group members must guide and motivate individual participants through the debriefing session, inciting and empowering learners to challenge their own beliefs and habits. To do this, learners need to feel psychological safe. Let the group talk and critical explore. Motivate with reality and context: Training with multidisciplinary team members, with different levels of expertise, acting in their usual environment (in-situ simulation) on physiological variables is mandatory to introduce cultural context and social conditions to the learning experience. Embedding in situ team training sessions into a teaching programme to enable repeated training and to assess regularly team performance is mandatory for a cultural change of sustained improvement of team performance and patient safety. PMID:24694243
NASA Technical Reports Server (NTRS)
Jackson, E. B.; Powell, Richard W.; Ragsdale, W. A.
1991-01-01
The role of simulations in the design of the HL-20, the crew-carrying unpowered lifting-body component of the NASA Personnel Launch System, is reviewed and illustrated with drawings and diagrams. Detailed consideration is given to the overall implementation of a real-time simulation of the HL-20 approach and landing phase, the baseline and experimental control laws used in the flight-control system, autoland guidance and control laws (vertical and lateral steering), the control-surface mixer and actuator model, and simulation results. The simulations allowed identification and correction of design problems with respect to the position of the landing gear and the original maximum L/D ratio of 3.2.
Perceptual Processing Affects Conceptual Processing
ERIC Educational Resources Information Center
van Dantzig, Saskia; Pecher, Diane; Zeelenberg, Rene; Barsalou, Lawrence W.
2008-01-01
According to the Perceptual Symbols Theory of cognition (Barsalou, 1999), modality-specific simulations underlie the representation of concepts. A strong prediction of this view is that perceptual processing affects conceptual processing. In this study, participants performed a perceptual detection task and a conceptual property-verification task…
Van De Gucht, Tim; Saeys, Wouter; Van Meensel, Jef; Van Nuffel, Annelies; Vangeyte, Jurgen; Lauwers, Ludwig
2018-01-01
Although prototypes of automatic lameness detection systems for dairy cattle exist, information about their economic value is lacking. In this paper, a conceptual and operational framework for simulating the farm-specific economic value of automatic lameness detection systems was developed and tested on 4 system types: walkover pressure plates, walkover pressure mats, camera systems, and accelerometers. The conceptual framework maps essential factors that determine economic value (e.g., lameness prevalence, incidence and duration, lameness costs, detection performance, and their relationships). The operational simulation model links treatment costs and avoided losses with detection results and farm-specific information, such as herd size and lameness status. Results show that detection performance, herd size, discount rate, and system lifespan have a large influence on economic value. In addition, lameness prevalence influences the economic value, stressing the importance of an adequate prior estimation of the on-farm prevalence. The simulations provide first estimates for the upper limits for purchase prices of automatic detection systems. The framework allowed for identification of knowledge gaps obstructing more accurate economic value estimation. These include insights in cost reductions due to early detection and treatment, and links between specific lameness causes and their related losses. Because this model provides insight in the trade-offs between automatic detection systems' performance and investment price, it is a valuable tool to guide future research and developments. Copyright © 2018 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Mean Line Pump Flow Model in Rocket Engine System Simulation
NASA Technical Reports Server (NTRS)
Veres, Joseph P.; Lavelle, Thomas M.
2000-01-01
A mean line pump flow modeling method has been developed to provide a fast capability for modeling turbopumps of rocket engines. Based on this method, a mean line pump flow code PUMPA has been written that can predict the performance of pumps at off-design operating conditions, given the loss of the diffusion system at the design point. The pump code can model axial flow inducers, mixed-flow and centrifugal pumps. The code can model multistage pumps in series. The code features rapid input setup and computer run time, and is an effective analysis and conceptual design tool. The map generation capability of the code provides the map information needed for interfacing with a rocket engine system modeling code. The off-design and multistage modeling capabilities of the code permit parametric design space exploration of candidate pump configurations and provide pump performance data for engine system evaluation. The PUMPA code has been integrated with the Numerical Propulsion System Simulation (NPSS) code and an expander rocket engine system has been simulated. The mean line pump flow code runs as an integral part of the NPSS rocket engine system simulation and provides key pump performance information directly to the system model at all operating conditions.
Aerodynamics model for a generic ASTOVL lift-fan aircraft
NASA Technical Reports Server (NTRS)
Birckelbaw, Lourdes G.; Mcneil, Walter E.; Wardwell, Douglas A.
1995-01-01
This report describes the aerodynamics model used in a simulation model of an advanced short takeoff and vertical landing (ASTOVL) lift-fan fighter aircraft. The simulation model was developed for use in piloted evaluations of transition and hover flight regimes, so that only low speed (M approximately 0.2) aerodynamics are included in the mathematical model. The aerodynamic model includes the power-off aerodynamic forces and moments and the propulsion system induced aerodynamic effects, including ground effects. The power-off aerodynamics data were generated using the U.S. Air Force Stability and Control Digital DATCOM program and a NASA Ames in-house graphics program called VORVIEW which allows the user to easily analyze arbitrary conceptual aircraft configurations using the VORLAX program. The jet-induced data were generated using the prediction methods of R. E. Kuhn et al., as referenced in this report.
NASA Astrophysics Data System (ADS)
Fovet, O.; Hrachowitz, M.; RUIZ, L.; Gascuel-odoux, C.; Savenije, H.
2013-12-01
While most hydrological models reproduce the general flow dynamics of a system, they frequently fail to adequately mimic system internal processes. This is likely to make them inadequate to simulate solutes transport. For example, the hysteresis between storage and discharge, which is often observed in shallow hard-rock aquifers, is rarely well reproduced by models. One main reason is that this hysteresis has little weight in the calibration because objective functions are based on time series of individual variables. This reduces the ability of classical calibration/validation procedures to assess the relevance of the conceptual hypothesis associated with hydrological models. Calibrating models on variables derived from the combination of different individual variables (like stream discharge and groundwater levels) is a way to insure that models will be accepted based on their consistency. Here we therefore test the value of this more systems-like approach to test different hypothesis on the behaviour of a small experimental low-land catchment in French Brittany (ORE AgrHys) where a high hysteresis is observed on the stream flow vs. shallow groundwater level relationship. Several conceptual models were applied to this site, and calibrated using objective functions based on metrics of this hysteresis. The tested model structures differed with respect to the storage function in each reservoir, the storage-discharge function in each reservoir, the deep loss expressions (as constant or variable fraction), the number of reservoirs (from 1 to 4) and their organization (parallel, series). The observed hysteretic groundwater level-discharge relationship was not satisfactorily reproduced by most of the tested models except for the most complex ones. Those were thus more consistent, their underlying hypotheses are probably more realistic even though their performance for simulating observed stream flow was decreased. Selecting models based on such systems-like approach is likely to improve their efficiency for environmental application e.g. on solute transport issues. The next step would be to apply the same approach with variables combining hydrological and biogeochemical variables.
Evaluating long-term cumulative hydrologic effects of forest management: a conceptual approach
Robert R. Ziemer
1992-01-01
It is impractical to address experimentally many aspects of cumulative hydrologic effects, since to do so would require studying large watersheds for a century or more. Monte Carlo simulations were conducted using three hypothetical 10,000-ha fifth-order forested watersheds. Most of the physical processes expressed by the model are transferable from temperate to...
NASA Astrophysics Data System (ADS)
Marsh, C.; Pomeroy, J. W.; Wheater, H. S.
2017-12-01
Accurate management of water resources is necessary for social, economic, and environmental sustainability worldwide. In locations with seasonal snowcovers, the accurate prediction of these water resources is further complicated due to frozen soils, solid-phase precipitation, blowing snow transport, and snowcover-vegetation-atmosphere interactions. Complex process interactions and feedbacks are a key feature of hydrological systems and may result in emergent phenomena, i.e., the arising of novel and unexpected properties within a complex system. One example is the feedback associated with blowing snow redistribution, which can lead to drifts that cause locally-increased soil moisture, thus increasing plant growth that in turn subsequently impacts snow redistribution, creating larger drifts. Attempting to simulate these emergent behaviours is a significant challenge, however, and there is concern that process conceptualizations within current models are too incomplete to represent the needed interactions. An improved understanding of the role of emergence in hydrological systems often requires high resolution distributed numerical hydrological models that incorporate the relevant process dynamics. The Canadian Hydrological Model (CHM) provides a novel tool for examining cold region hydrological systems. Key features include efficient terrain representation, allowing simulations at various spatial scales, reduced computational overhead, and a modular process representation allowing for an alternative-hypothesis framework. Using both physics-based and conceptual process representations sourced from long term process studies and the current cold regions literature allows for comparison of process representations and importantly, their ability to produce emergent behaviours. Examining the system in a holistic, process-based manner can hopefully derive important insights and aid in development of improved process representations.
O/S analysis of conceptual space vehicles. Part 1
NASA Technical Reports Server (NTRS)
Ebeling, Charles E.
1995-01-01
The application of recently developed computer models in determining operational capabilities and support requirements during the conceptual design of proposed space systems is discussed. The models used are the reliability and maintainability (R&M) model, the maintenance simulation model, and the operations and support (O&S) cost model. In the process of applying these models, the R&M and O&S cost models were updated. The more significant enhancements include (1) improved R&M equations for the tank subsystems, (2) the ability to allocate schedule maintenance by subsystem, (3) redefined spares calculations, (4) computing a weighted average of the working days and mission days per month, (5) the use of a position manning factor, and (6) the incorporation into the O&S model of new formulas for computing depot and organizational recurring and nonrecurring training costs and documentation costs, and depot support equipment costs. The case study used is based upon a winged, single-stage, vertical-takeoff vehicle (SSV) designed to deliver to the Space Station Freedom (SSF) a 25,000 lb payload including passengers without a crew.
Misrepresentation and amendment of soil moisture in conceptual hydrological modelling
NASA Astrophysics Data System (ADS)
Zhuo, Lu; Han, Dawei
2016-04-01
Although many conceptual models are very effective in simulating river runoff, their soil moisture schemes are generally not realistic in comparison with the reality (i.e., getting the right answers for the wrong reasons). This study reveals two significant misrepresentations in those models through a case study using the Xinanjiang model which is representative of many well-known conceptual hydrological models. The first is the setting of the upper limit of its soil moisture at the field capacity, due to the 'holding excess runoff' concept (i.e., runoff begins on repletion of its storage to the field capacity). The second is neglect of capillary rise of water movement. A new scheme is therefore proposed to overcome those two issues. The amended model is as effective as its original form in flow modelling, but represents more logically realistic soil water processes. The purpose of the study is to enable the hydrological model to get the right answers for the right reasons. Therefore, the new model structure has a better capability in potentially assimilating soil moisture observations to enhance its real-time flood forecasting accuracy. The new scheme is evaluated in the Pontiac catchment of the USA through a comparison with satellite observed soil moisture. The correlation between the XAJ and the observed soil moisture is enhanced significantly from 0.64 to 0.70. In addition, a new soil moisture term called SMDS (Soil Moisture Deficit to Saturation) is proposed to complement the conventional SMD (Soil Moisture Deficit).
NASA Astrophysics Data System (ADS)
Soulsby, Chris; Dunn, Sarah M.
2003-02-01
Hydrochemical tracers (alkalinity and silica) were used in an end-member mixing analysis (EMMA) of runoff sources in the 10 km2 Allt a' Mharcaidh catchment. A three-component mixing model was used to separate the hydrograph and estimate, to a first approximation, the range of likely contributions of overland flow, shallow subsurface storm flow, and groundwater to the annual hydrograph. A conceptual, catchment-scale rainfall-runoff model (DIY) was also used to separate the annual hydrograph in an equivalent set of flow paths. The two approaches produced independent representations of catchment hydrology that exhibited reasonable agreement. This showed the dominance of overland flow in generating storm runoff and the important role of groundwater inputs throughout the hydrological year. Moreover, DIY was successfully adapted to simulate stream chemistry (alkalinity) at daily time steps. Sensitivity analysis showed that whilst a distinct groundwater source at the catchment scale could be identified, there was considerable uncertainty in differentiating between overland flow and subsurface storm flow in both the EMMA and DIY applications. Nevertheless, the study indicated that the complementary use of tracer analysis in EMMA can increase the confidence in conceptual model structure. However, conclusions are restricted to the specific spatial and temporal scales examined.
NASA Astrophysics Data System (ADS)
Bianchi Janetti, Emanuela; Riva, Monica; Guadagnini, Alberto
2017-04-01
We perform a variance-based global sensitivity analysis to assess the impact of the uncertainty associated with (a) the spatial distribution of hydraulic parameters, e.g., hydraulic conductivity, and (b) the conceptual model adopted to describe the system on the characterization of a regional-scale aquifer. We do so in the context of inverse modeling of the groundwater flow system. The study aquifer lies within the provinces of Bergamo and Cremona (Italy) and covers a planar extent of approximately 785 km2. Analysis of available sedimentological information allows identifying a set of main geo-materials (facies/phases) which constitute the geological makeup of the subsurface system. We parameterize the conductivity field following two diverse conceptual schemes. The first one is based on the representation of the aquifer as a Composite Medium. In this conceptualization the system is composed by distinct (five, in our case) lithological units. Hydraulic properties (such as conductivity) in each unit are assumed to be uniform. The second approach assumes that the system can be modeled as a collection of media coexisting in space to form an Overlapping Continuum. A key point in this model is that each point in the domain represents a finite volume within which each of the (five) identified lithofacies can be found with a certain volumetric percentage. Groundwater flow is simulated with the numerical code MODFLOW-2005 for each of the adopted conceptual models. We then quantify the relative contribution of the considered uncertain parameters, including boundary conditions, to the total variability of the piezometric level recorded in a set of 40 monitoring wells by relying on the variance-based Sobol indices. The latter are derived numerically for the investigated settings through the use of a model-order reduction technique based on the polynomial chaos expansion approach.
NASA Astrophysics Data System (ADS)
Caputo, Riccardo
2010-09-01
It is a commonplace field observation that extension fractures are more abundant than shear fractures. The questions of how much more abundant, and why, are posed in this paper and qualitative estimates of their ratio within a rock volume are made on the basis of field observations and mechanical considerations. A conceptual model is also proposed to explain the common range of ratios between extension and shear fractures, here called the j/ f ratio. The model considers three major genetic stress components originated from overburden, pore-fluid pressure and tectonics and assumes that some of the remote genetic stress components vary with time ( i.e. stress-rates are included). Other important assumptions of the numerical model are that: i) the strength of the sub-volumes is randomly attributed following a Weibull probabilistic distribution, ii) all fractures heal after a given time, thus simulating the cementation process, and therefore iii) both extensional jointing and shear fracturing could be recurrent events within the same sub-volume. As a direct consequence of these assumptions, the stress tensor at any point varies continuously in time and these variations are caused by both remote stresses and local stress drops associated with in-situ and neighbouring fracturing events. The conceptual model is implemented in a computer program to simulate layered carbonate rock bodies undergoing brittle deformation. The numerical results are obtained by varying the principal parameters, like depth ( viz. confining pressure), tensile strength, pore-fluid pressure and shape of the Weibull distribution function, in a wide range of values, therefore simulating a broad spectrum of possible mechanical and lithological conditions. The quantitative estimates of the j/ f ratio confirm the general predominance of extensional failure events during brittle deformation in shallow crustal rocks and provide useful insights for better understanding the role played by the different parameters. For example, as a general trend it is observed that the j/ f ratio is inversely proportional to depth ( viz. confining pressure) and directly proportional to pore-fluid pressure, while the stronger is the rock, the wider is the range of depths showing a finite value of the j/ f ratio and in general the deeper are the conditions where extension fractures can form. Moreover, the wider is the strength variability of rocks ( i.e. the lower is the m parameter of the Weibull probabilistic distribution function), the wider is the depth range where both fractures can form providing a finite value of the j/ f ratio. Natural case studies from different geological and tectonic settings are also used to test the conceptual model and the numerical results showing a good agreement between measured and predicted j/ f ratios.
The Analysis of Rush Orders Risk in Supply Chain: A Simulation Approach
NASA Technical Reports Server (NTRS)
Mahfouz, Amr; Arisha, Amr
2011-01-01
Satisfying customers by delivering demands at agreed time, with competitive prices, and in satisfactory quality level are crucial requirements for supply chain survival. Incidence of risks in supply chain often causes sudden disruptions in the processes and consequently leads to customers losing their trust in a company's competence. Rush orders are considered to be one of the main types of supply chain risks due to their negative impact on the overall performance, Using integrated definition modeling approaches (i.e. IDEF0 & IDEF3) and simulation modeling technique, a comprehensive integrated model has been developed to assess rush order risks and examine two risk mitigation strategies. Detailed functions sequence and objects flow were conceptually modeled to reflect on macro and micro levels of the studied supply chain. Discrete event simulation models were then developed to assess and investigate the mitigation strategies of rush order risks, the objective of this is to minimize order cycle time and cost.
ERIC Educational Resources Information Center
Kumar, David Devraj; Thomas, P. V.; Morris, John D.; Tobias, Karen M.; Baker, Mary; Jermanovich, Trudy
2011-01-01
This study examined the impact of computer simulation and supported science learning on a teacher's understanding and conceptual knowledge of current electricity. Pre/Post tests were used to measure the teachers' concept attainment. Overall, there was a significant and large knowledge difference effect from Pre to Post test. Two interesting…
ERIC Educational Resources Information Center
Olympiou, Georgios; Zacharias, Zacharia; deJong, Ton
2013-01-01
This study aimed to identify if complementing representations of concrete objects with representations of abstract objects improves students' conceptual understanding as they use a simulation to experiment in the domain of "Light and Color". Moreover, we investigated whether students' prior knowledge is a factor that must be considered in deciding…
ERIC Educational Resources Information Center
Dega, Bekele Gashe; Kriek, Jeanne; Mogese, Temesgen Fereja
2013-01-01
The purpose of this study was to investigate Ethiopian physics undergraduate students' conceptual change in the concepts of electric potential and energy (EPE) and electromagnetic induction (EMI). A quasi-experimental design was used to study the effect of cognitive perturbation using physics interactive simulations (CPS) in relation to cognitive…
ERIC Educational Resources Information Center
Wibowo, Firmanul Catur; Suhandi, Andi; Nahadi; Samsudin, Achmad; Darman, Dina Rahmi; Suherli, Zulmiswal; Hasani, Aceng; Leksono, Sroso Mukti; Hendrayana, Aan; Suherman; Hidayat, Soleh; Hamdani, Dede; Costu, Bayram
2017-01-01
Most students cannot understand the concepts of science concepts. The abstract concepts that require visualization help students to promote to the understanding about the concept. The aim of this study was to develop Virtual Microscopic Simulation (VMS) in terms of encouraging conceptual change and to promote its effectiveness connected to…
ERIC Educational Resources Information Center
Kumar, David Devraj; Sherwood, Robert D.
2007-01-01
A study of the effect of science teaching with a multimedia simulation on water quality, the "River of Life," on the science conceptual understanding of students (N = 83) in an undergraduate science education (K-9) course is reported. Teaching reality-based meaningful science is strongly recommended by the National Science Education Standards…
Guo, Xuezhen; Claassen, G D H; Oude Lansink, A G J M; Saatkamp, H W
2014-06-01
Economic analysis of hazard surveillance in livestock production chains is essential for surveillance organizations (such as food safety authorities) when making scientifically based decisions on optimization of resource allocation. To enable this, quantitative decision support tools are required at two levels of analysis: (1) single-hazard surveillance system and (2) surveillance portfolio. This paper addresses the first level by presenting a conceptual approach for the economic analysis of single-hazard surveillance systems. The concept includes objective and subjective aspects of single-hazard surveillance system analysis: (1) a simulation part to derive an efficient set of surveillance setups based on the technical surveillance performance parameters (TSPPs) and the corresponding surveillance costs, i.e., objective analysis, and (2) a multi-criteria decision making model to evaluate the impacts of the hazard surveillance, i.e., subjective analysis. The conceptual approach was checked for (1) conceptual validity and (2) data validity. Issues regarding the practical use of the approach, particularly the data requirement, were discussed. We concluded that the conceptual approach is scientifically credible for economic analysis of single-hazard surveillance systems and that the practicability of the approach depends on data availability. Copyright © 2014 Elsevier B.V. All rights reserved.
Dark Matter and Super Symmetry: Exploring and Explaining the Universe with Simulations at the LHC
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gutsche, Oliver
The Large Hadron Collider (LHC) at CERN in Geneva, Switzerland, is one of the largest machines on this planet. It is built to smash protons into each other at unprecedented energies to reveal the fundamental constituents of our universe. The 4 detectors at the LHC record multi-petabyte datasets every year. The scientific analysis of this data requires equally large simulation datasets of the collisions based on the theory of particle physics, the Standard Model. The goal is to verify the validity of the Standard Model or of theories that extend the Model like the concepts of Supersymmetry and an explanationmore » of Dark Matter. I will give an overview of the nature of simulations needed to discover new particles like the Higgs boson in 2012, and review the different areas where simulations are indispensable: from the actual recording of the collisions to the extraction of scientific results to the conceptual design of improvements to the LHC and its experiments.« less
Lihoreau, Mathieu; Buhl, Jerome; Charleston, Michael A; Sword, Gregory A; Raubenheimer, David; Simpson, Stephen J
2015-03-01
Over recent years, modelling approaches from nutritional ecology (known as Nutritional Geometry) have been increasingly used to describe how animals and some other organisms select foods and eat them in appropriate amounts in order to maintain a balanced nutritional state maximising fitness. These nutritional strategies profoundly affect the physiology, behaviour and performance of individuals, which in turn impact their social interactions within groups and societies. Here, we present a conceptual framework to study the role of nutrition as a major ecological factor influencing the development and maintenance of social life. We first illustrate some of the mechanisms by which nutritional differences among individuals mediate social interactions in a broad range of species and ecological contexts. We then explain how studying individual- and collective-level nutrition in a common conceptual framework derived from Nutritional Geometry can bring new fundamental insights into the mechanisms and evolution of social interactions, using a combination of simulation models and manipulative experiments. © 2015 The Authors. Ecology Letters published by John Wiley & Sons Ltd and CNRS.
A Dynamic Simulation Model of Organizational Culture and Business Strategy Effects on Performance
NASA Astrophysics Data System (ADS)
Trivellas, Panagiotis; Reklitis, Panagiotis; Konstantopoulos, Nikolaos
2007-12-01
In the past two decades, organizational culture literature has gained tremendous interest for both academic and practitioners. This is based not only on the suggestion that culture is related to performance, but also on the view that it is subject of direct managerial control and manipulation to the desired direction. In the present paper, we adopt Competing Values Framework (CVF) to operationalise organizational culture and Porter's typology to conceptualize business strategy (cost leadership, innovative and marketing differentiation, and focus). Although simulation of social events is a quite difficult task, since there are so many considerations (not all well understood) involved, in the present study we developed a dynamic model to simulate the organizational culture and strategy effects on financial performance. Data obtained from a six-year survey in the banking sector of a European developing economy was used for the proposed dynamic model development.
Xiao, Bo; Imel, Zac E.; Georgiou, Panayiotis; Atkins, David C.; Narayanan, Shrikanth S.
2017-01-01
Empathy is an important psychological process that facilitates human communication and interaction. Enhancement of empathy has profound significance in a range of applications. In this paper, we review emerging directions of research on computational analysis of empathy expression and perception as well as empathic interactions, including their simulation. We summarize the work on empathic expression analysis by the targeted signal modalities (e.g., text, audio, facial expressions). We categorize empathy simulation studies into theory-based emotion space modeling or application-driven user and context modeling. We summarize challenges in computational study of empathy including conceptual framing and understanding of empathy, data availability, appropriate use and validation of machine learning techniques, and behavior signal processing. Finally, we propose a unified view of empathy computation, and offer a series of open problems for future research. PMID:27017830
Gas Core Reactor Numerical Simulation Using a Coupled MHD-MCNP Model
NASA Technical Reports Server (NTRS)
Kazeminezhad, F.; Anghaie, S.
2008-01-01
Analysis is provided in this report of using two head-on magnetohydrodynamic (MHD) shocks to achieve supercritical nuclear fission in an axially elongated cylinder filled with UF4 gas as an energy source for deep space missions. The motivation for each aspect of the design is explained and supported by theory and numerical simulations. A subsequent report will provide detail on relevant experimental work to validate the concept. Here the focus is on the theory of and simulations for the proposed gas core reactor conceptual design from the onset of shock generations to the supercritical state achieved when the shocks collide. The MHD model is coupled to a standard nuclear code (MCNP) to observe the neutron flux and fission power attributed to the supercritical state brought about by the shock collisions. Throughout the modeling, realistic parameters are used for the initial ambient gaseous state and currents to ensure a resulting supercritical state upon shock collisions.
Aircraft Flight Modeling During the Optimization of Gas Turbine Engine Working Process
NASA Astrophysics Data System (ADS)
Tkachenko, A. Yu; Kuz'michev, V. S.; Krupenich, I. N.
2018-01-01
The article describes a method for simulating the flight of the aircraft along a predetermined path, establishing a functional connection between the parameters of the working process of gas turbine engine and the efficiency criteria of the aircraft. This connection is necessary for solving the optimization tasks of the conceptual design stage of the engine according to the systems approach. Engine thrust level, in turn, influences the operation of aircraft, thus making accurate simulation of the aircraft behavior during flight necessary for obtaining the correct solution. The described mathematical model of aircraft flight provides the functional connection between the airframe characteristics, working process of gas turbine engines (propulsion system), ambient and flight conditions and flight profile features. This model provides accurate results of flight simulation and the resulting aircraft efficiency criteria, required for optimization of working process and control function of a gas turbine engine.
Methodology and application of combined watershed and ground-water models in Kansas
Sophocleous, M.; Perkins, S.P.
2000-01-01
Increased irrigation in Kansas and other regions during the last several decades has caused serious water depletion, making the development of comprehensive strategies and tools to resolve such problems increasingly important. This paper makes the case for an intermediate complexity, quasi-distributed, comprehensive, large-watershed model, which falls between the fully distributed, physically based hydrological modeling system of the type of the SHE model and the lumped, conceptual rainfall-runoff modeling system of the type of the Stanford watershed model. This is achieved by integrating the quasi-distributed watershed model SWAT with the fully-distributed ground-water model MODFLOW. The advantage of this approach is the appreciably smaller input data requirements and the use of readily available data (compared to the fully distributed, physically based models), the statistical handling of watershed heterogeneities by employing the hydrologic-response-unit concept, and the significantly increased flexibility in handling stream-aquifer interactions, distributed well withdrawals, and multiple land uses. The mechanics of integrating the component watershed and ground-water models are outlined, and three real-world management applications of the integrated model from Kansas are briefly presented. Three different aspects of the integrated model are emphasized: (1) management applications of a Decision Support System for the integrated model (Rattlesnake Creek subbasin); (2) alternative conceptual models of spatial heterogeneity related to the presence or absence of an underlying aquifer with shallow or deep water table (Lower Republican River basin); and (3) the general nature of the integrated model linkage by employing a watershed simulator other than SWAT (Wet Walnut Creek basin). These applications demonstrate the practicality and versatility of this relatively simple and conceptually clear approach, making public acceptance of the integrated watershed modeling system much easier. This approach also enhances model calibration and thus the reliability of model results. (C) 2000 Elsevier Science B.V.Increased irrigation in Kansas and other regions during the last several decades has caused serious water depletion, making the development of comprehensive strategies and tools to resolve such problems increasingly important. This paper makes the case for an intermediate complexity, quasi-distributed, comprehensive, large-watershed model, which falls between the fully distributed, physically based hydrological modeling system of the type of the SHE model and the lumped, conceptual rainfall-runoff modeling system of the type of the Stanford watershed model. This is achieved by integrating the quasi-distributed watershed model SWAT with the fully-distributed ground-water model MODFLOW. The advantage of this approach is the appreciably smaller input data requirements and the use of readily available data (compared to the fully distributed, physically based models), the statistical handling of watershed heterogeneities by employing the hydrologic-response-unit concept, and the significantly increased flexibility in handling stream-aquifer interactions, distributed well withdrawals, and multiple land uses. The mechanics of integrating the component watershed and ground-water models are outlined, and three real-world management applications of the integrated model from Kansas are briefly presented. Three different aspects of the integrated model are emphasized: (1) management applications of a Decision Support System for the integrated model (Rattlesnake Creek subbasin); (2) alternative conceptual models of spatial heterogeneity related to the presence or absence of an underlying aquifer with shallow or deep water table (Lower Republican River basin); and (3) the general nature of the integrated model linkage by employing a watershed simulator other than SWAT (Wet Walnut Creek basin). These applications demonstrate the practicality and ve
NASA Technical Reports Server (NTRS)
Rabadi, Ghaith
2005-01-01
A significant portion of lifecycle costs for launch vehicles are generated during the operations phase. Research indicates that operations costs can account for a large percentage of the total life-cycle costs of reusable space transportation systems. These costs are largely determined by decisions made early during conceptual design. Therefore, operational considerations are an important part of vehicle design and concept analysis process that needs to be modeled and studied early in the design phase. However, this is a difficult and challenging task due to uncertainties of operations definitions, the dynamic and combinatorial nature of the processes, and lack of analytical models and the scarcity of historical data during the conceptual design phase. Ultimately, NASA would like to know the best mix of launch vehicle concepts that would meet the missions launch dates at the minimum cost. To answer this question, we first need to develop a model to estimate the total cost, including the operational cost, to accomplish this set of missions. In this project, we have developed and implemented a discrete-event simulation model using ARENA (a simulation modeling environment) to determine this cost assessment. Discrete-event simulation is widely used in modeling complex systems, including transportation systems, due to its flexibility, and ability to capture the dynamics of the system. The simulation model accepts manifest inputs including the set of missions that need to be accomplished over a period of time, the clients (e.g., NASA or DoD) who wish to transport the payload to space, the payload weights, and their destinations (e.g., International Space Station, LEO, or GEO). A user of the simulation model can define an architecture of reusable or expendable launch vehicles to achieve these missions. Launch vehicles may belong to different families where each family may have it own set of resources, processing times, and cost factors. The goal is to capture the required resource levels of the major launch elements and their required facilities. The model s output can show whether or not a certain architecture of vehicles can meet the launch dates, and if not, how much the delay cost would be. It will also produce aggregate figures of missions cost based on element procurement cost, processing cost, cargo integration cost, delay cost, and mission support cost. One of the most useful features of this model is that it is stochastic where it accepts statistical distributions to represent the processing times mimicking the stochastic nature of real systems.
NASA Astrophysics Data System (ADS)
Masciopinto, Costantino; Volpe, Angela; Palmiotta, Domenico; Cherubini, Claudia
2010-09-01
A combination of a parallel fracture model with the PHREEQC-2 geochemical model was developed to simulate sequential flow and chemical transport with reactions in fractured media where both laminar and turbulent flows occur. The integration of non-laminar flow resistances in one model produced relevant effects on water flow velocities, thus improving model prediction capabilities on contaminant transport. The proposed conceptual model consists of 3D rock-blocks, separated by horizontal bedding plane fractures with variable apertures. Particle tracking solved the transport equations for conservative compounds and provided input for PHREEQC-2. For each cluster of contaminant pathways, PHREEQC-2 determined the concentration for mass-transfer, sorption/desorption, ion exchange, mineral dissolution/precipitation and biodegradation, under kinetically controlled reactive processes of equilibrated chemical species. Field tests have been performed for the code verification. As an example, the combined model has been applied to a contaminated fractured aquifer of southern Italy in order to simulate the phenol transport. The code correctly fitted the field available data and also predicted a possible rapid depletion of phenols as a result of an increased biodegradation rate induced by a simulated artificial injection of nitrates, upgradient to the sources.
Compressibility Effects on Particle-Fluid Interaction Force for Eulerian-Eulerian Simulations
NASA Astrophysics Data System (ADS)
Akiki, Georges; Francois, Marianne; Zhang, Duan
2017-11-01
Particle-fluid interaction forces are essential in modeling multiphase flows. Several models can be found in the literature based on empirical, numerical, and experimental results from various simplified flow conditions. Some of these models also account for finite Mach number effects. Using these models is relatively straightforward with Eulerian-Lagrangian calculations if the model for the total force on particles is used. In Eulerian-Eulerian simulations, however, there is the pressure gradient terms in the momentum equation for particles. For low Mach number flows, the pressure gradient force is negligible if the particle density is much greater than that of the fluid. For supersonic flows where a standing shock is present, even for a steady and uniform flow, it is unclear whether the significant pressure-gradient force should to be separated out from the particle force model. To answer this conceptual question, we perform single-sphere fully-resolved DNS simulations for a wide range of Mach numbers. We then examine whether the total force obtained from the DNS can be categorized into well-established models, such as the quasi-steady, added-mass, pressure-gradient, and history forces. Work sponsored by Advanced Simulation and Computing (ASC) program of NNSA and LDRD-CNLS of LANL.
Simulation of Subsurface Multiphase Contaminant Extraction Using a Bioslurping Well Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matos de Souza, Michelle; Oostrom, Mart; White, Mark D.
2016-07-12
Subsurface simulation of multiphase extraction from wells is notoriously difficult. Explicit representation of well geometry requires small grid resolution, potentially leading to large computational demands. To reduce the problem dimensionality, multiphase extraction is mostly modeled using vertically-averaged approaches. In this paper, a multiphase well model approach is presented as an alternative to simplify the application. The well model, a multiphase extension of the classic Peaceman model, has been implemented in the STOMP simulator. The numerical solution approach accounts for local conditions and gradients in the exchange of fluids between the well and the aquifer. Advantages of this well model implementationmore » include the option to simulate the effects of well characteristics and operation. Simulations were conducted investigating the effects of extraction location, applied vacuum pressure, and a number of hydraulic properties. The obtained results were all consistent and logical. A major outcome of the test simulations is that, in contrast with common recommendations to extract from either the gas-NAPL or the NAPL-aqueous phase interface, the optimum extraction location should be in between these two levels. The new model implementation was also used to simulate extraction at a field site in Brazil. The simulation shows a good match with the field data, suggesting that the new STOMP well module may correctly represent oil removal. The field simulations depend on the quality of the site conceptual model, including the porous media and contaminant properties and the boundary and extraction conditions adopted. The new module may potentially be used to design field applications and analyze extraction data.« less
PhET: Interactive Simulations for Teaching and Learning Physics
NASA Astrophysics Data System (ADS)
Perkins, Katherine; Adams, Wendy; Dubson, Michael; Finkelstein, Noah; Reid, Sam; Wieman, Carl; LeMaster, Ron
2006-01-01
The Physics Education Technology (PhET) project creates useful simulations for teaching and learning physics and makes them freely available from the PhET website (http://phet.colorado.edu). The simulations (sims) are animated, interactive, and game-like environments in which students learn through exploration. In these sims, we emphasize the connections between real-life phenomena and the underlying science, and seek to make the visual and conceptual models of expert physicists accessible to students. We use a research-based approach in our design—incorporating findings from prior research and our own testing to create sims that support student engagement with and understanding of physics concepts.
Mesoscopic model for binary fluids
NASA Astrophysics Data System (ADS)
Echeverria, C.; Tucci, K.; Alvarez-Llamoza, O.; Orozco-Guillén, E. E.; Morales, M.; Cosenza, M. G.
2017-10-01
We propose a model for studying binary fluids based on the mesoscopic molecular simulation technique known as multiparticle collision, where the space and state variables are continuous, and time is discrete. We include a repulsion rule to simulate segregation processes that does not require calculation of the interaction forces between particles, so binary fluids can be described on a mesoscopic scale. The model is conceptually simple and computationally efficient; it maintains Galilean invariance and conserves the mass and energy in the system at the micro- and macro-scale, whereas momentum is conserved globally. For a wide range of temperatures and densities, the model yields results in good agreement with the known properties of binary fluids, such as the density profile, interface width, phase separation, and phase growth. We also apply the model to the study of binary fluids in crowded environments with consistent results.
Computer simulation of the probability that endangered whales will interact with oil spills
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reed, M.; Jayko, K.; Bowles, A.
1987-03-01
A numerical model system was developed to assess quantitatively the probability that endangered bowhead and gray whales will encounter spilled oil in Alaskan waters. Bowhead and gray whale migration and diving-surfacing models, and an oil-spill trajectory model comprise the system. The migration models were developed from conceptual considerations, then calibrated with and tested against observations. The movement of a whale point is governed by a random walk algorithm which stochastically follows a migratory pathway. The oil-spill model, developed under a series of other contracts, accounts for transport and spreading behavior in open water and in the presence of sea ice.more » Historical wind records and heavy, normal, or light ice cover data sets are selected at random to provide stochastic oil-spill scenarios for whale-oil interaction simulations.« less
NASA Astrophysics Data System (ADS)
Leach, J.; Moore, D.
2015-12-01
Winter stream temperature of coastal mountain catchments influences fish growth and development. Transient snow cover and advection associated with lateral throughflow inputs are dominant controls on stream thermal regimes in these regions. Existing stream temperature models lack the ability to properly simulate these processes. Therefore, we developed and evaluated a conceptual-parametric catchment-scale stream temperature model that includes the role of transient snow cover and lateral advection associated with throughflow. The model provided reasonable estimates of observed stream temperature at three test catchments. We used the model to simulate winter stream temperature for virtual catchments located at different elevations within the rain-on-snow zone. The modelling exercise examined stream temperature response associated with interactions between elevation, snow regime, and changes in air temperature. Modelling results highlight that the sensitivity of winter stream temperature response to changes in climate may be dependent on catchment elevation and landscape position.
Dini-Andreote, Francisco; Stegen, James C; van Elsas, Jan Dirk; Salles, Joana Falcão
2015-03-17
Ecological succession and the balance between stochastic and deterministic processes are two major themes within microbial ecology, but these conceptual domains have mostly developed independent of each other. Here we provide a framework that integrates shifts in community assembly processes with microbial primary succession to better understand mechanisms governing the stochastic/deterministic balance. Synthesizing previous work, we devised a conceptual model that links ecosystem development to alternative hypotheses related to shifts in ecological assembly processes. Conceptual model hypotheses were tested by coupling spatiotemporal data on soil bacterial communities with environmental conditions in a salt marsh chronosequence spanning 105 years of succession. Analyses within successional stages showed community composition to be initially governed by stochasticity, but as succession proceeded, there was a progressive increase in deterministic selection correlated with increasing sodium concentration. Analyses of community turnover among successional stages--which provide a larger spatiotemporal scale relative to within stage analyses--revealed that changes in the concentration of soil organic matter were the main predictor of the type and relative influence of determinism. Taken together, these results suggest scale-dependency in the mechanisms underlying selection. To better understand mechanisms governing these patterns, we developed an ecological simulation model that revealed how changes in selective environments cause shifts in the stochastic/deterministic balance. Finally, we propose an extended--and experimentally testable--conceptual model integrating ecological assembly processes with primary and secondary succession. This framework provides a priori hypotheses for future experiments, thereby facilitating a systematic approach to understand assembly and succession in microbial communities across ecosystems.
Dini-Andreote, Francisco; Stegen, James C.; van Elsas, Jan Dirk; Salles, Joana Falcão
2015-01-01
Ecological succession and the balance between stochastic and deterministic processes are two major themes within microbial ecology, but these conceptual domains have mostly developed independent of each other. Here we provide a framework that integrates shifts in community assembly processes with microbial primary succession to better understand mechanisms governing the stochastic/deterministic balance. Synthesizing previous work, we devised a conceptual model that links ecosystem development to alternative hypotheses related to shifts in ecological assembly processes. Conceptual model hypotheses were tested by coupling spatiotemporal data on soil bacterial communities with environmental conditions in a salt marsh chronosequence spanning 105 years of succession. Analyses within successional stages showed community composition to be initially governed by stochasticity, but as succession proceeded, there was a progressive increase in deterministic selection correlated with increasing sodium concentration. Analyses of community turnover among successional stages—which provide a larger spatiotemporal scale relative to within stage analyses—revealed that changes in the concentration of soil organic matter were the main predictor of the type and relative influence of determinism. Taken together, these results suggest scale-dependency in the mechanisms underlying selection. To better understand mechanisms governing these patterns, we developed an ecological simulation model that revealed how changes in selective environments cause shifts in the stochastic/deterministic balance. Finally, we propose an extended—and experimentally testable—conceptual model integrating ecological assembly processes with primary and secondary succession. This framework provides a priori hypotheses for future experiments, thereby facilitating a systematic approach to understand assembly and succession in microbial communities across ecosystems. PMID:25733885
Hydrogeological Characterization of the Middle Magdalena Valley - Colombia
NASA Astrophysics Data System (ADS)
Arenas, Maria Cristina; Riva, Monica; Donado, Leonardo David; Guadagnini, Alberto
2017-04-01
We provide a detailed hydrogeological characterization of the complex aquifer system of the Middle Magdalena Valley, Colombia. The latter is comprised by 3 sub-basins within which 7 blocks have been identified for active exploration and potential production of oil and gas. As such, there is a critical need to establish modern water resources management practices in the area to accommodate the variety of social, environmental and industrial needs. We do so by starting from a detailed hydrogeological characterization of the system and focus on: (a) a detailed hydrogeological reconnaissance of the area leading to the definition of the main hydrogeological units; (b) the collection, organization and analysis of daily climatic data from 39 stations available in the region; and (c) the assessment of the groundwater flow circulation through the formulation of a conceptual and a mathematical model of the subsurface system. Groundwater flow is simulated in the SAM 1.1 aquifer located in the Middle Magdalena Valley with the objective of showing and evaluating alternative conceptual hydrogeological modeling alternatives. We focus here on modeling results at system equilibrium (i.e., under steady-state conditions) and assess the value of available information in the context of the candidate modeling strategies we consider. Results of our modeling effort are conducive to the characterization of the distributed hydrogeological budget and the assessment of critical areas as a function of the conceptualization of the system functioning and data avilability.
Kumar, Ashish; Vercruysse, Jurgen; Vanhoorne, Valérie; Toiviainen, Maunu; Panouillot, Pierre-Emmanuel; Juuti, Mikko; Vervaet, Chris; Remon, Jean Paul; Gernaey, Krist V; De Beer, Thomas; Nopens, Ingmar
2015-04-25
Twin-screw granulation is a promising continuous alternative for traditional batchwise wet granulation processes. The twin-screw granulator (TSG) screws consist of transport and kneading element modules. Therefore, the granulation to a large extent is governed by the residence time distribution within each module where different granulation rate processes dominate over others. Currently, experimental data is used to determine the residence time distributions. In this study, a conceptual model based on classical chemical engineering methods is proposed to better understand and simulate the residence time distribution in a TSG. The experimental data were compared with the proposed most suitable conceptual model to estimate the parameters of the model and to analyse and predict the effects of changes in number of kneading discs and their stagger angle, screw speed and powder feed rate on residence time. The study established that the kneading block in the screw configuration acts as a plug-flow zone inside the granulator. Furthermore, it was found that a balance between the throughput force and conveying rate is required to obtain a good axial mixing inside the twin-screw granulator. Although the granulation behaviour is different for other excipients, the experimental data collection and modelling methods applied in this study are generic and can be adapted to other excipients. Copyright © 2015 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
M. A. Wasiolek
The purpose of this report is to document the biosphere model, the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), which describes radionuclide transport processes in the biosphere and associated human exposure that may arise as the result of radionuclide release from the geologic repository at Yucca Mountain. The biosphere model is one of the process models that support the Yucca Mountain Project (YMP) Total System Performance Assessment (TSPA) for the license application (LA), the TSPA-LA. The ERMYN model provides the capability of performing human radiation dose assessments. This report documents the biosphere model, which includes: (1) Describing the referencemore » biosphere, human receptor, exposure scenarios, and primary radionuclides for each exposure scenario (Section 6.1); (2) Developing a biosphere conceptual model using site-specific features, events, and processes (FEPs), the reference biosphere, the human receptor, and assumptions (Section 6.2 and Section 6.3); (3) Building a mathematical model using the biosphere conceptual model and published biosphere models (Sections 6.4 and 6.5); (4) Summarizing input parameters for the mathematical model, including the uncertainty associated with input values (Section 6.6); (5) Identifying improvements in the ERMYN model compared with the model used in previous biosphere modeling (Section 6.7); (6) Constructing an ERMYN implementation tool (model) based on the biosphere mathematical model using GoldSim stochastic simulation software (Sections 6.8 and 6.9); (7) Verifying the ERMYN model by comparing output from the software with hand calculations to ensure that the GoldSim implementation is correct (Section 6.10); and (8) Validating the ERMYN model by corroborating it with published biosphere models; comparing conceptual models, mathematical models, and numerical results (Section 7).« less
Design Oriented Structural Modeling for Airplane Conceptual Design Optimization
NASA Technical Reports Server (NTRS)
Livne, Eli
1999-01-01
The main goal for research conducted with the support of this grant was to develop design oriented structural optimization methods for the conceptual design of airplanes. Traditionally in conceptual design airframe weight is estimated based on statistical equations developed over years of fitting airplane weight data in data bases of similar existing air- planes. Utilization of such regression equations for the design of new airplanes can be justified only if the new air-planes use structural technology similar to the technology on the airplanes in those weight data bases. If any new structural technology is to be pursued or any new unconventional configurations designed the statistical weight equations cannot be used. In such cases any structural weight estimation must be based on rigorous "physics based" structural analysis and optimization of the airframes under consideration. Work under this grant progressed to explore airframe design-oriented structural optimization techniques along two lines of research: methods based on "fast" design oriented finite element technology and methods based on equivalent plate / equivalent shell models of airframes, in which the vehicle is modelled as an assembly of plate and shell components, each simulating a lifting surface or nacelle / fuselage pieces. Since response to changes in geometry are essential in conceptual design of airplanes, as well as the capability to optimize the shape itself, research supported by this grant sought to develop efficient techniques for parametrization of airplane shape and sensitivity analysis with respect to shape design variables. Towards the end of the grant period a prototype automated structural analysis code designed to work with the NASA Aircraft Synthesis conceptual design code ACS= was delivered to NASA Ames.
Dissolution Rate, Weathering Mechanics, and Friability of TNT, Comp B, Tritonal, and Octol
2010-02-01
second conceptual model also simulates dissolution of a particle that experiences constant soil moisture such as one mixed in with the soil...or are mediated by moisture on the particle surface is not yet known. The identities of these red products are also unknown as are their health...it using the outdoor data. The model assumes that raindrops intercepted by HE particles were fully saturated in HE as they dripped off. Particle
System Level Uncertainty Assessment for Collaborative RLV Design
NASA Technical Reports Server (NTRS)
Charania, A. C.; Bradford, John E.; Olds, John R.; Graham, Matthew
2002-01-01
A collaborative design process utilizing Probabilistic Data Assessment (PDA) is showcased. Given the limitation of financial resources by both the government and industry, strategic decision makers need more than just traditional point designs, they need to be aware of the likelihood of these future designs to meet their objectives. This uncertainty, an ever-present character in the design process, can be embraced through a probabilistic design environment. A conceptual design process is presented that encapsulates the major engineering disciplines for a Third Generation Reusable Launch Vehicle (RLV). Toolsets consist of aerospace industry standard tools in disciplines such as trajectory, propulsion, mass properties, cost, operations, safety, and economics. Variations of the design process are presented that use different fidelities of tools. The disciplinary engineering models are used in a collaborative engineering framework utilizing Phoenix Integration's ModelCenter and AnalysisServer environment. These tools allow the designer to join disparate models and simulations together in a unified environment wherein each discipline can interact with any other discipline. The design process also uses probabilistic methods to generate the system level output metrics of interest for a RLV conceptual design. The specific system being examined is the Advanced Concept Rocket Engine 92 (ACRE-92) RLV. Previous experience and knowledge (in terms of input uncertainty distributions from experts and modeling and simulation codes) can be coupled with Monte Carlo processes to best predict the chances of program success.
Bone fracture healing in mechanobiological modeling: A review of principles and methods.
Ghiasi, Mohammad S; Chen, Jason; Vaziri, Ashkan; Rodriguez, Edward K; Nazarian, Ara
2017-06-01
Bone fracture is a very common body injury. The healing process is physiologically complex, involving both biological and mechanical aspects. Following a fracture, cell migration, cell/tissue differentiation, tissue synthesis, and cytokine and growth factor release occur, regulated by the mechanical environment. Over the past decade, bone healing simulation and modeling has been employed to understand its details and mechanisms, to investigate specific clinical questions, and to design healing strategies. The goal of this effort is to review the history and the most recent work in bone healing simulations with an emphasis on both biological and mechanical properties. Therefore, we provide a brief review of the biology of bone fracture repair, followed by an outline of the key growth factors and mechanical factors influencing it. We then compare different methodologies of bone healing simulation, including conceptual modeling (qualitative modeling of bone healing to understand the general mechanisms), biological modeling (considering only the biological factors and processes), and mechanobiological modeling (considering both biological aspects and mechanical environment). Finally we evaluate different components and clinical applications of bone healing simulation such as mechanical stimuli, phases of bone healing, and angiogenesis.
Modeling radionuclide migration from underground nuclear explosions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harp, Dylan Robert; Stauffer, Philip H.; Viswanathan, Hari S.
2017-03-06
The travel time of radionuclide gases to the ground surface in fracture rock depends on many complex factors. Numerical simulators are the most complete repositories of knowledge of the complex processes governing radionuclide gas migration to the ground surface allowing us to verify conceptualizations of physical processes against observations and forecast radionuclide gas travel times to the ground surface and isotopic ratios
Water management simulation games and the construction of knowledge
NASA Astrophysics Data System (ADS)
Rusca, M.; Heun, J.; Schwartz, K.
2012-03-01
In recent years simulations have become an important part of teaching activities. The reasons behind the popularity of simulation games are twofold. On the one hand, emerging theories on how people learn have called for an experienced-based learning approach. On the other hand, the demand for water management professionals has changed. Three important developments are having considerable consequences for water management programmes, which educate and train these professionals. These developments are the increasing emphasis on integration in water management, the characteristics and speed of reforms in the public sector and the shifting state-society relations in many countries. In response to these developments, demand from the labour market is oriented toward water professionals who need to have both a specialist in-depth knowledge in their own field, as well as the ability to understand and interact with other disciplines and interests. In this context, skills in negotiating, consensus building and working in teams are considered essential for all professionals. In this paper we argue that simulation games have an important role to play in (actively) educating students and training the new generation of water professionals to respond to the above-mentioned challenges. At the same time, simulations are not a panacea for learners and teachers. Challenges of using simulations games include the demands it places on the teacher. Setting up the simulation game, facilitating the delivery and ensuring that learning objectives are achieved requires considerable knowledge and experience as well as considerable time-inputs of the teacher. Moreover, simulation games usually incorporate a case-based learning model, which may neglect or underemphasize theories and conceptualization. For simulations to be effective they have to be embedded in this larger theoretical and conceptual framework. Simulations, therefore, complement rather than substitute traditional teaching methods.
Water management simulation games and the construction of knowledge
NASA Astrophysics Data System (ADS)
Rusca, M.; Heun, J.; Schwartz, K.
2012-08-01
In recent years, simulations have become an important part of teaching activities. The reasons behind the popularity of simulation games are twofold. On the one hand, emerging theories on how people learn have called for an experienced-based learning approach. On the other hand, the demand for water management professionals has changed. Three important developments are having considerable consequences for water management programmes, which educate and train these professionals. These developments are the increasing emphasis on integration in water management, the characteristics and speed of reforms in the public sector and the shifting state-society relations in many countries. In response to these developments, demand from the labour market is oriented toward water professionals who need to have both a specialist in-depth knowledge in their own field, as well as the ability to understand and interact with other disciplines and interests. In this context, skills in negotiating, consensus building and working in teams are considered essential for all professionals. In this paper, we argue that simulation games have an important role to play in (actively) educating students and training the new generation of water professionals to respond to the above-mentioned challenges. At the same time, simulations are not a panacea for learners and teachers. Challenges of using simulation games include the demands it places on the teacher. Setting up the simulation game, facilitating the delivery and ensuring that learning objectives are achieved require considerable knowledge and experience as well as considerable time-inputs of the teacher. Moreover, simulation games usually incorporate a case-based learning model, which may neglect or underemphasize theories and conceptualizations. For simulations to be effective, they have to be embedded in this larger theoretical and conceptual framework. Simulations, therefore, complement rather than substitute traditional teaching methods.
Overview of a simple model describing variation of dissolved organic carbon in an upland catchment
Boyer, Elizabeth W.; Hornberger, George M.; Bencala, Kenneth E.; McKnight, Diane M.
1996-01-01
Hydrological mechanisms controlling the variation of dissolved organic carbon (DOC) were investigated in the Deer Creek catchment located near Montezuma, CO. Patterns of DOC in streamflow suggested that increased flows through the upper soil horizon during snowmelt are responsible for flushing this DOC-enriched interstitial water to the streams. We examined possible hydrological mechanisms to explain the observed variability of DOC in Deer Creek by first simulating the hydrological response of the catchment using TOPMODEL and then routing the predicted flows through a simple model that accounted for temporal changes in DOC. Conceptually the DOC model can be taken to represent a terrestrial (soil) reservoir in which DOC builds up during low flow periods and is flushed out when infiltrating meltwaters cause the water table to rise into this “reservoir”. Concentrations of DOC measured in the upper soil and in streamflow were compared to model simulations. The simulated DOC response provides a reasonable reproduction of the observed dynamics of DOC in the stream at Deer Creek.
Developing a discrete event simulation model for university student shuttle buses
NASA Astrophysics Data System (ADS)
Zulkepli, Jafri; Khalid, Ruzelan; Nawawi, Mohd Kamal Mohd; Hamid, Muhammad Hafizan
2017-11-01
Providing shuttle buses for university students to attend their classes is crucial, especially when their number is large and the distances between their classes and residential halls are far. These factors, in addition to the non-optimal current bus services, typically require the students to wait longer which eventually opens a space for them to complain. To considerably reduce the waiting time, providing the optimal number of buses to transport them from location to location and the effective route schedules to fulfil the students' demand at relevant time ranges are thus important. The optimal bus number and schedules are to be determined and tested using a flexible decision platform. This paper thus models the current services of student shuttle buses in a university using a Discrete Event Simulation approach. The model can flexibly simulate whatever changes configured to the current system and report its effects to the performance measures. How the model was conceptualized and formulated for future system configurations are the main interest of this paper.
Reilly, Thomas E.; Plummer, Niel; Phillips, Patrick J.; Busenberg, Eurybiades
1994-01-01
Measurements of the concentrations of chlorofluorocarbons (CFCs), tritium, and other environmental tracers can be used to calculate recharge ages of shallow groundwater and estimate rates of groundwater movement. Numerical simulation also provides quantitative estimates of flow rates, flow paths, and mixing properties of the groundwater system. The environmental tracer techniques and the hydraulic analyses each contribute to the understanding and quantification of the flow of shallow groundwater. However, when combined, the two methods provide feedback that improves the quantification of the flow system and provides insight into the processes that are the most uncertain. A case study near Locust Grove, Maryland, is used to investigate the utility of combining groundwater age dating, based on CFCs and tritium, and hydraulic analyses using numerical simulation techniques. The results of the feedback between an advective transport model and the estimates of groundwater ages determined by the CFCs improve a quantitative description of the system by refining the system conceptualization and estimating system parameters. The plausible system developed with this feedback between the advective flow model and the CFC ages is further tested using a solute transport simulation to reproduce the observed tritium distribution in the groundwater. The solute transport simulation corroborates the plausible system developed and also indicates that, for the system under investigation with the data obtained from 0.9-m-long (3-foot-long) well screens, the hydrodynamic dispersion is negligible. Together the two methods enable a coherent explanation of the flow paths and rates of movement while indicating weaknesses in the understanding of the system that will require future data collection and conceptual refinement of the groundwater system.
An analogue conceptual rainfall-runoff model for educational purposes
NASA Astrophysics Data System (ADS)
Herrnegger, Mathew; Riedl, Michael; Schulz, Karsten
2016-04-01
Conceptual rainfall-runoff models, in which runoff processes are modelled with a series of connected linear and non-linear reservoirs, remain widely applied tools in science and practice. Additionally, the concept is appreciated in teaching due to its somewhat simplicity in explaining and exploring hydrological processes of catchments. However, when a series of reservoirs are used, the model system becomes highly parametrized and complex and the traceability of the model results becomes more difficult to explain to an audience not accustomed to numerical modelling. Since normally the simulations are performed with a not visible digital code, the results are also not easily comprehensible. This contribution therefore presents a liquid analogue model, in which a conceptual rainfall-runoff model is reproduced by a physical model. This consists of different acrylic glass containers representing different storage components within a catchment, e.g. soil water or groundwater storage. The containers are equipped and connected with pipes, in which water movement represents different flow processes, e.g. surface runoff, percolation or base flow. Water from a storage container is pumped to the upper part of the model and represents effective rainfall input. The water then flows by gravity through the different pipes and storages. Valves are used for controlling the flows within the analogue model, comparable to the parameterization procedure in numerical models. Additionally, an inexpensive microcontroller-based board and sensors are used to measure storage water levels, with online visualization of the states as time series data, building a bridge between the analogue and digital world. The ability to physically witness the different flows and water levels in the storages makes the analogue model attractive to the audience. Hands-on experiments can be performed with students, in which different scenarios or catchment types can be simulated, not only with the analogue but also in parallel with the digital model, thereby connecting real-world with science. The effects of different parameterization setups, which is important not only in hydrological sciences, can be shown in a tangible way. The use of the analogue model in the context of "children meet University" events seems an attractive approach to show a younger audience the basic ideas of catchment modelling concepts, which would otherwise not be possible.
Development and verification of an agent-based model of opinion leadership.
Anderson, Christine A; Titler, Marita G
2014-09-27
The use of opinion leaders is a strategy used to speed the process of translating research into practice. Much is still unknown about opinion leader attributes and activities and the context in which they are most effective. Agent-based modeling is a methodological tool that enables demonstration of the interactive and dynamic effects of individuals and their behaviors on other individuals in the environment. The purpose of this study was to develop and test an agent-based model of opinion leadership. The details of the design and verification of the model are presented. The agent-based model was developed by using a software development platform to translate an underlying conceptual model of opinion leadership into a computer model. Individual agent attributes (for example, motives and credibility) and behaviors (seeking or providing an opinion) were specified as variables in the model in the context of a fictitious patient care unit. The verification process was designed to test whether or not the agent-based model was capable of reproducing the conditions of the preliminary conceptual model. The verification methods included iterative programmatic testing ('debugging') and exploratory analysis of simulated data obtained from execution of the model. The simulation tests included a parameter sweep, in which the model input variables were adjusted systematically followed by an individual time series experiment. Statistical analysis of model output for the 288 possible simulation scenarios in the parameter sweep revealed that the agent-based model was performing, consistent with the posited relationships in the underlying model. Nurse opinion leaders act on the strength of their beliefs and as a result, become an opinion resource for their uncertain colleagues, depending on their perceived credibility. Over time, some nurses consistently act as this type of resource and have the potential to emerge as opinion leaders in a context where uncertainty exists. The development and testing of agent-based models is an iterative process. The opinion leader model presented here provides a basic structure for continued model development, ongoing verification, and the establishment of validation procedures, including empirical data collection.
Validation of a DICE Simulation Against a Discrete Event Simulation Implemented Entirely in Code.
Möller, Jörgen; Davis, Sarah; Stevenson, Matt; Caro, J Jaime
2017-10-01
Modeling is an essential tool for health technology assessment, and various techniques for conceptualizing and implementing such models have been described. Recently, a new method has been proposed-the discretely integrated condition event or DICE simulation-that enables frequently employed approaches to be specified using a common, simple structure that can be entirely contained and executed within widely available spreadsheet software. To assess if a DICE simulation provides equivalent results to an existing discrete event simulation, a comparison was undertaken. A model of osteoporosis and its management programmed entirely in Visual Basic for Applications and made public by the National Institute for Health and Care Excellence (NICE) Decision Support Unit was downloaded and used to guide construction of its DICE version in Microsoft Excel ® . The DICE model was then run using the same inputs and settings, and the results were compared. The DICE version produced results that are nearly identical to the original ones, with differences that would not affect the decision direction of the incremental cost-effectiveness ratios (<1% discrepancy), despite the stochastic nature of the models. The main limitation of the simple DICE version is its slow execution speed. DICE simulation did not alter the results and, thus, should provide a valid way to design and implement decision-analytic models without requiring specialized software or custom programming. Additional efforts need to be made to speed up execution.
Solar Eclipse Effect on Shelter Air Temperature
NASA Technical Reports Server (NTRS)
Segal, M.; Turner, R. W.; Prusa, J.; Bitzer, R. J.; Finley, S. V.
1996-01-01
Decreases in shelter temperature during eclipse events were quantified on the basis of observations, numerical model simulations, and complementary conceptual evaluations. Observations for the annular eclipse on 10 May 1994 over the United States are presented, and these provide insights into the temporal and spatial changes in the shelter temperature. The observations indicated near-surface temperature drops of as much as 6 C. Numerical model simulations for this eclipse event, which provide a complementary evaluation of the spatial and temporal patterns of the temperature drops, predict similar decreases. Interrelationships between the temperature drop, degree of solar irradiance reduction, and timing of the peak eclipse are also evaluated for late spring, summer, and winter sun conditions. These simulations suggest that for total eclipses the drops in shelter temperature in midlatitudes can be as high as 7 C for a spring morning eclipse.
Roux-Rouquié, Magali; Caritey, Nicolas; Gaubert, Laurent; Rosenthal-Sabroux, Camille
2004-07-01
One of the main issues in Systems Biology is to deal with semantic data integration. Previously, we examined the requirements for a reference conceptual model to guide semantic integration based on the systemic principles. In the present paper, we examine the usefulness of the Unified Modelling Language (UML) to describe and specify biological systems and processes. This makes unambiguous representations of biological systems, which would be suitable for translation into mathematical and computational formalisms, enabling analysis, simulation and prediction of these systems behaviours.
ERIC Educational Resources Information Center
What Works Clearinghouse, 2014
2014-01-01
The 2014 study, "Conceptualizing Astronomical Scale: Virtual Simulations on Handheld Tablet Computers Reverse Misconceptions," examined the effects of using the true-to-scale (TTS) display mode versus the orrery display mode in the iPad's Solar Walk software application on students' knowledge of the Earth's place in the solar system. The…
Machine learning of fault characteristics from rocket engine simulation data
NASA Technical Reports Server (NTRS)
Ke, Min; Ali, Moonis
1990-01-01
Transformation of data into knowledge through conceptual induction has been the focus of our research described in this paper. We have developed a Machine Learning System (MLS) to analyze the rocket engine simulation data. MLS can provide to its users fault analysis, characteristics, and conceptual descriptions of faults, and the relationships of attributes and sensors. All the results are critically important in identifying faults.
UNSAT-H Version 2. 0: Unsaturated soil water and heat flow model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fayer, M.J.; Jones, T.L.
1990-04-01
This report documents UNSAT-H Version 2.0, a model for calculating water and heat flow in unsaturated media. The documentation includes the bases for the conceptual model and its numerical implementation, benchmark test cases, example simulations involving layered soils and plant transpiration, and the code listing. Waste management practices at the Hanford Site have included disposal of low-level wastes by near-surface burial. Predicting the future long-term performance of any such burial site in terms of migration of contaminants requires a model capable of simulating water flow in the unsaturated soils above the buried waste. The model currently used to meet thismore » need is UNSAT-H. This model was developed at Pacific Northwest Laboratory to assess water dynamics of near-surface, waste-disposal sites at the Hanford Site. The code is primarily used to predict deep drainage as a function of such environmental conditions as climate, soil type, and vegetation. UNSAT-H is also used to simulate the effects of various practices to enhance isolation of wastes. 66 refs., 29 figs., 7 tabs.« less
Capacity of clinical pathways--a strategic multi-level evaluation tool.
Cardoen, Brecht; Demeulemeester, Erik
2008-12-01
In this paper we strategically evaluate the efficiency of clinical pathways and their complex interdependencies with respect to joint resource usage and patient throughput. We propose a discrete-event simulation approach that allows for the simultaneous evaluation of multiple clinical pathways and the inherent uncertainty (resource, duration and arrival) that accompanies medical processes. Both the consultation suite and the surgery suite may be modeled and examined in detail by means of sensitivity or scenario analyses. Since each medical facility can somehow be represented as a combination of clinical pathways, i.e. they are conceptually similar, the simulation model is generic in nature. Next to the formulation of the model, we illustrate its applicability by means of a case study that was conducted in a Belgian hospital.
NASA Astrophysics Data System (ADS)
Nikurashin, Maxim; Gunn, Andrew
2017-04-01
The meridional overturning circulation (MOC) is a planetary-scale oceanic flow which is of direct importance to the climate system: it transports heat meridionally and regulates the exchange of CO2 with the atmosphere. The MOC is forced by wind and heat and freshwater fluxes at the surface and turbulent mixing in the ocean interior. A number of conceptual theories for the sensitivity of the MOC to changes in forcing have recently been developed and tested with idealized numerical models. However, the skill of the simple conceptual theories to describe the MOC simulated with higher complexity global models remains largely unknown. In this study, we present a systematic comparison of theoretical and modelled sensitivity of the MOC and associated deep ocean stratification to vertical mixing and southern hemisphere westerlies. The results show that theories that simplify the ocean into a single-basin, zonally-symmetric box are generally in a good agreement with a realistic, global ocean circulation model. Some disagreement occurs in the abyssal ocean, where complex bottom topography is not taken into account by simple theories. Distinct regimes, where the MOC has a different sensitivity to wind or mixing, as predicted by simple theories, are also clearly shown by the global ocean model. The sensitivity of the Indo-Pacific, Atlantic, and global basins is analysed separately to validate the conceptual understanding of the upper and lower overturning cells in the theory.
Wind-driven variations in an overturning circulation
NASA Astrophysics Data System (ADS)
Bringedal, Carina; Eldevik, Tor; Spall, Michael
2017-04-01
The Atlantic overturning circulation and poleward heat transport is balanced by northern heat loss to the atmosphere and corresponding water mass transformation. The structure of this circulation and transformation is particularly manifested - and observed - at the Greenland-Scotland ridge. There is however a rich variability in the exchanges across the ridge on seasonal and yearly time scales. This variability has been almost perfectly captured in atmospherically forced ocean GCMs (e.g. Olsen et al 2008, Sandø et al 2012), suggesting that on shorter time scales the variability of the exchanges are connected to sea level pressure and corresponding wind stress forcing. Focusing on seasonal and yearly time scales, we accordingly propose that the connection between the exchanges of overturning waters across the Greenland-Scotland ridge and the sea level pressure must be direct and simple, and we use idealized simulations to support this hypothesis. The mechanisms underlying the connection are formulated through conceptual models. Although the models and simulations are simplified with respect to bathymetry and hydrography, they can reproduce the main features of the overturning circulation in the Nordic seas. In the observations, the variable exchanges can largely be related to sea level pressure variations and large scale wind patterns, and the idealized simulations and accompanying conceptual models show how these impacts can manifest via coastal downwelling and gyre circulation. S. M. Olsen, B. Hansen, D. Quadfasel and S. Østerhus, Observed and modelled stability of overflow across the Greenland-Scotland ridge, Nature 455, (2008) A. B. Sandø, J. E. Ø. Nilsen, T. Eldevik and M. Bentsen, Mechanisms for variable North Atlantic-Nordic seas exchanges, Journal of Geophysical Research 117, (2012)
Using models to manage systems subject to sustainability indicators
Hill, M.C.
2006-01-01
Mathematical and numerical models can provide insight into sustainability indicators using relevant simulated quantities, which are referred to here as predictions. To be useful, many concerns need to be considered. Four are discussed here: (a) mathematical and numerical accuracy of the model; (b) the accuracy of the data used in model development, (c) the information observations provide to aspects of the model important to predictions of interest as measured using sensitivity analysis; and (d) the existence of plausible alternative models for a given system. The four issues are illustrated using examples from conservative and transport modelling, and using conceptual arguments. Results suggest that ignoring these issues can produce misleading conclusions.
Empirical tools for simulating salinity in the estuaries in Everglades National Park, Florida
NASA Astrophysics Data System (ADS)
Marshall, F. E.; Smith, D. T.; Nickerson, D. M.
2011-12-01
Salinity in a shallow estuary is affected by upland freshwater inputs (surface runoff, stream/canal flows, groundwater), atmospheric processes (precipitation, evaporation), marine connectivity, and wind patterns. In Everglades National Park (ENP) in South Florida, the unique Everglades ecosystem exists as an interconnected system of fresh, brackish, and salt water marshes, mangroves, and open water. For this effort a coastal aquifer conceptual model of the Everglades hydrologic system was used with traditional correlation and regression hydrologic techniques to create a series of multiple linear regression (MLR) salinity models from observed hydrologic, marine, and weather data. The 37 ENP MLR salinity models cover most of the estuarine areas of ENP and produce daily salinity simulations that are capable of estimating 65-80% of the daily variability in salinity depending upon the model. The Root Mean Squared Error is typically about 2-4 salinity units, and there is little bias in the predictions. However, the absolute error of a model prediction in the nearshore embayments and the mangrove zone of Florida Bay may be relatively large for a particular daily simulation during the seasonal transitions. Comparisons show that the models group regionally by similar independent variables and salinity regimes. The MLR salinity models have approximately the same expected range of simulation accuracy and error as higher spatial resolution salinity models.
The Oceanographic Multipurpose Software Environment (OMUSE v1.0)
NASA Astrophysics Data System (ADS)
Pelupessy, Inti; van Werkhoven, Ben; van Elteren, Arjen; Viebahn, Jan; Candy, Adam; Portegies Zwart, Simon; Dijkstra, Henk
2017-08-01
In this paper we present the Oceanographic Multipurpose Software Environment (OMUSE). OMUSE aims to provide a homogeneous environment for existing or newly developed numerical ocean simulation codes, simplifying their use and deployment. In this way, numerical experiments that combine ocean models representing different physics or spanning different ranges of physical scales can be easily designed. Rapid development of simulation models is made possible through the creation of simple high-level scripts. The low-level core of the abstraction in OMUSE is designed to deploy these simulations efficiently on heterogeneous high-performance computing resources. Cross-verification of simulation models with different codes and numerical methods is facilitated by the unified interface that OMUSE provides. Reproducibility in numerical experiments is fostered by allowing complex numerical experiments to be expressed in portable scripts that conform to a common OMUSE interface. Here, we present the design of OMUSE as well as the modules and model components currently included, which range from a simple conceptual quasi-geostrophic solver to the global circulation model POP (Parallel Ocean Program). The uniform access to the codes' simulation state and the extensive automation of data transfer and conversion operations aids the implementation of model couplings. We discuss the types of couplings that can be implemented using OMUSE. We also present example applications that demonstrate the straightforward model initialization and the concurrent use of data analysis tools on a running model. We give examples of multiscale and multiphysics simulations by embedding a regional ocean model into a global ocean model and by coupling a surface wave propagation model with a coastal circulation model.
Morgan, David S.; Hinkle, Stephen R.; Weick, Rodney J.
2007-01-01
This report presents the results of a study by the U.S. Geological Survey, done in cooperation with the Oregon Department of Environmental Quality and Deschutes County, to develop a better understanding of the effects of nitrogen from on-site wastewater disposal systems on the quality of ground water near La Pine in southern Deschutes County and northern Klamath County, Oregon. Simulation models were used to test the conceptual understanding of the system and were coupled with optimization methods to develop the Nitrate Loading Management Model, a decision-support tool that can be used to efficiently evaluate alternative approaches for managing nitrate loading from on-site wastewater systems. The conceptual model of the system is based on geologic, hydrologic, and geochemical data collected for this study, as well as previous hydrogeologic and water quality studies and field testing of on-site wastewater systems in the area by other agencies. On-site wastewater systems are the only significant source of anthropogenic nitrogen to shallow ground water in the study area. Between 1960 and 2005 estimated nitrate loading from on-site wastewater systems increased from 3,900 to 91,000 pounds of nitrogen per year. When all remaining lots are developed (in 2019 at current building rates), nitrate loading is projected to reach nearly 150,000 pounds of nitrogen per year. Low recharge rates (2-3 inches per year) and ground-water flow velocities generally have limited the extent of nitrate occurrence to discrete plumes within 20-30 feet of the water table; however, hydraulic-gradient and age data indicate that, given sufficient time and additional loading, nitrate will migrate to depths where many domestic wells currently obtain water. In 2000, nitrate concentrations greater than 4 milligrams nitrogen per liter (mg N/L) were detected in 10 percent of domestic wells sampled by Oregon Department of Environmental Quality. Numerical simulation models were constructed at transect (2.4 square miles) and study-area (247 square miles) scales to test the conceptual model and evaluate processes controlling nitrate concentrations in ground water and potential ground-water discharge of nitrate to streams. Simulation of water-quality conditions for a projected future build-out (base) scenario in which all existing lots are developed using conventional on-site wastewater systems indicates that, at equilibrium, average nitrate concentrations near the water table will exceed 10 mg N/L over areas totaling 9,400 acres. Other scenarios were simulated where future nitrate loading was reduced using advanced treatment on-site systems and a development transfer program. Seven other scenarios were simulated with total nitrate loading reductions ranging from 15 to 94 percent; simulated reductions in the area where average nitrate concentrations near the water table exceed 10 mg N/L range from 22 to 99 percent at equilibrium. Simulations also show that the ground-water system responds slowly to changes in nitrate loading due to low recharge rates and ground-water flow velocity. Consequently, reductions in nitrate loading will not immediately reduce average nitrate concentrations and the average concentration in the aquifer will continue to increase for 25-50 years depending on the level and timing of loading reduction. The capacity of the ground-water system to receive on-site wastewater system effluent, which is related to the density of homes, presence of upgradient residential development, ground-water recharge rate, ground-water flow velocity, and thickness of the oxic part of the aquifer, varies within the study area. Optimization capability was added to the study-area simulation model and the combined simulation-optimization model was used to evaluate alternative approaches to management of nitrate loading from on-site wastewater systems to the shallow alluvial aquifer. The Nitrate Loading Management Model (NLMM) was formulated to find the minimum red
NASA Astrophysics Data System (ADS)
Hong, Eun-Mi; Park, Yongeun; Muirhead, Richard; Pachepsky, Yakov
2017-04-01
Pathogenic microorganisms in recreational and irrigation waters remain the subject of concern. Water quality models are used to estimate microbial quality of water sources, to evaluate microbial contamination-related risks, to guide the microbial water quality monitoring, and to evaluate the effect of agricultural management on the microbial water quality. The Agricultural Policy/Environmental eXtender (APEX) is the watershed-scale water quality model that includes highly detailed representation of agricultural management. The APEX currently does not have microbial fate and transport simulation capabilities. The objective of this work was to develop the first APEX microbial fate and transport module that could use the APEX conceptual model of manure removal together with recently introduced conceptualizations of the in-stream microbial fate and transport. The module utilizes manure erosion rates found in the APEX. The total number of removed bacteria was set to the concentrations of bacteria in soil-manure mixing layer and eroded manure amount. Bacteria survival in soil-manure mixing layer was simulated with the two-stage survival model. Individual survival patterns were simulated for each manure application date. Simulated in-stream microbial fate and transport processes included the reach-scale passive release of bacteria with resuspended bottom sediment during high flow events, the transport of bacteria from bottom sediment due to the hyporheic exchange during low flow periods, the deposition with settling sediment, and the two-stage survival. Default parameter values were available from recently published databases. The APEX model with the newly developed microbial fate and transport module was applied to simulate seven years of monitoring data for the Toenepi watershed in New Zealand. The stream network of the watershed ran through grazing lands with the daily bovine waste deposition. Based on calibration and testing results, the APEX with the microbe module reproduced well the monitored pattern of E. coli concentrations at the watershed outlet. The APEX with the microbial fate and transport module will be utilized for predicting microbial quality of water under various agricultural practices (grazing, cropping, and manure application), evaluating monitoring protocols, and supporting the selection of management practices based on regulations that rely on fecal indicator bacteria concentrations. Future development should include modeling contributions of wildlife, manure weathering, and weather effects on manure-borne microorganism survival and release.
Hydrophobicity within the three-dimensional Mercedes-Benz model: potential of mean force.
Dias, Cristiano L; Hynninen, Teemu; Ala-Nissila, Tapio; Foster, Adam S; Karttunen, Mikko
2011-02-14
We use the three-dimensional Mercedes-Benz model for water and Monte Carlo simulations to study the structure and thermodynamics of the hydrophobic interaction. Radial distribution functions are used to classify different cases of the interaction, namely, contact configurations, solvent separated configurations, and desolvation configurations. The temperature dependence of these cases is shown to be in qualitative agreement with atomistic models of water. In particular, while the energy for the formation of contact configurations is favored by entropy, its strengthening with increasing temperature is accounted for by enthalpy. This is consistent with our simulated heat capacity. An important feature of the model is that it can be used to account for well-converged thermodynamics quantities, e.g., the heat capacity of transfer. Microscopic mechanisms for the temperature dependence of the hydrophobic interaction are discussed at the molecular level based on the conceptual simplicity of the model.
Hydrophobicity within the three-dimensional Mercedes-Benz model: Potential of mean force
NASA Astrophysics Data System (ADS)
Dias, Cristiano L.; Hynninen, Teemu; Ala-Nissila, Tapio; Foster, Adam S.; Karttunen, Mikko
2011-02-01
We use the three-dimensional Mercedes-Benz model for water and Monte Carlo simulations to study the structure and thermodynamics of the hydrophobic interaction. Radial distribution functions are used to classify different cases of the interaction, namely, contact configurations, solvent separated configurations, and desolvation configurations. The temperature dependence of these cases is shown to be in qualitative agreement with atomistic models of water. In particular, while the energy for the formation of contact configurations is favored by entropy, its strengthening with increasing temperature is accounted for by enthalpy. This is consistent with our simulated heat capacity. An important feature of the model is that it can be used to account for well-converged thermodynamics quantities, e.g., the heat capacity of transfer. Microscopic mechanisms for the temperature dependence of the hydrophobic interaction are discussed at the molecular level based on the conceptual simplicity of the model.
NASA Technical Reports Server (NTRS)
Sinacori, J. B.
1980-01-01
A conceptual design of a visual system for a rotorcraft flight simulator is presented. Also, drive logic elements for a coupled motion base for such a simulator are given. The design is the result of an assessment of many potential arrangements of electro-optical elements and is a concept considered feasible for the application. The motion drive elements represent an example logic for a coupled motion base and is essentially an appeal to the designers of such logic to combine their washout and braking functions.
Flow Simulation of N2B Hybrid Wing Body Configuration
NASA Technical Reports Server (NTRS)
Kim, Hyoungjin; Liou, Meng-Sing
2012-01-01
The N2B hybrid wing body aircraft was conceptually designed to meet environmental and performance goals for the N+2 generation transport set by the subsonic fixed wing project. In this study, flow fields around the N2B configuration is simulated using a Reynolds-averaged Navier-Stokes flow solver using unstructured meshes. Boundary conditions at engine fan face and nozzle exhaust planes are provided by response surfaces of the NPSS thermodynamic engine cycle model. The present flow simulations reveal challenging design issues arising from boundary layer ingestion offset inlet and nacelle-airframe interference. The N2B configuration can be a good test bed for application of multidisciplinary design optimization technology.
NASA Technical Reports Server (NTRS)
1989-01-01
The results of the refined conceptual design phase (task 5) of the Simulation Computer System (SCS) study are reported. The SCS is the computational portion of the Payload Training Complex (PTC) providing simulation based training on payload operations of the Space Station Freedom (SSF). In task 4 of the SCS study, the range of architectures suitable for the SCS was explored. Identified system architectures, along with their relative advantages and disadvantages for SCS, were presented in the Conceptual Design Report. Six integrated designs-combining the most promising features from the architectural formulations-were additionally identified in the report. The six integrated designs were evaluated further to distinguish the more viable designs to be refined as conceptual designs. The three designs that were selected represent distinct approaches to achieving a capable and cost effective SCS configuration for the PTC. Here, the results of task 4 (input to this task) are briefly reviewed. Then, prior to describing individual conceptual designs, the PTC facility configuration and the SSF systems architecture that must be supported by the SCS are reviewed. Next, basic features of SCS implementation that have been incorporated into all selected SCS designs are considered. The details of the individual SCS designs are then presented before making a final comparison of the three designs.
The Martian climate: Energy balance models with CO2/H2O atmospheres
NASA Technical Reports Server (NTRS)
Hoffert, M. I.
1984-01-01
Progress in the development of a multi-reservoir, time dependent energy balance climate model for Mars driven by prescribed insolation at the top of the atmosphere is reported. The first approximately half-year of the program was devoted to assembling and testing components of the full model. Specific accomplishments were made on a longwave radiation code, coupling seasonal solar input to a ground temperature simulation, and conceptualizing an approach to modeling the seasonal pressure waves that develop in the Martian atmosphere as a result of sublimation and condensation of CO2 in polar regions.
NASA Astrophysics Data System (ADS)
Wissing, Dennis Robert
The purpose of the this research was to explore undergraduates' conceptual development for oxygen transport and utilization, as a component of a cardiopulmonary physiology and advanced respiratory care course in the allied health program. This exploration focused on the student's development of knowledge and the presence of alternative conceptions, prior to, during, and after completing cardiopulmonary physiology and advanced respiratory care courses. Using the simulation program, SimBioSysTM (Samsel, 1994), student-participants completed a series of laboratory exercises focusing on cardiopulmonary disease states. This study examined data gathered from: (1) a novice group receiving the simulation program prior to instruction, (2) a novice group that experienced the simulation program following course completion in cardiopulmonary physiology, and (3) an intermediate group who experienced the simulation program following completion of formal education in Respiratory Care. This research was based on the theory of Human Constructivism as described by Mintzes, Wandersee, and Novak (1997). Data-gathering techniques were based on theories supported by Novak (1984), Wandersee (1997), and Chi (1997). Data were generated by exams, interviews, verbal analysis (Chi, 1997), and concept mapping. Results suggest that simulation may be an effective instructional method for assessing conceptual development and diagnosing alternative conceptions in undergraduates enrolled in a cardiopulmonary science program. Use of simulation in conjunction with clinical interview and concept mapping may assist in verifying gaps in learning and conceptual knowledge. This study found only limited evidence to support the use of computer simulation prior to lecture to augment learning. However, it was demonstrated that students' prelecture experience with the computer simulation helped the instructor assess what the learner knew so he or she could be taught accordingly. In addition, use of computer simulation after formal instruction was shown to be useful in aiding students identified by the instructor as needing remediation.
Architectural Framework for Addressing Legacy Waste from the Cold War - 13611
DOE Office of Scientific and Technical Information (OSTI.GOV)
Love, Gregory A.; Glazner, Christopher G.; Steckley, Sam
We present an architectural framework for the use of a hybrid simulation model of enterprise-wide operations used to develop system-level insight into the U.S. Department of Energy's (DOE) environmental cleanup of legacy nuclear waste at the Savannah River Site. We use this framework for quickly exploring policy and architectural options, analyzing plans, addressing management challenges and developing mitigation strategies for DOE Office of Environmental Management (EM). The socio-technical complexity of EM's mission compels the use of a qualitative approach to complement a more a quantitative discrete event modeling effort. We use this model-based analysis to pinpoint pressure and leverage pointsmore » and develop a shared conceptual understanding of the problem space and platform for communication among stakeholders across the enterprise in a timely manner. This approach affords the opportunity to discuss problems using a unified conceptual perspective and is also general enough that it applies to a broad range of capital investment/production operations problems. (authors)« less
Moving horizon estimation for assimilating H-SAF remote sensing data into the HBV hydrological model
NASA Astrophysics Data System (ADS)
Montero, Rodolfo Alvarado; Schwanenberg, Dirk; Krahe, Peter; Lisniak, Dmytro; Sensoy, Aynur; Sorman, A. Arda; Akkol, Bulut
2016-06-01
Remote sensing information has been extensively developed over the past few years including spatially distributed data for hydrological applications at high resolution. The implementation of these products in operational flow forecasting systems is still an active field of research, wherein data assimilation plays a vital role on the improvement of initial conditions of streamflow forecasts. We present a novel implementation of a variational method based on Moving Horizon Estimation (MHE), in application to the conceptual rainfall-runoff model HBV, to simultaneously assimilate remotely sensed snow covered area (SCA), snow water equivalent (SWE), soil moisture (SM) and in situ measurements of streamflow data using large assimilation windows of up to one year. This innovative application of the MHE approach allows to simultaneously update precipitation, temperature, soil moisture as well as upper and lower zones water storages of the conceptual model, within the assimilation window, without an explicit formulation of error covariance matrixes and it enables a highly flexible formulation of distance metrics for the agreement of simulated and observed variables. The framework is tested in two data-dense sites in Germany and one data-sparse environment in Turkey. Results show a potential improvement of the lead time performance of streamflow forecasts by using perfect time series of state variables generated by the simulation of the conceptual rainfall-runoff model itself. The framework is also tested using new operational data products from the Satellite Application Facility on Support to Operational Hydrology and Water Management (H-SAF) of EUMETSAT. This study is the first application of H-SAF products to hydrological forecasting systems and it verifies their added value. Results from assimilating H-SAF observations lead to a slight reduction of the streamflow forecast skill in all three cases compared to the assimilation of streamflow data only. On the other hand, the forecast skill of soil moisture shows a significant improvement.
NASA Astrophysics Data System (ADS)
Couzo, Evan A.
Several factors combine to make ozone (O3) pollution in Houston, Texas, unique when compared to other metropolitan areas. These include complex meteorology, intense clustering of industrial activity, and significant precursor emissions from the heavily urbanized eight-county area. Decades of air pollution research have borne out two different causes, or conceptual models, of O 3 formation. One conceptual model describes a gradual region-wide increase in O3 concentrations "typical" of many large U.S. cities. The other conceptual model links episodic emissions of volatile organic compounds to spatially limited plumes of high O3, which lead to large hourly increases that have exceeded 100 parts per billion (ppb) per hour. These large hourly increases are known to lead to violations of the federal O 3 standard and impact Houston's status as a non-attainment area. There is a need to further understand and characterize the causes of peak O 3 levels in Houston and simulate them correctly so that environmental regulators can find the most cost-effective pollution controls. This work provides a detailed understanding of unusually large O 3 increases in the natural and modeled environments. First, we probe regulatory model simulations and assess their ability to reproduce the observed phenomenon. As configured for the purpose of demonstrating future attainment of the O3 standard, the model fails to predict the spatially limited O3 plumes observed in Houston. Second, we combine ambient meteorological and pollutant measurement data to identify the most likely geographic origins and preconditions of the concentrated O3 plumes. We find evidence that the O3 plumes are the result of photochemical activity accelerated by industrial emissions. And, third, we implement changes to the modeled chemistry to add missing formation mechanisms of nitrous acid, which is an important radical precursor. Radicals control the chemical reactivity of atmospheric systems, and perturbations to radical budgets can shift chemical pathways. The mechanism additions increase the concentrations of nitrous acid, especially right after sunrise. The overall effect on O3 is small (up to three ppb), but we demonstrate the successful implementation of a surface sub-model that chemically processes adsorbed compounds. To our knowledge, this is the first time that chemical processing on surfaces has been used in a three-dimensional regulatory air quality model.
NASA Astrophysics Data System (ADS)
Sivapalan, Murugesu; Ruprecht, John K.; Viney, Neil R.
1996-03-01
A long-term water balance model has been developed to predict the hydrological effects of land-use change (especially forest clearing) in small experimental catchments in the south-west of Western Australia. This small catchment model has been used as the building block for the development of a large catchment-scale model, and has also formed the basis for a coupled water and salt balance model, developed to predict the changes in stream salinity resulting from land-use and climate change. The application of the coupled salt and water balance model to predict stream salinities in two small experimental catchments, and the application of the large catchment-scale model to predict changes in water yield in a medium-sized catchment that is being mined for bauxite, are presented in Parts 2 and 3, respectively, of this series of papers.The small catchment model has been designed as a simple, robust, conceptually based model of the basic daily water balance fluxes in forested catchments. The responses of the catchment to rainfall and pan evaporation are conceptualized in terms of three interdependent subsurface stores A, B and F. Store A depicts a near-stream perched aquifer system; B represents a deeper, permanent groundwater system; and F is an intermediate, unsaturated infiltration store. The responses of these stores are characterized by a set of constitutive relations which involves a number of conceptual parameters. These parameters are estimated by calibration by comparing observed and predicted runoff. The model has performed very well in simulations carried out on Salmon and Wights, two small experimental catchments in the Collie River basin in south-west Western Australia. The results from the application of the model to these small catchments are presented in this paper.
Reducing structural uncertainty in conceptual hydrological modeling in the semi-arid Andes
NASA Astrophysics Data System (ADS)
Hublart, P.; Ruelland, D.; Dezetter, A.; Jourde, H.
2014-10-01
The use of lumped, conceptual models in hydrological impact studies requires placing more emphasis on the uncertainty arising from deficiencies and/or ambiguities in the model structure. This study provides an opportunity to combine a multiple-hypothesis framework with a multi-criteria assessment scheme to reduce structural uncertainty in the conceptual modeling of a meso-scale Andean catchment (1515 km2) over a 30 year period (1982-2011). The modeling process was decomposed into six model-building decisions related to the following aspects of the system behavior: snow accumulation and melt, runoff generation, redistribution and delay of water fluxes, and natural storage effects. Each of these decisions was provided with a set of alternative modeling options, resulting in a total of 72 competing model structures. These structures were calibrated using the concept of Pareto optimality with three criteria pertaining to streamflow simulations and one to the seasonal dynamics of snow processes. The results were analyzed in the four-dimensional space of performance measures using a fuzzy c-means clustering technique and a differential split sample test, leading to identify 14 equally acceptable model hypotheses. A filtering approach was then applied to these best-performing structures in order to minimize the overall uncertainty envelope while maximizing the number of enclosed observations. This led to retain 8 model hypotheses as a representation of the minimum structural uncertainty that could be obtained with this modeling framework. Future work to better consider model predictive uncertainty should include a proper assessment of parameter equifinality and data errors, as well as the testing of new or refined hypotheses to allow for the use of additional auxiliary observations.
Reducing structural uncertainty in conceptual hydrological modelling in the semi-arid Andes
NASA Astrophysics Data System (ADS)
Hublart, P.; Ruelland, D.; Dezetter, A.; Jourde, H.
2015-05-01
The use of lumped, conceptual models in hydrological impact studies requires placing more emphasis on the uncertainty arising from deficiencies and/or ambiguities in the model structure. This study provides an opportunity to combine a multiple-hypothesis framework with a multi-criteria assessment scheme to reduce structural uncertainty in the conceptual modelling of a mesoscale Andean catchment (1515 km2) over a 30-year period (1982-2011). The modelling process was decomposed into six model-building decisions related to the following aspects of the system behaviour: snow accumulation and melt, runoff generation, redistribution and delay of water fluxes, and natural storage effects. Each of these decisions was provided with a set of alternative modelling options, resulting in a total of 72 competing model structures. These structures were calibrated using the concept of Pareto optimality with three criteria pertaining to streamflow simulations and one to the seasonal dynamics of snow processes. The results were analyzed in the four-dimensional (4-D) space of performance measures using a fuzzy c-means clustering technique and a differential split sample test, leading to identify 14 equally acceptable model hypotheses. A filtering approach was then applied to these best-performing structures in order to minimize the overall uncertainty envelope while maximizing the number of enclosed observations. This led to retain eight model hypotheses as a representation of the minimum structural uncertainty that could be obtained with this modelling framework. Future work to better consider model predictive uncertainty should include a proper assessment of parameter equifinality and data errors, as well as the testing of new or refined hypotheses to allow for the use of additional auxiliary observations.
Novel Virtual User Models of Mild Cognitive Impairment for Simulating Dementia
Segkouli, Sofia; Tzovaras, Dimitrios; Tsakiris, Thanos; Tsolaki, Magda; Karagiannidis, Charalampos
2015-01-01
Virtual user modeling research has attempted to address critical issues of human-computer interaction (HCI) such as usability and utility through a large number of analytic, usability-oriented approaches as cognitive models in order to provide users with experiences fitting to their specific needs. However, there is demand for more specific modules embodied in cognitive architecture that will detect abnormal cognitive decline across new synthetic task environments. Also, accessibility evaluation of graphical user interfaces (GUIs) requires considerable effort for enhancing ICT products accessibility for older adults. The main aim of this study is to develop and test virtual user models (VUM) simulating mild cognitive impairment (MCI) through novel specific modules, embodied at cognitive models and defined by estimations of cognitive parameters. Well-established MCI detection tests assessed users' cognition, elaborated their ability to perform multitasks, and monitored the performance of infotainment related tasks to provide more accurate simulation results on existing conceptual frameworks and enhanced predictive validity in interfaces' design supported by increased tasks' complexity to capture a more detailed profile of users' capabilities and limitations. The final outcome is a more robust cognitive prediction model, accurately fitted to human data to be used for more reliable interfaces' evaluation through simulation on the basis of virtual models of MCI users. PMID:26339282
Bridging the gap: enhancing interprofessional education using simulation.
Robertson, James; Bandali, Karim
2008-10-01
Simulated learning and interprofessional education (IPE) are increasingly becoming more prevalent in health care curriculum. As the focus shifts to patient-centred care, health professionals will need to learn with, from and about one another in real-life settings in order to facilitate teamwork and collaboration. The provision of simulated learning in an interprofessional environment helps replicate these settings thereby providing the traditional medical education model with opportunities for growth and innovation. Learning in context is an essential psychological and cognitive aspect of education.This paper offers a conceptual analysis of the salient issues related to IPE and medical simulation. In addition, the paper argues for the integration of simulation into IPE in order to develop innovative approaches for the delivery of education and improved clinical practice that may benefit students and all members of the health care team.
NASA Astrophysics Data System (ADS)
Politikos, D.; Somarakis, S.; Tsiaras, K. P.; Giannoulaki, M.; Petihakis, G.; Machias, A.; Triantafyllou, G.
2015-11-01
A 3-D full life cycle population model for the North Aegean Sea (NAS) anchovy stock is presented. The model is two-way coupled with a hydrodynamic-biogeochemical model (POM-ERSEM). The anchovy life span is divided into seven life stages/age classes. Embryos and early larvae are passive particles, but subsequent stages exhibit active horizontal movements based on specific rules. A bioenergetics model simulates the growth in both the larval and juvenile/adult stages, while the microzooplankton and mesozooplankton fields of the biogeochemical model provide the food for fish consumption. The super-individual approach is adopted for the representation of the anchovy population. A dynamic egg production module, with an energy allocation algorithm, is embedded in the bioenergetics equation and produces eggs based on a new conceptual model for anchovy vitellogenesis. A model simulation for the period 2003-2006 with realistic initial conditions reproduced well the magnitude of population biomass and daily egg production estimated from acoustic and daily egg production method (DEPM) surveys, carried out in the NAS during June 2003-2006. Model simulated adult and egg habitats were also in good agreement with observed spatial distributions of acoustic biomass and egg abundance in June. Sensitivity simulations were performed to investigate the effect of different formulations adopted for key processes, such as reproduction and movement. The effect of the anchovy population on plankton dynamics was also investigated, by comparing simulations adopting a two-way or a one-way coupling of the fish with the biogeochemical model.
Simplified contaminant source depletion models as analogs of multiphase simulators
NASA Astrophysics Data System (ADS)
Basu, Nandita B.; Fure, Adrian D.; Jawitz, James W.
2008-04-01
Four simplified dense non-aqueous phase liquid (DNAPL) source depletion models recently introduced in the literature are evaluated for the prediction of long-term effects of source depletion under natural gradient flow. These models are simple in form (a power function equation is an example) but are shown here to serve as mathematical analogs to complex multiphase flow and transport simulators. The spill and subsequent dissolution of DNAPLs was simulated in domains having different hydrologic characteristics (variance of the log conductivity field = 0.2, 1 and 3) using the multiphase flow and transport simulator UTCHEM. The dissolution profiles were fitted using four analytical models: the equilibrium streamtube model (ESM), the advection dispersion model (ADM), the power law model (PLM) and the Damkohler number model (DaM). All four models, though very different in their conceptualization, include two basic parameters that describe the mean DNAPL mass and the joint variability in the velocity and DNAPL distributions. The variability parameter was observed to be strongly correlated with the variance of the log conductivity field in the ESM and ADM but weakly correlated in the PLM and DaM. The DaM also includes a third parameter that describes the effect of rate-limited dissolution, but here this parameter was held constant as the numerical simulations were found to be insensitive to local-scale mass transfer. All four models were able to emulate the characteristics of the dissolution profiles generated from the complex numerical simulator, but the one-parameter PLM fits were the poorest, especially for the low heterogeneity case.
Simplified contaminant source depletion models as analogs of multiphase simulators.
Basu, Nandita B; Fure, Adrian D; Jawitz, James W
2008-04-28
Four simplified dense non-aqueous phase liquid (DNAPL) source depletion models recently introduced in the literature are evaluated for the prediction of long-term effects of source depletion under natural gradient flow. These models are simple in form (a power function equation is an example) but are shown here to serve as mathematical analogs to complex multiphase flow and transport simulators. The spill and subsequent dissolution of DNAPLs was simulated in domains having different hydrologic characteristics (variance of the log conductivity field=0.2, 1 and 3) using the multiphase flow and transport simulator UTCHEM. The dissolution profiles were fitted using four analytical models: the equilibrium streamtube model (ESM), the advection dispersion model (ADM), the power law model (PLM) and the Damkohler number model (DaM). All four models, though very different in their conceptualization, include two basic parameters that describe the mean DNAPL mass and the joint variability in the velocity and DNAPL distributions. The variability parameter was observed to be strongly correlated with the variance of the log conductivity field in the ESM and ADM but weakly correlated in the PLM and DaM. The DaM also includes a third parameter that describes the effect of rate-limited dissolution, but here this parameter was held constant as the numerical simulations were found to be insensitive to local-scale mass transfer. All four models were able to emulate the characteristics of the dissolution profiles generated from the complex numerical simulator, but the one-parameter PLM fits were the poorest, especially for the low heterogeneity case.
Behavior-based network management: a unique model-based approach to implementing cyber superiority
NASA Astrophysics Data System (ADS)
Seng, Jocelyn M.
2016-05-01
Behavior-Based Network Management (BBNM) is a technological and strategic approach to mastering the identification and assessment of network behavior, whether human-driven or machine-generated. Recognizing that all five U.S. Air Force (USAF) mission areas rely on the cyber domain to support, enhance and execute their tasks, BBNM is designed to elevate awareness and improve the ability to better understand the degree of reliance placed upon a digital capability and the operational risk.2 Thus, the objective of BBNM is to provide a holistic view of the digital battle space to better assess the effects of security, monitoring, provisioning, utilization management, allocation to support mission sustainment and change control. Leveraging advances in conceptual modeling made possible by a novel advancement in software design and implementation known as Vector Relational Data Modeling (VRDM™), the BBNM approach entails creating a network simulation in which meaning can be inferred and used to manage network behavior according to policy, such as quickly detecting and countering malicious behavior. Initial research configurations have yielded executable BBNM models as combinations of conceptualized behavior within a network management simulation that includes only concepts of threats and definitions of "good" behavior. A proof of concept assessment called "Lab Rat," was designed to demonstrate the simplicity of network modeling and the ability to perform adaptation. The model was tested on real world threat data and demonstrated adaptive and inferential learning behavior. Preliminary results indicate this is a viable approach towards achieving cyber superiority in today's volatile, uncertain, complex and ambiguous (VUCA) environment.
NASA Astrophysics Data System (ADS)
Dunn, S. M.; Lilly, A.
2001-10-01
There are now many examples of hydrological models that utilise the capabilities of Geographic Information Systems to generate spatially distributed predictions of behaviour. However, the spatial variability of hydrological parameters relating to distributions of soils and vegetation can be hard to establish. In this paper, the relationship between a soil hydrological classification Hydrology of Soil Types (HOST) and the spatial parameters of a conceptual catchment-scale model is investigated. A procedure involving inverse modelling using Monte-Carlo simulations on two catchments is developed to identify relative values for soil related parameters of the DIY model. The relative values determine the internal variability of hydrological processes as a function of the soil type. For three out of the four soil parameters studied, the variability between HOST classes was found to be consistent across two catchments when tested independently. Problems in identifying values for the fourth 'fast response distance' parameter have highlighted a potential limitation with the present structure of the model. The present assumption that this parameter can be related simply to soil type rather than topography appears to be inadequate. With the exclusion of this parameter, calibrated parameter sets from one catchment can be converted into equivalent parameter sets for the alternate catchment on the basis of their HOST distributions, to give a reasonable simulation of flow. Following further testing on different catchments, and modifications to the definition of the fast response distance parameter, the technique provides a methodology whereby it is possible to directly derive spatial soil parameters for new catchments.
Port-O-Sim Object Simulation Application
NASA Technical Reports Server (NTRS)
Lanzi, Raymond J.
2009-01-01
Port-O-Sim is a software application that supports engineering modeling and simulation of launch-range systems and subsystems, as well as the vehicles that operate on them. It is flexible, distributed, object-oriented, and realtime. A scripting language is used to configure an array of simulation objects and link them together. The script is contained in a text file, but executed and controlled using a graphical user interface. A set of modules is defined, each with input variables, output variables, and settings. These engineering models can be either linked to each other or run as standalone. The settings can be modified during execution. Since 2001, this application has been used for pre-mission failure mode training for many Range Safety Scenarios. It contains range asset link analysis, develops look-angle data, supports sky-screen site selection, drives GPS (Global Positioning System) and IMU (Inertial Measurement Unit) simulators, and can support conceptual design efforts for multiple flight programs with its capacity for rapid six-degrees-of-freedom model development. Due to the assembly of various object types into one application, the application is applicable across a wide variety of launch range problem domains.
Goode, Daniel J.; Cravotta, Charles A.; Hornberger, Roger J.; Hewitt, Michael A.; Hughes, Robert E.; Koury, Daniel J.; Eicholtz, Lee W.
2011-01-01
This report, prepared in cooperation with the Pennsylvania Department of Environmental Protection (PaDEP), the Eastern Pennsylvania Coalition for Abandoned Mine Reclamation, and the Dauphin County Conservation District, provides estimates of water budgets and groundwater volumes stored in abandoned underground mines in the Western Middle Anthracite Coalfield, which encompasses an area of 120 square miles in eastern Pennsylvania. The estimates are based on preliminary simulations using a groundwater-flow model and an associated geographic information system that integrates data on the mining features, hydrogeology, and streamflow in the study area. The Mahanoy and Shamokin Creek Basins were the focus of the study because these basins exhibit extensive hydrologic effects and water-quality degradation from the abandoned mines in their headwaters in the Western Middle Anthracite Coalfield. Proposed groundwater withdrawals from the flooded parts of the mines and stream-channel modifications in selected areas have the potential for altering the distribution of groundwater and the interaction between the groundwater and streams in the area. Preliminary three-dimensional, steady-state simulations of groundwater flow by the use of MODFLOW are presented to summarize information on the exchange of groundwater among adjacent mines and to help guide the management of ongoing data collection, reclamation activities, and water-use planning. The conceptual model includes high-permeability mine voids that are connected vertically and horizontally within multicolliery units (MCUs). MCUs were identified on the basis of mine maps, locations of mine discharges, and groundwater levels in the mines measured by PaDEP. The locations and integrity of mine barriers were determined from mine maps and groundwater levels. The permeability of intact barriers is low, reflecting the hydraulic characteristics of unmined host rock and coal. A steady-state model was calibrated to measured groundwater levels and stream base flow, the latter at many locations composed primarily of discharge from mines. Automatic parameter estimation used MODFLOW-2000 with manual adjustments to constrain parameter values to realistic ranges. The calibrated model supports the conceptual model of high-permeability MCUs separated by low-permeability barriers and streamflow losses and gains associated with mine infiltration and discharge. The simulated groundwater levels illustrate low groundwater gradients within an MCU and abrupt changes in water levels between MCUs. The preliminary model results indicate that the primary result of increased pumping from the mine would be reduced discharge from the mine to streams near the pumping wells. The intact barriers limit the spatial extent of mine dewatering. Considering the simulated groundwater levels, depth of mining, and assumed bulk porosity of 11 or 40 percent for the mined seams, the water volume in storage in the mines of the Western Middle Anthracite Coalfield was estimated to range from 60 to 220 billion gallons, respectively. Details of the groundwater-level distribution and the rates of some mine discharges are not simulated well using the preliminary model. Use of the model results should be limited to evaluation of the conceptual model and its simulation using porous-media flow methods, overall water budgets for the Western Middle Anthracite Coalfield, and approximate storage volumes. Model results should not be considered accurate for detailed simulation of flow within a single MCU or individual flooded mine. Although improvements in the model calibration were possible by introducing spatial variability in permeability parameters and adjusting barrier properties, more detailed parameterizations have increased uncertainty because of the limited data set. The preliminary identification of data needs includes continuous streamflow, mine discharge rate, and groundwater levels in the mines and adjacent areas. Data collected whe
Quinn, T. Alexander; Kohl, Peter
2013-01-01
Since the development of the first mathematical cardiac cell model 50 years ago, computational modelling has become an increasingly powerful tool for the analysis of data and for the integration of information related to complex cardiac behaviour. Current models build on decades of iteration between experiment and theory, representing a collective understanding of cardiac function. All models, whether computational, experimental, or conceptual, are simplified representations of reality and, like tools in a toolbox, suitable for specific applications. Their range of applicability can be explored (and expanded) by iterative combination of ‘wet’ and ‘dry’ investigation, where experimental or clinical data are used to first build and then validate computational models (allowing integration of previous findings, quantitative assessment of conceptual models, and projection across relevant spatial and temporal scales), while computational simulations are utilized for plausibility assessment, hypotheses-generation, and prediction (thereby defining further experimental research targets). When implemented effectively, this combined wet/dry research approach can support the development of a more complete and cohesive understanding of integrated biological function. This review illustrates the utility of such an approach, based on recent examples of multi-scale studies of cardiac structure and mechano-electric function. PMID:23334215
NASA Astrophysics Data System (ADS)
Steinschneider, S.; Wi, S.; Brown, C. M.
2013-12-01
Flood risk management performance is investigated within the context of integrated climate and hydrologic modeling uncertainty to explore system robustness. The research question investigated is whether structural and hydrologic parameterization uncertainties are significant relative to other uncertainties such as climate change when considering water resources system performance. Two hydrologic models are considered, a conceptual, lumped parameter model that preserves the water balance and a physically-based model that preserves both water and energy balances. In the conceptual model, parameter and structural uncertainties are quantified and propagated through the analysis using a Bayesian modeling framework with an innovative error model. Mean climate changes and internal climate variability are explored using an ensemble of simulations from a stochastic weather generator. The approach presented can be used to quantify the sensitivity of flood protection adequacy to different sources of uncertainty in the climate and hydrologic system, enabling the identification of robust projects that maintain adequate performance despite the uncertainties. The method is demonstrated in a case study for the Coralville Reservoir on the Iowa River, where increased flooding over the past several decades has raised questions about potential impacts of climate change on flood protection adequacy.
Dissolved oxygen in gravity sewers--measurement and simulation.
Gudjonsson, G; Vollertsen, J; Hvitved-Jacobsen, T
2002-01-01
Dissolved oxygen (DO) concentrations were during 2 months continuously measured in an intercepting sewer. Measurements were made upstream and downstream in a 3.6 km gravity sewer. DO showed significant diurnal variations mainly caused by changes in the organic matter composition of the wastewater. At low temperatures the gravity sewer was strictly aerobic. However, towards the end of the measuring campaign, DO concentrations decreased as temperature increased and the sewer became anaerobic part of the day. A conceptual model that takes into account bulk water and biofilm DO uptake as well as reaeration was used to simulate the DO measured. Using measurements from the upstream station as input, the model was calibrated to yield good validation results of the DO at the downstream station.
"Observing" the Circumnuclear Stars and Gas in Disk Galaxy Simulations
NASA Astrophysics Data System (ADS)
Cook, Angela; Hicks, Erin K. S.
2018-06-01
We present simulations based on theoretical models of common disk processes designed to represent potential inflow observed within the central 500 pc of local Seyfert galaxies. Mock observations of these n-body plus smoothed particle hydrodynamical simulations provide the conceptual framework in which to identify the driving inflow mechanism, for example nuclear bars, and to quantify to the inflow based on observable properties. From these mock observations the azimuthal average of the flux distribution, velocity dispersion, and velocity of both the stars and interstellar medium on scales of 50pc have been measured at a range of inclinations angles. A comparison of the simulated disk galaxies with these observed azimuthal averages in 40 Seyfert galaxies measured as part of the KONA (Keck OSIRIS Nearby AGN) survey will be presented.
NASA Astrophysics Data System (ADS)
Hansen, A. L.; Donnelly, C.; Refsgaard, J. C.; Karlsson, I. B.
2018-01-01
This paper describes a modeling approach proposed to simulate the impact of local-scale, spatially targeted N-mitigation measures for the Baltic Sea Basin. Spatially targeted N-regulations aim at exploiting the considerable spatial differences in the natural N-reduction taking place in groundwater and surface water. While such measures can be simulated using local-scale physically-based catchment models, use of such detailed models for the 1.8 million km2 Baltic Sea basin is not feasible due to constraints on input data and computing power. Large-scale models that are able to simulate the Baltic Sea basin, on the other hand, do not have adequate spatial resolution to simulate some of the field-scale measures. Our methodology combines knowledge and results from two local-scale physically-based MIKE SHE catchment models, the large-scale and more conceptual E-HYPE model, and auxiliary data in order to enable E-HYPE to simulate how spatially targeted regulation of agricultural practices may affect N-loads to the Baltic Sea. We conclude that the use of E-HYPE with this upscaling methodology enables the simulation of the impact on N-loads of applying a spatially targeted regulation at the Baltic Sea basin scale to the correct order-of-magnitude. The E-HYPE model together with the upscaling methodology therefore provides a sound basis for large-scale policy analysis; however, we do not expect it to be sufficiently accurate to be useful for the detailed design of local-scale measures.
First status report on regional ground-water flow modeling for the Paradox Basin, Utah
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andrews, R.W.
1984-05-01
Regional ground-water flow within the principal hydrogeologic units of the Paradox Basin is evaluated by developing a conceptual model of the flow regime in the shallow aquifers and the deep-basin brine aquifers and testing these models using a three-dimensional, finite-difference flow code. Semiquantitative sensitivity analysis (a limited parametric study) is conducted to define the system response to changes in hydrologic properties or boundary conditions. A direct method for sensitivity analysis using an adjoint form of the flow equation is applied to the conceptualized flow regime in the Leadville limestone aquifer. All steps leading to the final results and conclusions aremore » incorporated in this report. The available data utilized in this study is summarized. The specific conceptual models, defining the areal and vertical averaging of litho-logic units, aquifer properties, fluid properties, and hydrologic boundary conditions, are described in detail. Two models were evaluated in this study: a regional model encompassing the hydrogeologic units above and below the Paradox Formation/Hermosa Group and a refined scale model which incorporated only the post Paradox strata. The results are delineated by the simulated potentiometric surfaces and tables summarizing areal and vertical boundary fluxes, Darcy velocities at specific points, and ground-water travel paths. Results from the adjoint sensitivity analysis include importance functions and sensitivity coefficients, using heads or the average Darcy velocities to represent system response. The reported work is the first stage of an ongoing evaluation of the Gibson Dome area within the Paradox Basin as a potential repository for high-level radioactive wastes.« less
NASA Astrophysics Data System (ADS)
Molisee, D. D.; Germa, A.; Charbonnier, S. J.; Connor, C.
2017-12-01
Medicine Lake Volcano (MLV) is most voluminous of all the Cascade Volcanoes ( 600 km3), and has the highest eruption frequency after Mount St. Helens. Detailed mapping by USGS colleagues has shown that during the last 500,000 years MLV erupted >200 lava flows ranging from basalt to rhyolite, produced at least one ash-flow tuff, one caldera forming event, and at least 17 scoria cones. Underlying these units are 23 additional volcanic units that are considered to be pre-MLV in age. Despite the very high likelihood of future eruptions, fewer than 60 of 250 mapped volcanic units (MLV and pre-MLV) have been dated reliably. A robust set of eruptive ages is key to understanding the history of the MLV system and to forecasting the future behavior of the volcano. The goals of this study are to 1) obtain additional radiometric ages from stratigraphically strategic units; 2) recalculate recurrence rate of eruptions based on an augmented set of radiometric dates; and 3) use lava flow, PDC, ash fall-out, and lahar computational simulation models to assess the potential effects of discrete volcanic hazards locally and regionally. We identify undated target units (units in key stratigraphic positions to provide maximum chronological insight) and obtain field samples for radiometric dating (40Ar/39Ar and K/Ar) and petrology. Stratigraphic and radiometric data are then used together in the Volcano Event Age Model (VEAM) to identify changes in the rate and type of volcanic eruptions through time, with statistical uncertainty. These newly obtained datasets will be added to published data to build a conceptual model of volcanic hazards at MLV. Alternative conceptual models, for example, may be that the rate of MLV lava flow eruptions are nonstationary in time and/or space and/or volume. We explore the consequences of these alternative models on forecasting future eruptions. As different styles of activity have different impacts, we estimate these potential effects using simulation. The results of this study will improve the existing MLV hazard assessment in hopes of mitigating casualties and social impact should an eruption occur at MLV.
Payne, Dorothy F.
2010-01-01
Saltwater intrusion of the Upper Floridan aquifer has been observed in the Hilton Head area, South Carolina since the late 1970s and currently affects freshwater supply. Rising sea level in the Hilton Head Island area may contribute to the occurrence of and affect the rate of saltwater intrusion into the Upper Floridan aquifer by increasing the hydraulic gradient and by inundating an increasing area with saltwater, which may then migrate downward into geologic units that presently contain freshwater. Rising sea level may offset any beneficial results from reductions in groundwater pumpage, and thus needs to be considered in groundwater-management decisions. A variable-density groundwater flow and transport model was modified from a previously existing model to simulate the effects of sea-level rise in the Hilton Head Island area. Specifically, the model was used to (1) simulate trends of saltwater intrusion from predevelopment to the present day (1885-2004) and evaluate the conceptual model, (2) project these trends from the present day into the future based on different potential rates of sea-level change, and (3) evaluate the relative influences of pumpage and sea-level rise on saltwater intrusion. Four scenarios were simulated for 2004-2104: (1) continuation of the estimated sea-level rise rate over the last century, (2) a doubling of the sea-level rise, (3) a cessation of sea-level rise, and (4) continuation of the rate over the last century coupled with an elimination of all pumpage. Results show that, if present-day (year 2004) pumping conditions are maintained, the extent of saltwater in the Upper Floridan aquifer will increase, whether or not sea level continues to rise. Furthermore, if all pumpage is eliminated and sea level continues to rise, the simulated saltwater extent in the Upper Floridan aquifer is reduced. These results indicate that pumpage is a strong driving force for simulated saltwater intrusion, more so than sea-level rise at current rates. However, results must be considered in light of limitations in the model, including, but not limited to uncertainty in field data, the conceptual model, the physical properties and representation of the hydrogeologic framework, and boundary and initial conditions, as well as uncertainty in future conditions, such as the rate of sea-level rise.
The coexistence of alternative and scientific conceptions in physics
NASA Astrophysics Data System (ADS)
Ozdemir, Omer F.
The purpose of this study was to inquire about the simultaneous coexistence of alternative and scientific conceptions in the domain of physics. This study was particularly motivated by several arguments put forward in opposition to the Conceptual Change Model. In the simplest form, these arguments state that people construct different domains of knowledge and different modes of perception in different situations. Therefore, holding different conceptualizations is unavoidable and expecting a replacement in an individual's conceptual structure is not plausible in terms of instructional practices. The following research questions were generated to inquire about this argument: (1) Do individuals keep their alternative conceptions after they have acquired scientific conceptions? (2) Assuming that individuals who acquired scientific conceptions also have alternative conceptions, how are these different conceptions nested in their conceptual structure? (3) What kind of knowledge, skills, and reasoning are necessary to transfer scientific principles instead of alternative ones in the construction of a valid model? Analysis of the data collected from the non-physics group indicated that the nature of alternative conceptions is framed by two types of reasoning: reasoning by mental simulation and semiformal reasoning. Analysis of the data collected from the physics group revealed that mental images or scenes feeding reasoning by mental simulation had not disappeared after the acquisition of scientific conceptions. The analysis of data also provided enough evidence to conclude that alternative principles feeding semiformal reasoning have not necessarily disappeared after the acquisition of scientific conceptions. However, in regard to semiformal reasoning, compartmentalization was not as clear as the case demonstrated in reasoning by mental simulation; instead semiformal and scientific reasoning are intertwined in a way that the components of semiformal reasoning can easily take their place among the components of scientific reasoning. In spite of the fact that the coexistence of multiple conceptions might obstruct the transfer of scientific conceptions in problem-solving situations, several factors stimulating the use of scientific conceptions were noticed explicitly. These factors were categorized as follows: (a) the level of individuals' domain specific knowledge in the corresponding field, (b) the level of individuals' knowledge about the process of science (how science generates its knowledge claims), (c) the level of individuals' awareness of different types of reasoning and conceptions, and (d) the context in which the problem is situated. (Abstract shortened by UMI.)
Fate and Transport of Nanoparticles in Porous Media: A Numerical Study
NASA Astrophysics Data System (ADS)
Taghavy, Amir
Understanding the transport characteristics of NPs in natural soil systems is essential to revealing their potential impact on the food chain and groundwater. In addition, many nanotechnology-based remedial measures require effective transport of NPs through soil, which necessitates accurate understanding of their transport and retention behavior. Based upon the conceptual knowledge of environmental behavior of NPs, mathematical models can be developed to represent the coupling of processes that govern the fate of NPs in subsurface, serving as effective tools for risk assessment and/or design of remedial strategies. This work presents an innovative hybrid Eulerian-Lagrangian modeling technique for simulating the simultaneous reactive transport of nanoparticles (NPs) and dissolved constituents in porous media. Governing mechanisms considered in the conceptual model include particle-soil grain, particle-particle, particle-dissolved constituents, and particle- oil/water interface interactions. The main advantage of this technique, compared to conventional Eulerian models, lies in its ability to address non-uniformity in physicochemical particle characteristics. The developed numerical simulator was applied to investigate the fate and transport of NPs in a number of practical problems relevant to the subsurface environment. These problems included: (1) reductive dechlorination of chlorinated solvents by zero-valent iron nanoparticles (nZVI) in dense non-aqueous phase liquid (DNAPL) source zones; (2) reactive transport of dissolving silver nanoparticles (nAg) and the dissolved silver ions; (3) particle-particle interactions and their effects on the particle-soil grain interactions; and (4) influence of particle-oil/water interface interactions on NP transport in porous media.
NASA Astrophysics Data System (ADS)
Cota, Stephen A.; Lomheim, Terrence S.; Florio, Christopher J.; Harbold, Jeffrey M.; Muto, B. Michael; Schoolar, Richard B.; Wintz, Daniel T.; Keller, Robert A.
2011-10-01
In a previous paper in this series, we described how The Aerospace Corporation's Parameterized Image Chain Analysis & Simulation SOftware (PICASSO) tool may be used to model space and airborne imaging systems operating in the visible to near-infrared (VISNIR). PICASSO is a systems-level tool, representative of a class of such tools used throughout the remote sensing community. It is capable of modeling systems over a wide range of fidelity, anywhere from conceptual design level (where it can serve as an integral part of the systems engineering process) to as-built hardware (where it can serve as part of the verification process). In the present paper, we extend the discussion of PICASSO to the modeling of Thermal Infrared (TIR) remote sensing systems, presenting the equations and methods necessary to modeling in that regime.
Nodal failure index approach to groundwater remediation design
Lee, J.; Reeves, H.W.; Dowding, C.H.
2008-01-01
Computer simulations often are used to design and to optimize groundwater remediation systems. We present a new computationally efficient approach that calculates the reliability of remedial design at every location in a model domain with a single simulation. The estimated reliability and other model information are used to select a best remedial option for given site conditions, conceptual model, and available data. To evaluate design performance, we introduce the nodal failure index (NFI) to determine the number of nodal locations at which the probability of success is below the design requirement. The strength of the NFI approach is that selected areas of interest can be specified for analysis and the best remedial design determined for this target region. An example application of the NFI approach using a hypothetical model shows how the spatial distribution of reliability can be used for a decision support system in groundwater remediation design. ?? 2008 ASCE.
NASA Astrophysics Data System (ADS)
Rates, Christopher A.; Mulvey, Bridget K.; Feldon, David F.
2016-08-01
Components of complex systems apply across multiple subject areas, and teaching these components may help students build unifying conceptual links. Students, however, often have difficulty learning these components, and limited research exists to understand what types of interventions may best help improve understanding. We investigated 32 high school students' understandings of complex systems components and whether an agent-based simulation could improve their understandings. Pretest and posttest essays were coded for changes in six components to determine whether students showed more expert thinking about the complex system of the Chesapeake Bay watershed. Results showed significant improvement for the components Emergence ( r = .26, p = .03), Order ( r = .37, p = .002), and Tradeoffs ( r = .44, p = .001). Implications include that the experiential nature of the simulation has the potential to support conceptual change for some complex systems components, presenting a promising option for complex systems instruction.
An ice sheet model validation framework for the Greenland ice sheet.
Price, Stephen F; Hoffman, Matthew J; Bonin, Jennifer A; Howat, Ian M; Neumann, Thomas; Saba, Jack; Tezaur, Irina; Guerber, Jeffrey; Chambers, Don P; Evans, Katherine J; Kennedy, Joseph H; Lenaerts, Jan; Lipscomb, William H; Perego, Mauro; Salinger, Andrew G; Tuminaro, Raymond S; van den Broeke, Michiel R; Nowicki, Sophie M J
2017-01-01
We propose a new ice sheet model validation framework - the Cryospheric Model Comparison Tool (CmCt) - that takes advantage of ice sheet altimetry and gravimetry observations collected over the past several decades and is applied here to modeling of the Greenland ice sheet. We use realistic simulations performed with the Community Ice Sheet Model (CISM) along with two idealized, non-dynamic models to demonstrate the framework and its use. Dynamic simulations with CISM are forced from 1991 to 2013 using combinations of reanalysis-based surface mass balance and observations of outlet glacier flux change. We propose and demonstrate qualitative and quantitative metrics for use in evaluating the different model simulations against the observations. We find that the altimetry observations used here are largely ambiguous in terms of their ability to distinguish one simulation from another. Based on basin- and whole-ice-sheet scale metrics, we find that simulations using both idealized conceptual models and dynamic, numerical models provide an equally reasonable representation of the ice sheet surface (mean elevation differences of <1 m). This is likely due to their short period of record, biases inherent to digital elevation models used for model initial conditions, and biases resulting from firn dynamics, which are not explicitly accounted for in the models or observations. On the other hand, we find that the gravimetry observations used here are able to unambiguously distinguish between simulations of varying complexity, and along with the CmCt, can provide a quantitative score for assessing a particular model and/or simulation. The new framework demonstrates that our proposed metrics can distinguish relatively better from relatively worse simulations and that dynamic ice sheet models, when appropriately initialized and forced with the right boundary conditions, demonstrate predictive skill with respect to observed dynamic changes occurring on Greenland over the past few decades. An extensible design will allow for continued use of the CmCt as future altimetry, gravimetry, and other remotely sensed data become available for use in ice sheet model validation.
ERIC Educational Resources Information Center
Echeverria, Alejandro; Barrios, Enrique; Nussbaum, Miguel; Amestica, Matias; Leclerc, Sandra
2012-01-01
Computer simulations combined with games have been successfully used to teach conceptual physics. However, there is no clear methodology for guiding the design of these types of games. To remedy this, we propose a structured methodology for the design of conceptual physics games that explicitly integrates the principles of the intrinsic…
Sepúlveda, Nicasio
2002-01-01
A numerical model of the intermediate and Floridan aquifer systems in peninsular Florida was used to (1) test and refine the conceptual understanding of the regional ground-water flow system; (2) develop a data base to support subregional ground-water flow modeling; and (3) evaluate effects of projected 2020 ground-water withdrawals on ground-water levels. The four-layer model was based on the computer code MODFLOW-96, developed by the U.S. Geological Survey. The top layer consists of specified-head cells simulating the surficial aquifer system as a source-sink layer. The second layer simulates the intermediate aquifer system in southwest Florida and the intermediate confining unit where it is present. The third and fourth layers simulate the Upper and Lower Floridan aquifers, respectively. Steady-state ground-water flow conditions were approximated for time-averaged hydrologic conditions from August 1993 through July 1994 (1993-94). This period was selected based on data from Upper Floridan a quifer wells equipped with continuous water-level recorders. The grid used for the ground-water flow model was uniform and composed of square 5,000-foot cells, with 210 columns and 300 rows.
NASA Astrophysics Data System (ADS)
Baird, M. E.; Walker, S. J.; Wallace, B. B.; Webster, I. T.; Parslow, J. S.
2003-03-01
A simple model of estuarine eutrophication is built on biomechanical (or mechanistic) descriptions of a number of the key ecological processes in estuaries. Mechanistically described processes include the nutrient uptake and light capture of planktonic and benthic autotrophs, and the encounter rates of planktonic predators and prey. Other more complex processes, such as sediment biogeochemistry, detrital processes and phosphate dynamics, are modelled using empirical descriptions from the Port Phillip Bay Environmental Study (PPBES) ecological model. A comparison is made between the mechanistically determined rates of ecological processes and the analogous empirically determined rates in the PPBES ecological model. The rates generally agree, with a few significant exceptions. Model simulations were run at a range of estuarine depths and nutrient loads, with outputs presented as the annually averaged biomass of autotrophs. The simulations followed a simple conceptual model of eutrophication, suggesting a simple biomechanical understanding of estuarine processes can provide a predictive tool for ecological processes in a wide range of estuarine ecosystems.
NASA Astrophysics Data System (ADS)
Congdon, R. D.
2012-12-01
There is frequently a need in land management agencies for a quick and easy method for estimating hydrogeologic conditions in a watershed for which there is very little subsurface information. Setting up a finite difference or finite element model takes valuable time that often is not available when decisions need to be made quickly. An analytic element model (AEM), GFLOW in this case, may enable the investigator to produce a preliminary steady-state model for a watershed, and to easily evaluate variants of the conceptual model. Use of preexisting data, such as stream gage data or USGS reports makes the job much easier. Solutions to analytic element models are obtained within seconds. The Eagle Creek watershed in central New Mexico is a site of local water supply issues in an area of volcanic and plutonic rocks. Parameters estimated by groundwater consultants and the USGS, and discharge data from three USGS stream gages were used to set up the steady-state analytical model (GFLOW). Matching gage records with line-sink fluxes facilitated conceptualization of local groundwater flow and quick analysis of the effects of steady water supply pumping on Eagle Creek. Because of steep topgraphy and limited access, a water supply well is located within the stream channel within 20 meters of the creek, and it would be useful to evaluate the effects of the well on stream flow. A USGS report (SIR 2010-5205) revealed a section of Eagle Creek with a high vertical conductivity which results in flow loss of up to 34 l/s (including flow to the water table and flow into alluvium) when the well was pumped and the water table was lowered below the channel bottom. The water supply well was simulated with a steady-state well pumping at the average and maximum rates of 12 l/s and 31 l/s. The initial simulation shows that pumping at these rates results in stream flow loss of 19% and 51%, respectively. The simulation was conducted with average flow conditions, and this information will be important in planning for management during periods of drought, as well as times of more normal precipitation; as water uses must be balanced with the needs of the existing ecosystem. Alternatives, such as low conductivity blocks between stream channels and different volumetric and geographic pumping scenarios may also be readily explored in an AEM. Exporting these scenarios into MODFLOW simulations will enable us to evaluate transient and cyclical pumping effects on the surface waters for each AEM conceptualization, as well as being able to simulate seasonal recharge. However, in many cases the use of MODFLOW may not be necessary, if the AEM proves sufficient to answer the relevant questions. Symbiotic use of GFLOW and MODFLOW will be an invaluable aid in evaluating groundwater and its uses in National Forest watersheds, especially in cases when time is a critical factor in informed decision-making.
NASA Astrophysics Data System (ADS)
Wichmann, Volker
2017-09-01
The Gravitational Process Path (GPP) model can be used to simulate the process path and run-out area of gravitational processes based on a digital terrain model (DTM). The conceptual model combines several components (process path, run-out length, sink filling and material deposition) to simulate the movement of a mass point from an initiation site to the deposition area. For each component several modeling approaches are provided, which makes the tool configurable for different processes such as rockfall, debris flows or snow avalanches. The tool can be applied to regional-scale studies such as natural hazard susceptibility mapping but also contains components for scenario-based modeling of single events. Both the modeling approaches and precursor implementations of the tool have proven their applicability in numerous studies, also including geomorphological research questions such as the delineation of sediment cascades or the study of process connectivity. This is the first open-source implementation, completely re-written, extended and improved in many ways. The tool has been committed to the main repository of the System for Automated Geoscientific Analyses (SAGA) and thus will be available with every SAGA release.
Multi Modal Anticipation in Fuzzy Space
NASA Astrophysics Data System (ADS)
Asproth, Viveca; Holmberg, Stig C.; Hâkansson, Anita
2006-06-01
We are all stakeholders in the geographical space, which makes up our common living and activity space. This means that a careful, creative, and anticipatory planning, design, and management of that space will be of paramount importance for our sustained life on earth. Here it is shown that the quality of such planning could be significantly increased with help of a computer based modelling and simulation tool. Further, the design and implementation of such a tool ought to be guided by the conceptual integration of some core concepts like anticipation and retardation, multi modal system modelling, fuzzy space modelling, and multi actor interaction.
Aubertot, Jean-Noël; Robin, Marie-Hélène
2013-01-01
The limitation of damage caused by pests (plant pathogens, weeds, and animal pests) in any agricultural crop requires integrated management strategies. Although significant efforts have been made to i) develop, and to a lesser extent ii) combine genetic, biological, cultural, physical and chemical control methods in Integrated Pest Management (IPM) strategies (vertical integration), there is a need for tools to help manage Injury Profiles (horizontal integration). Farmers design cropping systems according to their goals, knowledge, cognition and perception of socio-economic and technological drivers as well as their physical, biological, and chemical environment. In return, a given cropping system, in a given production situation will exhibit a unique injury profile, defined as a dynamic vector of the main injuries affecting the crop. This simple description of agroecosystems has been used to develop IPSIM (Injury Profile SIMulator), a modelling framework to predict injury profiles as a function of cropping practices, abiotic and biotic environment. Due to the tremendous complexity of agroecosystems, a simple holistic aggregative approach was chosen instead of attempting to couple detailed models. This paper describes the conceptual bases of IPSIM, an aggregative hierarchical framework and a method to help specify IPSIM for a given crop. A companion paper presents a proof of concept of the proposed approach for a single disease of a major crop (eyespot on wheat). In the future, IPSIM could be used as a tool to help design ex-ante IPM strategies at the field scale if coupled with a damage sub-model, and a multicriteria sub-model that assesses the social, environmental, and economic performances of simulated agroecosystems. In addition, IPSIM could also be used to help make diagnoses on commercial fields. It is important to point out that the presented concepts are not crop- or pest-specific and that IPSIM can be used on any crop. PMID:24019908
Hancock, G R; Verdon-Kidd, D; Lowry, J B C
2017-12-01
Landscape Evolution Modelling (LEM) technologies provide a means by which it is possible to simulate the long-term geomorphic stability of a conceptual rehabilitated landform. However, simulations rarely consider the potential effects of anthropogenic climate change and consequently risk not accounting for the range of rainfall variability that might be expected in both the near and far future. One issue is that high resolution (both spatial and temporal) rainfall projections incorporating the potential effects of greenhouse forcing are required as input. However, projections of rainfall change are still highly uncertain for many regions, particularly at sub annual/seasonal scales. This is the case for northern Australia, where a decrease or an increase in rainfall post 2030 is considered equally likely based on climate model simulations. The aim of this study is therefore to investigate a spatial analogue approach to develop point scale hourly rainfall scenarios to be used as input to the CAESAR - Lisflood LEM to test the sensitivity of the geomorphic stability of a conceptual rehabilitated landform to potential changes in climate. Importantly, the scenarios incorporate the range of projected potential increase/decrease in rainfall for northern Australia and capture the expected envelope of erosion rates and erosion patterns (i.e. where erosion and deposition occurs) over a 100year modelled period. We show that all rainfall scenarios produce sediment output and gullying greater than that of the surrounding natural system, however a 'wetter' future climate produces the highest output. Importantly, incorporating analogue rainfall scenarios into LEM has the capacity to both improve landform design and enhance the modelling software. Further, the method can be easily transferred to other sites (both nationally and internationally) where rainfall variability is significant and climate change impacts are uncertain. Crown Copyright © 2017. Published by Elsevier B.V. All rights reserved.
Aubertot, Jean-Noël; Robin, Marie-Hélène
2013-01-01
The limitation of damage caused by pests (plant pathogens, weeds, and animal pests) in any agricultural crop requires integrated management strategies. Although significant efforts have been made to i) develop, and to a lesser extent ii) combine genetic, biological, cultural, physical and chemical control methods in Integrated Pest Management (IPM) strategies (vertical integration), there is a need for tools to help manage Injury Profiles (horizontal integration). Farmers design cropping systems according to their goals, knowledge, cognition and perception of socio-economic and technological drivers as well as their physical, biological, and chemical environment. In return, a given cropping system, in a given production situation will exhibit a unique injury profile, defined as a dynamic vector of the main injuries affecting the crop. This simple description of agroecosystems has been used to develop IPSIM (Injury Profile SIMulator), a modelling framework to predict injury profiles as a function of cropping practices, abiotic and biotic environment. Due to the tremendous complexity of agroecosystems, a simple holistic aggregative approach was chosen instead of attempting to couple detailed models. This paper describes the conceptual bases of IPSIM, an aggregative hierarchical framework and a method to help specify IPSIM for a given crop. A companion paper presents a proof of concept of the proposed approach for a single disease of a major crop (eyespot on wheat). In the future, IPSIM could be used as a tool to help design ex-ante IPM strategies at the field scale if coupled with a damage sub-model, and a multicriteria sub-model that assesses the social, environmental, and economic performances of simulated agroecosystems. In addition, IPSIM could also be used to help make diagnoses on commercial fields. It is important to point out that the presented concepts are not crop- or pest-specific and that IPSIM can be used on any crop.
An Agent-Based Model of New Venture Creation: Conceptual Design for Simulating Entrepreneurship
NASA Technical Reports Server (NTRS)
Provance, Mike; Collins, Andrew; Carayannis, Elias
2012-01-01
There is a growing debate over the means by which regions can foster the growth of entrepreneurial activity in order to stimulate recovery and growth of their economies. On one side, agglomeration theory suggests the regions grow because of strong clusters that foster knowledge spillover locally; on the other side, the entrepreneurial action camp argues that innovative business models are generated by entrepreneurs with unique market perspectives who draw on knowledge from more distant domains. We will show you the design for a novel agent-based model of new venture creation that will demonstrate the relationship between agglomeration and action. The primary focus of this model is information exchange as the medium for these agent interactions. Our modeling and simulation study proposes to reveal interesting relationships in these perspectives, offer a foundation on which these disparate theories from economics and sociology can find common ground, and expand the use of agent-based modeling into entrepreneurship research.
One-dimensional GIS-based model compared with a two-dimensional model in urban floods simulation.
Lhomme, J; Bouvier, C; Mignot, E; Paquier, A
2006-01-01
A GIS-based one-dimensional flood simulation model is presented and applied to the centre of the city of Nîmes (Gard, France), for mapping flow depths or velocities in the streets network. The geometry of the one-dimensional elements is derived from the Digital Elevation Model (DEM). The flow is routed from one element to the next using the kinematic wave approximation. At the crossroads, the flows in the downstream branches are computed using a conceptual scheme. This scheme was previously designed to fit Y-shaped pipes junctions, and has been modified here to fit X-shaped crossroads. The results were compared with the results of a two-dimensional hydrodynamic model based on the full shallow water equations. The comparison shows that good agreements can be found in the steepest streets of the study zone, but differences may be important in the other streets. Some reasons that can explain the differences between the two models are given and some research possibilities are proposed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reed, M.; Jayko, K.; Bowles, A.
1986-10-01
A numerical model system was developed to assess quantitatively the probability that endangered bowhead and gray whales will encounter spilled oil in Alaskan waters. Bowhead and gray whale migration diving-surfacing models, and an oil-spill-trajectory model comprise the system. The migration models were developed from conceptual considerations, then calibrated with and tested against observations. The distribution of animals is represented in space and time by discrete points, each of which may represent one or more whales. The movement of a whale point is governed by a random-walk algorithm which stochastically follows a migratory pathway.
NASA Astrophysics Data System (ADS)
Kelleher, Christa A.; Shaw, Stephen B.
2018-02-01
Recent research has found that hydrologic modeling over decadal time periods often requires time variant model parameters. Most prior work has focused on assessing time variance in model parameters conceptualizing watershed features and functions. In this paper, we assess whether adding a time variant scalar to potential evapotranspiration (PET) can be used in place of time variant parameters. Using the HBV hydrologic model and four different simple but common PET methods (Hamon, Priestly-Taylor, Oudin, and Hargreaves), we simulated 60+ years of daily discharge on four rivers in New York state. Allowing all ten model parameters to vary in time achieved good model fits in terms of daily NSE and long-term water balance. However, allowing single model parameters to vary in time - including a scalar on PET - achieved nearly equivalent model fits across PET methods. Overall, varying a PET scalar in time is likely more physically consistent with known biophysical controls on PET as compared to varying parameters conceptualizing innate watershed properties related to soil properties such as wilting point and field capacity. This work suggests that the seeming need for time variance in innate watershed parameters may be due to overly simple evapotranspiration formulations that do not account for all factors controlling evapotranspiration over long time periods.
2010-01-01
Soporte de Modelos Sistémicos: Aplicación al Sector de Desarrollo de Software de Argentina,” Tesis de PhD, Universidad Tecnológica Nacional-Facultad...with New Results 31 2.3 Other Simulation Approaches 37 Conceptual Planning , Execution, and Operation of Combat Fire Support Effectiveness: A...Figure 29: Functional Structure of Multiple Regression Model 80 Figure 30: TSP Quality Plan One 85 Figure 31: TSP Quality Plan Two 85 Figure
Oldenburg, Curtis M.; Freifeld, Barry M.; Pruess, Karsten; Pan, Lehua; Finsterle, Stefan; Moridis, George J.
2012-01-01
In response to the urgent need for estimates of the oil and gas flow rate from the Macondo well MC252-1 blowout, we assembled a small team and carried out oil and gas flow simulations using the TOUGH2 codes over two weeks in mid-2010. The conceptual model included the oil reservoir and the well with a top boundary condition located at the bottom of the blowout preventer. We developed a fluid properties module (Eoil) applicable to a simple two-phase and two-component oil-gas system. The flow of oil and gas was simulated using T2Well, a coupled reservoir-wellbore flow model, along with iTOUGH2 for sensitivity analysis and uncertainty quantification. The most likely oil flow rate estimated from simulations based on the data available in early June 2010 was about 100,000 bbl/d (barrels per day) with a corresponding gas flow rate of 300 MMscf/d (million standard cubic feet per day) assuming the well was open to the reservoir over 30 m of thickness. A Monte Carlo analysis of reservoir and fluid properties provided an uncertainty distribution with a long tail extending down to 60,000 bbl/d of oil (170 MMscf/d of gas). The flow rate was most strongly sensitive to reservoir permeability. Conceptual model uncertainty was also significant, particularly with regard to the length of the well that was open to the reservoir. For fluid-entry interval length of 1.5 m, the oil flow rate was about 56,000 bbl/d. Sensitivity analyses showed that flow rate was not very sensitive to pressure-drop across the blowout preventer due to the interplay between gas exsolution and oil flow rate. PMID:21730177
A multiscale interaction model for the origin of the tropospheric QBO
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goswami, B.N.
1995-03-01
A conceptual model for the origin of the tropospheric quasi-biennial oscillation (QBO) is presented. It is argued that the tropospheric QBO may not be a fundamental mode of oscillation of the tropical coupled system. It is proposed that it may arise due to multiscale interactions between high-frequency synoptic and intraseasonal oscillations of the atmosphere and a low-frequency oscillation of the couple system in the presence of the annual cycle. This is demonstrated using a conceptual low-order system consisting of three variables representing the nonlinear atmospheric oscillations and a linear oscillator representing the low-frequency coupled mode. The annual cycle and couplingmore » to the low-frequency linear oscillator provide slowly varying forcings for the atmospheric high-frequency oscillations. The atmospheric oscillations go through a chaotic regime during a certain part of the slowly varying forcing. Such variable forcing introduces a low-frequency tail in the spectrum of the atmospheric high-frequency oscillations. The low-frequency tail resonantly interacts with the low-frequency oscillation and produces the QBO in addition to broadening the spectrum of the low-frequency oscillator. The conceptual model simulates features similar to many observed features of the tropospheric QBO but depends on the assumption that there is an inherent low-frequency El Nino-Southern Oscillation oscillation with a four-year period that occurs independently of the high-frequency forcing or the QBO.« less
NASA Astrophysics Data System (ADS)
Karlsen, R. H.; Smits, F. J. C.; Stuyfzand, P. J.; Olsthoorn, T. N.; van Breukelen, B. M.
2012-08-01
SummaryThis article describes the post audit and inverse modeling of a 1-D forward reactive transport model. The model simulates the changes in water quality following artificial recharge of pre-treated water from the river Rhine in the Amsterdam Water Supply Dunes using the PHREEQC-2 numerical code. One observation dataset is used for model calibration, and another dataset for validation of model predictions. The total simulation time of the model is 50 years, from 1957 to 2007, with recharge composition varying on a monthly basis and the post audit is performed 26 years after the former model simulation period. The post audit revealed that the original model could reasonably predict conservative transport and kinetic redox reactions (oxygen and nitrate reduction coupled to the oxidation of soil organic carbon), but showed discrepancies in the simulation of cation exchange. Conceptualizations of the former model were inadequate to accurately simulate water quality changes controlled by cation exchange, especially concerning the breakthrough of potassium and magnesium fronts. Changes in conceptualization and model design, including the addition of five flow paths, to a total of six, and the use of parameter estimation software (PEST), resulted in a better model to measurement fit and system representation. No unique parameter set could be found for the model, primarily due to high parameter correlations, and an assessment of the predictive error was made using a calibration constrained Monte-Carlo method, and evaluated against field observations. The predictive error was found to be low for Na+ and Ca2+, except for greater travel times, while the K+ and Mg2+ error was restricted to the exchange fronts at some of the flow paths. Optimized cation exchange coefficients were relatively high, especially for potassium, but still within the observed range in literature. The exchange coefficient for potassium agrees with strong fixation on illite, a main clay mineral in the area. Optimized CEC values were systematically lower than clay and organic matter contents indicated, possibly reflecting preferential flow of groundwater through the more permeable but less reactive aquifer parts. Whereas the artificial recharge initially acted as an intrusion of relatively saline water triggering Na+ for Ca2+ exchange, further increasing total hardness of the recharged water, the gradual long-term reduction in salinity of the river Rhine since the mid 1970s has shifted to an intrusion of fresher water causing Ca2+ for Na+ exchange. As a result, seasonal and longer term reversal of the initial cation exchange processes was observed adding to the general long-term reduction in total hardness of the recharged Rhine water.
Harte, P.T.; Mack, Thomas J.
1992-01-01
Hydrogeologic data collected since 1990 were assessed and a ground-water-flow model was refined in this study of the Milford-Souhegan glacial-drift aquifer in Milford, New Hampshire. The hydrogeologic data collected were used to refine estimates of hydraulic conductivity and saturated thickness of the aquifer, which were previously calculated during 1988-90. In October 1990, water levels were measured at 124 wells and piezometers, and at 45 stream-seepage sites on the main stem of the Souhegan River, and on small tributary streams overlying the aquifer to improve an understanding of ground-water-flow patterns and stream-seepage gains and losses. Refinement of the ground-water-flow model included a reduction in the number of active cells in layer 2 in the central part of the aquifer, a revision of simulated hydraulic conductivity in model layers 2 and representing the aquifer, incorporation of a new block-centered finite-difference ground-water-flow model, and incorporation of a new solution algorithm and solver (a preconditioned conjugate-gradient algorithm). Refinements to the model resulted in decreases in the difference between calculated and measured heads at 22 wells. The distribution of gains and losses of stream seepage calculated in simulation with the refined model is similar to that calculated in the previous model simulation. The contributing area to the Savage well, under average pumping conditions, decreased by 0.021 square miles from the area calculated in the previous model simulation. The small difference in the contrib- uting recharge area indicates that the additional data did not enhance model simulation and that the conceptual framework for the previous model is accurate.
Teaching quantum physics by the sum over paths approach and GeoGebra simulations
NASA Astrophysics Data System (ADS)
Malgieri, M.; Onorato, P.; De Ambrosis, A.
2014-09-01
We present a research-based teaching sequence in introductory quantum physics using the Feynman sum over paths approach. Our reconstruction avoids the historical pathway, and starts by reconsidering optics from the standpoint of the quantum nature of light, analysing both traditional and modern experiments. The core of our educational path lies in the treatment of conceptual and epistemological themes, peculiar of quantum theory, based on evidence from quantum optics, such as the single photon Mach-Zehnder and Zhou-Wang-Mandel experiments. The sequence is supported by a collection of interactive simulations, realized in the open source GeoGebra environment, which we used to assist students in learning the basics of the method, and help them explore the proposed experimental situations as modeled in the sum over paths perspective. We tested our approach in the context of a post-graduate training course for pre-service physics teachers; according to the data we collected, student teachers displayed a greatly improved understanding of conceptual issues, and acquired significant abilities in using the sum over path method for problem solving.
The long oasis: understanding and managing saline floodplains in southeastern Australia
NASA Astrophysics Data System (ADS)
Woods, J.; Green, G.; Laattoe, T.; Purczel, C.; Riches, V.; Li, C.; Denny, M.
2017-12-01
In a semi-arid region of southeastern Australia, the River Murray is the predominant source of freshwater for town water supply, irrigation, and floodplain ecosystems. The river interacts with aquifers where the salinity routinely exceeds 18,000 mg/l. River regulation, extraction, land clearance, and irrigation have reduced the size and frequency of floods while moving more salt into the floodplain. Floodplain ecosystem health has declined. Management options to improve floodplain health under these modified conditions include environmental watering, weirpool manipulation, and groundwater pumping. To benefit long-lived tree species, floodplain management needs to increase soil moisture availability. A conceptual model was developed of floodplain processes impacting soil moisture availability. The implications and limitations of the conceptualization were investigated using a series of numerical models, each of which simulated a subset of the processes under current and managed conditions. The aim was to determine what range of behaviors the models predicted, and to identify which parameters were key to accurately predicting the success of management options. Soil moisture availability was found to depend strongly on the properties of the floodplain clay, which controls vertical recharge during inundation. Groundwater freshening near surface water features depended on the riverbed conductivity and the penetration of the river into the floodplain sediments. Evapotranspiration is another critical process, and simulations revealed the limitations of standard numerical codes in environments where both evaporation and transpiration depend on salinity. Finally, maintenance of viable populations of floodplain trees is conceptually understood to rely on the persistence of adequate soil moisture availability over time, but thresholds for duration of exposure to low moisture availability that lead to decline and irreversible decline in tree condition are a major knowledge gap. The work identified critical data gaps which will be addressed in monitoring guidelines to improve management. This includes: hydrogeochemical sampling; in situ soil monitoring combined with tree health observations; monitoring of actual evapotranspiration; and monitoring of bores close to surface water sources.
Virtual experiments: a new approach for improving process conceptualization in hillslope hydrology
NASA Astrophysics Data System (ADS)
Weiler, Markus; McDonnell, Jeff
2004-01-01
We present an approach for process conceptualization in hillslope hydrology. We develop and implement a series of virtual experiments, whereby the interaction between water flow pathways, source and mixing at the hillslope scale is examined within a virtual experiment framework. We define these virtual experiments as 'numerical experiments with a model driven by collective field intelligence'. The virtual experiments explore the first-order controls in hillslope hydrology, where the experimentalist and modeler work together to cooperatively develop and analyze the results. Our hillslope model for the virtual experiments (HillVi) in this paper is based on conceptualizing the water balance within the saturated and unsaturated zone in relation to soil physical properties in a spatially explicit manner at the hillslope scale. We argue that a virtual experiment model needs to be able to capture all major controls on subsurface flow processes that the experimentalist might deem important, while at the same time being simple with few 'tunable parameters'. This combination makes the approach, and the dialog between experimentalist and modeler, a useful hypothesis testing tool. HillVi simulates mass flux for different initial conditions under the same flow conditions. We analyze our results in terms of an artificial line source and isotopic hydrograph separation of water and subsurface flow. Our results for this first set of virtual experiments showed how drainable porosity and soil depth variability exert a first order control on flow and transport at the hillslope scale. We found that high drainable porosity soils resulted in a restricted water table rise, resulting in more pronounced channeling of lateral subsurface flow along the soil-bedrock interface. This in turn resulted in a more anastomosing network of tracer movement across the slope. The virtual isotope hydrograph separation showed higher proportions of event water with increasing drainable porosity. When combined with previous experimental findings and conceptualizations, virtual experiments can be an effective way to isolate certain controls and examine their influence over a range of rainfall and antecedent wetness conditions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dai, Heng; Ye, Ming; Walker, Anthony P.
Hydrological models are always composed of multiple components that represent processes key to intended model applications. When a process can be simulated by multiple conceptual-mathematical models (process models), model uncertainty in representing the process arises. While global sensitivity analysis methods have been widely used for identifying important processes in hydrologic modeling, the existing methods consider only parametric uncertainty but ignore the model uncertainty for process representation. To address this problem, this study develops a new method to probe multimodel process sensitivity by integrating the model averaging methods into the framework of variance-based global sensitivity analysis, given that the model averagingmore » methods quantify both parametric and model uncertainty. A new process sensitivity index is derived as a metric of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and model parameters. For demonstration, the new index is used to evaluate the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that converting precipitation to recharge, and the geology process is also simulated by two models of different parameterizations of hydraulic conductivity; each process model has its own random parameters. The new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less
Simulating motivated cognition
NASA Technical Reports Server (NTRS)
Gevarter, William B.
1991-01-01
A research effort to develop a sophisticated computer model of human behavior is described. A computer framework of motivated cognition was developed. Motivated cognition focuses on the motivations or affects that provide the context and drive in human cognition and decision making. A conceptual architecture of the human decision-making approach from the perspective of information processing in the human brain is developed in diagrammatic form. A preliminary version of such a diagram is presented. This architecture is then used as a vehicle for successfully constructing a computer program simulation Dweck and Leggett's findings that relate how an individual's implicit theories orient them toward particular goals, with resultant cognitions, affects, and behavior.
Conceptual and logical level of database modeling
NASA Astrophysics Data System (ADS)
Hunka, Frantisek; Matula, Jiri
2016-06-01
Conceptual and logical levels form the top most levels of database modeling. Usually, ORM (Object Role Modeling) and ER diagrams are utilized to capture the corresponding schema. The final aim of business process modeling is to store its results in the form of database solution. For this reason, value oriented business process modeling which utilizes ER diagram to express the modeling entities and relationships between them are used. However, ER diagrams form the logical level of database schema. To extend possibilities of different business process modeling methodologies, the conceptual level of database modeling is needed. The paper deals with the REA value modeling approach to business process modeling using ER-diagrams, and derives conceptual model utilizing ORM modeling approach. Conceptual model extends possibilities for value modeling to other business modeling approaches.
Development of a 3D Stream Network and Topography for Improved Large-Scale Hydraulic Modeling
NASA Astrophysics Data System (ADS)
Saksena, S.; Dey, S.; Merwade, V.
2016-12-01
Most digital elevation models (DEMs) used for hydraulic modeling do not include channel bed elevations. As a result, the DEMs are complimented with additional bathymetric data for accurate hydraulic simulations. Existing methods to acquire bathymetric information through field surveys or through conceptual models are limited to reach-scale applications. With an increasing focus on large scale hydraulic modeling of rivers, a framework to estimate and incorporate bathymetry for an entire stream network is needed. This study proposes an interpolation-based algorithm to estimate bathymetry for a stream network by modifying the reach-based empirical River Channel Morphology Model (RCMM). The effect of a 3D stream network that includes river bathymetry is then investigated by creating a 1D hydraulic model (HEC-RAS) and 2D hydrodynamic model (Integrated Channel and Pond Routing) for the Upper Wabash River Basin in Indiana, USA. Results show improved simulation of flood depths and storage in the floodplain. Similarly, the impact of river bathymetry incorporation is more significant in the 2D model as compared to the 1D model.
Miller, Brian W.; Symstad, Amy J.; Frid, Leonardo; Fisichelli, Nicholas A.; Schuurman, Gregor W.
2017-01-01
Simulation models can represent complexities of the real world and serve as virtual laboratories for asking “what if…?” questions about how systems might respond to different scenarios. However, simulation models have limited relevance to real-world applications when designed without input from people who could use the simulated scenarios to inform their decisions. Here, we report on a state-and-transition simulation model of vegetation dynamics that was coupled to a scenario planning process and co-produced by researchers, resource managers, local subject-matter experts, and climate change adaptation specialists to explore potential effects of climate scenarios and management alternatives on key resources in southwest South Dakota. Input from management partners and local experts was critical for representing key vegetation types, bison and cattle grazing, exotic plants, fire, and the effects of climate change and management on rangeland productivity and composition given the paucity of published data on many of these topics. By simulating multiple land management jurisdictions, climate scenarios, and management alternatives, the model highlighted important tradeoffs between grazer density and vegetation composition, as well as between the short- and long-term costs of invasive species management. It also pointed to impactful uncertainties related to the effects of fire and grazing on vegetation. More broadly, a scenario-based approach to model co-production bracketed the uncertainty associated with climate change and ensured that the most important (and impactful) uncertainties related to resource management were addressed. This cooperative study demonstrates six opportunities for scientists to engage users throughout the modeling process to improve model utility and relevance: (1) identifying focal dynamics and variables, (2) developing conceptual model(s), (3) parameterizing the simulation, (4) identifying relevant climate scenarios and management alternatives, (5) evaluating and refining the simulation, and (6) interpreting the results. We also reflect on lessons learned and offer several recommendations for future co-production efforts, with the aim of advancing the pursuit of usable science.
Statistical Techniques Complement UML When Developing Domain Models of Complex Dynamical Biosystems.
Williams, Richard A; Timmis, Jon; Qwarnstrom, Eva E
2016-01-01
Computational modelling and simulation is increasingly being used to complement traditional wet-lab techniques when investigating the mechanistic behaviours of complex biological systems. In order to ensure computational models are fit for purpose, it is essential that the abstracted view of biology captured in the computational model, is clearly and unambiguously defined within a conceptual model of the biological domain (a domain model), that acts to accurately represent the biological system and to document the functional requirements for the resultant computational model. We present a domain model of the IL-1 stimulated NF-κB signalling pathway, which unambiguously defines the spatial, temporal and stochastic requirements for our future computational model. Through the development of this model, we observe that, in isolation, UML is not sufficient for the purpose of creating a domain model, and that a number of descriptive and multivariate statistical techniques provide complementary perspectives, in particular when modelling the heterogeneity of dynamics at the single-cell level. We believe this approach of using UML to define the structure and interactions within a complex system, along with statistics to define the stochastic and dynamic nature of complex systems, is crucial for ensuring that conceptual models of complex dynamical biosystems, which are developed using UML, are fit for purpose, and unambiguously define the functional requirements for the resultant computational model.
A Physically Based Coupled Chemical and Physical Weathering Model for Simulating Soilscape Evolution
NASA Astrophysics Data System (ADS)
Willgoose, G. R.; Welivitiya, D.; Hancock, G. R.
2015-12-01
A critical missing link in existing landscape evolution models is a dynamic soil evolution models where soils co-evolve with the landform. Work by the authors over the last decade has demonstrated a computationally manageable model for soil profile evolution (soilscape evolution) based on physical weathering. For chemical weathering it is clear that full geochemistry models such as CrunchFlow and PHREEQC are too computationally intensive to be couplable to existing soilscape and landscape evolution models. This paper presents a simplification of CrunchFlow chemistry and physics that makes the task feasible, and generalises it for hillslope geomorphology applications. Results from this simplified model will be compared with field data for soil pedogenesis. Other researchers have previously proposed a number of very simple weathering functions (e.g. exponential, humped, reverse exponential) as conceptual models of the in-profile weathering process. The paper will show that all of these functions are possible for specific combinations of in-soil environmental, geochemical and geologic conditions, and the presentation will outline the key variables controlling which of these conceptual models can be realistic models of in-profile processes and under what conditions. The presentation will finish by discussing the coupling of this model with a physical weathering model, and will show sample results from our SSSPAM soilscape evolution model to illustrate the implications of including chemical weathering in the soilscape evolution model.
Statistical Techniques Complement UML When Developing Domain Models of Complex Dynamical Biosystems
Timmis, Jon; Qwarnstrom, Eva E.
2016-01-01
Computational modelling and simulation is increasingly being used to complement traditional wet-lab techniques when investigating the mechanistic behaviours of complex biological systems. In order to ensure computational models are fit for purpose, it is essential that the abstracted view of biology captured in the computational model, is clearly and unambiguously defined within a conceptual model of the biological domain (a domain model), that acts to accurately represent the biological system and to document the functional requirements for the resultant computational model. We present a domain model of the IL-1 stimulated NF-κB signalling pathway, which unambiguously defines the spatial, temporal and stochastic requirements for our future computational model. Through the development of this model, we observe that, in isolation, UML is not sufficient for the purpose of creating a domain model, and that a number of descriptive and multivariate statistical techniques provide complementary perspectives, in particular when modelling the heterogeneity of dynamics at the single-cell level. We believe this approach of using UML to define the structure and interactions within a complex system, along with statistics to define the stochastic and dynamic nature of complex systems, is crucial for ensuring that conceptual models of complex dynamical biosystems, which are developed using UML, are fit for purpose, and unambiguously define the functional requirements for the resultant computational model. PMID:27571414
Coupled stochastic soil moisture simulation-optimization model of deficit irrigation
NASA Astrophysics Data System (ADS)
Alizadeh, Hosein; Mousavi, S. Jamshid
2013-07-01
This study presents an explicit stochastic optimization-simulation model of short-term deficit irrigation management for large-scale irrigation districts. The model which is a nonlinear nonconvex program with an economic objective function is built on an agrohydrological simulation component. The simulation component integrates (1) an explicit stochastic model of soil moisture dynamics of the crop-root zone considering interaction of stochastic rainfall and irrigation with shallow water table effects, (2) a conceptual root zone salt balance model, and 3) the FAO crop yield model. Particle Swarm Optimization algorithm, linked to the simulation component, solves the resulting nonconvex program with a significantly better computational performance compared to a Monte Carlo-based implicit stochastic optimization model. The model has been tested first by applying it in single-crop irrigation problems through which the effects of the severity of water deficit on the objective function (net benefit), root-zone water balance, and irrigation water needs have been assessed. Then, the model has been applied in Dasht-e-Abbas and Ein-khosh Fakkeh Irrigation Districts (DAID and EFID) of the Karkheh Basin in southwest of Iran. While the maximum net benefit has been obtained for a stress-avoidance (SA) irrigation policy, the highest water profitability has been resulted when only about 60% of the water used in the SA policy is applied. The DAID with respectively 33% of total cultivated area and 37% of total applied water has produced only 14% of the total net benefit due to low-valued crops and adverse soil and shallow water table conditions.
NASA Technical Reports Server (NTRS)
Chatterjee, Sharmista
1993-01-01
Our first goal in this project was to perform a systems analysis of a closed loop Environmental Control Life Support System (ECLSS). This pertains to the development of a model of an existing real system from which to assess the state or performance of the existing system. Systems analysis is applied to conceptual models obtained from a system design effort. For our modelling purposes we used a simulator tool called ASPEN (Advanced System for Process Engineering). Our second goal was to evaluate the thermodynamic efficiency of the different components comprising an ECLSS. Use is made of the second law of thermodynamics to determine the amount of irreversibility of energy loss of each component. This will aid design scientists in selecting the components generating the least entropy, as our penultimate goal is to keep the entropy generation of the whole system at a minimum.
Geostatistical borehole image-based mapping of karst-carbonate aquifer pores
Michael Sukop,; Cunningham, Kevin J.
2016-01-01
Quantification of the character and spatial distribution of porosity in carbonate aquifers is important as input into computer models used in the calculation of intrinsic permeability and for next-generation, high-resolution groundwater flow simulations. Digital, optical, borehole-wall image data from three closely spaced boreholes in the karst-carbonate Biscayne aquifer in southeastern Florida are used in geostatistical experiments to assess the capabilities of various methods to create realistic two-dimensional models of vuggy megaporosity and matrix-porosity distribution in the limestone that composes the aquifer. When the borehole image data alone were used as the model training image, multiple-point geostatistics failed to detect the known spatial autocorrelation of vuggy megaporosity and matrix porosity among the three boreholes, which were only 10 m apart. Variogram analysis and subsequent Gaussian simulation produced results that showed a realistic conceptualization of horizontal continuity of strata dominated by vuggy megaporosity and matrix porosity among the three boreholes.
A Structural Equation Model of Conceptual Change in Physics
ERIC Educational Resources Information Center
Taasoobshirazi, Gita; Sinatra, Gale M.
2011-01-01
A model of conceptual change in physics was tested on introductory-level, college physics students. Structural equation modeling was used to test hypothesized relationships among variables linked to conceptual change in physics including an approach goal orientation, need for cognition, motivation, and course grade. Conceptual change in physics…
Design of the VISITOR Tool: A Versatile ImpulSive Interplanetary Trajectory OptimizeR
NASA Technical Reports Server (NTRS)
Corpaccioli, Luca; Linskens, Harry; Komar, David R.
2014-01-01
The design of trajectories for interplanetary missions represents one of the most complex and important problems to solve during conceptual space mission design. To facilitate conceptual mission sizing activities, it is essential to obtain sufficiently accurate trajectories in a fast and repeatable manner. To this end, the VISITOR tool was developed. This tool modularly augments a patched conic MGA-1DSM model with a mass model, launch window analysis, and the ability to simulate more realistic arrival and departure operations. This was implemented in MATLAB, exploiting the built-in optimization tools and vector analysis routines. The chosen optimization strategy uses a grid search and pattern search, an iterative variable grid method. A genetic algorithm can be selectively used to improve search space pruning, at the cost of losing the repeatability of the results and increased computation time. The tool was validated against seven flown missions: the average total mission (Delta)V offset from the nominal trajectory was 9.1%, which was reduced to 7.3% when using the genetic algorithm at the cost of an increase in computation time by a factor 5.7. It was found that VISITOR was well-suited for the conceptual design of interplanetary trajectories, while also facilitating future improvements due to its modular structure.
Coupling sensing to crop models for closed-loop plant production in advanced life support systems
NASA Astrophysics Data System (ADS)
Cavazzoni, James; Ling, Peter P.
1999-01-01
We present a conceptual framework for coupling sensing to crop models for closed-loop analysis of plant production for NASA's program in advanced life support. Crop status may be monitored through non-destructive observations, while models may be independently applied to crop production planning and decision support. To achieve coupling, environmental variables and observations are linked to mode inputs and outputs, and monitoring results compared with model predictions of plant growth and development. The information thus provided may be useful in diagnosing problems with the plant growth system, or as a feedback to the model for evaluation of plant scheduling and potential yield. In this paper, we demonstrate this coupling using machine vision sensing of canopy height and top projected canopy area, and the CROPGRO crop growth model. Model simulations and scenarios are used for illustration. We also compare model predictions of the machine vision variables with data from soybean experiments conducted at New Jersey Agriculture Experiment Station Horticulture Greenhouse Facility, Rutgers University. Model simulations produce reasonable agreement with the available data, supporting our illustration.
A network-based approach for resistance transmission in bacterial populations.
Gehring, Ronette; Schumm, Phillip; Youssef, Mina; Scoglio, Caterina
2010-01-07
Horizontal transfer of mobile genetic elements (conjugation) is an important mechanism whereby resistance is spread through bacterial populations. The aim of our work is to develop a mathematical model that quantitatively describes this process, and to use this model to optimize antimicrobial dosage regimens to minimize resistance development. The bacterial population is conceptualized as a compartmental mathematical model to describe changes in susceptible, resistant, and transconjugant bacteria over time. This model is combined with a compartmental pharmacokinetic model to explore the effect of different plasma drug concentration profiles. An agent-based simulation tool is used to account for resistance transfer occurring when two bacteria are adjacent or in close proximity. In addition, a non-linear programming optimal control problem is introduced to minimize bacterial populations as well as the drug dose. Simulation and optimization results suggest that the rapid death of susceptible individuals in the population is pivotal in minimizing the number of transconjugants in a population. This supports the use of potent antimicrobials that rapidly kill susceptible individuals and development of dosage regimens that maintain effective antimicrobial drug concentrations for as long as needed to kill off the susceptible population. Suggestions are made for experiments to test the hypotheses generated by these simulations.
Impact of the calibration period on the conceptual rainfall-runoff model parameter estimates
NASA Astrophysics Data System (ADS)
Todorovic, Andrijana; Plavsic, Jasna
2015-04-01
A conceptual rainfall-runoff model is defined by its structure and parameters, which are commonly inferred through model calibration. Parameter estimates depend on objective function(s), optimisation method, and calibration period. Model calibration over different periods may result in dissimilar parameter estimates, while model efficiency decreases outside calibration period. Problem of model (parameter) transferability, which conditions reliability of hydrologic simulations, has been investigated for decades. In this paper, dependence of the parameter estimates and model performance on calibration period is analysed. The main question that is addressed is: are there any changes in optimised parameters and model efficiency that can be linked to the changes in hydrologic or meteorological variables (flow, precipitation and temperature)? Conceptual, semi-distributed HBV-light model is calibrated over five-year periods shifted by a year (sliding time windows). Length of the calibration periods is selected to enable identification of all parameters. One water year of model warm-up precedes every simulation, which starts with the beginning of a water year. The model is calibrated using the built-in GAP optimisation algorithm. The objective function used for calibration is composed of Nash-Sutcliffe coefficient for flows and logarithms of flows, and volumetric error, all of which participate in the composite objective function with approximately equal weights. Same prior parameter ranges are used in all simulations. The model is calibrated against flows observed at the Slovac stream gauge on the Kolubara River in Serbia (records from 1954 to 2013). There are no trends in precipitation nor in flows, however, there is a statistically significant increasing trend in temperatures at this catchment. Parameter variability across the calibration periods is quantified in terms of standard deviations of normalised parameters, enabling detection of the most variable parameters. Correlation coefficients among optimised model parameters and total precipitation P, mean temperature T and mean flow Q are calculated to give an insight into parameter dependence on the hydrometeorological drivers. The results reveal high sensitivity of almost all model parameters towards calibration period. The highest variability is displayed by the refreezing coefficient, water holding capacity, and temperature gradient. The only statistically significant (decreasing) trend is detected in the evapotranspiration reduction threshold. Statistically significant correlation is detected between the precipitation gradient and precipitation depth, and between the time-area histogram base and flows. All other correlations are not statistically significant, implying that changes in optimised parameters cannot generally be linked to the changes in P, T or Q. As for the model performance, the model reproduces the observed runoff satisfactorily, though the runoff is slightly overestimated in wet periods. The Nash-Sutcliffe efficiency coefficient (NSE) ranges from 0.44 to 0.79. Higher NSE values are obtained over wetter periods, what is supported by statistically significant correlation between NSE and flows. Overall, no systematic variations in parameters or in model performance are detected. Parameter variability may therefore rather be attributed to errors in data or inadequacies in the model structure. Further research is required to examine the impact of the calibration strategy or model structure on the variability in optimised parameters in time.
NASA Astrophysics Data System (ADS)
Zahmatkesh, Zahra; Karamouz, Mohammad; Nazif, Sara
2015-09-01
Simulation of rainfall-runoff process in urban areas is of great importance considering the consequences and damages of extreme runoff events and floods. The first issue in flood hazard analysis is rainfall simulation. Large scale climate signals have been proved to be effective in rainfall simulation and prediction. In this study, an integrated scheme is developed for rainfall-runoff modeling considering different sources of uncertainty. This scheme includes three main steps of rainfall forecasting, rainfall-runoff simulation and future runoff prediction. In the first step, data driven models are developed and used to forecast rainfall using large scale climate signals as rainfall predictors. Due to high effect of different sources of uncertainty on the output of hydrologic models, in the second step uncertainty associated with input data, model parameters and model structure is incorporated in rainfall-runoff modeling and simulation. Three rainfall-runoff simulation models are developed for consideration of model conceptual (structural) uncertainty in real time runoff forecasting. To analyze the uncertainty of the model structure, streamflows generated by alternative rainfall-runoff models are combined, through developing a weighting method based on K-means clustering. Model parameters and input uncertainty are investigated using an adaptive Markov Chain Monte Carlo method. Finally, calibrated rainfall-runoff models are driven using the forecasted rainfall to predict future runoff for the watershed. The proposed scheme is employed in the case study of the Bronx River watershed, New York City. Results of uncertainty analysis of rainfall-runoff modeling reveal that simultaneous estimation of model parameters and input uncertainty significantly changes the probability distribution of the model parameters. It is also observed that by combining the outputs of the hydrological models using the proposed clustering scheme, the accuracy of runoff simulation in the watershed is remarkably improved up to 50% in comparison to the simulations by the individual models. Results indicate that the developed methodology not only provides reliable tools for rainfall and runoff modeling, but also adequate time for incorporating required mitigation measures in dealing with potentially extreme runoff events and flood hazard. Results of this study can be used in identification of the main factors affecting flood hazard analysis.
NASA Astrophysics Data System (ADS)
Blumberga, Andra; Timma, Lelde; Blumberga, Dagnija
2015-12-01
When the renewable energy is used, the challenge is match the supply of intermittent energy with the demand for energy therefore the energy storage solutions should be used. This paper is dedicated to hydrogen accumulation from wind sources. The case study investigates the conceptual system that uses intermitted renewable energy resources to produce hydrogen (power-to-gas concept) and fuel (power-to-liquid concept). For this specific case study hydrogen is produced from surplus electricity generated by wind power plant trough electrolysis process and fuel is obtained by upgrading biogas to biomethane using hydrogen. System dynamic model is created for this conceptual system. The developed system dynamics model has been used to simulate 2 different scenarios. The results show that in both scenarios the point at which the all electricity needs of Latvia are covered is obtained. Moreover, the methodology of system dynamics used in this paper is white-box model that allows to apply the developed model to other case studies and/or to modify model based on the newest data. The developed model can be used for both scientific research and policy makers to better understand the dynamic relation within the system and the response of system to changes in both internal and external factors.
Nyein, Michelle K; Jason, Amanda M; Yu, Li; Pita, Claudio M; Joannopoulos, John D; Moore, David F; Radovitzky, Raul A
2010-11-30
Blast-induced traumatic brain injury is the most prevalent military injury in Iraq and Afghanistan, yet little is known about the mechanical effects of blasts on the human head, and still less is known about how personal protective equipment affects the brain's response to blasts. In this study we investigated the effect of the Advanced Combat Helmet (ACH) and a conceptual face shield on the propagation of stress waves within the brain tissue following blast events. We used a sophisticated computational framework for simulating coupled fluid-solid dynamic interactions and a three-dimensional biofidelic finite element model of the human head and intracranial contents combined with a detailed model of the ACH and a conceptual face shield. Simulations were conducted in which the unhelmeted head, head with helmet, and head with helmet and face shield were exposed to a frontal blast wave with incident overpressure of 10 atm. Direct transmission of stress waves into the intracranial cavity was observed in the unprotected head and head with helmet simulations. Compared to the unhelmeted head, the head with helmet experienced slight mitigation of intracranial stresses. This suggests that the existing ACH does not significantly contribute to mitigating blast effects, but does not worsen them either. By contrast, the helmet and face shield combination impeded direct transmission of stress waves to the face, resulting in a delay in the transmission of stresses to the intracranial cavity and lower intracranial stresses. This suggests a possible strategy for mitigating blast waves often associated with military concussion.
Nyein, Michelle K.; Jason, Amanda M.; Yu, Li; Pita, Claudio M.; Joannopoulos, John D.; Moore, David F.; Radovitzky, Raul A.
2010-01-01
Blast-induced traumatic brain injury is the most prevalent military injury in Iraq and Afghanistan, yet little is known about the mechanical effects of blasts on the human head, and still less is known about how personal protective equipment affects the brain’s response to blasts. In this study we investigated the effect of the Advanced Combat Helmet (ACH) and a conceptual face shield on the propagation of stress waves within the brain tissue following blast events. We used a sophisticated computational framework for simulating coupled fluid–solid dynamic interactions and a three-dimensional biofidelic finite element model of the human head and intracranial contents combined with a detailed model of the ACH and a conceptual face shield. Simulations were conducted in which the unhelmeted head, head with helmet, and head with helmet and face shield were exposed to a frontal blast wave with incident overpressure of 10 atm. Direct transmission of stress waves into the intracranial cavity was observed in the unprotected head and head with helmet simulations. Compared to the unhelmeted head, the head with helmet experienced slight mitigation of intracranial stresses. This suggests that the existing ACH does not significantly contribute to mitigating blast effects, but does not worsen them either. By contrast, the helmet and face shield combination impeded direct transmission of stress waves to the face, resulting in a delay in the transmission of stresses to the intracranial cavity and lower intracranial stresses. This suggests a possible strategy for mitigating blast waves often associated with military concussion. PMID:21098257
Verifying different-modality properties for concepts produces switching costs.
Pecher, Diane; Zeelenberg, René; Barsalou, Lawrence W
2003-03-01
According to perceptual symbol systems, sensorimotor simulations underlie the representation of concepts. It follows that sensorimotor phenomena should arise in conceptual processing. Previous studies have shown that switching from one modality to another during perceptual processing incurs a processing cost. If perceptual simulation underlies conceptual processing, then verifying the properties of concepts should exhibit a switching cost as well. For example, verifying a property in the auditory modality (e.g., BLENDER-loud) should be slower after verifying a property in a different modality (e.g., CRANBERRIES-tart) than after verifying a property in the same modality (e.g., LEAVES-rustling). Only words were presented to subjects, and there were no instructions to use imagery. Nevertheless, switching modalities incurred a cost, analogous to the cost of switching modalities in perception. A second experiment showed that this effect was not due to associative priming between properties in the same modality. These results support the hypothesis that perceptual simulation underlies conceptual processing.
The added value of remote sensing products in constraining hydrological models
NASA Astrophysics Data System (ADS)
Nijzink, Remko C.; Almeida, Susana; Pechlivanidis, Ilias; Capell, René; Gustafsson, David; Arheimer, Berit; Freer, Jim; Han, Dawei; Wagener, Thorsten; Sleziak, Patrik; Parajka, Juraj; Savenije, Hubert; Hrachowitz, Markus
2017-04-01
The calibration of a hydrological model still depends on the availability of streamflow data, even though more additional sources of information (i.e. remote sensed data products) have become more widely available. In this research, the model parameters of four different conceptual hydrological models (HYPE, HYMOD, TUW, FLEX) were constrained with remotely sensed products. The models were applied over 27 catchments across Europe to cover a wide range of climates, vegetation and landscapes. The fluxes and states of the models were correlated with the relevant products (e.g. MOD10A snow with modelled snow states), after which new a-posteriori parameter distributions were determined based on a weighting procedure using conditional probabilities. Briefly, each parameter was weighted with the coefficient of determination of the relevant regression between modelled states/fluxes and products. In this way, final feasible parameter sets were derived without the use of discharge time series. Initial results show that improvements in model performance, with regard to streamflow simulations, are obtained when the models are constrained with a set of remotely sensed products simultaneously. In addition, we present a more extensive analysis to assess a model's ability to reproduce a set of hydrological signatures, such as rising limb density or peak distribution. Eventually, this research will enhance our understanding and recommendations in the use of remotely sensed products for constraining conceptual hydrological modelling and improving predictive capability, especially for data sparse regions.
An undergraduate laboratory activity on molecular dynamics simulations.
Spitznagel, Benjamin; Pritchett, Paige R; Messina, Troy C; Goadrich, Mark; Rodriguez, Juan
2016-01-01
Vision and Change [AAAS, 2011] outlines a blueprint for modernizing biology education by addressing conceptual understanding of key concepts, such as the relationship between structure and function. The document also highlights skills necessary for student success in 21st century Biology, such as the use of modeling and simulation. Here we describe a laboratory activity that allows students to investigate the dynamic nature of protein structure and function through the use of a modeling technique known as molecular dynamics (MD). The activity takes place over two lab periods that are 3 hr each. The first lab period unpacks the basic approach behind MD simulations, beginning with the kinematic equations that all bioscience students learn in an introductory physics course. During this period students are taught rudimentary programming skills in Python while guided through simple modeling exercises that lead up to the simulation of the motion of a single atom. In the second lab period students extend concepts learned in the first period to develop skills in the use of expert MD software. Here students simulate and analyze changes in protein conformation resulting from temperature change, solvation, and phosphorylation. The article will describe how these activities can be carried out using free software packages, including Abalone and VMD/NAMD. © 2016 The International Union of Biochemistry and Molecular Biology.
NASA Astrophysics Data System (ADS)
Pham, H. V.; Parashar, R.; Sund, N. L.; Pohlmann, K.
2017-12-01
Pahute Mesa, located in the north-western region of the Nevada National Security Site, is an area where numerous underground nuclear tests were conducted. The mesa contains several fractured aquifers that can potentially provide high permeability pathways for migration of radionuclides away from testing locations. The BULLION Forced-Gradient Experiment (FGE) conducted on Pahute Mesa injected and pumped solute and colloid tracers from a system of three wells for obtaining site-specific information about the transport of radionuclides in fractured rock aquifers. This study aims to develop reliable three-dimensional discrete fracture network (DFN) models to simulate the BULLION FGE as a means for computing realistic ranges of important parameters describing fractured rock. Multiple conceptual DFN models were developed using dfnWorks, a parallelized computational suite developed by Los Alamos National Laboratory, to simulate flow and conservative particle movement in subsurface fractured rocks downgradient from the BULLION test. The model domain is 100x200x100 m and includes the three tracer-test wells of the BULLION FGE and the Pahute Mesa Lava-flow aquifer. The model scenarios considered differ from each other in terms of boundary conditions and fracture density. For each conceptual model, a number of statistically equivalent fracture network realizations were generated using data from fracture characterization studies. We adopt the covariance matrix adaptation-evolution strategy (CMA-ES) as a global local stochastic derivative-free optimization method to calibrate the DFN models using groundwater levels and tracer breakthrough data obtained from the three wells. Models of fracture apertures based on fracture type and size are proposed and the values of apertures in each model are estimated during model calibration. The ranges of fracture aperture values resulting from this study are expected to enhance understanding of radionuclide transport in fractured rocks and support development of improved large-scale flow and transport models for Pahute Mesa.
M and S supporting unmanned autonomous systems (UAxS) concept development and experimentation
NASA Astrophysics Data System (ADS)
Biagini, Marco; Scaccianoce, Alfio; Corona, Fabio; Forconi, Sonia; Byrum, Frank; Fowler, Olivia; Sidoran, James L.
2017-05-01
The development of the next generation of multi-domain unmanned semi and fully autonomous C4ISR systems involves a multitude of security concerns and interoperability challenges. Conceptual solutions to capability shortfalls and gaps can be identified through Concept Development and Experimentation (CD and E) cycles. Modelling and Simulation (M and S) is a key tool in supporting unmanned autonomous systems (UAxS) CD and E activities and addressing associated security challenges. This paper serves to illustrate the application of M and S to UAxS development and highlight initiatives made by the North Atlantic Treaty Organization (NATO) M and S Centre of Excellence (CoE) to facilitate interoperability. The NATO M and S CoE collaborates with other NATO and Nations bodies in order to develop UAxS projects such as the Allied Command for Transformation Counter Unmanned Autonomous Systems (CUAxS) project or the work of Science and Technology Organization (STO) panels. Some initiatives, such as Simulated Interactive Robotics Initiative (SIRI) made the baseline for further developments and to study emerging technologies in M and S and robotics fields. Artificial Intelligence algorithm modelling, Robot Operating Systems (ROS), network operations, cyber security, interoperable languages and related data models are some of the main aspects considered in this paper. In particular, the implementation of interoperable languages like C-BML and NIEM MilOps are discussed in relation to a Command and Control - Simulation Interoperability (C2SIM) paradigm. All these technologies are used to build a conceptual architecture to support UAxS CD and E.In addition, other projects that the NATO M and S CoE is involved in, such as the NATO Urbanization Project could provide credible future operational environments and benefit UAxS project development, as dual application of UAxS technology in large urbanized areas.In conclusion, this paper contains a detailed overview regarding how applying Modelling and Simulation to support CD and E activities is a valid approach to develop and validate future capabilities requirements in general and next generation UAxS.
Feature Statistics Modulate the Activation of Meaning During Spoken Word Processing.
Devereux, Barry J; Taylor, Kirsten I; Randall, Billi; Geertzen, Jeroen; Tyler, Lorraine K
2016-03-01
Understanding spoken words involves a rapid mapping from speech to conceptual representations. One distributed feature-based conceptual account assumes that the statistical characteristics of concepts' features--the number of concepts they occur in (distinctiveness/sharedness) and likelihood of co-occurrence (correlational strength)--determine conceptual activation. To test these claims, we investigated the role of distinctiveness/sharedness and correlational strength in speech-to-meaning mapping, using a lexical decision task and computational simulations. Responses were faster for concepts with higher sharedness, suggesting that shared features are facilitatory in tasks like lexical decision that require access to them. Correlational strength facilitated responses for slower participants, suggesting a time-sensitive co-occurrence-driven settling mechanism. The computational simulation showed similar effects, with early effects of shared features and later effects of correlational strength. These results support a general-to-specific account of conceptual processing, whereby early activation of shared features is followed by the gradual emergence of a specific target representation. Copyright © 2015 The Authors. Cognitive Science published by Cognitive Science Society, Inc.
Conceptual Frameworks in the Doctoral Research Process: A Pedagogical Model
ERIC Educational Resources Information Center
Berman, Jeanette; Smyth, Robyn
2015-01-01
This paper contributes to consideration of the role of conceptual frameworks in the doctoral research process. Through reflection on the two authors' own conceptual frameworks for their doctoral studies, a pedagogical model has been developed. The model posits the development of a conceptual framework as a core element of the doctoral…
Leavesley, G.H.; Markstrom, S.L.; Viger, R.J.
2004-01-01
The interdisciplinary nature and increasing complexity of water- and environmental-resource problems require the use of modeling approaches that can incorporate knowledge from a broad range of scientific disciplines. The large number of distributed hydrological and ecosystem models currently available are composed of a variety of different conceptualizations of the associated processes they simulate. Assessment of the capabilities of these distributed models requires evaluation of the conceptualizations of the individual processes, and the identification of which conceptualizations are most appropriate for various combinations of criteria, such as problem objectives, data constraints, and spatial and temporal scales of application. With this knowledge, "optimal" models for specific sets of criteria can be created and applied. The U.S. Geological Survey (USGS) Modular Modeling System (MMS) is an integrated system of computer software that has been developed to provide these model development and application capabilities. MMS supports the integration of models and tools at a variety of levels of modular design. These include individual process models, tightly coupled models, loosely coupled models, and fully-integrated decision support systems. A variety of visualization and statistical tools are also provided. MMS has been coupled with the Bureau of Reclamation (BOR) object-oriented reservoir and river-system modeling framework, RiverWare, under a joint USGS-BOR program called the Watershed and River System Management Program. MMS and RiverWare are linked using a shared relational database. The resulting database-centered decision support system provides tools for evaluating and applying optimal resource-allocation and management strategies to complex, operational decisions on multipurpose reservoir systems and watersheds. Management issues being addressed include efficiency of water-resources management, environmental concerns such as meeting flow needs for endangered species, and optimizing operations within the constraints of multiple objectives such as power generation, irrigation, and water conservation. This decision support system approach is being developed, tested, and implemented in the Gunni-son, Yakima, San Juan, Rio Grande, and Truckee River basins of the western United States. Copyright ASCE 2004.
Evaluating the spatial distribution of water balance in a small watershed, Pennsylvania
NASA Astrophysics Data System (ADS)
Yu, Zhongbo; Gburek, W. J.; Schwartz, F. W.
2000-04-01
A conceptual water-balance model was modified from a point application to be distributed for evaluating the spatial distribution of watershed water balance based on daily precipitation, temperature and other hydrological parameters. The model was calibrated by comparing simulated daily variation in soil moisture with field observed data and results of another model that simulates the vertical soil moisture flow by numerically solving Richards' equation. The impacts of soil and land use on the hydrological components of the water balance, such as evapotranspiration, soil moisture deficit, runoff and subsurface drainage, were evaluated with the calibrated model in this study. Given the same meteorological conditions and land use, the soil moisture deficit, evapotranspiration and surface runoff increase, and subsurface drainage decreases, as the available water capacity of soil increases. Among various land uses, alfalfa produced high soil moisture deficit and evapotranspiration and lower surface runoff and subsurface drainage, whereas soybeans produced an opposite trend. The simulated distribution of various hydrological components shows the combined effect of soil and land use. Simulated hydrological components compare well with observed data. The study demonstrated that the distributed water balance approach is efficient and has advantages over the use of single average value of hydrological variables and the application at a single point in the traditional practice.
Thackray, Debbie; Roberts, Lisa
2017-02-01
The ability of physiotherapists to make clinical decisions is a vital component of being an autonomous practitioner, yet this complex phenomenon has been under-researched in cardiorespiratory physiotherapy. The purpose of this study was to explore clinical decision-making (CDM) by experienced physiotherapists in a scenario of a simulated patient experiencing acute deterioration of their respiratory function. The main objective of this observational study was to identify the actions, thoughts, and behaviours used by experienced cardiorespiratory physiotherapists in their clinical decision-making processes. A mixed-methods (qualitative) design employing observation and think-aloud, was adopted using a computerised manikin in a simulated environment. The participants clinically assessed the manikin programmed with the same clinical signs, under standardised conditions in the clinical skills practice suite, which was set up as a ward environment. Experienced cardiorespiratory physiotherapists, recruited from clinical practice within a 50-mile radius of the University(*). Participants were video-recorded throughout the assessment and treatment and asked to verbalise their thought processes using the 'think-aloud' method. The recordings were transcribed verbatim and managed using a Framework approach. Eight cardiorespiratory physiotherapists participated (mean 7years clinical experience, range 3.5-16years. CDM was similar to the collaborative hypothetico-deductive model, five-rights nursing model, reasoning strategies, inductive reasoning and pattern recognition. However, the CDM demonstrated by the physiotherapists was complex, interactive and iterative. Information processing occurred continuously throughout the whole interaction with the patient, and the specific cognitive skills of recognition, matching, discriminating, relating, inferring, synthesising and prediction were identified as being used sequentially. The findings from this study were used to develop a new conceptual model of clinical decision-making for cardiorespiratory physiotherapy. This conceptual model can be used to inform future educational strategies to prepare physiotherapists and nurses for working in acute respiratory care. Crown Copyright © 2016. Published by Elsevier Ltd. All rights reserved.
Active tensor magnetic gradiometer system final report for Project MM–1514
Smith, David V.; Phillips, Jeffrey D.; Hutton, S. Raymond
2014-01-01
An interactive computer simulation program, based on physical models of system sensors, platform geometry, Earth environment, and spheroidal magnetically-permeable targets, was developed to generate synthetic magnetic field data from a conceptual tensor magnetic gradiometer system equipped with an active primary field generator. The system sensors emulate the prototype tensor magnetic gradiometer system (TMGS) developed under a separate contract for unexploded ordnance (UXO) detection and classification. Time-series data from different simulation scenarios were analyzed to recover physical dimensions of the target source. Helbig-Euler simulations were run with rectangular and rod-like source bodies to determine whether such a system could separate the induced component of the magnetization from the remanent component for each target. This report concludes with an engineering assessment of a practical system design.
Turnaround Time Modeling for Conceptual Rocket Engines
NASA Technical Reports Server (NTRS)
Nix, Michael; Staton, Eric J.
2004-01-01
Recent years have brought about a paradigm shift within NASA and the Space Launch Community regarding the performance of conceptual design. Reliability, maintainability, supportability, and operability are no longer effects of design; they have moved to the forefront and are affecting design. A primary focus of this shift has been a planned decrease in vehicle turnaround time. Potentials for instituting this decrease include attacking the issues of removing, refurbishing, and replacing the engines after each flight. less, it is important to understand the operational affects of an engine on turnaround time, ground support personnel and equipment. One tool for visualizing this relationship involves the creation of a Discrete Event Simulation (DES). A DES model can be used to run a series of trade studies to determine if the engine is meeting its requirements, and, if not, what can be altered to bring it into compliance. Using DES, it is possible to look at the ways in which labor requirements, parallel maintenance versus serial maintenance, and maintenance scheduling affect the overall turnaround time. A detailed DES model of the Space Shuttle Main Engines (SSME) has been developed. Trades may be performed using the SSME Processing Model to see where maintenance bottlenecks occur, what the benefits (if any) are of increasing the numbers of personnel, or the number and location of facilities, in addition to trades previously mentioned, all with the goal of optimizing the operational turnaround time and minimizing operational cost. The SSME Processing Model was developed in such a way that it can easily be used as a foundation for developing DES models of other operational or developmental reusable engines. Performing a DES on a developmental engine during the conceptual phase makes it easier to affect the design and make changes to bring about a decrease in turnaround time and costs.
Design of a Neurally Plausible Model of Fear Learning
Krasne, Franklin B.; Fanselow, Michael S.; Zelikowsky, Moriel
2011-01-01
A neurally oriented conceptual and computational model of fear conditioning manifested by freezing behavior (FRAT), which accounts for many aspects of delay and context conditioning, has been constructed. Conditioning and extinction are the result of neuromodulation-controlled LTP at synapses of thalamic, cortical, and hippocampal afferents on principal cells and inhibitory interneurons of lateral and basal amygdala. The phenomena accounted for by the model (and simulated by the computational version) include conditioning, secondary reinforcement, blocking, the immediate shock deficit, extinction, renewal, and a range of empirically valid effects of pre- and post-training ablation or inactivation of hippocampus or amygdala nuclei. PMID:21845175
Perceptual processing affects conceptual processing.
Van Dantzig, Saskia; Pecher, Diane; Zeelenberg, René; Barsalou, Lawrence W
2008-04-05
According to the Perceptual Symbols Theory of cognition (Barsalou, 1999), modality-specific simulations underlie the representation of concepts. A strong prediction of this view is that perceptual processing affects conceptual processing. In this study, participants performed a perceptual detection task and a conceptual property-verification task in alternation. Responses on the property-verification task were slower for those trials that were preceded by a perceptual trial in a different modality than for those that were preceded by a perceptual trial in the same modality. This finding of a modality-switch effect across perceptual processing and conceptual processing supports the hypothesis that perceptual and conceptual representations are partially based on the same systems. 2008 Cognitive Science Society, Inc.
NASA Astrophysics Data System (ADS)
Rotzoll, K.; Izuka, S. K.; Nishikawa, T.; Fienen, M. N.; El-Kadi, A. I.
2015-12-01
The volcanic-rock aquifers of Kauai, Oahu, and Maui are heavily developed, leading to concerns related to the effects of groundwater withdrawals on saltwater intrusion and streamflow. A numerical modeling analysis using the most recently available data (e.g., information on recharge, withdrawals, hydrogeologic framework, and conceptual models of groundwater flow) will substantially advance current understanding of groundwater flow and provide insight into the effects of human activity and climate change on Hawaii's water resources. Three island-wide groundwater-flow models were constructed using MODFLOW 2005 coupled with the Seawater-Intrusion Package (SWI2), which simulates the transition between saltwater and freshwater in the aquifer as a sharp interface. This approach allowed relatively fast model run times without ignoring the freshwater-saltwater system at the regional scale. Model construction (FloPy3), automated-parameter estimation (PEST), and analysis of results were streamlined using Python scripts. Model simulations included pre-development (1870) and current (average of 2001-10) scenarios for each island. Additionally, scenarios for future withdrawals and climate change were simulated for Oahu. We present our streamlined approach and preliminary results showing estimated effects of human activity on the groundwater resource by quantifying decline in water levels, reduction in stream base flow, and rise of the freshwater-saltwater interface.
Modeling, simulation, and flight characteristics of an aircraft designed to fly at 100,000 feet
NASA Technical Reports Server (NTRS)
Sim, Alex G.
1991-01-01
A manned real time simulation of a conceptual vehicle, the stratoplane, was developed to study the problems associated with the flight characteristics of a large, lightweight vehicle. Mathematical models of the aerodynamics, mass properties, and propulsion system were developed in support of the simulation and are presented. The simulation was at first conducted without control augmentation to determine the needs for a control system. The unaugmented flying qualities were dominated by lightly damped dutch roll oscillations. Constant pilot workloads were needed at high altitudes. Control augmentation was studied using basic feedbacks. For the longitudinal axis, flight path angle, and pitch rate feedback were sufficient to damp the phugoid mode and to provide good flying qualities. In the lateral directional axis, bank angle, roll rate, and yaw rate feedbacks were sufficient to provide a safe vehicle with acceptable handling qualities. Intentionally stalling the stratoplane to very high angles of attack (deep stall) was studied as a means of enable safe and rapid descent. It was concluded that the deep stall maneuver is viable for this class of vehicle.
Numerical model of water flow and solute accumulation in vertisols using HYDRUS 2D/3D code
NASA Astrophysics Data System (ADS)
Weiss, Tomáš; Dahan, Ofer; Turkeltub, Tuvia
2015-04-01
Keywords: dessication-crack-induced-salinization, preferential flow, conceptual model, numerical model, vadose zone, vertisols, soil water retention function, HYDRUS 2D/3D Vertisols cover a hydrologically very significant area of semi-arid regions often through which water infiltrates to groundwater aquifers. Understanding of water flow and solute accumulation is thus very relevant to agricultural activity and water resources management. Previous works suggest a conceptual model of dessication-crack-induced-salinization where salinization of sediment in the deep section of the vadose zone (up to 4 m) is induced by subsurface evaporation due to convective air flow in the dessication cracks. It suggests that the salinization is induced by the hydraulic gradient between the dry sediment in the vicinity of cracks (low potential) and the relatively wet sediment further from the main cracks (high potential). This paper presents a modified previously suggested conceptual model and a numerical model. The model uses a simple uniform flow approach but unconventionally prescribes the boundary conditions and the hydraulic parameters of soil. The numerical model is bound to one location close to a dairy farm waste lagoon, but the application of the suggested conceptual model could be possibly extended to all semi-arid regions with vertisols. Simulations were conducted using several modeling approaches with an ultimate goal of fitting the simulation results to the controlling variables measured in the field: temporal variation in water content across thick layer of unsaturated clay sediment (>10 m), sediment salinity and salinity the water draining down the vadose zone to the water table. The development of the model was engineered in several steps; all computed as forward solutions by try-and-error approach. The model suggests very deep instant infiltration of fresh water up to 12 m, which is also supported by the field data. The paper suggests prescribing a special atmospheric boundary to the wall of the crack (so that the solute can accumulate due to evaporation on the crack block wall, and infiltrating fresh water can push the solute further down) - in order to do so, HYDRUS 2D/3D code had to be modified by its developers. Unconventionally, the main fitting parameters were: parameter a and n in the soil water retention curve and saturated hydraulic conductivity. The amount of infiltrated water (within a reasonable range), the infiltration function in the crack and the actual evaporation from the crack were also used as secondary fitting parameters. The model supports the previous findings that significant amount (~90%) of water from rain events must infiltrate through the crack. It was also noted that infiltration from the crack has to be increasing with depth and that the highest infiltration rate should be somewhere between 1-3m. This paper suggests a new way how to model vertisols in semi-arid regions. It also supports the previous findings about vertisols: especially, the utmost importance of soil cracks as preferential pathways for water and contaminants and soil cracks as deep evaporators.
An ice sheet model validation framework for the Greenland ice sheet
NASA Astrophysics Data System (ADS)
Price, Stephen F.; Hoffman, Matthew J.; Bonin, Jennifer A.; Howat, Ian M.; Neumann, Thomas; Saba, Jack; Tezaur, Irina; Guerber, Jeffrey; Chambers, Don P.; Evans, Katherine J.; Kennedy, Joseph H.; Lenaerts, Jan; Lipscomb, William H.; Perego, Mauro; Salinger, Andrew G.; Tuminaro, Raymond S.; van den Broeke, Michiel R.; Nowicki, Sophie M. J.
2017-01-01
We propose a new ice sheet model validation framework - the Cryospheric Model Comparison Tool (CmCt) - that takes advantage of ice sheet altimetry and gravimetry observations collected over the past several decades and is applied here to modeling of the Greenland ice sheet. We use realistic simulations performed with the Community Ice Sheet Model (CISM) along with two idealized, non-dynamic models to demonstrate the framework and its use. Dynamic simulations with CISM are forced from 1991 to 2013, using combinations of reanalysis-based surface mass balance and observations of outlet glacier flux change. We propose and demonstrate qualitative and quantitative metrics for use in evaluating the different model simulations against the observations. We find that the altimetry observations used here are largely ambiguous in terms of their ability to distinguish one simulation from another. Based on basin-scale and whole-ice-sheet-scale metrics, we find that simulations using both idealized conceptual models and dynamic, numerical models provide an equally reasonable representation of the ice sheet surface (mean elevation differences of < 1 m). This is likely due to their short period of record, biases inherent to digital elevation models used for model initial conditions, and biases resulting from firn dynamics, which are not explicitly accounted for in the models or observations. On the other hand, we find that the gravimetry observations used here are able to unambiguously distinguish between simulations of varying complexity, and along with the CmCt, can provide a quantitative score for assessing a particular model and/or simulation. The new framework demonstrates that our proposed metrics can distinguish relatively better from relatively worse simulations and that dynamic ice sheet models, when appropriately initialized and forced with the right boundary conditions, demonstrate a predictive skill with respect to observed dynamic changes that have occurred on Greenland over the past few decades. An extensible design will allow for continued use of the CmCt as future altimetry, gravimetry, and other remotely sensed data become available for use in ice sheet model validation.
An ice sheet model validation framework for the Greenland ice sheet
Price, Stephen F.; Hoffman, Matthew J.; Bonin, Jennifer A.; Howat, Ian M.; Neumann, Thomas; Saba, Jack; Tezaur, Irina; Guerber, Jeffrey; Chambers, Don P.; Evans, Katherine J.; Kennedy, Joseph H.; Lenaerts, Jan; Lipscomb, William H.; Perego, Mauro; Salinger, Andrew G.; Tuminaro, Raymond S.; van den Broeke, Michiel R.; Nowicki, Sophie M. J.
2018-01-01
We propose a new ice sheet model validation framework – the Cryospheric Model Comparison Tool (CmCt) – that takes advantage of ice sheet altimetry and gravimetry observations collected over the past several decades and is applied here to modeling of the Greenland ice sheet. We use realistic simulations performed with the Community Ice Sheet Model (CISM) along with two idealized, non-dynamic models to demonstrate the framework and its use. Dynamic simulations with CISM are forced from 1991 to 2013 using combinations of reanalysis-based surface mass balance and observations of outlet glacier flux change. We propose and demonstrate qualitative and quantitative metrics for use in evaluating the different model simulations against the observations. We find that the altimetry observations used here are largely ambiguous in terms of their ability to distinguish one simulation from another. Based on basin- and whole-ice-sheet scale metrics, we find that simulations using both idealized conceptual models and dynamic, numerical models provide an equally reasonable representation of the ice sheet surface (mean elevation differences of <1 m). This is likely due to their short period of record, biases inherent to digital elevation models used for model initial conditions, and biases resulting from firn dynamics, which are not explicitly accounted for in the models or observations. On the other hand, we find that the gravimetry observations used here are able to unambiguously distinguish between simulations of varying complexity, and along with the CmCt, can provide a quantitative score for assessing a particular model and/or simulation. The new framework demonstrates that our proposed metrics can distinguish relatively better from relatively worse simulations and that dynamic ice sheet models, when appropriately initialized and forced with the right boundary conditions, demonstrate predictive skill with respect to observed dynamic changes occurring on Greenland over the past few decades. An extensible design will allow for continued use of the CmCt as future altimetry, gravimetry, and other remotely sensed data become available for use in ice sheet model validation. PMID:29697704
An Ice Sheet Model Validation Framework for the Greenland Ice Sheet
NASA Technical Reports Server (NTRS)
Price, Stephen F.; Hoffman, Matthew J.; Bonin, Jennifer A.; Howat, Ian M.; Neumann, Thomas A.; Saba, Jack; Tezaur, Irina; Guerber, Jeffrey R.; Chambers, Don P.; Evans, Katherine J.;
2017-01-01
We propose a new ice sheet model validation framework - the Cryospheric Model Comparison Tool (CmCt) - that takes advantage of ice sheet altimetry and gravimetry observations collected over the past several decades and is applied here to modeling of the Greenland ice sheet. We use realistic simulations performed with the Community Ice Sheet Model (CISM) along with two idealized, non-dynamic models to demonstrate the framework and its use. Dynamic simulations with CISM are forced from 1991 to 2013, using combinations of reanalysis-based surface mass balance and observations of outlet glacier flux change. We propose and demonstrate qualitative and quantitative metrics for use in evaluating the different model simulations against the observations. We find that the altimetry observations used here are largely ambiguous in terms of their ability to distinguish one simulation from another. Based on basin-scale and whole-ice-sheet-scale metrics, we find that simulations using both idealized conceptual models and dynamic, numerical models provide an equally reasonable representation of the ice sheet surface (mean elevation differences of less than 1 meter). This is likely due to their short period of record, biases inherent to digital elevation models used for model initial conditions, and biases resulting from firn dynamics, which are not explicitly accounted for in the models or observations. On the other hand, we find that the gravimetry observations used here are able to unambiguously distinguish between simulations of varying complexity, and along with the CmCt, can provide a quantitative score for assessing a particular model and/or simulation. The new framework demonstrates that our proposed metrics can distinguish relatively better from relatively worse simulations and that dynamic ice sheet models, when appropriately initialized and forced with the right boundary conditions, demonstrate a predictive skill with respect to observed dynamic changes that have occurred on Greenland over the past few decades. An extensible design will allow for continued use of the CmCt as future altimetry, gravimetry, and other remotely sensed data become available for use in ice sheet model validation.
Neurological evidence linguistic processes precede perceptual simulation in conceptual processing.
Louwerse, Max; Hutchinson, Sterling
2012-01-01
There is increasing evidence from response time experiments that language statistics and perceptual simulations both play a role in conceptual processing. In an EEG experiment we compared neural activity in cortical regions commonly associated with linguistic processing and visual perceptual processing to determine to what extent symbolic and embodied accounts of cognition applied. Participants were asked to determine the semantic relationship of word pairs (e.g., sky - ground) or to determine their iconic relationship (i.e., if the presentation of the pair matched their expected physical relationship). A linguistic bias was found toward the semantic judgment task and a perceptual bias was found toward the iconicity judgment task. More importantly, conceptual processing involved activation in brain regions associated with both linguistic and perceptual processes. When comparing the relative activation of linguistic cortical regions with perceptual cortical regions, the effect sizes for linguistic cortical regions were larger than those for the perceptual cortical regions early in a trial with the reverse being true later in a trial. These results map upon findings from other experimental literature and provide further evidence that processing of concept words relies both on language statistics and on perceptual simulations, whereby linguistic processes precede perceptual simulation processes.
Neurological Evidence Linguistic Processes Precede Perceptual Simulation in Conceptual Processing
Louwerse, Max; Hutchinson, Sterling
2012-01-01
There is increasing evidence from response time experiments that language statistics and perceptual simulations both play a role in conceptual processing. In an EEG experiment we compared neural activity in cortical regions commonly associated with linguistic processing and visual perceptual processing to determine to what extent symbolic and embodied accounts of cognition applied. Participants were asked to determine the semantic relationship of word pairs (e.g., sky – ground) or to determine their iconic relationship (i.e., if the presentation of the pair matched their expected physical relationship). A linguistic bias was found toward the semantic judgment task and a perceptual bias was found toward the iconicity judgment task. More importantly, conceptual processing involved activation in brain regions associated with both linguistic and perceptual processes. When comparing the relative activation of linguistic cortical regions with perceptual cortical regions, the effect sizes for linguistic cortical regions were larger than those for the perceptual cortical regions early in a trial with the reverse being true later in a trial. These results map upon findings from other experimental literature and provide further evidence that processing of concept words relies both on language statistics and on perceptual simulations, whereby linguistic processes precede perceptual simulation processes. PMID:23133427
Conceptual Change Texts in Chemistry Teaching: A Study on the Particle Model of Matter
ERIC Educational Resources Information Center
Beerenwinkel, Anne; Parchmann, Ilka; Grasel, Cornelia
2011-01-01
This study explores the effect of a conceptual change text on students' awareness of common misconceptions on the particle model of matter. The conceptual change text was designed based on principles of text comprehensibility, of conceptual change instruction and of instructional approaches how to introduce the particle model. It was evaluated in…
Validation of a model for investigating red cell mass changes during weightlessness
NASA Technical Reports Server (NTRS)
Leonard, J. I.
1976-01-01
The model, both the conceptual model and simulation model, provided a convenient framework on which to demonstrate the commonality between such diverse stresses as descent from altitude, red cell infusions, bed rest, and weightlessness. The results suggest that all of these stresses induce an increased blood hematocrit leading to tissue hyperoxia and eventual inhibition of the erythyocyte producing circuit until the hyperoxic condition is relieved. The erythropoietic system was acting, in these situations, as if it were an hematocrit sensor and regulator. In these terms the decreases in red cell mass during Skylab may be explained in terms of normal feedback regulation of the erythropoietic system in the face of sustained decreases in plasma colume.
Modeling and Prototyping of Automatic Clutch System for Light Vehicles
NASA Astrophysics Data System (ADS)
Murali, S.; Jothi Prakash, V. M.; Vishal, S.
2017-03-01
Nowadays, recycling or regenerating the waste in to something useful is appreciated all around the globe. It reduces greenhouse gas emissions that contribute to global climate change. This study deals with provision of the automatic clutch mechanism in vehicles to facilitate the smooth changing of gears. This study proposed to use the exhaust gases which are normally expelled out as a waste from the turbocharger to actuate the clutch mechanism in vehicles to facilitate the smooth changing of gears. At present, clutches are operated automatically by using an air compressor in the four wheelers. In this study, a conceptual design is proposed in which the clutch is operated by the exhaust gas from the turbocharger and this will remove the usage of air compressor in the existing system. With this system, usage of air compressor is eliminated and the riders need not to operate the clutch manually. This work involved in development, analysation and validation of the conceptual design through simulation software. Then the developed conceptual design of an automatic pneumatic clutch system is tested with proto type.
Simulating forest landscape disturbances as coupled human and natural systems
Wimberly, Michael; Sohl, Terry L.; Liu, Zhihua; Lamsal, Aashis
2015-01-01
Anthropogenic disturbances resulting from human land use affect forest landscapes over a range of spatial and temporal scales, with diverse influences on vegetation patterns and dynamics. These processes fall within the scope of the coupled human and natural systems (CHANS) concept, which has emerged as an important framework for understanding the reciprocal interactions and feedbacks that connect human activities and ecosystem responses. Spatial simulation modeling of forest landscape change is an important technique for exploring the dynamics of CHANS over large areas and long time periods. Landscape models for simulating interactions between human activities and forest landscape dynamics can be grouped into two main categories. Forest landscape models (FLMs) focus on landscapes where forests are the dominant land cover and simulate succession and natural disturbances along with forest management activities. In contrast, land change models (LCMs) simulate mosaics of different land cover and land use classes that include forests in addition to other land uses such as developed areas and agricultural lands. There are also several examples of coupled models that combine elements of FLMs and LCMs. These integrated models are particularly useful for simulating human–natural interactions in landscapes where human settlement and agriculture are expanding into forested areas. Despite important differences in spatial scale and disciplinary scope, FLMs and LCMs have many commonalities in conceptual design and technical implementation that can facilitate continued integration. The ultimate goal will be to implement forest landscape disturbance modeling in a CHANS framework that recognizes the contextual effects of regional land use and other human activities on the forest ecosystem while capturing the reciprocal influences of forests and their disturbances on the broader land use mosaic.
NASA Astrophysics Data System (ADS)
Hidayat, Iki; Sutopo; Pratama, Heru Berian
2017-12-01
The Kerinci geothermal field is one phase liquid reservoir system in the Kerinci District, western part of Jambi Province. In this field, there are geothermal prospects that identified by the heat source up flow inside a National Park area. Kerinci field was planned to develop 1×55 MWe by Pertamina Geothermal Energy. To define reservoir characterization, the numerical simulation of Kerinci field is developed by using TOUGH2 software with information from conceptual model. The pressure and temperature profile well data of KRC-B1 are validated with simulation data to reach natural state condition. The result of the validation is suitable matching. Based on natural state simulation, the resource assessment of Kerinci geothermal field is estimated by using Monte Carlo simulation with the result P10-P50-P90 are 49.4 MW, 64.3 MW and 82.4 MW respectively. This paper is the first study of resource assessment that has been estimated successfully in Kerinci Geothermal Field using numerical simulation coupling with Monte carlo simulation.
Investigation of Transonic Wake Dynamics for Mechanically Deployable Entry Systems
NASA Technical Reports Server (NTRS)
Stern, Eric; Barnhardt, Michael; Venkatapathy, Ethiraj; Candler, Graham; Prabhu, Dinesh
2012-01-01
A numerical investigation of transonic flow around a mechanically deployable entry system being considered for a robotic mission to Venus has been performed, and preliminary results are reported. The flow around a conceptual representation of the vehicle geometry was simulated at discrete points along a ballistic trajectory using Detached Eddy Simulation (DES). The trajectory points selected span the low supersonic to transonic regimes with freestream Mach numbers from 1:5 to 0:8, and freestream Reynolds numbers (based on diameter) between 2:09 x 10(exp 6) and 2:93 x 10(exp 6). Additionally, the Mach 0:8 case was simulated at angles of attack between 0 and 5 . Static aerodynamic coefficients obtained from the data show qualitative agreement with data from 70deg sphere-cone wind tunnel tests performed for the Viking program. Finally, the effect of choices of models and numerical algorithms is addressed by comparing the DES results to those using a Reynolds Averaged Navier-Stokes (RANS) model, as well as to results using a more dissipative numerical scheme.
Buckley, Thomas N; Roberts, David W
2006-02-01
Conventional wisdom holds that the ratio of leaf area to sapwood area (L/S) should decline during height (H) growth to maintain hydraulic homeostasis and prevent stomatal conductance (g(s)) from declining. We contend that L/S should increase with H based on a numerical simulation, a mathematical analysis and a conceptual argument: (1) numerical simulation--a tree growth model, DESPOT (Deducing Emergent Structure and Physiology Of Trees), in which carbon (C) allocation is regulated to maximize C gain, predicts L/S should increase during most of H growth; (2) mathematical analysis--the formal criterion for optimal C allocation, applied to a simplified analytical model of whole tree carbon-water balance, predicts L/S should increase with H if leaf-level gas exchange parameters including g(s) are conserved; and (3) conceptual argument--photosynthesis is limited by several substitutable resources (chiefly nitrogen (N), water and light) and H growth increases the C cost of water transport but not necessarily of N and light capture, so if the goal is to maximize C gain or growth, allocation should shift in favor of increasing photosynthetic capacity and irradiance, rather than sustaining g(s). Although many data are consistent with the prediction that L/S should decline with H, many others are not, and we discuss possible reasons for these discrepancies.
NASA Astrophysics Data System (ADS)
Nowak, W.; Koch, J.
2014-12-01
Predicting DNAPL fate and transport in heterogeneous aquifers is challenging and subject to an uncertainty that needs to be quantified. Models for this task needs to be equipped with an accurate source zone description, i.e., the distribution of mass of all partitioning phases (DNAPL, water, and soil) in all possible states ((im)mobile, dissolved, and sorbed), mass-transfer algorithms, and the simulation of transport processes in the groundwater. Such detailed models tend to be computationally cumbersome when used for uncertainty quantification. Therefore, a selective choice of the relevant model states, processes, and scales are both sensitive and indispensable. We investigate the questions: what is a meaningful level of model complexity and how to obtain an efficient model framework that is still physically and statistically consistent. In our proposed model, aquifer parameters and the contaminant source architecture are conceptualized jointly as random space functions. The governing processes are simulated in a three-dimensional, highly-resolved, stochastic, and coupled model that can predict probability density functions of mass discharge and source depletion times. We apply a stochastic percolation approach as an emulator to simulate the contaminant source formation, a random walk particle tracking method to simulate DNAPL dissolution and solute transport within the aqueous phase, and a quasi-steady-state approach to solve for DNAPL depletion times. Using this novel model framework, we test whether and to which degree the desired model predictions are sensitive to simplifications often found in the literature. With this we identify that aquifer heterogeneity, groundwater flow irregularity, uncertain and physically-based contaminant source zones, and their mutual interlinkages are indispensable components of a sound model framework.
Augmenting Parametric Optimal Ascent Trajectory Modeling with Graph Theory
NASA Technical Reports Server (NTRS)
Dees, Patrick D.; Zwack, Matthew R.; Edwards, Stephen; Steffens, Michael
2016-01-01
It has been well documented that decisions made in the early stages of Conceptual and Pre-Conceptual design commit up to 80% of total Life-Cycle Cost (LCC) while engineers know the least about the product they are designing [1]. Once within Preliminary and Detailed design however, making changes to the design becomes far more difficult to enact in both cost and schedule. Primarily this has been due to a lack of detailed data usually uncovered later during the Preliminary and Detailed design phases. In our current budget-constrained environment, making decisions within Conceptual and Pre-Conceptual design which minimize LCC while meeting requirements is paramount to a program's success. Within the arena of launch vehicle design, optimizing the ascent trajectory is critical for minimizing the costs present within such concerns as propellant, aerodynamic, aeroheating, and acceleration loads while meeting requirements such as payload delivered to a desired orbit. In order to optimize the vehicle design its constraints and requirements must be known, however as the design cycle proceeds it is all but inevitable that the conditions will change. Upon that change, the previously optimized trajectory may no longer be optimal, or meet design requirements. The current paradigm for adjusting to these updates is generating point solutions for every change in the design's requirements [2]. This can be a tedious, time-consuming task as changes in virtually any piece of a launch vehicle's design can have a disproportionately large effect on the ascent trajectory, as the solution space of the trajectory optimization problem is both non-linear and multimodal [3]. In addition, an industry standard tool, Program to Optimize Simulated Trajectories (POST), requires an expert analyst to produce simulated trajectories that are feasible and optimal [4]. In a previous publication the authors presented a method for combatting these challenges [5]. In order to bring more detailed information into Conceptual and Pre-Conceptual design, knowledge of the effects originating from changes to the vehicle must be calculated. In order to do this, a model capable of quantitatively describing any vehicle within the entire design space under consideration must be constructed. This model must be based upon analysis of acceptable fidelity, which in this work comes from POST. Design space interrogation can be achieved with surrogate modeling, a parametric, polynomial equation representing a tool. A surrogate model must be informed by data from the tool with enough points to represent the solution space for the chosen number of variables with an acceptable level of error. Therefore, Design Of Experiments (DOE) is used to select points within the design space to maximize information gained on the design space while minimizing number of data points required. To represent a design space with a non-trivial number of variable parameters the number of points required still represent an amount of work which would take an inordinate amount of time via the current paradigm of manual analysis, and so an automated method was developed. The best practices of expert trajectory analysts working within NASA Marshall's Advanced Concepts Office (ACO) were implemented within a tool called multiPOST. These practices include how to use the output data from a previous run of POST to inform the next, determining whether a trajectory solution is feasible from a real-world perspective, and how to handle program execution errors. The tool was then augmented with multiprocessing capability to enable analysis on multiple trajectories simultaneously, allowing throughput to scale with available computational resources. In this update to the previous work the authors discuss issues with the method and solutions.
NASA Astrophysics Data System (ADS)
Sampath Kumar, Bharath
The purpose of this study is to examine the role of partnering visualization tool such as simulation towards development of student's concrete conceptual understanding of chemical equilibrium. Students find chemistry concepts abstract, especially at the microscopic level. Chemical equilibrium is one such topic. While research studies have explored effectiveness of low tech instructional strategies such as analogies, jigsaw, cooperative learning, and using modeling blocks, fewer studies have explored the use of visualization tool such as simulations in the context of dynamic chemical equilibrium. Research studies have identified key reasons behind misconceptions such as lack of systematic understanding of foundational chemistry concepts, failure to recognize the system is dynamic, solving numerical problems on chemical equilibrium in an algorithmic fashion, erroneous application Le Chatelier's principle (LCP) etc. Kress et al. (2001) suggested that external representation in the form of visualization is more than a tool for learning, because it enables learners to make meanings or express their ideas which cannot be readily done so through a verbal representation alone. Mixed method study design was used towards data collection. The qualitative portion of the study is aimed towards understanding the change in student's mental model before and after the intervention. A quantitative instrument was developed based on common areas of misconceptions identified by research studies. A pilot study was conducted prior to the actual study to obtain feedback from students on the quantitative instrument and the simulation. Participants for the pilot study were sampled from a single general chemistry class. Following the pilot study, the research study was conducted with a total of 27 students (N=15 in experimental group and N=12 in control group). Prior to participating in the study, students have completed their midterm test on the topic of chemical equilibrium. Qualitative interviews pre and post revealed students' mental model or thought process towards chemical equilibrium. Simulations used in the study were developed using the SCRATCH software platform. In order to test the effect of visualization tool on students' conceptual understanding of chemical equilibrium, an ANCOVA analysis was conducted. Results from a one-factor ANCOVA showed posttest scores were significantly higher for the experimental group (Mpostadj. = 7.27 SDpost = 1.387) relative to the control group (Mpostadj. = 2.67, SDpost = 1.371) after adjusting for pretest scores, F (1,24) = 71.82, MSE = 1.497, p = 0.03, eta 2p = 0.75, d = 3.33. Cohen's d was converted to an attenuated effect size d* using the procedure outlined in Thompson (2006). The adjusted (for pretest scores) group mean difference estimate without measure error correction for the posttest scores and the pretest scores was 4.2 with a Cohen's d = 3.04. An alternate approach reported in Cho and Preacher (2015) was used to determine effect size. The adjusted (for pretest scores) group mean difference estimate with measurement error correction only for the posttest scores (but not with measurement error correction for the pretest scores) was 4.99 with a Cohen's d = 3.61. Finally, the adjusted (for pretest scores) group mean difference estimate with measurement error correction for both pretest and posttest scores was 4.23 with a Cohen's d = 3.07. From a quantitative perspective, these effect size indicate a strong relationship between the experimental intervention provided and students' conceptual understanding of chemical equilibrium concepts. That is, those students who received the experimental intervention had exceptionally higher. KEYWORDS: Chemical Equilibrium, Visualization, Alternate Conceptions, Ontological Shift. Simulations.
Effect of sub-pore scale morphology of biological deposits on porous media flow properties
NASA Astrophysics Data System (ADS)
Ghezzehei, T. A.
2012-12-01
Biological deposits often influence fluid flow by altering the pore space morphology and related hydrologic properties such as porosity, water retention characteristics, and permeability. In most coupled-processes models changes in porosity are inferred from biological process models using mass-balance. The corresponding evolution of permeability is estimated using (semi-) empirical porosity-permeability functions such as the Kozeny-Carman equation or power-law functions. These equations typically do not account for the heterogeneous spatial distribution and morphological irregularities of the deposits. As a result, predictions of permeability evolution are generally unsatisfactory. In this presentation, we demonstrate the significance of pore-scale deposit distribution on porosity-permeability relations using high resolution simulations of fluid flow through a single pore interspersed with deposits of varying morphologies. Based on these simulations, we present a modification to the Kozeny-Carman model that accounts for the shape of the deposits. Limited comparison with published experimental data suggests the plausibility of the proposed conceptual model.
Validation of the Continuum of Care Conceptual Model for Athletic Therapy
Lafave, Mark R.; Butterwick, Dale; Eubank, Breda
2015-01-01
Utilization of conceptual models in field-based emergency care currently borrows from existing standards of medical and paramedical professions. The purpose of this study was to develop and validate a comprehensive conceptual model that could account for injuries ranging from nonurgent to catastrophic events including events that do not follow traditional medical or prehospital care protocols. The conceptual model should represent the continuum of care from the time of initial injury spanning to an athlete's return to participation in their sport. Finally, the conceptual model should accommodate both novices and experts in the AT profession. This paper chronicles the content validation steps of the Continuum of Care Conceptual Model for Athletic Therapy (CCCM-AT). The stages of model development were domain and item generation, content expert validation using a three-stage modified Ebel procedure, and pilot testing. Only the final stage of the modified Ebel procedure reached a priori 80% consensus on three domains of interest: (1) heading descriptors; (2) the order of the model; (3) the conceptual model as a whole. Future research is required to test the use of the CCCM-AT in order to understand its efficacy in teaching and practice within the AT discipline. PMID:26464897
Sensitivity Studies of 3D Reservoir Simulation at the I-Lan Geothermal Area in Taiwan Using TOUGH2
NASA Astrophysics Data System (ADS)
Kuo, C. W.; Song, S. R.
2014-12-01
A large scale geothermal project conducted by National Science Council is initiated recently in I-Lan south area, northeastern Taiwan. The goal of this national project is to generate at least 5 MW electricity from geothermal energy. To achieve this goal, an integrated team which consists of various specialties are held together to investigate I-Lan area comprehensively. For example, I-Lan geological data, petrophysical analysis, seismicity, temperature distribution, hydrology, geochemistry, heat source study etc. were performed to build a large scale 3D conceptual model of the geothermal potential sites. In addition, not only a well of 3000m deep but also several shallow wells are currently drilling to give us accurate information about the deep underground. According to the current conceptual model, the target area is bounded by two main faults, Jiaosi and Choshui faults. The geothermal gradient measured at one drilling well (1200m) is about 49.1˚C/km. The geothermal reservoir is expected to occur at a fractured geological formation, Siling sandstone layer. The preliminary results of this area from all the investigations are used as input parameters to create a realistic numerical reservoir model. This work is using numerical simulator TOUGH2/EOS1 to study the geothermal energy potential in I-Lan area. Once we can successfully predict the geothermal energy potential in this area and generate 5 MW electricity, we can apply the similar methodology to the other potential sites in Taiwan, and therefore increase the percentage of renewable energy in the generation of electricity. A large scale of three-dimensional subsurface geological model is built mainly based on the seismic exploration of the subsurface structure and well log data. The dimensions of the reservoir model in x, y, and z coordinates are 20x10x5 km, respectively. Once the conceptual model and the well locations are set up appropriately based on the field data, sensitivity studies on production and injection rates, heat source, fractures, and all the relevant parameters are performed to evaluate their effects on temperature distribution of reservoir for 30 years. Through these sensitivity studies, we can design the better geothermal system in I-Lan area and reduce the risk of exploitation.
NASA Astrophysics Data System (ADS)
Riboust, Philippe; Thirel, Guillaume; Le Moine, Nicolas; Ribstein, Pierre
2016-04-01
A better knowledge of the accumulated snow on the watersheds will help flood forecasting centres and hydro-power companies to predict the amount of water released during spring snowmelt. Since precipitations gauges are sparse at high elevations and integrative measurements of the snow accumulated on watershed surface are hard to obtain, using snow models is an adequate way to estimate snow water equivalent (SWE) on watersheds. In addition to short term prediction, simulating accurately SWE with snow models should have many advantages. Validating the snow module on both SWE and snowmelt should give a more reliable model for climate change studies or regionalization for ungauged watersheds. The aim of this study is to create a new snow module, which has a structure that allows the use of measured snow data for calibration or assimilation. Energy balance modelling seems to be the logical choice for designing a model in which internal variables, such as SWE, could be compared to observations. Physical models are complex, needing high computational resources and many different types of inputs that are not widely measured at meteorological stations. At the opposite, simple conceptual degree-day models offer to simulate snowmelt using only temperature and precipitation as inputs with fast computing. Its major drawback is to be empirical, i.e. not taking into account all of the processes of the energy balance, which makes this kind of model more difficult to use when willing to compare SWE to observed measurements. In order to reach our objectives, we created a snow model structured by a simplified energy balance where each of the processes is empirically parameterized in order to be calculated using only temperature, precipitation and cloud cover variables. This model's structure is similar to the one created by M.T. Walter (2005), where parameterizations from the literature were used to compute all of the processes of the energy balance. The conductive fluxes into the snowpack were modelled by using analytical solutions to the heat equation taking phase change into account. This approach has the advantage to use few forcing variables and to take into account all the processes of the energy balance. Indeed, the simulations should be quick enough to allow, for example, ensemble prediction or simulation of numerous basins, more easily than physical snow models. The snow module formulation has been completed and is in its validation phase using data from the experimental station of Col de Porte, Alpes, France. Data from the US SNOTEL product will be used in order to test the model structure on a larger scale and to test diverse calibration procedures, since the aim is to use it on a basin scale for discharge modelling purposes.
NASA Astrophysics Data System (ADS)
Abitew, T. A.; Roy, T.; Serrat-Capdevila, A.; van Griensven, A.; Bauwens, W.; Valdes, J. B.
2016-12-01
The Tekeze Basin supports one of Africans largest Arch Dam located in northern Ethiopian has vital role in hydropower generation. However, little has been done on the hydrology of the basin due to limited in situ hydroclimatological data. Therefore, the main objective of this research is to simulate streamflow upstream of the Tekeze Dam using Soil and Water Assessment Tool (SWAT) forced by bias-corrected multiple satellite rainfall products (CMORPH, TMPA and PERSIANN-CCS). This talk will present the potential as well as skills of bias-corrected satellite rainfall products for streamflow prediction in in Tropical Africa. Additionally, the SWAT model results will also be compared with previous conceptual Hydrological models (HyMOD and HBV) from SERVIR Streamflow forecasting in African Basin project (http://www.swaat.arizona.edu/index.html).
Nordqvist, R.; Voss, C.I.
1996-01-01
An approach to model discrimination and network design for evaluation of groundwater contamination risk is proposed and demonstrated by application to a site in a glaciofluvial aquifer in Sweden. The approach consists of first hypothesizing alternative conceptual models of hydrogeology at the site on the basis of both quantitative data and qualitative information. The conceptual models are then expressed as two-dimensional numerical models of groundwater flow and solute transport, and model attributes controlling risk to the water supply are determined by simulation. Model predictions of response to a specific field test are made with each model that affects risk. Regions for effective measurement networks are then identified. Effective networks are those that capture sufficient information to determine which of the hypothesized models best describes the system with a minimum of measurement points. For the example site in Sweden, the network is designed such that important system parameters may be accurately estimated at the same time as model discrimination is carried out. The site in Vansbro, Sweden, consists of a water-supply well in an esker separated (by 300m) from a wood preservation and treatment area on the esker flank by only a narrow inlet of a bordering stream. Application of the above-described risk analysis shows that, of all the hydrologic controls and parameters in the groundwater system, the only factor that controls the potential migration of wood-treatment contaminants to the well is whether the inlet's bed is pervious, creating a hydraulic barrier to lateral contaminant transport. Furthermore, the analysis localizes an area near the end of the inlet wherein the most effective measurements of drawdown would be made to discriminate between a permeable and impermeable bed. The location of this optimal area is not obvious prior to application of the above methodology.
Social determinants of health inequalities: towards a theoretical perspective using systems science.
Jayasinghe, Saroj
2015-08-25
A systems approach offers a novel conceptualization to natural and social systems. In recent years, this has led to perceiving population health outcomes as an emergent property of a dynamic and open, complex adaptive system. The current paper explores these themes further and applies the principles of systems approach and complexity science (i.e. systems science) to conceptualize social determinants of health inequalities. The conceptualization can be done in two steps: viewing health inequalities from a systems approach and extending it to include complexity science. Systems approach views health inequalities as patterns within the larger rubric of other facets of the human condition, such as educational outcomes and economic development. This anlysis requires more sophisticated models such as systems dynamic models. An extension of the approach is to view systems as complex adaptive systems, i.e. systems that are 'open' and adapt to the environment. They consist of dynamic adapting subsystems that exhibit non-linear interactions, while being 'open' to a similarly dynamic environment of interconnected systems. They exhibit emergent properties that cannot be estimated with precision by using the known interactions among its components (such as economic development, political freedom, health system, culture etc.). Different combinations of the same bundle of factors or determinants give rise to similar patterns or outcomes (i.e. property of convergence), and minor variations in the initial condition could give rise to widely divergent outcomes. Novel approaches using computer simulation models (e.g. agent-based models) would shed light on possible mechanisms as to how factors or determinants interact and lead to emergent patterns of health inequalities of populations.
Megalla, Dina; Van Geel, Paul J; Doyle, James T
2016-09-01
A landfill gas to energy (LFGTE) facility in Ste. Sophie, Quebec was instrumented with sensors which measure temperature, oxygen, moisture content, settlement, total earth pressure, electrical conductivity and mounding of leachate. These parameters were monitored during the operating phase of the landfill in order to better understand the biodegradation and waste stabilization processes occurring within a LFGTE facility. Conceptual and numerical models were created to describe the heat transfer processes which occur within five waste lifts placed over a two-year period. A finite element model was created to simulate the temperatures within the waste and estimate the heat budget over a four and a half year period. The calibrated model was able to simulate the temperatures measured to date within the instrumented waste profile at the site. The model was used to evaluate the overall heat budget for the waste profile. The model simulations and heat budget provide a better understanding of the heat transfer processes occurring within the landfill and the relative impact of the various heat source/sink and storage terms. Aerobic biodegradation appears to play an important role in the overall heat budget at this site generating 36% of the total heat generated within the waste profile during the waste placement stages of landfill operations. Copyright © 2015 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Baser, Mustafa
2006-01-01
This paper reports upon an active learning approach that promotes conceptual change when studying direct current electricity circuits, using free open source software, "Qucs". The study involved a total of 102 prospective mathematics teacher students. Prior to instruction, students' understanding of direct current electricity was…
Conceptual model for collision detection and avoidance for runway incursion prevention
NASA Astrophysics Data System (ADS)
Latimer, Bridgette A.
The Federal Aviation Administration (FAA), National Transportation and Safety Board (NTSB), National Aeronautics and Space Administration (NASA), numerous corporate entities, and research facilities have each come together to determine ways to make air travel safer and more efficient. These efforts have resulted in the development of a concept known as the Next Generation (Next Gen) of Aircraft or Next Gen. The Next Gen concept promises to be a clear departure from the way in which aircraft operations are performed today. The Next Gen initiatives require that modifications are made to the existing National Airspace System (NAS) concept of operations, system level requirements, software (SW) and hardware (HW) requirements, SW and HW designs and implementations. A second example of the changes in the NAS is the shift away from air traffic controllers having the responsibility for separation assurance. In the proposed new scheme of free flight, each aircraft would be responsible for assuring that it is safely separated from surrounding aircraft. Free flight would allow the separation minima for enroute aircraft to be reduced from 2000 nautical miles (nm) to 1000 nm. Simply put "Free Flight is a concept of air traffic management that permits pilots and controllers to share information and work together to manage air traffic from pre-flight through arrival without compromising safety [107]." The primary goal of this research project was to create a conceptual model that embodies the essential ingredients needed for a collision detection and avoidance system. This system was required to operate in two modes: air traffic controller's perspective and pilot's perspective. The secondary goal was to demonstrate that the technologies, procedures, and decision logic embedded in the conceptual model were able to effectively detect and avoid collision risks from both perspectives. Embodied in the conceptual model are five distinct software modules: Data Acquisition, State Processor, Projection, Collision Detection, and Alerting and Resolution. The underlying algorithms in the Projection module are linear projection and Kalman filtering which are used to estimate the future state of the aircraft. The Resolution and Alerting module is comprised of two algorithms: a generic alerting algorithm and the potential fields algorithm [71]. The conceptual model was created using Enterprise Architect RTM and MATLAB RTM was used to code the methods and to simulate conflict scenarios.
Predator-prey models with component Allee effect for predator reproduction.
Terry, Alan J
2015-12-01
We present four predator-prey models with component Allee effect for predator reproduction. Using numerical simulation results for our models, we describe how the customary definitions of component and demographic Allee effects, which work well for single species models, can be extended to predators in predator-prey models by assuming that the prey population is held fixed. We also find that when the prey population is not held fixed, then these customary definitions may lead to conceptual problems. After this discussion of definitions, we explore our four models, analytically and numerically. Each of our models has a fixed point that represents predator extinction, which is always locally stable. We prove that the predator will always die out either if the initial predator population is sufficiently small or if the initial prey population is sufficiently small. Through numerical simulations, we explore co-existence fixed points. In addition, we demonstrate, by simulation, the existence of a stable limit cycle in one of our models. Finally, we derive analytical conditions for a co-existence trapping region in three of our models, and show that the fourth model cannot possess a particular kind of co-existence trapping region. We punctuate our results with comments on their real-world implications; in particular, we mention the possibility of prey resurgence from mortality events, and the possibility of failure in a biological pest control program.
Implementation of a 3d numerical model of a folded multilayer carbonate aquifer
NASA Astrophysics Data System (ADS)
Di Salvo, Cristina; Guyennon, Nicolas; Romano, Emanuele; Bruna Petrangeli, Anna; Preziosi, Elisabetta
2016-04-01
The main objective of this research is to present a case study of the numerical model implementation of a complex carbonate, structurally folded aquifer, with a finite difference, porous equivalent model. The case study aquifer (which extends over 235 km2 in the Apennine chain, Central Italy) provides a long term average of 3.5 m3/s of good quality groundwater to the surface river network, sustaining the minimum vital flow, and it is planned to be exploited in the next years for public water supply. In the downstream part of the river in the study area, a "Site of Community Importance" include the Nera River for its valuable aquatic fauna. However, the possible negative effects of the foreseen exploitation on groundwater dependent ecosystems are a great concern and model grounded scenarios are needed. This multilayer aquifer was conceptualized as five hydrostratigraphic units: three main aquifers (the uppermost unconfined, the central and the deepest partly confined), are separated by two locally discontinuous aquitards. The Nera river cuts through the two upper aquifers and acts as the main natural sink for groundwater. An equivalent porous medium approach was chosen. The complex tectonic structure of the aquifer requires several steps in defining the conceptual model; the presence of strongly dipping layers with very heterogeneous hydraulic conductivity, results in different thicknesses of saturated portions. Aquifers can have both unconfined or confined zones; drying and rewetting must be allowed when considering recharge/discharge cycles. All these characteristics can be included in the conceptual and numerical model; however, being the number of flow and head target scarce, the over-parametrization of the model must be avoided. Following the principle of parsimony, three steady state numerical models were developed, starting from a simple model, and then adding complexity: 2D (single layer), QUASI -3D (with leackage term simulating flow through aquitards) and fully-3D (with aquitards simulated explicitly and transient flow represented by 3D governing equations). At first, steady state simulation were run under average seasonal recharge. To overcome dry-cell problems in the FULL-3D model, the Newton-Raphson formulation for MODFLOW-2005 was invoked. Steady state calibration was achieved mainly using annual average flow along four streambed's Nera River springs and average water level data available only in two observation wells. Results show that a FULL-3D zoned model was required to match the observed distribution of river base flow. The FULL-3D model was then run in transient conditions (1990-2013) by using monthly spatially distributed recharge estimated using the Thornthwaite-Mather method based on 60 years of climate data. The monitored flow of one spring, used for public water supply, was used as proxy data for reconstruct Nera River hydrogram; proxy-based hydrogram was used for calibration of storage coefficients and further model's parameters adjustment. Once calibrated, the model was run under different aquifer management scenario (i.e., pumping wells planned to be active for water supply); the related risk of depletion of spring discharge and groundwater-surface water interaction was evaluated.
Improved Conceptual Models Methodology (ICoMM) for Validation of Non-Observable Systems
2015-12-01
distribution is unlimited IMPROVED CONCEPTUAL MODELS METHODOLOGY (ICoMM) FOR VALIDATION OF NON-OBSERVABLE SYSTEMS by Sang M. Sok December 2015...REPORT TYPE AND DATES COVERED Dissertation 4. TITLE AND SUBTITLE IMPROVED CONCEPTUAL MODELS METHODOLOGY (ICoMM) FOR VALIDATION OF NON-OBSERVABLE...importance of the CoM. The improved conceptual model methodology (ICoMM) is developed in support of improving the structure of the CoM for both face and
DOE Office of Scientific and Technical Information (OSTI.GOV)
G. Keating; W.Statham
2004-02-12
The purpose of this model report is to provide documentation of the conceptual and mathematical model (ASHPLUME) for atmospheric dispersal and subsequent deposition of ash on the land surface from a potential volcanic eruption at Yucca Mountain, Nevada. This report also documents the ash (tephra) redistribution conceptual model. The ASHPLUME conceptual model accounts for incorporation and entrainment of waste fuel particles associated with a hypothetical volcanic eruption through the Yucca Mountain repository and downwind transport of contaminated tephra. The ASHPLUME mathematical model describes the conceptual model in mathematical terms to allow for prediction of radioactive waste/ash deposition on the groundmore » surface given that the hypothetical eruptive event occurs. This model report also describes the conceptual model for tephra redistribution from a basaltic cinder cone. Sensitivity analyses and model validation activities for the ash dispersal and redistribution models are also presented. Analyses documented in this model report will improve and clarify the previous documentation of the ASHPLUME mathematical model and its application to the Total System Performance Assessment (TSPA) for the License Application (TSPA-LA) igneous scenarios. This model report also documents the redistribution model product outputs based on analyses to support the conceptual model.« less
Recognising the Effects of Costing Assumptions in Educational Business Simulation Games
ERIC Educational Resources Information Center
Eckardt, Gordon; Selen, Willem; Wynder, Monte
2015-01-01
Business simulations are a powerful way to provide experiential learning that is focussed, controlled, and concentrated. Inherent in any simulation, however, are numerous assumptions that determine feedback, and hence the lessons learnt. In this conceptual paper we describe some common cost assumptions that are implicit in simulation design and…
Modelling strategies to predict the multi-scale effects of rural land management change
NASA Astrophysics Data System (ADS)
Bulygina, N.; Ballard, C. E.; Jackson, B. M.; McIntyre, N.; Marshall, M.; Reynolds, B.; Wheater, H. S.
2011-12-01
Changes to the rural landscape due to agricultural land management are ubiquitous, yet predicting the multi-scale effects of land management change on hydrological response remains an important scientific challenge. Much empirical research has been of little generic value due to inadequate design and funding of monitoring programmes, while the modelling issues challenge the capability of data-based, conceptual and physics-based modelling approaches. In this paper we report on a major UK research programme, motivated by a national need to quantify effects of agricultural intensification on flood risk. Working with a consortium of farmers in upland Wales, a multi-scale experimental programme (from experimental plots to 2nd order catchments) was developed to address issues of upland agricultural intensification. This provided data support for a multi-scale modelling programme, in which highly detailed physics-based models were conditioned on the experimental data and used to explore effects of potential field-scale interventions. A meta-modelling strategy was developed to represent detailed modelling in a computationally-efficient manner for catchment-scale simulation; this allowed catchment-scale quantification of potential management options. For more general application to data-sparse areas, alternative approaches were needed. Physics-based models were developed for a range of upland management problems, including the restoration of drained peatlands, afforestation, and changing grazing practices. Their performance was explored using literature and surrogate data; although subject to high levels of uncertainty, important insights were obtained, of practical relevance to management decisions. In parallel, regionalised conceptual modelling was used to explore the potential of indices of catchment response, conditioned on readily-available catchment characteristics, to represent ungauged catchments subject to land management change. Although based in part on speculative relationships, significant predictive power was derived from this approach. Finally, using a formal Bayesian procedure, these different sources of information were combined with local flow data in a catchment-scale conceptual model application , i.e. using small-scale physical properties, regionalised signatures of flow and available flow measurements.
Vezzaro, L; Sharma, A K; Ledin, A; Mikkelsen, P S
2015-03-15
The estimation of micropollutant (MP) fluxes in stormwater systems is a fundamental prerequisite when preparing strategies to reduce stormwater MP discharges to natural waters. Dynamic integrated models can be important tools in this step, as they can be used to integrate the limited data provided by monitoring campaigns and to evaluate the performance of different strategies based on model simulation results. This study presents an example where six different control strategies, including both source-control and end-of-pipe treatment, were compared. The comparison focused on fluxes of heavy metals (copper, zinc) and organic compounds (fluoranthene). MP fluxes were estimated by using an integrated dynamic model, in combination with stormwater quality measurements. MP sources were identified by using GIS land usage data, runoff quality was simulated by using a conceptual accumulation/washoff model, and a stormwater retention pond was simulated by using a dynamic treatment model based on MP inherent properties. Uncertainty in the results was estimated with a pseudo-Bayesian method. Despite the great uncertainty in the MP fluxes estimated by the runoff quality model, it was possible to compare the six scenarios in terms of discharged MP fluxes, compliance with water quality criteria, and sediment accumulation. Source-control strategies obtained better results in terms of reduction of MP emissions, but all the simulated strategies failed in fulfilling the criteria based on emission limit values. The results presented in this study shows how the efficiency of MP pollution control strategies can be quantified by combining advanced modeling tools (integrated stormwater quality model, uncertainty calibration). Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Mottes, Charles; Lesueur-Jannoyer, Magalie; Charlier, Jean-Baptiste; Carles, Céline; Guéné, Mathilde; Le Bail, Marianne; Malézieux, Eric
2015-10-01
Simulation of flows and pollutant transfers in heterogeneous media is widely recognized to be a remaining frontier in hydrology research. We present a new modeling approach to simulate agricultural pollutions in watersheds: WATPPASS, a model for Watershed Agricultural Techniques and Pesticide Practices ASSessment. It is designed to assess mean pesticide concentrations and loads that result from the use of pesticides in horticultural watersheds located on heterogeneous subsoil. WATPPASS is suited for small watershed with significant groundwater flows and complex aquifer systems. The model segments the watershed into fields with independent hydrological and pesticide transfers at the ground surface. Infiltrated water and pesticides are routed toward outlet using a conceptual reservoir model. We applied WATPPASS on a heterogeneous tropical volcanic watershed of Martinique in the French West Indies. We carried out and hydrological analysis that defined modeling constraints: (i) a spatial variability of runoff/infiltration partitioning according to land use, and (ii) a predominance of groundwater flow paths in two overlapping aquifers under permeable soils (50-60% of annual flows). We carried out simulations on a 550 days period at a daily time step for hydrology (Nashsqrt > 0.75). Weekly concentrations and loads of a persistent organic pesticide (chlordecone) were simulated for 67 weeks to evaluate the modeling approach. Pesticide simulations without specific calibration detected the mean long-term measured concentration, leading to a good quantification of the cumulative loads (5% error), but failed to represent the concentration peaks at the correct timing. Nevertheless, we succeed in adjusting the model structure to better represent the temporal dynamic of pesticide concentrations. This modification requires a proper evaluation on an independent dataset. Finally, WATPPASS is a compromise between complexity and easiness of use that makes it suited for cropping system assessment in complex pedological and geological environment.
Conceptual achievement of 1GBq activity in a Plasma Focus driven system.
Tabbakh, Farshid; Sadat Kiai, Seyed Mahmood; Pashaei, Mohammad
2017-11-01
This is an approach to evaluate the radioisotope production by means of typical dense plasma focus devices. The production rate of the appropriate positron emitters, F-18, N-13 and O-15 has been studied. The beam-target mechanism was simulated by GEANT4 Monte Carlo tool using QGSP_BIC and QGSP_INCLXX physic models as comparison. The results for positron emitters have been evaluated by reported experimental data and found conformity between simulations and experimental reports that leads to using this code as a reliable tool in optimizing the DPF driven systems for achieving to 1GBq activity of produced radioisotope. Copyright © 2017 Elsevier Ltd. All rights reserved.
Improvements, testing and development of the ADM-τ sub-grid surface tension model for two-phase LES
NASA Astrophysics Data System (ADS)
Aniszewski, Wojciech
2016-12-01
In this paper, a specific subgrid term occurring in Large Eddy Simulation (LES) of two-phase flows is investigated. This and other subgrid terms are presented, we subsequently elaborate on the existing models for those and re-formulate the ADM-τ model for sub-grid surface tension previously published by these authors. This paper presents a substantial, conceptual simplification over the original model version, accompanied by a decrease in its computational cost. At the same time, it addresses the issues the original model version faced, e.g. introduces non-isotropic applicability criteria based on resolved interface's principal curvature radii. Additionally, this paper introduces more throughout testing of the ADM-τ, in both simple and complex flows.
Complex systems and health behavior change: insights from cognitive science.
Orr, Mark G; Plaut, David C
2014-05-01
To provide proof-of-concept that quantum health behavior can be instantiated as a computational model that is informed by cognitive science, the Theory of Reasoned Action, and quantum health behavior theory. We conducted a synthetic review of the intersection of quantum health behavior change and cognitive science. We conducted simulations, using a computational model of quantum health behavior (a constraint satisfaction artificial neural network) and tested whether the model exhibited quantum-like behavior. The model exhibited clear signs of quantum-like behavior. Quantum health behavior can be conceptualized as constraint satisfaction: a mitigation between current behavioral state and the social contexts in which it operates. We outlined implications for moving forward with computational models of both quantum health behavior and health behavior in general.
Benoit, Richard; Mion, Lorraine
2012-08-01
This paper presents a proposed conceptual model to guide research on pressure ulcer risk in critically ill patients, who are at high risk for pressure ulcer development. However, no conceptual model exists that guides risk assessment in this population. Results from a review of prospective studies were evaluated for design quality and level of statistical reporting. Multivariate findings from studies having high or medium design quality by the National Institute of Health and Clinical Excellence standards were conceptually grouped. The conceptual groupings were integrated into Braden and Bergstrom's (Braden and Bergstrom [1987] Rehabilitation Nursing, 12, 8-12, 16) conceptual model, retaining their original constructs and augmenting their concept of intrinsic factors for tissue tolerance. The model could enhance consistency in research on pressure ulcer risk factors. Copyright © 2012 Wiley Periodicals, Inc.
Geza, Mengistu; Lowe, Kathryn S; Huntzinger, Deborah N; McCray, John E
2013-07-01
Onsite wastewater treatment systems are commonly used in the United States to reclaim domestic wastewater. A distinct biomat forms at the infiltrative surface, causing resistance to flow and decreasing soil moisture below the biomat. To simulate these conditions, previous modeling studies have used a two-layer approach: a thin biomat layer (1-5 cm thick) and the native soil layer below the biomat. However, the effect of wastewater application extends below the biomat layer. We used numerical modeling supported by experimental data to justify a new conceptual model that includes an intermediate zone (IZ) below the biomat. The conceptual model was set up using Hydrus 2D and calibrated against soil moisture and water flux measurements. The estimated hydraulic conductivity value for the IZ was between biomat and the native soil. The IZ has important implications for wastewater treatment. When the IZ was not considered, a loading rate of 5 cm d resulted in an 8.5-cm ponding. With the IZ, the same loading rate resulted in a 9.5-cm ponding. Without the IZ, up to 3.1 cm d of wastewater could be applied without ponding; with the IZ, only up to 2.8 cm d could be applied without ponding. The IZ also plays a significant role in soil moisture distribution. Without the IZ, near-saturation conditions were observed only within the biomat, whereas near-saturation conditions extended below the biomat with the IZ. Accurate prediction of ponding is important to prevent surfacing of wastewater. The degree of water and air saturation influences pollutant treatment efficiency through residence time, volatility, and biochemical reactions. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.
Simulation-Based Analysis of Reentry Dynamics for the Sharp Atmospheric Entry Vehicle
NASA Technical Reports Server (NTRS)
Tillier, Clemens Emmanuel
1998-01-01
This thesis describes the analysis of the reentry dynamics of a high-performance lifting atmospheric entry vehicle through numerical simulation tools. The vehicle, named SHARP, is currently being developed by the Thermal Protection Materials and Systems branch of NASA Ames Research Center, Moffett Field, California. The goal of this project is to provide insight into trajectory tradeoffs and vehicle dynamics using simulation tools that are powerful, flexible, user-friendly and inexpensive. Implemented Using MATLAB and SIMULINK, these tools are developed with an eye towards further use in the conceptual design of the SHARP vehicle's trajectory and flight control systems. A trajectory simulator is used to quantify the entry capabilities of the vehicle subject to various operational constraints. Using an aerodynamic database computed by NASA and a model of the earth, the simulator generates the vehicle trajectory in three-dimensional space based on aerodynamic angle inputs. Requirements for entry along the SHARP aerothermal performance constraint are evaluated for different control strategies. Effect of vehicle mass on entry parameters is investigated, and the cross range capability of the vehicle is evaluated. Trajectory results are presented and interpreted. A six degree of freedom simulator builds on the trajectory simulator and provides attitude simulation for future entry controls development. A Newtonian aerodynamic model including control surfaces and a mass model are developed. A visualization tool for interpreting simulation results is described. Control surfaces are roughly sized. A simple controller is developed to fly the vehicle along its aerothermal performance constraint using aerodynamic flaps for control. This end-to-end demonstration proves the suitability of the 6-DOF simulator for future flight control system development. Finally, issues surrounding real-time simulation with hardware in the loop are discussed.
An agent-based hydroeconomic model to evaluate water policies in Jordan
NASA Astrophysics Data System (ADS)
Yoon, J.; Gorelick, S.
2014-12-01
Modern water systems can be characterized by a complex network of institutional and private actors that represent competing sectors and interests. Identifying solutions to enhance water security in such systems calls for analysis that can adequately account for this level of complexity and interaction. Our work focuses on the development of a hierarchical, multi-agent, hydroeconomic model that attempts to realistically represent complex interactions between hydrologic and multi-faceted human systems. The model is applied to Jordan, one of the most water-poor countries in the world. In recent years, the water crisis in Jordan has escalated due to an ongoing drought and influx of refugees from regional conflicts. We adopt a modular approach in which biophysical modules simulate natural and engineering phenomena, and human modules represent behavior at multiple scales of decision making. The human modules employ agent-based modeling, in which agents act as autonomous decision makers at the transboundary, state, organizational, and user levels. A systematic nomenclature and conceptual framework is used to characterize model agents and modules. Concepts from the Unified Modeling Language (UML) are adopted to promote clear conceptualization of model classes and process sequencing, establishing a foundation for full deployment of the integrated model in a scalable object-oriented programming environment. Although the framework is applied to the Jordanian water context, it is generalizable to other regional human-natural freshwater supply systems.
NASA Astrophysics Data System (ADS)
Robins, N. S.; Rutter, H. K.; Dumpleton, S.; Peach, D. W.
2005-01-01
Groundwater investigation has long depended on the process of developing a conceptual flow model as a precursor to developing a mathematical model, which in turn may lead in complex aquifers to the development of a numerical approximation model. The assumptions made in the development of the conceptual model depend heavily on the geological framework defining the aquifer, and if the conceptual model is inappropriate then subsequent modelling will also be incorrect. Paradoxically, the development of a robust conceptual model remains difficult, not least because this 3D paradigm is usually reduced to 2D plans and sections. 3D visualisation software is now available to facilitate the development of the conceptual model, to make the model more robust and defensible and to assist in demonstrating the hydraulics of the aquifer system. Case studies are presented to demonstrate the role and cost-effectiveness of the visualisation process.
ERIC Educational Resources Information Center
Jaakkola, Tomi; Nurmi, Sami; Veermans, Koen
2011-01-01
The aim of this experimental study was to compare learning outcomes of students using a simulation alone (simulation environment) with outcomes of those using a simulation in parallel with real circuits (combination environment) in the domain of electricity, and to explore how learning outcomes in these environments are mediated by implicit (only…
NASA Astrophysics Data System (ADS)
Dakhlaoui, H.; Ruelland, D.; Tramblay, Y.; Bargaoui, Z.
2017-07-01
To evaluate the impact of climate change on water resources at the catchment scale, not only future projections of climate are necessary but also robust rainfall-runoff models that must be fairly reliable under changing climate conditions. The aim of this study was thus to assess the robustness of three conceptual rainfall-runoff models (GR4j, HBV and IHACRES) on five basins in northern Tunisia under long-term climate variability, in the light of available future climate scenarios for this region. The robustness of the models was evaluated using a differential split sample test based on a climate classification of the observation period that simultaneously accounted for precipitation and temperature conditions. The study catchments include the main hydrographical basins in northern Tunisia, which produce most of the surface water resources in the country. A 30-year period (1970-2000) was used to capture a wide range of hydro-climatic conditions. The calibration was based on the Kling-Gupta Efficiency (KGE) criterion, while model transferability was evaluated based on the Nash-Sutcliffe efficiency criterion and volume error. The three hydrological models were shown to behave similarly under climate variability. The models simulated the runoff pattern better when transferred to wetter and colder conditions than to drier and warmer ones. It was shown that their robustness became unacceptable when climate conditions involved a decrease of more than 25% in annual precipitation and an increase of more than +1.75 °C in annual mean temperatures. The reduction in model robustness may be partly due to the climate dependence of some parameters. When compared to precipitation and temperature projections in the region, the limits of transferability obtained in this study are generally respected for short and middle term. For long term projections under the most pessimistic emission gas scenarios, the limits of transferability are generally not respected, which may hamper the use of conceptual models for hydrological projections in northern Tunisia.
Spatial Learning and Computer Simulations in Science
ERIC Educational Resources Information Center
Lindgren, Robb; Schwartz, Daniel L.
2009-01-01
Interactive simulations are entering mainstream science education. Their effects on cognition and learning are often framed by the legacy of information processing, which emphasized amodal problem solving and conceptual organization. In contrast, this paper reviews simulations from the vantage of research on perception and spatial learning,…
2009-01-01
Background The study of biological networks has led to the development of increasingly large and detailed models. Computer tools are essential for the simulation of the dynamical behavior of the networks from the model. However, as the size of the models grows, it becomes infeasible to manually verify the predictions against experimental data or identify interesting features in a large number of simulation traces. Formal verification based on temporal logic and model checking provides promising methods to automate and scale the analysis of the models. However, a framework that tightly integrates modeling and simulation tools with model checkers is currently missing, on both the conceptual and the implementational level. Results We have developed a generic and modular web service, based on a service-oriented architecture, for integrating the modeling and formal verification of genetic regulatory networks. The architecture has been implemented in the context of the qualitative modeling and simulation tool GNA and the model checkers NUSMV and CADP. GNA has been extended with a verification module for the specification and checking of biological properties. The verification module also allows the display and visual inspection of the verification results. Conclusions The practical use of the proposed web service is illustrated by means of a scenario involving the analysis of a qualitative model of the carbon starvation response in E. coli. The service-oriented architecture allows modelers to define the model and proceed with the specification and formal verification of the biological properties by means of a unified graphical user interface. This guarantees a transparent access to formal verification technology for modelers of genetic regulatory networks. PMID:20042075
The Cancer Family Caregiving Experience: An Updated and Expanded Conceptual Model
Fletcher, Barbara Swore; Miaskowski, Christine; Given, Barbara; Schumacher, Karen
2011-01-01
Objective The decade from 2000–2010 was an era of tremendous growth in family caregiving research specific to the cancer population. This research has implications for how cancer family caregiving is conceptualized, yet the most recent comprehensive model of cancer family caregiving was published ten years ago. Our objective was to develop an updated and expanded comprehensive model of the cancer family caregiving experience, derived from concepts and variables used in research during past ten years. Methods A conceptual model was developed based on cancer family caregiving research published from 2000–2010. Results Our updated and expanded model has three main elements: 1) the stress process, 2) contextual factors, and 3) the cancer trajectory. Emerging ways of conceptualizing the relationships between and within model elements are addressed, as well as an emerging focus on caregiver-patient dyads as the unit of analysis. Conclusions Cancer family caregiving research has grown dramatically since 2000 resulting in a greatly expanded conceptual landscape. This updated and expanded model of the cancer family caregiving experience synthesizes the conceptual implications of an international body of work and demonstrates tremendous progress in how cancer family caregiving research is conceptualized. PMID:22000812
skelesim: an extensible, general framework for population genetic simulation in R.
Parobek, Christian M; Archer, Frederick I; DePrenger-Levin, Michelle E; Hoban, Sean M; Liggins, Libby; Strand, Allan E
2017-01-01
Simulations are a key tool in molecular ecology for inference and forecasting, as well as for evaluating new methods. Due to growing computational power and a diversity of software with different capabilities, simulations are becoming increasingly powerful and useful. However, the widespread use of simulations by geneticists and ecologists is hindered by difficulties in understanding these softwares' complex capabilities, composing code and input files, a daunting bioinformatics barrier and a steep conceptual learning curve. skelesim (an R package) guides users in choosing appropriate simulations, setting parameters, calculating genetic summary statistics and organizing data output, in a reproducible pipeline within the R environment. skelesim is designed to be an extensible framework that can 'wrap' around any simulation software (inside or outside the R environment) and be extended to calculate and graph any genetic summary statistics. Currently, skelesim implements coalescent and forward-time models available in the fastsimcoal2 and rmetasim simulation engines to produce null distributions for multiple population genetic statistics and marker types, under a variety of demographic conditions. skelesim is intended to make simulations easier while still allowing full model complexity to ensure that simulations play a fundamental role in molecular ecology investigations. skelesim can also serve as a teaching tool: demonstrating the outcomes of stochastic population genetic processes; teaching general concepts of simulations; and providing an introduction to the R environment with a user-friendly graphical user interface (using shiny). © 2016 John Wiley & Sons Ltd.
skeleSim: an extensible, general framework for population genetic simulation in R
Parobek, Christian M.; Archer, Frederick I.; DePrenger-Levin, Michelle E.; Hoban, Sean M.; Liggins, Libby; Strand, Allan E.
2016-01-01
Simulations are a key tool in molecular ecology for inference and forecasting, as well as for evaluating new methods. Due to growing computational power and a diversity of software with different capabilities, simulations are becoming increasingly powerful and useful. However, the widespread use of simulations by geneticists and ecologists is hindered by difficulties in understanding these softwares’ complex capabilities, composing code and input files, a daunting bioinformatics barrier, and a steep conceptual learning curve. skeleSim (an R package) guides users in choosing appropriate simulations, setting parameters, calculating genetic summary statistics, and organizing data output, in a reproducible pipeline within the R environment. skeleSim is designed to be an extensible framework that can ‘wrap’ around any simulation software (inside or outside the R environment) and be extended to calculate and graph any genetic summary statistics. Currently, skeleSim implements coalescent and forward-time models available in the fastsimcoal2 and rmetasim simulation engines to produce null distributions for multiple population genetic statistics and marker types, under a variety of demographic conditions. skeleSim is intended to make simulations easier while still allowing full model complexity to ensure that simulations play a fundamental role in molecular ecology investigations. skeleSim can also serve as a teaching tool: demonstrating the outcomes of stochastic population genetic processes; teaching general concepts of simulations; and providing an introduction to the R environment with a user-friendly graphical user interface (using shiny). PMID:27736016
2007-12-01
Liberation Tigers of Tamil Eelam (LTTE), a rebel group that has been fighting for an independent Tamil homeland in the north of Sri Lanka since 1976. The...date, and has destroyed numerous boats, even a warship in the Sri Lankan navy (SLN) [13]. The Abu Sayyaf group (ASG) is an example of the several...and other environments. Terence Tan is studying the application of conceptual-blending theory to agents, for naval tactical-plan generation in
Rapid Human-Computer Interactive Conceptual Design of Mobile and Manipulative Robot Systems
2015-05-19
algorithm based on Age-Fitness Pareto Optimization (AFPO) ([9]) with an additional user prefer- ence objective and a neural network-based user model, we...greater than 40, which is about 5 times further than any robot traveled in our experiments. 6 3.3 Methods The algorithm uses a client -server computational...architecture. The client here is an interactive pro- gram which takes a pair of controllers as input, simulates4 two copies of the robot with
Design and simulation of a cable-pulley-based transmission for artificial ankle joints
NASA Astrophysics Data System (ADS)
Liu, Huaxin; Ceccarelli, Marco; Huang, Qiang
2016-06-01
In this paper, a mechanical transmission based on cable pulley is proposed for human-like actuation in the artificial ankle joints of human-scale. The anatomy articular characteristics of the human ankle is discussed for proper biomimetic inspiration in designing an accurate, efficient, and robust motion control of artificial ankle joint devices. The design procedure is presented through the inclusion of conceptual considerations and design details for an interactive solution of the transmission system. A mechanical design is elaborated for the ankle joint angular with pitch motion. A multi-body dynamic simulation model is elaborated accordingly and evaluated numerically in the ADAMS environment. Results of the numerical simulations are discussed to evaluate the dynamic performance of the proposed design solution and to investigate the feasibility of the proposed design in future applications for humanoid robots.
Costello, John P; Olivieri, Laura J; Krieger, Axel; Thabit, Omar; Marshall, M Blair; Yoo, Shi-Joon; Kim, Peter C; Jonas, Richard A; Nath, Dilip S
2014-07-01
The current educational approach for teaching congenital heart disease (CHD) anatomy to students involves instructional tools and techniques that have significant limitations. This study sought to assess the feasibility of utilizing present-day three-dimensional (3D) printing technology to create high-fidelity synthetic heart models with ventricular septal defect (VSD) lesions and applying these models to a novel, simulation-based educational curriculum for premedical and medical students. Archived, de-identified magnetic resonance images of five common VSD subtypes were obtained. These cardiac images were then segmented and built into 3D computer-aided design models using Mimics Innovation Suite software. An Objet500 Connex 3D printer was subsequently utilized to print a high-fidelity heart model for each VSD subtype. Next, a simulation-based educational curriculum using these heart models was developed and implemented in the instruction of 29 premedical and medical students. Assessment of this curriculum was undertaken with Likert-type questionnaires. High-fidelity VSD models were successfully created utilizing magnetic resonance imaging data and 3D printing. Following instruction with these high-fidelity models, all students reported significant improvement in knowledge acquisition (P < .0001), knowledge reporting (P < .0001), and structural conceptualization (P < .0001) of VSDs. It is feasible to use present-day 3D printing technology to create high-fidelity heart models with complex intracardiac defects. Furthermore, this tool forms the foundation for an innovative, simulation-based educational approach to teach students about CHD and creates a novel opportunity to stimulate their interest in this field. © The Author(s) 2014.
Initial Conceptualization and Application of the Alaska Thermokarst Model
NASA Astrophysics Data System (ADS)
Bolton, W. R.; Lara, M. J.; Genet, H.; Romanovsky, V. E.; McGuire, A. D.
2015-12-01
Thermokarst topography forms whenever ice-rich permafrost thaws and the ground subsides due to the volume loss when ground ice transitions to water. The Alaska Thermokarst Model (ATM) is a large-scale, state-and-transition model designed to simulate transitions between landscape units affected by thermokarst disturbance. The ATM uses a frame-based methodology to track transitions and proportion of cohorts within a 1-km2 grid cell. In the arctic tundra environment, the ATM tracks thermokarst-related transitions among wetland tundra, graminoid tundra, shrub tundra, and thermokarst lakes. In the boreal forest environment, the ATM tracks transitions among forested permafrost plateau, thermokarst lakes, collapse scar fens and bogs. The transition from one cohort to another due to thermokarst processes can take place if thaw reaches ice-rich ground layers either due to pulse disturbance (i.e. large precipitation event or fires), or due to gradual active layer deepening that eventually results in penetration of the protective layer. The protective layer buffers the ice-rich soils from the land surface and is critical to determine how susceptible an area is to thermokarst degradation. The rate of terrain transition in our model is determined by a set of rules that are based upon the ice-content of the soil, the drainage efficiency (or the ability of the landscape to store or transport water), the cumulative probability of thermokarst initiation, distance from rivers, lake dynamics (increasing, decreasing, or stable), and other factors. Tundra types are allowed to transition from one type to another (for example, wetland tundra to graminoid tundra) under favorable climatic conditions. In this study, we present our conceptualization and initial simulation results from in the arctic (the Barrow Peninsula) and boreal (the Tanana Flats) regions of Alaska.
The Use of a Computer Simulation to Promote Conceptual Change: A Quasi-Experimental Study
ERIC Educational Resources Information Center
Trundle, Kathy Cabe; Bell, Randy L.
2010-01-01
This mixed-methods investigation compared the effectiveness of three instructional approaches in achieving desired conceptual change among early childhood preservice teachers (n = 157). Each of the three treatments employed inquiry-based instruction on moon phases using data collected from: (1) the planetarium software program, Starry Night[TM],…
ERIC Educational Resources Information Center
Rates, Christopher A.; Mulvey, Bridget K.; Feldon, David F.
2016-01-01
Components of complex systems apply across multiple subject areas, and teaching these components may help students build unifying conceptual links. Students, however, often have difficulty learning these components, and limited research exists to understand what types of interventions may best help improve understanding. We investigated 32 high…
A Hydrological Modeling Framework for Flood Risk Assessment for Japan
NASA Astrophysics Data System (ADS)
Ashouri, H.; Chinnayakanahalli, K.; Chowdhary, H.; Sen Gupta, A.
2016-12-01
Flooding has been the most frequent natural disaster that claims lives and imposes significant economic losses to human societies worldwide. Japan, with an annual rainfall of up to approximately 4000 mm is extremely vulnerable to flooding. The focus of this research is to develop a macroscale hydrologic model for simulating flooding toward an improved understanding and assessment of flood risk across Japan. The framework employs a conceptual hydrological model, known as the Probability Distributed Model (PDM), as well as the Muskingum-Cunge flood routing procedure for simulating streamflow. In addition, a Temperature-Index model is incorporated to account for snowmelt and its contribution to streamflow. For an efficient calibration of the model, in terms of computational timing and convergence of the parameters, a set of A Priori parameters is obtained based on the relationships between the model parameters and the physical properties of watersheds. In this regard, we have implemented a particle tracking algorithm and a statistical model which use high resolution Digital Terrain Models to estimate different time related parameters of the model such as time to peak of the unit hydrograph. In addition, global soil moisture and depth data are used to generate A Priori estimation of maximum soil moisture capacity, an important parameter of the PDM model. Once the model is calibrated, its performance is examined during the Typhoon Nabi which struck Japan in September 2005 and caused severe flooding throughout the country. The model is also validated for the extreme precipitation event in 2012 which affected Kyushu. In both cases, quantitative measures show that simulated streamflow depicts good agreement with gauge-based observations. The model is employed to simulate thousands of possible flood events for the entire Japan which makes a basis for a comprehensive flood risk assessment and loss estimation for the flood insurance industry.
Montgomery, Erwin B.; He, Huang
2016-01-01
The efficacy of Deep Brain Stimulation (DBS) for an expanding array of neurological and psychiatric disorders demonstrates directly that DBS affects the basic electroneurophysiological mechanisms of the brain. The increasing array of active electrode configurations, stimulation currents, pulse widths, frequencies, and pulse patterns provides valuable tools to probe electroneurophysiological mechanisms. The extension of basic electroneurophysiological and anatomical concepts using sophisticated computational modeling and simulation has provided relatively straightforward explanations of all the DBS parameters except frequency. This article summarizes current thought about frequency and relevant observations. Current methodological and conceptual errors are critically examined in the hope that future work will not replicate these errors. One possible alternative theory is presented to provide a contrast to many current theories. DBS, conceptually, is a noisy discrete oscillator interacting with the basal ganglia–thalamic–cortical system of multiple re-entrant, discrete oscillators. Implications for positive and negative resonance, stochastic resonance and coherence, noisy synchronization, and holographic memory (related to movement generation) are presented. The time course of DBS neuronal responses demonstrates evolution of the DBS response consistent with the dynamics of re-entrant mechanisms. Finally, computational modeling demonstrates identical dynamics as seen by neuronal activities recorded from human and nonhuman primates, illustrating the differences of discrete from continuous harmonic oscillators and the power of conceptualizing the nervous system as composed on interacting discrete nonlinear oscillators. PMID:27548234
ERIC Educational Resources Information Center
Greca, Ileana M.; Seoane, Eugenia; Arriassecq, Irene
2014-01-01
Computers and simulations represent an undeniable aspect of daily scientific life, the use of simulations being comparable to the introduction of the microscope and the telescope, in the development of knowledge. In science education, simulations have been proposed for over three decades as useful tools to improve the conceptual understanding of…
ERIC Educational Resources Information Center
Moizer, Jonathan; Lean, Jonathan
2010-01-01
This article presents a conceptual analysis of simulation game adoption and use across university faculty. The metaphor of epidemiology is used to characterize the diffusion of simulation games for teaching and learning. A simple stock-flow diagram is presented to illustrate this dynamic. Future scenarios for simulation game adoption are…
Weeks, Margaret R; Li, Jianghong; Lounsbury, David; Green, Helena Danielle; Abbott, Maryann; Berman, Marcie; Rohena, Lucy; Gonzalez, Rosely; Lang, Shawn; Mosher, Heather
2017-12-01
Achieving community-level goals to eliminate the HIV epidemic requires coordinated efforts through community consortia with a common purpose to examine and critique their own HIV testing and treatment (T&T) care system and build effective tools to guide their efforts to improve it. Participatory system dynamics (SD) modeling offers conceptual, methodological, and analytical tools to engage diverse stakeholders in systems conceptualization and visual mapping of dynamics that undermine community-level health outcomes and identify those that can be leveraged for systems improvement. We recruited and engaged a 25-member multi-stakeholder Task Force, whose members provide or utilize HIV-related services, to participate in SD modeling to examine and address problems of their local HIV T&T service system. Findings from the iterative model building sessions indicated Task Force members' increasingly complex understanding of the local HIV care system and demonstrated their improved capacity to visualize and critique multiple models of the HIV T&T service system and identify areas of potential leverage. Findings also showed members' enhanced communication and consensus in seeking deeper systems understanding and options for solutions. We discuss implications of using these visual SD models for subsequent simulation modeling of the T&T system and for other community applications to improve system effectiveness. © Society for Community Research and Action 2017.
Modeling intragranular diffusion in low-connectivity granular media
NASA Astrophysics Data System (ADS)
Ewing, Robert P.; Liu, Chongxuan; Hu, Qinhong
2012-03-01
Characterizing the diffusive exchange of solutes between bulk water in an aquifer and water in the intragranular pores of the solid phase is still challenging despite decades of study. Many disparities between observation and theory could be attributed to low connectivity of the intragranular pores. The presence of low connectivity indicates that a useful conceptual framework is percolation theory. The present study was initiated to develop a percolation-based finite difference (FD) model, and to test it rigorously against both random walk (RW) simulations of diffusion starting from nonequilibrium, and data on Borden sand published by Ball and Roberts (1991a,b) and subsequently reanalyzed by Haggerty and Gorelick (1995) using a multirate mass transfer (MRMT) approach. The percolation-theoretical model is simple and readily incorporated into existing FD models. The FD model closely matches the RW results using only a single fitting parameter, across a wide range of pore connectivities. Simulation of the Borden sand experiment without pore connectivity effects reproduced the MRMT analysis, but including low pore connectivity effects improved the fit. Overall, the theory and simulation results show that low intragranular pore connectivity can produce diffusive behavior that appears as if the solute had undergone slow sorption, despite the absence of any sorption process, thereby explaining some hitherto confusing aspects of intragranular diffusion.
The resilience and functional role of moss in boreal and arctic ecosystems.
Turetsky, M R; Bond-Lamberty, B; Euskirchen, E; Talbot, J; Frolking, S; McGuire, A D; Tuittila, E-S
2012-10-01
Mosses in northern ecosystems are ubiquitous components of plant communities, and strongly influence nutrient, carbon and water cycling. We use literature review, synthesis and model simulations to explore the role of mosses in ecological stability and resilience. Moss community responses to disturbance showed all possible responses (increases, decreases, no change) within most disturbance categories. Simulations from two process-based models suggest that northern ecosystems would need to experience extreme perturbation before mosses were eliminated. But simulations with two other models suggest that loss of moss will reduce soil carbon accumulation primarily by influencing decomposition rates and soil nitrogen availability. It seems clear that mosses need to be incorporated into models as one or more plant functional types, but more empirical work is needed to determine how to best aggregate species. We highlight several issues that have not been adequately explored in moss communities, such as functional redundancy and singularity, relationships between response and effect traits, and parameter vs conceptual uncertainty in models. Mosses play an important role in several ecosystem processes that play out over centuries - permafrost formation and thaw, peat accumulation, development of microtopography - and there is a need for studies that increase our understanding of slow, long-term dynamical processes. © 2012 The Authors. New Phytologist © 2012 New Phytologist Trust.
The resilience and functional role of moss in boreal and arctic ecosystems
Turetsky, M.; Bond-Lamberty, B.; Euskirchen, E.S.; Talbot, J. J.; Frolking, S.; McGuire, A.D.; Tuittila, E.S.
2012-01-01
Mosses in northern ecosystems are ubiquitous components of plant communities, and strongly influence nutrient, carbon and water cycling. We use literature review, synthesis and model simulations to explore the role of mosses in ecological stability and resilience. Moss community responses to disturbance showed all possible responses (increases, decreases, no change) within most disturbance categories. Simulations from two process-based models suggest that northern ecosystems would need to experience extreme perturbation before mosses were eliminated. But simulations with two other models suggest that loss of moss will reduce soil carbon accumulation primarily by influencing decomposition rates and soil nitrogen availability. It seems clear that mosses need to be incorporated into models as one or more plant functional types, but more empirical work is needed to determine how to best aggregate species. We highlight several issues that have not been adequately explored in moss communities, such as functional redundancy and singularity, relationships between response and effect traits, and parameter vs conceptual uncertainty in models. Mosses play an important role in several ecosystem processes that play out over centuries – permafrost formation and thaw, peat accumulation, development of microtopography – and there is a need for studies that increase our understanding of slow, long-term dynamical processes.
Why College Students Cheat: A Conceptual Model of Five Factors
ERIC Educational Resources Information Center
Yu, Hongwei; Glanzer, Perry L.; Johnson, Byron R.; Sriram, Rishi; Moore, Brandon
2018-01-01
Though numerous studies have identified factors associated with academic misconduct, few have proposed conceptual models that could make sense of multiple factors. In this study, we used structural equation modeling (SEM) to test a conceptual model of five factors using data from a relatively large sample of 2,503 college students. The results…
Modelling remediation scenarios in historical mining catchments.
Gamarra, Javier G P; Brewer, Paul A; Macklin, Mark G; Martin, Katherine
2014-01-01
Local remediation measures, particularly those undertaken in historical mining areas, can often be ineffective or even deleterious because erosion and sedimentation processes operate at spatial scales beyond those typically used in point-source remediation. Based on realistic simulations of a hybrid landscape evolution model combined with stochastic rainfall generation, we demonstrate that similar remediation strategies may result in differing effects across three contrasting European catchments depending on their topographic and hydrologic regimes. Based on these results, we propose a conceptual model of catchment-scale remediation effectiveness based on three basic catchment characteristics: the degree of contaminant source coupling, the ratio of contaminated to non-contaminated sediment delivery, and the frequency of sediment transport events.
NASA Technical Reports Server (NTRS)
Cotton, W. R.; Tripoli, G. J.
1980-01-01
Major research accomplishments which were achieved during the first year of the grant are summarized. The research concentrated in the following areas: (1) an examination of observational requirements for predicting convective storm development and intensity as suggested by recent numerical experiments; (2) interpretation of recent 3D numerical experiments with regard to the relationship between overshooting tops and surface wind gusts; (3) the development of software for emulating satellite-inferred cloud properties using 3D cloud model predicted data; and (4) the development of a conceptual/semi-quantitative model of eastward propagating, mesoscale convective complexes forming to the lee of the Rocky Mountains.
Fire in the Brazilian Amazon: A Spatially Explicit Model for Policy Impact Analysis
NASA Technical Reports Server (NTRS)
Arima, Eugenio Y.; Simmons, Cynthia S.; Walker, Robert T.; Cochrane, Mark A.
2007-01-01
This article implements a spatially explicit model to estimate the probability of forest and agricultural fires in the Brazilian Amazon. We innovate by using variables that reflect farmgate prices of beef and soy, and also provide a conceptual model of managed and unmanaged fires in order to simulate the impact of road paving, cattle exports, and conservation area designation on the occurrence of fire. Our analysis shows that fire is positively correlated with the price of beef and soy, and that the creation of new conservation units may offset the negative environmental impacts caused by the increasing number of fire events associated with early stages of frontier development.
Conceptualizing Telehealth in Nursing Practice: Advancing a Conceptual Model to Fill a Virtual Gap.
Nagel, Daniel A; Penner, Jamie L
2016-03-01
Increasingly nurses use various telehealth technologies to deliver health care services; however, there has been a lag in research and generation of empirical knowledge to support nursing practice in this expanding field. One challenge to generating knowledge is a gap in development of a comprehensive conceptual model or theoretical framework to illustrate relationships of concepts and phenomena inherent to adoption of a broad range of telehealth technologies to holistic nursing practice. A review of the literature revealed eight published conceptual models, theoretical frameworks, or similar entities applicable to nursing practice. Many of these models focus exclusively on use of telephones and four were generated from qualitative studies, but none comprehensively reflect complexities of bridging nursing process and elements of nursing practice into use of telehealth. The purpose of this article is to present a review of existing conceptual models and frameworks, discuss predominant themes and features of these models, and present a comprehensive conceptual model for telehealth nursing practice synthesized from this literature for consideration and further development. This conceptual model illustrates characteristics of, and relationships between, dimensions of telehealth practice to guide research and knowledge development in provision of holistic person-centered care delivery to individuals by nurses through telehealth technologies. © The Author(s) 2015.
NASA Astrophysics Data System (ADS)
Rolland, Colette; Yu, Eric; Salinesi, Camille; Castro, Jaelson
The use of intentional concepts, the notion of "goal" in particular, has been prominent in recent approaches to requirement engineering (RE). Goal-oriented frameworks and methods for requirements engineering (GORE) have been keynote topics in requirements engineering, conceptual modelling, and more generally in software engineering. What are the conceptual modelling foundations in these approaches? RIGiM (Requirements Intentions and Goals in Conceptual Modelling) aims to provide a forum for discussing the interplay between requirements engineering and conceptual modelling, and in particular, to investigate how goal- and intention-driven approaches help in conceptualising purposeful systems. What are the fundamental objectives and premises of requirements engineering and conceptual modelling respectively, and how can they complement each other? What are the demands on conceptual modelling from the standpoint of requirements engineering? What conceptual modelling techniques can be further taken advantage of in requirements engineering? What are the upcoming modelling challenges and issues in GORE? What are the unresolved open questions? What lessons are there to be learnt from industrial experiences? What empirical data are there to support the cost-benefit analysis when adopting GORE methods? Are there application domains or types of project settings for which goals and intentional approaches are particularly suitable or not suitable? What degree of formalization and automation, or interactivity is feasible and appropriate for what types of participants during requirements engineering?
NASA Astrophysics Data System (ADS)
Joshi, Nitin; Ojha, C. S. P.; Sharma, P. K.
2012-10-01
In this study a conceptual model that accounts for the effects of nonequilibrium contaminant transport in a fractured porous media is developed. Present model accounts for both physical and sorption nonequilibrium. Analytical solution was developed using the Laplace transform technique, which was then numerically inverted to obtain solute concentration in the fracture matrix system. The semianalytical solution developed here can incorporate both semi-infinite and finite fracture matrix extent. In addition, the model can account for flexible boundary conditions and nonzero initial condition in the fracture matrix system. The present semianalytical solution was validated against the existing analytical solutions for the fracture matrix system. In order to differentiate between various sorption/transport mechanism different cases of sorption and mass transfer were analyzed by comparing the breakthrough curves and temporal moments. It was found that significant differences in the signature of sorption and mass transfer exists. Applicability of the developed model was evaluated by simulating the published experimental data of Calcium and Strontium transport in a single fracture. The present model simulated the experimental data reasonably well in comparison to the model based on equilibrium sorption assumption in fracture matrix system, and multi rate mass transfer model.
Synthetic calibration of a Rainfall-Runoff Model
Thompson, David B.; Westphal, Jerome A.; ,
1990-01-01
A method for synthetically calibrating storm-mode parameters for the U.S. Geological Survey's Precipitation-Runoff Modeling System is described. Synthetic calibration is accomplished by adjusting storm-mode parameters to minimize deviations between the pseudo-probability disributions represented by regional regression equations and actual frequency distributions fitted to model-generated peak discharge and runoff volume. Results of modeling storm hydrographs using synthetic and analytic storm-mode parameters are presented. Comparisons are made between model results from both parameter sets and between model results and observed hydrographs. Although mean storm runoff is reproducible to within about 26 percent of the observed mean storm runoff for five or six parameter sets, runoff from individual storms is subject to large disparities. Predicted storm runoff volume ranged from 2 percent to 217 percent of commensurate observed values. Furthermore, simulation of peak discharges was poor. Predicted peak discharges from individual storm events ranged from 2 percent to 229 percent of commensurate observed values. The model was incapable of satisfactorily executing storm-mode simulations for the study watersheds. This result is not considered a particular fault of the model, but instead is indicative of deficiencies in similar conceptual models.
Van Oudenhove, Lukas; Cuypers, Stefaan
2014-05-01
Psychosomatic medicine, with its prevailing biopsychosocial model, aims to integrate human and exact sciences with their divergent conceptual models. Therefore, its own conceptual foundations, which often remain implicit and unknown, may be critically relevant. We defend the thesis that choosing between different metaphysical views on the 'mind-body problem' may have important implications for the conceptual foundations of psychosomatic medicine, and therefore potentially also for its methods, scientific status and relationship with the scientific disciplines it aims to integrate: biomedical sciences (including neuroscience), psychology and social sciences. To make this point, we introduce three key positions in the philosophical 'mind-body' debate (emergentism, reductionism, and supervenience physicalism) and investigate their consequences for the conceptual basis of the biopsychosocial model in general and its 'psycho-biological' part ('mental causation') in particular. Despite the clinical merits of the biopsychosocial model, we submit that it is conceptually underdeveloped or even flawed, which may hamper its use as a proper scientific model.
Intercultural Simulation Games: A Review (of the United States and beyond)
ERIC Educational Resources Information Center
Fowler, Sandra M.; Pusch, Margaret D.
2010-01-01
Intercultural simulations are instructional activities that engage and challenge participants with experiences integral to encounters between people of more than one cultural group. Simulations designed specifically to support intercultural encounters have been in use since the 1970s. This article examines the conceptual bases for intercultural…
Wolfs, Vincent; Villazon, Mauricio Florencio; Willems, Patrick
2013-01-01
Applications such as real-time control, uncertainty analysis and optimization require an extensive number of model iterations. Full hydrodynamic sewer models are not sufficient for these applications due to the excessive computation time. Simplifications are consequently required. A lumped conceptual modelling approach results in a much faster calculation. The process of identifying and calibrating the conceptual model structure could, however, be time-consuming. Moreover, many conceptual models lack accuracy, or do not account for backwater effects. To overcome these problems, a modelling methodology was developed which is suited for semi-automatic calibration. The methodology is tested for the sewer system of the city of Geel in the Grote Nete river basin in Belgium, using both synthetic design storm events and long time series of rainfall input. A MATLAB/Simulink(®) tool was developed to guide the modeller through the step-wise model construction, reducing significantly the time required for the conceptual modelling process.
Pore space analysis of NAPL distribution in sand-clay media
Matmon, D.; Hayden, N.J.
2003-01-01
This paper introduces a conceptual model of clays and non-aqueous phase liquids (NAPLs) at the pore scale that has been developed from a mathematical unit cell model, and direct micromodel observation and measurement of clay-containing porous media. The mathematical model uses a unit cell concept with uniform spherical grains for simulating the sand in the sand-clay matrix (???10% clay). Micromodels made with glass slides and including different clay-containing porous media were used to investigate the two clays (kaolinite and montmorillonite) and NAPL distribution within the pore space. The results were used to understand the distribution of NAPL advancing into initially saturated sand and sand-clay media, and provided a detailed analysis of the pore-scale geometry, pore size distribution, NAPL entry pressures, and the effect of clay on this geometry. Interesting NAPL saturation profiles were observed as a result of the complexity of the pore space geometry with the different packing angles and the presence of clays. The unit cell approach has applications for enhancing the mechanistic understanding and conceptualization, both visually and mathematically, of pore-scale processes such as NAPL and clay distribution. ?? 2003 Elsevier Science Ltd. All rights reserved.
Introduction of the 2nd Phase of the Integrated Hydrologic Model Intercomparison Project
NASA Astrophysics Data System (ADS)
Kollet, Stefan; Maxwell, Reed; Dages, Cecile; Mouche, Emmanuel; Mugler, Claude; Paniconi, Claudio; Park, Young-Jin; Putti, Mario; Shen, Chaopeng; Stisen, Simon; Sudicky, Edward; Sulis, Mauro; Ji, Xinye
2015-04-01
The 2nd Phase of the Integrated Hydrologic Model Intercomparison Project commenced in June 2013 with a workshop at Bonn University funded by the German Science Foundation and US National Science Foundation. Three test cases were defined and compared that are available online at www.hpsc-terrsys.de including a tilted v-catchment case; a case called superslab based on multiple slab-heterogeneities in the hydraulic conductivity along a hillslope; and the Borden site case, based on a published field experiment. The goal of this phase is to further interrogate the coupling of surface-subsurface flow implemented in various integrated hydrologic models; and to understand and quantify the impact of differences in the conceptual and technical implementations on the simulation results, which may constitute an additional source of uncertainty. The focus has been broadened considerably including e.g. saturated and unsaturated subsurface storages, saturated surface area, ponded surface storage in addition to discharge, and pressure/saturation profiles and cross-sections. Here, first results are presented and discussed demonstrating the conceptual and technical challenges in implementing essentially the same governing equations describing highly non-linear moisture redistribution processes and surface-groundwater interactions.
Computational Plume Modeling of COnceptual ARES Vehicle Stage Tests
NASA Technical Reports Server (NTRS)
Allgood, Daniel C.; Ahuja, Vineet
2007-01-01
The plume-induced environment of a conceptual ARES V vehicle stage test at the NASA Stennis Space Center (NASA-SSC) was modeled using computational fluid dynamics (CFD). A full-scale multi-element grid was generated for the NASA-SSC B-2 test stand with the ARES V stage being located in a proposed off-center forward position. The plume produced by the ARES V main power plant (cluster of five RS-68 LOX/LH2 engines) was simulated using a multi-element flow solver - CRUNCH. The primary objective of this work was to obtain a fundamental understanding of the ARES V plume and its impingement characteristics on the B-2 flame-deflector. The location, size and shape of the impingement region were quantified along with the un-cooled deflector wall pressures, temperatures and incident heating rates. Issues with the proposed tests were identified and several of these addressed using the CFD methodology. The final results of this modeling effort will provide useful data and boundary conditions in upcoming engineering studies that are directed towards determining the required facility modifications for ensuring safe and reliable stage testing in support of the Constellation Program.
NASA Astrophysics Data System (ADS)
Teles, V.; de Marsily, G.; Delay, F.; Perrier, E.
Alluvial floodplains are extremely heterogeneous aquifers, whose three-dimensional structures are quite difficult to model. In general, when representing such structures, the medium heterogeneity is modeled with classical geostatistical or Boolean meth- ods. Another approach, still in its infancy, is called the genetic method because it simulates the generation of the medium by reproducing sedimentary processes. We developed a new genetic model to obtain a realistic three-dimensional image of allu- vial media. It does not simulate the hydrodynamics of sedimentation but uses semi- empirical and statistical rules to roughly reproduce fluvial deposition and erosion. The main processes, either at the stream scale or at the plain scale, are modeled by simple rules applied to "sediment" entities or to conceptual "erosion" entities. The model was applied to a several kilometer long portion of the Aube River floodplain (France) and reproduced the deposition and erosion cycles that occurred during the inferred climate periods (15 000 BP to present). A three-dimensional image of the aquifer was gener- ated, by extrapolating the two-dimensional information collected on a cross-section of the floodplain. Unlike geostatistical methods, this extrapolation does not use a statis- tical spatial analysis of the data, but a genetic analysis, which leads to a more realistic structure. Groundwater flow and transport simulations in the alluvium were carried out with a three-dimensional flow code or simulator (MODFLOW), using different rep- resentations of the alluvial reservoir of the Aube River floodplain: first an equivalent homogeneous medium, and then different heterogeneous media built either with the traditional geostatistical approach simulating the permeability distribution, or with the new genetic model presented here simulating sediment facies. In the latter case, each deposited entity of a given lithology was assigned a constant hydraulic conductivity value. Results of these models have been compared to assess the value of the genetic approach and will be presented.
NASA Astrophysics Data System (ADS)
Boo, Kyung-Jin
The primary purpose of this dissertation is to provide the groundwork for a sustainable energy future in Korea. For this purpose, a conceptual framework of sustainable energy development was developed to provide a deeper understanding of interrelationships between energy, the economy, and the environment (E 3). Based on this theoretical work, an empirical simulation model was developed to investigate the ways in which E3 interact. This dissertation attempts to develop a unified concept of sustainable energy development by surveying multiple efforts to integrate various definitions of sustainability. Sustainable energy development should be built on the basis of three principles: ecological carrying capacity, economic efficiency, and socio-political equity. Ecological carrying capacity delineates the earth's resource constraints as well as its ability to assimilate wastes. Socio-political equity implies an equitable distribution of the benefits and costs of energy consumption and an equitable distribution of environmental burdens. Economic efficiency dictates efficient allocation of scarce resources. The simulation model is composed of three modules: an energy module, an environmental module and an economic module. Because the model is grounded on economic structural behaviorism, the dynamic nature of the current economy is effectively depicted and simulated through manipulating exogenous policy variables. This macro-economic model is used to simulate six major policy intervention scenarios. Major findings from these policy simulations were: (1) carbon taxes are the most effective means of reducing air-pollutant emissions; (2) sustainable energy development can be achieved through reinvestment of carbon taxes into energy efficiency and renewable energy programs; and (3) carbon taxes would increase a nation's welfare if reinvested in relevant areas. The policy simulation model, because it is based on neoclassical economics, has limitations such that it cannot fully account for socio-political realities (inter- and intra-generational equity) which are core feature of sustainability. Thus, alternative approaches based on qualitative analysis, such as the multi-criteria approach, will be required to complement the current policy simulation model.
A Simple Climate Model Program for High School Education
NASA Astrophysics Data System (ADS)
Dommenget, D.
2012-04-01
The future climate change projections of the IPCC AR4 are based on GCM simulations, which give a distinct global warming pattern, with an arctic winter amplification, an equilibrium land sea contrast and an inter-hemispheric warming gradient. While these simulations are the most important tool of the IPCC predictions, the conceptual understanding of these predicted structures of climate change are very difficult to reach if only based on these highly complex GCM simulations and they are not accessible for ordinary people. In this study presented here we will introduce a very simple gridded globally resolved energy balance model based on strongly simplified physical processes, which is capable of simulating the main characteristics of global warming. The model shall give a bridge between the 1-dimensional energy balance models and the fully coupled 4-dimensional complex GCMs. It runs on standard PC computers computing globally resolved climate simulation with 2yrs per second or 100,000yrs per day. The program can compute typical global warming scenarios in a few minutes on a standard PC. The computer code is only 730 line long with very simple formulations that high school students should be able to understand. The simple model's climate sensitivity and the spatial structure of the warming pattern is within the uncertainties of the IPCC AR4 models simulations. It is capable of simulating the arctic winter amplification, the equilibrium land sea contrast and the inter-hemispheric warming gradient with good agreement to the IPCC AR4 models in amplitude and structure. The program can be used to do sensitivity studies in which students can change something (e.g. reduce the solar radiation, take away the clouds or make snow black) and see how it effects the climate or the climate response to changes in greenhouse gases. This program is available for every one and could be the basis for high school education. Partners for a high school project are wanted!
Thorndahl, S; Willems, P
2008-01-01
Failure of urban drainage systems may occur due to surcharge or flooding at specific manholes in the system, or due to overflows from combined sewer systems to receiving waters. To quantify the probability or return period of failure, standard approaches make use of the simulation of design storms or long historical rainfall series in a hydrodynamic model of the urban drainage system. In this paper, an alternative probabilistic method is investigated: the first-order reliability method (FORM). To apply this method, a long rainfall time series was divided in rainstorms (rain events), and each rainstorm conceptualized to a synthetic rainfall hyetograph by a Gaussian shape with the parameters rainstorm depth, duration and peak intensity. Probability distributions were calibrated for these three parameters and used on the basis of the failure probability estimation, together with a hydrodynamic simulation model to determine the failure conditions for each set of parameters. The method takes into account the uncertainties involved in the rainstorm parameterization. Comparison is made between the failure probability results of the FORM method, the standard method using long-term simulations and alternative methods based on random sampling (Monte Carlo direct sampling and importance sampling). It is concluded that without crucial influence on the modelling accuracy, the FORM is very applicable as an alternative to traditional long-term simulations of urban drainage systems.
Naranjo, Ramon C.
2017-01-01
Groundwater-flow models are often calibrated using a limited number of observations relative to the unknown inputs required for the model. This is especially true for models that simulate groundwater surface-water interactions. In this case, subsurface temperature sensors can be an efficient means for collecting long-term data that capture the transient nature of physical processes such as seepage losses. Continuous and spatially dense network of diverse observation data can be used to improve knowledge of important physical drivers, conceptualize and calibrate variably saturated groundwater flow models. An example is presented for which the results of such analysis were used to help guide irrigation districts and water management decisions on costly upgrades to conveyance systems to improve water usage, farm productivity and restoration efforts to improve downstream water quality and ecosystems.
Optimization of a hydrodynamic separator using a multiscale computational fluid dynamics approach.
Schmitt, Vivien; Dufresne, Matthieu; Vazquez, Jose; Fischer, Martin; Morin, Antoine
2013-01-01
This article deals with the optimization of a hydrodynamic separator working on the tangential separation mechanism along a screen. The aim of this study is to optimize the shape of the device to avoid clogging. A multiscale approach is used. This methodology combines measurements and computational fluid dynamics (CFD). A local model enables us to observe the different phenomena occurring at the orifice scale, which shows the potential of expanded metal screens. A global model is used to simulate the flow within the device using a conceptual model of the screen (porous wall). After validation against the experimental measurements, the global model was used to investigate the influence of deflectors and disk plates in the structure.
Tenbus, Frederick J.; Fleck, William B.
2001-01-01
Military activity at Graces Quarters, a former open-air chemical-agent facility at Aberdeen Proving Ground, Maryland, has resulted in ground-water contamination by chlorinated hydrocarbons. As part of a ground-water remediation feasibility study, a three-dimensional model was constructed to simulate transport of four chlorinated hydrocarbons (1,1,2,2-tetrachloroethane, trichloroethene, carbon tetrachloride, and chloroform) that are components of a contaminant plume in the surficial and middle aquifers underlying the east-central part of Graces Quarters. The model was calibrated to steady-state hydraulic head at 58 observation wells and to the concentration of 1,1,2,2-tetrachloroethane in 58 observation wells and 101direct-push probe samples from the mid-1990s. Simulations using the same basic model with minor adjustments were then run for each of the other plume constituents. The error statistics between the simulated and measured concentrations of each of the constituents compared favorably to the error statisticst,1,2,2-tetrachloroethane calibration. Model simulations were used in conjunction with contaminant concentration data to examine the sources and degradation of the plume constituents. It was determined from this that mixed contaminant sources with no ambient degradation was the best approach for simulating multi-species solute transport at the site. Forward simulations were run to show potential solute transport 30 years and 100 years into the future with and without source removal. Although forward simulations are subject to uncertainty, they can be useful for illustrating various aspects of the conceptual model and its implementation. The forward simulation with no source removal indicates that contaminants would spread throughout various parts of the surficial and middle aquifers, with the100-year simulation showing potential discharge areas in either the marshes at the end of the Graces Quarters peninsula or just offshore in the estuaries. The simulation with source removal indicates that if the modeling assumptions are reasonable and ground-water cleanup within30 years is important, source removal alone is not a sufficient remedy, and cleanup might not even occur within 100 years.