Framework for Uncertainty Assessment - Hanford Site-Wide Groundwater Flow and Transport Modeling
NASA Astrophysics Data System (ADS)
Bergeron, M. P.; Cole, C. R.; Murray, C. J.; Thorne, P. D.; Wurstner, S. K.
2002-05-01
Pacific Northwest National Laboratory is in the process of development and implementation of an uncertainty estimation methodology for use in future site assessments that addresses parameter uncertainty as well as uncertainties related to the groundwater conceptual model. The long-term goals of the effort are development and implementation of an uncertainty estimation methodology for use in future assessments and analyses being made with the Hanford site-wide groundwater model. The basic approach in the framework developed for uncertainty assessment consists of: 1) Alternate conceptual model (ACM) identification to identify and document the major features and assumptions of each conceptual model. The process must also include a periodic review of the existing and proposed new conceptual models as data or understanding become available. 2) ACM development of each identified conceptual model through inverse modeling with historical site data. 3) ACM evaluation to identify which of conceptual models are plausible and should be included in any subsequent uncertainty assessments. 4) ACM uncertainty assessments will only be carried out for those ACMs determined to be plausible through comparison with historical observations and model structure identification measures. The parameter uncertainty assessment process generally involves: a) Model Complexity Optimization - to identify the important or relevant parameters for the uncertainty analysis; b) Characterization of Parameter Uncertainty - to develop the pdfs for the important uncertain parameters including identification of any correlations among parameters; c) Propagation of Uncertainty - to propagate parameter uncertainties (e.g., by first order second moment methods if applicable or by a Monte Carlo approach) through the model to determine the uncertainty in the model predictions of interest. 5)Estimation of combined ACM and scenario uncertainty by a double sum with each component of the inner sum (an individual CCDF) representing parameter uncertainty associated with a particular scenario and ACM and the outer sum enumerating the various plausible ACM and scenario combinations in order to represent the combined estimate of uncertainty (a family of CCDFs). A final important part of the framework includes identification, enumeration, and documentation of all the assumptions, which include those made during conceptual model development, required by the mathematical model, required by the numerical model, made during the spatial and temporal descretization process, needed to assign the statistical model and associated parameters that describe the uncertainty in the relevant input parameters, and finally those assumptions required by the propagation method. Pacific Northwest National Laboratory is operated for the U.S. Department of Energy under Contract DE-AC06-76RL01830.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vermeul, Vincent R.; Cole, Charles R.; Bergeron, Marcel P.
2001-08-29
The baseline three-dimensional transient inverse model for the estimation of site-wide scale flow parameters, including their uncertainties, using data on the transient behavior of the unconfined aquifer system over the entire historical period of Hanford operations, has been modified to account for the effects of basalt intercommunication between the Hanford unconfined aquifer and the underlying upper basalt confined aquifer. Both the baseline and alternative conceptual models (ACM-1) considered only the groundwater flow component and corresponding observational data in the 3-Dl transient inverse calibration efforts. Subsequent efforts will examine both groundwater flow and transport. Comparisons of goodness of fit measures andmore » parameter estimation results for the ACM-1 transient inverse calibrated model with those from previous site-wide groundwater modeling efforts illustrate that the new 3-D transient inverse model approach will strengthen the technical defensibility of the final model(s) and provide the ability to incorporate uncertainty in predictions related to both conceptual model and parameter uncertainty. These results, however, indicate that additional improvements are required to the conceptual model framework. An investigation was initiated at the end of this basalt inverse modeling effort to determine whether facies-based zonation would improve specific yield parameter estimation results (ACM-2). A description of the justification and methodology to develop this zonation is discussed.« less
NASA Astrophysics Data System (ADS)
Arsenault, Richard; Poissant, Dominique; Brissette, François
2015-11-01
This paper evaluated the effects of parametric reduction of a hydrological model on five regionalization methods and 267 catchments in the province of Quebec, Canada. The Sobol' variance-based sensitivity analysis was used to rank the model parameters by their influence on the model results and sequential parameter fixing was performed. The reduction in parameter correlations improved parameter identifiability, however this improvement was found to be minimal and was not transposed in the regionalization mode. It was shown that 11 of the HSAMI models' 23 parameters could be fixed with little or no loss in regionalization skill. The main conclusions were that (1) the conceptual lumped models used in this study did not represent physical processes sufficiently well to warrant parameter reduction for physics-based regionalization methods for the Canadian basins examined and (2) catchment descriptors did not adequately represent the relevant hydrological processes, namely snow accumulation and melt.
Dispersion Modeling in Complex Urban Systems
Models are used to represent real systems in an understandable way. They take many forms. A conceptual model explains the way a system works. In environmental studies, for example, a conceptual model may delineate all the factors and parameters for determining how a particle move...
Reiter, Michael A; Saintil, Max; Yang, Ziming; Pokrajac, Dragoljub
2009-08-01
Conceptual modeling is a useful tool for identifying pathways between drivers, stressors, Valued Ecosystem Components (VECs), and services that are central to understanding how an ecosystem operates. The St. Jones River watershed, DE is a complex ecosystem, and because management decisions must include ecological, social, political, and economic considerations, a conceptual model is a good tool for accommodating the full range of inputs. In 2002, a Four-Component, Level 1 conceptual model was formed for the key habitats of the St. Jones River watershed, but since the habitat level of resolution is too fine for some important watershed-scale issues we developed a functional watershed-scale model using the existing narrowed habitat-scale models. The narrowed habitat-scale conceptual models and associated matrices developed by Reiter et al. (2006) were combined with data from the 2002 land use/land cover (LULC) GIS-based maps of Kent County in Delaware to assemble a diagrammatic and numerical watershed-scale conceptual model incorporating the calculated weight of each habitat within the watershed. The numerical component of the assembled watershed model was subsequently subjected to the same Monte Carlo narrowing methodology used for the habitat versions to refine the diagrammatic component of the watershed-scale model. The narrowed numerical representation of the model was used to generate forecasts for changes in the parameters "Agriculture" and "Forest", showing that land use changes in these habitats propagated through the results of the model by the weighting factor. Also, the narrowed watershed-scale conceptual model identified some key parameters upon which to focus research attention and management decisions at the watershed scale. The forecast and simulation results seemed to indicate that the watershed-scale conceptual model does lead to different conclusions than the habitat-scale conceptual models for some issues at the larger watershed scale.
Assessment of Alternative Conceptual Models Using Reactive Transport Modeling with Monitoring Data
NASA Astrophysics Data System (ADS)
Dai, Z.; Price, V.; Heffner, D.; Hodges, R.; Temples, T.; Nicholson, T.
2005-12-01
Monitoring data proved very useful in evaluating alternative conceptual models, simulating contaminant transport behavior, and reducing uncertainty. A graded approach using three alternative conceptual site models was formulated to simulate a field case of tetrachloroethene (PCE) transport and biodegradation. These models ranged from simple to complex in their representation of subsurface heterogeneities. The simplest model was a single-layer homogeneous aquifer that employed an analytical reactive transport code, BIOCHLOR (Aziz et al., 1999). Due to over-simplification of the aquifer structure, this simulation could not reproduce the monitoring data. The second model consisted of a multi-layer conceptual model, in combination with numerical modules, MODFLOW and RT3D within GMS, to simulate flow and reactive transport. Although the simulation results from the second model were comparatively better than those from the simple model, they still did not adequately reproduce the monitoring well concentrations because the geological structures were still inadequately defined. Finally, a more realistic conceptual model was formulated that incorporated heterogeneities and geologic structures identified from well logs and seismic survey data using the Petra and PetraSeis software. This conceptual model included both a major channel and a younger channel that were detected in the PCE source area. In this model, these channels control the local ground-water flow direction and provide a preferential chemical transport pathway. Simulation results using this conceptual site model proved compatible with the monitoring concentration data. This study demonstrates that the bias and uncertainty from inadequate conceptual models are much larger than those introduced from an inadequate choice of model parameter values (Neuman and Wierenga, 2003; Meyer et al., 2004; Ye et al., 2004). This case study integrated conceptual and numerical models, based on interpreted local hydrogeologic and geochemical data, with detailed monitoring plume data. It provided key insights for confirming alternative conceptual site models and assessing the performance of monitoring networks. A monitoring strategy based on this graded approach for assessing alternative conceptual models can provide the technical bases for identifying critical monitoring locations, adequate monitoring frequency, and performance indicator parameters for performance monitoring involving ground-water levels and PCE concentrations.
NASA Astrophysics Data System (ADS)
Mockler, E. M.; Chun, K. P.; Sapriza-Azuri, G.; Bruen, M.; Wheater, H. S.
2016-11-01
Predictions of river flow dynamics provide vital information for many aspects of water management including water resource planning, climate adaptation, and flood and drought assessments. Many of the subjective choices that modellers make including model and criteria selection can have a significant impact on the magnitude and distribution of the output uncertainty. Hydrological modellers are tasked with understanding and minimising the uncertainty surrounding streamflow predictions before communicating the overall uncertainty to decision makers. Parameter uncertainty in conceptual rainfall-runoff models has been widely investigated, and model structural uncertainty and forcing data have been receiving increasing attention. This study aimed to assess uncertainties in streamflow predictions due to forcing data and the identification of behavioural parameter sets in 31 Irish catchments. By combining stochastic rainfall ensembles and multiple parameter sets for three conceptual rainfall-runoff models, an analysis of variance model was used to decompose the total uncertainty in streamflow simulations into contributions from (i) forcing data, (ii) identification of model parameters and (iii) interactions between the two. The analysis illustrates that, for our subjective choices, hydrological model selection had a greater contribution to overall uncertainty, while performance criteria selection influenced the relative intra-annual uncertainties in streamflow predictions. Uncertainties in streamflow predictions due to the method of determining parameters were relatively lower for wetter catchments, and more evenly distributed throughout the year when the Nash-Sutcliffe Efficiency of logarithmic values of flow (lnNSE) was the evaluation criterion.
NASA Astrophysics Data System (ADS)
Neuberg, J. W.; Thomas, M.; Pascal, K.; Karl, S.
2012-04-01
Geophysical datasets are essential to guide particularly short-term forecasting of volcanic activity. Key parameters are derived from these datasets and interpreted in different ways, however, the biggest impact on the interpretation is not determined by the range of parameters but controlled through the parameterisation and the underlying conceptual model of the volcanic process. On the other hand, the increasing number of sophisticated geophysical models need to be constrained by monitoring data, to transform a merely numerical exercise into a useful forecasting tool. We utilise datasets from the "big three", seismology, deformation and gas emissions, to gain insight in the mutual relationship between conceptual models and constraining data. We show that, e.g. the same seismic dataset can be interpreted with respect to a wide variety of different models with very different implications to forecasting. In turn, different data processing procedures lead to different outcomes even though they are based on the same conceptual model. Unsurprisingly, the most reliable interpretation will be achieved by employing multi-disciplinary models with overlapping constraints.
NASA Astrophysics Data System (ADS)
Thomsen, N. I.; Troldborg, M.; McKnight, U. S.; Binning, P. J.; Bjerg, P. L.
2012-04-01
Mass discharge estimates are increasingly being used in the management of contaminated sites. Such estimates have proven useful for supporting decisions related to the prioritization of contaminated sites in a groundwater catchment. Potential management options can be categorised as follows: (1) leave as is, (2) clean up, or (3) further investigation needed. However, mass discharge estimates are often very uncertain, which may hamper the management decisions. If option 1 is incorrectly chosen soil and water quality will decrease, threatening or destroying drinking water resources. The risk of choosing option 2 is to spend money on remediating a site that does not pose a problem. Choosing option 3 will often be safest, but may not be the optimal economic solution. Quantification of the uncertainty in mass discharge estimates can therefore greatly improve the foundation for selecting the appropriate management option. The uncertainty of mass discharge estimates depends greatly on the extent of the site characterization. A good approach for uncertainty estimation will be flexible with respect to the investigation level, and account for both parameter and conceptual model uncertainty. We propose a method for quantifying the uncertainty of dynamic mass discharge estimates from contaminant point sources on the local scale. The method considers both parameter and conceptual uncertainty through a multi-model approach. The multi-model approach evaluates multiple conceptual models for the same site. The different conceptual models consider different source characterizations and hydrogeological descriptions. The idea is to include a set of essentially different conceptual models where each model is believed to be realistic representation of the given site, based on the current level of information. Parameter uncertainty is quantified using Monte Carlo simulations. For each conceptual model we calculate a transient mass discharge estimate with uncertainty bounds resulting from the parametric uncertainty. To quantify the conceptual uncertainty from a given site, we combine the outputs from the different conceptual models using Bayesian model averaging. The weight for each model is obtained by integrating available data and expert knowledge using Bayesian belief networks. The multi-model approach is applied to a contaminated site. At the site a DNAPL (dense non aqueous phase liquid) spill consisting of PCE (perchloroethylene) has contaminated a fractured clay till aquitard overlaying a limestone aquifer. The exact shape and nature of the source is unknown and so is the importance of transport in the fractures. The result of the multi-model approach is a visual representation of the uncertainty of the mass discharge estimates for the site which can be used to support the management options.
NASA Astrophysics Data System (ADS)
Silva, F. E. O. E.; Naghettini, M. D. C.; Fernandes, W.
2014-12-01
This paper evaluated the uncertainties associated with the estimation of the parameters of a conceptual rainfall-runoff model, through the use of Bayesian inference techniques by Monte Carlo simulation. The Pará River sub-basin, located in the upper São Francisco river basin, in southeastern Brazil, was selected for developing the studies. In this paper, we used the Rio Grande conceptual hydrologic model (EHR/UFMG, 2001) and the Markov Chain Monte Carlo simulation method named DREAM (VRUGT, 2008a). Two probabilistic models for the residues were analyzed: (i) the classic [Normal likelihood - r ≈ N (0, σ²)]; and (ii) a generalized likelihood (SCHOUPS & VRUGT, 2010), in which it is assumed that the differences between observed and simulated flows are correlated, non-stationary, and distributed as a Skew Exponential Power density. The assumptions made for both models were checked to ensure that the estimation of uncertainties in the parameters was not biased. The results showed that the Bayesian approach proved to be adequate to the proposed objectives, enabling and reinforcing the importance of assessing the uncertainties associated with hydrological modeling.
Zyvoloski, G.; Kwicklis, E.; Eddebbarh, A.-A.; Arnold, B.; Faunt, C.; Robinson, B.A.
2003-01-01
This paper presents several different conceptual models of the Large Hydraulic Gradient (LHG) region north of Yucca Mountain and describes the impact of those models on groundwater flow near the potential high-level repository site. The results are based on a numerical model of site-scale saturated zone beneath Yucca Mountain. This model is used for performance assessment predictions of radionuclide transport and to guide future data collection and modeling activities. The numerical model is calibrated by matching available water level measurements using parameter estimation techniques, along with more informal comparisons of the model to hydrologic and geochemical information. The model software (hydrologic simulation code FEHM and parameter estimation software PEST) and model setup allows for efficient calibration of multiple conceptual models. Until now, the Large Hydraulic Gradient has been simulated using a low-permeability, east-west oriented feature, even though direct evidence for this feature is lacking. In addition to this model, we investigate and calibrate three additional conceptual models of the Large Hydraulic Gradient, all of which are based on a presumed zone of hydrothermal chemical alteration north of Yucca Mountain. After examining the heads and permeabilities obtained from the calibrated models, we present particle pathways from the potential repository that record differences in the predicted groundwater flow regime. The results show that Large Hydraulic Gradient can be represented with the alternate conceptual models that include the hydrothermally altered zone. The predicted pathways are mildly sensitive to the choice of the conceptual model and more sensitive to the quality of calibration in the vicinity on the repository. These differences are most likely due to different degrees of fit of model to data, and do not represent important differences in hydrologic conditions for the different conceptual models. ?? 2002 Elsevier Science B.V. All rights reserved.
Tyler Jon Smith
2008-01-01
In Montana and much of the Rocky Mountain West, the single most important parameter in forecasting the controls on regional water resources is snowpack. Despite the heightened importance of snowpack, few studies have considered the representation of uncertainty in coupled snowmelt/hydrologic conceptual models. Uncertainty estimation provides a direct interpretation of...
NASA Astrophysics Data System (ADS)
Kelleher, Christa A.; Shaw, Stephen B.
2018-02-01
Recent research has found that hydrologic modeling over decadal time periods often requires time variant model parameters. Most prior work has focused on assessing time variance in model parameters conceptualizing watershed features and functions. In this paper, we assess whether adding a time variant scalar to potential evapotranspiration (PET) can be used in place of time variant parameters. Using the HBV hydrologic model and four different simple but common PET methods (Hamon, Priestly-Taylor, Oudin, and Hargreaves), we simulated 60+ years of daily discharge on four rivers in New York state. Allowing all ten model parameters to vary in time achieved good model fits in terms of daily NSE and long-term water balance. However, allowing single model parameters to vary in time - including a scalar on PET - achieved nearly equivalent model fits across PET methods. Overall, varying a PET scalar in time is likely more physically consistent with known biophysical controls on PET as compared to varying parameters conceptualizing innate watershed properties related to soil properties such as wilting point and field capacity. This work suggests that the seeming need for time variance in innate watershed parameters may be due to overly simple evapotranspiration formulations that do not account for all factors controlling evapotranspiration over long time periods.
Evaluating Conceptual Site Models with Multicomponent Reactive Transport Modeling
NASA Astrophysics Data System (ADS)
Dai, Z.; Heffner, D.; Price, V.; Temples, T. J.; Nicholson, T. J.
2005-05-01
Modeling ground-water flow and multicomponent reactive chemical transport is a useful approach for testing conceptual site models and assessing the design of monitoring networks. A graded approach with three conceptual site models is presented here with a field case of tetrachloroethene (PCE) transport and biodegradation near Charleston, SC. The first model assumed a one-layer homogeneous aquifer structure with semi-infinite boundary conditions, in which an analytical solution of the reactive solute transport can be obtained with BIOCHLOR (Aziz et al., 1999). Due to the over-simplification of the aquifer structure, this simulation cannot reproduce the monitoring data. In the second approach we used GMS to develop the conceptual site model, a layer-cake multi-aquifer system, and applied a numerical module (MODFLOW and RT3D within GMS) to solve the flow and reactive transport problem. The results were better than the first approach but still did not fit the plume well because the geological structures were still inadequately defined. In the third approach we developed a complex conceptual site model by interpreting log and seismic survey data with Petra and PetraSeis. We detected a major channel and a younger channel, through the PCE source area. These channels control the local ground-water flow direction and provide a preferential chemical transport pathway. Results using the third conceptual site model agree well with the monitoring concentration data. This study confirms that the bias and uncertainty from inadequate conceptual models are much larger than those introduced from an inadequate choice of model parameter values (Neuman and Wierenga, 2003; Meyer et al., 2004). Numerical modeling in this case provides key insight into the hydrogeology and geochemistry of the field site for predicting contaminant transport in the future. Finally, critical monitoring points and performance indicator parameters are selected for future monitoring to confirm system performance.
Imposing constraints on parameter values of a conceptual hydrological model using baseflow response
NASA Astrophysics Data System (ADS)
Dunn, S. M.
Calibration of conceptual hydrological models is frequently limited by a lack of data about the area that is being studied. The result is that a broad range of parameter values can be identified that will give an equally good calibration to the available observations, usually of stream flow. The use of total stream flow can bias analyses towards interpretation of rapid runoff, whereas water quality issues are more frequently associated with low flow condition. This paper demonstrates how model distinctions between surface an sub-surface runoff can be used to define a likelihood measure based on the sub-surface (or baseflow) response. This helps to provide more information about the model behaviour, constrain the acceptable parameter sets and reduce uncertainty in streamflow prediction. A conceptual model, DIY, is applied to two contrasting catchments in Scotland, the Ythan and the Carron Valley. Parameter ranges and envelopes of prediction are identified using criteria based on total flow efficiency, baseflow efficiency and combined efficiencies. The individual parameter ranges derived using the combined efficiency measures still cover relatively wide bands, but are better constrained for the Carron than the Ythan. This reflects the fact that hydrological behaviour in the Carron is dominated by a much flashier surface response than in the Ythan. Hence, the total flow efficiency is more strongly controlled by surface runoff in the Carron and there is a greater contrast with the baseflow efficiency. Comparisons of the predictions using different efficiency measures for the Ythan also suggest that there is a danger of confusing parameter uncertainties with data and model error, if inadequate likelihood measures are defined.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sig Drellack, Lance Prothro
2007-12-01
The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result ofmore » the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The simulations are challenged by the distributed sources in each of the Corrective Action Units, by complex mass transfer processes, and by the size and complexity of the field-scale flow models. An efficient methodology utilizing particle tracking results and convolution integrals provides in situ concentrations appropriate for Monte Carlo analysis. Uncertainty in source releases and transport parameters including effective porosity, fracture apertures and spacing, matrix diffusion coefficients, sorption coefficients, and colloid load and mobility are considered. With the distributions of input uncertainties and output plume volumes, global analysis methods including stepwise regression, contingency table analysis, and classification tree analysis are used to develop sensitivity rankings of parameter uncertainties for each model considered, thus assisting a variety of decisions.« less
NASA Astrophysics Data System (ADS)
Clark, Martyn P.; Kavetski, Dmitri
2010-10-01
A major neglected weakness of many current hydrological models is the numerical method used to solve the governing model equations. This paper thoroughly evaluates several classes of time stepping schemes in terms of numerical reliability and computational efficiency in the context of conceptual hydrological modeling. Numerical experiments are carried out using 8 distinct time stepping algorithms and 6 different conceptual rainfall-runoff models, applied in a densely gauged experimental catchment, as well as in 12 basins with diverse physical and hydroclimatic characteristics. Results show that, over vast regions of the parameter space, the numerical errors of fixed-step explicit schemes commonly used in hydrology routinely dwarf the structural errors of the model conceptualization. This substantially degrades model predictions, but also, disturbingly, generates fortuitously adequate performance for parameter sets where numerical errors compensate for model structural errors. Simply running fixed-step explicit schemes with shorter time steps provides a poor balance between accuracy and efficiency: in some cases daily-step adaptive explicit schemes with moderate error tolerances achieved comparable or higher accuracy than 15 min fixed-step explicit approximations but were nearly 10 times more efficient. From the range of simple time stepping schemes investigated in this work, the fixed-step implicit Euler method and the adaptive explicit Heun method emerge as good practical choices for the majority of simulation scenarios. In combination with the companion paper, where impacts on model analysis, interpretation, and prediction are assessed, this two-part study vividly highlights the impact of numerical errors on critical performance aspects of conceptual hydrological models and provides practical guidelines for robust numerical implementation.
Identifying Hydrologic Processes in Agricultural Watersheds Using Precipitation-Runoff Models
Linard, Joshua I.; Wolock, David M.; Webb, Richard M.T.; Wieczorek, Michael
2009-01-01
Understanding the fate and transport of agricultural chemicals applied to agricultural fields will assist in designing the most effective strategies to prevent water-quality impairments. At a watershed scale, the processes controlling the fate and transport of agricultural chemicals are generally understood only conceptually. To examine the applicability of conceptual models to the processes actually occurring, two precipitation-runoff models - the Soil and Water Assessment Tool (SWAT) and the Water, Energy, and Biogeochemical Model (WEBMOD) - were applied in different agricultural settings of the contiguous United States. Each model, through different physical processes, simulated the transport of water to a stream from the surface, the unsaturated zone, and the saturated zone. Models were calibrated for watersheds in Maryland, Indiana, and Nebraska. The calibrated sets of input parameters for each model at each watershed are discussed, and the criteria used to validate the models are explained. The SWAT and WEBMOD model results at each watershed conformed to each other and to the processes identified in each watershed's conceptual hydrology. In Maryland the conceptual understanding of the hydrology indicated groundwater flow was the largest annual source of streamflow; the simulation results for the validation period confirm this. The dominant source of water to the Indiana watershed was thought to be tile drains. Although tile drains were not explicitly simulated in the SWAT model, a large component of streamflow was received from lateral flow, which could be attributed to tile drains. Being able to explicitly account for tile drains, WEBMOD indicated water from tile drains constituted most of the annual streamflow in the Indiana watershed. The Nebraska models indicated annual streamflow was composed primarily of perennial groundwater flow and infiltration-excess runoff, which conformed to the conceptual hydrology developed for that watershed. The hydrologic processes represented in the parameter sets resulting from each model were comparable at individual watersheds, but varied between watersheds. The models were unable to show, however, whether hydrologic processes other than those included in the original conceptual models were major contributors to streamflow. Supplemental simulations of agricultural chemical transport could improve the ability to assess conceptual models.
Simulated discharge trends indicate robustness of hydrological models in a changing climate
NASA Astrophysics Data System (ADS)
Addor, Nans; Nikolova, Silviya; Seibert, Jan
2016-04-01
Assessing the robustness of hydrological models under contrasted climatic conditions should be part any hydrological model evaluation. Robust models are particularly important for climate impact studies, as models performing well under current conditions are not necessarily capable of correctly simulating hydrological perturbations caused by climate change. A pressing issue is the usually assumed stationarity of parameter values over time. Modeling experiments using conceptual hydrological models revealed that assuming transposability of parameters values in changing climatic conditions can lead to significant biases in discharge simulations. This raises the question whether parameter values should to be modified over time to reflect changes in hydrological processes induced by climate change. Such a question denotes a focus on the contribution of internal processes (i.e., catchment processes) to discharge generation. Here we adopt a different perspective and explore the contribution of external forcing (i.e., changes in precipitation and temperature) to changes in discharge. We argue that in a robust hydrological model, discharge variability should be induced by changes in the boundary conditions, and not by changes in parameter values. In this study, we explore how well the conceptual hydrological model HBV captures transient changes in hydrological signatures over the period 1970-2009. Our analysis focuses on research catchments in Switzerland undisturbed by human activities. The precipitation and temperature forcing are extracted from recently released 2km gridded data sets. We use a genetic algorithm to calibrate HBV for the whole 40-year period and for the eight successive 5-year periods to assess eventual trends in parameter values. Model calibration is run multiple times to account for parameter uncertainty. We find that in alpine catchments showing a significant increase of winter discharge, this trend can be captured reasonably well with constant parameter values over the whole reference period. Further, preliminary results suggest that some trends in parameter values do not reflect changes in hydrological processes, as reported by others previously, but instead might stem from a modeling artifact related to the parameterization of evapotranspiration, which is overly sensitive to temperature increase. We adopt a trading-space-for-time approach to better understand whether robust relationships between parameter values and forcing can be established, and to critically explore the rationale behind time-dependent parameter values in conceptual hydrological models.
Combined Estimation of Hydrogeologic Conceptual Model and Parameter Uncertainty
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meyer, Philip D.; Ye, Ming; Neuman, Shlomo P.
2004-03-01
The objective of the research described in this report is the development and application of a methodology for comprehensively assessing the hydrogeologic uncertainties involved in dose assessment, including uncertainties associated with conceptual models, parameters, and scenarios. This report describes and applies a statistical method to quantitatively estimate the combined uncertainty in model predictions arising from conceptual model and parameter uncertainties. The method relies on model averaging to combine the predictions of a set of alternative models. Implementation is driven by the available data. When there is minimal site-specific data the method can be carried out with prior parameter estimates basedmore » on generic data and subjective prior model probabilities. For sites with observations of system behavior (and optionally data characterizing model parameters), the method uses model calibration to update the prior parameter estimates and model probabilities based on the correspondence between model predictions and site observations. The set of model alternatives can contain both simplified and complex models, with the requirement that all models be based on the same set of data. The method was applied to the geostatistical modeling of air permeability at a fractured rock site. Seven alternative variogram models of log air permeability were considered to represent data from single-hole pneumatic injection tests in six boreholes at the site. Unbiased maximum likelihood estimates of variogram and drift parameters were obtained for each model. Standard information criteria provided an ambiguous ranking of the models, which would not justify selecting one of them and discarding all others as is commonly done in practice. Instead, some of the models were eliminated based on their negligibly small updated probabilities and the rest were used to project the measured log permeabilities by kriging onto a rock volume containing the six boreholes. These four projections, and associated kriging variances, were averaged using the posterior model probabilities as weights. Finally, cross-validation was conducted by eliminating from consideration all data from one borehole at a time, repeating the above process, and comparing the predictive capability of the model-averaged result with that of each individual model. Using two quantitative measures of comparison, the model-averaged result was superior to any individual geostatistical model of log permeability considered.« less
Force-directed visualization for conceptual data models
NASA Astrophysics Data System (ADS)
Battigaglia, Andrew; Sutter, Noah
2017-03-01
Conceptual data models are increasingly stored in an eXtensible Markup Language (XML) format because of its portability between different systems and the ability of databases to use this format for storing data. However, when attempting to capture business or design needs, an organized graphical format is preferred in order to facilitate communication to receive as much input as possible from users and subject-matter experts. Existing methods of achieving this conversion suffer from problems of not being specific enough to capture all of the needs of conceptual data modeling and not being able to handle a large number of relationships between entities. This paper describes an implementation for a modeling solution to clearly illustrate conceptual data models stored in XML formats in well organized and structured diagrams. A force layout with several different parameters is applied to the diagram to create both compact and easily traversable relationships between entities.
NASA Astrophysics Data System (ADS)
Dunn, S. M.; Lilly, A.
2001-10-01
There are now many examples of hydrological models that utilise the capabilities of Geographic Information Systems to generate spatially distributed predictions of behaviour. However, the spatial variability of hydrological parameters relating to distributions of soils and vegetation can be hard to establish. In this paper, the relationship between a soil hydrological classification Hydrology of Soil Types (HOST) and the spatial parameters of a conceptual catchment-scale model is investigated. A procedure involving inverse modelling using Monte-Carlo simulations on two catchments is developed to identify relative values for soil related parameters of the DIY model. The relative values determine the internal variability of hydrological processes as a function of the soil type. For three out of the four soil parameters studied, the variability between HOST classes was found to be consistent across two catchments when tested independently. Problems in identifying values for the fourth 'fast response distance' parameter have highlighted a potential limitation with the present structure of the model. The present assumption that this parameter can be related simply to soil type rather than topography appears to be inadequate. With the exclusion of this parameter, calibrated parameter sets from one catchment can be converted into equivalent parameter sets for the alternate catchment on the basis of their HOST distributions, to give a reasonable simulation of flow. Following further testing on different catchments, and modifications to the definition of the fast response distance parameter, the technique provides a methodology whereby it is possible to directly derive spatial soil parameters for new catchments.
2013-06-01
1 18th ICCRTS Using a Functional Simulation of Crisis Management to Test the C2 Agility Model Parameters on Key Performance Variables...AND SUBTITLE Using a Functional Simulation of Crisis Management to Test the C2 Agility Model Parameters on Key Performance Variables 5a. CONTRACT...command in crisis management. C2 Agility Model Agility can be conceptualized at a number of different levels; for instance at the team
PESTAN: Pesticide Analytical Model Version 4.0 User's Guide
The principal objective of this User's Guide to provide essential information on the aspects such as model conceptualization, model theory, assumptions and limitations, determination of input parameters, analysis of results and sensitivity analysis.
Impact of the calibration period on the conceptual rainfall-runoff model parameter estimates
NASA Astrophysics Data System (ADS)
Todorovic, Andrijana; Plavsic, Jasna
2015-04-01
A conceptual rainfall-runoff model is defined by its structure and parameters, which are commonly inferred through model calibration. Parameter estimates depend on objective function(s), optimisation method, and calibration period. Model calibration over different periods may result in dissimilar parameter estimates, while model efficiency decreases outside calibration period. Problem of model (parameter) transferability, which conditions reliability of hydrologic simulations, has been investigated for decades. In this paper, dependence of the parameter estimates and model performance on calibration period is analysed. The main question that is addressed is: are there any changes in optimised parameters and model efficiency that can be linked to the changes in hydrologic or meteorological variables (flow, precipitation and temperature)? Conceptual, semi-distributed HBV-light model is calibrated over five-year periods shifted by a year (sliding time windows). Length of the calibration periods is selected to enable identification of all parameters. One water year of model warm-up precedes every simulation, which starts with the beginning of a water year. The model is calibrated using the built-in GAP optimisation algorithm. The objective function used for calibration is composed of Nash-Sutcliffe coefficient for flows and logarithms of flows, and volumetric error, all of which participate in the composite objective function with approximately equal weights. Same prior parameter ranges are used in all simulations. The model is calibrated against flows observed at the Slovac stream gauge on the Kolubara River in Serbia (records from 1954 to 2013). There are no trends in precipitation nor in flows, however, there is a statistically significant increasing trend in temperatures at this catchment. Parameter variability across the calibration periods is quantified in terms of standard deviations of normalised parameters, enabling detection of the most variable parameters. Correlation coefficients among optimised model parameters and total precipitation P, mean temperature T and mean flow Q are calculated to give an insight into parameter dependence on the hydrometeorological drivers. The results reveal high sensitivity of almost all model parameters towards calibration period. The highest variability is displayed by the refreezing coefficient, water holding capacity, and temperature gradient. The only statistically significant (decreasing) trend is detected in the evapotranspiration reduction threshold. Statistically significant correlation is detected between the precipitation gradient and precipitation depth, and between the time-area histogram base and flows. All other correlations are not statistically significant, implying that changes in optimised parameters cannot generally be linked to the changes in P, T or Q. As for the model performance, the model reproduces the observed runoff satisfactorily, though the runoff is slightly overestimated in wet periods. The Nash-Sutcliffe efficiency coefficient (NSE) ranges from 0.44 to 0.79. Higher NSE values are obtained over wetter periods, what is supported by statistically significant correlation between NSE and flows. Overall, no systematic variations in parameters or in model performance are detected. Parameter variability may therefore rather be attributed to errors in data or inadequacies in the model structure. Further research is required to examine the impact of the calibration strategy or model structure on the variability in optimised parameters in time.
A simplified model of precipitation enhancement over a heterogeneous surface
NASA Astrophysics Data System (ADS)
Cioni, Guido; Hohenegger, Cathy
2018-06-01
Soil moisture heterogeneities influence the onset of convection and subsequent evolution of precipitating systems through the triggering of mesoscale circulations. However, local evaporation also plays a role in determining precipitation amounts. Here we aim at disentangling the effect of advection and evaporation on precipitation over the course of a diurnal cycle by formulating a simple conceptual model. The derivation of the model is inspired by the results of simulations performed with a high-resolution (250 m) large eddy simulation model over a surface with varying degrees of heterogeneity. A key element of the conceptual model is the representation of precipitation as a weighted sum of advection and evaporation, each weighed by its own efficiency. The model is then used to isolate the main parameters that control precipitation variations over a spatially drier patch. It is found that these changes surprisingly do not depend on soil moisture itself but instead purely on parameters that describe the atmospheric initial state. The likelihood for enhanced precipitation over drier soils is discussed based on these parameters. Additional experiments are used to test the validity of the model.
Calibration of an Unsteady Groundwater Flow Model for a Complex, Strongly Heterogeneous Aquifer
NASA Astrophysics Data System (ADS)
Curtis, Z. K.; Liao, H.; Li, S. G.; Phanikumar, M. S.; Lusch, D.
2016-12-01
Modeling of groundwater systems characterized by complex three-dimensional structure and heterogeneity remains a significant challenge. Most of today's groundwater models are developed based on relatively simple conceptual representations in favor of model calibratibility. As more complexities are modeled, e.g., by adding more layers and/or zones, or introducing transient processes, more parameters have to be estimated and issues related to ill-posed groundwater problems and non-unique calibration arise. Here, we explore the use of an alternative conceptual representation for groundwater modeling that is fully three-dimensional and can capture complex 3D heterogeneity (both systematic and "random") without over-parameterizing the aquifer system. In particular, we apply Transition Probability (TP) geostatistics on high resolution borehole data from a water well database to characterize the complex 3D geology. Different aquifer material classes, e.g., `AQ' (aquifer material), `MAQ' (marginal aquifer material'), `PCM' (partially confining material), and `CM' (confining material), are simulated, with the hydraulic properties of each material type as tuning parameters during calibration. The TP-based approach is applied to simulate unsteady groundwater flow in a large, complex, and strongly heterogeneous glacial aquifer system in Michigan across multiple spatial and temporal scales. The resulting model is calibrated to observed static water level data over a time span of 50 years. The results show that the TP-based conceptualization enables much more accurate and robust calibration/simulation than that based on conventional deterministic layer/zone based conceptual representations.
Eberts, S.M.; Böhlke, J.K.; Kauffman, L.J.; Jurgens, B.C.
2012-01-01
Environmental age tracers have been used in various ways to help assess vulnerability of drinking-water production wells to contamination. The most appropriate approach will depend on the information that is available and that which is desired. To understand how the well will respond to changing nonpoint-source contaminant inputs at the water table, some representation of the distribution of groundwater ages in the well is needed. Such information for production wells is sparse and difficult to obtain, especially in areas lacking detailed field studies. In this study, age distributions derived from detailed groundwater-flow models with advective particle tracking were compared with those generated from lumped-parameter models to examine conditions in which estimates from simpler, less resource-intensive lumped-parameter models could be used in place of estimates from particle-tracking models. In each of four contrasting hydrogeologic settings in the USA, particle-tracking and lumped-parameter models yielded roughly similar age distributions and largely indistinguishable contaminant trends when based on similar conceptual models and calibrated to similar tracer data. Although model calibrations and predictions were variably affected by tracer limitations and conceptual ambiguities, results illustrated the importance of full age distributions, rather than apparent tracer ages or model mean ages, for trend analysis and forecasting.
Halford, Keith J.
2006-01-01
MODOPTIM is a non-linear ground-water model calibration and management tool that simulates flow with MODFLOW-96 as a subroutine. A weighted sum-of-squares objective function defines optimal solutions for calibration and management problems. Water levels, discharges, water quality, subsidence, and pumping-lift costs are the five direct observation types that can be compared in MODOPTIM. Differences between direct observations of the same type can be compared to fit temporal changes and spatial gradients. Water levels in pumping wells, wellbore storage in the observation wells, and rotational translation of observation wells also can be compared. Negative and positive residuals can be weighted unequally so inequality constraints such as maximum chloride concentrations or minimum water levels can be incorporated in the objective function. Optimization parameters are defined with zones and parameter-weight matrices. Parameter change is estimated iteratively with a quasi-Newton algorithm and is constrained to a user-defined maximum parameter change per iteration. Parameters that are less sensitive than a user-defined threshold are not estimated. MODOPTIM facilitates testing more conceptual models by expediting calibration of each conceptual model. Examples of applying MODOPTIM to aquifer-test analysis, ground-water management, and parameter estimation problems are presented.
NASA Astrophysics Data System (ADS)
Chang, Yong; Wu, Jichun; Jiang, Guanghui; Kang, Zhiqiang
2017-05-01
Conceptual models often suffer from the over-parameterization problem due to limited available data for the calibration. This leads to the problem of parameter nonuniqueness and equifinality, which may bring much uncertainty of the simulation result. How to find out the appropriate model structure supported by the available data to simulate the catchment is still a big challenge in the hydrological research. In this paper, we adopt a multi-model framework to identify the dominant hydrological process and appropriate model structure of a karst spring, located in Guilin city, China. For this catchment, the spring discharge is the only available data for the model calibration. This framework starts with a relative complex conceptual model according to the perception of the catchment and then this complex is simplified into several different models by gradually removing the model component. The multi-objective approach is used to compare the performance of these different models and the regional sensitivity analysis (RSA) is used to investigate the parameter identifiability. The results show this karst spring is mainly controlled by two different hydrological processes and one of the processes is threshold-driven which is consistent with the fieldwork investigation. However, the appropriate model structure to simulate the discharge of this spring is much simpler than the actual aquifer structure and hydrological processes understanding from the fieldwork investigation. A simple linear reservoir with two different outlets is enough to simulate this spring discharge. The detail runoff process in the catchment is not needed in the conceptual model to simulate the spring discharge. More complex model should need more other additional data to avoid serious deterioration of model predictions.
The added value of remote sensing products in constraining hydrological models
NASA Astrophysics Data System (ADS)
Nijzink, Remko C.; Almeida, Susana; Pechlivanidis, Ilias; Capell, René; Gustafsson, David; Arheimer, Berit; Freer, Jim; Han, Dawei; Wagener, Thorsten; Sleziak, Patrik; Parajka, Juraj; Savenije, Hubert; Hrachowitz, Markus
2017-04-01
The calibration of a hydrological model still depends on the availability of streamflow data, even though more additional sources of information (i.e. remote sensed data products) have become more widely available. In this research, the model parameters of four different conceptual hydrological models (HYPE, HYMOD, TUW, FLEX) were constrained with remotely sensed products. The models were applied over 27 catchments across Europe to cover a wide range of climates, vegetation and landscapes. The fluxes and states of the models were correlated with the relevant products (e.g. MOD10A snow with modelled snow states), after which new a-posteriori parameter distributions were determined based on a weighting procedure using conditional probabilities. Briefly, each parameter was weighted with the coefficient of determination of the relevant regression between modelled states/fluxes and products. In this way, final feasible parameter sets were derived without the use of discharge time series. Initial results show that improvements in model performance, with regard to streamflow simulations, are obtained when the models are constrained with a set of remotely sensed products simultaneously. In addition, we present a more extensive analysis to assess a model's ability to reproduce a set of hydrological signatures, such as rising limb density or peak distribution. Eventually, this research will enhance our understanding and recommendations in the use of remotely sensed products for constraining conceptual hydrological modelling and improving predictive capability, especially for data sparse regions.
He, Yujie; Yang, Jinyan; Zhuang, Qianlai; McGuire, A. David; Zhu, Qing; Liu, Yaling; Teskey, Robert O.
2014-01-01
Conventional Q10 soil organic matter decomposition models and more complex microbial models are available for making projections of future soil carbon dynamics. However, it is unclear (1) how well the conceptually different approaches can simulate observed decomposition and (2) to what extent the trajectories of long-term simulations differ when using the different approaches. In this study, we compared three structurally different soil carbon (C) decomposition models (one Q10 and two microbial models of different complexity), each with a one- and two-horizon version. The models were calibrated and validated using 4 years of measurements of heterotrophic soil CO2 efflux from trenched plots in a Dahurian larch (Larix gmelinii Rupr.) plantation. All models reproduced the observed heterotrophic component of soil CO2 efflux, but the trajectories of soil carbon dynamics differed substantially in 100 year simulations with and without warming and increased litterfall input, with microbial models that produced better agreement with observed changes in soil organic C in long-term warming experiments. Our results also suggest that both constant and varying carbon use efficiency are plausible when modeling future decomposition dynamics and that the use of a short-term (e.g., a few years) period of measurement is insufficient to adequately constrain model parameters that represent long-term responses of microbial thermal adaption. These results highlight the need to reframe the representation of decomposition models and to constrain parameters with long-term observations and multiple data streams. We urge caution in interpreting future soil carbon responses derived from existing decomposition models because both conceptual and parameter uncertainties are substantial.
Prevention through Design Adoption Readiness Model (PtD ARM): An integrated conceptual model.
Weidman, Justin; Dickerson, Deborah E; Koebel, Charles T
2015-01-01
Prevention through Design (PtD), eliminating hazards at the design-stage of tools and systems, is the optimal method of mitigating occupational health and safety risks. A recent National Institute of Safety and Health initiative has established a goal to increase adoption of PtD innovation in industry. The construction industry has traditionally lagged behind other sectors in the adoption of innovation, in general; and of safety and health prevention innovation, in particular. Therefore, as a first step toward improving adoption trends in this sector, a conceptual model was developed to describe the parameters and causal relationships that influence and predict construction stakeholder "adoption readiness" for PtD technology innovation. This model was built upon three well-established theoretical frameworks: the Health Belief Model, the Diffusion of Innovation Model, and the Technology Acceptance Model. Earp and Ennett's model development methodology was employed to build a depiction of the key constructs and directionality and magnitude of relationships among them. Key constructs were identified from the literature associated with the three theoretical frameworks, with special emphasis given to studies related to construction or OHS technology adoption. A conceptual model is presented. Recommendations for future research are described and include confirmatory structural equation modeling of model parameters and relationships, additional descriptive investigation of barriers to adoption in some trade sectors, and design and evaluation of an intervention strategy.
Abramoff, Rose; Xu, Xiaofeng; Hartman, Melannie; ...
2017-12-20
Soil organic carbon (SOC) can be defined by measurable chemical and physical pools, such as mineral-associated carbon, carbon physically entrapped in aggregates, dissolved carbon, and fragments of plant detritus. Yet, most soil models use conceptual rather than measurable SOC pools. What would the traditional pool-based soil model look like if it were built today, reflecting the latest understanding of biological, chemical, and physical transformations in soils? We propose a conceptual model—the Millennial model—that defines pools as measurable entities. First, we discuss relevant pool definitions conceptually and in terms of the measurements that can be used to quantify pool size, formation,more » and destabilization. Then, we develop a numerical model following the Millennial model conceptual framework to evaluate against the Century model, a widely-used standard for estimating SOC stocks across space and through time. The Millennial model predicts qualitatively similar changes in total SOC in response to single factor perturbations when compared to Century, but different responses to multiple factor perturbations. Finally, we review important conceptual and behavioral differences between the Millennial and Century modeling approaches, and the field and lab measurements needed to constrain parameter values. Here, we propose the Millennial model as a simple but comprehensive framework to model SOC pools and guide measurements for further model development.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abramoff, Rose; Xu, Xiaofeng; Hartman, Melannie
Soil organic carbon (SOC) can be defined by measurable chemical and physical pools, such as mineral-associated carbon, carbon physically entrapped in aggregates, dissolved carbon, and fragments of plant detritus. Yet, most soil models use conceptual rather than measurable SOC pools. What would the traditional pool-based soil model look like if it were built today, reflecting the latest understanding of biological, chemical, and physical transformations in soils? We propose a conceptual model—the Millennial model—that defines pools as measurable entities. First, we discuss relevant pool definitions conceptually and in terms of the measurements that can be used to quantify pool size, formation,more » and destabilization. Then, we develop a numerical model following the Millennial model conceptual framework to evaluate against the Century model, a widely-used standard for estimating SOC stocks across space and through time. The Millennial model predicts qualitatively similar changes in total SOC in response to single factor perturbations when compared to Century, but different responses to multiple factor perturbations. Finally, we review important conceptual and behavioral differences between the Millennial and Century modeling approaches, and the field and lab measurements needed to constrain parameter values. Here, we propose the Millennial model as a simple but comprehensive framework to model SOC pools and guide measurements for further model development.« less
Detecting hydrological changes through conceptual model
NASA Astrophysics Data System (ADS)
Viola, Francesco; Caracciolo, Domenico; Pumo, Dario; Francipane, Antonio; Valerio Noto, Leonardo
2015-04-01
Natural changes and human modifications in hydrological systems coevolve and interact in a coupled and interlinked way. If, on one hand, climatic changes are stochastic, non-steady, and affect the hydrological systems, on the other hand, human-induced changes due to over-exploitation of soils and water resources modifies the natural landscape, water fluxes and its partitioning. Indeed, the traditional assumption of static systems in hydrological analysis, which has been adopted for long time, fails whenever transient climatic conditions and/or land use changes occur. Time series analysis is a way to explore environmental changes together with societal changes; unfortunately, the not distinguishability between causes restrict the scope of this method. In order to overcome this limitation, it is possible to couple time series analysis with an opportune hydrological model, such as a conceptual hydrological model, which offers a schematization of complex dynamics acting within a basin. Assuming that model parameters represent morphological basin characteristics and that calibration is a way to detect hydrological signature at a specific moment, it is possible to argue that calibrating the model over different time windows could be a method for detecting potential hydrological changes. In order to test the capabilities of a conceptual model in detecting hydrological changes, this work presents different "in silico" experiments. A synthetic-basin is forced with an ensemble of possible future scenarios generated with a stochastic weather generator able to simulate steady and non-steady climatic conditions. The experiments refer to Mediterranean climate, which is characterized by marked seasonality, and consider the outcomes of the IPCC 5th report for describing climate evolution in the next century. In particular, in order to generate future climate change scenarios, a stochastic downscaling in space and time is carried out using realizations of an ensemble of General Circulation Models (GCMs) for the future scenarios 2046-2065 and 2081-2100. Land use changes (i.e., changes in the fraction of impervious area due to increasing urbanization) are explicitly simulated, while the reference hydrological responses are assessed by the spatially distributed, process-based hydrological model tRIBS, the TIN-based Real-time Integrated Basin Simulator. Several scenarios have been created, describing hypothetical centuries with steady conditions, climate change conditions, land use change conditions and finally complex conditions involving both transient climatic modifications and gradual land use changes. A conceptual lumped model, the EHSM (EcoHydrological Streamflow Model) is calibrated for the above mentioned scenarios with regard to different time-windows. The calibrated parameters show high sensitivity to anthropic variations in land use and/or climatic variability. Land use changes are clearly visible from parameters evolution especially when steady climatic conditions are considered. When the increase in urbanization is coupled with rainfall reduction the ability to detect human interventions through the analysis of conceptual model parameters is weakened.
A conceptual disease model for adult Pompe disease.
Kanters, Tim A; Redekop, W Ken; Rutten-Van Mölken, Maureen P M H; Kruijshaar, Michelle E; Güngör, Deniz; van der Ploeg, Ans T; Hakkaart, Leona
2015-09-15
Studies in orphan diseases are, by nature, confronted with small patient populations, meaning that randomized controlled trials will have limited statistical power. In order to estimate the effectiveness of treatments in orphan diseases and extrapolate effects into the future, alternative models might be needed. The purpose of this study is to develop a conceptual disease model for Pompe disease in adults (an orphan disease). This conceptual model describes the associations between the most important levels of health concepts for Pompe disease in adults, from biological parameters via physiological parameters, symptoms and functional indicators to health perceptions and final health outcomes as measured in terms of health-related quality of life. The structure of the Wilson-Cleary health outcomes model was used as a blueprint, and filled with clinically relevant aspects for Pompe disease based on literature and expert opinion. Multiple observations per patient from a Dutch cohort study in untreated patients were used to quantify the relationships between the different levels of health concepts in the model by means of regression analyses. Enzyme activity, muscle strength, respiratory function, fatigue, level of handicap, general health perceptions, mental and physical component scales and utility described the different levels of health concepts in the Wilson-Cleary model for Pompe disease. Regression analyses showed that functional status was affected by fatigue, muscle strength and respiratory function. Health perceptions were affected by handicap. In turn, self-reported quality of life was affected by health perceptions. We conceptualized a disease model that incorporated the mechanisms believed to be responsible for impaired quality of life in Pompe disease. The model provides a comprehensive overview of various aspects of Pompe disease in adults, which can be useful for both clinicians and policymakers to support their multi-faceted decision making.
UNCERTAINTY AND SENSITIVITY ANALYSES FOR VERY HIGH ORDER MODELS
While there may in many cases be high potential for exposure of humans and ecosystems to chemicals released from a source, the degree to which this potential is realized is often uncertain. Conceptually, uncertainties are divided among parameters, model, and modeler during simula...
USDA-ARS?s Scientific Manuscript database
Runoff travel time, which is a function of watershed and storm characteristics, is an important parameter affecting the prediction accuracy of hydrologic models. Although, time of concentration (tc) is a most widely used time parameter, it has multiple conceptual and computational definitions. Most ...
Herneneutics of Sansei Management: Some Conceptual Parameters.
ERIC Educational Resources Information Center
Tanaka, Ronald
1988-01-01
Explores the managerial behavior of sansei, third-generation Japanese-Americans, which directly conforms to neither Western nor native Japanese models, but which contains identifiable elements of each. (BJV)
Taking the mystery out of mathematical model applications to karst aquifers—A primer
Kuniansky, Eve L.
2014-01-01
Advances in mathematical model applications toward the understanding of the complex flow, characterization, and water-supply management issues for karst aquifers have occurred in recent years. Different types of mathematical models can be applied successfully if appropriate information is available and the problems are adequately identified. The mathematical approaches discussed in this paper are divided into three major categories: 1) distributed parameter models, 2) lumped parameter models, and 3) fitting models. The modeling approaches are described conceptually with examples (but without equations) to help non-mathematicians understand the applications.
NASA Astrophysics Data System (ADS)
Kavetski, Dmitri; Clark, Martyn P.
2010-10-01
Despite the widespread use of conceptual hydrological models in environmental research and operations, they remain frequently implemented using numerically unreliable methods. This paper considers the impact of the time stepping scheme on model analysis (sensitivity analysis, parameter optimization, and Markov chain Monte Carlo-based uncertainty estimation) and prediction. It builds on the companion paper (Clark and Kavetski, 2010), which focused on numerical accuracy, fidelity, and computational efficiency. Empirical and theoretical analysis of eight distinct time stepping schemes for six different hydrological models in 13 diverse basins demonstrates several critical conclusions. (1) Unreliable time stepping schemes, in particular, fixed-step explicit methods, suffer from troublesome numerical artifacts that severely deform the objective function of the model. These deformations are not rare isolated instances but can arise in any model structure, in any catchment, and under common hydroclimatic conditions. (2) Sensitivity analysis can be severely contaminated by numerical errors, often to the extent that it becomes dominated by the sensitivity of truncation errors rather than the model equations. (3) Robust time stepping schemes generally produce "better behaved" objective functions, free of spurious local optima, and with sufficient numerical continuity to permit parameter optimization using efficient quasi Newton methods. When implemented within a multistart framework, modern Newton-type optimizers are robust even when started far from the optima and provide valuable diagnostic insights not directly available from evolutionary global optimizers. (4) Unreliable time stepping schemes lead to inconsistent and biased inferences of the model parameters and internal states. (5) Even when interactions between hydrological parameters and numerical errors provide "the right result for the wrong reason" and the calibrated model performance appears adequate, unreliable time stepping schemes make the model unnecessarily fragile in predictive mode, undermining validation assessments and operational use. Erroneous or misleading conclusions of model analysis and prediction arising from numerical artifacts in hydrological models are intolerable, especially given that robust numerics are accepted as mainstream in other areas of science and engineering. We hope that the vivid empirical findings will encourage the conceptual hydrological community to close its Pandora's box of numerical problems, paving the way for more meaningful model application and interpretation.
Geographic information system/watershed model interface
Fisher, Gary T.
1989-01-01
Geographic information systems allow for the interactive analysis of spatial data related to water-resources investigations. A conceptual design for an interface between a geographic information system and a watershed model includes functions for the estimation of model parameter values. Design criteria include ease of use, minimal equipment requirements, a generic data-base management system, and use of a macro language. An application is demonstrated for a 90.1-square-kilometer subbasin of the Patuxent River near Unity, Maryland, that performs automated derivation of watershed parameters for hydrologic modeling.
Modeling and Predicting Pesticide Exposures
Models provide a means for representing a real system in an understandable way. They take many forms, beginning with conceptual models that explain the way a system works, such as delineation of all the factors and parameters of how a pesticide particle moves in the air after a s...
Regionalized rainfall-runoff model to estimate low flow indices
NASA Astrophysics Data System (ADS)
Garcia, Florine; Folton, Nathalie; Oudin, Ludovic
2016-04-01
Estimating low flow indices is of paramount importance to manage water resources and risk assessments. These indices are derived from river discharges which are measured at gauged stations. However, the lack of observations at ungauged sites bring the necessity of developing methods to estimate these low flow indices from observed discharges in neighboring catchments and from catchment characteristics. Different estimation methods exist. Regression or geostatistical methods performed on the low flow indices are the most common types of methods. Another less common method consists in regionalizing rainfall-runoff model parameters, from catchment characteristics or by spatial proximity, to estimate low flow indices from simulated hydrographs. Irstea developed GR2M-LoiEau, a conceptual monthly rainfall-runoff model, combined with a regionalized model of snow storage and melt. GR2M-LoiEau relies on only two parameters, which are regionalized and mapped throughout France. This model allows to cartography monthly reference low flow indices. The inputs data come from SAFRAN, the distributed mesoscale atmospheric analysis system, which provides daily solid and liquid precipitation and temperature data from everywhere in the French territory. To exploit fully these data and to estimate daily low flow indices, a new version of GR-LoiEau has been developed at a daily time step. The aim of this work is to develop and regionalize a GR-LoiEau model that can provide any daily, monthly or annual estimations of low flow indices, yet keeping only a few parameters, which is a major advantage to regionalize them. This work includes two parts. On the one hand, a daily conceptual rainfall-runoff model is developed with only three parameters in order to simulate daily and monthly low flow indices, mean annual runoff and seasonality. On the other hand, different regionalization methods, based on spatial proximity and similarity, are tested to estimate the model parameters and to simulate low flow indices in ungauged sites. The analysis is carried out on 691 French catchments that are representative of various hydro-meteorological behaviors. The results are validated with a cross-validation procedure and are compared with the ones obtained with GR4J, a conceptual rainfall-runoff model, which already provides daily estimations, but involves four parameters that cannot easily be regionalized.
NASA Astrophysics Data System (ADS)
Marchionda, Elisabetta; Deschamps, Rémy; Nader, Fadi H.; Ceriani, Andrea; Di Giulio, Andrea; Lawrence, David; Morad, Daniel J.
2017-04-01
The stratigraphic record of a carbonate system is the result of the interplay of several local and global factors that control the physical and the biological responses within a basin. Conceptual models cannot be detailed enough to take into account all the processes that control the deposition of sediments. The evaluation of the key controlling parameters on the sedimentation can be investigated with the use of stratigraphic forward models, that permit dynamic and quantitative simulations of the sedimentary basin infill. This work focuses on an onshore Abu Dhabi field (UAE) and it aims to provide a complete picture of the stratigraphic evolution of Upper Jurassic Arab Formation (Fm.). In this study, we started with the definition of the field-scale conceptual depositional model of the Formation, resulting from facies and well log analysis based on five wells. The Arab Fm. could be defined as a shallow marine carbonate ramp, that ranges from outer ramp deposits to supratidal/evaporitic facies association (from bottom to top). With the reconstruction of the sequence stratigraphic pattern and several paleofacies maps, it was possible to suggest multiple directions of progradations at local scale. Then, a 3D forward modelling tool has been used to i) identify and quantify the controlling parameters on geometries and facies distribution of the Arab Fm.; ii) predict the stratigraphic architecture of the Arab Fm.; and iii) integrate and validate the conceptual model. Numerous constraints were set during the different simulations and sensitivity analyses were performed testing the carbonate production, eustatic oscillations and transport parameters. To verify the geological consistency the 3D forward modelling has been calibrated with the available control points (five wells) in terms of thickness and facies distribution.
NASA Astrophysics Data System (ADS)
Versini, Pierre-Antoine; Tchiguirinskaia, Ioulia; Schertzer, Daniel
2017-04-01
Green roofs are commonly considered as efficient tools to mitigate urban runoff as they can store precipitation, and consequently provide retention and detention performances. Designed as a compromise between water holding capacity, weight and hydraulic conductivity, their substrate is usually an artificial media differentiating significantly from a traditional soil. In order to assess green roofs hydrological performances, many models have been developed. Classified into two categories (conceptual and physically based), they are usually applied to reproduce the discharge of a particular monitored green roof considered as homogeneous. Although the resulted simulations could be satisfactory, the question of robustness and consistency of the calibrated parameters is often not addressed. Here, a modeling framework has been developed to assess the efficiency and the robustness of both modelling approaches (conceptual and physically based) in reproducing green roof hydrological behaviour. SWMM and VS2DT models have been used for this purpose. This work also benefits from an experimental setup where several green roofs differentiated by their substrate thickness and vegetation cover are monitored. Based on the data collected for several rainfall events, it has been studied how the calibrated parameters are effectively linked to their physical properties and how they can vary from one green roof configuration to another. Although both models reproduce correctly the observed discharges in most of the cases, their calibrated parameters exhibit a high inconsistency. For a same green roof configuration, these parameters can vary significantly from one rainfall event to another, even if they are supposed to be linked to the green roof characteristics (roughness, residual moisture content for instance). They can also be different from one green roof configuration to another although the implemented substrate is the same. Finally, it appears very difficult to find any relationship between the calibrated parameters supposed to represent similar characteristics in both models (porosity, hydraulic conductivity). These results illustrate the difficulty to reproduce the hydrological behaviour of such an artificial media constituting green roof substrate. They justify the development of new methods able to take to into account the spatial heterogeneity of the substrate for instance.
Performance of a distributed semi-conceptual hydrological model under tropical watershed conditions
USDA-ARS?s Scientific Manuscript database
Many hydrologic models have been developed to help manage natural resources all over the world. Nevertheless, most models have presented a high complexity in terms of data base requirements, as well as, many calibration parameters. This has resulted in serious difficulties to application in catchmen...
NASA Astrophysics Data System (ADS)
Bianchi Janetti, Emanuela; Riva, Monica; Guadagnini, Alberto
2017-04-01
We perform a variance-based global sensitivity analysis to assess the impact of the uncertainty associated with (a) the spatial distribution of hydraulic parameters, e.g., hydraulic conductivity, and (b) the conceptual model adopted to describe the system on the characterization of a regional-scale aquifer. We do so in the context of inverse modeling of the groundwater flow system. The study aquifer lies within the provinces of Bergamo and Cremona (Italy) and covers a planar extent of approximately 785 km2. Analysis of available sedimentological information allows identifying a set of main geo-materials (facies/phases) which constitute the geological makeup of the subsurface system. We parameterize the conductivity field following two diverse conceptual schemes. The first one is based on the representation of the aquifer as a Composite Medium. In this conceptualization the system is composed by distinct (five, in our case) lithological units. Hydraulic properties (such as conductivity) in each unit are assumed to be uniform. The second approach assumes that the system can be modeled as a collection of media coexisting in space to form an Overlapping Continuum. A key point in this model is that each point in the domain represents a finite volume within which each of the (five) identified lithofacies can be found with a certain volumetric percentage. Groundwater flow is simulated with the numerical code MODFLOW-2005 for each of the adopted conceptual models. We then quantify the relative contribution of the considered uncertain parameters, including boundary conditions, to the total variability of the piezometric level recorded in a set of 40 monitoring wells by relying on the variance-based Sobol indices. The latter are derived numerically for the investigated settings through the use of a model-order reduction technique based on the polynomial chaos expansion approach.
Generalized Processing Tree Models: Jointly Modeling Discrete and Continuous Variables.
Heck, Daniel W; Erdfelder, Edgar; Kieslich, Pascal J
2018-05-24
Multinomial processing tree models assume that discrete cognitive states determine observed response frequencies. Generalized processing tree (GPT) models extend this conceptual framework to continuous variables such as response times, process-tracing measures, or neurophysiological variables. GPT models assume finite-mixture distributions, with weights determined by a processing tree structure, and continuous components modeled by parameterized distributions such as Gaussians with separate or shared parameters across states. We discuss identifiability, parameter estimation, model testing, a modeling syntax, and the improved precision of GPT estimates. Finally, a GPT version of the feature comparison model of semantic categorization is applied to computer-mouse trajectories.
Conceptual Study of Rotary-Wing Microrobotics
2008-03-27
tensile residual stress, respectively [78-80]. ......... 48 Table 8: Wing-T design parameters compared to Tsuzuki’s recommendations. ....... 73...Table 13: Summary of key parameters for a feasible rotary-wing MEMS robot design...Direct Methanol Fuel Cell DOF Degrees of Freedom DRIE Deep Reactive Ion Etch FEA Finite Element Analysis FEM Finite Element Modeling FOM Figure
The ACTIVE conceptual framework as a structural equation model.
Gross, Alden L; Payne, Brennan R; Casanova, Ramon; Davoudzadeh, Pega; Dzierzewski, Joseph M; Farias, Sarah; Giovannetti, Tania; Ip, Edward H; Marsiske, Michael; Rebok, George W; Schaie, K Warner; Thomas, Kelsey; Willis, Sherry; Jones, Richard N
2018-01-01
Background/Study Context: Conceptual frameworks are analytic models at a high level of abstraction. Their operationalization can inform randomized trial design and sample size considerations. The Advanced Cognitive Training for Independent and Vital Elderly (ACTIVE) conceptual framework was empirically tested using structural equation modeling (N=2,802). ACTIVE was guided by a conceptual framework for cognitive training in which proximal cognitive abilities (memory, inductive reasoning, speed of processing) mediate treatment-related improvement in primary outcomes (everyday problem-solving, difficulty with activities of daily living, everyday speed, driving difficulty), which in turn lead to improved secondary outcomes (health-related quality of life, health service utilization, mobility). Measurement models for each proximal, primary, and secondary outcome were developed and tested using baseline data. Each construct was then combined in one model to evaluate fit (RMSEA, CFI, normalized residuals of each indicator). To expand the conceptual model and potentially inform future trials, evidence of modification of structural model parameters was evaluated by age, years of education, sex, race, and self-rated health status. Preconceived measurement models for memory, reasoning, speed of processing, everyday problem-solving, instrumental activities of daily living (IADL) difficulty, everyday speed, driving difficulty, and health-related quality of life each fit well to the data (all RMSEA < .05; all CFI > .95). Fit of the full model was excellent (RMSEA = .038; CFI = .924). In contrast with previous findings from ACTIVE regarding who benefits from training, interaction testing revealed associations between proximal abilities and primary outcomes are stronger on average by nonwhite race, worse health, older age, and less education (p < .005). Empirical data confirm the hypothesized ACTIVE conceptual model. Findings suggest that the types of people who show intervention effects on cognitive performance potentially may be different from those with the greatest chance of transfer to real-world activities.
NASA Astrophysics Data System (ADS)
Noh, Seong Jin; Tachikawa, Yasuto; Shiiba, Michiharu; Kim, Sunmin
Applications of data assimilation techniques have been widely used to improve upon the predictability of hydrologic modeling. Among various data assimilation techniques, sequential Monte Carlo (SMC) filters, known as "particle filters" provide the capability to handle non-linear and non-Gaussian state-space models. This paper proposes a dual state-parameter updating scheme (DUS) based on SMC methods to estimate both state and parameter variables of a hydrologic model. We introduce a kernel smoothing method for the robust estimation of uncertain model parameters in the DUS. The applicability of the dual updating scheme is illustrated using the implementation of the storage function model on a middle-sized Japanese catchment. We also compare performance results of DUS combined with various SMC methods, such as SIR, ASIR and RPF.
NASA Astrophysics Data System (ADS)
Knoben, Wouter; Woods, Ross; Freer, Jim
2016-04-01
Conceptual hydrologic models consist of a certain arrangement of spatial and temporal dynamics consisting of stores, fluxes and transformation functions, depending on the modeller's choices and intended use. They have the advantages of being computationally efficient, being relatively easy model structures to reconfigure and having relatively low input data demands. This makes them well-suited for large-scale and large-sample hydrology, where appropriately representing the dominant hydrologic functions of a catchment is a main concern. Given these requirements, the number of parameters in the model cannot be too high, to avoid equifinality and identifiability issues. This limits the number and level of complexity of dominant hydrologic processes the model can represent. Specific purposes and places thus require a specific model and this has led to an abundance of conceptual hydrologic models. No structured overview of these models exists and there is no clear method to select appropriate model structures for different catchments. This study is a first step towards creating an overview of the elements that make up conceptual models, which may later assist a modeller in finding an appropriate model structure for a given catchment. To this end, this study brings together over 30 past and present conceptual models. The reviewed model structures are simply different configurations of three basic model elements (stores, fluxes and transformation functions), depending on the hydrologic processes the models are intended to represent. Differences also exist in the inner workings of the stores, fluxes and transformations, i.e. the mathematical formulations that describe each model element's intended behaviour. We investigate the hypothesis that different model structures can produce similar behavioural simulations. This can clarify the overview of model elements by grouping elements which are similar, which can improve model structure selection.
Conceptual Model Development for Sea Turtle Nesting Habitat: Support for USACE Navigation Projects
2015-08-01
regional values. • Beach Width: The width of the beach (m) defines the region from the shoreline to the dune toe . Loggerhead turtles tend to prefer...primary drivers of the model parameters. • Beach Elevation: Beach elevation (m) is measured from the shoreline to the dune toe . Elevation influences...mapping, and morphological features in combination with imagery-derived environmental parameters (i.e., dune vegetation) have not been attempted
NASA Astrophysics Data System (ADS)
Schoups, G.; Vrugt, J. A.; Fenicia, F.; van de Giesen, N. C.
2010-10-01
Conceptual rainfall-runoff models have traditionally been applied without paying much attention to numerical errors induced by temporal integration of water balance dynamics. Reliance on first-order, explicit, fixed-step integration methods leads to computationally cheap simulation models that are easy to implement. Computational speed is especially desirable for estimating parameter and predictive uncertainty using Markov chain Monte Carlo (MCMC) methods. Confirming earlier work of Kavetski et al. (2003), we show here that the computational speed of first-order, explicit, fixed-step integration methods comes at a cost: for a case study with a spatially lumped conceptual rainfall-runoff model, it introduces artificial bimodality in the marginal posterior parameter distributions, which is not present in numerically accurate implementations of the same model. The resulting effects on MCMC simulation include (1) inconsistent estimates of posterior parameter and predictive distributions, (2) poor performance and slow convergence of the MCMC algorithm, and (3) unreliable convergence diagnosis using the Gelman-Rubin statistic. We studied several alternative numerical implementations to remedy these problems, including various adaptive-step finite difference schemes and an operator splitting method. Our results show that adaptive-step, second-order methods, based on either explicit finite differencing or operator splitting with analytical integration, provide the best alternative for accurate and efficient MCMC simulation. Fixed-step or adaptive-step implicit methods may also be used for increased accuracy, but they cannot match the efficiency of adaptive-step explicit finite differencing or operator splitting. Of the latter two, explicit finite differencing is more generally applicable and is preferred if the individual hydrologic flux laws cannot be integrated analytically, as the splitting method then loses its advantage.
A constrained rasch model of trace redintegration in serial recall.
Roodenrys, Steven; Miller, Leonie M
2008-04-01
The notion that verbal short-term memory tasks, such as serial recall, make use of information in long-term as well as in short-term memory is instantiated in many models of these tasks. Such models incorporate a process in which degraded traces retrieved from a short-term store are reconstructed, or redintegrated (Schweickert, 1993), through the use of information in long-term memory. This article presents a conceptual and mathematical model of this process based on a class of item-response theory models. It is demonstrated that this model provides a better fit to three sets of data than does the multinomial processing tree model of redintegration (Schweickert, 1993) and that a number of conceptual accounts of serial recall can be related to the parameters of the model.
Macquarrie, K T B; Mayer, K U; Jin, B; Spiessl, S M
2010-03-01
Redox evolution in sparsely fractured crystalline rocks is a key, and largely unresolved, issue when assessing the geochemical suitability of deep geological repositories for nuclear waste. Redox zonation created by the influx of oxygenated waters has previously been simulated using reactive transport models that have incorporated a variety of processes, resulting in predictions for the depth of oxygen penetration that may vary greatly. An assessment and direct comparison of the various underlying conceptual models are therefore needed. In this work a reactive transport model that considers multiple processes in an integrated manner is used to investigate the ingress of oxygen for both single fracture and fracture zone scenarios. It is shown that the depth of dissolved oxygen migration is greatly influenced by the a priori assumptions that are made in the conceptual models. For example, the ability of oxygen to access and react with minerals in the rock matrix may be of paramount importance for single fracture conceptual models. For fracture zone systems, the abundance and reactivity of minerals within the fractures and thin matrix slabs between the fractures appear to provide key controls on O(2) attenuation. The findings point to the need for improved understanding of the coupling between the key transport-reaction feedbacks to determine which conceptual models are most suitable and to provide guidance for which parameters should be targeted in field and laboratory investigations. Copyright 2009 Elsevier B.V. All rights reserved.
While there is a high potential for exposure of humans and ecosystems to chemicals released from hazardous waste sites, the degree to which this potential is realized is often uncertain. Conceptually divided among parameter, model, and modeler uncertainties imparted during simula...
NASA Astrophysics Data System (ADS)
Garavaglia, F.; Seyve, E.; Gottardi, F.; Le Lay, M.; Gailhard, J.; Garçon, R.
2014-12-01
MORDOR is a conceptual hydrological model extensively used in Électricité de France (EDF, French electric utility company) operational applications: (i) hydrological forecasting, (ii) flood risk assessment, (iii) water balance and (iv) climate change studies. MORDOR is a lumped, reservoir, elevation based model with hourly or daily areal rainfall and air temperature as the driving input data. The principal hydrological processes represented are evapotranspiration, direct and indirect runoff, ground water, snow accumulation and melt and routing. The model has been intensively used at EDF for more than 20 years, in particular for modeling French mountainous watersheds. In the matter of parameters calibration we propose and test alternative multi-criteria techniques based on two specific approaches: automatic calibration using single-objective functions and a priori parameter calibration founded on hydrological watershed features. The automatic calibration approach uses single-objective functions, based on Kling-Gupta efficiency, to quantify the good agreement between the simulated and observed runoff focusing on four different runoff samples: (i) time-series sample, (I) annual hydrological regime, (iii) monthly cumulative distribution functions and (iv) recession sequences.The primary purpose of this study is to analyze the definition and sensitivity of MORDOR parameters testing different calibration techniques in order to: (i) simplify the model structure, (ii) increase the calibration-validation performance of the model and (iii) reduce the equifinality problem of calibration process. We propose an alternative calibration strategy that reaches these goals. The analysis is illustrated by calibrating MORDOR model to daily data for 50 watersheds located in French mountainous regions.
NASA Astrophysics Data System (ADS)
Guerrero, J.; Halldin, S.; Xu, C.; Lundin, L.
2011-12-01
Distributed hydrological models are important tools in water management as they account for the spatial variability of the hydrological data, as well as being able to produce spatially distributed outputs. They can directly incorporate and assess potential changes in the characteristics of our basins. A recognized problem for models in general is equifinality, which is only exacerbated for distributed models who tend to have a large number of parameters. We need to deal with the fundamentally ill-posed nature of the problem that such models force us to face, i.e. a large number of parameters and very few variables that can be used to constrain them, often only the catchment discharge. There is a growing but yet limited literature showing how the internal states of a distributed model can be used to calibrate/validate its predictions. In this paper, a distributed version of WASMOD, a conceptual rainfall runoff model with only three parameters, combined with a routing algorithm based on the high-resolution HydroSHEDS data was used to simulate the discharge in the Paso La Ceiba basin in Honduras. The parameter space was explored using Monte-Carlo simulations and the region of space containing the parameter-sets that were considered behavioral according to two different criteria was delimited using the geometric concept of alpha-shapes. The discharge data from five internal sub-basins was used to aid in the calibration of the model and to answer the following questions: Can this information improve the simulations at the outlet of the catchment, or decrease their uncertainty? Also, after reducing the number of model parameters needing calibration through sensitivity analysis: Is it possible to relate them to basin characteristics? The analysis revealed that in most cases the internal discharge data can be used to reduce the uncertainty in the discharge at the outlet, albeit with little improvement in the overall simulation results.
NASA Astrophysics Data System (ADS)
Sivapalan, Murugesu; Ruprecht, John K.; Viney, Neil R.
1996-03-01
A long-term water balance model has been developed to predict the hydrological effects of land-use change (especially forest clearing) in small experimental catchments in the south-west of Western Australia. This small catchment model has been used as the building block for the development of a large catchment-scale model, and has also formed the basis for a coupled water and salt balance model, developed to predict the changes in stream salinity resulting from land-use and climate change. The application of the coupled salt and water balance model to predict stream salinities in two small experimental catchments, and the application of the large catchment-scale model to predict changes in water yield in a medium-sized catchment that is being mined for bauxite, are presented in Parts 2 and 3, respectively, of this series of papers.The small catchment model has been designed as a simple, robust, conceptually based model of the basic daily water balance fluxes in forested catchments. The responses of the catchment to rainfall and pan evaporation are conceptualized in terms of three interdependent subsurface stores A, B and F. Store A depicts a near-stream perched aquifer system; B represents a deeper, permanent groundwater system; and F is an intermediate, unsaturated infiltration store. The responses of these stores are characterized by a set of constitutive relations which involves a number of conceptual parameters. These parameters are estimated by calibration by comparing observed and predicted runoff. The model has performed very well in simulations carried out on Salmon and Wights, two small experimental catchments in the Collie River basin in south-west Western Australia. The results from the application of the model to these small catchments are presented in this paper.
Wei Wu; James Clark; James Vose
2010-01-01
Hierarchical Bayesian (HB) modeling allows for multiple sources of uncertainty by factoring complex relationships into conditional distributions that can be used to draw inference and make predictions. We applied an HB model to estimate the parameters and state variables of a parsimonious hydrological model â GR4J â by coherently assimilating the uncertainties from the...
NASA Astrophysics Data System (ADS)
Steinschneider, S.; Wi, S.; Brown, C. M.
2013-12-01
Flood risk management performance is investigated within the context of integrated climate and hydrologic modeling uncertainty to explore system robustness. The research question investigated is whether structural and hydrologic parameterization uncertainties are significant relative to other uncertainties such as climate change when considering water resources system performance. Two hydrologic models are considered, a conceptual, lumped parameter model that preserves the water balance and a physically-based model that preserves both water and energy balances. In the conceptual model, parameter and structural uncertainties are quantified and propagated through the analysis using a Bayesian modeling framework with an innovative error model. Mean climate changes and internal climate variability are explored using an ensemble of simulations from a stochastic weather generator. The approach presented can be used to quantify the sensitivity of flood protection adequacy to different sources of uncertainty in the climate and hydrologic system, enabling the identification of robust projects that maintain adequate performance despite the uncertainties. The method is demonstrated in a case study for the Coralville Reservoir on the Iowa River, where increased flooding over the past several decades has raised questions about potential impacts of climate change on flood protection adequacy.
MCFire model technical description
David R. Conklin; James M. Lenihan; Dominique Bachelet; Ronald P. Neilson; John B. Kim
2016-01-01
MCFire is a computer program that simulates the occurrence and effects of wildfire on natural vegetation, as a submodel within the MC1 dynamic global vegetation model. This report is a technical description of the algorithms and parameter values used in MCFire, intended to encapsulate its design and features a higher level that is more conceptual than the level...
Model Calibration in Watershed Hydrology
NASA Technical Reports Server (NTRS)
Yilmaz, Koray K.; Vrugt, Jasper A.; Gupta, Hoshin V.; Sorooshian, Soroosh
2009-01-01
Hydrologic models use relatively simple mathematical equations to conceptualize and aggregate the complex, spatially distributed, and highly interrelated water, energy, and vegetation processes in a watershed. A consequence of process aggregation is that the model parameters often do not represent directly measurable entities and must, therefore, be estimated using measurements of the system inputs and outputs. During this process, known as model calibration, the parameters are adjusted so that the behavior of the model approximates, as closely and consistently as possible, the observed response of the hydrologic system over some historical period of time. This Chapter reviews the current state-of-the-art of model calibration in watershed hydrology with special emphasis on our own contributions in the last few decades. We discuss the historical background that has led to current perspectives, and review different approaches for manual and automatic single- and multi-objective parameter estimation. In particular, we highlight the recent developments in the calibration of distributed hydrologic models using parameter dimensionality reduction sampling, parameter regularization and parallel computing.
The ACTIVE conceptual framework as a structural equation model
Gross, Alden L.; Payne, Brennan R.; Casanova, Ramon; Davoudzadeh, Pega; Dzierzewski, Joseph M.; Farias, Sarah; Giovannetti, Tania; Ip, Edward H.; Marsiske, Michael; Rebok, George W.; Schaie, K. Warner; Thomas, Kelsey; Willis, Sherry; Jones, Richard N.
2018-01-01
Background/Study Context Conceptual frameworks are analytic models at a high level of abstraction. Their operationalization can inform randomized trial design and sample size considerations. Methods The Advanced Cognitive Training for Independent and Vital Elderly (ACTIVE) conceptual framework was empirically tested using structural equation modeling (N=2,802). ACTIVE was guided by a conceptual framework for cognitive training in which proximal cognitive abilities (memory, inductive reasoning, speed of processing) mediate treatment-related improvement in primary outcomes (everyday problem-solving, difficulty with activities of daily living, everyday speed, driving difficulty), which in turn lead to improved secondary outcomes (health-related quality of life, health service utilization, mobility). Measurement models for each proximal, primary, and secondary outcome were developed and tested using baseline data. Each construct was then combined in one model to evaluate fit (RMSEA, CFI, normalized residuals of each indicator). To expand the conceptual model and potentially inform future trials, evidence of modification of structural model parameters was evaluated by age, years of education, sex, race, and self-rated health status. Results Preconceived measurement models for memory, reasoning, speed of processing, everyday problem-solving, instrumental activities of daily living (IADL) difficulty, everyday speed, driving difficulty, and health-related quality of life each fit well to the data (all RMSEA < .05; all CFI > .95). Fit of the full model was excellent (RMSEA = .038; CFI = .924). In contrast with previous findings from ACTIVE regarding who benefits from training, interaction testing revealed associations between proximal abilities and primary outcomes are stronger on average by nonwhite race, worse health, older age, and less education (p < .005). Conclusions Empirical data confirm the hypothesized ACTIVE conceptual model. Findings suggest that the types of people who show intervention effects on cognitive performance potentially may be different from those with the greatest chance of transfer to real-world activities. PMID:29303475
Hydrological model parameter dimensionality is a weak measure of prediction uncertainty
NASA Astrophysics Data System (ADS)
Pande, S.; Arkesteijn, L.; Savenije, H.; Bastidas, L. A.
2015-04-01
This paper shows that instability of hydrological system representation in response to different pieces of information and associated prediction uncertainty is a function of model complexity. After demonstrating the connection between unstable model representation and model complexity, complexity is analyzed in a step by step manner. This is done measuring differences between simulations of a model under different realizations of input forcings. Algorithms are then suggested to estimate model complexity. Model complexities of the two model structures, SAC-SMA (Sacramento Soil Moisture Accounting) and its simplified version SIXPAR (Six Parameter Model), are computed on resampled input data sets from basins that span across the continental US. The model complexities for SIXPAR are estimated for various parameter ranges. It is shown that complexity of SIXPAR increases with lower storage capacity and/or higher recession coefficients. Thus it is argued that a conceptually simple model structure, such as SIXPAR, can be more complex than an intuitively more complex model structure, such as SAC-SMA for certain parameter ranges. We therefore contend that magnitudes of feasible model parameters influence the complexity of the model selection problem just as parameter dimensionality (number of parameters) does and that parameter dimensionality is an incomplete indicator of stability of hydrological model selection and prediction problems.
ERIC Educational Resources Information Center
Wright, John C.; And Others
A conceptual model of how children process televised information was developed with the goal of identifying those parameters of the process that are both measurable and manipulable in research settings. The model presented accommodates the nature of information processing both by the child and by the presentation by the medium. Presentation is…
Mohammadi, Mohammad Hossein; Vanclooster, Marnik
2012-05-01
Solute transport in partially saturated soils is largely affected by fluid velocity distribution and pore size distribution within the solute transport domain. Hence, it is possible to describe the solute transport process in terms of the pore size distribution of the soil, and indirectly in terms of the soil hydraulic properties. In this paper, we present a conceptual approach that allows predicting the parameters of the Convective Lognormal Transfer model from knowledge of soil moisture and the Soil Moisture Characteristic (SMC), parameterized by means of the closed-form model of Kosugi (1996). It is assumed that in partially saturated conditions, the air filled pore volume act as an inert solid phase, allowing the use of the Arya et al. (1999) pragmatic approach to estimate solute travel time statistics from the saturation degree and SMC parameters. The approach is evaluated using a set of partially saturated transport experiments as presented by Mohammadi and Vanclooster (2011). Experimental results showed that the mean solute travel time, μ(t), increases proportionally with the depth (travel distance) and decreases with flow rate. The variance of solute travel time σ²(t) first decreases with flow rate up to 0.4-0.6 Ks and subsequently increases. For all tested BTCs predicted solute transport with μ(t) estimated from the conceptual model performed much better as compared to predictions with μ(t) and σ²(t) estimated from calibration of solute transport at shallow soil depths. The use of μ(t) estimated from the conceptual model therefore increases the robustness of the CLT model in predicting solute transport in heterogeneous soils at larger depths. In view of the fact that reasonable indirect estimates of the SMC can be made from basic soil properties using pedotransfer functions, the presented approach may be useful for predicting solute transport at field or watershed scales. Copyright © 2012 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Mohammadi, Mohammad Hossein; Vanclooster, Marnik
2012-05-01
Solute transport in partially saturated soils is largely affected by fluid velocity distribution and pore size distribution within the solute transport domain. Hence, it is possible to describe the solute transport process in terms of the pore size distribution of the soil, and indirectly in terms of the soil hydraulic properties. In this paper, we present a conceptual approach that allows predicting the parameters of the Convective Lognormal Transfer model from knowledge of soil moisture and the Soil Moisture Characteristic (SMC), parameterized by means of the closed-form model of Kosugi (1996). It is assumed that in partially saturated conditions, the air filled pore volume act as an inert solid phase, allowing the use of the Arya et al. (1999) pragmatic approach to estimate solute travel time statistics from the saturation degree and SMC parameters. The approach is evaluated using a set of partially saturated transport experiments as presented by Mohammadi and Vanclooster (2011). Experimental results showed that the mean solute travel time, μt, increases proportionally with the depth (travel distance) and decreases with flow rate. The variance of solute travel time σ2t first decreases with flow rate up to 0.4-0.6 Ks and subsequently increases. For all tested BTCs predicted solute transport with μt estimated from the conceptual model performed much better as compared to predictions with μt and σ2t estimated from calibration of solute transport at shallow soil depths. The use of μt estimated from the conceptual model therefore increases the robustness of the CLT model in predicting solute transport in heterogeneous soils at larger depths. In view of the fact that reasonable indirect estimates of the SMC can be made from basic soil properties using pedotransfer functions, the presented approach may be useful for predicting solute transport at field or watershed scales.
Majnarić-Trtica, Ljiljana; Vitale, Branko
2011-10-01
To introduce systems biology as a conceptual framework for research in family medicine, based on empirical data from a case study on the prediction of influenza vaccination outcomes. This concept is primarily oriented towards planning preventive interventions and includes systematic data recording, a multi-step research protocol and predictive modelling. Factors known to affect responses to influenza vaccination include older age, past exposure to influenza viruses, and chronic diseases; however, constructing useful prediction models remains a challenge, because of the need to identify health parameters that are appropriate for general use in modelling patients' responses. The sample consisted of 93 patients aged 50-89 years (median 69), with multiple medical conditions, who were vaccinated against influenza. Literature searches identified potentially predictive health-related parameters, including age, gender, diagnoses of the main chronic ageing diseases, anthropometric measures, and haematological and biochemical tests. By applying data mining algorithms, patterns were identified in the data set. Candidate health parameters, selected in this way, were then combined with information on past influenza virus exposure to build the prediction model using logistic regression. A highly significant prediction model was obtained, indicating that by using a systems biology approach it is possible to answer unresolved complex medical uncertainties. Adopting this systems biology approach can be expected to be useful in identifying the most appropriate target groups for other preventive programmes.
Aircraft Conceptual Design Using Vehicle Sketch Pad
NASA Technical Reports Server (NTRS)
Fredericks, William J.; Antcliff, Kevin R.; Costa, Guillermo; Deshpande, Nachiket; Moore, Mark D.; Miguel, Edric A. San; Snyder, Alison N.
2010-01-01
Vehicle Sketch Pad (VSP) is a parametric geometry modeling tool that is intended for use in the conceptual design of aircraft. The intent of this software is to rapidly model aircraft configurations without expending the expertise and time that is typically required for modeling with traditional Computer Aided Design (CAD) packages. VSP accomplishes this by using parametrically defined components, such as a wing that is defined by span, area, sweep, taper ratio, thickness to cord, and so on. During this phase of frequent design builds, changes to the model can be rapidly visualized along with the internal volumetric layout. Using this geometry-based approach, parameters such as wetted areas and cord lengths can be easily extracted for rapid external performance analyses, such as a parasite drag buildup. At the completion of the conceptual design phase, VSP can export its geometry to higher fidelity tools. This geometry tool was developed by NASA and is freely available to U.S. companies and universities. It has become integral to conceptual design in the Aeronautics Systems Analysis Branch (ASAB) here at NASA Langley Research Center and is currently being used at over 100 universities, aerospace companies, and other government agencies. This paper focuses on the use of VSP in recent NASA conceptual design studies to facilitate geometry-centered design methodology. Such a process is shown to promote greater levels of creativity, more rapid assessment of critical design issues, and improved ability to quickly interact with higher order analyses. A number of VSP vehicle model examples are compared to CAD-based conceptual design, from a designer perspective; comparisons are also made of the time and expertise required to build the geometry representations as well.
Probabilistic graphs as a conceptual and computational tool in hydrology and water management
NASA Astrophysics Data System (ADS)
Schoups, Gerrit
2014-05-01
Originally developed in the fields of machine learning and artificial intelligence, probabilistic graphs constitute a general framework for modeling complex systems in the presence of uncertainty. The framework consists of three components: 1. Representation of the model as a graph (or network), with nodes depicting random variables in the model (e.g. parameters, states, etc), which are joined together by factors. Factors are local probabilistic or deterministic relations between subsets of variables, which, when multiplied together, yield the joint distribution over all variables. 2. Consistent use of probability theory for quantifying uncertainty, relying on basic rules of probability for assimilating data into the model and expressing unknown variables as a function of observations (via the posterior distribution). 3. Efficient, distributed approximation of the posterior distribution using general-purpose algorithms that exploit model structure encoded in the graph. These attributes make probabilistic graphs potentially useful as a conceptual and computational tool in hydrology and water management (and beyond). Conceptually, they can provide a common framework for existing and new probabilistic modeling approaches (e.g. by drawing inspiration from other fields of application), while computationally they can make probabilistic inference feasible in larger hydrological models. The presentation explores, via examples, some of these benefits.
NASA Astrophysics Data System (ADS)
Francés, Alain P.; Lubczynski, Maciek W.; Roy, Jean; Santos, Fernando A. M.; Mahmoudzadeh Ardekani, Mohammad R.
2014-11-01
Hard rock aquifers are highly heterogeneous and hydrogeologically complex. To contribute to the design of hydrogeological conceptual models of hard rock aquifers, we propose a multi-techniques methodology based on a downward approach that combines remote sensing (RS), non-invasive hydrogeophysics and hydrogeological field data acquisition. The proposed methodology is particularly suitable for data scarce areas. It was applied in the pilot research area of Sardón catchment (80 km2) located west of Salamanca (Spain). The area was selected because of hard-rock hydrogeology, semi-arid climate and scarcity of groundwater resources. The proposed methodology consisted of three main steps. First, we detected the main hydrogeological features at the catchment scale by processing: (i) a high resolution digital terrain model to map lineaments and to outline fault zones; and (ii) high-resolution, multispectral satellite QuickBird and WorldView-2 images to map the outcropping granite. Second, we characterized at the local scale the hydrogeological features identified at step one with: i) ground penetrating radar (GPR) to assess groundwater table depth complementing the available monitoring network data; ii) 2D electric resistivity tomography (ERT) and frequency domain electromagnetic (FDEM) to retrieve the hydrostratigraphy along selected survey transects; iii) magnetic resonance soundings (MRS) to retrieve the hydrostratigraphy and aquifer parameters at the selected survey sites. In the third step, we drilled 5 boreholes (25 to 48 m deep) and performed slug tests to verify the hydrogeophysical interpretation and to calibrate the MRS parameters. Finally, we compiled and integrated all acquired data to define the geometry and parameters of the Sardón aquifer at the catchment scale. In line with a general conceptual model of hard rock aquifers, we identified two main hydrostratigraphic layers: a saprolite layer and a fissured layer. Both layers were intersected and drained by fault zones that control the hydrogeology of the catchment. The spatial discontinuities of the saprolite layer were well defined by RS techniques while subsurface geometry and aquifer parameters by hydrogeophysics. The GPR method was able to detect shallow water table at depth between 1 and 3 m b.g.s. The hydrostratigraphy and parameterization of the fissured layer remained uncertain because ERT and FDEM geophysical methods were quantitatively not conclusive while MRS detectability was restricted by low volumetric water content. The proposed multi-technique methodology integrating cost efficient RS, hydrogeophysics and hydrogeological field investigations allowed us to characterize geometrically and parametrically the Sardón hard rock aquifer system, facilitating the design of hydrogeological conceptual model of the area.
A General and Flexible Approach to Estimating the Social Relations Model Using Bayesian Methods
ERIC Educational Resources Information Center
Ludtke, Oliver; Robitzsch, Alexander; Kenny, David A.; Trautwein, Ulrich
2013-01-01
The social relations model (SRM) is a conceptual, methodological, and analytical approach that is widely used to examine dyadic behaviors and interpersonal perception within groups. This article introduces a general and flexible approach to estimating the parameters of the SRM that is based on Bayesian methods using Markov chain Monte Carlo…
NASA Astrophysics Data System (ADS)
Zoccarato, C.; Baù, D.; Bottazzi, F.; Ferronato, M.; Gambolati, G.; Mantica, S.; Teatini, P.
2016-10-01
The geomechanical analysis of a highly compartmentalized reservoir is performed to simulate the seafloor subsidence due to gas production. The available observations over the hydrocarbon reservoir consist of bathymetric surveys carried out before and at the end of a 10-yr production life. The main goal is the calibration of the reservoir compressibility cM, that is, the main geomechanical parameter controlling the surface response. Two conceptual models are considered: in one (i) cM varies only with the depth and the vertical effective stress (heterogeneity due to lithostratigraphic variability); in another (ii) cM varies also in the horizontal plane, that is, it is spatially distributed within the reservoir stratigraphic units. The latter hypothesis accounts for a possible partitioning of the reservoir due to the presence of sealing faults and thrusts that suggests the idea of a block heterogeneous system with the number of reservoir blocks equal to the number of uncertain parameters. The method applied here relies on an ensemble-based data assimilation (DA) algorithm (i.e. the ensemble smoother, ES), which incorporates the information from the bathymetric measurements into the geomechanical model response to infer and reduce the uncertainty of the parameter cM. The outcome from conceptual model (i) indicates that DA is effective in reducing the cM uncertainty. However, the maximum settlement still remains underestimated, while the areal extent of the subsidence bowl is overestimated. We demonstrate that the selection of the heterogeneous conceptual model (ii) allows to reproduce much better the observations thus removing a clear bias of the model structure. DA allows significantly reducing the cM uncertainty in the five blocks (out of the seven) characterized by large volume and large pressure decline. Conversely, the assimilation of land displacements only partially constrains the prior cM uncertainty in the reservoir blocks marginally contributing to the cumulative seafloor subsidence, that is, blocks with low pressure.
Thermohydrology of fractured geologic materials
NASA Astrophysics Data System (ADS)
Esh, David Whittaker
1998-11-01
Thermohydrological and thermohydrochemical modeling as applied to the disposal of radioactive materials in a geologic repository is presented. Site hydrology, chemistry, and mineralogy were summarized and conceptual models of the fundamental system processes were developed. The numerical model TOUGH2 was used to complete computer simulations of thermohydrological processes in fractured, geologic media. Sensitivity studies investigating the impact of dimensionality and different conceptual models to represent fractures (ECM, DK, MINC) on thermohydrological response were developed. Sensitivity to parameter variation within a given conceptual model was also considered. The sensitivity of response was examined against thermohydrological metrics derived from the flow and redistribution of moisture. A simple thermohydrochemical model to investigate a three-process coupling (thermal-hydrological-chemical) was presented. The redistribution of chloride was evaluated because the chemical behavior is well known and defensible. In addition, it is very important to overall system performance. For all of the simulations completed, chloride was found to be extremely concentrated in the fluids that eventually return to the engineered barrier system. Chloride concentration and mass flux were increased from ambient by over a factor of 1000 for some simulations. Thermohydrology was found to have the potential to significantly alter chemistry from ambient conditions.
Opportunities and Challenges in Supply-Side Simulation: Physician-Based Models
Gresenz, Carole Roan; Auerbach, David I; Duarte, Fabian
2013-01-01
Objective To provide a conceptual framework and to assess the availability of empirical data for supply-side microsimulation modeling in the context of health care. Data Sources Multiple secondary data sources, including the American Community Survey, Health Tracking Physician Survey, and SK&A physician database. Study Design We apply our conceptual framework to one entity in the health care market—physicians—and identify, assess, and compare data available for physician-based simulation models. Principal Findings Our conceptual framework describes three broad types of data required for supply-side microsimulation modeling. Our assessment of available data for modeling physician behavior suggests broad comparability across various sources on several dimensions and highlights the need for significant integration of data across multiple sources to provide a platform adequate for modeling. A growing literature provides potential estimates for use as behavioral parameters that could serve as the models' engines. Sources of data for simulation modeling that account for the complex organizational and financial relationships among physicians and other supply-side entities are limited. Conclusions A key challenge for supply-side microsimulation modeling is optimally combining available data to harness their collective power. Several possibilities also exist for novel data collection. These have the potential to serve as catalysts for the next generation of supply-side-focused simulation models to inform health policy. PMID:23347041
NASA Astrophysics Data System (ADS)
Liu, Gang; Zhao, Rong; Liu, Jiping; Zhang, Qingpu
2007-06-01
The Lancang River Basin is so narrow and its hydrological and meteorological information are so flexible. The Rainfall, evaporation, glacial melt water and groundwater affect the runoff whose replenishment forms changing notable with the season in different areas at the basin. Characters of different kind of distributed model and conceptual hydrological model are analyzed. A semi-distributed hydrological model of relation between monthly runoff and rainfall, temperate and soil type has been built in Changdu County based on Visual Basic and ArcObject. The way of discretization of distributed hydrological model was used in the model, and principles of conceptual model are taken into account. The sub-catchment of Changdu is divided into regular cells, and all kinds of hydrological and meteorological information and land use classes and slope extracted from 1:250000 digital elevation models are distributed in each cell. The model does not think of the rainfall-runoff hydro-physical process but use the conceptual model to simulate the whole contributes to the runoff of the area. The affection of evapotranspiration loss and underground water is taken into account at the same time. The spatial distribute characteristics of the monthly runoff in the area are simulated and analyzed with a few parameters.
Renal parameter estimates in unrestrained dogs
NASA Technical Reports Server (NTRS)
Rader, R. D.; Stevens, C. M.
1974-01-01
A mathematical formulation has been developed to describe the hemodynamic parameters of a conceptualized kidney model. The model was developed by considering regional pressure drops and regional storage capacities within the renal vasculature. Estimation of renal artery compliance, pre- and postglomerular resistance, and glomerular filtration pressure is feasible by considering mean levels and time derivatives of abdominal aortic pressure and renal artery flow. Changes in the smooth muscle tone of the renal vessels induced by exogenous angiotensin amide, acetylcholine, and by the anaesthetic agent halothane were estimated by use of the model. By employing totally implanted telemetry, the technique was applied on unrestrained dogs to measure renal resistive and compliant parameters while the dogs were being subjected to obedience training, to avoidance reaction, and to unrestrained caging.
Johnson, Raymond H.
2007-01-01
In mountain watersheds, the increased demand for clean water resources has led to an increased need for an understanding of ground water flow in alpine settings. In Prospect Gulch, located in southwestern Colorado, understanding the ground water flow system is an important first step in addressing metal loads from acid-mine drainage and acid-rock drainage in an area with historical mining. Ground water flow modeling with sensitivity analyses are presented as a general tool to guide future field data collection, which is applicable to any ground water study, including mountain watersheds. For a series of conceptual models, the observation and sensitivity capabilities of MODFLOW-2000 are used to determine composite scaled sensitivities, dimensionless scaled sensitivities, and 1% scaled sensitivity maps of hydraulic head. These sensitivities determine the most important input parameter(s) along with the location of observation data that are most useful for future model calibration. The results are generally independent of the conceptual model and indicate recharge in a high-elevation recharge zone as the most important parameter, followed by the hydraulic conductivities in all layers and recharge in the next lower-elevation zone. The most important observation data in determining these parameters are hydraulic heads at high elevations, with a depth of less than 100 m being adequate. Evaluation of a possible geologic structure with a different hydraulic conductivity than the surrounding bedrock indicates that ground water discharge to individual stream reaches has the potential to identify some of these structures. Results of these sensitivity analyses can be used to prioritize data collection in an effort to reduce time and money spend by collecting the most relevant model calibration data.
Numerical model of water flow and solute accumulation in vertisols using HYDRUS 2D/3D code
NASA Astrophysics Data System (ADS)
Weiss, Tomáš; Dahan, Ofer; Turkeltub, Tuvia
2015-04-01
Keywords: dessication-crack-induced-salinization, preferential flow, conceptual model, numerical model, vadose zone, vertisols, soil water retention function, HYDRUS 2D/3D Vertisols cover a hydrologically very significant area of semi-arid regions often through which water infiltrates to groundwater aquifers. Understanding of water flow and solute accumulation is thus very relevant to agricultural activity and water resources management. Previous works suggest a conceptual model of dessication-crack-induced-salinization where salinization of sediment in the deep section of the vadose zone (up to 4 m) is induced by subsurface evaporation due to convective air flow in the dessication cracks. It suggests that the salinization is induced by the hydraulic gradient between the dry sediment in the vicinity of cracks (low potential) and the relatively wet sediment further from the main cracks (high potential). This paper presents a modified previously suggested conceptual model and a numerical model. The model uses a simple uniform flow approach but unconventionally prescribes the boundary conditions and the hydraulic parameters of soil. The numerical model is bound to one location close to a dairy farm waste lagoon, but the application of the suggested conceptual model could be possibly extended to all semi-arid regions with vertisols. Simulations were conducted using several modeling approaches with an ultimate goal of fitting the simulation results to the controlling variables measured in the field: temporal variation in water content across thick layer of unsaturated clay sediment (>10 m), sediment salinity and salinity the water draining down the vadose zone to the water table. The development of the model was engineered in several steps; all computed as forward solutions by try-and-error approach. The model suggests very deep instant infiltration of fresh water up to 12 m, which is also supported by the field data. The paper suggests prescribing a special atmospheric boundary to the wall of the crack (so that the solute can accumulate due to evaporation on the crack block wall, and infiltrating fresh water can push the solute further down) - in order to do so, HYDRUS 2D/3D code had to be modified by its developers. Unconventionally, the main fitting parameters were: parameter a and n in the soil water retention curve and saturated hydraulic conductivity. The amount of infiltrated water (within a reasonable range), the infiltration function in the crack and the actual evaporation from the crack were also used as secondary fitting parameters. The model supports the previous findings that significant amount (~90%) of water from rain events must infiltrate through the crack. It was also noted that infiltration from the crack has to be increasing with depth and that the highest infiltration rate should be somewhere between 1-3m. This paper suggests a new way how to model vertisols in semi-arid regions. It also supports the previous findings about vertisols: especially, the utmost importance of soil cracks as preferential pathways for water and contaminants and soil cracks as deep evaporators.
Psychological mechanisms in outdoor place and weather assessment: towards a conceptual model
NASA Astrophysics Data System (ADS)
Knez, Igor; Thorsson, Sofia; Eliasson, Ingegärd; Lindberg, Fredrik
2009-01-01
The general aim has been to illuminate the psychological mechanisms involved in outdoor place and weather assessment. This reasoning was conceptualized in a model, tentatively proposing direct and indirect links of influence in an outdoor place-human relationship. The model was subsequently tested by an empirical study, performed in a Nordic city, on the impact of weather and personal factors on participants’ perceptual and emotional estimations of outdoor urban places. In line with our predictions, we report significant influences of weather parameters (air temperature, wind, and cloudlessness) and personal factors (environmental attitude and age) on participants’ perceptual and emotional estimations of outdoor urban places. All this is a modest, yet significant, step towards an understanding of the psychology of outdoor place and weather assessment.
NASA Astrophysics Data System (ADS)
Rödiger, T.; Geyer, S.; Mallast, U.; Merz, R.; Krause, P.; Fischer, C.; Siebert, C.
2014-02-01
A key factor for sustainable management of groundwater systems is the accurate estimation of groundwater recharge. Hydrological models are common tools for such estimations and widely used. As such models need to be calibrated against measured values, the absence of adequate data can be problematic. We present a nested multi-response calibration approach for a semi-distributed hydrological model in the semi-arid catchment of Wadi al Arab in Jordan, with sparsely available runoff data. The basic idea of the calibration approach is to use diverse observations in a nested strategy, in which sub-parts of the model are calibrated to various observation data types in a consecutive manner. First, the available different data sources have to be screened for information content of processes, e.g. if data sources contain information on mean values, spatial or temporal variability etc. for the entire catchment or only sub-catchments. In a second step, the information content has to be mapped to relevant model components, which represent these processes. Then the data source is used to calibrate the respective subset of model parameters, while the remaining model parameters remain unchanged. This mapping is repeated for other available data sources. In that study the gauged spring discharge (GSD) method, flash flood observations and data from the chloride mass balance (CMB) are used to derive plausible parameter ranges for the conceptual hydrological model J2000g. The water table fluctuation (WTF) method is used to validate the model. Results from modelling using a priori parameter values from literature as a benchmark are compared. The estimated recharge rates of the calibrated model deviate less than ±10% from the estimates derived from WTF method. Larger differences are visible in the years with high uncertainties in rainfall input data. The performance of the calibrated model during validation produces better results than applying the model with only a priori parameter values. The model with a priori parameter values from literature tends to overestimate recharge rates with up to 30%, particular in the wet winter of 1991/1992. An overestimation of groundwater recharge and hence available water resources clearly endangers reliable water resource managing in water scarce region. The proposed nested multi-response approach may help to better predict water resources despite data scarcity.
Using a Virtual Experiment to Analyze Infiltration Process from Point to Grid-cell Size Scale
NASA Astrophysics Data System (ADS)
Barrios, M. I.
2013-12-01
The hydrological science requires the emergence of a consistent theoretical corpus driving the relationships between dominant physical processes at different spatial and temporal scales. However, the strong spatial heterogeneities and non-linearities of these processes make difficult the development of multiscale conceptualizations. Therefore, scaling understanding is a key issue to advance this science. This work is focused on the use of virtual experiments to address the scaling of vertical infiltration from a physically based model at point scale to a simplified physically meaningful modeling approach at grid-cell scale. Numerical simulations have the advantage of deal with a wide range of boundary and initial conditions against field experimentation. The aim of the work was to show the utility of numerical simulations to discover relationships between the hydrological parameters at both scales, and to use this synthetic experience as a media to teach the complex nature of this hydrological process. The Green-Ampt model was used to represent vertical infiltration at point scale; and a conceptual storage model was employed to simulate the infiltration process at the grid-cell scale. Lognormal and beta probability distribution functions were assumed to represent the heterogeneity of soil hydraulic parameters at point scale. The linkages between point scale parameters and the grid-cell scale parameters were established by inverse simulations based on the mass balance equation and the averaging of the flow at the point scale. Results have shown numerical stability issues for particular conditions and have revealed the complex nature of the non-linear relationships between models' parameters at both scales and indicate that the parameterization of point scale processes at the coarser scale is governed by the amplification of non-linear effects. The findings of these simulations have been used by the students to identify potential research questions on scale issues. Moreover, the implementation of this virtual lab improved the ability to understand the rationale of these process and how to transfer the mathematical models to computational representations.
How much expert knowledge is it worth to put in conceptual hydrological models?
NASA Astrophysics Data System (ADS)
Antonetti, Manuel; Zappa, Massimiliano
2017-04-01
Both modellers and experimentalists agree on using expert knowledge to improve our conceptual hydrological simulations on ungauged basins. However, they use expert knowledge differently for both hydrologically mapping the landscape and parameterising a given hydrological model. Modellers use generally very simplified (e.g. topography-based) mapping approaches and put most of the knowledge for constraining the model by defining parameter and process relational rules. In contrast, experimentalists tend to invest all their detailed and qualitative knowledge about processes to obtain a spatial distribution of areas with different dominant runoff generation processes (DRPs) as realistic as possible, and for defining plausible narrow value ranges for each model parameter. Since, most of the times, the modelling goal is exclusively to simulate runoff at a specific site, even strongly simplified hydrological classifications can lead to satisfying results due to equifinality of hydrological models, overfitting problems and the numerous uncertainty sources affecting runoff simulations. Therefore, to test to which extent expert knowledge can improve simulation results under uncertainty, we applied a typical modellers' modelling framework relying on parameter and process constraints defined based on expert knowledge to several catchments on the Swiss Plateau. To map the spatial distribution of the DRPs, mapping approaches with increasing involvement of expert knowledge were used. Simulation results highlighted the potential added value of using all the expert knowledge available on a catchment. Also, combinations of event types and landscapes, where even a simplified mapping approach can lead to satisfying results, were identified. Finally, the uncertainty originated by the different mapping approaches was compared with the one linked to meteorological input data and catchment initial conditions.
Clark, D Angus; Nuttall, Amy K; Bowles, Ryan P
2018-01-01
Latent change score models (LCS) are conceptually powerful tools for analyzing longitudinal data (McArdle & Hamagami, 2001). However, applications of these models typically include constraints on key parameters over time. Although practically useful, strict invariance over time in these parameters is unlikely in real data. This study investigates the robustness of LCS when invariance over time is incorrectly imposed on key change-related parameters. Monte Carlo simulation methods were used to explore the impact of misspecification on parameter estimation, predicted trajectories of change, and model fit in the dual change score model, the foundational LCS. When constraints were incorrectly applied, several parameters, most notably the slope (i.e., constant change) factor mean and autoproportion coefficient, were severely and consistently biased, as were regression paths to the slope factor when external predictors of change were included. Standard fit indices indicated that the misspecified models fit well, partly because mean level trajectories over time were accurately captured. Loosening constraint improved the accuracy of parameter estimates, but estimates were more unstable, and models frequently failed to converge. Results suggest that potentially common sources of misspecification in LCS can produce distorted impressions of developmental processes, and that identifying and rectifying the situation is a challenge.
Combining Deterministic structures and stochastic heterogeneity for transport modeling
NASA Astrophysics Data System (ADS)
Zech, Alraune; Attinger, Sabine; Dietrich, Peter; Teutsch, Georg
2017-04-01
Contaminant transport in highly heterogeneous aquifers is extremely challenging and subject of current scientific debate. Tracer plumes often show non-symmetric but highly skewed plume shapes. Predicting such transport behavior using the classical advection-dispersion-equation (ADE) in combination with a stochastic description of aquifer properties requires a dense measurement network. This is in contrast to the available information for most aquifers. A new conceptual aquifer structure model is presented which combines large-scale deterministic information and the stochastic approach for incorporating sub-scale heterogeneity. The conceptual model is designed to allow for a goal-oriented, site specific transport analysis making use of as few data as possible. Thereby the basic idea is to reproduce highly skewed tracer plumes in heterogeneous media by incorporating deterministic contrasts and effects of connectivity instead of using unimodal heterogeneous models with high variances. The conceptual model consists of deterministic blocks of mean hydraulic conductivity which might be measured by pumping tests indicating values differing in orders of magnitudes. A sub-scale heterogeneity is introduced within every block. This heterogeneity can be modeled as bimodal or log-normal distributed. The impact of input parameters, structure and conductivity contrasts is investigated in a systematic manor. Furthermore, some first successful implementation of the model was achieved for the well known MADE site.
Chronic fish toxicity is a key parameter for hazard classification and environmental risk assessment of chemicals, and the OECD 210 fish early life-stage (FELS) test is the primary guideline test used for various international regulatory programs. There exists a need to develop ...
Research on Product Conceptual Design Based on Integrated of TRIZ and HOQ
NASA Astrophysics Data System (ADS)
Xie, Jianmin; Tang, Xiaowo; Shao, Yunfei
The conceptual design determines the success of the final product quality and competition of market. The determination of design parameters and the effective method to resolve parameters contradiction are the key to success. In this paper, the concept of HOQ products designed to determine the parameters, then using the TRIZ contradiction matrix and inventive principles of design parameters to solve the problem of contradictions. Facts have proved that the effective method is to obtain the product concept design parameters and to resolve contradictions line parameters.
Besner, Marie-Claude; Prévost, Michèle; Regli, Stig
2011-01-01
Low and negative pressure events in drinking water distribution systems have the potential to result in intrusion of pathogenic microorganisms if an external source of contamination is present (e.g., nearby leaking sewer main) and there is a pathway for contaminant entry (e.g., leaks in drinking water main). While the public health risk associated with such events is not well understood, quantitative microbial risk assessment can be used to estimate such risk. A conceptual model is provided and the state of knowledge, current assumptions, and challenges associated with the conceptual model parameters are presented. This review provides a characterization of the causes, magnitudes, durations and frequencies of low/negative pressure events; pathways for pathogen entry; pathogen occurrence in external sources of contamination; volumes of water that may enter through the different pathways; fate and transport of pathogens from the pathways of entry to customer taps; pathogen exposure to populations consuming the drinking water; and risk associated with pathogen exposure. Copyright © 2010 Elsevier Ltd. All rights reserved.
Synthetic calibration of a Rainfall-Runoff Model
Thompson, David B.; Westphal, Jerome A.; ,
1990-01-01
A method for synthetically calibrating storm-mode parameters for the U.S. Geological Survey's Precipitation-Runoff Modeling System is described. Synthetic calibration is accomplished by adjusting storm-mode parameters to minimize deviations between the pseudo-probability disributions represented by regional regression equations and actual frequency distributions fitted to model-generated peak discharge and runoff volume. Results of modeling storm hydrographs using synthetic and analytic storm-mode parameters are presented. Comparisons are made between model results from both parameter sets and between model results and observed hydrographs. Although mean storm runoff is reproducible to within about 26 percent of the observed mean storm runoff for five or six parameter sets, runoff from individual storms is subject to large disparities. Predicted storm runoff volume ranged from 2 percent to 217 percent of commensurate observed values. Furthermore, simulation of peak discharges was poor. Predicted peak discharges from individual storm events ranged from 2 percent to 229 percent of commensurate observed values. The model was incapable of satisfactorily executing storm-mode simulations for the study watersheds. This result is not considered a particular fault of the model, but instead is indicative of deficiencies in similar conceptual models.
Adaptive Parameter Optimization of a Grid-based Conceptual Hydrological Model
NASA Astrophysics Data System (ADS)
Samaniego, L.; Kumar, R.; Attinger, S.
2007-12-01
Any spatially explicit hydrological model at the mesoscale is a conceptual approximation of the hydrological cycle and its dominant process occurring at this scale. Manual-expert calibration of this type of models may become quite tedious---if not impossible---taking into account the enormous amount of data required by these kind of models and the intrinsic uncertainty of both the data (input-output) and the model structure. Additionally, the model should be able to reproduce well several process which are accounted by a number of predefined objectives. As a consequence, some degree of automatic calibration would be required to find "good" solutions, each one constituting a trade-off among all calibration criteria. In other words, it is very likely that a number of parameter sets fulfil the optimization criteria and thus can be considered a model solution. In this study, we dealt with two research questions: 1) How to assess the adequate level of model complexity so that model overparameterization is avoided? And, 2) How to find a good solution with a relatively low computational burden? In the present study, a grid-based conceptual hydrological model denoted as HBV-UFZ based on some of the original HBV concepts was employed. This model was driven by 12~h precipitation, temperature, and PET grids which are acquired either from satellite products or from data of meteorological stations. In the latter case, the data was interpolated with external drift Kriging. The first research question was addressed in this study with the implementation of nonlinear transfer functions that regionalize most model parameters as a function of other spatially distributed observables such as land cover (time dependent) and other time independent basin characteristics such as soil type, slope, aspect, geological formations among others. The second question was addressed with an adaptive constrained optimization algorithm based on a parallel implementation of simulated annealing (SA). The main difference with the standard SA is the parameter search routine which uses adaptive heuristic rules to improve its efficiency. These rules are based on the relative behavior of the efficiency criteria. The efficiency of the model is evaluated with the Nash-Sutcliffe efficiency coefficient (NS) and the RMSE obtained for various short and long term runoff characteristics such as daily flows; semiannual high and low flow characteristics such as total drought duration frequency of high flows; and annual specific discharge at various gauging stations. Additionally, the parameter search was constrained with the 95% confidence bands of the runoff characteristics mentioned above. The proposed method was calibrated in the Upper Neckar River basin covering an area of approximately 4000~km2 during the period from 1961 to 1993. The spatial and temporal resolutions used were a grid size of (1000 × 1000)~m and 12~h intervals respectively. The results of the study indicate significant improvement in model performance (e.g. Nash-Sutcliffe of various runoff characteristics ~ 0.8) and a significant reduction in computational burden of at least 25%.
Simultaneous Semi-Distributed Model Calibration Guided by ...
Modelling approaches to transfer hydrologically-relevant information from locations with streamflow measurements to locations without such measurements continues to be an active field of research for hydrologists. The Pacific Northwest Hydrologic Landscapes (PNW HL) provide a solid conceptual classification framework based on our understanding of dominant processes. A Hydrologic Landscape code (5 letter descriptor based on physical and climatic properties) describes each assessment unit area, and these units average area 60km2. The core function of these HL codes is to relate and transfer hydrologically meaningful information between watersheds without the need for streamflow time series. We present a novel approach based on the HL framework to answer the question “How can we calibrate models across separate watersheds simultaneously, guided by our understanding of dominant processes?“. We should be able to apply the same parameterizations to assessment units of common HL codes if 1) the Hydrologic Landscapes contain hydrologic information transferable between watersheds at a sub-watershed-scale and 2) we use a conceptual hydrologic model and parameters that reflect the hydrologic behavior of a watershed. In this study, This work specifically tests the ability or inability to use HL-codes to inform and share model parameters across watersheds in the Pacific Northwest. EPA’s Western Ecology Division has published and is refining a framework for defining la
Forcing Regression through a Given Point Using Any Familiar Computational Routine.
1983-03-01
a linear model , Y =a + OX + e ( Model I) then adopt the principle of least squares; and use sample data to estimate the unknown parameters, a and 8...has an expected value of zero indicates that the "average" response is considered linear . If c varies widely, Model I, though conceptually correct, may...relationship is linear from the maximum observed x to x - a, then Model II should be used. To pro- ceed with the customary evaluation of Model I would be
A new technology for determining transport parameters in porous media
DOE Office of Scientific and Technical Information (OSTI.GOV)
Conca, J.L.; Wright, J.
The UFA Method can directly and rapidly measure transport parameters for any porous medium over a wide range of water contents and conditions. UFA results for subsurface sediments at a mixed-waste disposal site at the Hanford Site in Washington State provided the data necessary for detailed hydrostratigraphic mapping, subsurface flux and recharge distributions, and subsurface chemical mapping. Seven hundred unsaturated conductivity measurements along with pristine pore water extractions were obtained in only six months using the UFA. These data are used to provide realistic information to conceptual models, predictive models and restoration strategies.
Laboratory Experiments on Bentonite Samples: FY16 Progress
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruth M. Tinnacher; Tournassat, Christophe; James A. Davis
2016-08-22
The primary goal of this study is to improve the understanding of U(VI) sorption and diffusion behavior in sodium-montmorillonite in order to support the development of realistic conceptual models describing these processes in performance assessment models while (1) accounting for potential changes in system conditions over time and space, (2) avoiding overly conservative transport predictions, and (3) using a minimum number of fitting parameters.
Incorporating Handling Qualities Analysis into Rotorcraft Conceptual Design
NASA Technical Reports Server (NTRS)
Lawrence, Ben
2014-01-01
This paper describes the initial development of a framework to incorporate handling qualities analyses into a rotorcraft conceptual design process. In particular, the paper describes how rotorcraft conceptual design level data can be used to generate flight dynamics models for handling qualities analyses. Also, methods are described that couple a basic stability augmentation system to the rotorcraft flight dynamics model to extend analysis to beyond that of the bare airframe. A methodology for calculating the handling qualities characteristics of the flight dynamics models and for comparing the results to ADS-33E criteria is described. Preliminary results from the application of the handling qualities analysis for variations in key rotorcraft design parameters of main rotor radius, blade chord, hub stiffness and flap moment of inertia are shown. Varying relationships, with counteracting trends for different handling qualities criteria and different flight speeds are exhibited, with the action of the control system playing a complex part in the outcomes. Overall, the paper demonstrates how a broad array of technical issues across flight dynamics stability and control, simulation and modeling, control law design and handling qualities testing and evaluation had to be confronted to implement even a moderately comprehensive handling qualities analysis of relatively low fidelity models. A key outstanding issue is to how to 'close the loop' with an overall design process, and options for the exploration of how to feedback handling qualities results to a conceptual design process are proposed for future work.
Conceptual design and analysis of a dynamic scale model of the Space Station Freedom
NASA Technical Reports Server (NTRS)
Davis, D. A.; Gronet, M. J.; Tan, M. K.; Thorne, J.
1994-01-01
This report documents the conceptual design study performed to evaluate design options for a subscale dynamic test model which could be used to investigate the expected on-orbit structural dynamic characteristics of the Space Station Freedom early build configurations. The baseline option was a 'near-replica' model of the SSF SC-7 pre-integrated truss configuration. The approach used to develop conceptual design options involved three sets of studies: evaluation of the full-scale design and analysis databases, conducting scale factor trade studies, and performing design sensitivity studies. The scale factor trade study was conducted to develop a fundamental understanding of the key scaling parameters that drive design, performance and cost of a SSF dynamic scale model. Four scale model options were estimated: 1/4, 1/5, 1/7, and 1/10 scale. Prototype hardware was fabricated to assess producibility issues. Based on the results of the study, a 1/4-scale size is recommended based on the increased model fidelity associated with a larger scale factor. A design sensitivity study was performed to identify critical hardware component properties that drive dynamic performance. A total of 118 component properties were identified which require high-fidelity replication. Lower fidelity dynamic similarity scaling can be used for non-critical components.
Continuous Improvement of a Groundwater Model over a 20-Year Period: Lessons Learned.
Andersen, Peter F; Ross, James L; Fenske, Jon P
2018-04-17
Groundwater models developed for specific sites generally become obsolete within a few years due to changes in: (1) modeling technology; (2) site/project personnel; (3) project funding; and (4) modeling objectives. Consequently, new models are sometimes developed for the same sites using the latest technology and data, but without potential knowledge gained from the prior models. When it occurs, this practice is particularly problematic because, although technology, data, and observed conditions change, development of the new numerical model may not consider the conceptual model's underpinnings. As a contrary situation, we present the unique case of a numerical flow and trichloroethylene (TCE) transport model that was first developed in 1993 and since revised and updated annually by the same personnel. The updates are prompted by an increase in the amount of data, exposure to a wider range of hydrologic conditions over increasingly longer timeframes, technological advances, evolving modeling objectives, and revised modeling methodologies. The history of updates shows smooth, incremental changes in the conceptual model and modeled aquifer parameters that result from both increase and decrease in complexity. Myriad modeling objectives have included demonstrating the ineffectiveness of a groundwater extraction/injection system, evaluating potential TCE degradation, locating new monitoring points, and predicting likelihood of exceedance of groundwater standards. The application emphasizes an original tenet of successful groundwater modeling: iterative adjustment of the conceptual model based on observations of actual vs. model response. © 2018, National Ground Water Association.
Mechanisms of Hydrocarbon Based Polymer Etch
NASA Astrophysics Data System (ADS)
Lane, Barton; Ventzek, Peter; Matsukuma, Masaaki; Suzuki, Ayuta; Koshiishi, Akira
2015-09-01
Dry etch of hydrocarbon based polymers is important for semiconductor device manufacturing. The etch mechanisms for oxygen rich plasma etch of hydrocarbon based polymers has been studied but the mechanism for lean chemistries has received little attention. We report on an experimental and analytic study of the mechanism for etching of a hydrocarbon based polymer using an Ar/O2 chemistry in a single frequency 13.56 MHz test bed. The experimental study employs an analysis of transients from sequential oxidation and Ar sputtering steps using OES and surface analytics to constrain conceptual models for the etch mechanism. The conceptual model is consistent with observations from MD studies and surface analysis performed by Vegh et al. and Oehrlein et al. and other similar studies. Parameters of the model are fit using published data and the experimentally observed time scales.
NASA Astrophysics Data System (ADS)
Barati, Reza
2017-07-01
Perumal et al. (2017) compared the performances of the variable parameter McCarthy-Muskingum (VPMM) model of Perumal and Price (2013) and the nonlinear Muskingum (NLM) model of Gill (1978) using hypothetical inflow hydrographs in an artificial channel. As input parameters, first model needs the initial condition, upstream boundary condition, Manning's roughness coefficient, length of the routing reach, cross-sections of the river reach and the bed slope, while the latter one requires the initial condition, upstream boundary condition and the hydrologic parameters (three parameters which can be calibrated using flood hydrographs of the upstream and downstream sections). The VPMM model was examined by available Manning's roughness values, whereas the NLM model was tested in both calibration and validation steps. As final conclusion, Perumal et al. (2017) claimed that the NLM model should be retired from the literature of the Muskingum model. While the author's intention is laudable, this comment examines some important issues in the subject matter of the original study.
A new Bayesian recursive technique for parameter estimation
NASA Astrophysics Data System (ADS)
Kaheil, Yasir H.; Gill, M. Kashif; McKee, Mac; Bastidas, Luis
2006-08-01
The performance of any model depends on how well its associated parameters are estimated. In the current application, a localized Bayesian recursive estimation (LOBARE) approach is devised for parameter estimation. The LOBARE methodology is an extension of the Bayesian recursive estimation (BARE) method. It is applied in this paper on two different types of models: an artificial intelligence (AI) model in the form of a support vector machine (SVM) application for forecasting soil moisture and a conceptual rainfall-runoff (CRR) model represented by the Sacramento soil moisture accounting (SAC-SMA) model. Support vector machines, based on statistical learning theory (SLT), represent the modeling task as a quadratic optimization problem and have already been used in various applications in hydrology. They require estimation of three parameters. SAC-SMA is a very well known model that estimates runoff. It has a 13-dimensional parameter space. In the LOBARE approach presented here, Bayesian inference is used in an iterative fashion to estimate the parameter space that will most likely enclose a best parameter set. This is done by narrowing the sampling space through updating the "parent" bounds based on their fitness. These bounds are actually the parameter sets that were selected by BARE runs on subspaces of the initial parameter space. The new approach results in faster convergence toward the optimal parameter set using minimum training/calibration data and fewer sets of parameter values. The efficacy of the localized methodology is also compared with the previously used BARE algorithm.
Some issues in data model mapping
NASA Technical Reports Server (NTRS)
Dominick, Wayne D. (Editor); Alsabbagh, Jamal R.
1985-01-01
Numerous data models have been reported in the literature since the early 1970's. They have been used as database interfaces and as conceptual design tools. The mapping between schemas expressed according to the same data model or according to different models is interesting for theoretical and practical purposes. This paper addresses some of the issues involved in such a mapping. Of special interest are the identification of the mapping parameters and some current approaches for handling the various situations that require a mapping.
Delineation, characterization, and classification of topographic eminences
NASA Astrophysics Data System (ADS)
Sinha, Gaurav
Topographic eminences are defined as upwardly rising, convex shaped topographic landforms that are noticeably distinct in their immediate surroundings. As opposed to everyday objects, the properties of a topographic eminence are dependent not only on how it is conceptualized, but is also intrinsically related to its spatial extent and its relative location in the landscape. In this thesis, a system for automated detection, delineation and characterization of topographic eminences based on an analysis of digital elevation models is proposed. Research has shown that conceptualization of eminences (and other landforms) is linked to the cultural and linguistic backgrounds of people. However, the perception of stimuli from our physical environment is not subject to cultural or linguistic bias. Hence, perceptually salient morphological and spatial properties of the natural landscape can form the basis for generically applicable detection and delineation of topographic eminences. Six principles of cognitive eminence modeling are introduced to develop the philosophical foundation of this research regarding eminence delineation and characterization. The first step in delineating eminences is to automatically detect their presence within digital elevation models. This is achieved by the use of quantitative geomorphometric parameters (e.g., elevation, slope and curvature) and qualitative geomorphometric features (e.g., peaks, passes, pits, ridgelines, and valley lines). The process of eminence delineation follows that of eminence detection. It is posited that eminences may be perceived either as monolithic terrain objects, or as composites of morphological parts (e.g., top, bottom, slope). Individual eminences may also simultaneously be conceived as comprising larger, higher order eminence complexes (e.g., mountain ranges). Multiple algorithms are presented for the delineation of simple and complex eminences, and the morphological parts of eminences. The proposed eminence detection and delineation methods are amenable to intuitive parameterization such that they can easily capture the multitude of eminence conceptualizations that people develop due to differences in terrain type and cultural and linguistic backgrounds. Eminence delineation is an important step in object based modeling of the natural landscape. However, mere 'geocoding' of eminences is not sufficient for modeling how people intuitively perceive and reason about eminences. Therefore, a comprehensive eminence parameterization system for characterizing the perceptual properties of eminences is also proposed in this thesis. Over 40 parameters are suggested for measuring the commonly perceived properties of eminences: size, shape, topology, proximity, and visibility. The proposed parameters describe different aspects of naive eminence perception. Quantitative analysis of eminence parameters using cluster analysis, confirms that not only can eminences be parameterized as individual terrain objects, but that eminence (dis)similarities can be exploited to develop intuitive eminence classification systems. Eminence parameters are also shown to be essential for exploring the relationships between extracted eminences and natural language terms (e.g., hill, mount, mountain, peak) used commonly to refer to different types of eminences. The results from this research confirm that object based modeling of the landscape is not only useful for terrain information system design, but is also essential for understanding how people commonly conceptualize their observations of and interactions with the natural landscape.
Strategies to Move From Conceptual Models to Quantifying Resilience in FEW Systems
NASA Astrophysics Data System (ADS)
Padowski, J.; Adam, J. C.; Boll, J.; Barber, M. E.; Cosens, B.; Goldsby, M.; Fortenbery, R.; Fowler, A.; Givens, J.; Guzman, C. D.; Hampton, S. E.; Harrison, J.; Huang, M.; Katz, S. L.; Kraucunas, I.; Kruger, C. E.; Liu, M.; Luri, M.; Malek, K.; Mills, A.; McLarty, D.; Pickering, N. B.; Rajagopalan, K.; Stockle, C.; Richey, A.; Voisin, N.; Witinok-Huber, B.; Yoder, J.; Yorgey, G.; Zhao, M.
2017-12-01
Understanding interdependencies within Food-Energy-Water (FEW) systems is critical to maintain FEW security. This project examines how coordinated management of physical (e.g., reservoirs, aquifers, and batteries) and non-physical (e.g., water markets, social capital, and insurance markets) storage systems across the three sectors promotes resilience. Coordination increases effective storage within the overall system and enhances buffering against shocks at multiple scales. System-wide resilience can be increased with innovations in technology (e.g., smart systems and energy storage) and institutions (e.g., economic systems and water law). Using the Columbia River Basin as our geographical study region, we use an integrated approach that includes a continuum of science disciplines, moving from theory to practice. In order to understand FEW linkages, we started with detailed, connected conceptual models of the food, energy, water, and social systems to identify where key interdependencies (i.e., overlaps, stocks, and flows) exist within and between systems. These are used to identify stress and opportunity points, develop innovation solutions across FEW sectors, remove barriers to the adoption of solutions, and quantify increases in system-wide resilience to regional and global change. The conceptual models act as a foundation from which we can identify key drivers, parameters, time steps, and variables of importance to build and improve existing systems dynamic and biophysical models. Our process of developing conceptual models and moving to integrated modeling is critical and serves as a foundation for coupling quantitative components with economic and social domain components and analyses of how these interact through time and space. This poster provides a description of this process that pulls together conceptual maps and integrated modeling output to quantify resilience across all three of the FEW sectors (a.k.a. "The Resilience Calculator"). Companion posters describe our case studies and our efforts in incorporating social systems into this resilience calculator.
NASA Astrophysics Data System (ADS)
Smith, T.; Marshall, L.
2007-12-01
In many mountainous regions, the single most important parameter in forecasting the controls on regional water resources is snowpack (Williams et al., 1999). In an effort to bridge the gap between theoretical understanding and functional modeling of snow-driven watersheds, a flexible hydrologic modeling framework is being developed. The aim is to create a suite of models that move from parsimonious structures, concentrated on aggregated watershed response, to those focused on representing finer scale processes and distributed response. This framework will operate as a tool to investigate the link between hydrologic model predictive performance, uncertainty, model complexity, and observable hydrologic processes. Bayesian methods, and particularly Markov chain Monte Carlo (MCMC) techniques, are extremely useful in uncertainty assessment and parameter estimation of hydrologic models. However, these methods have some difficulties in implementation. In a traditional Bayesian setting, it can be difficult to reconcile multiple data types, particularly those offering different spatial and temporal coverage, depending on the model type. These difficulties are also exacerbated by sensitivity of MCMC algorithms to model initialization and complex parameter interdependencies. As a way of circumnavigating some of the computational complications, adaptive MCMC algorithms have been developed to take advantage of the information gained from each successive iteration. Two adaptive algorithms are compared is this study, the Adaptive Metropolis (AM) algorithm, developed by Haario et al (2001), and the Delayed Rejection Adaptive Metropolis (DRAM) algorithm, developed by Haario et al (2006). While neither algorithm is truly Markovian, it has been proven that each satisfies the desired ergodicity and stationarity properties of Markov chains. Both algorithms were implemented as the uncertainty and parameter estimation framework for a conceptual rainfall-runoff model based on the Probability Distributed Model (PDM), developed by Moore (1985). We implement the modeling framework in Stringer Creek watershed in the Tenderfoot Creek Experimental Forest (TCEF), Montana. The snowmelt-driven watershed offers that additional challenge of modeling snow accumulation and melt and current efforts are aimed at developing a temperature- and radiation-index snowmelt model. Auxiliary data available from within TCEF's watersheds are used to support in the understanding of information value as it relates to predictive performance. Because the model is based on lumped parameters, auxiliary data are hard to incorporate directly. However, these additional data offer benefits through the ability to inform prior distributions of the lumped, model parameters. By incorporating data offering different information into the uncertainty assessment process, a cross-validation technique is engaged to better ensure that modeled results reflect real process complexity.
Automated Generation of Finite-Element Meshes for Aircraft Conceptual Design
NASA Technical Reports Server (NTRS)
Li, Wu; Robinson, Jay
2016-01-01
This paper presents a novel approach for automated generation of fully connected finite-element meshes for all internal structural components and skins of a given wing-body geometry model, controlled by a few conceptual-level structural layout parameters. Internal structural components include spars, ribs, frames, and bulkheads. Structural layout parameters include spar/rib locations in wing chordwise/spanwise direction and frame/bulkhead locations in longitudinal direction. A simple shell thickness optimization problem with two load conditions is used to verify versatility and robustness of the automated meshing process. The automation process is implemented in ModelCenter starting from an OpenVSP geometry and ending with a NASTRAN 200 solution. One subsonic configuration and one supersonic configuration are used for numerical verification. Two different structural layouts are constructed for each configuration and five finite-element meshes of different sizes are generated for each layout. The paper includes various comparisons of solutions of 20 thickness optimization problems, as well as discussions on how the optimal solutions are affected by the stress constraint bound and the initial guess of design variables.
Neutronic design studies of a conceptual DCLL fusion reactor for a DEMO and a commercial power plant
NASA Astrophysics Data System (ADS)
Palermo, I.; Veredas, G.; Gómez-Ros, J. M.; Sanz, J.; Ibarra, A.
2016-01-01
Neutronic analyses or, more widely, nuclear analyses have been performed for the development of a dual-coolant He/LiPb (DCLL) conceptual design reactor. A detailed three-dimensional (3D) model has been examined and optimized. The design is based on the plasma parameters and functional materials of the power plant conceptual studies (PPCS) model C. The initial radial-build for the detailed model has been determined according to the dimensions established in a previous work on an equivalent simplified homogenized reactor model. For optimization purposes, the initial specifications established over the simplified model have been refined on the detailed 3D design, modifying material and dimension of breeding blanket, shield and vacuum vessel in order to fulfil the priority requirements of a fusion reactor in terms of the fundamental neutronic responses. Tritium breeding ratio, energy multiplication factor, radiation limits in the TF coils, helium production and displacements per atom (dpa) have been calculated in order to demonstrate the functionality and viability of the reactor design in guaranteeing tritium self-sufficiency, power efficiency, plasma confinement, and re-weldability and structural integrity of the components. The paper describes the neutronic design improvements of the DCLL reactor, obtaining results for both DEMO and power plant operational scenarios.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Landkamer, Lee L.; Harvey, Ronald W.; Scheibe, Timothy D.
A new colloid transport model is introduced that is conceptually simple but captures the essential features of complicated attachment and detachment behavior of colloids when conditions of secondary minimum attachment exist. This model eliminates the empirical concept of collision efficiency; the attachment rate is computed directly from colloid filtration theory. Also, a new paradigm for colloid detachment based on colloid population heterogeneity is introduced. Assuming the dispersion coefficient can be estimated from tracer behavior, this model has only two fitting parameters: (1) the fraction of colloids that attach irreversibly and (2) the rate at which reversibly attached colloids leave themore » surface. These two parameters were correlated to physical parameters that control colloid transport such as the depth of the secondary minimum and pore water velocity. Given this correlation, the model serves as a heuristic tool for exploring the influence of physical parameters such as surface potential and fluid velocity on colloid transport. This model can be extended to heterogeneous systems characterized by both primary and secondary minimum deposition by simply increasing the fraction of colloids that attach irreversibly.« less
Performance Assessment Uncertainty Analysis for Japan's HLW Program Feasibility Study (H12)
DOE Office of Scientific and Technical Information (OSTI.GOV)
BABA,T.; ISHIGURO,K.; ISHIHARA,Y.
1999-08-30
Most HLW programs in the world recognize that any estimate of long-term radiological performance must be couched in terms of the uncertainties derived from natural variation, changes through time and lack of knowledge about the essential processes. The Japan Nuclear Cycle Development Institute followed a relatively standard procedure to address two major categories of uncertainty. First, a FEatures, Events and Processes (FEPs) listing, screening and grouping activity was pursued in order to define the range of uncertainty in system processes as well as possible variations in engineering design. A reference and many alternative cases representing various groups of FEPs weremore » defined and individual numerical simulations performed for each to quantify the range of conceptual uncertainty. Second, parameter distributions were developed for the reference case to represent the uncertainty in the strength of these processes, the sequencing of activities and geometric variations. Both point estimates using high and low values for individual parameters as well as a probabilistic analysis were performed to estimate parameter uncertainty. A brief description of the conceptual model uncertainty analysis is presented. This paper focuses on presenting the details of the probabilistic parameter uncertainty assessment.« less
2016-01-01
Background Contributing to health informatics research means using conceptual models that are integrative and explain the research in terms of the two broad domains of health science and information science. However, it can be hard for novice health informatics researchers to find exemplars and guidelines in working with integrative conceptual models. Objectives The aim of this paper is to support the use of integrative conceptual models in research on information and communication technologies in the health sector, and to encourage discussion of these conceptual models in scholarly forums. Methods A two-part method was used to summarize and structure ideas about how to work effectively with conceptual models in health informatics research that included (1) a selective review and summary of the literature of conceptual models; and (2) the construction of a step-by-step approach to developing a conceptual model. Results The seven-step methodology for developing conceptual models in health informatics research explained in this paper involves (1) acknowledging the limitations of health science and information science conceptual models; (2) giving a rationale for one’s choice of integrative conceptual model; (3) explicating a conceptual model verbally and graphically; (4) seeking feedback about the conceptual model from stakeholders in both the health science and information science domains; (5) aligning a conceptual model with an appropriate research plan; (6) adapting a conceptual model in response to new knowledge over time; and (7) disseminating conceptual models in scholarly and scientific forums. Conclusions Making explicit the conceptual model that underpins a health informatics research project can contribute to increasing the number of well-formed and strongly grounded health informatics research projects. This explication has distinct benefits for researchers in training, research teams, and researchers and practitioners in information, health, and other disciplines. PMID:26912288
Gray, Kathleen; Sockolow, Paulina
2016-02-24
Contributing to health informatics research means using conceptual models that are integrative and explain the research in terms of the two broad domains of health science and information science. However, it can be hard for novice health informatics researchers to find exemplars and guidelines in working with integrative conceptual models. The aim of this paper is to support the use of integrative conceptual models in research on information and communication technologies in the health sector, and to encourage discussion of these conceptual models in scholarly forums. A two-part method was used to summarize and structure ideas about how to work effectively with conceptual models in health informatics research that included (1) a selective review and summary of the literature of conceptual models; and (2) the construction of a step-by-step approach to developing a conceptual model. The seven-step methodology for developing conceptual models in health informatics research explained in this paper involves (1) acknowledging the limitations of health science and information science conceptual models; (2) giving a rationale for one's choice of integrative conceptual model; (3) explicating a conceptual model verbally and graphically; (4) seeking feedback about the conceptual model from stakeholders in both the health science and information science domains; (5) aligning a conceptual model with an appropriate research plan; (6) adapting a conceptual model in response to new knowledge over time; and (7) disseminating conceptual models in scholarly and scientific forums. Making explicit the conceptual model that underpins a health informatics research project can contribute to increasing the number of well-formed and strongly grounded health informatics research projects. This explication has distinct benefits for researchers in training, research teams, and researchers and practitioners in information, health, and other disciplines.
Natural language generation of surgical procedures.
Wagner, J C; Rogers, J E; Baud, R H; Scherrer, J R
1999-01-01
A number of compositional Medical Concept Representation systems are being developed. Although these provide for a detailed conceptual representation of the underlying information, they have to be translated back to natural language for used by end-users and applications. The GALEN programme has been developing one such representation and we report here on a tool developed to generate natural language phrases from the GALEN conceptual representations. This tool can be adapted to different source modelling schemes and to different destination languages or sublanguages of a domain. It is based on a multilingual approach to natural language generation, realised through a clean separation of the domain model from the linguistic model and their link by well defined structures. Specific knowledge structures and operations have been developed for bridging between the modelling 'style' of the conceptual representation and natural language. Using the example of the scheme developed for modelling surgical operative procedures within the GALEN-IN-USE project, we show how the generator is adapted to such a scheme. The basic characteristics of the surgical procedures scheme are presented together with the basic principles of the generation tool. Using worked examples, we discuss the transformation operations which change the initial source representation into a form which can more directly be translated to a given natural language. In particular, the linguistic knowledge which has to be introduced--such as definitions of concepts and relationships is described. We explain the overall generator strategy and how particular transformation operations are triggered by language-dependent and conceptual parameters. Results are shown for generated French phrases corresponding to surgical procedures from the urology domain.
From Magma Fracture to a Seismic Magma Flow Meter
NASA Astrophysics Data System (ADS)
Neuberg, J. W.
2007-12-01
Seismic swarms of low-frequency events occur during periods of enhanced volcanic activity and have been related to the flow of magma at depth. Often they precede a dome collapse on volcanoes like Soufriere Hills, Montserrat, or Mt St Helens. This contribution is based on the conceptual model of magma rupture as a trigger mechanism. Several source mechanisms and radiation patterns at the focus of a single event are discussed. We investigate the accelerating event rate and seismic amplitudes during one swarm, as well as over a time period of several swarms. The seismic slip vector will be linked to magma flow parameters resulting in estimates of magma flux for a variety of flow models such as plug flow, parabolic- or friction controlled flow. In this way we try to relate conceptual models to quantitative estimations which could lead to estimations of magma flux at depth from seismic low-frequency signals.
Recent Papers in Parametric Modelling of Time Series.
1983-04-01
conceptually R t A A different ways. First if one is Interested in savi 8h t~p t,p ,p parameters, then a resolution comparable to the . t -) Let’s agree to...has been affected most. Continue to next page for Figures 2 through 7 CONCLUSIONS We have presented a general framwork for de- riving and
Tokamak experimental power reactor conceptual design. Volume II
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1976-08-01
Volume II contains the following appendices: (1) summary of EPR design parameters, (2) impurity control, (3) plasma computational models, (4) structural support system, (5) materials considerations for the primary energy conversion system, (6) magnetics, (7) neutronics penetration analysis, (8) first wall stress analysis, (9) enrichment of isotopes of hydrogen by cryogenic distillation, and (10) noncircular plasma considerations. (MOW)
NASA Astrophysics Data System (ADS)
Zhang, Y.; Li, S.
2014-12-01
Geologic carbon sequestration (GCS) is proposed for the Nugget Sandstone in Moxa Arch, a regional saline aquifer with a large storage potential. For a proposed storage site, this study builds a suite of increasingly complex conceptual "geologic" model families, using subsets of the site characterization data: a homogeneous model family, a stationary petrophysical model family, a stationary facies model family with sub-facies petrophysical variability, and a non-stationary facies model family (with sub-facies variability) conditioned to soft data. These families, representing alternative conceptual site models built with increasing data, were simulated with the same CO2 injection test (50 years at 1/10 Mt per year), followed by 2950 years of monitoring. Using the Design of Experiment, an efficient sensitivity analysis (SA) is conducted for all families, systematically varying uncertain input parameters. Results are compared among the families to identify parameters that have 1st order impact on predicting the CO2 storage ratio (SR) at both end of injection and end of monitoring. At this site, geologic modeling factors do not significantly influence the short-term prediction of the storage ratio, although they become important over monitoring time, but only for those families where such factors are accounted for. Based on the SA, a response surface analysis is conducted to generate prediction envelopes of the storage ratio, which are compared among the families at both times. Results suggest a large uncertainty in the predicted storage ratio given the uncertainties in model parameters and modeling choices: SR varies from 5-60% (end of injection) to 18-100% (end of monitoring), although its variation among the model families is relatively minor. Moreover, long-term leakage risk is considered small at the proposed site. In the lowest-SR scenarios, all families predict gravity-stable supercritical CO2 migrating toward the bottom of the aquifer. In the highest-SR scenarios, supercritical CO2 footprints are relatively insignificant by the end of monitoring.
NASA Astrophysics Data System (ADS)
Gragne, A. S.; Sharma, A.; Mehrotra, R.; Alfredsen, K.
2015-08-01
Accuracy of reservoir inflow forecasts is instrumental for maximizing the value of water resources and benefits gained through hydropower generation. Improving hourly reservoir inflow forecasts over a 24 h lead time is considered within the day-ahead (Elspot) market of the Nordic exchange market. A complementary modelling framework presents an approach for improving real-time forecasting without needing to modify the pre-existing forecasting model, but instead formulating an independent additive or complementary model that captures the structure the existing operational model may be missing. We present here the application of this principle for issuing improved hourly inflow forecasts into hydropower reservoirs over extended lead times, and the parameter estimation procedure reformulated to deal with bias, persistence and heteroscedasticity. The procedure presented comprises an error model added on top of an unalterable constant parameter conceptual model. This procedure is applied in the 207 km2 Krinsvatn catchment in central Norway. The structure of the error model is established based on attributes of the residual time series from the conceptual model. Besides improving forecast skills of operational models, the approach estimates the uncertainty in the complementary model structure and produces probabilistic inflow forecasts that entrain suitable information for reducing uncertainty in the decision-making processes in hydropower systems operation. Deterministic and probabilistic evaluations revealed an overall significant improvement in forecast accuracy for lead times up to 17 h. Evaluation of the percentage of observations bracketed in the forecasted 95 % confidence interval indicated that the degree of success in containing 95 % of the observations varies across seasons and hydrologic years.
Johnson, R.H.; Poeter, E.P.
2007-01-01
Perchloroethylene (PCE) saturations determined from GPR surveys were used as observations for inversion of multiphase flow simulations of a PCE injection experiment (Borden 9??m cell), allowing for the estimation of optimal bulk intrinsic permeability values. The resulting fit statistics and analysis of residuals (observed minus simulated PCE saturations) were used to improve the conceptual model. These improvements included adjustment of the elevation of a permeability contrast, use of the van Genuchten versus Brooks-Corey capillary pressure-saturation curve, and a weighting scheme to account for greater measurement error with larger saturation values. A limitation in determining PCE saturations through one-dimensional GPR modeling is non-uniqueness when multiple GPR parameters are unknown (i.e., permittivity, depth, and gain function). Site knowledge, fixing the gain function, and multiphase flow simulations assisted in evaluating non-unique conceptual models of PCE saturation, where depth and layering were reinterpreted to provide alternate conceptual models. Remaining bias in the residuals is attributed to the violation of assumptions in the one-dimensional GPR interpretation (which assumes flat, infinite, horizontal layering) resulting from multidimensional influences that were not included in the conceptual model. While the limitations and errors in using GPR data as observations for inverse multiphase flow simulations are frustrating and difficult to quantify, simulation results indicate that the error and bias in the PCE saturation values are small enough to still provide reasonable optimal permeability values. The effort to improve model fit and reduce residual bias decreases simulation error even for an inversion based on biased observations and provides insight into alternate GPR data interpretations. Thus, this effort is warranted and provides information on bias in the observation data when this bias is otherwise difficult to assess. ?? 2006 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Caputo, Riccardo
2010-09-01
It is a commonplace field observation that extension fractures are more abundant than shear fractures. The questions of how much more abundant, and why, are posed in this paper and qualitative estimates of their ratio within a rock volume are made on the basis of field observations and mechanical considerations. A conceptual model is also proposed to explain the common range of ratios between extension and shear fractures, here called the j/ f ratio. The model considers three major genetic stress components originated from overburden, pore-fluid pressure and tectonics and assumes that some of the remote genetic stress components vary with time ( i.e. stress-rates are included). Other important assumptions of the numerical model are that: i) the strength of the sub-volumes is randomly attributed following a Weibull probabilistic distribution, ii) all fractures heal after a given time, thus simulating the cementation process, and therefore iii) both extensional jointing and shear fracturing could be recurrent events within the same sub-volume. As a direct consequence of these assumptions, the stress tensor at any point varies continuously in time and these variations are caused by both remote stresses and local stress drops associated with in-situ and neighbouring fracturing events. The conceptual model is implemented in a computer program to simulate layered carbonate rock bodies undergoing brittle deformation. The numerical results are obtained by varying the principal parameters, like depth ( viz. confining pressure), tensile strength, pore-fluid pressure and shape of the Weibull distribution function, in a wide range of values, therefore simulating a broad spectrum of possible mechanical and lithological conditions. The quantitative estimates of the j/ f ratio confirm the general predominance of extensional failure events during brittle deformation in shallow crustal rocks and provide useful insights for better understanding the role played by the different parameters. For example, as a general trend it is observed that the j/ f ratio is inversely proportional to depth ( viz. confining pressure) and directly proportional to pore-fluid pressure, while the stronger is the rock, the wider is the range of depths showing a finite value of the j/ f ratio and in general the deeper are the conditions where extension fractures can form. Moreover, the wider is the strength variability of rocks ( i.e. the lower is the m parameter of the Weibull probabilistic distribution function), the wider is the depth range where both fractures can form providing a finite value of the j/ f ratio. Natural case studies from different geological and tectonic settings are also used to test the conceptual model and the numerical results showing a good agreement between measured and predicted j/ f ratios.
NASA Technical Reports Server (NTRS)
Gerberich, Matthew W.; Oleson, Steven R.
2013-01-01
The Collaborative Modeling for Parametric Assessment of Space Systems (COMPASS) team at Glenn Research Center has performed integrated system analysis of conceptual spacecraft mission designs since 2006 using a multidisciplinary concurrent engineering process. The set of completed designs was archived in a database, to allow for the study of relationships between design parameters. Although COMPASS uses a parametric spacecraft costing model, this research investigated the possibility of using a top-down approach to rapidly estimate the overall vehicle costs. This paper presents the relationships between significant design variables, including breakdowns of dry mass, wet mass, and cost. It also develops a model for a broad estimate of these parameters through basic mission characteristics, including the target location distance, the payload mass, the duration, the delta-v requirement, and the type of mission, propulsion, and electrical power. Finally, this paper examines the accuracy of this model in regards to past COMPASS designs, with an assessment of outlying spacecraft, and compares the results to historical data of completed NASA missions.
Guo, Xuezhen; Claassen, G D H; Oude Lansink, A G J M; Saatkamp, H W
2014-06-01
Economic analysis of hazard surveillance in livestock production chains is essential for surveillance organizations (such as food safety authorities) when making scientifically based decisions on optimization of resource allocation. To enable this, quantitative decision support tools are required at two levels of analysis: (1) single-hazard surveillance system and (2) surveillance portfolio. This paper addresses the first level by presenting a conceptual approach for the economic analysis of single-hazard surveillance systems. The concept includes objective and subjective aspects of single-hazard surveillance system analysis: (1) a simulation part to derive an efficient set of surveillance setups based on the technical surveillance performance parameters (TSPPs) and the corresponding surveillance costs, i.e., objective analysis, and (2) a multi-criteria decision making model to evaluate the impacts of the hazard surveillance, i.e., subjective analysis. The conceptual approach was checked for (1) conceptual validity and (2) data validity. Issues regarding the practical use of the approach, particularly the data requirement, were discussed. We concluded that the conceptual approach is scientifically credible for economic analysis of single-hazard surveillance systems and that the practicability of the approach depends on data availability. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Bormann, H.; Diekkrüger, B.
2003-04-01
A conceptual model is presented to simulate the water fluxes of regional catchments in Benin (West Africa). The model is applied in the framework of the IMPETUS project (an integrated approach to the efficient management of scarce water resources in West Africa) which aims to assess the effects of environmental and anthropogenic changes on the regional hydrological processes and on the water availability in Benin. In order to assess the effects of decreasing precipitation and increasing human activities on the hydrological processes in the upper Ouémé valley, a scenario analysis is performed to predict possible changes. Therefore a regional hydrological model is proposed which reproduces the recent hydrological processes, and which is able to consider the changes of landscape properties.The study presented aims to check the validity of the conceptual and lumped model under the conditions of the subhumid tree savannah and therefore analyses the importance of possible sources of uncertainty. Main focus is set on the uncertainties caused by input data, model parameters and model structure. As the model simulates the water fluxes at the catchment outlet of the Térou river (3133 km2) in a sufficient quality, first results of a scenario analysis are presented. Changes of interest are the expected future decrease in amount and temporal structure of the precipitation (e.g. minus X percent precipitation during the whole season versus minus X percent precipitation in the end of the rainy season, alternatively), the decrease in soil water storage capacity which is caused by erosion, and the increasing consumption of ground water for drinking water and agricultural purposes. Resuming from the results obtained, the perspectives of lumped and conceptual models are discussed with special regard to available management options of this kind of models. Advantages and disadvantages compared to alternative model approaches (process based, physics based) are discussed.
A modal parameter extraction procedure applicable to linear time-invariant dynamic systems
NASA Technical Reports Server (NTRS)
Kurdila, A. J.; Craig, R. R., Jr.
1985-01-01
Modal analysis has emerged as a valuable tool in many phases of the engineering design process. Complex vibration and acoustic problems in new designs can often be remedied through use of the method. Moreover, the technique has been used to enhance the conceptual understanding of structures by serving to verify analytical models. A new modal parameter estimation procedure is presented. The technique is applicable to linear, time-invariant systems and accommodates multiple input excitations. In order to provide a background for the derivation of the method, some modal parameter extraction procedures currently in use are described. Key features implemented in the new technique are elaborated upon.
Shimatani, Ichiro Ken; Yoda, Ken; Katsumata, Nobuhiro; Sato, Katsufumi
2012-01-01
To analyze an animal's movement trajectory, a basic model is required that satisfies the following conditions: the model must have an ecological basis and the parameters used in the model must have ecological interpretations, a broad range of movement patterns can be explained by that model, and equations and probability distributions in the model should be mathematically tractable. Random walk models used in previous studies do not necessarily satisfy these requirements, partly because movement trajectories are often more oriented or tortuous than expected from the models. By improving the modeling for turning angles, this study aims to propose a basic movement model. On the basis of the recently developed circular auto-regressive model, we introduced a new movement model and extended its applicability to capture the asymmetric effects of external factors such as wind. The model was applied to GPS trajectories of a seabird (Calonectris leucomelas) to demonstrate its applicability to various movement patterns and to explain how the model parameters are ecologically interpreted under a general conceptual framework for movement ecology. Although it is based on a simple extension of a generalized linear model to circular variables, the proposed model enables us to evaluate the effects of external factors on movement separately from the animal's internal state. For example, maximum likelihood estimates and model selection suggested that in one homing flight section, the seabird intended to fly toward the island, but misjudged its navigation and was driven off-course by strong winds, while in the subsequent flight section, the seabird reset the focal direction, navigated the flight under strong wind conditions, and succeeded in approaching the island.
A Conceptual Wing Flutter Analysis Tool for Systems Analysis and Parametric Design Study
NASA Technical Reports Server (NTRS)
Mukhopadhyay, Vivek
2003-01-01
An interactive computer program was developed for wing flutter analysis in the conceptual design stage. The objective was to estimate flutt er instability boundaries of a typical wing, when detailed structural and aerodynamic data are not available. Effects of change in key flu tter parameters can also be estimated in order to guide the conceptual design. This userfriendly software was developed using MathCad and M atlab codes. The analysis method was based on non-dimensional paramet ric plots of two primary flutter parameters, namely Regier number and Flutter number, with normalization factors based on wing torsion stiffness, sweep, mass ratio, taper ratio, aspect ratio, center of gravit y location and pitch-inertia radius of gyration. These parametric plo ts were compiled in a Chance-Vought Corporation report from database of past experiments and wind tunnel test results. An example was prese nted for conceptual flutter analysis of outer-wing of a Blended-Wing- Body aircraft.
NASA Technical Reports Server (NTRS)
Ebeling, Charles
1991-01-01
The primary objective is to develop a methodology for predicting operational and support parameters and costs of proposed space systems. The first phase consists of: (1) the identification of data sources; (2) the development of a methodology for determining system reliability and maintainability parameters; (3) the implementation of the methodology through the use of prototypes; and (4) support in the development of an integrated computer model. The phase 1 results are documented and a direction is identified to proceed to accomplish the overall objective.
Esralew, Rachel A; Flint, Lorraine; Thorne, James H; Boynton, Ryan; Flint, Alan
2016-07-01
Climate-change adaptation planning for managed wetlands is challenging under uncertain futures when the impact of historic climate variability on wetland response is unquantified. We assessed vulnerability of Modoc National Wildlife Refuge (MNWR) through use of the Basin Characterization Model (BCM) landscape hydrology model, and six global climate models, representing projected wetter and drier conditions. We further developed a conceptual model that provides greater value for water managers by incorporating the BCM outputs into a conceptual framework that links modeled parameters to refuge management outcomes. This framework was used to identify landscape hydrology parameters that reflect refuge sensitivity to changes in (1) climatic water deficit (CWD) and recharge, and (2) the magnitude, timing, and frequency of water inputs. BCM outputs were developed for 1981-2100 to assess changes and forecast the probability of experiencing wet and dry water year types that have historically resulted in challenging conditions for refuge habitat management. We used a Yule's Q skill score to estimate the probability of modeled discharge that best represents historic water year types. CWD increased in all models across 72.3-100 % of the water supply basin by 2100. Earlier timing in discharge, greater cool season discharge, and lesser irrigation season water supply were predicted by most models. Under the worst-case scenario, moderately dry years increased from 10-20 to 40-60 % by 2100. MNWR could adapt by storing additional water during the cool season for later use and prioritizing irrigation of habitats during dry years.
A Bayesian alternative for multi-objective ecohydrological model specification
NASA Astrophysics Data System (ADS)
Tang, Yating; Marshall, Lucy; Sharma, Ashish; Ajami, Hoori
2018-01-01
Recent studies have identified the importance of vegetation processes in terrestrial hydrologic systems. Process-based ecohydrological models combine hydrological, physical, biochemical and ecological processes of the catchments, and as such are generally more complex and parametric than conceptual hydrological models. Thus, appropriate calibration objectives and model uncertainty analysis are essential for ecohydrological modeling. In recent years, Bayesian inference has become one of the most popular tools for quantifying the uncertainties in hydrological modeling with the development of Markov chain Monte Carlo (MCMC) techniques. The Bayesian approach offers an appealing alternative to traditional multi-objective hydrologic model calibrations by defining proper prior distributions that can be considered analogous to the ad-hoc weighting often prescribed in multi-objective calibration. Our study aims to develop appropriate prior distributions and likelihood functions that minimize the model uncertainties and bias within a Bayesian ecohydrological modeling framework based on a traditional Pareto-based model calibration technique. In our study, a Pareto-based multi-objective optimization and a formal Bayesian framework are implemented in a conceptual ecohydrological model that combines a hydrological model (HYMOD) and a modified Bucket Grassland Model (BGM). Simulations focused on one objective (streamflow/LAI) and multiple objectives (streamflow and LAI) with different emphasis defined via the prior distribution of the model error parameters. Results show more reliable outputs for both predicted streamflow and LAI using Bayesian multi-objective calibration with specified prior distributions for error parameters based on results from the Pareto front in the ecohydrological modeling. The methodology implemented here provides insight into the usefulness of multiobjective Bayesian calibration for ecohydrologic systems and the importance of appropriate prior distributions in such approaches.
Kumar, Ashish; Vercruysse, Jurgen; Vanhoorne, Valérie; Toiviainen, Maunu; Panouillot, Pierre-Emmanuel; Juuti, Mikko; Vervaet, Chris; Remon, Jean Paul; Gernaey, Krist V; De Beer, Thomas; Nopens, Ingmar
2015-04-25
Twin-screw granulation is a promising continuous alternative for traditional batchwise wet granulation processes. The twin-screw granulator (TSG) screws consist of transport and kneading element modules. Therefore, the granulation to a large extent is governed by the residence time distribution within each module where different granulation rate processes dominate over others. Currently, experimental data is used to determine the residence time distributions. In this study, a conceptual model based on classical chemical engineering methods is proposed to better understand and simulate the residence time distribution in a TSG. The experimental data were compared with the proposed most suitable conceptual model to estimate the parameters of the model and to analyse and predict the effects of changes in number of kneading discs and their stagger angle, screw speed and powder feed rate on residence time. The study established that the kneading block in the screw configuration acts as a plug-flow zone inside the granulator. Furthermore, it was found that a balance between the throughput force and conveying rate is required to obtain a good axial mixing inside the twin-screw granulator. Although the granulation behaviour is different for other excipients, the experimental data collection and modelling methods applied in this study are generic and can be adapted to other excipients. Copyright © 2015 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
M. A. Wasiolek
The purpose of this report is to document the biosphere model, the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), which describes radionuclide transport processes in the biosphere and associated human exposure that may arise as the result of radionuclide release from the geologic repository at Yucca Mountain. The biosphere model is one of the process models that support the Yucca Mountain Project (YMP) Total System Performance Assessment (TSPA) for the license application (LA), the TSPA-LA. The ERMYN model provides the capability of performing human radiation dose assessments. This report documents the biosphere model, which includes: (1) Describing the referencemore » biosphere, human receptor, exposure scenarios, and primary radionuclides for each exposure scenario (Section 6.1); (2) Developing a biosphere conceptual model using site-specific features, events, and processes (FEPs), the reference biosphere, the human receptor, and assumptions (Section 6.2 and Section 6.3); (3) Building a mathematical model using the biosphere conceptual model and published biosphere models (Sections 6.4 and 6.5); (4) Summarizing input parameters for the mathematical model, including the uncertainty associated with input values (Section 6.6); (5) Identifying improvements in the ERMYN model compared with the model used in previous biosphere modeling (Section 6.7); (6) Constructing an ERMYN implementation tool (model) based on the biosphere mathematical model using GoldSim stochastic simulation software (Sections 6.8 and 6.9); (7) Verifying the ERMYN model by comparing output from the software with hand calculations to ensure that the GoldSim implementation is correct (Section 6.10); and (8) Validating the ERMYN model by corroborating it with published biosphere models; comparing conceptual models, mathematical models, and numerical results (Section 7).« less
NASA Astrophysics Data System (ADS)
Zhu, Gaofeng; Li, Xin; Ma, Jinzhu; Wang, Yunquan; Liu, Shaomin; Huang, Chunlin; Zhang, Kun; Hu, Xiaoli
2018-04-01
Sequential Monte Carlo (SMC) samplers have become increasing popular for estimating the posterior parameter distribution with the non-linear dependency structures and multiple modes often present in hydrological models. However, the explorative capabilities and efficiency of the sampler depends strongly on the efficiency in the move step of SMC sampler. In this paper we presented a new SMC sampler entitled the Particle Evolution Metropolis Sequential Monte Carlo (PEM-SMC) algorithm, which is well suited to handle unknown static parameters of hydrologic model. The PEM-SMC sampler is inspired by the works of Liang and Wong (2001) and operates by incorporating the strengths of the genetic algorithm, differential evolution algorithm and Metropolis-Hasting algorithm into the framework of SMC. We also prove that the sampler admits the target distribution to be a stationary distribution. Two case studies including a multi-dimensional bimodal normal distribution and a conceptual rainfall-runoff hydrologic model by only considering parameter uncertainty and simultaneously considering parameter and input uncertainty show that PEM-SMC sampler is generally superior to other popular SMC algorithms in handling the high dimensional problems. The study also indicated that it may be important to account for model structural uncertainty by using multiplier different hydrological models in the SMC framework in future study.
Retrofitting activated sludge systems to intermittent aeration for nitrogen removal.
Hanhan, O; Artan, N; Orhon, D
2002-01-01
The paper provides the basis and the conceptual approach of applying process kinetics and modelling to the design of alternating activated sludge systems for retrofitting existing activated sludge plants to intermittent aeration for nitrogen removal. It shows the significant role of the two specific parameters, namely, the aerated fraction and the cycle time ratio on process performance through model simulations and proposes a way to incorporate them into a design procedure using process stoichiometry and mass balance. It illustrates the effect of these parameters, together with the sludge age, in establishing the balance between the denitrification potential and the available nitrogen created in the anoxic/aerobic sequences of system operation.
ERIC Educational Resources Information Center
Koziol, Natalie A.
2016-01-01
Testlets, or groups of related items, are commonly included in educational assessments due to their many logistical and conceptual advantages. Despite their advantages, testlets introduce complications into the theory and practice of educational measurement. Responses to items within a testlet tend to be correlated even after controlling for…
NASA Astrophysics Data System (ADS)
Lewis, Elizabeth; Kilsby, Chris; Fowler, Hayley
2014-05-01
The impact of climate change on hydrological systems requires further quantification in order to inform water management. This study intends to conduct such analysis using hydrological models. Such models are of varying forms, of which conceptual, lumped parameter models and physically-based models are two important types. The majority of hydrological studies use conceptual models calibrated against measured river flow time series in order to represent catchment behaviour. This method often shows impressive results for specific problems in gauged catchments. However, the results may not be robust under non-stationary conditions such as climate change, as physical processes and relationships amenable to change are not accounted for explicitly. Moreover, conceptual models are less readily applicable to ungauged catchments, in which hydrological predictions are also required. As such, the physically based, spatially distributed model SHETRAN is used in this study to develop a robust and reliable framework for modelling historic and future behaviour of gauged and ungauged catchments across the whole of Great Britain. In order to achieve this, a large array of data completely covering Great Britain for the period 1960-2006 has been collated and efficiently stored ready for model input. The data processed include a DEM, rainfall, PE and maps of geology, soil and land cover. A desire to make the modelling system easy for others to work with led to the development of a user-friendly graphical interface. This allows non-experts to set up and run a catchment model in a few seconds, a process that can normally take weeks or months. The quality and reliability of the extensive dataset for modelling hydrological processes has also been evaluated. One aspect of this has been an assessment of error and uncertainty in rainfall input data, as well as the effects of temporal resolution in precipitation inputs on model calibration. SHETRAN has been updated to accept gridded rainfall inputs, and UKCP09 gridded daily rainfall data has been disaggregated using hourly records to analyse the implications of using realistic sub-daily variability. Furthermore, the development of a comprehensive dataset and computationally efficient means of setting up and running catchment models has allowed for examination of how a robust parameter scheme may be derived. This analysis has been based on collective parameterisation of multiple catchments in contrasting hydrological settings and subject to varied processes. 350 gauged catchments all over the UK have been simulated, and a robust set of parameters is being sought by examining the full range of hydrological processes and calibrating to a highly diverse flow data series. The modelling system will be used to generate flow time series based on historical input data and also downscaled Regional Climate Model (RCM) forecasts using the UKCP09 Weather Generator. This will allow for analysis of flow frequency and associated future changes, which cannot be determined from the instrumental record or from lumped parameter model outputs calibrated only to historical catchment behaviour. This work will be based on the existing and functional modelling system described following some further improvements to calibration, particularly regarding simulation of groundwater-dominated catchments.
Collaborative Research: Robust Climate Projections and Stochastic Stability of Dynamical Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ilya Zaliapin
This project focused on conceptual exploration of El Nino/Southern Oscillation (ENSO) variability and sensitivity using a Delay Differential Equation developed in the project. We have (i) established the existence and continuous dependence of solutions of the model (ii) explored multiple models solutions, and the distribution of solutions extrema, and (iii) established and explored the phase locking phenomenon and the existence of multiple solutions for the same values of model parameters. In addition, we have applied to our model the concept of pullback attractor, which greatly facilitated predictive understanding of the nonlinear model's behavior.
Is our medical school socially accountable? The case of Faculty of Medicine, Suez Canal University.
Hosny, Somaya; Ghaly, Mona; Boelen, Charles
2015-04-01
Faculty of Medicine, Suez Canal University (FOM/SCU) was established as community oriented school with innovative educational strategies. Social accountability represents the commitment of the medical school towards the community it serves. To assess FOM/SCU compliance to social accountability using the "Conceptualization, Production, Usability" (CPU) model. FOM/SCU's practice was reviewed against CPU model parameters. CPU consists of three domains, 11 sections and 31 parameters. Data were collected through unstructured interviews with the main stakeholders and documents review since 2005 to 2013. FOM/SCU shows general compliance to the three domains of the CPU. Very good compliance was shown to the "P" domain of the model through FOM/SCU's innovative educational system, students and faculty members. More work is needed on the "C" and "U" domains. FOM/SCU complies with many parameters of the CPU model; however, more work should be accomplished to comply with some items in the C and U domains so that FOM/SCU can be recognized as a proactive socially accountable school.
Time-varying parameter models for catchments with land use change: the importance of model structure
NASA Astrophysics Data System (ADS)
Pathiraja, Sahani; Anghileri, Daniela; Burlando, Paolo; Sharma, Ashish; Marshall, Lucy; Moradkhani, Hamid
2018-05-01
Rapid population and economic growth in Southeast Asia has been accompanied by extensive land use change with consequent impacts on catchment hydrology. Modeling methodologies capable of handling changing land use conditions are therefore becoming ever more important and are receiving increasing attention from hydrologists. A recently developed data-assimilation-based framework that allows model parameters to vary through time in response to signals of change in observations is considered for a medium-sized catchment (2880 km2) in northern Vietnam experiencing substantial but gradual land cover change. We investigate the efficacy of the method as well as the importance of the chosen model structure in ensuring the success of a time-varying parameter method. The method was used with two lumped daily conceptual models (HBV and HyMOD) that gave good-quality streamflow predictions during pre-change conditions. Although both time-varying parameter models gave improved streamflow predictions under changed conditions compared to the time-invariant parameter model, persistent biases for low flows were apparent in the HyMOD case. It was found that HyMOD was not suited to representing the modified baseflow conditions, resulting in extreme and unrealistic time-varying parameter estimates. This work shows that the chosen model can be critical for ensuring the time-varying parameter framework successfully models streamflow under changing land cover conditions. It can also be used to determine whether land cover changes (and not just meteorological factors) contribute to the observed hydrologic changes in retrospective studies where the lack of a paired control catchment precludes such an assessment.
The Freter model: a simple model of biofilm formation.
Jones, Don; Kojouharov, Hristo V; Le, Dung; Smith, Hal
2003-08-01
A simple, conceptual model of biofilm formation, due to R. Freter et al. (1983), is studied analytically and numerically in both CSTR and PFR. Two steady state regimes are identified, namely, the complete washout of the microbes from the reactor and the successful colonization of both the wall and bulk fluid. One of these is stable for any particular set of parameter values and sharp and explicit conditions are given for the stability of each. The effects of adding an anti-microbial agent to the CSTR are examined.
Neural Networks for Hydrological Modeling Tool for Operational Purposes
NASA Astrophysics Data System (ADS)
Bhatt, Divya; Jain, Ashu
2010-05-01
Hydrological models are useful in many water resources applications such as flood control, irrigation and drainage, hydro power generation, water supply, erosion and sediment control, etc. Estimates of runoff are needed in many water resources planning, design development, operation and maintenance activities. Runoff is generally computed using rainfall-runoff models. Computer based hydrologic models have become popular for obtaining hydrological forecasts and for managing water systems. Rainfall-runoff library (RRL) is computer software developed by Cooperative Research Centre for Catchment Hydrology (CRCCH), Australia consisting of five different conceptual rainfall-runoff models, and has been in operation in many water resources applications in Australia. Recently, soft artificial intelligence tools such as Artificial Neural Networks (ANNs) have become popular for research purposes but have not been adopted in operational hydrological forecasts. There is a strong need to develop ANN models based on real catchment data and compare them with the conceptual models actually in use in real catchments. In this paper, the results from an investigation on the use of RRL and ANNs are presented. Out of the five conceptual models in the RRL toolkit, SimHyd model has been used. Genetic Algorithm has been used as an optimizer in the RRL to calibrate the SimHyd model. Trial and error procedures were employed to arrive at the best values of various parameters involved in the GA optimizer to develop the SimHyd model. The results obtained from the best configuration of the SimHyd model are presented here. Feed-forward neural network model structure trained by back-propagation training algorithm has been adopted here to develop the ANN models. The daily rainfall and runoff data derived from Bird Creek Basin, Oklahoma, USA have been employed to develop all the models included here. A wide range of error statistics have been used to evaluate the performance of all the models developed in this study. The ANN models developed consistently outperformed the conceptual model developed in this study. The results obtained in this study indicate that the ANNs can be extremely useful tools for modeling the complex rainfall-runoff process in real catchments. The ANNs should be adopted in real catchments for hydrological modeling and forecasting. It is hoped that more research will be carried out to compare the performance of ANN model with the conceptual models actually in use at catchment scales. It is hoped that such efforts may go a long way in making the ANNs more acceptable by the policy makers, water resources decision makers, and traditional hydrologists.
NASA Astrophysics Data System (ADS)
Xu, Chong-yu; Tunemar, Liselotte; Chen, Yongqin David; Singh, V. P.
2006-06-01
Sensitivity of hydrological models to input data errors have been reported in the literature for particular models on a single or a few catchments. A more important issue, i.e. how model's response to input data error changes as the catchment conditions change has not been addressed previously. This study investigates the seasonal and spatial effects of precipitation data errors on the performance of conceptual hydrological models. For this study, a monthly conceptual water balance model, NOPEX-6, was applied to 26 catchments in the Mälaren basin in Central Sweden. Both systematic and random errors were considered. For the systematic errors, 5-15% of mean monthly precipitation values were added to the original precipitation to form the corrupted input scenarios. Random values were generated by Monte Carlo simulation and were assumed to be (1) independent between months, and (2) distributed according to a Gaussian law of zero mean and constant standard deviation that were taken as 5, 10, 15, 20, and 25% of the mean monthly standard deviation of precipitation. The results show that the response of the model parameters and model performance depends, among others, on the type of the error, the magnitude of the error, physical characteristics of the catchment, and the season of the year. In particular, the model appears less sensitive to the random error than to the systematic error. The catchments with smaller values of runoff coefficients were more influenced by input data errors than were the catchments with higher values. Dry months were more sensitive to precipitation errors than were wet months. Recalibration of the model with erroneous data compensated in part for the data errors by altering the model parameters.
NASA Technical Reports Server (NTRS)
DeMott, Diana; Fuqua, Bryan; Wilson, Paul
2013-01-01
Once a project obtains approval, decision makers have to consider a variety of alternative paths for completing the project and meeting the project objectives. How decisions are made involves a variety of elements including: cost, experience, current technology, ideologies, politics, future needs and desires, capabilities, manpower, timing, available information, and for many ventures management needs to assess the elements of risk versus reward. The use of high level Probabilistic Risk Assessment (PRA) Models during conceptual design phases provides management with additional information during the decision making process regarding the risk potential for proposed operations and design prototypes. The methodology can be used as a tool to: 1) allow trade studies to compare alternatives based on risk, 2) determine which elements (equipment, process or operational parameters) drives the risk, and 3) provide information to mitigate or eliminate risks early in the conceptual design to lower costs. Creating system models using conceptual design proposals and generic key systems based on what is known today can provide an understanding of the magnitudes of proposed systems and operational risks and facilitates trade study comparisons early in the decision making process. Identifying the "best" way to achieve the desired results is difficult, and generally occurs based on limited information. PRA provides a tool for decision makers to explore how some decisions will affect risk before the project is committed to that path, which can ultimately save time and money.
Plant–herbivore–decomposer stoichiometric mismatches and nutrient cycling in ecosystems
Cherif, Mehdi; Loreau, Michel
2013-01-01
Plant stoichiometry is thought to have a major influence on how herbivores affect nutrient availability in ecosystems. Most conceptual models predict that plants with high nutrient contents increase nutrient excretion by herbivores, in turn raising nutrient availability. To test this hypothesis, we built a stoichiometrically explicit model that includes a simple but thorough description of the processes of herbivory and decomposition. Our results challenge traditional views of herbivore impacts on nutrient availability in many ways. They show that the relationship between plant nutrient content and the impact of herbivores predicted by conceptual models holds only at high plant nutrient contents. At low plant nutrient contents, the impact of herbivores is mediated by the mineralization/immobilization of nutrients by decomposers and by the type of resource limiting the growth of decomposers. Both parameters are functions of the mismatch between plant and decomposer stoichiometries. Our work provides new predictions about the impacts of herbivores on ecosystem fertility that depend on critical interactions between plant, herbivore and decomposer stoichiometries in ecosystems. PMID:23303537
Reducing structural uncertainty in conceptual hydrological modeling in the semi-arid Andes
NASA Astrophysics Data System (ADS)
Hublart, P.; Ruelland, D.; Dezetter, A.; Jourde, H.
2014-10-01
The use of lumped, conceptual models in hydrological impact studies requires placing more emphasis on the uncertainty arising from deficiencies and/or ambiguities in the model structure. This study provides an opportunity to combine a multiple-hypothesis framework with a multi-criteria assessment scheme to reduce structural uncertainty in the conceptual modeling of a meso-scale Andean catchment (1515 km2) over a 30 year period (1982-2011). The modeling process was decomposed into six model-building decisions related to the following aspects of the system behavior: snow accumulation and melt, runoff generation, redistribution and delay of water fluxes, and natural storage effects. Each of these decisions was provided with a set of alternative modeling options, resulting in a total of 72 competing model structures. These structures were calibrated using the concept of Pareto optimality with three criteria pertaining to streamflow simulations and one to the seasonal dynamics of snow processes. The results were analyzed in the four-dimensional space of performance measures using a fuzzy c-means clustering technique and a differential split sample test, leading to identify 14 equally acceptable model hypotheses. A filtering approach was then applied to these best-performing structures in order to minimize the overall uncertainty envelope while maximizing the number of enclosed observations. This led to retain 8 model hypotheses as a representation of the minimum structural uncertainty that could be obtained with this modeling framework. Future work to better consider model predictive uncertainty should include a proper assessment of parameter equifinality and data errors, as well as the testing of new or refined hypotheses to allow for the use of additional auxiliary observations.
Reducing structural uncertainty in conceptual hydrological modelling in the semi-arid Andes
NASA Astrophysics Data System (ADS)
Hublart, P.; Ruelland, D.; Dezetter, A.; Jourde, H.
2015-05-01
The use of lumped, conceptual models in hydrological impact studies requires placing more emphasis on the uncertainty arising from deficiencies and/or ambiguities in the model structure. This study provides an opportunity to combine a multiple-hypothesis framework with a multi-criteria assessment scheme to reduce structural uncertainty in the conceptual modelling of a mesoscale Andean catchment (1515 km2) over a 30-year period (1982-2011). The modelling process was decomposed into six model-building decisions related to the following aspects of the system behaviour: snow accumulation and melt, runoff generation, redistribution and delay of water fluxes, and natural storage effects. Each of these decisions was provided with a set of alternative modelling options, resulting in a total of 72 competing model structures. These structures were calibrated using the concept of Pareto optimality with three criteria pertaining to streamflow simulations and one to the seasonal dynamics of snow processes. The results were analyzed in the four-dimensional (4-D) space of performance measures using a fuzzy c-means clustering technique and a differential split sample test, leading to identify 14 equally acceptable model hypotheses. A filtering approach was then applied to these best-performing structures in order to minimize the overall uncertainty envelope while maximizing the number of enclosed observations. This led to retain eight model hypotheses as a representation of the minimum structural uncertainty that could be obtained with this modelling framework. Future work to better consider model predictive uncertainty should include a proper assessment of parameter equifinality and data errors, as well as the testing of new or refined hypotheses to allow for the use of additional auxiliary observations.
NASA Astrophysics Data System (ADS)
Tang, Ting; Seuntjens, Piet; van Griensven, Ann; Bronders, Jan
2016-04-01
Urban areas can significantly contribute to pesticide contamination in surface water. However, pesticide behaviours in urban areas, particularly on hard surfaces, are far less studied than those in agricultural areas. Pesticide application on hard surfaces (e.g. roadsides and walkways) is of particular concern due to the high imperviousness and therefore high pesticide runoff potential. Experimental studies have shown that pesticide behaviours on and interactions with hard surfaces are important factors controlling the pesticide runoff potential, and therefore the magnitude and timing of peak concentrations in surface water. We conceptualized pesticide behaviours on hard surfaces and incorporated the conceptualization into a new pesticide runoff model. The pesticide runoff model was implemented in a catchment hydrological model WetSpa-Python (Water and Energy Transfer between Soil, Plants and Atmosphere, Python version). The conceptualization for pesticide processes on hard surfaces accounts for the differences in pesticide behaviour on different hard surfaces. Four parameters are used to describe the partitioning and wash-off of each pesticide on hard surfaces. We tested the conceptualization using experimental dataset for five pesticides on two types of hard surfaces, namely concrete and asphalt. The conceptualization gave good performance in accounting for the wash-off pattern for the modelled pesticides and surfaces, according to quantitative evaluations using the Nash-Sutcliffe efficiency and percent bias. The resulting pesticide runoff model WetSpa-PST (WetSpa for PeSTicides) can simulate pesticides and their metabolites at the catchment scale. Overall, it includes four groups of pesticide processes, namely pesticide application, pesticide interception by plant foliage, pesticide processes on land surfaces (including partitioning, degradation and wash-off on hard surface; partitioning, dissipation, infiltration and runoff in soil) and pesticide processes in depression storage (including degradation, infiltration and runoff). Processes on hard surfaces employs the conceptualization described in the paragraph above. The WetSpa-PST model can account for various spatial details of the urban features in a catchment, such as asphalt, concrete and roof areas. The distributed feature also allows users to input detailed pesticide application data of both non-point and point origins. Thanks to the Python modelling framework prototype used in the WetSpa-Python model, processes in the WetSpa-PST model can be simulated at different time steps depending on data availability and the characteristic temporal scale of each process. This helps to increase the computational accuracy during heavy rainfall events, especially for the associated fast transport of pesticides into surface water. Overall, the WetSpa-PST model has good potential in predicting effects of management options on pesticide releases from heavily urbanized catchments.
Computer simulation of storm runoff for three watersheds in Albuquerque, New Mexico
Knutilla, R.L.; Veenhuis, J.E.
1994-01-01
Rainfall-runoff data from three watersheds were selected for calibration and verification of the U.S. Geological Survey's Distributed Routing Rainfall-Runoff Model. The watersheds chosen are residentially developed. The conceptually based model uses an optimization process that adjusts selected parameters to achieve the best fit between measured and simulated runoff volumes and peak discharges. Three of these optimization parameters represent soil-moisture conditions, three represent infiltration, and one accounts for effective impervious area. Each watershed modeled was divided into overland-flow segments and channel segments. The overland-flow segments were further subdivided to reflect pervious and impervious areas. Each overland-flow and channel segment was assigned representative values of area, slope, percentage of imperviousness, and roughness coefficients. Rainfall-runoff data for each watershed were separated into two sets for use in calibration and verification. For model calibration, seven input parameters were optimized to attain a best fit of the data. For model verification, parameter values were set using values from model calibration. The standard error of estimate for calibration of runoff volumes ranged from 19 to 34 percent, and for peak discharge calibration ranged from 27 to 44 percent. The standard error of estimate for verification of runoff volumes ranged from 26 to 31 percent, and for peak discharge verification ranged from 31 to 43 percent.
Parametric study of a canard-configured transport using conceptual design optimization
NASA Technical Reports Server (NTRS)
Arbuckle, P. D.; Sliwa, S. M.
1985-01-01
Constrained-parameter optimization is used to perform optimal conceptual design of both canard and conventional configurations of a medium-range transport. A number of design constants and design constraints are systematically varied to compare the sensitivities of canard and conventional configurations to a variety of technology assumptions. Main-landing-gear location and canard surface high-lift performance are identified as critical design parameters for a statically stable, subsonic, canard-configured transport.
Trybula, Elizabeth M.; Cibin, Raj; Burks, Jennifer L.; ...
2014-06-13
The Soil and Water Assessment Tool (SWAT) is increasingly used to quantify hydrologic and water quality impacts of bioenergy production, but crop-growth parameters for candidate perennial rhizomatous grasses (PRG) Miscanthus × giganteus and upland ecotypes of Panicum virgatum (switchgrass) are limited by the availability of field data. Crop-growth parameter ranges and suggested values were developed in this study using agronomic and weather data collected at the Purdue University Water Quality Field Station in northwestern Indiana. During the process of parameterization, the comparison of measured data with conceptual representation of PRG growth in the model led to three changes in themore » SWAT 2009 code: the harvest algorithm was modified to maintain belowground biomass over winter, plant respiration was extended via modified-DLAI to better reflect maturity and leaf senescence, and nutrient uptake algorithms were revised to respond to temperature, water, and nutrient stress. Parameter values and changes to the model resulted in simulated biomass yield and leaf area index consistent with reported values for the region. Code changes in the SWAT model improved nutrient storage during dormancy period and nitrogen and phosphorus uptake by both switchgrass and Miscanthus.« less
A climate responsive urban design tool: a platform to improve energy efficiency in a dry hot climate
NASA Astrophysics Data System (ADS)
El Dallal, Norhan; Visser, Florentine
2017-09-01
In the Middle East and North Africa (MENA) region, new urban developments should address the climatic conditions to improve outdoor comfort and to reduce the energy consumption of buildings. This article describes a design tool that supports climate responsive design for a dry hot climate. The approach takes the climate as an initiator for the conceptual urban form with a more energy-efficient urban morphology. The methodology relates the different passive strategies suitable for major climate conditions in MENA region (dry-hot) to design parameters that create the urban form. This parametric design approach is the basis for a tool that generates conceptual climate responsive urban forms so as to assist the urban designer early in the design process. Various conceptual scenarios, generated by a computational model, are the results of the proposed platform. A practical application of the approach is conducted on a New Urban Community in Aswan (Egypt), showing the economic feasibility of the resulting urban form and morphology, and the proposed tool.
Evolution of the conceptual model of unsaturated zone hydrology at Yucca Mountain, Nevada
NASA Astrophysics Data System (ADS)
Flint, Alan L.; Flint, Lorraine E.; Bodvarsson, Gudmundur S.; Kwicklis, Edward M.; Fabryka-Martin, June
2001-06-01
Yucca Mountain is an arid site proposed for consideration as the United States' first underground high-level radioactive waste repository. Low rainfall (approximately 170 mm/yr) and a thick unsaturated zone (500-1000 m) are important physical attributes of the site because the quantity of water likely to reach the waste and the paths and rates of movement of the water to the saturated zone under future climates would be major factors in controlling the concentrations and times of arrival of radionuclides at the surrounding accessible environment. The framework for understanding the hydrologic processes that occur at this site and that control how quickly water will penetrate through the unsaturated zone to the water table has evolved during the past 15 yr. Early conceptual models assumed that very small volumes of water infiltrated into the bedrock (0.5-4.5 mm/yr, or 2-3 percent of rainfall), that much of the infiltrated water flowed laterally within the upper nonwelded units because of capillary barrier effects, and that the remaining water flowed down faults with a small amount flowing through the matrix of the lower welded, fractured rocks. It was believed that the matrix had to be saturated for fractures to flow. However, accumulating evidence indicated that infiltration rates were higher than initially estimated, such as infiltration modeling based on neutron borehole data, bomb-pulse isotopes deep in the mountain, perched water analyses and thermal analyses. Mechanisms supporting lateral diversion did not apply at these higher fluxes, and the flux calculated in the lower welded unit exceeded the conductivity of the matrix, implying vertical flow of water in the high permeability fractures of the potential repository host rock, and disequilibrium between matrix and fracture water potentials. The development of numerical modeling methods and parameter values evolved concurrently with the conceptual model in order to account for the observed field data, particularly fracture flow deep in the unsaturated zone. This paper presents the history of the evolution of conceptual models of hydrology and numerical models of unsaturated zone flow at Yucca Mountain, Nevada ( Flint, A.L., Flint, L.E., Kwicklis, E.M., Bodvarsson, G.S., Fabryka-Martin, J.M., 2001. Hydrology of Yucca Mountain. Reviews of Geophysics in press). This retrospective is the basis for recommendations for optimizing the efficiency with which a viable and robust conceptual model can be developed for a complex site.
Fienen, Michael N.; D'Oria, Marco; Doherty, John E.; Hunt, Randall J.
2013-01-01
The application bgaPEST is a highly parameterized inversion software package implementing the Bayesian Geostatistical Approach in a framework compatible with the parameter estimation suite PEST. Highly parameterized inversion refers to cases in which parameters are distributed in space or time and are correlated with one another. The Bayesian aspect of bgaPEST is related to Bayesian probability theory in which prior information about parameters is formally revised on the basis of the calibration dataset used for the inversion. Conceptually, this approach formalizes the conditionality of estimated parameters on the specific data and model available. The geostatistical component of the method refers to the way in which prior information about the parameters is used. A geostatistical autocorrelation function is used to enforce structure on the parameters to avoid overfitting and unrealistic results. Bayesian Geostatistical Approach is designed to provide the smoothest solution that is consistent with the data. Optionally, users can specify a level of fit or estimate a balance between fit and model complexity informed by the data. Groundwater and surface-water applications are used as examples in this text, but the possible uses of bgaPEST extend to any distributed parameter applications.
Models based on value and probability in health improve shared decision making.
Ortendahl, Monica
2008-10-01
Diagnostic reasoning and treatment decisions are a key competence of doctors. A model based on values and probability provides a conceptual framework for clinical judgments and decisions, and also facilitates the integration of clinical and biomedical knowledge into a diagnostic decision. Both value and probability are usually estimated values in clinical decision making. Therefore, model assumptions and parameter estimates should be continually assessed against data, and models should be revised accordingly. Introducing parameter estimates for both value and probability, which usually pertain in clinical work, gives the model labelled subjective expected utility. Estimated values and probabilities are involved sequentially for every step in the decision-making process. Introducing decision-analytic modelling gives a more complete picture of variables that influence the decisions carried out by the doctor and the patient. A model revised for perceived values and probabilities by both the doctor and the patient could be used as a tool for engaging in a mutual and shared decision-making process in clinical work.
Handling Qualities Optimization for Rotorcraft Conceptual Design
NASA Technical Reports Server (NTRS)
Lawrence, Ben; Theodore, Colin R.; Berger, Tom
2016-01-01
Over the past decade, NASA, under a succession of rotary-wing programs has been moving towards coupling multiple discipline analyses in a rigorous consistent manner to evaluate rotorcraft conceptual designs. Handling qualities is one of the component analyses to be included in a future NASA Multidisciplinary Analysis and Optimization framework for conceptual design of VTOL aircraft. Similarly, the future vision for the capability of the Concept Design and Assessment Technology Area (CD&A-TA) of the U.S Army Aviation Development Directorate also includes a handling qualities component. SIMPLI-FLYD is a tool jointly developed by NASA and the U.S. Army to perform modeling and analysis for the assessment of flight dynamics and control aspects of the handling qualities of rotorcraft conceptual designs. An exploration of handling qualities analysis has been carried out using SIMPLI-FLYD in illustrative scenarios of a tiltrotor in forward flight and single-main rotor helicopter at hover. Using SIMPLI-FLYD and the conceptual design tool NDARC integrated into a single process, the effects of variations of design parameters such as tail or rotor size were evaluated in the form of margins to fixed- and rotary-wing handling qualities metrics as well as the vehicle empty weight. The handling qualities design margins are shown to vary across the flight envelope due to both changing flight dynamic and control characteristics and changing handling qualities specification requirements. The current SIMPLI-FLYD capability and future developments are discussed in the context of an overall rotorcraft conceptual design process.
Two-compartment modeling of tissue microcirculation revisited.
Brix, Gunnar; Salehi Ravesh, Mona; Griebel, Jürgen
2017-05-01
Conventional two-compartment modeling of tissue microcirculation is used for tracer kinetic analysis of dynamic contrast-enhanced (DCE) computed tomography or magnetic resonance imaging studies although it is well-known that the underlying assumption of an instantaneous mixing of the administered contrast agent (CA) in capillaries is far from being realistic. It was thus the aim of the present study to provide theoretical and computational evidence in favor of a conceptually alternative modeling approach that makes it possible to characterize the bias inherent to compartment modeling and, moreover, to approximately correct for it. Starting from a two-region distributed-parameter model that accounts for spatial gradients in CA concentrations within blood-tissue exchange units, a modified lumped two-compartment exchange model was derived. It has the same analytical structure as the conventional two-compartment model, but indicates that the apparent blood flow identifiable from measured DCE data is substantially overestimated, whereas the three other model parameters (i.e., the permeability-surface area product as well as the volume fractions of the plasma and interstitial distribution space) are unbiased. Furthermore, a simple formula was derived to approximately compute a bias-corrected flow from the estimates of the apparent flow and permeability-surface area product obtained by model fitting. To evaluate the accuracy of the proposed modeling and bias correction method, representative noise-free DCE curves were analyzed. They were simulated for 36 microcirculation and four input scenarios by an axially distributed reference model. As analytically proven, the considered two-compartment exchange model is structurally identifiable from tissue residue data. The apparent flow values estimated for the 144 simulated tissue/input scenarios were considerably biased. After bias-correction, the deviations between estimated and actual parameter values were (11.2 ± 6.4) % (vs. (105 ± 21) % without correction) for the flow, (3.6 ± 6.1) % for the permeability-surface area product, (5.8 ± 4.9) % for the vascular volume and (2.5 ± 4.1) % for the interstitial volume; with individual deviations of more than 20% being the exception and just marginal. Increasing the duration of CA administration only had a statistically significant but opposite effect on the accuracy of the estimated flow (declined) and intravascular volume (improved). Physiologically well-defined tissue parameters are structurally identifiable and accurately estimable from DCE data by the conceptually modified two-compartment model in combination with the bias correction. The accuracy of the bias-corrected flow is nearly comparable to that of the three other (theoretically unbiased) model parameters. As compared to conventional two-compartment modeling, this feature constitutes a major advantage for tracer kinetic analysis of both preclinical and clinical DCE imaging studies. © 2017 American Association of Physicists in Medicine.
A sediment graph model based on SCS-CN method
NASA Astrophysics Data System (ADS)
Singh, P. K.; Bhunya, P. K.; Mishra, S. K.; Chaube, U. C.
2008-01-01
SummaryThis paper proposes new conceptual sediment graph models based on coupling of popular and extensively used methods, viz., Nash model based instantaneous unit sediment graph (IUSG), soil conservation service curve number (SCS-CN) method, and Power law. These models vary in their complexity and this paper tests their performance using data of the Nagwan watershed (area = 92.46 km 2) (India). The sensitivity of total sediment yield and peak sediment flow rate computations to model parameterisation is analysed. The exponent of the Power law, β, is more sensitive than other model parameters. The models are found to have substantial potential for computing sediment graphs (temporal sediment flow rate distribution) as well as total sediment yield.
NASA Technical Reports Server (NTRS)
Willis, W. S.; Konarski, M.; Sutherland, M. V.
1982-01-01
Ejector concepts for use with a remote augmented lift system (RALS) exhaust nozzle were studied. A number of concepts were considered and three were selected as having the greatest promise of providing the desired aircraft and exhaust gas cooling and lift enhancement. A scale model test program is recommended to explore the effects of the more important parameters on ejector performance.
The ring-shaped thermal field of Stefanos crater, Nisyros Island: a conceptual model
NASA Astrophysics Data System (ADS)
Pantaleo, M.; Walter, T. R.
2013-11-01
Fumarole fields related to hydrothermal processes release the heat of the underground through permeable pathways. Thermal changes, therefore, are likely to depend also on the variation of these pathways. As these paths may affect or even control the temperature field at the surface, their understanding is relevant to applied and basic science alike. A common difficulty, however, in surface temperature field studies at active volcanoes is that the parameters controlling the ascending routes of fluids are poorly constrained in general. Here we analyze the crater of Stefanos, Nisyros (Greece), and highlight complexities in the spatial pattern of the fumarole field related to permeability conditions. There may be different explanations for the observed permeability changes, such as structural control, lithology, weathering, and heterogeneous sediment accumulation and erosion. We combine high resolution infrared mosaics and grain-size analysis of soils, aiming to elaborate parameters controlling the appearance of the fumarole field. We find a ring-shaped thermal field located within the explosion crater, which is dependent on contrasts of the soil granulometry and volcanotectonic history. We develop a conceptual model of how the ring-shaped thermal field has formed at the Stefanos crater and similarly at other volcanic edifices, highlighting the importance of local permeability contrast that may increase or decrease the thermal fluid flux.
The ring-shaped thermal field of Stefanos crater, Nisyros Island: a conceptual model
NASA Astrophysics Data System (ADS)
Pantaleo, M.; Walter, T. R.
2014-04-01
Fumarole fields related to hydrothermal processes release the heat of the underground through permeable pathways. Thermal changes, therefore, are likely to depend also on the size and permeability variation of these pathways. There may be different explanations for the observed permeability changes, such as fault control, lithology, weathering/alteration, heterogeneous sediment accumulation/erosion and physical changes of the fluids (e.g., temperature and viscosity). A common difficulty, however, in surface temperature field studies at active volcanoes is that the parameters controlling the ascending routes of fluids are poorly constrained in general. Here we analyze the crater of Stefanos, Nisyros (Greece), and highlight complexities in the spatial pattern of the fumarole field related to permeability conditions. We combine high-resolution infrared mosaics and grain-size analysis of soils, aiming to elaborate parameters controlling the appearance of the fumarole field. We find a ring-shaped thermal field located within the explosion crater, which we interpret to reflect near-surface contrasts of the soil granulometry and volcanotectonic history at depth. We develop a conceptual model of how the ring-shaped thermal field formed at the Stefanos crater and similarly at other volcanic edifices, highlighting the importance of local permeability contrast that may increase or decrease the thermal fluid flux.
Arctic melt ponds and bifurcations in the climate system
NASA Astrophysics Data System (ADS)
Sudakov, I.; Vakulenko, S. A.; Golden, K. M.
2015-05-01
Understanding how sea ice melts is critical to climate projections. In the Arctic, melt ponds that develop on the surface of sea ice floes during the late spring and summer largely determine their albedo - a key parameter in climate modeling. Here we explore the possibility of a conceptual sea ice climate model passing through a bifurcation point - an irreversible critical threshold as the system warms, by incorporating geometric information about melt pond evolution. This study is based on a bifurcation analysis of the energy balance climate model with ice-albedo feedback as the key mechanism driving the system to bifurcation points.
NASA Astrophysics Data System (ADS)
Koskela, J. J.; Croke, B. W. F.; Koivusalo, H.; Jakeman, A. J.; Kokkonen, T.
2012-11-01
Bayesian inference is used to study the effect of precipitation and model structural uncertainty on estimates of model parameters and confidence limits of predictive variables in a conceptual rainfall-runoff model in the snow-fed Rudbäck catchment (142 ha) in southern Finland. The IHACRES model is coupled with a simple degree day model to account for snow accumulation and melt. The posterior probability distribution of the model parameters is sampled by using the Differential Evolution Adaptive Metropolis (DREAM(ZS)) algorithm and the generalized likelihood function. Precipitation uncertainty is taken into account by introducing additional latent variables that were used as multipliers for individual storm events. Results suggest that occasional snow water equivalent (SWE) observations together with daily streamflow observations do not contain enough information to simultaneously identify model parameters, precipitation uncertainty and model structural uncertainty in the Rudbäck catchment. The addition of an autoregressive component to account for model structure error and latent variables having uniform priors to account for input uncertainty lead to dubious posterior distributions of model parameters. Thus our hypothesis that informative priors for latent variables could be replaced by additional SWE data could not be confirmed. The model was found to work adequately in 1-day-ahead simulation mode, but the results were poor in the simulation batch mode. This was caused by the interaction of parameters that were used to describe different sources of uncertainty. The findings may have lessons for other cases where parameterizations are similarly high in relation to available prior information.
NASA Astrophysics Data System (ADS)
Fan, Y. R.; Huang, G. H.; Baetz, B. W.; Li, Y. P.; Huang, K.
2017-06-01
In this study, a copula-based particle filter (CopPF) approach was developed for sequential hydrological data assimilation by considering parameter correlation structures. In CopPF, multivariate copulas are proposed to reflect parameter interdependence before the resampling procedure with new particles then being sampled from the obtained copulas. Such a process can overcome both particle degeneration and sample impoverishment. The applicability of CopPF is illustrated with three case studies using a two-parameter simplified model and two conceptual hydrologic models. The results for the simplified model indicate that model parameters are highly correlated in the data assimilation process, suggesting a demand for full description of their dependence structure. Synthetic experiments on hydrologic data assimilation indicate that CopPF can rejuvenate particle evolution in large spaces and thus achieve good performances with low sample size scenarios. The applicability of CopPF is further illustrated through two real-case studies. It is shown that, compared with traditional particle filter (PF) and particle Markov chain Monte Carlo (PMCMC) approaches, the proposed method can provide more accurate results for both deterministic and probabilistic prediction with a sample size of 100. Furthermore, the sample size would not significantly influence the performance of CopPF. Also, the copula resampling approach dominates parameter evolution in CopPF, with more than 50% of particles sampled by copulas in most sample size scenarios.
Chong, Ket Hing; Zhang, Xiaomeng; Zheng, Jie
2018-01-01
Ageing is a natural phenomenon that is inherently complex and remains a mystery. Conceptual model of cellular ageing landscape was proposed for computational studies of ageing. However, there is a lack of quantitative model of cellular ageing landscape. This study aims to investigate the mechanism of cellular ageing in a theoretical model using the framework of Waddington's epigenetic landscape. We construct an ageing gene regulatory network (GRN) consisting of the core cell cycle regulatory genes (including p53). A model parameter (activation rate) is used as a measure of the accumulation of DNA damage. Using the bifurcation diagrams to estimate the parameter values that lead to multi-stability, we obtained a conceptual model for capturing three distinct stable steady states (or attractors) corresponding to homeostasis, cell cycle arrest, and senescence or apoptosis. In addition, we applied a Monte Carlo computational method to quantify the potential landscape, which displays: I) one homeostasis attractor for low accumulation of DNA damage; II) two attractors for cell cycle arrest and senescence (or apoptosis) in response to high accumulation of DNA damage. Using the Waddington's epigenetic landscape framework, the process of ageing can be characterized by state transitions from landscape I to II. By in silico perturbations, we identified the potential landscape of a perturbed network (inactivation of p53), and thereby demonstrated the emergence of a cancer attractor. The simulated dynamics of the perturbed network displays a landscape with four basins of attraction: homeostasis, cell cycle arrest, senescence (or apoptosis) and cancer. Our analysis also showed that for the same perturbed network with low DNA damage, the landscape displays only the homeostasis attractor. The mechanistic model offers theoretical insights that can facilitate discovery of potential strategies for network medicine of ageing-related diseases such as cancer.
NASA Astrophysics Data System (ADS)
Feyen, Luc; Caers, Jef
2006-06-01
In this work, we address the problem of characterizing the heterogeneity and uncertainty of hydraulic properties for complex geological settings. Hereby, we distinguish between two scales of heterogeneity, namely the hydrofacies structure and the intrafacies variability of the hydraulic properties. We employ multiple-point geostatistics to characterize the hydrofacies architecture. The multiple-point statistics are borrowed from a training image that is designed to reflect the prior geological conceptualization. The intrafacies variability of the hydraulic properties is represented using conventional two-point correlation methods, more precisely, spatial covariance models under a multi-Gaussian spatial law. We address the different levels and sources of uncertainty in characterizing the subsurface heterogeneity, and explore their effect on groundwater flow and transport predictions. Typically, uncertainty is assessed by way of many images, termed realizations, of a fixed statistical model. However, in many cases, sampling from a fixed stochastic model does not adequately represent the space of uncertainty. It neglects the uncertainty related to the selection of the stochastic model and the estimation of its input parameters. We acknowledge the uncertainty inherent in the definition of the prior conceptual model of aquifer architecture and in the estimation of global statistics, anisotropy, and correlation scales. Spatial bootstrap is used to assess the uncertainty of the unknown statistical parameters. As an illustrative example, we employ a synthetic field that represents a fluvial setting consisting of an interconnected network of channel sands embedded within finer-grained floodplain material. For this highly non-stationary setting we quantify the groundwater flow and transport model prediction uncertainty for various levels of hydrogeological uncertainty. Results indicate the importance of accurately describing the facies geometry, especially for transport predictions.
Integrating O/S models during conceptual design, part 2
NASA Technical Reports Server (NTRS)
Ebeling, Charles E.
1994-01-01
This report documents the procedures for utilizing and maintaining the Reliability & Maintainability Model (RAM) developed by the University of Dayton for the National Aeronautics and Space Administration (NASA) Langley Research Center (LaRC) under NASA research grant NAG-1-1327. The purpose of the grant is to provide support to NASA in establishing operational and support parameters and costs of proposed space systems. As part of this research objective, the model described here was developed. Additional documentation concerning the development of this model may be found in Part 1 of this report. This is the 2nd part of a 3 part technical report.
Two statistics for evaluating parameter identifiability and error reduction
Doherty, John; Hunt, Randall J.
2009-01-01
Two statistics are presented that can be used to rank input parameters utilized by a model in terms of their relative identifiability based on a given or possible future calibration dataset. Identifiability is defined here as the capability of model calibration to constrain parameters used by a model. Both statistics require that the sensitivity of each model parameter be calculated for each model output for which there are actual or presumed field measurements. Singular value decomposition (SVD) of the weighted sensitivity matrix is then undertaken to quantify the relation between the parameters and observations that, in turn, allows selection of calibration solution and null spaces spanned by unit orthogonal vectors. The first statistic presented, "parameter identifiability", is quantitatively defined as the direction cosine between a parameter and its projection onto the calibration solution space. This varies between zero and one, with zero indicating complete non-identifiability and one indicating complete identifiability. The second statistic, "relative error reduction", indicates the extent to which the calibration process reduces error in estimation of a parameter from its pre-calibration level where its value must be assigned purely on the basis of prior expert knowledge. This is more sophisticated than identifiability, in that it takes greater account of the noise associated with the calibration dataset. Like identifiability, it has a maximum value of one (which can only be achieved if there is no measurement noise). Conceptually it can fall to zero; and even below zero if a calibration problem is poorly posed. An example, based on a coupled groundwater/surface-water model, is included that demonstrates the utility of the statistics. ?? 2009 Elsevier B.V.
Simplified contaminant source depletion models as analogs of multiphase simulators
NASA Astrophysics Data System (ADS)
Basu, Nandita B.; Fure, Adrian D.; Jawitz, James W.
2008-04-01
Four simplified dense non-aqueous phase liquid (DNAPL) source depletion models recently introduced in the literature are evaluated for the prediction of long-term effects of source depletion under natural gradient flow. These models are simple in form (a power function equation is an example) but are shown here to serve as mathematical analogs to complex multiphase flow and transport simulators. The spill and subsequent dissolution of DNAPLs was simulated in domains having different hydrologic characteristics (variance of the log conductivity field = 0.2, 1 and 3) using the multiphase flow and transport simulator UTCHEM. The dissolution profiles were fitted using four analytical models: the equilibrium streamtube model (ESM), the advection dispersion model (ADM), the power law model (PLM) and the Damkohler number model (DaM). All four models, though very different in their conceptualization, include two basic parameters that describe the mean DNAPL mass and the joint variability in the velocity and DNAPL distributions. The variability parameter was observed to be strongly correlated with the variance of the log conductivity field in the ESM and ADM but weakly correlated in the PLM and DaM. The DaM also includes a third parameter that describes the effect of rate-limited dissolution, but here this parameter was held constant as the numerical simulations were found to be insensitive to local-scale mass transfer. All four models were able to emulate the characteristics of the dissolution profiles generated from the complex numerical simulator, but the one-parameter PLM fits were the poorest, especially for the low heterogeneity case.
Simplified contaminant source depletion models as analogs of multiphase simulators.
Basu, Nandita B; Fure, Adrian D; Jawitz, James W
2008-04-28
Four simplified dense non-aqueous phase liquid (DNAPL) source depletion models recently introduced in the literature are evaluated for the prediction of long-term effects of source depletion under natural gradient flow. These models are simple in form (a power function equation is an example) but are shown here to serve as mathematical analogs to complex multiphase flow and transport simulators. The spill and subsequent dissolution of DNAPLs was simulated in domains having different hydrologic characteristics (variance of the log conductivity field=0.2, 1 and 3) using the multiphase flow and transport simulator UTCHEM. The dissolution profiles were fitted using four analytical models: the equilibrium streamtube model (ESM), the advection dispersion model (ADM), the power law model (PLM) and the Damkohler number model (DaM). All four models, though very different in their conceptualization, include two basic parameters that describe the mean DNAPL mass and the joint variability in the velocity and DNAPL distributions. The variability parameter was observed to be strongly correlated with the variance of the log conductivity field in the ESM and ADM but weakly correlated in the PLM and DaM. The DaM also includes a third parameter that describes the effect of rate-limited dissolution, but here this parameter was held constant as the numerical simulations were found to be insensitive to local-scale mass transfer. All four models were able to emulate the characteristics of the dissolution profiles generated from the complex numerical simulator, but the one-parameter PLM fits were the poorest, especially for the low heterogeneity case.
NASA Astrophysics Data System (ADS)
Gragne, A. S.; Sharma, A.; Mehrotra, R.; Alfredsen, K. T.
2012-12-01
Accuracy of reservoir inflow forecasts is instrumental for maximizing value of water resources and influences operation of hydropower reservoirs significantly. Improving hourly reservoir inflow forecasts over a 24 hours lead-time is considered with the day-ahead (Elspot) market of the Nordic exchange market in perspectives. The procedure presented comprises of an error model added on top of an un-alterable constant parameter conceptual model, and a sequential data assimilation routine. The structure of the error model was investigated using freely available software for detecting mathematical relationships in a given dataset (EUREQA) and adopted to contain minimum complexity for computational reasons. As new streamflow data become available the extra information manifested in the discrepancies between measurements and conceptual model outputs are extracted and assimilated into the forecasting system recursively using Sequential Monte Carlo technique. Besides improving forecast skills significantly, the probabilistic inflow forecasts provided by the present approach entrains suitable information for reducing uncertainty in decision making processes related to hydropower systems operation. The potential of the current procedure for improving accuracy of inflow forecasts at lead-times unto 24 hours and its reliability in different seasons of the year will be illustrated and discussed thoroughly.
Virtual experiments: a new approach for improving process conceptualization in hillslope hydrology
NASA Astrophysics Data System (ADS)
Weiler, Markus; McDonnell, Jeff
2004-01-01
We present an approach for process conceptualization in hillslope hydrology. We develop and implement a series of virtual experiments, whereby the interaction between water flow pathways, source and mixing at the hillslope scale is examined within a virtual experiment framework. We define these virtual experiments as 'numerical experiments with a model driven by collective field intelligence'. The virtual experiments explore the first-order controls in hillslope hydrology, where the experimentalist and modeler work together to cooperatively develop and analyze the results. Our hillslope model for the virtual experiments (HillVi) in this paper is based on conceptualizing the water balance within the saturated and unsaturated zone in relation to soil physical properties in a spatially explicit manner at the hillslope scale. We argue that a virtual experiment model needs to be able to capture all major controls on subsurface flow processes that the experimentalist might deem important, while at the same time being simple with few 'tunable parameters'. This combination makes the approach, and the dialog between experimentalist and modeler, a useful hypothesis testing tool. HillVi simulates mass flux for different initial conditions under the same flow conditions. We analyze our results in terms of an artificial line source and isotopic hydrograph separation of water and subsurface flow. Our results for this first set of virtual experiments showed how drainable porosity and soil depth variability exert a first order control on flow and transport at the hillslope scale. We found that high drainable porosity soils resulted in a restricted water table rise, resulting in more pronounced channeling of lateral subsurface flow along the soil-bedrock interface. This in turn resulted in a more anastomosing network of tracer movement across the slope. The virtual isotope hydrograph separation showed higher proportions of event water with increasing drainable porosity. When combined with previous experimental findings and conceptualizations, virtual experiments can be an effective way to isolate certain controls and examine their influence over a range of rainfall and antecedent wetness conditions.
Conceptual and logical level of database modeling
NASA Astrophysics Data System (ADS)
Hunka, Frantisek; Matula, Jiri
2016-06-01
Conceptual and logical levels form the top most levels of database modeling. Usually, ORM (Object Role Modeling) and ER diagrams are utilized to capture the corresponding schema. The final aim of business process modeling is to store its results in the form of database solution. For this reason, value oriented business process modeling which utilizes ER diagram to express the modeling entities and relationships between them are used. However, ER diagrams form the logical level of database schema. To extend possibilities of different business process modeling methodologies, the conceptual level of database modeling is needed. The paper deals with the REA value modeling approach to business process modeling using ER-diagrams, and derives conceptual model utilizing ORM modeling approach. Conceptual model extends possibilities for value modeling to other business modeling approaches.
Asquith, William H.; Roussel, Meghan C.
2007-01-01
Estimation of representative hydrographs from design storms, which are known as design hydrographs, provides for cost-effective, riskmitigated design of drainage structures such as bridges, culverts, roadways, and other infrastructure. During 2001?07, the U.S. Geological Survey (USGS), in cooperation with the Texas Department of Transportation, investigated runoff hydrographs, design storms, unit hydrographs,and watershed-loss models to enhance design hydrograph estimation in Texas. Design hydrographs ideally should mimic the general volume, peak, and shape of observed runoff hydrographs. Design hydrographs commonly are estimated in part by unit hydrographs. A unit hydrograph is defined as the runoff hydrograph that results from a unit pulse of excess rainfall uniformly distributed over the watershed at a constant rate for a specific duration. A time-distributed, watershed-loss model is required for modeling by unit hydrographs. This report develops a specific time-distributed, watershed-loss model known as an initial-abstraction, constant-loss model. For this watershed-loss model, a watershed is conceptualized to have the capacity to store or abstract an absolute depth of rainfall at and near the beginning of a storm. Depths of total rainfall less than this initial abstraction do not produce runoff. The watershed also is conceptualized to have the capacity to remove rainfall at a constant rate (loss) after the initial abstraction is satisfied. Additional rainfall inputs after the initial abstraction is satisfied contribute to runoff if the rainfall rate (intensity) is larger than the constant loss. The initial abstraction, constant-loss model thus is a two-parameter model. The initial-abstraction, constant-loss model is investigated through detailed computational and statistical analysis of observed rainfall and runoff data for 92 USGS streamflow-gaging stations (watersheds) in Texas with contributing drainage areas from 0.26 to 166 square miles. The analysis is limited to a previously described, watershed-specific, gamma distribution model of the unit hydrograph. In particular, the initial-abstraction, constant-loss model is tuned to the gamma distribution model of the unit hydrograph. A complex computational analysis of observed rainfall and runoff for the 92 watersheds was done to determine, by storm, optimal values of initial abstraction and constant loss. Optimal parameter values for a given storm were defined as those values that produced a modeled runoff hydrograph with volume equal to the observed runoff hydrograph and also minimized the residual sum of squares of the two hydrographs. Subsequently, the means of the optimal parameters were computed on a watershed-specific basis. These means for each watershed are considered the most representative, are tabulated, and are used in further statistical analyses. Statistical analyses of watershed-specific, initial abstraction and constant loss include documentation of the distribution of each parameter using the generalized lambda distribution. The analyses show that watershed development has substantial influence on initial abstraction and limited influence on constant loss. The means and medians of the 92 watershed-specific parameters are tabulated with respect to watershed development; although they have considerable uncertainty, these parameters can be used for parameter prediction for ungaged watersheds. The statistical analyses of watershed-specific, initial abstraction and constant loss also include development of predictive procedures for estimation of each parameter for ungaged watersheds. Both regression equations and regression trees for estimation of initial abstraction and constant loss are provided. The watershed characteristics included in the regression analyses are (1) main-channel length, (2) a binary factor representing watershed development, (3) a binary factor representing watersheds with an abundance of rocky and thin-soiled terrain, and (4) curve numb
Predicting herbicide and biocide concentrations in rivers across Switzerland
NASA Astrophysics Data System (ADS)
Wemyss, Devon; Honti, Mark; Stamm, Christian
2014-05-01
Pesticide concentrations vary strongly in space and time. Accordingly, intensive sampling is required to achieve a reliable quantification of pesticide pollution. As this requires substantial resources, loads and concentration ranges in many small and medium streams remain unknown. Here, we propose partially filling the information gap for herbicides and biocides by using a modelling approach that predicts stream concentrations without site-specific calibration simply based on generally available data like land use, discharge and nation-wide consumption data. The simple, conceptual model distinguishes herbicide losses from agricultural fields, private gardens and biocide losses from buildings (facades, roofs). The herbicide model is driven by river discharge and the applied herbicide mass; the biocide model requires precipitation and the footprint area of urban areas containing the biocide. The model approach allows for modelling concentrations across multiple catchments at the daily, or shorter, time scale and for small to medium-sized catchments (1 - 100 km2). Four high resolution sampling campaigns in the Swiss Plateau were used to calibrate the model parameters for six model compounds: atrazine, metolachlor, terbuthylazine, terbutryn, diuron and mecoprop. Five additional sampled catchments across Switzerland were used to directly compare the predicted to the measured concentrations. Analysis of the first results reveals a reasonable simulation of the concentration dynamics for specific rainfall events and across the seasons. Predicted concentration ranges are reasonable even without site-specific calibration. This indicates the transferability of the calibrated model directly to other areas. However, the results also demonstrate systematic biases in that the highest measured peaks were not attained by the model. Probable causes for these deviations are conceptual model limitations and input uncertainty (pesticide use intensity, local precipitation, etc.). Accordingly, the model will be conceptually improved. This presentation will present the model simulations and compare the performance of the original and the modified model versions. Finally, the model will be applied across approximately 50% of the catchments in the Swiss Plateau, where necessary input data is available and where the model concept can be reasonably applied.
Scale-dependency of effective hydraulic conductivity on fire-affected hillslopes
NASA Astrophysics Data System (ADS)
Langhans, Christoph; Lane, Patrick N. J.; Nyman, Petter; Noske, Philip J.; Cawson, Jane G.; Oono, Akiko; Sheridan, Gary J.
2016-07-01
Effective hydraulic conductivity (Ke) for Hortonian overland flow modeling has been defined as a function of rainfall intensity and runon infiltration assuming a distribution of saturated hydraulic conductivities (Ks). But surface boundary condition during infiltration and its interactions with the distribution of Ks are not well represented in models. As a result, the mean value of the Ks distribution (KS¯), which is the central parameter for Ke, varies between scales. Here we quantify this discrepancy with a large infiltration data set comprising four different methods and scales from fire-affected hillslopes in SE Australia using a relatively simple yet widely used conceptual model of Ke. Ponded disk (0.002 m2) and ring infiltrometers (0.07 m2) were used at the small scales and rainfall simulations (3 m2) and small catchments (ca 3000 m2) at the larger scales. We compared KS¯ between methods measured at the same time and place. Disk and ring infiltrometer measurements had on average 4.8 times higher values of KS¯ than rainfall simulations and catchment-scale estimates. Furthermore, the distribution of Ks was not clearly log-normal and scale-independent, as supposed in the conceptual model. In our interpretation, water repellency and preferential flow paths increase the variance of the measured distribution of Ks and bias ponding toward areas of very low Ks during rainfall simulations and small catchment runoff events while areas with high preferential flow capacity remain water supply-limited more than the conceptual model of Ke predicts. The study highlights problems in the current theory of scaling runoff generation.
A Hydrological Modeling Framework for Flood Risk Assessment for Japan
NASA Astrophysics Data System (ADS)
Ashouri, H.; Chinnayakanahalli, K.; Chowdhary, H.; Sen Gupta, A.
2016-12-01
Flooding has been the most frequent natural disaster that claims lives and imposes significant economic losses to human societies worldwide. Japan, with an annual rainfall of up to approximately 4000 mm is extremely vulnerable to flooding. The focus of this research is to develop a macroscale hydrologic model for simulating flooding toward an improved understanding and assessment of flood risk across Japan. The framework employs a conceptual hydrological model, known as the Probability Distributed Model (PDM), as well as the Muskingum-Cunge flood routing procedure for simulating streamflow. In addition, a Temperature-Index model is incorporated to account for snowmelt and its contribution to streamflow. For an efficient calibration of the model, in terms of computational timing and convergence of the parameters, a set of A Priori parameters is obtained based on the relationships between the model parameters and the physical properties of watersheds. In this regard, we have implemented a particle tracking algorithm and a statistical model which use high resolution Digital Terrain Models to estimate different time related parameters of the model such as time to peak of the unit hydrograph. In addition, global soil moisture and depth data are used to generate A Priori estimation of maximum soil moisture capacity, an important parameter of the PDM model. Once the model is calibrated, its performance is examined during the Typhoon Nabi which struck Japan in September 2005 and caused severe flooding throughout the country. The model is also validated for the extreme precipitation event in 2012 which affected Kyushu. In both cases, quantitative measures show that simulated streamflow depicts good agreement with gauge-based observations. The model is employed to simulate thousands of possible flood events for the entire Japan which makes a basis for a comprehensive flood risk assessment and loss estimation for the flood insurance industry.
Observing Inflationary Reheating
NASA Astrophysics Data System (ADS)
Martin, Jérôme; Ringeval, Christophe; Vennin, Vincent
2015-02-01
Reheating is the epoch which connects inflation to the subsequent hot big-bang phase. Conceptually very important, this era is, however, observationally poorly known. We show that the current Planck satellite measurements of the cosmic microwave background (CMB) anisotropies constrain the kinematic properties of the reheating era for most of the inflationary models. This result is obtained by deriving the marginalized posterior distributions of the reheating parameter for about 200 models of slow-roll inflation. Weighted by the statistical evidence of each model to explain the data, we show that the Planck 2013 measurements induce an average reduction of the posterior-to-prior volume by 40%. Making some additional assumptions on reheating, such as specifying a mean equation of state parameter, or focusing the analysis on peculiar scenarios, can enhance or reduce this constraint. Our study also indicates that the Bayesian evidence of a model can substantially be affected by the reheating properties. The precision of the current CMB data is therefore such that estimating the observational performance of a model now requires incorporating information about its reheating history.
A Structural Equation Model of Conceptual Change in Physics
ERIC Educational Resources Information Center
Taasoobshirazi, Gita; Sinatra, Gale M.
2011-01-01
A model of conceptual change in physics was tested on introductory-level, college physics students. Structural equation modeling was used to test hypothesized relationships among variables linked to conceptual change in physics including an approach goal orientation, need for cognition, motivation, and course grade. Conceptual change in physics…
A Bayesian Alternative for Multi-objective Ecohydrological Model Specification
NASA Astrophysics Data System (ADS)
Tang, Y.; Marshall, L. A.; Sharma, A.; Ajami, H.
2015-12-01
Process-based ecohydrological models combine the study of hydrological, physical, biogeochemical and ecological processes of the catchments, which are usually more complex and parametric than conceptual hydrological models. Thus, appropriate calibration objectives and model uncertainty analysis are essential for ecohydrological modeling. In recent years, Bayesian inference has become one of the most popular tools for quantifying the uncertainties in hydrological modeling with the development of Markov Chain Monte Carlo (MCMC) techniques. Our study aims to develop appropriate prior distributions and likelihood functions that minimize the model uncertainties and bias within a Bayesian ecohydrological framework. In our study, a formal Bayesian approach is implemented in an ecohydrological model which combines a hydrological model (HyMOD) and a dynamic vegetation model (DVM). Simulations focused on one objective likelihood (Streamflow/LAI) and multi-objective likelihoods (Streamflow and LAI) with different weights are compared. Uniform, weakly informative and strongly informative prior distributions are used in different simulations. The Kullback-leibler divergence (KLD) is used to measure the dis(similarity) between different priors and corresponding posterior distributions to examine the parameter sensitivity. Results show that different prior distributions can strongly influence posterior distributions for parameters, especially when the available data is limited or parameters are insensitive to the available data. We demonstrate differences in optimized parameters and uncertainty limits in different cases based on multi-objective likelihoods vs. single objective likelihoods. We also demonstrate the importance of appropriately defining the weights of objectives in multi-objective calibration according to different data types.
Millimeter wave satellite concepts, volume 1
NASA Technical Reports Server (NTRS)
Hilsen, N. B.; Holland, L. D.; Thomas, R. E.; Wallace, R. W.; Gallagher, J. G.
1977-01-01
The identification of technologies necessary for development of millimeter spectrum communication satellites was examined from a system point of view. Development of methodology based on the technical requirements of potential services that might be assigned to millimeter wave bands for identifying the viable and appropriate technologies for future NASA millimeter research and development programs, and testing of this methodology with selected user applications and services were the goals of the program. The entire communications network, both ground and space subsystems was studied. Cost, weight, and performance models for the subsystems, conceptual design for point-to-point and broadcast communications satellites, and analytic relationships between subsystem parameters and an overall link performance are discussed along with baseline conceptual systems, sensitivity studies, model adjustment analyses, identification of critical technologies and their risks, and brief research and development program scenarios for the technologies judged to be moderate or extensive risks. Identification of technologies for millimeter satellite communication systems, and assessment of the relative risks of these technologies, was accomplished through subsystem modeling and link optimization for both point-to-point and broadcast applications.
NASA Astrophysics Data System (ADS)
Guenet, Bertrand; Esteban Moyano, Fernando; Peylin, Philippe; Ciais, Philippe; Janssens, Ivan A.
2016-03-01
Priming of soil carbon decomposition encompasses different processes through which the decomposition of native (already present) soil organic matter is amplified through the addition of new organic matter, with new inputs typically being more labile than the native soil organic matter. Evidence for priming comes from laboratory and field experiments, but to date there is no estimate of its impact at global scale and under the current anthropogenic perturbation of the carbon cycle. Current soil carbon decomposition models do not include priming mechanisms, thereby introducing uncertainty when extrapolating short-term local observations to ecosystem and regional to global scale. In this study we present a simple conceptual model of decomposition priming, called PRIM, able to reproduce laboratory (incubation) and field (litter manipulation) priming experiments. Parameters for this model were first optimized against data from 20 soil incubation experiments using a Bayesian framework. The optimized parameter values were evaluated against another set of soil incubation data independent from the ones used for calibration and the PRIM model reproduced the soil incubations data better than the original, CENTURY-type soil decomposition model, whose decomposition equations are based only on first-order kinetics. We then compared the PRIM model and the standard first-order decay model incorporated into the global land biosphere model ORCHIDEE (Organising Carbon and Hydrology In Dynamic Ecosystems). A test of both models was performed at ecosystem scale using litter manipulation experiments from five sites. Although both versions were equally able to reproduce observed decay rates of litter, only ORCHIDEE-PRIM could simulate the observed priming (R2 = 0.54) in cases where litter was added or removed. This result suggests that a conceptually simple and numerically tractable representation of priming adapted to global models is able to capture the sign and magnitude of the priming of litter and soil organic matter.
NASA Astrophysics Data System (ADS)
Guenet, B.; Moyano, F. E.; Peylin, P.; Ciais, P.; Janssens, I. A.
2015-10-01
Priming of soil carbon decomposition encompasses different processes through which the decomposition of native (already present) soil organic matter is amplified through the addition of new organic matter, with new inputs typically being more labile than the native soil organic matter. Evidence for priming comes from laboratory and field experiments, but to date there is no estimate of its impact at global scale and under the current anthropogenic perturbation of the carbon cycle. Current soil carbon decomposition models do not include priming mechanisms, thereby introducing uncertainty when extrapolating short-term local observations to ecosystem and regional to global scale. In this study we present a simple conceptual model of decomposition priming, called PRIM, able to reproduce laboratory (incubation) and field (litter manipulation) priming experiments. Parameters for this model were first optimized against data from 20 soil incubation experiments using a Bayesian framework. The optimized parameter values were evaluated against another set of soil incubation data independent from the ones used for calibration and the PRIM model reproduced the soil incubations data better than the original, CENTURY-type soil decomposition model, whose decomposition equations are based only on first order kinetics. We then compared the PRIM model and the standard first order decay model incorporated into the global land biosphere model ORCHIDEE. A test of both models was performed at ecosystem scale using litter manipulation experiments from 5 sites. Although both versions were equally able to reproduce observed decay rates of litter, only ORCHIDEE-PRIM could simulate the observed priming (R2 = 0.54) in cases where litter was added or removed. This result suggests that a conceptually simple and numerically tractable representation of priming adapted to global models is able to capture the sign and magnitude of the priming of litter and soil organic matter.
Hydrologic Model Selection using Markov chain Monte Carlo methods
NASA Astrophysics Data System (ADS)
Marshall, L.; Sharma, A.; Nott, D.
2002-12-01
Estimation of parameter uncertainty (and in turn model uncertainty) allows assessment of the risk in likely applications of hydrological models. Bayesian statistical inference provides an ideal means of assessing parameter uncertainty whereby prior knowledge about the parameter is combined with information from the available data to produce a probability distribution (the posterior distribution) that describes uncertainty about the parameter and serves as a basis for selecting appropriate values for use in modelling applications. Widespread use of Bayesian techniques in hydrology has been hindered by difficulties in summarizing and exploring the posterior distribution. These difficulties have been largely overcome by recent advances in Markov chain Monte Carlo (MCMC) methods that involve random sampling of the posterior distribution. This study presents an adaptive MCMC sampling algorithm which has characteristics that are well suited to model parameters with a high degree of correlation and interdependence, as is often evident in hydrological models. The MCMC sampling technique is used to compare six alternative configurations of a commonly used conceptual rainfall-runoff model, the Australian Water Balance Model (AWBM), using 11 years of daily rainfall runoff data from the Bass river catchment in Australia. The alternative configurations considered fall into two classes - those that consider model errors to be independent of prior values, and those that model the errors as an autoregressive process. Each such class consists of three formulations that represent increasing levels of complexity (and parameterisation) of the original model structure. The results from this study point both to the importance of using Bayesian approaches in evaluating model performance, as well as the simplicity of the MCMC sampling framework that has the ability to bring such approaches within the reach of the applied hydrological community.
Conceptual Frameworks in the Doctoral Research Process: A Pedagogical Model
ERIC Educational Resources Information Center
Berman, Jeanette; Smyth, Robyn
2015-01-01
This paper contributes to consideration of the role of conceptual frameworks in the doctoral research process. Through reflection on the two authors' own conceptual frameworks for their doctoral studies, a pedagogical model has been developed. The model posits the development of a conceptual framework as a core element of the doctoral…
Bärgman, Jonas; Boda, Christian-Nils; Dozza, Marco
2017-05-01
As the development and deployment of in-vehicle intelligent safety systems (ISS) for crash avoidance and mitigation have rapidly increased in the last decades, the need to evaluate their prospective safety benefits before introduction has never been higher. Counterfactual simulations using relevant mathematical models (for vehicle dynamics, sensors, the environment, ISS algorithms, and models of driver behavior) have been identified as having high potential. However, although most of these models are relatively mature, models of driver behavior in the critical seconds before a crash are still relatively immature. There are also large conceptual differences between different driver models. The objective of this paper is, firstly, to demonstrate the importance of the choice of driver model when counterfactual simulations are used to evaluate two ISS: Forward collision warning (FCW), and autonomous emergency braking (AEB). Secondly, the paper demonstrates how counterfactual simulations can be used to perform sensitivity analyses on parameter settings, both for driver behavior and ISS algorithms. Finally, the paper evaluates the effect of the choice of glance distribution in the driver behavior model on the safety benefit estimation. The paper uses pre-crash kinematics and driver behavior from 34 rear-end crashes from the SHRP2 naturalistic driving study for the demonstrations. The results for FCW show a large difference in the percent of avoided crashes between conceptually different models of driver behavior, while differences were small for conceptually similar models. As expected, the choice of model of driver behavior did not affect AEB benefit much. Based on our results, researchers and others who aim to evaluate ISS with the driver in the loop through counterfactual simulations should be sure to make deliberate and well-grounded choices of driver models: the choice of model matters. Copyright © 2017 Elsevier Ltd. All rights reserved.
Geometric Model for a Parametric Study of the Blended-Wing-Body Airplane
NASA Technical Reports Server (NTRS)
Mastin, C. Wayne; Smith, Robert E.; Sadrehaghighi, Ideen; Wiese, Micharl R.
1996-01-01
A parametric model is presented for the blended-wing-body airplane, one concept being proposed for the next generation of large subsonic transports. The model is defined in terms of a small set of parameters which facilitates analysis and optimization during the conceptual design process. The model is generated from a preliminary CAD geometry. From this geometry, airfoil cross sections are cut at selected locations and fitted with analytic curves. The airfoils are then used as boundaries for surfaces defined as the solution of partial differential equations. Both the airfoil curves and the surfaces are generated with free parameters selected to give a good representation of the original geometry. The original surface is compared with the parametric model, and solutions of the Euler equations for compressible flow are computed for both geometries. The parametric model is a good approximation of the CAD model and the computed solutions are qualitatively similar. An optimal NURBS approximation is constructed and can be used by a CAD model for further refinement or modification of the original geometry.
A Path Model for Evaluating Dosing Parameters for Children With Cerebral Palsy
Christy, Jennifer B.; Heathcock, Jill C.; Kolobe, Thubi H.A.
2014-01-01
Dosing of pediatric rehabilitation services for children with cerebral palsy (CP) has been identified as a national priority. Establishing dosing parameters for pediatric physical therapy interventions is critical for informing clinical decision making, health policy, and guidelines for reimbursement. The purpose of this perspective article is to describe a path model for evaluating dosing parameters of interventions for children with CP. The model is intended for dose-related and effectiveness studies of pediatric physical therapy interventions. The premise of the model is: Intervention type (focus on body structures, activity, or the environment) acts on a child first through the family, then through the dose (frequency, intensity, time), to yield structural and behavioral changes. As a result, these changes are linked to improvements in functional independence. Community factors affect dose as well as functional independence (performance and capacity), influencing the relationships between type of intervention and intervention responses. The constructs of family characteristics; child characteristics (eg, age, level of severity, comorbidities, readiness to change, preferences); plastic changes in bone, muscle, and brain; motor skill acquisition; and community access warrant consideration from researchers who are designing intervention studies. Multiple knowledge gaps are identified, and a framework is provided for conceptualizing dosing parameters for children with CP. PMID:24231231
An Improved Nested Sampling Algorithm for Model Selection and Assessment
NASA Astrophysics Data System (ADS)
Zeng, X.; Ye, M.; Wu, J.; WANG, D.
2017-12-01
Multimodel strategy is a general approach for treating model structure uncertainty in recent researches. The unknown groundwater system is represented by several plausible conceptual models. Each alternative conceptual model is attached with a weight which represents the possibility of this model. In Bayesian framework, the posterior model weight is computed as the product of model prior weight and marginal likelihood (or termed as model evidence). As a result, estimating marginal likelihoods is crucial for reliable model selection and assessment in multimodel analysis. Nested sampling estimator (NSE) is a new proposed algorithm for marginal likelihood estimation. The implementation of NSE comprises searching the parameters' space from low likelihood area to high likelihood area gradually, and this evolution is finished iteratively via local sampling procedure. Thus, the efficiency of NSE is dominated by the strength of local sampling procedure. Currently, Metropolis-Hasting (M-H) algorithm and its variants are often used for local sampling in NSE. However, M-H is not an efficient sampling algorithm for high-dimensional or complex likelihood function. For improving the performance of NSE, it could be feasible to integrate more efficient and elaborated sampling algorithm - DREAMzs into the local sampling. In addition, in order to overcome the computation burden problem of large quantity of repeating model executions in marginal likelihood estimation, an adaptive sparse grid stochastic collocation method is used to build the surrogates for original groundwater model.
Modeling coupled sorption and transformation of 17β-estradiol-17-sulfate in soil-water systems
NASA Astrophysics Data System (ADS)
Bai, Xuelian; Shrestha, Suman L.; Casey, Francis X. M.; Hakk, Heldur; Fan, Zhaosheng
2014-11-01
Animal manure is the primary source of exogenous free estrogens in the environment, which are known endocrine-disrupting chemicals to disorder the reproduction system of organisms. Conjugated estrogens can act as precursors to free estrogens, which may increase the total estrogenicity in the environment. In this study, a comprehensive model was used to simultaneously simulate the coupled sorption and transformation of a sulfate estrogen conjugate, 17β-estradiol-17-sulfate (E2-17S), in various soil-water systems (non-sterile/sterile; topsoil/subsoil). The simulated processes included multiple transformation pathways (i.e. hydroxylation, hydrolysis, and oxidation) and mass transfer between the aqueous, reversibly sorbed, and irreversibly sorbed phases of all soils for E2-17S and its metabolites. The conceptual model was conceived based on a series of linear sorption and first-order transformation expressions. The model was inversely solved using finite difference to estimate process parameters. A global optimization method was applied for the inverse analysis along with variable model restrictions to estimate 36 parameters. The model provided a satisfactory simultaneous fit (R2adj = 0.93 and d = 0.87) of all the experimental data and reliable parameter estimates. This modeling study improved the understanding on fate and transport of estrogen conjugates under various soil-water conditions.
Hayenga, Heather N; Thorne, Bryan C; Peirce, Shayn M; Humphrey, Jay D
2011-11-01
There is a need to develop multiscale models of vascular adaptations to understand tissue-level manifestations of cellular level mechanisms. Continuum-based biomechanical models are well suited for relating blood pressures and flows to stress-mediated changes in geometry and properties, but less so for describing underlying mechanobiological processes. Discrete stochastic agent-based models are well suited for representing biological processes at a cellular level, but not for describing tissue-level mechanical changes. We present here a conceptually new approach to facilitate the coupling of continuum and agent-based models. Because of ubiquitous limitations in both the tissue- and cell-level data from which one derives constitutive relations for continuum models and rule-sets for agent-based models, we suggest that model verification should enforce congruency across scales. That is, multiscale model parameters initially determined from data sets representing different scales should be refined, when possible, to ensure that common outputs are consistent. Potential advantages of this approach are illustrated by comparing simulated aortic responses to a sustained increase in blood pressure predicted by continuum and agent-based models both before and after instituting a genetic algorithm to refine 16 objectively bounded model parameters. We show that congruency-based parameter refinement not only yielded increased consistency across scales, it also yielded predictions that are closer to in vivo observations.
Crystalline lens paradoxes revisited: significance of age-related restructuring of the GRIN.
Sheil, Conor J; Goncharov, Alexander V
2017-09-01
The accommodating volume-constant age-dependent optical (AVOCADO) model of the crystalline lens is used to explore the age-related changes in ocular power and spherical aberration. The additional parameter m in the GRIN lens model allows decoupling of the axial and radial GRIN profiles, and is used to stabilise the age-related change in ocular power. Data for age-related changes in ocular geometry and lens parameter P in the axial GRIN profile were taken from published experimental data. In our age-dependent eye model, the ocular refractive power shows behaviour similar to the previously unexplained "lens paradox". Furthermore, ocular spherical aberration agrees with the data average, in contrast to the proposed "spherical aberration paradox". The additional flexibility afforded by parameter m , which controls the ratio of the axial and radial GRIN profile exponents, has allowed us to study the restructuring of the lens GRIN medium with age, resulting in a new interpretation of the origin of the power and spherical aberration paradoxes. Our findings also contradict the conceptual idea that the ageing eye is similar to the accommodating eye.
A Fuzzy Cognitive Model of aeolian instability across the South Texas Sandsheet
NASA Astrophysics Data System (ADS)
Houser, C.; Bishop, M. P.; Barrineau, C. P.
2014-12-01
Characterization of aeolian systems is complicated by rapidly changing surface-process regimes, spatio-temporal scale dependencies, and subjective interpretation of imagery and spatial data. This paper describes the development and application of analytical reasoning to quantify instability of an aeolian environment using scale-dependent information coupled with conceptual knowledge of process and feedback mechanisms. Specifically, a simple Fuzzy Cognitive Model (FCM) for aeolian landscape instability was developed that represents conceptual knowledge of key biophysical processes and feedbacks. Model inputs include satellite-derived surface biophysical and geomorphometric parameters. FCMs are a knowledge-based Artificial Intelligence (AI) technique that merges fuzzy logic and neural computing in which knowledge or concepts are structured as a web of relationships that is similar to both human reasoning and the human decision-making process. Given simple process-form relationships, the analytical reasoning model is able to map the influence of land management practices and the geomorphology of the inherited surface on aeolian instability within the South Texas Sandsheet. Results suggest that FCMs can be used to formalize process-form relationships and information integration analogous to human cognition with future iterations accounting for the spatial interactions and temporal lags across the sand sheets.
NASA Astrophysics Data System (ADS)
Cohen-Adad, Julien; Paul, Perrine; Morandi, Xavier; Jannin, Pierre
2006-03-01
During an image-guided neurosurgery procedure, the neuronavigation system is subject to inaccuracy because of anatomical deformations which induce a gap between the preoperative images and their anatomical reality. Thus, the objective of many research teams is to succeed in quantifying these deformations in order to update preoperative images. Anatomical intraoperative deformations correspond to a complex spatio-temporal phenomenon. Our objective is to identify the parameters implicated in these deformations and to use these parameters as constrains for systems dedicated to updating preoperative images. In order to identify these parameters of deformation we followed the iterative methodology used for cognitive system conception: identification, conceptualization, formalization, implementation and validation. A state of the art about cortical deformations has been established in order to identify relevant parameters probably involved in the deformations. As a first step, 30 parameters have been identified and described following an ontological approach. They were formalized into a Unified Modeling Language (UML) class diagram. We implemented that model into a web-based application in order to fill a database. Two surgical cases have been studied at this moment. After having entered enough surgical cases for data mining purposes, we expect to identify the most relevant and influential parameters and to gain a better ability to understand the deformation phenomenon. This original approach is part of a global system aiming at quantifying and correcting anatomical deformations.
Kwicklis, Edward M.; Wolfsberg, Andrew V.; Stauffer, Philip H.; Walvoord, Michelle Ann; Sully, Michael J.
2006-01-01
Multiphase, multicomponent numerical models of long-term unsaturated-zone liquid and vapor movement were created for a thick alluvial basin at the Nevada Test Site to predict present-day liquid and vapor fluxes. The numerical models are based on recently developed conceptual models of unsaturated-zone moisture movement in thick alluvium that explain present-day water potential and tracer profiles in terms of major climate and vegetation transitions that have occurred during the past 10 000 yr or more. The numerical models were calibrated using borehole hydrologic and environmental tracer data available from a low-level radioactive waste management site located in a former nuclear weapons testing area. The environmental tracer data used in the model calibration includes tracers that migrate in both the liquid and vapor phases (??D, ??18O) and tracers that migrate solely as dissolved solutes (Cl), thus enabling the estimation of some gas-phase as well as liquid-phase transport parameters. Parameter uncertainties and correlations identified during model calibration were used to generate parameter combinations for a set of Monte Carlo simulations to more fully characterize the uncertainty in liquid and vapor fluxes. The calculated background liquid and vapor fluxes decrease as the estimated time since the transition to the present-day arid climate increases. However, on the whole, the estimated fluxes display relatively little variability because correlations among parameters tend to create parameter sets for which changes in some parameters offset the effects of others in the set. Independent estimates on the timing since the climate transition established from packrat midden data were essential for constraining the model calibration results. The study demonstrates the utility of environmental tracer data in developing numerical models of liquid- and gas-phase moisture movement and the importance of considering parameter correlations when using Monte Carlo analysis to characterize the uncertainty in moisture fluxes. ?? Soil Science Society of America.
Technical Note: Approximate Bayesian parameterization of a process-based tropical forest model
NASA Astrophysics Data System (ADS)
Hartig, F.; Dislich, C.; Wiegand, T.; Huth, A.
2014-02-01
Inverse parameter estimation of process-based models is a long-standing problem in many scientific disciplines. A key question for inverse parameter estimation is how to define the metric that quantifies how well model predictions fit to the data. This metric can be expressed by general cost or objective functions, but statistical inversion methods require a particular metric, the probability of observing the data given the model parameters, known as the likelihood. For technical and computational reasons, likelihoods for process-based stochastic models are usually based on general assumptions about variability in the observed data, and not on the stochasticity generated by the model. Only in recent years have new methods become available that allow the generation of likelihoods directly from stochastic simulations. Previous applications of these approximate Bayesian methods have concentrated on relatively simple models. Here, we report on the application of a simulation-based likelihood approximation for FORMIND, a parameter-rich individual-based model of tropical forest dynamics. We show that approximate Bayesian inference, based on a parametric likelihood approximation placed in a conventional Markov chain Monte Carlo (MCMC) sampler, performs well in retrieving known parameter values from virtual inventory data generated by the forest model. We analyze the results of the parameter estimation, examine its sensitivity to the choice and aggregation of model outputs and observed data (summary statistics), and demonstrate the application of this method by fitting the FORMIND model to field data from an Ecuadorian tropical forest. Finally, we discuss how this approach differs from approximate Bayesian computation (ABC), another method commonly used to generate simulation-based likelihood approximations. Our results demonstrate that simulation-based inference, which offers considerable conceptual advantages over more traditional methods for inverse parameter estimation, can be successfully applied to process-based models of high complexity. The methodology is particularly suitable for heterogeneous and complex data structures and can easily be adjusted to other model types, including most stochastic population and individual-based models. Our study therefore provides a blueprint for a fairly general approach to parameter estimation of stochastic process-based models.
Conceptual Change Texts in Chemistry Teaching: A Study on the Particle Model of Matter
ERIC Educational Resources Information Center
Beerenwinkel, Anne; Parchmann, Ilka; Grasel, Cornelia
2011-01-01
This study explores the effect of a conceptual change text on students' awareness of common misconceptions on the particle model of matter. The conceptual change text was designed based on principles of text comprehensibility, of conceptual change instruction and of instructional approaches how to introduce the particle model. It was evaluated in…
NASA Astrophysics Data System (ADS)
Blöcher, Johanna; Kuraz, Michal
2017-04-01
In this contribution we propose implementations of the dual permeability model with different inter-domain exchange descriptions and metaheuristic optimization algorithms for parameter identification and mesh optimization. We compare variants of the coupling term with different numbers of parameters to test if a reduction of parameters is feasible. This can reduce parameter uncertainty in inverse modeling, but also allow for different conceptual models of the domain and matrix coupling. The different variants of the dual permeability model are implemented in the open-source objective library DRUtES written in FORTRAN 2003/2008 in 1D and 2D. For parameter identification we use adaptations of the particle swarm optimization (PSO) and Teaching-learning-based optimization (TLBO), which are population-based metaheuristics with different learning strategies. These are high-level stochastic-based search algorithms that don't require gradient information or a convex search space. Despite increasing computing power and parallel processing, an overly fine mesh is not feasible for parameter identification. This creates the need to find a mesh that optimizes both accuracy and simulation time. We use a bi-objective PSO algorithm to generate a Pareto front of optimal meshes to account for both objectives. The dual permeability model and the optimization algorithms were tested on virtual data and field TDR sensor readings. The TDR sensor readings showed a very steep increase during rapid rainfall events and a subsequent steep decrease. This was theorized to be an effect of artificial macroporous envelopes surrounding TDR sensors creating an anomalous region with distinct local soil hydraulic properties. One of our objectives is to test how well the dual permeability model can describe this infiltration behavior and what coupling term would be most suitable.
Evolution of the conceptual model of unsaturated zone hydrology at Yucca Mountain, Nevada
Flint, Alan L.; Flint, Lorraine E.; Bodvarsson, Gudmundur S.; Kwicklis, Edward M.; Fabryka-Martin, June
2001-01-01
Yucca Mountain is an arid site proposed for consideration as the United States’ first underground high-level radioactive waste repository. Low rainfall (approximately 170 mm/yr) and a thick unsaturated zone (500–1000 m) are important physical attributes of the site because the quantity of water likely to reach the waste and the paths and rates of movement of the water to the saturated zone under future climates would be major factors in controlling the concentrations and times of arrival of radionuclides at the surrounding accessible environment. The framework for understanding the hydrologic processes that occur at this site and that control how quickly water will penetrate through the unsaturated zone to the water table has evolved during the past 15 yr. Early conceptual models assumed that very small volumes of water infiltrated into the bedrock (0.5–4.5 mm/yr, or 2–3 percent of rainfall), that much of the infiltrated water flowed laterally within the upper nonwelded units because of capillary barrier effects, and that the remaining water flowed down faults with a small amount flowing through the matrix of the lower welded, fractured rocks. It was believed that the matrix had to be saturated for fractures to flow. However, accumulating evidence indicated that infiltration rates were higher than initially estimated, such as infiltration modeling based on neutron borehole data, bomb-pulse isotopes deep in the mountain, perched water analyses and thermal analyses. Mechanisms supporting lateral diversion did not apply at these higher fluxes, and the flux calculated in the lower welded unit exceeded the conductivity of the matrix, implying vertical flow of water in the high permeability fractures of the potential repository host rock, and disequilibrium between matrix and fracture water potentials. The development of numerical modeling methods and parameter values evolved concurrently with the conceptual model in order to account for the observed field data, particularly fracture flow deep in the unsaturated zone. This paper presents the history of the evolution of conceptual models of hydrology and numerical models of unsaturated zone flow at Yucca Mountain, Nevada (Flint, A.L., Flint, L.E., Kwicklis, E.M., Bodvarsson, G.S., Fabryka-Martin, J.M., 2001. Hydrology of Yucca Mountain. Reviews of Geophysics in press). This retrospective is the basis for recommendations for optimizing the efficiency with which a viable and robust conceptual model can be developed for a complex site.
Hunt, R.J.; Feinstein, D.T.; Pint, C.D.; Anderson, M.P.
2006-01-01
As part of the USGS Water, Energy, and Biogeochemical Budgets project and the NSF Long-Term Ecological Research work, a parameter estimation code was used to calibrate a deterministic groundwater flow model of the Trout Lake Basin in northern Wisconsin. Observations included traditional calibration targets (head, lake stage, and baseflow observations) as well as unconventional targets such as groundwater flows to and from lakes, depth of a lake water plume, and time of travel. The unconventional data types were important for parameter estimation convergence and allowed the development of a more detailed parameterization capable of resolving model objectives with well-constrained parameter values. Independent estimates of groundwater inflow to lakes were most important for constraining lakebed leakance and the depth of the lake water plume was important for determining hydraulic conductivity and conceptual aquifer layering. The most important target overall, however, was a conventional regional baseflow target that led to correct distribution of flow between sub-basins and the regional system during model calibration. The use of an automated parameter estimation code: (1) facilitated the calibration process by providing a quantitative assessment of the model's ability to match disparate observed data types; and (2) allowed assessment of the influence of observed targets on the calibration process. The model calibration required the use of a 'universal' parameter estimation code in order to include all types of observations in the objective function. The methods described in this paper help address issues of watershed complexity and non-uniqueness common to deterministic watershed models. ?? 2005 Elsevier B.V. All rights reserved.
Equifinality and process-based modelling
NASA Astrophysics Data System (ADS)
Khatami, S.; Peel, M. C.; Peterson, T. J.; Western, A. W.
2017-12-01
Equifinality is understood as one of the fundamental difficulties in the study of open complex systems, including catchment hydrology. A review of the hydrologic literature reveals that the term equifinality has been widely used, but in many cases inconsistently and without coherent recognition of the various facets of equifinality, which can lead to ambiguity but also methodological fallacies. Therefore, in this study we first characterise the term equifinality within the context of hydrological modelling by reviewing the genesis of the concept of equifinality and then presenting a theoretical framework. During past decades, equifinality has mainly been studied as a subset of aleatory (arising due to randomness) uncertainty and for the assessment of model parameter uncertainty. Although the connection between parameter uncertainty and equifinality is undeniable, we argue there is more to equifinality than just aleatory parameter uncertainty. That is, the importance of equifinality and epistemic uncertainty (arising due to lack of knowledge) and their implications is overlooked in our current practice of model evaluation. Equifinality and epistemic uncertainty in studying, modelling, and evaluating hydrologic processes are treated as if they can be simply discussed in (or often reduced to) probabilistic terms (as for aleatory uncertainty). The deficiencies of this approach to conceptual rainfall-runoff modelling are demonstrated for selected Australian catchments by examination of parameter and internal flux distributions and interactions within SIMHYD. On this basis, we present a new approach that expands equifinality concept beyond model parameters to inform epistemic uncertainty. The new approach potentially facilitates the identification and development of more physically plausible models and model evaluation schemes particularly within the multiple working hypotheses framework, and is generalisable to other fields of environmental modelling as well.
Validation of the Continuum of Care Conceptual Model for Athletic Therapy
Lafave, Mark R.; Butterwick, Dale; Eubank, Breda
2015-01-01
Utilization of conceptual models in field-based emergency care currently borrows from existing standards of medical and paramedical professions. The purpose of this study was to develop and validate a comprehensive conceptual model that could account for injuries ranging from nonurgent to catastrophic events including events that do not follow traditional medical or prehospital care protocols. The conceptual model should represent the continuum of care from the time of initial injury spanning to an athlete's return to participation in their sport. Finally, the conceptual model should accommodate both novices and experts in the AT profession. This paper chronicles the content validation steps of the Continuum of Care Conceptual Model for Athletic Therapy (CCCM-AT). The stages of model development were domain and item generation, content expert validation using a three-stage modified Ebel procedure, and pilot testing. Only the final stage of the modified Ebel procedure reached a priori 80% consensus on three domains of interest: (1) heading descriptors; (2) the order of the model; (3) the conceptual model as a whole. Future research is required to test the use of the CCCM-AT in order to understand its efficacy in teaching and practice within the AT discipline. PMID:26464897
Virus Neutralisation: New Insights from Kinetic Neutralisation Curves
Magnus, Carsten
2013-01-01
Antibodies binding to the surface of virions can lead to virus neutralisation. Different theories have been proposed to determine the number of antibodies that must bind to a virion for neutralisation. Early models are based on chemical binding kinetics. Applying these models lead to very low estimates of the number of antibodies needed for neutralisation. In contrast, according to the more conceptual approach of stoichiometries in virology a much higher number of antibodies is required for virus neutralisation by antibodies. Here, we combine chemical binding kinetics with (virological) stoichiometries to better explain virus neutralisation by antibody binding. This framework is in agreement with published data on the neutralisation of the human immunodeficiency virus. Knowing antibody reaction constants, our model allows us to estimate stoichiometrical parameters from kinetic neutralisation curves. In addition, we can identify important parameters that will make further analysis of kinetic neutralisation curves more valuable in the context of estimating stoichiometries. Our model gives a more subtle explanation of kinetic neutralisation curves in terms of single-hit and multi-hit kinetics. PMID:23468602
Sensitivity Studies of 3D Reservoir Simulation at the I-Lan Geothermal Area in Taiwan Using TOUGH2
NASA Astrophysics Data System (ADS)
Kuo, C. W.; Song, S. R.
2014-12-01
A large scale geothermal project conducted by National Science Council is initiated recently in I-Lan south area, northeastern Taiwan. The goal of this national project is to generate at least 5 MW electricity from geothermal energy. To achieve this goal, an integrated team which consists of various specialties are held together to investigate I-Lan area comprehensively. For example, I-Lan geological data, petrophysical analysis, seismicity, temperature distribution, hydrology, geochemistry, heat source study etc. were performed to build a large scale 3D conceptual model of the geothermal potential sites. In addition, not only a well of 3000m deep but also several shallow wells are currently drilling to give us accurate information about the deep underground. According to the current conceptual model, the target area is bounded by two main faults, Jiaosi and Choshui faults. The geothermal gradient measured at one drilling well (1200m) is about 49.1˚C/km. The geothermal reservoir is expected to occur at a fractured geological formation, Siling sandstone layer. The preliminary results of this area from all the investigations are used as input parameters to create a realistic numerical reservoir model. This work is using numerical simulator TOUGH2/EOS1 to study the geothermal energy potential in I-Lan area. Once we can successfully predict the geothermal energy potential in this area and generate 5 MW electricity, we can apply the similar methodology to the other potential sites in Taiwan, and therefore increase the percentage of renewable energy in the generation of electricity. A large scale of three-dimensional subsurface geological model is built mainly based on the seismic exploration of the subsurface structure and well log data. The dimensions of the reservoir model in x, y, and z coordinates are 20x10x5 km, respectively. Once the conceptual model and the well locations are set up appropriately based on the field data, sensitivity studies on production and injection rates, heat source, fractures, and all the relevant parameters are performed to evaluate their effects on temperature distribution of reservoir for 30 years. Through these sensitivity studies, we can design the better geothermal system in I-Lan area and reduce the risk of exploitation.
Electromagnetic Detection of a Perfect Carpet Cloak
Shi, Xihang; Gao, Fei; Lin, Xiao; Zhang, Baile
2015-01-01
It has been shown that a spherical invisibility cloak originally proposed by Pendry et al. can be electromagnetically detected by shooting a charged particle through it, whose underlying mechanism stems from the asymmetry of transformation optics applied to motions of photons and charges [PRL 103, 243901 (2009)]. However, the conceptual three-dimensional invisibility cloak that exactly follows specifications of transformation optics is formidably difficult to implement, while the simplified cylindrical cloak that has been experimentally realized is inherently visible. On the other hand, the recent carpet cloak model has acquired remarkable experimental development, including a recently demonstrated full-parameter carpet cloak without any approximation in the required constitutive parameters. In this paper, we numerically investigate the electromagnetic radiation from a charged particle passing through a perfect carpet cloak and propose an experimentally verifiable model to demonstrate symmetry breaking of transformation optics. PMID:25997798
Electromagnetic Detection of a Perfect Carpet Cloak
NASA Astrophysics Data System (ADS)
Shi, Xihang; Gao, Fei; Lin, Xiao; Zhang, Baile
2015-05-01
It has been shown that a spherical invisibility cloak originally proposed by Pendry et al. can be electromagnetically detected by shooting a charged particle through it, whose underlying mechanism stems from the asymmetry of transformation optics applied to motions of photons and charges [PRL 103, 243901 (2009)]. However, the conceptual three-dimensional invisibility cloak that exactly follows specifications of transformation optics is formidably difficult to implement, while the simplified cylindrical cloak that has been experimentally realized is inherently visible. On the other hand, the recent carpet cloak model has acquired remarkable experimental development, including a recently demonstrated full-parameter carpet cloak without any approximation in the required constitutive parameters. In this paper, we numerically investigate the electromagnetic radiation from a charged particle passing through a perfect carpet cloak and propose an experimentally verifiable model to demonstrate symmetry breaking of transformation optics.
Royle, J. Andrew; Chandler, Richard B.; Sollmann, Rahel; Gardner, Beth
2013-01-01
Spatial Capture-Recapture provides a revolutionary extension of traditional capture-recapture methods for studying animal populations using data from live trapping, camera trapping, DNA sampling, acoustic sampling, and related field methods. This book is a conceptual and methodological synthesis of spatial capture-recapture modeling. As a comprehensive how-to manual, this reference contains detailed examples of a wide range of relevant spatial capture-recapture models for inference about population size and spatial and temporal variation in demographic parameters. Practicing field biologists studying animal populations will find this book to be a useful resource, as will graduate students and professionals in ecology, conservation biology, and fisheries and wildlife management.
Models of dyadic social interaction.
Griffin, Dale; Gonzalez, Richard
2003-01-01
We discuss the logic of research designs for dyadic interaction and present statistical models with parameters that are tied to psychologically relevant constructs. Building on Karl Pearson's classic nineteenth-century statistical analysis of within-organism similarity, we describe several approaches to indexing dyadic interdependence and provide graphical methods for visualizing dyadic data. We also describe several statistical and conceptual solutions to the 'levels of analytic' problem in analysing dyadic data. These analytic strategies allow the researcher to examine and measure psychological questions of interdependence and social influence. We provide illustrative data from casually interacting and romantic dyads. PMID:12689382
Theory-Based Parameterization of Semiotics for Measuring Pre-literacy Development
NASA Astrophysics Data System (ADS)
Bezruczko, N.
2013-09-01
A probabilistic model was applied to problem of measuring pre-literacy in young children. First, semiotic philosophy and contemporary cognition research were conceptually integrated to establish theoretical foundations for rating 14 characteristics of children's drawings and narratives (N = 120). Then ratings were transformed with a Rasch model, which estimated linear item parameter values that accounted for 79 percent of rater variance. Principle Components Analysis of item residual matrix confirmed variance remaining after item calibration was largely unsystematic. Validation analyses found positive correlations between semiotic measures and preschool literacy outcomes. Practical implications of a semiotics dimension for preschool practice were discussed.
Nordqvist, R.; Voss, C.I.
1996-01-01
An approach to model discrimination and network design for evaluation of groundwater contamination risk is proposed and demonstrated by application to a site in a glaciofluvial aquifer in Sweden. The approach consists of first hypothesizing alternative conceptual models of hydrogeology at the site on the basis of both quantitative data and qualitative information. The conceptual models are then expressed as two-dimensional numerical models of groundwater flow and solute transport, and model attributes controlling risk to the water supply are determined by simulation. Model predictions of response to a specific field test are made with each model that affects risk. Regions for effective measurement networks are then identified. Effective networks are those that capture sufficient information to determine which of the hypothesized models best describes the system with a minimum of measurement points. For the example site in Sweden, the network is designed such that important system parameters may be accurately estimated at the same time as model discrimination is carried out. The site in Vansbro, Sweden, consists of a water-supply well in an esker separated (by 300m) from a wood preservation and treatment area on the esker flank by only a narrow inlet of a bordering stream. Application of the above-described risk analysis shows that, of all the hydrologic controls and parameters in the groundwater system, the only factor that controls the potential migration of wood-treatment contaminants to the well is whether the inlet's bed is pervious, creating a hydraulic barrier to lateral contaminant transport. Furthermore, the analysis localizes an area near the end of the inlet wherein the most effective measurements of drawdown would be made to discriminate between a permeable and impermeable bed. The location of this optimal area is not obvious prior to application of the above methodology.
Improved Conceptual Models Methodology (ICoMM) for Validation of Non-Observable Systems
2015-12-01
distribution is unlimited IMPROVED CONCEPTUAL MODELS METHODOLOGY (ICoMM) FOR VALIDATION OF NON-OBSERVABLE SYSTEMS by Sang M. Sok December 2015...REPORT TYPE AND DATES COVERED Dissertation 4. TITLE AND SUBTITLE IMPROVED CONCEPTUAL MODELS METHODOLOGY (ICoMM) FOR VALIDATION OF NON-OBSERVABLE...importance of the CoM. The improved conceptual model methodology (ICoMM) is developed in support of improving the structure of the CoM for both face and
NASA Astrophysics Data System (ADS)
Rivera, Diego; Rivas, Yessica; Godoy, Alex
2015-02-01
Hydrological models are simplified representations of natural processes and subject to errors. Uncertainty bounds are a commonly used way to assess the impact of an input or model architecture uncertainty in model outputs. Different sets of parameters could have equally robust goodness-of-fit indicators, which is known as Equifinality. We assessed the outputs from a lumped conceptual hydrological model to an agricultural watershed in central Chile under strong interannual variability (coefficient of variability of 25%) by using the Equifinality concept and uncertainty bounds. The simulation period ran from January 1999 to December 2006. Equifinality and uncertainty bounds from GLUE methodology (Generalized Likelihood Uncertainty Estimation) were used to identify parameter sets as potential representations of the system. The aim of this paper is to exploit the use of uncertainty bounds to differentiate behavioural parameter sets in a simple hydrological model. Then, we analyze the presence of equifinality in order to improve the identification of relevant hydrological processes. The water balance model for Chillan River exhibits, at a first stage, equifinality. However, it was possible to narrow the range for the parameters and eventually identify a set of parameters representing the behaviour of the watershed (a behavioural model) in agreement with observational and soft data (calculation of areal precipitation over the watershed using an isohyetal map). The mean width of the uncertainty bound around the predicted runoff for the simulation period decreased from 50 to 20 m3s-1 after fixing the parameter controlling the areal precipitation over the watershed. This decrement is equivalent to decreasing the ratio between simulated and observed discharge from 5.2 to 2.5. Despite the criticisms against the GLUE methodology, such as the lack of statistical formality, it is identified as a useful tool assisting the modeller with the identification of critical parameters.
Operational models of pharmacological agonism.
Black, J W; Leff, P
1983-12-22
The traditional receptor-stimulus model of agonism began with a description of drug action based on the law of mass action and has developed by a series of modifications, each accounting for new experimental evidence. By contrast, in this paper an approach to modelling agonism is taken that begins with the observation that experimental agonist-concentration effect, E/[A], curves are commonly hyperbolic and develops using the deduction that the relation between occupancy and effect must be hyperbolic if the law of mass action applies at the agonist-receptor level. The result is a general model that explicitly describes agonism by three parameters: an agonist-receptor dissociation constant, KA; the total receptor concentration, [R0]; and a parameter, KE, defining the transduction of agonist-receptor complex, AR, into pharmacological effect. The ratio, [R0]/KE, described here as the 'transducer ratio', tau, is a logical definition for the efficacy of an agonist in a system. The model may be extended to account for non-hyperbolic E/[A] curves with no loss of meaning. Analysis shows that an explicit formulation of the traditional receptor-stimulus model is one particular form of the general model but that it is not the simplest. An alternative model is proposed, representing the cognitive and transducer functions of a receptor, that describes agonist action with one fewer parameter than the traditional model. In addition, this model provides a chemical definition of intrinsic efficacy making this parameter experimentally accessible in principle. The alternative models are compared and contrasted with regard to their practical and conceptual utilities in experimental pharmacology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
G. Keating; W.Statham
2004-02-12
The purpose of this model report is to provide documentation of the conceptual and mathematical model (ASHPLUME) for atmospheric dispersal and subsequent deposition of ash on the land surface from a potential volcanic eruption at Yucca Mountain, Nevada. This report also documents the ash (tephra) redistribution conceptual model. The ASHPLUME conceptual model accounts for incorporation and entrainment of waste fuel particles associated with a hypothetical volcanic eruption through the Yucca Mountain repository and downwind transport of contaminated tephra. The ASHPLUME mathematical model describes the conceptual model in mathematical terms to allow for prediction of radioactive waste/ash deposition on the groundmore » surface given that the hypothetical eruptive event occurs. This model report also describes the conceptual model for tephra redistribution from a basaltic cinder cone. Sensitivity analyses and model validation activities for the ash dispersal and redistribution models are also presented. Analyses documented in this model report will improve and clarify the previous documentation of the ASHPLUME mathematical model and its application to the Total System Performance Assessment (TSPA) for the License Application (TSPA-LA) igneous scenarios. This model report also documents the redistribution model product outputs based on analyses to support the conceptual model.« less
Benoit, Richard; Mion, Lorraine
2012-08-01
This paper presents a proposed conceptual model to guide research on pressure ulcer risk in critically ill patients, who are at high risk for pressure ulcer development. However, no conceptual model exists that guides risk assessment in this population. Results from a review of prospective studies were evaluated for design quality and level of statistical reporting. Multivariate findings from studies having high or medium design quality by the National Institute of Health and Clinical Excellence standards were conceptually grouped. The conceptual groupings were integrated into Braden and Bergstrom's (Braden and Bergstrom [1987] Rehabilitation Nursing, 12, 8-12, 16) conceptual model, retaining their original constructs and augmenting their concept of intrinsic factors for tissue tolerance. The model could enhance consistency in research on pressure ulcer risk factors. Copyright © 2012 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Robins, N. S.; Rutter, H. K.; Dumpleton, S.; Peach, D. W.
2005-01-01
Groundwater investigation has long depended on the process of developing a conceptual flow model as a precursor to developing a mathematical model, which in turn may lead in complex aquifers to the development of a numerical approximation model. The assumptions made in the development of the conceptual model depend heavily on the geological framework defining the aquifer, and if the conceptual model is inappropriate then subsequent modelling will also be incorrect. Paradoxically, the development of a robust conceptual model remains difficult, not least because this 3D paradigm is usually reduced to 2D plans and sections. 3D visualisation software is now available to facilitate the development of the conceptual model, to make the model more robust and defensible and to assist in demonstrating the hydraulics of the aquifer system. Case studies are presented to demonstrate the role and cost-effectiveness of the visualisation process.
Buchwald, Peter
2017-06-01
A generalized model of receptor function is proposed that relies on the essential assumptions of the minimal two-state receptor theory (i.e., ligand binding followed by receptor activation), but uses a different parametrization and allows nonlinear response (transduction) for possible signal amplification. For the most general case, three parameters are used: K d , the classic equilibrium dissociation constant to characterize binding affinity; ε , an intrinsic efficacy to characterize the ability of the bound ligand to activate the receptor (ranging from 0 for an antagonist to 1 for a full agonist); and γ , a gain (amplification) parameter to characterize the nonlinearity of postactivation signal transduction (ranging from 1 for no amplification to infinity). The obtained equation, E/Emax=εγLεγ+1-εL+Kd, resembles that of the operational (Black and Leff) or minimal two-state (del Castillo-Katz) models, E/Emax=τLτ+1L+Kd, with εγ playing a role somewhat similar to that of the τ efficacy parameter of those models, but has several advantages. Its parameters are more intuitive as they are conceptually clearly related to the different steps of binding, activation, and signal transduction (amplification), and they are also better suited for optimization by nonlinear regression. It allows fitting of complex data where receptor binding and response are measured separately and the fractional occupancy and response are mismatched. Unlike the previous models, it is a true generalized model as simplified forms can be reproduced with special cases of its parameters. Such simplified forms can be used on their own to characterize partial agonism, competing partial and full agonists, or signal amplification.
NASA Technical Reports Server (NTRS)
Pace, Dale K.
2000-01-01
A simulation conceptual model is a simulation developers way of translating modeling requirements (i. e., what is to be represented by the simulation or its modification) into a detailed design framework (i. e., how it is to be done), from which the software, hardware, networks (in the case of distributed simulation), and systems/equipment that will make up the simulation can be built or modified. A conceptual model is the collection of information which describes a simulation developers concept about the simulation and its pieces. That information consists of assumptions, algorithms, characteristics, relationships, and data. Taken together, these describe how the simulation developer understands what is to be represented by the simulation (entities, actions, tasks, processes, interactions, etc.) and how that representation will satisfy the requirements to which the simulation responds. Thus the conceptual model is the basis for judgment about simulation fidelity and validity for any condition that is not specifically tested. The more perspicuous and precise the conceptual model, the more likely it is that the simulation development will both fully satisfy requirements and allow demonstration that the requirements are satisfied (i. e., validation). Methods used in simulation conceptual model development have significant implications for simulation management and for assessment of simulation uncertainty. This paper suggests how to develop and document a simulation conceptual model so that the simulation fidelity and validity can be most effectively determined. These ideas for conceptual model development apply to all simulation varieties. The paper relates these ideas to uncertainty assessments as they relate to simulation fidelity and validity. The paper also explores implications for simulation management from conceptual model development methods, especially relative to reuse of simulation components.
Development and verification of an agent-based model of opinion leadership.
Anderson, Christine A; Titler, Marita G
2014-09-27
The use of opinion leaders is a strategy used to speed the process of translating research into practice. Much is still unknown about opinion leader attributes and activities and the context in which they are most effective. Agent-based modeling is a methodological tool that enables demonstration of the interactive and dynamic effects of individuals and their behaviors on other individuals in the environment. The purpose of this study was to develop and test an agent-based model of opinion leadership. The details of the design and verification of the model are presented. The agent-based model was developed by using a software development platform to translate an underlying conceptual model of opinion leadership into a computer model. Individual agent attributes (for example, motives and credibility) and behaviors (seeking or providing an opinion) were specified as variables in the model in the context of a fictitious patient care unit. The verification process was designed to test whether or not the agent-based model was capable of reproducing the conditions of the preliminary conceptual model. The verification methods included iterative programmatic testing ('debugging') and exploratory analysis of simulated data obtained from execution of the model. The simulation tests included a parameter sweep, in which the model input variables were adjusted systematically followed by an individual time series experiment. Statistical analysis of model output for the 288 possible simulation scenarios in the parameter sweep revealed that the agent-based model was performing, consistent with the posited relationships in the underlying model. Nurse opinion leaders act on the strength of their beliefs and as a result, become an opinion resource for their uncertain colleagues, depending on their perceived credibility. Over time, some nurses consistently act as this type of resource and have the potential to emerge as opinion leaders in a context where uncertainty exists. The development and testing of agent-based models is an iterative process. The opinion leader model presented here provides a basic structure for continued model development, ongoing verification, and the establishment of validation procedures, including empirical data collection.
NASA Astrophysics Data System (ADS)
Dakhlaoui, H.; Ruelland, D.; Tramblay, Y.; Bargaoui, Z.
2017-07-01
To evaluate the impact of climate change on water resources at the catchment scale, not only future projections of climate are necessary but also robust rainfall-runoff models that must be fairly reliable under changing climate conditions. The aim of this study was thus to assess the robustness of three conceptual rainfall-runoff models (GR4j, HBV and IHACRES) on five basins in northern Tunisia under long-term climate variability, in the light of available future climate scenarios for this region. The robustness of the models was evaluated using a differential split sample test based on a climate classification of the observation period that simultaneously accounted for precipitation and temperature conditions. The study catchments include the main hydrographical basins in northern Tunisia, which produce most of the surface water resources in the country. A 30-year period (1970-2000) was used to capture a wide range of hydro-climatic conditions. The calibration was based on the Kling-Gupta Efficiency (KGE) criterion, while model transferability was evaluated based on the Nash-Sutcliffe efficiency criterion and volume error. The three hydrological models were shown to behave similarly under climate variability. The models simulated the runoff pattern better when transferred to wetter and colder conditions than to drier and warmer ones. It was shown that their robustness became unacceptable when climate conditions involved a decrease of more than 25% in annual precipitation and an increase of more than +1.75 °C in annual mean temperatures. The reduction in model robustness may be partly due to the climate dependence of some parameters. When compared to precipitation and temperature projections in the region, the limits of transferability obtained in this study are generally respected for short and middle term. For long term projections under the most pessimistic emission gas scenarios, the limits of transferability are generally not respected, which may hamper the use of conceptual models for hydrological projections in northern Tunisia.
Patyk, Kelly A; Helm, Julie; Martin, Michael K; Forde-Folle, Kimberly N; Olea-Popelka, Francisco J; Hokanson, John E; Fingerlin, Tasha; Reeves, Aaron
2013-07-01
Epidemiologic simulation modeling of highly pathogenic avian influenza (HPAI) outbreaks provides a useful conceptual framework with which to estimate the consequences of HPAI outbreaks and to evaluate disease control strategies. The purposes of this study were to establish detailed and informed input parameters for an epidemiologic simulation model of the H5N1 strain of HPAI among commercial and backyard poultry in the state of South Carolina in the United States using a highly realistic representation of this poultry population; to estimate the consequences of an outbreak of HPAI in this population with a model constructed from these parameters; and to briefly evaluate the sensitivity of model outcomes to several parameters. Parameters describing disease state durations; disease transmission via direct contact, indirect contact, and local-area spread; and disease detection, surveillance, and control were established through consultation with subject matter experts, a review of the current literature, and the use of several computational tools. The stochastic model constructed from these parameters produced simulated outbreaks ranging from 2 to 111 days in duration (median 25 days), during which 1 to 514 flocks were infected (median 28 flocks). Model results were particularly sensitive to the rate of indirect contact that occurs among flocks. The baseline model established in this study can be used in the future to evaluate various control strategies, as a tool for emergency preparedness and response planning, and to assess the costs associated with disease control and the economic consequences of a disease outbreak. Published by Elsevier B.V.
Technical Note: Approximate Bayesian parameterization of a complex tropical forest model
NASA Astrophysics Data System (ADS)
Hartig, F.; Dislich, C.; Wiegand, T.; Huth, A.
2013-08-01
Inverse parameter estimation of process-based models is a long-standing problem in ecology and evolution. A key problem of inverse parameter estimation is to define a metric that quantifies how well model predictions fit to the data. Such a metric can be expressed by general cost or objective functions, but statistical inversion approaches are based on a particular metric, the probability of observing the data given the model, known as the likelihood. Deriving likelihoods for dynamic models requires making assumptions about the probability for observations to deviate from mean model predictions. For technical reasons, these assumptions are usually derived without explicit consideration of the processes in the simulation. Only in recent years have new methods become available that allow generating likelihoods directly from stochastic simulations. Previous applications of these approximate Bayesian methods have concentrated on relatively simple models. Here, we report on the application of a simulation-based likelihood approximation for FORMIND, a parameter-rich individual-based model of tropical forest dynamics. We show that approximate Bayesian inference, based on a parametric likelihood approximation placed in a conventional MCMC, performs well in retrieving known parameter values from virtual field data generated by the forest model. We analyze the results of the parameter estimation, examine the sensitivity towards the choice and aggregation of model outputs and observed data (summary statistics), and show results from using this method to fit the FORMIND model to field data from an Ecuadorian tropical forest. Finally, we discuss differences of this approach to Approximate Bayesian Computing (ABC), another commonly used method to generate simulation-based likelihood approximations. Our results demonstrate that simulation-based inference, which offers considerable conceptual advantages over more traditional methods for inverse parameter estimation, can successfully be applied to process-based models of high complexity. The methodology is particularly suited to heterogeneous and complex data structures and can easily be adjusted to other model types, including most stochastic population and individual-based models. Our study therefore provides a blueprint for a fairly general approach to parameter estimation of stochastic process-based models in ecology and evolution.
The Cancer Family Caregiving Experience: An Updated and Expanded Conceptual Model
Fletcher, Barbara Swore; Miaskowski, Christine; Given, Barbara; Schumacher, Karen
2011-01-01
Objective The decade from 2000–2010 was an era of tremendous growth in family caregiving research specific to the cancer population. This research has implications for how cancer family caregiving is conceptualized, yet the most recent comprehensive model of cancer family caregiving was published ten years ago. Our objective was to develop an updated and expanded comprehensive model of the cancer family caregiving experience, derived from concepts and variables used in research during past ten years. Methods A conceptual model was developed based on cancer family caregiving research published from 2000–2010. Results Our updated and expanded model has three main elements: 1) the stress process, 2) contextual factors, and 3) the cancer trajectory. Emerging ways of conceptualizing the relationships between and within model elements are addressed, as well as an emerging focus on caregiver-patient dyads as the unit of analysis. Conclusions Cancer family caregiving research has grown dramatically since 2000 resulting in a greatly expanded conceptual landscape. This updated and expanded model of the cancer family caregiving experience synthesizes the conceptual implications of an international body of work and demonstrates tremendous progress in how cancer family caregiving research is conceptualized. PMID:22000812
Personalizing knowledge delivery services: a conceptual framework
NASA Technical Reports Server (NTRS)
Majchrzak, Ann; Chelleppa, Ramnath K.; Cooper, Lynne P.; Hars, Alexander
2003-01-01
Consistent with the call of the Minnesota Symposium for new theory in knowledge management, we offer a new conceptualization of Knowledge Management Systems (KMS) as a portfolio of personalized knowledge delivery services. Borrowing from research on online consumer behavior, we describe the challenges imposed by personalized knowledge delivery services, and suggest design parameters that can help to overcome these challenges. We develop our design constructs through a set of hypotheses and discuss the research implications of our new conceptualization. Finally, we describe practical implications suggested by our conceptualization - practical suggestions that we hope to gain some experience with as part of an ongoing action research project at our partner organization.
Geochemical Data Package for Performance Assessment Calculations Related to the Savannah River Site
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kaplan, Daniel I.
The Savannah River Site (SRS) disposes of low-level radioactive waste (LLW) and stabilizes high-level radioactive waste (HLW) tanks in the subsurface environment. Calculations used to establish the radiological limits of these facilities are referred to as Performance Assessments (PA), Special Analyses (SA), and Composite Analyses (CA). The objective of this document is to revise existing geochemical input values used for these calculations. This work builds on earlier compilations of geochemical data (2007, 2010), referred to a geochemical data packages. This work is being conducted as part of the on-going maintenance program of the SRS PA programs that periodically updates calculationsmore » and data packages when new information becomes available. Because application of values without full understanding of their original purpose may lead to misuse, this document also provides the geochemical conceptual model, the approach used for selecting the values, the justification for selecting data, and the assumptions made to assure that the conceptual and numerical geochemical models are reasonably conservative (i.e., bias the recommended input values to reflect conditions that will tend to predict the maximum risk to the hypothetical recipient). This document provides 1088 input parameters for geochemical parameters describing transport processes for 64 elements (>740 radioisotopes) potentially occurring within eight subsurface disposal or tank closure areas: Slit Trenches (ST), Engineered Trenches (ET), Low Activity Waste Vault (LAWV), Intermediate Level (ILV) Vaults, Naval Reactor Component Disposal Areas (NRCDA), Components-in-Grout (CIG) Trenches, Saltstone Facility, and Closed Liquid Waste Tanks. The geochemical parameters described here are the distribution coefficient, Kd value, apparent solubility concentration, k s value, and the cementitious leachate impact factor.« less
Montgomery, Erwin B.; He, Huang
2016-01-01
The efficacy of Deep Brain Stimulation (DBS) for an expanding array of neurological and psychiatric disorders demonstrates directly that DBS affects the basic electroneurophysiological mechanisms of the brain. The increasing array of active electrode configurations, stimulation currents, pulse widths, frequencies, and pulse patterns provides valuable tools to probe electroneurophysiological mechanisms. The extension of basic electroneurophysiological and anatomical concepts using sophisticated computational modeling and simulation has provided relatively straightforward explanations of all the DBS parameters except frequency. This article summarizes current thought about frequency and relevant observations. Current methodological and conceptual errors are critically examined in the hope that future work will not replicate these errors. One possible alternative theory is presented to provide a contrast to many current theories. DBS, conceptually, is a noisy discrete oscillator interacting with the basal ganglia–thalamic–cortical system of multiple re-entrant, discrete oscillators. Implications for positive and negative resonance, stochastic resonance and coherence, noisy synchronization, and holographic memory (related to movement generation) are presented. The time course of DBS neuronal responses demonstrates evolution of the DBS response consistent with the dynamics of re-entrant mechanisms. Finally, computational modeling demonstrates identical dynamics as seen by neuronal activities recorded from human and nonhuman primates, illustrating the differences of discrete from continuous harmonic oscillators and the power of conceptualizing the nervous system as composed on interacting discrete nonlinear oscillators. PMID:27548234
NASA Astrophysics Data System (ADS)
Wu, Y.; Blodau, C.
2013-08-01
Elevated nitrogen deposition and climate change alter the vegetation communities and carbon (C) and nitrogen (N) cycling in peatlands. To address this issue we developed a new process-oriented biogeochemical model (PEATBOG) for analyzing coupled carbon and nitrogen dynamics in northern peatlands. The model consists of four submodels, which simulate: (1) daily water table depth and depth profiles of soil moisture, temperature and oxygen levels; (2) competition among three plants functional types (PFTs), production and litter production of plants; (3) decomposition of peat; and (4) production, consumption, diffusion and export of dissolved C and N species in soil water. The model is novel in the integration of the C and N cycles, the explicit spatial resolution belowground, the consistent conceptualization of movement of water and solutes, the incorporation of stoichiometric controls on elemental fluxes and a consistent conceptualization of C and N reactivity in vegetation and soil organic matter. The model was evaluated for the Mer Bleue Bog, near Ottawa, Ontario, with regards to simulation of soil moisture and temperature and the most important processes in the C and N cycles. Model sensitivity was tested for nitrogen input, precipitation, and temperature, and the choices of the most uncertain parameters were justified. A simulation of nitrogen deposition over 40 yr demonstrates the advantages of the PEATBOG model in tracking biogeochemical effects and vegetation change in the ecosystem.
NASA Astrophysics Data System (ADS)
Wu, Y.; Blodau, C.
2013-03-01
Elevated nitrogen deposition and climate change alter the vegetation communities and carbon (C) and nitrogen (N) cycling in peatlands. To address this issue we developed a new process-oriented biogeochemical model (PEATBOG) for analyzing coupled carbon and nitrogen dynamics in northern peatlands. The model consists of four submodels, which simulate: (1) daily water table depth and depth profiles of soil moisture, temperature and oxygen levels; (2) competition among three plants functional types (PFTs), production and litter production of plants; (3) decomposition of peat; and (4) production, consumption, diffusion and export of dissolved C and N species in soil water. The model is novel in the integration of the C and N cycles, the explicit spatial resolution belowground, the consistent conceptualization of movement of water and solutes, the incorporation of stoichiometric controls on elemental fluxes and a consistent conceptualization of C and N reactivity in vegetation and soil organic matter. The model was evaluated for the Mer Bleue Bog, near Ottawa, Ontario, with regards to simulation of soil moisture and temperature and the most important processes in the C and N cycles. Model sensitivity was tested for nitrogen input, precipitation, and temperature, and the choices of the most uncertain parameters were justified. A simulation of nitrogen deposition over 40 yr demonstrates the advantages of the PEATBOG model in tracking biogeochemical effects and vegetation change in the ecosystem.
NASA Astrophysics Data System (ADS)
Elshafei, Y.; Sivapalan, M.; Tonts, M.; Hipsey, M. R.
2014-06-01
It is increasingly acknowledged that, in order to sustainably manage global freshwater resources, it is critical that we better understand the nature of human-hydrology interactions at the broader catchment system scale. Yet to date, a generic conceptual framework for building models of catchment systems that include adequate representation of socioeconomic systems - and the dynamic feedbacks between human and natural systems - has remained elusive. In an attempt to work towards such a model, this paper outlines a generic framework for models of socio-hydrology applicable to agricultural catchments, made up of six key components that combine to form the coupled system dynamics: namely, catchment hydrology, population, economics, environment, socioeconomic sensitivity and collective response. The conceptual framework posits two novel constructs: (i) a composite socioeconomic driving variable, termed the Community Sensitivity state variable, which seeks to capture the perceived level of threat to a community's quality of life, and acts as a key link tying together one of the fundamental feedback loops of the coupled system, and (ii) a Behavioural Response variable as the observable feedback mechanism, which reflects land and water management decisions relevant to the hydrological context. The framework makes a further contribution through the introduction of three macro-scale parameters that enable it to normalise for differences in climate, socioeconomic and political gradients across study sites. In this way, the framework provides for both macro-scale contextual parameters, which allow for comparative studies to be undertaken, and catchment-specific conditions, by way of tailored "closure relationships", in order to ensure that site-specific and application-specific contexts of socio-hydrologic problems can be accommodated. To demonstrate how such a framework would be applied, two socio-hydrological case studies, taken from the Australian experience, are presented and the parameterisation approach that would be taken in each case is discussed. Preliminary findings in the case studies lend support to the conceptual theories outlined in the framework. It is envisioned that the application of this framework across study sites and gradients will aid in developing our understanding of the fundamental interactions and feedbacks in such complex human-hydrology systems, and allow hydrologists to improve social-ecological systems modelling through better representation of human feedbacks on hydrological processes.
NASA Astrophysics Data System (ADS)
Neri, Mattia; Toth, Elena
2017-04-01
The study presents the implementation of different regionalisation approaches for the transfer of model parameters from similar and/or neighbouring gauged basin to an ungauged catchment, and in particular it uses a semi-distributed continuously-simulating conceptual rainfall-runoff model for simulating daily streamflows. The case study refers to a set of Apennine catchments (in the Emilia-Romagna region, Italy), that, given the spatial proximity, are assumed to belong to the same hydrologically homogeneous region and are used, alternatively, as donors and regionalised basins. The model is a semi-distributed version of the HBV model (TUWien model) in which the catchment is divided in zones of different altitude that contribute separately to the total outlet flow. The model includes a snow module, whose application in the Apennine area has been, so far, very limited, even if snow accumulation and melting phenomena do have an important role in the study basins. Two methods, both widely applied in the recent literature, are applied for regionalising the model: i) "parameters averaging", where each parameter is obtained as a weighted mean of the parameters obtained, through calibration, on the donor catchments ii) "output averaging", where the model is run over the ungauged basin using the entire set of parameters of each donor basin and the simulated outputs are then averaged. In the first approach, the parameters are regionalised independently from each other, in the second one, instead, the correlation among the parameters is maintained. Since the model is a semi-distributed one, where each elevation zone contributes separately, the study proposes to test also a modified version of the second approach ("output averaging"), where each zone is considered as an autonomous entity, whose parameters are transposed to the ungauged sub-basin corresponding to the same elevation zone. The study explores also the choice of the weights to be used for averaging the parameters (in the "parameters averaging" approach) or for averaging the simulated streamflow (in the "output averaging" approach): in particular, weights are estimated as a function of the similarity/distance of the ungauged basin/zone to the donors, on the basis of a set of geo-morphological catchment descriptors. The predictive accuracy of the different regionalisation methods is finally assessed by jack-knife cross-validation against the observed daily runoff for all the study catchments.
NASA Astrophysics Data System (ADS)
Hernández, Mario R.; Francés, Félix
2015-04-01
One phase of the hydrological models implementation process, significantly contributing to the hydrological predictions uncertainty, is the calibration phase in which values of the unknown model parameters are tuned by optimizing an objective function. An unsuitable error model (e.g. Standard Least Squares or SLS) introduces noise into the estimation of the parameters. The main sources of this noise are the input errors and the hydrological model structural deficiencies. Thus, the biased calibrated parameters cause the divergence model phenomenon, where the errors variance of the (spatially and temporally) forecasted flows far exceeds the errors variance in the fitting period, and provoke the loss of part or all of the physical meaning of the modeled processes. In other words, yielding a calibrated hydrological model which works well, but not for the right reasons. Besides, an unsuitable error model yields a non-reliable predictive uncertainty assessment. Hence, with the aim of prevent all these undesirable effects, this research focuses on the Bayesian joint inference (BJI) of both the hydrological and error model parameters, considering a general additive (GA) error model that allows for correlation, non-stationarity (in variance and bias) and non-normality of model residuals. As hydrological model, it has been used a conceptual distributed model called TETIS, with a particular split structure of the effective model parameters. Bayesian inference has been performed with the aid of a Markov Chain Monte Carlo (MCMC) algorithm called Dream-ZS. MCMC algorithm quantifies the uncertainty of the hydrological and error model parameters by getting the joint posterior probability distribution, conditioned on the observed flows. The BJI methodology is a very powerful and reliable tool, but it must be used correctly this is, if non-stationarity in errors variance and bias is modeled, the Total Laws must be taken into account. The results of this research show that the application of BJI with a GA error model outperforms the hydrological parameters robustness (diminishing the divergence model phenomenon) and improves the reliability of the streamflow predictive distribution, in respect of the results of a bad error model as SLS. Finally, the most likely prediction in a validation period, for both BJI+GA and SLS error models shows a similar performance.
Why College Students Cheat: A Conceptual Model of Five Factors
ERIC Educational Resources Information Center
Yu, Hongwei; Glanzer, Perry L.; Johnson, Byron R.; Sriram, Rishi; Moore, Brandon
2018-01-01
Though numerous studies have identified factors associated with academic misconduct, few have proposed conceptual models that could make sense of multiple factors. In this study, we used structural equation modeling (SEM) to test a conceptual model of five factors using data from a relatively large sample of 2,503 college students. The results…
Variational Bayesian Parameter Estimation Techniques for the General Linear Model
Starke, Ludger; Ostwald, Dirk
2017-01-01
Variational Bayes (VB), variational maximum likelihood (VML), restricted maximum likelihood (ReML), and maximum likelihood (ML) are cornerstone parametric statistical estimation techniques in the analysis of functional neuroimaging data. However, the theoretical underpinnings of these model parameter estimation techniques are rarely covered in introductory statistical texts. Because of the widespread practical use of VB, VML, ReML, and ML in the neuroimaging community, we reasoned that a theoretical treatment of their relationships and their application in a basic modeling scenario may be helpful for both neuroimaging novices and practitioners alike. In this technical study, we thus revisit the conceptual and formal underpinnings of VB, VML, ReML, and ML and provide a detailed account of their mathematical relationships and implementational details. We further apply VB, VML, ReML, and ML to the general linear model (GLM) with non-spherical error covariance as commonly encountered in the first-level analysis of fMRI data. To this end, we explicitly derive the corresponding free energy objective functions and ensuing iterative algorithms. Finally, in the applied part of our study, we evaluate the parameter and model recovery properties of VB, VML, ReML, and ML, first in an exemplary setting and then in the analysis of experimental fMRI data acquired from a single participant under visual stimulation. PMID:28966572
A conceptual modeling framework for discrete event simulation using hierarchical control structures.
Furian, N; O'Sullivan, M; Walker, C; Vössner, S; Neubacher, D
2015-08-01
Conceptual Modeling (CM) is a fundamental step in a simulation project. Nevertheless, it is only recently that structured approaches towards the definition and formulation of conceptual models have gained importance in the Discrete Event Simulation (DES) community. As a consequence, frameworks and guidelines for applying CM to DES have emerged and discussion of CM for DES is increasing. However, both the organization of model-components and the identification of behavior and system control from standard CM approaches have shortcomings that limit CM's applicability to DES. Therefore, we discuss the different aspects of previous CM frameworks and identify their limitations. Further, we present the Hierarchical Control Conceptual Modeling framework that pays more attention to the identification of a models' system behavior, control policies and dispatching routines and their structured representation within a conceptual model. The framework guides the user step-by-step through the modeling process and is illustrated by a worked example.
Conceptualizing Telehealth in Nursing Practice: Advancing a Conceptual Model to Fill a Virtual Gap.
Nagel, Daniel A; Penner, Jamie L
2016-03-01
Increasingly nurses use various telehealth technologies to deliver health care services; however, there has been a lag in research and generation of empirical knowledge to support nursing practice in this expanding field. One challenge to generating knowledge is a gap in development of a comprehensive conceptual model or theoretical framework to illustrate relationships of concepts and phenomena inherent to adoption of a broad range of telehealth technologies to holistic nursing practice. A review of the literature revealed eight published conceptual models, theoretical frameworks, or similar entities applicable to nursing practice. Many of these models focus exclusively on use of telephones and four were generated from qualitative studies, but none comprehensively reflect complexities of bridging nursing process and elements of nursing practice into use of telehealth. The purpose of this article is to present a review of existing conceptual models and frameworks, discuss predominant themes and features of these models, and present a comprehensive conceptual model for telehealth nursing practice synthesized from this literature for consideration and further development. This conceptual model illustrates characteristics of, and relationships between, dimensions of telehealth practice to guide research and knowledge development in provision of holistic person-centered care delivery to individuals by nurses through telehealth technologies. © The Author(s) 2015.
NASA Astrophysics Data System (ADS)
Rolland, Colette; Yu, Eric; Salinesi, Camille; Castro, Jaelson
The use of intentional concepts, the notion of "goal" in particular, has been prominent in recent approaches to requirement engineering (RE). Goal-oriented frameworks and methods for requirements engineering (GORE) have been keynote topics in requirements engineering, conceptual modelling, and more generally in software engineering. What are the conceptual modelling foundations in these approaches? RIGiM (Requirements Intentions and Goals in Conceptual Modelling) aims to provide a forum for discussing the interplay between requirements engineering and conceptual modelling, and in particular, to investigate how goal- and intention-driven approaches help in conceptualising purposeful systems. What are the fundamental objectives and premises of requirements engineering and conceptual modelling respectively, and how can they complement each other? What are the demands on conceptual modelling from the standpoint of requirements engineering? What conceptual modelling techniques can be further taken advantage of in requirements engineering? What are the upcoming modelling challenges and issues in GORE? What are the unresolved open questions? What lessons are there to be learnt from industrial experiences? What empirical data are there to support the cost-benefit analysis when adopting GORE methods? Are there application domains or types of project settings for which goals and intentional approaches are particularly suitable or not suitable? What degree of formalization and automation, or interactivity is feasible and appropriate for what types of participants during requirements engineering?
NASA Astrophysics Data System (ADS)
Berg, Matthew; Hartley, Brian; Richters, Oliver
2015-01-01
By synthesizing stock-flow consistent models, input-output models, and aspects of ecological macroeconomics, a method is developed to simultaneously model monetary flows through the financial system, flows of produced goods and services through the real economy, and flows of physical materials through the natural environment. This paper highlights the linkages between the physical environment and the economic system by emphasizing the role of the energy industry. A conceptual model is developed in general form with an arbitrary number of sectors, while emphasizing connections with the agent-based, econophysics, and complexity economics literature. First, we use the model to challenge claims that 0% interest rates are a necessary condition for a stationary economy and conduct a stability analysis within the parameter space of interest rates and consumption parameters of an economy in stock-flow equilibrium. Second, we analyze the role of energy price shocks in contributing to recessions, incorporating several propagation and amplification mechanisms. Third, implied heat emissions from energy conversion and the effect of anthropogenic heat flux on climate change are considered in light of a minimal single-layer atmosphere climate model, although the model is only implicitly, not explicitly, linked to the economic model.
Van Oudenhove, Lukas; Cuypers, Stefaan
2014-05-01
Psychosomatic medicine, with its prevailing biopsychosocial model, aims to integrate human and exact sciences with their divergent conceptual models. Therefore, its own conceptual foundations, which often remain implicit and unknown, may be critically relevant. We defend the thesis that choosing between different metaphysical views on the 'mind-body problem' may have important implications for the conceptual foundations of psychosomatic medicine, and therefore potentially also for its methods, scientific status and relationship with the scientific disciplines it aims to integrate: biomedical sciences (including neuroscience), psychology and social sciences. To make this point, we introduce three key positions in the philosophical 'mind-body' debate (emergentism, reductionism, and supervenience physicalism) and investigate their consequences for the conceptual basis of the biopsychosocial model in general and its 'psycho-biological' part ('mental causation') in particular. Despite the clinical merits of the biopsychosocial model, we submit that it is conceptually underdeveloped or even flawed, which may hamper its use as a proper scientific model.
A Generalized QMRA Beta-Poisson Dose-Response Model.
Xie, Gang; Roiko, Anne; Stratton, Helen; Lemckert, Charles; Dunn, Peter K; Mengersen, Kerrie
2016-10-01
Quantitative microbial risk assessment (QMRA) is widely accepted for characterizing the microbial risks associated with food, water, and wastewater. Single-hit dose-response models are the most commonly used dose-response models in QMRA. Denoting PI(d) as the probability of infection at a given mean dose d, a three-parameter generalized QMRA beta-Poisson dose-response model, PI(d|α,β,r*), is proposed in which the minimum number of organisms required for causing infection, K min , is not fixed, but a random variable following a geometric distribution with parameter 0
Improving inflow forecasting into hydropower reservoirs through a complementary modelling framework
NASA Astrophysics Data System (ADS)
Gragne, A. S.; Sharma, A.; Mehrotra, R.; Alfredsen, K.
2014-10-01
Accuracy of reservoir inflow forecasts is instrumental for maximizing the value of water resources and benefits gained through hydropower generation. Improving hourly reservoir inflow forecasts over a 24 h lead-time is considered within the day-ahead (Elspot) market of the Nordic exchange market. We present here a new approach for issuing hourly reservoir inflow forecasts that aims to improve on existing forecasting models that are in place operationally, without needing to modify the pre-existing approach, but instead formulating an additive or complementary model that is independent and captures the structure the existing model may be missing. Besides improving forecast skills of operational models, the approach estimates the uncertainty in the complementary model structure and produces probabilistic inflow forecasts that entrain suitable information for reducing uncertainty in the decision-making processes in hydropower systems operation. The procedure presented comprises an error model added on top of an un-alterable constant parameter conceptual model, the models being demonstrated with reference to the 207 km2 Krinsvatn catchment in central Norway. The structure of the error model is established based on attributes of the residual time series from the conceptual model. Deterministic and probabilistic evaluations revealed an overall significant improvement in forecast accuracy for lead-times up to 17 h. Season based evaluations indicated that the improvement in inflow forecasts varies across seasons and inflow forecasts in autumn and spring are less successful with the 95% prediction interval bracketing less than 95% of the observations for lead-times beyond 17 h.
Global Sensitivity Analysis for Process Identification under Model Uncertainty
NASA Astrophysics Data System (ADS)
Ye, M.; Dai, H.; Walker, A. P.; Shi, L.; Yang, J.
2015-12-01
The environmental system consists of various physical, chemical, and biological processes, and environmental models are always built to simulate these processes and their interactions. For model building, improvement, and validation, it is necessary to identify important processes so that limited resources can be used to better characterize the processes. While global sensitivity analysis has been widely used to identify important processes, the process identification is always based on deterministic process conceptualization that uses a single model for representing a process. However, environmental systems are complex, and it happens often that a single process may be simulated by multiple alternative models. Ignoring the model uncertainty in process identification may lead to biased identification in that identified important processes may not be so in the real world. This study addresses this problem by developing a new method of global sensitivity analysis for process identification. The new method is based on the concept of Sobol sensitivity analysis and model averaging. Similar to the Sobol sensitivity analysis to identify important parameters, our new method evaluates variance change when a process is fixed at its different conceptualizations. The variance considers both parametric and model uncertainty using the method of model averaging. The method is demonstrated using a synthetic study of groundwater modeling that considers recharge process and parameterization process. Each process has two alternative models. Important processes of groundwater flow and transport are evaluated using our new method. The method is mathematically general, and can be applied to a wide range of environmental problems.
Wolfs, Vincent; Villazon, Mauricio Florencio; Willems, Patrick
2013-01-01
Applications such as real-time control, uncertainty analysis and optimization require an extensive number of model iterations. Full hydrodynamic sewer models are not sufficient for these applications due to the excessive computation time. Simplifications are consequently required. A lumped conceptual modelling approach results in a much faster calculation. The process of identifying and calibrating the conceptual model structure could, however, be time-consuming. Moreover, many conceptual models lack accuracy, or do not account for backwater effects. To overcome these problems, a modelling methodology was developed which is suited for semi-automatic calibration. The methodology is tested for the sewer system of the city of Geel in the Grote Nete river basin in Belgium, using both synthetic design storm events and long time series of rainfall input. A MATLAB/Simulink(®) tool was developed to guide the modeller through the step-wise model construction, reducing significantly the time required for the conceptual modelling process.
Using 3D dynamic cartography and hydrological modelling for linear streamflow mapping
NASA Astrophysics Data System (ADS)
Drogue, G.; Pfister, L.; Leviandier, T.; Humbert, J.; Hoffmann, L.; El Idrissi, A.; Iffly, J.-F.
2002-10-01
This paper presents a regionalization methodology and an original representation of the downstream variation of daily streamflow using a conceptual rainfall-runoff model (HRM) and the 3D visualization tools of the GIS ArcView. The regionalization of the parameters of the HRM model was obtained by fitting simultaneously the runoff series from five sub-basins of the Alzette river basin (Grand-Duchy of Luxembourg) according to the permeability of geological formations. After validating the transposability of the regional parameter values on five test basins, streamflow series were simulated with the model at ungauged sites in one medium size geologically contrasted test basin and interpolated assuming a linear increase of streamflow between modelling points. 3D spatio-temporal cartography of mean annual and high raw and specific discharges are illustrated. During a severe flooding, the propagation of the flood waves in the different parts of the stream network shows an important contribution of sub-basins lying on impervious geological formations (direct runoff) compared with those including permeable geological formations which have a more contrasted hydrological response. The effect of spatial variability of rainfall is clearly perceptible.
Use of constrained optimization in the conceptual design of a medium-range subsonic transport
NASA Technical Reports Server (NTRS)
Sliwa, S. M.
1980-01-01
Constrained parameter optimization was used to perform the optimal conceptual design of a medium range transport configuration. The impact of choosing a given performance index was studied, and the required income for a 15 percent return on investment was proposed as a figure of merit. A number of design constants and constraint functions were systematically varied to document the sensitivities of the optimal design to a variety of economic and technological assumptions. A comparison was made for each of the parameter variations between the baseline configuration and the optimally redesigned configuration.
Alessandri, Elena; Williamson, Victoria J.; Eiholzer, Hubert; Williamon, Aaron
2015-01-01
Critical reviews offer rich data that can be used to investigate how musical experiences are conceptualized by expert listeners. However, these data also present significant challenges in terms of organization, analysis, and interpretation. This study presents a new systematic method for examining written responses to music, tested on a substantial corpus of music criticism. One hundred critical reviews of Beethoven’s piano sonata recordings, published in the Gramophone between August 1934 and July 2010, were selected using in-depth data reduction (qualitative/quantitative approach). The texts were then examined using thematic analysis in order to generate a visual descriptive model of expert critical review. This model reveals how the concept of evaluation permeates critical review. It also distinguishes between two types of descriptors. The first characterizes the performance in terms of specific actions or features of the musical sound (musical parameters, technique, and energy); the second appeals to higher-order properties (artistic style, character and emotion, musical structure, communicativeness) or assumed performer qualities (understanding, intentionality, spontaneity, sensibility, control, and care). The new model provides a methodological guide and conceptual basis for future studies of critical review in any genre. PMID:25741295
Alessandri, Elena; Williamson, Victoria J; Eiholzer, Hubert; Williamon, Aaron
2015-01-01
Critical reviews offer rich data that can be used to investigate how musical experiences are conceptualized by expert listeners. However, these data also present significant challenges in terms of organization, analysis, and interpretation. This study presents a new systematic method for examining written responses to music, tested on a substantial corpus of music criticism. One hundred critical reviews of Beethoven's piano sonata recordings, published in the Gramophone between August 1934 and July 2010, were selected using in-depth data reduction (qualitative/quantitative approach). The texts were then examined using thematic analysis in order to generate a visual descriptive model of expert critical review. This model reveals how the concept of evaluation permeates critical review. It also distinguishes between two types of descriptors. The first characterizes the performance in terms of specific actions or features of the musical sound (musical parameters, technique, and energy); the second appeals to higher-order properties (artistic style, character and emotion, musical structure, communicativeness) or assumed performer qualities (understanding, intentionality, spontaneity, sensibility, control, and care). The new model provides a methodological guide and conceptual basis for future studies of critical review in any genre.
Reliability and Maintainability model (RAM) user and maintenance manual. Part 2
NASA Technical Reports Server (NTRS)
Ebeling, Charles E.
1995-01-01
This report documents the procedures for utilizing and maintaining the Reliability and Maintainability Model (RAM) developed by the University of Dayton for the NASA Langley Research Center (LaRC). The RAM model predicts reliability and maintainability (R&M) parameters for conceptual space vehicles using parametric relationships between vehicle design and performance characteristics and subsystem mean time between maintenance actions (MTBM) and manhours per maintenance action (MH/MA). These parametric relationships were developed using aircraft R&M data from over thirty different military aircraft of all types. This report describes the general methodology used within the model, the execution and computational sequence, the input screens and data, the output displays and reports, and study analyses and procedures. A source listing is provided.
Leahy-Warren, Patricia; Mulcahy, Helen; Benefield, Lazelle; Bradley, Colin; Coffey, Alice; Donohoe, Ann; Fitzgerald, Serena; Frawley, Tim; Healy, Elizabeth; Healy, Maria; Kelly, Marcella; McCarthy, Bernard; McLoughlin, Kathleen; Meagher, Catherine; O'Connell, Rhona; O'Mahony, Aoife; Paul, Gillian; Phelan, Amanda; Stokes, Diarmuid; Walsh, Jessica; Savage, Eileen
2017-01-01
Successful models of nursing and midwifery in the community delivering healthcare throughout the lifespan and across a health and illness continuum are limited, yet necessary to guide global health services. Primary and community health services are the typical points of access for most people and the location where most care is delivered. The scope of primary healthcare is complex and multifaceted and therefore requires a practice framework with sound conceptual and theoretical underpinnings. The aim of this paper is to present a conceptual model informed by a scoping evidence review of the literature. A scoping evidence review of the literature was conducted using the Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA) statement. Databases included CINAHL, MEDLINE, PsycINFO and SocINDEX using the EBSCO platform and the Cochrane Library using the keywords: model, nursing, midwifery, community, primary care. Grey literature for selected countries was searched using the Google 'advanced' search interface. Data extraction and quality appraisal for both empirical and grey literature were conducted independently by two reviewers. From 127 empirical and 24 non-empirical papers, data extraction parameters, in addition to the usual methodological features, included: the nature of nursing and midwifery; the population group; interventions and main outcomes; components of effective nursing and midwifery outcomes. The evidence was categorised into six broad areas and subsequently synthesised into four themes. These were not mutually exclusive: (1) Integrated and Collaborative Care; (2) Organisation and Delivery of Nursing and Midwifery Care in the Community; (3) Adjuncts to Nursing Care and (4) Overarching Conceptual Model. It is the latter theme that is the focus of this paper. In essence, the model depicts a person/client on a lifespan and preventative-curative trajectory. The health related needs of the client, commensurate with their point position, relative to both trajectories, determines the nurse or midwife intervention. Consequently, it is this need, that determines the discipline or speciality of the nurse or midwife with the most appropriate competencies. Use of a conceptual model of nursing and midwifery to inform decision-making in primary/community based care ensures clinical outcomes are meaningful and more sustainable. Operationalising this model for nursing and midwifery in the community demands strong leadership and effective clinical governance.
Zhang, Z. Fred; White, Signe K.; Bonneville, Alain; ...
2014-12-31
Numerical simulations have been used for estimating CO2 injectivity, CO2 plume extent, pressure distribution, and Area of Review (AoR), and for the design of CO2 injection operations and monitoring network for the FutureGen project. The simulation results are affected by uncertainties associated with numerous input parameters, the conceptual model, initial and boundary conditions, and factors related to injection operations. Furthermore, the uncertainties in the simulation results also vary in space and time. The key need is to identify those uncertainties that critically impact the simulation results and quantify their impacts. We introduce an approach to determine the local sensitivity coefficientmore » (LSC), defined as the response of the output in percent, to rank the importance of model inputs on outputs. The uncertainty of an input with higher sensitivity has larger impacts on the output. The LSC is scalable by the error of an input parameter. The composite sensitivity of an output to a subset of inputs can be calculated by summing the individual LSC values. We propose a local sensitivity coefficient method and applied it to the FutureGen 2.0 Site in Morgan County, Illinois, USA, to investigate the sensitivity of input parameters and initial conditions. The conceptual model for the site consists of 31 layers, each of which has a unique set of input parameters. The sensitivity of 11 parameters for each layer and 7 inputs as initial conditions is then investigated. For CO2 injectivity and plume size, about half of the uncertainty is due to only 4 or 5 of the 348 inputs and 3/4 of the uncertainty is due to about 15 of the inputs. The initial conditions and the properties of the injection layer and its neighbour layers contribute to most of the sensitivity. Overall, the simulation outputs are very sensitive to only a small fraction of the inputs. However, the parameters that are important for controlling CO2 injectivity are not the same as those controlling the plume size. The three most sensitive inputs for injectivity were the horizontal permeability of Mt Simon 11 (the injection layer), the initial fracture-pressure gradient, and the residual aqueous saturation of Mt Simon 11, while those for the plume area were the initial salt concentration, the initial pressure, and the initial fracture-pressure gradient. The advantages of requiring only a single set of simulation results, scalability to the proper parameter errors, and easy calculation of the composite sensitivities make this approach very cost-effective for estimating AoR uncertainty and guiding cost-effective site characterization, injection well design, and monitoring network design for CO2 storage projects.« less
Conceptual Models and Guidelines for Clinical Assessment of Financial Capacity
Marson, Daniel
2016-01-01
The ability to manage financial affairs is a life skill of critical importance, and neuropsychologists are increasingly asked to assess financial capacity across a variety of settings. Sound clinical assessment of financial capacity requires knowledge and appreciation of applicable clinical conceptual models and principles. However, the literature has presented relatively little conceptual guidance for clinicians concerning financial capacity and its assessment. This article seeks to address this gap. The article presents six clinical models of financial capacity : (1) the early gerontological IADL model of Lawton, (2) the clinical skills model and (3) related cognitive psychological model developed by Marson and colleagues, (4) a financial decision-making model adapting earlier decisional capacity work of Appelbaum and Grisso, (5) a person-centered model of financial decision-making developed by Lichtenberg and colleagues, and (6) a recent model of financial capacity in the real world developed through the Institute of Medicine. Accompanying presentation of the models is discussion of conceptual and practical perspectives they represent for clinician assessment. Based on the models, the article concludes by presenting a series of conceptually oriented guidelines for clinical assessment of financial capacity. In summary, sound assessment of financial capacity requires knowledge and appreciation of clinical conceptual models and principles. Awareness of such models, principles and guidelines will strengthen and advance clinical assessment of financial capacity. PMID:27506235
Methods and conceptual models to guide the development of tools for diagnosing the causes of biological impairment within aquatic ecosystems of the United States are described in this report. The conceptual models developed here address nutrients, suspended and bedded sediments (...
Single particle momentum and angular distributions in hadron-hadron collisions at ultrahigh energies
NASA Technical Reports Server (NTRS)
Chou, T. T.; Chen, N. Y.
1985-01-01
The forward-backward charged multiplicity distribution (P n sub F, n sub B) of events in the 540 GeV antiproton-proton collider has been extensively studied by the UA5 Collaboration. It was pointed out that the distribution with respect to n = n sub F + n sub B satisfies approximate KNO scaling and that with respect to Z = n sub F - n sub B is binomial. The geometrical model of hadron-hadron collision interprets the large multiplicity fluctuation as due to the widely different nature of collisions at different impact parameters b. For a single impact parameter b, the collision in the geometrical model should exhibit stochastic behavior. This separation of the stochastic and nonstochastic (KNO) aspects of multiparticle production processes gives conceptually a lucid and attractive picture of such collisions, leading to the concept of partition temperature T sub p and the single particle momentum spectrum to be discussed in detail.
A new framework for comprehensive, robust, and efficient global sensitivity analysis: 2. Application
NASA Astrophysics Data System (ADS)
Razavi, Saman; Gupta, Hoshin V.
2016-01-01
Based on the theoretical framework for sensitivity analysis called "Variogram Analysis of Response Surfaces" (VARS), developed in the companion paper, we develop and implement a practical "star-based" sampling strategy (called STAR-VARS), for the application of VARS to real-world problems. We also develop a bootstrap approach to provide confidence level estimates for the VARS sensitivity metrics and to evaluate the reliability of inferred factor rankings. The effectiveness, efficiency, and robustness of STAR-VARS are demonstrated via two real-data hydrological case studies (a 5-parameter conceptual rainfall-runoff model and a 45-parameter land surface scheme hydrology model), and a comparison with the "derivative-based" Morris and "variance-based" Sobol approaches are provided. Our results show that STAR-VARS provides reliable and stable assessments of "global" sensitivity across the full range of scales in the factor space, while being 1-2 orders of magnitude more efficient than the Morris or Sobol approaches.
Pumping tests in non-uniform aquifers - the linear strip case
Butler, J.J.; Liu, W.Z.
1991-01-01
Many pumping tests are performed in geologic settings that can be conceptualized as a linear infinite strip of one material embedded in a matrix of differing flow properties. A semi-analytical solution is presented to aid the analysis of drawdown data obtained from pumping tests performed in settings that can be represented by such a conceptual model. Integral transform techniques are employed to obtain a solution in transform space that can be numerically inverted to real space. Examination of the numerically transformed solution reveals several interesting features of flow in this configuration. If the transmissivity of the strip is much higher than that of the matrix, linear and bilinear flow are the primary flow regimes during a pumping test. If the contrast between matrix and strip properties is not as extreme, then radial flow should be the primary flow mechanism. Sensitivity analysis is employed to develop insight into the controls on drawdown in this conceptual model and to demonstrate the importance of temporal and spatial placement of observations. Changes in drawdown are sensitive to the transmissivity of the strip for a limited time duration. After that time, only the total drawdown remains a function of strip transmissivity. In the case of storativity, both the total drawdown and changes in drawdown are sensitive to the storativity of the strip for a time of quite limited duration. After that time, essentially no information can be gained about the storage properties of the strip from drawdown data. An example analysis is performed using data previously presented in the literature to demonstrate the viability of the semi-analytical solution and to illustrate a general procedure for analysis of drawdown data in complex geologic settings. This example reinforces the importance of observation well placement and the time of data collection in constraining parameter correlation, a major source of the uncertainty that arises in the parameter estimation procedure. ?? 1991.
Evaluation of bacterial run and tumble motility parameters through trajectory analysis
NASA Astrophysics Data System (ADS)
Liang, Xiaomeng; Lu, Nanxi; Chang, Lin-Ching; Nguyen, Thanh H.; Massoudieh, Arash
2018-04-01
In this paper, a method for extraction of the behavior parameters of bacterial migration based on the run and tumble conceptual model is described. The methodology is applied to the microscopic images representing the motile movement of flagellated Azotobacter vinelandii. The bacterial cells are considered to change direction during both runs and tumbles as is evident from the movement trajectories. An unsupervised cluster analysis was performed to fractionate each bacterial trajectory into run and tumble segments, and then the distribution of parameters for each mode were extracted by fitting mathematical distributions best representing the data. A Gaussian copula was used to model the autocorrelation in swimming velocity. For both run and tumble modes, Gamma distribution was found to fit the marginal velocity best, and Logistic distribution was found to represent better the deviation angle than other distributions considered. For the transition rate distribution, log-logistic distribution and log-normal distribution, respectively, was found to do a better job than the traditionally agreed exponential distribution. A model was then developed to mimic the motility behavior of bacteria at the presence of flow. The model was applied to evaluate its ability to describe observed patterns of bacterial deposition on surfaces in a micro-model experiment with an approach velocity of 200 μm/s. It was found that the model can qualitatively reproduce the attachment results of the micro-model setting.
NASA Technical Reports Server (NTRS)
Ebeling, Charles; Beasley, Kenneth D.
1992-01-01
The first year of research to provide NASA support in predicting operational and support parameters and costs of proposed space systems is reported. Some of the specific research objectives were (1) to develop a methodology for deriving reliability and maintainability parameters and, based upon their estimates, determine the operational capability and support costs, and (2) to identify data sources and establish an initial data base to implement the methodology. Implementation of the methodology is accomplished through the development of a comprehensive computer model. While the model appears to work reasonably well when applied to aircraft systems, it was not accurate when used for space systems. The model is dynamic and should be updated as new data become available. It is particularly important to integrate the current aircraft data base with data obtained from the Space Shuttle and other space systems since subsystems unique to a space vehicle require data not available from aircraft. This research only addressed the major subsystems on the vehicle.
Beretta, Edoardo; Capasso, Vincenzo; Garao, Dario G
2018-06-01
In this paper a conceptual mathematical model of malaria transmission proposed in a previous paper has been analyzed in a deeper detail. Among its key epidemiological features of this model, two-age-classes (child and adult) and asymptomatic carriers have been included. The extra mortality of mosquitoes due to the use of long-lasting treated mosquito nets (LLINs) and Indoor Residual Spraying (IRS) has been included too. By taking advantage of the natural double time scale of the parasite and the human populations, it has been possible to provide interesting threshold results. In particular it has been shown that key parameters can be identified such that below a threshold level, built on these parameters, the epidemic tends to extinction, while above another threshold level it tends to a nontrivial endemic state, for which an interval estimate has been provided. Numerical simulations confirm the analytical results. Copyright © 2018 Elsevier Inc. All rights reserved.
Conceptual design and structural analysis for an 8.4-m telescope
NASA Astrophysics Data System (ADS)
Mendoza, Manuel; Farah, Alejandro; Ruiz Schneider, Elfego
2004-09-01
This paper describes the conceptual design of the optics support structures of a telescope with a primary mirror of 8.4 m, the same size as a Large Binocular Telescope (LBT) primary mirror. The design goal is to achieve a structure for supporting the primary and secondary mirrors and keeping them joined as rigid as possible. With this purpose an optimization with several models was done. This iterative design process includes: specifications development, concepts generation and evaluation. Process included Finite Element Analysis (FEA) as well as other analytical calculations. Quality Function Deployment (QFD) matrix was used to obtain telescope tube and spider specifications. Eight spiders and eleven tubes geometric concepts were proposed. They were compared in decision matrixes using performance indicators and parameters. Tubes and spiders went under an iterative optimization process. The best tubes and spiders concepts were assembled together. All assemblies were compared and ranked according to their performance.
Borek, Aleksandra J; Abraham, Charles
2018-03-01
Small groups are used to promote health, well-being, and personal change by altering members' perceptions, beliefs, expectations, and behaviour patterns. An extensive cross-disciplinary literature has articulated and tested theories explaining how such groups develop, function, and facilitate change. Yet these theoretical understandings are rarely applied in the development, description, and evaluation of health-promotion, group-based, behaviour-change interventions. Medline database, library catalogues, search engines, specific journals and reference lists were searched for relevant texts. Texts were reviewed for explanatory concepts or theories describing change processes in groups, which were integrated into the developing conceptual structure. This was designed to be a parsimonious conceptual framework that could be applied to design and delivery. Five categories of interacting processes and concepts were identified and defined: (1) group development processes, (2) dynamic group processes, (3) social change processes, (4) personal change processes, and (5) group design and operating parameters. Each of these categories encompasses a variety of theorised mechanisms explaining individual change in small groups. The final conceptual model, together with the design issues and practical recommendations derived from it, provides a practical basis for linking research and theory explaining group functioning to optimal design of group-based, behaviour-change interventions. © 2018 The Authors. Applied Psychology: Health and Well-Being published by John Wiley & Sons Ltd on behalf of International Association of Applied Psychology.
A conceptual modeling framework for discrete event simulation using hierarchical control structures
Furian, N.; O’Sullivan, M.; Walker, C.; Vössner, S.; Neubacher, D.
2015-01-01
Conceptual Modeling (CM) is a fundamental step in a simulation project. Nevertheless, it is only recently that structured approaches towards the definition and formulation of conceptual models have gained importance in the Discrete Event Simulation (DES) community. As a consequence, frameworks and guidelines for applying CM to DES have emerged and discussion of CM for DES is increasing. However, both the organization of model-components and the identification of behavior and system control from standard CM approaches have shortcomings that limit CM’s applicability to DES. Therefore, we discuss the different aspects of previous CM frameworks and identify their limitations. Further, we present the Hierarchical Control Conceptual Modeling framework that pays more attention to the identification of a models’ system behavior, control policies and dispatching routines and their structured representation within a conceptual model. The framework guides the user step-by-step through the modeling process and is illustrated by a worked example. PMID:26778940
Can the Neuman Systems Model be adapted to the Malaysian nursing context?
Shamsudin, Nafsiah
2002-04-01
Nursing in Malaysia is still developing as a profession. Issues such as using nursing conceptual models or frameworks in the delivery of nursing care have not been addressed by the majority of nurses. One reason for this has been the level of education and preparation of nurses, while another reason lies with the origins of existing nursing conceptual models. Most nursing conceptual models have their origins in North America. Their utility by nurses of different cultures and academic preparations might not be appropriate. Nursing is a social activity, an interaction between the nurse and the patient. It is carried out in a social environment within a particular culture. Conceptual models developed in one culture might not be readily implanted into another culture. This paper discusses how a conceptual model developed in North America; that is, the Neuman Systems Model, can be adapted into the Malaysian nursing context.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arnett, Ronald Chester
Fate and transport model results are presented for the Texarkana Wood Preserving Company (TWPC)superfund site. The conceptual model assumes two sources of contamination, specifically, the areas around the old and new process areas. Recent data show the presence of non-aqueous phase liquids (NAPL) in the aquifer that are also sources of dissolved contamination in the aquifer. A flow model was constructed and calibrated against measured hydraulic heads at permanent monitoring wells. Good matches were obtained between model simulated heads and most measured heads. An unexplained exception occurs at monitoring well MW-13 down gradient of the site beyond the measured contaminantmore » plume where the model predicts heads that are more than 2 ft. lower than reported field measurements. Adjusting hydraulic parameters in the model could not account for this anomaly and still preserve the head matches at other wells. There is likely a moderate deficiency in the conceptual model or perhaps a data error. Other information such as substantial amounts of infiltrating surface water in the area or a correction in surveyed elevation would improve the flow model. A particle tracking model calculated a travel time from the new process area to the Day’s Creek discharge location on the order of 40 years. Travel times from the old process area to Day’s Creek were calculated to be on the order of 80 years. While these calculations are subject to some uncertainty, travel times of decades are indicated.« less
NASA Astrophysics Data System (ADS)
Sun, Xiaobin; Xu, Yongxin; Lin, Lixiang
2015-05-01
Parameter estimates of artesian aquifers where piezometric head is above ground level are largely made through free-flowing and recovery tests. The straight-line method proposed by Jacob-Lohman is often used for interpretation of flow rate measured at flowing artesian boreholes. However, the approach fails to interpret the free-flowing test data from two artesian boreholes in the fractured-rock aquifer in Table Mountain Group (TMG) of South Africa. The diagnostic plot method using the reciprocal rate derivative is adapted to evaluate the artesian aquifer properties. The variation of the derivative helps not only identify flow regimes and discern the boundary conditions, but also facilitates conceptualization of the aquifer system and selection of an appropriate model for data interpretation later on. Test data from two free-flowing tests conducted in different sites in TMG are analysed using the diagnostic plot method. Based on the results, conceptual models and appropriate approaches are developed to evaluate the aquifer properties. The advantages and limitations of using the diagnostic plot method on free-flowing test data are discussed.
Conceptual hydrogeological model of a coastal hydrosystem in the mediterranean
NASA Astrophysics Data System (ADS)
Mitropapas, Anastasios; Pouliaris, Christos; Apostolopoulos, Georgios; Vasileiou, Eleni; Schüth, Christoph; Vienken, Thomas; Dietrich, Peter; Kallioras, Andreas
2016-04-01
Groundwater resources management in the Mediterranean basin is an issue of paramount importance that becomes a necessity in the case of the coastal hydrosystems. Coastal aquifers are considered very sensitive ecosystems that are subject to several stresses being of natural or anthropogenic origin. The coastal hydrosystem of Lavrion can be used as a reference site that incorporates multi-disciplinary environmental problems, which are typical for Circum-Mediterranean. This study presents the synthesis of a wide range of field activities within the area of Lavrion including the monitoring of water resources within all hydrologic zones (surface, unsaturated and saturated) and geophysical (invasive and non-invasive) surveys. Different monitoring approaches -targeting to the collection of hydrochemical, geophysical, geological, hydrological data- were applied, that proved to provide a sound characterization of the groundwater flows within the coastal karstic system in connection to the surrounding water bodies of the study area. The above are used as input parameters process during the development of the conceptual model of the coastal hydrosystem of Lavrion. Key-words: Coastal hydrosystems, Mediterranean basin, seawater intrusion
Conceptual Model Learning Objects and Design Recommendations for Small Screens
ERIC Educational Resources Information Center
Churchill, Daniel
2011-01-01
This article presents recommendations for the design of conceptual models for applications via handheld devices such as personal digital assistants and some mobile phones. The recommendations were developed over a number of years through experience that involves design of conceptual models, and applications of these multimedia representations with…
Semantic Description of Educational Adaptive Hypermedia Based on a Conceptual Model
ERIC Educational Resources Information Center
Papasalouros, Andreas; Retalis, Symeon; Papaspyrou, Nikolaos
2004-01-01
The role of conceptual modeling in Educational Adaptive Hypermedia Applications (EAHA) is especially important. A conceptual model of an educational application depicts the instructional solution that is implemented, containing information about concepts that must be ac-quired by learners, tasks in which learners must be involved and resources…
A Multivariate Model of Conceptual Change
ERIC Educational Resources Information Center
Taasoobshirazi, Gita; Heddy, Benjamin; Bailey, MarLynn; Farley, John
2016-01-01
The present study used the Cognitive Reconstruction of Knowledge Model (CRKM) model of conceptual change as a framework for developing and testing how key cognitive, motivational, and emotional variables are linked to conceptual change in physics. This study extends an earlier study developed by Taasoobshirazi and Sinatra ("J Res Sci…
Conceptual Model of Research to Reduce Stigma Related to Mental Disorders in Adolescents
Pinto-Foltz, Melissa D.; Logsdon, M. Cynthia
2010-01-01
Purpose: To explicate an initial conceptual model that is amenable to testing and guiding anti-stigma interventions with adolescents. Design/Sources Used: Multidisciplinary research and theoretical articles were reviewed. . Conclusions: The conceptual model may guide anti-stigma interventions, and undergo testing and refinement in the future to reflect scientific advances in stigma reduction among adolescents. Use of a conceptual model enhances empirical evaluation of anti-stigma interventions yielding a casual explanation for the intervention effects and enhances clinical applicability of interventions across settings. PMID:19916813
Senin, Tatjana; Meyer, Thorsten
2018-01-22
Aim was to gather theoretical knowledge about self-determination and to develop a conceptual model for medical rehabilitation- which serves as a basis for discussion. We performed a literature research in electronic databases. Various theories and research results were adopted and transferred to the context of medical rehabilitation and into a conceptual model. The conceptual model of self-determination reflects on a continuum which forms of self-determination may be present in situations of medical rehabilitation treatments. The location on the continuum depends theoretically on the manifestation of certain internal and external factors that may influence each other. The model provides a first conceptualization of self-determination focusing on medical rehabilitation which should be further refined and tested empirically. © Georg Thieme Verlag KG Stuttgart · New York.
Data Modeling & the Infrastructural Nature of Conceptual Tools
ERIC Educational Resources Information Center
Lesh, Richard; Caylor, Elizabeth; Gupta, Shweta
2007-01-01
The goal of this paper is to demonstrate the infrastructural nature of many modern conceptual technologies. The focus of this paper is on conceptual tools associated with elementary types of data modeling. We intend to show a variety of ways in which these conceptual tools not only express thinking, but also mold and shape thinking. And those ways…
Randhawa, Gurprit K
2017-01-01
A conceptual model for exploring the relationship between end-user support (EUS) and electronic medical record (EMR) use by primary care physicians is presented. The model was developed following a review of conceptual and theoretical frameworks related to technology adoption/use and EUS. The model includes (a) one core construct (facilitating conditions), (b) four antecedents and one postcedent of facilitating conditions, and (c) four moderators. EMR use behaviour is the key outcome of the model. The proposed conceptual model should be tested. The model may be used to inform planning and decision-making for EMR implementations to increase EMR use for benefits realization.
Ward, Ryan D.; Gallistel, C.R.; Balsam, Peter D
2013-01-01
Learning in conditioning protocols has long been thought to depend on temporal contiguity between the conditioned stimulus and the unconditioned stimulus. This conceptualization has led to a preponderance of associative models of conditioning. We suggest that trial-based associative models that posit contiguity as the primary principle underlying learning are flawed, and provide a brief review of an alternative, information theoretic approach to conditioning. The information that a CS conveys about the timing of the next US can be derived from the temporal parameters of a conditioning protocol. According to this view, a CS will support conditioned responding if, and only if, it reduces uncertainty about the timing of the next US. PMID:23384660
Schryver, Jack; Nutaro, James; Shankar, Mallikarjun
2015-10-30
An agent-based simulation model hierarchy emulating disease states and behaviors critical to progression of diabetes type 2 was designed and implemented in the DEVS framework. The models are translations of basic elements of an established system dynamics model of diabetes. In this model hierarchy, which mimics diabetes progression over an aggregated U.S. population, was dis-aggregated and reconstructed bottom-up at the individual (agent) level. Four levels of model complexity were defined in order to systematically evaluate which parameters are needed to mimic outputs of the system dynamics model. Moreover, the four estimated models attempted to replicate stock counts representing disease statesmore » in the system dynamics model, while estimating impacts of an elderliness factor, obesity factor and health-related behavioral parameters. Health-related behavior was modeled as a simple realization of the Theory of Planned Behavior, a joint function of individual attitude and diffusion of social norms that spread over each agent s social network. Although the most complex agent-based simulation model contained 31 adjustable parameters, all models were considerably less complex than the system dynamics model which required numerous time series inputs to make its predictions. In all three elaborations of the baseline model provided significantly improved fits to the output of the system dynamics model. The performances of the baseline agent-based model and its extensions illustrate a promising approach to translate complex system dynamics models into agent-based model alternatives that are both conceptually simpler and capable of capturing main effects of complex local agent-agent interactions.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schryver, Jack; Nutaro, James; Shankar, Mallikarjun
An agent-based simulation model hierarchy emulating disease states and behaviors critical to progression of diabetes type 2 was designed and implemented in the DEVS framework. The models are translations of basic elements of an established system dynamics model of diabetes. In this model hierarchy, which mimics diabetes progression over an aggregated U.S. population, was dis-aggregated and reconstructed bottom-up at the individual (agent) level. Four levels of model complexity were defined in order to systematically evaluate which parameters are needed to mimic outputs of the system dynamics model. Moreover, the four estimated models attempted to replicate stock counts representing disease statesmore » in the system dynamics model, while estimating impacts of an elderliness factor, obesity factor and health-related behavioral parameters. Health-related behavior was modeled as a simple realization of the Theory of Planned Behavior, a joint function of individual attitude and diffusion of social norms that spread over each agent s social network. Although the most complex agent-based simulation model contained 31 adjustable parameters, all models were considerably less complex than the system dynamics model which required numerous time series inputs to make its predictions. In all three elaborations of the baseline model provided significantly improved fits to the output of the system dynamics model. The performances of the baseline agent-based model and its extensions illustrate a promising approach to translate complex system dynamics models into agent-based model alternatives that are both conceptually simpler and capable of capturing main effects of complex local agent-agent interactions.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vrugt, Jasper A; Robinson, Bruce A; Ter Braak, Cajo J F
In recent years, a strong debate has emerged in the hydrologic literature regarding what constitutes an appropriate framework for uncertainty estimation. Particularly, there is strong disagreement whether an uncertainty framework should have its roots within a proper statistical (Bayesian) context, or whether such a framework should be based on a different philosophy and implement informal measures and weaker inference to summarize parameter and predictive distributions. In this paper, we compare a formal Bayesian approach using Markov Chain Monte Carlo (MCMC) with generalized likelihood uncertainty estimation (GLUE) for assessing uncertainty in conceptual watershed modeling. Our formal Bayesian approach is implemented usingmore » the recently developed differential evolution adaptive metropolis (DREAM) MCMC scheme with a likelihood function that explicitly considers model structural, input and parameter uncertainty. Our results demonstrate that DREAM and GLUE can generate very similar estimates of total streamflow uncertainty. This suggests that formal and informal Bayesian approaches have more common ground than the hydrologic literature and ongoing debate might suggest. The main advantage of formal approaches is, however, that they attempt to disentangle the effect of forcing, parameter and model structural error on total predictive uncertainty. This is key to improving hydrologic theory and to better understand and predict the flow of water through catchments.« less
A conceptual model of the automated credibility assessment of the volunteered geographic information
NASA Astrophysics Data System (ADS)
Idris, N. H.; Jackson, M. J.; Ishak, M. H. I.
2014-02-01
The use of Volunteered Geographic Information (VGI) in collecting, sharing and disseminating geospatially referenced information on the Web is increasingly common. The potentials of this localized and collective information have been seen to complement the maintenance process of authoritative mapping data sources and in realizing the development of Digital Earth. The main barrier to the use of this data in supporting this bottom up approach is the credibility (trust), completeness, accuracy, and quality of both the data input and outputs generated. The only feasible approach to assess these data is by relying on an automated process. This paper describes a conceptual model of indicators (parameters) and practical approaches to automated assess the credibility of information contributed through the VGI including map mashups, Geo Web and crowd - sourced based applications. There are two main components proposed to be assessed in the conceptual model - metadata and data. The metadata component comprises the indicator of the hosting (websites) and the sources of data / information. The data component comprises the indicators to assess absolute and relative data positioning, attribute, thematic, temporal and geometric correctness and consistency. This paper suggests approaches to assess the components. To assess the metadata component, automated text categorization using supervised machine learning is proposed. To assess the correctness and consistency in the data component, we suggest a matching validation approach using the current emerging technologies from Linked Data infrastructures and using third party reviews validation. This study contributes to the research domain that focuses on the credibility, trust and quality issues of data contributed by web citizen providers.
Estimating Model Probabilities using Thermodynamic Markov Chain Monte Carlo Methods
NASA Astrophysics Data System (ADS)
Ye, M.; Liu, P.; Beerli, P.; Lu, D.; Hill, M. C.
2014-12-01
Markov chain Monte Carlo (MCMC) methods are widely used to evaluate model probability for quantifying model uncertainty. In a general procedure, MCMC simulations are first conducted for each individual model, and MCMC parameter samples are then used to approximate marginal likelihood of the model by calculating the geometric mean of the joint likelihood of the model and its parameters. It has been found the method of evaluating geometric mean suffers from the numerical problem of low convergence rate. A simple test case shows that even millions of MCMC samples are insufficient to yield accurate estimation of the marginal likelihood. To resolve this problem, a thermodynamic method is used to have multiple MCMC runs with different values of a heating coefficient between zero and one. When the heating coefficient is zero, the MCMC run is equivalent to a random walk MC in the prior parameter space; when the heating coefficient is one, the MCMC run is the conventional one. For a simple case with analytical form of the marginal likelihood, the thermodynamic method yields more accurate estimate than the method of using geometric mean. This is also demonstrated for a case of groundwater modeling with consideration of four alternative models postulated based on different conceptualization of a confining layer. This groundwater example shows that model probabilities estimated using the thermodynamic method are more reasonable than those obtained using the geometric method. The thermodynamic method is general, and can be used for a wide range of environmental problem for model uncertainty quantification.
Evaluation of a physically based quasi-linear and a conceptually based nonlinear Muskingum methods
NASA Astrophysics Data System (ADS)
Perumal, Muthiah; Tayfur, Gokmen; Rao, C. Madhusudana; Gurarslan, Gurhan
2017-03-01
Two variants of the Muskingum flood routing method formulated for accounting nonlinearity of the channel routing process are investigated in this study. These variant methods are: (1) The three-parameter conceptual Nonlinear Muskingum (NLM) method advocated by Gillin 1978, and (2) The Variable Parameter McCarthy-Muskingum (VPMM) method recently proposed by Perumal and Price in 2013. The VPMM method does not require rigorous calibration and validation procedures as required in the case of NLM method due to established relationships of its parameters with flow and channel characteristics based on hydrodynamic principles. The parameters of the conceptual nonlinear storage equation used in the NLM method were calibrated using the Artificial Intelligence Application (AIA) techniques, such as the Genetic Algorithm (GA), the Differential Evolution (DE), the Particle Swarm Optimization (PSO) and the Harmony Search (HS). The calibration was carried out on a given set of hypothetical flood events obtained by routing a given inflow hydrograph in a set of 40 km length prismatic channel reaches using the Saint-Venant (SV) equations. The validation of the calibrated NLM method was investigated using a different set of hypothetical flood hydrographs obtained in the same set of channel reaches used for calibration studies. Both the sets of solutions obtained in the calibration and validation cases using the NLM method were compared with the corresponding solutions of the VPMM method based on some pertinent evaluation measures. The results of the study reveal that the physically based VPMM method is capable of accounting for nonlinear characteristics of flood wave movement better than the conceptually based NLM method which requires the use of tedious calibration and validation procedures.
NASA Astrophysics Data System (ADS)
Seiller, G.; Anctil, F.; Roy, R.
2017-09-01
This paper outlines the design and experimentation of an Empirical Multistructure Framework (EMF) for lumped conceptual hydrological modeling. This concept is inspired from modular frameworks, empirical model development, and multimodel applications, and encompasses the overproduce and select paradigm. The EMF concept aims to reduce subjectivity in conceptual hydrological modeling practice and includes model selection in the optimisation steps, reducing initial assumptions on the prior perception of the dominant rainfall-runoff transformation processes. EMF generates thousands of new modeling options from, for now, twelve parent models that share their functional components and parameters. Optimisation resorts to ensemble calibration, ranking and selection of individual child time series based on optimal bias and reliability trade-offs, as well as accuracy and sharpness improvement of the ensemble. Results on 37 snow-dominated Canadian catchments and 20 climatically-diversified American catchments reveal the excellent potential of the EMF in generating new individual model alternatives, with high respective performance values, that may be pooled efficiently into ensembles of seven to sixty constitutive members, with low bias and high accuracy, sharpness, and reliability. A group of 1446 new models is highlighted to offer good potential on other catchments or applications, based on their individual and collective interests. An analysis of the preferred functional components reveals the importance of the production and total flow elements. Overall, results from this research confirm the added value of ensemble and flexible approaches for hydrological applications, especially in uncertain contexts, and open up new modeling possibilities.
Health literacy and public health: a systematic review and integration of definitions and models.
Sørensen, Kristine; Van den Broucke, Stephan; Fullam, James; Doyle, Gerardine; Pelikan, Jürgen; Slonska, Zofia; Brand, Helmut
2012-01-25
Health literacy concerns the knowledge and competences of persons to meet the complex demands of health in modern society. Although its importance is increasingly recognised, there is no consensus about the definition of health literacy or about its conceptual dimensions, which limits the possibilities for measurement and comparison. The aim of the study is to review definitions and models on health literacy to develop an integrated definition and conceptual model capturing the most comprehensive evidence-based dimensions of health literacy. A systematic literature review was performed to identify definitions and conceptual frameworks of health literacy. A content analysis of the definitions and conceptual frameworks was carried out to identify the central dimensions of health literacy and develop an integrated model. The review resulted in 17 definitions of health literacy and 12 conceptual models. Based on the content analysis, an integrative conceptual model was developed containing 12 dimensions referring to the knowledge, motivation and competencies of accessing, understanding, appraising and applying health-related information within the healthcare, disease prevention and health promotion setting, respectively. Based upon this review, a model is proposed integrating medical and public health views of health literacy. The model can serve as a basis for developing health literacy enhancing interventions and provide a conceptual basis for the development and validation of measurement tools, capturing the different dimensions of health literacy within the healthcare, disease prevention and health promotion settings.
NASA Astrophysics Data System (ADS)
Klotz, Daniel; Herrnegger, Mathew; Schulz, Karsten
2015-04-01
A multi-scale parameter-estimation method, as presented by Samaniego et al. (2010), is implemented and extended for the conceptual hydrological model COSERO. COSERO is a HBV-type model that is specialized for alpine-environments, but has been applied over a wide range of basins all over the world (see: Kling et al., 2014 for an overview). Within the methodology available small-scale information (DEM, soil texture, land cover, etc.) is used to estimate the coarse-scale model parameters by applying a set of transfer-functions (TFs) and subsequent averaging methods, whereby only TF hyper-parameters are optimized against available observations (e.g. runoff data). The parameter regionalisation approach was extended in order to allow for a more meta-heuristical handling of the transfer-functions. The two main novelties are: 1. An explicit introduction of constrains into parameter estimation scheme: The constraint scheme replaces invalid parts of the transfer-function-solution space with valid solutions. It is inspired by applications in evolutionary algorithms and related to the combination of learning and evolution. This allows the consideration of physical and numerical constraints as well as the incorporation of a priori modeller-experience into the parameter estimation. 2. Spline-based transfer-functions: Spline-based functions enable arbitrary forms of transfer-functions: This is of importance since in many cases the general relationship between sub-grid information and parameters are known, but not the form of the transfer-function itself. The contribution presents the results and experiences with the adopted method and the introduced extensions. Simulation are performed for the pre-alpine/alpine Traisen catchment in Lower Austria. References: Samaniego, L., Kumar, R., Attinger, S. (2010): Multiscale parameter regionalization of a grid-based hydrologic model at the mesoscale, Water Resour. Res., doi: 10.1029/2008WR007327 Kling, H., Stanzel, P., Fuchs, M., and Nachtnebel, H. P. (2014): Performance of the COSERO precipitation-runoff model under non-stationary conditions in basins with different climates, Hydrolog. Sci. J., doi: 10.1080/02626667.2014.959956.
Multiparadigm Design Environments
1992-01-01
following results: 1. New methods for programming in terms of conceptual models 2. Design of object-oriented languages 3. Compiler optimization and...experimented with object-based methods for programming directly in terms of conceptual models, object-oriented language design, computer program...expect the3e results to have a strong influence on future ,,j :- ...... L ! . . • a mm ammmml ll Illlll • l I 1 Conceptual Programming Conceptual
ERIC Educational Resources Information Center
Urey, Mustafa; Calik, Muammer
2008-01-01
Since students' misconceptions are not completely remedied by means of only one conceptual change method, the authors assume that using different conceptual methods embedded within the 5E model will not only be more effective in enhancing students' conceptual understanding, but also may eliminate all students' misconceptions. The aim of this study…
ERIC Educational Resources Information Center
Fortuin, Karen P. J.; van Koppen, C. S. A.; Leemans, Rik
2011-01-01
Conceptual models are useful for facing the challenges of environmental sciences curriculum and course developers and students. These challenges are inherent to the interdisciplinary and problem-oriented character of environmental sciences curricula. In this article, we review the merits of conceptual models in facing these challenges. These…
NASA Astrophysics Data System (ADS)
Rivera, A.; Pétré, M.
2013-12-01
The Milk River transboundary aquifer straddles southern Alberta (Canada) and northern Montana (United States), in a semi-arid region considered water short. This confined sandstone aquifer is a source for municipal supply and agricultural uses on the Canadian side, as well as for secondary oil recovery on the US-side of the border. The extensive use of this resource since the mid 1950's has led to a dramatic drop in the water level in some places and concerns about the durability of the resource have risen. The Milk River aquifer has been the object of many studies during the 20th century; however most of them were limited by the USCanada border, preventing a sound understanding of the global dynamics of the aquifer. The objectives of this transboundary study are to better understand the dynamics of the Milk River aquifer, following its natural limits, in order to make recommendations for a sustainable management and its good governance by the two international jurisdictions, as recommended in the UNGA resolution 63/124 on the Law of Transboundary Aquifers. Since 2009, the Milk River transboundary aquifer is part of the inventory of UNESCO ISARM-Americas initiative, which encourages riparian states to work cooperatively toward mutually beneficial and sustainable aquifer development However, the use of this shared resource is not ruled by any international agreement or convention between the USA and the Canada. Stakeholders from the two countries have been involved, at various levels of jurisdictions (municipal, provincial, state, federal) to establish a strong cooperation. In these contexts, models can constitute useful tools for informed decisions. In the case of the Milk River aquifer, models could support scientists and managers from both countries in avoiding potential tensions linked to the water shortage context in this region. Models can determine the conditions of overexploitation and provide an assessment of a sustainable yield. A unified conceptual model of the Milk River Aquifer has been built. This model follows the natural limits of the aquifer and is not interrupted by the USCanada border. The conceptual model covers many aspects such as the hydrostratigraphic 3D model, the groundwater flow, the recharge and discharge areas, the hydrogeological parameters, the pumping and observation wells, and the transboundary aspects. This model covers circa 55 000 km2. The study area is limited to the North/Northeast and Southeast by gas fields. This unified conceptual model will form the basis for a future 3D numerical hydrogeological model of groundwater flow in the Milk River Aquifer across the Canada-US border.
Multiphase modeling of geologic carbon sequestration in saline aquifers.
Bandilla, Karl W; Celia, Michael A; Birkholzer, Jens T; Cihan, Abdullah; Leister, Evan C
2015-01-01
Geologic carbon sequestration (GCS) is being considered as a climate change mitigation option in many future energy scenarios. Mathematical modeling is routinely used to predict subsurface CO2 and resident brine migration for the design of injection operations, to demonstrate the permanence of CO2 storage, and to show that other subsurface resources will not be degraded. Many processes impact the migration of CO2 and brine, including multiphase flow dynamics, geochemistry, and geomechanics, along with the spatial distribution of parameters such as porosity and permeability. In this article, we review a set of multiphase modeling approaches with different levels of conceptual complexity that have been used to model GCS. Model complexity ranges from coupled multiprocess models to simplified vertical equilibrium (VE) models and macroscopic invasion percolation models. The goal of this article is to give a framework of conceptual model complexity, and to show the types of modeling approaches that have been used to address specific GCS questions. Application of the modeling approaches is shown using five ongoing or proposed CO2 injection sites. For the selected sites, the majority of GCS models follow a simplified multiphase approach, especially for questions related to injection and local-scale heterogeneity. Coupled multiprocess models are only applied in one case where geomechanics have a strong impact on the flow. Owing to their computational efficiency, VE models tend to be applied at large scales. A macroscopic invasion percolation approach was used to predict the CO2 migration at one site to examine details of CO2 migration under the caprock. © 2015, National Ground Water Association.
Improving and Evaluating Nested Sampling Algorithm for Marginal Likelihood Estimation
NASA Astrophysics Data System (ADS)
Ye, M.; Zeng, X.; Wu, J.; Wang, D.; Liu, J.
2016-12-01
With the growing impacts of climate change and human activities on the cycle of water resources, an increasing number of researches focus on the quantification of modeling uncertainty. Bayesian model averaging (BMA) provides a popular framework for quantifying conceptual model and parameter uncertainty. The ensemble prediction is generated by combining each plausible model's prediction, and each model is attached with a model weight which is determined by model's prior weight and marginal likelihood. Thus, the estimation of model's marginal likelihood is crucial for reliable and accurate BMA prediction. Nested sampling estimator (NSE) is a new proposed method for marginal likelihood estimation. The process of NSE is accomplished by searching the parameters' space from low likelihood area to high likelihood area gradually, and this evolution is finished iteratively via local sampling procedure. Thus, the efficiency of NSE is dominated by the strength of local sampling procedure. Currently, Metropolis-Hasting (M-H) algorithm is often used for local sampling. However, M-H is not an efficient sampling algorithm for high-dimensional or complicated parameter space. For improving the efficiency of NSE, it could be ideal to incorporate the robust and efficient sampling algorithm - DREAMzs into the local sampling of NSE. The comparison results demonstrated that the improved NSE could improve the efficiency of marginal likelihood estimation significantly. However, both improved and original NSEs suffer from heavy instability. In addition, the heavy computation cost of huge number of model executions is overcome by using an adaptive sparse grid surrogates.
Characteristics of middle and upper tropospheric clouds as deduced from rawinsonde data
NASA Technical Reports Server (NTRS)
Starr, D. D. O.; Cox, S. K.
1982-01-01
The static environment of middle and upper tropospheric clouds is characterized. Computed relative humidity with respect to ice is used to diagnose the presence of cloud layer. The deduced seasonal mean cloud cover estimates based on this technique are shown to be reasonable. The cases are stratified by season and pressure thickness, and the dry static stability, vertical wind speed shear, and Richardson number are computed for three layers for each case. Mean values for each parameter are presented for each stratification and layer. The relative frequency of occurrence of various structures is presented for each stratification. The observed values of each parameter and the observed structure of each parameter are quite variable. Structures corresponding to any of a number of different conceptual models may be found. Moist adiabatic conditions are not commonly observed and the stratification based on thickness yields substantially different results for each group.
Entangled Parametric Hierarchies: Problems for an Overspecified Universal Grammar
Boeckx, Cedric; Leivada, Evelina
2013-01-01
This study addresses the feasibility of the classical notion of parameter in linguistic theory from the perspective of parametric hierarchies. A novel program-based analysis is implemented in order to show certain empirical problems related to these hierarchies. The program was developed on the basis of an enriched data base spanning 23 contemporary and 5 ancient languages. The empirical issues uncovered cast doubt on classical parametric models of language acquisition as well as on the conceptualization of an overspecified Universal Grammar that has parameters among its primitives. Pinpointing these issues leads to the proposal that (i) the (bio)logical problem of language acquisition does not amount to a process of triggering innately pre-wired values of parameters and (ii) it paves the way for viewing language, epigenetic (‘parametric’) variation as an externalization-related epiphenomenon, whose learning component may be more important than what sometimes is assumed. PMID:24019867
OBO to UML: Support for the development of conceptual models in the biomedical domain.
Waldemarin, Ricardo C; de Farias, Cléver R G
2018-04-01
A conceptual model abstractly defines a number of concepts and their relationships for the purposes of understanding and communication. Once a conceptual model is available, it can also be used as a starting point for the development of a software system. The development of conceptual models using the Unified Modeling Language (UML) facilitates the representation of modeled concepts and allows software developers to directly reuse these concepts in the design of a software system. The OBO Foundry represents the most relevant collaborative effort towards the development of ontologies in the biomedical domain. The development of UML conceptual models in the biomedical domain may benefit from the use of domain-specific semantics and notation. Further, the development of these models may also benefit from the reuse of knowledge contained in OBO ontologies. This paper investigates the support for the development of conceptual models in the biomedical domain using UML as a conceptual modeling language and using the support provided by the OBO Foundry for the development of biomedical ontologies, namely entity kind and relationship types definitions provided by the Basic Formal Ontology (BFO) and the OBO Core Relations Ontology (OBO Core), respectively. Further, the paper investigates the support for the reuse of biomedical knowledge currently available in OBOFFF ontologies in the development these conceptual models. The paper describes a UML profile for the OBO Core Relations Ontology, which basically defines a number of stereotypes to represent BFO entity kinds and OBO Core relationship types definitions. The paper also presents a support toolset consisting of a graphical editor named OBO-RO Editor, which directly supports the development of UML models using the extensions defined by our profile, and a command-line tool named OBO2UML, which directly converts an OBOFFF ontology into a UML model. Copyright © 2018 Elsevier Inc. All rights reserved.
Symbolic Regression for the Estimation of Transfer Functions of Hydrological Models
NASA Astrophysics Data System (ADS)
Klotz, D.; Herrnegger, M.; Schulz, K.
2017-11-01
Current concepts for parameter regionalization of spatially distributed rainfall-runoff models rely on the a priori definition of transfer functions that globally map land surface characteristics (such as soil texture, land use, and digital elevation) into the model parameter space. However, these transfer functions are often chosen ad hoc or derived from small-scale experiments. This study proposes and tests an approach for inferring the structure and parametrization of possible transfer functions from runoff data to potentially circumvent these difficulties. The concept uses context-free grammars to generate possible proposition for transfer functions. The resulting structure can then be parametrized with classical optimization techniques. Several virtual experiments are performed to examine the potential for an appropriate estimation of transfer function, all of them using a very simple conceptual rainfall-runoff model with data from the Austrian Mur catchment. The results suggest that a priori defined transfer functions are in general well identifiable by the method. However, the deduction process might be inhibited, e.g., by noise in the runoff observation data, often leading to transfer function estimates of lower structural complexity.
A Common Core for Active Conceptual Modeling for Learning from Surprises
NASA Astrophysics Data System (ADS)
Liddle, Stephen W.; Embley, David W.
The new field of active conceptual modeling for learning from surprises (ACM-L) may be helpful in preserving life, protecting property, and improving quality of life. The conceptual modeling community has developed sound theory and practices for conceptual modeling that, if properly applied, could help analysts model and predict more accurately. In particular, we need to associate more semantics with links, and we need fully reified high-level objects and relationships that have a clear, formal underlying semantics that follows a natural, ontological approach. We also need to capture more dynamic aspects in our conceptual models to more accurately model complex, dynamic systems. These concepts already exist, and the theory is well developed; what remains is to link them with the ideas needed to predict system evolution, thus enabling risk assessment and response planning. No single researcher or research group will be able to achieve this ambitious vision alone. As a starting point, we recommend that the nascent ACM-L community agree on a common core model that supports all aspects—static and dynamic—needed for active conceptual modeling in support of learning from surprises. A common core will more likely gain the traction needed to sustain the extended ACM-L research effort that will yield the advertised benefits of learning from surprises.
STEWB - Simplified Transient Estimation of the Water Budget
NASA Astrophysics Data System (ADS)
Meyer, P. D.; Simmons, C. S.; Cady, R. E.; Gee, G. W.
2001-12-01
A simplified model describing the transient water budget of a shallow unsaturated soil profile is presented. This model was developed for the U.S. Nuclear Regulatory Commission to provide estimates of the time-varying net infiltration at sites containing residual levels of radioactive materials. Ease of use, computational efficiency, and use of standard parameters and available data were requirements of the model. The model's conceptualization imposes the following simplifications: a uniform soil profile, instantaneous redistribution of infiltrated water, drainage under a unit hydraulic gradient, and no drainage from the soil profile during infiltration. The model's formulation is a revision of that originally presented by Kim et al. [WRR, 32(12):3475-3484, 1996]. Daily meteorological data are required as input. Random durations for precipitation events are generated based on an estimate of the average number of exceedances per year for the specific daily rainfall depth observed. Snow accumulation and melt are described using empirical relationships. During precipitation or snowmelt, runoff is described using an infiltration equation for ponded conditions. When no water is being applied to the profile, evapotranspiration (ET) and drainage occur. The ET rate equals the potential evapotranspiration rate, PET, above a critical value of saturation, SC. Below this critical value, ET = PET*(S/SC)**p, where S is saturation and p is an empirical parameter. Drainage flux from the profile equals the hydraulic conductivity as represented by the Brooks-Corey model. The model has been implemented with an easy-to-use graphical interface and is available at http://nrc-hydro-uncert.pnl.gov/code.htm. Comparison of the model results with lysimeter measurements will be shown, including a 50-year record from the ARS-Coshocton site in Ohio. The interpretation of parameters and the sensitivity of the model to parameter values will be discussed.
Soulis, Konstantinos X; Valiantzas, John D; Ntoulas, Nikolaos; Kargas, George; Nektarios, Panayiotis A
2017-09-15
In spite of the well-known green roof benefits, their widespread adoption in the management practices of urban drainage systems requires the use of adequate analytical and modelling tools. In the current study, green roof runoff modeling was accomplished by developing, testing, and jointly using a simple conceptual model and a physically based numerical simulation model utilizing HYDRUS-1D software. The use of such an approach combines the advantages of the conceptual model, namely simplicity, low computational requirements, and ability to be easily integrated in decision support tools with the capacity of the physically based simulation model to be easily transferred in conditions and locations other than those used for calibrating and validating it. The proposed approach was evaluated with an experimental dataset that included various green roof covers (either succulent plants - Sedum sediforme, or xerophytic plants - Origanum onites, or bare substrate without any vegetation) and two substrate depths (either 8 cm or 16 cm). Both the physically based and the conceptual models matched very closely the observed hydrographs. In general, the conceptual model performed better than the physically based simulation model but the overall performance of both models was sufficient in most cases as it is revealed by the Nash-Sutcliffe Efficiency index which was generally greater than 0.70. Finally, it was showcased how a physically based and a simple conceptual model can be jointly used to allow the use of the simple conceptual model for a wider set of conditions than the available experimental data and in order to support green roof design. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Hilbert, E. E.; Carl, C.; Goss, W.; Hansen, G. R.; Olsasky, M. J.; Johnston, A. R.
1978-01-01
An integrated sensor for traffic surveillance on mainline sections of urban freeways is described. Applicable imaging and processor technology is surveyed and the functional requirements for the sensors and the conceptual design of the breadboard sensors are given. Parameters measured by the sensors include lane density, speed, and volume. The freeway image is also used for incident diagnosis.
Superhot fluids circulating close to magma intrusions: a contribution from analogue modelling
NASA Astrophysics Data System (ADS)
Montanari, Domenico; Agostini, Andrea; Bonini, Marco; Corti, Giacomo
2017-04-01
Magma overpressure at the time of the emplacement at shallow crustal levels may lead to deformation (i.e. forced folding, fracturing and faulting) in the country rock, both at local and regional scale. To get insights into this process, we reproduced and analysed in the laboratory the fracture/fault network associated with the emplacement of magma at shallow crustal levels. We used a mixture of quartz sand and K-feldspar fine sand as an analogue for the brittle crust, and polyglycerols for the magma. The models were able to reproduce complex 3D architectures of deformation resulting from magma emplacement, with different deformation patterns -invariably dominated by forced folding and associated brittle faulting/fracturing- resulting from variable parameters. These results provide useful hints into geothermal researches. Fractures and faults associated with magma emplacement are indeed expected to significantly influence the distribution and migration of superhot geothermal fluids near the edge of the magma intrusion. These structures can therefore be considered as potential targets for geothermal or mineral deposits exploration. In this perspective, the results of analogue models may provide useful geometric and conceptual constraints for field work, numerical modeling, and particularly seismic interpretation for achieving a better understanding and tuning of the integrated conceptual model concerning the circulation of supercritical fluids. The research leading to these results has received funding from the European Community's Seventh Framework Programme under grant agreement No. 608553 (Project IMAGE).
Development of Conceptual Models for Internet Search: A Case Study.
ERIC Educational Resources Information Center
Uden, Lorna; Tearne, Stephen; Alderson, Albert
This paper describes the creation and evaluation of a World Wide Web-based courseware module, using conceptual models based on constructivism, that teaches novices how to use the Internet for searching. Questionnaires and interviews were used to understand the difficulties of a group of novices. The conceptual model of the experts for the task was…
Feasibility of Implementing an All-Volunteer Force for the ROK Armed Forces
2007-03-01
Korea’s current military/economic/political/social factors for voluntary recruitment through an open-systems conceptual model. Results indicate that the...recruitment through an open-systems conceptual model. Results indicate that the draft should be maintained for the near future, but this does not...7 A. A CONCEPTUAL MODEL FOR DEFENSE ORGANIZATION
Teacher Emotion Research: Introducing a Conceptual Model to Guide Future Research
ERIC Educational Resources Information Center
Fried, Leanne; Mansfield, Caroline; Dobozy, Eva
2015-01-01
This article reports on the development of a conceptual model of teacher emotion through a review of teacher emotion research published between 2003 and 2013. By examining 82 publications regarding teacher emotion, the main aim of the review was to identify how teacher emotion was conceptualised in the literature and develop a conceptual model to…
Showing Automatically Generated Students' Conceptual Models to Students and Teachers
ERIC Educational Resources Information Center
Perez-Marin, Diana; Pascual-Nieto, Ismael
2010-01-01
A student conceptual model can be defined as a set of interconnected concepts associated with an estimation value that indicates how well these concepts are used by the students. It can model just one student or a group of students, and can be represented as a concept map, conceptual diagram or one of several other knowledge representation…
Applying a Conceptual Model in Sport Sector Work- Integrated Learning Contexts
ERIC Educational Resources Information Center
Agnew, Deborah; Pill, Shane; Orrell, Janice
2017-01-01
This paper applies a conceptual model for work-integrated learning (WIL) in a multidisciplinary sports degree program. Two examples of WIL in sport will be used to illustrate how the conceptual WIL model is being operationalized. The implications for practice are that curriculum design must recognize a highly flexible approach to the nature of…
ERIC Educational Resources Information Center
Battisti, Bryce Thomas; Hanegan, Nikki; Sudweeks, Richard; Cates, Rex
2010-01-01
Concept inventories are often used to assess current student understanding although conceptual change models are problematic. Due to controversies with conceptual change models and the realities of student assessment, it is important that concept inventories are evaluated using a variety of theoretical models to improve quality. This study used a…
Inverse modeling of BTEX dissolution and biodegradation at the Bemidji, MN crude-oil spill site
Essaid, H.I.; Cozzarelli, I.M.; Eganhouse, R.P.; Herkelrath, W.N.; Bekins, B.A.; Delin, G.N.
2003-01-01
The U.S. Geological Survey (USGS) solute transport and biodegradation code BIOMOC was used in conjunction with the USGS universal inverse modeling code UCODE to quantify field-scale hydrocarbon dissolution and biodegradation at the USGS Toxic Substances Hydrology Program crude-oil spill research site located near Bemidji, MN. This inverse modeling effort used the extensive historical data compiled at the Bemidji site from 1986 to 1997 and incorporated a multicomponent transport and biodegradation model. Inverse modeling was successful when coupled transport and degradation processes were incorporated into the model and a single dissolution rate coefficient was used for all BTEX components. Assuming a stationary oil body, we simulated benzene, toluene, ethylbenzene, m,p-xylene, and o-xylene (BTEX) concentrations in the oil and ground water, respectively, as well as dissolved oxygen. Dissolution from the oil phase and aerobic and anaerobic degradation processes were represented. The parameters estimated were the recharge rate, hydraulic conductivity, dissolution rate coefficient, individual first-order BTEX anaerobic degradation rates, and transverse dispersivity. Results were similar for simulations obtained using several alternative conceptual models of the hydrologic system and biodegradation processes. The dissolved BTEX concentration data were not sufficient to discriminate between these conceptual models. The calibrated simulations reproduced the general large-scale evolution of the plume, but did not reproduce the observed small-scale spatial and temporal variability in concentrations. The estimated anaerobic biodegradation rates for toluene and o-xylene were greater than the dissolution rate coefficient. However, the estimated anaerobic biodegradation rates for benzene, ethylbenzene, and m,p-xylene were less than the dissolution rate coefficient. The calibrated model was used to determine the BTEX mass balance in the oil body and groundwater plume. Dissolution from the oil body was greatest for compounds with large effective solubilities (benzene) and with large degradation rates (toluene and o-xylene). Anaerobic degradation removed 77% of the BTEX that dissolved into the water phase and aerobic degradation removed 17%. Although goodness-of-fit measures for the alternative conceptual models were not significantly different, predictions made with the models were quite variable. ?? 2003 Elsevier Science B.V. All rights reserved.
Issues and recent advances in optimal experimental design for site investigation (Invited)
NASA Astrophysics Data System (ADS)
Nowak, W.
2013-12-01
This presentation provides an overview over issues and recent advances in model-based experimental design for site exploration. The addressed issues and advances are (1) how to provide an adequate envelope to prior uncertainty, (2) how to define the information needs in a task-oriented manner, (3) how to measure the expected impact of a data set that it not yet available but only planned to be collected, and (4) how to perform best the optimization of the data collection plan. Among other shortcomings of the state-of-the-art, it is identified that there is a lack of demonstrator studies where exploration schemes based on expert judgment are compared to exploration schemes obtained by optimal experimental design. Such studies will be necessary do address the often voiced concern that experimental design is an academic exercise with little improvement potential over the well- trained gut feeling of field experts. When addressing this concern, a specific focus has to be given to uncertainty in model structure, parameterizations and parameter values, and to related surprises that data often bring about in field studies, but never in synthetic-data based studies. The background of this concern is that, initially, conceptual uncertainty may be so large that surprises are the rule rather than the exception. In such situations, field experts have a large body of experience in handling the surprises, and expert judgment may be good enough compared to meticulous optimization based on a model that is about to be falsified by the incoming data. In order to meet surprises accordingly and adapt to them, there needs to be a sufficient representation of conceptual uncertainty within the models used. Also, it is useless to optimize an entire design under this initial range of uncertainty. Thus, the goal setting of the optimization should include the objective to reduce conceptual uncertainty. A possible way out is to upgrade experimental design theory towards real-time interaction with the ongoing site investigation, such that surprises in the data are immediately accounted for to restrict the conceptual uncertainty and update the optimization of the plan.
Klijs, Bart; Kibele, Eva U B; Ellwardt, Lea; Zuidersma, Marij; Stolk, Ronald P; Wittek, Rafael P M; Mendes de Leon, Carlos M; Smidt, Nynke
2016-08-11
Previous studies are inconclusive on whether poor socioeconomic conditions in the neighborhood are associated with major depressive disorder. Furthermore, conceptual models that relate neighborhood conditions to depressive disorder have not been evaluated using empirical data. In this study, we investigated whether neighborhood income is associated with major depressive episodes. We evaluated three conceptual models. Conceptual model 1: The association between neighborhood income and major depressive episodes is explained by diseases, lifestyle factors, stress and social participation. Conceptual model 2: A low individual income relative to the mean income in the neighborhood is associated with major depressive episodes. Conceptual model 3: A high income of the neighborhood buffers the effect of a low individual income on major depressive disorder. We used adult baseline data from the LifeLines Cohort Study (N = 71,058) linked with data on the participants' neighborhoods from Statistics Netherlands. The current presence of a major depressive episode was assessed using the MINI neuropsychiatric interview. The association between neighborhood income and major depressive episodes was assessed using a mixed effect logistic regression model adjusted for age, sex, marital status, education and individual (equalized) income. This regression model was sequentially adjusted for lifestyle factors, chronic diseases, stress, and social participation to evaluate conceptual model 1. To evaluate conceptual models 2 and 3, an interaction term for neighborhood income*individual income was included. Multivariate regression analysis showed that a low neighborhood income is associated with major depressive episodes (OR (95 % CI): 0.82 (0.73;0.93)). Adjustment for diseases, lifestyle factors, stress, and social participation attenuated this association (ORs (95 % CI): 0.90 (0.79;1.01)). Low individual income was also associated with major depressive episodes (OR (95 % CI): 0.72 (0.68;0.76)). The interaction of individual income*neighborhood income on major depressive episodes was not significant (p = 0.173). Living in a low-income neighborhood is associated with major depressive episodes. Our results suggest that this association is partly explained by chronic diseases, lifestyle factors, stress and poor social participation, and thereby partly confirm conceptual model 1. Our results do not support conceptual model 2 and 3.
Health literacy and public health: A systematic review and integration of definitions and models
2012-01-01
Background Health literacy concerns the knowledge and competences of persons to meet the complex demands of health in modern society. Although its importance is increasingly recognised, there is no consensus about the definition of health literacy or about its conceptual dimensions, which limits the possibilities for measurement and comparison. The aim of the study is to review definitions and models on health literacy to develop an integrated definition and conceptual model capturing the most comprehensive evidence-based dimensions of health literacy. Methods A systematic literature review was performed to identify definitions and conceptual frameworks of health literacy. A content analysis of the definitions and conceptual frameworks was carried out to identify the central dimensions of health literacy and develop an integrated model. Results The review resulted in 17 definitions of health literacy and 12 conceptual models. Based on the content analysis, an integrative conceptual model was developed containing 12 dimensions referring to the knowledge, motivation and competencies of accessing, understanding, appraising and applying health-related information within the healthcare, disease prevention and health promotion setting, respectively. Conclusions Based upon this review, a model is proposed integrating medical and public health views of health literacy. The model can serve as a basis for developing health literacy enhancing interventions and provide a conceptual basis for the development and validation of measurement tools, capturing the different dimensions of health literacy within the healthcare, disease prevention and health promotion settings. PMID:22276600
A Systematic Review of Conceptual Frameworks of Medical Complexity and New Model Development.
Zullig, Leah L; Whitson, Heather E; Hastings, Susan N; Beadles, Chris; Kravchenko, Julia; Akushevich, Igor; Maciejewski, Matthew L
2016-03-01
Patient complexity is often operationalized by counting multiple chronic conditions (MCC) without considering contextual factors that can affect patient risk for adverse outcomes. Our objective was to develop a conceptual model of complexity addressing gaps identified in a review of published conceptual models. We searched for English-language MEDLINE papers published between 1 January 2004 and 16 January 2014. Two reviewers independently evaluated abstracts and all authors contributed to the development of the conceptual model in an iterative process. From 1606 identified abstracts, six conceptual models were selected. One additional model was identified through reference review. Each model had strengths, but several constructs were not fully considered: 1) contextual factors; 2) dynamics of complexity; 3) patients' preferences; 4) acute health shocks; and 5) resilience. Our Cycle of Complexity model illustrates relationships between acute shocks and medical events, healthcare access and utilization, workload and capacity, and patient preferences in the context of interpersonal, organizational, and community factors. This model may inform studies on the etiology of and changes in complexity, the relationship between complexity and patient outcomes, and intervention development to improve modifiable elements of complex patients.
Open Vehicle Sketch Pad Aircraft Modeling Strategies
NASA Technical Reports Server (NTRS)
Hahn, Andrew S.
2013-01-01
Geometric modeling of aircraft during the Conceptual design phase is very different from that needed for the Preliminary or Detailed design phases. The Conceptual design phase is characterized by the rapid, multi-disciplinary analysis of many design variables by a small engineering team. The designer must walk a line between fidelity and productivity, picking tools and methods with the appropriate balance of characteristics to achieve the goals of the study, while staying within the available resources. Identifying geometric details that are important, and those that are not, is critical to making modeling and methodology choices. This is true for both the low-order analysis methods traditionally used in Conceptual design as well as the highest-order analyses available. This paper will highlight some of Conceptual design's characteristics that drive the designer s choices as well as modeling examples for several aircraft configurations using the open source version of the Vehicle Sketch Pad (Open VSP) aircraft Conceptual design geometry modeler.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yidong Xia; Mitch Plummer; Robert Podgorney
2016-02-01
Performance of heat production process over a 30-year period is assessed in a conceptual EGS model with a geothermal gradient of 65K per km depth in the reservoir. Water is circulated through a pair of parallel wells connected by a set of single large wing fractures. The results indicate that the desirable output electric power rate and lifespan could be obtained under suitable material properties and system parameters. A sensitivity analysis on some design constraints and operation parameters indicates that 1) the fracture horizontal spacing has profound effect on the long-term performance of heat production, 2) the downward deviation anglemore » for the parallel doublet wells may help overcome the difficulty of vertical drilling to reach a favorable production temperature, and 3) the thermal energy production rate and lifespan has close dependence on water mass flow rate. The results also indicate that the heat production can be improved when the horizontal fracture spacing, well deviation angle, and production flow rate are under reasonable conditions. To conduct the reservoir modeling and simulations, an open-source, finite element based, fully implicit, fully coupled hydrothermal code, namely FALCON, has been developed and used in this work. Compared with most other existing codes that are either closed-source or commercially available in this area, this new open-source code has demonstrated a code development strategy that aims to provide an unparalleled easiness for user-customization and multi-physics coupling. Test results have shown that the FALCON code is able to complete the long-term tests efficiently and accurately, thanks to the state-of-the-art nonlinear and linear solver algorithms implemented in the code.« less
A flexible and qualitatively stable model for cell cycle dynamics including DNA damage effects.
Jeffries, Clark D; Johnson, Charles R; Zhou, Tong; Simpson, Dennis A; Kaufmann, William K
2012-01-01
This paper includes a conceptual framework for cell cycle modeling into which the experimenter can map observed data and evaluate mechanisms of cell cycle control. The basic model exhibits qualitative stability, meaning that regardless of magnitudes of system parameters its instances are guaranteed to be stable in the sense that all feasible trajectories converge to a certain trajectory. Qualitative stability can also be described by the signs of real parts of eigenvalues of the system matrix. On the biological side, the resulting model can be tuned to approximate experimental data pertaining to human fibroblast cell lines treated with ionizing radiation, with or without disabled DNA damage checkpoints. Together these properties validate a fundamental, first order systems view of cell dynamics. Classification Codes: 15A68.
Challenges in Requirements Engineering: A Research Agenda for Conceptual Modeling
NASA Astrophysics Data System (ADS)
March, Salvatore T.; Allen, Gove N.
Domains for which information systems are developed deal primarily with social constructions—conceptual objects and attributes created by human intentions and for human purposes. Information systems play an active role in these domains. They document the creation of new conceptual objects, record and ascribe values to their attributes, initiate actions within the domain, track activities performed, and infer conclusions based on the application of rules that govern how the domain is affected when socially-defined and identified causal events occur. Emerging applications of information technologies evaluate such business rules, learn from experience, and adapt to changes in the domain. Conceptual modeling grammars aimed at representing their system requirements must include conceptual objects, socially-defined events, and the rules pertaining to them. We identify challenges to conceptual modeling research and pose an ontology of the artificial as a step toward meeting them.
Application of the human needs conceptual model to dental hygiene practice.
Darby, M L; Walsh, M M
2000-01-01
The Human Needs Conceptual Model is relevant to dental hygiene because of the need for dental hygienists to be client focused, humanistic, and accountable in practice. Application of the Human Needs Conceptual Model provides a formal framework for identifying and understanding the unique needs of the client that can be met through dental hygiene care. Practitioners find that the Human Needs Conceptual Model can not only help them in assessment and diagnosis, but also in client education, decision-making, care implementation, and the evaluation of treatment outcomes. By using the model, the dental hygienist is able to manage client care humanistically and holistically, and ensure that care is client-centered rather than task-oriented. With the model, a professional practice can be made operational.
White-Means, S I
1995-01-01
There is no consensus on the appropriate conceptualization of race in economic models of health care. This is because race is rarely the primary focus for analysis of the market. This article presents an alternative framework for conceptualizing race in health economic models. A case study is analyzed to illustrate the value of the alternative conceptualization. The case study findings clearly document the importance of model stratification according to race. Moreover, the findings indicate that empirical results are improved when medical utilization models are refined in a way that reflects the unique experiences of the population that is studied. PMID:7721593
General Methodology for Designing Spacecraft Trajectories
NASA Technical Reports Server (NTRS)
Condon, Gerald; Ocampo, Cesar; Mathur, Ravishankar; Morcos, Fady; Senent, Juan; Williams, Jacob; Davis, Elizabeth C.
2012-01-01
A methodology for designing spacecraft trajectories in any gravitational environment within the solar system has been developed. The methodology facilitates modeling and optimization for problems ranging from that of a single spacecraft orbiting a single celestial body to that of a mission involving multiple spacecraft and multiple propulsion systems operating in gravitational fields of multiple celestial bodies. The methodology consolidates almost all spacecraft trajectory design and optimization problems into a single conceptual framework requiring solution of either a system of nonlinear equations or a parameter-optimization problem with equality and/or inequality constraints.
Modeling aqueous ferrous iron chemistry at low temperatures with application to Mars
Marion, G.M.; Catling, D.C.; Kargel, J.S.
2003-01-01
Major uncertainties exist with respect to the aqueous geochemical evolution of the Martian surface. Considering the prevailing cryogenic climates and the abundance of salts and iron minerals on Mars, any attempt at comprehensive modeling of Martian aqueous chemistry should include iron chemistry and be valid at low temperatures and high solution concentrations. The objectives of this paper were to (1) estimate ferrous iron Pitzer-equation parameters and iron mineral solubility products at low temperatures (from < 0 ??C to 25 ??C), (2) incorporate these parameters and solubility products into the FREZCHEM model, and (3) use the model to simulate the surficial aqueous geochemical evolution of Mars. Ferrous iron Pitzer-equation parameters were derived in this work or taken from the literature. Six new iron minerals [FeCl2??4H2O, FeCl2??6H2O, FeSO4??H2O, FeSO4??7H2O, FeCO3, and Fe(OH)3] were added to the FREZCHEM model bringing the total solid phases to 56. Agreement between model predictions and experimental data are fair to excellent for the ferrous systems: Fe-Cl, Fe-SO4, Fe-HCO3, H-Fe-Cl, and H-Fe-SO4. We quantified a conceptual model for the aqueous geochemical evolution of the Martian surface. The five stages of the conceptual model are: (1) carbonic acid weathering of primary ferromagnesian minerals to form an initial magnesium-iron-bicarbonate-rich solution; (2) evaporation and precipitation of carbonates, including siderite (FeCO3), with evolution of the brine to a concentrated NaCl solution; (3) ferrous/ferric iron oxidation; (4) either evaporation or freezing of the brine to dryness; and (5) surface acidification. What began as a dilute Mg-Fe-HCO3 dominated leachate representing ferromagnesian weathering evolved into an Earth-like seawater composition dominated by NaCl, and finally into a hypersaline Mg-Na-SO4-Cl brine. Weathering appears to have taken place initially under conditions that allowed solution of ferrous iron [low O2(g)], but later caused oxidation of iron [high O2(g)]. Surface acidification and/or sediment burial can account for the minor amounts of Martian surface carbonates. This model rests on a large number of assumptions and is therefore speculative. Nevertheless, the model is consistent with current understanding concerning surficial salts and minerals based on Martian meteorites, Mars lander data, and remotely-sensed spectral analyses. ?? 2003 Elsevier Ltd.
A parsimonious dynamic model for river water quality assessment.
Mannina, Giorgio; Viviani, Gaspare
2010-01-01
Water quality modelling is of crucial importance for the assessment of physical, chemical, and biological changes in water bodies. Mathematical approaches to water modelling have become more prevalent over recent years. Different model types ranging from detailed physical models to simplified conceptual models are available. Actually, a possible middle ground between detailed and simplified models may be parsimonious models that represent the simplest approach that fits the application. The appropriate modelling approach depends on the research goal as well as on data available for correct model application. When there is inadequate data, it is mandatory to focus on a simple river water quality model rather than detailed ones. The study presents a parsimonious river water quality model to evaluate the propagation of pollutants in natural rivers. The model is made up of two sub-models: a quantity one and a quality one. The model employs a river schematisation that considers different stretches according to the geometric characteristics and to the gradient of the river bed. Each stretch is represented with a conceptual model of a series of linear channels and reservoirs. The channels determine the delay in the pollution wave and the reservoirs cause its dispersion. To assess the river water quality, the model employs four state variables: DO, BOD, NH(4), and NO. The model was applied to the Savena River (Italy), which is the focus of a European-financed project in which quantity and quality data were gathered. A sensitivity analysis of the model output to the model input or parameters was done based on the Generalised Likelihood Uncertainty Estimation methodology. The results demonstrate the suitability of such a model as a tool for river water quality management.
Conceptual Models and Guidelines for Clinical Assessment of Financial Capacity.
Marson, Daniel
2016-09-01
The ability to manage financial affairs is a life skill of critical importance, and neuropsychologists are increasingly asked to assess financial capacity across a variety of settings. Sound clinical assessment of financial capacity requires knowledge and appreciation of applicable clinical conceptual models and principles. However, the literature has presented relatively little conceptual guidance for clinicians concerning financial capacity and its assessment. This article seeks to address this gap. The article presents six clinical models of financial capacity : (1) the early gerontological IADL model of Lawton, (2) the clinical skills model and (3) related cognitive psychological model developed by Marson and colleagues, (4) a financial decision-making model adapting earlier decisional capacity work of Appelbaum and Grisso, (5) a person-centered model of financial decision-making developed by Lichtenberg and colleagues, and (6) a recent model of financial capacity in the real world developed through the Institute of Medicine. Accompanying presentation of the models is discussion of conceptual and practical perspectives they represent for clinician assessment. Based on the models, the article concludes by presenting a series of conceptually oriented guidelines for clinical assessment of financial capacity. In summary, sound assessment of financial capacity requires knowledge and appreciation of clinical conceptual models and principles. Awareness of such models, principles and guidelines will strengthen and advance clinical assessment of financial capacity. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Using heat as a tracer to estimate spatially distributed mean residence times in the hyporheic zone
NASA Astrophysics Data System (ADS)
Naranjo, R. C.; Pohll, G. M.; Stone, M. C.; Niswonger, R. G.; McKay, W. A.
2013-12-01
Biogeochemical reactions that occur in the hyporheic zone are highly dependent on the time solutes are in contact with riverbed sediments. In this investigation, we developed a two-dimensional longitudinal flow and solute transport model to estimate the spatial distribution of mean residence time in the hyporheic zone along a riffle-pool sequence to gain a better understanding of nitrogen reactions. A flow and transport model was developed to estimate spatially distributed mean residence times and was calibrated using observations of temperature and pressure. The approach used in this investigation accounts for the mixing of ages given advection and dispersion. Uncertainty of flow and transport parameters was evaluated using standard Monte-Carlo analysis and the generalized likelihood uncertainty estimation method. Results of parameter estimation indicate the presence of a low-permeable zone in the riffle area that induced horizontal flow at shallow depth within the riffle area. This establishes shallow and localized flow paths and limits deep vertical exchange. From the optimal model, mean residence times were found to be relatively long (9 - 40 days). The uncertainty of hydraulic conductivity resulted in a mean interquartile range of 13 days across all piezometers and was reduced by 24% with the inclusion of temperature and pressure observations. To a lesser extent, uncertainty in streambed porosity and dispersivity resulted in a mean interquartile range of 2.2- and 4.7 days, respectively. Alternative conceptual models demonstrate the importance of accounting for the spatial distribution of hydraulic conductivity in simulating mean residence times in a riffle-pool sequence. It is demonstrated that spatially variable mean residence time beneath a riffle-pool system does not conform to simple conceptual models of hyporheic flow through a riffle-pool sequence. Rather, the mixing behavior between the river and the hyporheic flow are largely controlled by layered heterogeneity and anisotropy of the subsurface.
An Empirical Study of Enterprise Conceptual Modeling
NASA Astrophysics Data System (ADS)
Anaby-Tavor, Ateret; Amid, David; Fisher, Amit; Ossher, Harold; Bellamy, Rachel; Callery, Matthew; Desmond, Michael; Krasikov, Sophia; Roth, Tova; Simmonds, Ian; de Vries, Jacqueline
Business analysts, business architects, and solution consultants use a variety of practices and methods in their quest to understand business. The resulting work products could end up being transitioned into the formal world of software requirement definitions or as recommendations for all kinds of business activities. We describe an empirical study about the nature of these methods, diagrams, and home-grown conceptual models as reflected in real practice at IBM. We identify the models as artifacts of "enterprise conceptual modeling". We study important features of these models, suggest practical classifications, and discuss their usage. Our survey shows that the "enterprise conceptual modeling" arena presents a variety of descriptive models, each used by a relatively small group of colleagues. Together they form a "long tail" that extends from "drawings" on one end to "standards" on the other.
Educational Criteria for Evaluating Simple Class Diagrams Made by Novices for Conceptual Modeling
ERIC Educational Resources Information Center
Kayama, Mizue; Ogata, Shinpei; Asano, David K.; Hashimoto, Masami
2016-01-01
Conceptual modeling is one of the most important learning topics for higher education and secondary education. The goal of conceptual modeling in this research is to draw a class diagram using given notation to satisfy the given requirements. In this case, the subjects are asked to choose concepts to satisfy the given requirements and to correctly…
ERIC Educational Resources Information Center
Lee, Heewon; Contento, Isobel R.; Koch, Pamela
2013-01-01
Objective: To use and review a conceptual model of process evaluation and to examine the implementation of a nutrition education curriculum, "Choice, Control & Change", designed to promote dietary and physical activity behaviors that reduce obesity risk. Design: A process evaluation study based on a systematic conceptual model. Setting: Five…
Conceptual models of information processing
NASA Technical Reports Server (NTRS)
Stewart, L. J.
1983-01-01
The conceptual information processing issues are examined. Human information processing is defined as an active cognitive process that is analogous to a system. It is the flow and transformation of information within a human. The human is viewed as an active information seeker who is constantly receiving, processing, and acting upon the surrounding environmental stimuli. Human information processing models are conceptual representations of cognitive behaviors. Models of information processing are useful in representing the different theoretical positions and in attempting to define the limits and capabilities of human memory. It is concluded that an understanding of conceptual human information processing models and their applications to systems design leads to a better human factors approach.
Bromaghin, Jeffrey F.; McDonald, Trent L.; Amstrup, Steven C.
2013-01-01
Mark-recapture models are extensively used in quantitative population ecology, providing estimates of population vital rates, such as survival, that are difficult to obtain using other methods. Vital rates are commonly modeled as functions of explanatory covariates, adding considerable flexibility to mark-recapture models, but also increasing the subjectivity and complexity of the modeling process. Consequently, model selection and the evaluation of covariate structure remain critical aspects of mark-recapture modeling. The difficulties involved in model selection are compounded in Cormack-Jolly- Seber models because they are composed of separate sub-models for survival and recapture probabilities, which are conceptualized independently even though their parameters are not statistically independent. The construction of models as combinations of sub-models, together with multiple potential covariates, can lead to a large model set. Although desirable, estimation of the parameters of all models may not be feasible. Strategies to search a model space and base inference on a subset of all models exist and enjoy widespread use. However, even though the methods used to search a model space can be expected to influence parameter estimation, the assessment of covariate importance, and therefore the ecological interpretation of the modeling results, the performance of these strategies has received limited investigation. We present a new strategy for searching the space of a candidate set of Cormack-Jolly-Seber models and explore its performance relative to existing strategies using computer simulation. The new strategy provides an improved assessment of the importance of covariates and covariate combinations used to model survival and recapture probabilities, while requiring only a modest increase in the number of models on which inference is based in comparison to existing techniques.
Conceptual Models of Depression in Primary Care Patients: A Comparative Study
Karasz, Alison; Garcia, Nerina; Ferri, Lucia
2009-01-01
Conventional psychiatric treatment models are based on a biopsychiatric model of depression. A plausible explanation for low rates of depression treatment utilization among ethnic minorities and the poor is that members of these communities do not share the cultural assumptions underlying the biopsychiatric model. The study examined conceptual models of depression among depressed patients from various ethnic groups, focusing on the degree to which patients’ conceptual models ‘matched’ a biopsychiatric model of depression. The sample included 74 primary care patients from three ethnic groups screening positive for depression. We administered qualitative interviews assessing patients’ conceptual representations of depression. The analysis proceeded in two phases. The first phase involved a strategy called ‘quantitizing’ the qualitative data. A rating scheme was developed and applied to the data by a rater blind to study hypotheses. The data was subjected to statistical analyses. The second phase of the analysis involved the analysis of thematic data using standard qualitative techniques. Study hypotheses were largely supported. The qualitative analysis provided a detailed picture of primary care patients’ conceptual models of depression and suggested interesting directions for future research. PMID:20182550
A Conceptual Framework for SAHRA Integrated Multi-resolution Modeling in the Rio Grande Basin
NASA Astrophysics Data System (ADS)
Liu, Y.; Gupta, H.; Springer, E.; Wagener, T.; Brookshire, D.; Duffy, C.
2004-12-01
The sustainable management of water resources in a river basin requires an integrated analysis of the social, economic, environmental and institutional dimensions of the problem. Numerical models are commonly used for integration of these dimensions and for communication of the analysis results to stakeholders and policy makers. The National Science Foundation Science and Technology Center for Sustainability of semi-Arid Hydrology and Riparian Areas (SAHRA) has been developing integrated multi-resolution models to assess impacts of climate variability and land use change on water resources in the Rio Grande Basin. These models not only couple natural systems such as surface and ground waters, but will also include engineering, economic and social components that may be involved in water resources decision-making processes. This presentation will describe the conceptual framework being developed by SAHRA to guide and focus the multiple modeling efforts and to assist the modeling team in planning, data collection and interpretation, communication, evaluation, etc. One of the major components of this conceptual framework is a Conceptual Site Model (CSM), which describes the basin and its environment based on existing knowledge and identifies what additional information must be collected to develop technically sound models at various resolutions. The initial CSM is based on analyses of basin profile information that has been collected, including a physical profile (e.g., topographic and vegetative features), a man-made facility profile (e.g., dams, diversions, and pumping stations), and a land use and ecological profile (e.g., demographics, natural habitats, and endangered species). Based on the initial CSM, a Conceptual Physical Model (CPM) is developed to guide and evaluate the selection of a model code (or numerical model) for each resolution to conduct simulations and predictions. A CPM identifies, conceptually, all the physical processes and engineering and socio-economic activities occurring (or to occur) in the real system that the corresponding numerical models are required to address, such as riparian evapotranspiration responses to vegetation change and groundwater pumping impacts on soil moisture contents. Simulation results from different resolution models and observations of the real system will then be compared to evaluate the consistency among the CSM, the CPMs, and the numerical models, and feedbacks will be used to update the models. In a broad sense, the evaluation of the models (conceptual or numerical), as well as the linkages between them, can be viewed as a part of the overall conceptual framework. As new data are generated and understanding improves, the models will evolve, and the overall conceptual framework is refined. The development of the conceptual framework becomes an on-going process. We will describe the current state of this framework and the open questions that have to be addressed in the future.
Conceptual strategies and inter-theory relations: The case of nanoscale cracks
NASA Astrophysics Data System (ADS)
Bursten, Julia R.
2018-05-01
This paper introduces a new account of inter-theory relations in physics, which I call the conceptual strategies account. Using the example of a multiscale computer simulation model of nanoscale crack propagation in silicon, I illustrate this account and contrast it with existing reductive, emergent, and handshaking approaches. The conceptual strategies account develops the notion that relations among physical theories, and among their models, are constrained but not dictated by limitations from physics, mathematics, and computation, and that conceptual reasoning within those limits is required both to generate and to understand the relations between theories. Conceptual strategies result in a variety of types of relations between theories and models. These relations are themselves epistemic objects, like theories and models, and as such are an under-recognized part of the epistemic landscape of science.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dai, Heng; Ye, Ming; Walker, Anthony P.
Hydrological models are always composed of multiple components that represent processes key to intended model applications. When a process can be simulated by multiple conceptual-mathematical models (process models), model uncertainty in representing the process arises. While global sensitivity analysis methods have been widely used for identifying important processes in hydrologic modeling, the existing methods consider only parametric uncertainty but ignore the model uncertainty for process representation. To address this problem, this study develops a new method to probe multimodel process sensitivity by integrating the model averaging methods into the framework of variance-based global sensitivity analysis, given that the model averagingmore » methods quantify both parametric and model uncertainty. A new process sensitivity index is derived as a metric of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and model parameters. For demonstration, the new index is used to evaluate the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that converting precipitation to recharge, and the geology process is also simulated by two models of different parameterizations of hydraulic conductivity; each process model has its own random parameters. The new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less
Kwasniok, Frank; Lohmann, Gerrit
2009-12-01
A method for systematically deriving simple nonlinear dynamical models from ice-core data is proposed. It offers a tool to integrate models and theories with paleoclimatic data. The method is based on the unscented Kalman filter, a nonlinear extension of the conventional Kalman filter. Here, we adopt the abstract conceptual model of stochastically driven motion in a potential that allows for two distinctly different states. The parameters of the model-the shape of the potential and the noise level-are estimated from a North Greenland ice-core record. For the glacial period from 70 to 20 ky before present, a potential is derived that is asymmetric and almost degenerate. There is a deep well corresponding to a cold stadial state and a very shallow well corresponding to a warm interstadial state.
NASA Astrophysics Data System (ADS)
Pandey, S.; Vesselinov, V. V.; O'Malley, D.; Karra, S.; Hansen, S. K.
2016-12-01
Models and data are used to characterize the extent of contamination and remediation, both of which are dependent upon the complex interplay of processes ranging from geochemical reactions, microbial metabolism, and pore-scale mixing to heterogeneous flow and external forcings. Characterization is wrought with important uncertainties related to the model itself (e.g. conceptualization, model implementation, parameter values) and the data used for model calibration (e.g. sparsity, measurement errors). This research consists of two primary components: (1) Developing numerical models that incorporate the complex hydrogeology and biogeochemistry that drive groundwater contamination and remediation; (2) Utilizing novel techniques for data/model-based analyses (such as parameter calibration and uncertainty quantification) to aid in decision support for optimal uncertainty reduction related to characterization and remediation of contaminated sites. The reactive transport models are developed using PFLOTRAN and are capable of simulating a wide range of biogeochemical and hydrologic conditions that affect the migration and remediation of groundwater contaminants under diverse field conditions. Data/model-based analyses are achieved using MADS, which utilizes Bayesian methods and Information Gap theory to address the data/model uncertainties discussed above. We also use these tools to evaluate different models, which vary in complexity, in order to weigh and rank models based on model accuracy (in representation of existing observations), model parsimony (everything else being equal, models with smaller number of model parameters are preferred), and model robustness (related to model predictions of unknown future states). These analyses are carried out on synthetic problems, but are directly related to real-world problems; for example, the modeled processes and data inputs are consistent with the conditions at the Los Alamos National Laboratory contamination sites (RDX and Chromium).
Conceptual model of sediment processes in the upper Yuba River watershed, Sierra Nevada, CA
Curtis, J.A.; Flint, L.E.; Alpers, Charles N.; Yarnell, S.M.
2005-01-01
This study examines the development of a conceptual model of sediment processes in the upper Yuba River watershed; and we hypothesize how components of the conceptual model may be spatially distributed using a geographical information system (GIS). The conceptual model illustrates key processes controlling sediment dynamics in the upper Yuba River watershed and was tested and revised using field measurements, aerial photography, and low elevation videography. Field reconnaissance included mass wasting and channel storage inventories, assessment of annual channel change in upland tributaries, and evaluation of the relative importance of sediment sources and transport processes. Hillslope erosion rates throughout the study area are relatively low when compared to more rapidly eroding landscapes such as the Pacific Northwest and notable hillslope sediment sources include highly erodible andesitic mudflows, serpentinized ultramafics, and unvegetated hydraulic mine pits. Mass wasting dominates surface erosion on the hillslopes; however, erosion of stored channel sediment is the primary contributor to annual sediment yield. We used GIS to spatially distribute the components of the conceptual model and created hillslope erosion potential and channel storage models. The GIS models exemplify the conceptual model in that landscapes with low potential evapotranspiration, sparse vegetation, steep slopes, erodible geology and soils, and high road densities display the greatest hillslope erosion potential and channel storage increases with increasing stream order. In-channel storage in upland tributaries impacted by hydraulic mining is an exception. Reworking of stored hydraulic mining sediment in low-order tributaries continues to elevate upper Yuba River sediment yields. Finally, we propose that spatially distributing the components of a conceptual model in a GIS framework provides a guide for developing more detailed sediment budgets or numerical models making it an inexpensive way to develop a roadmap for understanding sediment dynamics at a watershed scale.
Organizational intellectual capital and the role of the nurse manager: A proposed conceptual model.
Gilbert, Jason H; Von Ah, Diane; Broome, Marion E
Nurse managers must leverage both the human capital and social capital of the teams they lead in order to produce quality outcomes. Little is known about the relationship between human capital and social capital and how these concepts may work together to produce organizational outcomes through leadership of nurses. The purpose of this article was to explore the concepts of human capital and social capital as they relate to nursing leadership in health care organizations. Specific aims included (a) to synthesize the literature related to human capital and social capital in leadership, (b) to refine the conceptual definitions of human capital and social capital with associated conceptual antecedents and consequences, and (c) to propose a synthesized conceptual model guiding further empirical research of social capital and human capital in nursing leadership. A systematic integrative review of leadership literature using criteria informed by Whittemore and Knafl (2005) was completed. CINAHL Plus with Full Text, Academic Search Premier, Business Source Premier, Health Business FullTEXT, MEDLINE, and PsychINFO databases were searched for the years 1995 to 2016 using terms "human capital," "social capital," and "management." Analysis of conceptual definitions, theoretical and conceptual models, antecedents and consequences, propositions or hypotheses, and empirical support for 37 articles fitting review criteria resulted in the synthesis of the proposed Gilbert Conceptual Model of Organizational Intellectual Capital. The Gilbert Conceptual Model of Organizational Intellectual Capital advances the propositions of human capital theory and social capital theory and is the first model to conceptualize the direct and moderating effects that nurse leaders have on the human capital and social capital of the teams they lead. This model provides a framework for further empirical study and may have implications for practice, organizational policy, and education related to nursing leadership. Copyright © 2017 Elsevier Inc. All rights reserved.
Tolerance of uncertainty: Conceptual analysis, integrative model, and implications for healthcare.
Hillen, Marij A; Gutheil, Caitlin M; Strout, Tania D; Smets, Ellen M A; Han, Paul K J
2017-05-01
Uncertainty tolerance (UT) is an important, well-studied phenomenon in health care and many other important domains of life, yet its conceptualization and measurement by researchers in various disciplines have varied substantially and its essential nature remains unclear. The objectives of this study were to: 1) analyze the meaning and logical coherence of UT as conceptualized by developers of UT measures, and 2) develop an integrative conceptual model to guide future empirical research regarding the nature, causes, and effects of UT. A narrative review and conceptual analysis of 18 existing measures of Uncertainty and Ambiguity Tolerance was conducted, focusing on how measure developers in various fields have defined both the "uncertainty" and "tolerance" components of UT-both explicitly through their writings and implicitly through the items constituting their measures. Both explicit and implicit conceptual definitions of uncertainty and tolerance vary substantially and are often poorly and inconsistently specified. A logically coherent, unified understanding or theoretical model of UT is lacking. To address these gaps, we propose a new integrative definition and multidimensional conceptual model that construes UT as the set of negative and positive psychological responses-cognitive, emotional, and behavioral-provoked by the conscious awareness of ignorance about particular aspects of the world. This model synthesizes insights from various disciplines and provides an organizing framework for future research. We discuss how this model can facilitate further empirical and theoretical research to better measure and understand the nature, determinants, and outcomes of UT in health care and other domains of life. Uncertainty tolerance is an important and complex phenomenon requiring more precise and consistent definition. An integrative definition and conceptual model, intended as a tentative and flexible point of departure for future research, adds needed breadth, specificity, and precision to efforts to conceptualize and measure UT. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
On the Way to Appropriate Model Complexity
NASA Astrophysics Data System (ADS)
Höge, M.
2016-12-01
When statistical models are used to represent natural phenomena they are often too simple or too complex - this is known. But what exactly is model complexity? Among many other definitions, the complexity of a model can be conceptualized as a measure of statistical dependence between observations and parameters (Van der Linde, 2014). However, several issues remain when working with model complexity: A unique definition for model complexity is missing. Assuming a definition is accepted, how can model complexity be quantified? How can we use a quantified complexity to the better of modeling? Generally defined, "complexity is a measure of the information needed to specify the relationships between the elements of organized systems" (Bawden & Robinson, 2015). The complexity of a system changes as the knowledge about the system changes. For models this means that complexity is not a static concept: With more data or higher spatio-temporal resolution of parameters, the complexity of a model changes. There are essentially three categories into which all commonly used complexity measures can be classified: (1) An explicit representation of model complexity as "Degrees of freedom" of a model, e.g. effective number of parameters. (2) Model complexity as code length, a.k.a. "Kolmogorov complexity": The longer the shortest model code, the higher its complexity (e.g. in bits). (3) Complexity defined via information entropy of parametric or predictive uncertainty. Preliminary results show that Bayes theorem allows for incorporating all parts of the non-static concept of model complexity like data quality and quantity or parametric uncertainty. Therefore, we test how different approaches for measuring model complexity perform in comparison to a fully Bayesian model selection procedure. Ultimately, we want to find a measure that helps to assess the most appropriate model.
Analysis of Fractional Flow for Transient Two-Phase Flow in Fractal Porous Medium
NASA Astrophysics Data System (ADS)
Lu, Ting; Duan, Yonggang; Fang, Quantang; Dai, Xiaolu; Wu, Jinsui
2016-03-01
Prediction of fractional flow in fractal porous medium is important for reservoir engineering and chemical engineering as well as hydrology. A physical conceptual fractional flow model of transient two-phase flow is developed in fractal porous medium based on the fractal characteristics of pore-size distribution and on the approximation that porous medium consist of a bundle of tortuous capillaries. The analytical expression for fractional flow for wetting phase is presented, and the proposed expression is the function of structural parameters (such as tortuosity fractal dimension, pore fractal dimension, maximum and minimum diameters of capillaries) and fluid properties (such as contact angle, viscosity and interfacial tension) in fractal porous medium. The sensitive parameters that influence fractional flow and its derivative are formulated, and their impacts on fractional flow are discussed.
Threat to life and risk-taking behaviors: a review of empirical findings and explanatory models.
Ben-Zur, Hasida; Zeidner, Moshe
2009-05-01
This article reviews the literature focusing on the relationship between perceived threat to life and risk-taking behaviors. The review of empirical data, garnered from field studies and controlled experiments, suggests that personal threat to life results in elevated risk-taking behavior. To account for these findings, this review proposes a number of theoretical explanations. These frameworks are grounded in divergent conceptual models: coping with stress, emotion regulation, replenishing of lost resources through self-enhancement, modifications of key parameters of cognitive processing of risky outcomes, and neurocognitive mechanisms. The review concludes with a number of methodological considerations, as well as directions for future work in this promising area of research.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rheinstaedter, Maikel C.; Enderle, Mechthild; Kloepperpieper, Axel
2005-01-01
Methanol-{beta}-hydroquinone clathrate has been established as a model system for dielectric ordering and fluctuations and is conceptually close to magnetic spin systems. In x-ray and neutron diffraction experiments, we investigated the ordered structure, the one-dimensional (1D) and the three-dimensional critical scattering in the paraelectric phase, and the temperature dependence of the lattice constants. Our results can be explained by microscopic models of the methanol pseudospin in the hydroquinone cage network, in consistency with previous dielectric investigations. A coupling of the 1D fluctuations to local strains leads to an anomalous temperature dependence of the 1D lattice parameter in the paraelectric regime.
Ward, Ryan D; Gallistel, C R; Balsam, Peter D
2013-05-01
Learning in conditioning protocols has long been thought to depend on temporal contiguity between the conditioned stimulus and the unconditioned stimulus. This conceptualization has led to a preponderance of associative models of conditioning. We suggest that trial-based associative models that posit contiguity as the primary principle underlying learning are flawed, and provide a brief review of an alternative, information theoretic approach to conditioning. The information that a CS conveys about the timing of the next US can be derived from the temporal parameters of a conditioning protocol. According to this view, a CS will support conditioned responding if, and only if, it reduces uncertainty about the timing of the next US. Copyright © 2013 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
C. Harrington
2004-10-25
The purpose of this model report is to provide documentation of the conceptual and mathematical model (Ashplume) for atmospheric dispersal and subsequent deposition of ash on the land surface from a potential volcanic eruption at Yucca Mountain, Nevada. This report also documents the ash (tephra) redistribution conceptual model. These aspects of volcanism-related dose calculation are described in the context of the entire igneous disruptive events conceptual model in ''Characterize Framework for Igneous Activity'' (BSC 2004 [DIRS 169989], Section 6.1.1). The Ashplume conceptual model accounts for incorporation and entrainment of waste fuel particles associated with a hypothetical volcanic eruption through themore » Yucca Mountain repository and downwind transport of contaminated tephra. The Ashplume mathematical model describes the conceptual model in mathematical terms to allow for prediction of radioactive waste/ash deposition on the ground surface given that the hypothetical eruptive event occurs. This model report also describes the conceptual model for tephra redistribution from a basaltic cinder cone. Sensitivity analyses and model validation activities for the ash dispersal and redistribution models are also presented. Analyses documented in this model report update the previous documentation of the Ashplume mathematical model and its application to the Total System Performance Assessment (TSPA) for the License Application (TSPA-LA) igneous scenarios. This model report also documents the redistribution model product outputs based on analyses to support the conceptual model. In this report, ''Ashplume'' is used when referring to the atmospheric dispersal model and ''ASHPLUME'' is used when referencing the code of that model. Two analysis and model reports provide direct inputs to this model report, namely ''Characterize Eruptive Processes at Yucca Mountain, Nevada and Number of Waste Packages Hit by Igneous Intrusion''. This model report provides direct inputs to the TSPA, which uses the ASHPLUME software described and used in this model report. Thus, ASHPLUME software inputs are inputs to this model report for ASHPLUME runs in this model report. However, ASHPLUME software inputs are outputs of this model report for ASHPLUME runs by TSPA.« less
Bayesian comparison of conceptual models of abrupt climate changes during the last glacial period
NASA Astrophysics Data System (ADS)
Boers, Niklas; Ghil, Michael; Rousseau, Denis-Didier
2017-04-01
Records of oxygen isotope ratios and dust concentrations from the North Greenland Ice Core Project (NGRIP) provide accurate proxies for the evolution of Arctic temperature and atmospheric circulation during the last glacial period (12ka to 100ka b2k) [1]. The most distinctive feature of these records are sudden transitions, called Dansgaard-Oeschger (DO) events, during which Arctic temperatures increased by up to 10 K within a few decades. These warming events are consistently followed by more gradual cooling in Antarctica [2]. The physical mechanisms responsible for these transitions and their out-of-phase relationship between the northern and southern hemisphere remain unclear. Substantial evidence hints at variations of the Atlantic Meridional Overturning Circulation as a key mechanism [2,3], but also other mechanisms, such as variations of sea ice extent [4] or ice shelf coverage [5] may play an important role. Here, we intend to shed more light on the relevance of the different mechanisms suggested to explain the abrupt climate changes and their inter-hemispheric coupling. For this purpose, several conceptual differential equation models are developed that represent the suggested physical mechanisms. Optimal parameters for each model candidate are then determined via maximum likelihood estimation with respect to the observed paleoclimatic data. Our approach is thus semi-empirical: While a model's general form is deduced from physical arguments about relevant climatic mechanisms — oceanic and atmospheric — its specific parameters are obtained by training the model on observed data. The distinct model candidates are evaluated by comparing statistical properties of time series simulated with these models to the observed statistics. In particular, Bayesian model selection criteria like Maximum Likelihood Ratio tests are used to obtain a hierarchy of the different candidates in terms of their likelihood, given the observed oxygen isotope and dust time series. [1] Kindler et al., Clim. Past (2014) [2] WAIS, Nature (2015) [3] Henry et al., Science (2016) [4] Gildor and Tziperman, Phil. Trans. R. Soc. (2003) [5] Petersen et al., Paleoceanography (2013)
Nonlocal gravity. Conceptual aspects and cosmological predictions
NASA Astrophysics Data System (ADS)
Belgacem, Enis; Dirian, Yves; Foffa, Stefano; Maggiore, Michele
2018-03-01
Even if the fundamental action of gravity is local, the corresponding quantum effective action, that includes the effect of quantum fluctuations, is a nonlocal object. These nonlocalities are well understood in the ultraviolet regime but much less in the infrared, where they could in principle give rise to important cosmological effects. Here we systematize and extend previous work of our group, in which it is assumed that a mass scale Λ is dynamically generated in the infrared, giving rise to nonlocal terms in the quantum effective action of gravity. We give a detailed discussion of conceptual aspects related to nonlocal gravity (including causality, degrees of freedom, ambiguities related to the boundary conditions of the nonlocal operator, scenarios for the emergence of a dynamical scale in the infrared) and of the cosmological consequences of these models. The requirement of providing a viable cosmological evolution severely restricts the form of the nonlocal terms, and selects a model (the so-called RR model) that corresponds to a dynamical mass generation for the conformal mode. For such a model: (1) there is a FRW background evolution, where the nonlocal term acts as an effective dark energy with a phantom equation of state, providing accelerated expansion without a cosmological constant. (2) Cosmological perturbations are well behaved. (3) Implementing the model in a Boltzmann code and comparing with observations we find that the RR model fits the CMB, BAO, SNe, structure formation data and local H0 measurements at a level statistically equivalent to ΛCDM. (4) Bayesian parameter estimation shows that the value of H0 obtained in the RR model is higher than in ΛCDM, reducing to 2.0σ the tension with the value from local measurements. (5) The RR model provides a prediction for the sum of neutrino masses that falls within the limits set by oscillation and terrestrial experiments (in contrast to ΛCDM, where letting the sum of neutrino masses vary as a free parameter within these limits, one hits the lower bound). (6) Gravitational waves propagate at the speed of light, complying with the limit from GW170817/GRB 170817A.
NASA Astrophysics Data System (ADS)
Alsing, Justin; Wandelt, Benjamin; Feeney, Stephen
2018-07-01
Many statistical models in cosmology can be simulated forwards but have intractable likelihood functions. Likelihood-free inference methods allow us to perform Bayesian inference from these models using only forward simulations, free from any likelihood assumptions or approximations. Likelihood-free inference generically involves simulating mock data and comparing to the observed data; this comparison in data space suffers from the curse of dimensionality and requires compression of the data to a small number of summary statistics to be tractable. In this paper, we use massive asymptotically optimal data compression to reduce the dimensionality of the data space to just one number per parameter, providing a natural and optimal framework for summary statistic choice for likelihood-free inference. Secondly, we present the first cosmological application of Density Estimation Likelihood-Free Inference (DELFI), which learns a parametrized model for joint distribution of data and parameters, yielding both the parameter posterior and the model evidence. This approach is conceptually simple, requires less tuning than traditional Approximate Bayesian Computation approaches to likelihood-free inference and can give high-fidelity posteriors from orders of magnitude fewer forward simulations. As an additional bonus, it enables parameter inference and Bayesian model comparison simultaneously. We demonstrate DELFI with massive data compression on an analysis of the joint light-curve analysis supernova data, as a simple validation case study. We show that high-fidelity posterior inference is possible for full-scale cosmological data analyses with as few as ˜104 simulations, with substantial scope for further improvement, demonstrating the scalability of likelihood-free inference to large and complex cosmological data sets.
NASA Astrophysics Data System (ADS)
Xu, Zexuan; Hu, Bill
2016-04-01
Dual-permeability karst aquifers of porous media and conduit networks with significant different hydrological characteristics are widely distributed in the world. Discrete-continuum numerical models, such as MODFLOW-CFP and CFPv2, have been verified as appropriate approaches to simulate groundwater flow and solute transport in numerical modeling of karst hydrogeology. On the other hand, seawater intrusion associated with fresh groundwater resources contamination has been observed and investigated in numbers of coastal aquifers, especially under conditions of sea level rise. Density-dependent numerical models including SEAWAT are able to quantitatively evaluate the seawater/freshwater interaction processes. A numerical model of variable-density flow and solute transport - conduit flow process (VDFST-CFP) is developed to provide a better description of seawater intrusion and submarine groundwater discharge in a coastal karst aquifer with conduits. The coupling discrete-continuum VDFST-CFP model applies Darcy-Weisbach equation to simulate non-laminar groundwater flow in the conduit system in which is conceptualized and discretized as pipes, while Darcy equation is still used in continuum porous media. Density-dependent groundwater flow and solute transport equations with appropriate density terms in both conduit and porous media systems are derived and numerically solved using standard finite difference method with an implicit iteration procedure. Synthetic horizontal and vertical benchmarks are created to validate the newly developed VDFST-CFP model by comparing with other numerical models such as variable density SEAWAT, couplings of constant density groundwater flow and solute transport MODFLOW/MT3DMS and discrete-continuum CFPv2/UMT3D models. VDFST-CFP model improves the simulation of density dependent seawater/freshwater mixing processes and exchanges between conduit and matrix. Continuum numerical models greatly overestimated the flow rate under turbulent flow condition but discrete-continuum models provide more accurate results. Parameters sensitivities analysis indicates that conduit diameter and friction factor, matrix hydraulic conductivity and porosity are important parameters that significantly affect variable-density flow and solute transport simulation. The pros and cons of model assumptions, conceptual simplifications and numerical techniques in VDFST-CFP are discussed. In general, the development of VDFST-CFP model is an innovation in numerical modeling methodology and could be applied to quantitatively evaluate the seawater/freshwater interaction in coastal karst aquifers. Keywords: Discrete-continuum numerical model; Variable density flow and transport; Coastal karst aquifer; Non-laminar flow
Quasi-steady aerodynamic model of clap-and-fling flapping MAV and validation using free-flight data.
Armanini, S F; Caetano, J V; Croon, G C H E de; Visser, C C de; Mulder, M
2016-06-30
Flapping-wing aerodynamic models that are accurate, computationally efficient and physically meaningful, are challenging to obtain. Such models are essential to design flapping-wing micro air vehicles and to develop advanced controllers enhancing the autonomy of such vehicles. In this work, a phenomenological model is developed for the time-resolved aerodynamic forces on clap-and-fling ornithopters. The model is based on quasi-steady theory and accounts for inertial, circulatory, added mass and viscous forces. It extends existing quasi-steady approaches by: including a fling circulation factor to account for unsteady wing-wing interaction, considering real platform-specific wing kinematics and different flight regimes. The model parameters are estimated from wind tunnel measurements conducted on a real test platform. Comparison to wind tunnel data shows that the model predicts the lift forces on the test platform accurately, and accounts for wing-wing interaction effectively. Additionally, validation tests with real free-flight data show that lift forces can be predicted with considerable accuracy in different flight regimes. The complete parameter-varying model represents a wide range of flight conditions, is computationally simple, physically meaningful and requires few measurements. It is therefore potentially useful for both control design and preliminary conceptual studies for developing new platforms.
A conceptual snow model with an analytic resolution of the heat and phase change equations
NASA Astrophysics Data System (ADS)
Riboust, Philippe; Le Moine, Nicolas; Thirel, Guillaume; Ribstein, Pierre
2017-04-01
Compared to degree-day snow models, physically-based snow models resolve more processes in an attempt to achieve a better representation of reality. Often these physically-based models resolve the heat transport equations in snow using a vertical discretization of the snowpack. The snowpack is decomposed into several layers in which the mechanical and thermal states of the snow are calculated. A higher number of layers in the snowpack allow for better accuracy but it also tends to increase the computational costs. In order to develop a snow model that estimates the temperature profile of snow with a lower computational cost, we used an analytical decomposition of the vertical profile using eigenfunctions (i.e. trigonometric functions adapted to the specific boundary conditions). The mass transfer of snow melt has also been estimated using an analytical conceptualization of runoff fingering and matrix flow. As external meteorological forcing, the model uses solar and atmospheric radiation, air temperature, atmospheric humidity and precipitations. It has been tested and calibrated at point scale at two different stations in the Alps: Col de Porte (France, 1325 m) and Weissfluhjoch (Switzerland, 2540 m). A sensitivity analysis of model parameters and model inputs will be presented together with a comparison with measured snow surface temperature, SWE, snow depth, temperature profile and snow melt data. The snow model is created in order to be ultimately coupled with hydrological models for rainfall-runoff modeling in mountainous areas. We hope to create a model faster than physically-based models but capable to estimate more physical processes than degree-day snow models. This should help to build a more reliable snow model capable of being easily calibrated by remote sensing and in situ observation or to assimilate these data for forecasting purposes.
NASA Technical Reports Server (NTRS)
Mack, Robert J.; Needleman, Kathy E.
1990-01-01
A method for designing wind tunnel models of conceptual, low-boom, supersonic cruise aircraft is presented. Also included is a review of the procedures used to design the conceptual low-boom aircraft. In the discussion, problems unique to, and encountered during, the design of both the conceptual aircraft and the wind tunnel models are outlined. The sensitivity of low-boom characteristics in the aircraft design to control the volume and lift equivalent area distributions was emphasized. Solutions to these problems are reported; especially the two which led to the design of the wind tunnel model support stings.
Finding Mass Constraints Through Third Neutrino Mass Eigenstate Decay
NASA Astrophysics Data System (ADS)
Gangolli, Nakul; de Gouvêa, André; Kelly, Kevin
2018-01-01
In this paper we aim to constrain the decay parameter for the third neutrino mass utilizing already accepted constraints on the other mixing parameters from the Pontecorvo-Maki-Nakagawa-Sakata matrix (PMNS). The main purpose of this project is to determine the parameters that will allow the Jiangmen Underground Neutrino Observatory (JUNO) to observe a decay parameter with some statistical significance. Another goal is to determine the parameters that JUNO could detect in the case that the third neutrino mass is lighter than the first two neutrino species. We also replicate the results that were found in the JUNO Conceptual Design Report (CDR). By utilizing Χ2-squared analysis constraints have been put on the mixing angles, mass squared differences, and the third neutrino decay parameter. These statistical tests take into account background noise and normalization corrections and thus the finalized bounds are a good approximation for the true bounds that JUNO can detect. If the decay parameter is not included in our models, the 99% confidence interval lies within The bounds 0s to 2.80x10-12s. However, if we account for a decay parameter of 3x10-5 ev2, then 99% confidence interval lies within 8.73x10-12s to 8.73x10-11s.
The Model-Building Process in Introductory College Geography: An Illustrative Example
ERIC Educational Resources Information Center
Cadwallader, Martin
1978-01-01
Illustrates the five elements of conceptual models by developing a model of consumer behavior in choosing among alternative supermarkets. The elements are: identifying the problem, constructing a conceptual model, translating it into a symbolic model, operationalizing the model, and testing. (Author/AV)
Blackmore, S; Pedretti, D; Mayer, K U; Smith, L; Beckie, R D
2018-05-30
Accurate predictions of solute release from waste-rock piles (WRPs) are paramount for decision making in mining-related environmental processes. Tracers provide information that can be used to estimate effective transport parameters and understand mechanisms controlling the hydraulic and geochemical behavior of WRPs. It is shown that internal tracers (i.e. initially present) together with external (i.e. applied) tracers provide complementary and quantitative information to identify transport mechanisms. The analysis focuses on two experimental WRPs, Piles 4 and Pile 5 at the Antamina Mine site (Peru), where both an internal chloride tracer and externally applied bromide tracer were monitored in discharge over three years. The results suggest that external tracers provide insight into transport associated with relatively fast flow regions that are activated during higher-rate recharge events. In contrast, internal tracers provide insight into mechanisms controlling solutes release from lower-permeability zones within the piles. Rate-limited diffusive processes, which can be mimicked by nonlocal mass-transfer models, affect both internal and external tracers. The sensitivity of the mass-transfer parameters to heterogeneity is higher for external tracers than for internal tracers, as indicated by the different mean residence times characterizing the flow paths associated with each tracer. The joint use of internal and external tracers provides a more comprehensive understanding of the transport mechanisms in WRPs. In particular, the tracer tests support the notion that a multi-porosity conceptualization of WRPs is more adequate for capturing key mechanisms than a dual-porosity conceptualization. Copyright © 2018 Elsevier B.V. All rights reserved.
Theoretical results on the tandem junction solar cell based on its Ebers-Moll transistor model
NASA Technical Reports Server (NTRS)
Goradia, C.; Vaughn, J.; Baraona, C. R.
1980-01-01
A one-dimensional theoretical model of the tandem junction solar cell (TJC) with base resistivity greater than about 1 ohm-cm and under low level injection has been derived. This model extends a previously published conceptual model which treats the TJC as an npn transistor. The model gives theoretical expressions for each of the Ebers-Moll type currents of the illuminated TJC and allows for the calculation of the spectral response, I(sc), V(oc), FF and eta under variation of one or more of the geometrical and material parameters and 1MeV electron fluence. Results of computer calculations based on this model are presented and discussed. These results indicate that for space applications, both a high beginning of life efficiency, greater than 15% AM0, and a high radiation tolerance can be achieved only with thin (less than 50 microns) TJC's with high base resistivity (greater than 10 ohm-cm).
Preliminary results from the hydrodynamic element of the 1994 entrapment zone study
Burau, J.R.; Stacey, M.; Gartner, J.W.
1995-01-01
This article discusses preliminary results from analyses of USGS hydrodynamic data collected as part of the 1994 Interagency Ecological Program entrapment zone study. The USGS took part in three 30-hour cruises and deployed instruments for measuring currents and salinity from April to June. This article primarily focuses on the analysis of data from five Acoustic Doppler Current ProUers (ADCPs) deployed in Carquinez Strait, Suisun Bay, and the Western Delta. From these analyses a revised conceptual model of the hydrodynamics of the entrapment/null zone has evolved. The ideas discussed in this newsletter article are essentially working hypotheses, which are presented here to stimulate discussion and further analyses. In this article we discuss the currently-held conceptual model of entrapment and present data that are inconsistent with this conceptual model. Finally, we suggest a revised conceptual model that is consistent with all of the hydrodynamic data collected to date and describe how the 1995 study incorporates our revised conceptual model into its design.
Dini-Andreote, Francisco; Stegen, James C.; van Elsas, Jan D.; ...
2015-03-17
Despite growing recognition that deterministic and stochastic factors simultaneously influence bacterial communities, little is known about mechanisms shifting their relative importance. To better understand underlying mechanisms, we developed a conceptual model linking ecosystem development during primary succession to shifts in the stochastic/deterministic balance. To evaluate the conceptual model we coupled spatiotemporal data on soil bacterial communities with environmental conditions spanning 105 years of salt marsh development. At the local scale there was a progression from stochasticity to determinism due to Na accumulation with increasing ecosystem age, supporting a main element of the conceptual model. At the regional-scale, soil organic mattermore » (SOM) governed the relative influence of stochasticity and the type of deterministic ecological selection, suggesting scale-dependency in how deterministic ecological selection is imposed. Analysis of a new ecological simulation model supported these conceptual inferences. Looking forward, we propose an extended conceptual model that integrates primary and secondary succession in microbial systems.« less
Navigating Tensions Between Conceptual and Metaconceptual Goals in the Use of Models
NASA Astrophysics Data System (ADS)
Delgado, Cesar
2015-04-01
Science education involves learning about phenomena at three levels: concrete (facts and generalizations), conceptual (concepts and theories), and metaconceptual (epistemology) (Snir et al. in J Sci Educ Technol 2(2):373-388, 1993). Models are key components in science, can help build conceptual understanding, and may also build metaconceptual understanding. Technology can transform teaching and learning by turning models into interactive simulations that learners can investigate. This paper identifies four characteristics of models and simulations that support conceptual learning but misconstrue models and science at a metaconceptual level. Ahistorical models combine the characteristics of several historical models; they conveniently compile ideas but misrepresent the history of science (Gilbert in Int J Sci Math Educ 2(2):115-130, 2004). Teleological models explain behavior in terms of a final cause; they can lead to useful heuristics but imply purpose in processes driven by chance and probability (Talanquer in Int J Sci Educ 29(7):853-870, 2007). Epistemological overreach occurs when models or simulations imply greater certainty and knowledge about phenomena than warranted; conceptualizing nature as being well known (e.g., having a mathematical structure) poses the danger of conflating model and reality or data and theory. Finally, models are inevitably ontologically impoverished. Real-world deviations and many variables are left out of models, as models' role is to simplify. Models and simulations also lose much of the sensory data present in phenomena. Teachers, designers, and professional development designers and facilitators must thus navigate the tension between conceptual and metaconceptual learning when using models and simulations. For each characteristic, examples are provided, along with recommendations for instruction and design. Prompts for explicit reflective activities around models are provided for each characteristic
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wichlacz, Paul Louis; Orr, Brennan
2002-08-01
The National Research Council has defined a conceptual model as ''an evolving hypothesis identifying the important features, processes, and events controlling fluid flow and contaminant transport of consequence at a specific field site in the context of a recognized problem''. Presently, several subregional conceptual models are under development at the Idaho National Engineering and Environmental Laboratory (INEEL). Additionally, facility-specific conceptual models have been described as part of INEEL environmental restoration activities. Compilation of these models is required to develop a comprehensive conceptual model that can be used to strategically plan for future groundwater research activities at the INEEL. Conceptual modelsmore » of groundwater flow and contaminant transport at the INEEL include the description of the geologic framework, matrix hydraulic properties, and inflows and outflows. They also include definitions of the contaminant source term and contaminant transport mechanisms. The geologic framework of the INEEL subregion is described by the geometry of the system, stratigraphic units within the system, and structural features that affect groundwater flow and contaminant transport. These elements define geohydrologic units that make up the Snake River Plain Aquifer (SRPA). The United States Geological Survey (USGS) conceptual model encompasses approximately 1,920 mi2 of the eastern Snake River Plain. The Waste Area Group (WAG)-10 model includes the USGS area and additional areas to the northeast and southeast. Both conceptual models are bounded to the northwest by the Pioneer Mountains, Lost River Range, and Lemhi Mountains. They are bounded to the southeast by groundwater flow paths determined from aquifer water-level contours. The upgradient extent of the USGS model is a water-level contour that includes the northeastern boundary of the INEEL. The WAG-10 model includes more of the Mud Lake area to utilize previous estimates of underflow into the subregion. Both conceptual models extend approximately 25 miles to the southwest of the INEEL, a distance sufficient to include known concentrations of contaminant tracers. Several hypotheses have been developed concerning the effective thickness of the SRPA at the INEEL. The USGS model has defined the effective thickness from electrical resistivity and borehole data to be as much as 2,500 ft in the eastern part of the subregion and as much as 4,000 ft in the southwestern part. The WAG-10 model has developed two alternatives using aquifer-temperature and electrical resistivity data. The ''thick'' aquifer interpretation utilizes colder temperature data and includes a northtrending zone in which the thickness exceeds 1,300 ft and with a maximum thickness of 1,700 ft. The ''thin'' aquifer interpretation minimizes aquifer thickness, with thickness ranging from 328 to 1,300 ft. Facility-specific models generally have focused efforts on the upper 250 ft of saturation. Conceptual models have utilized a stratigraphic data set to define geohydrologic units within the INEEL subregion. This data set, compiled from geophysical logs and cores from boreholes, correlates the thick, complex stack of basalt flows across the subregion. Conceptual models generally concur that the upper geohydrologic unit consists of a section of highly fractured, multiple, thin basalt flows and sedimentary interbeds. Beneath this unit is an areally extensive, thick, unfractured basalt flow that rises above the water table southwest of the INEEL. The bottom unit consists of a thick section of slightly- to moderately-altered basalt. A key objective of the DOE water-integration project at the INEEL is to coordinate development of a subregional conceptual model of groundwater flow and contaminant transport that is based on the best available understanding of geologic and hydrologic features. The first step in this process is to compile and summarize the current conceptual models of groundwater flow and contaminant transport at the INEEL that have been developed from extensive geohydrologic studies conducted during the last 50 years.« less
MeProRisk - a Joint Venture for Minimizing Risk in Geothermal Reservoir Development
NASA Astrophysics Data System (ADS)
Clauser, C.; Marquart, G.
2009-12-01
Exploration and development of geothermal reservoirs for the generation of electric energy involves high engineering and economic risks due to the need for 3-D geophysical surface surveys and deep boreholes. The MeProRisk project provides a strategy guideline for reducing these risks by combining cross-disciplinary information from different specialists: Scientists from three German universities and two private companies contribute with new methods in seismic modeling and interpretation, numerical reservoir simulation, estimation of petrophysical parameters, and 3-D visualization. The approach chosen in MeProRisk consists in considering prospecting and developing of geothermal reservoirs as an iterative process. A first conceptual model for fluid flow and heat transport simulation can be developed based on limited available initial information on geology and rock properties. In the next step, additional data is incorporated which is based on (a) new seismic interpretation methods designed for delineating fracture systems, (b) statistical studies on large numbers of rock samples for estimating reliable rock parameters, (c) in situ estimates of the hydraulic conductivity tensor. This results in a continuous refinement of the reservoir model where inverse modelling of fluid flow and heat transport allows infering the uncertainty and resolution of the model at each iteration step. This finally yields a calibrated reservoir model which may be used to direct further exploration by optimizing additional borehole locations, estimate the uncertainty of key operational and economic parameters, and optimize the long-term operation of a geothermal resrvoir.
NASA Astrophysics Data System (ADS)
Hernández, María Isabel; Couso, Digna; Pintó, Roser
2015-04-01
The study we have carried out aims to characterize 15- to 16-year-old students' learning progressions throughout the implementation of a teaching-learning sequence on the acoustic properties of materials. Our purpose is to better understand students' modeling processes about this topic and to identify how the instructional design and actual enactment influences students' learning progressions. This article presents the design principles which elicit the structure and types of modeling and inquiry activities designed to promote students' development of three conceptual models. Some of these activities are enhanced by the use of ICT such as sound level meters connected to data capture systems, which facilitate the measurement of the intensity level of sound emitted by a sound source and transmitted through different materials. Framing this study within the design-based research paradigm, it consists of the experimentation of the designed teaching sequence with two groups of students ( n = 29) in their science classes. The analysis of students' written productions together with classroom observations of the implementation of the teaching sequence allowed characterizing students' development of the conceptual models. Moreover, we could evidence the influence of different modeling and inquiry activities on students' development of the conceptual models, identifying those that have a major impact on students' modeling processes. Having evidenced different levels of development of each conceptual model, our results have been interpreted in terms of the attributes of each conceptual model, the distance between students' preliminary mental models and the intended conceptual models, and the instructional design and enactment.
Global sensitivity analysis in stochastic simulators of uncertain reaction networks.
Navarro Jimenez, M; Le Maître, O P; Knio, O M
2016-12-28
Stochastic models of chemical systems are often subjected to uncertainties in kinetic parameters in addition to the inherent random nature of their dynamics. Uncertainty quantification in such systems is generally achieved by means of sensitivity analyses in which one characterizes the variability with the uncertain kinetic parameters of the first statistical moments of model predictions. In this work, we propose an original global sensitivity analysis method where the parametric and inherent variability sources are both treated through Sobol's decomposition of the variance into contributions from arbitrary subset of uncertain parameters and stochastic reaction channels. The conceptual development only assumes that the inherent and parametric sources are independent, and considers the Poisson processes in the random-time-change representation of the state dynamics as the fundamental objects governing the inherent stochasticity. A sampling algorithm is proposed to perform the global sensitivity analysis, and to estimate the partial variances and sensitivity indices characterizing the importance of the various sources of variability and their interactions. The birth-death and Schlögl models are used to illustrate both the implementation of the algorithm and the richness of the proposed analysis method. The output of the proposed sensitivity analysis is also contrasted with a local derivative-based sensitivity analysis method classically used for this type of systems.
Horobin, R W; Stockert, J C; Rashid-Doubell, F
2015-05-01
We discuss a variety of biological targets including generic biomembranes and the membranes of the endoplasmic reticulum, endosomes/lysosomes, Golgi body, mitochondria (outer and inner membranes) and the plasma membrane of usual fluidity. For each target, we discuss the access of probes to the target membrane, probe uptake into the membrane and the mechanism of selectivity of the probe uptake. A statement of the QSAR decision rule that describes the required physicochemical features of probes that enable selective staining also is provided, followed by comments on exceptions and limits. Examples of probes typically used to demonstrate each target structure are noted and decision rule tabulations are provided for probes that localize in particular targets; these tabulations show distribution of probes in the conceptual space defined by the relevant structure parameters ("parameter space"). Some general implications and limitations of the QSAR models for probe targeting are discussed including the roles of certain cell and protocol factors that play significant roles in lipid staining. A case example illustrates the predictive ability of QSAR models. Key limiting values of the head group hydrophilicity parameter associated with membrane-probe interactions are discussed in an appendix.
Global sensitivity analysis in stochastic simulators of uncertain reaction networks
Navarro Jimenez, M.; Le Maître, O. P.; Knio, O. M.
2016-12-23
Stochastic models of chemical systems are often subjected to uncertainties in kinetic parameters in addition to the inherent random nature of their dynamics. Uncertainty quantification in such systems is generally achieved by means of sensitivity analyses in which one characterizes the variability with the uncertain kinetic parameters of the first statistical moments of model predictions. In this work, we propose an original global sensitivity analysis method where the parametric and inherent variability sources are both treated through Sobol’s decomposition of the variance into contributions from arbitrary subset of uncertain parameters and stochastic reaction channels. The conceptual development only assumes thatmore » the inherent and parametric sources are independent, and considers the Poisson processes in the random-time-change representation of the state dynamics as the fundamental objects governing the inherent stochasticity. Here, a sampling algorithm is proposed to perform the global sensitivity analysis, and to estimate the partial variances and sensitivity indices characterizing the importance of the various sources of variability and their interactions. The birth-death and Schlögl models are used to illustrate both the implementation of the algorithm and the richness of the proposed analysis method. The output of the proposed sensitivity analysis is also contrasted with a local derivative-based sensitivity analysis method classically used for this type of systems.« less
Global sensitivity analysis in stochastic simulators of uncertain reaction networks
NASA Astrophysics Data System (ADS)
Navarro Jimenez, M.; Le Maître, O. P.; Knio, O. M.
2016-12-01
Stochastic models of chemical systems are often subjected to uncertainties in kinetic parameters in addition to the inherent random nature of their dynamics. Uncertainty quantification in such systems is generally achieved by means of sensitivity analyses in which one characterizes the variability with the uncertain kinetic parameters of the first statistical moments of model predictions. In this work, we propose an original global sensitivity analysis method where the parametric and inherent variability sources are both treated through Sobol's decomposition of the variance into contributions from arbitrary subset of uncertain parameters and stochastic reaction channels. The conceptual development only assumes that the inherent and parametric sources are independent, and considers the Poisson processes in the random-time-change representation of the state dynamics as the fundamental objects governing the inherent stochasticity. A sampling algorithm is proposed to perform the global sensitivity analysis, and to estimate the partial variances and sensitivity indices characterizing the importance of the various sources of variability and their interactions. The birth-death and Schlögl models are used to illustrate both the implementation of the algorithm and the richness of the proposed analysis method. The output of the proposed sensitivity analysis is also contrasted with a local derivative-based sensitivity analysis method classically used for this type of systems.
Eddebbarh, A.-A.; Zyvoloski, G.A.; Robinson, B.A.; Kwicklis, E.M.; Reimus, P.W.; Arnold, B.W.; Corbet, T.; Kuzio, S.P.; Faunt, C.
2003-01-01
The US Department of Energy is pursuing Yucca Mountain, Nevada, for the development of a geologic repository for the disposal of spent nuclear fuel and high-level radioactive waste, if the repository is able to meet applicable radiation protection standards established by the US Nuclear Regulatory Commission and the US Environmental Protection Agency (EPA). Effective performance of such a repository would rely on a number of natural and engineered barriers to isolate radioactive waste from the accessible environment. Groundwater beneath Yucca Mountain is the primary medium through which most radionuclides might move away from the potential repository. The saturated zone (SZ) system is expected to act as a natural barrier to this possible movement of radionuclides both by delaying their transport and by reducing their concentration before they reach the accessible environment. Information obtained from Yucca Mountain Site Characterization Project activities is used to estimate groundwater flow rates through the site-scale SZ flow and transport model area and to constrain general conceptual models of groundwater flow in the site-scale area. The site-scale conceptual model is a synthesis of what is known about flow and transport processes at the scale required for total system performance assessment of the site. This knowledge builds on and is consistent with knowledge that has accumulated at the regional scale but is more detailed because more data are available at the site-scale level. The mathematical basis of the site-scale model and the associated numerical approaches are designed to assist in quantifying the uncertainty in the permeability of rocks in the geologic framework model and to represent accurately the flow and transport processes included in the site-scale conceptual model. Confidence in the results of the mathematical model was obtained by comparing calculated to observed hydraulic heads, estimated to measured permeabilities, and lateral flow rates calculated by the site-scale model to those calculated by the regional-scale flow model. In addition, it was confirmed that the flow paths leaving the region of the potential repository are consistent with those inferred from gradients of measured head and those independently inferred from water-chemistry data. The general approach of the site-scale SZ flow and transport model analysis is to calculate unit breakthrough curves for radionuclides at the interface between the SZ and the biosphere using the three-dimensional site-scale SZ flow and transport model. Uncertainties are explicitly incorporated into the site-scale SZ flow and transport abstractions through key parameters and conceptual models. ?? 2002 Elsevier Science B.V. All rights reserved.
Handren, Lindsay; Crano, William D.
2018-01-01
Culturally, people tend to abstain from alcohol intake during the weekdays and wait to consume in greater frequency and quantity during the weekends. The current research sought to empirically justify the days representing weekday versus weekend alcohol consumption. In study 1 (N = 419), item response theory was applied to a two-parameter (difficulty and discrimination) model that evaluated the days of drinking (frequency) during the typical 7-day week. Item characteristic curves were most similar for Monday, Tuesday, and Wednesday (prototypical weekday) and for Friday and Saturday (prototypical weekend). Thursday and Sunday, however, exhibited item characteristics that bordered the properties of weekday and weekend consumption. In study 2 (N = 403), confirmatory factor analysis was applied to test six hypothesized measurement structures representing drinks per day (quantity) during the typical week. The measurement model producing the strongest fit indices was a correlated two-factor structure involving separate weekday and weekend factors that permitted Thursday and Sunday to double load on both dimensions. The proper conceptualization and accurate measurement of the days demarcating the normative boundaries of “dry” weekdays and “wet” weekends are imperative to inform research and prevention efforts targeting temporal alcohol intake patterns. PMID:27488456
Lac, Andrew; Handren, Lindsay; Crano, William D
2016-10-01
Culturally, people tend to abstain from alcohol intake during the weekdays and wait to consume in greater frequency and quantity during the weekends. The current research sought to empirically justify the days representing weekday versus weekend alcohol consumption. In study 1 (N = 419), item response theory was applied to a two-parameter (difficulty and discrimination) model that evaluated the days of drinking (frequency) during the typical 7-day week. Item characteristic curves were most similar for Monday, Tuesday, and Wednesday (prototypical weekday) and for Friday and Saturday (prototypical weekend). Thursday and Sunday, however, exhibited item characteristics that bordered the properties of weekday and weekend consumption. In study 2 (N = 403), confirmatory factor analysis was applied to test six hypothesized measurement structures representing drinks per day (quantity) during the typical week. The measurement model producing the strongest fit indices was a correlated two-factor structure involving separate weekday and weekend factors that permitted Thursday and Sunday to double load on both dimensions. The proper conceptualization and accurate measurement of the days demarcating the normative boundaries of "dry" weekdays and "wet" weekends are imperative to inform research and prevention efforts targeting temporal alcohol intake patterns.
Chen, Lei; Chen, Hongkun; Yang, Jun; Shu, Zhengyu; He, Huiwen; Shu, Xin
2016-01-01
The modified flux-coupling-type superconducting fault current (SFCL) is a high-efficient electrical auxiliary device, whose basic function is to suppress the short-circuit current by controlling the magnetic path through a high-speed switch. In this paper, the high-speed switch is based on electromagnetic repulsion mechanism, and its conceptual design is carried out to promote the application of the modified SFCL. Regarding that the switch which is consisting of a mobile copper disc, two fixed opening and closing coils, the computational method for the electromagnetic force is discussed, and also the dynamic mathematical model including circuit equation, magnetic field equation as well as mechanical motion equation is theoretically deduced. According to the mathematical modeling and calculation of characteristic parameters, a feasible design scheme is presented, and the high-speed switch's response time can be less than 0.5 ms. For that the modified SFCL is equipped with this high-speed switch, the SFCL's application in a 10 kV micro-grid system with multiple renewable energy sources are assessed in the MATLAB software. The simulations are well able to affirm the SFCL's performance behaviors.
A conceptual model for vision rehabilitation
Roberts, Pamela S.; Rizzo, John-Ross; Hreha, Kimberly; Wertheimer, Jeffrey; Kaldenberg, Jennifer; Hironaka, Dawn; Riggs, Richard; Colenbrander, August
2017-01-01
Vision impairments are highly prevalent after acquired brain injury (ABI). Conceptual models that focus on constructing intellectual frameworks greatly facilitate comprehension and implementation of practice guidelines in an interprofessional setting. The purpose of this article is to provide a review of the vision literature in ABI, describe a conceptual model for vision rehabilitation, explain its potential clinical inferences, and discuss its translation into rehabilitation across multiple practice settings and disciplines. PMID:27997671
A conceptual model for vision rehabilitation.
Roberts, Pamela S; Rizzo, John-Ross; Hreha, Kimberly; Wertheimer, Jeffrey; Kaldenberg, Jennifer; Hironaka, Dawn; Riggs, Richard; Colenbrander, August
2016-01-01
Vision impairments are highly prevalent after acquired brain injury (ABI). Conceptual models that focus on constructing intellectual frameworks greatly facilitate comprehension and implementation of practice guidelines in an interprofessional setting. The purpose of this article is to provide a review of the vision literature in ABI, describe a conceptual model for vision rehabilitation, explain its potential clinical inferences, and discuss its translation into rehabilitation across multiple practice settings and disciplines.
Navigating Tensions between Conceptual and Metaconceptual Goals in the Use of Models
ERIC Educational Resources Information Center
Delgado, Cesar
2015-01-01
Science education involves learning about phenomena at three levels: concrete (facts and generalizations), conceptual (concepts and theories), and metaconceptual (epistemology) (Snir et al. in "J Sci Educ Technol" 2(2):373-388, 1993). Models are key components in science, can help build conceptual understanding, and may also build…
A Scoping Review: Conceptualizations and Pedagogical Models of Learning in Nursing Simulation
ERIC Educational Resources Information Center
Poikela, Paula; Teräs, Marianne
2015-01-01
Simulations have been implemented globally in nursing education for years with diverse conceptual foundations. The aim of this scoping review is to examine the literature regarding the conceptualizations of learning and pedagogical models in nursing simulations. A scoping review of peer-reviewed articles published between 2000 and 2013 was…
NASA Astrophysics Data System (ADS)
Bachmann-Machnik, Anna; Meyer, Daniel; Waldhoff, Axel; Fuchs, Stephan; Dittmer, Ulrich
2018-04-01
Retention Soil Filters (RSFs), a form of vertical flow constructed wetlands specifically designed for combined sewer overflow (CSO) treatment, have proven to be an effective tool to mitigate negative impacts of CSOs on receiving water bodies. Long-term hydrologic simulations are used to predict the emissions from urban drainage systems during planning of stormwater management measures. So far no universally accepted model for RSF simulation exists. When simulating hydraulics and water quality in RSFs, an appropriate level of detail must be chosen for reasonable balancing between model complexity and model handling, considering the model input's level of uncertainty. The most crucial parameters determining the resultant uncertainties of the integrated sewer system and filter bed model were identified by evaluating a virtual drainage system with a Retention Soil Filter for CSO treatment. To determine reasonable parameter ranges for RSF simulations, data of 207 events from six full-scale RSF plants in Germany were analyzed. Data evaluation shows that even though different plants with varying loading and operation modes were examined, a simple model is sufficient to assess relevant suspended solids (SS), chemical oxygen demand (COD) and NH4 emissions from RSFs. Two conceptual RSF models with different degrees of complexity were assessed. These models were developed based on evaluation of data from full scale RSF plants and column experiments. Incorporated model processes are ammonium adsorption in the filter layer and degradation during subsequent dry weather period, filtration of SS and particulate COD (XCOD) to a constant background concentration and removal of solute COD (SCOD) by a constant removal rate during filter passage as well as sedimentation of SS and XCOD in the filter overflow. XCOD, SS and ammonium loads as well as ammonium concentration peaks are discharged primarily via RSF overflow not passing through the filter bed. Uncertainties of the integrated simulation of the sewer system and RSF model mainly originate from the model parameters of the hydrologic sewer system model.
Developing International Guidelines on Volcanic Hazard Assessments for Nuclear Facilities
NASA Astrophysics Data System (ADS)
Connor, Charles
2014-05-01
Worldwide, tremendous progress has been made in recent decades in forecasting volcanic events, such as episodes of volcanic unrest, eruptions, and the potential impacts of eruptions. Generally these forecasts are divided into two categories. Short-term forecasts are prepared in response to unrest at volcanoes, rely on geophysical monitoring and related observations, and have the goal of forecasting events on timescales of hours to weeks to provide time for evacuation of people, shutdown of facilities, and implementation of related safety measures. Long-term forecasts are prepared to better understand the potential impacts of volcanism in the future and to plan for potential volcanic activity. Long-term forecasts are particularly useful to better understand and communicate the potential consequences of volcanic events for populated areas around volcanoes and for siting critical infrastructure, such as nuclear facilities. Recent work by an international team, through the auspices of the International Atomic Energy Agency, has focused on developing guidelines for long-term volcanic hazard assessments. These guidelines have now been implemented for hazard assessment for nuclear facilities in nations including Indonesia, the Philippines, Armenia, Chile, and the United States. One any time scale, all volcanic hazard assessments rely on a geologically reasonable conceptual model of volcanism. Such conceptual models are usually built upon years or decades of geological studies of specific volcanic systems, analogous systems, and development of a process-level understanding of volcanic activity. Conceptual models are used to bound potential rates of volcanic activity, potential magnitudes of eruptions, and to understand temporal and spatial trends in volcanic activity. It is these conceptual models that provide essential justification for assumptions made in statistical model development and the application of numerical models to generate quantitative forecasts. It is a tremendous challenge in quantitative volcanic hazard assessments to encompass alternative conceptual models, and to create models that are robust to evolving understanding of specific volcanic systems by the scientific community. A central question in volcanic hazards forecasts is quantifying rates of volcanic activity. Especially for long-dormant volcanic systems, data from the geologic record may be sparse, individual events may be missing or unrecognized in the geologic record, patterns of activity may be episodic or otherwise nonstationary. This leads to uncertainty in forecasting long-term rates of activity. Hazard assessments strive to quantify such uncertainty, for example by comparing observed rates of activity with alternative parametric and nonparametric models. Numerical models are presented that characterize the spatial distribution of potential volcanic events. These spatial density models serve as the basis for application of numerical models of specific phenomena such as development of lava flow, tephra fallout, and a host of other volcanic phenomena. Monte Carlo techniques (random sampling, stratified sampling, importance sampling) are methods used to sample vent location and other key eruption parameters, such as eruption volume, magma rheology, and eruption column height for probabilistic models. The development of coupled scenarios (e.g., the probability of tephra accumulation on a slope resulting in subsequent debris flows) is also assessed through these methods, usually with the aid of event trees. The primary products of long-term forecasts are a statistical model of the conditional probability of the potential effects of volcanism, should an eruption occur, and the probability of such activity occurring. It is emphasized that hazard forecasting is an iterative process, and board consideration must be given to alternative conceptual models of volcanism, weighting of volcanological data in the analyses, and alternative statistical and numerical models. This structure is amenable to expert elicitation in order to weight alternative models and to explore alternative scenarios.
Conceptualization of preferential flow for hillslope stability assessment
NASA Astrophysics Data System (ADS)
Kukemilks, Karlis; Wagner, Jean-Frank; Saks, Tomas; Brunner, Philip
2018-03-01
This study uses two approaches to conceptualize preferential flow with the goal to investigate their influence on hillslope stability. Synthetic three-dimensional hydrogeological models using dual-permeability and discrete-fracture conceptualization were subsequently integrated into slope stability simulations. The slope stability simulations reveal significant differences in slope stability depending on the preferential flow conceptualization applied, despite similar small-scale hydrogeological responses of the system. This can be explained by a local-scale increase of pore-water pressures observed in the scenario with discrete fractures. The study illustrates the critical importance of correctly conceptualizing preferential flow for slope stability simulations. It further demonstrates that the combination of the latest generation of physically based hydrogeological models with slope stability simulations allows for improvement to current modeling approaches through more complex consideration of preferential flow paths.
Operations and Modeling Analysis
NASA Technical Reports Server (NTRS)
Ebeling, Charles
2005-01-01
The Reliability and Maintainability Analysis Tool (RMAT) provides NASA the capability to estimate reliability and maintainability (R&M) parameters and operational support requirements for proposed space vehicles based upon relationships established from both aircraft and Shuttle R&M data. RMAT has matured both in its underlying database and in its level of sophistication in extrapolating this historical data to satisfy proposed mission requirements, maintenance concepts and policies, and type of vehicle (i.e. ranging from aircraft like to shuttle like). However, a companion analyses tool, the Logistics Cost Model (LCM) has not reached the same level of maturity as RMAT due, in large part, to nonexistent or outdated cost estimating relationships and underlying cost databases, and it's almost exclusive dependence on Shuttle operations and logistics cost input parameters. As a result, the full capability of the RMAT/LCM suite of analysis tools to take a conceptual vehicle and derive its operations and support requirements along with the resulting operating and support costs has not been realized.
An auto-adaptive optimization approach for targeting nonpoint source pollution control practices.
Chen, Lei; Wei, Guoyuan; Shen, Zhenyao
2015-10-21
To solve computationally intensive and technically complex control of nonpoint source pollution, the traditional genetic algorithm was modified into an auto-adaptive pattern, and a new framework was proposed by integrating this new algorithm with a watershed model and an economic module. Although conceptually simple and comprehensive, the proposed algorithm would search automatically for those Pareto-optimality solutions without a complex calibration of optimization parameters. The model was applied in a case study in a typical watershed of the Three Gorges Reservoir area, China. The results indicated that the evolutionary process of optimization was improved due to the incorporation of auto-adaptive parameters. In addition, the proposed algorithm outperformed the state-of-the-art existing algorithms in terms of convergence ability and computational efficiency. At the same cost level, solutions with greater pollutant reductions could be identified. From a scientific viewpoint, the proposed algorithm could be extended to other watersheds to provide cost-effective configurations of BMPs.
Inviscid dynamics of a wet foam drop with monodisperse bubble size distribution
NASA Astrophysics Data System (ADS)
McDaniel, J. Gregory; Akhatov, Iskander; Holt, R. Glynn
2002-06-01
Motivated by recent experiments involving the acoustic levitation of foam drops, we develop a model for nonlinear oscillations of a spherical drop composed of monodisperse aqueous foam with void fraction below 0.1. The model conceptually divides a foam drop into many cells, each cell consisting of a spherical volume of liquid with a bubble at its center. By treating the liquid as incompressible and inviscid, a nonlinear equation is obtained for bubble motion due to a pressure applied at the outer radius of the liquid sphere. Upon linearizing this equation and connecting the cells at their outer radii, a wave equation is obtained with a dispersion relation for the sound waves in a bubbly liquid. For the spherical drop, this equation is solved by a normal mode expansion that yields the natural frequencies as functions of standard foam parameters. Numerical examples illustrate how the analysis may be used to extract foam parameters, such as void fraction and bubble radius, from the experimentally measured natural frequencies of a foam drop.
A New Method for Conceptual Modelling of Information Systems
NASA Astrophysics Data System (ADS)
Gustas, Remigijus; Gustiene, Prima
Service architecture is not necessarily bound to the technical aspects of information system development. It can be defined by using conceptual models that are independent of any implementation technology. Unfortunately, the conventional information system analysis and design methods cover just a part of required modelling notations for engineering of service architectures. They do not provide effective support to maintain semantic integrity between business processes and data. Service orientation is a paradigm that can be applied for conceptual modelling of information systems. The concept of service is rather well understood in different domains. It can be applied equally well for conceptualization of organizational and technical information system components. This chapter concentrates on analysis of the differences between service-oriented modelling and object-oriented modelling. Service-oriented method is used for semantic integration of information system static and dynamic aspects.
Three-Dimensional Modeling of Aircraft High-Lift Components with Vehicle Sketch Pad
NASA Technical Reports Server (NTRS)
Olson, Erik D.
2016-01-01
Vehicle Sketch Pad (OpenVSP) is a parametric geometry modeler that has been used extensively for conceptual design studies of aircraft, including studies using higher-order analysis. OpenVSP can model flap and slat surfaces using simple shearing of the airfoil coordinates, which is an appropriate level of complexity for lower-order aerodynamic analysis methods. For three-dimensional analysis, however, there is not a built-in method for defining the high-lift components in OpenVSP in a realistic manner, or for controlling their complex motions in a parametric manner that is intuitive to the designer. This paper seeks instead to utilize OpenVSP's existing capabilities, and establish a set of best practices for modeling high-lift components at a level of complexity suitable for higher-order analysis methods. Techniques are described for modeling the flap and slat components as separate three-dimensional surfaces, and for controlling their motion using simple parameters defined in the local hinge-axis frame of reference. To demonstrate the methodology, an OpenVSP model for the Energy-Efficient Transport (EET) AR12 wind-tunnel model has been created, taking advantage of OpenVSP's Advanced Parameter Linking capability to translate the motions of the high-lift components from the hinge-axis coordinate system to a set of transformations in OpenVSP's frame of reference.
Implementation of nursing conceptual models: observations of a multi-site research team.
Shea, H; Rogers, M; Ross, E; Tucker, D; Fitch, M; Smith, I
1989-01-01
The general acceptance by nursing of the nursing process as the methodology of practice enabled nurses to have a common grounding for practice, research and theory development in the 1970s. It has become clear, however, that the nursing process is just that--a process. What is sorely needed is the nursing content for that process and consequently in the past 10 years nursing theorists have further developed their particular conceptual models (CM). Three major teaching hospitals in Toronto have instituted a conceptual model (CM) of nursing as a basis of nursing practice. Mount Sinai Hospital has adopted Roy's adaptation model; Sunnybrook Medical Centre, Kings's goal attainment model; and Toronto General Hospital, Orem's self-care deficit theory model. All of these hospitals are affiliated through a series of cross appointments with the Faculty of Nursing at the University of Toronto. Two community hospitals, Mississauga and Scarborough General, have also adopted Orem's model and are related to the University through educational, community and interest groups. A group of researchers from these hospitals and the University of Toronto have proposed a collaborative project to determine what impact using a conceptual model will make on nursing practice. Discussions among the participants of this research group indicate that there are observations associated with instituting conceptual models that can be identified early in the process of implementation. These observations may be of assistance to others contemplating the implementation of conceptually based practice in their institution.
OWL references in ORM conceptual modelling
NASA Astrophysics Data System (ADS)
Matula, Jiri; Belunek, Roman; Hunka, Frantisek
2017-07-01
Object Role Modelling methodology is the fact-based type of conceptual modelling. The aim of the paper is to emphasize a close connection to OWL documents and its possible mutual cooperation. The definition of entities or domain values is an indispensable part of the conceptual schema design procedure defined by the ORM methodology. Many of these entities are already defined in OWL documents. Therefore, it is not necessary to declare entities again, whereas it is possible to utilize references from OWL documents during modelling of information systems.
NASA Astrophysics Data System (ADS)
Zeng, X.
2015-12-01
A large number of model executions are required to obtain alternative conceptual models' predictions and their posterior probabilities in Bayesian model averaging (BMA). The posterior model probability is estimated through models' marginal likelihood and prior probability. The heavy computation burden hinders the implementation of BMA prediction, especially for the elaborated marginal likelihood estimator. For overcoming the computation burden of BMA, an adaptive sparse grid (SG) stochastic collocation method is used to build surrogates for alternative conceptual models through the numerical experiment of a synthetical groundwater model. BMA predictions depend on model posterior weights (or marginal likelihoods), and this study also evaluated four marginal likelihood estimators, including arithmetic mean estimator (AME), harmonic mean estimator (HME), stabilized harmonic mean estimator (SHME), and thermodynamic integration estimator (TIE). The results demonstrate that TIE is accurate in estimating conceptual models' marginal likelihoods. The BMA-TIE has better predictive performance than other BMA predictions. TIE has high stability for estimating conceptual model's marginal likelihood. The repeated estimated conceptual model's marginal likelihoods by TIE have significant less variability than that estimated by other estimators. In addition, the SG surrogates are efficient to facilitate BMA predictions, especially for BMA-TIE. The number of model executions needed for building surrogates is 4.13%, 6.89%, 3.44%, and 0.43% of the required model executions of BMA-AME, BMA-HME, BMA-SHME, and BMA-TIE, respectively.
NASA Astrophysics Data System (ADS)
Karlsen, R. H.; Smits, F. J. C.; Stuyfzand, P. J.; Olsthoorn, T. N.; van Breukelen, B. M.
2012-08-01
SummaryThis article describes the post audit and inverse modeling of a 1-D forward reactive transport model. The model simulates the changes in water quality following artificial recharge of pre-treated water from the river Rhine in the Amsterdam Water Supply Dunes using the PHREEQC-2 numerical code. One observation dataset is used for model calibration, and another dataset for validation of model predictions. The total simulation time of the model is 50 years, from 1957 to 2007, with recharge composition varying on a monthly basis and the post audit is performed 26 years after the former model simulation period. The post audit revealed that the original model could reasonably predict conservative transport and kinetic redox reactions (oxygen and nitrate reduction coupled to the oxidation of soil organic carbon), but showed discrepancies in the simulation of cation exchange. Conceptualizations of the former model were inadequate to accurately simulate water quality changes controlled by cation exchange, especially concerning the breakthrough of potassium and magnesium fronts. Changes in conceptualization and model design, including the addition of five flow paths, to a total of six, and the use of parameter estimation software (PEST), resulted in a better model to measurement fit and system representation. No unique parameter set could be found for the model, primarily due to high parameter correlations, and an assessment of the predictive error was made using a calibration constrained Monte-Carlo method, and evaluated against field observations. The predictive error was found to be low for Na+ and Ca2+, except for greater travel times, while the K+ and Mg2+ error was restricted to the exchange fronts at some of the flow paths. Optimized cation exchange coefficients were relatively high, especially for potassium, but still within the observed range in literature. The exchange coefficient for potassium agrees with strong fixation on illite, a main clay mineral in the area. Optimized CEC values were systematically lower than clay and organic matter contents indicated, possibly reflecting preferential flow of groundwater through the more permeable but less reactive aquifer parts. Whereas the artificial recharge initially acted as an intrusion of relatively saline water triggering Na+ for Ca2+ exchange, further increasing total hardness of the recharged water, the gradual long-term reduction in salinity of the river Rhine since the mid 1970s has shifted to an intrusion of fresher water causing Ca2+ for Na+ exchange. As a result, seasonal and longer term reversal of the initial cation exchange processes was observed adding to the general long-term reduction in total hardness of the recharged Rhine water.
Saengsiri, Aem-orn; Hacker, Eileen Danaher
2015-01-01
Health-related quality of life is an important clinical outcome to measure in patients with cardiovascular disease. International nurse researchers with limited English skills and novice cardiovascular nurse researchers face numerous challenges when conducting quality of life research because of the conceptual ambiguity of the construct and subsequent operationalization issues as well as difficulty identifying conceptual models to guide their quality of life research. The overall purpose of this article was to provide guidance to cardiovascular nurse researchers (using Thailand as an example) who are interested in examining quality of life in their native country but lack access to quality of life conceptual models and instruments because of language barriers. This article will examine definitions of health-related quality of life, selection of a conceptual model to guide quality of life research, use of the conceptual model to guide selection and measurement of variables, and translation of instruments when reliable and valid instruments are not available in the native language. Ferrans' definition of quality of life and the Wilson and Cleary Revised Model of Patient Outcomes were selected to guide the research. Selection of variables/instruments flowed directly from the conceptualization of constructs identified in this model. Our study, "Examining HRQOL in Thai People With Coronary Artery Disease Following Percutaneous Coronary Intervention," serves as an exemplar to illustrate the conceptual and operational challenges associated with conducting quality of life research in Thailand. The ultimate goal of cardiovascular nursing is to help patients achieve their optimal quality of life. Thai clinicians implementing quality of life assessment in clinical practice face similar conceptual and operationalization issues, especially when using instruments that are not well established or easily interpreted. Although quality of life assessment in clinical practice improves communication between patients and healthcare providers, clear guidelines for making changes to treatment strategies based on changes in quality of life must be established.
a Bayesian Synthesis of Predictions from Different Models for Setting Water Quality Criteria
NASA Astrophysics Data System (ADS)
Arhonditsis, G. B.; Ecological Modelling Laboratory
2011-12-01
Skeptical views of the scientific value of modelling argue that there is no true model of an ecological system, but rather several adequate descriptions of different conceptual basis and structure. In this regard, rather than picking the single "best-fit" model to predict future system responses, we can use Bayesian model averaging to synthesize the forecasts from different models. Hence, by acknowledging that models from different areas of the complexity spectrum have different strengths and weaknesses, the Bayesian model averaging is an appealing approach to improve the predictive capacity and to overcome the ambiguity surrounding the model selection or the risk of basing ecological forecasts on a single model. Our study addresses this question using a complex ecological model, developed by Ramin et al. (2011; Environ Modell Softw 26, 337-353) to guide the water quality criteria setting process in the Hamilton Harbour (Ontario, Canada), along with a simpler plankton model that considers the interplay among phosphate, detritus, and generic phytoplankton and zooplankton state variables. This simple approach is more easily subjected to detailed sensitivity analysis and also has the advantage of fewer unconstrained parameters. Using Markov Chain Monte Carlo simulations, we calculate the relative mean standard error to assess the posterior support of the two models from the existing data. Predictions from the two models are then combined using the respective standard error estimates as weights in a weighted model average. The model averaging approach is used to examine the robustness of predictive statements made from our earlier work regarding the response of Hamilton Harbour to the different nutrient loading reduction strategies. The two eutrophication models are then used in conjunction with the SPAtially Referenced Regressions On Watershed attributes (SPARROW) watershed model. The Bayesian nature of our work is used: (i) to alleviate problems of spatiotemporal resolution mismatch between watershed and receiving waterbody models; and (ii) to overcome the conceptual or scale misalignment between processes of interest and supporting information. The proposed Bayesian approach provides an effective means of empirically estimating the relation between in-stream measurements of nutrient fluxes and the sources/sinks of nutrients within the watershed, while explicitly accounting for the uncertainty associated with the existing knowledge from the system along with the different types of spatial correlation typically underlying the parameter estimation of watershed models. Our modelling exercise offers the first estimates of the export coefficients and the delivery rates from the different subcatchments and thus generates testable hypotheses regarding the nutrient export "hot spots" in the studied watershed. Finally, we conduct modeling experiments that evaluate the potential improvement of the model parameter estimates and the decrease of the predictive uncertainty, if the uncertainty associated with the contemporary nutrient loading estimates is reduced. The lessons learned from this study will contribute towards the development of integrated modelling frameworks.
Thorndahl, S; Willems, P
2008-01-01
Failure of urban drainage systems may occur due to surcharge or flooding at specific manholes in the system, or due to overflows from combined sewer systems to receiving waters. To quantify the probability or return period of failure, standard approaches make use of the simulation of design storms or long historical rainfall series in a hydrodynamic model of the urban drainage system. In this paper, an alternative probabilistic method is investigated: the first-order reliability method (FORM). To apply this method, a long rainfall time series was divided in rainstorms (rain events), and each rainstorm conceptualized to a synthetic rainfall hyetograph by a Gaussian shape with the parameters rainstorm depth, duration and peak intensity. Probability distributions were calibrated for these three parameters and used on the basis of the failure probability estimation, together with a hydrodynamic simulation model to determine the failure conditions for each set of parameters. The method takes into account the uncertainties involved in the rainstorm parameterization. Comparison is made between the failure probability results of the FORM method, the standard method using long-term simulations and alternative methods based on random sampling (Monte Carlo direct sampling and importance sampling). It is concluded that without crucial influence on the modelling accuracy, the FORM is very applicable as an alternative to traditional long-term simulations of urban drainage systems.
Surface-water hydrology and runoff simulations for three basins in Pierce County, Washington
Mastin, M.C.
1996-01-01
The surface-water hydrology in Clear, Clarks, and Clover Creek Basins in central Pierce County, Washington, is described with a conceptual model of the runoff processes and then simulated with the Hydrological Simulation Program-FORTRAN (HSPF), a continuous, deterministic hydrologic model. The study area is currently undergoing a rapid conversion of rural, undeveloped land to urban and suburban land that often changes the flow characteristics of the streams that drain these lands. The complex interactions of land cover, climate, soils, topography, channel characteristics, and ground- water flow patterns determine the surface-water hydrology of the study area and require a complex numerical model to assess the impact of urbanization on streamflows. The U.S. Geological Survey completed this investigation in cooperation with the Storm Drainage and Surface Water Management Utility within the Pierce County Department of Public Works to describe the important rainfall-runoff processes within the study area and to develop a simulation model to be used as a tool to predict changes in runoff characteristics resulting from changes in land use. The conceptual model, a qualitative representation of the study basins, links the physical characteristics to the runoff process of the study basins. The model incorporates 11 generalizations identified by the investigation, eight of which describe runoff from hillslopes, and three that account for the effects of channel characteristics and ground-water flow patterns on runoff. Stream discharge was measured at 28 sites and precipitation was measured at six sites for 3 years in two overlapping phases during the period of October 1989 through September 1992 to calibrate and validate the simulation model. Comparison of rainfall data from October 1989 through September 1992 shows the data-collection period beginning with 2 wet water years followed by the relatively dry 1992 water year. Runoff was simulated with two basin models-the Clover Creek Basin model and the Clear-Clarks Basin model-by incorporating the generalizations of the conceptual model into the construction of two HSPF numerical models. Initially, the process-related parameters for runoff from glacial-till hillslopes were calibrated with numerical models for three catchment sites and one headwater basin where streamflows were continuously measured and little or no influence from ground water, channel storage, or channel losses affected runoff. At one of the catchments soil moisture was monitored and compared with simulated soil moisture. The values for these parameters were used in the basin models. Basin models were calibrated to the first year of observed streamflow data by adjusting other parameters in the numerical model that simulated channel losses, simulated channel storage in a few of the reaches in the headwaters and in the floodplain of the main stem of Clover Creek, and simulated volume and outflow of the ground-water reservoir representing the regional ground-water aquifers. The models were run for a second year without any adjustments, and simulated results were compared with observed results as a measure of validation of the models. The investigation showed the importance of defining the ground-water flow boundaries and demonstrated a simple method of simulating the influence of the regional ground-water aquifer on streamflows. In the Clover Creek Basin model, ground-water flow boundaries were used to define subbasins containing mostly glacial outwash soils and not containing any surface drainage channels. In the Clear-Clarks Basin model, ground-water flow boundaries outlined a recharge area outside the surface-water boundaries of the basin that was incorporated into the model in order to provide sufficient water to balance simulated ground-water outflows to the creeks. A simulated ground-water reservoir used to represent regional ground-water flow processes successfully provided the proper water balance of inflows and outfl
Chemical demulsification of petroleum emulsions using oil-soluable demulsifiers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krawczyk, M.A.; Wasan, D.T.; Shetty, C.S.
1991-02-01
This paper investigates the factors affecting the coalescence and interfacial behavior of water- in-crude-oil emulsions in the presence of oil-soluble demulsifiers. The emulsion-breaking characteristics and interfacial properties of East Texas Crude and a model system were compared. The variation of interfacial tension with demulsifier concentration for the model system was ascertained by measuring the interfacial tensions between the oil and water phase. Interfacial activity, adsorption kinetics, and partitioning were shown to be the most important parameters governing demulsifier performance. A conceptual model of drop-drop coalescence process in demulsification was presented which indicates that the interfacial activity of the demulsifier mustmore » be high enough to suppress the interfacial tension gradient. This accelerates the rate of film drainage, thus promoting coalescence.« less
Mirus, B.B.; Perkins, K.S.; Nimmo, J.R.; Singha, K.
2009-01-01
To understand their relation to pedogenic development, soil hydraulic properties in the Mojave Desert were investi- gated for three deposit types: (i) recently deposited sediments in an active wash, (ii) a soil of early Holocene age, and (iii) a highly developed soil of late Pleistocene age. Eff ective parameter values were estimated for a simplifi ed model based on Richards' equation using a fl ow simulator (VS2D), an inverse algorithm (UCODE-2005), and matric pressure and water content data from three ponded infi ltration experiments. The inverse problem framework was designed to account for the eff ects of subsurface lateral spreading of infi ltrated water. Although none of the inverse problems converged on a unique, best-fi t parameter set, a minimum standard error of regression was reached for each deposit type. Parameter sets from the numerous inversions that reached the minimum error were used to develop probability distribu tions for each parameter and deposit type. Electrical resistance imaging obtained for two of the three infi ltration experiments was used to independently test fl ow model performance. Simulations for the active wash and Holocene soil successfully depicted the lateral and vertical fl uxes. Simulations of the more pedogenically developed Pleistocene soil did not adequately replicate the observed fl ow processes, which would require a more complex conceptual model to include smaller scale heterogeneities. The inverse-modeling results, however, indicate that with increasing age, the steep slope of the soil water retention curve shitis toward more negative matric pressures. Assigning eff ective soil hydraulic properties based on soil age provides a promising framework for future development of regional-scale models of soil moisture dynamics in arid environments for land-management applications. ?? Soil Science Society of America.
Scrutinizing UML Activity Diagrams
NASA Astrophysics Data System (ADS)
Al-Fedaghi, Sabah
Building an information system involves two processes: conceptual modeling of the “real world domain” and designing the software system. Object-oriented methods and languages (e.g., UML) are typically used for describing the software system. For the system analysis process that produces the conceptual description, object-oriented techniques or semantics extensions are utilized. Specifically, UML activity diagrams are the “flow charts” of object-oriented conceptualization tools. This chapter proposes an alternative to UML activity diagrams through the development of a conceptual modeling methodology based on the notion of flow.
Pursuing realistic hydrologic model under SUPERFLEX framework in a semi-humid catchment in China
NASA Astrophysics Data System (ADS)
Wei, Lingna; Savenije, Hubert H. G.; Gao, Hongkai; Chen, Xi
2016-04-01
Model realism is pursued perpetually by hydrologists for flood and drought prediction, integrated water resources management and decision support of water security. "Physical-based" distributed hydrologic models are speedily developed but they also encounter unneglectable challenges, for instance, computational time with low efficiency and parameters uncertainty. This study step-wisely tested four conceptual hydrologic models under the framework of SUPERFLEX in a small semi-humid catchment in southern Huai River basin of China. The original lumped FLEXL has hypothesized model structure of four reservoirs to represent canopy interception, unsaturated zone, subsurface flow of fast and slow components and base flow storage. Considering the uneven rainfall in space, the second model (FLEXD) is developed with same parameter set for different rain gauge controlling units. To reveal the effect of topography, terrain descriptor of height above the nearest drainage (HAND) combined with slope is applied to classify the experimental catchment into two landscapes. Then the third one (FLEXTOPO) builds different model blocks in consideration of the dominant hydrologic process corresponding to the topographical condition. The fourth one named FLEXTOPOD integrating the parallel framework of FLEXTOPO in four controlled units is designed to interpret spatial variability of rainfall patterns and topographic features. Through pairwise comparison, our results suggest that: (1) semi-distributed models (FLEXD and FLEXTOPOD) taking precipitation spatial heterogeneity into account has improved model performance with parsimonious parameter set, and (2) hydrologic model architecture with flexibility to reflect perceived dominant hydrologic processes can include the local terrain circumstances for each landscape. Hence, the modeling actions are coincided with the catchment behaviour and close to the "reality". The presented methodology is regarding hydrologic model as a tool to test our hypothesis and deepen our understanding of hydrologic processes, which will be helpful to improve modeling realism.
The conceptualization model problem—surprise
NASA Astrophysics Data System (ADS)
Bredehoeft, John
2005-03-01
The foundation of model analysis is the conceptual model. Surprise is defined as new data that renders the prevailing conceptual model invalid; as defined here it represents a paradigm shift. Limited empirical data indicate that surprises occur in 20-30% of model analyses. These data suggest that groundwater analysts have difficulty selecting the appropriate conceptual model. There is no ready remedy to the conceptual model problem other than (1) to collect as much data as is feasible, using all applicable methods—a complementary data collection methodology can lead to new information that changes the prevailing conceptual model, and (2) for the analyst to remain open to the fact that the conceptual model can change dramatically as more information is collected. In the final analysis, the hydrogeologist makes a subjective decision on the appropriate conceptual model. The conceptualization problem does not render models unusable. The problem introduces an uncertainty that often is not widely recognized. Conceptual model uncertainty is exacerbated in making long-term predictions of system performance. C'est le modèle conceptuel qui se trouve à base d'une analyse sur un modèle. On considère comme une surprise lorsque le modèle est invalidé par des données nouvelles; dans les termes définis ici la surprise est équivalente à un change de paradigme. Des données empiriques limitées indiquent que les surprises apparaissent dans 20 à 30% des analyses effectuées sur les modèles. Ces données suggèrent que l'analyse des eaux souterraines présente des difficultés lorsqu'il s'agit de choisir le modèle conceptuel approprié. Il n'existe pas un autre remède au problème du modèle conceptuel que: (1) rassembler autant des données que possible en utilisant toutes les méthodes applicables—la méthode des données complémentaires peut conduire aux nouvelles informations qui vont changer le modèle conceptuel, et (2) l'analyste doit rester ouvert au fait que le modèle conceptuel peut bien changer lorsque des nouvelles informations apparaissent. Dans l'analyse finale le hydrogéologue prend une décision subjective sur le modèle conceptuel approprié. Le problème du le modèle conceptuel ne doit pas rendre le modèle inutilisable. Ce problème introduit une incertitude qui n'est pas toujours reconnue. Les incertitudes du modèle conceptuel deviennent plus importantes dans les cases de prévisions à long terme dans l'analyse de performance. La base para hacer un análisis de un modelo es el modelo conceptual. Se define aquí la sorpresa como los datos nuevos que convierten en incoherente al modelo conceptual previamente aceptado; tal como se define aquí esto representa un cambio de paradigma. Los datos empíricos limitados indican que estas sorpresas suceden entre un 20 a un 30% de los análisis de modelos. Esto sugiere que los analistas de modelos de agua subterránea tienen dificultades al seleccionar el modelo conceptual apropiado. No hayotra solución disponible a este problema del modelo conceptual diferente de: (1) Recolectar tanta información como sea posible, mediante la utilización de todos los métodos aplicables, lo cual puede resultar en que esta nueva información ayude a cambiar el modelo conceptual vigente, y (2) Que el analista de modelos se mantenga siempre abierto al hecho de que un modelo conceptual puede cambiar de manera total, en la medida en que se colecte mas información. En el análisis final el hidrogeólogo toma una decisión subjetiva en cuanto al modelo conceptual apropiado. El problema de la conceptualización no produce modelos inútiles. El problema presenta una incertidumbre, la cual a menudo no es tenida en cuentade manera adecuada. Esta incertidumbre en los modelos conceptuales se aumenta, cuando se hacen predicciones a largo plazo del comportamiento de un sistema dado.
How far can we go in hydrological modelling without any knowledge of runoff formation processes?
NASA Astrophysics Data System (ADS)
Ayzel, Georgy
2016-04-01
Hydrological modelling is a challenging scientific issue for the last 50 years and tend to be it further because of the highest level of runoff formation processes complexity at the different spatio-temporal scales. Enormous number of modelling-related papers have submitted to the top-ranked journals every year, but in this publication speed race authors have pay increasing attention to the models and data they use by itself rather than underlying watershed processes. Great community effort of the free and open-source models sharing with high availability of hydrometeorological data sources led to conceptual shifting paradigm of hydrological science to the technical-oriented direction. In the third-world countries this shifting is more clear by the reason of field studies absence and obligatory requirement of practical significance of the research supported by the government funds. As a result we get a state of hydrological modelling discipline closer to the aim of high Nash-Sutcliffe efficiency (NSE) achievement rather than watershed processes understanding. Both lumped physically-based land-surface model SWAP (Soil Water - Atmosphere - Plants) and SCE-UA (Shuffled Complex Evolution method developed at The University of Arizona) technique for robust model parameters search were used for the runoff modelling of 323 MOPEX watersheds. No one special data analysis and expert knowledge-based decisions were not performed. Median value of NSE is 0.652 and 90% of watersheds have efficiency bigger than 0.5. Thus without any information of particular features of each watershed satisfactory modelling results were obtained. To prove our conclusions we build cutting-edge conceptual rainfall-runoff model based on decision trees and adaptive boosting machine learning algorithms for the one small watershed in USA. No one special data analysis or feature engineering was not performed too. Obtained results demonstrate great model prediction power both for learning and testing periods (NSE > 0.95). The way we obtain our results is clear and direct: we used both open-source physically based and conceptual models coupled with open access data. However these results does not make a significant contribution to the hydrological cycle processes understanding. And not the hydrological modelling itself but the reason why and for what we do it is the most challenging issue for the future research.
A Multialgorithm Approach to Land Surface Modeling of Suspended Sediment in the Colorado Front Range
Stewart, J. R.; Kasprzyk, J. R.; Rajagopalan, B.; Minear, J. T.; Raseman, W. J.
2017-01-01
Abstract A new paradigm of simulating suspended sediment load (SSL) with a Land Surface Model (LSM) is presented here. Five erosion and SSL algorithms were applied within a common LSM framework to quantify uncertainties and evaluate predictability in two steep, forested catchments (>1,000 km2). The algorithms were chosen from among widely used sediment models, including empirically based: monovariate rating curve (MRC) and the Modified Universal Soil Loss Equation (MUSLE); stochastically based: the Load Estimator (LOADEST); conceptually based: the Hydrologic Simulation Program—Fortran (HSPF); and physically based: the Distributed Hydrology Soil Vegetation Model (DHSVM). The algorithms were driven by the hydrologic fluxes and meteorological inputs generated from the Variable Infiltration Capacity (VIC) LSM. A multiobjective calibration was applied to each algorithm and optimized parameter sets were validated over an excluded period, as well as in a transfer experiment to a nearby catchment to explore parameter robustness. Algorithm performance showed consistent decreases when parameter sets were applied to periods with greatly differing SSL variability relative to the calibration period. Of interest was a joint calibration of all sediment algorithm and streamflow parameters simultaneously, from which trade‐offs between streamflow performance and partitioning of runoff and base flow to optimize SSL timing were noted, decreasing the flexibility and robustness of the streamflow to adapt to different time periods. Parameter transferability to another catchment was most successful in more process‐oriented algorithms, the HSPF and the DHSVM. This first‐of‐its‐kind multialgorithm sediment scheme offers a unique capability to portray acute episodic loading while quantifying trade‐offs and uncertainties across a range of algorithm structures. PMID:29399268
Probabilistic margin evaluation on accidental transients for the ASTRID reactor project
NASA Astrophysics Data System (ADS)
Marquès, Michel
2014-06-01
ASTRID is a technological demonstrator of Sodium cooled Fast Reactor (SFR) under development. The conceptual design studies are being conducted in accordance with the Generation IV reactor objectives, particularly in terms of improving safety. For the hypothetical events, belonging to the accidental category "severe accident prevention situations" having a very low frequency of occurrence, the safety demonstration is no more based on a deterministic demonstration with conservative assumptions on models and parameters but on a "Best-Estimate Plus Uncertainty" (BEPU) approach. This BEPU approach ispresented in this paper for an Unprotected Loss-of-Flow (ULOF) event. The Best-Estimate (BE) analysis of this ULOFt ransient is performed with the CATHARE2 code, which is the French reference system code for SFR applications. The objective of the BEPU analysis is twofold: first evaluate the safety margin to sodium boiling in taking into account the uncertainties on the input parameters of the CATHARE2 code (twenty-two uncertain input parameters have been identified, which can be classified into five groups: reactor power, accident management, pumps characteristics, reactivity coefficients, thermal parameters and head losses); secondly quantify the contribution of each input uncertainty to the overall uncertainty of the safety margins, in order to refocusing R&D efforts on the most influential factors. This paper focuses on the methodological aspects of the evaluation of the safety margin. At least for the preliminary phase of the project (conceptual design), a probabilistic criterion has been fixed in the context of this BEPU analysis; this criterion is the value of the margin to sodium boiling, which has a probability 95% to be exceeded, obtained with a confidence level of 95% (i.e. the M5,95percentile of the margin distribution). This paper presents two methods used to assess this percentile: the Wilks method and the Bootstrap method ; the effectiveness of the two methods is compared on the basis of 500 simulations performed with theCATHARE2 code. We conclude that, with only 100 simulations performed with the CATHARE2 code, which is a number of simulations workable in the conceptual design phase of the ASTRID project where the models and the hypothesis are often modified, it is best in order to evaluate the percentile M5,95 of the margin to sodium boiling to use the bootstrap method, which will provide a slightly conservative result. On the other hand, in order to obtain an accurate estimation of the percentileM5,95, for the safety report for example, it will be necessary to perform at least 300 simulations with the CATHARE2 code. In this case, both methods (Wilks and Bootstrap) would give equivalent results.
Payn, Robert A.; Hall, Robert O Jr.; Kennedy, Theodore A.; Poole, Geoff C; Marshall, Lucy A.
2017-01-01
Conventional methods for estimating whole-stream metabolic rates from measured dissolved oxygen dynamics do not account for the variation in solute transport times created by dynamic flow conditions. Changes in flow at hourly time scales are common downstream of hydroelectric dams (i.e. hydropeaking), and hydrologic limitations of conventional metabolic models have resulted in a poor understanding of the controls on biological production in these highly managed river ecosystems. To overcome these limitations, we coupled a two-station metabolic model of dissolved oxygen dynamics with a hydrologic river routing model. We designed calibration and parameter estimation tools to infer values for hydrologic and metabolic parameters based on time series of water quality data, achieving the ultimate goal of estimating whole-river gross primary production and ecosystem respiration during dynamic flow conditions. Our case study data for model design and calibration were collected in the tailwater of Glen Canyon Dam (Arizona, USA), a large hydropower facility where the mean discharge was 325 m3 s 1 and the average daily coefficient of variation of flow was 0.17 (i.e. the hydropeaking index averaged from 2006 to 2016). We demonstrate the coupled model’s conceptual consistency with conventional models during steady flow conditions, and illustrate the potential bias in metabolism estimates with conventional models during unsteady flow conditions. This effort contributes an approach to solute transport modeling and parameter estimation that allows study of whole-ecosystem metabolic regimes across a more diverse range of hydrologic conditions commonly encountered in streams and rivers.
1990-10-03
9 4.1. Mapping the Conceptual Model to the Implementation ................................................ 9 4.2. Overview of...browser-editor application. Finally, appendix A provides a detailed description of the AdaKNET conceptual model; users of AdaKNET should fami...provide a brief summary of the semantics of the underlying conceptual model implemented by AdaKNET, use of the AdaKNET ADT will require a more thorough
Constructing a Conceptual Model Linking Drivers and Ecosystem Services in Piedmont Streams
2011-04-01
to the Virginia-Maryland border and is bound by the Appalachian Mountains and Blue Ridge to the northwest and the Atlantic Coastal Plain to the south...demand on freshwater ecosystem services, and a growing appreciation for the value of functioning ecosystems, the Appalachian Piedmont has developed a...the model and how it can be adapted and ap - plied for specific projects. A FRAMEWORK FOR CONCEPTUAL MODELING The general approach to conceptual
LaDeau, Shannon L; Glass, Gregory E; Hobbs, N Thompson; Latimer, Andrew; Ostfeld, Richard S
2011-07-01
Ecologists worldwide are challenged to contribute solutions to urgent and pressing environmental problems by forecasting how populations, communities, and ecosystems will respond to global change. Rising to this challenge requires organizing ecological information derived from diverse sources and formally assimilating data with models of ecological processes. The study of infectious disease has depended on strategies for integrating patterns of observed disease incidence with mechanistic process models since John Snow first mapped cholera cases around a London water pump in 1854. Still, zoonotic and vector-borne diseases increasingly affect human populations, and methods used to successfully characterize directly transmitted diseases are often insufficient. We use four case studies to demonstrate that advances in disease forecasting require better understanding of zoonotic host and vector populations, as well of the dynamics that facilitate pathogen amplification and disease spillover into humans. In each case study, this goal is complicated by limited data, spatiotemporal variability in pathogen transmission and impact, and often, insufficient biological understanding. We present a conceptual framework for data-model fusion in infectious disease research that addresses these fundamental challenges using a hierarchical state-space structure to (1) integrate multiple data sources and spatial scales to inform latent parameters, (2) partition uncertainty in process and observation models, and (3) explicitly build upon existing ecological and epidemiological understanding. Given the constraints inherent in the study of infectious disease and the urgent need for progress, fusion of data and expertise via this type of conceptual framework should prove an indispensable tool.
NASA Technical Reports Server (NTRS)
Chwalowski, Pawel; Samareh, Jamshid A.; Horta, Lucas G.; Piatak, David J.; McGowan, Anna-Maria R.
2009-01-01
The conceptual and preliminary design processes for aircraft with large shape changes are generally difficult and time-consuming, and the processes are often customized for a specific shape change concept to streamline the vehicle design effort. Accordingly, several existing reports show excellent results of assessing a particular shape change concept or perturbations of a concept. The goal of the current effort was to develop a multidisciplinary analysis tool and process that would enable an aircraft designer to assess several very different morphing concepts early in the design phase and yet obtain second-order performance results so that design decisions can be made with better confidence. The approach uses an efficient parametric model formulation that allows automatic model generation for systems undergoing radical shape changes as a function of aerodynamic parameters, geometry parameters, and shape change parameters. In contrast to other more self-contained approaches, the approach utilizes off-the-shelf analysis modules to reduce development time and to make it accessible to many users. Because the analysis is loosely coupled, discipline modules like a multibody code can be easily swapped for other modules with similar capabilities. One of the advantages of this loosely coupled system is the ability to use the medium- to high-fidelity tools early in the design stages when the information can significantly influence and improve overall vehicle design. Data transfer among the analysis modules are based on an accurate and automated general purpose data transfer tool. In general, setup time for the integrated system presented in this paper is 2-4 days for simple shape change concepts and 1-2 weeks for more mechanically complicated concepts. Some of the key elements briefly described in the paper include parametric model development, aerodynamic database generation, multibody analysis, and the required software modules as well as examples for a telescoping wing, a folding wing, and a bat-like wing. The paper also includes the verification of a medium-fidelity aerodynamic tool used for the aerodynamic database generation with a steady and unsteady high-fidelity CFD analysis tool for a folding wing example.
ERIC Educational Resources Information Center
Vasilenko, Sara A.; Lefkowitz, Eva S.; Welsh, Deborah P.
2014-01-01
Although research has increasingly emphasized how adolescent sexual behavior may be associated with aspects of health beyond unwanted pregnancy and sexually transmitted infections, no current theoretical or conceptual model fully explains associations between sexual behavior and multiple facets of health. We provide a conceptual model that…
On the Performance of Alternate Conceptual Ecohydrological Models for Streamflow Prediction
NASA Astrophysics Data System (ADS)
Naseem, Bushra; Ajami, Hoori; Cordery, Ian; Sharma, Ashish
2016-04-01
A merging of a lumped conceptual hydrological model with two conceptual dynamic vegetation models is presented to assess the performance of these models for simultaneous simulations of streamflow and leaf area index (LAI). Two conceptual dynamic vegetation models with differing representation of ecological processes are merged with a lumped conceptual hydrological model (HYMOD) to predict catchment scale streamflow and LAI. The merged RR-LAI-I model computes relative leaf biomass based on transpiration rates while the RR-LAI-II model computes above ground green and dead biomass based on net primary productivity and water use efficiency in response to soil moisture dynamics. To assess the performance of these models, daily discharge and 8-day MODIS LAI product for 27 catchments of 90 - 1600km2 in size located in the Murray - Darling Basin in Australia are used. Our results illustrate that when single-objective optimisation was focussed on maximizing the objective function for streamflow or LAI, the other un-calibrated predicted outcome (LAI if streamflow is the focus) was consistently compromised. Thus, single-objective optimization cannot take into account the essence of all processes in the conceptual ecohydrological models. However, multi-objective optimisation showed great strength for streamflow and LAI predictions. Both response outputs were better simulated by RR-LAI-II than RR-LAI-I due to better representation of physical processes such as net primary productivity (NPP) in RR-LAI-II. Our results highlight that simultaneous calibration of streamflow and LAI using a multi-objective algorithm proves to be an attractive tool for improved streamflow predictions.
Assessment of student learning with hypermedia tools in first-year college chemistry
NASA Astrophysics Data System (ADS)
Skov, Neil Martin
Learning chemistry is difficult for some students. In response to this difficulty, many educators argue that hypermedia technology can promote learning of abstract chemistry concepts. This research assesses learning outcomes and use patterns exhibited by first-year college general chemistry students using an instructional hypermedia system called Seeing Through Chemistry (STC) as part of their first course. STC was designed to help students with inadequate preparation for college chemistry. The assessment answers two questions: (a) do students learn from instructional hypermedia, and (b) what kind of students benefit from this medium? This non-experimental, quantitative research involved 82 student volunteers in their first college chemistry course. Data include SAT scores, high school science and mathematics grades, career orientation, chemistry placement score, motivation, laboratory and lecture section enrollment, and chemistry course grade. The investigation requires two specialized assessment tools: a measure of conceptual understanding of acids and bases, and a measure of cognitive engagement with hypermedia. Data analysis methods include two causal path models to examine hypermedia use and learning outcomes: one showing STC's effect on overall chemistry course performance, and the other demonstrating the effect of a single STC module on students' conceptual knowledge of acids and bases. Though there is no significant effect on course grade, the second analysis shows statistically significant learning from students' work with instructional hypermedia. Both causal models demonstrate that students with poorer preparation for college chemistry used STC more than students with better preparation, which matches the designers' intent. Some better prepared students were relatively more motivated to use the hypermedia system. Other findings show positive effects of high school science and college laboratory coursework on concept learning. This research informs the field of hypermedia design. Since STC's developers used particular parameters to guide their design, the medium's positive effect on learning indirectly supports the underlying design parameters. This research also demonstrates an effective method for assessing hypermedia learning in large course settings. In addition, the study exhibits a new tool for investigating conceptual understandings of large numbers of students, and a new way to measure cognitive engagement of students using instructional hypermedia.
Ackerman, Daniel J.; Rousseau, Joseph P.; Rattray, Gordon W.; Fisher, Jason C.
2010-01-01
Three-dimensional steady-state and transient models of groundwater flow and advective transport in the eastern Snake River Plain aquifer were developed by the U.S. Geological Survey in cooperation with the U.S. Department of Energy. The steady-state and transient flow models cover an area of 1,940 square miles that includes most of the 890 square miles of the Idaho National Laboratory (INL). A 50-year history of waste disposal at the INL has resulted in measurable concentrations of waste contaminants in the eastern Snake River Plain aquifer. Model results can be used in numerical simulations to evaluate the movement of contaminants in the aquifer. Saturated flow in the eastern Snake River Plain aquifer was simulated using the MODFLOW-2000 groundwater flow model. Steady-state flow was simulated to represent conditions in 1980 with average streamflow infiltration from 1966-80 for the Big Lost River, the major variable inflow to the system. The transient flow model simulates groundwater flow between 1980 and 1995, a period that included a 5-year wet cycle (1982-86) followed by an 8-year dry cycle (1987-94). Specified flows into or out of the active model grid define the conditions on all boundaries except the southwest (outflow) boundary, which is simulated with head-dependent flow. In the transient flow model, streamflow infiltration was the major stress, and was variable in time and location. The models were calibrated by adjusting aquifer hydraulic properties to match simulated and observed heads or head differences using the parameter-estimation program incorporated in MODFLOW-2000. Various summary, regression, and inferential statistics, in addition to comparisons of model properties and simulated head to measured properties and head, were used to evaluate the model calibration. Model parameters estimated for the steady-state calibration included hydraulic conductivity for seven of nine hydrogeologic zones and a global value of vertical anisotropy. Parameters estimated for the transient calibration included specific yield for five of the seven hydrogeologic zones. The zones represent five rock units and parts of four rock units with abundant interbedded sediment. All estimates of hydraulic conductivity were nearly within 2 orders of magnitude of the maximum expected value in a range that exceeds 6 orders of magnitude. The estimate of vertical anisotropy was larger than the maximum expected value. All estimates of specific yield and their confidence intervals were within the ranges of values expected for aquifers, the range of values for porosity of basalt, and other estimates of specific yield for basalt. The steady-state model reasonably simulated the observed water-table altitude, orientation, and gradients. Simulation of transient flow conditions accurately reproduced observed changes in the flow system resulting from episodic infiltration from the Big Lost River and facilitated understanding and visualization of the relative importance of historical differences in infiltration in time and space. As described in a conceptual model, the numerical model simulations demonstrate flow that is (1) dominantly horizontal through interflow zones in basalt and vertical anisotropy resulting from contrasts in hydraulic conductivity of various types of basalt and the interbedded sediments, (2) temporally variable due to streamflow infiltration from the Big Lost River, and (3) moving downward downgradient of the INL. The numerical models were reparameterized, recalibrated, and analyzed to evaluate alternative conceptualizations or implementations of the conceptual model. The analysis of the reparameterized models revealed that little improvement in the model could come from alternative descriptions of sediment content, simulated aquifer thickness, streamflow infiltration, and vertical head distribution on the downgradient boundary. Of the alternative estimates of flow to or from the aquifer, only a 20 percent decrease in
An open, object-based modeling approach for simulating subsurface heterogeneity
NASA Astrophysics Data System (ADS)
Bennett, J.; Ross, M.; Haslauer, C. P.; Cirpka, O. A.
2017-12-01
Characterization of subsurface heterogeneity with respect to hydraulic and geochemical properties is critical in hydrogeology as their spatial distribution controls groundwater flow and solute transport. Many approaches of characterizing subsurface heterogeneity do not account for well-established geological concepts about the deposition of the aquifer materials; those that do (i.e. process-based methods) often require forcing parameters that are difficult to derive from site observations. We have developed a new method for simulating subsurface heterogeneity that honors concepts of sequence stratigraphy, resolves fine-scale heterogeneity and anisotropy of distributed parameters, and resembles observed sedimentary deposits. The method implements a multi-scale hierarchical facies modeling framework based on architectural element analysis, with larger features composed of smaller sub-units. The Hydrogeological Virtual Reality simulator (HYVR) simulates distributed parameter models using an object-based approach. Input parameters are derived from observations of stratigraphic morphology in sequence type-sections. Simulation outputs can be used for generic simulations of groundwater flow and solute transport, and for the generation of three-dimensional training images needed in applications of multiple-point geostatistics. The HYVR algorithm is flexible and easy to customize. The algorithm was written in the open-source programming language Python, and is intended to form a code base for hydrogeological researchers, as well as a platform that can be further developed to suit investigators' individual needs. This presentation will encompass the conceptual background and computational methods of the HYVR algorithm, the derivation of input parameters from site characterization, and the results of groundwater flow and solute transport simulations in different depositional settings.
How uncertain is model-based prediction of copper loads in stormwater runoff?
Lindblom, E; Ahlman, S; Mikkelsen, P S
2007-01-01
In this paper, we conduct a systematic analysis of the uncertainty related with estimating the total load of pollution (copper) from a separate stormwater drainage system, conditioned on a specific combination of input data, a dynamic conceptual pollutant accumulation-washout model and measurements (runoff volumes and pollutant masses). We use the generalized likelihood uncertainty estimation (GLUE) methodology and generate posterior parameter distributions that result in model outputs encompassing a significant number of the highly variable measurements. Given the applied pollution accumulation-washout model and a total of 57 measurements during one month, the total predicted copper masses can be predicted within a range of +/-50% of the median value. The message is that this relatively large uncertainty should be acknowledged in connection with posting statements about micropollutant loads as estimated from dynamic models, even when calibrated with on-site concentration data.
NASA Astrophysics Data System (ADS)
Hsu, Kuo-Lin; Gupta, Hoshin V.; Gao, Xiaogang; Sorooshian, Soroosh; Imam, Bisher
2002-12-01
Artificial neural networks (ANNs) can be useful in the prediction of hydrologic variables, such as streamflow, particularly when the underlying processes have complex nonlinear interrelationships. However, conventional ANN structures suffer from network training issues that significantly limit their widespread application. This paper presents a multivariate ANN procedure entitled self-organizing linear output map (SOLO), whose structure has been designed for rapid, precise, and inexpensive estimation of network structure/parameters and system outputs. More important, SOLO provides features that facilitate insight into the underlying processes, thereby extending its usefulness beyond forecast applications as a tool for scientific investigations. These characteristics are demonstrated using a classic rainfall-runoff forecasting problem. Various aspects of model performance are evaluated in comparison with other commonly used modeling approaches, including multilayer feedforward ANNs, linear time series modeling, and conceptual rainfall-runoff modeling.
NASA Astrophysics Data System (ADS)
Cultrera, Matteo; Boaga, Jacopo; Di Sipio, Eloisa; Dalla Santa, Giorgia; De Seta, Massimiliano; Galgaro, Antonio
2018-05-01
Groundwater tracer tests are often used to improve aquifer characterization, but they present several disadvantages, such as the need to pour solutions or dyes into the aquifer system and alteration of the water's chemical properties. Thus, tracers can affect the groundwater flow mechanics and data interpretation becomes more complex, hindering effective study of ground heat pumps for low enthalpy geothermal systems. This paper presents a preliminary methodology based on a multidisciplinary application of heat as a tracer for defining the main parameters of shallow aquifers. The field monitoring techniques electrical resistivity tomography (ERT) and distributed temperature sensing (DTS) are noninvasive and were applied to a shallow-aquifer test site in northeast Italy. The combination of these measurement techniques supports the definition of the main aquifer parameters and therefore the construction of a reliable conceptual model, which is then described through the numerical code FEFLOW. This model is calibrated with DTS and validated by ERT outcomes. The reliability of the numerical model in terms of fate and transport is thereby enhanced, leading to the potential for better environmental management and protection of groundwater resources through more cost-effective solutions.
A statistical framework for genetic association studies of power curves in bird flight
Lin, Min; Zhao, Wei
2006-01-01
How the power required for bird flight varies as a function of forward speed can be used to predict the flight style and behavioral strategy of a bird for feeding and migration. A U-shaped curve was observed between the power and flight velocity in many birds, which is consistent to the theoretical prediction by aerodynamic models. In this article, we present a general genetic model for fine mapping of quantitative trait loci (QTL) responsible for power curves in a sample of birds drawn from a natural population. This model is developed within the maximum likelihood context, implemented with the EM algorithm for estimating the population genetic parameters of QTL and the simplex algorithm for estimating the QTL genotype-specific parameters of power curves. Using Monte Carlo simulation derived from empirical observations of power curves in the European starling (Sturnus vulgaris), we demonstrate how the underlying QTL for power curves can be detected from molecular markers and how the QTL detected affect the most appropriate flight speeds used to design an optimal migration strategy. The results from our model can be directly integrated into a conceptual framework for understanding flight origin and evolution. PMID:17066123
NASA Astrophysics Data System (ADS)
Domènech, Cristina; Galí, Salvador; Villanova-de-Benavent, Cristina; Soler, Josep M.; Proenza, Joaquín A.
2017-10-01
Oxide-type Ni-laterite deposits are characterized by a dominant limonite zone with goethite as the economically most important Ni ore mineral and a thin zone of hydrous Mg silicate-rich saprolite beneath the magnesium discontinuity. Fe, less soluble, is mainly retained forming goethite, while Ni is redeposited at greater depth in a Fe(III) and Ni-rich serpentine (serpentine II) or in goethite, where it adsorbs or substitutes for Fe in the mineral structure. Here, a 1D reactive transport model, using CrunchFlow, of Punta Gorda oxide-type Ni-laterite deposit (Moa Bay, Cuba) formation is presented. The model reproduces the formation of the different laterite horizons in the profile from an initial, partially serpentinized peridotite, in 106 years, validating the conceptual model of the formation of this kind of deposits in which a narrow saprolite horizon rich in Ni-bearing serpentine is formed above peridotite parent rock and a thick limonite horizon is formed over saprolite. Results also confirm that sorption of Ni onto goethite can explain the weight percent of Ni found in the Moa goethite. Sensitivity analyses accounting for the effect of key parameters (composition, dissolution rate, carbonate concentration, quartz precipitation) on the model results are also presented. It is found that aqueous carbonate concentration and quartz precipitation significantly affects the laterization process rate, while the effect of the composition of secondary serpentine or of mineral dissolution rates is minor. The results of this reactive transport modeling have proven useful to validate the conceptual models derived from field observations.
NASA Astrophysics Data System (ADS)
Hugman, Rui; Stigter, Tibor; Monteiro, Jose Paulo
2015-04-01
The Albufeira-Ribeira de Quarteira aquifer system on the south coast of Portugal is an important source of groundwater for agriculture and tourism, as well as contributing to significant freshwater discharge along the coast in the form of inter- and sub-tidal springs and maintaining groundwater dependent ecosystems along the Quarteira stream. Submarine groundwater discharge (SGD) in the area was investigated within the scope of a multidisciplinary research project FREEZE (PTDC/MAR/102030/2008) which aimed to identify and characterize the effects of the hydrological/hydrogeological conditions on associated ecosystems. As well as near shore submarine springs, signs of SGD were found several kilometres from the shoreline during offshore CTD and geophysical surveys. On-land geophysical and offshore seismic surveys supplied data to update the 3D hydrogeological conceptual model of the aquifer system. Numerical models were applied to test the possibility of an offshore continuation of fresh groundwater over several kilometres under local conditions. Due to the high computational demand of variable density modelling, in an initial phase simplified 2D cross section models were used to test the conceptual model and reduce uncertainty in regards to model parameters. Results confirm the potential for SGD several kilometres from the coast within a range of acceptable values of hydraulic conductivity and recharge of the system. This represents the initial step in developing and calibrating a 3D regional scale model of the system, which aims to supply an estimate of the spatial distribution of SGD as well as serve as a decision support tool for the local water resources management agency.
Controls on the variability of net infiltration to desert sandstone
Heilweil, Victor M.; McKinney, Tim S.; Zhdanov, Michael S.; Watt, Dennis E.
2007-01-01
As populations grow in arid climates and desert bedrock aquifers are increasingly targeted for future development, understanding and quantifying the spatial variability of net infiltration becomes critically important for accurately inventorying water resources and mapping contamination vulnerability. This paper presents a conceptual model of net infiltration to desert sandstone and then develops an empirical equation for its spatial quantification at the watershed scale using linear least squares inversion methods for evaluating controlling parameters (independent variables) based on estimated net infiltration rates (dependent variables). Net infiltration rates used for this regression analysis were calculated from environmental tracers in boreholes and more than 3000 linear meters of vadose zone excavations in an upland basin in southwestern Utah underlain by Navajo sandstone. Soil coarseness, distance to upgradient outcrop, and topographic slope were shown to be the primary physical parameters controlling the spatial variability of net infiltration. Although the method should be transferable to other desert sandstone settings for determining the relative spatial distribution of net infiltration, further study is needed to evaluate the effects of other potential parameters such as slope aspect, outcrop parameters, and climate on absolute net infiltration rates.
Conceptualizing a model: a report of the ISPOR-SMDM Modeling Good Research Practices Task Force-2.
Roberts, Mark; Russell, Louise B; Paltiel, A David; Chambers, Michael; McEwan, Phil; Krahn, Murray
2012-01-01
The appropriate development of a model begins with understanding the problem that is being represented. The aim of this article is to provide a series of consensus-based best practices regarding the process of model conceptualization. For the purpose of this series of papers, the authors consider the development of models whose purpose is to inform medical decisions and health-related resource allocation questions. They specifically divide the conceptualization process into two distinct components: the conceptualization of the problem, which converts knowledge of the health care process or decision into a representation of the problem, followed by the conceptualization of the model itself, which matches the attributes and characteristics of a particular modeling type to the needs of the problem being represented. Recommendations are made regarding the structure of the modeling team, agreement on the statement of the problem, the structure, perspective and target population of the model, and the interventions and outcomes represented. Best practices relating to the specific characteristics of model structure, and which characteristics of the problem might be most easily represented in a specific modeling method, are presented. Each section contains a number of recommendations that were iterated among the authors, as well as the wider modeling taskforce, jointly set up by the International Society for Pharmacoeconomics and Outcomes Research and the Society for Medical Decision Making.
Conceptualizing Programme Evaluation
ERIC Educational Resources Information Center
Hassan, Salochana
2013-01-01
The main thrust of this paper deals with the conceptualization of theory-driven evaluation pertaining to a tutor training programme. Conceptualization of evaluation, in this case, is an integration between a conceptualization model as well as a theoretical framework in the form of activity theory. Existing examples of frameworks of programme…
A new zonation algorithm with parameter estimation using hydraulic head and subsidence observations.
Zhang, Meijing; Burbey, Thomas J; Nunes, Vitor Dos Santos; Borggaard, Jeff
2014-01-01
Parameter estimation codes such as UCODE_2005 are becoming well-known tools in groundwater modeling investigations. These programs estimate important parameter values such as transmissivity (T) and aquifer storage values (Sa ) from known observations of hydraulic head, flow, or other physical quantities. One drawback inherent in these codes is that the parameter zones must be specified by the user. However, such knowledge is often unknown even if a detailed hydrogeological description is available. To overcome this deficiency, we present a discrete adjoint algorithm for identifying suitable zonations from hydraulic head and subsidence measurements, which are highly sensitive to both elastic (Sske) and inelastic (Sskv) skeletal specific storage coefficients. With the advent of interferometric synthetic aperture radar (InSAR), distributed spatial and temporal subsidence measurements can be obtained. A synthetic conceptual model containing seven transmissivity zones, one aquifer storage zone and three interbed zones for elastic and inelastic storage coefficients were developed to simulate drawdown and subsidence in an aquifer interbedded with clay that exhibits delayed drainage. Simulated delayed land subsidence and groundwater head data are assumed to be the observed measurements, to which the discrete adjoint algorithm is called to create approximate spatial zonations of T, Sske , and Sskv . UCODE-2005 is then used to obtain the final optimal parameter values. Calibration results indicate that the estimated zonations calculated from the discrete adjoint algorithm closely approximate the true parameter zonations. This automation algorithm reduces the bias established by the initial distribution of zones and provides a robust parameter zonation distribution. © 2013, National Ground Water Association.
NASA Astrophysics Data System (ADS)
King, J. N.; Walsh, V.; Cunningham, K. J.; Evans, F. S.; Langevin, C. D.; Dausman, A.
2009-12-01
The Miami-Dade Water and Sewer Department (MDWASD) injects buoyant effluent from the North District Wastewater Treatment Plant (NDWWTP) through four Class I injection wells into the Boulder Zone---a saline (35 parts per thousand) and transmissive (105 to 106 square meters per day) hydrogeologic unit located approximately 1000 meters below land surface. Miami-Dade County is located in southeast Florida, U.S.A. Portions of the Floridan and Biscayne aquifers are located above the Boulder Zone. The Floridan and Biscayne aquifers---underground sources of drinking water---are protected by U.S. Federal Laws and Regulations, Florida Statutes, and Miami-Dade County ordinances. In 1998, MDWASD began to observe effluent constituents within the Floridan aquifer. Continuous-source and impulse-source analytical models for advective and diffusive transport of effluent are used in the present work to test contaminant flow-path hypotheses, suggest transport mechanisms, and estimate dispersivity. MDWASD collected data in the Floridan aquifer between 1996 and 2007. A parameter estimation code is used to optimize analytical model parameters by fitting model data to collected data. These simple models will be used to develop conceptual and numerical models of effluent transport at the NDWWTP, and in the vicinity of the NDWWTP.
Roeder, Ingo; Kamminga, Leonie M; Braesel, Katrin; Dontje, Bert; de Haan, Gerald; Loeffler, Markus
2005-01-15
Many current experimental results show the necessity of new conceptual approaches to understand hematopoietic stem cell organization. Recently, we proposed a novel theoretical concept and a corresponding quantitative model based on microenvironment-dependent stem cell plasticity. The objective of our present work is to subject this model to an experimental test for the situation of chimeric hematopoiesis. Investigating clonal competition processes in DBA/2-C57BL/6 mouse chimeras, we observed biphasic chimerism development with initially increasing but long-term declining DBA/2 contribution. These experimental results were used to select the parameters of the mathematical model. To validate the model beyond this specific situation, we fixed the obtained parameter configuration to simulate further experimental settings comprising variations of transplanted DBA/2-C57BL/6 proportions, secondary transplantations, and perturbation of stabilized chimeras by cytokine and cytotoxic treatment. We show that the proposed model is able to consistently describe the situation of chimeric hematopoiesis. Our results strongly support the view that the relative growth advantage of strain-specific stem cells is not a fixed cellular property but is sensitively dependent on the actual state of the entire system. We conclude that hematopoietic stem cell organization should be understood as a flexible, self-organized rather than a fixed, preprogrammed process.
On the generation of climate model ensembles
NASA Astrophysics Data System (ADS)
Haughton, Ned; Abramowitz, Gab; Pitman, Andy; Phipps, Steven J.
2014-10-01
Climate model ensembles are used to estimate uncertainty in future projections, typically by interpreting the ensemble distribution for a particular variable probabilistically. There are, however, different ways to produce climate model ensembles that yield different results, and therefore different probabilities for a future change in a variable. Perhaps equally importantly, there are different approaches to interpreting the ensemble distribution that lead to different conclusions. Here we use a reduced-resolution climate system model to compare three common ways to generate ensembles: initial conditions perturbation, physical parameter perturbation, and structural changes. Despite these three approaches conceptually representing very different categories of uncertainty within a modelling system, when comparing simulations to observations of surface air temperature they can be very difficult to separate. Using the twentieth century CMIP5 ensemble for comparison, we show that initial conditions ensembles, in theory representing internal variability, significantly underestimate observed variance. Structural ensembles, perhaps less surprisingly, exhibit over-dispersion in simulated variance. We argue that future climate model ensembles may need to include parameter or structural perturbation members in addition to perturbed initial conditions members to ensure that they sample uncertainty due to internal variability more completely. We note that where ensembles are over- or under-dispersive, such as for the CMIP5 ensemble, estimates of uncertainty need to be treated with care.
Some applications of categorical data analysis to epidemiological studies.
Grizzle, J E; Koch, G G
1979-01-01
Several examples of categorized data from epidemiological studies are analyzed to illustrate that more informative analysis than tests of independence can be performed by fitting models. All of the analyses fit into a unified conceptual framework that can be performed by weighted least squares. The methods presented show how to calculate point estimate of parameters, asymptotic variances, and asymptotically valid chi 2 tests. The examples presented are analysis of relative risks estimated from several 2 x 2 tables, analysis of selected features of life tables, construction of synthetic life tables from cross-sectional studies, and analysis of dose-response curves. PMID:540590
Parameter estimation uncertainty: Comparing apples and apples?
NASA Astrophysics Data System (ADS)
Hart, D.; Yoon, H.; McKenna, S. A.
2012-12-01
Given a highly parameterized ground water model in which the conceptual model of the heterogeneity is stochastic, an ensemble of inverse calibrations from multiple starting points (MSP) provides an ensemble of calibrated parameters and follow-on transport predictions. However, the multiple calibrations are computationally expensive. Parameter estimation uncertainty can also be modeled by decomposing the parameterization into a solution space and a null space. From a single calibration (single starting point) a single set of parameters defining the solution space can be extracted. The solution space is held constant while Monte Carlo sampling of the parameter set covering the null space creates an ensemble of the null space parameter set. A recently developed null-space Monte Carlo (NSMC) method combines the calibration solution space parameters with the ensemble of null space parameters, creating sets of calibration-constrained parameters for input to the follow-on transport predictions. Here, we examine the consistency between probabilistic ensembles of parameter estimates and predictions using the MSP calibration and the NSMC approaches. A highly parameterized model of the Culebra dolomite previously developed for the WIPP project in New Mexico is used as the test case. A total of 100 estimated fields are retained from the MSP approach and the ensemble of results defining the model fit to the data, the reproduction of the variogram model and prediction of an advective travel time are compared to the same results obtained using NSMC. We demonstrate that the NSMC fields based on a single calibration model can be significantly constrained by the calibrated solution space and the resulting distribution of advective travel times is biased toward the travel time from the single calibrated field. To overcome this, newly proposed strategies to employ a multiple calibration-constrained NSMC approach (M-NSMC) are evaluated. Comparison of the M-NSMC and MSP methods suggests that M-NSMC can provide a computationally efficient and practical solution for predictive uncertainty analysis in highly nonlinear and complex subsurface flow and transport models. This material is based upon work supported as part of the Center for Frontiers of Subsurface Energy Security, an Energy Frontier Research Center funded by the U.S. Department of Energy, Office of Science, Office of Basic Energy Sciences under Award Number DE-SC0001114. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.
Modelling in Primary School: Constructing Conceptual Models and Making Sense of Fractions
ERIC Educational Resources Information Center
Shahbari, Juhaina Awawdeh; Peled, Irit
2017-01-01
This article describes sixth-grade students' engagement in two model-eliciting activities offering students the opportunity to construct mathematical models. The findings show that students utilized their knowledge of fractions including conceptual and procedural knowledge in constructing mathematical models for the given situations. Some students…
Conceptual Models and the Future of Special Education
ERIC Educational Resources Information Center
Kauffman, James M.
2007-01-01
A medical model has advantages over a legal model in thinking about special education, especially in responding supportively to difference, meeting individual needs, and practicing prevention. The legal conceptual model now dominates thinking about special education, but a medical model promises a brighter future for special education and for…
NASA Astrophysics Data System (ADS)
Li, Yutong; Wang, Yuxin; Duffy, Alex H. B.
2014-11-01
Computer-based conceptual design for routine design has made great strides, yet non-routine design has not been given due attention, and it is still poorly automated. Considering that the function-behavior-structure(FBS) model is widely used for modeling the conceptual design process, a computer-based creativity enhanced conceptual design model(CECD) for non-routine design of mechanical systems is presented. In the model, the leaf functions in the FBS model are decomposed into and represented with fine-grain basic operation actions(BOA), and the corresponding BOA set in the function domain is then constructed. Choosing building blocks from the database, and expressing their multiple functions with BOAs, the BOA set in the structure domain is formed. Through rule-based dynamic partition of the BOA set in the function domain, many variants of regenerated functional schemes are generated. For enhancing the capability to introduce new design variables into the conceptual design process, and dig out more innovative physical structure schemes, the indirect function-structure matching strategy based on reconstructing the combined structure schemes is adopted. By adjusting the tightness of the partition rules and the granularity of the divided BOA subsets, and making full use of the main function and secondary functions of each basic structure in the process of reconstructing of the physical structures, new design variables and variants are introduced into the physical structure scheme reconstructing process, and a great number of simpler physical structure schemes to accomplish the overall function organically are figured out. The creativity enhanced conceptual design model presented has a dominant capability in introducing new deign variables in function domain and digging out simpler physical structures to accomplish the overall function, therefore it can be utilized to solve non-routine conceptual design problem.
NASA Astrophysics Data System (ADS)
Hostache, Renaud; Rains, Dominik; Chini, Marco; Lievens, Hans; Verhoest, Niko E. C.; Matgen, Patrick
2017-04-01
Motivated by climate change and its impact on the scarcity or excess of water in many parts of the world, several agencies and research institutions have taken initiatives in monitoring and predicting the hydrologic cycle at a global scale. Such a monitoring/prediction effort is important for understanding the vulnerability to extreme hydrological events and for providing early warnings. This can be based on an optimal combination of hydro-meteorological models and remote sensing, in which satellite measurements can be used as forcing or calibration data or for regularly updating the model states or parameters. Many advances have been made in these domains and the near future will bring new opportunities with respect to remote sensing as a result of the increasing number of spaceborn sensors enabling the large scale monitoring of water resources. Besides of these advances, there is currently a tendency to refine and further complicate physically-based hydrologic models to better capture the hydrologic processes at hand. However, this may not necessarily be beneficial for large-scale hydrology, as computational efforts are therefore increasing significantly. As a matter of fact, a novel thematic science question that is to be investigated is whether a flexible conceptual model can match the performance of a complex physically-based model for hydrologic simulations at large scale. In this context, the main objective of this study is to investigate how innovative techniques that allow for the estimation of soil moisture from satellite data can help in reducing errors and uncertainties in large scale conceptual hydro-meteorological modelling. A spatially distributed conceptual hydrologic model has been set up based on recent developments of the SUPERFLEX modelling framework. As it requires limited computational efforts, this model enables early warnings for large areas. Using as forcings the ERA-Interim public dataset and coupled with the CMEM radiative transfer model, SUPERFLEX is capable of predicting runoff, soil moisture, and SMOS-like brightness temperature time series. Such a model is traditionally calibrated using only discharge measurements. In this study we designed a multi-objective calibration procedure based on both discharge measurements and SMOS-derived brightness temperature observations in order to evaluate the added value of remotely sensed soil moisture data in the calibration process. As a test case we set up the SUPERFLEX model for the large scale Murray-Darling catchment in Australia ( 1 Million km2). When compared to in situ soil moisture time series, model predictions show good agreement resulting in correlation coefficients exceeding 70 % and Root Mean Squared Errors below 1 %. When benchmarked with the physically based land surface model CLM, SUPERFLEX exhibits similar performance levels. By adapting the runoff routing function within the SUPERFLEX model, the predicted discharge results in a Nash Sutcliff Efficiency exceeding 0.7 over both the calibration and the validation periods.
Thomas, Jonathan V.; Stanton, Gregory P.; Bumgarner, Johnathan R.; Pearson, Daniel K.; Teeple, Andrew; Houston, Natalie A.; Payne, Jason; Musgrove, MaryLynn
2013-01-01
Several previous studies have been done to compile or collect physical and chemical data, describe the hydrogeologic processes, and develop conceptual and numerical groundwater-flow models of the Edwards-Trinity aquifer in the Trans-Pecos region. Documented methods were used to compile and collect groundwater, surface-water, geochemical, geophysical, and geologic information that subsequently were used to develop this conceptual model.
Mirror neurons and imitation: a computationally guided review.
Oztop, Erhan; Kawato, Mitsuo; Arbib, Michael
2006-04-01
Neurophysiology reveals the properties of individual mirror neurons in the macaque while brain imaging reveals the presence of 'mirror systems' (not individual neurons) in the human. Current conceptual models attribute high level functions such as action understanding, imitation, and language to mirror neurons. However, only the first of these three functions is well-developed in monkeys. We thus distinguish current opinions (conceptual models) on mirror neuron function from more detailed computational models. We assess the strengths and weaknesses of current computational models in addressing the data and speculations on mirror neurons (macaque) and mirror systems (human). In particular, our mirror neuron system (MNS), mental state inference (MSI) and modular selection and identification for control (MOSAIC) models are analyzed in more detail. Conceptual models often overlook the computational requirements for posited functions, while too many computational models adopt the erroneous hypothesis that mirror neurons are interchangeable with imitation ability. Our meta-analysis underlines the gap between conceptual and computational models and points out the research effort required from both sides to reduce this gap.
An introduction to the multisystem model of knowledge integration and translation.
Palmer, Debra; Kramlich, Debra
2011-01-01
Many nurse researchers have designed strategies to assist health care practitioners to move evidence into practice. While many have been identified as "models," most do not have a conceptual framework. They are unidirectional, complex, and difficult for novice research users to understand. These models have focused on empirical knowledge and ignored the importance of practitioners' tacit knowledge. The Communities of Practice conceptual framework allows for the integration of tacit and explicit knowledge into practice. This article describes the development of a new translation model, the Multisystem Model of Knowledge Integration and Translation, supported by the Communities of Practice conceptual framework.
NASA Astrophysics Data System (ADS)
Ahmad, Sabrina; Jalil, Intan Ermahani A.; Ahmad, Sharifah Sakinah Syed
2016-08-01
It is seldom technical issues which impede the process of eliciting software requirements. The involvement of multiple stakeholders usually leads to conflicts and therefore the need of conflict detection and resolution effort is crucial. This paper presents a conceptual model to further improve current efforts. Hence, this paper forwards an improved conceptual model to assist the conflict detection and resolution effort which extends the model ability and improves overall performance. The significant of the new model is to empower the automation of conflicts detection and its severity level with rule-based reasoning.
NASA Astrophysics Data System (ADS)
Ryu, Suna; Han, Yuhwha; Paik, Seoung-Hey
2015-04-01
The present study explores how engaging in modeling practice, along with argumentation, leverages students' epistemic and conceptual understanding in an afterschool science/math class of 16 tenth graders. The study also explores how students used mobile Internet phones (smart phones) productively to support modeling practices. As the modeling practices became more challenging, student discussion occurred more often, from what to model to providing explanations for the phenomenon. Students came to argue about evidence that supported their model and how the model could explain target and related phenomena. This finding adds to the literature that modeling practice can help students improve conceptual understanding of subject knowledge as well as epistemic understanding.
CADDIS Volume 5. Causal Databases: Interactive Conceptual Diagrams (ICDs)
In Interactive Conceptual Diagram (ICD) section of CADDIS allows users to create conceptual model diagrams, search a literature-based evidence database, and then attach that evidence to their diagrams.
Evaluation of a distributed catchment scale water balance model
NASA Technical Reports Server (NTRS)
Troch, Peter A.; Mancini, Marco; Paniconi, Claudio; Wood, Eric F.
1993-01-01
The validity of some of the simplifying assumptions in a conceptual water balance model is investigated by comparing simulation results from the conceptual model with simulation results from a three-dimensional physically based numerical model and with field observations. We examine, in particular, assumptions and simplifications related to water table dynamics, vertical soil moisture and pressure head distributions, and subsurface flow contributions to stream discharge. The conceptual model relies on a topographic index to predict saturation excess runoff and on Philip's infiltration equation to predict infiltration excess runoff. The numerical model solves the three-dimensional Richards equation describing flow in variably saturated porous media, and handles seepage face boundaries, infiltration excess and saturation excess runoff production, and soil driven and atmosphere driven surface fluxes. The study catchments (a 7.2 sq km catchment and a 0.64 sq km subcatchment) are located in the North Appalachian ridge and valley region of eastern Pennsylvania. Hydrologic data collected during the MACHYDRO 90 field experiment are used to calibrate the models and to evaluate simulation results. It is found that water table dynamics as predicted by the conceptual model are close to the observations in a shallow water well and therefore, that a linear relationship between a topographic index and the local water table depth is found to be a reasonable assumption for catchment scale modeling. However, the hydraulic equilibrium assumption is not valid for the upper 100 cm layer of the unsaturated zone and a conceptual model that incorporates a root zone is suggested. Furthermore, theoretical subsurface flow characteristics from the conceptual model are found to be different from field observations, numerical simulation results, and theoretical baseflow recession characteristics based on Boussinesq's groundwater equation.
An integrated conceptual framework for evaluating and improving 'understanding' in informed consent.
Bossert, Sabine; Strech, Daniel
2017-10-17
The development of understandable informed consent (IC) documents has proven to be one of the most important challenges in research with humans as well as in healthcare settings. Therefore, evaluating and improving understanding has been of increasing interest for empirical research on IC. However, several conceptual and practical challenges for the development of understandable IC documents remain unresolved. In this paper, we will outline and systematize some of these challenges. On the basis of our own experiences in empirical user testing of IC documents as well as the relevant literature on understanding in IC, we propose an integrated conceptual model for the development of understandable IC documents. The proposed conceptual model integrates different methods for the participatory improvement of written information, including IC, as well as quantitative methods for measuring understanding in IC. In most IC processes, understandable written information is an important prerequisite for valid IC. To improve the quality of IC documents, a conceptual model for participatory procedures of testing, revising, and retesting can be applied. However, the model presented in this paper needs further theoretical and empirical elaboration and clarification of several conceptual and practical challenges.
Conceptual astronomy: A novel model for teaching postsecondary science courses
NASA Astrophysics Data System (ADS)
Zeilik, Michael; Schau, Candace; Mattern, Nancy; Hall, Shannon; Teague, Kathleen W.; Bisard, Walter
1997-10-01
An innovative, conceptually based instructional model for teaching large undergraduate astronomy courses was designed, implemented, and evaluated in the Fall 1995 semester. This model was based on cognitive and educational theories of knowledge and, we believe, is applicable to other large postsecondary science courses. Major components were: (a) identification of the basic important concepts and their interrelationships that are necessary for connected understanding of astronomy in novice students; (b) use of these concepts and their interrelationships throughout the design, implementation, and evaluation stages of the model; (c) identification of students' prior knowledge and misconceptions; and (d) implementation of varied instructional strategies targeted toward encouraging conceptual understanding in students (i.e., instructional concept maps, cooperative small group work, homework assignments stressing concept application, and a conceptually based student assessment system). Evaluation included the development and use of three measures of conceptual understanding and one of attitudes toward studying astronomy. Over the semester, students showed very large increases in their understanding as assessed by a conceptually based multiple-choice measure of misconceptions, a select-and-fill-in concept map measure, and a relatedness-ratings measure. Attitudes, which were slightly positive before the course, changed slightly in a less favorable direction.
Zeng, Liang; Proctor, Robert W; Salvendy, Gavriel
2011-06-01
This research is intended to empirically validate a general model of creative product and service development proposed in the literature. A current research gap inspired construction of a conceptual model to capture fundamental phases and pertinent facilitating metacognitive strategies in the creative design process. The model also depicts the mechanism by which design creativity affects consumer behavior. The validity and assets of this model have not yet been investigated. Four laboratory studies were conducted to demonstrate the value of the proposed cognitive phases and associated metacognitive strategies in the conceptual model. Realistic product and service design problems were used in creativity assessment to ensure ecological validity. Design creativity was enhanced by explicit problem analysis, whereby one formulates problems from different perspectives and at different levels of abstraction. Remote association in conceptual combination spawned more design creativity than did near association. Abstraction led to greater creativity in conducting conceptual expansion than did specificity, which induced mental fixation. Domain-specific knowledge and experience enhanced design creativity, indicating that design can be of a domain-specific nature. Design creativity added integrated value to products and services and positively influenced customer behavior. The validity and value of the proposed conceptual model is supported by empirical findings. The conceptual model of creative design could underpin future theory development. Propositions advanced in this article should provide insights and approaches to facilitate organizations pursuing product and service creativity to gain competitive advantage.
Arabidopsis non-host resistance to powdery mildews.
Lipka, Ulrike; Fuchs, Rene; Lipka, Volker
2008-08-01
Immunity of an entire plant species against all genetic variants of a particular parasite is referred to as non-host resistance. Although non-host resistance represents the most common and durable form of plant resistance in nature, it has thus far been poorly understood at the molecular level. Recently, novel model systems have established the first mechanistic insights. The genetic dissection of Arabidopsis non-host resistance to non-adapted biotrophic powdery mildew fungi provided evidence for functionally redundant but operationally distinct pre- and post-invasion immune responses. Conceptually, these complex and successive defence mechanisms explain the durable and robust nature of non-host resistance. Pathogen lifestyle and infection biology, ecological parameters and the evolutionary relationship of the interaction partners determine differences and commonalities in other model systems.
Large-scale mapping of hard-rock aquifer properties applied to Burkina Faso.
Courtois, Nathalie; Lachassagne, Patrick; Wyns, Robert; Blanchin, Raymonde; Bougaïré, Francis D; Somé, Sylvain; Tapsoba, Aïssata
2010-01-01
A country-scale (1:1,000,000) methodology has been developed for hydrogeologic mapping of hard-rock aquifers (granitic and metamorphic rocks) of the type that underlie a large part of the African continent. The method is based on quantifying the "useful thickness" and hydrodynamic properties of such aquifers and uses a recent conceptual model developed for this hydrogeologic context. This model links hydrodynamic parameters (transmissivity, storativity) to lithology and the geometry of the various layers constituting a weathering profile. The country-scale hydrogeological mapping was implemented in Burkina Faso, where a recent 1:1,000,000-scale digital geological map and a database of some 16,000 water wells were used to evaluate the methodology.
An Integrative-Interactive Conceptual Model for Curriculum Development.
ERIC Educational Resources Information Center
Al-Ibrahim, Abdul Rahman H.
1982-01-01
The Integrative-Interactive Conceptual Model for Curriculum Development calls for curriculum reform and innovation to be cybernetic so that all aspects of curriculum planning get adequate attention. (CJ)
Evaluating the Functionality of Conceptual Models
NASA Astrophysics Data System (ADS)
Mehmood, Kashif; Cherfi, Samira Si-Said
Conceptual models serve as the blueprints of information systems and their quality plays decisive role in the success of the end system. It has been witnessed that majority of the IS change-requests results due to deficient functionalities in the information systems. Therefore, a good analysis and design method should ensure that conceptual models are functionally correct and complete, as they are the communicating mediator between the users and the development team. Conceptual model is said to be functionally complete if it represents all the relevant features of the application domain and covers all the specified requirements. Our approach evaluates the functional aspects on multiple levels of granularity in addition to providing the corrective actions or transformation for improvement. This approach has been empirically validated by practitioners through a survey.
Conceptual Model of Iodine Behavior in the Subsurface at the Hanford Site
DOE Office of Scientific and Technical Information (OSTI.GOV)
Truex, Michael J.; Lee, Brady D.; Johnson, Christian D.
The fate and transport of 129I in the environment and potential remediation technologies are currently being studied as part of environmental remediation activities at the Hanford Site. A conceptual model describing the nature and extent of subsurface contamination, factors that control plume behavior, and factors relevant to potential remediation processes is needed to support environmental remedy decisions. Because 129I is an uncommon contaminant, relevant remediation experience and scientific literature are limited. Thus, the conceptual model also needs to both describe known contaminant and biogeochemical process information and to identify aspects about which additional information needed to effectively support remedy decisions.more » this document summarizes the conceptual model of iodine behavior relevant to iodine in the subsurface environment at the Hanford site.« less
Zarrinabadi, Zarrin; Isfandyari-Moghaddam, Alireza; Erfani, Nasrolah; Tahour Soltani, Mohsen Ahmadi
2018-01-01
INTRODUCTION: According to the research mission of the librarianship and information sciences field, it is necessary to have the ability to communicate constructively between the user of the information and information in these students, and it appears more important in medical librarianship and information sciences because of the need for quick access to information for clinicians. Considering the role of spiritual intelligence in capability to establish effective and balanced communication makes it important to study this variable in librarianship and information students. One of the main factors that can affect the results of any research is conceptual model of measure variables. Accordingly, the purpose of this study was codification of spiritual intelligence measurement model. METHODS: This correlational study was conducted through structural equation model, and 270 students were opted from library and medical information students of nationwide medical universities by simple random sampling and responded to the King spiritual intelligence questionnaire (2008). Initially, based on the data, the model parameters were estimated using maximum likelihood method; then, spiritual intelligence measurement model was tested by fit indices. Data analysis was performed by Smart-Partial Least Squares software. RESULTS: Preliminary results showed that due to the positive indicators of predictive association and t-test results for spiritual intelligence parameters, the King measurement model has the acceptable fit and internal correlation of the questionnaire items was significant. Composite reliability and Cronbach's alpha of parameters indicated high reliability of spiritual intelligence model. CONCLUSIONS: The spiritual intelligence measurement model was evaluated, and results showed that the model has a good fit, so it is recommended that domestic researchers use this questionnaire to assess spiritual intelligence. PMID:29922688
Zarrinabadi, Zarrin; Isfandyari-Moghaddam, Alireza; Erfani, Nasrolah; Tahour Soltani, Mohsen Ahmadi
2018-01-01
According to the research mission of the librarianship and information sciences field, it is necessary to have the ability to communicate constructively between the user of the information and information in these students, and it appears more important in medical librarianship and information sciences because of the need for quick access to information for clinicians. Considering the role of spiritual intelligence in capability to establish effective and balanced communication makes it important to study this variable in librarianship and information students. One of the main factors that can affect the results of any research is conceptual model of measure variables. Accordingly, the purpose of this study was codification of spiritual intelligence measurement model. This correlational study was conducted through structural equation model, and 270 students were opted from library and medical information students of nationwide medical universities by simple random sampling and responded to the King spiritual intelligence questionnaire (2008). Initially, based on the data, the model parameters were estimated using maximum likelihood method; then, spiritual intelligence measurement model was tested by fit indices. Data analysis was performed by Smart-Partial Least Squares software. Preliminary results showed that due to the positive indicators of predictive association and t -test results for spiritual intelligence parameters, the King measurement model has the acceptable fit and internal correlation of the questionnaire items was significant. Composite reliability and Cronbach's alpha of parameters indicated high reliability of spiritual intelligence model. The spiritual intelligence measurement model was evaluated, and results showed that the model has a good fit, so it is recommended that domestic researchers use this questionnaire to assess spiritual intelligence.
A conceptual network model of the air transportation system. the basic level 1 model.
DOT National Transportation Integrated Search
1971-04-01
A basic conceptual model of the entire Air Transportation System is being developed to serve as an analytical tool for studying the interactions among the system elements. The model is being designed to function in an interactive computer graphics en...
NASA Astrophysics Data System (ADS)
Kumar, J.; Jain, A.; Srivastava, R.
2005-12-01
The identification of pollution sources in aquifers is an important area of research not only for the hydrologists but also for the local and Federal agencies and defense organizations. Once the data in terms of pollutant concentration measurements at observation wells become known, it is important to identify the polluting industry in order to implement punitive or remedial measures. Traditionally, hydrologists have relied on the conceptual methods for the identification of groundwater pollution sources. The problem of identification of groundwater pollution sources using the conceptual methods requires a thorough understanding of the groundwater flow and contaminant transport processes and inverse modeling procedures that are highly complex and difficult to implement. Recently, the soft computing techniques, such as artificial neural networks (ANNs) and genetic algorithms, have provided an attractive and easy to implement alternative to solve complex problems efficiently. Some researchers have used ANNs for the identification of pollution sources in aquifers. A major problem with most previous studies using ANNs has been the large size of the neural networks that are needed to model the inverse problem. The breakthrough curves at an observation well may consist of hundreds of concentration measurements, and presenting all of them to the input layer of an ANN not only results in humongous networks but also requires large amount of training and testing data sets to develop the ANN models. This paper presents the results of a study aimed at using certain characteristics of the breakthrough curves and ANNs for determining the distance of the pollution source from a given observation well. Two different neural network models are developed that differ in the manner of characterizing the breakthrough curves. The first ANN model uses five parameters, similar to the synthetic unit hydrograph parameters, to characterize the breakthrough curves. The five parameters employed are peak concentration, time to peak concentration, the widths of the breakthrough curves at 50% and 75% of the peak concentration, and the time base of the breakthrough curve. The second ANN model employs only the first four parameters leaving out the time base. The measurement of breakthrough curve at an observation well involves very high costs in sample collection at suitable time intervals and analysis for various contaminants. The receding portions of the breakthrough curves are normally very long and excluding the time base from modeling would result in considerable cost savings. The feed-forward multi-layer perceptron (MLP) type neural networks trained using the back-propagation algorithm, are employed in this study. The ANN models for the two approaches were developed using simulated data generated for conservative pollutant transport through a homogeneous aquifer. A new approach for ANN training using back-propagation is employed that considers two different error statistics to prevent over-training and under-training of the ANNs. The preliminary results indicate that the ANNs are able to identify the location of the pollution source very efficiently from both the methods of the breakthrough curves characterization.
DOE Office of Scientific and Technical Information (OSTI.GOV)
M.A. Wasiolek
Inhalation exposure pathway modeling has recently been investigated as one of the tasks of the BIOPROTA Project (BIOPROTA 2005). BIOPROTA was set up to address the key uncertainties in long term assessments of contaminant releases into the environment arising from radioactive waste disposal. Participants of this international Project include national authorities and agencies, both regulators and operators, with responsibility for achieving safe and acceptable radioactive waste management. The objective of the inhalation task was to investigate the calculation of doses arising from inhalation of particles suspended from soils within which long-lived radionuclides, particularly alpha emitters, had accumulated. It was recognizedmore » that site-specific conditions influence the choice of conceptual model and input parameter values. Therefore, one of the goals of the task was to identify the circumstances in which different processes included in specific inhalation exposure pathway models were important. This paper discusses evaluation of processes and modeling assumptions specific to the proposed repository at Yucca Mountain as compared to the typical approaches and other models developed for different assessments and project specific contexts. Inhalation of suspended particulates that originate from contaminated soil is an important exposure pathway, particularly for exposure to actinides such as uranium, neptunium and plutonium. Radionuclide accumulation in surface soil arises from irrigation of soil with contaminated water over many years. The level of radionuclide concentration in surface soil depends on the assumed duration of irrigation. Irrigation duration is one of the parameters used on biosphere models and it depends on a specific assessment context. It is one of the parameters addressed in this paper from the point of view of assessment context for the proposed repository at Yucca Mountain. The preferred model for the assessment of inhalation exposure uses atmospheric mass loading approach, which is based on the mass of airborne particulates per unit volume of air that is inhaled by the receptor. This type of model was used by the majority of the BIOPROTA inhalation task participants and is also used in the Yucca Mountain model. Although the mass loading model is conceptually straightforward, there are some considerations that need to be included when using this model. Small particles have larger surface to volume ratio than large particles and this ratio increases in inverse proportion to the particle size. This is particularly important for elements such as plutonium, which have high sorption coefficients, and thus are preferentially attached to small particles of soil. Suspended particulates originating from soil are composed of particles smaller than average soil particles and thus, on average, have larger available surface area, and consequently activity, per unit mass than that of soil. The increase of radionuclide concentration of suspended particulates compared with that of underlying soil is quantified in terms of the enhancement factor, which is included in the inhalation model for the Yucca Mountain repository. In this paper, the use of the enhancement factor in the inhalation exposure models is discussed. Then, enhancement factor values used in the Yucca Mountain model are discussed from the perspective of site-specific conditions as well as the microenvironmental approach to modeling inhalation exposure of the receptor: The receptor can spend specified time in several environments, each of them characterized by an occupancy time, suspended particulate level, enhancement factor and breathing rate. The environment where inhalation exposure is the highest is associated with the receptor being active outdoors and involved in activities that generate high levels of dust by using farm equipment, walking, or conducting other outdoor activities. I n summary, it is important to recognize that site-specific conditions play an important role in constructing conceptual and mathematical models of inhalation exposure.« less
Improving Conceptual Models Using AEM Data and Probability Distributions
NASA Astrophysics Data System (ADS)
Davis, A. C.; Munday, T. J.; Christensen, N. B.
2012-12-01
With emphasis being placed on uncertainty in groundwater modelling and prediction, coupled with questions concerning the value of geophysical methods in hydrogeology, it is important to ask meaningful questions of hydrogeophysical data and inversion results. For example, to characterise aquifers using electromagnetic (EM) data, we ask questions such as "Given that the electrical conductivity of aquifer 'A' is less than x, where is that aquifer elsewhere in the survey area?" The answer may be given by examining inversion models, selecting locations and layers that satisfy the condition 'conductivity <= x', and labelling them as aquifer 'A'. One difficulty with this approach is that the inversion model result often be considered to be the only model for the data. In reality it is just one image of the subsurface that, given the method and the regularisation imposed in the inversion, agrees with measured data within a given error bound. We have no idea whether the final model realised by the inversion satisfies the global minimum error, or whether it is simply in a local minimum. There is a distribution of inversion models that satisfy the error tolerance condition: the final model is not the only one, nor is it necessarily the correct one. AEM inversions are often linearised in the calculation of the parameter sensitivity: we rely on the second derivatives in the Taylor expansion, thus the minimum model has all layer parameters distributed about their mean parameter value with well-defined variance. We investigate the validity of the minimum model, and its uncertainty, by examining the full posterior covariance matrix. We ask questions of the minimum model, and answer them in a probabilistically. The simplest question we can pose is "What is the probability that all layer resistivity values are <= a cut-off value?" We can calculate through use of the erf or the erfc functions. The covariance values of the inversion become marginalised in the integration: only the main diagonal is used. Complications arise when we ask more specific questions, such as "What is the probability that the resistivity of layer 2 <= x, given that layer 1 <= y?" The probability then becomes conditional, calculation includes covariance terms, the integration is taken over many dimensions, and the cross-correlation of parameters becomes important. To illustrate, we examine the inversion results of a Tempest AEM survey over the Uley Basin aquifers in the Eyre Peninsula, South Australia. Key aquifers include the unconfined Bridgewater Formation that overlies the Uley and Wanilla Formations, which contain Tertiary clays and Tertiary sandstone. These Formations overlie weathered basement which define the lower bound of the Uley Basin aquifer systems. By correlating the conductivity of the sub-surface Formation types, we pose questions such as: "What is the probability-depth of the Bridgewater Formation in the Uley South Basin?", "What is the thickness of the Uley Formation?" and "What is the most probable depth to basement?" We use these questions to generate improved conceptual hydrogeological models of the Uley Basin in order to develop better estimates of aquifer extent and the available groundwater resource.
NASA Astrophysics Data System (ADS)
Bizzi, S.; Surridge, B.; Lerner, D. N.:
2009-04-01
River ecosystems represent complex networks of interacting biological, chemical and geomorphological processes. These processes generate spatial and temporal patterns in biological, chemical and geomorphological variables, and a growing number of these variables are now being used to characterise the status of rivers. However, integrated analyses of these biological-chemical-geomorphological networks have rarely been undertaken, and as a result our knowledge of the underlying processes and how they generate the resulting patterns remains weak. The apparent complexity of the networks involved, and the lack of coherent datasets, represent two key challenges to such analyses. In this paper we describe the application of a novel technique, Structural Equation Modelling (SEM), to the investigation of biological, chemical and geomorphological data collected from rivers across England and Wales. The SEM approach is a multivariate statistical technique enabling simultaneous examination of direct and indirect relationships across a network of variables. Further, SEM allows a-priori conceptual or theoretical models to be tested against available data. This is a significant departure from the solely exploratory analyses which characterise other multivariate techniques. We took biological, chemical and river habitat survey data collected by the Environment Agency for 400 sites in rivers spread across England and Wales, and created a single, coherent dataset suitable for SEM analyses. Biological data cover benthic macroinvertebrates, chemical data relate to a range of standard parameters (e.g. BOD, dissolved oxygen and phosphate concentration), and geomorphological data cover factors such as river typology, substrate material and degree of physical modification. We developed a number of a-priori conceptual models, reflecting current research questions or existing knowledge, and tested the ability of these conceptual models to explain the variance and covariance within the dataset. The conceptual models we developed were able to explain correctly the variance and covariance shown by the datasets, proving to be a relevant representation of the processes involved. The models explained 65% of the variance in indices describing benthic macroinvertebrate communities. Dissolved oxygen was of primary importance, but geomorphological factors, including river habitat type and degree of habitat degradation, also had significant explanatory power. The addition of spatial variables, such as latitude or longitude, did not provide additional explanatory power. This suggests that the variables already included in the models effectively represented the eco-regions across which our data were distributed. The models produced new insights into the relative importance of chemical and geomorphological factors for river macroinvertebrate communities. The SEM technique proved a powerful tool for exploring complex biological-chemical-geomorphological networks, for example able to deal with the co-correlations that are common in rivers due to multiple feedback mechanisms.
NASA Astrophysics Data System (ADS)
Sivapalan, Murugesu; Viney, Neil R.; Jeevaraj, Charles G.
1996-03-01
This paper presents an application of a long-term, large catchment-scale, water balance model developed to predict the effects of forest clearing in the south-west of Western Australia. The conceptual model simulates the basic daily water balance fluxes in forested catchments before and after clearing. The large catchment is divided into a number of sub-catchments (1-5 km2 in area), which are taken as the fundamental building blocks of the large catchment model. The responses of the individual subcatchments to rainfall and pan evaporation are conceptualized in terms of three inter-dependent subsurface stores A, B and F, which are considered to represent the moisture states of the subcatchments. Details of the subcatchment-scale water balance model have been presented earlier in Part 1 of this series of papers. The response of any subcatchment is a function of its local moisture state, as measured by the local values of the stores. The variations of the initial values of the stores among the subcatchments are described in the large catchment model through simple, linear equations involving a number of similarity indices representing topography, mean annual rainfall and level of forest clearing.The model is applied to the Conjurunup catchment, a medium-sized (39·6 km2) catchment in the south-west of Western Australia. The catchment has been heterogeneously (in space and time) cleared for bauxite mining and subsequently rehabilitated. For this application, the catchment is divided into 11 subcatchments. The model parameters are estimated by calibration, by comparing observed and predicted runoff values, over a 18 year period, for the large catchment and two of the subcatchments. Excellent fits are obtained.
Conceptual model for partnership and sustainability in global health.
Leffers, Jeanne; Mitchell, Emma
2011-01-01
Although nursing has a long history of service to the global community, the profession lacks a theoretical and empirical base for nurses to frame their global practice. A study using grounded theory methodology to investigate partnership and sustainability for global health led to the development of a conceptual model. Interviews were conducted with 13 global health nurse experts. Themes from the interviews were: components for engagement, mutual goal setting, cultural bridging, collaboration, capacity building, leadership, partnership, ownership, and sustainability. Next, the identified themes were reviewed in the literature in order to evaluate their conceptual relationships. Finally, careful comparison of the interview transcripts and the supporting literature led to the Conceptual Framework for Partnership and Sustainability in Global Health Nursing. The model posits that engagement and partnership must precede any planning and intervention in order to create sustainable interventions. This conceptual framework will offer nurses important guidance for global health nursing practice. © 2010 Wiley Periodicals, Inc.
Lung Cancer Screening Participation: Developing a Conceptual Model to Guide Research
Carter-Harris, Lisa; Davis, Lorie L.; Rawl, Susan M.
2017-01-01
Purpose To describe the development of a conceptual model to guide research focused on lung cancer screening participation from the perspective of the individual in the decision-making process. Methods Based on a comprehensive review of empirical and theoretical literature, a conceptual model was developed linking key psychological variables (stigma, medical mistrust, fatalism, worry, and fear) to the health belief model and precaution adoption process model. Results Proposed model concepts have been examined in prior research of either lung or other cancer screening behavior. To date, a few studies have explored a limited number of variables that influence screening behavior in lung cancer specifically. Therefore, relationships among concepts in the model have been proposed and future research directions presented. Conclusion This proposed model is an initial step to support theoretically based research. As lung cancer screening becomes more widely implemented, it is critical to theoretically guide research to understand variables that may be associated with lung cancer screening participation. Findings from future research guided by the proposed conceptual model can be used to refine the model and inform tailored intervention development. PMID:28304262
Lung Cancer Screening Participation: Developing a Conceptual Model to Guide Research.
Carter-Harris, Lisa; Davis, Lorie L; Rawl, Susan M
2016-11-01
To describe the development of a conceptual model to guide research focused on lung cancer screening participation from the perspective of the individual in the decision-making process. Based on a comprehensive review of empirical and theoretical literature, a conceptual model was developed linking key psychological variables (stigma, medical mistrust, fatalism, worry, and fear) to the health belief model and precaution adoption process model. Proposed model concepts have been examined in prior research of either lung or other cancer screening behavior. To date, a few studies have explored a limited number of variables that influence screening behavior in lung cancer specifically. Therefore, relationships among concepts in the model have been proposed and future research directions presented. This proposed model is an initial step to support theoretically based research. As lung cancer screening becomes more widely implemented, it is critical to theoretically guide research to understand variables that may be associated with lung cancer screening participation. Findings from future research guided by the proposed conceptual model can be used to refine the model and inform tailored intervention development.
PDES Logical Layer Initiation Task.
1986-04-28
surface. We have heard such expressions as "topology sits on tcp of geometry." We choose to avoid subordinating one to the other by bringing them together...a mapping from Discipline model to Global model. 38 A~A g d ip . t ~ P A1 / /oaefZ - - 6jOM#AL Mat&mft9 We have attempted to group basqd on the...FIGURE PHASE 2: Conceptualization and Integration. In this phase conceptual entities and relationships are discovered. An integrated conceptual modelO
Nagy, Balázs; Setyawan, Juliana; Coghill, David; Soroncz-Szabó, Tamás; Kaló, Zoltán; Doshi, Jalpa A
2017-06-01
Models incorporating long-term outcomes (LTOs) are not available to assess the health economic impact of attention-deficit/hyperactivity disorder (ADHD). Develop a conceptual modelling framework capable of assessing long-term economic impact of ADHD therapies. Literature was reviewed; a conceptual structure for the long-term model was outlined with attention to disease characteristics and potential impact of treatment strategies. The proposed model has four layers: i) multi-state short-term framework to differentiate between ADHD treatments; ii) multiple states being merged into three core health states associated with LTOs; iii) series of sub-models in which particular LTOs are depicted; iv) outcomes collected to be either used directly for economic analyses or translated into other relevant measures. This conceptual model provides a framework to assess relationships between short- and long-term outcomes of the disease and its treatment, and to estimate the economic impact of ADHD treatments throughout the course of the disease.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thorne, P.D.; Chamness, M.A.; Vermeul, V.R.
This report documents work conducted during the fiscal year 1994 to development an improved three-dimensional conceptual model of ground-water flow in the unconfined aquifer system across the Hanford Site Ground-Water Surveillance Project, which is managed by Pacific Northwest Laboratory. The main objective of the ongoing effort to develop an improved conceptual model of ground-water flow is to provide the basis for improved numerical report models that will be capable of accurately predicting the movement of radioactive and chemical contaminant plumes in the aquifer beneath Hanford. More accurate ground-water flow models will also be useful in assessing the impacts of changesmore » in facilities and operations. For example, decreasing volumes of operational waste-water discharge are resulting in a declining water table in parts of the unconfined aquifer. In addition to supporting numerical modeling, the conceptual model also provides a qualitative understanding of the movement of ground water and contaminants in the aquifer.« less
Walsh, M M; Darby, M
1993-01-01
In summary, the theories of Maslow and of Yura and Walsh have been highlighted as background for understanding the human needs conceptual model of dental hygiene. In addition, 11 human needs have been identified and defined as being especially related to dental hygiene care, and a sample evaluation tool for their clinical assessment and a dental hygiene care plan have been presented. The four concepts of client, environment, health/oral health, and dental hygiene actions explained in terms of human need theory, and the 11 human needs related to dental hygiene care constitute the human needs conceptual model of dental hygiene. Within the framework of the human needs conceptual model of dental hygiene, the dental hygiene process is a systematic approach to dental hygiene care that involves assessment of the 11 human needs related to dental hygiene care; analysis of deficits in these needs; determination of the dental hygiene care plan based on identified deficits; implementation of dental hygiene interventions stated in the care plan; and evaluation of the effectiveness of dental hygiene interventions in achieving specific goals, including subsequent reassessment and revision of the dental hygiene care plan. This human needs conceptual model for dental hygiene provides a guide for comprehensive and humanistic client care. This model allows the dental hygienist to view each client (whether an individual or a group) holistically to prevent oral disease and to promote health and wellness. Dental hygiene theorists are encouraged to expand this model or to develop additional conceptual models based on dental hygiene's paradigm.
The Site-Scale Saturated Zone Flow Model for Yucca Mountain
NASA Astrophysics Data System (ADS)
Al-Aziz, E.; James, S. C.; Arnold, B. W.; Zyvoloski, G. A.
2006-12-01
This presentation provides a reinterpreted conceptual model of the Yucca Mountain site-scale flow system subject to all quality assurance procedures. The results are based on a numerical model of site-scale saturated zone beneath Yucca Mountain, which is used for performance assessment predictions of radionuclide transport and to guide future data collection and modeling activities. This effort started from the ground up with a revised and updated hydrogeologic framework model, which incorporates the latest lithology data, and increased grid resolution that better resolves the hydrogeologic framework, which was updated throughout the model domain. In addition, faults are much better represented using the 250× 250- m2 spacing (compared to the previous model's 500× 500-m2 spacing). Data collected since the previous model calibration effort have been included and they comprise all Nye County water-level data through Phase IV of their Early Warning Drilling Program. Target boundary fluxes are derived from the newest (2004) Death Valley Regional Flow System model from the US Geologic Survey. A consistent weighting scheme assigns importance to each measured water-level datum and boundary flux extracted from the regional model. The numerical model is calibrated by matching these weighted water level measurements and boundary fluxes using parameter estimation techniques, along with more informal comparisons of the model to hydrologic and geochemical information. The model software (hydrologic simulation code FEHM~v2.24 and parameter estimation software PEST~v5.5) and model setup facilitates efficient calibration of multiple conceptual models. Analyses evaluate the impact of these updates and additional data on the modeled potentiometric surface and the flowpaths emanating from below the repository. After examining the heads and permeabilities obtained from the calibrated models, we present particle pathways from the proposed repository and compare them to those from the previous model calibration. Specific discharge at a point 5~km from the repository is also examined and found to be within acceptable uncertainty. The results show that updated model yields a calibration with smaller residuals than the previous model revision while ensuring that flowpaths follow measured gradients and paths derived from hydrochemical analyses. This work was supported by the Yucca Mountain Site Characterization Office as part of the Civilian Radioactive Waste Management Program, which is managed by the U.S. Department of Energy, Yucca Mountain Site Characterization Project. Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy under Contract DE AC04 94AL85000.
Modeling in the Classroom: An Evolving Learning Tool
NASA Astrophysics Data System (ADS)
Few, A. A.; Marlino, M. R.; Low, R.
2006-12-01
Among the early programs (early 1990s) focused on teaching Earth System Science were the Global Change Instruction Program (GCIP) funded by NSF through UCAR and the Earth System Science Education Program (ESSE) funded by NASA through USRA. These two programs introduced modeling as a learning tool from the beginning, and they provided workshops, demonstrations and lectures for their participating universities. These programs were aimed at university-level education. Recently, classroom modeling is experiencing a revival of interest. Drs John Snow and Arthur Few conducted two workshops on modeling at the ESSE21 meeting in Fairbanks, Alaska, in August 2005. The Digital Library for Earth System Education (DLESE) at http://www.dlese.org provides web access to STELLA models and tutorials, and UCAR's Education and Outreach (EO) program holds workshops that include training in modeling. An important innovation to the STELLA modeling software by isee systems, http://www.iseesystems.com, called "isee Player" is available as a free download. The Player allows users to view and run STELLA models, change model parameters, share models with colleagues and students, and make working models available on the web. This is important because the expert can create models, and the user can learn how the modeled system works. Another aspect of this innovation is that the educational benefits of modeling concepts can be extended throughout most of the curriculum. The procedure for building a working computer model of an Earth Science System follows this general format: (1) carefully define the question(s) for which you seek the answer(s); (2) identify the interacting system components and inputs contributing to the system's behavior; (3) collect the information and data that will be required to complete the conceptual model; (4) construct a system diagram (graphic) of the system that displays all of system's central questions, components, relationships and required inputs. At this stage in the process the conceptual model of the system is compete and a clear understanding of how the system works is achieved. When appropriate software is available the advanced classes can proceed to (5) creating a computer model of the system and testing the conceptual model. For classes lacking these advanced capabilities they may view and run models using the free isee Player and shared working models. In any event there is understanding to be gained in every step of the procedure outlined above. You can view some examples at http://www.ruf.rice.edu/~few/. We plan to populate this site with samples of Earth science systems for use in Earth system science education.
A Conceptual Model To Assist Educational Leaders Manage Change.
ERIC Educational Resources Information Center
Cochren, John R.
This paper presents a conceptual model to help school leaders manage change effectively. The model was developed from a literature review of theory development and model construction. Specifically, the paper identifies the major components that inhibit organizational change, and synthesizes the most salient features of these components through a…
Conceptual Modeling Framework for E-Area PA HELP Infiltration Model Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dyer, J. A.
A conceptual modeling framework based on the proposed E-Area Low-Level Waste Facility (LLWF) closure cap design is presented for conducting Hydrologic Evaluation of Landfill Performance (HELP) model simulations of intact and subsided cap infiltration scenarios for the next E-Area Performance Assessment (PA).
NASA Astrophysics Data System (ADS)
Zahmatkesh, Zahra; Karamouz, Mohammad; Nazif, Sara
2015-09-01
Simulation of rainfall-runoff process in urban areas is of great importance considering the consequences and damages of extreme runoff events and floods. The first issue in flood hazard analysis is rainfall simulation. Large scale climate signals have been proved to be effective in rainfall simulation and prediction. In this study, an integrated scheme is developed for rainfall-runoff modeling considering different sources of uncertainty. This scheme includes three main steps of rainfall forecasting, rainfall-runoff simulation and future runoff prediction. In the first step, data driven models are developed and used to forecast rainfall using large scale climate signals as rainfall predictors. Due to high effect of different sources of uncertainty on the output of hydrologic models, in the second step uncertainty associated with input data, model parameters and model structure is incorporated in rainfall-runoff modeling and simulation. Three rainfall-runoff simulation models are developed for consideration of model conceptual (structural) uncertainty in real time runoff forecasting. To analyze the uncertainty of the model structure, streamflows generated by alternative rainfall-runoff models are combined, through developing a weighting method based on K-means clustering. Model parameters and input uncertainty are investigated using an adaptive Markov Chain Monte Carlo method. Finally, calibrated rainfall-runoff models are driven using the forecasted rainfall to predict future runoff for the watershed. The proposed scheme is employed in the case study of the Bronx River watershed, New York City. Results of uncertainty analysis of rainfall-runoff modeling reveal that simultaneous estimation of model parameters and input uncertainty significantly changes the probability distribution of the model parameters. It is also observed that by combining the outputs of the hydrological models using the proposed clustering scheme, the accuracy of runoff simulation in the watershed is remarkably improved up to 50% in comparison to the simulations by the individual models. Results indicate that the developed methodology not only provides reliable tools for rainfall and runoff modeling, but also adequate time for incorporating required mitigation measures in dealing with potentially extreme runoff events and flood hazard. Results of this study can be used in identification of the main factors affecting flood hazard analysis.
Conceptual Design of Electron-Beam Generated Plasma Tools
NASA Astrophysics Data System (ADS)
Agarwal, Ankur; Rauf, Shahid; Dorf, Leonid; Collins, Ken; Boris, David; Walton, Scott
2015-09-01
Realization of the next generation of high-density nanostructured devices is predicated on etching features with atomic layer resolution, no damage and high selectivity. High energy electron beams generate plasmas with unique features that make them attractive for applications requiring monolayer precision. In these plasmas, high energy beam electrons ionize the background gas and the resultant daughter electrons cool to low temperatures via collisions with gas molecules and lack of any accelerating fields. For example, an electron temperature of <0.6 eV with densities comparable to conventional plasma sources can be obtained in molecular gases. The chemistry in such plasmas can significantly differ from RF plasmas as the ions/radicals are produced primarily by beam electrons rather than those in the tail of a low energy distribution. In this work, we will discuss the conceptual design of an electron beam based plasma processing system. Plasma properties will be discussed for Ar, Ar/N2, and O2 plasmas using a computational plasma model, and comparisons made to experiments. The fluid plasma model is coupled to a Monte Carlo kinetic model for beam electrons which considers gas phase collisions and the effect of electric and magnetic fields on electron motion. The impact of critical operating parameters such as magnetic field, beam energy, and gas pressure on plasma characteristics in electron-beam plasma processing systems will be discussed. Partially supported by the NRL base program.
Conceptualizing a model: a report of the ISPOR-SMDM Modeling Good Research Practices Task Force--2.
Roberts, Mark; Russell, Louise B; Paltiel, A David; Chambers, Michael; McEwan, Phil; Krahn, Murray
2012-01-01
The appropriate development of a model begins with understanding the problem that is being represented. The aim of this article was to provide a series of consensus-based best practices regarding the process of model conceptualization. For the purpose of this series of articles, we consider the development of models whose purpose is to inform medical decisions and health-related resource allocation questions. We specifically divide the conceptualization process into two distinct components: the conceptualization of the problem, which converts knowledge of the health care process or decision into a representation of the problem, followed by the conceptualization of the model itself, which matches the attributes and characteristics of a particular modeling type with the needs of the problem being represented. Recommendations are made regarding the structure of the modeling team, agreement on the statement of the problem, the structure, perspective, and target population of the model, and the interventions and outcomes represented. Best practices relating to the specific characteristics of model structure and which characteristics of the problem might be most easily represented in a specific modeling method are presented. Each section contains a number of recommendations that were iterated among the authors, as well as among the wider modeling taskforce, jointly set up by the International Society for Pharmacoeconomics and Outcomes Research and the Society for Medical Decision Making. Copyright © 2012 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
An, Gary
2009-01-01
The sheer volume of biomedical research threatens to overwhelm the capacity of individuals to effectively process this information. Adding to this challenge is the multiscale nature of both biological systems and the research community as a whole. Given this volume and rate of generation of biomedical information, the research community must develop methods for robust representation of knowledge in order for individuals, and the community as a whole, to "know what they know." Despite increasing emphasis on "data-driven" research, the fact remains that researchers guide their research using intuitively constructed conceptual models derived from knowledge extracted from publications, knowledge that is generally qualitatively expressed using natural language. Agent-based modeling (ABM) is a computational modeling method that is suited to translating the knowledge expressed in biomedical texts into dynamic representations of the conceptual models generated by researchers. The hierarchical object-class orientation of ABM maps well to biomedical ontological structures, facilitating the translation of ontologies into instantiated models. Furthermore, ABM is suited to producing the nonintuitive behaviors that often "break" conceptual models. Verification in this context is focused at determining the plausibility of a particular conceptual model, and qualitative knowledge representation is often sufficient for this goal. Thus, utilized in this fashion, ABM can provide a powerful adjunct to other computational methods within the research process, as well as providing a metamodeling framework to enhance the evolution of biomedical ontologies.
A Conceptual Model for Episodes of Acute, Unscheduled Care.
Pines, Jesse M; Lotrecchiano, Gaetano R; Zocchi, Mark S; Lazar, Danielle; Leedekerken, Jacob B; Margolis, Gregg S; Carr, Brendan G
2016-10-01
We engaged in a 1-year process to develop a conceptual model representing an episode of acute, unscheduled care. Acute, unscheduled care includes acute illnesses (eg, nausea and vomiting), injuries, or exacerbations of chronic conditions (eg, worsening dyspnea in congestive heart failure) and is delivered in emergency departments, urgent care centers, and physicians' offices, as well as through telemedicine. We began with a literature search to define an acute episode of care and to identify existing conceptual models used in health care. In accordance with this information, we then drafted a preliminary conceptual model and collected stakeholder feedback, using online focus groups and concept mapping. Two technical expert panels reviewed the draft model, examined the stakeholder feedback, and discussed ways the model could be improved. After integrating the experts' comments, we solicited public comment on the model and made final revisions. The final conceptual model includes social and individual determinants of health that influence the incidence of acute illness and injury, factors that affect care-seeking decisions, specific delivery settings where acute care is provided, and outcomes and costs associated with the acute care system. We end with recommendations for how researchers, policymakers, payers, patients, and providers can use the model to identify and prioritize ways to improve acute care delivery. Copyright © 2016 American College of Emergency Physicians. Published by Elsevier Inc. All rights reserved.
Conceptual design study of the moderate size superconducting spherical tokamak power plant
NASA Astrophysics Data System (ADS)
Gi, Keii; Ono, Yasushi; Nakamura, Makoto; Someya, Youji; Utoh, Hiroyasu; Tobita, Kenji; Ono, Masayuki
2015-06-01
A new conceptual design of the superconducting spherical tokamak (ST) power plant was proposed as an attractive choice for tokamak fusion reactors. We reassessed a possibility of the ST as a power plant using the conservative reactor engineering constraints often used for the conventional tokamak reactor design. An extensive parameters scan which covers all ranges of feasible superconducting ST reactors was completed, and five constraints which include already achieved plasma magnetohydrodynamic (MHD) and confinement parameters in ST experiments were established for the purpose of choosing the optimum operation point. Based on comparison with the estimated future energy costs of electricity (COEs) in Japan, cost-effective ST reactors can be designed if their COEs are smaller than 120 mills kW-1 h-1 (2013). We selected the optimized design point: A = 2.0 and Rp = 5.4 m after considering the maintenance scheme and TF ripple. A self-consistent free-boundary MHD equilibrium and poloidal field coil configuration of the ST reactor were designed by modifying the neutral beam injection system and plasma profiles. The MHD stability of the equilibrium was analysed and a ramp-up scenario was considered for ensuring the new ST design. The optimized moderate-size ST power plant conceptual design realizes realistic plasma and fusion engineering parameters keeping its economic competitiveness against existing energy sources in Japan.
Modeling phytoremediation of nitrogen-polluted water using water hyacinth (Eichhornia crassipes)
NASA Astrophysics Data System (ADS)
Mayo, Aloyce W.; Hanai, Emmanuel E.
2017-08-01
Water hyacinth (Eichhornia crassipes) has a great potential for purification of wastewater through physical, chemical and biological mechanisms. In an attempt to improve the quality of effluents discharged from waste stabilization ponds at the University of Dar es Salaam, a pilot plant was constructed to experiment the effectiveness of this plants for transformation and removal of nitrogen. Samples of wastewater were collected and examined for water quality parameters, including pH, temperature, dissolved oxygen, and various forms of nitrogen, which were used as input parameters in a kinetic mathematical model. A conceptual model was then developed to model various processes in the system using STELLA 6.0.1 software. The results show that total nitrogen was removed by 63.9%. Denitrification contributed 73.8% of the removed nitrogen. Other dominant nitrogen removal mechanisms are net sedimentation and uptake by water hyacinth, which contributed 16.7% and 9.5% of the removed nitrogen, respectively. The model indicated that in presence of water hyacinth biofilm about 1.26 g Nm-2day-1 of nitrogen was removed. However, in the absence of biofilm in water hyacinth pond, the permanent nitrogen removal was only 0.89 g Nm-2day-1. This suggests that in absence of water hyacinth, the efficiency of nitrogen removal would decrease by 29.4%.
Pros, Cons, and Alternatives to Weight Based Cost Estimating
NASA Technical Reports Server (NTRS)
Joyner, Claude R.; Lauriem, Jonathan R.; Levack, Daniel H.; Zapata, Edgar
2011-01-01
Many cost estimating tools use weight as a major parameter in projecting the cost. This is often combined with modifying factors such as complexity, technical maturity of design, environment of operation, etc. to increase the fidelity of the estimate. For a set of conceptual designs, all meeting the same requirements, increased weight can be a major driver in increased cost. However, once a design is fixed, increased weight generally decreases cost, while decreased weight generally increases cost - and the relationship is not linear. Alternative approaches to estimating cost without using weight (except perhaps for materials costs) have been attempted to try to produce a tool usable throughout the design process - from concept studies through development. This paper will address the pros and cons of using weight based models for cost estimating, using liquid rocket engines as the example. It will then examine approaches that minimize the impct of weight based cost estimating. The Rocket Engine- Cost Model (RECM) is an attribute based model developed internally by Pratt & Whitney Rocketdyne for NASA. RECM will be presented primarily to show a successful method to use design and programmatic parameters instead of weight to estimate both design and development costs and production costs. An operations model developed by KSC, the Launch and Landing Effects Ground Operations model (LLEGO), will also be discussed.
Implications of conceptual channel representation on SWAT streamflow and sediment modeling
USDA-ARS?s Scientific Manuscript database
Hydrologic modeling outputs are influenced by how a watershed system is represented. Channel routing is a typical example of the mathematical conceptualization of watershed landscape and processes in hydrologic modeling. We investigated the sensitivity of accuracy, equifinality, and uncertainty of...
ITE CHARACTERIZATION TO SUPPORT CONCEPTUAL MODEL DEVELOPMENT FOR SUBSURFACE RADIONUCLIDE TRANSPORT
Remediation of radionuclide contaminants in ground water often begins with the development of conceptual and analytical models that guide our understanding of the processes controlling radionuclide transport. The reliability of these models is often predicated on the collection o...
Tutorial: Parallel Computing of Simulation Models for Risk Analysis.
Reilly, Allison C; Staid, Andrea; Gao, Michael; Guikema, Seth D
2016-10-01
Simulation models are widely used in risk analysis to study the effects of uncertainties on outcomes of interest in complex problems. Often, these models are computationally complex and time consuming to run. This latter point may be at odds with time-sensitive evaluations or may limit the number of parameters that are considered. In this article, we give an introductory tutorial focused on parallelizing simulation code to better leverage modern computing hardware, enabling risk analysts to better utilize simulation-based methods for quantifying uncertainty in practice. This article is aimed primarily at risk analysts who use simulation methods but do not yet utilize parallelization to decrease the computational burden of these models. The discussion is focused on conceptual aspects of embarrassingly parallel computer code and software considerations. Two complementary examples are shown using the languages MATLAB and R. A brief discussion of hardware considerations is located in the Appendix. © 2016 Society for Risk Analysis.
A beginner's guide to writing the nursing conceptual model-based theoretical rationale.
Gigliotti, Eileen; Manister, Nancy N
2012-10-01
Writing the theoretical rationale for a study can be a daunting prospect for novice researchers. Nursing's conceptual models provide excellent frameworks for placement of study variables, but moving from the very abstract concepts of the nursing model to the less abstract concepts of the study variables is difficult. Similar to the five-paragraph essay used by writing teachers to assist beginning writers to construct a logical thesis, the authors of this column present guidelines that beginners can follow to construct their theoretical rationale. This guide can be used with any nursing conceptual model but Neuman's model was chosen here as the exemplar.
NASA Astrophysics Data System (ADS)
Kim, Y.; Suk, H.
2011-12-01
In this study, about 2,000 deep observation wells, stream and/or river distribution, and river's density were analyzed to identify regional groundwater flow trend, based on the regional groundwater survey of four major river watersheds including Geum river, Han river, Youngsan-Seomjin river, and Nakdong river in Korea. Hydrogeologial data were collected to analyze regional groundwater flow characteristics according to geological units. Additionally, hydrological soil type data were collected to estimate direct runoff through SCS-CN method. Temperature and precipitation data were used to quantify infiltration rate. The temperature and precipitation data were also used to quantify evaporation by Thornthwaite method and to evaluate groundwater recharge, respectively. Understanding the regional groundwater characteristics requires the database of groundwater flow parameters, but most hydrogeological data include limited information such as groundwater level and well configuration. In this study, therefore, groundwater flow parameters such as hydraulic conductivities or transmissivities were estimated using observed groundwater level by inverse model, namely PEST (Non-linear Parameter ESTimation). Since groundwater modeling studies have some uncertainties in data collection, conceptualization, and model results, model calibration should be performed. The calibration may be manually performed by changing parameters step by step, or various parameters are simultaneously changed by automatic procedure using PEST program. In this study, both manual and automatic procedures were employed to calibrate and estimate hydraulic parameter distributions. In summary, regional groundwater survey data obtained from four major river watersheds and various data of hydrology, meteorology, geology, soil, and topography in Korea were used to estimate hydraulic conductivities using PEST program. Especially, in order to estimate hydraulic conductivity effectively, it is important to perform in such a way that areas of same or similar hydrogeological characteristics should be grouped into zones. Keywords: regional groundwater, database, hydraulic conductivity, PEST, Korean peninsular Acknowledgements: This work was supported by the Radioactive Waste Management of the Korea Institute of Energy Technology Evaluation and Planning (KETEP) grant funded by the Korea government Ministry of Knowledge Economy (2011T100200152)
Triad Issue Paper: Using Geophysical Tools to Develop the Conceptual Site Model
This technology bulletin explains how hazardous-waste site professionals can use geophysical tools to provide information about subsurface conditions to create a more representative conceptual site model (CSM).
Supporting user-defined granularities in a spatiotemporal conceptual model
Khatri, V.; Ram, S.; Snodgrass, R.T.; O'Brien, G. M.
2002-01-01
Granularities are integral to spatial and temporal data. A large number of applications require storage of facts along with their temporal and spatial context, which needs to be expressed in terms of appropriate granularities. For many real-world applications, a single granularity in the database is insufficient. In order to support any type of spatial or temporal reasoning, the semantics related to granularities needs to be embedded in the database. Specifying granularities related to facts is an important part of conceptual database design because under-specifying the granularity can restrict an application, affect the relative ordering of events and impact the topological relationships. Closely related to granularities is indeterminacy, i.e., an occurrence time or location associated with a fact that is not known exactly. In this paper, we present an ontology for spatial granularities that is a natural analog of temporal granularities. We propose an upward-compatible, annotation-based spatiotemporal conceptual model that can comprehensively capture the semantics related to spatial and temporal granularities, and indeterminacy without requiring new spatiotemporal constructs. We specify the formal semantics of this spatiotemporal conceptual model via translation to a conventional conceptual model. To underscore the practical focus of our approach, we describe an on-going case study. We apply our approach to a hydrogeologic application at the United States Geologic Survey and demonstrate that our proposed granularity-based spatiotemporal conceptual model is straightforward to use and is comprehensive.
A Conceptual Model for Analysing Management Development in the UK Hospitality Industry
ERIC Educational Resources Information Center
Watson, Sandra
2007-01-01
This paper presents a conceptual, contingent model of management development. It explains the nature of the UK hospitality industry and its potential influence on MD practices, prior to exploring dimensions and relationships in the model. The embryonic model is presented as a model that can enhance our understanding of the complexities of the…
ERIC Educational Resources Information Center
Tarhini, Ali; Elyas, Tariq; Akour, Mohammad Ali; Al-Salti, Zahran
2016-01-01
The main aim of this paper is to develop an amalgamated conceptual model of technology acceptance that explains how individual, social, cultural and organizational factors affect the students' acceptance and usage behaviour of the Web-based learning systems. More specifically, the proposed model extends the Technology Acceptance Model (TAM) to…
Validation of Groundwater Models: Meaningful or Meaningless?
NASA Astrophysics Data System (ADS)
Konikow, L. F.
2003-12-01
Although numerical simulation models are valuable tools for analyzing groundwater systems, their predictive accuracy is limited. People who apply groundwater flow or solute-transport models, as well as those who make decisions based on model results, naturally want assurance that a model is "valid." To many people, model validation implies some authentication of the truth or accuracy of the model. History matching is often presented as the basis for model validation. Although such model calibration is a necessary modeling step, it is simply insufficient for model validation. Because of parameter uncertainty and solution non-uniqueness, declarations of validation (or verification) of a model are not meaningful. Post-audits represent a useful means to assess the predictive accuracy of a site-specific model, but they require the existence of long-term monitoring data. Model testing may yield invalidation, but that is an opportunity to learn and to improve the conceptual and numerical models. Examples of post-audits and of the application of a solute-transport model to a radioactive waste disposal site illustrate deficiencies in model calibration, prediction, and validation.
[Impact of small-area context on health: proposing a conceptual model].
Voigtländer, S; Mielck, A; Razum, O
2012-11-01
Recent empirical studies stress the impact of features related to the small-area context on individual health. However, so far there exists no standard explanatory model that integrates the different kinds of such features and that conceptualises their relation to individual characteristics of social inequality. A review of theoretical publications on the relationship between social position and health as well as existing conceptual models for the impact of features related to the small-area context on health was undertaken. In the present article we propose a conceptual model for the health impact of the small-area context. This model conceptualises the location of residence as one dimension of social inequality that affects health through the resources as well as stressors which are inherent in the small-area context. The proposed conceptual model offers an orientation for future empirical studies and can serve as a basis for further discussions concerning the health relevance of the small-area context. © Georg Thieme Verlag KG Stuttgart · New York.
Benchmarking hydrological model predictive capability for UK River flows and flood peaks.
NASA Astrophysics Data System (ADS)
Lane, Rosanna; Coxon, Gemma; Freer, Jim; Wagener, Thorsten
2017-04-01
Data and hydrological models are now available for national hydrological analyses. However, hydrological model performance varies between catchments, and lumped, conceptual models are not able to produce adequate simulations everywhere. This study aims to benchmark hydrological model performance for catchments across the United Kingdom within an uncertainty analysis framework. We have applied four hydrological models from the FUSE framework to 1128 catchments across the UK. These models are all lumped models and run at a daily timestep, but differ in the model structural architecture and process parameterisations, therefore producing different but equally plausible simulations. We apply FUSE over a 20 year period from 1988-2008, within a GLUE Monte Carlo uncertainty analyses framework. Model performance was evaluated for each catchment, model structure and parameter set using standard performance metrics. These were calculated both for the whole time series and to assess seasonal differences in model performance. The GLUE uncertainty analysis framework was then applied to produce simulated 5th and 95th percentile uncertainty bounds for the daily flow time-series and additionally the annual maximum prediction bounds for each catchment. The results show that the model performance varies significantly in space and time depending on catchment characteristics including climate, geology and human impact. We identify regions where models are systematically failing to produce good results, and present reasons why this could be the case. We also identify regions or catchment characteristics where one model performs better than others, and have explored what structural component or parameterisation enables certain models to produce better simulations in these catchments. Model predictive capability was assessed for each catchment, through looking at the ability of the models to produce discharge prediction bounds which successfully bound the observed discharge. These results improve our understanding of the predictive capability of simple conceptual hydrological models across the UK and help us to identify where further effort is needed to develop modelling approaches to better represent different catchment and climate typologies.
Technology needs of advanced Earth observation spacecraft
NASA Technical Reports Server (NTRS)
Herbert, J. J.; Postuchow, J. R.; Schartel, W. A.
1984-01-01
Remote sensing missions were synthesized which could contribute significantly to the understanding of global environmental parameters. Instruments capable of sensing important land and sea parameters are combined with a large antenna designed to passively quantify surface emitted radiation at several wavelengths. A conceptual design for this large deployable antenna was developed. All subsystems required to make the antenna an autonomous spacecraft were conceptually designed. The entire package, including necessary orbit transfer propulsion, is folded to package within the Space Transportation System (STS) cargo bay. After separation, the antenna, its integral feed mast, radiometer receivers, power system, and other instruments are automatically deployed and transferred to the operational orbit. The design resulted in an antenna with a major antenna dimension of 120 meters, weighing 7650 kilograms, and operating at an altitude of 700 kilometers.
The development of conceptual and predictive models is an important tool to guide site characterization in support of monitoring contaminants in ground water. The accuracy of predictive models is limited by the adequacy of the input data and the assumptions made to constrain mod...
Thoughts About Nursing Conceptual Models and the "Medical Model".
Fawcett, Jacqueline
2017-01-01
This essay, written to celebrate the 30th anniversary of Nursing Science Quarterly, focuses on the distinctions between the discipline of nursology and the trade of medicine. The distinctions are drawn from content found in nursing conceptual models and from literature about the elusive content of the so-called "medical model."
Models of borderline personality disorder: recent advances and new perspectives.
D'Agostino, Alessandra; Rossi Monti, Mario; Starcevic, Vladan
2018-01-01
The purpose of this article is to review the most relevant conceptual models of borderline personality disorder (BPD), with a focus on recent developments in this area. Several conceptual models have been proposed with the aim of better understanding BPD: the borderline personality organization, emotion dysregulation, reflective (mentalization) dysfunction, interpersonal hypersensitivity and hyperbolic temperament models. These models have all been supported to some extent and their common components include disorganized attachment and traumatic early experiences, emotion dysregulation, interpersonal sensitivity and difficulties with social cognition. An attempt to integrate some components of the conceptual models of BPD has resulted in an emerging new perspective, the interpersonal dysphoria model, which emphasizes dysphoria as an overarching phenomenon that connects the dispositional and situational aspects of BPD. Various conceptual models have expanded our understanding of BPD, but it appears that further development entails theoretical integration. More research is needed to better understand interactions between various components of BPD, including the situational factors that activate symptoms of BPD. This will help develop therapeutic approaches that are more tailored to the heterogeneous psychopathology of BPD.
Conceptual ecological models to guide integrated landscape monitoring of the Great Basin
Miller, D.M.; Finn, S.P.; Woodward, Andrea; Torregrosa, Alicia; Miller, M.E.; Bedford, D.R.; Brasher, A.M.
2010-01-01
The Great Basin Integrated Landscape Monitoring Pilot Project was developed in response to the need for a monitoring and predictive capability that addresses changes in broad landscapes and waterscapes. Human communities and needs are nested within landscapes formed by interactions among the hydrosphere, geosphere, and biosphere. Understanding the complex processes that shape landscapes and deriving ways to manage them sustainably while meeting human needs require sophisticated modeling and monitoring. This document summarizes current understanding of ecosystem structure and function for many of the ecosystems within the Great Basin using conceptual models. The conceptual ecosystem models identify key ecological components and processes, identify external drivers, develop a hierarchical set of models that address both site and landscape attributes, inform regional monitoring strategy, and identify critical gaps in our knowledge of ecosystem function. The report also illustrates an approach for temporal and spatial scaling from site-specific models to landscape models and for understanding cumulative effects. Eventually, conceptual models can provide a structure for designing monitoring programs, interpreting monitoring and other data, and assessing the accuracy of our understanding of ecosystem functions and processes.
[Design of a conceptual model on the transference of public health research results in Honduras].
Macías-Chapula, César A
2012-01-01
To design a conceptual model on the transference of public health research results at the local, context level. Using systems thinking concepts, a soft systems approach (SSM) was used to analyse and solve what was perceived as a problem situation related to the transference of research results within Honduras public health system. A bibliometric analysis was also conducted to enrich the problem situation. Six root definitions were defined and modeled as relevant to the expressed problem situation. This led to the development of the conceptual model. The model obtained identified four levels of resolution as derived from the human activities involved in the transference of research results: 1) those of the researchers; 2) the information/documentation professionals; 3) health staff; and 4) the population/society. These actors/ clients and their activities were essential to the functioning of the model since they represent what the model is and does. SSM helped to design the conceptual model. The bibliometric analysis was relevant to construct the rich image of the problem situation.
NASA Astrophysics Data System (ADS)
Liu, D.; Tian, F.; Lin, M.; Sivapalan, M.
2015-02-01
The complex interactions and feedbacks between humans and water are critically important issues but remain poorly understood in the newly proposed discipline of socio-hydrology (Sivapalan et al., 2012). An exploratory model with the appropriate level of simplification can be valuable for improving our understanding of the co-evolution and self-organization of socio-hydrological systems driven by interactions and feedbacks operating at different scales. In this study, a simplified conceptual socio-hydrological model based on logistic growth curves is developed for the Tarim River basin in western China and is used to illustrate the explanatory power of such a co-evolutionary model. The study area is the main stream of the Tarim River, which is divided into two modeling units. The socio-hydrological system is composed of four sub-systems, i.e., the hydrological, ecological, economic, and social sub-systems. In each modeling unit, the hydrological equation focusing on water balance is coupled to the other three evolutionary equations to represent the dynamics of the social sub-system (denoted by population), the economic sub-system (denoted by irrigated crop area ratio), and the ecological sub-system (denoted by natural vegetation cover), each of which is expressed in terms of a logistic growth curve. Four feedback loops are identified to represent the complex interactions among different sub-systems and different spatial units, of which two are inner loops occurring within each separate unit and the other two are outer loops linking the two modeling units. The feedback mechanisms are incorporated into the constitutive relations for model parameters, i.e., the colonization and mortality rates in the logistic growth curves that are jointly determined by the state variables of all sub-systems. The co-evolution of the Tarim socio-hydrological system is then analyzed with this conceptual model to gain insights into the overall system dynamics and its sensitivity to the external drivers and internal system variables. The results show a costly pendulum swing between a balanced distribution of socio-economic and natural ecologic resources among the upper and lower reaches and a highly skewed distribution towards the upper reach. This evolution is principally driven by the attitudinal changes occurring within water resources management policies that reflect the evolving community awareness of society to concerns regarding the ecology and environment.
NASA Astrophysics Data System (ADS)
Cullis, J. D.; Gillis, C.; Bothwell, M.; Kilroy, C.; Packman, A. I.; Hassan, M. A.
2010-12-01
The nuisance diatom Didymosphenia geminata (didymo) presents an ecological paradox. How can this benthic algae produce such large amounts of biomass in cold, fast flowing, low nutrient streams? The aim of this paper is to present a conceptual model for the growth, persistence, and blooming behavior of this benthic mat-forming diatom that may help to explain this paradox. The conceptual model highlights the importance of distinguishing between mat thickness and cell growth. It presents evidence gathered from a range of existing studies around the world to support the proposed relationship between growth and light, nutrients and temperature as well as the importance of flood events and bed disturbance in mat removal. It is anticipated that this conceptual model will not only help in identifying the key controlling variables and set a framework for future studies but also support the future management of this nuisance algae. Summary of the conceptual model for didymo growth showing the proposed relationships for the growth of cells and mats with nutrients, radiation and water temperature and the dependence of removal on bed shear stress and the potential for physical bed disturbance.
Looking at the ICF and human communication through the lens of classification theory.
Walsh, Regina
2011-08-01
This paper explores the insights that classification theory can provide about the application of the International Classification of Functioning, Disability and Health (ICF) to communication. It first considers the relationship between conceptual models and classification systems, highlighting that classification systems in speech-language pathology (SLP) have not historically been based on conceptual models of human communication. It then overviews the key concepts and criteria of classification theory. Applying classification theory to the ICF and communication raises a number of issues, some previously highlighted through clinical application. Six focus questions from classification theory are used to explore these issues, and to propose the creation of an ICF-related conceptual model of communicating for the field of communication disability, which would address some of the issues raised. Developing a conceptual model of communication for SLP purposes closely articulated with the ICF would foster productive intra-professional discourse, while at the same time allow the profession to continue to use the ICF for purposes in inter-disciplinary discourse. The paper concludes by suggesting the insights of classification theory can assist professionals to apply the ICF to communication with the necessary rigour, and to work further in developing a conceptual model of human communication.
Simple and detailed conceptual model diagram and associated narrative for ammonia, dissolved oxygen, flow alteration, herbicides, insecticides, ionic strength, metals, nutrients, ph, physical habitat, sediments, temperature, unspecified toxic chemicals.
Fulton, John W.; Koerkle, Edward H.; McAuley, Steven D.; Hoffman, Scott A.; Zarr, Linda F.
2005-01-01
The Spring Creek Basin, Centre County, Pa., is experiencing some of the most rapid growth and development within the Commonwealth. This trend has resulted in land-use changes and increased water use, which will affect the quantity and quality of stormwater runoff, surface water, ground water, and aquatic resources within the basin. The U.S. Geological Survey (USGS), in cooperation with the ClearWater Conservancy (CWC), Spring Creek Watershed Community (SCWC), and Spring Creek Watershed Commission (SCWCm), has developed a Watershed Plan (Plan) to assist decision makers in water-resources planning. One element of the Plan is to provide a summary of the basin characteristics and a conceptual model that incorporates the hydrogeologic characteristics of the basin. The report presents hydrogeologic data for the basin and presents a conceptual model that can be used as the basis for simulating surface-water and ground-water flow within the basin. Basin characteristics; sources of data referenced in this text; physical characteristics such as climate, physiography, topography, and land use; hydrogeologic characteristics; and water-quality characteristics are discussed. A conceptual model is a simplified description of the physical components and interaction of the surface- and ground-water systems. The purpose for constructing a conceptual model is to simplify the problem and to organize the available data so that the system can be analyzed accurately. Simplification is necessary, because a complete accounting of a system, such as Spring Creek, is not possible. The data and the conceptual model could be used in development of a fully coupled numerical model that dynamically links surface water, ground water, and land-use changes. The model could be used by decision makers to manage water resources within the basin and as a prototype that is transferable to other watersheds.
A Conceptual Framework Curriculum Evaluation Electrical Engineering Education
ERIC Educational Resources Information Center
Imansari, Nurulita; Sutadji, Eddy
2017-01-01
This evaluation is a conceptual framework that has been analyzed in the hope that can help research related an evaluation of the curriculum. The Model of evaluation used was CIPPO model. CIPPO Model consists of "context," "input," "process," "product," and "outcomes." On the dimension of the…
NASA Astrophysics Data System (ADS)
Osezua Aikhuele, Daniel; Mohd Turan, Faiz
2016-02-01
The instability in today's market and the emerging demands for mass customized products by customers, are driving companies to seek for cost effective and time efficient improvements in their production system and this have led to real pressure for the adaptation of new developmental architecture and operational parameters to remain competitive in the market. Among such developmental architecture adopted, is the integration of lean thinking in the product development process. However, due to lack of clear understanding of the lean performance and its measurements, many companies are unable to implement and fully integrate the lean principle into their product development process and without a proper performance measurement, the performance level of the organizational value stream will be unknown and the specific area of improvement as it relates to the LPD program cannot be tracked. Hence, it will result in poor decision making in the LPD implementation. This paper therefore seeks to present a conceptual model for evaluation of LPD performances by identifying and analysing the core existing LPD enabler (Chief Engineer, Cross-functional teams, Set-based engineering, Poka-yoke (mistakeproofing), Knowledge-based environment, Value-focused planning and development, Top management support, Technology, Supplier integration, Workforce commitment and Continuous improvement culture) for assessing the LPD performance.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Takeuchi, Shinji; Takeuchi, Ryuji; Salden, Walter
2007-07-01
A hydrogeological conceptual model has been developed based on pressure responses observed at multilevel pressure monitoring zones in seven boreholes and surface tilt data in and around the Mizunami Underground Research Laboratory site. Pressure changes caused by some earthquakes, cross-hole hydraulic testing, and shaft excavation activities are considered. Surface tilt has been measured from the half way of the shaft excavation phase. The shaft excavation has been commenced from July 2003 with two shafts (Main shaft and Ventilation shaft). By the end of October 2005, discharging of water in the shafts has been halted at the depths of 172 mmore » and 191 m respectively to allow modifications to be made to the water treatment facility due to an excess of F and B concentration in the water. This results in the recovery of the groundwater levels and filling of the underground workings. Beginning in February 2006 pumping has been resumed and the underground workings have been re-occupied. Continuous groundwater pressure and surface tilt measurements with some numerical analysis during the shaft excavation phase show the existence of the flow barrier fault predicted from the surface-based investigation phase and hydraulic parameter around the shafts. (authors)« less
IDEAS: A multidisciplinary computer-aided conceptual design system for spacecraft
NASA Technical Reports Server (NTRS)
Ferebee, M. J., Jr.
1984-01-01
During the conceptual development of advanced aerospace vehicles, many compromises must be considered to balance economy and performance of the total system. Subsystem tradeoffs may need to be made in order to satisfy system-sensitive attributes. Due to the increasingly complex nature of aerospace systems, these trade studies have become more difficult and time-consuming to complete and involve interactions of ever-larger numbers of subsystems, components, and performance parameters. The current advances of computer-aided synthesis, modeling and analysis techniques have greatly helped in the evaluation of competing design concepts. Langley Research Center's Space Systems Division is currently engaged in trade studies for a variety of systems which include advanced ground-launched space transportation systems, space-based orbital transfer vehicles, large space antenna concepts and space stations. The need for engineering analysis tools to aid in the rapid synthesis and evaluation of spacecraft has led to the development of the Interactive Design and Evaluation of Advanced Spacecraft (IDEAS) computer-aided design system. The ADEAS system has been used to perform trade studies of competing technologies and requirements in order to pinpoint possible beneficial areas for research and development. IDEAS is presented as a multidisciplinary tool for the analysis of advanced space systems. Capabilities range from model generation and structural and thermal analysis to subsystem synthesis and performance analysis.
Thermal management of LEDs: package to system
NASA Astrophysics Data System (ADS)
Arik, Mehmet; Becker, Charles A.; Weaver, Stanton E.; Petroski, James
2004-01-01
Light emitting diodes, LEDs, historically have been used for indicators and produced low amounts of heat. The introduction of high brightness LEDs with white light and monochromatic colors have led to a movement towards general illumination. The increased electrical currents used to drive the LEDs have focused more attention on the thermal paths in the developments of LED power packaging. The luminous efficiency of LEDs is soon expected to reach over 80 lumens/W, this is approximately 6 times the efficiency of a conventional incandescent tungsten bulb. Thermal management for the solid-state lighting applications is a key design parameter for both package and system level. Package and system level thermal management is discussed in separate sections. Effect of chip packages on junction to board thermal resistance was compared for both SiC and Sapphire chips. The higher thermal conductivity of the SiC chip provided about 2 times better thermal performance than the latter, while the under-filled Sapphire chip package can only catch the SiC chip performance. Later, system level thermal management was studied based on established numerical models for a conceptual solid-state lighting system. A conceptual LED illumination system was chosen and CFD models were created to determine the availability and limitations of passive air-cooling.
A model-averaging method for assessing groundwater conceptual model uncertainty.
Ye, Ming; Pohlmann, Karl F; Chapman, Jenny B; Pohll, Greg M; Reeves, Donald M
2010-01-01
This study evaluates alternative groundwater models with different recharge and geologic components at the northern Yucca Flat area of the Death Valley Regional Flow System (DVRFS), USA. Recharge over the DVRFS has been estimated using five methods, and five geological interpretations are available at the northern Yucca Flat area. Combining the recharge and geological components together with additional modeling components that represent other hydrogeological conditions yields a total of 25 groundwater flow models. As all the models are plausible given available data and information, evaluating model uncertainty becomes inevitable. On the other hand, hydraulic parameters (e.g., hydraulic conductivity) are uncertain in each model, giving rise to parametric uncertainty. Propagation of the uncertainty in the models and model parameters through groundwater modeling causes predictive uncertainty in model predictions (e.g., hydraulic head and flow). Parametric uncertainty within each model is assessed using Monte Carlo simulation, and model uncertainty is evaluated using the model averaging method. Two model-averaging techniques (on the basis of information criteria and GLUE) are discussed. This study shows that contribution of model uncertainty to predictive uncertainty is significantly larger than that of parametric uncertainty. For the recharge and geological components, uncertainty in the geological interpretations has more significant effect on model predictions than uncertainty in the recharge estimates. In addition, weighted residuals vary more for the different geological models than for different recharge models. Most of the calibrated observations are not important for discriminating between the alternative models, because their weighted residuals vary only slightly from one model to another.
NASA Astrophysics Data System (ADS)
Wang, Yong; Tao, Zhengwu; Chen, Liang; Ma, Xin
2017-10-01
Carbonate reservoir is one of the important reservoirs in the world. Because of the characteristics of carbonate reservoir, horizontal well has become a key technology for efficiently developing carbonate reservoir. Establishing corresponding mathematical models and analyzing transient pressure behaviors of this type of well-reservoir configuration can provide a better understanding of fluid flow patterns in formation as well as estimations of important parameters. A mathematical model for a oil-water two-phase flow horizontal well in triple media carbonate reservoir by conceptualizing vugs as spherical shapes are presented in this article. A semi-analytical solution is obtained in the Laplace domain using source function theory, Laplace transformation, and superposition principle. Analysis of transient pressure responses indicates that seven characteristic flow periods of horizontal well in triple media carbonate reservoir can be identified. Parametric analysis shows that water saturation of matrix, vug and fracture system, horizontal section length, and horizontal well position can significantly influence the transient pressure responses of horizontal well in triple media carbonate reservoir. The model presented in this article can be applied to obtain important parameters pertinent to reservoir by type curve matching.
Reinventing The Design Process: Teams and Models
NASA Technical Reports Server (NTRS)
Wall, Stephen D.
1999-01-01
The future of space mission designing will be dramatically different from the past. Formerly, performance-driven paradigms emphasized data return with cost and schedule being secondary issues. Now and in the future, costs are capped and schedules fixed-these two variables must be treated as independent in the design process. Accordingly, JPL has redesigned its design process. At the conceptual level, design times have been reduced by properly defining the required design depth, improving the linkages between tools, and managing team dynamics. In implementation-phase design, system requirements will be held in crosscutting models, linked to subsystem design tools through a central database that captures the design and supplies needed configuration management and control. Mission goals will then be captured in timelining software that drives the models, testing their capability to execute the goals. Metrics are used to measure and control both processes and to ensure that design parameters converge through the design process within schedule constraints. This methodology manages margins controlled by acceptable risk levels. Thus, teams can evolve risk tolerance (and cost) as they would any engineering parameter. This new approach allows more design freedom for a longer time, which tends to encourage revolutionary and unexpected improvements in design.
NASA Technical Reports Server (NTRS)
Magee, J. P.; Clark, R. D.; Widdison, C. A.
1975-01-01
Conceptual design studies are summarized of tandem-rotor helicopter and tilt-rotor aircraft for a short haul transport mission in the 1985 time frame. Vertical takeoff designs of both configurations are discussed, and the impact of external noise criteria on the vehicle designs, performance, and costs are shown. A STOL design for the tilt-rotor configuration is reported, and the effect of removing the vertical takeoff design constraints on the design parameters, fuel economy, and operating cost is discussed.
Conceptual design of a piloted Mars sprint life support system
NASA Technical Reports Server (NTRS)
Cullingford, H. S.; Novara, M.
1988-01-01
This paper presents the conceptual design of a life support system sustaining a crew of six in a piloted Mars sprint. The requirements and constraints of the system are discussed along with its baseline performance parameters. An integrated operation is achieved with air, water, and waste processing and supplemental food production. The design philosophy includes maximized reliability considerations, regenerative operations, reduced expendables, and fresh harvest capability. The life support system performance will be described with characteristics of the associated physical-chemical subsystems and a greenhouse.
NASA Astrophysics Data System (ADS)
Dalla Valle, Nicolas; Wutzler, Thomas; Meyer, Stefanie; Potthast, Karin; Michalzik, Beate
2017-04-01
Dual-permeability type models are widely used to simulate water fluxes and solute transport in structured soils. These models contain two spatially overlapping flow domains with different parameterizations or even entirely different conceptual descriptions of flow processes. They are usually able to capture preferential flow phenomena, but a large set of parameters is needed, which are very laborious to obtain or cannot be measured at all. Therefore, model inversions are often used to derive the necessary parameters. Although these require sufficient input data themselves, they can use measurements of state variables instead, which are often easier to obtain and can be monitored by automated measurement systems. In this work we show a method to estimate soil hydraulic parameters from high frequency soil moisture time series data gathered at two different measurement depths by inversion of a simple one dimensional dual-permeability model. The model uses an advection equation based on the kinematic wave theory to describe the flow in the fracture domain and a Richards equation for the flow in the matrix domain. The soil moisture time series data were measured in mesocosms during sprinkling experiments. The inversion consists of three consecutive steps: First, the parameters of the water retention function were assessed using vertical soil moisture profiles in hydraulic equilibrium. This was done using two different exponential retention functions and the Campbell function. Second, the soil sorptivity and diffusivity functions were estimated from Boltzmann-transformed soil moisture data, which allowed the calculation of the hydraulic conductivity function. Third, the parameters governing flow in the fracture domain were determined using the whole soil moisture time series. The resulting retention functions were within the range of values predicted by pedotransfer functions apart from very dry conditions, where all retention functions predicted lower matrix potentials. The diffusivity function predicted values of a similar range as shown in other studies. Overall, the model was able to emulate soil moisture time series for low measurement depths, but deviated increasingly at larger depths. This indicates that some of the model parameters are not constant throughout the profile. However, overall seepage fluxes were still predicted correctly. In the near future we will apply the inversion method to lower frequency soil moisture data from different sites to evaluate the model's ability to predict preferential flow seepage fluxes at the field scale.
Valinia, Salar; Englund, Göran; Moldan, Filip; Futter, Martyn N; Köhler, Stephan J; Bishop, Kevin; Fölster, Jens
2014-09-01
Quantifying the effects of human activity on the natural environment is dependent on credible estimates of reference conditions to define the state of the environment before the onset of adverse human impacts. In Europe, emission controls that aimed at restoring ecological status were based on hindcasts from process-based models or paleolimnological reconstructions. For instance, 1860 is used in Europe as the target for restoration from acidification concerning biological and chemical parameters. A more practical problem is that the historical states of ecosystems and their function cannot be observed directly. Therefore, we (i) compare estimates of acidification based on long-term observations of roach (Rutilus rutilus) populations with hindcast pH from the hydrogeochemical model MAGIC; (ii) discuss policy implications and possible scope for use of long-term archival data for assessing human impacts on the natural environment and (iii) present a novel conceptual model for interpreting the importance of physico-chemical and ecological deviations from reference conditions. Of the 85 lakes studied, 78 were coherently classified by both methods. In 1980, 28 lakes were classified as acidified with the MAGIC model, however, roach was present in 14 of these. In 2010, MAGIC predicted chemical recovery in 50% of the lakes, however roach only recolonized in five lakes after 1990, showing a lag between chemical and biological recovery. Our study is the first study of its kind to use long-term archival biological data in concert with hydrogeochemical modeling for regional assessments of anthropogenic acidification. Based on our results, we show how the conceptual model can be used to understand and prioritize management of physico-chemical and ecological effects of anthropogenic stressors on surface water quality. © 2014 The Authors Global Change Biology Published by John Wiley & Sons Ltd.
Valinia, Salar; Englund, Göran; Moldan, Filip; Futter, Martyn N; Köhler, Stephan J; Bishop, Kevin; Fölster, Jens
2014-01-01
Quantifying the effects of human activity on the natural environment is dependent on credible estimates of reference conditions to define the state of the environment before the onset of adverse human impacts. In Europe, emission controls that aimed at restoring ecological status were based on hindcasts from process-based models or paleolimnological reconstructions. For instance, 1860 is used in Europe as the target for restoration from acidification concerning biological and chemical parameters. A more practical problem is that the historical states of ecosystems and their function cannot be observed directly. Therefore, we (i) compare estimates of acidification based on long-term observations of roach (Rutilus rutilus) populations with hindcast pH from the hydrogeochemical model MAGIC; (ii) discuss policy implications and possible scope for use of long-term archival data for assessing human impacts on the natural environment and (iii) present a novel conceptual model for interpreting the importance of physico-chemical and ecological deviations from reference conditions. Of the 85 lakes studied, 78 were coherently classified by both methods. In 1980, 28 lakes were classified as acidified with the MAGIC model, however, roach was present in 14 of these. In 2010, MAGIC predicted chemical recovery in 50% of the lakes, however roach only recolonized in five lakes after 1990, showing a lag between chemical and biological recovery. Our study is the first study of its kind to use long-term archival biological data in concert with hydrogeochemical modeling for regional assessments of anthropogenic acidification. Based on our results, we show how the conceptual model can be used to understand and prioritize management of physico-chemical and ecological effects of anthropogenic stressors on surface water quality. PMID:24535943
ERIC Educational Resources Information Center
Heimberg, Richard G.
2009-01-01
Moscovitch's (2009) model of social phobia is put forth as an integration and extension of previous cognitive-behavioral models. The author asserts that his approach overcomes a number of shortcomings of previous models and will serve to better guide case conceptualization, treatment planning, and intervention implementation for clients with…
ERIC Educational Resources Information Center
Hrepic, Zdeslav; Zollman, Dean A.; Rebello, N. Sanjay
2010-01-01
We investigated introductory physics students' mental models of sound propagation. We used a phenomenographic method to analyze the data in the study. In addition to the scientifically accepted Wave model, students used the "Entity" model to describe the propagation of sound. In this latter model sound is a self-standing entity,…